WorldWideScience

Sample records for unit computer file

  1. NASA work unit system file maintenance manual

    Science.gov (United States)

    1972-01-01

    The NASA Work Unit System is a management information system for research tasks (i.e., work units) performed under NASA grants and contracts. It supplies profiles on research efforts and statistics on fund distribution. The file maintenance operator can add, delete and change records at a remote terminal or can submit punched cards to the computer room for batch update. The system is designed for file maintenance by a person with little or no knowledge of data processing techniques.

  2. Algorithms and file structures for computational geometry

    International Nuclear Information System (INIS)

    Hinrichs, K.; Nievergelt, J.

    1983-01-01

    Algorithms for solving geometric problems and file structures for storing large amounts of geometric data are of increasing importance in computer graphics and computer-aided design. As examples of recent progress in computational geometry, we explain plane-sweep algorithms, which solve various topological and geometric problems efficiently; and we present the grid file, an adaptable, symmetric multi-key file structure that provides efficient access to multi-dimensional data along any space dimension. (orig.)

  3. A computer controlled tele-cobalt unit

    International Nuclear Information System (INIS)

    Brace, J.A.

    1982-01-01

    A computer controlled cobalt treatment unit was commissioned for treating patients in January 1980. Initially the controlling computer was a minicomputer, but now the control of the therapy unit is by a microcomputer. The treatment files, which specify the movement and configurations necessary to deliver the prescribed dose, are produced on the minicomputer and then transferred to the microcomputer using minitape cartridges. The actual treatment unit is based on a standard cobalt unit with a few additional features e.g. the drive motors can be controlled either by the computer or manually. Since the treatment unit is used for both manual and automatic treatments, the operational procedure under computer control is made to closely follow the manual procedure for a single field treatment. The necessary safety features which protect against human, hardware and software errors as well as the advantages and disadvantages of computer controlled radiotherapy are discussed

  4. Small file aggregation in a parallel computing system

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Zhang, Jingwang

    2014-09-02

    Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.

  5. Storing files in a parallel computing system using list-based index to identify replica files

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Zhang, Zhenhua; Grider, Gary

    2015-07-21

    Improved techniques are provided for storing files in a parallel computing system using a list-based index to identify file replicas. A file and at least one replica of the file are stored in one or more storage nodes of the parallel computing system. An index for the file comprises at least one list comprising a pointer to a storage location of the file and a storage location of the at least one replica of the file. The file comprises one or more of a complete file and one or more sub-files. The index may also comprise a checksum value for one or more of the file and the replica(s) of the file. The checksum value can be evaluated to validate the file and/or the file replica(s). A query can be processed using the list.

  6. RAMA: A file system for massively parallel computers

    Science.gov (United States)

    Miller, Ethan L.; Katz, Randy H.

    1993-01-01

    This paper describes a file system design for massively parallel computers which makes very efficient use of a few disks per processor. This overcomes the traditional I/O bottleneck of massively parallel machines by storing the data on disks within the high-speed interconnection network. In addition, the file system, called RAMA, requires little inter-node synchronization, removing another common bottleneck in parallel processor file systems. Support for a large tertiary storage system can easily be integrated in lo the file system; in fact, RAMA runs most efficiently when tertiary storage is used.

  7. WinSCP for Windows File Transfers | High-Performance Computing | NREL

    Science.gov (United States)

    WinSCP for Windows File Transfers WinSCP for Windows File Transfers WinSCP for can used to securely transfer files between your local computer running Microsoft Windows and a remote computer running Linux

  8. Documentation of CATHENA input files for the APOLLO computer

    International Nuclear Information System (INIS)

    1988-06-01

    Input files created for the VAX version of the CATHENA two-fluid code have been modified and documented for simulation on the AECB's APOLLO computer system. The input files describe the RD-14 thermalhydraulic loop, the RD-14 steam generator, the RD-12 steam generator blowdown test facility, the Stern Laboratories Cold Water Injection Facility (CWIT), and a CANDU 600 reactor. Sample CATHENA predictions are given and compared with experimental results where applicable. 24 refs

  9. How Do We Really Compute with Units?

    Science.gov (United States)

    Fiedler, B. H.

    2010-01-01

    The methods that we teach students for computing with units of measurement are often not consistent with the practice of professionals. For professionals, the vast majority of computations with quantities of measure are performed within programs on electronic computers, for which an accounting for the units occurs only once, in the design of the…

  10. NET: an inter-computer file transfer command

    International Nuclear Information System (INIS)

    Burris, R.D.

    1978-05-01

    The NET command was defined and supported in order to facilitate file transfer between computers. Among the goals of the implementation were greatest possible ease of use, maximum power (i.e., support of a diversity of equipment and operations), and protection of the operating system

  11. Storing files in a parallel computing system based on user-specified parser function

    Science.gov (United States)

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Manzanares, Adam; Torres, Aaron

    2014-10-21

    Techniques are provided for storing files in a parallel computing system based on a user-specified parser function. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a parser from the distributed application for processing the plurality of files prior to storage; and storing one or more of the plurality of files in one or more storage nodes of the parallel computing system based on the processing by the parser. The plurality of files comprise one or more of a plurality of complete files and a plurality of sub-files. The parser can optionally store only those files that satisfy one or more semantic requirements of the parser. The parser can also extract metadata from one or more of the files and the extracted metadata can be stored with one or more of the plurality of files and used for searching for files.

  12. A software to report and file by personal computer

    International Nuclear Information System (INIS)

    Di Giandomenico, E.; Filippone, A.; Esposito, A.; Bonomo, L.

    1989-01-01

    During the past four years the authors have been gaining experince in reporting radiological examinations by personal computer. Today they describe the project of a new software which allows the reporting and filing of roentgenograms. This program was realized by a radiologist, using a well known data base management system: dBASE III. The program was shaped to fit the radiologist's needs: it helps to report, and allows to file, radiological data, with the diagnosic codes used by the American College of Radiology. In this paper the authors describe the data base structure and indicate the software functions which make its use possible. Thus, this paper is not aimed at advertising a new reporting program, but at demonstrating how the radiologist can himself manage some aspects of his work with the help of a personal computer

  13. Citizens unite for computational immunology!

    Science.gov (United States)

    Belden, Orrin S; Baker, Sarah Catherine; Baker, Brian M

    2015-07-01

    Recruiting volunteers who can provide computational time, programming expertise, or puzzle-solving talent has emerged as a powerful tool for biomedical research. Recent projects demonstrate the potential for such 'crowdsourcing' efforts in immunology. Tools for developing applications, new funding opportunities, and an eager public make crowdsourcing a serious option for creative solutions for computationally-challenging problems. Expanded uses of crowdsourcing in immunology will allow for more efficient large-scale data collection and analysis. It will also involve, inspire, educate, and engage the public in a variety of meaningful ways. The benefits are real - it is time to jump in! Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. 22 CFR 1429.21 - Computation of time for filing papers.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Computation of time for filing papers. 1429.21... MISCELLANEOUS AND GENERAL REQUIREMENTS General Requirements § 1429.21 Computation of time for filing papers. In... subchapter requires the filing of any paper, such document must be received by the Board or the officer or...

  15. 77 FR 2492 - United States Pharmacopeial Convention; Filing of Food Additive Petition; Amendment

    Science.gov (United States)

    2012-01-18

    ..., and 180 [Docket No. FDA-2010-F-0320] United States Pharmacopeial Convention; Filing of Food Additive... Food and Drug Administration (FDA) is amending the filing notice for a food additive petition filed by the U.S. Pharmacopeial Convention requesting that the food additive regulations that incorporate by...

  16. 75 FR 48353 - United States Pharmacopeial Convention; Filing of Food Additive Petition

    Science.gov (United States)

    2010-08-10

    ...] United States Pharmacopeial Convention; Filing of Food Additive Petition AGENCY: Food and Drug.... Pharmacopeial Convention has filed a petition proposing that the food additive regulations that incorporate by... that a food additive petition (FAP 0A4782) has been filed by U.S. Pharmacopeial Convention, 12601...

  17. Control of peripheral units by satellite computer

    International Nuclear Information System (INIS)

    Tran, K.T.

    1974-01-01

    A computer system was developed allowing the control of nuclear physics experiments, and use of the results by means of graphical and conversational assemblies. This system which is made of two computers, one IBM-370/135 and one Telemecanique Electrique T1600, controls the conventional IBM peripherals and also the special ones made in the laboratory, such as data acquisition display and graphics units. The visual display is implemented by a scanning-type television, equipped with a light-pen. These units in themselves are universal, but their specifications were established to meet the requirements of nuclear physics experiments. The input-output channels of the two computers have been connected together by an interface, designed and implemented in the Laboratory. This interface allows the exchange of control signals and data (the data are changed from bytes into word and vice-versa). The T1600 controls the peripherals mentionned above according to the commands of the IBM370. Hence the T1600 has here the part of a satellite computer which allows conversation with the main computer and also insures the control of its special peripheral units [fr

  18. Methods and apparatus for multi-resolution replication of files in a parallel computing system using semantic information

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-10-20

    Techniques are provided for storing files in a parallel computing system using different resolutions. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a sub-file. The method comprises the steps of obtaining semantic information related to the file; generating a plurality of replicas of the file with different resolutions based on the semantic information; and storing the file and the plurality of replicas of the file in one or more storage nodes of the parallel computing system. The different resolutions comprise, for example, a variable number of bits and/or a different sub-set of data elements from the file. A plurality of the sub-files can be merged to reproduce the file.

  19. File and metadata management for BESIII distributed computing

    International Nuclear Information System (INIS)

    Nicholson, C; Zheng, Y H; Lin, L; Deng, Z Y; Li, W D; Zhang, X M

    2012-01-01

    The BESIII experiment at the Institute of High Energy Physics (IHEP), Beijing, uses the high-luminosity BEPCII e + e − collider to study physics in the π-charm energy region around 3.7 GeV; BEPCII has produced the worlds largest samples of J/φ and φ’ events to date. An order of magnitude increase in the data sample size over the 2011-2012 data-taking period demanded a move from a very centralized to a distributed computing environment, as well as the development of an efficient file and metadata management system. While BESIII is on a smaller scale than some other HEP experiments, this poses particular challenges for its distributed computing and data management system. These constraints include limited resources and manpower, and low quality of network connections to IHEP. Drawing on the rich experience of the HEP community, a system has been developed which meets these constraints. The design and development of the BESIII distributed data management system, including its integration with other BESIII distributed computing components, such as job management, are presented here.

  20. Comparison of canal transportation and centering ability of hand Protaper files and rotary Protaper files by using micro computed tomography

    OpenAIRE

    Amit Gandhi; Taru Gandhi

    2011-01-01

    Introduction and objective: The aim of the present study was to compare root canal preparation with rotary ProTaper files and hand ProTaper files to find a better instrumentation technique for maintaining root canal geometry with the aid of computed tomography. Material and methods: Twenty curved root canals with at least 10 degree of curvature were divided into 2 groups of 10 teeth each. In group I the canals were prepared with hand ProTaper files and in group II the canals were prepared wit...

  1. Arranging and finding folders and files on your Windows 7 computer

    CERN Document Server

    Steps, Studio Visual

    2014-01-01

    If you have lots of documents on your desk, it may prove to be impossible to find the document you are looking for. In order to easily find certain documents, they are often stored in a filing cabinet and arranged in a logical order. The folders on your computer serve the same purpose. They do not just contain files; they can also contain other folders. You can create an unlimited number of folders, and each folder can contain any number of subfolders and files. You can use Windows Explorer, also called the folder window, to work with the files and folders on your computer. You can copy, delete, move, find, and sort files, among other things. Or you can transfer files and folders to a USB stick, an external hard drive, a CD, DVD or Blu-Ray disk. In this practical guide we will show you how to use the folder window, and help you arrange your own files.

  2. Methods and apparatus for capture and storage of semantic information with sub-files in a parallel computing system

    Science.gov (United States)

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-02-03

    Techniques are provided for storing files in a parallel computing system using sub-files with semantically meaningful boundaries. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a plurality of sub-files. The method comprises the steps of obtaining a user specification of semantic information related to the file; providing the semantic information as a data structure description to a data formatting library write function; and storing the semantic information related to the file with one or more of the sub-files in one or more storage nodes of the parallel computing system. The semantic information provides a description of data in the file. The sub-files can be replicated based on semantically meaningful boundaries.

  3. RRB / SSI Interface Checkwriting Integrated Computer Operation Extract File (CHICO)

    Data.gov (United States)

    Social Security Administration — This monthly file provides SSA with information about benefit payments made to railroad retirement beneficiaries. SSA uses this data to verify Supplemental Security...

  4. Geothermal-energy files in computer storage: sites, cities, and industries

    Energy Technology Data Exchange (ETDEWEB)

    O' Dea, P.L.

    1981-12-01

    The site, city, and industrial files are described. The data presented are from the hydrothermal site file containing about three thousand records which describe some of the principal physical features of hydrothermal resources in the United States. Data elements include: latitude, longitude, township, range, section, surface temperature, subsurface temperature, the field potential, and well depth for commercialization. (MHR)

  5. A technique for integrating remote minicomputers into a general computer's file system

    CERN Document Server

    Russell, R D

    1976-01-01

    This paper describes a simple technique for interfacing remote minicomputers used for real-time data acquisition into the file system of a central computer. Developed as part of the ORION system at CERN, this 'File Manager' subsystem enables a program in the minicomputer to access and manipulate files of any type as if they resided on a storage device attached to the minicomputer. Yet, completely transparent to the program, the files are accessed from disks on the central system via high-speed data links, with response times comparable to local storage devices. (6 refs).

  6. Informatics in Radiology (infoRAD): personal computer security: part 2. Software Configuration and file protection.

    Science.gov (United States)

    Caruso, Ronald D

    2004-01-01

    Proper configuration of software security settings and proper file management are necessary and important elements of safe computer use. Unfortunately, the configuration of software security options is often not user friendly. Safe file management requires the use of several utilities, most of which are already installed on the computer or available as freeware. Among these file operations are setting passwords, defragmentation, deletion, wiping, removal of personal information, and encryption. For example, Digital Imaging and Communications in Medicine medical images need to be anonymized, or "scrubbed," to remove patient identifying information in the header section prior to their use in a public educational or research environment. The choices made with respect to computer security may affect the convenience of the computing process. Ultimately, the degree of inconvenience accepted will depend on the sensitivity of the files and communications to be protected and the tolerance of the user. Copyright RSNA, 2004

  7. Processing of evaluated neutron data files in ENDF format on personal computers

    International Nuclear Information System (INIS)

    Vertes, P.

    1991-11-01

    A computer code package - FDMXPC - has been developed for processing evaluated data files in ENDF format. The earlier version of this package is supplemented with modules performing calculations using Reich-Moore and Adler-Adler resonance parameters. The processing of evaluated neutron data files by personal computers requires special programming considerations outlined in this report. The scope of the FDMXPC program system is demonstrated by means of numerical examples. (author). 5 refs, 4 figs, 4 tabs

  8. Computing with impure numbers - Automatic consistency checking and units conversion using computer algebra

    Science.gov (United States)

    Stoutemyer, D. R.

    1977-01-01

    The computer algebra language MACSYMA enables the programmer to include symbolic physical units in computer calculations, and features automatic detection of dimensionally-inhomogeneous formulas and conversion of inconsistent units in a dimensionally homogeneous formula. Some examples illustrate these features.

  9. A digital imaging teaching file by using the internet, HTML and personal computers

    International Nuclear Information System (INIS)

    Chun, Tong Jin; Jeon, Eun Ju; Baek, Ho Gil; Kang, Eun Joo; Baik, Seung Kug; Choi, Han Yong; Kim, Bong Ki

    1996-01-01

    A film-based teaching file takes up space and the need to search through such a file places limits on the extent to which it is likely to be used. Furthermore it is not easy for doctors in a medium-sized hospital to experience a variety of cases, and so for these reasons we created an easy-to-use digital imaging teaching file with HTML(Hypertext Markup Language) and downloaded images via World Wide Web(WWW) services on the Internet. This was suitable for use by computer novices. We used WWW internet services as a resource for various images and three different IMB-PC compatible computers(386DX, 486DX-II, and Pentium) in downloading the images and in developing a digitalized teaching file. These computers were connected with the Internet through a high speed dial-up modem(28.8Kbps) and to navigate the Internet. Twinsock and Netscape were used. 3.0, Korean word processing software, was used to create HTML(Hypertext Markup Language) files and the downloaded images were linked to the HTML files. In this way, a digital imaging teaching file program was created. Access to a Web service via the Internet required a high speed computer(at least 486DX II with 8MB RAM) for comfortabel use; this also ensured that the quality of downloaded images was not degraded during downloading and that these were good enough to use as a teaching file. The time needed to retrieve the text and related images depends on the size of the file, the speed of the network, and the network traffic at the time of connection. For computer novices, a digital image teaching file using HTML is easy to use. Our method of creating a digital imaging teaching file by using Internet and HTML would be easy to create and radiologists with little computer experience who want to study various digital radiologic imaging cases would find it easy to use

  10. Globus File Transfer Services | High-Performance Computing | NREL

    Science.gov (United States)

    installed on the systems at both ends of the data transfer. The NREL endpoint is nrel#globus. Click Login on the Globus web site. On the login page select "Globus ID" as the login method and click Login to the Globus website. From the Manage Data drop down menu, select Transfer Files. Then click Get

  11. Cooperative storage of shared files in a parallel computing system with dynamic block size

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-11-10

    Improved techniques are provided for parallel writing of data to a shared object in a parallel computing system. A method is provided for storing data generated by a plurality of parallel processes to a shared object in a parallel computing system. The method is performed by at least one of the processes and comprises: dynamically determining a block size for storing the data; exchanging a determined amount of the data with at least one additional process to achieve a block of the data having the dynamically determined block size; and writing the block of the data having the dynamically determined block size to a file system. The determined block size comprises, e.g., a total amount of the data to be stored divided by the number of parallel processes. The file system comprises, for example, a log structured virtual parallel file system, such as a Parallel Log-Structured File System (PLFS).

  12. Computer Forensics Method in Analysis of Files Timestamps in Microsoft Windows Operating System and NTFS File System

    Directory of Open Access Journals (Sweden)

    Vesta Sergeevna Matveeva

    2013-02-01

    Full Text Available All existing file browsers displays 3 timestamps for every file in file system NTFS. Nowadays there are a lot of utilities that can manipulate temporal attributes to conceal the traces of file using. However every file in NTFS has 8 timestamps that are stored in file record and used in detecting the fact of attributes substitution. The authors suggest a method of revealing original timestamps after replacement and automated variant of it in case of a set of files.

  13. New Generation General Purpose Computer (GPC) compact IBM unit

    Science.gov (United States)

    1991-01-01

    New Generation General Purpose Computer (GPC) compact IBM unit replaces a two-unit earlier generation computer. The new IBM unit is documented in table top views alone (S91-26867, S91-26868), with the onboard equipment it supports including the flight deck CRT screen and keypad (S91-26866), and next to the two earlier versions it replaces (S91-26869).

  14. Survey on Security Issues in File Management in Cloud Computing Environment

    Science.gov (United States)

    Gupta, Udit

    2015-06-01

    Cloud computing has pervaded through every aspect of Information technology in past decade. It has become easier to process plethora of data, generated by various devices in real time, with the advent of cloud networks. The privacy of users data is maintained by data centers around the world and hence it has become feasible to operate on that data from lightweight portable devices. But with ease of processing comes the security aspect of the data. One such security aspect is secure file transfer either internally within cloud or externally from one cloud network to another. File management is central to cloud computing and it is paramount to address the security concerns which arise out of it. This survey paper aims to elucidate the various protocols which can be used for secure file transfer and analyze the ramifications of using each protocol.

  15. The image database management system of teaching file using personal computer

    International Nuclear Information System (INIS)

    Shin, M. J.; Kim, G. W.; Chun, T. J.; Ahn, W. H.; Baik, S. K.; Choi, H. Y.; Kim, B. G.

    1995-01-01

    For the systemic management and easy using of teaching file in radiology department, the authors tried to do the setup of a database management system of teaching file using personal computer. We used a personal computer (IBM PC compatible, 486DX2) including a image capture card(Window vision, Dooin Elect, Seoul, Korea) and video camera recorder (8mm, CCD-TR105, Sony, Tokyo, Japan) for the acquisition and storage of images. We developed the database program by using Foxpro for Window 2.6(Microsoft, Seattle, USA) executed in the Window 3.1 (Microsoft, Seattle, USA). Each datum consisted of hospital number, name, sex, age, examination date, keyword, radiologic examination modalities, final diagnosis, radiologic findings, references and representative images. The images were acquired and stored as bitmap format (8 bitmap, 540 X 390 ∼ 545 X 414, 256 gray scale) and displayed on the 17 inch-flat monitor(1024 X 768, Samtron, Seoul, Korea). Without special devices, the images acquisition and storage could be done on the reading viewbox, simply. The image quality on the computer's monitor was less than the one of original film on the viewbox, but generally the characteristics of each lesions could be differentiated. Easy retrieval of data was possible for the purpose of teaching file system. Without high cost appliances, we could consummate the image database system of teaching file using personal computer with relatively inexpensive method

  16. Transfer of numeric ASCII data files between Apple and IBM personal computers.

    Science.gov (United States)

    Allan, R W; Bermejo, R; Houben, D

    1986-01-01

    Listings for programs designed to transfer numeric ASCII data files between Apple and IBM personal computers are provided with accompanying descriptions of how the software operates. Details of the hardware used are also given. The programs may be easily adapted for transferring data between other microcomputers.

  17. Energy efficiency of computer power supply units - Final report

    Energy Technology Data Exchange (ETDEWEB)

    Aebischer, B. [cepe - Centre for Energy Policy and Economics, Swiss Federal Institute of Technology Zuerich, Zuerich (Switzerland); Huser, H. [Encontrol GmbH, Niederrohrdorf (Switzerland)

    2002-11-15

    This final report for the Swiss Federal Office of Energy (SFOE) takes a look at the efficiency of computer power supply units, which decreases rapidly during average computer use. The background and the purpose of the project are examined. The power supplies for personal computers are discussed and the testing arrangement used is described. Efficiency, power-factor and operating points of the units are examined. Potentials for improvement and measures to be taken are discussed. Also, action to be taken by those involved in the design and operation of such power units is proposed. Finally, recommendations for further work are made.

  18. Dimensional quality control of Ti-Ni dental file by optical coordinate metrology and computed tomography

    DEFF Research Database (Denmark)

    Yagüe-Fabra, J.A.; Tosello, Guido; Ontiveros, S.

    2014-01-01

    Endodontic dental files usually present complex 3D geometries, which make the complete measurement of the component very challenging with conventional micro metrology tools. Computed Tomography (CT) can represent a suitable alternative solution to micro metrology tools based on optical and tactile...... techniques. However, the establishment of CT systems traceability when measuring 3D complex geometries is still an open issue. In this work, to verify the quality of the CT dimensional measurements, the dental file has been measured both with a μCT system and an optical CMM (OCMM). The uncertainty...

  19. Installation of new Generation General Purpose Computer (GPC) compact unit

    Science.gov (United States)

    1991-01-01

    In the Kennedy Space Center's (KSC's) Orbiter Processing Facility (OPF) high bay 2, Spacecraft Electronics technician Ed Carter (right), wearing clean suit, prepares for (26864) and installs (26865) the new Generation General Purpose Computer (GPC) compact IBM unit in Atlantis', Orbiter Vehicle (OV) 104's, middeck avionics bay as Orbiter Systems Quality Control technician Doug Snider looks on. Both men work for NASA contractor Lockheed Space Operations Company. All three orbiters are being outfitted with the compact IBM unit, which replaces a two-unit earlier generation computer.

  20. File management for experiment control parameters within a distributed function computer network

    International Nuclear Information System (INIS)

    Stubblefield, F.W.

    1976-10-01

    An attempt to design and implement a computer system for control of and data collection from a set of laboratory experiments reveals that many of the experiments in the set require an extensive collection of parameters for their control. The operation of the experiments can be greatly simplified if a means can be found for storing these parameters between experiments and automatically accessing them as they are required. A subsystem for managing files of such experiment control parameters is discussed. 3 figures

  1. Computer systems for the control of teletherapy units

    International Nuclear Information System (INIS)

    Brace, J.A.

    1985-01-01

    This paper describes a computer-controlled tracking cobalt unit installed at the Royal Free Hospital. It is based on a standard TEM MS90 unit and operates at 90-cm source-axis distance with a geometric field size of 45 x 45 cm at that distance. It has been modified so that it can be used either manually or under computer control. There are nine parameters that can be controlled positionally and two that can be controlled in rate mode; these are presented in a table

  2. Complexity estimates based on integral transforms induced by computational units

    Czech Academy of Sciences Publication Activity Database

    Kůrková, Věra

    2012-01-01

    Roč. 33, September (2012), s. 160-167 ISSN 0893-6080 R&D Projects: GA ČR GAP202/11/1368 Institutional research plan: CEZ:AV0Z10300504 Institutional support: RVO:67985807 Keywords : neural networks * estimates of model complexity * approximation from a dictionary * integral transforms * norms induced by computational units Subject RIV: IN - Informatics, Computer Science Impact factor: 1.927, year: 2012

  3. Graphics processing unit based computation for NDE applications

    Science.gov (United States)

    Nahas, C. A.; Rajagopal, Prabhu; Balasubramaniam, Krishnan; Krishnamurthy, C. V.

    2012-05-01

    Advances in parallel processing in recent years are helping to improve the cost of numerical simulation. Breakthroughs in Graphical Processing Unit (GPU) based computation now offer the prospect of further drastic improvements. The introduction of 'compute unified device architecture' (CUDA) by NVIDIA (the global technology company based in Santa Clara, California, USA) has made programming GPUs for general purpose computing accessible to the average programmer. Here we use CUDA to develop parallel finite difference schemes as applicable to two problems of interest to NDE community, namely heat diffusion and elastic wave propagation. The implementations are for two-dimensions. Performance improvement of the GPU implementation against serial CPU implementation is then discussed.

  4. Exploiting graphics processing units for computational biology and bioinformatics.

    Science.gov (United States)

    Payne, Joshua L; Sinnott-Armstrong, Nicholas A; Moore, Jason H

    2010-09-01

    Advances in the video gaming industry have led to the production of low-cost, high-performance graphics processing units (GPUs) that possess more memory bandwidth and computational capability than central processing units (CPUs), the standard workhorses of scientific computing. With the recent release of generalpurpose GPUs and NVIDIA's GPU programming language, CUDA, graphics engines are being adopted widely in scientific computing applications, particularly in the fields of computational biology and bioinformatics. The goal of this article is to concisely present an introduction to GPU hardware and programming, aimed at the computational biologist or bioinformaticist. To this end, we discuss the primary differences between GPU and CPU architecture, introduce the basics of the CUDA programming language, and discuss important CUDA programming practices, such as the proper use of coalesced reads, data types, and memory hierarchies. We highlight each of these topics in the context of computing the all-pairs distance between instances in a dataset, a common procedure in numerous disciplines of scientific computing. We conclude with a runtime analysis of the GPU and CPU implementations of the all-pairs distance calculation. We show our final GPU implementation to outperform the CPU implementation by a factor of 1700.

  5. The Evolution of Computer Based Learning Software Design: Computer Assisted Teaching Unit Experience.

    Science.gov (United States)

    Blandford, A. E.; Smith, P. R.

    1986-01-01

    Describes the style of design of computer simulations developed by Computer Assisted Teaching Unit at Queen Mary College with reference to user interface, input and initialization, input data vetting, effective display screen use, graphical results presentation, and need for hard copy. Procedures and problems relating to academic involvement are…

  6. Evaluation of clinical data in childhood asthma. Application of a computer file system

    International Nuclear Information System (INIS)

    Fife, D.; Twarog, F.J.; Geha, R.S.

    1983-01-01

    A computer file system was used in our pediatric allergy clinic to assess the value of chest roentgenograms and hemoglobin determinations used in the examination of patients and to correlate exposure to pets and forced hot air with the severity of asthma. Among 889 children with asthma, 20.7% had abnormal chest roentgenographic findings, excluding hyperinflation and peribronchial thickening, and 0.7% had abnormal hemoglobin values. Environmental exposure to pets or forced hot air was not associated with increased severity of asthma, as assessed by five measures of outcome: number of medications administered, requirement for corticosteroids, frequency of clinic visits, frequency of emergency room visits, and frequency of hospitalizations

  7. Building Parts Inventory Files Using the AppleWorks Data Base Subprogram and Apple IIe or GS Computers.

    Science.gov (United States)

    Schlenker, Richard M.

    This manual is a "how to" training device for building database files using the AppleWorks program with an Apple IIe or Apple IIGS Computer with Duodisk or two disk drives and an 80-column card. The manual provides step-by-step directions, and includes 25 figures depicting the computer screen at the various stages of the database file…

  8. CINDA 83 (1977-1983). The index to literature and computer files on microscopic neutron data

    International Nuclear Information System (INIS)

    1983-01-01

    CINDA, the Computer Index of Neutron Data, contains bibliographical references to measurements, calculations, reviews and evaluations of neutron cross-sections and other microscopic neutron data; it includes also index references to computer libraries of numerical neutron data exchanged between four regional neutron data centres. The present issue, CINDA 83, is an index to the literature on neutron data published after 1976. The basic volume, CINDA-A, together with the present issue, contains the full CINDA file as of 1 April 1983. A supplement to CINDA 83 is foreseen for fall 1983. Next year's issue, which is envisaged to be published in June 1984, will again cover all relevant literature that has appeared after 1976

  9. Trust in social computing. The case of peer-to-peer file sharing networks

    Directory of Open Access Journals (Sweden)

    Heng Xu

    2011-09-01

    Full Text Available Social computing and online communities are changing the fundamental way people share information and communicate with each other. Social computing focuses on how users may have more autonomy to express their ideas and participate in social exchanges in various ways, one of which may be peer-to-peer (P2P file sharing. Given the greater risk of opportunistic behavior by malicious or criminal communities in P2P networks, it is crucial to understand the factors that affect individual’s use of P2P file sharing software. In this paper, we develop and empirically test a research model that includes trust beliefs and perceived risks as two major antecedent beliefs to the usage intention. Six trust antecedents are assessed including knowledge-based trust, cognitive trust, and both organizational and peer-network factors of institutional trust. Our preliminary results show general support for the model and offer some important implications for software vendors in P2P sharing industry and regulatory bodies.

  10. Graphics processing units in bioinformatics, computational biology and systems biology.

    Science.gov (United States)

    Nobile, Marco S; Cazzaniga, Paolo; Tangherloni, Andrea; Besozzi, Daniela

    2017-09-01

    Several studies in Bioinformatics, Computational Biology and Systems Biology rely on the definition of physico-chemical or mathematical models of biological systems at different scales and levels of complexity, ranging from the interaction of atoms in single molecules up to genome-wide interaction networks. Traditional computational methods and software tools developed in these research fields share a common trait: they can be computationally demanding on Central Processing Units (CPUs), therefore limiting their applicability in many circumstances. To overcome this issue, general-purpose Graphics Processing Units (GPUs) are gaining an increasing attention by the scientific community, as they can considerably reduce the running time required by standard CPU-based software, and allow more intensive investigations of biological systems. In this review, we present a collection of GPU tools recently developed to perform computational analyses in life science disciplines, emphasizing the advantages and the drawbacks in the use of these parallel architectures. The complete list of GPU-powered tools here reviewed is available at http://bit.ly/gputools. © The Author 2016. Published by Oxford University Press.

  11. Implicit Theories of Creativity in Computer Science in the United States and China

    Science.gov (United States)

    Tang, Chaoying; Baer, John; Kaufman, James C.

    2015-01-01

    To study implicit concepts of creativity in computer science in the United States and mainland China, we first asked 308 Chinese computer scientists for adjectives that would describe a creative computer scientist. Computer scientists and non-computer scientists from China (N = 1069) and the United States (N = 971) then rated how well those…

  12. For what reasons do patients file a complaint? A retrospective study on patient rights units' registries.

    Science.gov (United States)

    Önal, Gülsüm; Civaner, M Murat

    2015-01-01

    In 2004, Patient Rights Units were established in all public hospitals in Turkey to allow patients to voice their complaints about services. To determine what violations are reflected into the complaint mechanism, the pattern over time, and patients' expectations of the services. Descriptive study. A retrospective study performed using the complaint database of the Istanbul Health Directorate, from 2005 to 2011. The results indicate that people who are older than 40 years, women, and those with less than high school education are the most common patients in these units. A total of 218,186 complaints were filed. Each year, the number of complaints increased compared to the previous year, and nearly half of the applications were made in 2010 and 2011 (48.9%). The three most frequent complaints were "not benefiting from services in general" (35.4%), "not being treated in a respectable manner and in comfortable conditions" (17.8%), and "not being properly informed" (13.5%). Two-thirds of the overall applications were found in favour of the patients (63.3%), and but this rate has decreased over the years. Patients would like to be treated in a manner that respects their human dignity. Educating healthcare workers on communication skills might be a useful initiative. More importantly, health policies and the organisation of services should prioritise patient rights. It is only then would be possible to exercise patient rights in reality.

  13. Comparison of canal transportation and centering ability of twisted files, Pathfile-ProTaper system, and stainless steel hand K-files by using computed tomography.

    Science.gov (United States)

    Gergi, Richard; Rjeily, Joe Abou; Sader, Joseph; Naaman, Alfred

    2010-05-01

    The purpose of this study was to compare canal transportation and centering ability of 2 rotary nickel-titanium (NiTi) systems (Twisted Files [TF] and Pathfile-ProTaper [PP]) with conventional stainless steel K-files. Ninety root canals with severe curvature and short radius were selected. Canals were divided randomly into 3 groups of 30 each. After preparation with TF, PP, and stainless steel files, the amount of transportation that occurred was assessed by using computed tomography. Three sections from apical, mid-root, and coronal levels of the canal were recorded. Amount of transportation and centering ability were assessed. The 3 groups were statistically compared with analysis of variance and Tukey honestly significant difference test. Less transportation and better centering ability occurred with TF rotary instruments (P < .0001). K-files showed the highest transportation followed by PP system. PP system showed significant transportation when compared with TF (P < .0001). The TF system was found to be the best for all variables measured in this study. Copyright (c) 2010 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  14. Zebra: A striped network file system

    Science.gov (United States)

    Hartman, John H.; Ousterhout, John K.

    1992-01-01

    The design of Zebra, a striped network file system, is presented. Zebra applies ideas from log-structured file system (LFS) and RAID research to network file systems, resulting in a network file system that has scalable performance, uses its servers efficiently even when its applications are using small files, and provides high availability. Zebra stripes file data across multiple servers, so that the file transfer rate is not limited by the performance of a single server. High availability is achieved by maintaining parity information for the file system. If a server fails its contents can be reconstructed using the contents of the remaining servers and the parity information. Zebra differs from existing striped file systems in the way it stripes file data: Zebra does not stripe on a per-file basis; instead it stripes the stream of bytes written by each client. Clients write to the servers in units called stripe fragments, which are analogous to segments in an LFS. Stripe fragments contain file blocks that were written recently, without regard to which file they belong. This method of striping has numerous advantages over per-file striping, including increased server efficiency, efficient parity computation, and elimination of parity update.

  15. Measurement of parameters for the quality control of X-ray units by using PIN diodes and a personal computer

    International Nuclear Information System (INIS)

    Ramirez, F.; Gaytan, E.; Mercado, I.; Estrada, M.; Cerdeira, A.

    2000-01-01

    The design of a new system for the measurement of the main parameters of X-ray units used in medicine is presented. The system measures automatically the exposure time, high voltage applied, waveform of the detected signal, exposure ratio and the total exposure (dose). The X-ray detectors employed are PIN diodes developed at CINVESTAV, the measurements are done in one single shot, without invasion of the X-ray unit. The results are shown in the screen of the computer and can be saved in a file for later analysis. The proposed system is intended to be used in the quality control of X-rays units for clinical radio-diagnosis. It is a simple and inexpensive equipment if compared with available commercial equipment that uses ionization chambers and accurate electrometers that small facilities and hospitals cannot afford

  16. To the problem of reliability standardization in computer-aided manufacturing at NPP units

    International Nuclear Information System (INIS)

    Yastrebenetskij, M.A.; Shvyryaev, Yu.V.; Spektor, L.I.; Nikonenko, I.V.

    1989-01-01

    The problems of reliability standardization in computer-aided manufacturing of NPP units considering the following approaches: computer-aided manufacturing of NPP units as a part of automated technological complex; computer-aided manufacturing of NPP units as multi-functional system, are analyzed. Selection of the composition of reliability indeces for computer-aided manufacturing of NPP units for each of the approaches considered is substantiated

  17. Modification to the Monte Carlo N-Particle (MCNP) Visual Editor (MCNPVised) to Read in Computer Aided Design (CAD) Files

    International Nuclear Information System (INIS)

    Randolph Schwarz; Leland L. Carter; Alysia Schwarz

    2005-01-01

    Monte Carlo N-Particle Transport Code (MCNP) is the code of choice for doing complex neutron/photon/electron transport calculations for the nuclear industry and research institutions. The Visual Editor for Monte Carlo N-Particle is internationally recognized as the best code for visually creating and graphically displaying input files for MCNP. The work performed in this grant was used to enhance the capabilities of the MCNP Visual Editor to allow it to read in both 2D and 3D Computer Aided Design (CAD) files, allowing the user to electronically generate a valid MCNP input geometry

  18. Conversion of Input Data between KENO and MCNP File Formats for Computer Criticality Assessments

    International Nuclear Information System (INIS)

    Schwarz, Randolph A.; Carter, Leland L.; Schwarz Alysia L.

    2006-01-01

    KENO is a Monte Carlo criticality code that is maintained by Oak Ridge National Laboratory (ORNL). KENO is included in the SCALE (Standardized Computer Analysis for Licensing Evaluation) package. KENO is often used because it was specifically designed for criticality calculations. Because KENO has convenient geometry input, including the treatment of lattice arrays of materials, it is frequently used for production calculations. Monte Carlo N-Particle (MCNP) is a Monte Carlo transport code maintained by Los Alamos National Laboratory (LANL). MCNP has a powerful 3D geometry package and an extensive cross section database. It is a general-purpose code and may be used for calculations involving shielding or medical facilities, for example, but can also be used for criticality calculations. MCNP is becoming increasingly more popular for performing production criticality calculations. Both codes have their own specific advantages. After a criticality calculation has been performed with one of the codes, it is often desirable (or may be a safety requirement) to repeat the calculation with the other code to compare the important parameters using a different geometry treatment and cross section database. This manual conversion of input files between the two codes is labor intensive. The industry needs the capability of converting geometry models between MCNP and KENO without a large investment in manpower. The proposed conversion package will aid the user in converting between the codes. It is not intended to be used as a ''black box''. The resulting input file will need to be carefully inspected by criticality safety personnel to verify the intent of the calculation is preserved in the conversion. The purpose of this package is to help the criticality specialist in the conversion process by converting the geometry, materials, and pertinent data cards

  19. 17 CFR 230.487 - Effectiveness of registration statements filed by certain unit investment trusts.

    Science.gov (United States)

    2010-04-01

    ... Investment Companies; Business Development Companies § 230.487 Effectiveness of registration statements filed... Company Act of 1940 that files a registration statement pursuant to the Act in connection with the..., may designate a date and time for such registration statement to become effective. If the registrant...

  20. Modification to the Monte N-Particle (MCNP) Visual Editor (MCNPVised) to read in Computer Aided Design (CAD) files

    International Nuclear Information System (INIS)

    Schwarz, Randy A.; Carter, Leeland L.

    2004-01-01

    Monte Carlo N-Particle Transport Code (MCNP) (Reference 1) is the code of choice for doing complex neutron/photon/electron transport calculations for the nuclear industry and research institutions. The Visual Editor for Monte Carlo N-Particle (References 2 to 11) is recognized internationally as the best code for visually creating and graphically displaying input files for MCNP. The work performed in this grant enhanced the capabilities of the MCNP Visual Editor to allow it to read in a 2D Computer Aided Design (CAD) file, allowing the user to modify and view the 2D CAD file and then electronically generate a valid MCNP input geometry with a user specified axial extent

  1. Sandia`s computer support units: The first three years

    Energy Technology Data Exchange (ETDEWEB)

    Harris, R.N. [Sandia National Labs., Albuquerque, NM (United States). Labs. Computing Dept.

    1997-11-01

    This paper describes the method by which Sandia National Laboratories has deployed information technology to the line organizations and to the desktop as part of the integrated information services organization under the direction of the Chief Information officer. This deployment has been done by the Computer Support Unit (CSU) Department. The CSU approach is based on the principle of providing local customer service with a corporate perspective. Success required an approach that was both customer compelled at times and market or corporate focused in most cases. Above all, a complete solution was required that included a comprehensive method of technology choices and development, process development, technology implementation, and support. It is the authors hope that this information will be useful in the development of a customer-focused business strategy for information technology deployment and support. Descriptions of current status reflect the status as of May 1997.

  2. Computer aided design of fast neutron therapy units

    International Nuclear Information System (INIS)

    Gileadi, A.E.; Gomberg, H.J.; Lampe, I.

    1980-01-01

    Conceptual design of a radiation-therapy unit using fusion neutrons is presently being considered by KMS Fusion, Inc. As part of this effort, a powerful and versatile computer code, TBEAM, has been developed which enables the user to determine physical characteristics of the fast neutron beam generated in the facility under consideration, using certain given design parameters of the facility as inputs. TBEAM uses the method of statistical sampling (Monte Carlo) to solve the space, time and energy dependent neutron transport equation relating to the conceptual design described by the user-supplied input parameters. The code traces the individual source neutrons as they propagate throughout the shield-collimator structure of the unit, and it keeps track of each interaction by type, position and energy. In its present version, TBEAM is applicable to homogeneous and laminated shields of spherical geometry, to collimator apertures of conical shape, and to neutrons emitted by point sources or such plate sources as are used in neutron generators of various types. TBEAM-generated results comparing the performance of point or plate sources in otherwise identical shield-collimator configurations are presented in numerical form. (H.K.)

  3. 2016 KML Boundary File, United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  4. 2014 Cartographic Boundary File, Division for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  5. 2015 Cartographic Boundary File, Region for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  6. 2014 Cartographic Boundary File, United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  7. 2014 Cartographic Boundary File, Combined Statistical Area for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  8. 2016 Cartographic Boundary File, Current County and Equivalent for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  9. 2014 Cartographic Boundary File, Division for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  10. 2015 Cartographic Boundary File, Combined Statistical Area for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  11. 2016 Cartographic Boundary File, Division for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  12. 2016 Cartographic Boundary File, 115th Congressional Districts for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  13. 2014 Cartographic Boundary File, State-Congressional District for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  14. 2014 Cartographic Boundary File, State-Congressional District for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  15. 2015 Cartographic Boundary File, State-Congressional District for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  16. 2016 Cartographic Boundary File, 115th Congressional Districts for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  17. 2015 Cartographic Boundary File, State-Congressional District for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  18. 2016 Cartographic Boundary File, 115th Congressional Districts for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /Topologically...

  19. 2014 Cartographic Boundary File, State-Congressional District for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  20. 2015 Cartographic Boundary File, State-Congressional District for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  1. 2015 Cartographic Boundary File, State-Congressional District for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  2. 2016 Cartographic Boundary File, 115th Congressional Districts for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  3. 2014 Cartographic Boundary File, State-Congressional District for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  4. 2016 Cartographic Boundary File, 115th Congressional Districts for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  5. 2014 Cartographic Boundary File, State-Congressional District for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  6. 2015 Cartographic Boundary File, State-Congressional District for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  7. 2014 Cartographic Boundary File, State-Congressional District for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  8. 2016 Cartographic Boundary File, Current Combined Statistical Area for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  9. 2014 Cartographic Boundary File, Division for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  10. 2016 Cartographic Boundary File, Current Combined Statistical Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  11. 2014 Cartographic Boundary File, State-County for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  12. 2014 Cartographic Boundary File, State-County for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  13. 2016 Cartographic Boundary File, Region for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  14. 2016 Cartographic Boundary File, Region for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  15. 2016 Cartographic Boundary File, Current County and Equivalent for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  16. 2016 Cartographic Boundary File, Current Combined Statistical Area for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  17. 2015 Cartographic Boundary File, Combined Statistical Area for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  18. 2015 Cartographic Boundary File, Division for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  19. 2014 Cartographic Boundary File, Combined Statistical Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  20. 2014 Cartographic Boundary File, Urban Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  1. 2016 Cartographic Boundary File, Division for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  2. 2016 Cartographic Boundary File, Current Combined Statistical Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  3. 2015 Cartographic Boundary File, Combined Statistical Area for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  4. 2014 Cartographic Boundary File, Combined Statistical Area for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  5. 2015 Cartographic Boundary File, Region for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  6. 2014 Cartographic Boundary File, Division for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  7. 2016 Cartographic Boundary File, Region for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  8. 2016 Cartographic Boundary File, Division for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  9. 2016 Cartographic Boundary File, United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  10. 2014 Cartographic Boundary File, State for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  11. 2016 Cartographic Boundary File, United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  12. 2014 Cartographic Boundary File, State for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  13. 2015 Cartographic Boundary File, Division for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  14. 2014 Cartographic Boundary File, Region for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  15. 2016 Cartographic Boundary File, United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  16. 2016 Cartographic Boundary File, Current County and Equivalent for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  17. 2014 Cartographic Boundary File, Region for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  18. 2016 Cartographic Boundary File, Current County and Equivalent for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  19. 2014 Cartographic Boundary File, State-County for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  20. 2014 Cartographic Boundary File, State-County for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  1. 2014 Cartographic Boundary File, State-County for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  2. 2015 Cartographic Boundary File, State-County for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  3. 2015 Cartographic Boundary File, Combined Statistical Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  4. 2014 Cartographic Boundary File, State for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  5. 2014 Cartographic Boundary File, State for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  6. 2016 Cartographic Boundary File, Region for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  7. 2014 Cartographic Boundary File, State for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  8. 2015 Cartographic Boundary File, State-County for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  9. 2015 Cartographic Boundary File, Division for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  10. Evaluation of Single File Systems Reciproc, Oneshape, and WaveOne using Cone Beam Computed Tomography -An In Vitro Study.

    Science.gov (United States)

    Dhingra, Annil; Ruhal, Nidhi; Miglani, Anjali

    2015-04-01

    Successful endodontic therapy depends on many factor, one of the most important step in any root canal treatment is root canal preparation. In addition, respecting the original shape of the canal is of the same importance; otherwise, canal aberrations such as transportation will be created. The purpose of this study is to compare and evaluate Reciprocating WaveOne ,Reciproc and Rotary Oneshape Single File Instrumentation System On Cervical Dentin Thickness, Cross Sectional Area and Canal Transportation on First Mandibular Molar Using Cone Beam Computed Tomography. Sixty Mandibular First Molars extracted due to periodontal reason was collected from the Department of Oral and Maxillofacial. Teeth were prepared using one rotary and two reciprocating single file system. Teeth were divided into 3 groups 20 teeth in each group. Pre instrumentation and Post instrumentation scans was done and evaluated for three parameters Canal Transportation, Cervical Dentinal Thickness, Cross-sectional Area. Results were analysed statistically using ANOVA, Post-Hoc Tukey analysis. The change in cross-sectional area after filing showed significant difference at 0mm, 1mm, 2mm and 7mm (pfile system over a distance of 7 mm (starting from 0mm and then evaluation at 1mm, 2mm, 3mm, 5mm and 7mm), the results showed a significant difference among the file systems at various lengths (p= 0.014, 0.046, 0.004, 0.028, 0.005 & 0.029 respectively). Mean value of cervical dentinal removal is maximum at all the levels for oneshape and minimum for waveone showing the better quality of waveone and reciproc over oneshape file system. Significant difference was found at 9mm, 11mm and 12mm between all the three file systems (p<0.001,< 0.001, <0.001). It was concluded that reciprocating motion is better than rotary motion in all the three parameters Canal Transportation, Cross-sectional Area, Cervical Dentinal Thickness.

  11. NURE [National Uranium Resource Evaluation] HSSR [Hydrogeochemical and Stream Sediment Reconnaissance] Introduction to Data Files, United States: Volume 1

    International Nuclear Information System (INIS)

    1985-01-01

    One product of the Hydrogeochemical and Stream Sediment Reconnaissance (HSSR) program, a component of the National Uranium Resource Evaluation (NURE), is a data-base of interest to scientists and professionals in the academic, business, industrial, and governmental communities. This database contains individual records for water and sediment samples taken during the reconnaissance survey of the entire United States, excluding Hawaii. The purpose of this report is to describe the NURE HSSR data by highlighting its key characteristics and providing user guides to the data. A companion report, ''A Technical History of the NURE HSSR Program,'' summarizes those aspects of the HSSR Program which are likely to be important in helping users understand the database. Each record on the database contains varying information on general field or site characteristics and analytical results for elemental concentrations in the sample; the database is potentially valuable for describing the geochemistry of specified locations and addressing issues or questions in other areas such as water quality, geoexploration, and hydrologic studies. This report is organized in twelve volumes. This first volume presents a brief history of the NURE HSSR program, a description of the data files produced by ISP, a Users' Dictionary for the Analysis File and graphs showing the distribution of elemental concentrations for sediments at the US level. Volumes 2 through 12 are comprised of Data Summary Tables displaying the percentile distribution of the elemental concentrations on the file. Volume 2 contains data for the individual states. Volumes 3 through 12 contain data for the 1 0 x 2 0 quadrangles, organized into eleven regional files; the data for the two regional files for Alaska (North and South) are bound together as Volume 12

  12. Generation of Gaussian 09 Input Files for the Computation of 1H and 13C NMR Chemical Shifts of Structures from a Spartan’14 Conformational Search

    OpenAIRE

    sprotocols

    2014-01-01

    Authors: Spencer Reisbick & Patrick Willoughby ### Abstract This protocol describes an approach to preparing a series of Gaussian 09 computational input files for an ensemble of conformers generated in Spartan’14. The resulting input files are necessary for computing optimum geometries, relative conformer energies, and NMR shielding tensors using Gaussian. Using the conformational search feature within Spartan’14, an ensemble of conformational isomers was obtained. To convert the str...

  13. The Computer Backgrounds of Soldiers in Army Units: FY01

    National Research Council Canada - National Science Library

    Singh, Harnam

    2002-01-01

    A multi-year research effort was instituted in FY99 to examine soldiers' experiences with computers, self- perceptions of their computer skill, and their ability to identify frequently used, Windows-based icons...

  14. Hand held control unit for controlling a display screen-oriented computer game, and a display screen-oriented computer game having one or more such control units

    NARCIS (Netherlands)

    2001-01-01

    A hand-held control unit is used to control a display screen-oriented computer game. The unit comprises a housing with a front side, a set of control members lying generally flush with the front side for through actuating thereof controlling actions of in-game display items, and an output for

  15. Comparison of Tissue Density in Hounsfield Units in Computed Tomography and Cone Beam Computed Tomography.

    Science.gov (United States)

    Varshowsaz, Masoud; Goorang, Sepideh; Ehsani, Sara; Azizi, Zeynab; Rahimian, Sepideh

    2016-03-01

    Bone quality and quantity assessment is one of the most important steps in implant treatment planning. Different methods such as computed tomography (CT) and recently suggested cone beam computed tomography (CBCT) with lower radiation dose and less time and cost are used for bone density assessment. This in vitro study aimed to compare the tissue density values in Hounsfield units (HUs) in CBCT and CT scans of different tissue phantoms with two different thicknesses, two different image acquisition settings and in three locations in the phantoms. Four different tissue phantoms namely hard tissue, soft tissue, air and water were scanned by three different CBCT and a CT system in two thicknesses (full and half) and two image acquisition settings (high and low kVp and mA). The images were analyzed at three sites (middle, periphery and intermediate) using eFilm software. The difference in density values was analyzed by ANOVA and correction coefficient test (P<0.05). There was a significant difference between density values in CBCT and CT scans in most situations, and CBCT values were not similar to CT values in any of the phantoms in different thicknesses and acquisition parameters or the three different sites. The correction coefficients confirmed the results. CBCT is not reliable for tissue density assessment. The results were not affected by changes in thickness, acquisition parameters or locations.

  16. Computational system to create an entry file for replicating I-125 seeds simulating brachytherapy case studies using the MCNPX code

    Directory of Open Access Journals (Sweden)

    Leonardo da Silva Boia

    2014-03-01

    Full Text Available Purpose: A computational system was developed for this paper in the C++ programming language, to create a 125I radioactive seed entry file, based on the positioning of a virtual grid (template in voxel geometries, with the purpose of performing prostate cancer treatment simulations using the MCNPX code.Methods: The system is fed with information from the planning system with regard to each seed’s location and its depth, and an entry file is automatically created with all the cards (instructions for each seed regarding their cell blocks and surfaces spread out spatially in the 3D environment. The system provides with precision a reproduction of the clinical scenario for the MCNPX code’s simulation environment, thereby allowing the technique’s in-depth study.Results and Conclusion: The preliminary results from this study showed that the lateral penumbra of uniform scanning proton beams was less sensitive In order to validate the computational system, an entry file was created with 88 125I seeds that were inserted in the phantom’s MAX06 prostate region with initial activity determined for the seeds at the 0.27 mCi value. Isodose curves were obtained in all the prostate slices in 5 mm steps in the 7 to 10 cm interval, totaling 7 slices. Variance reduction techniques were applied in order to optimize computational time and the reduction of uncertainties such as photon and electron energy interruptions in 4 keV and forced collisions regarding cells of interest. Through the acquisition of isodose curves, the results obtained show that hot spots have values above 300 Gy, as anticipated in literature, stressing the importance of the sources’ correct positioning, in which the computational system developed provides, in order not to release excessive doses in adjacent risk organs. The 144 Gy prescription curve showed in the validation process that it covers perfectly a large percentage of the volume, at the same time that it demonstrates a large

  17. Noncontrast computed tomographic Hounsfield unit evaluation of cerebral venous thrombosis: a quantitative evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Besachio, David A. [University of Utah, Department of Radiology, Salt Lake City (United States); United States Navy, Bethesda, MD (United States); Quigley, Edward P.; Shah, Lubdha M.; Salzman, Karen L. [University of Utah, Department of Radiology, Salt Lake City (United States)

    2013-08-15

    Our objective is to determine the utility of noncontrast Hounsfield unit values, Hounsfield unit values corrected for the patient's hematocrit, and venoarterial Hounsfield unit difference measurements in the identification of intracranial venous thrombosis on noncontrast head computed tomography. We retrospectively reviewed noncontrast head computed tomography exams performed in both normal patients and those with cerebral venous thrombosis, acquiring Hounsfield unit values in normal and thrombosed cerebral venous structures. Also, we acquired Hounsfield unit values in the internal carotid artery for comparison to thrombosed and nonthrombosed venous structures and compared the venous Hounsfield unit values to the patient's hematocrit. A significant difference is identified between Hounsfield unit values in thrombosed and nonthrombosed venous structures. Applying Hounsfield unit threshold values of greater than 65, a Hounsfield unit to hematocrit ratio of greater than 1.7, and venoarterial difference values greater than 15 alone and in combination, the majority of cases of venous thrombosis are identifiable on noncontrast head computed tomography. Absolute Hounsfield unit values, Hounsfield unit to hematocrit ratios, and venoarterial Hounsfield unit value differences are a useful adjunct in noncontrast head computed tomographic evaluation of cerebral venous thrombosis. (orig.)

  18. Noncontrast computed tomographic Hounsfield unit evaluation of cerebral venous thrombosis: a quantitative evaluation

    International Nuclear Information System (INIS)

    Besachio, David A.; Quigley, Edward P.; Shah, Lubdha M.; Salzman, Karen L.

    2013-01-01

    Our objective is to determine the utility of noncontrast Hounsfield unit values, Hounsfield unit values corrected for the patient's hematocrit, and venoarterial Hounsfield unit difference measurements in the identification of intracranial venous thrombosis on noncontrast head computed tomography. We retrospectively reviewed noncontrast head computed tomography exams performed in both normal patients and those with cerebral venous thrombosis, acquiring Hounsfield unit values in normal and thrombosed cerebral venous structures. Also, we acquired Hounsfield unit values in the internal carotid artery for comparison to thrombosed and nonthrombosed venous structures and compared the venous Hounsfield unit values to the patient's hematocrit. A significant difference is identified between Hounsfield unit values in thrombosed and nonthrombosed venous structures. Applying Hounsfield unit threshold values of greater than 65, a Hounsfield unit to hematocrit ratio of greater than 1.7, and venoarterial difference values greater than 15 alone and in combination, the majority of cases of venous thrombosis are identifiable on noncontrast head computed tomography. Absolute Hounsfield unit values, Hounsfield unit to hematocrit ratios, and venoarterial Hounsfield unit value differences are a useful adjunct in noncontrast head computed tomographic evaluation of cerebral venous thrombosis. (orig.)

  19. 76 FR 62092 - Filing Procedures

    Science.gov (United States)

    2011-10-06

    ... INTERNATIONAL TRADE COMMISSION Filing Procedures AGENCY: International Trade Commission. ACTION: Notice of issuance of Handbook on Filing Procedures. SUMMARY: The United States International Trade Commission (``Commission'') is issuing a Handbook on Filing Procedures to replace its Handbook on Electronic...

  20. OK, Computer: File Sharing, the Music Industry, and Why We Need the Pirate Party

    Directory of Open Access Journals (Sweden)

    Adrian Cosstick

    2009-03-01

    Full Text Available The Pirate Party believes the state and big business are in the process of protecting stale and inefficient models of business for their own monetary benefit by limiting our right to share information. The Pirate Party suggests that they are achieving this goal through the amendment of intellectual property legislation. In the dawn of the digital era, the Pirate Party advocates that governments and multinational corporations are using intellectual property to: crack down on file sharing which limits the ability to share knowledge and information; increase the terms and length of copyright to raise profits; and build code into music files which limits their ability to be shared (Pirate Party, 2009. There are a number of ‘copyright industries’ that are affected by these issues, none more so than the music industry. Its relationship with file sharing is topical and makes an excellent case study to address the impact big business has had on intellectual property and the need for the Pirate Party’s legislative input. The essay will then examine the central issues raised by illegal file sharing. In particular, the future for record companies in an environment that increasingly demands flexibility, and whether the Pirate Party’s proposal is a viable solution to the music industry’s problems

  1. Computer Backgrounds of Soldiers in Army Units: FY00

    National Research Council Canada - National Science Library

    Fober, Gene

    2001-01-01

    .... Soldiers from four Army installations were given a survey that examined their experiences with computers, self-perceptions of their skill, and an objective test of their ability to identify Windows-based icons...

  2. Comparative evaluation of effect of rotary and reciprocating single-file systems on pericervical dentin: A cone-beam computed tomography study.

    Science.gov (United States)

    Zinge, Priyanka Ramdas; Patil, Jayaprakash

    2017-01-01

    The aim of this study is to evaluate and compare the effect of one shape, Neolix rotary single-file systems and WaveOne, Reciproc reciprocating single-file systems on pericervical dentin (PCD) using cone-beam computed tomography (CBCT). A total of 40 freshly extracted mandibular premolars were collected and divided into two groups, namely, Group A - Rotary: A 1 - Neolix and A 2 - OneShape and Group B - Reciprocating: B 1 - WaveOne and B 2 - Reciproc. Preoperative scans of each were taken followed by conventional access cavity preparation and working length determination with 10-k file. Instrumentation of the canal was done according to the respective file system, and postinstrumentation CBCT scans of teeth were obtained. 90 μm thick slices were obtained 4 mm apical and coronal to the cementoenamel junction. The PCD thickness was calculated as the shortest distance from the canal outline to the closest adjacent root surface, which was measured in four surfaces, i.e., facial, lingual, mesial, and distal for all the groups in the two obtained scans. There was no significant difference found between rotary single-file systems and reciprocating single-file systems in their effect on PCD, but in Group B 2 , there was most significant loss of tooth structure in the mesial, lingual, and distal surface ( P file system removes more PCD as compared to other experimental groups, whereas Neolix single file system had the least effect on PCD.

  3. ERX: a software for editing files containing X-ray spectra to be used in exposure computational models

    International Nuclear Information System (INIS)

    Cabral, Manuela O.M.; Vieira, Jose W.; Silva, Alysson G.; Leal Neto, Viriato; Oliveira, Alex C.H.; Lima, Fernando R.A.

    2011-01-01

    Exposure Computational Models (ECMs) are utilities that simulate situations in which occurs irradiation in a given environment. An ECM is composed primarily by an anthropomorphic model (phantom), and a Monte Carlo code (MC). This paper presents a tutorial of the software Espectro de Raios-X (ERX). This software performs reading and numerical and graphical analysis of text files containing diagnostic X-ray spectra for use in algorithms of radioactive sources in the ECMs of a Grupo de Dosimetria Numerica. The ERX allows the user to select one among several X-ray spectrums in the energy range Diagnostic radiology X-Ray most commonly used in radiology clinics. In the current version of the ERX there are two types of input files: the contained in mspectra.dat file and the resulting of MC simulations in Geant4. The software allows the construction of charts of the Probability Density Function (PDF) and Cumulative Distribution Function (CDF) of a selected spectrum as well as the table with the values of these functions and the spectrum. In addition, the ERX allows the user to make comparative analysis between the PDF graphics of the two catalogs of spectra available, besides being can perform dosimetric evaluations with the selected spectrum. A software of this kind is an important computational tool for researchers in numerical dosimetry because of the diversity of Diagnostic radiology X-Ray machines, which implies in a mass of input data highly diverse. And because of this, the ERX provides independence to the group related to the data origin that is contained in the catalogs created, not being necessary to resort to others. (author)

  4. F2AC: A Lightweight, Fine-Grained, and Flexible Access Control Scheme for File Storage in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Wei Ren

    2016-01-01

    Full Text Available Current file storage service models for cloud servers assume that users either belong to single layer with different privileges or cannot authorize privileges iteratively. Thus, the access control is not fine-grained and flexible. Besides, most access control methods at cloud servers mainly rely on computationally intensive cryptographic algorithms and, especially, may not be able to support highly dynamic ad hoc groups with addition and removal of group members. In this paper, we propose a scheme called F2AC, which is a lightweight, fine-grained, and flexible access control scheme for file storage in mobile cloud computing. F2AC can not only achieve iterative authorization, authentication with tailored policies, and access control for dynamically changing accessing groups, but also provide access privilege transition and revocation. A new access control model called directed tree with linked leaf model is proposed for further implementations in data structures and algorithms. The extensive analysis is given for justifying the soundness and completeness of F2AC.

  5. A 1.5 GFLOPS Reciprocal Unit for Computer Graphics

    DEFF Research Database (Denmark)

    Nannarelli, Alberto; Rasmussen, Morten Sleth; Stuart, Matthias Bo

    2006-01-01

    The reciprocal operation 1/d is a frequent operation performed in graphics processors (GPUs). In this work, we present the design of a radix-16 reciprocal unit based on the algorithm combining the traditional digit-by-digit algorithm and the approximation of the reciprocal by one Newton-Raphson i...

  6. Status of computational fluid dynamics in the United States

    International Nuclear Information System (INIS)

    Kutler, P.; Steger, J.L.; Bailey, F.R.

    1987-01-01

    CFD-related progress in U.S. aerospace industries and research institutions is evaluated with respect to methods employed, their applications, and the computer technologies employed in their implementation. Goals for subsonic CFD are primarily aimed at greater fuel efficiency; those of supersonic CFD involve the achievement of high sustained cruise efficiency. Transatmospheric/hypersonic vehicles are noted to have recently become important concerns for CFD efforts. Attention is given to aspects of discretization, Euler and Navier-Stokes general purpose codes, zonal equation methods, internal and external flows, and the impact of supercomputers and their networks in advancing the state-of-the-art. 91 references

  7. Micro computed tomography evaluation of the Self-adjusting file and ProTaper Universal system on curved mandibular molars.

    Science.gov (United States)

    Serefoglu, Burcu; Piskin, Beyser

    2017-09-26

    The aim of this investigation was to compare the cleaning and shaping efficiency of Self-adjusting file and Protaper, and to assess the correlation between root canal curvature and working time in mandibular molars using micro-computed tomography. Twenty extracted mandibular molars instrumented with Protaper and Self-adjusting file and the total working time was measured in mesial canals. The changes in canal volume, surface area and structure model index, transportation, uninstrumented area and the correlation between working-time and the curvature were analyzed. Although no statistically significant difference was observed between two systems in distal canals (p>0.05), a significantly higher amount of removed dentin volume and lower uninstrumented area were provided by Protaper in mesial canals (p<0.0001). A correlation between working-time and the canal-curvature was also observed in mesial canals for both groups (SAFr 2 =0.792, p<0.0004, PTUr 2 =0.9098, p<0.0001).

  8. Computer Use and Vision-Related Problems Among University Students In Ajman, United Arab Emirate

    OpenAIRE

    Shantakumari, N; Eldeeb, R; Sreedharan, J; Gopal, K

    2014-01-01

    Background: The extensive use of computers as medium of teaching and learning in universities necessitates introspection into the extent of computer related health disorders among student population. Aim: This study was undertaken to assess the pattern of computer usage and related visual problems, among University students in Ajman, United Arab Emirates. Materials and Methods: A total of 500 Students studying in Gulf Medical University, Ajman and Ajman University of Science and Technology we...

  9. CINDA 99, supplement 2 to CINDA 97 (1988-1999). The index to literature and computer files on microscopic neutron data

    International Nuclear Information System (INIS)

    1999-01-01

    CINDA, the Computer Index of Neutron Data, contains bibliographical references to measurements, calculations, reviews and evaluations of neutron cross-sections and other microscopic neutron data; it includes also index references to computer libraries of numerical neutron data available from four regional neutron data centres. The present issue, CINDA 99, is the second supplement to CINDA 97, the index to the literature on neutron data published after 1987. It supersedes the first supplement, CINDA 98. The complete CINDA file as of 1 June 1999 is contained in: the archival issue CINDA-A (5 volumes, 1990), CINDA 97 and the current issue CINDA 99. The compilation and publication of CINDA are the result of worldwide co-operation involving the following four data centres. Each centre is responsible for compiling the CINDA entries from the literature published in a defined geographical area given in brackets below: the USA National Nuclear Data Center at the Brookhaven National Laboratory, USA (United States of America and Canada); the Russian Nuclear Data Centre at the Fiziko-Energeticheskij Institut, Obninsk, Russian Federation (former USSR countries); the NEA Data Bank in Paris, France (European OECD member countries in Western Europe and Japan); and the IAEA Nuclear Data Section in Vienna, Austria (all other countries in Eastern Europe, Asia, Australia, Africa, Central and South America; also IAEA publications and translation journals). Besides the published CINDA books, up-to-date computer retrievals for specified CINDA information are currently available on request from the responsible CINDA centres, or via direct access to the on-line services as described in this publication

  10. Units for on-line control with the ES computer in physical investigations

    International Nuclear Information System (INIS)

    Efimov, L.G.

    1983-01-01

    The peripheral part of complex of means created for organization of ES computer operation on-line with experimental devices, comprising two units is described. The first unit is employed as a part of a universal driver of the Camac branch for connection with microprogram ES computer channel controller and ensures multioperational (up to 44 record varieties) device software service. The bilateral data exchange between the device and computer can be performed by bytes as well as 16 or 24-digit words using CAMAC group modes and with maximum rate of 1.25 Mbyte/s. The second unit is meant for synchronization of the data aquisition process with the device starting system and for ensuring the device operator dialogue with the computer

  11. Request queues for interactive clients in a shared file system of a parallel computing system

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin

    2015-08-18

    Interactive requests are processed from users of log-in nodes. A metadata server node is provided for use in a file system shared by one or more interactive nodes and one or more batch nodes. The interactive nodes comprise interactive clients to execute interactive tasks and the batch nodes execute batch jobs for one or more batch clients. The metadata server node comprises a virtual machine monitor; an interactive client proxy to store metadata requests from the interactive clients in an interactive client queue; a batch client proxy to store metadata requests from the batch clients in a batch client queue; and a metadata server to store the metadata requests from the interactive client queue and the batch client queue in a metadata queue based on an allocation of resources by the virtual machine monitor. The metadata requests can be prioritized, for example, based on one or more of a predefined policy and predefined rules.

  12. A computer program for creating keyword indexes to textual data files

    Science.gov (United States)

    Moody, David W.

    1972-01-01

    A keyword-in-context (KWIC) or out-of-context (KWOC) index is a convenient means of organizing information. This keyword index program can be used to create either KWIC or KWOC indexes of bibliographic references or other types of information punched on. cards, typed on optical scanner sheets, or retrieved from various Department of Interior data bases using the Generalized Information Processing System (GIPSY). The index consists of a 'bibliographic' section and a keyword-section based on the permutation of. document titles, project titles, environmental impact statement titles, maps, etc. or lists of descriptors. The program can also create a back-of-the-book index to documents from a list of descriptors. By providing the user with a wide range of input and output options, the program provides the researcher, manager, or librarian with a means of-maintaining a list and index to documents in. a small library, reprint collection, or office file.

  13. Verification of data files of TREF-computer program; TREF-ohjelmiston ohjaustiedostojen soveltuvuustutkimus

    Energy Technology Data Exchange (ETDEWEB)

    Ruottu, S.; Halme, A.; Ruottu, A. [Einco Oy, Karhula (Finland)

    1996-12-01

    Originally the aim of Y43 project was to verify TREF data files for several different processes. However, it appeared that deficient or missing coordination between experimental and theoretical works made meaningful verifications impossible in some cases. Therefore verification calculations were focused on catalytic cracking reactor which was developed by Neste. The studied reactor consisted of prefluidisation and reaction zones. Verification calculations concentrated mainly on physical phenomena like vaporization near oil injection zone. The main steps of the cracking process can be described as follows oil (liquid) -> oil (gas) -> oil (catal) -> product (catal) + char (catal) -> product (gas). Catalytic nature of cracking reaction was accounted by defining the cracking pseudoreaction into catalyst phase. This simplified reaction model was valid only for vaporization zone. Applied fluid dynamic theory was based on the results of EINCO`s earlier LIEKKI-projects. (author)

  14. Software for a magnetic dick drive unit connected with a computer TPA-1001-i

    International Nuclear Information System (INIS)

    Elizarov, O.I.; Mateeva, A.; Salamatin, I.M.

    1977-01-01

    The disk drive unit with capacity 1250 K and minimal addressing part of memory 1 sector (128 10 -12-bit words) is connected with a TPA-1001-i computer. The operation regimes of the controller, functions and formats of the commands used are described as well as the software. The data transfer between the computer and magnetic disk drive unit is realized by means of programs relocatable in a binary form. These are inserted in a standard program library with modular structure. The manner of control handling and data transfer betweeen programs stored in the library on a magnetic disk drive are described. The resident program (100sub(8) words) inserted in a monitor takes into account special features of the disk drive unit being used. The algorithms of correction programs for a disk drive unit, programs for rewriting the library from papertape to disk drive unit and of the program for writing and reading the monitor are described

  15. Shaping ability of the conventional nickel-titanium and reciprocating nickel-titanium file systems: a comparative study using micro-computed tomography.

    Science.gov (United States)

    Hwang, Young-Hye; Bae, Kwang-Shik; Baek, Seung-Ho; Kum, Kee-Yeon; Lee, WooCheol; Shon, Won-Jun; Chang, Seok Woo

    2014-08-01

    This study used micro-computed tomographic imaging to compare the shaping ability of Mtwo (VDW, Munich, Germany), a conventional nickel-titanium file system, and Reciproc (VDW), a reciprocating file system morphologically similar to Mtwo. Root canal shaping was performed on the mesiobuccal and distobuccal canals of extracted maxillary molars. In the RR group (n = 15), Reciproc was used in a reciprocating motion (150° counterclockwise/30° clockwise, 300 rpm); in the MR group, Mtwo was used in a reciprocating motion (150° clockwise/30° counterclockwise, 300 rpm); and in the MC group, Mtwo was used in a continuous rotating motion (300 rpm). Micro-computed tomographic images taken before and after canal shaping were used to analyze canal volume change and the degree of transportation at the cervical, middle, and apical levels. The time required for canal shaping was recorded. Afterward, each file was analyzed using scanning electron microscopy. No statistically significant differences were found among the 3 groups in the time for canal shaping or canal volume change (P > .05). Transportation values of the RR and MR groups were not significantly different at any level. However, the transportation value of the MC group was significantly higher than both the RR and MR groups at the cervical and apical levels (P file deformation was observed for 1 file in group RR (1/15), 3 files in group MR (3/15), and 5 files in group MC (5/15). In terms of shaping ability, Mtwo used in a reciprocating motion was not significantly different from the Reciproc system. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  16. PCF File Format.

    Energy Technology Data Exchange (ETDEWEB)

    Thoreson, Gregory G [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    PCF files are binary files designed to contain gamma spectra and neutron count rates from radiation sensors. It is the native format for the GAmma Detector Response and Analysis Software (GADRAS) package [1]. It can contain multiple spectra and information about each spectrum such as energy calibration. This document outlines the format of the file that would allow one to write a computer program to parse and write such files.

  17. Reciprocating vs Rotary Instrumentation in Pediatric Endodontics: Cone Beam Computed Tomographic Analysis of Deciduous Root Canals using Two Single-file Systems.

    Science.gov (United States)

    Prabhakar, Attiguppe R; Yavagal, Chandrashekar; Dixit, Kratika; Naik, Saraswathi V

    2016-01-01

    Primary root canals are considered to be most challenging due to their complex anatomy. "Wave one" and "one shape" are single-file systems with reciprocating and rotary motion respectively. The aim of this study was to evaluate and compare dentin thickness, centering ability, canal transportation, and instrumentation time of wave one and one shape files in primary root canals using a cone beam computed tomographic (CBCT) analysis. This is an experimental, in vitro study comparing the two groups. A total of 24 extracted human primary teeth with minimum 7 mm root length were included in the study. Cone beam computed tomographic images were taken before and after the instrumentation for each group. Dentin thickness, centering ability, canal transportation, and instrumentation times were evaluated for each group. A significant difference was found in instrumentation time and canal transportation measures between the two groups. Wave one showed less canal transportation as compared with one shape, and the mean instrumentation time of wave one was significantly less than one shape. Reciprocating single-file systems was found to be faster with much less procedural errors and can hence be recommended for shaping the root canals of primary teeth. How to cite this article: Prabhakar AR, Yavagal C, Dixit K, Naik SV. Reciprocating vs Rotary Instrumentation in Pediatric Endodontics: Cone Beam Computed Tomographic Analysis of Deciduous Root Canals using Two Single-File Systems. Int J Clin Pediatr Dent 2016;9(1):45-49.

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  19. SHIVGAMI : Simplifying tHe titanIc blastx process using aVailable GAthering of coMputational unIts

    Directory of Open Access Journals (Sweden)

    Naman Mangukia

    2017-10-01

    Full Text Available Assembling novel genomes from scratch is a never ending process unless and until the homo sapiens cover all the living organisms! On top of that, this denovo approach is employed by RNASeq and Metagenomics analysis. Functional identification of the scaffolds or transcripts from such drafted assemblies is a substantial step routinely employes a well-known BlastX program which facilitates a user to search DNA query against NCBI-Protein (NR:~120Gb database. In spite of having multicore-processing option, blastX is an elongated process for the bulk of lengthy Queryinputs. Tremendous efforts are constantly being applied to solve this problem by increasing computational power, GPU-Based computing, Cloud computing and Hadoop based approach which ultimately requires gigantic cost in terms of money and processing. To address this issue, here we have come up with SHIVGAMI, which automates the entire process using perl and shell scripts, which divide, distribute and process the input FASTA sequences as per the CPU-cores availability amongst the computational units individually. Linux operating system, NR database and blastX program installations are prerequisites for each system.  The beauty of this stand-alone automation program SHIVGAMI is it requires the LAN connection exactly twice: During ‘query distribution’ and at the time of ‘proces completion’. In initial phase, it divides the fasta sequences according to the individual computer's core-capability. Then it will evenly distribute all the data along with small automation scripts which will run the blastX process to the respective computational unit and send back the results file to the master computer. The master computer finally combines and compiles the files into a single result. This simple automation converts a computer lab into a GRID without investment of any software, hardware and man-power. In short, SHIVGAMI is a time and cost savior tool for all users starting from commercial firm

  20. An Alternative Method for Computing Unit Costs and Productivity Ratios. AIR 1984 Annual Forum Paper.

    Science.gov (United States)

    Winstead, Wayland H.; And Others

    An alternative measure for evaluating the performance of academic departments was studied. A comparison was made with the traditional manner for computing unit costs and productivity ratios: prorating the salary and effort of each faculty member to each course level based on the personal mix of course taught. The alternative method used averaging…

  1. Whole Language, Computers and CD-ROM Technology: A Kindergarten Unit on "Benjamin Bunny."

    Science.gov (United States)

    Balajthy, Ernest

    A kindergarten teacher, two preservice teachers, and a college consultant on educational computer technology designed and developed a 10-day whole-language integrated unit on the theme of Beatrix Potter's "Benjamin Bunny." The project was designed as a demonstration of the potential of integrating the CD-ROM-based version of…

  2. Using Videos and 3D Animations for Conceptual Learning in Basic Computer Units

    Science.gov (United States)

    Cakiroglu, Unal; Yilmaz, Huseyin

    2017-01-01

    This article draws on a one-semester study to investigate the effect of videos and 3D animations on students' conceptual understandings about basic computer units. A quasi-experimental design was carried out in two classrooms; videos and 3D animations were used in classroom activities in one group and those were used for homework in the other…

  3. Students' Beliefs about Mobile Devices vs. Desktop Computers in South Korea and the United States

    Science.gov (United States)

    Sung, Eunmo; Mayer, Richard E.

    2012-01-01

    College students in the United States and in South Korea completed a 28-item multidimensional scaling (MDS) questionnaire in which they rated the similarity of 28 pairs of multimedia learning materials on a 10-point scale (e.g., narrated animation on a mobile device Vs. movie clip on a desktop computer) and a 56-item semantic differential…

  4. Cone-beam Computed Tomographic Assessment of Canal Centering Ability and Transportation after Preparation with Twisted File and Bio RaCe Instrumentation.

    Directory of Open Access Journals (Sweden)

    Kiamars Honardar

    2014-08-01

    Full Text Available Use of rotary Nickel-Titanium (NiTi instruments for endodontic preparation has introduced a new era in endodontic practice, but this issue has undergone dramatic modifications in order to achieve improved shaping abilities. Cone-beam computed tomography (CBCT has made it possible to accurately evaluate geometrical changes following canal preparation. This study was carried out to compare canal centering ability and transportation of Twisted File and BioRaCe rotary systems by means of cone-beam computed tomography.Thirty root canals from freshly extracted mandibular and maxillary teeth were selected. Teeth were mounted and scanned before and after preparation by CBCT at different apical levels. Specimens were divided into 2 groups of 15. In the first group Twisted File and in the second, BioRaCe was used for canal preparation. Canal transportation and centering ability after preparation were assessed by NNT Viewer and Photoshop CS4 software. Statistical analysis was performed using t-test and two-way ANOVA.All samples showed deviations from the original axes of the canals. No significant differences were detected between the two rotary NiTi instruments for canal centering ability in all sections. Regarding canal transportation however, a significant difference was seen in the BioRaCe group at 7.5mm from the apex.Under the conditions of this in vitro study, Twisted File and BioRaCe rotary NiTi files retained original canal geometry.

  5. Computer Drawing Method for Operating Characteristic Curve of PV Power Plant Array Unit

    Science.gov (United States)

    Tan, Jianbin

    2018-02-01

    According to the engineering design of large-scale grid-connected photovoltaic power stations and the research and development of many simulation and analysis systems, it is necessary to draw a good computer graphics of the operating characteristic curves of photovoltaic array elements and to propose a good segmentation non-linear interpolation algorithm. In the calculation method, Component performance parameters as the main design basis, the computer can get 5 PV module performances. At the same time, combined with the PV array series and parallel connection, the computer drawing of the performance curve of the PV array unit can be realized. At the same time, the specific data onto the module of PV development software can be calculated, and the good operation of PV array unit can be improved on practical application.

  6. Computer Science Teacher Professional Development in the United States: A Review of Studies Published between 2004 and 2014

    Science.gov (United States)

    Menekse, Muhsin

    2015-01-01

    While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher…

  7. Computation of large covariance matrices by SAMMY on graphical processing units and multicore CPUs

    Energy Technology Data Exchange (ETDEWEB)

    Arbanas, G.; Dunn, M.E.; Wiarda, D., E-mail: arbanasg@ornl.gov, E-mail: dunnme@ornl.gov, E-mail: wiardada@ornl.gov [Oak Ridge National Laboratory, Oak Ridge, TN (United States)

    2011-07-01

    Computational power of Graphical Processing Units and multicore CPUs was harnessed by the nuclear data evaluation code SAMMY to speed up computations of large Resonance Parameter Covariance Matrices (RPCMs). This was accomplished by linking SAMMY to vendor-optimized implementations of the matrix-matrix multiplication subroutine of the Basic Linear Algebra Library to compute the most time-consuming step. The {sup 235}U RPCM computed previously using a triple-nested loop was re-computed using the NVIDIA implementation of the subroutine on a single Tesla Fermi Graphical Processing Unit, and also using the Intel's Math Kernel Library implementation on two different multicore CPU systems. A multiplication of two matrices of dimensions 16,000×20,000 that had previously taken days, took approximately one minute on the GPU. Comparable performance was achieved on a dual six-core CPU system. The magnitude of the speed-up suggests that these, or similar, combinations of hardware and libraries may be useful for large matrix operations in SAMMY. Uniform interfaces of standard linear algebra libraries make them a promising candidate for a programming framework of a new generation of SAMMY for the emerging heterogeneous computing platforms. (author)

  8. Computation of large covariance matrices by SAMMY on graphical processing units and multicore CPUs

    International Nuclear Information System (INIS)

    Arbanas, G.; Dunn, M.E.; Wiarda, D.

    2011-01-01

    Computational power of Graphical Processing Units and multicore CPUs was harnessed by the nuclear data evaluation code SAMMY to speed up computations of large Resonance Parameter Covariance Matrices (RPCMs). This was accomplished by linking SAMMY to vendor-optimized implementations of the matrix-matrix multiplication subroutine of the Basic Linear Algebra Library to compute the most time-consuming step. The 235 U RPCM computed previously using a triple-nested loop was re-computed using the NVIDIA implementation of the subroutine on a single Tesla Fermi Graphical Processing Unit, and also using the Intel's Math Kernel Library implementation on two different multicore CPU systems. A multiplication of two matrices of dimensions 16,000×20,000 that had previously taken days, took approximately one minute on the GPU. Comparable performance was achieved on a dual six-core CPU system. The magnitude of the speed-up suggests that these, or similar, combinations of hardware and libraries may be useful for large matrix operations in SAMMY. Uniform interfaces of standard linear algebra libraries make them a promising candidate for a programming framework of a new generation of SAMMY for the emerging heterogeneous computing platforms. (author)

  9. 2016 Cartographic Boundary File, Current Metropolitan/Micropolitan Statistical Area for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  10. 2014 Cartographic Boundary File, Metropolitan Statistical Area/Micropolitan Statistical Area for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  11. 2015 Cartographic Boundary File, 5-Digit ZIP Code Tabulation Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  12. 2014 Cartographic Boundary File, Metropolitan Statistical Area/Micropolitan Statistical Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  13. 2014 Cartographic Boundary File, New England City and Town Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  14. 2016 Cartographic Boundary File, Current Metropolitan/Micropolitan Statistical Area for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  15. 2016 Cartographic Boundary File, Current New England City and Town Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  16. 2016 Cartographic Boundary File, 115th Congressional Districts within Current County and Equivalent for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  17. 2014 Cartographic Boundary File, State-Congressional District-County for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  18. 2014 Cartographic Boundary File, State-Congressional District-County for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  19. 2015 Cartographic Boundary File, State-Congressional District-County for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  20. 2016 Cartographic Boundary File, 115th Congressional Districts within Current County and Equivalent for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  1. 2015 Cartographic Boundary File, Metropolitan Statistical Area/Micropolitan Statistical Area for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  2. 2014 Cartographic Boundary File, Metropolitan Statistical Area/Micropolitan Statistical Area for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  3. 2014 Cartographic Boundary File, Metropolitan Statistical Area/Micropolitan Statistical Area for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  4. 2016 Cartographic Boundary File, Current New England City and Town Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  5. 2016 Cartographic Boundary File, Current Combined New England City and Town Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  6. 2016 Cartographic Boundary File, Current Metropolitan/Micropolitan Statistical Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  7. 2014 Cartographic Boundary File, Metropolitan Statistical Area/Micropolitan Statistical Area for United States, 1:5,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  8. 2016 Cartographic Boundary File, Current Metropolitan/Micropolitan Statistical Area for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  9. 2015 Cartographic Boundary File, Metropolitan Statistical Area/Micropolitan Statistical Area for United States, 1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  10. 2014 Cartographic Boundary File, 5-Digit ZIP Code Tabulation Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  11. 2014 Cartographic Boundary File, Metropolitan Statistical Area/Micropolitan Statistical Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  12. 2016 Cartographic Boundary File, Current Metropolitan/Micropolitan Statistical Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  13. 2015 Cartographic Boundary File, Metropolitan Statistical Area/Micropolitan Statistical Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  14. 2015 Cartographic Boundary File, Metropolitan Statistical Area/Micropolitan Statistical Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  15. 2015 Cartographic Boundary File, New England City and Town Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  16. 2016 Cartographic Boundary File, Current Metropolitan/Micropolitan Statistical Area for United States,1:20,000,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  17. 2014 Cartographic Boundary File, 5-Digit ZIP Code Tabulation Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  18. 2014 Cartographic Boundary File, New England City and Town Area for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  19. Simplified techniques of cerebral angiography using a mobile X-ray unit and computed radiography

    International Nuclear Information System (INIS)

    Gondo, Gakuji; Ishiwata, Yusuke; Yamashita, Toshinori; Iida, Takashi; Moro, Yutaka

    1989-01-01

    Simplified techniques of cerebral angiography using a mobile X-ray unit and computed radiography (CR) are discussed. Computed radiography is a digital radiography system in which an imaging plate is used as an X-ray detector and a final image is displayed on the film. In the angiograms performed with CR, the spatial frequency components can be enhanced for the easy analysis of fine blood vessels. Computed radiography has an automatic sensitivity and a latitude-setting mechanism, thus serving as an 'automatic camera.' This mechanism is useful for radiography with a mobile X-ray unit in hospital wards, intensive care units, or operating rooms where the appropriate setting of exposure conditions is difficult. We applied this mechanism to direct percutaneous carotid angiography and intravenous digital subtraction angiography with a mobile X-ray unit. Direct percutaneous carotid angiography using CR and a mobile X-ray unit were taken after the manual injection of a small amount of a contrast material through a fine needle. We performed direct percutaneous carotid angiography with this method 68 times on 25 cases from August 1986 to December 1987. Of the 68 angiograms, 61 were evaluated as good, compared with conventional angiography. Though the remaining seven were evaluated as poor, they were still diagnostically effective. This method is found useful for carotid angiography in emergency rooms, intensive care units, or operating rooms. Cerebral venography using CR and a mobile X-ray unit was done after the manual injection of a contrast material through the bilateral cubital veins. The cerebral venous system could be visualized from 16 to 24 seconds after the beginning of the injection of the contrast material. We performed cerebral venography with this method 14 times on six cases. These venograms were better than conventional angiograms in all cases. This method may be useful in managing patients suffering from cerebral venous thrombosis. (J.P.N.)

  20. Evaluation of Single File Systems Reciproc, Oneshape, and WaveOne using Cone Beam Computed Tomography –An In Vitro Study

    Science.gov (United States)

    Dhingra, Annil; Miglani, Anjali

    2015-01-01

    Background Successful endodontic therapy depends on many factor, one of the most important step in any root canal treatment is root canal preparation. In addition, respecting the original shape of the canal is of the same importance; otherwise, canal aberrations such as transportation will be created. Aim The purpose of this study is to compare and evaluate Reciprocating WaveOne ,Reciproc and Rotary Oneshape Single File Instrumentation System On Cervical Dentin Thickness, Cross Sectional Area and Canal Transportation on First Mandibular Molar Using Cone Beam Computed Tomography. Materials and Methods Sixty Mandibular First Molars extracted due to periodontal reason was collected from the Department of Oral and Maxillofacial. Teeth were prepared using one rotary and two reciprocating single file system. Teeth were divided into 3 groups 20 teeth in each group. Pre instrumentation and Post instrumentation scans was done and evaluated for three parameters Canal Transportation, Cervical Dentinal Thickness, Cross-sectional Area. Results were analysed statistically using ANOVA, Post-Hoc Tukey analysis. Results The change in cross-sectional area after filing showed significant difference at 0mm, 1mm, 2mm and 7mm (pfile system over a distance of 7 mm (starting from 0mm and then evaluation at 1mm, 2mm, 3mm, 5mm and 7mm), the results showed a significant difference among the file systems at various lengths (p= 0.014, 0.046, 0.004, 0.028, 0.005 & 0.029 respectively). Mean value of cervical dentinal removal is maximum at all the levels for oneshape and minimum for waveone showing the better quality of waveone and reciproc over oneshape file system. Significant difference was found at 9mm, 11mm and 12mm between all the three file systems (p<0.001,< 0.001, <0.001). Conclusion It was concluded that reciprocating motion is better than rotary motion in all the three parameters Canal Transportation, Cross-sectional Area, Cervical Dentinal Thickness. PMID:26023639

  1. User's guide for the implementation of level one of the proposed American National Standard Specifications for an information interchange data descriptive file on control data 6000/7000 series computers

    CERN Document Server

    Wiley, R A

    1977-01-01

    User's guide for the implementation of level one of the proposed American National Standard Specifications for an information interchange data descriptive file on control data 6000/7000 series computers

  2. Computation studies into architecture and energy transfer properties of photosynthetic units from filamentous anoxygenic phototrophs

    Energy Technology Data Exchange (ETDEWEB)

    Linnanto, Juha Matti [Institute of Physics, University of Tartu, Riia 142, 51014 Tartu (Estonia); Freiberg, Arvi [Institute of Physics, University of Tartu, Riia 142, 51014 Tartu, Estonia and Institute of Molecular and Cell Biology, University of Tartu, Riia 23, 51010 Tartu (Estonia)

    2014-10-06

    We have used different computational methods to study structural architecture, and light-harvesting and energy transfer properties of the photosynthetic unit of filamentous anoxygenic phototrophs. Due to the huge number of atoms in the photosynthetic unit, a combination of atomistic and coarse methods was used for electronic structure calculations. The calculations reveal that the light energy absorbed by the peripheral chlorosome antenna complex transfers efficiently via the baseplate and the core B808–866 antenna complexes to the reaction center complex, in general agreement with the present understanding of this complex system.

  3. Development of DUST: A computer code that calculates release rates from a LLW disposal unit

    International Nuclear Information System (INIS)

    Sullivan, T.M.

    1992-01-01

    Performance assessment of a Low-Level Waste (LLW) disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the disposal unit source term). The major physical processes that influence the source term are water flow, container degradation, waste form leaching, and radionuclide transport. A computer code, DUST (Disposal Unit Source Term) has been developed which incorporates these processes in a unified manner. The DUST code improves upon existing codes as it has the capability to model multiple container failure times, multiple waste form release properties, and radionuclide specific transport properties. Verification studies performed on the code are discussed

  4. 77 FR 31026 - Use of Computer Simulation of the United States Blood Supply in Support of Planning for Emergency...

    Science.gov (United States)

    2012-05-24

    ...] Use of Computer Simulation of the United States Blood Supply in Support of Planning for Emergency... entitled: ``Use of Computer Simulation of the United States Blood Supply in Support of Planning for... and panel discussions with experts from academia, regulated industry, government, and other...

  5. 78 FR 47011 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software... revised regulatory guide (RG), revision 1 of RG 1.171, ``Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This RG endorses American National Standards...

  6. 77 FR 50722 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2012-08-22

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software...) is issuing for public comment draft regulatory guide (DG), DG-1208, ``Software Unit Testing for Digital Computer Software used in Safety Systems of Nuclear Power Plants.'' The DG-1208 is proposed...

  7. 24 CFR 290.21 - Computing annual number of units eligible for substitution of tenant-based assistance or...

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 2 2010-04-01 2010-04-01 false Computing annual number of units eligible for substitution of tenant-based assistance or alternative uses. 290.21 Section 290.21 Housing and... Multifamily Projects § 290.21 Computing annual number of units eligible for substitution of tenant-based...

  8. Initial quantitative evaluation of computed radiography in an intensive care unit

    International Nuclear Information System (INIS)

    Hillis, D.J.; McDonald, I.G.; Kelly, W.J.

    1996-01-01

    The first computed radiography (CR) unit in Australia was installed at St Vincent's Hospital, Melbourne, in February 1994. An initial qualitative evaluation of the attitude of the intensive care unit (ICU) physicians to the CR unit was conducted by use of a survey. The results of the survey of ICU physicians indicated that images were available faster than under the previous system and that the use of the CR system was preferred to evaluate chest tubes and line placements. While it is recognized that a further detailed radiological evaluation of the CR system is required to establish the diagnostic performance of CR compared with conventional film, some comments on the implementation of the system and ICU physician attitudes to the CR system are put forward for consideration by other hospitals examining the possible use of CR systems. 11 refs., 1 tab

  9. All-optical quantum computing with a hybrid solid-state processing unit

    International Nuclear Information System (INIS)

    Pei Pei; Zhang Fengyang; Li Chong; Song Heshan

    2011-01-01

    We develop an architecture of a hybrid quantum solid-state processing unit for universal quantum computing. The architecture allows distant and nonidentical solid-state qubits in distinct physical systems to interact and work collaboratively. All the quantum computing procedures are controlled by optical methods using classical fields and cavity QED. Our methods have a prominent advantage of the insensitivity to dissipation process benefiting from the virtual excitation of subsystems. Moreover, the quantum nondemolition measurements and state transfer for the solid-state qubits are proposed. The architecture opens promising perspectives for implementing scalable quantum computation in a broader sense that different solid-state systems can merge and be integrated into one quantum processor afterward.

  10. Unit cell-based computer-aided manufacturing system for tissue engineering

    International Nuclear Information System (INIS)

    Kang, Hyun-Wook; Park, Jeong Hun; Kang, Tae-Yun; Seol, Young-Joon; Cho, Dong-Woo

    2012-01-01

    Scaffolds play an important role in the regeneration of artificial tissues or organs. A scaffold is a porous structure with a micro-scale inner architecture in the range of several to several hundreds of micrometers. Therefore, computer-aided construction of scaffolds should provide sophisticated functionality for porous structure design and a tool path generation strategy that can achieve micro-scale architecture. In this study, a new unit cell-based computer-aided manufacturing (CAM) system was developed for the automated design and fabrication of a porous structure with micro-scale inner architecture that can be applied to composite tissue regeneration. The CAM system was developed by first defining a data structure for the computing process of a unit cell representing a single pore structure. Next, an algorithm and software were developed and applied to construct porous structures with a single or multiple pore design using solid freeform fabrication technology and a 3D tooth/spine computer-aided design model. We showed that this system is quite feasible for the design and fabrication of a scaffold for tissue engineering. (paper)

  11. Unit cell-based computer-aided manufacturing system for tissue engineering.

    Science.gov (United States)

    Kang, Hyun-Wook; Park, Jeong Hun; Kang, Tae-Yun; Seol, Young-Joon; Cho, Dong-Woo

    2012-03-01

    Scaffolds play an important role in the regeneration of artificial tissues or organs. A scaffold is a porous structure with a micro-scale inner architecture in the range of several to several hundreds of micrometers. Therefore, computer-aided construction of scaffolds should provide sophisticated functionality for porous structure design and a tool path generation strategy that can achieve micro-scale architecture. In this study, a new unit cell-based computer-aided manufacturing (CAM) system was developed for the automated design and fabrication of a porous structure with micro-scale inner architecture that can be applied to composite tissue regeneration. The CAM system was developed by first defining a data structure for the computing process of a unit cell representing a single pore structure. Next, an algorithm and software were developed and applied to construct porous structures with a single or multiple pore design using solid freeform fabrication technology and a 3D tooth/spine computer-aided design model. We showed that this system is quite feasible for the design and fabrication of a scaffold for tissue engineering.

  12. Computing the Density Matrix in Electronic Structure Theory on Graphics Processing Units.

    Science.gov (United States)

    Cawkwell, M J; Sanville, E J; Mniszewski, S M; Niklasson, Anders M N

    2012-11-13

    The self-consistent solution of a Schrödinger-like equation for the density matrix is a critical and computationally demanding step in quantum-based models of interatomic bonding. This step was tackled historically via the diagonalization of the Hamiltonian. We have investigated the performance and accuracy of the second-order spectral projection (SP2) algorithm for the computation of the density matrix via a recursive expansion of the Fermi operator in a series of generalized matrix-matrix multiplications. We demonstrate that owing to its simplicity, the SP2 algorithm [Niklasson, A. M. N. Phys. Rev. B2002, 66, 155115] is exceptionally well suited to implementation on graphics processing units (GPUs). The performance in double and single precision arithmetic of a hybrid GPU/central processing unit (CPU) and full GPU implementation of the SP2 algorithm exceed those of a CPU-only implementation of the SP2 algorithm and traditional matrix diagonalization when the dimensions of the matrices exceed about 2000 × 2000. Padding schemes for arrays allocated in the GPU memory that optimize the performance of the CUBLAS implementations of the level 3 BLAS DGEMM and SGEMM subroutines for generalized matrix-matrix multiplications are described in detail. The analysis of the relative performance of the hybrid CPU/GPU and full GPU implementations indicate that the transfer of arrays between the GPU and CPU constitutes only a small fraction of the total computation time. The errors measured in the self-consistent density matrices computed using the SP2 algorithm are generally smaller than those measured in matrices computed via diagonalization. Furthermore, the errors in the density matrices computed using the SP2 algorithm do not exhibit any dependence of system size, whereas the errors increase linearly with the number of orbitals when diagonalization is employed.

  13. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study

    Science.gov (United States)

    Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-01-01

    Background The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. Aim The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Materials and Methods Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20o to 35o were divided into three groups of 20 samples each: ProTaper PT (group I) – full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO – single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey’s honestly significant difference test. Results It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant Conclusion It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal

  14. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study.

    Science.gov (United States)

    Agarwal, Rolly S; Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-05-01

    The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20(o) to 35(o) were divided into three groups of 20 samples each: ProTaper PT (group I) - full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO - single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey's honestly significant difference test. It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant. It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal transportation and centering ability comparable to full sequence

  15. Computer use and vision-related problems among university students in ajman, United arab emirate.

    Science.gov (United States)

    Shantakumari, N; Eldeeb, R; Sreedharan, J; Gopal, K

    2014-03-01

    The extensive use of computers as medium of teaching and learning in universities necessitates introspection into the extent of computer related health disorders among student population. This study was undertaken to assess the pattern of computer usage and related visual problems, among University students in Ajman, United Arab Emirates. A total of 500 Students studying in Gulf Medical University, Ajman and Ajman University of Science and Technology were recruited into this study. Demographic characteristics, pattern of usage of computers and associated visual symptoms were recorded in a validated self-administered questionnaire. Chi-square test was used to determine the significance of the observed differences between the variables. The level of statistical significance was at P computer users were headache - 53.3% (251/471), burning sensation in the eyes - 54.8% (258/471) and tired eyes - 48% (226/471). Female students were found to be at a higher risk. Nearly 72% of students reported frequent interruption of computer work. Headache caused interruption of work in 43.85% (110/168) of the students while tired eyes caused interruption of work in 43.5% (98/168) of the students. When the screen was viewed at distance more than 50 cm, the prevalence of headaches decreased by 38% (50-100 cm - OR: 0.62, 95% of the confidence interval [CI]: 0.42-0.92). Prevalence of tired eyes increased by 89% when screen filters were not used (OR: 1.894, 95% CI: 1.065-3.368). High prevalence of vision related problems was noted among university students. Sustained periods of close screen work without screen filters were found to be associated with occurrence of the symptoms and increased interruptions of work of the students. There is a need to increase the ergonomic awareness among students and corrective measures need to be implemented to reduce the impact of computer related vision problems.

  16. On dosimetry of radiodiagnosis facilities, mainly focused on computed tomography units

    International Nuclear Information System (INIS)

    Ghitulescu, Zoe

    2008-01-01

    The 'talk' refers to the Dosimetry of computed tomography units and it has been thought and structured in three parts, more or less stressed each of them, thus: 1) Basics of image acquisition using computed tomography technique; 2) Effective Dose calculation for a patient and its assessment using BERT concept; 3) Recommended actions of getting a good compromise in between related dose and the image quality. The aim of the first part is that the reader to become acquainted with the CT technique in order to be able of understanding the Effective Dose calculation given example and its conversion into time units using the BERT concept . The drown conclusion is that: 1) Effective dose calculation accomplished by the medical physicist (using a special soft for the CT scanner and the exam type) and, converted in time units through BERT concept, could be then communicated by the radiologist together with the diagnostic notes. Thus, it is obviously necessary a minimum informal of the patients as regards the nature and type of radiation, for instance, by the help of some leaflets. In the third part are discussed the factors which lead to get a good image quality taking into account the ALARA principle of Radiation Protection which states the fact that the dose should be 'as low as reasonable achievable'. (author)

  17. Computation of a Canadian SCWR unit cell with deterministic and Monte Carlo codes

    International Nuclear Information System (INIS)

    Harrisson, G.; Marleau, G.

    2012-01-01

    The Canadian SCWR has the potential to achieve the goals that the generation IV nuclear reactors must meet. As part of the optimization process for this design concept, lattice cell calculations are routinely performed using deterministic codes. In this study, the first step (self-shielding treatment) of the computation scheme developed with the deterministic code DRAGON for the Canadian SCWR has been validated. Some options available in the module responsible for the resonance self-shielding calculation in DRAGON 3.06 and different microscopic cross section libraries based on the ENDF/B-VII.0 evaluated nuclear data file have been tested and compared to a reference calculation performed with the Monte Carlo code SERPENT under the same conditions. Compared to SERPENT, DRAGON underestimates the infinite multiplication factor in all cases. In general, the original Stammler model with the Livolant-Jeanpierre approximations are the most appropriate self-shielding options to use in this case of study. In addition, the 89 groups WIMS-AECL library for slight enriched uranium and the 172 groups WLUP library for a mixture of plutonium and thorium give the most consistent results with those of SERPENT. (authors)

  18. Computer-aided modeling of aluminophosphate zeolites as packings of building units

    KAUST Repository

    Peskov, Maxim

    2012-03-22

    New building schemes of aluminophosphate molecular sieves from packing units (PUs) are proposed. We have investigated 61 framework types discovered in zeolite-like aluminophosphates and have identified important PU combinations using a recently implemented computational algorithm of the TOPOS package. All PUs whose packing completely determines the overall topology of the aluminophosphate framework were described and catalogued. We have enumerated 235 building models for the aluminophosphates belonging to 61 zeolite framework types, from ring- or cage-like PU clusters. It is indicated that PUs can be considered as precursor species in the zeolite synthesis processes. © 2012 American Chemical Society.

  19. Computational fluid dynamics research at the United Technologies Research Center requiring supercomputers

    Science.gov (United States)

    Landgrebe, Anton J.

    1987-01-01

    An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.

  20. Energy- and cost-efficient lattice-QCD computations using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    Bach, Matthias

    2014-07-01

    Quarks and gluons are the building blocks of all hadronic matter, like protons and neutrons. Their interaction is described by Quantum Chromodynamics (QCD), a theory under test by large scale experiments like the Large Hadron Collider (LHC) at CERN and in the future at the Facility for Antiproton and Ion Research (FAIR) at GSI. However, perturbative methods can only be applied to QCD for high energies. Studies from first principles are possible via a discretization onto an Euclidean space-time grid. This discretization of QCD is called Lattice QCD (LQCD) and is the only ab-initio option outside of the high-energy regime. LQCD is extremely compute and memory intensive. In particular, it is by definition always bandwidth limited. Thus - despite the complexity of LQCD applications - it led to the development of several specialized compute platforms and influenced the development of others. However, in recent years General-Purpose computation on Graphics Processing Units (GPGPU) came up as a new means for parallel computing. Contrary to machines traditionally used for LQCD, graphics processing units (GPUs) are a massmarket product. This promises advantages in both the pace at which higher-performing hardware becomes available and its price. CL2QCD is an OpenCL based implementation of LQCD using Wilson fermions that was developed within this thesis. It operates on GPUs by all major vendors as well as on central processing units (CPUs). On the AMD Radeon HD 7970 it provides the fastest double-precision D kernel for a single GPU, achieving 120GFLOPS. D - the most compute intensive kernel in LQCD simulations - is commonly used to compare LQCD platforms. This performance is enabled by an in-depth analysis of optimization techniques for bandwidth-limited codes on GPUs. Further, analysis of the communication between GPU and CPU, as well as between multiple GPUs, enables high-performance Krylov space solvers and linear scaling to multiple GPUs within a single system. LQCD

  1. Energy- and cost-efficient lattice-QCD computations using graphics processing units

    International Nuclear Information System (INIS)

    Bach, Matthias

    2014-01-01

    Quarks and gluons are the building blocks of all hadronic matter, like protons and neutrons. Their interaction is described by Quantum Chromodynamics (QCD), a theory under test by large scale experiments like the Large Hadron Collider (LHC) at CERN and in the future at the Facility for Antiproton and Ion Research (FAIR) at GSI. However, perturbative methods can only be applied to QCD for high energies. Studies from first principles are possible via a discretization onto an Euclidean space-time grid. This discretization of QCD is called Lattice QCD (LQCD) and is the only ab-initio option outside of the high-energy regime. LQCD is extremely compute and memory intensive. In particular, it is by definition always bandwidth limited. Thus - despite the complexity of LQCD applications - it led to the development of several specialized compute platforms and influenced the development of others. However, in recent years General-Purpose computation on Graphics Processing Units (GPGPU) came up as a new means for parallel computing. Contrary to machines traditionally used for LQCD, graphics processing units (GPUs) are a massmarket product. This promises advantages in both the pace at which higher-performing hardware becomes available and its price. CL2QCD is an OpenCL based implementation of LQCD using Wilson fermions that was developed within this thesis. It operates on GPUs by all major vendors as well as on central processing units (CPUs). On the AMD Radeon HD 7970 it provides the fastest double-precision D kernel for a single GPU, achieving 120GFLOPS. D - the most compute intensive kernel in LQCD simulations - is commonly used to compare LQCD platforms. This performance is enabled by an in-depth analysis of optimization techniques for bandwidth-limited codes on GPUs. Further, analysis of the communication between GPU and CPU, as well as between multiple GPUs, enables high-performance Krylov space solvers and linear scaling to multiple GPUs within a single system. LQCD

  2. Evaluation of the Self-Adjusting File system (SAF) for the instrumentation of primary molar root canals: a micro-computed tomographic study.

    Science.gov (United States)

    Kaya, E; Elbay, M; Yiğit, D

    2017-06-01

    The Self-Adjusting File (SAF) system has been recommended for use in permanent teeth since it offers more conservative and effective root-canal preparation when compared to traditional rotary systems. However, no study had evaluated the usage of SAF in primary teeth. The aim of this study was to evaluate and compare the use of SAF, K file (manual instrumentation) and Profile (traditional rotary instrumentation) systems for primary-tooth root-canal preparation in terms of instrumentation time and amounts of dentin removed using micro-computed tomography (μCT) technology. Study Design: The study was conducted with 60 human primary mandibular second molar teeth divided into 3 groups according to instrumentation technique: Group I: SAF (n=20); Group II: K file (n=20); Group III; Profile (n=20). Teeth were embedded in acrylic blocks and scanned with a μCT scanner prior to instrumentation. All distal root canals were prepared up to size 30 for K file,.04/30 for Profile and 2 mm thickness, size 25 for SAF; instrumentation time was recorded for each tooth, and a second μCT scan was performed after instrumentation was complete. Amounts of dentin removed were measured using the three-dimensional images by calculating the difference in root-canal volume before and after preparation. Data was statistically analysed using the Kolmogorov-Smirnov and Kruskal-Wallis tests. Manual instrumentation (K file) resulted in significantly more dentin removal when compared to rotary instrumentation (Profile and SAF), while the SAF system generated significantly less dentin removal than both manual instrumentation (K file) and traditional rotary instrumentation (Profile) (psystems. Within the experimental conditions of the present study, the SAF seems as a useful system for root-canal instrumentation in primary molars because it removed less dentin than other systems, which is especially important for the relatively thin-walled canals of primary teeth, and because it involves less

  3. Feasibility Study and Cost Benefit Analysis of Thin-Client Computer System Implementation Onboard United States Navy Ships

    National Research Council Canada - National Science Library

    Arbulu, Timothy D; Vosberg, Brian J

    2007-01-01

    The purpose of this MBA project was to conduct a feasibility study and a cost benefit analysis of using thin-client computer systems instead of traditional networks onboard United States Navy ships...

  4. [Comparison of effectiveness and safety between Twisted File technique and ProTaper Universal rotary full sequence based on micro-computed tomography].

    Science.gov (United States)

    Chen, Xiao-bo; Chen, Chen; Liang, Yu-hong

    2016-02-18

    To evaluate the efficacy and security of two type of rotary nickel titanium system (Twisted File and ProTaper Universal) for root canal preparation based on micro-computed tomography(micro-CT). Twenty extracted molars (including 62 canals) were divided into two experimental groups and were respectively instrumented using Twisted File rotary nickel titanium system (TF) and ProTaper Universal rotary nickel titanium system (PU) to #25/0.08 following recommended protocol. Time for root canal instrumentation (accumulation of time for every single file) was recorded. The 0-3 mm root surface from apex was observed under an optical stereomicroscope at 25 × magnification. The presence of crack line was noted. The root canals were scanned with micro-CT before and after root canal preparation. Three-dimensional shape images of canals were reconstructed, calculated and evaluated. The amount of canal central transportation of the two groups was calculated and compared. The shorter preparation time [(0.53 ± 0.14) min] was observed in TF group, while the preparation time of PU group was (2.06 ± 0.39) min (Pvs. (0.097 ± 0.084) mm, P<0.05]. No instrument separation was observed in both the groups. Cracks were not found in both the groups either based in micro-CT images or observation under an optical stereomicroscope at 25 × magnification. Compared with ProTaper Universal, Twisted File took less time in root canal preparation and exhibited better shaping ability, and less canal transportation.

  5. Chemical Equilibrium, Unit 2: Le Chatelier's Principle. A Computer-Enriched Module for Introductory Chemistry. Student's Guide and Teacher's Guide.

    Science.gov (United States)

    Jameson, A. Keith

    Presented are the teacher's guide and student materials for one of a series of self-instructional, computer-based learning modules for an introductory, undergraduate chemistry course. The student manual for this unit on Le Chatelier's principle includes objectives, prerequisites, pretest, instructions for executing the computer program, and…

  6. Computer interfacing of the unified systems for personnel supervising in nuclear units

    International Nuclear Information System (INIS)

    Staicu, M.

    1997-01-01

    The dosimetric supervising of the personnel working in nuclear units is based on the information supplied by: 1) the dosimetric data obtained by the method of thermoluminescence; 2) the dosimetric data obtained by the method of photo dosimetry: 3) the records from medical periodic control. To create a unified system of supervising the following elements were combined: a) an Automatic System of TLD Reading and Data Processing (SACDTL). The data from this system are transmitted 'on line' to the computer; b) the measuring line of the optical density of exposed dosimetric films. The interface achieved within the general ensemble SACDTL could be adapted to this line of measurement. The transmission of the data from the measurement line to the computer is made 'on line'; c) the medical surveillance data for each person transmitted 'off line' to the database computer. The unified system resulting from the unification of the three supervising systems will achieve the following general functions: - registering of the personnel working in the nuclear field; - recording the dosimetric data; - processing and presentation of the data; - issuing of measurement bulletins. Thus, by means of unified database, dosimetric intercomparison and correlative studies can be undertaken. (author)

  7. Cone beam computed tomography image guidance system for a dedicated intracranial radiosurgery treatment unit.

    Science.gov (United States)

    Ruschin, Mark; Komljenovic, Philip T; Ansell, Steve; Ménard, Cynthia; Bootsma, Gregory; Cho, Young-Bin; Chung, Caroline; Jaffray, David

    2013-01-01

    Image guidance has improved the precision of fractionated radiation treatment delivery on linear accelerators. Precise radiation delivery is particularly critical when high doses are delivered to complex shapes with steep dose gradients near critical structures, as is the case for intracranial radiosurgery. To reduce potential geometric uncertainties, a cone beam computed tomography (CT) image guidance system was developed in-house to generate high-resolution images of the head at the time of treatment, using a dedicated radiosurgery unit. The performance and initial clinical use of this imaging system are described. A kilovoltage cone beam CT system was integrated with a Leksell Gamma Knife Perfexion radiosurgery unit. The X-ray tube and flat-panel detector are mounted on a translational arm, which is parked above the treatment unit when not in use. Upon descent, a rotational axis provides 210° of rotation for cone beam CT scans. Mechanical integrity of the system was evaluated over a 6-month period. Subsequent clinical commissioning included end-to-end testing of targeting performance and subjective image quality performance in phantoms. The system has been used to image 2 patients, 1 of whom received single-fraction radiosurgery and 1 who received 3 fractions, using a relocatable head frame. Images of phantoms demonstrated soft tissue contrast visibility and submillimeter spatial resolution. A contrast difference of 35 HU was easily detected at a calibration dose of 1.2 cGy (center of head phantom). The shape of the mechanical flex vs scan angle was highly reproducible and exhibited cone beam CT image guidance system was successfully adapted to a radiosurgery unit. The system is capable of producing high-resolution images of bone and soft tissue. The system is in clinical use and provides excellent image guidance without invasive frames. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Long term file migration. Part I: file reference patterns

    International Nuclear Information System (INIS)

    Smith, A.J.

    1978-08-01

    In most large computer installations, files are moved between on-line disk and mass storage (tape, integrated mass storage device) either automatically by the system or specifically at the direction of the user. This is the first of two papers which study the selection of algorithms for the automatic migration of files between mass storage and disk. The use of the text editor data sets at the Stanford Linear Accelerator Center (SLAC) computer installation is examined through the analysis of thirteen months of file reference data. Most files are used very few times. Of those that are used sufficiently frequently that their reference patterns may be examined, about a third show declining rates of reference during their lifetime; of the remainder, very few (about 5%) show correlated interreference intervals, and interreference intervals (in days) appear to be more skewed than would occur with the Bernoulli process. Thus, about two-thirds of all sufficiently active files appear to be referenced as a renewal process with a skewed interreference distribution. A large number of other file reference statistics (file lifetimes, interference distributions, moments, means, number of uses/file, file sizes, file rates of reference, etc.) are computed and presented. The results are applied in the following paper to the development and comparative evaluation of file migration algorithms. 17 figures, 13 tables

  9. 78 FR 24199 - Streak Products, Inc. v. UTi, United States, Inc.; Notice of Filing of Complaint and Assignment

    Science.gov (United States)

    2013-04-24

    ... FEDERAL MARITIME COMMISSION [Docket No. 13--04] Streak Products, Inc. v. UTi, United States, Inc...,'' against UTi, United States, Inc. (``UTi''), hereinafter ``Respondent.'' Complainant states that it is a... therefore, has violated 46 U.S.C. 41104(2). Complainant also alleges that ``UTi engaged in an unfair or...

  10. Computer programs for unit-cell determination in electron diffraction experiments

    International Nuclear Information System (INIS)

    Li, X.Z.

    2005-01-01

    A set of computer programs for unit-cell determination from an electron diffraction tilt series and pattern indexing has been developed on the basis of several well-established algorithms. In this approach, a reduced direct primitive cell is first determined from experimental data, in the means time, the measurement errors of the tilt angles are checked and minimized. The derived primitive cell is then checked for possible higher lattice symmetry and transformed into a proper conventional cell. Finally a least-squares refinement procedure is adopted to generate optimum lattice parameters on the basis of the lengths of basic reflections in each diffraction pattern and the indices of these reflections. Examples are given to show the usage of the programs

  11. Development of Thermal Performance Analysis Computer Program on Turbine Cycle of Yoggwang 3,4 Units

    Energy Technology Data Exchange (ETDEWEB)

    Hong, S.Y.; Choi, K.H.; Jee, M.H.; Chung, S.I. [Korea Electric Power Research Institute, Taejon (Korea)

    2002-07-01

    The objective of the study ''Development of Thermal Performance Analysis Computer Program on Turbine Cycle of Yonggwang 3,4 Units'' is to utilize computerized program to the performance test of the turbine cycle or the analysis of the operational status of the thermal plants. In addition, the result can be applicable to the analysis of the thermal output at the abnormal status and be a powerful tool to find out the main problems for such cases. As a results, the output of this study can supply the way to confirm the technical capability to operate the plants efficiently and to obtain the economic gains remarkably. (author). 27 refs., 73 figs., 6 tabs.

  12. Utero-fetal unit and pregnant woman modeling using a computer graphics approach for dosimetry studies.

    Science.gov (United States)

    Anquez, Jérémie; Boubekeur, Tamy; Bibin, Lazar; Angelini, Elsa; Bloch, Isabelle

    2009-01-01

    Potential sanitary effects related to electromagnetic fields exposure raise public concerns, especially for fetuses during pregnancy. Human fetus exposure can only be assessed through simulated dosimetry studies, performed on anthropomorphic models of pregnant women. In this paper, we propose a new methodology to generate a set of detailed utero-fetal unit (UFU) 3D models during the first and third trimesters of pregnancy, based on segmented 3D ultrasound and MRI data. UFU models are built using recent geometry processing methods derived from mesh-based computer graphics techniques and embedded in a synthetic woman body. Nine pregnant woman models have been generated using this approach and validated by obstetricians, for anatomical accuracy and representativeness.

  13. UPTF test instrumentation. Measurement system identification, engineering units and computed parameters

    International Nuclear Information System (INIS)

    Sarkar, J.; Liebert, J.; Laeufer, R.

    1992-11-01

    This updated version of the previous report /1/ contains, besides additional instrumentation needed for 2D/3D Programme, the supplementary instrumentation in the inlet plenum of SG simulator and hot and cold leg of broken loop, the cold leg of intact loops and the upper plenum to meet the requirements (Test Phase A) of the UPTF Programme, TRAM, sponsored by the Federal Minister of Research and Technology (BMFT) of the Federal Republic of Germany. For understanding, the derivation and the description of the identification codes for the entire conventional and advanced measurement systems classifying the function, and the equipment unit, key, as adopted in the conventional power plants, have been included. Amendments have also been made to the appendices. In particular, the list of measurement systems covering the measurement identification code, instrument, measured quantity, measuring range, band width, uncertainty and sensor location has been updated and extended to include the supplementary instrumentation. Beyond these amendments, the uncertainties of measurements have been precisely specified. The measurement identification codes which also stand for the identification of the corresponding measured quantities in engineering units and the identification codes derived therefrom for the computed parameters have been adequately detailed. (orig.)

  14. Unit physics performance of a mix model in Eulerian fluid computations

    Energy Technology Data Exchange (ETDEWEB)

    Vold, Erik [Los Alamos National Laboratory; Douglass, Rod [Los Alamos National Laboratory

    2011-01-25

    In this report, we evaluate the performance of a K-L drag-buoyancy mix model, described in a reference study by Dimonte-Tipton [1] hereafter denoted as [D-T]. The model was implemented in an Eulerian multi-material AMR code, and the results are discussed here for a series of unit physics tests. The tests were chosen to calibrate the model coefficients against empirical data, principally from RT (Rayleigh-Taylor) and RM (Richtmyer-Meshkov) experiments, and the present results are compared to experiments and to results reported in [D-T]. Results show the Eulerian implementation of the mix model agrees well with expectations for test problems in which there is no convective flow of the mass averaged fluid, i.e., in RT mix or in the decay of homogeneous isotropic turbulence (HIT). In RM shock-driven mix, the mix layer moves through the Eulerian computational grid, and there are differences with the previous results computed in a Lagrange frame [D-T]. The differences are attributed to the mass averaged fluid motion and examined in detail. Shock and re-shock mix are not well matched simultaneously. Results are also presented and discussed regarding model sensitivity to coefficient values and to initial conditions (IC), grid convergence, and the generation of atomically mixed volume fractions.

  15. Real-time computation of parameter fitting and image reconstruction using graphical processing units

    Science.gov (United States)

    Locans, Uldis; Adelmann, Andreas; Suter, Andreas; Fischer, Jannis; Lustermann, Werner; Dissertori, Günther; Wang, Qiulin

    2017-06-01

    In recent years graphical processing units (GPUs) have become a powerful tool in scientific computing. Their potential to speed up highly parallel applications brings the power of high performance computing to a wider range of users. However, programming these devices and integrating their use in existing applications is still a challenging task. In this paper we examined the potential of GPUs for two different applications. The first application, created at Paul Scherrer Institut (PSI), is used for parameter fitting during data analysis of μSR (muon spin rotation, relaxation and resonance) experiments. The second application, developed at ETH, is used for PET (Positron Emission Tomography) image reconstruction and analysis. Applications currently in use were examined to identify parts of the algorithms in need of optimization. Efficient GPU kernels were created in order to allow applications to use a GPU, to speed up the previously identified parts. Benchmarking tests were performed in order to measure the achieved speedup. During this work, we focused on single GPU systems to show that real time data analysis of these problems can be achieved without the need for large computing clusters. The results show that the currently used application for parameter fitting, which uses OpenMP to parallelize calculations over multiple CPU cores, can be accelerated around 40 times through the use of a GPU. The speedup may vary depending on the size and complexity of the problem. For PET image analysis, the obtained speedups of the GPU version were more than × 40 larger compared to a single core CPU implementation. The achieved results show that it is possible to improve the execution time by orders of magnitude.

  16. A micro-computed tomographic evaluation of dentinal microcrack alterations during root canal preparation using single-file Ni-Ti systems.

    Science.gov (United States)

    Li, Mei-Lin; Liao, Wei-Li; Cai, Hua-Xiong

    2018-01-01

    The aim of the present study was to evaluate the length of dentinal microcracks observed prior to and following root canal preparation with different single-file nickel-titanium (Ni-Ti) systems using micro-computed tomography (micro-CT) analysis. A total of 80 mesial roots of mandibular first molars presenting with type II Vertucci canal configurations were scanned at an isotropic resolution of 7.4 µm. The samples were randomly assigned into four groups (n=20 per group) according to the system used for root canal preparation, including the WaveOne (WO), OneShape (OS), Reciproc (RE) and control groups. A second micro-CT scan was conducted after the root canals were prepared with size 25 instruments. Pre- and postoperative cross-section images of the roots (n=237,760) were then screened to identify the lengths of the microcracks. The results indicated that the microcrack lengths were notably increased following root canal preparation (Pfiles. Among the single-file Ni-Ti systems, WO and RE were not observed to cause notable microcracks, while the OS system resulted in evident microcracks.

  17. Assessment of Undiscovered Deposits of Gold, Silver, Copper, Lead, and Zinc in the United States: A Portable Document (PDF) Recompilation of USGS Open-File Report 96-96 and Circular 1178

    Science.gov (United States)

    U.S. Geological Survey National Mineral Resource Assessment Team Recompiled by Schruben, Paul G.

    2002-01-01

    This publication contains the results of a national mineral resource assessment study. The study (1) identifies regional tracts of ground believed to contain most of the nation's undiscovered resources of gold, silver, copper, lead, and zinc in conventional types of deposits; and (2) includes probabilistic estimates of the amounts of these undiscovered resources in most of the tracts. It also contains a table of the significant known deposits in the tracts, and includes descriptions of the mineral deposit models used for the assessment. The assessment was previously released in two major publications. The conterminous United States assessment was published in 1996 as USGS Open-File Report 96-96. Subsequently, the Alaska assessment was combined with the conterminous assessment in 1998 and released as USGS Circular 1178. This new recompilation was undertaken for several reasons. First, the graphical browser software used in Circular 1178 was ONLY compatible with the Microsoft Windows operating system. It was incompatible with the Macintosh operating system, Linux, and other types of Unix computers. Second, the browser on Circular 1178 is much less intuitive to operate, requiring most users to follow a tutorial to understand how to navigate the information on the CD. Third, this release corrects several errors and numbering inconsistencies in Circular 1178.

  18. Computer science teacher professional development in the United States: a review of studies published between 2004 and 2014

    Science.gov (United States)

    Menekse, Muhsin

    2015-10-01

    While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher professional development. In this study, the main goal was to systematically review the studies regarding computer science professional development to understand the scope, context, and effectiveness of these programs in the past decade (2004-2014). Based on 21 journal articles and conference proceedings, this study explored: (1) Type of professional development organization and source of funding, (2) professional development structure and participants, (3) goal of professional development and type of evaluation used, (4) specific computer science concepts and training tools used, (5) and their effectiveness to improve teacher practice and student learning.

  19. Grid collector: An event catalog with automated file management

    International Nuclear Information System (INIS)

    Wu, Kesheng; Zhang, Wei-Ming; Sim, Alexander; Gu, Junmin; Shoshani, Arie

    2003-01-01

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides ''direct'' access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select events based on tags, such as, ''production date between March 10 and 20, and the number of charged tracks > 100.'' The Grid Collector locates the files containing relevant events, transfers the files across the Grid if necessary, and delivers the events to the analysis code through the familiar iterators. There has been some research efforts to address the file management issues, the Grid Collector is unique in that it addresses the event access issue together with the file management issues. This makes it more useful to a large variety of users

  20. Grid collector: An event catalog with automated file management

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Kesheng; Zhang, Wei-Ming; Sim, Alexander; Gu, Junmin; Shoshani, Arie

    2003-10-17

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides ''direct'' access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select events based on tags, such as, ''production date between March 10 and 20, and the number of charged tracks > 100.'' The Grid Collector locates the files containing relevant events, transfers the files across the Grid if necessary, and delivers the events to the analysis code through the familiar iterators. There has been some research efforts to address the file management issues, the Grid Collector is unique in that it addresses the event access issue together with the file management issues. This makes it more useful to a large variety of users.

  1. Utilizing General Purpose Graphics Processing Units to Improve Performance of Computer Modelling and Visualization

    Science.gov (United States)

    Monk, J.; Zhu, Y.; Koons, P. O.; Segee, B. E.

    2009-12-01

    With the introduction of the G8X series of cards by nVidia an architecture called CUDA was released, virtually all subsequent video cards have had CUDA support. With this new architecture nVidia provided extensions for C/C++ that create an Application Programming Interface (API) allowing code to be executed on the GPU. Since then the concept of GPGPU (general purpose graphics processing unit) has been growing, this is the concept that the GPU is very good a algebra and running things in parallel so we should take use of that power for other applications. This is highly appealing in the area of geodynamic modeling, as multiple parallel solutions of the same differential equations at different points in space leads to a large speedup in simulation speed. Another benefit of CUDA is a programmatic method of transferring large amounts of data between the computer's main memory and the dedicated GPU memory located on the video card. In addition to being able to compute and render on the video card, the CUDA framework allows for a large speedup in the situation, such as with a tiled display wall, where the rendered pixels are to be displayed in a different location than where they are rendered. A CUDA extension for VirtualGL was developed allowing for faster read back at high resolutions. This paper examines several aspects of rendering OpenGL graphics on large displays using VirtualGL and VNC. It demonstrates how performance can be significantly improved in rendering on a tiled monitor wall. We present a CUDA enhanced version of VirtualGL as well as the advantages to having multiple VNC servers. It will discuss restrictions caused by read back and blitting rates and how they are affected by different sizes of virtual displays being rendered.

  2. Computational fluid dynamics simulation of wind-driven inter-unit dispersion around multi-storey buildings: Upstream building effect

    DEFF Research Database (Denmark)

    Ai, Zhengtao; Mak, C.M.; Dai, Y.W.

    2017-01-01

    of such changed airflow patterns on inter-unit dispersion characteristics around a multi-storey building due to wind effect. Computational fluid dynamics (CFD) method in the framework of Reynolds-averaged Navier-stokes modelling was employed to predict the coupled outdoor and indoor airflow field, and the tracer...... gas technique was used to simulate the dispersion of infectious agents between units. Based on the predicted concentration field, a mass conservation based parameter, namely re-entry ratio, was used to evaluate quantitatively the inter-unit dispersion possibilities and thus assess risks along...

  3. Usefulness of computed tomography hounsfield unit measurement for diagnosis of congenital cholesteatoma

    International Nuclear Information System (INIS)

    Ahn, Sang Hyuk; Kim, Yong Woo; Baik, Seung Kug; Hwang, Jae Yeon; Lee, Il Woo

    2014-01-01

    To evaluate the usefulness of Hounsfield unit (HU) measurements for diagnosing of congenital cholesteatoma. A total of 43 patients who underwent surgery due to middle ear cavity lesions were enrolled. Twenty-one patients were confirmed to have congenital cholesteatoma by histopathological results and the other 22 patients were confirmed to have otitis media (OM) by operation. Their computed tomography images were retrospectively reviewed. We measured HU of the soft tissue mass in the middle ear cavity. In addition, we evaluated the largest diameter and location of the mass, the presence of bony erosion in the ear ossicle, and the status of the tympanic membrane in the cholesteatoma group. The mean HU was 37.36 ± 6.11 (range, 27.5-52.5) in the congenital cholesteatoma group and 76.09 ± 8.74 (range, 58.5-96) in the OM group (p < 0.001). The cut-off value was 55.5. The most common location for congenital cholesteatoma was the mesotympanum, and ear ossicle erosion was present in 24%. All patients had an intact tympanic membrane. HU measurement may be useful as an additional indicator to diagnose congenital cholesteatoma.

  4. Usefulness of computed tomography hounsfield unit measurement for diagnosis of congenital cholesteatoma

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Sang Hyuk; Kim, Yong Woo; Baik, Seung Kug; Hwang, Jae Yeon; Lee, Il Woo [Medical Research Institute, Pusan National University Yangsan Hospital, College of Medicine, Pusan National University, Yangsan (Korea, Republic of)

    2014-02-15

    To evaluate the usefulness of Hounsfield unit (HU) measurements for diagnosing of congenital cholesteatoma. A total of 43 patients who underwent surgery due to middle ear cavity lesions were enrolled. Twenty-one patients were confirmed to have congenital cholesteatoma by histopathological results and the other 22 patients were confirmed to have otitis media (OM) by operation. Their computed tomography images were retrospectively reviewed. We measured HU of the soft tissue mass in the middle ear cavity. In addition, we evaluated the largest diameter and location of the mass, the presence of bony erosion in the ear ossicle, and the status of the tympanic membrane in the cholesteatoma group. The mean HU was 37.36 ± 6.11 (range, 27.5-52.5) in the congenital cholesteatoma group and 76.09 ± 8.74 (range, 58.5-96) in the OM group (p < 0.001). The cut-off value was 55.5. The most common location for congenital cholesteatoma was the mesotympanum, and ear ossicle erosion was present in 24%. All patients had an intact tympanic membrane. HU measurement may be useful as an additional indicator to diagnose congenital cholesteatoma.

  5. 76 FR 70651 - Fee for Filing a Patent Application Other Than by the Electronic Filing System

    Science.gov (United States)

    2011-11-15

    ... government; or (3) preempt tribal law. Therefore, a tribal summary impact statement is not required under... 0651-AC64 Fee for Filing a Patent Application Other Than by the Electronic Filing System AGENCY: United..., that is not filed by electronic means as prescribed by the Director of the United States Patent and...

  6. Effect of Jigsaw II, Reading-Writing-Presentation, and Computer Animations on the Teaching of "Light" Unit

    Science.gov (United States)

    Koç, Yasemin; Yildiz, Emre; Çaliklar, Seyma; Simsek, Ümit

    2016-01-01

    The aim of this study is to determine the effect of Jigsaw II technique, reading-writing-presentation method, and computer animation on students' academic achievements, epistemological beliefs, attitudes towards science lesson, and the retention of knowledge in the "Light" unit covered in the 7th grade. The sample of the study consists…

  7. The Development of an Individualized Instructional Program in Beginning College Mathematics Utilizing Computer Based Resource Units. Final Report.

    Science.gov (United States)

    Rockhill, Theron D.

    Reported is an attempt to develop and evaluate an individualized instructional program in pre-calculus college mathematics. Four computer based resource units were developed in the areas of set theory, relations and function, algebra, trigonometry, and analytic geometry. Objectives were determined by experienced calculus teachers, and…

  8. File sharing

    NARCIS (Netherlands)

    van Eijk, N.

    2011-01-01

    File sharing’ has become generally accepted on the Internet. Users share files for downloading music, films, games, software etc. In this note, we have a closer look at the definition of file sharing, the legal and policy-based context as well as enforcement issues. The economic and cultural

  9. Computed micro-tomographic evaluation of glide path with nickel-titanium rotary PathFile in maxillary first molars curved canals.

    Science.gov (United States)

    Pasqualini, Damiano; Bianchi, Caterina Chiara; Paolino, Davide Salvatore; Mancini, Lucia; Cemenasco, Andrea; Cantatore, Giuseppe; Castellucci, Arnaldo; Berutti, Elio

    2012-03-01

    X-ray computed micro-tomography scanning allows high-resolution 3-dimensional imaging of small objects. In this study, micro-CT scanning was used to compare the ability of manual and mechanical glide path to maintain the original root canal anatomy. Eight extracted upper first permanent molars were scanned at the TOMOLAB station at ELETTRA Synchrotron Light Laboratory in Trieste, Italy, with a microfocus cone-beam geometry system. A total of 2,400 projections on 360° have been acquired at 100 kV and 80 μA, with a focal spot size of 8 μm. Buccal root canals of each specimen (n = 16) were randomly assigned to PathFile (P) or stainless-steel K-file (K) to perform glide path at the full working length. Specimens were then microscanned at the apical level (A) and at the point of the maximum curvature level (C) for post-treatment analyses. Curvatures of root canals were classified as moderate (≤35°) or severe (≥40°). The ratio of diameter ratios (RDRs) and the ratio of cross-sectional areas (RAs) were assessed. For each level of analysis (A and C), 2 balanced 2-way factorial analyses of variance (P < .05) were performed to evaluate the significance of the instrument factor and of canal curvature factor as well as the interactions of the factors both with RDRs and RAs. Specimens in the K group had a mean curvature of 35.4° ± 11.5°; those in the P group had a curvature of 38° ± 9.9°. The instrument factor (P and K) was extremely significant (P < .001) for both the RDR and RA parameters, regardless of the point of analysis. Micro-CT scanning confirmed that NiTi rotary PathFile instruments preserve the original canal anatomy and cause less canal aberrations. Copyright © 2012 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  10. Performance characterization of megavoltage computed tomography imaging on a helical tomotherapy unit

    International Nuclear Information System (INIS)

    Meeks, Sanford L.; Harmon, Joseph F. Jr.; Langen, Katja M.; Willoughby, Twyla R.; Wagner, Thomas H.; Kupelian, Patrick A.

    2005-01-01

    Helical tomotherapy is an innovative means of delivering IGRT and IMRT using a device that combines features of a linear accelerator and a helical computed tomography (CT) scanner. The HI-ART II can generate CT images from the same megavoltage x-ray beam it uses for treatment. These megavoltage CT (MVCT) images offer verification of the patient position prior to and potentially during radiation therapy. Since the unit uses the actual treatment beam as the x-ray source for image acquisition, no surrogate telemetry systems are required to register image space to treatment space. The disadvantage to using the treatment beam for imaging, however, is that the physics of radiation interactions in the megavoltage energy range may force compromises between the dose delivered and the image quality in comparison to diagnostic CT scanners. The performance of the system is therefore characterized in terms of objective measures of noise, uniformity, contrast, and spatial resolution as a function of the dose delivered by the MVCT beam. The uniformity and spatial resolutions of MVCT images generated by the HI-ART II are comparable to that of diagnostic CT images. Furthermore, the MVCT scan contrast is linear with respect to the electron density of material imaged. MVCT images do not have the same performance characteristics as state-of-the art diagnostic CT scanners when one objectively examines noise and low-contrast resolution. These inferior results may be explained, at least partially, by the low doses delivered by our unit; the dose is 1.1 cGy in a 20 cm diameter cylindrical phantom. In spite of the poorer low-contrast resolution, these relatively low-dose MVCT scans provide sufficient contrast to delineate many soft-tissue structures. Hence, these images are useful not only for verifying the patient's position at the time of therapy, but they are also sufficient for delineating many anatomic structures. In conjunction with the ability to recalculate radiotherapy doses on

  11. Suitable exposure conditions for CB Throne? New model cone beam computed tomography unit for dental use

    International Nuclear Information System (INIS)

    Tanabe, Kouji; Nishikawa, Keiichi; Yajima, Aya; Mizuta, Shigeru; Sano, Tsukasa; Yajima, Yasutomo; Nakagawa, Kanichi; Kousuge, Yuuji

    2008-01-01

    The CB Throne is a cone beam computed tomography unit for dental use, and the smaller version of the CB MercuRay developed by Hitachi Medico Co. We investigated which exposure conditions were suitable in the clinical use. Suitable exposure conditions were determined by simple subjective comparisons. The right temporomandibular joint of the head phantom was scanned at all possible combinations of tube voltage (60, 80, 100, 120 kV) and tube current (10, 15 mA). Oblique-sagittal images of the same position were obtained using multiplanar reconstruction (MPR) function. Images obtained at 120 kV and 15 mA, which are the highest exposure conditions and certain to produce images of the best quality, were used to establish the standard. Eight oral radiologists observed each image and standard image on a LCD monitor. They compared subjectively spatial resolution and noise between each image and standard image using a 10 cm scale. Evaluation points were obtained from the check positions on the scales. The Steel method was used to determine significant differences. The images at 60 kV/10 mA and 80 kV/15 mA showed significantly lower evaluation points on spatial resolution. The images at 60 kV/10 mA, 60 kV/15 mA and 80 kV/10 mA showed significantly lower evaluation points on noise. In conclusion, even if exposure conditions are reduced to 100 kV/10 mA, 100 kV/15 mA or 120 kV/10 mA, the CB Throne will produce images of the best quality. (author)

  12. Absolute Hounsfield unit measurement on noncontrast computed tomography cannot accurately predict struvite stone composition.

    Science.gov (United States)

    Marchini, Giovanni Scala; Gebreselassie, Surafel; Liu, Xiaobo; Pynadath, Cindy; Snyder, Grace; Monga, Manoj

    2013-02-01

    The purpose of our study was to determine, in vivo, whether single-energy noncontrast computed tomography (NCCT) can accurately predict the presence/percentage of struvite stone composition. We retrospectively searched for all patients with struvite components on stone composition analysis between January 2008 and March 2012. Inclusion criteria were NCCT prior to stone analysis and stone size ≥4 mm. A single urologist, blinded to stone composition, reviewed all NCCT to acquire stone location, dimensions, and Hounsfield unit (HU). HU density (HUD) was calculated by dividing mean HU by the stone's largest transverse diameter. Stone analysis was performed via Fourier transform infrared spectrometry. Independent sample Student's t-test and analysis of variance (ANOVA) were used to compare HU/HUD among groups. Spearman's correlation test was used to determine the correlation between HU and stone size and also HU/HUD to % of each component within the stone. Significance was considered if pR=0.017; p=0.912) and negative with HUD (R=-0.20; p=0.898). Overall, 3 (6.8%) had stones (n=5) with other miscellaneous stones (n=39), no difference was found for HU (p=0.09) but HUD was significantly lower for pure stones (27.9±23.6 v 72.5±55.9, respectively; p=0.006). Again, significant overlaps were seen. Pure struvite stones have significantly lower HUD than mixed struvite stones, but overlap exists. A low HUD may increase the suspicion for a pure struvite calculus.

  13. Portable File Format (PFF) specifications

    Energy Technology Data Exchange (ETDEWEB)

    Dolan, Daniel H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Created at Sandia National Laboratories, the Portable File Format (PFF) allows binary data transfer across computer platforms. Although this capability is supported by many other formats, PFF files are still in use at Sandia, particularly in pulsed power research. This report provides detailed PFF specifications for accessing data without relying on legacy code.

  14. Mechanical properties of regular porous biomaterials made from truncated cube repeating unit cells: Analytical solutions and computational models.

    Science.gov (United States)

    Hedayati, R; Sadighi, M; Mohammadi-Aghdam, M; Zadpoor, A A

    2016-03-01

    Additive manufacturing (AM) has enabled fabrication of open-cell porous biomaterials based on repeating unit cells. The micro-architecture of the porous biomaterials and, thus, their physical properties could then be precisely controlled. Due to their many favorable properties, porous biomaterials manufactured using AM are considered as promising candidates for bone substitution as well as for several other applications in orthopedic surgery. The mechanical properties of such porous structures including static and fatigue properties are shown to be strongly dependent on the type of the repeating unit cell based on which the porous biomaterial is built. In this paper, we study the mechanical properties of porous biomaterials made from a relatively new unit cell, namely truncated cube. We present analytical solutions that relate the dimensions of the repeating unit cell to the elastic modulus, Poisson's ratio, yield stress, and buckling load of those porous structures. We also performed finite element modeling to predict the mechanical properties of the porous structures. The analytical solution and computational results were found to be in agreement with each other. The mechanical properties estimated using both the analytical and computational techniques were somewhat higher than the experimental data reported in one of our recent studies on selective laser melted Ti-6Al-4V porous biomaterials. In addition to porosity, the elastic modulus and Poisson's ratio of the porous structures were found to be strongly dependent on the ratio of the length of the inclined struts to that of the uninclined (i.e. vertical or horizontal) struts, α, in the truncated cube unit cell. The geometry of the truncated cube unit cell approaches the octahedral and cube unit cells when α respectively approaches zero and infinity. Consistent with those geometrical observations, the analytical solutions presented in this study approached those of the octahedral and cube unit cells when

  15. Grid collector an event catalog with automated file management

    CERN Document Server

    Ke Sheng Wu; Sim, A; Jun Min Gu; Shoshani, A

    2004-01-01

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides "direct" access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select ev...

  16. Porting of the transfer-matrix method for multilayer thin-film computations on graphics processing units

    Science.gov (United States)

    Limmer, Steffen; Fey, Dietmar

    2013-07-01

    Thin-film computations are often a time-consuming task during optical design. An efficient way to accelerate these computations with the help of graphics processing units (GPUs) is described. It turned out that significant speed-ups can be achieved. We investigate the circumstances under which the best speed-up values can be expected. Therefore we compare different GPUs among themselves and with a modern CPU. Furthermore, the effect of thickness modulation on the speed-up and the runtime behavior depending on the input data is examined.

  17. The Performance Improvement of the Lagrangian Particle Dispersion Model (LPDM) Using Graphics Processing Unit (GPU) Computing

    Science.gov (United States)

    2017-08-01

    used for its GPU computing capability during the experiment. It has Nvidia Tesla K40 GPU accelerators containing 32 GPU nodes consisting of 1024...cores. CUDA is a parallel computing platform and application programming interface (API) model that was created and designed by Nvidia to give direct...Agricultural and Forest Meteorology. 1995:76:277–291, ISSN 0168-1923. 3. GPU vs. CPU? What is GPU computing? Santa Clara (CA): Nvidia Corporation; 2017

  18. In-vitro Assessing the Shaping Ability of Three Nickel-Titanium Rotary Single File Systems by Cone Beam Computed Tomography

    Directory of Open Access Journals (Sweden)

    Ali Imad Al-Asadi

    2018-02-01

    Full Text Available Aim of the study was to evaluate the canal transportation and centering ability of three nickel-titanium single file rotary systems by cone beam computed tomography (CBCT. Materials and methods: Thirty permanent maxillary first molar with a range of mesiobuccal canals curvature from 20-30 degree were selected and assigned into three groups (n=10, according to the biomechanical preparation system used: Hyflex EDM (HF, Reciproc blue (RB and OneShape (OS. The sampled were scanned by CBCT after being mounted on customized acrylic base and then rescanned after the instrumentation. Slices from the axial section were taken from both exposures at 3 mm, 6 mm and 9 mm from the root apex corresponding to the apical, middle, and coronal third respectively. Data were statistically analyzed using Kurskal-Wallis and Mann-Whitney U tests at the 5% confidence level. Results: The results showed that there were no significant differences at the apical and coronal third and a significant difference at the middle third regarding canal transportation. However, there was a significant difference at the apical third and no significant difference at the middle and coronal third regarding centering ratio. Conclusion: It was concluded that the three single rotary systems reported a degree in canal transportation and centric ratio but the Hyflex EDM reported the least one.

  19. Canal transportation and centering ability of protaper and self-adjusting file system in long oval canals: An ex-vivo cone-beam computed tomography analysis.

    Science.gov (United States)

    Shah, Dipali Yogesh; Wadekar, Swati Ishwara; Dadpe, Ashwini Manish; Jadhav, Ganesh Ranganath; Choudhary, Lalit Jayant; Kalra, Dheeraj Deepak

    2017-01-01

    The purpose of this study was to compare and evaluate the shaping ability of ProTaper (PT) and Self-Adjusting File (SAF) system using cone-beam computed tomography (CBCT) to assess their performance in oval-shaped root canals. Sixty-two mandibular premolars with single oval canals were divided into two experimental groups ( n = 31) according to the systems used: Group I - PT and Group II - SAF. Canals were evaluated before and after instrumentation using CBCT to assess centering ratio and canal transportation at three levels. Data were statistically analyzed using one-way analysis of variance, post hoc Tukey's test, and t -test. The SAF showed better centering ability and lesser canal transportation than the PT only in the buccolingual plane at 6 and 9 mm levels. The shaping ability of the PT was best in the apical third in both the planes. The SAF had statistically significant better centering and lesser canal transportation in the buccolingual as compared to the mesiodistal plane at the middle and coronal levels. The SAF produced significantly less transportation and remained centered than the PT at the middle and coronal levels in the buccolingual plane of oval canals. In the mesiodistal plane, the performance of both the systems was parallel.

  20. Application of Computer Technology to Educational Administration in the United States.

    Science.gov (United States)

    Bozeman, William C.; And Others

    1991-01-01

    Description of evolution of computer applications in U.S. educational administration is followed by an overview of the structure and governance of public education and Visscher's developmental framework. Typical administrative computer applications in education are discussed, including student records, personnel management, budgeting, library…

  1. Building a parallel file system simulator

    International Nuclear Information System (INIS)

    Molina-Estolano, E; Maltzahn, C; Brandt, S A; Bent, J

    2009-01-01

    Parallel file systems are gaining in popularity in high-end computing centers as well as commercial data centers. High-end computing systems are expected to scale exponentially and to pose new challenges to their storage scalability in terms of cost and power. To address these challenges scientists and file system designers will need a thorough understanding of the design space of parallel file systems. Yet there exist few systematic studies of parallel file system behavior at petabyte- and exabyte scale. An important reason is the significant cost of getting access to large-scale hardware to test parallel file systems. To contribute to this understanding we are building a parallel file system simulator that can simulate parallel file systems at very large scale. Our goal is to simulate petabyte-scale parallel file systems on a small cluster or even a single machine in reasonable time and fidelity. With this simulator, file system experts will be able to tune existing file systems for specific workloads, scientists and file system deployment engineers will be able to better communicate workload requirements, file system designers and researchers will be able to try out design alternatives and innovations at scale, and instructors will be able to study very large-scale parallel file system behavior in the class room. In this paper we describe our approach and provide preliminary results that are encouraging both in terms of fidelity and simulation scalability.

  2. 2014 Cartographic Boundary File, American Indian Area/Alaska Native Area/Hawaiian Home Land for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  3. 2016 Cartographic Boundary File, 2010 Urban Areas (UA) within 2010 County and Equivalent for United States Virgin Islands, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  4. 2015 Cartographic Boundary File, American Indian Area/Alaska Native Area/Hawaiian Home Land for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  5. 2015 Cartographic Boundary File, American Indian Area/Alaska Native Area/Hawaiian Home Land for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2015 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  6. 2014 Cartographic Boundary File, American Indian Area/Alaska Native Area/Hawaiian Home Land for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2014 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  7. 2016 Cartographic Boundary File, Current American Indian/Alaska Native/Native Hawaiian Areas for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  8. 2016 Cartographic Boundary File, Current American Indian/Alaska Native/Native Hawaiian Areas for United States, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  9. Design and application of remote file management system

    International Nuclear Information System (INIS)

    Zhu Haijun; Liu Dekang; Shen liren

    2006-01-01

    File transfer protocol can help users transfer files between computers on internet. FTP can not fulfill the needs of users in special occasions, so it needs programmer define file transfer protocol himself based on users. The method or realization and application for user-defined file transfer protocol is introduced. (authors)

  10. Computer-aided modeling of aluminophosphate zeolites as packings of building units

    KAUST Repository

    Peskov, Maxim; Blatov, Vladislav A.; Ilyushin, Gregory D.; Schwingenschlö gl, Udo

    2012-01-01

    New building schemes of aluminophosphate molecular sieves from packing units (PUs) are proposed. We have investigated 61 framework types discovered in zeolite-like aluminophosphates and have identified important PU combinations using a recently

  11. Critical Vulnerability: Defending the Decisive Point of United States Computer Networked Information Systems

    National Research Council Canada - National Science Library

    Virden, Roy

    2003-01-01

    .... The military's use of computer networked information systems is thus a critical strength. These systems are then critical vulnerabilities because they may lack adequate protection and are open to enemy attack...

  12. Performance evaluation for volumetric segmentation of multiple sclerosis lesions using MATLAB and computing engine in the graphical processing unit (GPU)

    Science.gov (United States)

    Le, Anh H.; Park, Young W.; Ma, Kevin; Jacobs, Colin; Liu, Brent J.

    2010-03-01

    Multiple Sclerosis (MS) is a progressive neurological disease affecting myelin pathways in the brain. Multiple lesions in the white matter can cause paralysis and severe motor disabilities of the affected patient. To solve the issue of inconsistency and user-dependency in manual lesion measurement of MRI, we have proposed a 3-D automated lesion quantification algorithm to enable objective and efficient lesion volume tracking. The computer-aided detection (CAD) of MS, written in MATLAB, utilizes K-Nearest Neighbors (KNN) method to compute the probability of lesions on a per-voxel basis. Despite the highly optimized algorithm of imaging processing that is used in CAD development, MS CAD integration and evaluation in clinical workflow is technically challenging due to the requirement of high computation rates and memory bandwidth in the recursive nature of the algorithm. In this paper, we present the development and evaluation of using a computing engine in the graphical processing unit (GPU) with MATLAB for segmentation of MS lesions. The paper investigates the utilization of a high-end GPU for parallel computing of KNN in the MATLAB environment to improve algorithm performance. The integration is accomplished using NVIDIA's CUDA developmental toolkit for MATLAB. The results of this study will validate the practicality and effectiveness of the prototype MS CAD in a clinical setting. The GPU method may allow MS CAD to rapidly integrate in an electronic patient record or any disease-centric health care system.

  13. Massively Parallel Signal Processing using the Graphics Processing Unit for Real-Time Brain-Computer Interface Feature Extraction.

    Science.gov (United States)

    Wilson, J Adam; Williams, Justin C

    2009-01-01

    The clock speeds of modern computer processors have nearly plateaued in the past 5 years. Consequently, neural prosthetic systems that rely on processing large quantities of data in a short period of time face a bottleneck, in that it may not be possible to process all of the data recorded from an electrode array with high channel counts and bandwidth, such as electrocorticographic grids or other implantable systems. Therefore, in this study a method of using the processing capabilities of a graphics card [graphics processing unit (GPU)] was developed for real-time neural signal processing of a brain-computer interface (BCI). The NVIDIA CUDA system was used to offload processing to the GPU, which is capable of running many operations in parallel, potentially greatly increasing the speed of existing algorithms. The BCI system records many channels of data, which are processed and translated into a control signal, such as the movement of a computer cursor. This signal processing chain involves computing a matrix-matrix multiplication (i.e., a spatial filter), followed by calculating the power spectral density on every channel using an auto-regressive method, and finally classifying appropriate features for control. In this study, the first two computationally intensive steps were implemented on the GPU, and the speed was compared to both the current implementation and a central processing unit-based implementation that uses multi-threading. Significant performance gains were obtained with GPU processing: the current implementation processed 1000 channels of 250 ms in 933 ms, while the new GPU method took only 27 ms, an improvement of nearly 35 times.

  14. A study on the optimal replacement periods of digital control computer's components of Wolsung nuclear power plant unit 1

    International Nuclear Information System (INIS)

    Mok, Jin Il; Seong, Poong Hyun

    1993-01-01

    Due to the failure of the instrument and control devices of nuclear power plants caused by aging, nuclear power plants occasionally trip. Even a trip of a single nuclear power plant (NPP) causes an extravagant economical loss and deteriorates public acceptance of nuclear power plants. Therefore, the replacement of the instrument and control devices with proper consideration of the aging effect is necessary in order to prevent the inadvertent trip. In this paper we investigated the optimal replacement periods of the control computer's components of Wolsung nuclear power plant Unit 1. We first derived mathematical models of optimal replacement periods to the digital control computer's components of Wolsung NPP Unit 1 and calculated the optimal replacement periods analytically. We compared the periods with the replacement periods currently used at Wolsung NPP Unit 1. The periods used at Wolsung is not based on mathematical analysis, but on empirical knowledge. As a consequence, the optimal replacement periods analytically obtained and those used in the field show a little difference. (Author)

  15. Trinary arithmetic and logic unit (TALU) using savart plate and spatial light modulator (SLM) suitable for optical computation in multivalued logic

    Science.gov (United States)

    Ghosh, Amal K.; Bhattacharya, Animesh; Raul, Moumita; Basuray, Amitabha

    2012-07-01

    Arithmetic logic unit (ALU) is the most important unit in any computing system. Optical computing is becoming popular day-by-day because of its ultrahigh processing speed and huge data handling capability. Obviously for the fast processing we need the optical TALU compatible with the multivalued logic. In this regard we are communicating the trinary arithmetic and logic unit (TALU) in modified trinary number (MTN) system, which is suitable for the optical computation and other applications in multivalued logic system. Here the savart plate and spatial light modulator (SLM) based optoelectronic circuits have been used to exploit the optical tree architecture (OTA) in optical interconnection network.

  16. Computer finite element analysis of stress derived from particular units of torsionally flexible metal coupling

    Directory of Open Access Journals (Sweden)

    Mariusz KUCZAJ

    2010-01-01

    Full Text Available In this article the results of Finite Element Analysis (FEA results of stresses derived from chosen units of torsionally flexible metal coupling are presented. As model and simulation tool for particular component loads is used the Autodesk Inventor Professional 2009 program.

  17. Reduction of computing time for seismic applications based on the Helmholtz equation by Graphics Processing Units

    NARCIS (Netherlands)

    Knibbe, H.P.

    2015-01-01

    The oil and gas industry makes use of computational intensive algorithms to provide an image of the subsurface. The image is obtained by sending wave energy into the subsurface and recording the signal required for a seismic wave to reflect back to the surface from the Earth interfaces that may have

  18. Development of the Computer Code to Determine an Individual Radionuclides in the Rad-wastes Container for Ulchin Units 3 and 4

    Energy Technology Data Exchange (ETDEWEB)

    Kang, D.W.; Chi, J.H.; Goh, E.O. [Korea Electric Power Research Institute, Taejon (Korea)

    2001-07-01

    A computer program, RASSAY was developed to evaluate accurately the activities of various nuclides in the rad-waste container for Ulchin units 3 and 4. This is the final report of the project, {sup D}evelopment of the Computer Code to Determine an Individual Radionuclides in the Rad-wastes Container for Ulchin Units 3 and 4 and includes the followings; 1) Structure of the computer code, RASSAY 2) An example of surface dose calculation by computer simulation using MCNP code 3) Methods of sampling and activity measurement of various Rad-wastes. (author). 21 refs., 35 figs., 6 tabs.

  19. Comparing ProFile Vortex to ProTaper Next for the efficacy of removal of root filling material: An ex vivo micro-computed tomography study

    Directory of Open Access Journals (Sweden)

    Emad AlShwaimi

    2018-01-01

    Conclusion: Our findings suggest that PV is as effective as PTN for removal of root canal filling material. Therefore, PV can be considered for use in endodontic retreatment, although more effective files or techniques are still required.

  20. A dual computed tomography linear accelerator unit for stereotactic radiation therapy: a new approach without cranially fixated stereotactic frames

    International Nuclear Information System (INIS)

    Uematsu, Minoru; Fukui, Toshiharu; Shioda, Akira; Tokumitsu, Hideyuki; Takai, Kenji; Kojima, Tadaharu; Asai, Yoshiko; Kusano, Shoichi

    1996-01-01

    Purpose: To perform stereotactic radiation therapy (SRT) without cranially fixated stereotactic frames, we developed a dual computed tomography (CT) linear accelerator (linac) treatment unit. Methods and Materials: This unit is composed of a linac, CT, and motorized table. The linac and CT are set up at opposite ends of the table, which is suitable for both machines. The gantry axis of the linac is coaxial with that of the CT scanner. Thus, the center of the target detected with the CT can be matched easily with the gantry axis of the linac by rotating the table. Positioning is confirmed with the CT for each treatment session. Positioning and treatment errors with this unit were examined by phantom studies. Between August and December 1994, 8 patients with 11 lesions of primary or metastatic brain tumors received SRT with this unit. All lesions were treated with 24 Gy in three fractions to 30 Gy in 10 fractions to the 80% isodose line, with or without conventional external beam radiation therapy. Results: Phantom studies revealed that treatment errors with this unit were within 1 mm after careful positioning. The position was easily maintained using two tiny metallic balls as vertical and horizontal marks. Motion of patients was negligible using a conventional heat-flexible head mold and dental impression. The overall time for a multiple noncoplanar arcs treatment for a single isocenter was less than 1 h on the initial treatment day and usually less than 20 min on subsequent days. Treatment was outpatient-based and well tolerated with no acute toxicities. Satisfactory responses have been documented. Conclusion: Using this treatment unit, multiple fractionated SRT is performed easily and precisely without cranially fixated stereotactic frames

  1. Vortex particle method in parallel computations on graphical processing units used in study of the evolution of vortex structures

    International Nuclear Information System (INIS)

    Kudela, Henryk; Kosior, Andrzej

    2014-01-01

    Understanding the dynamics and the mutual interaction among various types of vortical motions is a key ingredient in clarifying and controlling fluid motion. In the paper several different cases related to vortex tube interactions are presented. Due to problems with very long computation times on the single processor, the vortex-in-cell (VIC) method is implemented on the multicore architecture of a graphics processing unit (GPU). Numerical results of leapfrogging of two vortex rings for inviscid and viscous fluid are presented as test cases for the new multi-GPU implementation of the VIC method. Influence of the Reynolds number on the reconnection process is shown for two examples: antiparallel vortex tubes and orthogonally offset vortex tubes. Our aim is to show the great potential of the VIC method for solutions of three-dimensional flow problems and that the VIC method is very well suited for parallel computation. (paper)

  2. Sweep efficiency improvement of waterfloods in Steelman Units V and VII through the application of computer models

    Energy Technology Data Exchange (ETDEWEB)

    Woods, W S

    1967-01-01

    The use of a digital computer program as a tool to investigate the position of flood fronts in 2 Steelman units is described. The program involves a simulated potentiometric analyzer. Several years of historical performance were utilized and alterations to the model were made to match the historical performance until a satisfactory prediction is obtained. Subsequent to matching the historical performance, future predictions were obtained to evaluate the efficiency of the ultimate sweep configuration in the reservoir. These data are used as directives for improving the operation of the waterfloods. Rather than the complicated and elaborate computer techniques currently in use, it is suggested that the results obtained in this particular application of simple techniques provide sufficient economic operating directives.

  3. Intensive-care unit lung infections: The role of imaging with special emphasis on multi-detector row computed tomography

    International Nuclear Information System (INIS)

    Romano, Luigia; Pinto, Antonio; Merola, Stefanella; Gagliardi, Nicola; Tortora, Giovanni; Scaglione, Mariano

    2008-01-01

    Nosocomial pneumonia is the most frequent hospital-acquired infection. In mechanically ventilated patients admitted to an intensive-care unit as many as 7-41% may develop pneumonia. The role of imaging is to identify the presence, location and extent of pulmonary infection and the presence of complications. However, the poor resolution of bedside plain film frequently limits the value of radiography as an accurate diagnostic tool. To date, multi-detector row computed tomography with its excellent contrast resolution is the most sensitive modality for evaluating lung parenchyma infections

  4. Computational design of metal-organic frameworks with paddlewheel-type secondary building units

    Science.gov (United States)

    Schwingenschlogl, Udo; Peskov, Maxim V.; Masghouni, Nejib

    We employ the TOPOS package to study 697 coordination polymers containing paddlewheel-type secondary building units. The underlying nets are analyzed and 3 novel nets are chosen as potential topologies for paddlewheel-type metal organic frameworks (MOFs). Dicarboxylate linkers are used to build basic structures for novel isoreticular MOF series, aiming at relatively compact structures with a low number of atoms per unit cell. The structures are optimized using density functional theory. Afterwards the Grand Canonical Monte Carlo approach is employed to generate adsorption isotherms for CO2, CO, and CH4 molecules. We utilize the universal forcefield for simulating the interaction between the molecules and hosting MOF. The diffusion behavior of the molecules inside the MOFs is analyzed by molecular dynamics simulations.

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  6. Self-Organizing Units in an Interdisciplinary Course for Pervasive Computing Design

    OpenAIRE

    McNair, Lisa; Newswander, Chad; Coupey, Eloise; Dorsa, Ed; Martin, Tom; Paretti, Marie

    2009-01-01

    We conducted a case study of a design course that focused on bringing together students from engineering, industrial design, and marketing to use pervasive computing technologies to design, coordinate, and build a “smart” dorm room for disabled individuals. The class was loosely structured to encourage innovation, critical thinking and interdisciplinarity. In this environment, teams were created, disassembled, and re-created in a self-organizing fashion. With few norms, teams were expected to...

  7. Specialists' meeting on fuel element performance computer modelling, Preston, United Kingdom, 15-19 March 1982

    International Nuclear Information System (INIS)

    1983-03-01

    The 46 papers of the meeting concerned with computer models of Water Reactor fuel elements cover practically all aspects of behavior of fuel elements in normal operation and in accident condition. Each session of the meeting produced a critical evaluation of one of the 5 topics into which the subject area had been divided. The sessions' report summarize the papers and make recommendations for further work. Separate abstracts were prepared for all the papers presented at this meeting

  8. Ten years of CLIVE (Computer-Aided Learning in Veterinary Education) in the United Kingdom.

    Science.gov (United States)

    Dale, Vicki H M; McConnell, Gill; Short, Andrew; Sullivan, Martin

    2005-01-01

    This paper outlines the work of the CLIVE (Computer-Aided Learning in Veterinary Education) project over a 10-year period, set against the backdrop of changes in education policy and learning technology developments. The consortium of six UK veterinary schools and 14 international Associate Member Schools has been very successful. Sustaining these partnerships requires that the project redefine itself and adapt to cater to the diverse learning needs of today's students and to changing professional and societal needs on an international scale.

  9. Arithmetical unit, interrupt hardware and input-output channel for the computer Bel

    International Nuclear Information System (INIS)

    Fyroe, Karl-Johan

    1969-01-01

    This thesis contains a description of a small general purpose computer using characters, variable word-length and two-address instructions and which is working in decimal (NBCD). We have realized three interruption lines with a fixed priority. The channel is selective and has generally access to the entire memory. Using slow IO-devices, time sharing is possible between the channel and the processor in the central memory buffer area. (author) [fr

  10. Control and management unit for a computation platform at the PANDA experiment

    Energy Technology Data Exchange (ETDEWEB)

    Galuska, Martin; Gessler, Thomas; Kuehn, Wolfgang; Lang, Johannes; Lange, Jens Soeren; Liang, Yutie; Liu, Ming; Spruck, Bjoern; Wang, Qiang [II. Physikalisches Institut, Justus-Liebig-Universitaet Giessen (Germany)

    2010-07-01

    The FAIR facility will provide high intensity antiproton and heavy ion beams for the PANDA and HADES experiments, leading to very high reaction rates. PANDA is expected to run at 10-20 MHz with a raw data output rate of up to 200 GB/s. A sophisticated data acquisition system is needed in order to select physically relevant events online. For this purpose a network of interconnected compute nodes can be used. Each compute node can be programmed to run various algorithms, such as online particle track recognition for high level triggering. An ATCA communication shelf provides power, cooling and high-speed interconnections to up to 14 nodes. A single shelf manager supervises and regulates the power distribution and temperature inside the shelf. The shelf manager relies on a local control chip on each node to relay sensor read-outs, provide hardware adresses and power requirements etc. An IPM controller based on an Atmel microcontroller was designed for this purpose, and a prototype was produced. The neccessary software is being developed to allow local communication with the components of the compute node and remote communication with the shelf manager conform to the ATCA specification.

  11. ACONC Files

    Data.gov (United States)

    U.S. Environmental Protection Agency — ACONC files containing simulated ozone and PM2.5 fields that were used to create the model difference plots shown in the journal article. This dataset is associated...

  12. XML Files

    Science.gov (United States)

    ... this page: https://medlineplus.gov/xml.html MedlinePlus XML Files To use the sharing features on this page, please enable JavaScript. MedlinePlus produces XML data sets that you are welcome to download ...

  13. 831 Files

    Data.gov (United States)

    Social Security Administration — SSA-831 file is a collection of initial and reconsideration adjudicative level DDS disability determinations. (A few hearing level cases are also present, but the...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  15. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  17. Effect of field-of-view size on gray values derived from cone-beam computed tomography compared with the Hounsfield unit values from multidetector computed tomography scans.

    Science.gov (United States)

    Shokri, Abbas; Ramezani, Leila; Bidgoli, Mohsen; Akbarzadeh, Mahdi; Ghazikhanlu-Sani, Karim; Fallahi-Sichani, Hamed

    2018-03-01

    This study aimed to evaluate the effect of field-of-view (FOV) size on the gray values derived from conebeam computed tomography (CBCT) compared with the Hounsfield unit values from multidetector computed tomography (MDCT) scans as the gold standard. A radiographic phantom was designed with 4 acrylic cylinders. One cylinder was filled with distilled water, and the other 3 were filled with 3 types of bone substitute: namely, Nanobone, Cenobone, and Cerabone. The phantom was scanned with 2 CBCT systems using 2 different FOV sizes, and 1 MDCT system was used as the gold standard. The mean gray values (MGVs) of each cylinder were calculated in each imaging protocol. In both CBCT systems, significant differences were noted in the MGVs of all materials between the 2 FOV sizes ( P <.05) except for Cerabone in the Cranex3D system. Significant differences were found in the MGVs of each material compared with the others in both FOV sizes for each CBCT system. No significant difference was seen between the Cranex3D CBCT system and the MDCT system in the MGVs of bone substitutes on images obtained with a small FOV. The size of the FOV significantly changed the MGVs of all bone substitutes, except for Cerabone in the Cranex3D system. Both CBCT systems had the ability to distinguish the 3 types of bone substitutes based on a comparison of their MGVs. The Cranex3D CBCT system used with a small FOV had a significant correlation with MDCT results.

  18. Computer simulation with TRNSYS for a mobile refrigeration system incorporating a phase change thermal storage unit

    International Nuclear Information System (INIS)

    Liu, Ming; Saman, Wasim; Bruno, Frank

    2014-01-01

    Highlights: • A mobile refrigeration system incorporating phase change thermal storage was simulated using TRNSYS. • A TRNSYS component of a phase change thermal storage unit was created and linked to other components from TRNSYS library. • The temperature in the refrigerated space can be predicted using this TRNSYS model under various conditions. • A mobile refrigeration system incorporating PCM and an off-peak electric driven refrigeration unit is feasible. • The phase change material with the lowest melting temperature should be selected. - Abstract: This paper presents a new TRNSYS model of a refrigeration system incorporating phase change material (PCM) for mobile transport. The PCTSU is charged by an off-vehicle refrigeration unit and the PCM provides cooling when discharging and the cooling released is utilized to cool down the refrigerated space. The advantage of this refrigeration system compared to a conventional system is that it consumes less energy and produces significantly lower greenhouse gas emissions. A refrigeration system for a typical refrigerated van is modelled and simulations are performed with climatic data from four different locations. The main components of the TRNSYS model are Type 88 (cooling load estimation) and Type 300 (new PCTSU component), accompanied by other additional components. The results show that in order to maintain the temperature of the products at −18 °C for 10 h, a total of 250 kg and 390 kg of PCM are required for no door opening and 20 door openings during the transportation, respectively. In addition, a parametric study is carried out to evaluate the effects of location, size of the refrigerated space, number of door openings and melting temperature of the PCM on the thermal performance

  19. Computer aided heat transfer analysis in a laboratory scaled heat exchanger unit

    International Nuclear Information System (INIS)

    Gunes, M.

    1998-01-01

    In this study. an explanation of a laboratory scaled heat exchanger unit and a software which is developed to analyze heat transfer. especially to use it in heat transfer courses, are represented. Analyses carried out in the software through sample values measured in the heat exchanger are: (l) Determination of heat transfer rate, logarithmic mean temperature difference and overall heat transfer coefficient; (2)Determination of convection heat transfer coefficient inside and outside the tube and the effect of fluid velocity on these; (3)Investigation of the relationship between Nusselt Number. Reynolds Number and Prandtl Number by using multiple non-linear regression analysis. Results are displayed on the screen graphically

  20. JNDC FP decay data file

    International Nuclear Information System (INIS)

    Yamamoto, Tohru; Akiyama, Masatsugu

    1981-02-01

    The decay data file for fission product nuclides (FP DECAY DATA FILE) has been prepared for summation calculation of the decay heat of fission products. The average energies released in β- and γ-transitions have been calculated with computer code PROFP. The calculated results and necessary information have been arranged in tabular form together with the estimated results for 470 nuclides of which decay data are not available experimentally. (author)

  1. Our experience in using the whole body computed tomography unit (GE CT/T)

    International Nuclear Information System (INIS)

    Murakawa, Yasuhiro; Morimoto, Mitsuo; Ishigaki, Naoya; Zaitsu, Hiroaki; Kawabata, Kohji

    1983-01-01

    Since our hospital installed the CT unit of the head and neck (SCT-100N) in April, 1979, we reported our experience in using this equipment in 1980 and 1982. After that, since the whole body CT unit (GE CT/T) was installed in our hospital in April, 1982, the total number of CT examination have reached approximately three thousand five handred till this August. Consequently, this CT image seemed to be superior in image quality to that obtained by other CT equipment. The most important characteristic of this equipment is that the retrospective and prospective reviews of the target image and the coronal and sagittal recontruction from contiguous transverse axial scans are possible. We show in this report two experimental CT photograms obtained by the reviews of the target image in using of microchart phantom and CT photograms of uterus myoma, metastatic thyroid and liver cancers and further, of contiguous transverse axial scans of cerebral embolism and the coronal recontruction. The important problems for the future of this equipment are those of absorbed X-ray dose of the patients and the scan time. (author)

  2. [Introducing computer units into the reception office as part of the Vrapce Psychiatric Hospital Information System].

    Science.gov (United States)

    Majdancić, Zeljko; Jukić, Vlado; Bojić, Miroslav

    2005-01-01

    Computerized medical record has become a necessity today, because of both the amount of present-day medical data and the need of better handling and processing them. In more than 120 years of the Vrapce Psychiatric Hospital existence, the most important changes in the working concept of the reception office took place when computer technology was introduced into the routine use. The reception office of the Hospital is the vital place where administrative activities intersect with medical care for a patient presenting to the Hospital. The importance of this segment of the Hospital is emphasized by the fact that the reception office is in function and at patients' disposition round-the-clock, for 365 days a year, with great frequency of patients. The shift from the established way of registering medical data on patient admission in handwriting or, later, typescript, to computer recording was a challenging and demanding task (from the aspects of hardware, software, network, education) for the development team as well as for the physicians because it has changed the concept (logic of the working process) of previous way of collecting the data from the patient (history, status, diagnostic procedures, therapy, etc.). The success in the development and implementation of this project and the confirmation of its usefulness during the four-year practice at Vrapce Psychiatric Hospital are best illustrated by the fact that other psychiatric hospitals in Croatia have already introduced or are introducing it in their daily practice.

  3. Teacher's Guide for Computational Models of Animal Behavior: A Computer-Based Curriculum Unit to Accompany the Elementary Science Study Guide "Behavior of Mealworms." Artificial Intelligence Memo No. 432.

    Science.gov (United States)

    Abelson, Hal; Goldenberg, Paul

    This experimental curriculum unit suggests how dramatic innovations in classroom content may be achieved through use of computers. The computational perspective is viewed as one which can enrich and transform traditional curricula, act as a focus for integrating insights from diverse disciplines, and enable learning to become more active and…

  4. Efficacy of Twisted File Adaptive, Reciproc and ProTaper Universal Retreatment instruments for root-canal-filling removal: A cone-beam computed tomography study.

    Science.gov (United States)

    Akbulut, Makbule Bilge; Akman, Melek; Terlemez, Arslan; Magat, Guldane; Sener, Sevgi; Shetty, Heeresh

    2016-01-01

    The aim of this study was to evaluate the efficacy of Twisted File (TF) Adaptive, Reciproc, and ProTaper Universal Retreatment (UR) System instruments for removing root-canal-filling. Sixty single rooted teeth were decoronated, instrumented and obturated. Preoperative CBCT scans were taken and the teeth were retreated with TF Adaptive, Reciproc, ProTaper UR, or hand files (n=15). Then, the teeth were rescanned, and the percentage volume of the residual root-canal-filling material was established. The total time for retreatment was recorded, and the data was statistically analyzed. The statistical ranking of the residual filling material volume was as follows: hand file=TF Adaptive>ProTaper UR=Reciproc. The ProTaper UR and Reciproc systems required shorter periods of time for retreatment. Root canal filling was more efficiently removed by using Reciproc and ProTaper UR instruments than TF Adaptive instruments and hand files. The TF Adaptive system was advantageous over hand files with regard to operating time.

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  6. 77 FR 35432 - Privacy Act of 1974, Computer Matching Program: United States Postal Service and the Defense...

    Science.gov (United States)

    2012-06-13

    ... system for permanent employees in a current pay status. USPS will validate the identification of the RC... the USPS Payroll reply file where inconsistencies exist. Any discrepancies as furnished by USPS...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  9. Compute-unified device architecture implementation of a block-matching algorithm for multiple graphical processing unit cards.

    Science.gov (United States)

    Massanes, Francesc; Cadennes, Marie; Brankov, Jovan G

    2011-07-01

    In this paper we describe and evaluate a fast implementation of a classical block matching motion estimation algorithm for multiple Graphical Processing Units (GPUs) using the Compute Unified Device Architecture (CUDA) computing engine. The implemented block matching algorithm (BMA) uses summed absolute difference (SAD) error criterion and full grid search (FS) for finding optimal block displacement. In this evaluation we compared the execution time of a GPU and CPU implementation for images of various sizes, using integer and non-integer search grids.The results show that use of a GPU card can shorten computation time by a factor of 200 times for integer and 1000 times for a non-integer search grid. The additional speedup for non-integer search grid comes from the fact that GPU has built-in hardware for image interpolation. Further, when using multiple GPU cards, the presented evaluation shows the importance of the data splitting method across multiple cards, but an almost linear speedup with a number of cards is achievable.In addition we compared execution time of the proposed FS GPU implementation with two existing, highly optimized non-full grid search CPU based motion estimations methods, namely implementation of the Pyramidal Lucas Kanade Optical flow algorithm in OpenCV and Simplified Unsymmetrical multi-Hexagon search in H.264/AVC standard. In these comparisons, FS GPU implementation still showed modest improvement even though the computational complexity of FS GPU implementation is substantially higher than non-FS CPU implementation.We also demonstrated that for an image sequence of 720×480 pixels in resolution, commonly used in video surveillance, the proposed GPU implementation is sufficiently fast for real-time motion estimation at 30 frames-per-second using two NVIDIA C1060 Tesla GPU cards.

  10. File Type Identification of File Fragments using Longest Common Subsequence (LCS)

    Science.gov (United States)

    Rahmat, R. F.; Nicholas, F.; Purnamawati, S.; Sitompul, O. S.

    2017-01-01

    Computer forensic analyst is a person in charge of investigation and evidence tracking. In certain cases, the file needed to be presented as digital evidence was deleted. It is difficult to reconstruct the file, because it often lost its header and cannot be identified while being restored. Therefore, a method is required for identifying the file type of file fragments. In this research, we propose Longest Common Subsequences that consists of three steps, namely training, testing and validation, to identify the file type from file fragments. From all testing results we can conlude that our proposed method works well and achieves 92.91% of accuracy to identify the file type of file fragment for three data types.

  11. Radiation dose reduction in a neonatal intensive care unit in computed radiography.

    Science.gov (United States)

    Frayre, A S; Torres, P; Gaona, E; Rivera, T; Franco, J; Molina, N

    2012-12-01

    The purpose of this study was to evaluate the dose received by chest x-rays in neonatal care with thermoluminescent dosimetry and to determine the level of exposure where the quantum noise level does not affect the diagnostic image quality in order to reduce the dose to neonates. In pediatric radiology, especially the prematurely born children are highly sensitive to the radiation because of the highly mitotic state of their cells; in general, the sensitivity of a tissue to radiation is directly proportional to its rate of proliferation. The sample consisted of 208 neonatal chest x-rays of 12 neonates admitted and treated in a Neonatal Intensive Care Unit (NICU). All the neonates were preterm in the range of 28-34 weeks, with a mean of 30.8 weeks. Entrance Surface Doses (ESD) values for chest x-rays are higher than the DRL of 50 μGy proposed by the National Radiological Protection Board (NRPB). In order to reduce the dose to neonates, the optimum image quality was achieved by determining the level of ESD where level noise does not affect the diagnostic image quality. The optimum ESD was estimated for additional 20 chest x-rays increasing kVp and reducing mAs until quantum noise affects image quality. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Transportable GPU (General Processor Units) chip set technology for standard computer architectures

    Science.gov (United States)

    Fosdick, R. E.; Denison, H. C.

    1982-11-01

    The USAFR-developed GPU Chip Set has been utilized by Tracor to implement both USAF and Navy Standard 16-Bit Airborne Computer Architectures. Both configurations are currently being delivered into DOD full-scale development programs. Leadless Hermetic Chip Carrier packaging has facilitated implementation of both architectures on single 41/2 x 5 substrates. The CMOS and CMOS/SOS implementations of the GPU Chip Set have allowed both CPU implementations to use less than 3 watts of power each. Recent efforts by Tracor for USAF have included the definition of a next-generation GPU Chip Set that will retain the application-proven architecture of the current chip set while offering the added cost advantages of transportability across ISO-CMOS and CMOS/SOS processes and across numerous semiconductor manufacturers using a newly-defined set of common design rules. The Enhanced GPU Chip Set will increase speed by an approximate factor of 3 while significantly reducing chip counts and costs of standard CPU implementations.

  13. Radiation dose reduction in a neonatal intensive care unit in computed radiography

    International Nuclear Information System (INIS)

    Frayre, A.S.; Torres, P.; Gaona, E.; Rivera, T.; Franco, J.; Molina, N.

    2012-01-01

    The purpose of this study was to evaluate the dose received by chest x-rays in neonatal care with thermoluminescent dosimetry and to determine the level of exposure where the quantum noise level does not affect the diagnostic image quality in order to reduce the dose to neonates. In pediatric radiology, especially the prematurely born children are highly sensitive to the radiation because of the highly mitotic state of their cells; in general, the sensitivity of a tissue to radiation is directly proportional to its rate of proliferation. The sample consisted of 208 neonatal chest x-rays of 12 neonates admitted and treated in a Neonatal Intensive Care Unit (NICU). All the neonates were preterm in the range of 28–34 weeks, with a mean of 30.8 weeks. Entrance Surface Doses (ESD) values for chest x-rays are higher than the DRL of 50 μGy proposed by the National Radiological Protection Board (NRPB). In order to reduce the dose to neonates, the optimum image quality was achieved by determining the level of ESD where level noise does not affect the diagnostic image quality. The optimum ESD was estimated for additional 20 chest x-rays increasing kVp and reducing mAs until quantum noise affects image quality. - Highlights: ► Entrance surface doses (ESD) in neonates were measured. ► Doses measured in neonates examinations were higher than those reported by literature. ► Reference levels in neonatal studies are required. ► Radiation protection optimization was proposed.

  14. Decomposing the Hounsfield unit: probabilistic segmentation of brain tissue in computed tomography.

    Science.gov (United States)

    Kemmling, A; Wersching, H; Berger, K; Knecht, S; Groden, C; Nölte, I

    2012-03-01

    The aim of this study was to present and evaluate a standardized technique for brain segmentation of cranial computed tomography (CT) using probabilistic partial volume tissue maps based on a database of high resolution T1 magnetic resonance images (MRI). Probabilistic tissue maps of white matter (WM), gray matter (GM) and cerebrospinal fluid (CSF) were derived from 600 normal brain MRIs (3.0 Tesla, T1-3D-turbo-field-echo) of 2 large community-based population studies (BiDirect and SEARCH Health studies). After partial tissue segmentation (FAST 4.0), MR images were linearly registered to MNI-152 standard space (FLIRT 5.5) with non-linear refinement (FNIRT 1.0) to obtain non-binary probabilistic volume images for each tissue class which were subsequently used for CT segmentation. From 150 normal cerebral CT scans a customized reference image in standard space was constructed with iterative non-linear registration to MNI-152 space. The inverse warp of tissue-specific probability maps to CT space (MNI-152 to individual CT) was used to decompose a CT image into tissue specific components (GM, WM, CSF). Potential benefits and utility of this novel approach with regard to unsupervised quantification of CT images and possible visual enhancement are addressed. Illustrative examples of tissue segmentation in different pathological cases including perfusion CT are presented. Automated tissue segmentation of cranial CT images using highly refined tissue probability maps derived from high resolution MR images is feasible. Potential applications include automated quantification of WM in leukoaraiosis, CSF in hydrocephalic patients, GM in neurodegeneration and ischemia and perfusion maps with separate assessment of GM and WM.

  15. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  1. Quantifying morphological parameters of the terminal branching units in a mouse lung by phase contrast synchrotron radiation computed tomography.

    Directory of Open Access Journals (Sweden)

    Jeongeun Hwang

    Full Text Available An effective technique of phase contrast synchrotron radiation computed tomography was established for the quantitative analysis of the microstructures in the respiratory zone of a mouse lung. Heitzman's method was adopted for the whole-lung sample preparation, and Canny's edge detector was used for locating the air-tissue boundaries. This technique revealed detailed morphology of the respiratory zone components, including terminal bronchioles and alveolar sacs, with sufficiently high resolution of 1.74 µm isotropic voxel size. The technique enabled visual inspection of the respiratory zone components and comprehension of their relative positions in three dimensions. To check the method's feasibility for quantitative imaging, morphological parameters such as diameter, surface area and volume were measured and analyzed for sixteen randomly selected terminal branching units, each consisting of a terminal bronchiole and a pair of succeeding alveolar sacs. The four types of asymmetry ratios concerning alveolar sac mouth diameter, alveolar sac surface area, and alveolar sac volume are measured. This is the first ever finding of the asymmetry ratio for the terminal bronchioles and alveolar sacs, and it is noteworthy that an appreciable degree of branching asymmetry was observed among the alveolar sacs at the terminal end of the airway tree, despite the number of samples was small yet. The series of efficient techniques developed and confirmed in this study, from sample preparation to quantification, is expected to contribute to a wider and exacter application of phase contrast synchrotron radiation computed tomography to a variety of studies.

  2. An analysis of file system and installation of the file management system for NOS operating system

    International Nuclear Information System (INIS)

    Lee, Young Jai; Park, Sun Hee; Hwang, In Ah; Kim, Hee Kyung

    1992-06-01

    In this technical report, we analyze NOS file structure for Cyber 170-875 and Cyber 960-31 computer system. We also describe functions, procedure and how-to-operate and how-to-use of VDS. VDS is used to manage large files effectively for Cyber computer system. The purpose of the VDS installation is to increase the virtual disk storage by utilizing magnetic tape, to assist the users of the computer system in managing their files, and to enhance the performance of KAERI Cyber computer system. (Author)

  3. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  4. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  6. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  7. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  11. 48 CFR 750.7109-1 - Filing requests.

    Science.gov (United States)

    2010-10-01

    ... CONTRACT MANAGEMENT EXTRAORDINARY CONTRACTUAL ACTIONS Extraordinary Contractual Actions To Protect Foreign Policy Interests of the United States 750.7109-1 Filing requests. Any person (hereinafter called the...

  12. 28 CFR 10.5 - Incorporation of papers previously filed.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Incorporation of papers previously filed... CARRYING ON ACTIVITIES WITHIN THE UNITED STATES Registration Statement § 10.5 Incorporation of papers previously filed. Papers and documents already filed with the Attorney General pursuant to the said act and...

  13. Computer-aided design system for a complex of problems on calculation and analysis of engineering and economical indexes of NPP power units

    International Nuclear Information System (INIS)

    Stepanov, V.I.; Koryagin, A.V.; Ruzankov, V.N.

    1988-01-01

    Computer-aided design system for a complex of problems concerning calculation and analysis of engineering and economical indices of NPP power units is described. In the system there are means for automated preparation and debugging of data base software complex, which realizes th plotted algorithm in the power unit control system. Besides, in the system there are devices for automated preparation and registration of technical documentation

  14. JENDL special purpose file

    International Nuclear Information System (INIS)

    Nakagawa, Tsuneo

    1995-01-01

    In JENDL-3,2, the data on all the reactions having significant cross section over the neutron energy from 0.01 meV to 20 MeV are given for 340 nuclides. The object range of application extends widely, such as the neutron engineering, shield and others of fast reactors, thermal neutron reactors and nuclear fusion reactors. This is a general purpose data file. On the contrary to this, the file in which only the data required for a specific application field are collected is called special purpose file. The file for dosimetry is a typical special purpose file. The Nuclear Data Center, Japan Atomic Energy Research Institute, is making ten kinds of JENDL special purpose files. The files, of which the working groups of Sigma Committee are in charge, are listed. As to the format of the files, ENDF format is used similarly to JENDL-3,2. Dosimetry file, activation cross section file, (α, n) reaction data file, fusion file, actinoid file, high energy data file, photonuclear data file, PKA/KERMA file, gas production cross section file and decay data file are described on their contents, the course of development and their verification. Dosimetry file and gas production cross section file have been completed already. As for the others, the expected time of completion is shown. When these files are completed, they are opened to the public. (K.I.)

  15. 75 FR 20849 - Notice of Agreements Filed

    Science.gov (United States)

    2010-04-21

    ... Shipping & Construction Co., Ltd. and United Abaco Shipping Company Limited. Filing Parties: Neal M. Mayer, Esq.; Hoppel, Mayer & Coleman; 1050 Connecticut Avenue NW., 10th Floor; Washington, DC 20036. Synopsis...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  17. Application of an EPID for fast daily dosimetric quality control of a fully computer-controlled treatment unit

    International Nuclear Information System (INIS)

    Dirkx, M.L.P.; Kroonwijk, M.; De Boer, J.C.J.; Heijmen, B.J.M.

    1995-01-01

    The MM50 Racetrack Microtron, suited for sophisticated three-dimensional computer-controlled conformal radiotherapy techniques, is a complex treatment unit in various respects. Therefore, for a number of gantry angles, daily quality control of the absolute output and the profiles of the scanned photon beams in mandatory. A fast method for these daily checks, based on dosimetric measurements with the Philips SRI-100 Electronic Portal Imaging Device, has been developed and tested. Open beams are checked for four different gantry angles; for gantry angle 0, a wedged field is checked as well. The fields are set up one after another under full computer control. Performing and analyzing the measurements takes about ten minutes. The applied EPID has favourable characteristics for dosimetric quality control measurements: absolute measurements reproduce within 0.5% (1 SD) and the reproducibility of a relative (2-D) fluence profile is 0.2% (1 SD). The day-to-day sensitivity stability over a period of a month is 0.6% (1 SD). EPID-signals are within 0.2% linear with the applied dose. The 2-D fluence profile of the 25 MV photon beam of the MM50 is very stable in time: during a period of one year, a maximum fluctuation of 2.6% was observed. Once, a deviation in the cGy/MU-value of 6% was detected. Only because of the performed morning quality control checks with the EPID, erroneous dose delivery to patients could be avoided; there is no interlock in the MM50-system that would have prevented patient treatment. Based on our experiences and on clinical requirements regarding the acceptability of deviations of beam characteristics, a protocol has been developed including action levels for additional investigations. Studies on the application of the SRI-100 for in vivo dosimetry on the MM50 have been started

  18. Application of an EPID for fast daily dosimetric quality control of a fully computer-controlled treatment unit

    Energy Technology Data Exchange (ETDEWEB)

    Dirkx, M L.P.; Kroonwijk, M; De Boer, J C.J.; Heijmen, B J.M. [Nederlands Kanker Inst. ` Antoni van Leeuwenhoekhuis` , Amsterdam (Netherlands)

    1995-12-01

    The MM50 Racetrack Microtron, suited for sophisticated three-dimensional computer-controlled conformal radiotherapy techniques, is a complex treatment unit in various respects. Therefore, for a number of gantry angles, daily quality control of the absolute output and the profiles of the scanned photon beams in mandatory. A fast method for these daily checks, based on dosimetric measurements with the Philips SRI-100 Electronic Portal Imaging Device, has been developed and tested. Open beams are checked for four different gantry angles; for gantry angle 0, a wedged field is checked as well. The fields are set up one after another under full computer control. Performing and analyzing the measurements takes about ten minutes. The applied EPID has favourable characteristics for dosimetric quality control measurements: absolute measurements reproduce within 0.5% (1 SD) and the reproducibility of a relative (2-D) fluence profile is 0.2% (1 SD). The day-to-day sensitivity stability over a period of a month is 0.6% (1 SD). EPID-signals are within 0.2% linear with the applied dose. The 2-D fluence profile of the 25 MV photon beam of the MM50 is very stable in time: during a period of one year, a maximum fluctuation of 2.6% was observed. Once, a deviation in the cGy/MU-value of 6% was detected. Only because of the performed morning quality control checks with the EPID, erroneous dose delivery to patients could be avoided; there is no interlock in the MM50-system that would have prevented patient treatment. Based on our experiences and on clinical requirements regarding the acceptability of deviations of beam characteristics, a protocol has been developed including action levels for additional investigations. Studies on the application of the SRI-100 for in vivo dosimetry on the MM50 have been started.

  19. Detection Of Alterations In Audio Files Using Spectrograph Analysis

    Directory of Open Access Journals (Sweden)

    Anandha Krishnan G

    2015-08-01

    Full Text Available The corresponding study was carried out to detect changes in audio file using spectrograph. An audio file format is a file format for storing digital audio data on a computer system. A sound spectrograph is a laboratory instrument that displays a graphical representation of the strengths of the various component frequencies of a sound as time passes. The objectives of the study were to find the changes in spectrograph of audio after altering them to compare altering changes with spectrograph of original files and to check for similarity and difference in mp3 and wav. Five different alterations were carried out on each audio file to analyze the differences between the original and the altered file. For altering the audio file MP3 or WAV by cutcopy the file was opened in Audacity. A different audio was then pasted to the audio file. This new file was analyzed to view the differences. By adjusting the necessary parameters the noise was reduced. The differences between the new file and the original file were analyzed. By adjusting the parameters from the dialog box the necessary changes were made. The edited audio file was opened in the software named spek where after analyzing a graph is obtained of that particular file which is saved for further analysis. The original audio graph received was combined with the edited audio file graph to see the alterations.

  20. Remote file inquiry (RFI) system

    Science.gov (United States)

    1975-01-01

    System interrogates and maintains user-definable data files from remote terminals, using English-like, free-form query language easily learned by persons not proficient in computer programming. System operates in asynchronous mode, allowing any number of inquiries within limitation of available core to be active concurrently.

  1. The optimal parameter design for a welding unit of manufacturing industry by Taguchi method and computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Zahraee, S.M.; Chegeni, A.; Toghtamish, A.

    2016-07-01

    Manufacturing systems include a complicated combination of resources, such as materials, labors, and machines. Hence, when the manufacturing systems are faced with a problem related to the availability of resources it is difficult to identify the root of the problem accurately and effectively. Managers and engineers in companies are trying to achieve a robust production line based on the maximum productivity. The main goal of this paper is to design a robust production line, taking productivity into account in the selected manufacturing industry. This paper presents the application of Taguchi method along with computer simulation for finding an optimum factor setting for three controllable factors, which are a number of welding machines, hydraulic machines, and cutting machines by analyzing the effect of noise factors in a selected manufacturing industry. Based on the final results, the optimal design parameter of welding unit of in the selected manufacturing industry will be obtained when factor A is located at level 2 and B and C are located at level 1. Therefore, maximum productive desirability is achieved when the number of welding machines, hydraulic machines, and cutting machines is equal to 17, 2, and 1, respectively. This paper has a significant role in designing a robust production line by considering the lowest cost and timely manner based on the Taguchi method. (Author)

  2. Economic Impacts of Potential Foot and Mouth Disease Agro-terrorism in the United States: A Computable General Equilibrium Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Oladosu, Gbadebo A [ORNL; Rose, Adam [University of Southern California, Los Angeles; Bumsoo, Lee [University of Illinois

    2013-01-01

    The foot and mouth disease (FMD) virus has high agro-terrorism potential because it is contagious, can be easily transmitted via inanimate objects and can be spread by wind. An outbreak of FMD in developed countries results in massive slaughtering of animals (for disease control) and disruptions in meat supply chains and trade, with potentially large economic losses. Although the United States has been FMD-free since 1929, the potential of FMD as a deliberate terrorist weapon calls for estimates of the physical and economic damage that could result from an outbreak. This paper estimates the economic impacts of three alternative scenarios of potential FMD attacks using a computable general equilibrium (CGE) model of the US economy. The three scenarios range from a small outbreak successfully contained within a state to a large multi-state attack resulting in slaughtering of 30 percent of the national livestock. Overall, the value of total output losses in our simulations range between $37 billion (0.15% of 2006 baseline economic output) and $228 billion (0.92%). Major impacts stem from the supply constraint on livestock due to massive animal slaughtering. As expected, the economic losses are heavily concentrated in agriculture and food manufacturing sectors, with losses ranging from $23 billion to $61 billion in the two industries.

  3. The optimal parameter design for a welding unit of manufacturing industry by Taguchi method and computer simulation

    Directory of Open Access Journals (Sweden)

    Seyed Mojib Zahraee

    2016-05-01

    Full Text Available Purpose: Manufacturing systems include a complicated combination of resources, such as materials, labors, and machines. Hence, when the manufacturing systems are faced with a problem related to the availability of resources it is difficult to identify the root of the problem accurately and effectively. Managers and engineers in companies are trying to achieve a robust production line based on the maximum productivity. The main goal of this paper is to design a robust production line, taking productivity into account in the selected manufacturing industry. Design/methodology/approach: This paper presents the application of Taguchi method along with computer simulation for finding an optimum factor setting for three controllable factors, which are a number of welding machines, hydraulic machines, and cutting machines by analyzing the effect of noise factors in a selected manufacturing industry. Findings and Originality/value: Based on the final results, the optimal design parameter of welding unit of in the selected manufacturing industry will be obtained when factor A is located at level 2 and B and C are located at level 1. Therefore, maximum productive desirability is achieved when the number of welding machines, hydraulic machines, and cutting machines is equal to 17, 2, and 1, respectively. This paper has a significant role in designing a robust production line by considering the lowest cost and timely manner based on the Taguchi method.

  4. Research of Performance Linux Kernel File Systems

    Directory of Open Access Journals (Sweden)

    Andrey Vladimirovich Ostroukh

    2015-10-01

    Full Text Available The article describes the most common Linux Kernel File Systems. The research was carried out on a personal computer, the characteristics of which are written in the article. The study was performed on a typical workstation running GNU/Linux with below characteristics. On a personal computer for measuring the file performance, has been installed the necessary software. Based on the results, conclusions and proposed recommendations for use of file systems. Identified and recommended by the best ways to store data.

  5. Fast probabilistic file fingerprinting for big data.

    Science.gov (United States)

    Tretyakov, Konstantin; Laur, Sven; Smant, Geert; Vilo, Jaak; Prins, Pjotr

    2013-01-01

    Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff.

  6. Experience with a mobile data storage device for transfer of studies from the critical care unit to a central nuclear medicine computer

    International Nuclear Information System (INIS)

    Cradduck, T.D.; Driedger, A.A.

    1981-01-01

    The introduction of mobile scintillation cameras has enabled the more immediate provision of nuclear medicine services in areas remote from the central nuclear medicine laboratory. Since a large number of such studies involve the use of a computer for data analysis, the concurrent problem of how to transmit those data to the computer becomes critical. A device is described using hard magnetic discs as the recording media and which can be wheeled from the patient's bedside to the central computer for playback. Some initial design problems, primarily associated with the critical timing which is necessary for the collection of gated studies, were overcome and the unit has been in service for the past two years. The major limitations are the relatively small capacity of the discs and the fact that the data are recorded in list mode. These constraints result in studies having poor statistical validity. The slow turn-around time, which results from the necessity to transport the system to the department and replay the study into the computer before analysis can begin, is also of particular concern. The use of this unit has clearly demonstrated the very important role that nuclear medicine can play in the care of the critically ill patient. The introduction of a complete acquisition and analysis unit is planned so that prompt diagnostic decisions can be made available within the intensive care unit. (author)

  7. Effect of Computer Animation Technique on Students' Comprehension of the "Solar System and Beyond" Unit in the Science and Technology Course

    Science.gov (United States)

    Aksoy, Gokhan

    2013-01-01

    The purpose of this study is to determine the effect of computer animation technique on academic achievement of students in the "Solar System and Beyond" unit lecture as part of the Science and Technology course of the seventh grade in primary education. The sample of the study consists of 60 students attending to the 7th grade of primary school…

  8. HUD GIS Boundary Files

    Data.gov (United States)

    Department of Housing and Urban Development — The HUD GIS Boundary Files are intended to supplement boundary files available from the U.S. Census Bureau. The files are for community planners interested in...

  9. Tax Unit Boundaries

    Data.gov (United States)

    Kansas Data Access and Support Center — The Statewide GIS Tax Unit boundary file was created through a collaborative partnership between the State of Kansas Department of Revenue Property Valuation...

  10. A secure file manager for UNIX

    Energy Technology Data Exchange (ETDEWEB)

    DeVries, R.G.

    1990-12-31

    The development of a secure file management system for a UNIX-based computer facility with supercomputers and workstations is described. Specifically, UNIX in its usual form does not address: (1) Operation which would satisfy rigorous security requirements. (2) Online space management in an environment where total data demands would be many times the actual online capacity. (3) Making the file management system part of a computer network in which users of any computer in the local network could retrieve data generated on any other computer in the network. The characteristics of UNIX can be exploited to develop a portable, secure file manager which would operate on computer systems ranging from workstations to supercomputers. Implementation considerations making unusual use of UNIX features, rather than requiring extensive internal system changes, are described, and implementation using the Cray Research Inc. UNICOS operating system is outlined.

  11. Schools (Students) Exchanging CAD/CAM Files over the Internet.

    Science.gov (United States)

    Mahoney, Gary S.; Smallwood, James E.

    This document discusses how students and schools can benefit from exchanging computer-aided design/computer-aided manufacturing (CAD/CAM) files over the Internet, explains how files are exchanged, and examines the problem of selected hardware/software incompatibility. Key terms associated with information search services are defined, and several…

  12. Parallel file system with metadata distributed across partitioned key-value store c

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron

    2017-09-19

    Improved techniques are provided for storing metadata associated with a plurality of sub-files associated with a single shared file in a parallel file system. The shared file is generated by a plurality of applications executing on a plurality of compute nodes. A compute node implements a Parallel Log Structured File System (PLFS) library to store at least one portion of the shared file generated by an application executing on the compute node and metadata for the at least one portion of the shared file on one or more object storage servers. The compute node is also configured to implement a partitioned data store for storing a partition of the metadata for the shared file, wherein the partitioned data store communicates with partitioned data stores on other compute nodes using a message passing interface. The partitioned data store can be implemented, for example, using Multidimensional Data Hashing Indexing Middleware (MDHIM).

  13. Peregrine System | High-Performance Computing | NREL

    Science.gov (United States)

    classes of nodes that users access: Login Nodes Peregrine has four login nodes, each of which has Intel E5 /scratch file systems, the /mss file system is mounted on all login nodes. Compute Nodes Peregrine has 2592

  14. Provider of Services File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The POS file consists of two data files, one for CLIA labs and one for 18 other provider types. The file names are CLIA and OTHER. If downloading the file, note it...

  15. PC Graphic file programing

    International Nuclear Information System (INIS)

    Yang, Jin Seok

    1993-04-01

    This book gives description of basic of graphic knowledge and understanding and realization of graphic file form. The first part deals with graphic with graphic data, store of graphic data and compress of data, programing language such as assembling, stack, compile and link of program and practice and debugging. The next part mentions graphic file form such as Mac paint file, GEM/IMG file, PCX file, GIF file, and TIFF file, consideration of hardware like mono screen driver and color screen driver in high speed, basic conception of dithering and conversion of formality.

  16. Predicting the stone composition of children preoperatively by Hounsfield unit detection on non-contrast computed tomography.

    Science.gov (United States)

    Altan, Mesut; Çitamak, Burak; Bozaci, Ali Cansu; Güneş, Altan; Doğan, Hasan Serkan; Haliloğlu, Mithat; Tekgül, Serdar

    2017-10-01

    Many studies have been performed on adult patients to reveal the relationship between Hounsfield unit (HU) value and composition of stone, but none have focused on childhood. We aimed to predict stone composition by HU properties in pre-intervention non-contrast computed tomography (NCCT) in children. This could help to orient patients towards more successful interventions. Data of 94 children, whose pre-intervention NCCT and post-interventional stone analysis were available were included. Stones were grouped into three groups: calcium oxalate (CaOx), cystine, and struvite. Besides spot urine PH value, core HU, periphery HU, and Hounsfield density (HUD) values were measured and groups were compared statistically. The mean age of patients was 7 ± 4 (2-17) years and the female/male ratio was 51/43. The mean stone size was 11.7 ± 5 (4-24) mm. There were 50, 38, and 6 patients in the CaOx, cystine, and struvite groups, respectively. The median values for core HU, periphery HU, and mean HU in the CaOx group were significantly higher than the corresponding median values in the cystine and struvite groups. Significant median HUD difference was seen only between the CaOx and cystine groups. No difference was seen between the cystine and struvite groups in terms of HU parameters. To distinguish these groups, mean spot urine PH values were compared and were found to be higher in the struvite group than the cystine group (Table). The retrospective nature and small number of patients in some groups are limitations of this study, which also does not include all stone compositions. Our cystine stone rate was higher than childhood stone composition distribution in the literature. This is because our center is a reference center in a region with high recurrence rates of cystine stones. In fact, high numbers of cystine stones helped us to compare them with calcium stones more accurately and became an advantage for this study. NCCT at diagnosis can provide some information for

  17. Methods and Algorithms for Detecting Objects in Video Files

    Directory of Open Access Journals (Sweden)

    Nguyen The Cuong

    2018-01-01

    Full Text Available Video files are files that store motion pictures and sounds like in real life. In today's world, the need for automated processing of information in video files is increasing. Automated processing of information has a wide range of application including office/home surveillance cameras, traffic control, sports applications, remote object detection, and others. In particular, detection and tracking of object movement in video file plays an important role. This article describes the methods of detecting objects in video files. Today, this problem in the field of computer vision is being studied worldwide.

  18. Design, Assembly, Integration, and Testing of a Power Processing Unit for a Cylindrical Hall Thruster, the NORSAT-2 Flatsat, and the Vector Gravimeter for Asteroids Instrument Computer

    Science.gov (United States)

    Svatos, Adam Ladislav

    This thesis describes the author's contributions to three separate projects. The bus of the NORSAT-2 satellite was developed by the Space Flight Laboratory (SFL) for the Norwegian Space Centre (NSC) and Space Norway. The author's contributions to the mission were performing unit tests for the components of all the spacecraft subsystems as well as designing and assembling the flatsat from flight spares. Gedex's Vector Gravimeter for Asteroids (VEGA) is an accelerometer for spacecraft. The author's contributions to this payload were modifying the instrument computer board schematic, designing the printed circuit board, developing and applying test software, and performing thermal acceptance testing of two instrument computer boards. The SFL's cylindrical Hall effect thruster combines the cylindrical configuration for a Hall thruster and uses permanent magnets to achieve miniaturization and low power consumption, respectively. The author's contributions were to design, build, and test an engineering model power processing unit.

  19. Use of computational methods for assessing the radiological status of units 1-4 of Kozloduy NPP. Evaluation of activated materials and computation of surface contamination

    International Nuclear Information System (INIS)

    Radovanov, P.

    2015-01-01

    For planning purposes, the calculation methods are a good approach for the predicting of: the amount of RAW and radionuclide inventory; dose rates from the equipment, as personnel exposure is decreased to the minimum (even to 0). In future, the development of the computing software and hardware will result in even better predictions to contribute to more accurate planning of the decommissioning process

  20. 76 FR 52650 - Federal Energy Regulatory Commission Combined Notice of Filings #1

    Science.gov (United States)

    2011-08-23

    ... Depreciation Rate Update) to be effective 1/1/2012. Filed Date: 08/11/2011. Accession Number: 20110811-5114...)(iii: JEA Scherer Unit 4 TSA Amendment Filing (SEGCO Depreciation Rate Update) to be effective 1/1/2012...

  1. Semiempirical and DFT computations of the influence of Tb(III) dopant on unit cell dimensions of cerium(III) fluoride.

    Science.gov (United States)

    Shyichuk, Andrii; Runowski, Marcin; Lis, Stefan; Kaczkowski, Jakub; Jezierski, Andrzej

    2015-01-30

    Several computational methods, both semiempirical and ab initio, were used to study the influence of the amount of dopant on crystal cell dimensions of CeF3 doped with Tb(3+) ions (CeF3 :Tb(3+) ). AM1, RM1, PM3, PM6, and PM7 semiempirical parameterization models were used, while the Sparkle model was used to represent the lanthanide cations in all cases. Ab initio calculations were performed by means of GGA+U/PBE projector augmented wave density functional theory. The computational results agree well with the experimental data. According to both computation and experiment, the crystal cell parameters undergo a linear decrease with increasing amount of the dopant. The computations performed using Sparkle/PM3 and DFT methods resulted in the best agreement with the experiment with the average deviation of about 1% in both cases. Typical Sparkle/PM3 computation on a 2×2×2 supercell of CeF3:Tb3+ lasted about two orders of magnitude shorter than the DFT computation concerning a unit cell of this material. © 2014 Wiley Periodicals, Inc.

  2. Distribution of lithostratigraphic units within the central block of Yucca Mountain, Nevada: A three-dimensional computer-based model, Version YMP.R2.0

    International Nuclear Information System (INIS)

    Buesch, D.C.; Nelson, J.E.; Dickerson, R.P.; Drake, R.M. II; San Juan, C.A.; Spengler, R.W.; Geslin, J.K.; Moyer, T.C.

    1996-01-01

    Yucca Mountain, Nevada is underlain by 14.0 to 11.6 Ma volcanic rocks tilted eastward 3 degree to 20 degree and cut by faults that were primarily active between 12.7 and 11.6 Ma. A three-dimensional computer-based model of the central block of the mountain consists of seven structural subblocks composed of six formations and the interstratified-bedded tuffaceous deposits. Rocks from the 12.7 Ma Tiva Canyon Tuff, which forms most of the exposed rocks on the mountain, to the 13.1 Ma Prow Pass Tuff are modeled with 13 surfaces. Modeled units represent single formations such as the Pah Canyon Tuff, grouped units such as the combination of the Yucca Mountain Tuff with the superjacent bedded tuff, and divisions of the Topopah Spring Tuff such as the crystal-poor vitrophyre interval. The model is based on data from 75 boreholes from which a structure contour map at the base of the Tiva Canyon Tuff and isochore maps for each unit are constructed to serve as primary input. Modeling consists of an iterative cycle that begins with the primary structure-contour map from which isochore values of the subjacent model unit are subtracted to produce the structure contour map on the base of the unit. This new structure contour map forms the input for another cycle of isochore subtraction to produce the next structure contour map. In this method of solids modeling, the model units are presented by surfaces (structure contour maps), and all surfaces are stored in the model. Surfaces can be converted to form volumes of model units with additional effort. This lithostratigraphic and structural model can be used for (1) storing data from, and planning future, site characterization activities, (2) preliminary geometry of units for design of Exploratory Studies Facility and potential repository, and (3) performance assessment evaluations

  3. Assessment of Two Desk-Top Computer Simulations Used to Train Tactical Decision Making (TDM) of Small Unit Infantry Leaders

    National Research Council Canada - National Science Library

    Beal, Scott A

    2007-01-01

    Fifty-two leaders in the Basic Non-Commissioned Officer Course (BNCOC) at Fort Benning, Georgia, participated in an assessment of two desk-top computer simulations used to train tactical decision making...

  4. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  5. Computerized index for teaching files

    International Nuclear Information System (INIS)

    Bramble, J.M.

    1989-01-01

    A computerized index can be used to retrieve cases from a teaching file that have radiographic findings similar to an unknown case. The probability that a user will review cases with a correct diagnosis was estimated with use of radiographic findings of arthritis in hand radiographs of 110 cases from a teaching file. The nearest-neighbor classification algorithm was used as a computer index to 110 cases of arthritis. Each case was treated as an unknown and inputted to the computer index. The accuracy of the computer index in retrieving cases with the same diagnosis (including rheumatoid arthritis, gout, psoriatic arthritis, inflammatory osteoarthritis, and pyrophosphate arthropathy) was measured. A Bayes classifier algorithm was also tested on the same database. Results are presented. The nearest-neighbor algorithm was 83%. By comparison, the estimated accuracy of the Bayes classifier algorithm was 78%. Conclusions: A computerized index to a teaching file based on the nearest-neighbor algorithm should allow the user to review cases with the correct diagnosis of an unknown case, by entering the findings of the unknown case

  6. Exploitation of heterogeneous resources for ATLAS Computing

    CERN Document Server

    Chudoba, Jiri; The ATLAS collaboration

    2018-01-01

    LHC experiments require significant computational resources for Monte Carlo simulations and real data processing and the ATLAS experiment is not an exception. In 2017, ATLAS exploited steadily almost 3M HS06 units, which corresponds to about 300 000 standard CPU cores. The total disk and tape capacity managed by the Rucio data management system exceeded 350 PB. Resources are provided mostly by Grid computing centers distributed in geographically separated locations and connected by the Grid middleware. The ATLAS collaboration developed several systems to manage computational jobs, data files and network transfers. ATLAS solutions for job and data management (PanDA and Rucio) were generalized and now are used also by other collaborations. More components are needed to include new resources such as private and public clouds, volunteers' desktop computers and primarily supercomputers in major HPC centers. Workflows and data flows significantly differ for these less traditional resources and extensive software re...

  7. The design and development of GRASS file reservation system

    International Nuclear Information System (INIS)

    Huang Qiulan; Zhu Suijiang; Cheng Yaodong; Chen Gang

    2010-01-01

    GFRS (GRASS File Reservation System) is designed to improve the file access performance of GRASS (Grid-enabled Advanced Storage System) which is a Hierarchical Storage Management (HSM) system developed at Computing Center, Institute of High Energy Physics. GRASS can provide massive storage management and data migration, but the data migration policy is simply based factors such as pool water level, the intervals for migration and so on, so it is short of precise control over files. As for that, we design GFRS to implement user-based file reservation which is to reserve and keep the required files on disks for High Energy physicists. CFRS can improve file access speed for users by avoiding migrating frequently accessed files to tapes. In this paper we first give a brief introduction of GRASS system and then detailed architecture and implementation of GFRS. Experiments results from GFRS have shown good performance and a simple analysis is made based on it. (authors)

  8. Computer aided design of operational units for tritium recovery from Li17Pb83 blanket of a DEMO fusion reactor

    International Nuclear Information System (INIS)

    Malara, C.; Viola, A.

    1995-01-01

    The problem of tritium recovery from Li 17 Pb 83 blanket of a DEMO fusion reactor is analyzed with the objective of limiting tritium permeation into the cooling water to acceptable levels. To this aim, a mathematical model describing the tritium behavior in blanket/recovery unit circuit has been formulated. By solving the model equations, tritium permeation rate into the cooling water and tritium inventory in the blanket are evaluated as a function of dimensionless parameters describing the combined effects of overall resistance for tritium transfer from Li 17 Pb 83 alloy to cooling water, circulating rate of the molten alloy in blanket/recovery unit circuit and extraction efficiency of tritium recovery unit. The extraction efficiency is, in turn, evaluated as a function of the operating conditions of recovery unit. The design of tritium recovery unit is then optimized on the basis of the above parametric analysis and the results are herein reported and discussed for a tritium permeation limit of 10 g/day into the cooling water. 14 refs., 9 figs., 2 tabs

  9. Decay data file based on the ENSDF file

    Energy Technology Data Exchange (ETDEWEB)

    Katakura, J. [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-03-01

    A decay data file with the JENDL (Japanese Evaluated Nuclear Data Library) format based on the ENSDF (Evaluated Nuclear Structure Data File) file was produced as a tentative one of special purpose files of JENDL. The problem using the ENSDF file as primary source data of the JENDL decay data file is presented. (author)

  10. UPIN Group File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Group Unique Physician Identifier Number (UPIN) File is the business entity file that contains the group practice UPIN and descriptive information. It does NOT...

  11. Multi-level, automatic file management system using magnetic disk, mass storage system and magnetic tape

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1979-12-01

    A simple, effective file management system using magnetic disk, mass storage system (MSS) and magnetic tape is described. Following are the concepts and techniques introduced in this file management system. (1) File distribution and continuity character of file references are closely approximated by memory retention function. A density function using the memory retention function is thus defined. (2) A method of computing the cost/benefit lines for magnetic disk, MSS and magnetic tape is presented. (3) A decision process of an optimal organization of file facilities incorporating file demands distribution to respective file devices, is presented. (4) A method of simple, practical, effective, automatic file management, incorporating multi-level file management, space management and file migration control, is proposed. (author)

  12. The Dynamic Interplay between Spatialization of Written Units in Writing Activity and Functions of Tools on the Computer

    Science.gov (United States)

    Huh, Joo Hee

    2012-01-01

    I criticize the typewriting model and linear writing structure of Microsoft Word software for writing in the computer. I problematize bodily movement in writing that the error of the software disregards. In this research, writing activity is viewed as bodily, spatial and mediated activity under the premise of the unity of consciousness and…

  13. Demographics of undergraduates studying games in the United States: a comparison of computer science students and the general population

    Science.gov (United States)

    McGill, Monica M.; Settle, Amber; Decker, Adrienne

    2013-06-01

    Our study gathered data to serve as a benchmark of demographics of undergraduate students in game degree programs. Due to the high number of programs that are cross-disciplinary with computer science programs or that are housed in computer science departments, the data is presented in comparison to data from computing students (where available) and the US population. Participants included students studying games at four nationally recognized postsecondary institutions. The results of the study indicate that there is no significant difference between the ratio of men to women studying in computing programs or in game degree programs, with women being severely underrepresented in both. Women, blacks, Hispanics/Latinos, and heterosexuals are underrepresented compared to the US population. Those with moderate and conservative political views and with religious affiliations are underrepresented in the game student population. Participants agree that workforce diversity is important and that their programs are adequately diverse, but only one-half of the participants indicated that diversity has been discussed in any of their courses.

  14. Massively parallel signal processing using the graphics processing unit for real-time brain-computer interface feature extraction

    Directory of Open Access Journals (Sweden)

    J. Adam Wilson

    2009-07-01

    Full Text Available The clock speeds of modern computer processors have nearly plateaued in the past five years. Consequently, neural prosthetic systems that rely on processing large quantities of data in a short period of time face a bottleneck, in that it may not be possible to process all of the data recorded from an electrode array with high channel counts and bandwidth, such as electrocorticographic grids or other implantable systems. Therefore, in this study a method of using the processing capabilities of a graphics card (GPU was developed for real-time neural signal processing of a brain-computer interface (BCI. The NVIDIA CUDA system was used to offload processing to the GPU, which is capable of running many operations in parallel, potentially greatly increasing the speed of existing algorithms. The BCI system records many channels of data, which are processed and translated into a control signal, such as the movement of a computer cursor. This signal processing chain involves computing a matrix-matrix multiplication (i.e., a spatial filter, followed by calculating the power spectral density on every channel using an auto-regressive method, and finally classifying appropriate features for control. In this study, the first two computationally-intensive steps were implemented on the GPU, and the speed was compared to both the current implementation and a CPU-based implementation that uses multi-threading. Significant performance gains were obtained with GPU processing: the current implementation processed 1000 channels in 933 ms, while the new GPU method took only 27 ms, an improvement of nearly 35 times.

  15. A File Archival System

    Science.gov (United States)

    Fanselow, J. L.; Vavrus, J. L.

    1984-01-01

    ARCH, file archival system for DEC VAX, provides for easy offline storage and retrieval of arbitrary files on DEC VAX system. System designed to eliminate situations that tie up disk space and lead to confusion when different programers develop different versions of same programs and associated files.

  16. Text File Comparator

    Science.gov (United States)

    Kotler, R. S.

    1983-01-01

    File Comparator program IFCOMP, is text file comparator for IBM OS/VScompatable systems. IFCOMP accepts as input two text files and produces listing of differences in pseudo-update form. IFCOMP is very useful in monitoring changes made to software at the source code level.

  17. New FORTRAN computer programs to acquire and process isotopic mass-spectrometric data

    International Nuclear Information System (INIS)

    Smith, D.H.

    1982-08-01

    The computer programs described in New Computer Programs to Acquire and Process Isotopic Mass Spectrometric Data have been revised. This report describes in some detail the operation of these programs, which acquire and process isotopic mass spectrometric data. Both functional and overall design aspects are addressed. The three basic program units - file manipulation, data acquisition, and data processing - are discussed in turn. Step-by-step instructions are included where appropriate, and each subsection is described in enough detail to give a clear picture of its function. Organization of file structure, which is central to the entire concept, is extensively discussed with the help of numerous tables. Appendices contain flow charts and outline file structure to help a programmer unfamiliar with the programs to alter them with a minimum of lost time

  18. Grammar-Based Specification and Parsing of Binary File Formats

    Directory of Open Access Journals (Sweden)

    William Underwood

    2012-03-01

    Full Text Available The capability to validate and view or play binary file formats, as well as to convert binary file formats to standard or current file formats, is critically important to the preservation of digital data and records. This paper describes the extension of context-free grammars from strings to binary files. Binary files are arrays of data types, such as long and short integers, floating-point numbers and pointers, as well as characters. The concept of an attribute grammar is extended to these context-free array grammars. This attribute grammar has been used to define a number of chunk-based and directory-based binary file formats. A parser generator has been used with some of these grammars to generate syntax checkers (recognizers for validating binary file formats. Among the potential benefits of an attribute grammar-based approach to specification and parsing of binary file formats is that attribute grammars not only support format validation, but support generation of error messages during validation of format, validation of semantic constraints, attribute value extraction (characterization, generation of viewers or players for file formats, and conversion to current or standard file formats. The significance of these results is that with these extensions to core computer science concepts, traditional parser/compiler technologies can potentially be used as a part of a general, cost effective curation strategy for binary file formats.

  19. Mobile phones and computer keyboards: unlikely reservoirs of multidrug-resistant organisms in the tertiary intensive care unit.

    Science.gov (United States)

    Smibert, O C; Aung, A K; Woolnough, E; Carter, G P; Schultz, M B; Howden, B P; Seemann, T; Spelman, D; McGloughlin, S; Peleg, A Y

    2018-03-02

    Few studies have used molecular epidemiological methods to study transmission links to clinical isolates in intensive care units. Ninety-four multidrug-resistant organisms (MDROs) cultured from routine specimens from intensive care unit (ICU) patients over 13 weeks were stored (11 meticillin-resistant Staphylococcus aureus (MRSA), two vancomycin-resistant enterococci and 81 Gram-negative bacteria). Medical staff personal mobile phones, departmental phones, and ICU keyboards were swabbed and cultured for MDROs; MRSA was isolated from two phones. Environmental and patient isolates of the same genus were selected for whole genome sequencing. On whole genome sequencing, the mobile phone isolates had a pairwise single nucleotide polymorphism (SNP) distance of 183. However, >15,000 core genome SNPs separated the mobile phone and clinical isolates. In a low-endemic setting, mobile phones and keyboards appear unlikely to contribute to hospital-acquired MDROs. Copyright © 2018 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  20. Functional needs which led to the use of digital computing devices in the protection system of 1300 MW units

    International Nuclear Information System (INIS)

    Dalle, H.

    1986-01-01

    After a review of classical protection functions used in 900 MW power plants, it is concluded that in order to have functioning margins it is useful to calculate more finely the controled parameters. These calculating needs lead to the use of digital computing devices. Drawing profit from the new possibilities one can improve the general performances of the protection system with regard to availability, safety and maintenance. These options in the case of PALUEL led to the realization of SPIN, described here

  1. 29 CFR 4000.28 - What if I send a computer disk?

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false What if I send a computer disk? 4000.28 Section 4000.28... I send a computer disk? (a) In general. We determine your filing or issuance date for a computer... paragraph (b) of this section. (1) Filings. For computer-disk filings, we may treat your submission as...

  2. 48 CFR 750.7110-5 - Contract files.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Contract files. 750.7110-5 Section 750.7110-5 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT CONTRACT... Interests of the United States 750.7110-5 Contract files. The fully executed action memorandum indicating...

  3. 19 CFR 210.53 - Motion filed after complaint.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Motion filed after complaint. 210.53 Section 210.53 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION INVESTIGATIONS OF UNFAIR PRACTICES IN IMPORT TRADE ADJUDICATION AND ENFORCEMENT Temporary Relief § 210.53 Motion filed after complaint. (a) A...

  4. A data compression algorithm for nuclear spectrum files

    International Nuclear Information System (INIS)

    Mika, J.F.; Martin, L.J.; Johnston, P.N.

    1990-01-01

    The total space occupied by computer files of spectra generated in nuclear spectroscopy systems can lead to problems of storage, and transmission time. An algorithm is presented which significantly reduces the space required to store nuclear spectra, without loss of any information content. Testing indicates that spectrum files can be routinely compressed by a factor of 5. (orig.)

  5. New orthopaedic implant management tool for computer-assisted planning, navigation, and simulation: from implant CAD files to a standardized XML-based implant database.

    Science.gov (United States)

    Sagbo, S; Blochaou, F; Langlotz, F; Vangenot, C; Nolte, L-P; Zheng, G

    2005-01-01

    Computer-Assisted Orthopaedic Surgery (CAOS) has made much progress over the last 10 years. Navigation systems have been recognized as important tools that help surgeons, and various such systems have been developed. A disadvantage of these systems is that they use non-standard formalisms and techniques. As a result, there are no standard concepts for implant and tool management or data formats to store information for use in 3D planning and navigation. We addressed these limitations and developed a practical and generic solution that offers benefits for surgeons, implant manufacturers, and CAS application developers. We developed a virtual implant database containing geometrical as well as calibration information for orthopedic implants and instruments, with a focus on trauma. This database has been successfully tested for various applications in the client/server mode. The implant information is not static, however, because manufacturers periodically revise their implants, resulting in the deletion of some implants and the introduction of new ones. Tracking these continuous changes and keeping CAS systems up to date is a tedious task if done manually. This leads to additional costs for system development, and some errors are inevitably generated due to the huge amount of information that has to be processed. To ease management with respect to implant life cycle, we developed a tool to assist end-users (surgeons, hospitals, CAS system providers, and implant manufacturers) in managing their implants. Our system can be used for pre-operative planning and intra-operative navigation, and also for any surgical simulation involving orthopedic implants. Currently, this tool allows addition of new implants, modification of existing ones, deletion of obsolete implants, export of a given implant, and also creation of backups. Our implant management system has been successfully tested in the laboratory with very promising results. It makes it possible to fill the current gap

  6. Securing the AliEn File Catalogue - Enforcing authorization with accountable file operations

    International Nuclear Information System (INIS)

    Schreiner, Steffen; Banerjee, Subho Sankar; Betev, Latchezar; Carminati, Federico; Vladimirovna Datskova, Olga; Furano, Fabrizio; Grigoras, Alina; Grigoras, Costin; Mendez Lorenzo, Patricia; Peters, Andreas Joachim; Saiz, Pablo; Bagnasco, Stefano; Zhu Jianlin

    2011-01-01

    The AliEn Grid Services, as operated by the ALICE Collaboration in its global physics analysis grid framework, is based on a central File Catalogue together with a distributed set of storage systems and the possibility to register links to external data resources. This paper describes several identified vulnerabilities in the AliEn File Catalogue access protocol regarding fraud and unauthorized file alteration and presents a more secure and revised design: a new mechanism, called LFN Booking Table, is introduced in order to keep track of access authorization in the transient state of files entering or leaving the File Catalogue. Due to a simplification of the original Access Envelope mechanism for xrootd-protocol-based storage systems, fundamental computational improvements of the mechanism were achieved as well as an up to 50% reduction of the credential's size. By extending the access protocol with signed status messages from the underlying storage system, the File Catalogue receives trusted information about a file's size and checksum and the protocol is no longer dependent on client trust. Altogether, the revised design complies with atomic and consistent transactions and allows for accountable, authentic, and traceable file operations. This paper describes these changes as part and beyond the development of AliEn version 2.19.

  7. 78 FR 54713 - Self-Regulatory Organizations; ICE Clear Europe Limited; Notice of Filing and Immediate...

    Science.gov (United States)

    2013-09-05

    ... filing, the United Kingdom, Mexico, Ireland, Switzerland, Spain, Norway, Denmark, Italy, and Germany have... condition of continuing to do business. Customary legal agreements in the financial services industry...

  8. DJFS: Providing Highly Reliable and High‐Performance File System with Small‐Sized

    Directory of Open Access Journals (Sweden)

    Junghoon Kim

    2017-11-01

    Full Text Available File systems and applications try to implement their own update protocols to guarantee data consistency, which is one of the most crucial aspects of computing systems. However, we found that the storage devices are substantially under‐utilized when preserving data consistency because they generate massive storage write traffic with many disk cache flush operations and force‐unit‐access (FUA commands. In this paper, we present DJFS (Delta‐Journaling File System that provides both a high level of performance and data consistency for different applications. We made three technical contributions to achieve our goal. First, to remove all storage accesses with disk cache flush operations and FUA commands, DJFS uses small‐sized NVRAM for a file system journal. Second, to reduce the access latency and space requirements of NVRAM, DJFS attempts to journal compress the differences in the modified blocks. Finally, to relieve explicit checkpointing overhead, DJFS aggressively reflects the checkpoint transactions to file system area in the unit of the specified region. Our evaluation on TPC‐C SQLite benchmark shows that, using our novel optimization schemes, DJFS outperforms Ext4 by up to 64.2 times with only 128 MB of NVRAM.

  9. Tax_Units_2011_Final

    Data.gov (United States)

    Kansas Data Access and Support Center — The Statewide GIS Tax Unit boundary file was created through a collaborative partnership between the State of Kansas Department of Revenue Property Valuation...

  10. ATLAS, an integrated structural analysis and design system. Volume 4: Random access file catalog

    Science.gov (United States)

    Gray, F. P., Jr. (Editor)

    1979-01-01

    A complete catalog is presented for the random access files used by the ATLAS integrated structural analysis and design system. ATLAS consists of several technical computation modules which output data matrices to corresponding random access file. A description of the matrices written on these files is contained herein.

  11. pcircle - A Suite of Scalable Parallel File System Tools

    Energy Technology Data Exchange (ETDEWEB)

    2015-10-01

    Most of the software related to file system are written for conventional local file system, they are serialized and can't take advantage of the benefit of a large scale parallel file system. "pcircle" software builds on top of ubiquitous MPI in cluster computing environment and "work-stealing" pattern to provide a scalable, high-performance suite of file system tools. In particular - it implemented parallel data copy and parallel data checksumming, with advanced features such as async progress report, checkpoint and restart, as well as integrity checking.

  12. MR-AFS: a global hierarchical file-system

    International Nuclear Information System (INIS)

    Reuter, H.

    2000-01-01

    The next generation of fusion experiments will use object-oriented technology creating the need for world wide sharing of an underlying hierarchical file-system. The Andrew file system (AFS) is a well known and widely spread global distributed file-system. Multiple-resident-AFS (MR-AFS) combines the features of AFS with hierarchical storage management systems. Files in MR-AFS therefore may be migrated on secondary storage, such as roboted tape libraries. MR-AFS is in use at IPP for the current experiments and data originating from super-computer applications. Experiences and scalability issues are discussed

  13. Network survivability performance (computer diskette)

    Science.gov (United States)

    1993-11-01

    File characteristics: Data file; 1 file. Physical description: 1 computer diskette; 3 1/2 in.; high density; 2.0MB. System requirements: Mac; Word. This technical report has been developed to address the survivability of telecommunications networks including services. It responds to the need for a common understanding of, and assessment techniques for network survivability, availability, integrity, and reliability. It provides a basis for designing and operating telecommunication networks to user expectations for network survivability.

  14. Prostate contouring uncertainty in megavoltage computed tomography images acquired with a helical tomotherapy unit during image-guided radiation therapy

    International Nuclear Information System (INIS)

    Song, William Y.; Chiu, Bernard; Bauman, Glenn S.; Lock, Michael; Rodrigues, George; Ash, Robert; Lewis, Craig; Fenster, Aaron; Battista, Jerry J.; Van Dyk, Jake

    2006-01-01

    Purpose: To evaluate the image-guidance capabilities of megavoltage computed tomography (MVCT), this article compares the interobserver and intraobserver contouring uncertainty in kilovoltage computed tomography (KVCT) used for radiotherapy planning with MVCT acquired with helical tomotherapy. Methods and Materials: Five prostate-cancer patients were evaluated. Each patient underwent a KVCT and an MVCT study, a total of 10 CT studies. For interobserver variability analysis, four radiation oncologists, one physicist, and two radiation therapists (seven observers in total) contoured the prostate and seminal vesicles (SV) in the 10 studies. The intraobserver variability was assessed by asking all observers to repeat the contouring of 1 patient's KVCT and MVCT studies. Quantitative analysis of contour variations was performed by use of volumes and radial distances. Results: The interobserver and intraobserver contouring uncertainty was larger in MVCT compared with KVCT. Observers consistently segmented larger volumes on MVCT where the ratio of average prostate and SV volumes was 1.1 and 1.2, respectively. On average (interobserver and intraobserver), the local delineation variability, in terms of standard deviations [Δσ = √(σ 2 MVCT - σ 2 KVCT )], increased by 0.32 cm from KVCT to MVCT. Conclusions: Although MVCT was inferior to KVCT for prostate delineation, the application of MVCT in prostate radiotherapy remains useful

  15. Polyhedral meshing as an innovative approach to computational domain discretization of a cyclone in a fluidized bed CLC unit

    Directory of Open Access Journals (Sweden)

    Sosnowski Marcin

    2017-01-01

    Full Text Available Chemical Looping Combustion (CLC is a technology that allows the separation of CO2, which is generated by the combustion of fossil fuels. The majority of process designs currently under investigation are systems of coupled fluidized beds. Advances in the development of power generation system using CLC cannot be introduced without using numerical modelling as a research tool. The primary and critical activity in numerical modelling is the computational domain discretization. It influences the numerical diffusion as well as convergence of the model and therefore the overall accuracy of the obtained results. Hence an innovative approach of computational domain discretization using polyhedral (POLY mesh is proposed in the paper. This method reduces both the numerical diffusion of the mesh as well as the time cost of preparing the model for subsequent calculation. The major advantage of POLY mesh is that each individual cell has many neighbours, so gradients can be much better approximated in comparison to commonly-used tetrahedral (TET mesh. POLYs are also less sensitive to stretching than TETs which results in better numerical stability of the model. Therefore detailed comparison of numerical modelling results concerning subsection of CLC system using tetrahedral and polyhedral mesh is covered in the paper.

  16. Making Friends in Dark Shadows: An Examination of the Use of Social Computing Strategy Within the United States Intelligence Community Since 9/11

    Directory of Open Access Journals (Sweden)

    Andrew Chomik

    2011-01-01

    Full Text Available The tragic events of 9/11/2001 in the United States highlighted failures in communication and cooperation in the U.S. intelligence community. Agencies within the community failed to “connect the dots” by not collaborating in intelligence gathering efforts, which resulted in severe gaps in data sharing that eventually contributed to the terrorist attack on American soil. Since then, and under the recommendation made by the 9/11 Commission Report, the United States intelligence community has made organizational and operational changes to intelligence gathering and sharing, primarily with the creation of the Office of the Director of National Intelligence (ODNI. The ODNI has since introduced a series of web-based social computing tools to be used by all members of the intelligence community, primarily with its closed-access wiki entitled “Intellipedia” and their social networking service called “A-Space”. This paper argues that, while the introduction of these and other social computing tools have been adopted successfully into the intelligence workplace, they have reached a plateau in their use and serve only as complementary tools to otherwise pre-existing information sharing processes. Agencies continue to ‘stove-pipe’ their respective data, a chronic challenge that plagues the community due to bureaucratic policy, technology use and workplace culture. This paper identifies and analyzes these challenges, and recommends improvements in the use of these tools, both in the business processes behind them and the technology itself. These recommendations aim to provide possible solutions for using these social computing tools as part of a more trusted, collaborative information sharing process.

  17. Development of new process network for gas chromatograph and analyzers connected with SCADA system and Digital Control Computers at Cernavoda NPP Unit 1

    International Nuclear Information System (INIS)

    Deneanu, Cornel; Popa Nemoiu, Dragos; Nica, Dana; Bucur, Cosmin

    2007-01-01

    The continuous monitoring of gas mixture concentrations (deuterium/ hydrogen/oxygen/nitrogen) accumulated in 'Moderator Cover Gas', 'Liquid Control Zone' and 'Heat Transport D 2 O Storage Tank Cover Gas', as well as the continuous monitoring of Heavy Water into Light Water concentration in 'Boilers Steam', 'Boilers Blown Down', 'Moderator heat exchangers', and 'Recirculated Water System', sensing any leaks of Cernavoda NPP U1 led to requirement of developing a new process network for gas chromatograph and analyzers connected to the SCADA system and Digital Control Computers of Cernavoda NPP Unit 1. In 2005 it was designed and implemented the process network for gas chromatograph which connected the gas chromatograph equipment to the SCADA system and Digital Control Computers of the Cernavoda NPP Unit 1. Later this process network for gas chromatograph has been extended to connect the AE13 and AE14 Fourier Transform Infrared (FTIR) analyzers with either. The Gas Chromatograph equipment measures with best accuracy the mixture gases (deuterium/ hydrogen/oxygen/nitrogen) concentration. The Fourier Transform Infrared (FTIR) AE13 and AE14 Analyzers measure the Heavy Water into Light Water concentration in Boilers Steam, Boilers BlownDown, Moderator heat exchangers, and Recirculated Water System, monitoring and signaling any leaks. The Gas Chromatograph equipment and Fourier Transform Infrared (FTIR) AE13 and AE14 Analyzers use the new OPC (Object Link Embedded for Process Control) technologies available in ABB's VistaNet network for interoperability with automation equipment. This new process network has interconnected the ABB chromatograph and Fourier Transform Infrared analyzers with plant Digital Control Computers using new technology. The result was an increased reliability and capability for inspection and improved system safety

  18. 76 FR 34811 - Notice of Renewal Charter and Filing Letters

    Science.gov (United States)

    2011-06-14

    ... Government Entities (ACT). The renewal charter was filed on une 3, 2011, with the Committee on Finance of the United States Senate, the ommittee on Ways and Means of the U.S. House of Representatives, and the...

  19. Characteristics of file sharing and peer to peer networking | Opara ...

    African Journals Online (AJOL)

    Characteristics of file sharing and peer to peer networking. ... distributing or providing access to digitally stored information, such as computer programs, ... including in multicast systems, anonymous communications systems, and web caches.

  20. Planning intensive care unit design using computer simulation modeling: optimizing integration of clinical, operational, and architectural requirements.

    Science.gov (United States)

    OʼHara, Susan

    2014-01-01

    Nurses have increasingly been regarded as critical members of the planning team as architects recognize their knowledge and value. But the nurses' role as knowledge experts can be expanded to leading efforts to integrate the clinical, operational, and architectural expertise through simulation modeling. Simulation modeling allows for the optimal merge of multifactorial data to understand the current state of the intensive care unit and predict future states. Nurses can champion the simulation modeling process and reap the benefits of a cost-effective way to test new designs, processes, staffing models, and future programming trends prior to implementation. Simulation modeling is an evidence-based planning approach, a standard, for integrating the sciences with real client data, to offer solutions for improving patient care.

  1. Computational procedure of optimal inventory model involving controllable backorder rate and variable lead time with defective units

    Science.gov (United States)

    Lee, Wen-Chuan; Wu, Jong-Wuu; Tsou, Hsin-Hui; Lei, Chia-Ling

    2012-10-01

    This article considers that the number of defective units in an arrival order is a binominal random variable. We derive a modified mixture inventory model with backorders and lost sales, in which the order quantity and lead time are decision variables. In our studies, we also assume that the backorder rate is dependent on the length of lead time through the amount of shortages and let the backorder rate be a control variable. In addition, we assume that the lead time demand follows a mixture of normal distributions, and then relax the assumption about the form of the mixture of distribution functions of the lead time demand and apply the minimax distribution free procedure to solve the problem. Furthermore, we develop an algorithm procedure to obtain the optimal ordering strategy for each case. Finally, three numerical examples are also given to illustrate the results.

  2. The impact of increased efficiency in the industrial use of energy: A computable general equilibrium analysis for the United Kingdom

    International Nuclear Information System (INIS)

    Allan, Grant; Hanley, Nick; McGregor, Peter; Swales, Kim; Turner, Karen

    2007-01-01

    The conventional wisdom is that improving energy efficiency will lower energy use. However, there is an extensive debate in the energy economics/policy literature concerning 'rebound' effects. These occur because an improvement in energy efficiency produces a fall in the effective price of energy services. The response of the economic system to this price fall at least partially offsets the expected beneficial impact of the energy efficiency gain. In this paper we use an economy-energy-environment computable general equilibrium (CGE) model for the UK to measure the impact of a 5% across the board improvement in the efficiency of energy use in all production sectors. We identify rebound effects of the order of 30-50%, but no backfire (no increase in energy use). However, these results are sensitive to the assumed structure of the labour market, key production elasticities, the time period under consideration and the mechanism through which increased government revenues are recycled back to the economy

  3. PURDU-WINCOF: A computer code for establishing the performance of a fan-compressor unit with water ingestion

    Science.gov (United States)

    Leonardo, M.; Tsuchiya, T.; Murthy, S. N. B.

    1982-01-01

    A model for predicting the performance of a multi-spool axial-flow compressor with a fan during operation with water ingestion was developed incorporating several two-phase fluid flow effects as follows: (1) ingestion of water, (2) droplet interaction with blades and resulting changes in blade characteristics, (3) redistribution of water and water vapor due to centrifugal action, (4) heat and mass transfer processes, and (5) droplet size adjustment due to mass transfer and mechanical stability considerations. A computer program, called the PURDU-WINCOF code, was generated based on the model utilizing a one-dimensional formulation. An illustrative case serves to show the manner in which the code can be utilized and the nature of the results obtained.

  4. United States Adolescents' Television, Computer, Videogame, Smartphone, and Tablet Use: Associations with Sugary Drinks, Sleep, Physical Activity, and Obesity.

    Science.gov (United States)

    Kenney, Erica L; Gortmaker, Steven L

    2017-03-01

    To quantify the relationships between youth use of television (TV) and other screen devices, including smartphones and tablets, and obesity risk factors. TV and other screen device use, including smartphones, tablets, computers, and/or videogames, was self-reported by a nationally representative, cross-sectional sample of 24 800 US high school students (2013-2015 Youth Risk Behavior Surveys). Students also reported on health behaviors including sugar-sweetened beverage (SSB) intake, physical activity, sleep, and weight and height. Sex-stratified logistic regression models, adjusting for the sampling design, estimated associations between TV and other screen device use and SSB intake, physical activity, sleep, and obesity. Approximately 20% of participants used other screen devices for ≥5 hours daily. Watching TV ≥5 hours daily was associated with daily SSB consumption (aOR = 2.72, 95% CI: 2.23, 3.32) and obesity (aOR = 1.78, 95% CI: 1.40, 2.27). Using other screen devices ≥5 hours daily was associated with daily SSB consumption (aOR = 1.98, 95% CI: 1.69, 2.32), inadequate physical activity (aOR = 1.94, 95% CI: 1.69, 2.25), and inadequate sleep (aOR = 1.79, 95% CI: 1.54, 2.08). Using smartphones, tablets, computers, and videogames is associated with several obesity risk factors. Although further study is needed, families should be encouraged to limit both TV viewing and newer screen devices. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. LASIP-III, a generalized processor for standard interface files

    International Nuclear Information System (INIS)

    Bosler, G.E.; O'Dell, R.D.; Resnik, W.M.

    1976-03-01

    The LASIP-III code was developed for processing Version III standard interface data files which have been specified by the Committee on Computer Code Coordination. This processor performs two distinct tasks, namely, transforming free-field format, BCD data into well-defined binary files and providing for printing and punching data in the binary files. While LASIP-III is exported as a complete free-standing code package, techniques are described for easily separating the processor into two modules, viz., one for creating the binary files and one for printing the files. The two modules can be separated into free-standing codes or they can be incorporated into other codes. Also, the LASIP-III code can be easily expanded for processing additional files, and procedures are described for such an expansion. 2 figures, 8 tables

  6. CryptoCache: A Secure Sharable File Cache for Roaming Users

    DEFF Research Database (Denmark)

    Jensen, Christian D.

    2000-01-01

    . Conventional distributed file systems cache everything locally or not at all; there is no possibility to cache files on nearby nodes.In this paper we present the design of a secure cache system called CryptoCache that allows roaming users to cache files on untrusted file hosting servers. The system allows...... flexible sharing of cached files among unauthenticated users, i.e. unlike most distributed file systems CryptoCache does not require a global authentication framework.Files are encrypted when they are transferred over the network and while stored on untrusted servers. The system uses public key......Small mobile computers are now sufficiently powerful to run many applications, but storage capacity remains limited so working files cannot be cached or stored locally. Even if files can be stored locally, the mobile device is not powerful enough to act as server in collaborations with other users...

  7. Distributed Data Management and Distributed File Systems

    CERN Document Server

    Girone, Maria

    2015-01-01

    The LHC program has been successful in part due to the globally distributed computing resources used for collecting, serving, processing, and analyzing the large LHC datasets. The introduction of distributed computing early in the LHC program spawned the development of new technologies and techniques to synchronize information and data between physically separated computing centers. Two of the most challenges services are the distributed file systems and the distributed data management systems. In this paper I will discuss how we have evolved from local site services to more globally independent services in the areas of distributed file systems and data management and how these capabilities may continue to evolve into the future. I will address the design choices, the motivations, and the future evolution of the computing systems used for High Energy Physics.

  8. Utilizing HDF4 File Content Maps for the Cloud

    Science.gov (United States)

    Lee, Hyokyung Joe

    2016-01-01

    We demonstrate a prototype study that HDF4 file content map can be used for efficiently organizing data in cloud object storage system to facilitate cloud computing. This approach can be extended to any binary data formats and to any existing big data analytics solution powered by cloud computing because HDF4 file content map project started as long term preservation of NASA data that doesn't require HDF4 APIs to access data.

  9. 37 CFR 1.251 - Unlocatable file.

    Science.gov (United States)

    2010-07-01

    ..., patent, or other patent-related proceeding after a reasonable search, the Office will notify the... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Unlocatable file. 1.251 Section 1.251 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF...

  10. BIBLIO: A Reprint File Management Algorithm

    Science.gov (United States)

    Zelnio, Robert N.; And Others

    1977-01-01

    The development of a simple computer algorithm designed for use by the individual educator or researcher in maintaining and searching reprint files is reported. Called BIBLIO, the system is inexpensive and easy to operate and maintain without sacrificing flexibility and utility. (LBH)

  11. An information retrieval system for research file data

    Science.gov (United States)

    Joan E. Lengel; John W. Koning

    1978-01-01

    Research file data have been successfully retrieved at the Forest Products Laboratory through a high-speed cross-referencing system involving the computer program FAMULUS as modified by the Madison Academic Computing Center at the University of Wisconsin. The method of data input, transfer to computer storage, system utilization, and effectiveness are discussed....

  12. GEODOC: the GRID document file, record structure and data element description

    Energy Technology Data Exchange (ETDEWEB)

    Trippe, T.; White, V.; Henderson, F.; Phillips, S.

    1975-11-06

    The purpose of this report is to describe the information structure of the GEODOC file. GEODOC is a computer based file which contains the descriptive cataloging and indexing information for all documents processed by the National Geothermal Information Resource Group. This file (along with other GRID files) is managed by DBMS, the Berkeley Data Base Management System. Input for the system is prepared using the IRATE Text Editing System with its extended (12 bit) character set, or punched cards.

  13. Adding Data Management Services to Parallel File Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brandt, Scott [Univ. of California, Santa Cruz, CA (United States)

    2015-03-04

    The objective of this project, called DAMASC for “Data Management in Scientific Computing”, is to coalesce data management with parallel file system management to present a declarative interface to scientists for managing, querying, and analyzing extremely large data sets efficiently and predictably. Managing extremely large data sets is a key challenge of exascale computing. The overhead, energy, and cost of moving massive volumes of data demand designs where computation is close to storage. In current architectures, compute/analysis clusters access data in a physically separate parallel file system and largely leave it scientist to reduce data movement. Over the past decades the high-end computing community has adopted middleware with multiple layers of abstractions and specialized file formats such as NetCDF-4 and HDF5. These abstractions provide a limited set of high-level data processing functions, but have inherent functionality and performance limitations: middleware that provides access to the highly structured contents of scientific data files stored in the (unstructured) file systems can only optimize to the extent that file system interfaces permit; the highly structured formats of these files often impedes native file system performance optimizations. We are developing Damasc, an enhanced high-performance file system with native rich data management services. Damasc will enable efficient queries and updates over files stored in their native byte-stream format while retaining the inherent performance of file system data storage via declarative queries and updates over views of underlying files. Damasc has four key benefits for the development of data-intensive scientific code: (1) applications can use important data-management services, such as declarative queries, views, and provenance tracking, that are currently available only within database systems; (2) the use of these services becomes easier, as they are provided within a familiar file

  14. Portable Brain-Computer Interface for the Intensive Care Unit Patient Communication Using Subject-Dependent SSVEP Identification.

    Science.gov (United States)

    Dehzangi, Omid; Farooq, Muhamed

    2018-01-01

    A major predicament for Intensive Care Unit (ICU) patients is inconsistent and ineffective communication means. Patients rated most communication sessions as difficult and unsuccessful. This, in turn, can cause distress, unrecognized pain, anxiety, and fear. As such, we designed a portable BCI system for ICU communications (BCI4ICU) optimized to operate effectively in an ICU environment. The system utilizes a wearable EEG cap coupled with an Android app designed on a mobile device that serves as visual stimuli and data processing module. Furthermore, to overcome the challenges that BCI systems face today in real-world scenarios, we propose a novel subject-specific Gaussian Mixture Model- (GMM-) based training and adaptation algorithm. First, we incorporate subject-specific information in the training phase of the SSVEP identification model using GMM-based training and adaptation. We evaluate subject-specific models against other subjects. Subsequently, from the GMM discriminative scores, we generate the transformed vectors, which are passed to our predictive model. Finally, the adapted mixture mean scores of the subject-specific GMMs are utilized to generate the high-dimensional supervectors. Our experimental results demonstrate that the proposed system achieved 98.7% average identification accuracy, which is promising in order to provide effective and consistent communication for patients in the intensive care.

  15. File access prediction using neural networks.

    Science.gov (United States)

    Patra, Prashanta Kumar; Sahu, Muktikanta; Mohapatra, Subasish; Samantray, Ronak Kumar

    2010-06-01

    One of the most vexing issues in design of a high-speed computer is the wide gap of access times between the memory and the disk. To solve this problem, static file access predictors have been used. In this paper, we propose dynamic file access predictors using neural networks to significantly improve upon the accuracy, success-per-reference, and effective-success-rate-per-reference by using neural-network-based file access predictor with proper tuning. In particular, we verified that the incorrect prediction has been reduced from 53.11% to 43.63% for the proposed neural network prediction method with a standard configuration than the recent popularity (RP) method. With manual tuning for each trace, we are able to improve upon the misprediction rate and effective-success-rate-per-reference using a standard configuration. Simulations on distributed file system (DFS) traces reveal that exact fit radial basis function (RBF) gives better prediction in high end system whereas multilayer perceptron (MLP) trained with Levenberg-Marquardt (LM) backpropagation outperforms in system having good computational capability. Probabilistic and competitive predictors are the most suitable for work stations having limited resources to deal with and the former predictor is more efficient than the latter for servers having maximum system calls. Finally, we conclude that MLP with LM backpropagation algorithm has better success rate of file prediction than those of simple perceptron, last successor, stable successor, and best k out of m predictors.

  16. Source Reference File

    Data.gov (United States)

    Social Security Administration — This file contains a national set of names and contact information for doctors, hospitals, clinics, and other facilities (known collectively as sources) from which...

  17. Patient Assessment File (PAF)

    Data.gov (United States)

    Department of Veterans Affairs — The Patient Assessment File (PAF) database compiles the results of the Patient Assessment Instrument (PAI) questionnaire filled out for intermediate care Veterans...

  18. RRB Earnings File (RRBERN)

    Data.gov (United States)

    Social Security Administration — RRBERN contains records for all beneficiaries on the RRB's PSSVES file who's SSNs are validated through the SVES processing. Validated output is processed through...

  19. Radiology Teaching Files on the Internet

    International Nuclear Information System (INIS)

    Lim, Eun Chung; Kim, Eun Kyung

    1996-01-01

    There is increasing attention about radiology teaching files on the Internet in the field of diagnostic radiology. The purpose of this study was to aid in the creation of new radiology teaching file by analysing the present radiology teaching file sites on the Internet with many aspects and evaluating images on those sites, using Macintosh II ci compute r, 28.8kbps TelePort Fax/Modem, Netscape Navigator 2.0 software. The results were as follow : 1. Analysis of radiology teaching file sites (1) Country distribution was the highest in USA (57.5%). (2) Average number of cases was 186 cases and radiology teaching file sites with search engine were 9 sites (22.5%). (3) At the method of case arrangement, anatomic area type and diagnosis type were found at the 10 sites (25%) each, question and answer type was found at the 9 sites (22.5%). (4) Radiology teaching file sites with oro-maxillofacial disorder were 9 sites (22.5%). (5) At the image format, GIF format was found at the 14 sites (35%), and JPEG format found at the 14 sites (35%). (6) Created year was the highest in 1995 (43.7%). (7) Continuing case upload was found at the 35 sites (87.5%). 2. Evaluation of images on the radiology teaching files (1) Average file size of GIF format (71 Kbyte) was greater than that of JPEG format (24 Kbyte). (P<0.001) (2) Image quality of GIF format was better than that of JPEG format. (P<0.001)

  20. Mode selectivity in the intramolecular cyclization of ketenimines bearing N-acylimino units: a computational and experimental study.

    Science.gov (United States)

    Alajarín, Mateo; Sánchez-Andrada, Pilar; Vidal, Angel; Tovar, Fulgencio

    2005-02-18

    [reaction: see text] The mode selectivity in the intramolecular cyclization of a particular class of ketenimines bearing N-acylimino units has been studied by ab initio and DFT calculations. In the model compounds the carbonyl carbon atom and the keteniminic nitrogen atom are linked either by a vinylic or an o-phenylene tether. Two cyclization modes have been analyzed: the [2+2] cycloaddition furnishing compounds with an azeto[2,1-b]pyrimidinone moiety and a 6pi-electrocyclic ring closure leading to compounds enclosing a 1,3-oxazine ring. The [2+2] cycloaddition reaction takes place via a two-step process with formation of a zwitterionic intermediate, which has been characterized as a cross-conjugated mesomeric betaine. The 6pi-electrocyclic ring closure occurs via a transition state whose pseudopericyclic character has been established on the basis of its magnetic properties, geometry, and NBO analysis. The 6pi-electrocyclic ring closure is energetically favored over the [2+2] cycloaddition, although the [2+2] cycloadducts are the thermodynamically controlled products. A quantitative kinetic analysis predicts that 1,3-oxazines would be the kinetically controlled products, but they should transform rapidly and totally into the [2+2] cycloadducts at room temperature. In the experimental study, a number of N-acylimino-ketenimines, in which both reactive functions are supported on an o-phenylene scaffold, have been successfully synthesized in three steps starting from 2-azidobenzoyl chloride. These compounds rapidly convert into azeto[2,1-b]quinazolin-8-ones in moderate to good yields as a result of a formal [2+2] cycloaddition.

  1. Design of a linear detector array unit for high energy x-ray helical computed tomography and linear scanner

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jeong Tae; Park, Jong Hwan; Kim, Gi Yoon [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon (Korea, Republic of); Kim, Dong Geun [Medical Imaging Department, ASTEL Inc., Seongnam (Korea, Republic of); Park, Shin Woong; Yi, Yun [Dept. of Electronics and Information Eng, Korea University, Seoul (Korea, Republic of); Kim, Hyun Duk [Research Center, Luvantix ADM Co., Ltd., Daejeon (Korea, Republic of)

    2016-11-15

    A linear detector array unit (LdAu) was proposed and designed for the high energy X-ray 2-d and 3-d imaging systems for industrial non-destructive test. Specially for 3-d imaging, a helical CT with a 15 MeV linear accelerator and a curved detector is proposed. the arc-shape detector can be formed by many LdAus all of which are arranged to face the focal spot when the source-to-detector distance is fixed depending on the application. An LdAu is composed of 10 modules and each module has 48 channels of CdWO{sub 4} (CWO) blocks and Si PIn photodiodes with 0.4 mm pitch. this modular design was made for easy manufacturing and maintenance. through the Monte carlo simulation, the CWO detector thickness of 17 mm was optimally determined. the silicon PIn photodiodes were designed as 48 channel arrays and fabricated with NTD (neutron transmutation doping) wafers of high resistivity and showed excellent leakage current properties below 1 nA at 10 V reverse bias. to minimize the low-voltage breakdown, the edges of the active layer and the guard ring were designed as a curved shape. the data acquisition system was also designed and fabricated as three independent functional boards; a sensor board, a capture board and a communication board to a Pc. this paper describes the design of the detectors (CWO blocks and Si PIn photodiodes) and the 3-board data acquisition system with their simulation results.

  2. Comparison of adult and child radiation equivalent doses from 2 dental cone-beam computed tomography units.

    Science.gov (United States)

    Al Najjar, Anas; Colosi, Dan; Dauer, Lawrence T; Prins, Robert; Patchell, Gayle; Branets, Iryna; Goren, Arthur D; Faber, Richard D

    2013-06-01

    With the advent of cone-beam computed tomography (CBCT) scans, there has been a transition toward these scans' replacing traditional radiographs for orthodontic diagnosis and treatment planning. Children represent a significant proportion of orthodontic patients. Similar CBCT exposure settings are predicted to result in higher equivalent doses to the head and neck organs in children than in adults. The purpose of this study was to measure the difference in equivalent organ doses from different scanners under similar settings in children compared with adults. Two phantom heads were used, representing a 33-year-old woman and a 5-year-old boy. Optically stimulated dosimeters were placed at 8 key head and neck organs, and equivalent doses to these organs were calculated after scanning. The manufacturers' predefined exposure settings were used. One scanner had a pediatric preset option; the other did not. Scanning the child's phantom head with the adult settings resulted in significantly higher equivalent radiation doses to children compared with adults, ranging from a 117% average ratio of equivalent dose to 341%. Readings at the cervical spine level were decreased significantly, down to 30% of the adult equivalent dose. When the pediatric preset was used for the scans, there was a decrease in the ratio of equivalent dose to the child mandible and thyroid. CBCT scans with adult settings on both phantom heads resulted in higher radiation doses to the head and neck organs in the child compared with the adult. In practice, this might result in excessive radiation to children scanned with default adult settings. Collimation should be used when possible to reduce the radiation dose to the patient. While CBCT scans offer a valuable tool, use of CBCT scans should be justified on a specific case-by-case basis. Copyright © 2013 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  3. Computer says 2.5 litres--how best to incorporate intelligent software into clinical decision making in the intensive care unit?

    Science.gov (United States)

    Lane, Katie; Boyd, Owen

    2009-01-01

    What will be the role of the intensivist when computer-assisted decision support reaches maturity? Celi's group reports that Bayesian theory can predict a patient's fluid requirement on day 2 in 78% of cases, based on data collected on day 1 and the known associations between those data, based on observations in previous patients in their unit. There are both advantages and limitations to the Bayesian approach, and this test study identifies areas for improvement in future models. Although such models have the potential to improve diagnostic and therapeutic accuracy, they must be introduced judiciously and locally to maximize their effect on patient outcome. Efficacy is thus far undetermined, and these novel approaches to patient management raise new challenges, not least medicolegal ones.

  4. A COMPUTATIONAL FLUID DYNAMICS ANALYSIS OF AIR FLOW THROUGH A TELECOM BACK-UP UNIT POWERED BY AN AIR-COOLED PROTON EXCHANGE MEMBRANE FUEL CELL

    DEFF Research Database (Denmark)

    Gao, Xin; Berning, Torsten; Kær, Søren Knudsen

    2016-01-01

    Proton exchange membrane fuel cells (PEMFC’s) are currently being commercialized for various applications ranging from automotive to stationary such as powering telecom back-up units. In PEMFC’s, oxygen from air is internally combined with hydrogen to form water and produce electricity and heat....... This product heat has to be effectively removed from the fuel cell, and while automotive fuel cells are usually liquid-cooled using a secondary coolant loop similar to the internal combustion engines, stationary fuel cell systems as they are used for telecom back-up applications often rely on excessive air fed...... to the fuel cell cathode to remove the heat. Thereby, the fuel cell system is much simpler and cheaper while the fuel cell performance is substantially lower compared to automotive fuel cells. This work presents a computational fluid dynamics analysis on the heat management of an air-cooled fuel cell powered...

  5. Internal fit of three-unit fixed dental prostheses produced by computer-aided design/computer-aided manufacturing and the lost-wax metal casting technique assessed using the triple-scan protocol.

    Science.gov (United States)

    Dahl, Bjørn E; Dahl, Jon E; Rønold, Hans J

    2018-02-01

    Suboptimal adaptation of fixed dental prostheses (FDPs) can lead to technical and biological complications. It is unclear if the computer-aided design/computer-aided manufacturing (CAD/CAM) technique improves adaptation of FDPs compared with FDPs made using the lost-wax and metal casting technique. Three-unit FDPs were manufactured by CAD/CAM based on digital impression of a typodont model. The FDPs were made from one of five materials: pre-sintered zirconium dioxide; hot isostatic pressed zirconium dioxide; lithium disilicate glass-ceramic; milled cobalt-chromium; and laser-sintered cobalt-chromium. The FDPs made using the lost-wax and metal casting technique were used as reference. The fit of the FDPs was analysed using the triple-scan method. The fit was evaluated for both single abutments and three-unit FDPs. The average cement space varied between 50 μm and 300 μm. Insignificant differences in internal fit were observed between the CAD/CAM-manufactured FDPs, and none of the FPDs had cement spaces that were statistically significantly different from those of the reference FDP. For all FDPs, the cement space at a marginal band 0.5-1.0 mm from the preparation margin was less than 100 μm. The milled cobalt-chromium FDP had the closest fit. The cement space of FDPs produced using the CAD/CAM technique was similar to that of FDPs produced using the conventional lost-wax and metal casting technique. © 2017 Eur J Oral Sci.

  6. Virus Alert: Ten Steps to Safe Computing.

    Science.gov (United States)

    Gunter, Glenda A.

    1997-01-01

    Discusses computer viruses and explains how to detect them; discusses virus protection and the need to update antivirus software; and offers 10 safe computing tips, including scanning floppy disks and commercial software, how to safely download files from the Internet, avoiding pirated software copies, and backing up files. (LRW)

  7. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  8. Identification and control of factors influencing flow-accelerated corrosion in HRSG units using computational fluid dynamics modeling, full-scale air flow testing, and risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pietrowski, Ronald L. [The Consolidated Edison Company of New York, Inc., New York, NY (United States)

    2010-11-15

    In 2009, Consolidated Edison's East River heat recovery steam generator units 10 and 20 both experienced economizer tube failures which forced each unit offline. Extensive inspections indicated that the primary failure mechanism was flow-accelerated corrosion (FAC). The inspections revealed evidence of active FAC in all 7 of the economizer modules, with the most advanced stages of degradation being noted in center modules. Analysis determined that various factors were influencing and enabling this corrosion mechanism. Computational fluid dynamics and full-scale air flow testing showed very turbulent feedwater flow prevalent in areas of the modules corresponding with the pattern of FAC damage observed through inspection. It also identified preferential flow paths, with higher flow velocities, in certain tubes directly under the inlet nozzles. A FAC risk analysis identified more general susceptibility to FAC in the areas experiencing damage due to feedwater pH, operating temperatures, local shear fluid forces, and the chemical composition of the original materials of construction. These, in combination, were the primary root causes of the failures. Corrective actions were identified, analyzed, and implemented, resulting in equipment replacements and repairs. (orig.)

  9. Status of the JENDL activation file

    International Nuclear Information System (INIS)

    Nakajima, Yutaka

    1996-01-01

    The preliminary JENDL activation file was accomplished in February 1995 and has been used in the Japanese Nuclear Data Committee and as one of the data sources for the Fusion Evaluated Nuclear Data Library in IAEA. Since there are already big activation libraries in western Europe and United States, we are aiming at more accurate evaluation of important reactions to application to nuclear energy development rather than aiming at as many reaction data as in these big libraries. In the preliminary file 1,158 reaction cross sections have been compiled for 225 nuclides up to 20 MeV. (author)

  10. Nuclear plant fire incident data file

    International Nuclear Information System (INIS)

    Sideris, A.G.; Hockenbury, R.W.; Yeater, M.L.; Vesely, W.E.

    1979-01-01

    A computerized nuclear plant fire incident data file was developed by American Nuclear Insurers and was further analyzed by Rensselaer Polytechnic Institute with technical and monetary support provided by the Nuclear Regulatory Commission. Data on 214 fires that occurred at nuclear facilities have been entered in the file. A computer program has been developed to sort the fire incidents according to various parameters. The parametric sorts that are presented in this article are significant since they are the most comprehensive statistics presently available on fires that have occurred at nuclear facilities

  11. Motor unit number index (MUNIX) derivation from the relationship between the area and power of surface electromyogram: a computer simulation and clinical study

    Science.gov (United States)

    Miralles, Francesc

    2018-06-01

    Objective. The motor unit number index (MUNIX) is a technique based on the surface electromyogram (sEMG) that is gaining acceptance as a method for monitoring motor neuron loss, because it is reliable and produces less discomfort than other electrodiagnostic techniques having the same intended purpose. MUNIX assumes that the relationship between the area of sEMG obtained at increasing levels of muscle activation and the values of a variable called ‘ideal case motor unit count’ (ICMUC), defined as the product of the ratio between area and power of the compound muscle action potential (CMAP) by that of the sEMG, is described by a decreasing power function. Nevertheless, the reason for this comportment is unknown. The objective of this work is to investigate if the definition of MUNIX could derive from more basic properties of the sEMG. Approach. The CMAP and sEMG epochs obtained at different levels of muscle activation from (1) the abductor pollicis brevis (APB) muscle of persons with and without a carpal tunnel syndrome (CTS) and (2) from a computer model of sEMG generation previously published were analysed. Main results. MUNIX reflects the power relationship existing between the area and power of a sEMG. The exponent of this function was smaller in patients with motor CTS than in the rest of the subjects. The analysis of the relationship between the area and power of a sEMG could aid in distinguishing a MUNIX reduction due to a motoneuron loss from that due to a loss of muscle fibre. Significance. MUNIX is derived from the relationship between the area and power of a sEMG. This relationship changes when there is a loss of motor units (MUs), which partially explains the diagnostic sensibility of MUNIX. Although the reasons for this change are unknown, it could reflect an increase in the proportion of MUs of great amplitude.

  12. Computer experimental analysis of the CHP performance of a 100 kW e SOFC Field Unit by a factorial design

    Science.gov (United States)

    Calì, M.; Santarelli, M. G. L.; Leone, P.

    Gas Turbine Technologies (GTT) and Politecnico di Torino, both located in Torino (Italy), have been involved in the design and installation of a SOFC laboratory in order to analyse the operation, in cogenerative configuration, of the CHP 100 kW e SOFC Field Unit, built by Siemens-Westinghouse Power Corporation (SWPC), which is at present (May 2005) starting its operation and which will supply electric and thermal power to the GTT factory. In order to take the better advantage from the analysis of the on-site operation, and especially to correctly design the scheduled experimental tests on the system, we developed a mathematical model and run a simulated experimental campaign, applying a rigorous statistical approach to the analysis of the results. The aim of this work is the computer experimental analysis, through a statistical methodology (2 k factorial experiments), of the CHP 100 performance. First, the mathematical model has been calibrated with the results acquired during the first CHP100 demonstration at EDB/ELSAM in Westerwoort. After, the simulated tests have been performed in the form of computer experimental session, and the measurement uncertainties have been simulated with perturbation imposed to the model independent variables. The statistical methodology used for the computer experimental analysis is the factorial design (Yates' Technique): using the ANOVA technique the effect of the main independent variables (air utilization factor U ox, fuel utilization factor U F, internal fuel and air preheating and anodic recycling flow rate) has been investigated in a rigorous manner. Analysis accounts for the effects of parameters on stack electric power, thermal recovered power, single cell voltage, cell operative temperature, consumed fuel flow and steam to carbon ratio. Each main effect and interaction effect of parameters is shown with particular attention on generated electric power and stack heat recovered.

  13. FHEO Filed Cases

    Data.gov (United States)

    Department of Housing and Urban Development — The dataset is a list of all the Title VIII fair housing cases filed by FHEO from 1/1/2007 - 12/31/2012 including the case number, case name, filing date, state and...

  14. Efficacy of D-RaCe and ProTaper Universal Retreatment NiTi instruments and hand files in removing gutta-percha from curved root canals - a micro-computed tomography study.

    Science.gov (United States)

    Rödig, T; Hausdörfer, T; Konietschke, F; Dullin, C; Hahn, W; Hülsmann, M

    2012-06-01

    To compare the efficacy of two rotary NiTi retreatment systems and Hedström files in removing filling material from curved root canals. Curved root canals of 57 extracted teeth were prepared using FlexMaster instruments and filled with gutta-percha and AH Plus. After determination of root canal curvatures and radii in two directions, the teeth were assigned to three identical groups (n = 19). The root fillings were removed with D-RaCe instruments, ProTaper Universal Retreatment instruments or Hedström files. Pre- and postoperative micro-CT imaging was used to assess the percentage of residual filling material as well as the amount of dentine removal. Working time and procedural errors were recorded. Data were analysed using analysis of covariance and analysis of variance procedures. D-RaCe instruments were significantly more effective than ProTaper Universal Retreatment instruments and Hedström files (P ProTaper group, four instrument fractures and one lateral perforation were observed. Five instrument fractures were recorded for D-RaCe. D-RaCe instruments were associated with significantly less residual filling material than ProTaper Universal Retreatment instruments and hand files. Hedström files removed significantly less dentine than both rotary NiTi systems. Retreatment with rotary NiTi systems resulted in a high incidence of procedural errors. © 2012 International Endodontic Journal.

  15. 32 CFR 150.21 - Appeals by the United States.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 1 2010-07-01 2010-07-01 false Appeals by the United States. 150.21 Section 150... the United States. (a) Restricted filing. Only a representative of the government designated by the Judge Advocate General of the respective service may file an appeal by the United States under Article...

  16. SU-F-T-233: Evaluation of Treatment Delivery Parameters Using High Resolution ELEKTA Log Files

    Energy Technology Data Exchange (ETDEWEB)

    Kabat, C; Defoor, D; Alexandrian, A; Papanikolaou, N; Stathakis, S [University of Texas HSC SA, San Antonio, TX (United States)

    2016-06-15

    Purpose: As modern linacs have become more technologically advanced with the implementation of IGRT and IMRT with HDMLCs, a requirement for more elaborate tracking techniques to monitor components’ integrity is paramount. ElektaLog files are generated every 40 milliseconds, which can be analyzed to track subtle changes and provide another aspect of quality assurance. This allows for constant monitoring of fraction consistency in addition to machine reliability. With this in mind, it was the aim of the study to evaluate if ElektaLog files can be utilized for linac consistency QA. Methods: ElektaLogs were reviewed for 16 IMRT patient plans with >16 fractions. Logs were analyzed by creating fluence maps from recorded values of MLC locations, jaw locations, and dose per unit time. Fluence maps were then utilized to calculate a 2D gamma index with a 2%–2mm criteria for each fraction. ElektaLogs were also used to analyze positional errors for MLC leaves and jaws, which were used to compute an overall error for the MLC banks, Y-jaws, and X-jaws by taking the root-meansquare value of the individual recorded errors during treatment. Additionally, beam on time was calculated using the number of ElektaLog file entries within the file. Results: The average 2D gamma for all 16 patient plans was found to be 98.0±2.0%. Recorded gamma index values showed an acceptable correlation between fractions. Average RMS values for MLC leaves and the jaws resulted in a leaf variation of roughly 0.3±0.08 mm and jaw variation of about 0.15±0.04 mm, both of which fall within clinical tolerances. Conclusion: The use of ElektaLog files for day-to-day evaluation of linac integrity and patient QA can be utilized to allow for reliable analysis of system accuracy and performance.

  17. SU-F-T-233: Evaluation of Treatment Delivery Parameters Using High Resolution ELEKTA Log Files

    International Nuclear Information System (INIS)

    Kabat, C; Defoor, D; Alexandrian, A; Papanikolaou, N; Stathakis, S

    2016-01-01

    Purpose: As modern linacs have become more technologically advanced with the implementation of IGRT and IMRT with HDMLCs, a requirement for more elaborate tracking techniques to monitor components’ integrity is paramount. ElektaLog files are generated every 40 milliseconds, which can be analyzed to track subtle changes and provide another aspect of quality assurance. This allows for constant monitoring of fraction consistency in addition to machine reliability. With this in mind, it was the aim of the study to evaluate if ElektaLog files can be utilized for linac consistency QA. Methods: ElektaLogs were reviewed for 16 IMRT patient plans with >16 fractions. Logs were analyzed by creating fluence maps from recorded values of MLC locations, jaw locations, and dose per unit time. Fluence maps were then utilized to calculate a 2D gamma index with a 2%–2mm criteria for each fraction. ElektaLogs were also used to analyze positional errors for MLC leaves and jaws, which were used to compute an overall error for the MLC banks, Y-jaws, and X-jaws by taking the root-meansquare value of the individual recorded errors during treatment. Additionally, beam on time was calculated using the number of ElektaLog file entries within the file. Results: The average 2D gamma for all 16 patient plans was found to be 98.0±2.0%. Recorded gamma index values showed an acceptable correlation between fractions. Average RMS values for MLC leaves and the jaws resulted in a leaf variation of roughly 0.3±0.08 mm and jaw variation of about 0.15±0.04 mm, both of which fall within clinical tolerances. Conclusion: The use of ElektaLog files for day-to-day evaluation of linac integrity and patient QA can be utilized to allow for reliable analysis of system accuracy and performance.

  18. Next generation WLCG File Transfer Service (FTS)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    LHC experiments at CERN and worldwide utilize WLCG resources and middleware components to perform distributed computing tasks. One of the most important tasks is reliable file replication. It is a complex problem, suffering from transfer failures, disconnections, transfer duplication, server and network overload, differences in storage systems, etc. To address these problems, EMI and gLite have provided the independent File Transfer Service (FTS) and Grid File Access Library (GFAL) tools. Their development started almost a decade ago, in the meantime, requirements in data management have changed - the old architecture of FTS and GFAL cannot keep support easily these changes. Technology has also been progressing: FTS and GFAL do not fit into the new paradigms (cloud, messaging, for example). To be able to serve the next stage of LHC data collecting (from 2013), we need a new generation of  these tools: FTS 3 and GFAL 2. We envision a service requiring minimal configuration, which can dynamically adapt to the...

  19. GIFT: an HEP project for file transfer

    International Nuclear Information System (INIS)

    Ferrer, M.L.; Mirabelli, G.; Valente, E.

    1986-01-01

    Started in autumn 1983, GIFT (General Internetwork File Transfer) is a collaboration among several HEP centers, including CERN, Frascati, Oslo, Oxford, RAL and Rome. The collaboration was initially set up with the aim of studying the feasibility of a software system to allow direct file exchange between computers which do not share a common Virtual File Protocol. After the completion of this first phase, an implementation phase started and, since March 1985, an experimental service based on this system has been running at CERN between DECnet, CERNET and the UK Coloured Book protocols. The authors present the motivations that, together with previous gateway experiences, led to the definition of GIFT specifications and to the implementation of the GIFT Kernel system. The position of GIFT in the overall development framework of the networking facilities needed by large international collaborations within the HEP community is explained. (Auth.)

  20. Configuration Management File Manager Developed for Numerical Propulsion System Simulation

    Science.gov (United States)

    Follen, Gregory J.

    1997-01-01

    One of the objectives of the High Performance Computing and Communication Project's (HPCCP) Numerical Propulsion System Simulation (NPSS) is to provide a common and consistent way to manage applications, data, and engine simulations. The NPSS Configuration Management (CM) File Manager integrated with the Common Desktop Environment (CDE) window management system provides a common look and feel for the configuration management of data, applications, and engine simulations for U.S. engine companies. In addition, CM File Manager provides tools to manage a simulation. Features include managing input files, output files, textual notes, and any other material normally associated with simulation. The CM File Manager includes a generic configuration management Application Program Interface (API) that can be adapted for the configuration management repositories of any U.S. engine company.

  1. Tabulation of Fundamental Assembly Heat and Radiation Source Files

    International Nuclear Information System (INIS)

    T. deBues; J.C. Ryman

    2006-01-01

    The purpose of this calculation is to tabulate a set of computer files for use as input to the WPLOAD thermal loading software. These files contain details regarding heat and radiation from pressurized water reactor (PWR) assemblies and boiling water reactor (BWR) assemblies. The scope of this calculation is limited to rearranging and reducing the existing file information into a more streamlined set of tables for use as input to WPLOAD. The electronic source term files used as input to this calculation were generated from the output files of the SAS2H/ORIGIN-S sequence of the SCALE Version 4.3 modular code system, as documented in References 2.1.1 and 2.1.2, and are included in Attachment II

  2. Impact on breast cancer diagnosis in a multidisciplinary unit after the incorporation of mammography digitalization and computer-aided detection systems.

    Science.gov (United States)

    Romero, Cristina; Varela, Celia; Muñoz, Enriqueta; Almenar, Asunción; Pinto, Jose María; Botella, Miguel

    2011-12-01

    The purpose of this article is to evaluate the impact on the diagnosis of breast cancer of implementing full-field digital mammography (FFDM) in a multidisciplinary breast pathology unit and, 1 year later, the addition of a computer-aided detection (CAD) system. A total of 13,453 mammograms performed between January and July of the years 2004, 2006, and 2007 were retrospectively reviewed using conventional mammography, digital mammography, and digital mammography plus CAD techniques. Mammograms were classified into two subsets: screening and diagnosis. Variables analyzed included cancer detection rate, rate of in situ carcinoma, tumor size at detection, biopsy rate, and positive predictive value of biopsy. FFDM increased the cancer detection rate, albeit not statistically significantly. The detection rate of in situ carcinoma increased significantly using FFDM plus CAD compared with conventional technique (36.8% vs 6.7%; p = 0.05 without Bonferroni statistical correction) for the screening dataset. Relative to conventional mammography, tumor size at detection decreased with digital mammography (T1, 61.5% vs 88%; p = 0.018) and with digital mammography plus CAD (T1, 79.7%; p = 0.03 without Bonferroni statistical correction). Biopsy rates in the general population increased significantly using CAD (10.6/1000 for conventional mammography, 14.7/1000 for digital mammography, and 17.9/1000 for digital mammography plus CAD; p = 0.02). The positive predictive value of biopsy decreased slightly, but not significantly, for both subsets. The incorporation of new techniques has improved the performance of the breast unit by increasing the overall detection rates and earlier detection (smaller tumors), both leading to an increase in interventionism.

  3. Study on efficiency of time computation in x-ray imaging simulation base on Monte Carlo algorithm using graphics processing unit

    International Nuclear Information System (INIS)

    Setiani, Tia Dwi; Suprijadi; Haryanto, Freddy

    2016-01-01

    Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic images and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 – 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 10"8 and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.

  4. Study on efficiency of time computation in x-ray imaging simulation base on Monte Carlo algorithm using graphics processing unit

    Energy Technology Data Exchange (ETDEWEB)

    Setiani, Tia Dwi, E-mail: tiadwisetiani@gmail.com [Computational Science, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia); Suprijadi [Computational Science, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia); Nuclear Physics and Biophysics Reaserch Division, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia); Haryanto, Freddy [Nuclear Physics and Biophysics Reaserch Division, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia)

    2016-03-11

    Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic images and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 – 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 10{sup 8} and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.

  5. Study on motion artifacts in coronary arteries with an anthropomorphic moving heart phantom on an ECG-gated multidetector computed tomography unit

    International Nuclear Information System (INIS)

    Greuter, Marcel J.W.; Dorgelo, Joost; Tukker, Wim G.J.; Oudkerk, Matthijs

    2005-01-01

    Acquisition time plays a key role in the quality of cardiac multidetector computed tomography (MDCT) and is directly related to the rotation time of the scanner. The purpose of this study is to examine the influence of heart rate and a multisector reconstruction algorithm on the image quality of coronary arteries of an anthropomorphic adjustable moving heart phantom on an ECG-gated MDCT unit. The heart phantom and a coronary artery phantom were used on a MDCT unit with a rotation time of 500 ms. The movement of the heart was determined by analysis of the images taken at different phases. The results indicate that the movement of the coronary arteries on the heart phantom is comparable to that in a clinical setting. The influence of the heart rate on image quality and artifacts was determined by analysis of several heart rates between 40 and 80 bpm where the movement of the heart was synchronized using a retrospective ECG-gated acquisition protocol. The resulting reformatted volume rendering images of the moving heart and the coronary arteries were qualitatively compared as a result of the heart rate. The evaluation was performed on three independent series by two independent radiologists for the image quality of the coronary arteries and the presence of artifacts. The evaluation shows that at heart rates above 50 bpm the influence of motion artifacts in the coronary arteries becomes apparent. In addition the influence of a dedicated multisector reconstruction technique on image quality was determined. The results show that the image quality of the coronary arteries is not only related to the heart rate and that the influence of the multisector reconstruction technique becomes significant above 70 bpm. Therefore, this study proves that from the actual acquisition time per heart cycle one cannot determine an actual acquisition time, but only a mathematical acquisition time. (orig.)

  6. Detection of Cement Leakage After Vertebroplasty with a Non-Flat-Panel Angio Unit Compared to Multidetector Computed Tomography - An Ex Vivo Study

    International Nuclear Information System (INIS)

    Baumann, Clemens; Fuchs, Heiko; Westphalen, Kerstin; Hierholzer, Johannes

    2008-01-01

    The purpose of this study was to investigate the detection of cement leakages after vertebroplasty using angiographic computed tomography (ACT) in a non-flat-panel angio unit compared to multidetector computed tomography (MDCT). Vertebroplasty was performed in 19 of 33 cadaver vertebrae (23 thoracic and 10 lumbar segments). In the angio suite, ACT (190 o ; 1.5 o per image) was performed to obtain volumetric data. Another volumetric data set of the specimen was obtained by MDCT using a standard algorithm. Nine multiplanar reconstructions in standardized axial, coronal, and sagittal planes of every vertebra were generated from both data sets. Images were evaluated on the basis of a nominal scale with 18 criteria, comprising osseous properties (e.g., integrity of the end plate) and cement distribution (e.g., presence of intraspinal cement). MDCT images were regarded as gold standard and analyzed by two readers in a consensus mode. Rotational acquisitions were analyzed by six blinded readers. Results were correlated with the gold standard using Cohen's κ-coefficient analysis. Furthermore, interobserver variability was calculated. Correlation with the gold standard ranged from no correlation (osseous margins of the neuroforamen, κ = 0.008) to intermediate (trace of vertebroplasty canula; κ = 0.615) for criteria referring to osseous morphology. However, there was an excellent correlation for those criteria referring to cement distribution, with κ values ranging from 0.948 (paravertebral cement distribution) to 0.972 (intraspinal cement distribution). With a minimum of κ = 0.768 ('good correlation') and a maximum of κ = 0.91 ('excellent'), interobserver variability was low. In conclusion, ACT in an angio suite without a flat-panel detector depicts a cement leakage after vertebroplasty as well as MDCT. However, the method does not provide sufficient depiction of osseous morphology.

  7. Study and development of a document file system with selective access

    International Nuclear Information System (INIS)

    Mathieu, Jean-Claude

    1974-01-01

    The objective of this research thesis was to design and to develop a set of software aimed at an efficient management of a document file system by using methods of selective access to information. Thus, the three main aspects of file processing (creation, modification, reorganisation) have been addressed. The author first presents the main problems related to the development of a comprehensive automatic documentation system, and their conventional solutions. Some future aspects, notably dealing with the development of peripheral computer technology, are also evoked. He presents the characteristics of INIS bibliographic records provided by the IAEA which have been used to create the files. In the second part, he briefly describes the file system general organisation. This system is based on the use of two main files: an inverse file which contains for each descriptor a list of of numbers of files indexed by this descriptor, and a dictionary of descriptor or input file which gives access to the inverse file. The organisation of these both files is then describes in a detailed way. Other related or associated files are created, and the overall architecture and mechanisms integrated into the file data input software are described, as well as various processing applied to these different files. Performance and possible development are finally discussed

  8. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  9. The Improvement and Performance of Mobile Environment Using Both Cloud and Text Computing

    OpenAIRE

    S.Saravana Kumar; J.Lakshmi Priya; P.Hannah Jennifer; N.Jeff Monica; Fathima

    2013-01-01

    In this research paper presents an design model for file sharing system for ubiquitos mobile devices using both cloud and text computing. File s haring is one of the rationales for computer networks with increasing demand for file sharing ap plications and technologies in small and large enterprise networks and on the Internet. File transfer is an important process in any form of computing as we need to really share the data ac ross. ...

  10. Workstation environment supports for startup of YGN 3 and 4 nuclear unit

    International Nuclear Information System (INIS)

    Lee, Won Jae; Kim, Won Bong; Lee, Byung Chae

    1995-07-01

    Light water reactor fuel development division of Korea Atomic Energy Research Institute participated in the installation of the plant computer system and software, and the user support activities of Asea Brown Boveri/Combustion Engineering for the Plant Monitoring System during the startup phase of YGN-3 nuclear unit. The main purpose of the participation is to have the self-reliant plant- computer technology for the independent design and startup of next nuclear units. This report describes the activities performed by KAERI with ABB/CE at the plant site. In addition, it describes the direct transfer of data files between PMS and workstation which was independently carried out by KAERI. Since KAERI should support the site in setting-up the plant computer environment independent of ABB-CE from the next nuclear units, the review was performed for the technical details of activities provided to the site in order to provide the better computer environment in the next nuclear units. In conclusion, this report is expected to provide the technical background for the supporting of plant computing environment and the scope of support work at plant site during Yonggwang 3, 4 startup in the area of plant computer for the next nuclear units. 6 refs. (Author) .new

  11. Workstation environment supports for startup of YGN 3 and 4 nuclear unit

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Kim, Won Bong; Lee, Byung Chae [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    Light water reactor fuel development division of Korea Atomic Energy Research Institute participated in the installation of the plant computer system and software, and the user support activities of Asea Brown Boveri/Combustion Engineering for the Plant Monitoring System during the startup phase of YGN-3 nuclear unit. The main purpose of the participation is to have the self-reliant plant- computer technology for the independent design and startup of next nuclear units. This report describes the activities performed by KAERI with ABB/CE at the plant site. In addition, it describes the direct transfer of data files between PMS and workstation which was independently carried out by KAERI. Since KAERI should support the site in setting-up the plant computer environment independent of ABB-CE from the next nuclear units, the review was performed for the technical details of activities provided to the site in order to provide the better computer environment in the next nuclear units. In conclusion, this report is expected to provide the technical background for the supporting of plant computing environment and the scope of support work at plant site during Yonggwang 3, 4 startup in the area of plant computer for the next nuclear units. 6 refs. (Author) .new.

  12. Download this PDF file

    African Journals Online (AJOL)

    5,. May. 1923, p. 287. ISouth African Military Schools) p 287. CGS Box 231, File 31/0/2. .... One gains the impression that the sphere .... tions, Anthropology, Sociology and Man Manage- ment. ... of the word, possesses personality and initiative,.

  13. MMLEADS Public Use File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare-Medicaid Linked Enrollee Analytic Data Source (MMLEADS) Public Use File (PUF) contains demographic, enrollment, condition prevalence, utilization, and...

  14. Hospital Service Area File

    Data.gov (United States)

    U.S. Department of Health & Human Services — This file is derived from the calendar year inpatient claims data. The records contain number of discharges, length of stay, and total charges summarized by provider...

  15. Patient Treatment File (PTF)

    Data.gov (United States)

    Department of Veterans Affairs — This database is part of the National Medical Information System (NMIS). The Patient Treatment File (PTF) contains a record for each inpatient care episode provided...

  16. USEEIO Satellite Files

    Data.gov (United States)

    U.S. Environmental Protection Agency — These files contain the environmental data as particular emissions or resources associated with a BEA sectors that are used in the USEEIO model. They are organized...

  17. Provider of Services File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The POS file contains data on characteristics of hospitals and other types of healthcare facilities, including the name and address of the facility and the type of...

  18. Download this PDF file

    African Journals Online (AJOL)

    countries quite a number of distance education institutions and programmes are more likely to be ... The Open University of Tanzania (OUT), (Ministry of Higher Education, Science and ..... (1991) Comic Relief Funding file. BAI, London, 1st ...

  19. Mapping the Information Trace in Local Field Potentials by a Computational Method of Two-Dimensional Time-Shifting Synchronization Likelihood Based on Graphic Processing Unit Acceleration.

    Science.gov (United States)

    Zhao, Zi-Fang; Li, Xue-Zhu; Wan, You

    2017-12-01

    The local field potential (LFP) is a signal reflecting the electrical activity of neurons surrounding the electrode tip. Synchronization between LFP signals provides important details about how neural networks are organized. Synchronization between two distant brain regions is hard to detect using linear synchronization algorithms like correlation and coherence. Synchronization likelihood (SL) is a non-linear synchronization-detecting algorithm widely used in studies of neural signals from two distant brain areas. One drawback of non-linear algorithms is the heavy computational burden. In the present study, we proposed a graphic processing unit (GPU)-accelerated implementation of an SL algorithm with optional 2-dimensional time-shifting. We tested the algorithm with both artificial data and raw LFP data. The results showed that this method revealed detailed information from original data with the synchronization values of two temporal axes, delay time and onset time, and thus can be used to reconstruct the temporal structure of a neural network. Our results suggest that this GPU-accelerated method can be extended to other algorithms for processing time-series signals (like EEG and fMRI) using similar recording techniques.

  20. Margin benefit assessment of the YGN 3 cycle 1 fxy error files for COLSS and CPC overall uncertainty analyses

    International Nuclear Information System (INIS)

    Yoon, Rae Young; In, Wang Kee; Auh, Geun Sun; Kim, Hee Cheol; Lee, Sang Keun

    1994-01-01

    Margin benefits are quantitatively assessed for the Yonggwang Unit 3 (YGN 3) Cycle 1 planar radial peaking factor (Fxy) error files for each time-in-life, i.e., BOC, IOC, MOC and EOC. The generic Fxy error file (FXYMEQO) is presently used for Yonggwang Unit 3 Cycle 1 COLSS (Core Operating Limit Supervisory System) and CPC (Core Protection Calculator) Overall Uncertainty Analyses (OUA). However, because this file is more conservative than the plant/cycle specific Fxy error files, COLSS and CPC thermal margins (DNB-OPM) for the generic Fxy error file are less than those of the plant/cycle specific Fxy error file. Therefore, the YGN 3 Cycle 1 Fxy error files were generated and analyzed by the modified codes for Yonggwang Plants. The YGN 3 Cycle 1 Fxy error files gave the increased thermal margin by about 1% for COLSS and CPC, respectively

  1. Software for automated evaluation of technical and economic performance factors of nuclear power plant units

    International Nuclear Information System (INIS)

    Cvan, M.; Zadrazil, J.; Barnak, M.

    1989-01-01

    Computer codes TEP V2, TEP EDU and TEP V1 are used especially in real-time evaluation of technical and economic performance factors of the power unit. Their basic functions include filtration of credibility of input data obtained by measurement, simultaneous calculation of flows of various types of energy, calculation of technical and economic factors, listings and filing of the results. Code ZMEK is designed for executing changes in the calculation constants file for codes TEP V2 and TEP EDU. Code TEP DEN is used in processing the complete daily report on the technical and economic performance factors of the unit. Briefly described are the basic algorithms of credibility filtration for the measured quantities, the methodology of fundamental balances and the method of guaranteeing the continuity of measurement. Experiences are given with the use of the codes, and the trends are outlined of their future development. (J.B.). 5 refs

  2. JENDL Dosimetry File

    International Nuclear Information System (INIS)

    Nakazawa, Masaharu; Iguchi, Tetsuo; Kobayashi, Katsuhei; Iwasaki, Shin; Sakurai, Kiyoshi; Ikeda, Yujiro; Nakagawa, Tsuneo.

    1992-03-01

    The JENDL Dosimetry File based on JENDL-3 was compiled and integral tests of cross section data were performed by the Dosimetry Integral Test Working Group of the Japanese Nuclear Data Committee. Data stored in the JENDL Dosimetry File are the cross sections and their covariance data for 61 reactions. The cross sections were mainly taken from JENDL-3 and the covariances from IRDF-85. For some reactions, data were adopted from other evaluated data files. The data are given in the neutron energy region below 20 MeV in both of point-wise and group-wise files in the ENDF-5 format. In order to confirm reliability of the data, several integral tests were carried out; comparison with the data in IRDF-85 and average cross sections measured in fission neutron fields, fast reactor spectra, DT neutron fields and Li(d, n) neutron fields. As a result, it has been found that the JENDL Dosimetry File gives better results than IRDF-85 but there are some problems to be improved in future. The contents of the JENDL Dosimetry File and the results of the integral tests are described in this report. All of the dosimetry cross sections are shown in a graphical form. (author) 76 refs

  3. JENDL Dosimetry File

    Energy Technology Data Exchange (ETDEWEB)

    Nakazawa, Masaharu; Iguchi, Tetsuo [Tokyo Univ. (Japan). Faculty of Engineering; Kobayashi, Katsuhei [Kyoto Univ., Kumatori, Osaka (Japan). Research Reactor Inst.; Iwasaki, Shin [Tohoku Univ., Sendai (Japan). Faculty of Engineering; Sakurai, Kiyoshi; Ikeda, Yujior; Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1992-03-15

    The JENDL Dosimetry File based on JENDL-3 was compiled and integral tests of cross section data were performed by the Dosimetry Integral Test Working Group of the Japanese Nuclear Data Committee. Data stored in the JENDL Dosimetry File are the cross sections and their covariance data for 61 reactions. The cross sections were mainly taken from JENDL-3 and the covariances from IRDF-85. For some reactions, data were adopted from other evaluated data files. The data are given in the neutron energy region below 20 MeV in both of point-wise and group-wise files in the ENDF-5 format. In order to confirm reliability of the data, several integral tests were carried out; comparison with the data in IRDF-85 and average cross sections measured in fission neutron fields, fast reactor spectra, DT neutron fields and Li(d,n) neutron fields. As a result, it has been found that the JENDL Dosimetry File gives better results than IRDF-85 but there are some problems to be improved in future. The contents of the JENDL Dosimetry File and the results of the integral tests are described in this report. All of the dosimetry cross sections are shown in a graphical form.

  4. Dynamic Non-Hierarchical File Systems for Exascale Storage

    Energy Technology Data Exchange (ETDEWEB)

    Long, Darrell E. [Univ. of California, Santa Cruz, CA (United States); Miller, Ethan L [Univ. of California, Santa Cruz, CA (United States)

    2015-02-24

    This constitutes the final report for “Dynamic Non-Hierarchical File Systems for Exascale Storage”. The ultimate goal of this project was to improve data management in scientific computing and high-end computing (HEC) applications, and to achieve this goal we proposed: to develop the first, HEC-targeted, file system featuring rich metadata and provenance collection, extreme scalability, and future storage hardware integration as core design goals, and to evaluate and develop a flexible non-hierarchical file system interface suitable for providing more powerful and intuitive data management interfaces to HEC and scientific computing users. Data management is swiftly becoming a serious problem in the scientific community – while copious amounts of data are good for obtaining results, finding the right data is often daunting and sometimes impossible. Scientists participating in a Department of Energy workshop noted that most of their time was spent “...finding, processing, organizing, and moving data and it’s going to get much worse”. Scientists should not be forced to become data mining experts in order to retrieve the data they want, nor should they be expected to remember the naming convention they used several years ago for a set of experiments they now wish to revisit. Ideally, locating the data you need would be as easy as browsing the web. Unfortunately, existing data management approaches are usually based on hierarchical naming, a 40 year-old technology designed to manage thousands of files, not exabytes of data. Today’s systems do not take advantage of the rich array of metadata that current high-end computing (HEC) file systems can gather, including content-based metadata and provenance1 information. As a result, current metadata search approaches are typically ad hoc and often work by providing a parallel management system to the “main” file system, as is done in Linux (the locate utility), personal computers, and enterprise search

  5. Development of data file system for cardiovascular nuclear medicine

    International Nuclear Information System (INIS)

    Hayashida, Kohei; Nishimura, Tsunehiko; Uehara, Toshiisa; Nisawa, Yoshifumi.

    1985-01-01

    A computer-assisted filing system for storing and processing data from cardiac pool scintigraphy and myocardial scintigraphy has been developed. Individual patient data are stored with his (her) identification number (ID) into floppy discs successively in order of receiving scintigraphy. Data for 900 patients can be stored per floppy disc. Scintigraphic findings can be outputted in a uniform file format, and can be used as a reporting format. Output or retrieval of filed individual patient data is possible according to each examination, disease code or ID. This system seems to be used for prospective study in patients with cardiovascular diseases. (Namekawa, K.)

  6. Tuning HDF5 subfiling performance on parallel file systems

    Energy Technology Data Exchange (ETDEWEB)

    Byna, Suren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chaarawi, Mohamad [Intel Corp. (United States); Koziol, Quincey [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Mainzer, John [The HDF Group (United States); Willmore, Frank [The HDF Group (United States)

    2017-05-12

    Subfiling is a technique used on parallel file systems to reduce locking and contention issues when multiple compute nodes interact with the same storage target node. Subfiling provides a compromise between the single shared file approach that instigates the lock contention problems on parallel file systems and having one file per process, which results in generating a massive and unmanageable number of files. In this paper, we evaluate and tune the performance of recently implemented subfiling feature in HDF5. In specific, we explain the implementation strategy of subfiling feature in HDF5, provide examples of using the feature, and evaluate and tune parallel I/O performance of this feature with parallel file systems of the Cray XC40 system at NERSC (Cori) that include a burst buffer storage and a Lustre disk-based storage. We also evaluate I/O performance on the Cray XC30 system, Edison, at NERSC. Our results show performance benefits of 1.2X to 6X performance advantage with subfiling compared to writing a single shared HDF5 file. We present our exploration of configurations, such as the number of subfiles and the number of Lustre storage targets to storing files, as optimization parameters to obtain superior I/O performance. Based on this exploration, we discuss recommendations for achieving good I/O performance as well as limitations with using the subfiling feature.

  7. Renewal-anomalous-heterogeneous files

    International Nuclear Information System (INIS)

    Flomenbom, Ophir

    2010-01-01

    Renewal-anomalous-heterogeneous files are solved. A simple file is made of Brownian hard spheres that diffuse stochastically in an effective 1D channel. Generally, Brownian files are heterogeneous: the spheres' diffusion coefficients are distributed and the initial spheres' density is non-uniform. In renewal-anomalous files, the distribution of waiting times for individual jumps is not exponential as in Brownian files, yet obeys: ψ α (t)∼t -1-α , 0 2 >, obeys, 2 >∼ 2 > nrml α , where 2 > nrml is the MSD in the corresponding Brownian file. This scaling is an outcome of an exact relation (derived here) connecting probability density functions of Brownian files and renewal-anomalous files. It is also shown that non-renewal-anomalous files are slower than the corresponding renewal ones.

  8. Total system for manufacture of nuclear vessels by computer: VECTRON

    International Nuclear Information System (INIS)

    Inagawa, Jin; Ueno, Osamu; Hanai, Yoshiharu; Ohkawa, Isao; Washizu, Hideyuki

    1980-01-01

    VECTRON (Vessel Engineering by Computer Tool and Rapid Operating for the N/C System) is a CAM (Computer Aided Manufacturing) system that has been developed to produce high quality and highly accurate vessels for nuclear power plants and other industrial plants. Outputs of this system are design drawings, manufacturing information and magnetic tapes of the N/C marking machine for vessel shell plates including their attachments. And it can also output information at each stage of designing, marking, cutting, forming and assembling by treating the vessels in three dimensions and by using data filing systems and plotting program for general use. The data filing systems consist of functional and manufacturing data of each part of vessels. This system not only realizes a change from manual work to computer work, but also leads us to improve production engineering and production jigs for safety and high quality. At present, VECTRON is being applied to the manufacture of the shell plates of primary containment vessels in the Kashiwazaki-Kariwa Nuclear Power Station Unit 1 (K-1) and the Fukushima Daini Nuclear Power Station Unit 3 (2F-3), to realize increased productivity. (author)

  9. Use of computed tomography scout film and Hounsfield unit of computed tomography scan in predicting the radio-opacity of urinary calculi in plain kidney, ureter and bladder radiographs.

    Science.gov (United States)

    Chua, Michael E; Gomez, Odina R; Sapno, Lorelei D; Lim, Steve L; Morales, Marcelino L

    2014-07-01

    The objective of this study is to determine the diagnostic utility of computed tomography (CT)- scout film with an optimal non-contrast helical CT scan Hounsfield unit (HU) in predicting the appearance of urinary calculus in the plain kidneys, ureter, urinary bladder (KUB)-radiograph. A prospective cross-sectional study was executed and data were collected from June 2007 to June 2012 at a tertiary hospital. The included subjects were diagnosed to have value, CT-scout film and KUB radiograph appearance were recorded independently by two observers. Univariate logistic analysis with receiver operating characteristic curve was generated to determine the best cut-off HU value of urolithiases not identified in CT-scout film, but determined radio-opaque in KUB X-ray. Subsequently, its sensitivity, specificity, predictive values and likelihood ratios were calculated. Statistical significance was set at P value of 0.05 or less. Two hundred and three valid cases were included. 73 out of 75 CT-scout film detected urolithiasis were identified on plain radiograph and determined as radio-opaque. The determined best cut off value of HU utilized for prediction of radiographic characteristics was 630HU at which urinary calculi were not seen at CT-scout film and were KUB X-ray radio-opaque. The set HU cut-off was established of ideal accuracy with an overall sensitivity of 82.2%, specificity of 96.9% and a positive predictive value of 96.5% and negative predictive value of 83.5%. Urolithiases identified on the CT-scout film were also seen as radiopaque on the KUB radiograph while those stones not visible on the CT-scout film, but above the optimal HU cut-off value of 630 are also likely to be radiopaque.

  10. Health, United States, 2012: Men's Health

    Science.gov (United States)

    ... Mailing List Previous Reports Suggested Citation Related Sites Purchase Health, United States Behavioral Health Report Children’s ... with Internet Explorer may experience difficulties in directly accessing links to Excel files ...

  11. Tax_Unit_Certification_Final_2012

    Data.gov (United States)

    Kansas Data Access and Support Center — The Statewide GIS Tax Unit boundary file was created through a collaborative partnership between the State of Kansas Department of Revenue Property Valuation...

  12. Tax_Units_Certification_2013_0301

    Data.gov (United States)

    Kansas Data Access and Support Center — The Statewide GIS Tax Unit boundary file was created through a collaborative partnership between the State of Kansas Department of Revenue Property Valuation...

  13. The version control service for the ATLAS data acquisition configuration files

    International Nuclear Information System (INIS)

    Soloviev, Igor

    2012-01-01

    The ATLAS experiment at the LHC in Geneva uses a complex and highly distributed Trigger and Data Acquisition system, involving a very large number of computing nodes and custom modules. The configuration of the system is specified by schema and data in more than 1000 XML files, with various experts responsible for updating the files associated with their components. Maintaining an error free and consistent set of XML files proved a major challenge. Therefore a special service was implemented; to validate any modifications; to check the authorization of anyone trying to modify a file; to record who had made changes, plus when and why; and to provide tools to compare different versions of files and to go back to earlier versions if required. This paper provides details of the implementation and exploitation experience, that may be interesting for other applications using many human-readable files maintained by different people, where consistency of the files and traceability of modifications are key requirements.

  14. Students "Hacking" School Computer Systems

    Science.gov (United States)

    Stover, Del

    2005-01-01

    This article deals with students hacking school computer systems. School districts are getting tough with students "hacking" into school computers to change grades, poke through files, or just pit their high-tech skills against district security. Dozens of students have been prosecuted recently under state laws on identity theft and unauthorized…

  15. NASA ARCH- A FILE ARCHIVAL SYSTEM FOR THE DEC VAX

    Science.gov (United States)

    Scott, P. J.

    1994-01-01

    The function of the NASA ARCH system is to provide a permanent storage area for files that are infrequently accessed. The NASA ARCH routines were designed to provide a simple mechanism by which users can easily store and retrieve files. The user treats NASA ARCH as the interface to a black box where files are stored. There are only five NASA ARCH user commands, even though NASA ARCH employs standard VMS directives and the VAX BACKUP utility. Special care is taken to provide the security needed to insure file integrity over a period of years. The archived files may exist in any of three storage areas: a temporary buffer, the main buffer, and a magnetic tape library. When the main buffer fills up, it is transferred to permanent magnetic tape storage and deleted from disk. Files may be restored from any of the three storage areas. A single file, multiple files, or entire directories can be stored and retrieved. archived entities hold the same name, extension, version number, and VMS file protection scheme as they had in the user's account prior to archival. NASA ARCH is capable of handling up to 7 directory levels. Wildcards are supported. User commands include TEMPCOPY, DISKCOPY, DELETE, RESTORE, and DIRECTORY. The DIRECTORY command searches a directory of savesets covering all three archival areas, listing matches according to area, date, filename, or other criteria supplied by the user. The system manager commands include 1) ARCHIVE- to transfer the main buffer to duplicate magnetic tapes, 2) REPORTto determine when the main buffer is full enough to archive, 3) INCREMENT- to back up the partially filled main buffer, and 4) FULLBACKUP- to back up the entire main buffer. On-line help files are provided for all NASA ARCH commands. NASA ARCH is written in DEC VAX DCL for interactive execution and has been implemented on a DEC VAX computer operating under VMS 4.X. This program was developed in 1985.

  16. Renewable Energy Atlas of the United States

    Energy Technology Data Exchange (ETDEWEB)

    Kuiper, J. [Environmental Science Division; Hlava, K. [Environmental Science Division; Greenwood, H. [Environmentall Science Division; Carr, A. [Environmental Science Division

    2013-12-13

    The Renewable Energy Atlas (Atlas) of the United States is a compilation of geospatial data focused on renewable energy resources, federal land ownership, and base map reference information. This report explains how to add the Atlas to your computer and install the associated software. The report also includes: A description of each of the components of the Atlas; Lists of the Geographic Information System (GIS) database content and sources; and A brief introduction to the major renewable energy technologies. The Atlas includes the following: A GIS database organized as a set of Environmental Systems Research Institute (ESRI) ArcGIS Personal GeoDatabases, and ESRI ArcReader and ArcGIS project files providing an interactive map visualization and analysis interface.

  17. Cloud object store for archive storage of high performance computing data using decoupling middleware

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-06-30

    Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  18. Cloud object store for checkpoints of high performance computing applications using decoupling middleware

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-04-19

    Cloud object storage is enabled for checkpoints of high performance computing applications using a middleware process. A plurality of files, such as checkpoint files, generated by a plurality of processes in a parallel computing system are stored by obtaining said plurality of files from said parallel computing system; converting said plurality of files to objects using a log structured file system middleware process; and providing said objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  19. Formalizing a hierarchical file system

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, Muhammad Ikram

    An abstract file system is defined here as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for creation, removal,

  20. Formalizing a Hierarchical File System

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, M.I.

    2009-01-01

    In this note, we define an abstract file system as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for removal

  1. Extracting the Data From the LCM vk4 Formatted Output File

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, James G. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-29

    These are slides about extracting the data from the LCM vk4 formatted output file. The following is covered: vk4 file produced by Keyence VK Software, custom analysis, no off the shelf way to read the file, reading the binary data in a vk4 file, various offsets in decimal lines, finding the height image data, directly in MATLAB, binary output beginning of height image data, color image information, color image binary data, color image decimal and binary data, MATLAB code to read vk4 file (choose a file, read the file, compute offsets, read optical image, laser optical image, read and compute laser intensity image, read height image, timing, display height image, display laser intensity image, display RGB laser optical images, display RGB optical images, display beginning data and save images to workspace, gamma correction subroutine), reading intensity form the vk4 file, linear in the low range, linear in the high range, gamma correction for vk4 files, computing the gamma intensity correction, observations.

  2. A Centralized Control and Dynamic Dispatch Architecture for File Integrity Analysis

    Directory of Open Access Journals (Sweden)

    Ronald DeMara

    2006-02-01

    Full Text Available The ability to monitor computer file systems for unauthorized changes is a powerful administrative tool. Ideally this task could be performed remotely under the direction of the administrator to allow on-demand checking, and use of tailorable reporting and exception policies targeted to adjustable groups of network elements. This paper introduces M-FICA, a Mobile File Integrity and Consistency Analyzer as a prototype to achieve this capability using mobile agents. The M-FICA file tampering detection approach uses MD5 message digests to identify file changes. Two agent types, Initiator and Examiner, are used to perform file integrity tasks. An Initiator travels to client systems, computes a file digest, then stores those digests in a database file located on write-once media. An Examiner agent computes a new digest to compare with the original digests in the database file. Changes in digest values indicate that the file contents have been modified. The design and evaluation results for a prototype developed in the Concordia agent framework are described.

  3. Reliable file sharing in distributed operating system using web RTC

    Science.gov (United States)

    Dukiya, Rajesh

    2017-12-01

    Since, the evolution of distributed operating system, distributed file system is come out to be important part in operating system. P2P is a reliable way in Distributed Operating System for file sharing. It was introduced in 1999, later it became a high research interest topic. Peer to Peer network is a type of network, where peers share network workload and other load related tasks. A P2P network can be a period of time connection, where a bunch of computers connected by a USB (Universal Serial Bus) port to transfer or enable disk sharing i.e. file sharing. Currently P2P requires special network that should be designed in P2P way. Nowadays, there is a big influence of browsers in our life. In this project we are going to study of file sharing mechanism in distributed operating system in web browsers, where we will try to find performance bottlenecks which our research will going to be an improvement in file sharing by performance and scalability in distributed file systems. Additionally, we will discuss the scope of Web Torrent file sharing and free-riding in peer to peer networks.

  4. 75 FR 30839 - Privacy Act of 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer...

    Science.gov (United States)

    2010-06-02

    ... 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer Match No. 1048, IRS... Services (CMS). ACTION: Notice of renewal of an existing computer matching program (CMP) that has an...'' section below for comment period. DATES: Effective Dates: CMS filed a report of the Computer Matching...

  5. Accessing files in an Internet: The Jade file system

    Science.gov (United States)

    Peterson, Larry L.; Rao, Herman C.

    1991-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  6. Accessing files in an internet - The Jade file system

    Science.gov (United States)

    Rao, Herman C.; Peterson, Larry L.

    1993-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  7. Generalidades de un Sistema de Monitorización Informático para Unidades de Cuidados Intensivos Generalities of a Computer Monitoring System for Intensive Cares Units

    Directory of Open Access Journals (Sweden)

    María del Carmen Tellería Prieto

    2012-02-01

    Full Text Available El empleo de las tecnologías de la información y las comunicaciones en el sector de la salud adquiere cada día una importancia mayor. Se exponen en el trabajo los requisitos generales a partir de los cuales se desarrolla un Sistema Informático para la Monitorización de pacientes críticos en los diferentes servicios de atención al grave, aunque inicialmente está dirigido a las unidades de terapia intensiva. El trabajo es parte de un proyecto ramal que ejecuta la Dirección Nacional de Urgencias Médicas del Ministerio de Salud Pública de Cuba, con la participación de emergencistas e intensivistas de todo el país. El sistema se implementa por informáticos de la salud en Pinar del Río, cumplimentando las regulaciones establecidas por la Dirección Nacional de Informática y la empresa Softel. El sistema de monitorización facilitará la captura, gestión, tratamiento y almacenamiento de la información generada para cada paciente, integrando toda la información que se maneja en el servicio. Se hace hincapié en las evoluciones médicas y de enfermería, la prescripción de los tratamientos, así como en la evaluación clínica de los pacientes, lo que permitirá la toma de decisiones terapéuticas más efectivas. En las generalidades a partir de las cuales se desarrollará el sistema de monitorización, se ha especificado que el sistema sea modular, de manejo sencillo e intuitivo, e implementado con software libre.The application of information and communication technologies in the health sector gains a greater importance every day. General requisites to develop a Computer System to perform the monitoring of critically-ill patients throughout the different services of intensive care were considered; though it was firstly designed to the intensive care units. This paper is part of a branch project conducted by the National Direction of Medical Emergencies belonging to Cuban Ministry of Public Health, and with the participation of

  8. Prefetching in file systems for MIMD multiprocessors

    Science.gov (United States)

    Kotz, David F.; Ellis, Carla Schlatter

    1990-01-01

    The question of whether prefetching blocks on the file into the block cache can effectively reduce overall execution time of a parallel computation, even under favorable assumptions, is considered. Experiments have been conducted with an interleaved file system testbed on the Butterfly Plus multiprocessor. Results of these experiments suggest that (1) the hit ratio, the accepted measure in traditional caching studies, may not be an adequate measure of performance when the workload consists of parallel computations and parallel file access patterns, (2) caching with prefetching can significantly improve the hit ratio and the average time to perform an I/O (input/output) operation, and (3) an improvement in overall execution time has been observed in most cases. In spite of these gains, prefetching sometimes results in increased execution times (a negative result, given the optimistic nature of the study). The authors explore why it is not trivial to translate savings on individual I/O requests into consistently better overall performance and identify the key problems that need to be addressed in order to improve the potential of prefetching techniques in the environment.

  9. Download this PDF file

    African Journals Online (AJOL)

    1- is gifts' ta5ie" in elist fig'equitable' fees distilition s ... O'." & 1 25; 33i) re...) C SS Sati ri. Southerri'Stillah diffigFiles'f actities s % -- - , a v. & ' " St - a s fit . . . fiji ſti i ...

  10. Challenging Ubiquitous Inverted Files

    NARCIS (Netherlands)

    de Vries, A.P.

    2000-01-01

    Stand-alone ranking systems based on highly optimized inverted file structures are generally considered ‘the’ solution for building search engines. Observing various developments in software and hardware, we argue however that IR research faces a complex engineering problem in the quest for more

  11. The Global File System

    Science.gov (United States)

    Soltis, Steven R.; Ruwart, Thomas M.; OKeefe, Matthew T.

    1996-01-01

    The global file system (GFS) is a prototype design for a distributed file system in which cluster nodes physically share storage devices connected via a network-like fiber channel. Networks and network-attached storage devices have advanced to a level of performance and extensibility so that the previous disadvantages of shared disk architectures are no longer valid. This shared storage architecture attempts to exploit the sophistication of storage device technologies whereas a server architecture diminishes a device's role to that of a simple component. GFS distributes the file system responsibilities across processing nodes, storage across the devices, and file system resources across the entire storage pool. GFS caches data on the storage devices instead of the main memories of the machines. Consistency is established by using a locking mechanism maintained by the storage devices to facilitate atomic read-modify-write operations. The locking mechanism is being prototyped in the Silicon Graphics IRIX operating system and is accessed using standard Unix commands and modules.

  12. Download this PDF file

    African Journals Online (AJOL)

    AJNS WEBMASTERS

    Incidence is higher in the elderly, about 58 per 100,000 per year. Diagnosis of CSDH is still .... in the other two patients was not stated in the case file. Evacuation of the Subdural .... Personal experience in 39 patients. Br J of Neurosurg. 2003 ...

  13. File System Virtual Appliances

    Science.gov (United States)

    2010-05-01

    4 KB of data is read or written, data is copied back and forth using trampoline buffers — pages that are shared during proxy initialization — because...in 2008. CIO Magazine. 104 · File system virtual appliances [64] Megiddo, N. and Modha, D. S. 2003. ARC: A Self-Tuning, Low Over- head Replacement

  14. 47 CFR 1.735 - Copies; service; separate filings against multiple defendants.

    Science.gov (United States)

    2010-10-01

    ... overnight delivery service such as, or comparable to, the US Postal Service Express Mail, United Parcel... 47 Telecommunication 1 2010-10-01 2010-10-01 false Copies; service; separate filings against... Complaints § 1.735 Copies; service; separate filings against multiple defendants. (a) Complaints may...

  15. Standard interface files and procedures for reactor physics codes. Version IV

    International Nuclear Information System (INIS)

    O'Dell, R.D.

    1977-09-01

    Standards, procedures, and recommendations of the Committee on Computer Code Coordination for promoting the exchange of reactor physics codes are updated to Version IV status. Standards and procedures covering general programming, program structure, standard interface files, and file management and handling subroutines are included

  16. Testing the Forensic Interestingness of Image Files Based on Size and Type

    Science.gov (United States)

    2017-09-01

    down to 0.18% (Rowe, 2015). 7 III. IMAGE FILE FORMATS When scanning a computer hard drive, many kinds of pictures are found. Digital images are not...3  III.  IMAGE FILE FORMATS ...Interchange Format JPEG Joint Photographic Experts Group LSH Locality Sensitive Hashing NSRL National Software Reference Library PDF Portable Document

  17. High School and Beyond Transcripts Survey (1982). Data File User's Manual. Contractor Report.

    Science.gov (United States)

    Jones, Calvin; And Others

    This data file user's manual documents the procedures used to collect and process high school transcripts for a large sample of the younger cohort (1980 sophomores) in the High School and Beyond survey. The manual provides the user with the technical assistance needed to use the computer file and also discusses the following: (1) sample design for…

  18. 11 CFR 100.19 - File, filed or filing (2 U.S.C. 434(a)).

    Science.gov (United States)

    2010-01-01

    ... a facsimile machine or by electronic mail if the reporting entity is not required to file..., including electronic reporting entities, may use the Commission's website's on-line program to file 48-hour... the reporting entity is not required to file electronically in accordance with 11 CFR 104.18. [67 FR...

  19. Evaluated neutronic file for indium

    International Nuclear Information System (INIS)

    Smith, A.B.; Chiba, S.; Smith, D.L.; Meadows, J.W.; Guenther, P.T.; Lawson, R.D.; Howerton, R.J.

    1990-01-01

    A comprehensive evaluated neutronic data file for elemental indium is documented. This file, extending from 10 -5 eV to 20 MeV, is presented in the ENDF/B-VI format, and contains all neutron-induced processes necessary for the vast majority of neutronic applications. In addition, an evaluation of the 115 In(n,n') 116m In dosimetry reaction is presented as a separate file. Attention is given in quantitative values, with corresponding uncertainty information. These files have been submitted for consideration as a part of the ENDF/B-VI national evaluated-file system. 144 refs., 10 figs., 4 tabs

  20. Translator program converts computer printout into braille language

    Science.gov (United States)

    Powell, R. A.

    1967-01-01

    Computer program converts print image tape files into six dot Braille cells, enabling a blind computer programmer to monitor and evaluate data generated by his own programs. The Braille output is printed 8 lines per inch.