WorldWideScience

Sample records for publications computer file

  1. MMLEADS Public Use File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare-Medicaid Linked Enrollee Analytic Data Source (MMLEADS) Public Use File (PUF) contains demographic, enrollment, condition prevalence, utilization, and...

  2. Earnings Public-Use File, 2006

    Data.gov (United States)

    Social Security Administration — Social Security Administration released Earnings Public-Use File (EPUF) for 2006. File contains earnings information for individuals drawn from a systematic random...

  3. Ambulance Reasonable Charge Public Use Files

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Ambulance Reasonable Charge public use files for calendar years (CY) 2003 through 2005 are located in the Downloads section below. These public use files are...

  4. Chronic Condition Public Use File (PUF)

    Data.gov (United States)

    U.S. Department of Health & Human Services — This release contains the Chronic Conditions Public Use Files (PUF) with information from Medicare claims. The CMS Chronic Conditions PUF is an aggregated file in...

  5. Algorithms and file structures for computational geometry

    International Nuclear Information System (INIS)

    Hinrichs, K.; Nievergelt, J.

    1983-01-01

    Algorithms for solving geometric problems and file structures for storing large amounts of geometric data are of increasing importance in computer graphics and computer-aided design. As examples of recent progress in computational geometry, we explain plane-sweep algorithms, which solve various topological and geometric problems efficiently; and we present the grid file, an adaptable, symmetric multi-key file structure that provides efficient access to multi-dimensional data along any space dimension. (orig.)

  6. 47 CFR 61.14 - Method of filing publications.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Method of filing publications. 61.14 Section 61...) TARIFFS Rules for Electronic Filing § 61.14 Method of filing publications. (a) Publications filed... date of a publication received by the Electronic Tariff Filing System will be determined by the date...

  7. Health Insurance Marketplace Public Use Files

    Data.gov (United States)

    U.S. Department of Health & Human Services — A set of seven (7) public use files containing information on health insurance issuers participating in the Health Insurance Marketplace and certified qualified...

  8. Medicare Geographic Variation - Public Use File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare Geographic Variation Public Use File provides the ability to view demographic, utilization and quality indicators at the state level (including...

  9. Value Modifier Public Use File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Center for Medicare (CM) has created a standard analytical file intended to promote transparency. For each Value Modifier performance year, CM will publish a...

  10. Small file aggregation in a parallel computing system

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Zhang, Jingwang

    2014-09-02

    Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.

  11. Storing files in a parallel computing system using list-based index to identify replica files

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Zhang, Zhenhua; Grider, Gary

    2015-07-21

    Improved techniques are provided for storing files in a parallel computing system using a list-based index to identify file replicas. A file and at least one replica of the file are stored in one or more storage nodes of the parallel computing system. An index for the file comprises at least one list comprising a pointer to a storage location of the file and a storage location of the at least one replica of the file. The file comprises one or more of a complete file and one or more sub-files. The index may also comprise a checksum value for one or more of the file and the replica(s) of the file. The checksum value can be evaluated to validate the file and/or the file replica(s). A query can be processed using the list.

  12. Supplemental Security Income Public-Use Microdata File, 2001 Data

    Data.gov (United States)

    Social Security Administration — The SSI Public-Use Microdata File contains an extract of data fields from SSA's Supplemental Security Record file and consists of a 5 percent random, representative...

  13. 47 CFR 61.32 - Method of filing publications.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Method of filing publications. 61.32 Section 61...) TARIFFS General Rules for Dominant Carriers § 61.32 Method of filing publications. (a) Publications sent....” The date on which the publication is received by the Secretary of the Commission (or the Mail Room...

  14. 47 CFR 61.20 - Method of filing publications.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Method of filing publications. 61.20 Section 61...) TARIFFS General Rules for Nondominant Carriers § 61.20 Method of filing publications. (a) Publications... 20554.“ The date on which the publication is received by the Secretary of the Commission (or the Mail...

  15. Informatics in Radiology (infoRAD): personal computer security: part 2. Software Configuration and file protection.

    Science.gov (United States)

    Caruso, Ronald D

    2004-01-01

    Proper configuration of software security settings and proper file management are necessary and important elements of safe computer use. Unfortunately, the configuration of software security options is often not user friendly. Safe file management requires the use of several utilities, most of which are already installed on the computer or available as freeware. Among these file operations are setting passwords, defragmentation, deletion, wiping, removal of personal information, and encryption. For example, Digital Imaging and Communications in Medicine medical images need to be anonymized, or "scrubbed," to remove patient identifying information in the header section prior to their use in a public educational or research environment. The choices made with respect to computer security may affect the convenience of the computing process. Ultimately, the degree of inconvenience accepted will depend on the sensitivity of the files and communications to be protected and the tolerance of the user. Copyright RSNA, 2004

  16. Basic Stand Alone Medicare Claims Public Use Files

    Data.gov (United States)

    U.S. Department of Health & Human Services — CMS is committed to increasing access to its Medicare claims data through the release of de-identified data files available for public use. They contain...

  17. RAMA: A file system for massively parallel computers

    Science.gov (United States)

    Miller, Ethan L.; Katz, Randy H.

    1993-01-01

    This paper describes a file system design for massively parallel computers which makes very efficient use of a few disks per processor. This overcomes the traditional I/O bottleneck of massively parallel machines by storing the data on disks within the high-speed interconnection network. In addition, the file system, called RAMA, requires little inter-node synchronization, removing another common bottleneck in parallel processor file systems. Support for a large tertiary storage system can easily be integrated in lo the file system; in fact, RAMA runs most efficiently when tertiary storage is used.

  18. WinSCP for Windows File Transfers | High-Performance Computing | NREL

    Science.gov (United States)

    WinSCP for Windows File Transfers WinSCP for Windows File Transfers WinSCP for can used to securely transfer files between your local computer running Microsoft Windows and a remote computer running Linux

  19. 47 CFR 73.3527 - Local public inspection file of noncommercial educational stations.

    Science.gov (United States)

    2010-10-01

    ... main studio and public file outside its community of license shall: (i) Make available to persons... 47 Telecommunication 4 2010-10-01 2010-10-01 false Local public inspection file of noncommercial... public inspection file of noncommercial educational stations. (a) Responsibility to maintain a file. The...

  20. Documentation of CATHENA input files for the APOLLO computer

    International Nuclear Information System (INIS)

    1988-06-01

    Input files created for the VAX version of the CATHENA two-fluid code have been modified and documented for simulation on the AECB's APOLLO computer system. The input files describe the RD-14 thermalhydraulic loop, the RD-14 steam generator, the RD-12 steam generator blowdown test facility, the Stern Laboratories Cold Water Injection Facility (CWIT), and a CANDU 600 reactor. Sample CATHENA predictions are given and compared with experimental results where applicable. 24 refs

  1. Computer Science and Technology Publications. NBS Publications List 84.

    Science.gov (United States)

    National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.

    This bibliography lists publications of the Institute for Computer Sciences and Technology of the National Bureau of Standards. Publications are listed by subject in the areas of computer security, computer networking, and automation technology. Sections list publications of: (1) current Federal Information Processing Standards; (2) computer…

  2. NET: an inter-computer file transfer command

    International Nuclear Information System (INIS)

    Burris, R.D.

    1978-05-01

    The NET command was defined and supported in order to facilitate file transfer between computers. Among the goals of the implementation were greatest possible ease of use, maximum power (i.e., support of a diversity of equipment and operations), and protection of the operating system

  3. Resource file: practical publications for energy management, edition III

    Energy Technology Data Exchange (ETDEWEB)

    1980-03-01

    The Resource File is an in-depth bibliography of 166 practical and action-oriented energy conservation publications and materials. It is a reference tool, designed for Federal, state, and local energy managers or people who are asked to recommend how-to conservation guides to the public. Each listing describes a publication's intended audience and provides a summary of its contents. Included are operations and maintenance manuals, life-cycle costing handbooks, home insulation manuals, films on fuel-saving driving techniques, and courses devoted exclusively to home weatherization. 166 items.

  4. Storing files in a parallel computing system based on user-specified parser function

    Science.gov (United States)

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Manzanares, Adam; Torres, Aaron

    2014-10-21

    Techniques are provided for storing files in a parallel computing system based on a user-specified parser function. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a parser from the distributed application for processing the plurality of files prior to storage; and storing one or more of the plurality of files in one or more storage nodes of the parallel computing system based on the processing by the parser. The plurality of files comprise one or more of a plurality of complete files and a plurality of sub-files. The parser can optionally store only those files that satisfy one or more semantic requirements of the parser. The parser can also extract metadata from one or more of the files and the extracted metadata can be stored with one or more of the plurality of files and used for searching for files.

  5. A software to report and file by personal computer

    International Nuclear Information System (INIS)

    Di Giandomenico, E.; Filippone, A.; Esposito, A.; Bonomo, L.

    1989-01-01

    During the past four years the authors have been gaining experince in reporting radiological examinations by personal computer. Today they describe the project of a new software which allows the reporting and filing of roentgenograms. This program was realized by a radiologist, using a well known data base management system: dBASE III. The program was shaped to fit the radiologist's needs: it helps to report, and allows to file, radiological data, with the diagnosic codes used by the American College of Radiology. In this paper the authors describe the data base structure and indicate the software functions which make its use possible. Thus, this paper is not aimed at advertising a new reporting program, but at demonstrating how the radiologist can himself manage some aspects of his work with the help of a personal computer

  6. Old Age, Survivors, and Disability Insurance (OASDI) Public-Use Microdata File, 2001 Data

    Data.gov (United States)

    Social Security Administration — The OASDI Public-Use Microdata File contains an extract of data fields from SSA's Master Beneficiary Record file and consists of a 1 percent random, representative...

  7. 22 CFR 1429.21 - Computation of time for filing papers.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Computation of time for filing papers. 1429.21... MISCELLANEOUS AND GENERAL REQUIREMENTS General Requirements § 1429.21 Computation of time for filing papers. In... subchapter requires the filing of any paper, such document must be received by the Board or the officer or...

  8. Bringing computational science to the public.

    Science.gov (United States)

    McDonagh, James L; Barker, Daniel; Alderson, Rosanna G

    2016-01-01

    The increasing use of computers in science allows for the scientific analyses of large datasets at an increasing pace. We provided examples and interactive demonstrations at Dundee Science Centre as part of the 2015 Women in Science festival, to present aspects of computational science to the general public. We used low-cost Raspberry Pi computers to provide hands on experience in computer programming and demonstrated the application of computers to biology. Computer games were used as a means to introduce computers to younger visitors. The success of the event was evaluated by voluntary feedback forms completed by visitors, in conjunction with our own self-evaluation. This work builds on the original work of the 4273π bioinformatics education program of Barker et al. (2013, BMC Bioinform. 14:243). 4273π provides open source education materials in bioinformatics. This work looks at the potential to adapt similar materials for public engagement events. It appears, at least in our small sample of visitors (n = 13), that basic computational science can be conveyed to people of all ages by means of interactive demonstrations. Children as young as five were able to successfully edit simple computer programs with supervision. This was, in many cases, their first experience of computer programming. The feedback is predominantly positive, showing strong support for improving computational science education, but also included suggestions for improvement. Our conclusions are necessarily preliminary. However, feedback forms suggest methods were generally well received among the participants; "Easy to follow. Clear explanation" and "Very easy. Demonstrators were very informative." Our event, held at a local Science Centre in Dundee, demonstrates that computer games and programming activities suitable for young children can be performed alongside a more specialised and applied introduction to computational science for older visitors.

  9. Methods and apparatus for multi-resolution replication of files in a parallel computing system using semantic information

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-10-20

    Techniques are provided for storing files in a parallel computing system using different resolutions. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a sub-file. The method comprises the steps of obtaining semantic information related to the file; generating a plurality of replicas of the file with different resolutions based on the semantic information; and storing the file and the plurality of replicas of the file in one or more storage nodes of the parallel computing system. The different resolutions comprise, for example, a variable number of bits and/or a different sub-set of data elements from the file. A plurality of the sub-files can be merged to reproduce the file.

  10. File and metadata management for BESIII distributed computing

    International Nuclear Information System (INIS)

    Nicholson, C; Zheng, Y H; Lin, L; Deng, Z Y; Li, W D; Zhang, X M

    2012-01-01

    The BESIII experiment at the Institute of High Energy Physics (IHEP), Beijing, uses the high-luminosity BEPCII e + e − collider to study physics in the π-charm energy region around 3.7 GeV; BEPCII has produced the worlds largest samples of J/φ and φ’ events to date. An order of magnitude increase in the data sample size over the 2011-2012 data-taking period demanded a move from a very centralized to a distributed computing environment, as well as the development of an efficient file and metadata management system. While BESIII is on a smaller scale than some other HEP experiments, this poses particular challenges for its distributed computing and data management system. These constraints include limited resources and manpower, and low quality of network connections to IHEP. Drawing on the rich experience of the HEP community, a system has been developed which meets these constraints. The design and development of the BESIII distributed data management system, including its integration with other BESIII distributed computing components, such as job management, are presented here.

  11. Comparison of canal transportation and centering ability of hand Protaper files and rotary Protaper files by using micro computed tomography

    OpenAIRE

    Amit Gandhi; Taru Gandhi

    2011-01-01

    Introduction and objective: The aim of the present study was to compare root canal preparation with rotary ProTaper files and hand ProTaper files to find a better instrumentation technique for maintaining root canal geometry with the aid of computed tomography. Material and methods: Twenty curved root canals with at least 10 degree of curvature were divided into 2 groups of 10 teeth each. In group I the canals were prepared with hand ProTaper files and in group II the canals were prepared wit...

  12. Arranging and finding folders and files on your Windows 7 computer

    CERN Document Server

    Steps, Studio Visual

    2014-01-01

    If you have lots of documents on your desk, it may prove to be impossible to find the document you are looking for. In order to easily find certain documents, they are often stored in a filing cabinet and arranged in a logical order. The folders on your computer serve the same purpose. They do not just contain files; they can also contain other folders. You can create an unlimited number of folders, and each folder can contain any number of subfolders and files. You can use Windows Explorer, also called the folder window, to work with the files and folders on your computer. You can copy, delete, move, find, and sort files, among other things. Or you can transfer files and folders to a USB stick, an external hard drive, a CD, DVD or Blu-Ray disk. In this practical guide we will show you how to use the folder window, and help you arrange your own files.

  13. Methods and apparatus for capture and storage of semantic information with sub-files in a parallel computing system

    Science.gov (United States)

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-02-03

    Techniques are provided for storing files in a parallel computing system using sub-files with semantically meaningful boundaries. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a plurality of sub-files. The method comprises the steps of obtaining a user specification of semantic information related to the file; providing the semantic information as a data structure description to a data formatting library write function; and storing the semantic information related to the file with one or more of the sub-files in one or more storage nodes of the parallel computing system. The semantic information provides a description of data in the file. The sub-files can be replicated based on semantically meaningful boundaries.

  14. RRB / SSI Interface Checkwriting Integrated Computer Operation Extract File (CHICO)

    Data.gov (United States)

    Social Security Administration — This monthly file provides SSA with information about benefit payments made to railroad retirement beneficiaries. SSA uses this data to verify Supplemental Security...

  15. 76 FR 799 - Publication of Year 2010 Form M-1 With Electronic Filing Option, Notice

    Science.gov (United States)

    2011-01-06

    ... DEPARTMENT OF LABOR Employee Benefits Security Administration Publication of Year 2010 Form M-1 With Electronic Filing Option, Notice AGENCY: Employee Benefits Security Administration, Department of... Employer Welfare Arrangements and Certain Entities Claiming Exception. It is generally identical to the...

  16. A technique for integrating remote minicomputers into a general computer's file system

    CERN Document Server

    Russell, R D

    1976-01-01

    This paper describes a simple technique for interfacing remote minicomputers used for real-time data acquisition into the file system of a central computer. Developed as part of the ORION system at CERN, this 'File Manager' subsystem enables a program in the minicomputer to access and manipulate files of any type as if they resided on a storage device attached to the minicomputer. Yet, completely transparent to the program, the files are accessed from disks on the central system via high-speed data links, with response times comparable to local storage devices. (6 refs).

  17. Processing of evaluated neutron data files in ENDF format on personal computers

    International Nuclear Information System (INIS)

    Vertes, P.

    1991-11-01

    A computer code package - FDMXPC - has been developed for processing evaluated data files in ENDF format. The earlier version of this package is supplemented with modules performing calculations using Reich-Moore and Adler-Adler resonance parameters. The processing of evaluated neutron data files by personal computers requires special programming considerations outlined in this report. The scope of the FDMXPC program system is demonstrated by means of numerical examples. (author). 5 refs, 4 figs, 4 tabs

  18. Digital publication of EDF's regulatory files

    International Nuclear Information System (INIS)

    Picard, Pauline

    2012-06-01

    The Engineering Centre of Deconstruction and Environment wants to modernize the distribution of its regulatory filings in digital form. This project requires a prior upstream of its implementation to determine the best publishing solution suited to the constraints of diffusion and heritage protection of EDF

  19. A digital imaging teaching file by using the internet, HTML and personal computers

    International Nuclear Information System (INIS)

    Chun, Tong Jin; Jeon, Eun Ju; Baek, Ho Gil; Kang, Eun Joo; Baik, Seung Kug; Choi, Han Yong; Kim, Bong Ki

    1996-01-01

    A film-based teaching file takes up space and the need to search through such a file places limits on the extent to which it is likely to be used. Furthermore it is not easy for doctors in a medium-sized hospital to experience a variety of cases, and so for these reasons we created an easy-to-use digital imaging teaching file with HTML(Hypertext Markup Language) and downloaded images via World Wide Web(WWW) services on the Internet. This was suitable for use by computer novices. We used WWW internet services as a resource for various images and three different IMB-PC compatible computers(386DX, 486DX-II, and Pentium) in downloading the images and in developing a digitalized teaching file. These computers were connected with the Internet through a high speed dial-up modem(28.8Kbps) and to navigate the Internet. Twinsock and Netscape were used. 3.0, Korean word processing software, was used to create HTML(Hypertext Markup Language) files and the downloaded images were linked to the HTML files. In this way, a digital imaging teaching file program was created. Access to a Web service via the Internet required a high speed computer(at least 486DX II with 8MB RAM) for comfortabel use; this also ensured that the quality of downloaded images was not degraded during downloading and that these were good enough to use as a teaching file. The time needed to retrieve the text and related images depends on the size of the file, the speed of the network, and the network traffic at the time of connection. For computer novices, a digital image teaching file using HTML is easy to use. Our method of creating a digital imaging teaching file by using Internet and HTML would be easy to create and radiologists with little computer experience who want to study various digital radiologic imaging cases would find it easy to use

  20. Globus File Transfer Services | High-Performance Computing | NREL

    Science.gov (United States)

    installed on the systems at both ends of the data transfer. The NREL endpoint is nrel#globus. Click Login on the Globus web site. On the login page select "Globus ID" as the login method and click Login to the Globus website. From the Manage Data drop down menu, select Transfer Files. Then click Get

  1. Cooperative storage of shared files in a parallel computing system with dynamic block size

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-11-10

    Improved techniques are provided for parallel writing of data to a shared object in a parallel computing system. A method is provided for storing data generated by a plurality of parallel processes to a shared object in a parallel computing system. The method is performed by at least one of the processes and comprises: dynamically determining a block size for storing the data; exchanging a determined amount of the data with at least one additional process to achieve a block of the data having the dynamically determined block size; and writing the block of the data having the dynamically determined block size to a file system. The determined block size comprises, e.g., a total amount of the data to be stored divided by the number of parallel processes. The file system comprises, for example, a log structured virtual parallel file system, such as a Parallel Log-Structured File System (PLFS).

  2. Computer Forensics Method in Analysis of Files Timestamps in Microsoft Windows Operating System and NTFS File System

    Directory of Open Access Journals (Sweden)

    Vesta Sergeevna Matveeva

    2013-02-01

    Full Text Available All existing file browsers displays 3 timestamps for every file in file system NTFS. Nowadays there are a lot of utilities that can manipulate temporal attributes to conceal the traces of file using. However every file in NTFS has 8 timestamps that are stored in file record and used in detecting the fact of attributes substitution. The authors suggest a method of revealing original timestamps after replacement and automated variant of it in case of a set of files.

  3. De-identifying a public use microdata file from the Canadian national discharge abstract database

    Directory of Open Access Journals (Sweden)

    Paton David

    2011-08-01

    Full Text Available Abstract Background The Canadian Institute for Health Information (CIHI collects hospital discharge abstract data (DAD from Canadian provinces and territories. There are many demands for the disclosure of this data for research and analysis to inform policy making. To expedite the disclosure of data for some of these purposes, the construction of a DAD public use microdata file (PUMF was considered. Such purposes include: confirming some published results, providing broader feedback to CIHI to improve data quality, training students and fellows, providing an easily accessible data set for researchers to prepare for analyses on the full DAD data set, and serve as a large health data set for computer scientists and statisticians to evaluate analysis and data mining techniques. The objective of this study was to measure the probability of re-identification for records in a PUMF, and to de-identify a national DAD PUMF consisting of 10% of records. Methods Plausible attacks on a PUMF were evaluated. Based on these attacks, the 2008-2009 national DAD was de-identified. A new algorithm was developed to minimize the amount of suppression while maximizing the precision of the data. The acceptable threshold for the probability of correct re-identification of a record was set at between 0.04 and 0.05. Information loss was measured in terms of the extent of suppression and entropy. Results Two different PUMF files were produced, one with geographic information, and one with no geographic information but more clinical information. At a threshold of 0.05, the maximum proportion of records with the diagnosis code suppressed was 20%, but these suppressions represented only 8-9% of all values in the DAD. Our suppression algorithm has less information loss than a more traditional approach to suppression. Smaller regions, patients with longer stays, and age groups that are infrequently admitted to hospitals tend to be the ones with the highest rates of suppression

  4. 1 CFR 5.2 - Documents required to be filed for public inspection and published.

    Science.gov (United States)

    2010-01-01

    ... inspection and published. 5.2 Section 5.2 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER THE FEDERAL REGISTER GENERAL § 5.2 Documents required to be filed for public inspection and published... Register and published in the Federal Register: (a) Presidential proclamations and Executive orders in the...

  5. 77 FR 16494 - Revised Public Utility Filing Requirements for Electric Quarterly Reports

    Science.gov (United States)

    2012-03-21

    ... at http:// www/ ferc.gov/docs-filing/efiling.asp , or via phone from FERC Online Support at 202-502... simultaneous exchange transaction involves the marketing function of a public utility transmission provider... net buyers reporting a negative price spread. \\18\\ There may be transactions in which credits or...

  6. Survey on Security Issues in File Management in Cloud Computing Environment

    Science.gov (United States)

    Gupta, Udit

    2015-06-01

    Cloud computing has pervaded through every aspect of Information technology in past decade. It has become easier to process plethora of data, generated by various devices in real time, with the advent of cloud networks. The privacy of users data is maintained by data centers around the world and hence it has become feasible to operate on that data from lightweight portable devices. But with ease of processing comes the security aspect of the data. One such security aspect is secure file transfer either internally within cloud or externally from one cloud network to another. File management is central to cloud computing and it is paramount to address the security concerns which arise out of it. This survey paper aims to elucidate the various protocols which can be used for secure file transfer and analyze the ramifications of using each protocol.

  7. The image database management system of teaching file using personal computer

    International Nuclear Information System (INIS)

    Shin, M. J.; Kim, G. W.; Chun, T. J.; Ahn, W. H.; Baik, S. K.; Choi, H. Y.; Kim, B. G.

    1995-01-01

    For the systemic management and easy using of teaching file in radiology department, the authors tried to do the setup of a database management system of teaching file using personal computer. We used a personal computer (IBM PC compatible, 486DX2) including a image capture card(Window vision, Dooin Elect, Seoul, Korea) and video camera recorder (8mm, CCD-TR105, Sony, Tokyo, Japan) for the acquisition and storage of images. We developed the database program by using Foxpro for Window 2.6(Microsoft, Seattle, USA) executed in the Window 3.1 (Microsoft, Seattle, USA). Each datum consisted of hospital number, name, sex, age, examination date, keyword, radiologic examination modalities, final diagnosis, radiologic findings, references and representative images. The images were acquired and stored as bitmap format (8 bitmap, 540 X 390 ∼ 545 X 414, 256 gray scale) and displayed on the 17 inch-flat monitor(1024 X 768, Samtron, Seoul, Korea). Without special devices, the images acquisition and storage could be done on the reading viewbox, simply. The image quality on the computer's monitor was less than the one of original film on the viewbox, but generally the characteristics of each lesions could be differentiated. Easy retrieval of data was possible for the purpose of teaching file system. Without high cost appliances, we could consummate the image database system of teaching file using personal computer with relatively inexpensive method

  8. Transfer of numeric ASCII data files between Apple and IBM personal computers.

    Science.gov (United States)

    Allan, R W; Bermejo, R; Houben, D

    1986-01-01

    Listings for programs designed to transfer numeric ASCII data files between Apple and IBM personal computers are provided with accompanying descriptions of how the software operates. Details of the hardware used are also given. The programs may be easily adapted for transferring data between other microcomputers.

  9. 77 FR 4568 - Annual Computational Science Symposium; Public Conference

    Science.gov (United States)

    2012-01-30

    ...] Annual Computational Science Symposium; Public Conference AGENCY: Food and Drug Administration, HHS... with the Pharmaceutical Users Software Exchange (PhUSE), is announcing a public conference entitled ``The FDA/PhUSE Annual Computational Science Symposium.'' The purpose of the conference is to help the...

  10. Computer-Based Simulation Games in Public Administration Education

    OpenAIRE

    Kutergina Evgeniia

    2017-01-01

    Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently...

  11. Dimensional quality control of Ti-Ni dental file by optical coordinate metrology and computed tomography

    DEFF Research Database (Denmark)

    Yagüe-Fabra, J.A.; Tosello, Guido; Ontiveros, S.

    2014-01-01

    Endodontic dental files usually present complex 3D geometries, which make the complete measurement of the component very challenging with conventional micro metrology tools. Computed Tomography (CT) can represent a suitable alternative solution to micro metrology tools based on optical and tactile...... techniques. However, the establishment of CT systems traceability when measuring 3D complex geometries is still an open issue. In this work, to verify the quality of the CT dimensional measurements, the dental file has been measured both with a μCT system and an optical CMM (OCMM). The uncertainty...

  12. Publicity in secured transactions law: Towards a European public notice filing system for non-possessory security rights in movable assets?

    NARCIS (Netherlands)

    Hamwijk, D.J.Y.

    2014-01-01

    There is a growing call to introduce a European public notice filing system for security rights in movable goods comparable to the notice filing system of Art. 9 UCC. A proposal to this effect has been adopted in Book IX DCFR, which represents a comprehensive framework of rules for proprietary

  13. To make files of projects of some non-noticeable modifications of base nuclear installations (INB) available to the public

    International Nuclear Information System (INIS)

    Vial, Eugenie

    2013-01-01

    As the French Nuclear Safety Authority (ASN) published a decision related to the availability to the public of files of projects of modifications as specified in the French Code of the Environment, the author proposes an analysis of this decision. He discusses its scope of application within a legal and regulatory context at the crossroad of nuclear and environmental issues, and indicates the procedures concerned by this decision. He outlines the content of the files which the operator must transmit to the ASN: the modification request file, the public availability file. He indicates the operations which must be performed prior to public availability, the content of the notice of availability, and the conditions of information of the local information commission (CLI) and of the advert of public availability. He presents the various operations to be performed after the public availability

  14. Local File Disclosure Vulnerability: A Case Study of Public-Sector Web Applications

    Science.gov (United States)

    Ahmed, M. Imran; Maruf Hassan, Md; Bhuyian, Touhid

    2018-01-01

    Almost all public-sector organisations in Bangladesh now offer online services through web applications, along with the existing channels, in their endeavour to realise the dream of a ‘Digital Bangladesh’. Nations across the world have joined the online environment thanks to training and awareness initiatives by their government. File sharing and downloading activities using web applications have now become very common, not only ensuring the easy distribution of different types of files and documents but also enormously reducing the time and effort of users. Although the online services that are being used frequently have made users’ life easier, it has increased the risk of exploitation of local file disclosure (LFD) vulnerability in the web applications of different public-sector organisations due to unsecure design and careless coding. This paper analyses the root cause of LFD vulnerability, its exploitation techniques, and its impact on 129 public-sector websites in Bangladesh by examining the use of manual black box testing approach.

  15. File management for experiment control parameters within a distributed function computer network

    International Nuclear Information System (INIS)

    Stubblefield, F.W.

    1976-10-01

    An attempt to design and implement a computer system for control of and data collection from a set of laboratory experiments reveals that many of the experiments in the set require an extensive collection of parameters for their control. The operation of the experiments can be greatly simplified if a means can be found for storing these parameters between experiments and automatically accessing them as they are required. A subsystem for managing files of such experiment control parameters is discussed. 3 figures

  16. How You Can Protect Public Access Computers "and" Their Users

    Science.gov (United States)

    Huang, Phil

    2007-01-01

    By providing the public with online computing facilities, librarians make available a world of information resources beyond their traditional print materials. Internet-connected computers in libraries greatly enhance the opportunity for patrons to enjoy the benefits of the digital age. Unfortunately, as hackers become more sophisticated and…

  17. Computer-Assisted Management of Instruction in Veterinary Public Health

    Science.gov (United States)

    Holt, Elsbeth; And Others

    1975-01-01

    Reviews a course in Food Hygiene and Public Health at the University of Illinois College of Veterinary Medicine in which students are sequenced through a series of computer-based lessons or autotutorial slide-tape lessons, the computer also being used to route, test, and keep records. Since grades indicated mastery of the subject, the course will…

  18. Public computing options for individuals with cognitive impairments: survey outcomes.

    Science.gov (United States)

    Fox, Lynn Elizabeth; Sohlberg, McKay Moore; Fickas, Stephen; Lemoncello, Rik; Prideaux, Jason

    2009-09-01

    To examine availability and accessibility of public computing for individuals with cognitive impairment (CI) who reside in the USA. A telephone survey was administered as a semi-structured interview to 145 informants representing seven types of public facilities across three geographically distinct regions using a snowball sampling technique. An Internet search of wireless (Wi-Fi) hotspots supplemented the survey. Survey results showed the availability of public computer terminals and Internet hotspots was greatest in the urban sample, followed by the mid-sized and rural cities. Across seven facility types surveyed, libraries had the highest percentage of access barriers, including complex queue procedures, login and password requirements, and limited technical support. University assistive technology centres and facilities with a restricted user policy, such as brain injury centres, had the lowest incidence of access barriers. Findings suggest optimal outcomes for people with CI will result from a careful match of technology and the user that takes into account potential barriers and opportunities to computing in an individual's preferred public environments. Trends in public computing, including the emergence of widespread Wi-Fi and limited access to terminals that permit auto-launch applications, should guide development of technology designed for use in public computing environments.

  19. Code and papers: computing publication patterns in the LHC era

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Publications in scholarly journals establish the body of knowledge deriving from scientific research; they also play a fundamental role in the career path of scientists and in the evaluation criteria of funding agencies. This presentation reviews the evolution of computing-oriented publications in HEP following the start of operation of LHC. Quantitative analyses are illustrated, which document the production of scholarly papers on computing-related topics by HEP experiments and core tools projects (including distributed computing R&D), and the citations they receive. Several scientometric indicators are analyzed to characterize the role of computing in HEP literature. Distinctive features of scholarly publication production in the software-oriented and hardware-oriented experimental HEP communities are highlighted. Current patterns and trends are compared to the situation in previous generations' HEP experiments at LEP, Tevatron and B-factories. The results of this scientometric analysis document objec...

  20. Public Services 2.0: The Impact of Social Computing on Public Services

    NARCIS (Netherlands)

    Huijboom, Noor; Broek, Tijs Van Den; Frissen, Valerie; Kool, Linda; Kotterink, Bas; Nielsen, Morten Meyerhoff; Millard, Jeremy

    2009-01-01

    The report gives an overview of the main trends of Social Computing, in the wider context of an evolving public sector, and in relation to relevant government trends and normative policy visions within and across EU Member States on future public services. It then provides an exhaustive literature

  1. 77 FR 16026 - Cargill Power Markets, LLC v. Public Service Company of New Mexico; Notice of Compliance Filing

    Science.gov (United States)

    2012-03-19

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. EL10-61-002] Cargill Power Markets, LLC v. Public Service Company of New Mexico; Notice of Compliance Filing Take notice that on March 12, 2012, Public Service Company of New Mexico and Cargill Power Markets, LLC submitted a...

  2. Evaluation of clinical data in childhood asthma. Application of a computer file system

    International Nuclear Information System (INIS)

    Fife, D.; Twarog, F.J.; Geha, R.S.

    1983-01-01

    A computer file system was used in our pediatric allergy clinic to assess the value of chest roentgenograms and hemoglobin determinations used in the examination of patients and to correlate exposure to pets and forced hot air with the severity of asthma. Among 889 children with asthma, 20.7% had abnormal chest roentgenographic findings, excluding hyperinflation and peribronchial thickening, and 0.7% had abnormal hemoglobin values. Environmental exposure to pets or forced hot air was not associated with increased severity of asthma, as assessed by five measures of outcome: number of medications administered, requirement for corticosteroids, frequency of clinic visits, frequency of emergency room visits, and frequency of hospitalizations

  3. Building Parts Inventory Files Using the AppleWorks Data Base Subprogram and Apple IIe or GS Computers.

    Science.gov (United States)

    Schlenker, Richard M.

    This manual is a "how to" training device for building database files using the AppleWorks program with an Apple IIe or Apple IIGS Computer with Duodisk or two disk drives and an 80-column card. The manual provides step-by-step directions, and includes 25 figures depicting the computer screen at the various stages of the database file…

  4. Computer-Based Simulation Games in Public Administration Education

    Directory of Open Access Journals (Sweden)

    Kutergina Evgeniia

    2017-12-01

    Full Text Available Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently in Russia the use of computer-based simulation games in Master of Public Administration (MPA curricula is quite limited. Th is paper focuses on computer- based simulation games for students of MPA programmes. Our aim was to analyze outcomes of implementing such games in MPA curricula. We have done so by (1 developing three computer-based simulation games about allocating public finances, (2 testing the games in the learning process, and (3 conducting a posttest examination to evaluate the effect of simulation games on students’ knowledge of municipal finances. Th is study was conducted in the National Research University Higher School of Economics (HSE and in the Russian Presidential Academy of National Economy and Public Administration (RANEPA during the period of September to December 2015, in Saint Petersburg, Russia. Two groups of students were randomly selected in each university and then randomly allocated either to the experimental or the control group. In control groups (n=12 in HSE, n=13 in RANEPA students had traditional lectures. In experimental groups (n=12 in HSE, n=13 in RANEPA students played three simulation games apart from traditional lectures. Th is exploratory research shows that the use of computer-based simulation games in MPA curricula can improve students’ outcomes by 38 %. In general, the experimental groups had better performances on the post-test examination (Figure 2. Students in the HSE experimental group had 27.5 % better

  5. CINDA 83 (1977-1983). The index to literature and computer files on microscopic neutron data

    International Nuclear Information System (INIS)

    1983-01-01

    CINDA, the Computer Index of Neutron Data, contains bibliographical references to measurements, calculations, reviews and evaluations of neutron cross-sections and other microscopic neutron data; it includes also index references to computer libraries of numerical neutron data exchanged between four regional neutron data centres. The present issue, CINDA 83, is an index to the literature on neutron data published after 1976. The basic volume, CINDA-A, together with the present issue, contains the full CINDA file as of 1 April 1983. A supplement to CINDA 83 is foreseen for fall 1983. Next year's issue, which is envisaged to be published in June 1984, will again cover all relevant literature that has appeared after 1976

  6. Trust in social computing. The case of peer-to-peer file sharing networks

    Directory of Open Access Journals (Sweden)

    Heng Xu

    2011-09-01

    Full Text Available Social computing and online communities are changing the fundamental way people share information and communicate with each other. Social computing focuses on how users may have more autonomy to express their ideas and participate in social exchanges in various ways, one of which may be peer-to-peer (P2P file sharing. Given the greater risk of opportunistic behavior by malicious or criminal communities in P2P networks, it is crucial to understand the factors that affect individual’s use of P2P file sharing software. In this paper, we develop and empirically test a research model that includes trust beliefs and perceived risks as two major antecedent beliefs to the usage intention. Six trust antecedents are assessed including knowledge-based trust, cognitive trust, and both organizational and peer-network factors of institutional trust. Our preliminary results show general support for the model and offer some important implications for software vendors in P2P sharing industry and regulatory bodies.

  7. 37 CFR 201.26 - Recordation of documents pertaining to computer shareware and donation of public domain computer...

    Science.gov (United States)

    2010-07-01

    ... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... GENERAL PROVISIONS § 201.26 Recordation of documents pertaining to computer shareware and donation of public domain computer software. (a) General. This section prescribes the procedures for submission of...

  8. 76 FR 40961 - Public Company Accounting Oversight Board; Notice of Filing of Proposed Temporary Rule for an...

    Science.gov (United States)

    2011-07-12

    ... of the Board Section 1. General Provisions * * * Rule 1001. Definitions of Terms Employed in Rules... Accounting Oversight Board; Notice of Filing of Proposed Temporary Rule for an Interim Program of Inspection... Act of 2002 (the ``Act''), notice is hereby given that on June 21, 2011, the Public Company Accounting...

  9. Comparison of canal transportation and centering ability of twisted files, Pathfile-ProTaper system, and stainless steel hand K-files by using computed tomography.

    Science.gov (United States)

    Gergi, Richard; Rjeily, Joe Abou; Sader, Joseph; Naaman, Alfred

    2010-05-01

    The purpose of this study was to compare canal transportation and centering ability of 2 rotary nickel-titanium (NiTi) systems (Twisted Files [TF] and Pathfile-ProTaper [PP]) with conventional stainless steel K-files. Ninety root canals with severe curvature and short radius were selected. Canals were divided randomly into 3 groups of 30 each. After preparation with TF, PP, and stainless steel files, the amount of transportation that occurred was assessed by using computed tomography. Three sections from apical, mid-root, and coronal levels of the canal were recorded. Amount of transportation and centering ability were assessed. The 3 groups were statistically compared with analysis of variance and Tukey honestly significant difference test. Less transportation and better centering ability occurred with TF rotary instruments (P < .0001). K-files showed the highest transportation followed by PP system. PP system showed significant transportation when compared with TF (P < .0001). The TF system was found to be the best for all variables measured in this study. Copyright (c) 2010 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  10. Publication and Retrieval of Computational Chemical-Physical Data Via the Semantic Web. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Ostlund, Neil [Chemical Semantics, Inc., Gainesville, FL (United States)

    2017-07-20

    This research showed the feasibility of applying the concepts of the Semantic Web to Computation Chemistry. We have created the first web portal (www.chemsem.com) that allows data created in the calculations of quantum chemistry, and other such chemistry calculations to be placed on the web in a way that makes the data accessible to scientists in a semantic form never before possible. The semantic web nature of the portal allows data to be searched, found, and used as an advance over the usual approach of a relational database. The semantic data on our portal has the nature of a Giant Global Graph (GGG) that can be easily merged with related data and searched globally via a SPARQL Protocol and RDF Query Language (SPARQL) that makes global searches for data easier than with traditional methods. Our Semantic Web Portal requires that the data be understood by a computer and hence defined by an ontology (vocabulary). This ontology is used by the computer in understanding the data. We have created such an ontology for computational chemistry (purl.org/gc) that encapsulates a broad knowledge of the field of computational chemistry. We refer to this ontology as the Gainesville Core. While it is perhaps the first ontology for computational chemistry and is used by our portal, it is only a start of what must be a long multi-partner effort to define computational chemistry. In conjunction with the above efforts we have defined a new potential file standard (Common Standard for eXchange – CSX for computational chemistry data). This CSX file is the precursor of data in the Resource Description Framework (RDF) form that the semantic web requires. Our portal translates CSX files (as well as other computational chemistry data files) into RDF files that are part of the graph database that the semantic web employs. We propose a CSX file as a convenient way to encapsulate computational chemistry data.

  11. Experience of public procurement of Open Compute servers

    Science.gov (United States)

    Bärring, Olof; Guerri, Marco; Bonfillou, Eric; Valsan, Liviu; Grigore, Alexandru; Dore, Vincent; Gentit, Alain; Clement, Benoît; Grossir, Anthony

    2015-12-01

    The Open Compute Project. OCP (http://www.opencompute.org/). was launched by Facebook in 2011 with the objective of building efficient computing infrastructures at the lowest possible cost. The technologies are released as open hardware. with the goal to develop servers and data centres following the model traditionally associated with open source software projects. In 2013 CERN acquired a few OCP servers in order to compare performance and power consumption with standard hardware. The conclusions were that there are sufficient savings to motivate an attempt to procure a large scale installation. One objective is to evaluate if the OCP market is sufficiently mature and broad enough to meet the constraints of a public procurement. This paper summarizes this procurement. which started in September 2014 and involved the Request for information (RFI) to qualify bidders and Request for Tender (RFT).

  12. Modification to the Monte Carlo N-Particle (MCNP) Visual Editor (MCNPVised) to Read in Computer Aided Design (CAD) Files

    International Nuclear Information System (INIS)

    Randolph Schwarz; Leland L. Carter; Alysia Schwarz

    2005-01-01

    Monte Carlo N-Particle Transport Code (MCNP) is the code of choice for doing complex neutron/photon/electron transport calculations for the nuclear industry and research institutions. The Visual Editor for Monte Carlo N-Particle is internationally recognized as the best code for visually creating and graphically displaying input files for MCNP. The work performed in this grant was used to enhance the capabilities of the MCNP Visual Editor to allow it to read in both 2D and 3D Computer Aided Design (CAD) files, allowing the user to electronically generate a valid MCNP input geometry

  13. Conversion of Input Data between KENO and MCNP File Formats for Computer Criticality Assessments

    International Nuclear Information System (INIS)

    Schwarz, Randolph A.; Carter, Leland L.; Schwarz Alysia L.

    2006-01-01

    KENO is a Monte Carlo criticality code that is maintained by Oak Ridge National Laboratory (ORNL). KENO is included in the SCALE (Standardized Computer Analysis for Licensing Evaluation) package. KENO is often used because it was specifically designed for criticality calculations. Because KENO has convenient geometry input, including the treatment of lattice arrays of materials, it is frequently used for production calculations. Monte Carlo N-Particle (MCNP) is a Monte Carlo transport code maintained by Los Alamos National Laboratory (LANL). MCNP has a powerful 3D geometry package and an extensive cross section database. It is a general-purpose code and may be used for calculations involving shielding or medical facilities, for example, but can also be used for criticality calculations. MCNP is becoming increasingly more popular for performing production criticality calculations. Both codes have their own specific advantages. After a criticality calculation has been performed with one of the codes, it is often desirable (or may be a safety requirement) to repeat the calculation with the other code to compare the important parameters using a different geometry treatment and cross section database. This manual conversion of input files between the two codes is labor intensive. The industry needs the capability of converting geometry models between MCNP and KENO without a large investment in manpower. The proposed conversion package will aid the user in converting between the codes. It is not intended to be used as a ''black box''. The resulting input file will need to be carefully inspected by criticality safety personnel to verify the intent of the calculation is preserved in the conversion. The purpose of this package is to help the criticality specialist in the conversion process by converting the geometry, materials, and pertinent data cards

  14. Evidence of Discrimination in Public Accommodations Based on Sexual Orientation and Gender Identity: An Analysis of Complaints Filed with State Enforcement Agencies, 2008-2014

    OpenAIRE

    Mallory, Christy; Sears, Brad

    2016-01-01

    LGBT people file public accommodations discrimination complaints based on sexual orientation and gender identity as frequently as people of color and women file complaints based on race and sex. This study examines complaints filed with state enforcement agencies based on sexual orientation or gender identity, race, and sex and adjusted them by the number of adults most likely to experience each type of discrimination – LGBT people, people of color, and women. Data on discrimination complaint...

  15. Modification to the Monte N-Particle (MCNP) Visual Editor (MCNPVised) to read in Computer Aided Design (CAD) files

    International Nuclear Information System (INIS)

    Schwarz, Randy A.; Carter, Leeland L.

    2004-01-01

    Monte Carlo N-Particle Transport Code (MCNP) (Reference 1) is the code of choice for doing complex neutron/photon/electron transport calculations for the nuclear industry and research institutions. The Visual Editor for Monte Carlo N-Particle (References 2 to 11) is recognized internationally as the best code for visually creating and graphically displaying input files for MCNP. The work performed in this grant enhanced the capabilities of the MCNP Visual Editor to allow it to read in a 2D Computer Aided Design (CAD) file, allowing the user to modify and view the 2D CAD file and then electronically generate a valid MCNP input geometry with a user specified axial extent

  16. Geothermal-energy files in computer storage: sites, cities, and industries

    Energy Technology Data Exchange (ETDEWEB)

    O' Dea, P.L.

    1981-12-01

    The site, city, and industrial files are described. The data presented are from the hydrothermal site file containing about three thousand records which describe some of the principal physical features of hydrothermal resources in the United States. Data elements include: latitude, longitude, township, range, section, surface temperature, subsurface temperature, the field potential, and well depth for commercialization. (MHR)

  17. 76 FR 38382 - Public Service Company of Colorado; Notice of Filing

    Science.gov (United States)

    2011-06-30

    ... entities into PSCo. Additionally, PSCo requested a waiver of the Commission's accounting regulations to... electronic submission of protests and interventions in lieu of paper using the ``eFiling'' link at http://www...

  18. Evaluation of Single File Systems Reciproc, Oneshape, and WaveOne using Cone Beam Computed Tomography -An In Vitro Study.

    Science.gov (United States)

    Dhingra, Annil; Ruhal, Nidhi; Miglani, Anjali

    2015-04-01

    Successful endodontic therapy depends on many factor, one of the most important step in any root canal treatment is root canal preparation. In addition, respecting the original shape of the canal is of the same importance; otherwise, canal aberrations such as transportation will be created. The purpose of this study is to compare and evaluate Reciprocating WaveOne ,Reciproc and Rotary Oneshape Single File Instrumentation System On Cervical Dentin Thickness, Cross Sectional Area and Canal Transportation on First Mandibular Molar Using Cone Beam Computed Tomography. Sixty Mandibular First Molars extracted due to periodontal reason was collected from the Department of Oral and Maxillofacial. Teeth were prepared using one rotary and two reciprocating single file system. Teeth were divided into 3 groups 20 teeth in each group. Pre instrumentation and Post instrumentation scans was done and evaluated for three parameters Canal Transportation, Cervical Dentinal Thickness, Cross-sectional Area. Results were analysed statistically using ANOVA, Post-Hoc Tukey analysis. The change in cross-sectional area after filing showed significant difference at 0mm, 1mm, 2mm and 7mm (pfile system over a distance of 7 mm (starting from 0mm and then evaluation at 1mm, 2mm, 3mm, 5mm and 7mm), the results showed a significant difference among the file systems at various lengths (p= 0.014, 0.046, 0.004, 0.028, 0.005 & 0.029 respectively). Mean value of cervical dentinal removal is maximum at all the levels for oneshape and minimum for waveone showing the better quality of waveone and reciproc over oneshape file system. Significant difference was found at 9mm, 11mm and 12mm between all the three file systems (p<0.001,< 0.001, <0.001). It was concluded that reciprocating motion is better than rotary motion in all the three parameters Canal Transportation, Cross-sectional Area, Cervical Dentinal Thickness.

  19. Public Use Microdata Samples (PUMS)

    Data.gov (United States)

    National Aeronautics and Space Administration — Public Use Microdata Samples (PUMS) are computer-accessible files containing records for a sample of housing units, with information on the characteristics of each...

  20. Generation of Gaussian 09 Input Files for the Computation of 1H and 13C NMR Chemical Shifts of Structures from a Spartan’14 Conformational Search

    OpenAIRE

    sprotocols

    2014-01-01

    Authors: Spencer Reisbick & Patrick Willoughby ### Abstract This protocol describes an approach to preparing a series of Gaussian 09 computational input files for an ensemble of conformers generated in Spartan’14. The resulting input files are necessary for computing optimum geometries, relative conformer energies, and NMR shielding tensors using Gaussian. Using the conformational search feature within Spartan’14, an ensemble of conformational isomers was obtained. To convert the str...

  1. Ethics, big data and computing in epidemiology and public health.

    Science.gov (United States)

    Salerno, Jennifer; Knoppers, Bartha M; Lee, Lisa M; Hlaing, WayWay M; Goodman, Kenneth W

    2017-05-01

    This article reflects on the activities of the Ethics Committee of the American College of Epidemiology (ACE). Members of the Ethics Committee identified an opportunity to elaborate on knowledge gained since the inception of the original Ethics Guidelines published by the ACE Ethics and Standards of Practice Committee in 2000. The ACE Ethics Committee presented a symposium session at the 2016 Epidemiology Congress of the Americas in Miami on the evolving complexities of ethics and epidemiology as it pertains to "big data." This article presents a summary and further discussion of that symposium session. Three topic areas were presented: the policy implications of big data and computing, the fallacy of "secondary" data sources, and the duty of citizens to contribute to big data. A balanced perspective is needed that provides safeguards for individuals but also furthers research to improve population health. Our in-depth review offers next steps for teaching of ethics and epidemiology, as well as for epidemiological research, public health practice, and health policy. To address contemporary topics in the area of ethics and epidemiology, the Ethics Committee hosted a symposium session on the timely topic of big data. Technological advancements in clinical medicine and genetic epidemiology research coupled with rapid advancements in data networks, storage, and computation at a lower cost are resulting in the growth of huge data repositories. Big data increases concerns about data integrity; informed consent; protection of individual privacy, confidentiality, and harm; data reidentification; and the reporting of faulty inferences. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. JENDL special purpose file

    International Nuclear Information System (INIS)

    Nakagawa, Tsuneo

    1995-01-01

    In JENDL-3,2, the data on all the reactions having significant cross section over the neutron energy from 0.01 meV to 20 MeV are given for 340 nuclides. The object range of application extends widely, such as the neutron engineering, shield and others of fast reactors, thermal neutron reactors and nuclear fusion reactors. This is a general purpose data file. On the contrary to this, the file in which only the data required for a specific application field are collected is called special purpose file. The file for dosimetry is a typical special purpose file. The Nuclear Data Center, Japan Atomic Energy Research Institute, is making ten kinds of JENDL special purpose files. The files, of which the working groups of Sigma Committee are in charge, are listed. As to the format of the files, ENDF format is used similarly to JENDL-3,2. Dosimetry file, activation cross section file, (α, n) reaction data file, fusion file, actinoid file, high energy data file, photonuclear data file, PKA/KERMA file, gas production cross section file and decay data file are described on their contents, the course of development and their verification. Dosimetry file and gas production cross section file have been completed already. As for the others, the expected time of completion is shown. When these files are completed, they are opened to the public. (K.I.)

  3. Computational system to create an entry file for replicating I-125 seeds simulating brachytherapy case studies using the MCNPX code

    Directory of Open Access Journals (Sweden)

    Leonardo da Silva Boia

    2014-03-01

    Full Text Available Purpose: A computational system was developed for this paper in the C++ programming language, to create a 125I radioactive seed entry file, based on the positioning of a virtual grid (template in voxel geometries, with the purpose of performing prostate cancer treatment simulations using the MCNPX code.Methods: The system is fed with information from the planning system with regard to each seed’s location and its depth, and an entry file is automatically created with all the cards (instructions for each seed regarding their cell blocks and surfaces spread out spatially in the 3D environment. The system provides with precision a reproduction of the clinical scenario for the MCNPX code’s simulation environment, thereby allowing the technique’s in-depth study.Results and Conclusion: The preliminary results from this study showed that the lateral penumbra of uniform scanning proton beams was less sensitive In order to validate the computational system, an entry file was created with 88 125I seeds that were inserted in the phantom’s MAX06 prostate region with initial activity determined for the seeds at the 0.27 mCi value. Isodose curves were obtained in all the prostate slices in 5 mm steps in the 7 to 10 cm interval, totaling 7 slices. Variance reduction techniques were applied in order to optimize computational time and the reduction of uncertainties such as photon and electron energy interruptions in 4 keV and forced collisions regarding cells of interest. Through the acquisition of isodose curves, the results obtained show that hot spots have values above 300 Gy, as anticipated in literature, stressing the importance of the sources’ correct positioning, in which the computational system developed provides, in order not to release excessive doses in adjacent risk organs. The 144 Gy prescription curve showed in the validation process that it covers perfectly a large percentage of the volume, at the same time that it demonstrates a large

  4. CINDA 99, supplement 2 to CINDA 97 (1988-1999). The index to literature and computer files on microscopic neutron data

    International Nuclear Information System (INIS)

    1999-01-01

    CINDA, the Computer Index of Neutron Data, contains bibliographical references to measurements, calculations, reviews and evaluations of neutron cross-sections and other microscopic neutron data; it includes also index references to computer libraries of numerical neutron data available from four regional neutron data centres. The present issue, CINDA 99, is the second supplement to CINDA 97, the index to the literature on neutron data published after 1987. It supersedes the first supplement, CINDA 98. The complete CINDA file as of 1 June 1999 is contained in: the archival issue CINDA-A (5 volumes, 1990), CINDA 97 and the current issue CINDA 99. The compilation and publication of CINDA are the result of worldwide co-operation involving the following four data centres. Each centre is responsible for compiling the CINDA entries from the literature published in a defined geographical area given in brackets below: the USA National Nuclear Data Center at the Brookhaven National Laboratory, USA (United States of America and Canada); the Russian Nuclear Data Centre at the Fiziko-Energeticheskij Institut, Obninsk, Russian Federation (former USSR countries); the NEA Data Bank in Paris, France (European OECD member countries in Western Europe and Japan); and the IAEA Nuclear Data Section in Vienna, Austria (all other countries in Eastern Europe, Asia, Australia, Africa, Central and South America; also IAEA publications and translation journals). Besides the published CINDA books, up-to-date computer retrievals for specified CINDA information are currently available on request from the responsible CINDA centres, or via direct access to the on-line services as described in this publication

  5. OK, Computer: File Sharing, the Music Industry, and Why We Need the Pirate Party

    Directory of Open Access Journals (Sweden)

    Adrian Cosstick

    2009-03-01

    Full Text Available The Pirate Party believes the state and big business are in the process of protecting stale and inefficient models of business for their own monetary benefit by limiting our right to share information. The Pirate Party suggests that they are achieving this goal through the amendment of intellectual property legislation. In the dawn of the digital era, the Pirate Party advocates that governments and multinational corporations are using intellectual property to: crack down on file sharing which limits the ability to share knowledge and information; increase the terms and length of copyright to raise profits; and build code into music files which limits their ability to be shared (Pirate Party, 2009. There are a number of ‘copyright industries’ that are affected by these issues, none more so than the music industry. Its relationship with file sharing is topical and makes an excellent case study to address the impact big business has had on intellectual property and the need for the Pirate Party’s legislative input. The essay will then examine the central issues raised by illegal file sharing. In particular, the future for record companies in an environment that increasingly demands flexibility, and whether the Pirate Party’s proposal is a viable solution to the music industry’s problems

  6. Comparative evaluation of effect of rotary and reciprocating single-file systems on pericervical dentin: A cone-beam computed tomography study.

    Science.gov (United States)

    Zinge, Priyanka Ramdas; Patil, Jayaprakash

    2017-01-01

    The aim of this study is to evaluate and compare the effect of one shape, Neolix rotary single-file systems and WaveOne, Reciproc reciprocating single-file systems on pericervical dentin (PCD) using cone-beam computed tomography (CBCT). A total of 40 freshly extracted mandibular premolars were collected and divided into two groups, namely, Group A - Rotary: A 1 - Neolix and A 2 - OneShape and Group B - Reciprocating: B 1 - WaveOne and B 2 - Reciproc. Preoperative scans of each were taken followed by conventional access cavity preparation and working length determination with 10-k file. Instrumentation of the canal was done according to the respective file system, and postinstrumentation CBCT scans of teeth were obtained. 90 μm thick slices were obtained 4 mm apical and coronal to the cementoenamel junction. The PCD thickness was calculated as the shortest distance from the canal outline to the closest adjacent root surface, which was measured in four surfaces, i.e., facial, lingual, mesial, and distal for all the groups in the two obtained scans. There was no significant difference found between rotary single-file systems and reciprocating single-file systems in their effect on PCD, but in Group B 2 , there was most significant loss of tooth structure in the mesial, lingual, and distal surface ( P file system removes more PCD as compared to other experimental groups, whereas Neolix single file system had the least effect on PCD.

  7. A quantitative evaluation of the relative status of journal and conference publications in computer science.

    OpenAIRE

    Coyle, Lorcan; Freyne, Jill; Smyth, Barry; Cunningham, Padraig

    2010-01-01

    While it is universally held by computer scientists that conference publications have a higher status in computer science than in other disciplines there is little quantitative evidence in support of this position. The importance of journal publications in academic promotion makes this a big issue since an exclusive focus on journal papers will miss many significant papers published at conferences in computer science. In this paper we set out to quantify the relative importance of journ...

  8. ERX: a software for editing files containing X-ray spectra to be used in exposure computational models

    International Nuclear Information System (INIS)

    Cabral, Manuela O.M.; Vieira, Jose W.; Silva, Alysson G.; Leal Neto, Viriato; Oliveira, Alex C.H.; Lima, Fernando R.A.

    2011-01-01

    Exposure Computational Models (ECMs) are utilities that simulate situations in which occurs irradiation in a given environment. An ECM is composed primarily by an anthropomorphic model (phantom), and a Monte Carlo code (MC). This paper presents a tutorial of the software Espectro de Raios-X (ERX). This software performs reading and numerical and graphical analysis of text files containing diagnostic X-ray spectra for use in algorithms of radioactive sources in the ECMs of a Grupo de Dosimetria Numerica. The ERX allows the user to select one among several X-ray spectrums in the energy range Diagnostic radiology X-Ray most commonly used in radiology clinics. In the current version of the ERX there are two types of input files: the contained in mspectra.dat file and the resulting of MC simulations in Geant4. The software allows the construction of charts of the Probability Density Function (PDF) and Cumulative Distribution Function (CDF) of a selected spectrum as well as the table with the values of these functions and the spectrum. In addition, the ERX allows the user to make comparative analysis between the PDF graphics of the two catalogs of spectra available, besides being can perform dosimetric evaluations with the selected spectrum. A software of this kind is an important computational tool for researchers in numerical dosimetry because of the diversity of Diagnostic radiology X-Ray machines, which implies in a mass of input data highly diverse. And because of this, the ERX provides independence to the group related to the data origin that is contained in the catalogs created, not being necessary to resort to others. (author)

  9. Public policy and regulatory implications for the implementation of Opportunistic Cloud Computing Services for Enterprises

    DEFF Research Database (Denmark)

    Kuada, Eric; Olesen, Henning; Henten, Anders

    2012-01-01

    Opportunistic Cloud Computing Services (OCCS) is a social network approach to the provisioning and management of cloud computing services for enterprises. This paper discusses how public policy and regulations will impact on OCCS implementation. We rely on documented publicly available government...... and corporate policies on the adoption of cloud computing services and deduce the impact of these policies on their adoption of opportunistic cloud computing services. We conclude that there are regulatory challenges on data protection that raises issues for cloud computing adoption in general; and the lack...... of a single globally accepted data protection standard poses some challenges for very successful implementation of OCCS for companies. However, the direction of current public and corporate policies on cloud computing make a good case for them to try out opportunistic cloud computing services....

  10. Computer self efficacy as correlate of on-line public access ...

    African Journals Online (AJOL)

    The use of Online Public Access Catalogue (OPAC) by students has a lot of advantages and computer self-efficacy is a factor that could determine its effective utilization. Little appears to be known about colleges of education students‟ use of OPAC, computer self-efficacy and the relationship between OPAC and computer ...

  11. F2AC: A Lightweight, Fine-Grained, and Flexible Access Control Scheme for File Storage in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Wei Ren

    2016-01-01

    Full Text Available Current file storage service models for cloud servers assume that users either belong to single layer with different privileges or cannot authorize privileges iteratively. Thus, the access control is not fine-grained and flexible. Besides, most access control methods at cloud servers mainly rely on computationally intensive cryptographic algorithms and, especially, may not be able to support highly dynamic ad hoc groups with addition and removal of group members. In this paper, we propose a scheme called F2AC, which is a lightweight, fine-grained, and flexible access control scheme for file storage in mobile cloud computing. F2AC can not only achieve iterative authorization, authentication with tailored policies, and access control for dynamically changing accessing groups, but also provide access privilege transition and revocation. A new access control model called directed tree with linked leaf model is proposed for further implementations in data structures and algorithms. The extensive analysis is given for justifying the soundness and completeness of F2AC.

  12. CryptoCache: A Secure Sharable File Cache for Roaming Users

    DEFF Research Database (Denmark)

    Jensen, Christian D.

    2000-01-01

    . Conventional distributed file systems cache everything locally or not at all; there is no possibility to cache files on nearby nodes.In this paper we present the design of a secure cache system called CryptoCache that allows roaming users to cache files on untrusted file hosting servers. The system allows...... flexible sharing of cached files among unauthenticated users, i.e. unlike most distributed file systems CryptoCache does not require a global authentication framework.Files are encrypted when they are transferred over the network and while stored on untrusted servers. The system uses public key......Small mobile computers are now sufficiently powerful to run many applications, but storage capacity remains limited so working files cannot be cached or stored locally. Even if files can be stored locally, the mobile device is not powerful enough to act as server in collaborations with other users...

  13. The Battle to Secure Our Public Access Computers

    Science.gov (United States)

    Sendze, Monique

    2006-01-01

    Securing public access workstations should be a significant part of any library's network and information-security strategy because of the sensitive information patrons enter on these workstations. As the IT manager for the Johnson County Library in Kansas City, Kan., this author is challenged to make sure that thousands of patrons get the access…

  14. Study of basic computer competence among public health nurses in Taiwan.

    Science.gov (United States)

    Yang, Kuei-Feng; Yu, Shu; Lin, Ming-Sheng; Hsu, Chia-Ling

    2004-03-01

    Rapid advances in information technology and media have made distance learning on the Internet possible. This new model of learning allows greater efficiency and flexibility in knowledge acquisition. Since basic computer competence is a prerequisite for this new learning model, this study was conducted to examine the basic computer competence of public health nurses in Taiwan and explore factors influencing computer competence. A national cross-sectional randomized study was conducted with 329 public health nurses. A questionnaire was used to collect data and was delivered by mail. Results indicate that basic computer competence of public health nurses in Taiwan is still needs to be improved (mean = 57.57 +- 2.83, total score range from 26-130). Among the five most frequently used software programs, nurses were most knowledgeable about Word and least knowledgeable about PowerPoint. Stepwise multiple regression analysis revealed eight variables (weekly number of hours spent online at home, weekly amount of time spent online at work, weekly frequency of computer use at work, previous computer training, computer at workplace and Internet access, job position, education level, and age) that significantly influenced computer competence, which accounted for 39.0 % of the variance. In conclusion, greater computer competence, broader educational programs regarding computer technology, and a greater emphasis on computers at work are necessary to increase the usefulness of distance learning via the Internet in Taiwan. Building a user-friendly environment is important in developing this new media model of learning for the future.

  15. Integrating publicly-available data to generate computationally ...

    Science.gov (United States)

    The adverse outcome pathway (AOP) framework provides a way of organizing knowledge related to the key biological events that result in a particular health outcome. For the majority of environmental chemicals, the availability of curated pathways characterizing potential toxicity is limited. Methods are needed to assimilate large amounts of available molecular data and quickly generate putative AOPs for further testing and use in hazard assessment. A graph-based workflow was used to facilitate the integration of multiple data types to generate computationally-predicted (cp) AOPs. Edges between graph entities were identified through direct experimental or literature information or computationally inferred using frequent itemset mining. Data from the TG-GATEs and ToxCast programs were used to channel large-scale toxicogenomics information into a cpAOP network (cpAOPnet) of over 20,000 relationships describing connections between chemical treatments, phenotypes, and perturbed pathways measured by differential gene expression and high-throughput screening targets. Sub-networks of cpAOPs for a reference chemical (carbon tetrachloride, CCl4) and outcome (hepatic steatosis) were extracted using the network topology. Comparison of the cpAOP subnetworks to published mechanistic descriptions for both CCl4 toxicity and hepatic steatosis demonstrate that computational approaches can be used to replicate manually curated AOPs and identify pathway targets that lack genomic mar

  16. Micro computed tomography evaluation of the Self-adjusting file and ProTaper Universal system on curved mandibular molars.

    Science.gov (United States)

    Serefoglu, Burcu; Piskin, Beyser

    2017-09-26

    The aim of this investigation was to compare the cleaning and shaping efficiency of Self-adjusting file and Protaper, and to assess the correlation between root canal curvature and working time in mandibular molars using micro-computed tomography. Twenty extracted mandibular molars instrumented with Protaper and Self-adjusting file and the total working time was measured in mesial canals. The changes in canal volume, surface area and structure model index, transportation, uninstrumented area and the correlation between working-time and the curvature were analyzed. Although no statistically significant difference was observed between two systems in distal canals (p>0.05), a significantly higher amount of removed dentin volume and lower uninstrumented area were provided by Protaper in mesial canals (p<0.0001). A correlation between working-time and the canal-curvature was also observed in mesial canals for both groups (SAFr 2 =0.792, p<0.0004, PTUr 2 =0.9098, p<0.0001).

  17. 78 FR 20313 - American Municipal Power, Inc.; Michigan Public Power Agency; Notice of Filing

    Science.gov (United States)

    2013-04-04

    ... requirement for reactive supply and voltage control from generation or other sources service (Fremont Energy... Commission's Public Reference Room in Washington, DC. There is an ``eSubscription'' link on the Web site that...

  18. The Observation of Bahasa Indonesia Official Computer Terms Implementation in Scientific Publication

    Science.gov (United States)

    Gunawan, D.; Amalia, A.; Lydia, M. S.; Muthaqin, M. I.

    2018-03-01

    The government of the Republic of Indonesia had issued a regulation to substitute computer terms in foreign language that have been used earlier into official computer terms in Bahasa Indonesia. This regulation was stipulated in Presidential Decree No. 2 of 2001 concerning the introduction of official computer terms in Bahasa Indonesia (known as Senarai Padanan Istilah/SPI). After sixteen years, people of Indonesia, particularly for academics, should have implemented the official computer terms in their official publications. This observation is conducted to discover the implementation of official computer terms usage in scientific publications which are written in Bahasa Indonesia. The data source used in this observation are the publications by the academics, particularly in computer science field. The method used in the observation is divided into four stages. The first stage is metadata harvesting by using Open Archive Initiative - Protocol for Metadata Harvesting (OAI-PMH). Second, converting the harvested document (in pdf format) to plain text. The third stage is text-preprocessing as the preparation of string matching. Then the final stage is searching the official computer terms based on 629 SPI terms by using Boyer-Moore algorithm. We observed that there are 240,781 foreign computer terms in 1,156 scientific publications from six universities. This result shows that the foreign computer terms are still widely used by the academics.

  19. School Survey on Crime and Safety (SSOCS) 2000 Public-Use Data Files, User's Manual, and Detailed Data Documentation. [CD-ROM].

    Science.gov (United States)

    National Center for Education Statistics (ED), Washington, DC.

    This CD-ROM contains the raw, public-use data from the 2000 School Survey on Crime and Safety (SSOCS) along with a User's Manual and Detailed Data Documentation. The data are provided in SAS, SPSS, STATA, and ASCII formats. The User's Manual and the Detailed Data Documentation are provided as .pdf files. (Author)

  20. Research foci of computing research in South Africa as reflected by publications in the South African computer journal

    CSIR Research Space (South Africa)

    Kotzé, P

    2009-01-01

    Full Text Available of research articles published in SACJ over its first 40 volumes of the journal using the ACM Computing Classification Scheme as basis. In their analysis the authors divided the publications into three cycles of more or less six years in order to identify...

  1. Social science. Publication bias in the social sciences: unlocking the file drawer.

    Science.gov (United States)

    Franco, Annie; Malhotra, Neil; Simonovits, Gabor

    2014-09-19

    We studied publication bias in the social sciences by analyzing a known population of conducted studies--221 in total--in which there is a full accounting of what is published and unpublished. We leveraged Time-sharing Experiments in the Social Sciences (TESS), a National Science Foundation-sponsored program in which researchers propose survey-based experiments to be run on representative samples of American adults. Because TESS proposals undergo rigorous peer review, the studies in the sample all exceed a substantial quality threshold. Strong results are 40 percentage points more likely to be published than are null results and 60 percentage points more likely to be written up. We provide direct evidence of publication bias and identify the stage of research production at which publication bias occurs: Authors do not write up and submit null findings. Copyright © 2014, American Association for the Advancement of Science.

  2. Request queues for interactive clients in a shared file system of a parallel computing system

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin

    2015-08-18

    Interactive requests are processed from users of log-in nodes. A metadata server node is provided for use in a file system shared by one or more interactive nodes and one or more batch nodes. The interactive nodes comprise interactive clients to execute interactive tasks and the batch nodes execute batch jobs for one or more batch clients. The metadata server node comprises a virtual machine monitor; an interactive client proxy to store metadata requests from the interactive clients in an interactive client queue; a batch client proxy to store metadata requests from the batch clients in a batch client queue; and a metadata server to store the metadata requests from the interactive client queue and the batch client queue in a metadata queue based on an allocation of resources by the virtual machine monitor. The metadata requests can be prioritized, for example, based on one or more of a predefined policy and predefined rules.

  3. A computer program for creating keyword indexes to textual data files

    Science.gov (United States)

    Moody, David W.

    1972-01-01

    A keyword-in-context (KWIC) or out-of-context (KWOC) index is a convenient means of organizing information. This keyword index program can be used to create either KWIC or KWOC indexes of bibliographic references or other types of information punched on. cards, typed on optical scanner sheets, or retrieved from various Department of Interior data bases using the Generalized Information Processing System (GIPSY). The index consists of a 'bibliographic' section and a keyword-section based on the permutation of. document titles, project titles, environmental impact statement titles, maps, etc. or lists of descriptors. The program can also create a back-of-the-book index to documents from a list of descriptors. By providing the user with a wide range of input and output options, the program provides the researcher, manager, or librarian with a means of-maintaining a list and index to documents in. a small library, reprint collection, or office file.

  4. Verification of data files of TREF-computer program; TREF-ohjelmiston ohjaustiedostojen soveltuvuustutkimus

    Energy Technology Data Exchange (ETDEWEB)

    Ruottu, S.; Halme, A.; Ruottu, A. [Einco Oy, Karhula (Finland)

    1996-12-01

    Originally the aim of Y43 project was to verify TREF data files for several different processes. However, it appeared that deficient or missing coordination between experimental and theoretical works made meaningful verifications impossible in some cases. Therefore verification calculations were focused on catalytic cracking reactor which was developed by Neste. The studied reactor consisted of prefluidisation and reaction zones. Verification calculations concentrated mainly on physical phenomena like vaporization near oil injection zone. The main steps of the cracking process can be described as follows oil (liquid) -> oil (gas) -> oil (catal) -> product (catal) + char (catal) -> product (gas). Catalytic nature of cracking reaction was accounted by defining the cracking pseudoreaction into catalyst phase. This simplified reaction model was valid only for vaporization zone. Applied fluid dynamic theory was based on the results of EINCO`s earlier LIEKKI-projects. (author)

  5. The Administrative Impact of Computers on the British Columbia Public School System.

    Science.gov (United States)

    Gibbens, Trevor P.

    This case study analyzes and evaluates the administrative computer systems in the British Columbia public school organization in order to investigate the costs and benefits of computers, their impact on managerial work, their influence on centralization in organizations, and the relationship between organizational objectives and the design of…

  6. The Impact of Social Computing on Public Services : a Rationale for Government 2 . 0

    NARCIS (Netherlands)

    Broek, Tijs Van Den; Frissen, Valerie; Huijboom, Noor; Punie, Yves

    2010-01-01

    In this article the impact of the fast emerging social computing trend on the public sector is explored. This exploration is based on the results of a study1 Prospective and Technological Studies (IPTS)2 commissioned by the Institute for . Three cases of social computing initiatives in diverse

  7. A Computer-Assisted Instruction in Teaching Abstract Statistics to Public Affairs Undergraduates

    Science.gov (United States)

    Ozturk, Ali Osman

    2012-01-01

    This article attempts to demonstrate the applicability of a computer-assisted instruction supported with simulated data in teaching abstract statistical concepts to political science and public affairs students in an introductory research methods course. The software is called the Elaboration Model Computer Exercise (EMCE) in that it takes a great…

  8. 12 CFR 345.43 - Content and availability of public file.

    Science.gov (United States)

    2010-01-01

    ..., banking by telephone or computer, loan production offices, and bank-at-work or bank-by-mail programs); (6... contained within the area, either on the map or in a separate list; and (7) Any other information the bank... report Home Mortgage Disclosure Act (HMDA) data. A bank required to report home mortgage loan data...

  9. 12 CFR 25.43 - Content and availability of public file.

    Science.gov (United States)

    2010-01-01

    ... exclusively for the bank, banking by telephone or computer, loan production offices, and bank-at-work or bank... identifying the geographies contained within the area, either on the map or in a separate list; and (7) Any... the OCC. (2) Banks required to report Home Mortgage Disclosure Act (HMDA) data. A bank required to...

  10. 12 CFR 228.43 - Content and availability of public file.

    Science.gov (United States)

    2010-01-01

    ... for the bank, banking by telephone or computer, loan production offices, and bank-at-work or bank-by... the geographies contained within the area, either on the map or in a separate list; and (7) Any other.... (2) Banks required to report Home Mortgage Disclosure Act (HMDA) data. A bank required to report home...

  11. Computational Ecology and Software (http://www.iaees.org/publications/journals/ces/online-version.asp

    Directory of Open Access Journals (Sweden)

    ces@iaees.org

    Full Text Available Computational Ecology and Software ISSN 2220-721X URL: http://www.iaees.org/publications/journals/ces/online-version.asp RSS: http://www.iaees.org/publications/journals/ces/rss.xml E-mail: ces@iaees.org Editor-in-Chief: WenJun Zhang Aims and Scope COMPUTATIONAL ECOLOGY AND SOFTWARE (ISSN 2220-721X is an open access, peer-reviewed online journal that considers scientific articles in all different areas of computational ecology. It is the transactions of the International Society of Computational Ecology. The journal is concerned with the ecological researches, constructions and applications of theories and methods of computational sciences including computational mathematics, computational statistics and computer science. It features the simulation, approximation, prediction, recognition, and classification of ecological issues. Intensive computation is one of the major stresses of the journal. The journal welcomes research articles, short communications, review articles, perspectives, and book reviews. The journal also supports the activities of the International Society of Computational Ecology. The topics to be covered by CES include, but are not limited to: •Computation intensive methods, numerical and optimization methods, differential and difference equation modeling and simulation, prediction, recognition, classification, statistical computation (Bayesian computing, randomization, bootstrapping, Monte Carlo techniques, stochastic process, etc., agent-based modeling, individual-based modeling, artificial neural networks, knowledge based systems, machine learning, genetic algorithms, data exploration, network analysis and computation, databases, ecological modeling and computation using Geographical Information Systems, satellite imagery, and other computation intensive theories and methods. •Artificial ecosystems, artificial life, complexity of ecosystems and virtual reality. •The development, evaluation and validation of software and

  12. Public Services 2.0: The Impact of Social Computing on Public Services

    OpenAIRE

    Punie, Y.; Misuraca, G.; Osimo, D.; Huijboom, N.; Broek, T.A. van den; Frissen, V.; Kool, L.

    2010-01-01

    Since 2003, the Internet has seen impressive growth in user-driven applications such as blogs, podcasts, wikis and social networking sites. This trend is referred to here as ‘social computing’ as online applications increasingly support the creation of value by social networks of people. The social computing trend has been recognised and monitored by the Institute for Prospective and Technological Studies (IPTS) over the past few years. IPTS observed a viral take up of social computing applic...

  13. Shaping ability of the conventional nickel-titanium and reciprocating nickel-titanium file systems: a comparative study using micro-computed tomography.

    Science.gov (United States)

    Hwang, Young-Hye; Bae, Kwang-Shik; Baek, Seung-Ho; Kum, Kee-Yeon; Lee, WooCheol; Shon, Won-Jun; Chang, Seok Woo

    2014-08-01

    This study used micro-computed tomographic imaging to compare the shaping ability of Mtwo (VDW, Munich, Germany), a conventional nickel-titanium file system, and Reciproc (VDW), a reciprocating file system morphologically similar to Mtwo. Root canal shaping was performed on the mesiobuccal and distobuccal canals of extracted maxillary molars. In the RR group (n = 15), Reciproc was used in a reciprocating motion (150° counterclockwise/30° clockwise, 300 rpm); in the MR group, Mtwo was used in a reciprocating motion (150° clockwise/30° counterclockwise, 300 rpm); and in the MC group, Mtwo was used in a continuous rotating motion (300 rpm). Micro-computed tomographic images taken before and after canal shaping were used to analyze canal volume change and the degree of transportation at the cervical, middle, and apical levels. The time required for canal shaping was recorded. Afterward, each file was analyzed using scanning electron microscopy. No statistically significant differences were found among the 3 groups in the time for canal shaping or canal volume change (P > .05). Transportation values of the RR and MR groups were not significantly different at any level. However, the transportation value of the MC group was significantly higher than both the RR and MR groups at the cervical and apical levels (P file deformation was observed for 1 file in group RR (1/15), 3 files in group MR (3/15), and 5 files in group MC (5/15). In terms of shaping ability, Mtwo used in a reciprocating motion was not significantly different from the Reciproc system. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  14. PCF File Format.

    Energy Technology Data Exchange (ETDEWEB)

    Thoreson, Gregory G [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    PCF files are binary files designed to contain gamma spectra and neutron count rates from radiation sensors. It is the native format for the GAmma Detector Response and Analysis Software (GADRAS) package [1]. It can contain multiple spectra and information about each spectrum such as energy calibration. This document outlines the format of the file that would allow one to write a computer program to parse and write such files.

  15. The Importance of Computer Science for Public Health Training: An Opportunity and Call to Action.

    Science.gov (United States)

    Kunkle, Sarah; Christie, Gillian; Yach, Derek; El-Sayed, Abdulrahman M

    2016-01-01

    A century ago, the Welch-Rose Report established a public health education system in the United States. Since then, the system has evolved to address emerging health needs and integrate new technologies. Today, personalized health technologies generate large amounts of data. Emerging computer science techniques, such as machine learning, present an opportunity to extract insights from these data that could help identify high-risk individuals and tailor health interventions and recommendations. As these technologies play a larger role in health promotion, collaboration between the public health and technology communities will become the norm. Offering public health trainees coursework in computer science alongside traditional public health disciplines will facilitate this evolution, improving public health's capacity to harness these technologies to improve population health.

  16. Reciprocating vs Rotary Instrumentation in Pediatric Endodontics: Cone Beam Computed Tomographic Analysis of Deciduous Root Canals using Two Single-file Systems.

    Science.gov (United States)

    Prabhakar, Attiguppe R; Yavagal, Chandrashekar; Dixit, Kratika; Naik, Saraswathi V

    2016-01-01

    Primary root canals are considered to be most challenging due to their complex anatomy. "Wave one" and "one shape" are single-file systems with reciprocating and rotary motion respectively. The aim of this study was to evaluate and compare dentin thickness, centering ability, canal transportation, and instrumentation time of wave one and one shape files in primary root canals using a cone beam computed tomographic (CBCT) analysis. This is an experimental, in vitro study comparing the two groups. A total of 24 extracted human primary teeth with minimum 7 mm root length were included in the study. Cone beam computed tomographic images were taken before and after the instrumentation for each group. Dentin thickness, centering ability, canal transportation, and instrumentation times were evaluated for each group. A significant difference was found in instrumentation time and canal transportation measures between the two groups. Wave one showed less canal transportation as compared with one shape, and the mean instrumentation time of wave one was significantly less than one shape. Reciprocating single-file systems was found to be faster with much less procedural errors and can hence be recommended for shaping the root canals of primary teeth. How to cite this article: Prabhakar AR, Yavagal C, Dixit K, Naik SV. Reciprocating vs Rotary Instrumentation in Pediatric Endodontics: Cone Beam Computed Tomographic Analysis of Deciduous Root Canals using Two Single-File Systems. Int J Clin Pediatr Dent 2016;9(1):45-49.

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  18. Secure encapsulation and publication of biological services in the cloud computing environment.

    Science.gov (United States)

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  19. 76 FR 67418 - Request for Comments on NIST Special Publication 500-293, US Government Cloud Computing...

    Science.gov (United States)

    2011-11-01

    ...-1659-01] Request for Comments on NIST Special Publication 500-293, US Government Cloud Computing... Publication 500-293, US Government Cloud Computing Technology Roadmap, Release 1.0 (Draft). This document is... (USG) agencies to accelerate their adoption of cloud computing. The roadmap has been developed through...

  20. Cone-beam Computed Tomographic Assessment of Canal Centering Ability and Transportation after Preparation with Twisted File and Bio RaCe Instrumentation.

    Directory of Open Access Journals (Sweden)

    Kiamars Honardar

    2014-08-01

    Full Text Available Use of rotary Nickel-Titanium (NiTi instruments for endodontic preparation has introduced a new era in endodontic practice, but this issue has undergone dramatic modifications in order to achieve improved shaping abilities. Cone-beam computed tomography (CBCT has made it possible to accurately evaluate geometrical changes following canal preparation. This study was carried out to compare canal centering ability and transportation of Twisted File and BioRaCe rotary systems by means of cone-beam computed tomography.Thirty root canals from freshly extracted mandibular and maxillary teeth were selected. Teeth were mounted and scanned before and after preparation by CBCT at different apical levels. Specimens were divided into 2 groups of 15. In the first group Twisted File and in the second, BioRaCe was used for canal preparation. Canal transportation and centering ability after preparation were assessed by NNT Viewer and Photoshop CS4 software. Statistical analysis was performed using t-test and two-way ANOVA.All samples showed deviations from the original axes of the canals. No significant differences were detected between the two rotary NiTi instruments for canal centering ability in all sections. Regarding canal transportation however, a significant difference was seen in the BioRaCe group at 7.5mm from the apex.Under the conditions of this in vitro study, Twisted File and BioRaCe rotary NiTi files retained original canal geometry.

  1. Evaluation of Single File Systems Reciproc, Oneshape, and WaveOne using Cone Beam Computed Tomography –An In Vitro Study

    Science.gov (United States)

    Dhingra, Annil; Miglani, Anjali

    2015-01-01

    Background Successful endodontic therapy depends on many factor, one of the most important step in any root canal treatment is root canal preparation. In addition, respecting the original shape of the canal is of the same importance; otherwise, canal aberrations such as transportation will be created. Aim The purpose of this study is to compare and evaluate Reciprocating WaveOne ,Reciproc and Rotary Oneshape Single File Instrumentation System On Cervical Dentin Thickness, Cross Sectional Area and Canal Transportation on First Mandibular Molar Using Cone Beam Computed Tomography. Materials and Methods Sixty Mandibular First Molars extracted due to periodontal reason was collected from the Department of Oral and Maxillofacial. Teeth were prepared using one rotary and two reciprocating single file system. Teeth were divided into 3 groups 20 teeth in each group. Pre instrumentation and Post instrumentation scans was done and evaluated for three parameters Canal Transportation, Cervical Dentinal Thickness, Cross-sectional Area. Results were analysed statistically using ANOVA, Post-Hoc Tukey analysis. Results The change in cross-sectional area after filing showed significant difference at 0mm, 1mm, 2mm and 7mm (pfile system over a distance of 7 mm (starting from 0mm and then evaluation at 1mm, 2mm, 3mm, 5mm and 7mm), the results showed a significant difference among the file systems at various lengths (p= 0.014, 0.046, 0.004, 0.028, 0.005 & 0.029 respectively). Mean value of cervical dentinal removal is maximum at all the levels for oneshape and minimum for waveone showing the better quality of waveone and reciproc over oneshape file system. Significant difference was found at 9mm, 11mm and 12mm between all the three file systems (p<0.001,< 0.001, <0.001). Conclusion It was concluded that reciprocating motion is better than rotary motion in all the three parameters Canal Transportation, Cross-sectional Area, Cervical Dentinal Thickness. PMID:26023639

  2. User's guide for the implementation of level one of the proposed American National Standard Specifications for an information interchange data descriptive file on control data 6000/7000 series computers

    CERN Document Server

    Wiley, R A

    1977-01-01

    User's guide for the implementation of level one of the proposed American National Standard Specifications for an information interchange data descriptive file on control data 6000/7000 series computers

  3. HEDIS Public Use File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Healthcare Effectiveness Data and Information Set (HEDIS) is a tool used by more than 90 percent of Americas health plans to measure performance on important...

  4. Leveraging Cloud Computing to Address Public Health Disparities: An Analysis of the SPHPS.

    Science.gov (United States)

    Jalali, Arash; Olabode, Olusegun A; Bell, Christopher M

    2012-01-01

    As the use of certified electronic health record technology (CEHRT) has continued to gain prominence in hospitals and physician practices, public health agencies and health professionals have the ability to access health data through health information exchanges (HIE). With such knowledge health providers are well positioned to positively affect population health, and enhance health status or quality-of-life outcomes in at-risk populations. Through big data analytics, predictive analytics and cloud computing, public health agencies have the opportunity to observe emerging public health threats in real-time and provide more effective interventions addressing health disparities in our communities. The Smarter Public Health Prevention System (SPHPS) provides real-time reporting of potential public health threats to public health leaders through the use of a simple and efficient dashboard and links people with needed personal health services through mobile platforms for smartphones and tablets to promote and encourage healthy behaviors in our communities. The purpose of this working paper is to evaluate how a secure virtual private cloud (VPC) solution could facilitate the implementation of the SPHPS in order to address public health disparities.

  5. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    Science.gov (United States)

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved. PMID:24078906

  6. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Weizhe Zhang

    2013-01-01

    Full Text Available Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  7. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study

    Science.gov (United States)

    Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-01-01

    Background The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. Aim The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Materials and Methods Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20o to 35o were divided into three groups of 20 samples each: ProTaper PT (group I) – full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO – single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey’s honestly significant difference test. Results It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant Conclusion It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal

  8. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study.

    Science.gov (United States)

    Agarwal, Rolly S; Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-05-01

    The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20(o) to 35(o) were divided into three groups of 20 samples each: ProTaper PT (group I) - full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO - single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey's honestly significant difference test. It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant. It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal transportation and centering ability comparable to full sequence

  9. Informed public choices for low-carbon electricity portfolios using a computer decision tool.

    Science.gov (United States)

    Mayer, Lauren A Fleishman; Bruine de Bruin, Wändi; Morgan, M Granger

    2014-04-01

    Reducing CO2 emissions from the electricity sector will likely require policies that encourage the widespread deployment of a diverse mix of low-carbon electricity generation technologies. Public discourse informs such policies. To make informed decisions and to productively engage in public discourse, citizens need to understand the trade-offs between electricity technologies proposed for widespread deployment. Building on previous paper-and-pencil studies, we developed a computer tool that aimed to help nonexperts make informed decisions about the challenges faced in achieving a low-carbon energy future. We report on an initial usability study of this interactive computer tool. After providing participants with comparative and balanced information about 10 electricity technologies, we asked them to design a low-carbon electricity portfolio. Participants used the interactive computer tool, which constrained portfolio designs to be realistic and yield low CO2 emissions. As they changed their portfolios, the tool updated information about projected CO2 emissions, electricity costs, and specific environmental impacts. As in the previous paper-and-pencil studies, most participants designed diverse portfolios that included energy efficiency, nuclear, coal with carbon capture and sequestration, natural gas, and wind. Our results suggest that participants understood the tool and used it consistently. The tool may be downloaded from http://cedmcenter.org/tools-for-cedm/informing-the-public-about-low-carbon-technologies/ .

  10. 6th International Workshop on Computer-Aided Scheduling of Public Transport

    CERN Document Server

    Branco, Isabel; Paixão, José

    1995-01-01

    This proceedings volume consists of papers presented at the Sixth International Workshop on Computer-Aided Scheduling of Public Transpon, which was held at the Fund~lio Calouste Gulbenkian in Lisbon from July 6th to 9th, 1993. In the tradition of alternating Workshops between North America and Europe - Chicago (1975), Leeds (1980), Montreal (1983), Hamburg (1987) and again Montreal (1990), the European city of Lisbon was selected as the venue for the Workshop in 1993. As in earlier Workshops, the central theme dealt with vehicle and duty scheduling problems and the employment of operations-research-based software systems for operational planning in public transport. However, as was initiated in Hamburg in 1987, the scope of this Workshop was broadened to include topics in related fields. This fundamental alteration was an inevitable consequence of the growing demand over the last decade for solutions to the complete planning process in public transport through integrated systems. Therefore, the program of thi...

  11. 76 FR 12397 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Bureau of the Public Debt (BPD...

    Science.gov (United States)

    2011-03-07

    ...; Computer Matching Program (SSA/ Bureau of the Public Debt (BPD))--Match Number 1038 AGENCY: Social Security... as shown above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection... containing SSNs extracted from the Supplemental Security Record database. Exchanges for this computer...

  12. 76 FR 40950 - Public Company Accounting Oversight Board; Notice of Filing of Proposed Board Funding Final Rules...

    Science.gov (United States)

    2011-07-12

    ... available, the issuer's net asset value. (i)(v) Issuer Accounting Support Fee The term ``issuer accounting... Accounting Oversight Board; Notice of Filing of Proposed Board Funding Final Rules for Allocation of the Board's Accounting Support Fee Among Issuers, Brokers, and Dealers, and Other Amendments to the Board's...

  13. 18 CFR 4.32 - Acceptance for filing or rejection; information to be made available to the public; requests for...

    Science.gov (United States)

    2010-04-01

    ...; and (B) The entities identified in paragraph (a)(2) of this section, as well as any other Federal... alleged in the application or other materials filed, be subscribed and verified under oath in the form set... petition. The applicant or petitioner must serve one copy of the application or petition on the Director of...

  14. THE USE OF COMPUTER APPLICATIONS IN THE STUDY OF ROMANIA'S PUBLIC DEBT

    Directory of Open Access Journals (Sweden)

    Popeanga Vasile

    2011-07-01

    Full Text Available Total public debt represents all monetary obligations of the state (government, public institutions, financial, administrative-territorial units at a time, resulting from internal and external loans (in lei and foreign currencies contracted on short, medium and long term, and the state treasury and its own obligations for the amounts advanced temporarily to cover the budget deficit. Loans may be contracted by the state through the Ministry of Finance, in his own name or guaranteed by it. Public debt is expressed in local currency or foreign currency, depending on where the contracts and loan conditions. In order to evaluate Romania's public debt, obligations denominated in another currency than the national currency is calculated using the exchange rate of National Bank of Romania. Also, total public debt of a country can be expressed in absolute values (to know the load on that country's economy which is subject to its creditors, the relative values as a percentage of GDP (to allow comparison over time and between countries and the average size per capita (to allow comparisons and analysis in time and space. Total public debt is calculated and separately manages its two forms, namely domestic public debt and external public debt. Ministry of Finance shall prepare and submit annually to the Government for approval and to Parliament for information, report on public debt, which contains information on government debt portfolio, debt service, public indebtedness indicators and information about primary and secondary market securities state and how to implement the medium-term strategy in managing government debt for the previous year. In order to make comparisons quick and effective on public debt dynamics in Romania, Excel 2010 has new features such as charts and sparkline slicers features which can help discover trends and statistics in accordance with existing data. The aim of this article is accurate assessment of Romania's public debt and its

  15. Design and study of parallel computing environment of Monte Carlo simulation for particle therapy planning using a public cloud-computing infrastructure

    International Nuclear Information System (INIS)

    Yokohama, Noriya

    2013-01-01

    This report was aimed at structuring the design of architectures and studying performance measurement of a parallel computing environment using a Monte Carlo simulation for particle therapy using a high performance computing (HPC) instance within a public cloud-computing infrastructure. Performance measurements showed an approximately 28 times faster speed than seen with single-thread architecture, combined with improved stability. A study of methods of optimizing the system operations also indicated lower cost. (author)

  16. Evaluation of the Self-Adjusting File system (SAF) for the instrumentation of primary molar root canals: a micro-computed tomographic study.

    Science.gov (United States)

    Kaya, E; Elbay, M; Yiğit, D

    2017-06-01

    The Self-Adjusting File (SAF) system has been recommended for use in permanent teeth since it offers more conservative and effective root-canal preparation when compared to traditional rotary systems. However, no study had evaluated the usage of SAF in primary teeth. The aim of this study was to evaluate and compare the use of SAF, K file (manual instrumentation) and Profile (traditional rotary instrumentation) systems for primary-tooth root-canal preparation in terms of instrumentation time and amounts of dentin removed using micro-computed tomography (μCT) technology. Study Design: The study was conducted with 60 human primary mandibular second molar teeth divided into 3 groups according to instrumentation technique: Group I: SAF (n=20); Group II: K file (n=20); Group III; Profile (n=20). Teeth were embedded in acrylic blocks and scanned with a μCT scanner prior to instrumentation. All distal root canals were prepared up to size 30 for K file,.04/30 for Profile and 2 mm thickness, size 25 for SAF; instrumentation time was recorded for each tooth, and a second μCT scan was performed after instrumentation was complete. Amounts of dentin removed were measured using the three-dimensional images by calculating the difference in root-canal volume before and after preparation. Data was statistically analysed using the Kolmogorov-Smirnov and Kruskal-Wallis tests. Manual instrumentation (K file) resulted in significantly more dentin removal when compared to rotary instrumentation (Profile and SAF), while the SAF system generated significantly less dentin removal than both manual instrumentation (K file) and traditional rotary instrumentation (Profile) (psystems. Within the experimental conditions of the present study, the SAF seems as a useful system for root-canal instrumentation in primary molars because it removed less dentin than other systems, which is especially important for the relatively thin-walled canals of primary teeth, and because it involves less

  17. [Comparison of effectiveness and safety between Twisted File technique and ProTaper Universal rotary full sequence based on micro-computed tomography].

    Science.gov (United States)

    Chen, Xiao-bo; Chen, Chen; Liang, Yu-hong

    2016-02-18

    To evaluate the efficacy and security of two type of rotary nickel titanium system (Twisted File and ProTaper Universal) for root canal preparation based on micro-computed tomography(micro-CT). Twenty extracted molars (including 62 canals) were divided into two experimental groups and were respectively instrumented using Twisted File rotary nickel titanium system (TF) and ProTaper Universal rotary nickel titanium system (PU) to #25/0.08 following recommended protocol. Time for root canal instrumentation (accumulation of time for every single file) was recorded. The 0-3 mm root surface from apex was observed under an optical stereomicroscope at 25 × magnification. The presence of crack line was noted. The root canals were scanned with micro-CT before and after root canal preparation. Three-dimensional shape images of canals were reconstructed, calculated and evaluated. The amount of canal central transportation of the two groups was calculated and compared. The shorter preparation time [(0.53 ± 0.14) min] was observed in TF group, while the preparation time of PU group was (2.06 ± 0.39) min (Pvs. (0.097 ± 0.084) mm, P<0.05]. No instrument separation was observed in both the groups. Cracks were not found in both the groups either based in micro-CT images or observation under an optical stereomicroscope at 25 × magnification. Compared with ProTaper Universal, Twisted File took less time in root canal preparation and exhibited better shaping ability, and less canal transportation.

  18. Societes Eolienne du Camelia - Unique authorisation request presented by the Societe Eoliennes du Camelia company in view of exploitation of a wind farm comprising six wind turbines and two supply stations located on the territory of the commune of Reboursin. Complete file, Public observations

    International Nuclear Information System (INIS)

    2017-01-01

    This document first contains a Prefect decree authorising a public inquiry, the opinion produced by the public inquiry, the opinion of the environmental authority, the response of the operator to this opinion, and decrees extending the duration of the public inquiry. Then comes the complete file elaborated for the wind farm project: request file, architecture file, map, impact studies (ecology, landscape, sound, hazards), and plans

  19. FCJ-133 The Scripted Spaces of Urban Ubiquitous Computing: The experience, poetics, and politics of public scripted space

    Directory of Open Access Journals (Sweden)

    Christian Ulrik Andersen

    2011-12-01

    Full Text Available This article proposes and introduces the concept of ‘scripted space’ as a new perspective on ubiquitous computing in urban environments. Drawing on urban history, computer games, and a workshop study of the city of Lund the article discusses the experience of digitally scripted spaces, and their relation to the history of public spaces. In conclusion, the article discusses the potential for employing scripted spaces as a reinvigoration of urban public space.

  20. High-performance parallel computing in the classroom using the public goods game as an example

    Science.gov (United States)

    Perc, Matjaž

    2017-07-01

    The use of computers in statistical physics is common because the sheer number of equations that describe the behaviour of an entire system particle by particle often makes it impossible to solve them exactly. Monte Carlo methods form a particularly important class of numerical methods for solving problems in statistical physics. Although these methods are simple in principle, their proper use requires a good command of statistical mechanics, as well as considerable computational resources. The aim of this paper is to demonstrate how the usage of widely accessible graphics cards on personal computers can elevate the computing power in Monte Carlo simulations by orders of magnitude, thus allowing live classroom demonstration of phenomena that would otherwise be out of reach. As an example, we use the public goods game on a square lattice where two strategies compete for common resources in a social dilemma situation. We show that the second-order phase transition to an absorbing phase in the system belongs to the directed percolation universality class, and we compare the time needed to arrive at this result by means of the main processor and by means of a suitable graphics card. Parallel computing on graphics processing units has been developed actively during the last decade, to the point where today the learning curve for entry is anything but steep for those familiar with programming. The subject is thus ripe for inclusion in graduate and advanced undergraduate curricula, and we hope that this paper will facilitate this process in the realm of physics education. To that end, we provide a documented source code for an easy reproduction of presented results and for further development of Monte Carlo simulations of similar systems.

  1. Long term file migration. Part I: file reference patterns

    International Nuclear Information System (INIS)

    Smith, A.J.

    1978-08-01

    In most large computer installations, files are moved between on-line disk and mass storage (tape, integrated mass storage device) either automatically by the system or specifically at the direction of the user. This is the first of two papers which study the selection of algorithms for the automatic migration of files between mass storage and disk. The use of the text editor data sets at the Stanford Linear Accelerator Center (SLAC) computer installation is examined through the analysis of thirteen months of file reference data. Most files are used very few times. Of those that are used sufficiently frequently that their reference patterns may be examined, about a third show declining rates of reference during their lifetime; of the remainder, very few (about 5%) show correlated interreference intervals, and interreference intervals (in days) appear to be more skewed than would occur with the Bernoulli process. Thus, about two-thirds of all sufficiently active files appear to be referenced as a renewal process with a skewed interreference distribution. A large number of other file reference statistics (file lifetimes, interference distributions, moments, means, number of uses/file, file sizes, file rates of reference, etc.) are computed and presented. The results are applied in the following paper to the development and comparative evaluation of file migration algorithms. 17 figures, 13 tables

  2. 76 FR 12398 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Bureau of the Public Debt (BPD...

    Science.gov (United States)

    2011-03-07

    ... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2010-0034] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Bureau of the Public Debt (BPD))--Match Number 1304 AGENCY: Social Security... as shown above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection...

  3. RISE OF BIOINFORMATICS AND COMPUTATIONAL BIOLOGY IN INDIA: A LOOK THROUGH PUBLICATIONS

    Directory of Open Access Journals (Sweden)

    Anjali Srivastava

    2017-09-01

    Full Text Available Computational biology and bioinformatics have been part and parcel of biomedical research for few decades now. However, the institutionalization of bioinformatics research took place with the establishment of Distributed Information Centres (DISCs in the research institutions of repute in various disciplines by the Department of Biotechnology, Government of India. Though, at initial stages, this endeavor was mainly focused on providing infrastructure for using information technology and internet based communication and tools for carrying out computational biology and in-silico assisted research in varied arena of research starting from disease biology to agricultural crops, spices, veterinary science and many more, the natural outcome of establishment of such facilities resulted into new experiments with bioinformatics tools. Thus, Biotechnology Information Systems (BTIS grew into a solid movement and a large number of publications started coming out of these centres. In the end of last century, bioinformatics started developing like a full-fledged research subject. In the last decade, a need was felt to actually make a factual estimation of the result of this endeavor of DBT which had, by then, established about two hundred centres in almost all disciplines of biomedical research. In a bid to evaluate the efforts and outcome of these centres, BTIS Centre at CSIR-CDRI, Lucknow was entrusted with collecting and collating the publications of these centres. However, when the full data was compiled, the DBT task force felt that the study must include Non-BTIS centres also so as to expand the report to have a glimpse of bioinformatics publications from the country.

  4. 36 CFR 1254.32 - What rules apply to public access use of the Internet on NARA-supplied computers?

    Science.gov (United States)

    2010-07-01

    ... access use of the Internet on NARA-supplied computers? 1254.32 Section 1254.32 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION PUBLIC AVAILABILITY AND USE USING RECORDS AND DONATED... for Internet use in all NARA research rooms. The number of workstations varies per location. We...

  5. SPEEDI: a computer code system for the real-time prediction of radiation dose to the public due to an accidental release

    International Nuclear Information System (INIS)

    Imai, Kazuhiko; Chino, Masamichi; Ishikawa, Hirohiko

    1985-10-01

    SPEEDI, a computer code system for prediction of environmental doses from radioactive materials accidentally released from a nuclear plant has been developed to assist the organizations responsible for an emergency planning. For realistic simulation, have been developed a model which statistically predicts the basic wind data and then calculates the three-dimensional mass consistent wind field by interpolating these predicted data, and a model for calculation of the diffusion of released materials using a combined model of random-walk and PICK methods. These calculation in the system is carried out in conversational mode with a computer so that we may use the system with ease in an emergency. SPEEDI has also versatile files, which make it easy to control the complicated flows of calculation. In order to attain a short computation time, a large-scale computer with performance of 25 MIPS and a vector processor of maximum 250 MFLOPS are used for calculation of the models so that quick responses have been made. Simplified models are also prepared for calculation in a minicomputer widely used by local governments and research institutes, although the precision of calculation as same with the above models can not be expected to obtain. The present report outlines the structure and functions of SPEEDI, methods for prediction of the wind field and the models for calculation of the concentration of released materials in air and on the ground, and the doses to the public. Some of the diffusion models have been compared with the field experiments which had been carried out as a part of the SPEEDI development program. The report also discusses the reliability of the diffusion models on the basis of the compared results, and shows that they can reasonably simulate the diffusion in the internal boundary layer which commonly occurs near the coastal region. (J.P.N.)

  6. A micro-computed tomographic evaluation of dentinal microcrack alterations during root canal preparation using single-file Ni-Ti systems.

    Science.gov (United States)

    Li, Mei-Lin; Liao, Wei-Li; Cai, Hua-Xiong

    2018-01-01

    The aim of the present study was to evaluate the length of dentinal microcracks observed prior to and following root canal preparation with different single-file nickel-titanium (Ni-Ti) systems using micro-computed tomography (micro-CT) analysis. A total of 80 mesial roots of mandibular first molars presenting with type II Vertucci canal configurations were scanned at an isotropic resolution of 7.4 µm. The samples were randomly assigned into four groups (n=20 per group) according to the system used for root canal preparation, including the WaveOne (WO), OneShape (OS), Reciproc (RE) and control groups. A second micro-CT scan was conducted after the root canals were prepared with size 25 instruments. Pre- and postoperative cross-section images of the roots (n=237,760) were then screened to identify the lengths of the microcracks. The results indicated that the microcrack lengths were notably increased following root canal preparation (Pfiles. Among the single-file Ni-Ti systems, WO and RE were not observed to cause notable microcracks, while the OS system resulted in evident microcracks.

  7. ICRP Publication 116—the first ICRP/ICRU application of the male and female adult reference computational phantoms

    CERN Document Server

    Petoussi-Henss, Nina; Eckerman, Keith F; Endo, Akira; Hertel, Nolan; Hunt, John; Menzel, Hans G; Pelliccioni, Maurizio; Schlattl, Helmut; Zankl, Maria

    2014-01-01

    ICRP Publication 116 on `Conversion coefficients for radiological protection quantities for external radiation exposures', provides fluence-to-dose conversion coefficients for organ-absorbed doses and effective dose for various types of external exposures (ICRP 2010 ICRP Publication 116). The publication supersedes the ICRP Publication 74 (ICRP 1996 ICRP Publication 74, ICRU 1998 ICRU Report 57), including new particle types and expanding the energy ranges considered. The coefficients were calculated using the ICRP/ICRU computational phantoms (ICRP 2009 ICRP Publication 110) representing the reference adult male and reference adult female (ICRP 2002 ICRP Publication 89), together with a variety of Monte Carlo codes simulating the radiation transport in the body. Idealized whole-body irradiation from unidirectional and rotational parallel beams as well as isotropic irradiation was considered for a large variety of incident radiations and energy ranges. Comparison of the effective doses with operational quantit...

  8. Information Dissemination of Public Health Emergency on Social Networks and Intelligent Computation

    Directory of Open Access Journals (Sweden)

    Hongzhi Hu

    2015-01-01

    Full Text Available Due to the extensive social influence, public health emergency has attracted great attention in today’s society. The booming social network is becoming a main information dissemination platform of those events and caused high concerns in emergency management, among which a good prediction of information dissemination in social networks is necessary for estimating the event’s social impacts and making a proper strategy. However, information dissemination is largely affected by complex interactive activities and group behaviors in social network; the existing methods and models are limited to achieve a satisfactory prediction result due to the open changeable social connections and uncertain information processing behaviors. ACP (artificial societies, computational experiments, and parallel execution provides an effective way to simulate the real situation. In order to obtain better information dissemination prediction in social networks, this paper proposes an intelligent computation method under the framework of TDF (Theory-Data-Feedback based on ACP simulation system which was successfully applied to the analysis of A (H1N1 Flu emergency.

  9. Information Dissemination of Public Health Emergency on Social Networks and Intelligent Computation.

    Science.gov (United States)

    Hu, Hongzhi; Mao, Huajuan; Hu, Xiaohua; Hu, Feng; Sun, Xuemin; Jing, Zaiping; Duan, Yunsuo

    2015-01-01

    Due to the extensive social influence, public health emergency has attracted great attention in today's society. The booming social network is becoming a main information dissemination platform of those events and caused high concerns in emergency management, among which a good prediction of information dissemination in social networks is necessary for estimating the event's social impacts and making a proper strategy. However, information dissemination is largely affected by complex interactive activities and group behaviors in social network; the existing methods and models are limited to achieve a satisfactory prediction result due to the open changeable social connections and uncertain information processing behaviors. ACP (artificial societies, computational experiments, and parallel execution) provides an effective way to simulate the real situation. In order to obtain better information dissemination prediction in social networks, this paper proposes an intelligent computation method under the framework of TDF (Theory-Data-Feedback) based on ACP simulation system which was successfully applied to the analysis of A (H1N1) Flu emergency.

  10. Bringing numerous methods for expression and promoter analysis to a public cloud computing service.

    Science.gov (United States)

    Polanski, Krzysztof; Gao, Bo; Mason, Sam A; Brown, Paul; Ott, Sascha; Denby, Katherine J; Wild, David L

    2018-03-01

    Every year, a large number of novel algorithms are introduced to the scientific community for a myriad of applications, but using these across different research groups is often troublesome, due to suboptimal implementations and specific dependency requirements. This does not have to be the case, as public cloud computing services can easily house tractable implementations within self-contained dependency environments, making the methods easily accessible to a wider public. We have taken 14 popular methods, the majority related to expression data or promoter analysis, developed these up to a good implementation standard and housed the tools in isolated Docker containers which we integrated into the CyVerse Discovery Environment, making these easily usable for a wide community as part of the CyVerse UK project. The integrated apps can be found at http://www.cyverse.org/discovery-environment, while the raw code is available at https://github.com/cyversewarwick and the corresponding Docker images are housed at https://hub.docker.com/r/cyversewarwick/. info@cyverse.warwick.ac.uk or D.L.Wild@warwick.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  11. Exploiting heterogeneous publicly available data sources for drug safety surveillance: computational framework and case studies.

    Science.gov (United States)

    Koutkias, Vassilis G; Lillo-Le Louët, Agnès; Jaulent, Marie-Christine

    2017-02-01

    Driven by the need of pharmacovigilance centres and companies to routinely collect and review all available data about adverse drug reactions (ADRs) and adverse events of interest, we introduce and validate a computational framework exploiting dominant as well as emerging publicly available data sources for drug safety surveillance. Our approach relies on appropriate query formulation for data acquisition and subsequent filtering, transformation and joint visualization of the obtained data. We acquired data from the FDA Adverse Event Reporting System (FAERS), PubMed and Twitter. In order to assess the validity and the robustness of the approach, we elaborated on two important case studies, namely, clozapine-induced cardiomyopathy/myocarditis versus haloperidol-induced cardiomyopathy/myocarditis, and apixaban-induced cerebral hemorrhage. The analysis of the obtained data provided interesting insights (identification of potential patient and health-care professional experiences regarding ADRs in Twitter, information/arguments against an ADR existence across all sources), while illustrating the benefits (complementing data from multiple sources to strengthen/confirm evidence) and the underlying challenges (selecting search terms, data presentation) of exploiting heterogeneous information sources, thereby advocating the need for the proposed framework. This work contributes in establishing a continuous learning system for drug safety surveillance by exploiting heterogeneous publicly available data sources via appropriate support tools.

  12. Associations between neck musculoskeletal complaints and work related factors among public service computer workers in Kaunas

    Directory of Open Access Journals (Sweden)

    Gintaré Kaliniene

    2013-10-01

    Full Text Available Objectives:Information technologies have been developing very rapidly, also in the case of occupational activities. Epidemiological studies have shown that employees, who work with computers, are more likely to complain of musculoskeletal disorders (MSD. The aim of this study was to evaluate associations between neck MSD and individual and work related factors. Materials and Methods: The investigation which consisted of two parts - a questionnaire study (using Nordic Musculoskeletal questionnaire and Copenhagen Psychosocial Questionnaire and a direct observation (to evaluate ergonomic work environment using RULA method was carried out in three randomly selected public sector companies of Kaunas. The study population consisted of 513 public service office workers. Results: The survey showed that neck MSDs were very common in the investigated population. The prevalence rate amounted to 65.7%. According to our survey neck MSDs were significantly associated with older age, bigger work experience, high quantitative and cognitive job demands, working for longer than 2 h without taking a break as well as with higher ergonomic risk score. The fully adjusted model working for longer than 2 h without taking a break had the strongest associations with neck complaints. Conclusion: It was confirmed, that neck MSDs were significantly associated with individual factors as well as conditions of work, therefore, preventive acions against neck complaints should be oriented at psychosocial and ergonomic work environment as well as at individual factors.

  13. Documentation for the NCES Common Core of Data National Public Education Financial Survey (NPEFS), School Year 2008-09 (Fiscal Year 2009). Revised File Version 1b. NCES 2011-330rev

    Science.gov (United States)

    Cornman, Stephen Q.; Zhou, Lei; Nakamoto, Nanae

    2012-01-01

    This documentation is for the revised file (Version 1b) of the National Center for Education Statistics' (NCES) Common Core of Data (CCD) National Public Education Financial Survey (NPEFS) for school year 2008-2009, fiscal year 2009 (FY 09). It contains a brief description of the data collection along with information required to understand and…

  14. panMetaDocs, eSciDoc, and DOIDB - an infrastructure for the curation and publication of file-based datasets for 'GFZ Data Services'

    Science.gov (United States)

    Ulbricht, Damian; Elger, Kirsten; Bertelmann, Roland; Klump, Jens

    2016-04-01

    With the foundation of DataCite in 2009 and the technical infrastructure installed in the last six years it has become very easy to create citable dataset DOIs. Nowadays, dataset DOIs are increasingly accepted and required by journals in reference lists of manuscripts. In addition, DataCite provides usage statistics [1] of assigned DOIs and offers a public search API to make research data count. By linking related information to the data, they become more useful for future generations of scientists. For this purpose, several identifier systems, as ISBN for books, ISSN for journals, DOI for articles or related data, Orcid for authors, and IGSN for physical samples can be attached to DOIs using the DataCite metadata schema [2]. While these are good preconditions to publish data, free and open solutions that help with the curation of data, the publication of research data, and the assignment of DOIs in one software seem to be rare. At GFZ Potsdam we built a modular software stack that is made of several free and open software solutions and we established 'GFZ Data Services'. 'GFZ Data Services' provides storage, a metadata editor for publication and a facility to moderate minted DOIs. All software solutions are connected through web APIs, which makes it possible to reuse and integrate established software. Core component of 'GFZ Data Services' is an eSciDoc [3] middleware that is used as central storage, and has been designed along the OAIS reference model for digital preservation. Thus, data are stored in self-contained packages that are made of binary file-based data and XML-based metadata. The eSciDoc infrastructure provides access control to data and it is able to handle half-open datasets, which is useful in embargo situations when a subset of the research data are released after an adequate period. The data exchange platform panMetaDocs [4] makes use of eSciDoc's REST API to upload file-based data into eSciDoc and uses a metadata editor [5] to annotate the files

  15. 47 CFR 1.1704 - Station files.

    Science.gov (United States)

    2010-10-01

    ... System (COALS) § 1.1704 Station files. Applications, notifications, correspondence, electronic filings... Television Relay Service (CARS) are maintained by the Commission in COALS and the Public Reference Room...

  16. Assessing Tax Form Distribution Costs: A Proposed Method for Computing the Dollar Value of Tax Form Distribution in a Public Library.

    Science.gov (United States)

    Casey, James B.

    1998-01-01

    Explains how a public library can compute the actual cost of distributing tax forms to the public by listing all direct and indirect costs and demonstrating the formulae and necessary computations. Supplies directions for calculating costs involved for all levels of staff as well as associated public relations efforts, space, and utility costs.…

  17. File sharing

    NARCIS (Netherlands)

    van Eijk, N.

    2011-01-01

    File sharing’ has become generally accepted on the Internet. Users share files for downloading music, films, games, software etc. In this note, we have a closer look at the definition of file sharing, the legal and policy-based context as well as enforcement issues. The economic and cultural

  18. Computed micro-tomographic evaluation of glide path with nickel-titanium rotary PathFile in maxillary first molars curved canals.

    Science.gov (United States)

    Pasqualini, Damiano; Bianchi, Caterina Chiara; Paolino, Davide Salvatore; Mancini, Lucia; Cemenasco, Andrea; Cantatore, Giuseppe; Castellucci, Arnaldo; Berutti, Elio

    2012-03-01

    X-ray computed micro-tomography scanning allows high-resolution 3-dimensional imaging of small objects. In this study, micro-CT scanning was used to compare the ability of manual and mechanical glide path to maintain the original root canal anatomy. Eight extracted upper first permanent molars were scanned at the TOMOLAB station at ELETTRA Synchrotron Light Laboratory in Trieste, Italy, with a microfocus cone-beam geometry system. A total of 2,400 projections on 360° have been acquired at 100 kV and 80 μA, with a focal spot size of 8 μm. Buccal root canals of each specimen (n = 16) were randomly assigned to PathFile (P) or stainless-steel K-file (K) to perform glide path at the full working length. Specimens were then microscanned at the apical level (A) and at the point of the maximum curvature level (C) for post-treatment analyses. Curvatures of root canals were classified as moderate (≤35°) or severe (≥40°). The ratio of diameter ratios (RDRs) and the ratio of cross-sectional areas (RAs) were assessed. For each level of analysis (A and C), 2 balanced 2-way factorial analyses of variance (P < .05) were performed to evaluate the significance of the instrument factor and of canal curvature factor as well as the interactions of the factors both with RDRs and RAs. Specimens in the K group had a mean curvature of 35.4° ± 11.5°; those in the P group had a curvature of 38° ± 9.9°. The instrument factor (P and K) was extremely significant (P < .001) for both the RDR and RA parameters, regardless of the point of analysis. Micro-CT scanning confirmed that NiTi rotary PathFile instruments preserve the original canal anatomy and cause less canal aberrations. Copyright © 2012 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  19. Publicity.

    Science.gov (United States)

    Chisholm, Joan

    Publicity for preschool cooperatives is described. Publicity helps produce financial support for preschool cooperatives. It may take the form of posters, brochures, newsletters, open house, newspaper coverage, and radio and television. Word of mouth and general good will in the community are the best avenues of publicity that a cooperative nursery…

  20. Portable File Format (PFF) specifications

    Energy Technology Data Exchange (ETDEWEB)

    Dolan, Daniel H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Created at Sandia National Laboratories, the Portable File Format (PFF) allows binary data transfer across computer platforms. Although this capability is supported by many other formats, PFF files are still in use at Sandia, particularly in pulsed power research. This report provides detailed PFF specifications for accessing data without relying on legacy code.

  1. The national public's values and interests related to the Arctic National Wildlife Refuge: A computer content analysis

    Science.gov (United States)

    David N. Bengston; David P. Fan; Roger Kaye

    2010-01-01

    This study examined the national public's values and interests related to the Arctic National Wildlife Refuge. Computer content analysis was used to analyze more than 23,000 media stories about the refuge from 1995 through 2007. Ten main categories of Arctic National Wildlife Refuge values and interests emerged from the analysis, reflecting a diversity of values,...

  2. A Computer Scientist’s Evaluation of Publically Available Hardware Trojan Benchmarks

    Science.gov (United States)

    2015-09-01

    in the Verilog file round.v. This module represents the tenth round of the AES encryption process. This round is similar to previous rounds. It...the round keys used during the AES encryption process. Note that the last two round keys are not leaked, but we assume that the attacker either...undermine the effectiveness of a 128-bit AES encryption circuit. These circuits are labelled AES -T100 through AES -T2100. The naming convention

  3. CIGEO public debate. Presentation of the Citizens' Conference - Press file. Presentation of the citizens panel opinion

    International Nuclear Information System (INIS)

    Hermitte, Marie-Angele; Bedu, Clemence; Besnus, Francois; Brom, Jean-Marie; Grambow, Bernd; Ruedinger, Andreas; Fourniau, Jean-Michel; Bobbio, Luigi; Blatrix, Cecile

    2013-01-01

    After a presentation of the organisation and implementation of the Citizens' Conference, a presentation of the members of the steering committee and of the assessment committee of this conference, this document presents the opinion of the citizen panel in the framework of the public debate on the Cigeo project of deep underground disposal of radioactive wastes. This opinion notably outlines the waste issue as an inter-generational issue, comments the calendar and condition of the Cigeo project, states the panel opinion on risks which are specific to the Cigeo project, discusses the issue of recoverability and reversibility, discusses the issue of site memory, evokes the possibility of exploitation of geothermal energy, outlines the importance of health and environmental monitoring, comments opportunities for local development, and discusses cost and financing assessments. An appendix presents the different training programmes proposed during the citizens' conference

  4. 76 FR 62373 - Notice of Public Meeting-Cloud Computing Forum & Workshop IV

    Science.gov (United States)

    2011-10-07

    ...--Cloud Computing Forum & Workshop IV AGENCY: National Institute of Standards and Technology (NIST), Commerce. ACTION: Notice. SUMMARY: NIST announces the Cloud Computing Forum & Workshop IV to be held on... to help develop open standards in interoperability, portability and security in cloud computing. This...

  5. 77 FR 26509 - Notice of Public Meeting-Cloud Computing Forum & Workshop V

    Science.gov (United States)

    2012-05-04

    ...--Cloud Computing Forum & Workshop V AGENCY: National Institute of Standards & Technology (NIST), Commerce. ACTION: Notice. SUMMARY: NIST announces the Cloud Computing Forum & Workshop V to be held on Tuesday... workshop. This workshop will provide information on the U.S. Government (USG) Cloud Computing Technology...

  6. 77 FR 74829 - Notice of Public Meeting-Cloud Computing and Big Data Forum and Workshop

    Science.gov (United States)

    2012-12-18

    ...--Cloud Computing and Big Data Forum and Workshop AGENCY: National Institute of Standards and Technology... Standards and Technology (NIST) announces a Cloud Computing and Big Data Forum and Workshop to be held on... followed by a one-day hands-on workshop. The NIST Cloud Computing and Big Data Forum and Workshop will...

  7. File: nuclear safety and transparency

    International Nuclear Information System (INIS)

    Martinez, J.P.; Etchegoyen, A.; Jeandron, C.

    2001-01-01

    Several experiences of nuclear safety and transparency are related in this file. Public information, access to documents, transparency in nuclear regulation are such subjects developed in this debate. (N.C.)

  8. In-vitro Assessing the Shaping Ability of Three Nickel-Titanium Rotary Single File Systems by Cone Beam Computed Tomography

    Directory of Open Access Journals (Sweden)

    Ali Imad Al-Asadi

    2018-02-01

    Full Text Available Aim of the study was to evaluate the canal transportation and centering ability of three nickel-titanium single file rotary systems by cone beam computed tomography (CBCT. Materials and methods: Thirty permanent maxillary first molar with a range of mesiobuccal canals curvature from 20-30 degree were selected and assigned into three groups (n=10, according to the biomechanical preparation system used: Hyflex EDM (HF, Reciproc blue (RB and OneShape (OS. The sampled were scanned by CBCT after being mounted on customized acrylic base and then rescanned after the instrumentation. Slices from the axial section were taken from both exposures at 3 mm, 6 mm and 9 mm from the root apex corresponding to the apical, middle, and coronal third respectively. Data were statistically analyzed using Kurskal-Wallis and Mann-Whitney U tests at the 5% confidence level. Results: The results showed that there were no significant differences at the apical and coronal third and a significant difference at the middle third regarding canal transportation. However, there was a significant difference at the apical third and no significant difference at the middle and coronal third regarding centering ratio. Conclusion: It was concluded that the three single rotary systems reported a degree in canal transportation and centric ratio but the Hyflex EDM reported the least one.

  9. Canal transportation and centering ability of protaper and self-adjusting file system in long oval canals: An ex-vivo cone-beam computed tomography analysis.

    Science.gov (United States)

    Shah, Dipali Yogesh; Wadekar, Swati Ishwara; Dadpe, Ashwini Manish; Jadhav, Ganesh Ranganath; Choudhary, Lalit Jayant; Kalra, Dheeraj Deepak

    2017-01-01

    The purpose of this study was to compare and evaluate the shaping ability of ProTaper (PT) and Self-Adjusting File (SAF) system using cone-beam computed tomography (CBCT) to assess their performance in oval-shaped root canals. Sixty-two mandibular premolars with single oval canals were divided into two experimental groups ( n = 31) according to the systems used: Group I - PT and Group II - SAF. Canals were evaluated before and after instrumentation using CBCT to assess centering ratio and canal transportation at three levels. Data were statistically analyzed using one-way analysis of variance, post hoc Tukey's test, and t -test. The SAF showed better centering ability and lesser canal transportation than the PT only in the buccolingual plane at 6 and 9 mm levels. The shaping ability of the PT was best in the apical third in both the planes. The SAF had statistically significant better centering and lesser canal transportation in the buccolingual as compared to the mesiodistal plane at the middle and coronal levels. The SAF produced significantly less transportation and remained centered than the PT at the middle and coronal levels in the buccolingual plane of oval canals. In the mesiodistal plane, the performance of both the systems was parallel.

  10. Building a parallel file system simulator

    International Nuclear Information System (INIS)

    Molina-Estolano, E; Maltzahn, C; Brandt, S A; Bent, J

    2009-01-01

    Parallel file systems are gaining in popularity in high-end computing centers as well as commercial data centers. High-end computing systems are expected to scale exponentially and to pose new challenges to their storage scalability in terms of cost and power. To address these challenges scientists and file system designers will need a thorough understanding of the design space of parallel file systems. Yet there exist few systematic studies of parallel file system behavior at petabyte- and exabyte scale. An important reason is the significant cost of getting access to large-scale hardware to test parallel file systems. To contribute to this understanding we are building a parallel file system simulator that can simulate parallel file systems at very large scale. Our goal is to simulate petabyte-scale parallel file systems on a small cluster or even a single machine in reasonable time and fidelity. With this simulator, file system experts will be able to tune existing file systems for specific workloads, scientists and file system deployment engineers will be able to better communicate workload requirements, file system designers and researchers will be able to try out design alternatives and innovations at scale, and instructors will be able to study very large-scale parallel file system behavior in the class room. In this paper we describe our approach and provide preliminary results that are encouraging both in terms of fidelity and simulation scalability.

  11. Design and application of remote file management system

    International Nuclear Information System (INIS)

    Zhu Haijun; Liu Dekang; Shen liren

    2006-01-01

    File transfer protocol can help users transfer files between computers on internet. FTP can not fulfill the needs of users in special occasions, so it needs programmer define file transfer protocol himself based on users. The method or realization and application for user-defined file transfer protocol is introduced. (authors)

  12. Public Computer Assisted Learning Facilities for Children with Visual Impairment: Universal Design for Inclusive Learning

    Science.gov (United States)

    Siu, Kin Wai Michael; Lam, Mei Seung

    2012-01-01

    Although computer assisted learning (CAL) is becoming increasingly popular, people with visual impairment face greater difficulty in accessing computer-assisted learning facilities. This is primarily because most of the current CAL facilities are not visually impaired friendly. People with visual impairment also do not normally have access to…

  13. Public census data on CD-ROM at Lawrence Berkeley Laboratory. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Merrill, D.W.

    1992-07-02

    In connection with the Comprehensive Epidemiologic Data Resource (CEDR) and Populations at Risk to Environmental Pollution (PAREP) projects, of the Information and Computing Sciences Division (ICSD) at Lawrence Berkeley Laboratory (LBL), are using public socioeconomic and geographic data files which are available to CEDR and PAREP collaborators via LBL`s computing network. At this time 67 CD-ROM diskettes (approximately 35 gigabytes) are on line via the Unix file server cedrcd.lbl.gov. Most of the files are from the US Bureau of the Census, and most pertain to the 1990 Census of Population and Housing. This paper contains a list of the CD-ROMs available.

  14. Public census data on CD-ROM at Lawrence Berkeley Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Merrill, D.W.

    1992-07-02

    In connection with the Comprehensive Epidemiologic Data Resource (CEDR) and Populations at Risk to Environmental Pollution (PAREP) projects, of the Information and Computing Sciences Division (ICSD) at Lawrence Berkeley Laboratory (LBL), are using public socioeconomic and geographic data files which are available to CEDR and PAREP collaborators via LBL's computing network. At this time 67 CD-ROM diskettes (approximately 35 gigabytes) are on line via the Unix file server cedrcd.lbl.gov. Most of the files are from the US Bureau of the Census, and most pertain to the 1990 Census of Population and Housing. This paper contains a list of the CD-ROMs available.

  15. 42 CFR 430.63 - Filing and service of papers.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Filing and service of papers. 430.63 Section 430.63... Conformity of State Medicaid Plans and Practice to Federal Requirements § 430.63 Filing and service of papers. (a) Filing. All papers in the proceedings are filed with the CMS Docket Clerk, in an original and two...

  16. 45 CFR 1386.85 - Filing and service of papers.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Filing and service of papers. 1386.85 Section 1386... Requirements General § 1386.85 Filing and service of papers. (a) All papers in the proceedings must be filed... transcripts of testimony need be filed. (b) Copies of papers in the proceedings must be served on all parties...

  17. Zebra: A striped network file system

    Science.gov (United States)

    Hartman, John H.; Ousterhout, John K.

    1992-01-01

    The design of Zebra, a striped network file system, is presented. Zebra applies ideas from log-structured file system (LFS) and RAID research to network file systems, resulting in a network file system that has scalable performance, uses its servers efficiently even when its applications are using small files, and provides high availability. Zebra stripes file data across multiple servers, so that the file transfer rate is not limited by the performance of a single server. High availability is achieved by maintaining parity information for the file system. If a server fails its contents can be reconstructed using the contents of the remaining servers and the parity information. Zebra differs from existing striped file systems in the way it stripes file data: Zebra does not stripe on a per-file basis; instead it stripes the stream of bytes written by each client. Clients write to the servers in units called stripe fragments, which are analogous to segments in an LFS. Stripe fragments contain file blocks that were written recently, without regard to which file they belong. This method of striping has numerous advantages over per-file striping, including increased server efficiency, efficient parity computation, and elimination of parity update.

  18. Selected Publications in Image Understanding and Computer Vision from 1974 to 1983

    Science.gov (United States)

    1985-04-18

    Germany, September 26-28, 1978), Plenum, New York, 1979. 9. Reconnaissance des Formes et Intelligence Artificielle (2’me Congres AFCET-IRIA, Toulouse...the last decade. .To L..... ABBREVIATIONS - AI Artificial Intelligence BC Biological Cybernetics CACM Communications of the ACM CG Computer Graphics... Intelligence PACM Proceedings of the ACM "P-IEEE Proceedings of the IEEE P-NCC Proceedings of the National Computer Conference PR Pattern Recognition PRL

  19. Comparing ProFile Vortex to ProTaper Next for the efficacy of removal of root filling material: An ex vivo micro-computed tomography study

    Directory of Open Access Journals (Sweden)

    Emad AlShwaimi

    2018-01-01

    Conclusion: Our findings suggest that PV is as effective as PTN for removal of root canal filling material. Therefore, PV can be considered for use in endodontic retreatment, although more effective files or techniques are still required.

  20. The GLEaMviz computational tool, a publicly available software to explore realistic epidemic spreading scenarios at the global scale

    Directory of Open Access Journals (Sweden)

    Quaggiotto Marco

    2011-02-01

    Full Text Available Abstract Background Computational models play an increasingly important role in the assessment and control of public health crises, as demonstrated during the 2009 H1N1 influenza pandemic. Much research has been done in recent years in the development of sophisticated data-driven models for realistic computer-based simulations of infectious disease spreading. However, only a few computational tools are presently available for assessing scenarios, predicting epidemic evolutions, and managing health emergencies that can benefit a broad audience of users including policy makers and health institutions. Results We present "GLEaMviz", a publicly available software system that simulates the spread of emerging human-to-human infectious diseases across the world. The GLEaMviz tool comprises three components: the client application, the proxy middleware, and the simulation engine. The latter two components constitute the GLEaMviz server. The simulation engine leverages on the Global Epidemic and Mobility (GLEaM framework, a stochastic computational scheme that integrates worldwide high-resolution demographic and mobility data to simulate disease spread on the global scale. The GLEaMviz design aims at maximizing flexibility in defining the disease compartmental model and configuring the simulation scenario; it allows the user to set a variety of parameters including: compartment-specific features, transition values, and environmental effects. The output is a dynamic map and a corresponding set of charts that quantitatively describe the geo-temporal evolution of the disease. The software is designed as a client-server system. The multi-platform client, which can be installed on the user's local machine, is used to set up simulations that will be executed on the server, thus avoiding specific requirements for large computational capabilities on the user side. Conclusions The user-friendly graphical interface of the GLEaMviz tool, along with its high level

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  2. Computer-Based Video Instruction to Teach Students with Intellectual Disabilities to Use Public Bus Transportation

    Science.gov (United States)

    Mechling, Linda; O'Brien, Eileen

    2010-01-01

    This study investigated the effectiveness of computer-based video instruction (CBVI) to teach three young adults with moderate intellectual disabilities to push a "request to stop bus signal" and exit a city bus in response to target landmarks. A multiple probe design across three students and one bus route was used to evaluate effectiveness of…

  3. Computers in Education: An Overview. Publication Number One. Software Engineering/Education Cooperative Project.

    Science.gov (United States)

    Collis, Betty; Muir, Walter

    The first of four major sections in this report presents an overview of the background and evolution of computer applications to learning and teaching. It begins with the early attempts toward "automated teaching" of the 1920s, and the "teaching machines" of B. F. Skinner of the 1940s through the 1960s. It then traces the…

  4. ACONC Files

    Data.gov (United States)

    U.S. Environmental Protection Agency — ACONC files containing simulated ozone and PM2.5 fields that were used to create the model difference plots shown in the journal article. This dataset is associated...

  5. XML Files

    Science.gov (United States)

    ... this page: https://medlineplus.gov/xml.html MedlinePlus XML Files To use the sharing features on this page, please enable JavaScript. MedlinePlus produces XML data sets that you are welcome to download ...

  6. 831 Files

    Data.gov (United States)

    Social Security Administration — SSA-831 file is a collection of initial and reconsideration adjudicative level DDS disability determinations. (A few hearing level cases are also present, but the...

  7. Factors Influencing the Adoption of and Business Case for Cloud Computing in the Public Sector

    NARCIS (Netherlands)

    Kuiper, E.; Van Dam, F.; Reiter, A.; Janssen, M.F.W.H.A.

    2014-01-01

    Cloud adoption in the public sector is taking off slowly, which is perceived as a problem. Models of factors influencing cloud adoption are derived for better understanding using literature and results obtained via desk research and surveys by the Cloud for Europe project. We conclude that several

  8. Energy expenditure of three public and three home based active computer games in children.

    NARCIS (Netherlands)

    Simons, M.; de Vries, S.I.; Jongert, M.W.A.

    2014-01-01

    The purpose of this study was to assess the energy expenditure (EE) experienced by children when playing six active video games, which can be used in a home environment and in a public setting (e.g. game center), and to evaluate whether the intensity of playing these games can meet the threshold for

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  10. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  12. The Jupyter/IPython architecture: a unified view of computational research, from interactive exploration to communication and publication.

    Science.gov (United States)

    Ragan-Kelley, M.; Perez, F.; Granger, B.; Kluyver, T.; Ivanov, P.; Frederic, J.; Bussonnier, M.

    2014-12-01

    IPython has provided terminal-based tools for interactive computing in Python since 2001. The notebook document format and multi-process architecture introduced in 2011 have expanded the applicable scope of IPython into teaching, presenting, and sharing computational work, in addition to interactive exploration. The new architecture also allows users to work in any language, with implementations in Python, R, Julia, Haskell, and several other languages. The language agnostic parts of IPython have been renamed to Jupyter, to better capture the notion that a cross-language design can encapsulate commonalities present in computational research regardless of the programming language being used. This architecture offers components like the web-based Notebook interface, that supports rich documents that combine code and computational results with text narratives, mathematics, images, video and any media that a modern browser can display. This interface can be used not only in research, but also for publication and education, as notebooks can be converted to a variety of output formats, including HTML and PDF. Recent developments in the Jupyter project include a multi-user environment for hosting notebooks for a class or research group, a live collaboration notebook via Google Docs, and better support for languages other than Python.

  13. JNDC FP decay data file

    International Nuclear Information System (INIS)

    Yamamoto, Tohru; Akiyama, Masatsugu

    1981-02-01

    The decay data file for fission product nuclides (FP DECAY DATA FILE) has been prepared for summation calculation of the decay heat of fission products. The average energies released in β- and γ-transitions have been calculated with computer code PROFP. The calculated results and necessary information have been arranged in tabular form together with the estimated results for 470 nuclides of which decay data are not available experimentally. (author)

  14. Factors Influencing the Adoption of and Business Case for Cloud Computing in the Public Sector

    OpenAIRE

    Kuiper, E.; Van Dam, F.; Reiter, A.; Janssen, M.F.W.H.A.

    2014-01-01

    Cloud adoption in the public sector is taking off slowly, which is perceived as a problem. Models of factors influencing cloud adoption are derived for better understanding using literature and results obtained via desk research and surveys by the Cloud for Europe project. We conclude that several factors require further research, such as the culture in countries, climate, legislation, economics and politics, IT staff shortage and feelings of uncertainty, fear and impatience. Adoption factors...

  15. Efficacy of Twisted File Adaptive, Reciproc and ProTaper Universal Retreatment instruments for root-canal-filling removal: A cone-beam computed tomography study.

    Science.gov (United States)

    Akbulut, Makbule Bilge; Akman, Melek; Terlemez, Arslan; Magat, Guldane; Sener, Sevgi; Shetty, Heeresh

    2016-01-01

    The aim of this study was to evaluate the efficacy of Twisted File (TF) Adaptive, Reciproc, and ProTaper Universal Retreatment (UR) System instruments for removing root-canal-filling. Sixty single rooted teeth were decoronated, instrumented and obturated. Preoperative CBCT scans were taken and the teeth were retreated with TF Adaptive, Reciproc, ProTaper UR, or hand files (n=15). Then, the teeth were rescanned, and the percentage volume of the residual root-canal-filling material was established. The total time for retreatment was recorded, and the data was statistically analyzed. The statistical ranking of the residual filling material volume was as follows: hand file=TF Adaptive>ProTaper UR=Reciproc. The ProTaper UR and Reciproc systems required shorter periods of time for retreatment. Root canal filling was more efficiently removed by using Reciproc and ProTaper UR instruments than TF Adaptive instruments and hand files. The TF Adaptive system was advantageous over hand files with regard to operating time.

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  17. Reference computations of public dose and cancer risk from airborne releases of plutonium. Nuclear safety technical report

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, V.L.

    1993-12-23

    This report presents results of computations of doses and the associated health risks of postulated accidental atmospheric releases from the Rocky Flats Plant (RFP) of one gram of weapons-grade plutonium in a form that is respirable. These computations are intended to be reference computations that can be used to evaluate a variety of accident scenarios by scaling the dose and health risk results presented here according to the amount of plutonium postulated to be released, instead of repeating the computations for each scenario. The MACCS2 code has been used as the basis of these computations. The basis and capabilities of MACCS2 are summarized, the parameters used in the evaluations are discussed, and results are presented for the doses and health risks to the public, both the Maximum Offsite Individual (a maximally exposed individual at or beyond the plant boundaries) and the population within 50 miles of RFP. A number of different weather scenarios are evaluated, including constant weather conditions and observed weather for 1990, 1991, and 1992. The isotopic mix of weapons-grade plutonium will change as it ages, the {sup 241}Pu decaying into {sup 241}Am. The {sup 241}Am reaches a peak concentration after about 72 years. The doses to the bone surface, liver, and whole body will increase slightly but the dose to the lungs will decrease slightly. The overall cancer risk will show almost no change over this period. This change in cancer risk is much smaller than the year-to-year variations in cancer risk due to weather. Finally, x/Q values are also presented for other applications, such as for hazardous chemical releases. These include the x/Q values for the MOI, for a collocated worker at 100 meters downwind of an accident site, and the x/Q value integrated over the population out to 50 miles.

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  20. File Type Identification of File Fragments using Longest Common Subsequence (LCS)

    Science.gov (United States)

    Rahmat, R. F.; Nicholas, F.; Purnamawati, S.; Sitompul, O. S.

    2017-01-01

    Computer forensic analyst is a person in charge of investigation and evidence tracking. In certain cases, the file needed to be presented as digital evidence was deleted. It is difficult to reconstruct the file, because it often lost its header and cannot be identified while being restored. Therefore, a method is required for identifying the file type of file fragments. In this research, we propose Longest Common Subsequences that consists of three steps, namely training, testing and validation, to identify the file type from file fragments. From all testing results we can conlude that our proposed method works well and achieves 92.91% of accuracy to identify the file type of file fragment for three data types.

  1. ACUTRI a computer code for assessing doses to the general public due to acute tritium releases

    CERN Document Server

    Yokoyama, S; Noguchi, H; Ryufuku, S; Sasaki, T

    2002-01-01

    Tritium, which is used as a fuel of a D-T burning fusion reactor, is the most important radionuclide for the safety assessment of a nuclear fusion experimental reactor such as ITER. Thus, a computer code, ACUTRI, which calculates the radiological impact of tritium released accidentally to the atmosphere, has been developed, aiming to be of use in a discussion of licensing of a fusion experimental reactor and an environmental safety evaluation method in Japan. ACUTRI calculates an individual tritium dose based on transfer models specific to tritium in the environment and ICRP dose models. In this calculation it is also possible to analyze statistically on meteorology in the same way as a conventional dose assessment method according to the meteorological guide of the Nuclear Safety Commission of Japan. A Gaussian plume model is used for calculating the atmospheric dispersion of tritium gas (HT) and/or tritiated water (HTO). The environmental pathway model in ACUTRI considers the following internal exposures: i...

  2. Virtual Space Exploration: Let's Use Web-Based Computer Game Technology to Boost IYA 2009 Public Interest

    Science.gov (United States)

    Hussey, K.; Doronila, P.; Kulikov, A.; Lane, K.; Upchurch, P.; Howard, J.; Harvey, S.; Woodmansee, L.

    2008-09-01

    With the recent releases of both Google's "Sky" and Microsoft's "WorldWide Telescope" and the large and increasing popularity of video games, the time is now for using these tools, and those crafted at NASA's Jet Propulsion Laboratory, to engage the public in astronomy like never before. This presentation will use "Cassini at Saturn Interactive Explorer " (CASSIE) to demonstrate the power of web-based video-game engine technology in providing the public a "first-person" look at space exploration. The concept of virtual space exploration is to allow the public to "see" objects in space as if they were either riding aboard or "flying" next to an ESA/NASA spacecraft. Using this technology, people are able to immediately "look" in any direction from their virtual location in space and "zoom-in" at will. Users can position themselves near Saturn's moons and observe the Cassini Spacecraft's "encounters" as they happened. Whenever real data for their "view" exists it is incorporated into the scene. Where data is missing, a high-fidelity simulation of the view is generated to fill in the scene. The observer can also change the time of observation into the past or future. Our approach is to utilize and extend the Unity 3d game development tool, currently in use by the computer gaming industry, along with JPL mission specific telemetry and instrument data to build our virtual explorer. The potential of the application of game technology for the development of educational curricula and public engagement are huge. We believe this technology can revolutionize the way the general public and the planetary science community views ESA/NASA missions and provides an educational context that is attractive to the younger generation. This technology is currently under development and application at JPL to assist our missions in viewing their data, communicating with the public and visualizing future mission plans. Real-time demonstrations of CASSIE and other applications in development

  3. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  9. A New Heuristic Anonymization Technique for Privacy Preserved Datasets Publication on Cloud Computing

    Science.gov (United States)

    Aldeen Yousra, S.; Mazleena, Salleh

    2018-05-01

    Recent advancement in Information and Communication Technologies (ICT) demanded much of cloud services to sharing users’ private data. Data from various organizations are the vital information source for analysis and research. Generally, this sensitive or private data information involves medical, census, voter registration, social network, and customer services. Primary concern of cloud service providers in data publishing is to hide the sensitive information of individuals. One of the cloud services that fulfill the confidentiality concerns is Privacy Preserving Data Mining (PPDM). The PPDM service in Cloud Computing (CC) enables data publishing with minimized distortion and absolute privacy. In this method, datasets are anonymized via generalization to accomplish the privacy requirements. However, the well-known privacy preserving data mining technique called K-anonymity suffers from several limitations. To surmount those shortcomings, I propose a new heuristic anonymization framework for preserving the privacy of sensitive datasets when publishing on cloud. The advantages of K-anonymity, L-diversity and (α, k)-anonymity methods for efficient information utilization and privacy protection are emphasized. Experimental results revealed the superiority and outperformance of the developed technique than K-anonymity, L-diversity, and (α, k)-anonymity measure.

  10. ACUTRI: a computer code for assessing doses to the general public due to acute tritium releases

    Energy Technology Data Exchange (ETDEWEB)

    Yokoyama, Sumi; Noguchi, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Ryufuku, Susumu; Sasaki, Toshihisa; Kurosawa, Naohiro [Visible Information Center, Inc., Tokai, Ibaraki (Japan)

    2002-11-01

    Tritium, which is used as a fuel of a D-T burning fusion reactor, is the most important radionuclide for the safety assessment of a nuclear fusion experimental reactor such as ITER. Thus, a computer code, ACUTRI, which calculates the radiological impact of tritium released accidentally to the atmosphere, has been developed, aiming to be of use in a discussion of licensing of a fusion experimental reactor and an environmental safety evaluation method in Japan. ACUTRI calculates an individual tritium dose based on transfer models specific to tritium in the environment and ICRP dose models. In this calculation it is also possible to analyze statistically on meteorology in the same way as a conventional dose assessment method according to the meteorological guide of the Nuclear Safety Commission of Japan. A Gaussian plume model is used for calculating the atmospheric dispersion of tritium gas (HT) and/or tritiated water (HTO). The environmental pathway model in ACUTRI considers the following internal exposures: inhalation from a primary plume (HT and/or HTO) released from the facilities and inhalation from a secondary plume (HTO) reemitted from the ground following deposition of HT and HTO. This report describes an outline of the ACUTRI code, a user guide and the results of test calculation. (author)

  11. The Perceptions of Globalization at a Public Research University Computer Science Graduate Department

    Science.gov (United States)

    Nielsen, Selin Yildiz

    Based on a qualitative methodological approach, this study focuses on the understanding of a phenomenon called globalization in a research university computer science department. The study looks into the participants' perspectives about the department, its dynamics, culture and academic environment as related to globalization. The economic, political, academic and social/cultural aspects of the department are taken into consideration in investigating the influences of globalization. Three questions guide this inquiry: 1) How is the notion of globalization interpreted in this department? 2) How does the perception of globalization influence the department in terms of finances, academics, policies and social life And 3) How are these perceptions influence the selection of students? Globalization and neo-institutional view of legitimacy is used as theoretical lenses to conceptualize responses to these questions. The data include interviews, field notes, official and non-official documents. Interpretations of these data are compared to findings from prior research on the impact of globalization in order to clarify and validate findings. Findings show that there is disagreement in how the notion of globalization is interpreted between the doctoral students and the faculty in the department. This disagreement revealed the attitudes and interpretations of globalization in the light of the policies and procedures related to the department. How the faculty experience globalization is not consistent with the literature in this project. The literature states that globalization is a big part of higher education and it is a phenomenon that causes the changes in the goals and missions of higher education institutions (Knight, 2003, De Witt, 2005). The data revealed that globalization is not the cause for change but more of a consequence of actions that take place in achieving the goals and missions of the department.

  12. An analysis of file system and installation of the file management system for NOS operating system

    International Nuclear Information System (INIS)

    Lee, Young Jai; Park, Sun Hee; Hwang, In Ah; Kim, Hee Kyung

    1992-06-01

    In this technical report, we analyze NOS file structure for Cyber 170-875 and Cyber 960-31 computer system. We also describe functions, procedure and how-to-operate and how-to-use of VDS. VDS is used to manage large files effectively for Cyber computer system. The purpose of the VDS installation is to increase the virtual disk storage by utilizing magnetic tape, to assist the users of the computer system in managing their files, and to enhance the performance of KAERI Cyber computer system. (Author)

  13. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  14. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  16. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  17. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  20. New details emerge from the Einstein files

    CERN Multimedia

    Overbye, D

    2002-01-01

    For many years the FBI spied on Einstein. New details of this surveilance are emerging in "The Einstein File: J. Edgar Hoover's Secret War Against the World's Most Famous Scientist," by Fred Jerome, who sued the government with the help of the Public Citizen Litigation Group to obtain a less censored version of the file (1 page).

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  2. Reference computations of public dose and cancer risk from airborne releases of uranium and Class W plutonium

    International Nuclear Information System (INIS)

    Peterson, V.L.

    1995-01-01

    This report presents ''reference'' computations that can be used by safety analysts in the evaluations of the consequences of postulated atmospheric releases of radionuclides from the Rocky Flats Environmental Technology Site. These computations deal specifically with doses and health risks to the public. The radionuclides considered are Class W Plutonium, all classes of Enriched Uranium, and all classes of Depleted Uranium. (The other class of plutonium, Y, was treated in an earlier report.) In each case, one gram of the respirable material is assumed to be released at ground leveL both with and without fire. The resulting doses and health risks can be scaled to whatever amount of release is appropriate for a postulated accident being investigated. The report begins with a summary of the organ-specific stochastic risk factors appropriate for alpha radiation, which poses the main health risk of plutonium and uranium. This is followed by a summary of the atmospheric dispersion factors for unfavorable and typical weather conditions for the calculation of consequences to both the Maximum Offsite Individual and the general population within 80 km (50 miles) of the site

  3. Computer-aided detection of pulmonary nodules: a comparative study using the public LIDC/IDRI database

    International Nuclear Information System (INIS)

    Jacobs, Colin; Prokop, Mathias; Rikxoort, Eva M. van; Ginneken, Bram van; Murphy, Keelin; Schaefer-Prokop, Cornelia M.

    2016-01-01

    To benchmark the performance of state-of-the-art computer-aided detection (CAD) of pulmonary nodules using the largest publicly available annotated CT database (LIDC/IDRI), and to show that CAD finds lesions not identified by the LIDC's four-fold double reading process. The LIDC/IDRI database contains 888 thoracic CT scans with a section thickness of 2.5 mm or lower. We report performance of two commercial and one academic CAD system. The influence of presence of contrast, section thickness, and reconstruction kernel on CAD performance was assessed. Four radiologists independently analyzed the false positive CAD marks of the best CAD system. The updated commercial CAD system showed the best performance with a sensitivity of 82 % at an average of 3.1 false positive detections per scan. Forty-five false positive CAD marks were scored as nodules by all four radiologists in our study. On the largest publicly available reference database for lung nodule detection in chest CT, the updated commercial CAD system locates the vast majority of pulmonary nodules at a low false positive rate. Potential for CAD is substantiated by the fact that it identifies pulmonary nodules that were not marked during the extensive four-fold LIDC annotation process. (orig.)

  4. 49 CFR 1312.4 - Filing of tariffs.

    Science.gov (United States)

    2010-10-01

    ... identifying each publication filed, and by the appropriate filing fee (see 49 CFR part 1002). Acknowledgment... OF TRANSPORTATION (CONTINUED) CARRIER RATES AND SERVICE TERMS REGULATIONS FOR THE PUBLICATION... English with rates explicitly stated in U.S. dollars and cents. Two copies of each tariff publication...

  5. 77 FR 9912 - Combined Notice of Filings #1

    Science.gov (United States)

    2012-02-21

    .... Docket Numbers: ER12-75-003. Applicants: Public Power, LLC. Description: Compliance Filing for MBR Tariff... Energy MBR Tariff to be effective 2/1/2012. Filed Date: 2/10/12. Accession Number: 20120210-5132... Cancellation of MBR Tariff to be effective 2/10/2012. Filed Date: 2/10/12. Accession Number: 20120210-5153...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  7. Detection Of Alterations In Audio Files Using Spectrograph Analysis

    Directory of Open Access Journals (Sweden)

    Anandha Krishnan G

    2015-08-01

    Full Text Available The corresponding study was carried out to detect changes in audio file using spectrograph. An audio file format is a file format for storing digital audio data on a computer system. A sound spectrograph is a laboratory instrument that displays a graphical representation of the strengths of the various component frequencies of a sound as time passes. The objectives of the study were to find the changes in spectrograph of audio after altering them to compare altering changes with spectrograph of original files and to check for similarity and difference in mp3 and wav. Five different alterations were carried out on each audio file to analyze the differences between the original and the altered file. For altering the audio file MP3 or WAV by cutcopy the file was opened in Audacity. A different audio was then pasted to the audio file. This new file was analyzed to view the differences. By adjusting the necessary parameters the noise was reduced. The differences between the new file and the original file were analyzed. By adjusting the parameters from the dialog box the necessary changes were made. The edited audio file was opened in the software named spek where after analyzing a graph is obtained of that particular file which is saved for further analysis. The original audio graph received was combined with the edited audio file graph to see the alterations.

  8. Remote file inquiry (RFI) system

    Science.gov (United States)

    1975-01-01

    System interrogates and maintains user-definable data files from remote terminals, using English-like, free-form query language easily learned by persons not proficient in computer programming. System operates in asynchronous mode, allowing any number of inquiries within limitation of available core to be active concurrently.

  9. Research of Performance Linux Kernel File Systems

    Directory of Open Access Journals (Sweden)

    Andrey Vladimirovich Ostroukh

    2015-10-01

    Full Text Available The article describes the most common Linux Kernel File Systems. The research was carried out on a personal computer, the characteristics of which are written in the article. The study was performed on a typical workstation running GNU/Linux with below characteristics. On a personal computer for measuring the file performance, has been installed the necessary software. Based on the results, conclusions and proposed recommendations for use of file systems. Identified and recommended by the best ways to store data.

  10. Using computer-assisted process facilitation techniques in government sponsored public meetings and working sessions - a paper addressing the East Fork Poplar Creek Working Group Experience

    International Nuclear Information System (INIS)

    Armstrong, L.D.; Rymer, G.; Perkins, S.

    1994-01-01

    This paper addresses a process facilitation technique using computer hardware and software that assists its users in group decision-making, consensus building, surveying and polling, and strategic planning. The process and equipment has been successfully used by the Department of Energy and Martin Marietta Energy Systems, Inc., Environmental Restoration and Waste Management Community Relations program. The technology is used to solicit and encourage qualitative and documented public feedback in government mandated or sponsored public meetings in Oak Ridge, Tennessee

  11. Fast probabilistic file fingerprinting for big data.

    Science.gov (United States)

    Tretyakov, Konstantin; Laur, Sven; Smant, Geert; Vilo, Jaak; Prins, Pjotr

    2013-01-01

    Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff.

  12. The DNA Files

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-09

    The DNA Files is a radio documentary which disseminates genetics information over public radio. The documentaries explore subjects which include the following: How genetics affects society. How human life began and how it evolved. Could new prenatal genetic tests hold the key to disease prevention later in life? Would a national genetic data base sacrifice individual privacy? and Should genes that may lead to the cure for cancer be privately owned? This report serves as a project update for the second quarter of 1998. It includes the spring/summer 1998 newsletter, the winter 1998 newsletter, the program clock, and the latest flyer.

  13. Public census data on CD-ROM at Lawrence Berkeley Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Merrill, D.W.

    1992-10-01

    The Comprehensive Epidemiologic Data Resource (CEDR) and Populations at Risk to Environmental Pollution (PAREP) projects, of the Information and Computing Sciences Division (ICSD) at Lawrence Berkeley Laboratory (LBL), are using public socio-economic and geographic data files which are available to CEDR and PAREP collaborators via LBL's computing network. At this time 70 CD-ROM diskettes (approximately 36 gigabytes) are on line via the Unix file server cedrcd. lbl. gov. Most of the files are from the US Bureau of the Census, and most pertain to the 1990 Census of Population and Housing. All the CD-ROM diskettes contain documentation in the form of ASCII text files. Printed documentation for most files is available for inspection at University of California Data and Technical Assistance (UC DATA), or the UC Documents Library. Many of the CD-ROM diskettes distributed by the Census Bureau contain software for PC compatible computers, for easily accessing the data. Shared access to the data is maintained through a collaboration among the CEDR and PAREP projects at LBL, and UC DATA, and the UC Documents Library. Via the Sun Network File System (NFS), these data can be exported to Internet computers for direct access by the user's application program(s).

  14. Public census data on CD-ROM at Lawrence Berkeley Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Merrill, D.W.

    1992-10-01

    The Comprehensive Epidemiologic Data Resource (CEDR) and Populations at Risk to Environmental Pollution (PAREP) projects, of the Information and Computing Sciences Division (ICSD) at Lawrence Berkeley Laboratory (LBL), are using public socio-economic and geographic data files which are available to CEDR and PAREP collaborators via LBL`s computing network. At this time 70 CD-ROM diskettes (approximately 36 gigabytes) are on line via the Unix file server cedrcd. lbl. gov. Most of the files are from the US Bureau of the Census, and most pertain to the 1990 Census of Population and Housing. All the CD-ROM diskettes contain documentation in the form of ASCII text files. Printed documentation for most files is available for inspection at University of California Data and Technical Assistance (UC DATA), or the UC Documents Library. Many of the CD-ROM diskettes distributed by the Census Bureau contain software for PC compatible computers, for easily accessing the data. Shared access to the data is maintained through a collaboration among the CEDR and PAREP projects at LBL, and UC DATA, and the UC Documents Library. Via the Sun Network File System (NFS), these data can be exported to Internet computers for direct access by the user`s application program(s).

  15. HUD GIS Boundary Files

    Data.gov (United States)

    Department of Housing and Urban Development — The HUD GIS Boundary Files are intended to supplement boundary files available from the U.S. Census Bureau. The files are for community planners interested in...

  16. 45 CFR 79.26 - Form, filing and service of papers.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Form, filing and service of papers. 79.26 Section... CIVIL REMEDIES § 79.26 Form, filing and service of papers. (a) Form. (1) Documents filed with the ALJ shall include an original and two copies. (2) Every pleading and paper filed in the proceeding shall...

  17. 42 CFR 3.524 - Form, filing, and service of papers.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Form, filing, and service of papers. 3.524 Section..., filing, and service of papers. (a) Forms. (1) Unless the ALJ directs the parties to do otherwise, documents filed with the ALJ must include an original and two copies. (2) Every pleading and paper filed in...

  18. Summative report of the public competition research and development on software for computational science and engineering in the fiscal year 1997 through 2002

    International Nuclear Information System (INIS)

    2005-09-01

    Japan Atomic Energy Research Institute started the public competition research and development on software for computational science and engineering in 1997, and closed it in 2002. This report describes the system of the competition research and development, application situations, R and D subjects adopted, evaluation findings, outputs produced, achievements and problems, as a summative report of practice of the system for six years. (author)

  19. A secure file manager for UNIX

    Energy Technology Data Exchange (ETDEWEB)

    DeVries, R.G.

    1990-12-31

    The development of a secure file management system for a UNIX-based computer facility with supercomputers and workstations is described. Specifically, UNIX in its usual form does not address: (1) Operation which would satisfy rigorous security requirements. (2) Online space management in an environment where total data demands would be many times the actual online capacity. (3) Making the file management system part of a computer network in which users of any computer in the local network could retrieve data generated on any other computer in the network. The characteristics of UNIX can be exploited to develop a portable, secure file manager which would operate on computer systems ranging from workstations to supercomputers. Implementation considerations making unusual use of UNIX features, rather than requiring extensive internal system changes, are described, and implementation using the Cray Research Inc. UNICOS operating system is outlined.

  20. Factors influencing health professions students' use of computers for data analysis at three Ugandan public medical schools: a cross-sectional survey.

    Science.gov (United States)

    Munabi, Ian G; Buwembo, William; Bajunirwe, Francis; Kitara, David Lagoro; Joseph, Ruberwa; Peter, Kawungezi; Obua, Celestino; Quinn, John; Mwaka, Erisa S

    2015-02-25

    Effective utilization of computers and their applications in medical education and research is of paramount importance to students. The objective of this study was to determine the association between owning a computer and use of computers for research data analysis and the other factors influencing health professions students' computer use for data analysis. We conducted a cross sectional study among undergraduate health professions students at three public universities in Uganda using a self-administered questionnaire. The questionnaire was composed of questions on participant demographics, students' participation in research, computer ownership, and use of computers for data analysis. Descriptive and inferential statistics (uni-variable and multi- level logistic regression analysis) were used to analyse data. The level of significance was set at 0.05. Six hundred (600) of 668 questionnaires were completed and returned (response rate 89.8%). A majority of respondents were male (68.8%) and 75.3% reported owning computers. Overall, 63.7% of respondents reported that they had ever done computer based data analysis. The following factors were significant predictors of having ever done computer based data analysis: ownership of a computer (adj. OR 1.80, p = 0.02), recently completed course in statistics (Adj. OR 1.48, p =0.04), and participation in research (Adj. OR 2.64, p <0.01). Owning a computer, participation in research and undertaking courses in research methods influence undergraduate students' use of computers for research data analysis. Students are increasingly participating in research, and thus need to have competencies for the successful conduct of research. Medical training institutions should encourage both curricular and extra-curricular efforts to enhance research capacity in line with the modern theories of adult learning.

  1. Schools (Students) Exchanging CAD/CAM Files over the Internet.

    Science.gov (United States)

    Mahoney, Gary S.; Smallwood, James E.

    This document discusses how students and schools can benefit from exchanging computer-aided design/computer-aided manufacturing (CAD/CAM) files over the Internet, explains how files are exchanged, and examines the problem of selected hardware/software incompatibility. Key terms associated with information search services are defined, and several…

  2. Parallel file system with metadata distributed across partitioned key-value store c

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron

    2017-09-19

    Improved techniques are provided for storing metadata associated with a plurality of sub-files associated with a single shared file in a parallel file system. The shared file is generated by a plurality of applications executing on a plurality of compute nodes. A compute node implements a Parallel Log Structured File System (PLFS) library to store at least one portion of the shared file generated by an application executing on the compute node and metadata for the at least one portion of the shared file on one or more object storage servers. The compute node is also configured to implement a partitioned data store for storing a partition of the metadata for the shared file, wherein the partitioned data store communicates with partitioned data stores on other compute nodes using a message passing interface. The partitioned data store can be implemented, for example, using Multidimensional Data Hashing Indexing Middleware (MDHIM).

  3. 45 CFR 99.5 - Filing and service of papers.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Filing and service of papers. 99.5 Section 99.5 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION PROCEDURE FOR HEARINGS FOR THE CHILD CARE AND DEVELOPMENT FUND General § 99.5 Filing and service of papers. (a) An original and...

  4. 76 FR 23320 - Combined Notice of Filings #2

    Science.gov (United States)

    2011-04-26

    ... Tuesday, May 10, 2011. Docket Numbers: ER10-3096-002. Applicants: Public Service Company of New Mexico. Description: Public Service Company of New Mexico submits tariff filing per 35: WestConnect Experimental... tariff filing per 35.13(a)(2)(iii): IPL Changes in Depreciation Rates for Wholesale Production Service to...

  5. 78 FR 2978 - Combined Notice of Filings #2

    Science.gov (United States)

    2013-01-15

    .... Applicants: Lea Power Partners, LLC. Description: Lea Power Partners, LLC's Updated Market Power Analysis and....m. ET 1/28/13. Docket Numbers: ER12-72-002. Applicants: Public Service Company of New Mexico. Description: Public Service Company of New Mexico submits tariff filing per 35: PNM Filing of Former NITSA and...

  6. 78 FR 54453 - Notice of Public Meeting-Intersection of Cloud Computing and Mobility Forum and Workshop

    Science.gov (United States)

    2013-09-04

    ...--Intersection of Cloud Computing and Mobility Forum and Workshop AGENCY: National Institute of Standards and.../intersection-of-cloud-and-mobility.cfm . SUPPLEMENTARY INFORMATION: NIST hosted six prior Cloud Computing Forum... interoperability, portability, and security, discuss the Federal Government's experience with cloud computing...

  7. 77 FR 74653 - Combined Notice of Filings #1

    Science.gov (United States)

    2012-12-17

    .... Applicants: Public Service Company of New Mexico. Description: Public Service Company of New Mexico submits... Market-Based Rate Tariff to be effective 2/1/2013. Filed Date: 12/7/12. Accession Number: 20121207-5219...

  8. Malpractice by physical therapists: descriptive analysis of reports in the National Practitioner Data Bank public use data file, 1991-2004.

    Science.gov (United States)

    Sandstrom, Robert

    2007-01-01

    As physical therapists increase autonomous practice, medical error becomes more important to public safety and public perceptions of the profession. The purpose of this study was to describe malpractice by physical therapists in the United States based on physical therapist malpractice reports in the National Practitioner Data Bank between January 1, 1991, and December 31, 2004. A frequency analysis of data related to physical therapist malpractice reports was performed. The relationship between size of malpractice payment and public policy related to access to physical therapist services and malpractice experience was explored. A total of 664 malpractice reports were found in the study period (mean, 47.73 events annually). California had 114 malpractice events, while Maine and Wyoming had none. The median payment amount for physical therapist malpractice was $10,000 to $15,000. "Treatment-related" events and events related to "improper technique" were the most common reasons for a malpractice report. Incidence of malpractice by physical therapists is low (estimated at 2.5 events/10,000 working therapists/year), and the average malpractice payment is small (public policy related to direct patient access to physical therapy services.

  9. Disaggregating the Truth: A Re-Analysis of the Costs and Benefits of Michigan's Public Universities. Professional File. Number 125, Summer 2012

    Science.gov (United States)

    Daun-Barnett, Nathan J.

    2012-01-01

    For more than 50 years, human capital theory has been the cornerstone for understanding the value of investing in individuals' productive capacities in terms of both personal social and economic gain and the collective benefits that accrue to society. Vedder and Denhart (2007) challenge the hypothesis that public investment in higher education…

  10. School Survey on Crime and Safety (SSOCS): 2015-16. Public-Use Data File User's Manual. NCES 2018-107

    Science.gov (United States)

    Jackson, Michael; Diliberti, Melissa; Kemp, Jana; Hummel, Steven; Cox, Christina; Gbondo-Tugbawa, Komba; Simon, Dillon

    2018-01-01

    The School Survey on Crime and Safety (SSOCS) is managed by the National Center for Education Statistics (NCES) within the Institute of Education Sciences of the U.S. Department of Education. SSOCS collects extensive crime and safety data from principals and administrators of public schools in the United States. Data from this collection can be…

  11. 17 CFR 232.13 - Date of filing; adjustment of filing date.

    Science.gov (United States)

    2010-04-01

    ..., shall not affect the date of filing. (2) If the conditions of paragraph (a)(1) of this section are... and consistent with the public interest and the protection of investors. (c) Payment of fees. Fees...

  12. Peregrine System | High-Performance Computing | NREL

    Science.gov (United States)

    classes of nodes that users access: Login Nodes Peregrine has four login nodes, each of which has Intel E5 /scratch file systems, the /mss file system is mounted on all login nodes. Compute Nodes Peregrine has 2592

  13. Provider of Services File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The POS file consists of two data files, one for CLIA labs and one for 18 other provider types. The file names are CLIA and OTHER. If downloading the file, note it...

  14. Yeu and Noirmoutier islands - Electric connection to the offshore wind farm. Brochure 'Electric connection of the offshore wind farm to the Yeu and Noirmoutier islands' - Consultation file, December 2015, District of Vendee. Document of contribution to the public debate. Yeu and Noirmoutier islands offshore wind farm project - Project commissioner file

    International Nuclear Information System (INIS)

    Boyadjis, Jean-Marc; Corallo, Christian

    2015-05-01

    After a brief presentation of the context, a first document presents the connection characteristics, the general route of the electrical connection, how the sea and ground environment is taken into account, how the population is consulted for the project, the organisation of the public inquiry. The second document presents the consultation file with a description of the adopted solution and main characteristics of the project of connection of offshore wind farms about the Yeu and Noirmoutier islands, an analysis of the initial condition of the environment (physical and natural environment, landscapes and heritage, human and social-economic context), a description and a comparative analysis of proposed beams for the underwater connection, landfall and underground connection, and the planned continuation of the project (consultation, administrative procedure, planning). The next document aims at presenting the project before public consultation: objectives, underwater and underground connections with some of their technical characteristics, planning. Another document also presents the offshore wind farm project, its energetic and technical characteristics, how it is integrated into its environment (stakes and challenges, impacts on landscape and on activities), how it is an example of the development of offshore wind energy sector. It is completed by several appendices containing a variety of data on these issues. A last document is a presentation of the project by the project commissioner (RTE) which notably addresses how technical and environmental issues are to be addressed

  15. PC Graphic file programing

    International Nuclear Information System (INIS)

    Yang, Jin Seok

    1993-04-01

    This book gives description of basic of graphic knowledge and understanding and realization of graphic file form. The first part deals with graphic with graphic data, store of graphic data and compress of data, programing language such as assembling, stack, compile and link of program and practice and debugging. The next part mentions graphic file form such as Mac paint file, GEM/IMG file, PCX file, GIF file, and TIFF file, consideration of hardware like mono screen driver and color screen driver in high speed, basic conception of dithering and conversion of formality.

  16. Methods and Algorithms for Detecting Objects in Video Files

    Directory of Open Access Journals (Sweden)

    Nguyen The Cuong

    2018-01-01

    Full Text Available Video files are files that store motion pictures and sounds like in real life. In today's world, the need for automated processing of information in video files is increasing. Automated processing of information has a wide range of application including office/home surveillance cameras, traffic control, sports applications, remote object detection, and others. In particular, detection and tracking of object movement in video file plays an important role. This article describes the methods of detecting objects in video files. Today, this problem in the field of computer vision is being studied worldwide.

  17. panMetaDocs, eSciDoc, and DOIDB—An Infrastructure for the Curation and Publication of File-Based Datasets for GFZ Data Services

    Directory of Open Access Journals (Sweden)

    Damian Ulbricht

    2016-03-01

    Full Text Available The GFZ German Research Centre for Geosciences is the national laboratory for Geosciences in Germany. As part of the Helmholtz Association, providing and maintaining large-scale scientific infrastructures are an essential part of GFZ activities. This includes the generation of significant volumes and numbers of research data, which subsequently become source materials for data publications. The development and maintenance of data systems is a key component of GFZ Data Services to support state-of-the-art research. A challenge lies not only in the diversity of scientific subjects and communities, but also in different types and manifestations of how data are managed by research groups and individual scientists. The data repository of GFZ Data Services provides a flexible IT infrastructure for data storage and publication, including minting of digital object identifiers (DOI. It was built as a modular system of several independent software components linked together through Application Programming Interfaces (APIs provided by the eSciDoc framework. Principal application software are panMetaDocs for data management and DOIDB for logging and moderating data publications activities. Wherever possible, existing software solutions were integrated or adapted. A summary of our experiences made in operating this service is given. Data are described through comprehensive landing pages and supplementary documents, like journal articles or data reports, thus augmenting the scientific usability of the service.

  18. Public census data on CD-ROM at Lawrence Berkeley Laboratory. Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Merrill, D.W.

    1993-03-12

    The Comprehensive Epidemiologic Data Resource (CEDR) and Populations at Risk to Environmental Pollution (PAREP) projects, of the Information and Computing sciences Division (ICSD) at Lawrence Berkeley Laboratory (LBL), are using public socioeconomic and geographic data files which are available to CEDR and PAREP collaborators via LBL`s computing network. At this time 89 CD-ROM diskettes (approximately 45 gigabytes) are on line via the Unix file server cedrcd.lbl.gov. Most of the files are from the US Bureau of the Census, and many of these pertain to the 1990 Census of Population and Housing. All the CD-ROM diskettes contain documentation in the form of ASCII text files. In addition, printed documentation for most files is available for inspection at University of California Data and Technical Assistance (UC DATA), tel. (510) 642-6571, or the UC Documents Library, tel. (510) 642-2569, both located on the UC Berkeley Campus. Many of the CD-ROM diskettes distributed by the Census Bureau contain software for PC compatible computers, for easily accessing the data. Shared access to the data is maintained through a collaboration among the CEDR and PAREP projects at LBL, and UC DATA, and the UC Documents Library. LBL is grateful to UC DATA and the UC Documents Library for the use of their CD-ROM diskettes. Shared access to LBL facilities may be restricted in the future if costs become prohibitive. Via the Sun Network File System (NFS), these data can be exported to Internet computers for direct access by the user`s application program(s). Due to the size of the files, this access method is preferred over File Transfer Protocol (FTP) access.

  19. Public census data on CD-ROM at Lawrence Berkeley Laboratory. Revision 3

    Energy Technology Data Exchange (ETDEWEB)

    Merrill, D.W.

    1993-01-16

    The Comprehensive Epidemiologic Data Resource (CEDR) and Populations at Risk to Environmental Pollution (PAREP) projects, of the Information and Computing Sciences Division (ICSD) at Lawrence Berkeley Laboratory (LBL), are using public socioeconomic and geographic data files which are available to CEDR and PAREP collaborators via LBL`s computing network. At this time 72 CD-ROM diskettes (approximately 37 gigabytes) are on line via the Unix file server ``cedrcd.lbl.gov``. Most of the files are from the US Bureau of the Census, and many of these pertain to the 1990 Census of Population and Housing. All the CD-ROM diskettes contain documentation in the form of ASCII text files. In addition, printed documentation for most files is available for inspection at University of California Data and Technical Assistance (UC DATA), tel. (510) 642-6571, or the UC Documents Library, tel. (510) 642-2569, both located on the UC Berkeley Campus. Many of the CD-ROM diskettes distributed by the Census Bureau contain software for PC compatible computers, for easily accessing the data. Shared access to the data is maintained through a collaboration among the CEDR and PAREP projects at LBL, and UC DATA, and the UC Documents Library. LBL is grateful to UC DATA and the UC Documents Library for the use of their CD-ROM diskettes. Shared access to LBL facilities may be restricted in the future if costs become prohibitive. Via the Sun Network File System (NFS), these data can be exported to Internet computers for direct access by the user`s application program(s). Due to the size of the files, this access method is preferred over File Transfer Protocol (FTP) access. Please contact Deane Merrill (dwmerrill@lbl.gov) if you wish to make use of the data.

  20. Public census data on CD-ROM at Lawrence Berkeley Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Merrill, D.W.

    1993-01-16

    The Comprehensive Epidemiologic Data Resource (CEDR) and Populations at Risk to Environmental Pollution (PAREP) projects, of the Information and Computing Sciences Division (ICSD) at Lawrence Berkeley Laboratory (LBL), are using public socioeconomic and geographic data files which are available to CEDR and PAREP collaborators via LBL's computing network. At this time 72 CD-ROM diskettes (approximately 37 gigabytes) are on line via the Unix file server cedrcd.lbl.gov''. Most of the files are from the US Bureau of the Census, and many of these pertain to the 1990 Census of Population and Housing. All the CD-ROM diskettes contain documentation in the form of ASCII text files. In addition, printed documentation for most files is available for inspection at University of California Data and Technical Assistance (UC DATA), tel. (510) 642-6571, or the UC Documents Library, tel. (510) 642-2569, both located on the UC Berkeley Campus. Many of the CD-ROM diskettes distributed by the Census Bureau contain software for PC compatible computers, for easily accessing the data. Shared access to the data is maintained through a collaboration among the CEDR and PAREP projects at LBL, and UC DATA, and the UC Documents Library. LBL is grateful to UC DATA and the UC Documents Library for the use of their CD-ROM diskettes. Shared access to LBL facilities may be restricted in the future if costs become prohibitive. Via the Sun Network File System (NFS), these data can be exported to Internet computers for direct access by the user's application program(s). Due to the size of the files, this access method is preferred over File Transfer Protocol (FTP) access. Please contact Deane Merrill (dwmerrill lbl.gov) if you wish to make use of the data.

  1. Users' guide to the positron camera DDP516 computer system

    International Nuclear Information System (INIS)

    Bracher, B.H.

    1979-08-01

    This publication is a guide to the operation, use and software for a DDP516 computer system provided by the Data Handling Group primarily for the development of a Positron Camera. The various sections of the publication fall roughly into three parts. (1) Sections forming the Operators Guide cover the basic operation of the machine, system utilities and back-up procedures. Copies of these sections are kept in a 'Nyrex' folder with the computer. (2) Sections referring to the software written particularly for Positron Camera Data Collection describe the system in outline and lead to details of file formats and program source files. (3) The remainder of the guide, describes General-Purpose Software. Much of this has been written over some years by various members of the Data Handling Group, and is available for use in other applications besides the positron camera. (UK)

  2. Assessing and Filing the Gap as a New Mode of Governance, lessons from a Preliminary Study carried out in the Cosenza's Public Prosecutors' Office

    Directory of Open Access Journals (Sweden)

    Diana Piana

    2012-06-01

    Full Text Available In all European countries courts and public prosecutor offices have been undergoing a long and comprehensive process of reform, target several different components of their organization and management. This phenomenon can be explained as the outcome of two combined forces: an increase demand of justice and a pressure from the international and supranational institutions. Accordingly, innovation has become a major issue in the judicial sector. Despite the attention devoted to it, much less effort has been made to comprehend the mechanisms that make organizations innovative. To what extent is leadership important? How may different organizational cultures facilitate or create obstacles to innovation? And to what extent can innovation be implemented through a top down approach in a peculiar organization, such as a judicial office? This article tells the story of a case study on a public prosecutors’ office located in the South of Italy. The pilot study has been framed and conducted as both normative and empirical in its own nature. Moreover, it represents a case study with a certain number of policy effects, as it turned into a roadmap which was adopted by the judicial office to improve its own organization and human resource management. This is the first study carried out in Italy using such methods with the objective to:a describe the implementation of organizational innovation (in this respect there had already been a study of the General Registry Office,b map intra-organizational routines and inter-connections between administration and the General Registry Office and between the GRO and the criminal records Office etc;c Identify skills to adapt and learn skills from daily routines and execution of specific tasks, as they are done in all public sectors.- If you really want to understand something, try first to change it - Thomas Eliot

  3. NASA work unit system file maintenance manual

    Science.gov (United States)

    1972-01-01

    The NASA Work Unit System is a management information system for research tasks (i.e., work units) performed under NASA grants and contracts. It supplies profiles on research efforts and statistics on fund distribution. The file maintenance operator can add, delete and change records at a remote terminal or can submit punched cards to the computer room for batch update. The system is designed for file maintenance by a person with little or no knowledge of data processing techniques.

  4. New developments in delivering public access to data from the National Center for Computational Toxicology at the EPA

    Science.gov (United States)

    Researchers at EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The goal of this researc...

  5. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  6. Download this PDF file

    African Journals Online (AJOL)

    gains, revenge, terrorism and simple publicity. People in the communication and computer engineering have· really working to their best to introduce new security mechanisms on the one band and to make the already existing security mechanisms more tightening both ip software and hardware solutions, on the other band.

  7. Computerized index for teaching files

    International Nuclear Information System (INIS)

    Bramble, J.M.

    1989-01-01

    A computerized index can be used to retrieve cases from a teaching file that have radiographic findings similar to an unknown case. The probability that a user will review cases with a correct diagnosis was estimated with use of radiographic findings of arthritis in hand radiographs of 110 cases from a teaching file. The nearest-neighbor classification algorithm was used as a computer index to 110 cases of arthritis. Each case was treated as an unknown and inputted to the computer index. The accuracy of the computer index in retrieving cases with the same diagnosis (including rheumatoid arthritis, gout, psoriatic arthritis, inflammatory osteoarthritis, and pyrophosphate arthropathy) was measured. A Bayes classifier algorithm was also tested on the same database. Results are presented. The nearest-neighbor algorithm was 83%. By comparison, the estimated accuracy of the Bayes classifier algorithm was 78%. Conclusions: A computerized index to a teaching file based on the nearest-neighbor algorithm should allow the user to review cases with the correct diagnosis of an unknown case, by entering the findings of the unknown case

  8. file 7

    African Journals Online (AJOL)

    chemotherapy using effective drugs and constitute major public health and deworming campaigns targeted at high risk ... They are classified among the therapies in school children have produced neglected tropical ... anthelminthic drug is the main stay of including hookworm (Necator americanus, programmes aimed at the ...

  9. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  10. National Household Education Surveys of 2003. Data File User's Manual, Volume II: Parent and Family Involvement in Education Survey. NCES 2004-102

    Science.gov (United States)

    Hagedorn, Mary; Montaquila, Jill; Vaden-Kiernan, Nancy; Kim, Kwang; Roth, Shelley Brock; Chapman, Christopher

    2004-01-01

    This manual provides documentation and guidance for users of the public-use data file for PFI-NHES: 2003. This volume contains a description of the content and organization of the data file, including useful information regarding questionnaire items and the various derived variables found on the file. Appended are the public-use data file layout,…

  11. Protecting your files on the DFS file system

    CERN Multimedia

    Computer Security Team

    2011-01-01

    The Windows Distributed File System (DFS) hosts user directories for all NICE users plus many more data.    Files can be accessed from anywhere, via a dedicated web portal (http://cern.ch/dfs). Due to the ease of access to DFS with in CERN it is of utmost importance to properly protect access to sensitive data. As the use of DFS access control mechanisms is not obvious to all users, passwords, certificates or sensitive files might get exposed. At least this happened in past to the Andrews File System (AFS) - the Linux equivalent to DFS) - and led to bad publicity due to a journalist accessing supposedly "private" AFS folders (SonntagsZeitung 2009/11/08). This problem does not only affect the individual user but also has a bad impact on CERN's reputation when it comes to IT security. Therefore, all departments and LHC experiments agreed recently to apply more stringent protections to all DFS user folders. The goal of this data protection policy is to assist users in pro...

  12. Protecting your files on the AFS file system

    CERN Multimedia

    2011-01-01

    The Andrew File System is a world-wide distributed file system linking hundreds of universities and organizations, including CERN. Files can be accessed from anywhere, via dedicated AFS client programs or via web interfaces that export the file contents on the web. Due to the ease of access to AFS it is of utmost importance to properly protect access to sensitive data in AFS. As the use of AFS access control mechanisms is not obvious to all users, passwords, private SSH keys or certificates have been exposed in the past. In one specific instance, this also led to bad publicity due to a journalist accessing supposedly "private" AFS folders (SonntagsZeitung 2009/11/08). This problem does not only affect the individual user but also has a bad impact on CERN's reputation when it comes to IT security. Therefore, all departments and LHC experiments agreed in April 2010 to apply more stringent folder protections to all AFS user folders. The goal of this data protection policy is to assist users in...

  13. The design and development of GRASS file reservation system

    International Nuclear Information System (INIS)

    Huang Qiulan; Zhu Suijiang; Cheng Yaodong; Chen Gang

    2010-01-01

    GFRS (GRASS File Reservation System) is designed to improve the file access performance of GRASS (Grid-enabled Advanced Storage System) which is a Hierarchical Storage Management (HSM) system developed at Computing Center, Institute of High Energy Physics. GRASS can provide massive storage management and data migration, but the data migration policy is simply based factors such as pool water level, the intervals for migration and so on, so it is short of precise control over files. As for that, we design GFRS to implement user-based file reservation which is to reserve and keep the required files on disks for High Energy physicists. CFRS can improve file access speed for users by avoiding migrating frequently accessed files to tapes. In this paper we first give a brief introduction of GRASS system and then detailed architecture and implementation of GFRS. Experiments results from GFRS have shown good performance and a simple analysis is made based on it. (authors)

  14. Decay data file based on the ENSDF file

    Energy Technology Data Exchange (ETDEWEB)

    Katakura, J. [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-03-01

    A decay data file with the JENDL (Japanese Evaluated Nuclear Data Library) format based on the ENSDF (Evaluated Nuclear Structure Data File) file was produced as a tentative one of special purpose files of JENDL. The problem using the ENSDF file as primary source data of the JENDL decay data file is presented. (author)

  15. Automatic processing of list of journals and publications in the Nuclear Research Institute

    International Nuclear Information System (INIS)

    Vymetal, L.

    Using an EC 1040 computer, the Institute of Nuclear Research processed the list of journals in the reference library of the Czechoslovak Atomic Energy Commission including journals acquired by all institutions subordinated to the Czechoslovak Atomic Energy Commission, ie., UJV Rez (Nuclear Research Institute), Nuclear Information Centre Prague, UVVVR Prague (Institute for Research, Production and Application of Radioisotopes) and Institute of Radioecology and Applied Nuclear Techniques Kosice. Computer processing allowed obtaining files arranged by libraries, subject matters of the journals, countries of publication, and journal titles. Automated processing is being prepared of publications by UJV staff. The preparation is described of data for computer processing of both files and specimens are shown of printouts. (Ha)

  16. External validation of a publicly available computer assisted diagnostic tool for mammographic mass lesions with two high prevalence research datasets.

    Science.gov (United States)

    Benndorf, Matthias; Burnside, Elizabeth S; Herda, Christoph; Langer, Mathias; Kotter, Elmar

    2015-08-01

    Lesions detected at mammography are described with a highly standardized terminology: the breast imaging-reporting and data system (BI-RADS) lexicon. Up to now, no validated semantic computer assisted classification algorithm exists to interactively link combinations of morphological descriptors from the lexicon to a probabilistic risk estimate of malignancy. The authors therefore aim at the external validation of the mammographic mass diagnosis (MMassDx) algorithm. A classification algorithm like MMassDx must perform well in a variety of clinical circumstances and in datasets that were not used to generate the algorithm in order to ultimately become accepted in clinical routine. The MMassDx algorithm uses a naïve Bayes network and calculates post-test probabilities of malignancy based on two distinct sets of variables, (a) BI-RADS descriptors and age ("descriptor model") and (b) BI-RADS descriptors, age, and BI-RADS assessment categories ("inclusive model"). The authors evaluate both the MMassDx (descriptor) and MMassDx (inclusive) models using two large publicly available datasets of mammographic mass lesions: the digital database for screening mammography (DDSM) dataset, which contains two subsets from the same examinations-a medio-lateral oblique (MLO) view and cranio-caudal (CC) view dataset-and the mammographic mass (MM) dataset. The DDSM contains 1220 mass lesions and the MM dataset contains 961 mass lesions. The authors evaluate discriminative performance using area under the receiver-operating-characteristic curve (AUC) and compare this to the BI-RADS assessment categories alone (i.e., the clinical performance) using the DeLong method. The authors also evaluate whether assigned probabilistic risk estimates reflect the lesions' true risk of malignancy using calibration curves. The authors demonstrate that the MMassDx algorithms show good discriminatory performance. AUC for the MMassDx (descriptor) model in the DDSM data is 0.876/0.895 (MLO/CC view) and AUC

  17. Exploitation of heterogeneous resources for ATLAS Computing

    CERN Document Server

    Chudoba, Jiri; The ATLAS collaboration

    2018-01-01

    LHC experiments require significant computational resources for Monte Carlo simulations and real data processing and the ATLAS experiment is not an exception. In 2017, ATLAS exploited steadily almost 3M HS06 units, which corresponds to about 300 000 standard CPU cores. The total disk and tape capacity managed by the Rucio data management system exceeded 350 PB. Resources are provided mostly by Grid computing centers distributed in geographically separated locations and connected by the Grid middleware. The ATLAS collaboration developed several systems to manage computational jobs, data files and network transfers. ATLAS solutions for job and data management (PanDA and Rucio) were generalized and now are used also by other collaborations. More components are needed to include new resources such as private and public clouds, volunteers' desktop computers and primarily supercomputers in major HPC centers. Workflows and data flows significantly differ for these less traditional resources and extensive software re...

  18. UPIN Group File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Group Unique Physician Identifier Number (UPIN) File is the business entity file that contains the group practice UPIN and descriptive information. It does NOT...

  19. Multi-level, automatic file management system using magnetic disk, mass storage system and magnetic tape

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1979-12-01

    A simple, effective file management system using magnetic disk, mass storage system (MSS) and magnetic tape is described. Following are the concepts and techniques introduced in this file management system. (1) File distribution and continuity character of file references are closely approximated by memory retention function. A density function using the memory retention function is thus defined. (2) A method of computing the cost/benefit lines for magnetic disk, MSS and magnetic tape is presented. (3) A decision process of an optimal organization of file facilities incorporating file demands distribution to respective file devices, is presented. (4) A method of simple, practical, effective, automatic file management, incorporating multi-level file management, space management and file migration control, is proposed. (author)

  20. BUILDING A COMPLETE FREE AND OPEN SOURCE GIS INFRASTRUCTURE FOR HYDROLOGICAL COMPUTING AND DATA PUBLICATION USING GIS.LAB AND GISQUICK PLATFORMS

    Directory of Open Access Journals (Sweden)

    M. Landa

    2017-07-01

    Full Text Available Building a complete free and open source GIS computing and data publication platform can be a relatively easy task. This paper describes an automated deployment of such platform using two open source software projects – GIS.lab and Gisquick. GIS.lab (http: //web.gislab.io is a project for rapid deployment of a complete, centrally managed and horizontally scalable GIS infrastructure in the local area network, data center or cloud. It provides a comprehensive set of free geospatial software seamlessly integrated into one, easy-to-use system. A platform for GIS computing (in our case demonstrated on hydrological data processing requires core components as a geoprocessing server, map server, and a computation engine as eg. GRASS GIS, SAGA, or other similar GIS software. All these components can be rapidly, and automatically deployed by GIS.lab platform. In our demonstrated solution PyWPS is used for serving WPS processes built on the top of GRASS GIS computation platform. GIS.lab can be easily extended by other components running in Docker containers. This approach is shown on Gisquick seamless integration. Gisquick (http://gisquick.org is an open source platform for publishing geospatial data in the sense of rapid sharing of QGIS projects on the web. The platform consists of QGIS plugin, Django-based server application, QGIS server, and web/mobile clients. In this paper is shown how to easily deploy complete open source GIS infrastructure allowing all required operations as data preparation on desktop, data sharing, and geospatial computation as the service. It also includes data publication in the sense of OGC Web Services and importantly also as interactive web mapping applications.

  1. A File Archival System

    Science.gov (United States)

    Fanselow, J. L.; Vavrus, J. L.

    1984-01-01

    ARCH, file archival system for DEC VAX, provides for easy offline storage and retrieval of arbitrary files on DEC VAX system. System designed to eliminate situations that tie up disk space and lead to confusion when different programers develop different versions of same programs and associated files.

  2. Text File Comparator

    Science.gov (United States)

    Kotler, R. S.

    1983-01-01

    File Comparator program IFCOMP, is text file comparator for IBM OS/VScompatable systems. IFCOMP accepts as input two text files and produces listing of differences in pseudo-update form. IFCOMP is very useful in monitoring changes made to software at the source code level.

  3. Quality Computer Assisted Mobile Learning (CAML and Distance Education Leadership in Managing Technology Enhanced Learning Management System (TELMS in the Malaysian Public Tertiary Education

    Directory of Open Access Journals (Sweden)

    Lee Tan Luck

    2009-07-01

    Full Text Available Abstract - The success in the implementation of a quality computer assisted mobile learning and distance education in a Technology Enhanced Learning Management System is highly rely on the academic leadership in managing and application of Information and Communication Technology (ICT in the tertiary level. The effectiveness of its leadership, knowledge, application and management of ICT and learning management system is of utmost important. Successful application and management includes quality and cost effectiveness of universities administration, CAML and distance education leadership development, organizational culture, academic staffs and students’ attitude and their commitment towards teaching and learning process, support towards the usage of state of the art techno-educational facilities, availability of ICT resources, maintenance and funding of a Learning Management System. This paper will discuss the above factors, which present a comprehensive framework for the implementation of a quality CAML and distance education environment in ICT application and management in the Malaysian public universities. Selected Fifty-two respondents from two Malaysian public universities which offer e Learning and distance education with Learning Management System were acquired. A survey questionnaire is used to determine the effectiveness of ICT and mobile learning application management. Data from the questionnaires were analyzed by using non-parametric and parametric statistics testing. Results of this study show there is significant different in the CAML and distance education leadership in TELMS and the application of ICT and its management in the Malaysian public universities. The study will also address the implementation elements necessary for transforming the public universities and its CAML and distance education teaching and learning process into an effective and result oriented computer assisted mobile learning management model in public

  4. The Cigeo project, Meuse/Haute-Marne reversible geological disposal facility for radioactive waste. Project Owner File, Public debate of 15 May to 15 October 2013

    International Nuclear Information System (INIS)

    Dupuis, Marie-Claude; Gonnot, Francois-Michel

    2013-07-01

    Andra is exploring several options for the disposal of low-level long-lived waste (LLW-LL). With the French Government's approval, in June 2008 Andra began looking around France for a site to build an LLW-LL repository. In late 2008 it provided the Government with a report analysing the geological, environmental and socio-economic aspects of the forty odd municipalities that expressed an interest in the project. After the withdrawal of the two municipalities chosen in 2009 to conduct geological investigations, the government asked Andra to re-explore the various management options for graphite and radium-bearing waste, focusing in particular on ways to manage these types of waste separately. The High Committee for Transparency and Information on Nuclear Safety (HCTISN) created a working group to provide feedback on the search for a site for LLW-LL. Andra submitted a report to the Government in late 2012. This report contains proposals for continuing the search and draw in particular on the HCTISN's recommendations. Contents: 1 - Radioactive waste (Sources, Types, Management, Waste to be disposed of at Cigeo, Cigeo's estimated disposal capacities, Where IS HLW and ILW-LL being stored until Cigeo is commissioned? 2 - Why deep geological disposal? (A 15-year research programme, Presentation and assessment of the research results, The public debate of 2005-2006, Deep geological disposal ratified by the 2006 Planning Act, The 2006 Planning Act: other areas of research complementary to deep geological disposal, The situation in other countries); 3 - Why the Meuse/Haute-Marne site? (Selection of the Meuse and Haute-Marne site to host an underground research laboratory, The geological formation in the Meuse and Haute-Marne site, Callovo-Oxfordian clay, Siting of Cigeo's installations); 4 - How will Cigeo operate? (The installations at Cigeo, Construction of Cigeo, Transport of waste packages, Operation of Cigeo, Closure of Cigeo); 5 - Safety at Cigeo

  5. Grammar-Based Specification and Parsing of Binary File Formats

    Directory of Open Access Journals (Sweden)

    William Underwood

    2012-03-01

    Full Text Available The capability to validate and view or play binary file formats, as well as to convert binary file formats to standard or current file formats, is critically important to the preservation of digital data and records. This paper describes the extension of context-free grammars from strings to binary files. Binary files are arrays of data types, such as long and short integers, floating-point numbers and pointers, as well as characters. The concept of an attribute grammar is extended to these context-free array grammars. This attribute grammar has been used to define a number of chunk-based and directory-based binary file formats. A parser generator has been used with some of these grammars to generate syntax checkers (recognizers for validating binary file formats. Among the potential benefits of an attribute grammar-based approach to specification and parsing of binary file formats is that attribute grammars not only support format validation, but support generation of error messages during validation of format, validation of semantic constraints, attribute value extraction (characterization, generation of viewers or players for file formats, and conversion to current or standard file formats. The significance of these results is that with these extensions to core computer science concepts, traditional parser/compiler technologies can potentially be used as a part of a general, cost effective curation strategy for binary file formats.

  6. 29 CFR 4000.28 - What if I send a computer disk?

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false What if I send a computer disk? 4000.28 Section 4000.28... I send a computer disk? (a) In general. We determine your filing or issuance date for a computer... paragraph (b) of this section. (1) Filings. For computer-disk filings, we may treat your submission as...

  7. Trends of Mobile Learning in Computing Education from 2006 to 2014: A Systematic Review of Research Publications

    Science.gov (United States)

    Anohah, Ebenezer; Oyelere, Solomon Sunday; Suhonen, Jarkko

    2017-01-01

    The majority of the existing research regarding mobile learning in computing education has primarily focused on studying the effectiveness of, and in some cases reporting about, implemented mobile learning solutions. However, it is equally important to explore development and application perspectives on the integration of mobile learning into…

  8. A data compression algorithm for nuclear spectrum files

    International Nuclear Information System (INIS)

    Mika, J.F.; Martin, L.J.; Johnston, P.N.

    1990-01-01

    The total space occupied by computer files of spectra generated in nuclear spectroscopy systems can lead to problems of storage, and transmission time. An algorithm is presented which significantly reduces the space required to store nuclear spectra, without loss of any information content. Testing indicates that spectrum files can be routinely compressed by a factor of 5. (orig.)

  9. New orthopaedic implant management tool for computer-assisted planning, navigation, and simulation: from implant CAD files to a standardized XML-based implant database.

    Science.gov (United States)

    Sagbo, S; Blochaou, F; Langlotz, F; Vangenot, C; Nolte, L-P; Zheng, G

    2005-01-01

    Computer-Assisted Orthopaedic Surgery (CAOS) has made much progress over the last 10 years. Navigation systems have been recognized as important tools that help surgeons, and various such systems have been developed. A disadvantage of these systems is that they use non-standard formalisms and techniques. As a result, there are no standard concepts for implant and tool management or data formats to store information for use in 3D planning and navigation. We addressed these limitations and developed a practical and generic solution that offers benefits for surgeons, implant manufacturers, and CAS application developers. We developed a virtual implant database containing geometrical as well as calibration information for orthopedic implants and instruments, with a focus on trauma. This database has been successfully tested for various applications in the client/server mode. The implant information is not static, however, because manufacturers periodically revise their implants, resulting in the deletion of some implants and the introduction of new ones. Tracking these continuous changes and keeping CAS systems up to date is a tedious task if done manually. This leads to additional costs for system development, and some errors are inevitably generated due to the huge amount of information that has to be processed. To ease management with respect to implant life cycle, we developed a tool to assist end-users (surgeons, hospitals, CAS system providers, and implant manufacturers) in managing their implants. Our system can be used for pre-operative planning and intra-operative navigation, and also for any surgical simulation involving orthopedic implants. Currently, this tool allows addition of new implants, modification of existing ones, deletion of obsolete implants, export of a given implant, and also creation of backups. Our implant management system has been successfully tested in the laboratory with very promising results. It makes it possible to fill the current gap

  10. Securing the AliEn File Catalogue - Enforcing authorization with accountable file operations

    International Nuclear Information System (INIS)

    Schreiner, Steffen; Banerjee, Subho Sankar; Betev, Latchezar; Carminati, Federico; Vladimirovna Datskova, Olga; Furano, Fabrizio; Grigoras, Alina; Grigoras, Costin; Mendez Lorenzo, Patricia; Peters, Andreas Joachim; Saiz, Pablo; Bagnasco, Stefano; Zhu Jianlin

    2011-01-01

    The AliEn Grid Services, as operated by the ALICE Collaboration in its global physics analysis grid framework, is based on a central File Catalogue together with a distributed set of storage systems and the possibility to register links to external data resources. This paper describes several identified vulnerabilities in the AliEn File Catalogue access protocol regarding fraud and unauthorized file alteration and presents a more secure and revised design: a new mechanism, called LFN Booking Table, is introduced in order to keep track of access authorization in the transient state of files entering or leaving the File Catalogue. Due to a simplification of the original Access Envelope mechanism for xrootd-protocol-based storage systems, fundamental computational improvements of the mechanism were achieved as well as an up to 50% reduction of the credential's size. By extending the access protocol with signed status messages from the underlying storage system, the File Catalogue receives trusted information about a file's size and checksum and the protocol is no longer dependent on client trust. Altogether, the revised design complies with atomic and consistent transactions and allows for accountable, authentic, and traceable file operations. This paper describes these changes as part and beyond the development of AliEn version 2.19.

  11. 42 CFR 84.40 - Quality control plans; filing requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Quality control plans; filing requirements. 84.40... Control § 84.40 Quality control plans; filing requirements. As a part of each application for approval or... proposed quality control plan which shall be designed to assure the quality of respiratory protection...

  12. 45 CFR 213.5 - Filing and service of papers.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 2 2010-10-01 2010-10-01 false Filing and service of papers. 213.5 Section 213.5... and service of papers. (a) All papers in the proceedings shall be filed with the FSA Hearing Clerk, in...) All papers in the proceedings shall be served on all parties by personal delivery or by mail. Service...

  13. 78 FR 11635 - Combined Notice of Filings #2

    Science.gov (United States)

    2013-02-19

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 2 Take notice.... Applicants: Wildcat Wind Farm I, LLC. Description: Wildcat Wind Farm I, LLC withdraws request for... electric rate filings: Docket Numbers: ER13-916-000. Applicants: Public Service Company of New Mexico...

  14. 77 FR 2284 - Combined Notice of Filings #1

    Science.gov (United States)

    2012-01-17

    ...: CPV Cimarron Renewable Energy Company, LLC. Description: CPV Cimarron Renewable Energy Company, LLC... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 1 Take notice... Service Company of New Mexico. Description: Public Service Company of New Mexico submits tariff filing per...

  15. 28 CFR 32.32 - Time for filing claim.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Time for filing claim. 32.32 Section 32.32 Judicial Administration DEPARTMENT OF JUSTICE PUBLIC SAFETY OFFICERS' DEATH, DISABILITY, AND EDUCATIONAL ASSISTANCE BENEFIT CLAIMS Educational Assistance Benefit Claims § 32.32 Time for filing claim. (a...

  16. 36 CFR 902.57 - Investigatory files compiled for law enforcement purposes.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Investigatory files compiled for law enforcement purposes. 902.57 Section 902.57 Parks, Forests, and Public Property PENNSYLVANIA AVENUE DEVELOPMENT CORPORATION FREEDOM OF INFORMATION ACT Exemptions From Public Access to Corporation Records § 902.57 Investigatory files compiled...

  17. 45 CFR 1705.3 - Procedures for requests pertaining to individual records in the D/AC File.

    Science.gov (United States)

    2010-10-01

    ... records in the D/AC File. 1705.3 Section 1705.3 Public Welfare Regulations Relating to Public Welfare... for requests pertaining to individual records in the D/AC File. (a) An individual who wishes to know whether the D/AC File contains a record pertaining to him or her shall submit a written request to that...

  18. ATLAS, an integrated structural analysis and design system. Volume 4: Random access file catalog

    Science.gov (United States)

    Gray, F. P., Jr. (Editor)

    1979-01-01

    A complete catalog is presented for the random access files used by the ATLAS integrated structural analysis and design system. ATLAS consists of several technical computation modules which output data matrices to corresponding random access file. A description of the matrices written on these files is contained herein.

  19. pcircle - A Suite of Scalable Parallel File System Tools

    Energy Technology Data Exchange (ETDEWEB)

    2015-10-01

    Most of the software related to file system are written for conventional local file system, they are serialized and can't take advantage of the benefit of a large scale parallel file system. "pcircle" software builds on top of ubiquitous MPI in cluster computing environment and "work-stealing" pattern to provide a scalable, high-performance suite of file system tools. In particular - it implemented parallel data copy and parallel data checksumming, with advanced features such as async progress report, checkpoint and restart, as well as integrity checking.

  20. MR-AFS: a global hierarchical file-system

    International Nuclear Information System (INIS)

    Reuter, H.

    2000-01-01

    The next generation of fusion experiments will use object-oriented technology creating the need for world wide sharing of an underlying hierarchical file-system. The Andrew file system (AFS) is a well known and widely spread global distributed file-system. Multiple-resident-AFS (MR-AFS) combines the features of AFS with hierarchical storage management systems. Files in MR-AFS therefore may be migrated on secondary storage, such as roboted tape libraries. MR-AFS is in use at IPP for the current experiments and data originating from super-computer applications. Experiences and scalability issues are discussed

  1. Network survivability performance (computer diskette)

    Science.gov (United States)

    1993-11-01

    File characteristics: Data file; 1 file. Physical description: 1 computer diskette; 3 1/2 in.; high density; 2.0MB. System requirements: Mac; Word. This technical report has been developed to address the survivability of telecommunications networks including services. It responds to the need for a common understanding of, and assessment techniques for network survivability, availability, integrity, and reliability. It provides a basis for designing and operating telecommunication networks to user expectations for network survivability.

  2. Provider Specific Data for Public Use in SAS Format

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Fiscal Intermediary maintains the Provider Specific File (PSF). The file contains information about the facts specific to the provider that affects computations...

  3. Tom Tabor, the owner of Tabor Communications, presents Wolfgang von Rüden with the Editors Choice Award of HPCwire, which was awarded to CERN for its commitment to educating the public about high-performance computing.

    CERN Multimedia

    Maximilien Brice

    2006-01-01

    Tom Tabor, the owner of Tabor Communications, presents Wolfgang von Rüden with the Editors Choice Award of HPCwire, which was awarded to CERN for its commitment to educating the public about high-performance computing.

  4. 42 CFR 1005.11 - Form, filing and service of papers.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Form, filing and service of papers. 1005.11 Section... and service of papers. (a) Forms. (1) Unless the ALJ directs the parties to do otherwise, documents filed with the ALJ will include an original and two copies. (2) Every pleading and paper filed in the...

  5. 45 CFR 160.524 - Form, filing, and service of papers.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Form, filing, and service of papers. 160.524..., and service of papers. (a) Forms. (1) Unless the ALJ directs the parties to do otherwise, documents filed with the ALJ must include an original and two copies. (2) Every pleading and paper filed in the...

  6. Computer ethics and cyber laws to mental health professionals.

    Science.gov (United States)

    Raveesh, B N; Pande, Sanjay

    2004-04-01

    The explosive growth of computer and communications technology raises new legal and ethical challenges that reflect tensions between individual rights and societal needs. For instance, should cracking into a computer system be viewed as a petty prank, as trespassing, as theft, or as espionage? Should placing copyrighted material onto a public file server be treated as freedom of expression or as theft? Should ordinary communications be encrypted using codes that make it impossible for law-enforcement agencies to perform wiretaps? As we develop shared understandings and norms of behaviour, we are setting standards that will govern the information society for decades to come.

  7. Characteristics of file sharing and peer to peer networking | Opara ...

    African Journals Online (AJOL)

    Characteristics of file sharing and peer to peer networking. ... distributing or providing access to digitally stored information, such as computer programs, ... including in multicast systems, anonymous communications systems, and web caches.

  8. LASIP-III, a generalized processor for standard interface files

    International Nuclear Information System (INIS)

    Bosler, G.E.; O'Dell, R.D.; Resnik, W.M.

    1976-03-01

    The LASIP-III code was developed for processing Version III standard interface data files which have been specified by the Committee on Computer Code Coordination. This processor performs two distinct tasks, namely, transforming free-field format, BCD data into well-defined binary files and providing for printing and punching data in the binary files. While LASIP-III is exported as a complete free-standing code package, techniques are described for easily separating the processor into two modules, viz., one for creating the binary files and one for printing the files. The two modules can be separated into free-standing codes or they can be incorporated into other codes. Also, the LASIP-III code can be easily expanded for processing additional files, and procedures are described for such an expansion. 2 figures, 8 tables

  9. The acoustics of public squares/places: A comparison between results from a computer simulation program and measurements in situ

    DEFF Research Database (Denmark)

    Paini, Dario; Rindel, Jens Holger; Gade, Anders

    2004-01-01

    or a band during, for instance, music summer festivals) and the best position for the audience. A further result could be to propose some acoustic adjustments to achieve better acoustic quality by considering the acoustic parameters which are typically used for concert halls and opera houses.......In the contest of a PhD thesis, in which the main purpose is to analyse the importance of the public square/place (“agora”) as a meeting point of sound and music, with particular regard to its use for concerts (amplified or not), a first step was done, making comparisons between measurement in situ...

  10. 76 FR 49467 - Combined Notice of Filings #1

    Science.gov (United States)

    2011-08-10

    ... per 35.13(a)(2)(iii: Modification of its Transmission Depreciation Rate to be effective 10/1/2011...: Arizona Public Service Company submits tariff filing per 35.17(b): Amendment to the APS depreciation...

  11. 78 FR 20907 - Combined Notice of Filings #2

    Science.gov (United States)

    2013-04-08

    ... Independent Transmission System Operator, Inc. submits 2013-03-29 MidAm Att O Depreciation Rates to be...-Retirement Benefits Other than Pensions of Public Service Company of Colorado. Filed Date: 3/29/13. Accession...

  12. Air and Soil Data Files from Sumas Study

    Data.gov (United States)

    U.S. Environmental Protection Agency — The data are summarized in the manuscript, but users may wish to apply them from these files. This dataset is associated with the following publication: Wroble, J.,...

  13. Distributed Data Management and Distributed File Systems

    CERN Document Server

    Girone, Maria

    2015-01-01

    The LHC program has been successful in part due to the globally distributed computing resources used for collecting, serving, processing, and analyzing the large LHC datasets. The introduction of distributed computing early in the LHC program spawned the development of new technologies and techniques to synchronize information and data between physically separated computing centers. Two of the most challenges services are the distributed file systems and the distributed data management systems. In this paper I will discuss how we have evolved from local site services to more globally independent services in the areas of distributed file systems and data management and how these capabilities may continue to evolve into the future. I will address the design choices, the motivations, and the future evolution of the computing systems used for High Energy Physics.

  14. Utilizing HDF4 File Content Maps for the Cloud

    Science.gov (United States)

    Lee, Hyokyung Joe

    2016-01-01

    We demonstrate a prototype study that HDF4 file content map can be used for efficiently organizing data in cloud object storage system to facilitate cloud computing. This approach can be extended to any binary data formats and to any existing big data analytics solution powered by cloud computing because HDF4 file content map project started as long term preservation of NASA data that doesn't require HDF4 APIs to access data.

  15. BIBLIO: A Reprint File Management Algorithm

    Science.gov (United States)

    Zelnio, Robert N.; And Others

    1977-01-01

    The development of a simple computer algorithm designed for use by the individual educator or researcher in maintaining and searching reprint files is reported. Called BIBLIO, the system is inexpensive and easy to operate and maintain without sacrificing flexibility and utility. (LBH)

  16. An information retrieval system for research file data

    Science.gov (United States)

    Joan E. Lengel; John W. Koning

    1978-01-01

    Research file data have been successfully retrieved at the Forest Products Laboratory through a high-speed cross-referencing system involving the computer program FAMULUS as modified by the Madison Academic Computing Center at the University of Wisconsin. The method of data input, transfer to computer storage, system utilization, and effectiveness are discussed....

  17. GEODOC: the GRID document file, record structure and data element description

    Energy Technology Data Exchange (ETDEWEB)

    Trippe, T.; White, V.; Henderson, F.; Phillips, S.

    1975-11-06

    The purpose of this report is to describe the information structure of the GEODOC file. GEODOC is a computer based file which contains the descriptive cataloging and indexing information for all documents processed by the National Geothermal Information Resource Group. This file (along with other GRID files) is managed by DBMS, the Berkeley Data Base Management System. Input for the system is prepared using the IRATE Text Editing System with its extended (12 bit) character set, or punched cards.

  18. A simulation model for visitors’ thermal comfort at urban public squares using non-probabilistic binary-linear classifier through soft-computing methodologies

    International Nuclear Information System (INIS)

    Kariminia, Shahab; Shamshirband, Shahaboddin; Hashim, Roslan; Saberi, Ahmadreza; Petković, Dalibor; Roy, Chandrabhushan; Motamedi, Shervin

    2016-01-01

    Sustaining outdoor life in cities is decreasing because of the recent rapid urbanisation without considering climate-responsive urban design concepts. Such inadvertent climatic modifications at the indoor level have imposed considerable demand on the urban energy resources. It is important to provide comfortable ambient climate at open urban squares. Researchers need to predict the comfortable conditions at such outdoor squares. The main objective of this study is predict the visitors' outdoor comfort indices by using a developed computational model termed as SVM-WAVELET (Support Vector Machines combined with Discrete Wavelet Transform algorithm). For data collection, the field study was conducted in downtown Isfahan, Iran (51°41′ E, 32°37′ N) with hot and arid summers. Based on different environmental elements, four separate locations were monitored across two public squares. Meteorological data were measured simultaneously by surveying the visitors' thermal sensations. According to the subjects' thermal feeling and their characteristics, their level of comfort was estimated. Further, the adapted computational model was used to estimate the visitors’ thermal sensations in terms of thermal comfort indices. The SVM-WAVELET results indicate that R"2 value for input parameters, including Thermal Sensation, PMW (The predicted mean vote), PET (physiologically equivalent temperature), SET (standard effective temperature) and T_m_r_t were estimated at 0.482, 0.943, 0.988, 0.969 and 0.840, respectively. - Highlights: • To explore the visitors' thermal sensation at urban public squares. • This article introduces findings of outdoor comfort prediction. • The developed SVM-WAVELET soft-computing technique was used. • SVM-WAVELET estimation results are more reliable and accurate.

  19. Adding Data Management Services to Parallel File Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brandt, Scott [Univ. of California, Santa Cruz, CA (United States)

    2015-03-04

    The objective of this project, called DAMASC for “Data Management in Scientific Computing”, is to coalesce data management with parallel file system management to present a declarative interface to scientists for managing, querying, and analyzing extremely large data sets efficiently and predictably. Managing extremely large data sets is a key challenge of exascale computing. The overhead, energy, and cost of moving massive volumes of data demand designs where computation is close to storage. In current architectures, compute/analysis clusters access data in a physically separate parallel file system and largely leave it scientist to reduce data movement. Over the past decades the high-end computing community has adopted middleware with multiple layers of abstractions and specialized file formats such as NetCDF-4 and HDF5. These abstractions provide a limited set of high-level data processing functions, but have inherent functionality and performance limitations: middleware that provides access to the highly structured contents of scientific data files stored in the (unstructured) file systems can only optimize to the extent that file system interfaces permit; the highly structured formats of these files often impedes native file system performance optimizations. We are developing Damasc, an enhanced high-performance file system with native rich data management services. Damasc will enable efficient queries and updates over files stored in their native byte-stream format while retaining the inherent performance of file system data storage via declarative queries and updates over views of underlying files. Damasc has four key benefits for the development of data-intensive scientific code: (1) applications can use important data-management services, such as declarative queries, views, and provenance tracking, that are currently available only within database systems; (2) the use of these services becomes easier, as they are provided within a familiar file

  20. File access prediction using neural networks.

    Science.gov (United States)

    Patra, Prashanta Kumar; Sahu, Muktikanta; Mohapatra, Subasish; Samantray, Ronak Kumar

    2010-06-01

    One of the most vexing issues in design of a high-speed computer is the wide gap of access times between the memory and the disk. To solve this problem, static file access predictors have been used. In this paper, we propose dynamic file access predictors using neural networks to significantly improve upon the accuracy, success-per-reference, and effective-success-rate-per-reference by using neural-network-based file access predictor with proper tuning. In particular, we verified that the incorrect prediction has been reduced from 53.11% to 43.63% for the proposed neural network prediction method with a standard configuration than the recent popularity (RP) method. With manual tuning for each trace, we are able to improve upon the misprediction rate and effective-success-rate-per-reference using a standard configuration. Simulations on distributed file system (DFS) traces reveal that exact fit radial basis function (RBF) gives better prediction in high end system whereas multilayer perceptron (MLP) trained with Levenberg-Marquardt (LM) backpropagation outperforms in system having good computational capability. Probabilistic and competitive predictors are the most suitable for work stations having limited resources to deal with and the former predictor is more efficient than the latter for servers having maximum system calls. Finally, we conclude that MLP with LM backpropagation algorithm has better success rate of file prediction than those of simple perceptron, last successor, stable successor, and best k out of m predictors.

  1. Source Reference File

    Data.gov (United States)

    Social Security Administration — This file contains a national set of names and contact information for doctors, hospitals, clinics, and other facilities (known collectively as sources) from which...

  2. Patient Assessment File (PAF)

    Data.gov (United States)

    Department of Veterans Affairs — The Patient Assessment File (PAF) database compiles the results of the Patient Assessment Instrument (PAI) questionnaire filled out for intermediate care Veterans...

  3. RRB Earnings File (RRBERN)

    Data.gov (United States)

    Social Security Administration — RRBERN contains records for all beneficiaries on the RRB's PSSVES file who's SSNs are validated through the SVES processing. Validated output is processed through...

  4. 78 FR 7424 - Combined Notice of Filings #1

    Science.gov (United States)

    2013-02-01

    .... Applicants: Public Service Company of New Mexico, Delta Person GP, LLC, BHB Power, LLC, Delta Person, Limited...-003. Applicants: Southwestern Public Service Company. Description: Supplement to June 29, 2012 Triennial Market Power Analysis of Southwestern Public Service Company. Filed Date: 1/24/13. Accession...

  5. 78 FR 20904 - Combined Notice of Filings #2

    Science.gov (United States)

    2013-04-08

    ... Market Power Analysis of Southwestern Public Service Company. Filed Date: 4/1/13. Accession Number....m. ET 4/22/13. Docket Numbers: ER13-1211-000. Applicants: Public Service Company of New Mexico. Description: Public Service Company of New Mexico submits Modification of Real Power Loss Factor SA to be...

  6. Radiology Teaching Files on the Internet

    International Nuclear Information System (INIS)

    Lim, Eun Chung; Kim, Eun Kyung

    1996-01-01

    There is increasing attention about radiology teaching files on the Internet in the field of diagnostic radiology. The purpose of this study was to aid in the creation of new radiology teaching file by analysing the present radiology teaching file sites on the Internet with many aspects and evaluating images on those sites, using Macintosh II ci compute r, 28.8kbps TelePort Fax/Modem, Netscape Navigator 2.0 software. The results were as follow : 1. Analysis of radiology teaching file sites (1) Country distribution was the highest in USA (57.5%). (2) Average number of cases was 186 cases and radiology teaching file sites with search engine were 9 sites (22.5%). (3) At the method of case arrangement, anatomic area type and diagnosis type were found at the 10 sites (25%) each, question and answer type was found at the 9 sites (22.5%). (4) Radiology teaching file sites with oro-maxillofacial disorder were 9 sites (22.5%). (5) At the image format, GIF format was found at the 14 sites (35%), and JPEG format found at the 14 sites (35%). (6) Created year was the highest in 1995 (43.7%). (7) Continuing case upload was found at the 35 sites (87.5%). 2. Evaluation of images on the radiology teaching files (1) Average file size of GIF format (71 Kbyte) was greater than that of JPEG format (24 Kbyte). (P<0.001) (2) Image quality of GIF format was better than that of JPEG format. (P<0.001)

  7. 76 FR 62092 - Filing Procedures

    Science.gov (United States)

    2011-10-06

    ... INTERNATIONAL TRADE COMMISSION Filing Procedures AGENCY: International Trade Commission. ACTION: Notice of issuance of Handbook on Filing Procedures. SUMMARY: The United States International Trade Commission (``Commission'') is issuing a Handbook on Filing Procedures to replace its Handbook on Electronic...

  8. An Examination of Publication Bias in an International Journal of Information Technology in Education

    Science.gov (United States)

    Liu, Leping; Aberasturi, Suzanne M.; Axtell, Kulwadee; Richmond, Aaron

    2007-01-01

    Publication bias refers to a tendency to publish articles with significant results over publications with nonsignificant results. In this article we first review the literature of publication bias focusing on the three major determinants (file drawer significance, file drawer effect size, and file drawer sample size) and two interrelated sources…

  9. Virus Alert: Ten Steps to Safe Computing.

    Science.gov (United States)

    Gunter, Glenda A.

    1997-01-01

    Discusses computer viruses and explains how to detect them; discusses virus protection and the need to update antivirus software; and offers 10 safe computing tips, including scanning floppy disks and commercial software, how to safely download files from the Internet, avoiding pirated software copies, and backing up files. (LRW)

  10. [Analysis and evaluation of the visual effort in remote-control public traffic operators working with computer-based equipments].

    Science.gov (United States)

    Gullà, F; Zambelli, P; Bergamaschi, A; Piccoli, B

    2007-01-01

    The aim of this study is the objective evaluation of the visual effort in 6 public traffic controllers (4 male, 2 female, mean age 29,6), by means of electronic equipment. The electronic equipment quantify the observation distance and the observation time for each controller's occupational visual field. The quantification of these parameters is obtained by the emission of ultrasound at 40 KHz from an emission sensor (placed by the VDT screen) and the ultrasound reception by means of a receiving sensor (placed on the operator's head). The travelling time of the ultrasound (US), as the air speed is known and costant (about 340 m/s), it is used to calculate the distance between the emitting and the receiving sensor. The results show that the visual acuity required is of average level, while accommodation's and convergence's effort vary from average to intense (depending on the visual characteristics of the operator considered), ranging from 26,41 and 43,92% of accommodation and convergence available in each operator. The time actually spent in "near observation within the c.v.p." (Tscr) was maintained in a range from 2h 54' and 4h 05'.

  11. Nuclear plant fire incident data file

    International Nuclear Information System (INIS)

    Sideris, A.G.; Hockenbury, R.W.; Yeater, M.L.; Vesely, W.E.

    1979-01-01

    A computerized nuclear plant fire incident data file was developed by American Nuclear Insurers and was further analyzed by Rensselaer Polytechnic Institute with technical and monetary support provided by the Nuclear Regulatory Commission. Data on 214 fires that occurred at nuclear facilities have been entered in the file. A computer program has been developed to sort the fire incidents according to various parameters. The parametric sorts that are presented in this article are significant since they are the most comprehensive statistics presently available on fires that have occurred at nuclear facilities

  12. 46 CFR 531.9 - Publication.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 9 2010-10-01 2010-10-01 false Publication. 531.9 Section 531.9 Shipping FEDERAL... Publication of Essential Terms § 531.9 Publication. (a) Contents. All authorized persons who choose to file... contained in a publication widely available to the public and well known within the industry. Reference may...

  13. FHEO Filed Cases

    Data.gov (United States)

    Department of Housing and Urban Development — The dataset is a list of all the Title VIII fair housing cases filed by FHEO from 1/1/2007 - 12/31/2012 including the case number, case name, filing date, state and...

  14. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  15. Efficacy of D-RaCe and ProTaper Universal Retreatment NiTi instruments and hand files in removing gutta-percha from curved root canals - a micro-computed tomography study.

    Science.gov (United States)

    Rödig, T; Hausdörfer, T; Konietschke, F; Dullin, C; Hahn, W; Hülsmann, M

    2012-06-01

    To compare the efficacy of two rotary NiTi retreatment systems and Hedström files in removing filling material from curved root canals. Curved root canals of 57 extracted teeth were prepared using FlexMaster instruments and filled with gutta-percha and AH Plus. After determination of root canal curvatures and radii in two directions, the teeth were assigned to three identical groups (n = 19). The root fillings were removed with D-RaCe instruments, ProTaper Universal Retreatment instruments or Hedström files. Pre- and postoperative micro-CT imaging was used to assess the percentage of residual filling material as well as the amount of dentine removal. Working time and procedural errors were recorded. Data were analysed using analysis of covariance and analysis of variance procedures. D-RaCe instruments were significantly more effective than ProTaper Universal Retreatment instruments and Hedström files (P ProTaper group, four instrument fractures and one lateral perforation were observed. Five instrument fractures were recorded for D-RaCe. D-RaCe instruments were associated with significantly less residual filling material than ProTaper Universal Retreatment instruments and hand files. Hedström files removed significantly less dentine than both rotary NiTi systems. Retreatment with rotary NiTi systems resulted in a high incidence of procedural errors. © 2012 International Endodontic Journal.

  16. Early Childhood Longitudinal Study, Kindergarten Class of 2010-11 (ECLS-K:2011). User's Manual for the ECLS-K:2011 Kindergarten-First Grade Data File and Electronic Codebook, Public Version. NCES 2015-078

    Science.gov (United States)

    Tourangeau, Karen; Nord, Christine; Lê, Thanh; Wallner-Allen, Kathleen; Hagedorn, Mary C.; Leggitt, John; Najarian, Michelle

    2015-01-01

    This manual provides guidance and documentation for users of the longitudinal kindergarten-first grade (K-1) data file of the Early Childhood Longitudinal Study, Kindergarten Class of 2010-11 (ECLS-K:2011). It mainly provides information specific to the first-grade rounds of data collection. Data for the ECLS-K:2011 are released in both a…

  17. Next generation WLCG File Transfer Service (FTS)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    LHC experiments at CERN and worldwide utilize WLCG resources and middleware components to perform distributed computing tasks. One of the most important tasks is reliable file replication. It is a complex problem, suffering from transfer failures, disconnections, transfer duplication, server and network overload, differences in storage systems, etc. To address these problems, EMI and gLite have provided the independent File Transfer Service (FTS) and Grid File Access Library (GFAL) tools. Their development started almost a decade ago, in the meantime, requirements in data management have changed - the old architecture of FTS and GFAL cannot keep support easily these changes. Technology has also been progressing: FTS and GFAL do not fit into the new paradigms (cloud, messaging, for example). To be able to serve the next stage of LHC data collecting (from 2013), we need a new generation of  these tools: FTS 3 and GFAL 2. We envision a service requiring minimal configuration, which can dynamically adapt to the...

  18. GIFT: an HEP project for file transfer

    International Nuclear Information System (INIS)

    Ferrer, M.L.; Mirabelli, G.; Valente, E.

    1986-01-01

    Started in autumn 1983, GIFT (General Internetwork File Transfer) is a collaboration among several HEP centers, including CERN, Frascati, Oslo, Oxford, RAL and Rome. The collaboration was initially set up with the aim of studying the feasibility of a software system to allow direct file exchange between computers which do not share a common Virtual File Protocol. After the completion of this first phase, an implementation phase started and, since March 1985, an experimental service based on this system has been running at CERN between DECnet, CERNET and the UK Coloured Book protocols. The authors present the motivations that, together with previous gateway experiences, led to the definition of GIFT specifications and to the implementation of the GIFT Kernel system. The position of GIFT in the overall development framework of the networking facilities needed by large international collaborations within the HEP community is explained. (Auth.)

  19. 75 FR 52267 - Waiver of Statement of Account Filing Deadline for the 2010/1 Period

    Science.gov (United States)

    2010-08-25

    ..., Public Television Claimants, Music Claimants, Devotional Claimants, National Public Radio and Canadian... time so operators can make that single filing will alleviate burdens on the cable industry as well as...

  20. Copyright and personal use of CERN’s computing infrastructure

    CERN Multimedia

    IT Department

    2009-01-01

    (La version française sera en ligne prochainement)The rules covering the personal use of CERN’s computing infrastructure are defined in Operational Circular No. 5 and its Subsidiary Rules (see http://cern.ch/ComputingRules). All users of CERN’s computing infrastructure must comply with these rules, whether they access CERN’s computing facilities from within the Organization’s site or at another location. In particular, OC5 clause 17 requires that proprietary rights (the rights in software, music, video, etc.) must be respected. The user is liable for damages resulting from non-compliance. Recently, there have been several violations of OC5, where copyright material was discovered on public world-readable disk space. Please ensure that all material under your responsibility (in particular in files owned by your account) respects proprietary rights, including with respect to the restriction of access by third parties. CERN Security Team

  1. 75 FR 29817 - Proposed Collection; Comment Request for Publication 1345

    Science.gov (United States)

    2010-05-27

    ... Publication 1345 AGENCY: Internal Revenue Service (IRS), Treasury. ACTION: Notice and request for comments... Publication 1345, Handbook for Authorized IRS e-file Providers. DATES: Written comments should be received on....gov . SUPPLEMENTARY INFORMATION: Title: Publication 1345, Handbook for Authorized IRS e-file Providers...

  2. Configuration Management File Manager Developed for Numerical Propulsion System Simulation

    Science.gov (United States)

    Follen, Gregory J.

    1997-01-01

    One of the objectives of the High Performance Computing and Communication Project's (HPCCP) Numerical Propulsion System Simulation (NPSS) is to provide a common and consistent way to manage applications, data, and engine simulations. The NPSS Configuration Management (CM) File Manager integrated with the Common Desktop Environment (CDE) window management system provides a common look and feel for the configuration management of data, applications, and engine simulations for U.S. engine companies. In addition, CM File Manager provides tools to manage a simulation. Features include managing input files, output files, textual notes, and any other material normally associated with simulation. The CM File Manager includes a generic configuration management Application Program Interface (API) that can be adapted for the configuration management repositories of any U.S. engine company.

  3. Tabulation of Fundamental Assembly Heat and Radiation Source Files

    International Nuclear Information System (INIS)

    T. deBues; J.C. Ryman

    2006-01-01

    The purpose of this calculation is to tabulate a set of computer files for use as input to the WPLOAD thermal loading software. These files contain details regarding heat and radiation from pressurized water reactor (PWR) assemblies and boiling water reactor (BWR) assemblies. The scope of this calculation is limited to rearranging and reducing the existing file information into a more streamlined set of tables for use as input to WPLOAD. The electronic source term files used as input to this calculation were generated from the output files of the SAS2H/ORIGIN-S sequence of the SCALE Version 4.3 modular code system, as documented in References 2.1.1 and 2.1.2, and are included in Attachment II

  4. Study and development of a document file system with selective access

    International Nuclear Information System (INIS)

    Mathieu, Jean-Claude

    1974-01-01

    The objective of this research thesis was to design and to develop a set of software aimed at an efficient management of a document file system by using methods of selective access to information. Thus, the three main aspects of file processing (creation, modification, reorganisation) have been addressed. The author first presents the main problems related to the development of a comprehensive automatic documentation system, and their conventional solutions. Some future aspects, notably dealing with the development of peripheral computer technology, are also evoked. He presents the characteristics of INIS bibliographic records provided by the IAEA which have been used to create the files. In the second part, he briefly describes the file system general organisation. This system is based on the use of two main files: an inverse file which contains for each descriptor a list of of numbers of files indexed by this descriptor, and a dictionary of descriptor or input file which gives access to the inverse file. The organisation of these both files is then describes in a detailed way. Other related or associated files are created, and the overall architecture and mechanisms integrated into the file data input software are described, as well as various processing applied to these different files. Performance and possible development are finally discussed

  5. The Improvement and Performance of Mobile Environment Using Both Cloud and Text Computing

    OpenAIRE

    S.Saravana Kumar; J.Lakshmi Priya; P.Hannah Jennifer; N.Jeff Monica; Fathima

    2013-01-01

    In this research paper presents an design model for file sharing system for ubiquitos mobile devices using both cloud and text computing. File s haring is one of the rationales for computer networks with increasing demand for file sharing ap plications and technologies in small and large enterprise networks and on the Internet. File transfer is an important process in any form of computing as we need to really share the data ac ross. ...

  6. Download this PDF file

    African Journals Online (AJOL)

    5,. May. 1923, p. 287. ISouth African Military Schools) p 287. CGS Box 231, File 31/0/2. .... One gains the impression that the sphere .... tions, Anthropology, Sociology and Man Manage- ment. ... of the word, possesses personality and initiative,.

  7. Hospital Service Area File

    Data.gov (United States)

    U.S. Department of Health & Human Services — This file is derived from the calendar year inpatient claims data. The records contain number of discharges, length of stay, and total charges summarized by provider...

  8. Patient Treatment File (PTF)

    Data.gov (United States)

    Department of Veterans Affairs — This database is part of the National Medical Information System (NMIS). The Patient Treatment File (PTF) contains a record for each inpatient care episode provided...

  9. USEEIO Satellite Files

    Data.gov (United States)

    U.S. Environmental Protection Agency — These files contain the environmental data as particular emissions or resources associated with a BEA sectors that are used in the USEEIO model. They are organized...

  10. Provider of Services File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The POS file contains data on characteristics of hospitals and other types of healthcare facilities, including the name and address of the facility and the type of...

  11. Download this PDF file

    African Journals Online (AJOL)

    countries quite a number of distance education institutions and programmes are more likely to be ... The Open University of Tanzania (OUT), (Ministry of Higher Education, Science and ..... (1991) Comic Relief Funding file. BAI, London, 1st ...

  12. 77 FR 31237 - Electronic Filing in the Copyright Office of Notices of Intention To Obtain a Section 115...

    Science.gov (United States)

    2012-05-25

    ... law, such notices may be filed in the Office only when the public records of the Copyright Office do... filed in the Copyright Office is sufficient as a matter of law under this section, that issue shall be... LIBRARY OF CONGRESS Copyright Office 37 CFR Part 201 [Docket No. RM 2012-4] Electronic Filing in...

  13. JENDL Dosimetry File

    International Nuclear Information System (INIS)

    Nakazawa, Masaharu; Iguchi, Tetsuo; Kobayashi, Katsuhei; Iwasaki, Shin; Sakurai, Kiyoshi; Ikeda, Yujiro; Nakagawa, Tsuneo.

    1992-03-01

    The JENDL Dosimetry File based on JENDL-3 was compiled and integral tests of cross section data were performed by the Dosimetry Integral Test Working Group of the Japanese Nuclear Data Committee. Data stored in the JENDL Dosimetry File are the cross sections and their covariance data for 61 reactions. The cross sections were mainly taken from JENDL-3 and the covariances from IRDF-85. For some reactions, data were adopted from other evaluated data files. The data are given in the neutron energy region below 20 MeV in both of point-wise and group-wise files in the ENDF-5 format. In order to confirm reliability of the data, several integral tests were carried out; comparison with the data in IRDF-85 and average cross sections measured in fission neutron fields, fast reactor spectra, DT neutron fields and Li(d, n) neutron fields. As a result, it has been found that the JENDL Dosimetry File gives better results than IRDF-85 but there are some problems to be improved in future. The contents of the JENDL Dosimetry File and the results of the integral tests are described in this report. All of the dosimetry cross sections are shown in a graphical form. (author) 76 refs

  14. JENDL Dosimetry File

    Energy Technology Data Exchange (ETDEWEB)

    Nakazawa, Masaharu; Iguchi, Tetsuo [Tokyo Univ. (Japan). Faculty of Engineering; Kobayashi, Katsuhei [Kyoto Univ., Kumatori, Osaka (Japan). Research Reactor Inst.; Iwasaki, Shin [Tohoku Univ., Sendai (Japan). Faculty of Engineering; Sakurai, Kiyoshi; Ikeda, Yujior; Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1992-03-15

    The JENDL Dosimetry File based on JENDL-3 was compiled and integral tests of cross section data were performed by the Dosimetry Integral Test Working Group of the Japanese Nuclear Data Committee. Data stored in the JENDL Dosimetry File are the cross sections and their covariance data for 61 reactions. The cross sections were mainly taken from JENDL-3 and the covariances from IRDF-85. For some reactions, data were adopted from other evaluated data files. The data are given in the neutron energy region below 20 MeV in both of point-wise and group-wise files in the ENDF-5 format. In order to confirm reliability of the data, several integral tests were carried out; comparison with the data in IRDF-85 and average cross sections measured in fission neutron fields, fast reactor spectra, DT neutron fields and Li(d,n) neutron fields. As a result, it has been found that the JENDL Dosimetry File gives better results than IRDF-85 but there are some problems to be improved in future. The contents of the JENDL Dosimetry File and the results of the integral tests are described in this report. All of the dosimetry cross sections are shown in a graphical form.

  15. Dynamic Non-Hierarchical File Systems for Exascale Storage

    Energy Technology Data Exchange (ETDEWEB)

    Long, Darrell E. [Univ. of California, Santa Cruz, CA (United States); Miller, Ethan L [Univ. of California, Santa Cruz, CA (United States)

    2015-02-24

    This constitutes the final report for “Dynamic Non-Hierarchical File Systems for Exascale Storage”. The ultimate goal of this project was to improve data management in scientific computing and high-end computing (HEC) applications, and to achieve this goal we proposed: to develop the first, HEC-targeted, file system featuring rich metadata and provenance collection, extreme scalability, and future storage hardware integration as core design goals, and to evaluate and develop a flexible non-hierarchical file system interface suitable for providing more powerful and intuitive data management interfaces to HEC and scientific computing users. Data management is swiftly becoming a serious problem in the scientific community – while copious amounts of data are good for obtaining results, finding the right data is often daunting and sometimes impossible. Scientists participating in a Department of Energy workshop noted that most of their time was spent “...finding, processing, organizing, and moving data and it’s going to get much worse”. Scientists should not be forced to become data mining experts in order to retrieve the data they want, nor should they be expected to remember the naming convention they used several years ago for a set of experiments they now wish to revisit. Ideally, locating the data you need would be as easy as browsing the web. Unfortunately, existing data management approaches are usually based on hierarchical naming, a 40 year-old technology designed to manage thousands of files, not exabytes of data. Today’s systems do not take advantage of the rich array of metadata that current high-end computing (HEC) file systems can gather, including content-based metadata and provenance1 information. As a result, current metadata search approaches are typically ad hoc and often work by providing a parallel management system to the “main” file system, as is done in Linux (the locate utility), personal computers, and enterprise search

  16. Development of data file system for cardiovascular nuclear medicine

    International Nuclear Information System (INIS)

    Hayashida, Kohei; Nishimura, Tsunehiko; Uehara, Toshiisa; Nisawa, Yoshifumi.

    1985-01-01

    A computer-assisted filing system for storing and processing data from cardiac pool scintigraphy and myocardial scintigraphy has been developed. Individual patient data are stored with his (her) identification number (ID) into floppy discs successively in order of receiving scintigraphy. Data for 900 patients can be stored per floppy disc. Scintigraphic findings can be outputted in a uniform file format, and can be used as a reporting format. Output or retrieval of filed individual patient data is possible according to each examination, disease code or ID. This system seems to be used for prospective study in patients with cardiovascular diseases. (Namekawa, K.)

  17. Tuning HDF5 subfiling performance on parallel file systems

    Energy Technology Data Exchange (ETDEWEB)

    Byna, Suren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chaarawi, Mohamad [Intel Corp. (United States); Koziol, Quincey [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Mainzer, John [The HDF Group (United States); Willmore, Frank [The HDF Group (United States)

    2017-05-12

    Subfiling is a technique used on parallel file systems to reduce locking and contention issues when multiple compute nodes interact with the same storage target node. Subfiling provides a compromise between the single shared file approach that instigates the lock contention problems on parallel file systems and having one file per process, which results in generating a massive and unmanageable number of files. In this paper, we evaluate and tune the performance of recently implemented subfiling feature in HDF5. In specific, we explain the implementation strategy of subfiling feature in HDF5, provide examples of using the feature, and evaluate and tune parallel I/O performance of this feature with parallel file systems of the Cray XC40 system at NERSC (Cori) that include a burst buffer storage and a Lustre disk-based storage. We also evaluate I/O performance on the Cray XC30 system, Edison, at NERSC. Our results show performance benefits of 1.2X to 6X performance advantage with subfiling compared to writing a single shared HDF5 file. We present our exploration of configurations, such as the number of subfiles and the number of Lustre storage targets to storing files, as optimization parameters to obtain superior I/O performance. Based on this exploration, we discuss recommendations for achieving good I/O performance as well as limitations with using the subfiling feature.

  18. 41 CFR 101-26.308 - Obtaining filing cabinets.

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Obtaining filing cabinets. 101-26.308 Section 101-26.308 Public Contracts and Property Management Federal Property Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 26-PROCUREMENT SOURCES AND...

  19. 78 FR 68431 - Combined Notice of Filings #1

    Science.gov (United States)

    2013-11-14

    ... Company's Petition for Limited Waiver. Filed Date: 11/4/13. Accession Number: 20131104-5152. Comments Due.... ET 11/26/13. Docket Numbers: ER14-313-000. Applicants: Public Service Company of New Hampshire. Description: Public Service Company of New Hampshire submits Cancellation of LCRA with CMEEC to be effective 1...

  20. 78 FR 2381 - Combined Notice of Filings #1

    Science.gov (United States)

    2013-01-11

    ...: Public Service Company of New Mexico. Description: Public Service Company of New Mexico submits its Triennial Market Power Update pursuant to Order No. 697. Filed Date: 12/21/12. Accession Number: 20121226... Pacific Power Company. Description: Updated Market Power Analysis for Southwest Region of Sierra Pacific...

  1. 77 FR 41399 - Combined Notice of Filings #2

    Science.gov (United States)

    2012-07-13

    .... Description: Updated Market Power Analysis for the Central Region of NextEra Energy Companies. Filed Date: 7/2.... Applicants: Public Service Company of New Mexico. Description: Notice of Cancellation of Service Schedule G to the Interconnection Agreement between Public Service Company of New Mexico and Los Alamos County...

  2. PRO/Mapper: a plotting program for the DEC PRO/300 personal computers utilizing the MAPPER graphics language

    International Nuclear Information System (INIS)

    Wachter, J.W.

    1986-05-01

    PRO/Mapper is an application for the Digital Equipment Corporation PRO/300 series of personal computers that facilitates the preparation of visuals such as graphs, charts, and maps in color or black and white. The user prepares an input data file containing English-language commands and writes it into a file using standard editor. PRO/Mapper then reads these files and draws graphs, maps, boxes, and complex line segments onto the computer screen. Axes, curves, and error bars may be plotted in graphical presentations. The commands of PRO/Mapper are a subset of the commands of the more sophisticated MAPPER program written for mainframe computers. The PRO/Mapper commands were chosen primarily for the production of linear graphs. Command files written for the PRO/300 are upward compatible with the Martin Marietta Energy Systems version of MAPPER and can be used to produce publication-quality slides, drawings, and maps on the various output devices of the Oak Ridge National Laboratory mainframe computers

  3. Renewal-anomalous-heterogeneous files

    International Nuclear Information System (INIS)

    Flomenbom, Ophir

    2010-01-01

    Renewal-anomalous-heterogeneous files are solved. A simple file is made of Brownian hard spheres that diffuse stochastically in an effective 1D channel. Generally, Brownian files are heterogeneous: the spheres' diffusion coefficients are distributed and the initial spheres' density is non-uniform. In renewal-anomalous files, the distribution of waiting times for individual jumps is not exponential as in Brownian files, yet obeys: ψ α (t)∼t -1-α , 0 2 >, obeys, 2 >∼ 2 > nrml α , where 2 > nrml is the MSD in the corresponding Brownian file. This scaling is an outcome of an exact relation (derived here) connecting probability density functions of Brownian files and renewal-anomalous files. It is also shown that non-renewal-anomalous files are slower than the corresponding renewal ones.

  4. The Cigeo project - Industrial centre of deep reversible storage of radioactive wastes in Meuse/Haute-Marne - Synthesis of the commissioner's file. Public debate from the 15 May to the 15 October 2013

    International Nuclear Information System (INIS)

    2013-03-01

    This synthesis of the commissioner's file of the Cigeo project (project of an industrial center of deep reversible storage of radioactive wastes in Meuse/Haute-Marne region, France), explains the reason of the project, the waste management choices made by other countries, the planning of the project, the type and volume of wastes that will be stored at Cigeo, the operation of the facility, the reversibility aspect, the selection and sitting of the site (surface and underground installations), the safety warranties, and the socio-economic benefits of the project

  5. Strategies for Sharing Seismic Data Among Multiple Computer Platforms

    Science.gov (United States)

    Baker, L. M.; Fletcher, J. B.

    2001-12-01

    Seismic waveform data is readily available from a variety of sources, but it often comes in a distinct, instrument-specific data format. For example, data may be from portable seismographs, such as those made by Refraction Technology or Kinemetrics, from permanent seismograph arrays, such as the USGS Parkfield Dense Array, from public data centers, such as the IRIS Data Center, or from personal communication with other researchers through e-mail or ftp. A computer must be selected to import the data - usually whichever is the most suitable for reading the originating format. However, the computer best suited for a specific analysis may not be the same. When copies of the data are then made for analysis, a proliferation of copies of the same data results, in possibly incompatible, computer-specific formats. In addition, if an error is detected and corrected in one copy, or some other change is made, all the other copies must be updated to preserve their validity. Keeping track of what data is available, where it is located, and which copy is authoritative requires an effort that is easy to neglect. We solve this problem by importing waveform data to a shared network file server that is accessible to all our computers on our campus LAN. We use a Network Appliance file server running Sun's Network File System (NFS) software. Using an NFS client software package on each analysis computer, waveform data can then be read by our MatLab or Fortran applications without first copying the data. Since there is a single copy of the waveform data in a single location, the NFS file system hierarchy provides an implicit complete waveform data catalog and the single copy is inherently authoritative. Another part of our solution is to convert the original data into a blocked-binary format (known historically as USGS DR100 or VFBB format) that is interpreted by MatLab or Fortran library routines available on each computer so that the idiosyncrasies of each machine are not visible to

  6. The version control service for the ATLAS data acquisition configuration files

    International Nuclear Information System (INIS)

    Soloviev, Igor

    2012-01-01

    The ATLAS experiment at the LHC in Geneva uses a complex and highly distributed Trigger and Data Acquisition system, involving a very large number of computing nodes and custom modules. The configuration of the system is specified by schema and data in more than 1000 XML files, with various experts responsible for updating the files associated with their components. Maintaining an error free and consistent set of XML files proved a major challenge. Therefore a special service was implemented; to validate any modifications; to check the authorization of anyone trying to modify a file; to record who had made changes, plus when and why; and to provide tools to compare different versions of files and to go back to earlier versions if required. This paper provides details of the implementation and exploitation experience, that may be interesting for other applications using many human-readable files maintained by different people, where consistency of the files and traceability of modifications are key requirements.

  7. Students "Hacking" School Computer Systems

    Science.gov (United States)

    Stover, Del

    2005-01-01

    This article deals with students hacking school computer systems. School districts are getting tough with students "hacking" into school computers to change grades, poke through files, or just pit their high-tech skills against district security. Dozens of students have been prosecuted recently under state laws on identity theft and unauthorized…

  8. NASA ARCH- A FILE ARCHIVAL SYSTEM FOR THE DEC VAX

    Science.gov (United States)

    Scott, P. J.

    1994-01-01

    The function of the NASA ARCH system is to provide a permanent storage area for files that are infrequently accessed. The NASA ARCH routines were designed to provide a simple mechanism by which users can easily store and retrieve files. The user treats NASA ARCH as the interface to a black box where files are stored. There are only five NASA ARCH user commands, even though NASA ARCH employs standard VMS directives and the VAX BACKUP utility. Special care is taken to provide the security needed to insure file integrity over a period of years. The archived files may exist in any of three storage areas: a temporary buffer, the main buffer, and a magnetic tape library. When the main buffer fills up, it is transferred to permanent magnetic tape storage and deleted from disk. Files may be restored from any of the three storage areas. A single file, multiple files, or entire directories can be stored and retrieved. archived entities hold the same name, extension, version number, and VMS file protection scheme as they had in the user's account prior to archival. NASA ARCH is capable of handling up to 7 directory levels. Wildcards are supported. User commands include TEMPCOPY, DISKCOPY, DELETE, RESTORE, and DIRECTORY. The DIRECTORY command searches a directory of savesets covering all three archival areas, listing matches according to area, date, filename, or other criteria supplied by the user. The system manager commands include 1) ARCHIVE- to transfer the main buffer to duplicate magnetic tapes, 2) REPORTto determine when the main buffer is full enough to archive, 3) INCREMENT- to back up the partially filled main buffer, and 4) FULLBACKUP- to back up the entire main buffer. On-line help files are provided for all NASA ARCH commands. NASA ARCH is written in DEC VAX DCL for interactive execution and has been implemented on a DEC VAX computer operating under VMS 4.X. This program was developed in 1985.

  9. Grid collector: An event catalog with automated file management

    International Nuclear Information System (INIS)

    Wu, Kesheng; Zhang, Wei-Ming; Sim, Alexander; Gu, Junmin; Shoshani, Arie

    2003-01-01

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides ''direct'' access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select events based on tags, such as, ''production date between March 10 and 20, and the number of charged tracks > 100.'' The Grid Collector locates the files containing relevant events, transfers the files across the Grid if necessary, and delivers the events to the analysis code through the familiar iterators. There has been some research efforts to address the file management issues, the Grid Collector is unique in that it addresses the event access issue together with the file management issues. This makes it more useful to a large variety of users

  10. Grid collector: An event catalog with automated file management

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Kesheng; Zhang, Wei-Ming; Sim, Alexander; Gu, Junmin; Shoshani, Arie

    2003-10-17

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides ''direct'' access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select events based on tags, such as, ''production date between March 10 and 20, and the number of charged tracks > 100.'' The Grid Collector locates the files containing relevant events, transfers the files across the Grid if necessary, and delivers the events to the analysis code through the familiar iterators. There has been some research efforts to address the file management issues, the Grid Collector is unique in that it addresses the event access issue together with the file management issues. This makes it more useful to a large variety of users.

  11. Computer Vision Syndrome.

    Science.gov (United States)

    Randolph, Susan A

    2017-07-01

    With the increased use of electronic devices with visual displays, computer vision syndrome is becoming a major public health issue. Improving the visual status of workers using computers results in greater productivity in the workplace and improved visual comfort.

  12. Cloud object store for archive storage of high performance computing data using decoupling middleware

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-06-30

    Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  13. Cloud object store for checkpoints of high performance computing applications using decoupling middleware

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-04-19

    Cloud object storage is enabled for checkpoints of high performance computing applications using a middleware process. A plurality of files, such as checkpoint files, generated by a plurality of processes in a parallel computing system are stored by obtaining said plurality of files from said parallel computing system; converting said plurality of files to objects using a log structured file system middleware process; and providing said objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  14. Formalizing a hierarchical file system

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, Muhammad Ikram

    An abstract file system is defined here as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for creation, removal,

  15. Formalizing a Hierarchical File System

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, M.I.

    2009-01-01

    In this note, we define an abstract file system as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for removal

  16. Interactive publications: creation and usage

    Science.gov (United States)

    Thoma, George R.; Ford, Glenn; Chung, Michael; Vasudevan, Kirankumar; Antani, Sameer

    2006-02-01

    As envisioned here, an "interactive publication" has similarities to multimedia documents that have been in existence for a decade or more, but possesses specific differentiating characteristics. In common usage, the latter refers to online entities that, in addition to text, consist of files of images and video clips residing separately in databases, rarely providing immediate context to the document text. While an interactive publication has many media objects as does the "traditional" multimedia document, it is a self-contained document, either as a single file with media files embedded within it, or as a "folder" containing tightly linked media files. The main characteristic that differentiates an interactive publication from a traditional multimedia document is that the reader would be able to reuse the media content for analysis and presentation, and to check the underlying data and possibly derive alternative conclusions leading, for example, to more in-depth peer reviews. We have created prototype publications containing paginated text and several media types encountered in the biomedical literature: 3D animations of anatomic structures; graphs, charts and tabular data; cell development images (video sequences); and clinical images such as CT, MRI and ultrasound in the DICOM format. This paper presents developments to date including: a tool to convert static tables or graphs into interactive entities, authoring procedures followed to create prototypes, and advantages and drawbacks of each of these platforms. It also outlines future work including meeting the challenge of network distribution for these large files.

  17. Publications | Page 293 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Use this search tool to locate a specific publication for your field of research. ... Via a graphical user interface, raw binary data files can be converted into several ... Participatory Geographic Information Systems (P-GIS) for natural resource ...

  18. Extracting the Data From the LCM vk4 Formatted Output File

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, James G. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-29

    These are slides about extracting the data from the LCM vk4 formatted output file. The following is covered: vk4 file produced by Keyence VK Software, custom analysis, no off the shelf way to read the file, reading the binary data in a vk4 file, various offsets in decimal lines, finding the height image data, directly in MATLAB, binary output beginning of height image data, color image information, color image binary data, color image decimal and binary data, MATLAB code to read vk4 file (choose a file, read the file, compute offsets, read optical image, laser optical image, read and compute laser intensity image, read height image, timing, display height image, display laser intensity image, display RGB laser optical images, display RGB optical images, display beginning data and save images to workspace, gamma correction subroutine), reading intensity form the vk4 file, linear in the low range, linear in the high range, gamma correction for vk4 files, computing the gamma intensity correction, observations.

  19. A Centralized Control and Dynamic Dispatch Architecture for File Integrity Analysis

    Directory of Open Access Journals (Sweden)

    Ronald DeMara

    2006-02-01

    Full Text Available The ability to monitor computer file systems for unauthorized changes is a powerful administrative tool. Ideally this task could be performed remotely under the direction of the administrator to allow on-demand checking, and use of tailorable reporting and exception policies targeted to adjustable groups of network elements. This paper introduces M-FICA, a Mobile File Integrity and Consistency Analyzer as a prototype to achieve this capability using mobile agents. The M-FICA file tampering detection approach uses MD5 message digests to identify file changes. Two agent types, Initiator and Examiner, are used to perform file integrity tasks. An Initiator travels to client systems, computes a file digest, then stores those digests in a database file located on write-once media. An Examiner agent computes a new digest to compare with the original digests in the database file. Changes in digest values indicate that the file contents have been modified. The design and evaluation results for a prototype developed in the Concordia agent framework are described.

  20. Reliable file sharing in distributed operating system using web RTC

    Science.gov (United States)

    Dukiya, Rajesh

    2017-12-01

    Since, the evolution of distributed operating system, distributed file system is come out to be important part in operating system. P2P is a reliable way in Distributed Operating System for file sharing. It was introduced in 1999, later it became a high research interest topic. Peer to Peer network is a type of network, where peers share network workload and other load related tasks. A P2P network can be a period of time connection, where a bunch of computers connected by a USB (Universal Serial Bus) port to transfer or enable disk sharing i.e. file sharing. Currently P2P requires special network that should be designed in P2P way. Nowadays, there is a big influence of browsers in our life. In this project we are going to study of file sharing mechanism in distributed operating system in web browsers, where we will try to find performance bottlenecks which our research will going to be an improvement in file sharing by performance and scalability in distributed file systems. Additionally, we will discuss the scope of Web Torrent file sharing and free-riding in peer to peer networks.

  1. 75 FR 30839 - Privacy Act of 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer...

    Science.gov (United States)

    2010-06-02

    ... 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer Match No. 1048, IRS... Services (CMS). ACTION: Notice of renewal of an existing computer matching program (CMP) that has an...'' section below for comment period. DATES: Effective Dates: CMS filed a report of the Computer Matching...

  2. Accessing files in an Internet: The Jade file system

    Science.gov (United States)

    Peterson, Larry L.; Rao, Herman C.

    1991-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  3. Accessing files in an internet - The Jade file system

    Science.gov (United States)

    Rao, Herman C.; Peterson, Larry L.

    1993-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  4. Prefetching in file systems for MIMD multiprocessors

    Science.gov (United States)

    Kotz, David F.; Ellis, Carla Schlatter

    1990-01-01

    The question of whether prefetching blocks on the file into the block cache can effectively reduce overall execution time of a parallel computation, even under favorable assumptions, is considered. Experiments have been conducted with an interleaved file system testbed on the Butterfly Plus multiprocessor. Results of these experiments suggest that (1) the hit ratio, the accepted measure in traditional caching studies, may not be an adequate measure of performance when the workload consists of parallel computations and parallel file access patterns, (2) caching with prefetching can significantly improve the hit ratio and the average time to perform an I/O (input/output) operation, and (3) an improvement in overall execution time has been observed in most cases. In spite of these gains, prefetching sometimes results in increased execution times (a negative result, given the optimistic nature of the study). The authors explore why it is not trivial to translate savings on individual I/O requests into consistently better overall performance and identify the key problems that need to be addressed in order to improve the potential of prefetching techniques in the environment.

  5. Download this PDF file

    African Journals Online (AJOL)

    1- is gifts' ta5ie" in elist fig'equitable' fees distilition s ... O'." & 1 25; 33i) re...) C SS Sati ri. Southerri'Stillah diffigFiles'f actities s % -- - , a v. & ' " St - a s fit . . . fiji ſti i ...

  6. Challenging Ubiquitous Inverted Files

    NARCIS (Netherlands)

    de Vries, A.P.

    2000-01-01

    Stand-alone ranking systems based on highly optimized inverted file structures are generally considered ‘the’ solution for building search engines. Observing various developments in software and hardware, we argue however that IR research faces a complex engineering problem in the quest for more

  7. The Global File System

    Science.gov (United States)

    Soltis, Steven R.; Ruwart, Thomas M.; OKeefe, Matthew T.

    1996-01-01

    The global file system (GFS) is a prototype design for a distributed file system in which cluster nodes physically share storage devices connected via a network-like fiber channel. Networks and network-attached storage devices have advanced to a level of performance and extensibility so that the previous disadvantages of shared disk architectures are no longer valid. This shared storage architecture attempts to exploit the sophistication of storage device technologies whereas a server architecture diminishes a device's role to that of a simple component. GFS distributes the file system responsibilities across processing nodes, storage across the devices, and file system resources across the entire storage pool. GFS caches data on the storage devices instead of the main memories of the machines. Consistency is established by using a locking mechanism maintained by the storage devices to facilitate atomic read-modify-write operations. The locking mechanism is being prototyped in the Silicon Graphics IRIX operating system and is accessed using standard Unix commands and modules.

  8. Download this PDF file

    African Journals Online (AJOL)

    AJNS WEBMASTERS

    Incidence is higher in the elderly, about 58 per 100,000 per year. Diagnosis of CSDH is still .... in the other two patients was not stated in the case file. Evacuation of the Subdural .... Personal experience in 39 patients. Br J of Neurosurg. 2003 ...

  9. File System Virtual Appliances

    Science.gov (United States)

    2010-05-01

    4 KB of data is read or written, data is copied back and forth using trampoline buffers — pages that are shared during proxy initialization — because...in 2008. CIO Magazine. 104 · File system virtual appliances [64] Megiddo, N. and Modha, D. S. 2003. ARC: A Self-Tuning, Low Over- head Replacement

  10. Publicity and public relations

    Science.gov (United States)

    Fosha, Charles E.

    1990-01-01

    This paper addresses approaches to using publicity and public relations to meet the goals of the NASA Space Grant College. Methods universities and colleges can use to publicize space activities are presented.

  11. Standard interface files and procedures for reactor physics codes. Version IV

    International Nuclear Information System (INIS)

    O'Dell, R.D.

    1977-09-01

    Standards, procedures, and recommendations of the Committee on Computer Code Coordination for promoting the exchange of reactor physics codes are updated to Version IV status. Standards and procedures covering general programming, program structure, standard interface files, and file management and handling subroutines are included

  12. Testing the Forensic Interestingness of Image Files Based on Size and Type

    Science.gov (United States)

    2017-09-01

    down to 0.18% (Rowe, 2015). 7 III. IMAGE FILE FORMATS When scanning a computer hard drive, many kinds of pictures are found. Digital images are not...3  III.  IMAGE FILE FORMATS ...Interchange Format JPEG Joint Photographic Experts Group LSH Locality Sensitive Hashing NSRL National Software Reference Library PDF Portable Document

  13. High School and Beyond Transcripts Survey (1982). Data File User's Manual. Contractor Report.

    Science.gov (United States)

    Jones, Calvin; And Others

    This data file user's manual documents the procedures used to collect and process high school transcripts for a large sample of the younger cohort (1980 sophomores) in the High School and Beyond survey. The manual provides the user with the technical assistance needed to use the computer file and also discusses the following: (1) sample design for…

  14. 11 CFR 100.19 - File, filed or filing (2 U.S.C. 434(a)).

    Science.gov (United States)

    2010-01-01

    ... a facsimile machine or by electronic mail if the reporting entity is not required to file..., including electronic reporting entities, may use the Commission's website's on-line program to file 48-hour... the reporting entity is not required to file electronically in accordance with 11 CFR 104.18. [67 FR...

  15. Evaluated neutronic file for indium

    International Nuclear Information System (INIS)

    Smith, A.B.; Chiba, S.; Smith, D.L.; Meadows, J.W.; Guenther, P.T.; Lawson, R.D.; Howerton, R.J.

    1990-01-01

    A comprehensive evaluated neutronic data file for elemental indium is documented. This file, extending from 10 -5 eV to 20 MeV, is presented in the ENDF/B-VI format, and contains all neutron-induced processes necessary for the vast majority of neutronic applications. In addition, an evaluation of the 115 In(n,n') 116m In dosimetry reaction is presented as a separate file. Attention is given in quantitative values, with corresponding uncertainty information. These files have been submitted for consideration as a part of the ENDF/B-VI national evaluated-file system. 144 refs., 10 figs., 4 tabs

  16. Translator program converts computer printout into braille language

    Science.gov (United States)

    Powell, R. A.

    1967-01-01

    Computer program converts print image tape files into six dot Braille cells, enabling a blind computer programmer to monitor and evaluate data generated by his own programs. The Braille output is printed 8 lines per inch.

  17. 46 CFR 530.12 - Publication.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 9 2010-10-01 2010-10-01 false Publication. 530.12 Section 530.12 Shipping FEDERAL MARITIME COMMISSION REGULATIONS AFFECTING OCEAN SHIPPING IN FOREIGN COMMERCE SERVICE CONTRACTS Publication of Essential Terms § 530.12 Publication. (a) Contents. All authorized persons who have a duty to file...

  18. Code 672 observational science branch computer networks

    Science.gov (United States)

    Hancock, D. W.; Shirk, H. G.

    1988-01-01

    In general, networking increases productivity due to the speed of transmission, easy access to remote computers, ability to share files, and increased availability of peripherals. Two different networks within the Observational Science Branch are described in detail.

  19. A History of the Andrew File System

    CERN Multimedia

    CERN. Geneva; Altman, Jeffrey

    2011-01-01

    Derrick Brashear and Jeffrey Altman will present a technical history of the evolution of Andrew File System starting with the early days of the Andrew Project at Carnegie Mellon through the commercialization by Transarc Corporation and IBM and a decade of OpenAFS. The talk will be technical with a focus on the various decisions and implementation trade-offs that were made over the course of AFS versions 1 through 4, the development of the Distributed Computing Environment Distributed File System (DCE DFS), and the course of the OpenAFS development community. The speakers will also discuss the various AFS branches developed at the University of Michigan, Massachusetts Institute of Technology and Carnegie Mellon University.

  20. 77 FR 21551 - New York City Department of Environmental Protection; Notice of Application Accepted for filing...

    Science.gov (United States)

    2012-04-10

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 13287-004] New York City Department of Environmental Protection; Notice of Application Accepted for filing And Soliciting Motions To Intervene and Protests Take notice that the following hydroelectric application has been filed with the Commission and is available for public...

  1. 76 FR 39263 - Antidumping and Countervailing Duty Proceedings: Electronic Filing Procedures; Administrative...

    Science.gov (United States)

    2011-07-06

    ... Department requires that manually filed documents be bound only with a paper clip, butterfly/binder clip, or..., the CRU will continue to maintain the official record in paper form for those documents that were... documents filed after its implementation. The CRU will continue to maintain the public record in paper form...

  2. Perceptions of Cataloguers and End-Users towards Bilingual Authority Files.

    Science.gov (United States)

    Abdoulaye, Kaba

    2002-01-01

    Analyzes and describes bilingual authority files at the main library of the International Islamic University of Malaysia. Highlights include a review of multilingual research; perceptions of end users and catalogers; problems with bilingual files; and use of the OPAC (online public access catalog) by users. (Author/LRW)

  3. 76 FR 52012 - Notice of Filing of Plats of Survey, New Mexico

    Science.gov (United States)

    2011-08-19

    ... Filing of Plats of Survey, New Mexico AGENCY: Bureau of Land Management, Interior. ACTION: Notice of... filed in the New Mexico State Office, Bureau of Land Management, Santa Fe, New Mexico, thirty (30) calendar days from the date of this publication. SUPPLEMENTARY INFORMATION: New Mexico Principal Meridian...

  4. 45 CFR 672.4 - Filing, service, and form of pleadings and documents.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 3 2010-10-01 2010-10-01 false Filing, service, and form of pleadings and... SCIENCE FOUNDATION ENFORCEMENT AND HEARING PROCEDURES § 672.4 Filing, service, and form of pleadings and... local officer, agency, department, corporation or other instrumentality shall be made by serving a copy...

  5. Radiological digital teaching file development: an overview

    International Nuclear Information System (INIS)

    Scarsbrook, A.F.; Foley, P.T.; Perriss, R.W.; Graham, R.N.J.

    2005-01-01

    Radiologists are collectors of interesting films for teaching purposes or for use in presentations and publications. Traditionally, hard copies of films have been stored in an organized fashion, usually in a filing cabinet or film library. This system has inherent limitations, such as the physical space required. Many of the shortcomings can be circumvented by development of an electronic teaching file. Whereas the implementation of an institutional radiological digital image database can require significant developmental effort and programming expertise, there are a number of web-based solutions which are freely available and can be relatively easily employed to establish a contemporary electronic image library. This article will review the various options and discuss the process of developing a digital image database

  6. SIDS-toADF File Mapping Manual

    Science.gov (United States)

    McCarthy, Douglas; Smith, Matthew; Poirier, Diane; Smith, Charles A. (Technical Monitor)

    2002-01-01

    The "CFD General Notation System" (CGNS) consists of a collection of conventions, and conforming software, for the storage and retrieval of Computational Fluid Dynamics (CFD) data. It facilitates the exchange of data between sites and applications, and helps stabilize the archiving of aerodynamic data. This effort was initiated in order to streamline the procedures in exchanging data and software between NASA and its customers, but the goal is to develop CGNS into a National Standard for the exchange of aerodynamic data. The CGNS development team is comprised of members from Boeing Commercial Airplane Group, NASA-Ames, NASA-Langley, NASA-Lewis, McDonnell-Douglas Corporation (now Boeing-St. Louis), Air Force-Wright Lab., and ICEM-CFD Engineering. The elements of CGNS address all activities associated with the storage of data on external media and its movement to and from application programs. These elements include: 1) The Advanced Data Format (ADF) Database manager, consisting of both a file format specification and its I/O software, which handles the actual reading and writing of data from and to external storage media; 2) The Standard Interface Data Structures (SIDS), which specify the intellectual content of CFD data and the conventions governing naming and terminology; 3) The SIDS-to-ADF File Mapping conventions, which specify the exact location where the CFD data defined by the SIDS is to be stored within the ADF file(s); and 4) The CGNS Mid-level Library, which provides CFD-knowledgeable routines suitable for direct installation into application codes. The SIDS-toADF File Mapping Manual specifies the exact manner in which, under CGNS conventions, CFD data structures (the SIDS) are to be stored in (i.e., mapped onto) the file structure provided by the database manager (ADF). The result is a conforming CGNS database. Adherence to the mapping conventions guarantees uniform meaning and location of CFD data within ADF files, and thereby allows the construction of

  7. Cloud Computing: Architecture and Services

    OpenAIRE

    Ms. Ravneet Kaur

    2018-01-01

    Cloud computing is Internet-based computing, whereby shared resources, software, and information are provided to computers and other devices on demand, like the electricity grid. It is a method for delivering information technology (IT) services where resources are retrieved from the Internet through web-based tools and applications, as opposed to a direct connection to a server. Rather than keeping files on a proprietary hard drive or local storage device, cloud-based storage makes it possib...

  8. File structure and organization in the automation system for operative account of equipment and materials in JINR

    International Nuclear Information System (INIS)

    Gulyaeva, N.D.; Markova, N.F.; Nikitina, V.I.; Tentyukova, G.N.

    1975-01-01

    The structure and organization of files in the information bank for the first variant of a JINR material and technical supply subsystem are described. Automated system of equipment operative stock-taking on the base of the SDS-6200 computer is developed. Information is stored on magnetic discs. The arrangement of each file depends on its purpose and structure of data. Access to the files can be arbitrary or consecutive. The files are divided into groups: primary document files, long-term reference, information on items that may change as a result of administrative decision [ru

  9. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  10. Experience with an on-line computer for controlling and optimizing the gas supply and the application of various peak-load supply plants of a public utility

    Energy Technology Data Exchange (ETDEWEB)

    Poll, J [Technische Werke der Stadt Stuttgart A.G. (Germany, F.R.)

    1977-02-01

    The computer system has the following tasks: 1) On-line control; 2) supply of an information system; 3) performance of a gas marketing forecast; 4) background computations. Measured data are compiled, processed, monitored, recorded, prepared, and stored. The process is controlled by about a dozen programmes, the remaining tasks are taken over by 22 programmes. The system has proved a success.

  11. Analytics Platform for ATLAS Computing Services

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration; Bryant, Lincoln

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning tools like Spark, Jupyter, R, S...

  12. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  13. Cloud computing for radiologists

    OpenAIRE

    Amit T Kharat; Amjad Safvi; S S Thind; Amarjit Singh

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as...

  14. File: International bilateral relations

    International Nuclear Information System (INIS)

    Feltin, Ch.; Rabouhams, J.; Bravo, X.; Rousseau, M.; Le Breton, S.; Saint Raymond, Ph.; Brigaud, O.; Pertuis, V.; McNair, J.; Sayers, M.R.; Bye, R.; Scherrer, J.

    1998-01-01

    Since its creation in 1973, the Authority of Safety was assigned missions in the international field with following objectives: to develop information exchanges with its foreign counterpart, to make know and to explain the French approach and practice; to give to concerned countries the useful information on french nuclear facilities situated near the border; This file shows with some examples, how bilateral relations allow to fill up these objectives and how the French Authority got the foreign experience. (N.C.)

  15. FEDGROUP - A program system for producing group constants from evaluated nuclear data of files disseminated by IAEA

    International Nuclear Information System (INIS)

    Vertes, P.

    1976-06-01

    A program system for calculating group constants from several evaluated nuclear data files has been developed. These files are distributed by the Nuclear Data Section of IAEA. Our program system - FEDGROUP - has certain advantage over the well-known similar codes such as: 1. it requires only a medium sized computer />or approximately equal to 20000 words memory/, 2. it is easily adaptable to any type of computer, 3. it is flexible to the input evaluated nuclear data file and to the output group constant file. Nowadays, FEDGROUP calculates practically all types of group constants needed for reactor physics calculations by using the most frequent representations of evaluated data. (author)

  16. Enkripsi dan Dekripsi File dengan Algoritma Blowfish pada Perangkat Mobile Berbasis Android

    Directory of Open Access Journals (Sweden)

    Siswo Wardoyo

    2016-03-01

    Full Text Available Cryptography is one of the ways used to secure data in the form of a file with encrypt files so that others are not entitled to know the file is private and confidential. One method is the algorithm Blowfish Cryptography which is a symmetric key using the algorithm to perform encryption and decryption. Applications that are built can perform file encryption-shaped images, videos, and documents. These applications can be running on a mobile phone that has a minimal operating system Android version 2.3. The software used to build these applications is Eclipse. The results of this research indicate that applications built capable of performing encryption and decryption. The results file encryption makes files into another unknown meaning. By using the keys numbered 72 bits or 9 character takes 1,49x108 years to break it with the speed it’s computation is 106 key/sec.

  17. File-System Workload on a Scientific Multiprocessor

    Science.gov (United States)

    Kotz, David; Nieuwejaar, Nils

    1995-01-01

    Many scientific applications have intense computational and I/O requirements. Although multiprocessors have permitted astounding increases in computational performance, the formidable I/O needs of these applications cannot be met by current multiprocessors a their I/O subsystems. To prevent I/O subsystems from forever bottlenecking multiprocessors and limiting the range of feasible applications, new I/O subsystems must be designed. The successful design of computer systems (both hardware and software) depends on a thorough understanding of their intended use. A system designer optimizes the policies and mechanisms for the cases expected to most common in the user's workload. In the case of multiprocessor file systems, however, designers have been forced to build file systems based only on speculation about how they would be used, extrapolating from file-system characterizations of general-purpose workloads on uniprocessor and distributed systems or scientific workloads on vector supercomputers (see sidebar on related work). To help these system designers, in June 1993 we began the Charisma Project, so named because the project sought to characterize 1/0 in scientific multiprocessor applications from a variety of production parallel computing platforms and sites. The Charisma project is unique in recording individual read and write requests-in live, multiprogramming, parallel workloads (rather than from selected or nonparallel applications). In this article, we present the first results from the project: a characterization of the file-system workload an iPSC/860 multiprocessor running production, parallel scientific applications at NASA's Ames Research Center.

  18. Defining nuclear medical file formal based on DICOM standard

    International Nuclear Information System (INIS)

    He Bin; Jin Yongjie; Li Yulan

    2001-01-01

    With the wide application of computer technology in medical area, DICOM is becoming the standard of digital imaging and communication. The author discusses how to define medical imaging file formal based on DICOM standard. It also introduces the format of ANMIS system the authors defined the validity and integrality of this format

  19. PFS: a distributed and customizable file system

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.

    1996-01-01

    In this paper we present our ongoing work on the Pegasus File System (PFS), a distributed and customizable file system that can be used for off-line file system experiments and on-line file system storage. PFS is best described as an object-oriented component library from which either a true file

  20. Huygens file service and storage architecture

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Stabell-Kulo, Tage; Stabell-Kulo, Tage

    1993-01-01

    The Huygens file server is a high-performance file server which is able to deliver multi-media data in a timely manner while also providing clients with ordinary “Unix” like file I/O. The file server integrates client machines, file servers and tertiary storage servers in the same storage

  1. Huygens File Service and Storage Architecture

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Stabell-Kulo, Tage; Stabell-Kulo, Tage

    1993-01-01

    The Huygens file server is a high-performance file server which is able to deliver multi-media data in a timely manner while also providing clients with ordinary “Unix” like file I/O. The file server integrates client machines, file servers and tertiary storage servers in the same storage

  2. 78 FR 75554 - Combined Notice of Filings

    Science.gov (United States)

    2013-12-12

    ...-000. Applicants: Young Gas Storage Company, Ltd. Description: Young Fuel Reimbursement Filing to be.... Protests may be considered, but intervention is necessary to become a party to the proceeding. eFiling is... qualifying facilities filings can be found at: http://www.ferc.gov/docs-filing/efiling/filing-req.pdf . For...

  3. 12 CFR 5.4 - Filing required.

    Science.gov (United States)

    2010-01-01

    ... CORPORATE ACTIVITIES Rules of General Applicability § 5.4 Filing required. (a) Filing. A depository institution shall file an application or notice with the OCC to engage in corporate activities and... advise an applicant through a pre-filing communication to send the filing or submission directly to the...

  4. RECOLA2: REcursive Computation of One-Loop Amplitudes 2

    Science.gov (United States)

    Denner, Ansgar; Lang, Jean-Nicolas; Uccirati, Sandro

    2018-03-01

    We present the Fortran95 program RECOLA2 for the perturbative computation of next-to-leading-order transition amplitudes in the Standard Model of particle physics and extended Higgs sectors. New theories are implemented via model files in the 't Hooft-Feynman gauge in the conventional formulation of quantum field theory and in the Background-Field method. The present version includes model files for Two-Higgs-Doublet Model and the Higgs-Singlet Extension of the Standard Model. We support standard renormalization schemes for the Standard Model as well as many commonly used renormalization schemes in extended Higgs sectors. Within these models the computation of next-to-leading-order polarized amplitudes and squared amplitudes, optionally summed over spin and colour, is fully automated for any process. RECOLA2 allows the computation of colour- and spin-correlated leading-order squared amplitudes that are needed in the dipole subtraction formalism. RECOLA2 is publicly available for download at http://recola.hepforge.org.

  5. The crystallographic information file (CIF): A new standard archive file for crystallography

    International Nuclear Information System (INIS)

    Hall, S.R.; Allen, F.H.; Brown, I.D.

    1991-01-01

    The specification of a new standard Crystallographic Information File (CIF) is described. Its development is based on the Self-Defining Text Archieve and Retrieval (STAR) procedure. The CIF is a general, flexible and easily extensible free-format archive file; it is human and machine readable and can be edited by a simple editor. The CIF is designed for the electronic transmission of crystallographic data between individual laboratories, journals and databases: It has been adopted by the International Union of Crystallography as the recommended medium for this purpose. The file consists of data names and data items, together with a loop facility for repeated items. The data names, constructed hierarchically so as to form data categories, are self-descriptive within a 32-character limit. The sorted list of data names, together with their precise definitions, constitutes the CIF dictionary (core version 1991). The CIF core dictionary is presented in full and covers the fundamental and most commonly used data items relevant to crystal structure analysis. The dictionary is also available as an electronic file suitable for CIF computer applications. Future extensions to the dictionary will include data items used in more specialized areas of crystallography. (orig.)

  6. The Galley Parallel File System

    Science.gov (United States)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    Most current multiprocessor file systems are designed to use multiple disks in parallel, using the high aggregate bandwidth to meet the growing I/0 requirements of parallel scientific applications. Many multiprocessor file systems provide applications with a conventional Unix-like interface, allowing the application to access multiple disks transparently. This interface conceals the parallelism within the file system, increasing the ease of programmability, but making it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. In addition to providing an insufficient interface, most current multiprocessor file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic scientific multiprocessor workloads. We discuss Galley's file structure and application interface, as well as the performance advantages offered by that interface.

  7. Wind farm project on the territory of municipalities of Equennes-Eramecourt, Saulchoy-sous-Poix, Thieulloy-la-Ville (80). Non technical summary of the exploitation authorisation request file. Opinion of the authority for the environment on the impact study and hazard study. Public inquiry report

    International Nuclear Information System (INIS)

    Lignier, Jean-Pierre; Gourio, Yann

    2016-01-01

    A first report recalls some general data about wind energy development, exploitation (in the world, Europe and France) and interest. It presents the project which is to be built (installation characteristics, location, wind turbine description, wind turbine safety systems), the requester and its financial and technical capacities, and the authorisation request file. Next parts proposes a brief presentation of the studied area, a description of the initial environment in terms of hydro-geological, hydraulic and hydrographic, and natural environment, cultural heritage, soil use, town planning issues, activities, natural and technological risks, and landscape issues. It proposes an assessment of the potential impact of the project on the environment, evokes substitution solutions, addresses the compliance with planning documents, analyses and characterises potential hazards. The next document states the opinion of the authority of the environment. It presents the project and its context (with its environmental, physical, urban, landscape, and legal aspects); states the opinion on the content of the impact study and hazard study. The last report concerns the public inquiry. It contains some generalities about the legal frameworks and the project, a report of the inquiry organisation and procedure (decrees, modalities, meetings and visits, public information, noticed incidents, general atmosphere), and then the statement of the inquiry commissioner on the various aspects of the project

  8. Download this PDF file

    African Journals Online (AJOL)

    computer networks. The calculation of the path in real time is useful in a number of situations. ... The most effective protocol in computer networks is ... in reconfigurable communication systems [7], it ..... The operating characteristic of a routing ...

  9. Download this PDF file

    African Journals Online (AJOL)

    The Effect of Language on Human-Computer Interactions in Cameroon. ONIBEREE. A.', NGOLAHC.E.º, SHU W.S.*. Department of Computer Science, University of Buea, Cameroon ..... Multiple-choice questions asked the respondent for their.

  10. Download this PDF file

    African Journals Online (AJOL)

    Dr Olaleye

    study is to determine the prevalence of computer related eye problems and their associations among computer users .... there were bright lights in their field of view while viewing .... range close to the near point of accommodation for a variable.

  11. Download this PDF file

    African Journals Online (AJOL)

    licenses and even security holograms. They are made to ... onto a computer system to encourage customers into buying their computer hardware. ..... producers, the government, the users and ICT business circles. The ... It is important that the.

  12. Download this PDF file

    African Journals Online (AJOL)

    GB

    internet connection and access to computers with in hospitals were found to be statistically ... hospitals use their computers for recording patient ..... relationship between the predictor and outcome ... information by physicians for patient care in.

  13. Grid collector an event catalog with automated file management

    CERN Document Server

    Ke Sheng Wu; Sim, A; Jun Min Gu; Shoshani, A

    2004-01-01

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides "direct" access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select ev...

  14. SLIB77, Source Library Data Compression and File Maintenance System

    International Nuclear Information System (INIS)

    Lunsford, A.

    1989-01-01

    Description of program or function: SLIB77 is a source librarian program designed to maintain FORTRAN source code in a compressed form on magnetic disk. The program was prepared to meet program maintenance requirements for ongoing program development and continual improvement of very large programs involving many programmers from a number of different organizations. SLIB77 automatically maintains in one file the source of the current program as well as all previous modifications. Although written originally for FORTRAN programs, SLIB77 is suitable for use with data files, text files, operating systems, and other programming languages, such as Ada, C and COBOL. It can handle libraries with records of up to 160-characters. Records are grouped into DECKS and assigned deck names by the user. SLIB77 assigns a number to each record in each DECK. Records can be deleted or restored singly or as a group within each deck. Modification records are grouped and assigned modification identification names by the user. The program assigns numbers to each new record within the deck. The program has two modes of execution, BATCH and EDIT. The BATCH mode is controlled by an input file and is used to make changes permanent and create new library files. The EDIT mode is controlled by interactive terminal input and a built-in line editor is used for modification of single decks. Transferring of a library from one computer system to another is accomplished using a Portable Library File created by SLIB77 in a BATCH run

  15. Federating LHCb datasets using the DIRAC File catalog

    CERN Document Server

    Haen, Christophe; Frank, Markus; Tsaregorodtsev, Andrei

    2015-01-01

    In the distributed computing model of LHCb the File Catalog (FC) is a central component that keeps track of each file and replica stored on the Grid. It is federating the LHCb data files in a logical namespace used by all LHCb applications. As a replica catalog, it is used for brokering jobs to sites where their input data is meant to be present, but also by jobs for finding alternative replicas if necessary. The LCG File Catalog (LFC) used originally by LHCb and other experiments is now being retired and needs to be replaced. The DIRAC File Catalog (DFC) was developed within the framework of the DIRAC Project and presented during CHEP 2012. From the technical point of view, the code powering the DFC follows an Aspect oriented programming (AOP): each type of entity that is manipulated by the DFC (Users, Files, Replicas, etc) is treated as a separate 'concern' in the AOP terminology. Hence, the database schema can also be adapted to the needs of a Virtual Organization. LHCb opted for a highly tuned MySQL datab...

  16. 77 FR 31835 - Marine Mammals; File No. 16580

    Science.gov (United States)

    2012-05-30

    ... Public Comment'' from the Features box on the Applications and Permits for Protected Species (APPS) home page, https://apps.nmfs.noaa.gov , and then selecting File No. 16580 from the list of available.... Marine mammal parts will be used incidentally for educational purposes. Import/export activities would...

  17. 76 FR 4867 - Marine Mammals; File No. 15453

    Science.gov (United States)

    2011-01-27

    ... Public Comment'' from the Features box on the Applications and Permits for Protected Species (APPS) home page, https://apps.nmfs.noaa.gov , and then selecting File No. 15453 from the list of available... educational graphics about the Hawaiian monk seal. In compliance with the National Environmental Policy Act of...

  18. 5 CFR 2634.602 - Filing of reports.

    Science.gov (United States)

    2010-01-01

    ... part. (b) The President, the Vice President, any independent counsel, and persons appointed by independent counsel under 28 U.S.C. chapter 40, shall file the public financial disclosure reports required... part. (e) Candidates for President and Vice President identified in § 2634.201(d), other than an...

  19. 75 FR 33296 - Columbia Gulf Transmission Company; Notice of Filing

    Science.gov (United States)

    2010-06-11

    ... Transmission Company; Notice of Filing June 2, 2010. Take notice that on May 20, 2010, Columbia Gulf..., pursuant to section 7(c) of the Natural Gas Act (NGA), for a certificate of public convenience and... of its existing transmission system to transport unprocessed gas (wet gas) near Centerville...

  20. 78 FR 40472 - Combined Notice of Filings #2

    Science.gov (United States)

    2013-07-05

    .... Applicants: Bangor Hydro Electric Company, Maine Public Service Company. Description: Errata to March 19, 2013 Section 203 Application of Bangor Hydro Electric Company, et. al. and Request for Shortened Answer... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 2 Take notice...

  1. 49 CFR 1152.24 - Filing and service of application.

    Science.gov (United States)

    2010-10-01

    ... inches with 11/2 inch left margin, shall be filed with the Chief, Section of Administration, Office of... discontinuance application, the Board will review the application and determine whether it conforms with all... subjected to new notice and publication under § 1152.20, unless the defect causing the rejection was in the...

  2. 77 FR 1064 - Combined Notice of Filings #1

    Science.gov (United States)

    2012-01-09

    ... confirm timely development of new interface pricing software of New York Independent System Operator, Inc... Market Power Analysis of Northern Indiana Public Service Company. Filed Date: 12/28/11. Accession Number..., Duke Energy Indiana, Inc., St. Paul Cogeneration, LLC. Description: Updated market power analysis of...

  3. 78 FR 41014 - Online Political File and Petition for Reconsideration

    Science.gov (United States)

    2013-07-09

    ... FEDERAL COMMUNICATIONS COMMISSION 47 CFR Part 73 [MB Docket No. 00-168; DA 13-1440] Online... of the rules requiring broadcast television stations to post their political files online, and on a.... The full text of this document is available for public inspection and copying during regular business...

  4. 75 FR 14589 - Combined Notice of Filings #1

    Science.gov (United States)

    2010-03-26

    ... Energy Marketing, LLC. Description: Public Service Company of New Mexico et al. submits Supplement to... as Second Revised Service Agreement No. 1688, effective 2/28/2010 with WM Renewable Energy LLC et al... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 1 March 18...

  5. 76 FR 19976 - Marine Mammals; File No. 15537

    Science.gov (United States)

    2011-04-11

    ... Mammals; File No. 15537 AGENCY: National Marine Fisheries Service (NMFS), National Oceanic and Atmospheric...) prepared in response to a public display permit application received from the Institute for Marine Mammal... of the Marine Mammal Protection Act of 1972, as amended (16 U.S.C. 1361 et seq.) and the regulations...

  6. 77 FR 30314 - Eastern States: Filing of Plat of Survey

    Science.gov (United States)

    2012-05-22

    ...] Eastern States: Filing of Plat of Survey AGENCY: Bureau of Land Management, Interior ACTION: Notice... the BLM-Eastern States office in Springfield, Virginia, 30 calendar days from the date of publication in the Federal Register. FOR FURTHER INFORMATION CONTACT: Bureau of Land Management-Eastern States...

  7. 77 FR 58575 - Eastern States: Filing of Plat of Survey

    Science.gov (United States)

    2012-09-21

    ...] Eastern States: Filing of Plat of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice... the BLM-Eastern States office in Springfield, Virginia, 30 calendar days from the date of publication in the Federal Register. FOR FURTHER INFORMATION CONTACT: Bureau of Land Management-Eastern States...

  8. Access to DIII-D data located in multiple files and multiple locations

    International Nuclear Information System (INIS)

    McHarg, B.B. Jr.

    1993-10-01

    The General Atomics DIII-D tokamak fusion experiment is now collecting over 80 MB of data per discharge once every 10 min, and that quantity is expected to double within the next year. The size of the data files, even in compressed format, is becoming increasingly difficult to handle. Data is also being acquired now on a variety of UNIX systems as well as MicroVAX and MODCOMP computer systems. The existing computers collect all the data into a single shot file, and this data collection is taking an ever increasing amount of time as the total quantity of data increases. Data is not available to experimenters until it has been collected into the shot file, which is in conflict with the substantial need for data examination on a timely basis between shots. The experimenters are also spread over many different types of computer systems (possibly located at other sites). To improve data availability and handling, software has been developed to allow individual computer systems to create their own shot files locally. The data interface routine PTDATA that is used to access DIII-D data has been modified so that a user's code on any computer can access data from any computer where that data might be located. This data access is transparent to the user. Breaking up the shot file into separate files in multiple locations also impacts software used for data archiving, data management, and data restoration

  9. 76 FR 21739 - Notice of Public Information Collection(s) Being Reviewed by the Federal Communications...

    Science.gov (United States)

    2011-04-18

    ... 73.3526(b) and 73.3527(b) require that the public inspection file be maintained at the main studio of... that maintain their main studios and public file outside their communities of license are required to... FEDERAL COMMUNICATIONS COMMISSION Notice of Public Information Collection(s) Being Reviewed by the...

  10. Recalling ISX shot data files from the off-line archive

    International Nuclear Information System (INIS)

    Stanton, J.S.

    1981-02-01

    This document describes a set of computer programs designed to allow access to ISX shot data files stored on off-line disk packs. The programs accept user requests for data files and build a queue of end requests. When an operator is available to mount the necessary disk packs, the system copies the requested files to an on-line disk area. The program runs on the Fusion Energy Division's DECsystem-10 computer. The request queue is implemented under the System 1022 data base management system. The support programs are coded in MACRO-10 and FORTRAN-10

  11. Assessing the economic impact of public investment in Malaysia: a case study on MyRapid Transit project using a dynamic computable general equilibrium model

    OpenAIRE

    Muniandy, Meenachi

    2017-01-01

    The central focus of this thesis is the question of whether public investment in transport infrastructure contributes positively to Malaysia’s economic growth and welfare. Although there are strong analytical reasons to believe that public investment spending is one of the important variables that influence growth, there remains significant uncertainty about its actual degree of influence. In Malaysia, whenever there is a collapse in domestic demand, government spending becomes an important m...

  12. Download this PDF file

    African Journals Online (AJOL)

    Eghosa O Ekhator

    Firstly, why should we expect higher ethical standards from the public sector than from the ... interested in doing business with such an organization. 1. .... Public sector ethics is a combination of both the internal and external issues, which.

  13. Download this PDF file

    African Journals Online (AJOL)

    countries legal aid at public expense is restricted to capital cases, given the scarcity of public defenders, as borne out .... paralegal to speak for an indigent defendant on matters of bail.18 ..... See: The socioeconomic impact of pretrial detention,.

  14. Download this PDF file

    African Journals Online (AJOL)

    Administrator

    significant public health challenge9-12. As Liberia emerges ... public health facilities, significant gaps exist regarding ... of socio-economic disparity, sexual violence, gender inequality ... (3) identify potential sites for speaking engagements.

  15. Download this PDF file

    African Journals Online (AJOL)

    DR Nneka

    2015-01-13

    Jan 13, 2015 ... and costs over runs, the standard of living of the people has not been adequately ... promotes positive alliances among business partners. ... public procurement refers to all acquisitions of goods and services by public.

  16. 37 CFR 1.221 - Voluntary publication or republication of patent application publication.

    Science.gov (United States)

    2010-07-01

    ... filing system requirements and be accompanied by the publication fee set forth in § 1.18(d) and the..., the Office will not publish the application and will refund the publication fee. (b) The Office will... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Voluntary publication or...

  17. Xeml Lab: a tool that supports the design of experiments at a graphical interface and generates computer-readable metadata files, which capture information about genotypes, growth conditions, environmental perturbations and sampling strategy.

    Science.gov (United States)

    Hannemann, Jan; Poorter, Hendrik; Usadel, Björn; Bläsing, Oliver E; Finck, Alex; Tardieu, Francois; Atkin, Owen K; Pons, Thijs; Stitt, Mark; Gibon, Yves

    2009-09-01

    Data mining depends on the ability to access machine-readable metadata that describe genotypes, environmental conditions, and sampling times and strategy. This article presents Xeml Lab. The Xeml Interactive Designer provides an interactive graphical interface at which complex experiments can be designed, and concomitantly generates machine-readable metadata files. It uses a new eXtensible Mark-up Language (XML)-derived dialect termed XEML. Xeml Lab includes a new ontology for environmental conditions, called Xeml Environment Ontology. However, to provide versatility, it is designed to be generic and also accepts other commonly used ontology formats, including OBO and OWL. A review summarizing important environmental conditions that need to be controlled, monitored and captured as metadata is posted in a Wiki (http://www.codeplex.com/XeO) to promote community discussion. The usefulness of Xeml Lab is illustrated by two meta-analyses of a large set of experiments that were performed with Arabidopsis thaliana during 5 years. The first reveals sources of noise that affect measurements of metabolite levels and enzyme activities. The second shows that Arabidopsis maintains remarkably stable levels of sugars and amino acids across a wide range of photoperiod treatments, and that adjustment of starch turnover and the leaf protein content contribute to this metabolic homeostasis.

  18. Volunteer Computing for Science Gateways

    OpenAIRE

    Anderson, David

    2017-01-01

    This poster offers information about volunteer computing for science gateways that offer high-throughput computing services. Volunteer computing can be used to get computing power. This increases the visibility of the gateway to the general public as well as increasing computing capacity at little cost.

  19. Scientific computing

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  20. The access of political representatives to information and the new laws on transparency and access to public information. In particular, their capacity to file claims with the transparency authorities

    Directory of Open Access Journals (Sweden)

    Emilio Guichot Reina

    2017-11-01

    Full Text Available The new public independent authorities in charge of access to information’s claims have adopted conflicting positions on the rules applicable to requests for information made by political representatives and on their own competence to hear complaints they may raise. The right of access to the information of the political representatives cannot have a smaller scope in its substantive, procedural and guarantees content than the one that the new regulation on transparency and access to the information recognizes to any person. The case law of the Supreme Court has consistently affirmed this. It would contribute to legal certainty if this criterion was followed by every new public independent authorities and it was expressly established by a future legislative reform.

  1. 75 FR 51994 - Combined Notice of Filings

    Science.gov (United States)

    2010-08-24

    ...: Panther Interstate Pipeline Energy, LLC. Description: Panther Interstate Pipeline Energy, LLC submits tariff filing per 154.203: Panther Baseline eTariff Filing to be effective 8/ 12/2010. Filed Date: 08/13...

  2. 76 FR 52323 - Combined Notice of Filings; Filings Instituting Proceedings

    Science.gov (United States)

    2011-08-22

    .... Applicants: Young Gas Storage Company, Ltd. Description: Young Gas Storage Company, Ltd. submits tariff..., but intervention is necessary to become a party to the proceeding. The filings are accessible in the.... More detailed information relating to filing requirements, interventions, protests, and service can be...

  3. Author fees for online publication

    Science.gov (United States)

    Like the journals themselves, AGU publication fees have been restructured to accommodate the new online, publish-as-ready approach. The new fee structure is based on authors' providing electronic files of their text and art in acceptable formats (Word, WordPerfect, and LaTeX for text, and .eps or .tif for digital art). However, if you are unable to supply electronic files, you can opt for a higher-charge, full-service route in which AGU will create electronic files from hard copy. All authors for AGU journals are expected to support the journal archive through fees based on number as well as size of article files. The revenue from these fees is set aside for the "Perpetual Care Trust Fund," which will support the migration of the journal archive to new formats or media as technology changes. For several journals, excess length fees remain in place to encourage submission of concisely written articles. During this first transition year, most author fees are based on the number of print page equivalents (pdf) in an article; in the future, however, charges are expected to be associated with file size. The specific fees for each journal are posted on AGU's Web site under Publications-Tools for Authors.

  4. Research on cloud computing solutions

    OpenAIRE

    Liudvikas Kaklauskas; Vaida Zdanytė

    2015-01-01

    Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, ...

  5. Computer system operation

    International Nuclear Information System (INIS)

    Lee, Young Jae; Lee, Hae Cho; Lee, Ho Yeun; Kim, Young Taek; Lee, Sung Kyu; Park, Jeong Suk; Nam, Ji Wha; Kim, Soon Kon; Yang, Sung Un; Sohn, Jae Min; Moon, Soon Sung; Park, Bong Sik; Lee, Byung Heon; Park, Sun Hee; Kim, Jin Hee; Hwang, Hyeoi Sun; Lee, Hee Ja; Hwang, In A.

    1993-12-01

    The report described the operation and the trouble shooting of main computer and KAERINet. The results of the project are as follows; 1. The operation and trouble shooting of the main computer system. (Cyber 170-875, Cyber 960-31, VAX 6320, VAX 11/780). 2. The operation and trouble shooting of the KAERINet. (PC to host connection, host to host connection, file transfer, electronic-mail, X.25, CATV etc.). 3. The development of applications -Electronic Document Approval and Delivery System, Installation the ORACLE Utility Program. 22 tabs., 12 figs. (Author) .new

  6. Computer system operation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Jae; Lee, Hae Cho; Lee, Ho Yeun; Kim, Young Taek; Lee, Sung Kyu; Park, Jeong Suk; Nam, Ji Wha; Kim, Soon Kon; Yang, Sung Un; Sohn, Jae Min; Moon, Soon Sung; Park, Bong Sik; Lee, Byung Heon; Park, Sun Hee; Kim, Jin Hee; Hwang, Hyeoi Sun; Lee, Hee Ja; Hwang, In A [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1993-12-01

    The report described the operation and the trouble shooting of main computer and KAERINet. The results of the project are as follows; 1. The operation and trouble shooting of the main computer system. (Cyber 170-875, Cyber 960-31, VAX 6320, VAX 11/780). 2. The operation and trouble shooting of the KAERINet. (PC to host connection, host to host connection, file transfer, electronic-mail, X.25, CATV etc.). 3. The development of applications -Electronic Document Approval and Delivery System, Installation the ORACLE Utility Program. 22 tabs., 12 figs. (Author) .new.

  7. Computer-communication networks

    CERN Document Server

    Meditch, James S

    1983-01-01

    Computer- Communication Networks presents a collection of articles the focus of which is on the field of modeling, analysis, design, and performance optimization. It discusses the problem of modeling the performance of local area networks under file transfer. It addresses the design of multi-hop, mobile-user radio networks. Some of the topics covered in the book are the distributed packet switching queuing network design, some investigations on communication switching techniques in computer networks and the minimum hop flow assignment and routing subject to an average message delay constraint

  8. Development of a script for converting DICOM files to .TXT

    International Nuclear Information System (INIS)

    Abrantes, Marcos E.S.; Oliveira, A.H. de

    2014-01-01

    Background: with the increased use of computer simulation techniques for diagnosis or therapy in patients, the MCNP and SCMS software is being widely used. For use as SCMS data entry interface for the MCNP is necessary to perform transformation of DICOM images to text files. Objective: to produce a semi automatic script conversion DICOM images generated by Computerized Tomography or Magnetic Resonance, for .txt in the IMAGEJ software. Methodology: this study was developed in the IMAGEJ software platform with an Intel Core 2 Duo computer, CPU of 2.00GHz, with 2:00 GB of RAM for 32-bit system. Development of the script was held in a text editor using JAVA language. For script insertion in IMAGEJ the plug in tool of this software was used. After this, a window is open asking for the path of the files that will be read, first and last name of DICOM file to be converted, along with where the new files will be stored. Results: for the manual processing of DICOM conversion to .txt of cerebral computed tomography with 600 images requires a time of about 8 hours. The use of script allows conversion time reduction for 12 minutes. Conclusion: the script used demonstrates DICOM conversion ability to .txt and a significant improvement in time savings in processing

  9. 1 CFR 17.3 - Criteria for emergency publication.

    Science.gov (United States)

    2010-01-01

    ... 1 General Provisions 1 2010-01-01 2010-01-01 false Criteria for emergency publication. 17.3..., AND PROCESSING OF DOCUMENTS FILING FOR PUBLIC INSPECTION AND PUBLICATION SCHEDULES Emergency Schedule § 17.3 Criteria for emergency publication. The emergency schedule is designed to provide the fastest...

  10. The Fermilab central computing facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-01-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front-end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS cluster interactive front-end, an Amdahl VM Computing engine, ACP farms, and (primarily) VMS workstations. This paper will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. (orig.)

  11. The Fermilab Central Computing Facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-05-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS Cluster interactive front end, an Amdahl VM computing engine, ACP farms, and (primarily) VMS workstations. This presentation will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. 2 figs

  12. Software for Managing Personal Files.

    Science.gov (United States)

    Lundeen, Gerald

    1989-01-01

    Discusses the special characteristics of personal file management software and compares four microcomputer software packages: Notebook II with Bibliography and Convert, Pro-Cite with Biblio-Links, askSam, and Reference Manager. Each package is evaluated in terms of the user interface, file maintenance, retrieval capabilities, output, and…

  13. Mixed-Media File Systems

    NARCIS (Netherlands)

    Bosch, H.G.P.

    1999-01-01

    This thesis addresses the problem of implementing mixed-media storage systems. In this work a mixed-media file system is defined to be a system that stores both conventional (best-effort) file data and real-time continuous-media data. Continuous-media data is usually bulky, and servers storing and

  14. Evaluation of the TSC Dolphin Computer Assisted Instructional System in the Chapter 1 Program of the District of Columbia Public Schools. Final Report 85-9.

    Science.gov (United States)

    Harris, Carolyn DeMeyer; And Others

    Dolphin is a computer-assisted instruction system used to teach and reinforce skills in reading, language arts, and mathematics. An evaluation of this system was conducted to provide information to TSC Division of Houghton Mifflin regarding its effectiveness and possible modifications to the system. The general design of the evaluation was to…

  15. Bulletin bibliographique sur l'E.A.O. (l'enseignement assiste par ordinateur) (Bibliographic Bulletin on Computer Assisted Instruction). Publication K-4.

    Science.gov (United States)

    LaForge, Lorne, Ed.

    The bibliography contains about 150 citations of journal articles, monographs, collected works, research reports, and essays drawn from the BIBELO database and concerning computer-assisted language instruction. The first half of the volume is an annotated bibliography in alphabetical order by author. The second section contains subject and…

  16. The NEA computer program library: a possible GDMS application

    International Nuclear Information System (INIS)

    Schuler, W.

    1978-01-01

    NEA Computer Program library maintains a series of eleven sequential computer files, used for linked applications in managing their stock of computer codes for nuclear reactor calculations, storing index and program abstract information, and administering their service to requesters. The high data redundancy beween the files suggests that a data base approach would be valid and this paper suggests a possible 'schema' for an CODASYL GDMS

  17. Design and creation of a direct access nuclear data file

    International Nuclear Information System (INIS)

    Charpentier, P.

    1981-06-01

    General considerations on the structure of instructions and files are reviewed. Design, organization and mode of use of the different files: instruction file, index files, inverted files, automatic analysis and inquiry programs are examined [fr

  18. Fast processing the film data file

    International Nuclear Information System (INIS)

    Abramov, B.M.; Avdeev, N.F.; Artemov, A.V.

    1978-01-01

    The problems of processing images obtained from three-meter magnetic spectrometer on a new PSP-2 automatic device are considered. A detailed description of the filtration program, which controls the correctness of operation connection line, as well as of scanning parameters and technical quality of information. The filtration process can be subdivided into the following main stages: search of fiducial marks binding of track to fiducial marks; plotting from sparks of track fragments in chambers. For filtration purposes the BESM-6 computer has been chosen. The complex of filtration programs is shaped as a RAM-file, the required version of the program is collected by the PATCHY program. The subprograms, performing the greater part of the calculations are written in the autocode MADLEN, the rest of the subprograms - in FORTRAN and ALGOL. The filtration time for one image makes 1,2-2 s of the calculation. The BESM-6 computer processes up to 12 thousand images a day

  19. Foundations of Neuromorphic Computing

    Science.gov (United States)

    2013-05-01

    paradigms: few sensors/complex computations and many sensors/simple computation. Challenges with Nano-enabled Neuromorphic Chips A wide variety of...FOUNDATIONS OF NEUROMORPHIC COMPUTING MAY 2013 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...2009 – SEP 2012 4. TITLE AND SUBTITLE FOUNDATIONS OF NEUROMORPHIC COMPUTING 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER N/A 5c. PROGRAM

  20. Temperature increases on the external root surface during endodontic treatment using single file systems.

    Science.gov (United States)

    Özkocak, I; Taşkan, M M; Gökt Rk, H; Aytac, F; Karaarslan, E Şirin

    2015-01-01

    The aim of this study is to evaluate increases in temperature on the external root surface during endodontic treatment with different rotary systems. Fifty human mandibular incisors with a single root canal were selected. All root canals were instrumented using a size 20 Hedstrom file, and the canals were irrigated with 5% sodium hypochlorite solution. The samples were randomly divided into the following three groups of 15 teeth: Group 1: The OneShape Endodontic File no.: 25; Group 2: The Reciproc Endodontic File no.: 25; Group 3: The WaveOne Endodontic File no.: 25. During the preparation, the temperature changes were measured in the middle third of the roots using a noncontact infrared thermometer. The temperature data were transferred from the thermometer to the computer and were observed graphically. Statistical analysis was performed using the Kruskal-Wallis analysis of variance at a significance level of 0.05. The increases in temperature caused by the OneShape file system were lower than those of the other files (P file showed the highest temperature increases. However, there were no significant differences between the Reciproc and WaveOne files. The single file rotary systems used in this study may be recommended for clinical use.

  1. Introduction [International Reactor Dosimetry File 2002 (IRDF-2002)

    International Nuclear Information System (INIS)

    Paviotti-Corcuera, R.; Zolnay, E.M.

    2006-01-01

    The most recently tested version of the International Reactor Dosimetry File, IRDF-90 Version 2 (IRDF-90.2), was released in 1993. Most of the evaluations used in this file were prepared in the mid-1980s, and in the meantime a large amount of new experimental data has become available, along with two new national reactor dosimetry libraries (the Russian Reactor Dosimetry File (RRDF-98) and the Japanese Evaluated Nuclear Data Library (JENDL/D-99)). The cross-sections and related uncertainties for several reactions in these libraries may be of better quality than the data in the older IRDF-90 file. These developments have resulted in different cross-section values being applied to the evaluation of experimental data, creating difficulties in comparing the results of reactor dosimetry calculations from the same types of nuclear facility. Therefore, there has been a strong demand from the reactor dosimetry community for an updated and standardized version of the IRDF. The IAEA has in the past supported similar efforts to improve the quality of data for reactor dosimetry applications. A major objective of the present data development project was to prepare and distribute a standardized, updated and tested reactor dosimetry cross-section library accompanied by uncertainty information (IRDF-2002) for use in service life assessments of nuclear power reactors. In order to achieve this objective, two technical meetings were organized. Both meetings were held at the IAEA in Vienna. The first meeting took place from 27 to 29 August 2002, the second from 1 to 3 October 2003. Recommendations were made concerning the following topics and the preparation of the library: reactions to be included, requirements for new evaluations or revisions, nuclear decay data, radiation damage data, testing of the data in benchmark fields and inclusion of computer codes. The participants emphasized that good quality nuclear data for reactor dosimetry are essential to improve assessments of the

  2. Download this PDF file

    African Journals Online (AJOL)

    SEGUN ADENIJI

    Key words: Teenage pregnancy, Cross-sectional Survey, Secondary School, Nigeria, High School Drop-out ... Counselling was given before suspension or expulsion in 4% of public schools and 15% of private .... In their opinions, most public (80%) and private (70%) school .... junior secondary school grade one students in.

  3. Download this PDF file

    African Journals Online (AJOL)

    Dr Kazungu

    Governments need financial resources to provide public goods and services to the citizens. ... governments budgetary procedures frameworks, fiscal forecasting and ... In Tanzania, the Public Finance Act (2001) stipulates the government budget .... monthly and annual total tax revenue collection data published by the Bank ...

  4. Download this PDF file

    African Journals Online (AJOL)

    This article examines the three-, five- and ten-year long-run performance of .... multiples (price-earnings or market-to-book reflecting optimistic assessments of ... prior to it going public, investors often judge the company's real value based on ..... Initial Public Offerings (IPOs)', Journal of Business Finance & Accounting, vol.

  5. Public Library Training Program for Older Adults Addresses Their Computer and Health Literacy Needs. A Review of: Xie, B. (2011. Improving older adults’ e-health literacy through computer training using NIH online resources. Library & Information Science Research, 34, 63-71. doi: /10.1016/j.lisr.2011.07.006

    Directory of Open Access Journals (Sweden)

    Cari Merkley

    2012-12-01

    Full Text Available Objective – To evaluate the efficacy of an ehealthliteracy educational intervention aimedat older adults.Design – Pre and post interventionquestionnaires administered in anexperimental study.Setting – Two public library branches inMaryland.Subjects – 218 adults between 60 and 89 yearsof age.Methods – A convenience sample of olderadults was recruited to participate in a fourweek training program structured around theNational Institutes of Health toolkit HelpingOlder Adults Search for Health InformationOnline. During the program, classes met at theparticipating libraries twice a week. Sessionswere two hours in length, and employedhands on exercises led by Master of LibraryScience students. The training included anintroduction to the Internet, as well as in depthtraining in the use of the NIHSeniorHealth andMedlinePlus websites. In the first class,participants were asked to complete a pretrainingquestionnaire that included questionsrelating to demographics and previouscomputer and Internet experience, as well asmeasures from the Computer Anxiety Scaleand two subscales of the Attitudes towardComputers Questionnaire. Participantsbetween September 2008 and June 2009 alsocompleted pre-training computer and web knowledge tests that asked individuals to label the parts of a computer and of a website using a provided list of terms. At the end of the program, participants were asked to complete post-training questionnaires that included the previously employed questions from the Computer Anxiety Scale and Attitudes towards Computer Questionnaire. New questions were added relating to the participants’ satisfaction with the training, its impact on their health decision making, their perceptions of public libraries, and the perceived usability and utility of the two websites highlighted during the training program. Those who completed pre-training knowledge tests were also asked to complete the same exercises at the end of the program.Main Results

  6. Extending the Online Public Access Catalog into the Microcomputer Environment.

    Science.gov (United States)

    Sutton, Brett

    1990-01-01

    Describes PCBIS, a database program for MS-DOS microcomputers that features a utility for automatically converting online public access catalog search results stored as text files into structured database files that can be searched, sorted, edited, and printed. Topics covered include the general features of the program, record structure, record…

  7. Virtual file system for PSDS

    Science.gov (United States)

    Runnels, Tyson D.

    1993-01-01

    This is a case study. It deals with the use of a 'virtual file system' (VFS) for Boeing's UNIX-based Product Standards Data System (PSDS). One of the objectives of PSDS is to store digital standards documents. The file-storage requirements are that the files must be rapidly accessible, stored for long periods of time - as though they were paper, protected from disaster, and accumulative to about 80 billion characters (80 gigabytes). This volume of data will be approached in the first two years of the project's operation. The approach chosen is to install a hierarchical file migration system using optical disk cartridges. Files are migrated from high-performance media to lower performance optical media based on a least-frequency-used algorithm. The optical media are less expensive per character stored and are removable. Vital statistics about the removable optical disk cartridges are maintained in a database. The assembly of hardware and software acts as a single virtual file system transparent to the PSDS user. The files are copied to 'backup-and-recover' media whose vital statistics are also stored in the database. Seventeen months into operation, PSDS is storing 49 gigabytes. A number of operational and performance problems were overcome. Costs are under control. New and/or alternative uses for the VFS are being considered.

  8. Nuclear decay data for dosimetry calculation. Revised data of ICRP Publication 38

    International Nuclear Information System (INIS)

    Endo, Akira; Yamaguchi, Yasuhiro

    2005-02-01

    New nuclear decay data used for dose calculation have been compiled for 1034 radionuclides, which are significant in medical, environmental and occupational exposures. The decay data were assembled from decay data sets of the Evaluated Nuclear Structure Data File (ENSDF), the latest version as of 2003. Basic nuclear properties in the ENSDF that are particularly important for calculating energies and intensities of radiations were examined and updated by referring to UNBASE2003/AME2003, the database for nuclear and decay properties of nuclides. In addition, modification of incomplete ENSDF was done for their format errors, level schemes, normalization records, and so on. The energies and intensities of emitted radiations by the nuclear decay and the subsequent atomic process were computed from the ENSDF using the computer code EDISTR04. EDISTR04 is an enhanced version of EDISTR used for assembling ICRP Publication 38 (ICRP38), and incorporates updates of atomic data and computation methods for calculating atomic radiations and spontaneous fission radiations. Quality assurance of the compiled data has been made by comparisons with various experimental data and decay databases prepared from different computer codes and data libraries. A package of the data files, called DECDC2 (Nuclear DECay Data for Dosimetry Calculation, Version 2), will succeed ICRP38 that has been used extensively in dose calculation and will be utilized in various fields. (author)

  9. PKA spectrum file

    Energy Technology Data Exchange (ETDEWEB)

    Kawai, M. [Toshiba Corp., Kawasaki, Kanagawa (Japan). Nuclear Engineering Lab.

    1997-03-01

    In the Japanese Nuclear Data Committee, the PKA/KERMA file containing PKA spectra, KERMA factors and DPA cross sections in the energy range between 10{sup -5} eV and 50 MeV is being prepared from the evaluated nuclear data. The processing code ESPERANT was developed to calculate quantities of PKA, KERMA and DPA from evaluated nuclear data for medium and heavy elements by using the effective single particle emission approximation (ESPEA). For light elements, the PKA spectra are evaluated by the SCINFUL/DDX and EXIFON codes, simultaneously with other neutron cross sections. The DPA cross sections due to charged particle emitted from light elements are evaluated for high neutron energy above 20 MeV. (author)

  10. Cloud Computing: An Overview

    Science.gov (United States)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  11. Competency Reference for Computer Assisted Drafting.

    Science.gov (United States)

    Oregon State Dept. of Education, Salem. Div. of Vocational Technical Education.

    This guide, developed in Oregon, lists competencies essential for students in computer-assisted drafting (CAD). Competencies are organized in eight categories: computer hardware, file usage and manipulation, basic drafting techniques, mechanical drafting, specialty disciplines, three dimensional drawing/design, plotting/printing, and advanced CAD.…

  12. 78 FR 11023 - Changes To Implement the First Inventor To File Provisions of the Leahy-Smith America Invents Act

    Science.gov (United States)

    2013-02-14

    ... which the patent or application, as the case may be, names another inventor and was effectively filed... publication derived the claimed invention from the inventor or a joint inventor named in the application or... filed not later than twelve months (six months in the case of a design application) after the date on...

  13. 7 CFR 400.171 - Qualifying when a state does not require that an Annual Statutory Financial Statement be filed.

    Science.gov (United States)

    2010-01-01

    ... Annual Statutory Financial Statement be filed. 400.171 Section 400.171 Agriculture Regulations of the... filing an Annual Statutory Financial Statement must, in addition to the requirements of § 400.170 (a), (b), (c) and (d), submit an Annual Statutory Financial Statement audited by a Certified Public Accountant...

  14. 76 FR 47210 - Notices of Filing of Petitions for Food Additives and Color Additives; Relocation in the Federal...

    Science.gov (United States)

    2011-08-04

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2011-N-0518] Notices of Filing of Petitions for Food Additives and Color Additives; Relocation in the Federal Register...) is notifying the public that notices of filing of petitions for food additives and color additives...

  15. 78 FR 17967 - Self-Regulatory Organizations; C2 Options Exchange, Incorporated; Notice of Filing and Immediate...

    Science.gov (United States)

    2013-03-25

    ...'') and Amazon.com Inc. (``AMZN'') (``mini-options''). The purpose of this proposed rule change is to...://www.c2exchange.com/Legal/ ), at the Exchange's Office of the Secretary, and at the Commission's Public... Web site. The C2 rule filing Web site is located at: http://www.c2exchange.com/Legal/RuleFilings.aspx...

  16. 78 FR 63177 - Order on Voluntary Remand and Clarifying Policy on Filing of Reactive Power Service Rate...

    Science.gov (United States)

    2013-10-23

    ... the obligation to follow a voltage schedule.'' \\19\\ The Commission distinguished Hot Spring Power Co..., to explore the mechanics of public utilities filing reactive power rate schedules for which there is...'' jurisdictional service and, accordingly, must be filed for Commission review); Sulphur Springs Valley Elec. Coop...

  17. 5 CFR 1203.13 - Filing pleadings.

    Science.gov (United States)

    2010-01-01

    ... delivery, by facsimile, or by e-filing in accordance with § 1201.14 of this chapter. If the document was... submitted by e-filing, it is considered to have been filed on the date of electronic submission. (e... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Filing pleadings. 1203.13 Section 1203.13...

  18. 12 CFR 16.33 - Filing fees.

    Science.gov (United States)

    2010-01-01

    ... Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY SECURITIES OFFERING DISCLOSURE RULES § 16.33 Filing fees. (a) Filing fees must accompany certain filings made under the provisions of this part... Comptroller of the Currency Fees published pursuant to § 8.8 of this chapter. (b) Filing fees must be paid by...

  19. 77 FR 13587 - Combined Notice of Filings

    Science.gov (United States)

    2012-03-07

    .... Applicants: Transcontinental Gas Pipe Line Company. Description: Annual Electric Power Tracker Filing... Company. Description: 2012 Annual Fuel and Electric Power Reimbursement to be effective 4/1/2012. Filed... submits tariff filing per 154.403: Storm Surcharge 2012 to be effective 4/1/2012. Filed Date: 3/1/12...

  20. 75 FR 4689 - Electronic Tariff Filings

    Science.gov (United States)

    2010-01-29

    ... elements ``are required to properly identify the nature of the tariff filing, organize the tariff database... (or other pleading) and the Type of Filing code chosen will be resolved in favor of the Type of Filing...'s wish expressed in its transmittal letter or in other pleadings, the Commission may not review a...