WorldWideScience

Sample records for policy computer file

  1. Algorithms and file structures for computational geometry

    International Nuclear Information System (INIS)

    Hinrichs, K.; Nievergelt, J.

    1983-01-01

    Algorithms for solving geometric problems and file structures for storing large amounts of geometric data are of increasing importance in computer graphics and computer-aided design. As examples of recent progress in computational geometry, we explain plane-sweep algorithms, which solve various topological and geometric problems efficiently; and we present the grid file, an adaptable, symmetric multi-key file structure that provides efficient access to multi-dimensional data along any space dimension. (orig.)

  2. Security and policy driven computing

    CERN Document Server

    Liu, Lei

    2010-01-01

    Security and Policy Driven Computing covers recent advances in security, storage, parallelization, and computing as well as applications. The author incorporates a wealth of analysis, including studies on intrusion detection and key management, computer storage policy, and transactional management.The book first describes multiple variables and index structure derivation for high dimensional data distribution and applies numeric methods to proposed search methods. It also focuses on discovering relations, logic, and knowledge for policy management. To manage performance, the text discusses con

  3. Optimizing Input/Output Using Adaptive File System Policies

    Science.gov (United States)

    Madhyastha, Tara M.; Elford, Christopher L.; Reed, Daniel A.

    1996-01-01

    Parallel input/output characterization studies and experiments with flexible resource management algorithms indicate that adaptivity is crucial to file system performance. In this paper we propose an automatic technique for selecting and refining file system policies based on application access patterns and execution environment. An automatic classification framework allows the file system to select appropriate caching and pre-fetching policies, while performance sensors provide feedback used to tune policy parameters for specific system environments. To illustrate the potential performance improvements possible using adaptive file system policies, we present results from experiments involving classification-based and performance-based steering.

  4. Small file aggregation in a parallel computing system

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Zhang, Jingwang

    2014-09-02

    Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.

  5. Storing files in a parallel computing system using list-based index to identify replica files

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Zhang, Zhenhua; Grider, Gary

    2015-07-21

    Improved techniques are provided for storing files in a parallel computing system using a list-based index to identify file replicas. A file and at least one replica of the file are stored in one or more storage nodes of the parallel computing system. An index for the file comprises at least one list comprising a pointer to a storage location of the file and a storage location of the at least one replica of the file. The file comprises one or more of a complete file and one or more sub-files. The index may also comprise a checksum value for one or more of the file and the replica(s) of the file. The checksum value can be evaluated to validate the file and/or the file replica(s). A query can be processed using the list.

  6. RAMA: A file system for massively parallel computers

    Science.gov (United States)

    Miller, Ethan L.; Katz, Randy H.

    1993-01-01

    This paper describes a file system design for massively parallel computers which makes very efficient use of a few disks per processor. This overcomes the traditional I/O bottleneck of massively parallel machines by storing the data on disks within the high-speed interconnection network. In addition, the file system, called RAMA, requires little inter-node synchronization, removing another common bottleneck in parallel processor file systems. Support for a large tertiary storage system can easily be integrated in lo the file system; in fact, RAMA runs most efficiently when tertiary storage is used.

  7. WinSCP for Windows File Transfers | High-Performance Computing | NREL

    Science.gov (United States)

    WinSCP for Windows File Transfers WinSCP for Windows File Transfers WinSCP for can used to securely transfer files between your local computer running Microsoft Windows and a remote computer running Linux

  8. Documentation of CATHENA input files for the APOLLO computer

    International Nuclear Information System (INIS)

    1988-06-01

    Input files created for the VAX version of the CATHENA two-fluid code have been modified and documented for simulation on the AECB's APOLLO computer system. The input files describe the RD-14 thermalhydraulic loop, the RD-14 steam generator, the RD-12 steam generator blowdown test facility, the Stern Laboratories Cold Water Injection Facility (CWIT), and a CANDU 600 reactor. Sample CATHENA predictions are given and compared with experimental results where applicable. 24 refs

  9. NET: an inter-computer file transfer command

    International Nuclear Information System (INIS)

    Burris, R.D.

    1978-05-01

    The NET command was defined and supported in order to facilitate file transfer between computers. Among the goals of the implementation were greatest possible ease of use, maximum power (i.e., support of a diversity of equipment and operations), and protection of the operating system

  10. Storing files in a parallel computing system based on user-specified parser function

    Science.gov (United States)

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Manzanares, Adam; Torres, Aaron

    2014-10-21

    Techniques are provided for storing files in a parallel computing system based on a user-specified parser function. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a parser from the distributed application for processing the plurality of files prior to storage; and storing one or more of the plurality of files in one or more storage nodes of the parallel computing system based on the processing by the parser. The plurality of files comprise one or more of a plurality of complete files and a plurality of sub-files. The parser can optionally store only those files that satisfy one or more semantic requirements of the parser. The parser can also extract metadata from one or more of the files and the extracted metadata can be stored with one or more of the plurality of files and used for searching for files.

  11. A software to report and file by personal computer

    International Nuclear Information System (INIS)

    Di Giandomenico, E.; Filippone, A.; Esposito, A.; Bonomo, L.

    1989-01-01

    During the past four years the authors have been gaining experince in reporting radiological examinations by personal computer. Today they describe the project of a new software which allows the reporting and filing of roentgenograms. This program was realized by a radiologist, using a well known data base management system: dBASE III. The program was shaped to fit the radiologist's needs: it helps to report, and allows to file, radiological data, with the diagnosic codes used by the American College of Radiology. In this paper the authors describe the data base structure and indicate the software functions which make its use possible. Thus, this paper is not aimed at advertising a new reporting program, but at demonstrating how the radiologist can himself manage some aspects of his work with the help of a personal computer

  12. 22 CFR 1429.21 - Computation of time for filing papers.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Computation of time for filing papers. 1429.21... MISCELLANEOUS AND GENERAL REQUIREMENTS General Requirements § 1429.21 Computation of time for filing papers. In... subchapter requires the filing of any paper, such document must be received by the Board or the officer or...

  13. Methods and apparatus for multi-resolution replication of files in a parallel computing system using semantic information

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-10-20

    Techniques are provided for storing files in a parallel computing system using different resolutions. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a sub-file. The method comprises the steps of obtaining semantic information related to the file; generating a plurality of replicas of the file with different resolutions based on the semantic information; and storing the file and the plurality of replicas of the file in one or more storage nodes of the parallel computing system. The different resolutions comprise, for example, a variable number of bits and/or a different sub-set of data elements from the file. A plurality of the sub-files can be merged to reproduce the file.

  14. File and metadata management for BESIII distributed computing

    International Nuclear Information System (INIS)

    Nicholson, C; Zheng, Y H; Lin, L; Deng, Z Y; Li, W D; Zhang, X M

    2012-01-01

    The BESIII experiment at the Institute of High Energy Physics (IHEP), Beijing, uses the high-luminosity BEPCII e + e − collider to study physics in the π-charm energy region around 3.7 GeV; BEPCII has produced the worlds largest samples of J/φ and φ’ events to date. An order of magnitude increase in the data sample size over the 2011-2012 data-taking period demanded a move from a very centralized to a distributed computing environment, as well as the development of an efficient file and metadata management system. While BESIII is on a smaller scale than some other HEP experiments, this poses particular challenges for its distributed computing and data management system. These constraints include limited resources and manpower, and low quality of network connections to IHEP. Drawing on the rich experience of the HEP community, a system has been developed which meets these constraints. The design and development of the BESIII distributed data management system, including its integration with other BESIII distributed computing components, such as job management, are presented here.

  15. Comparison of canal transportation and centering ability of hand Protaper files and rotary Protaper files by using micro computed tomography

    OpenAIRE

    Amit Gandhi; Taru Gandhi

    2011-01-01

    Introduction and objective: The aim of the present study was to compare root canal preparation with rotary ProTaper files and hand ProTaper files to find a better instrumentation technique for maintaining root canal geometry with the aid of computed tomography. Material and methods: Twenty curved root canals with at least 10 degree of curvature were divided into 2 groups of 10 teeth each. In group I the canals were prepared with hand ProTaper files and in group II the canals were prepared wit...

  16. Arranging and finding folders and files on your Windows 7 computer

    CERN Document Server

    Steps, Studio Visual

    2014-01-01

    If you have lots of documents on your desk, it may prove to be impossible to find the document you are looking for. In order to easily find certain documents, they are often stored in a filing cabinet and arranged in a logical order. The folders on your computer serve the same purpose. They do not just contain files; they can also contain other folders. You can create an unlimited number of folders, and each folder can contain any number of subfolders and files. You can use Windows Explorer, also called the folder window, to work with the files and folders on your computer. You can copy, delete, move, find, and sort files, among other things. Or you can transfer files and folders to a USB stick, an external hard drive, a CD, DVD or Blu-Ray disk. In this practical guide we will show you how to use the folder window, and help you arrange your own files.

  17. Security policies and trust in ubiquitous computing.

    Science.gov (United States)

    Joshi, Anupam; Finin, Tim; Kagal, Lalana; Parker, Jim; Patwardhan, Anand

    2008-10-28

    Ubiquitous environments comprise resource-constrained mobile and wearable devices and computational elements embedded in everyday artefacts. These are connected to each other using both infrastructure-based as well as short-range ad hoc networks. Limited Internet connectivity limits the use of conventional security mechanisms such as public key infrastructures and other forms of server-centric authentication. Under these circumstances, peer-to-peer interactions are well suited for not just information interchange, but also managing security and privacy. However, practical solutions for protecting mobile devices, preserving privacy, evaluating trust and determining the reliability and accuracy of peer-provided data in such interactions are still in their infancy. Our research is directed towards providing stronger assurances of the reliability and trustworthiness of information and services, and the use of declarative policy-driven approaches to handle the open and dynamic nature of such systems. This paper provides an overview of some of the challenges and issues, and points out directions for progress.

  18. Methods and apparatus for capture and storage of semantic information with sub-files in a parallel computing system

    Science.gov (United States)

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-02-03

    Techniques are provided for storing files in a parallel computing system using sub-files with semantically meaningful boundaries. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a plurality of sub-files. The method comprises the steps of obtaining a user specification of semantic information related to the file; providing the semantic information as a data structure description to a data formatting library write function; and storing the semantic information related to the file with one or more of the sub-files in one or more storage nodes of the parallel computing system. The semantic information provides a description of data in the file. The sub-files can be replicated based on semantically meaningful boundaries.

  19. RRB / SSI Interface Checkwriting Integrated Computer Operation Extract File (CHICO)

    Data.gov (United States)

    Social Security Administration — This monthly file provides SSA with information about benefit payments made to railroad retirement beneficiaries. SSA uses this data to verify Supplemental Security...

  20. A technique for integrating remote minicomputers into a general computer's file system

    CERN Document Server

    Russell, R D

    1976-01-01

    This paper describes a simple technique for interfacing remote minicomputers used for real-time data acquisition into the file system of a central computer. Developed as part of the ORION system at CERN, this 'File Manager' subsystem enables a program in the minicomputer to access and manipulate files of any type as if they resided on a storage device attached to the minicomputer. Yet, completely transparent to the program, the files are accessed from disks on the central system via high-speed data links, with response times comparable to local storage devices. (6 refs).

  1. Informatics in Radiology (infoRAD): personal computer security: part 2. Software Configuration and file protection.

    Science.gov (United States)

    Caruso, Ronald D

    2004-01-01

    Proper configuration of software security settings and proper file management are necessary and important elements of safe computer use. Unfortunately, the configuration of software security options is often not user friendly. Safe file management requires the use of several utilities, most of which are already installed on the computer or available as freeware. Among these file operations are setting passwords, defragmentation, deletion, wiping, removal of personal information, and encryption. For example, Digital Imaging and Communications in Medicine medical images need to be anonymized, or "scrubbed," to remove patient identifying information in the header section prior to their use in a public educational or research environment. The choices made with respect to computer security may affect the convenience of the computing process. Ultimately, the degree of inconvenience accepted will depend on the sensitivity of the files and communications to be protected and the tolerance of the user. Copyright RSNA, 2004

  2. Processing of evaluated neutron data files in ENDF format on personal computers

    International Nuclear Information System (INIS)

    Vertes, P.

    1991-11-01

    A computer code package - FDMXPC - has been developed for processing evaluated data files in ENDF format. The earlier version of this package is supplemented with modules performing calculations using Reich-Moore and Adler-Adler resonance parameters. The processing of evaluated neutron data files by personal computers requires special programming considerations outlined in this report. The scope of the FDMXPC program system is demonstrated by means of numerical examples. (author). 5 refs, 4 figs, 4 tabs

  3. A Semantic Based Policy Management Framework for Cloud Computing Environments

    Science.gov (United States)

    Takabi, Hassan

    2013-01-01

    Cloud computing paradigm has gained tremendous momentum and generated intensive interest. Although security issues are delaying its fast adoption, cloud computing is an unstoppable force and we need to provide security mechanisms to ensure its secure adoption. In this dissertation, we mainly focus on issues related to policy management and access…

  4. Cloud Computing in the EU Policy Sphere

    NARCIS (Netherlands)

    Sluijs, J.P.J.B.; Larouche, P.; Sauter, W.

    2011-01-01

    Cloud computing is a new development that is based on the premise that data and applications are stored centrally and can be accessed through the Internet. Our Article sets up a broad analysis of how the emergence of clouds relates to European law. We single out European competition law, network

  5. A digital imaging teaching file by using the internet, HTML and personal computers

    International Nuclear Information System (INIS)

    Chun, Tong Jin; Jeon, Eun Ju; Baek, Ho Gil; Kang, Eun Joo; Baik, Seung Kug; Choi, Han Yong; Kim, Bong Ki

    1996-01-01

    A film-based teaching file takes up space and the need to search through such a file places limits on the extent to which it is likely to be used. Furthermore it is not easy for doctors in a medium-sized hospital to experience a variety of cases, and so for these reasons we created an easy-to-use digital imaging teaching file with HTML(Hypertext Markup Language) and downloaded images via World Wide Web(WWW) services on the Internet. This was suitable for use by computer novices. We used WWW internet services as a resource for various images and three different IMB-PC compatible computers(386DX, 486DX-II, and Pentium) in downloading the images and in developing a digitalized teaching file. These computers were connected with the Internet through a high speed dial-up modem(28.8Kbps) and to navigate the Internet. Twinsock and Netscape were used. 3.0, Korean word processing software, was used to create HTML(Hypertext Markup Language) files and the downloaded images were linked to the HTML files. In this way, a digital imaging teaching file program was created. Access to a Web service via the Internet required a high speed computer(at least 486DX II with 8MB RAM) for comfortabel use; this also ensured that the quality of downloaded images was not degraded during downloading and that these were good enough to use as a teaching file. The time needed to retrieve the text and related images depends on the size of the file, the speed of the network, and the network traffic at the time of connection. For computer novices, a digital image teaching file using HTML is easy to use. Our method of creating a digital imaging teaching file by using Internet and HTML would be easy to create and radiologists with little computer experience who want to study various digital radiologic imaging cases would find it easy to use

  6. Globus File Transfer Services | High-Performance Computing | NREL

    Science.gov (United States)

    installed on the systems at both ends of the data transfer. The NREL endpoint is nrel#globus. Click Login on the Globus web site. On the login page select "Globus ID" as the login method and click Login to the Globus website. From the Manage Data drop down menu, select Transfer Files. Then click Get

  7. Cooperative storage of shared files in a parallel computing system with dynamic block size

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-11-10

    Improved techniques are provided for parallel writing of data to a shared object in a parallel computing system. A method is provided for storing data generated by a plurality of parallel processes to a shared object in a parallel computing system. The method is performed by at least one of the processes and comprises: dynamically determining a block size for storing the data; exchanging a determined amount of the data with at least one additional process to achieve a block of the data having the dynamically determined block size; and writing the block of the data having the dynamically determined block size to a file system. The determined block size comprises, e.g., a total amount of the data to be stored divided by the number of parallel processes. The file system comprises, for example, a log structured virtual parallel file system, such as a Parallel Log-Structured File System (PLFS).

  8. Computer Forensics Method in Analysis of Files Timestamps in Microsoft Windows Operating System and NTFS File System

    Directory of Open Access Journals (Sweden)

    Vesta Sergeevna Matveeva

    2013-02-01

    Full Text Available All existing file browsers displays 3 timestamps for every file in file system NTFS. Nowadays there are a lot of utilities that can manipulate temporal attributes to conceal the traces of file using. However every file in NTFS has 8 timestamps that are stored in file record and used in detecting the fact of attributes substitution. The authors suggest a method of revealing original timestamps after replacement and automated variant of it in case of a set of files.

  9. Survey on Security Issues in File Management in Cloud Computing Environment

    Science.gov (United States)

    Gupta, Udit

    2015-06-01

    Cloud computing has pervaded through every aspect of Information technology in past decade. It has become easier to process plethora of data, generated by various devices in real time, with the advent of cloud networks. The privacy of users data is maintained by data centers around the world and hence it has become feasible to operate on that data from lightweight portable devices. But with ease of processing comes the security aspect of the data. One such security aspect is secure file transfer either internally within cloud or externally from one cloud network to another. File management is central to cloud computing and it is paramount to address the security concerns which arise out of it. This survey paper aims to elucidate the various protocols which can be used for secure file transfer and analyze the ramifications of using each protocol.

  10. The image database management system of teaching file using personal computer

    International Nuclear Information System (INIS)

    Shin, M. J.; Kim, G. W.; Chun, T. J.; Ahn, W. H.; Baik, S. K.; Choi, H. Y.; Kim, B. G.

    1995-01-01

    For the systemic management and easy using of teaching file in radiology department, the authors tried to do the setup of a database management system of teaching file using personal computer. We used a personal computer (IBM PC compatible, 486DX2) including a image capture card(Window vision, Dooin Elect, Seoul, Korea) and video camera recorder (8mm, CCD-TR105, Sony, Tokyo, Japan) for the acquisition and storage of images. We developed the database program by using Foxpro for Window 2.6(Microsoft, Seattle, USA) executed in the Window 3.1 (Microsoft, Seattle, USA). Each datum consisted of hospital number, name, sex, age, examination date, keyword, radiologic examination modalities, final diagnosis, radiologic findings, references and representative images. The images were acquired and stored as bitmap format (8 bitmap, 540 X 390 ∼ 545 X 414, 256 gray scale) and displayed on the 17 inch-flat monitor(1024 X 768, Samtron, Seoul, Korea). Without special devices, the images acquisition and storage could be done on the reading viewbox, simply. The image quality on the computer's monitor was less than the one of original film on the viewbox, but generally the characteristics of each lesions could be differentiated. Easy retrieval of data was possible for the purpose of teaching file system. Without high cost appliances, we could consummate the image database system of teaching file using personal computer with relatively inexpensive method

  11. Transfer of numeric ASCII data files between Apple and IBM personal computers.

    Science.gov (United States)

    Allan, R W; Bermejo, R; Houben, D

    1986-01-01

    Listings for programs designed to transfer numeric ASCII data files between Apple and IBM personal computers are provided with accompanying descriptions of how the software operates. Details of the hardware used are also given. The programs may be easily adapted for transferring data between other microcomputers.

  12. An empirical analysis of journal policy effectiveness for computational reproducibility.

    Science.gov (United States)

    Stodden, Victoria; Seiler, Jennifer; Ma, Zhaokun

    2018-03-13

    A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by ( i ) requesting data and code from authors and ( ii ) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy-author remission of data and code postpublication upon request-an improvement over no policy, but currently insufficient for reproducibility.

  13. F2AC: A Lightweight, Fine-Grained, and Flexible Access Control Scheme for File Storage in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Wei Ren

    2016-01-01

    Full Text Available Current file storage service models for cloud servers assume that users either belong to single layer with different privileges or cannot authorize privileges iteratively. Thus, the access control is not fine-grained and flexible. Besides, most access control methods at cloud servers mainly rely on computationally intensive cryptographic algorithms and, especially, may not be able to support highly dynamic ad hoc groups with addition and removal of group members. In this paper, we propose a scheme called F2AC, which is a lightweight, fine-grained, and flexible access control scheme for file storage in mobile cloud computing. F2AC can not only achieve iterative authorization, authentication with tailored policies, and access control for dynamically changing accessing groups, but also provide access privilege transition and revocation. A new access control model called directed tree with linked leaf model is proposed for further implementations in data structures and algorithms. The extensive analysis is given for justifying the soundness and completeness of F2AC.

  14. Dimensional quality control of Ti-Ni dental file by optical coordinate metrology and computed tomography

    DEFF Research Database (Denmark)

    Yagüe-Fabra, J.A.; Tosello, Guido; Ontiveros, S.

    2014-01-01

    Endodontic dental files usually present complex 3D geometries, which make the complete measurement of the component very challenging with conventional micro metrology tools. Computed Tomography (CT) can represent a suitable alternative solution to micro metrology tools based on optical and tactile...... techniques. However, the establishment of CT systems traceability when measuring 3D complex geometries is still an open issue. In this work, to verify the quality of the CT dimensional measurements, the dental file has been measured both with a μCT system and an optical CMM (OCMM). The uncertainty...

  15. A Computable OLG Model for Gender and Growth Policy Analysis

    OpenAIRE

    Pierre-Richard Agénor

    2012-01-01

    This paper develops a computable Overlapping Generations (OLG) model for gender and growth policy analysis. The model accounts for human and physical capital accumulation (both public and private), intra- and inter-generational health persistence, fertility choices, and women's time allocation between market work, child rearing, and home production. Bargaining between spouses and gender bias, in the form of discrimination in the work place and mothers' time allocation between daughters and so...

  16. File management for experiment control parameters within a distributed function computer network

    International Nuclear Information System (INIS)

    Stubblefield, F.W.

    1976-10-01

    An attempt to design and implement a computer system for control of and data collection from a set of laboratory experiments reveals that many of the experiments in the set require an extensive collection of parameters for their control. The operation of the experiments can be greatly simplified if a means can be found for storing these parameters between experiments and automatically accessing them as they are required. A subsystem for managing files of such experiment control parameters is discussed. 3 figures

  17. Computational Models Used to Assess US Tobacco Control Policies.

    Science.gov (United States)

    Feirman, Shari P; Glasser, Allison M; Rose, Shyanika; Niaura, Ray; Abrams, David B; Teplitskaya, Lyubov; Villanti, Andrea C

    2017-11-01

    Simulation models can be used to evaluate existing and potential tobacco control interventions, including policies. The purpose of this systematic review was to synthesize evidence from computational models used to project population-level effects of tobacco control interventions. We provide recommendations to strengthen simulation models that evaluate tobacco control interventions. Studies were eligible for review if they employed a computational model to predict the expected effects of a non-clinical US-based tobacco control intervention. We searched five electronic databases on July 1, 2013 with no date restrictions and synthesized studies qualitatively. Six primary non-clinical intervention types were examined across the 40 studies: taxation, youth prevention, smoke-free policies, mass media campaigns, marketing/advertising restrictions, and product regulation. Simulation models demonstrated the independent and combined effects of these interventions on decreasing projected future smoking prevalence. Taxation effects were the most robust, as studies examining other interventions exhibited substantial heterogeneity with regard to the outcomes and specific policies examined across models. Models should project the impact of interventions on overall tobacco use, including nicotine delivery product use, to estimate preventable health and cost-saving outcomes. Model validation, transparency, more sophisticated models, and modeling policy interactions are also needed to inform policymakers to make decisions that will minimize harm and maximize health. In this systematic review, evidence from multiple studies demonstrated the independent effect of taxation on decreasing future smoking prevalence, and models for other tobacco control interventions showed that these strategies are expected to decrease smoking, benefit population health, and are reasonable to implement from a cost perspective. Our recommendations aim to help policymakers and researchers minimize harm and

  18. Do Your School Policies Provide Equal Access to Computers? Are You Sure?

    Science.gov (United States)

    DuBois, Phyllis A.; Schubert, Jane G.

    1986-01-01

    Outlines how school policies can unintentionally perpetuate gender discrimination in student computer use and access. Describes four areas of administrative policies that can cause inequities and provides ways for administrators to counteract these policies. Includes discussion of a program to balance computer use, and an abstract of an article…

  19. File sharing

    NARCIS (Netherlands)

    van Eijk, N.

    2011-01-01

    File sharing’ has become generally accepted on the Internet. Users share files for downloading music, films, games, software etc. In this note, we have a closer look at the definition of file sharing, the legal and policy-based context as well as enforcement issues. The economic and cultural

  20. Evaluation of clinical data in childhood asthma. Application of a computer file system

    International Nuclear Information System (INIS)

    Fife, D.; Twarog, F.J.; Geha, R.S.

    1983-01-01

    A computer file system was used in our pediatric allergy clinic to assess the value of chest roentgenograms and hemoglobin determinations used in the examination of patients and to correlate exposure to pets and forced hot air with the severity of asthma. Among 889 children with asthma, 20.7% had abnormal chest roentgenographic findings, excluding hyperinflation and peribronchial thickening, and 0.7% had abnormal hemoglobin values. Environmental exposure to pets or forced hot air was not associated with increased severity of asthma, as assessed by five measures of outcome: number of medications administered, requirement for corticosteroids, frequency of clinic visits, frequency of emergency room visits, and frequency of hospitalizations

  1. Building Parts Inventory Files Using the AppleWorks Data Base Subprogram and Apple IIe or GS Computers.

    Science.gov (United States)

    Schlenker, Richard M.

    This manual is a "how to" training device for building database files using the AppleWorks program with an Apple IIe or Apple IIGS Computer with Duodisk or two disk drives and an 80-column card. The manual provides step-by-step directions, and includes 25 figures depicting the computer screen at the various stages of the database file…

  2. CINDA 83 (1977-1983). The index to literature and computer files on microscopic neutron data

    International Nuclear Information System (INIS)

    1983-01-01

    CINDA, the Computer Index of Neutron Data, contains bibliographical references to measurements, calculations, reviews and evaluations of neutron cross-sections and other microscopic neutron data; it includes also index references to computer libraries of numerical neutron data exchanged between four regional neutron data centres. The present issue, CINDA 83, is an index to the literature on neutron data published after 1976. The basic volume, CINDA-A, together with the present issue, contains the full CINDA file as of 1 April 1983. A supplement to CINDA 83 is foreseen for fall 1983. Next year's issue, which is envisaged to be published in June 1984, will again cover all relevant literature that has appeared after 1976

  3. Trust in social computing. The case of peer-to-peer file sharing networks

    Directory of Open Access Journals (Sweden)

    Heng Xu

    2011-09-01

    Full Text Available Social computing and online communities are changing the fundamental way people share information and communicate with each other. Social computing focuses on how users may have more autonomy to express their ideas and participate in social exchanges in various ways, one of which may be peer-to-peer (P2P file sharing. Given the greater risk of opportunistic behavior by malicious or criminal communities in P2P networks, it is crucial to understand the factors that affect individual’s use of P2P file sharing software. In this paper, we develop and empirically test a research model that includes trust beliefs and perceived risks as two major antecedent beliefs to the usage intention. Six trust antecedents are assessed including knowledge-based trust, cognitive trust, and both organizational and peer-network factors of institutional trust. Our preliminary results show general support for the model and offer some important implications for software vendors in P2P sharing industry and regulatory bodies.

  4. Request queues for interactive clients in a shared file system of a parallel computing system

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin

    2015-08-18

    Interactive requests are processed from users of log-in nodes. A metadata server node is provided for use in a file system shared by one or more interactive nodes and one or more batch nodes. The interactive nodes comprise interactive clients to execute interactive tasks and the batch nodes execute batch jobs for one or more batch clients. The metadata server node comprises a virtual machine monitor; an interactive client proxy to store metadata requests from the interactive clients in an interactive client queue; a batch client proxy to store metadata requests from the batch clients in a batch client queue; and a metadata server to store the metadata requests from the interactive client queue and the batch client queue in a metadata queue based on an allocation of resources by the virtual machine monitor. The metadata requests can be prioritized, for example, based on one or more of a predefined policy and predefined rules.

  5. Public policy and regulatory implications for the implementation of Opportunistic Cloud Computing Services for Enterprises

    DEFF Research Database (Denmark)

    Kuada, Eric; Olesen, Henning; Henten, Anders

    2012-01-01

    Opportunistic Cloud Computing Services (OCCS) is a social network approach to the provisioning and management of cloud computing services for enterprises. This paper discusses how public policy and regulations will impact on OCCS implementation. We rely on documented publicly available government...... and corporate policies on the adoption of cloud computing services and deduce the impact of these policies on their adoption of opportunistic cloud computing services. We conclude that there are regulatory challenges on data protection that raises issues for cloud computing adoption in general; and the lack...... of a single globally accepted data protection standard poses some challenges for very successful implementation of OCCS for companies. However, the direction of current public and corporate policies on cloud computing make a good case for them to try out opportunistic cloud computing services....

  6. Toward Reproducible Computational Research: An Empirical Analysis of Data and Code Policy Adoption by Journals.

    Directory of Open Access Journals (Sweden)

    Victoria Stodden

    Full Text Available Journal policy on research data and code availability is an important part of the ongoing shift toward publishing reproducible computational science. This article extends the literature by studying journal data sharing policies by year (for both 2011 and 2012 for a referent set of 170 journals. We make a further contribution by evaluating code sharing policies, supplemental materials policies, and open access status for these 170 journals for each of 2011 and 2012. We build a predictive model of open data and code policy adoption as a function of impact factor and publisher and find higher impact journals more likely to have open data and code policies and scientific societies more likely to have open data and code policies than commercial publishers. We also find open data policies tend to lead open code policies, and we find no relationship between open data and code policies and either supplemental material policies or open access journal status. Of the journals in this study, 38% had a data policy, 22% had a code policy, and 66% had a supplemental materials policy as of June 2012. This reflects a striking one year increase of 16% in the number of data policies, a 30% increase in code policies, and a 7% increase in the number of supplemental materials policies. We introduce a new dataset to the community that categorizes data and code sharing, supplemental materials, and open access policies in 2011 and 2012 for these 170 journals.

  7. Toward Reproducible Computational Research: An Empirical Analysis of Data and Code Policy Adoption by Journals.

    Science.gov (United States)

    Stodden, Victoria; Guo, Peixuan; Ma, Zhaokun

    2013-01-01

    Journal policy on research data and code availability is an important part of the ongoing shift toward publishing reproducible computational science. This article extends the literature by studying journal data sharing policies by year (for both 2011 and 2012) for a referent set of 170 journals. We make a further contribution by evaluating code sharing policies, supplemental materials policies, and open access status for these 170 journals for each of 2011 and 2012. We build a predictive model of open data and code policy adoption as a function of impact factor and publisher and find higher impact journals more likely to have open data and code policies and scientific societies more likely to have open data and code policies than commercial publishers. We also find open data policies tend to lead open code policies, and we find no relationship between open data and code policies and either supplemental material policies or open access journal status. Of the journals in this study, 38% had a data policy, 22% had a code policy, and 66% had a supplemental materials policy as of June 2012. This reflects a striking one year increase of 16% in the number of data policies, a 30% increase in code policies, and a 7% increase in the number of supplemental materials policies. We introduce a new dataset to the community that categorizes data and code sharing, supplemental materials, and open access policies in 2011 and 2012 for these 170 journals.

  8. Comparison of canal transportation and centering ability of twisted files, Pathfile-ProTaper system, and stainless steel hand K-files by using computed tomography.

    Science.gov (United States)

    Gergi, Richard; Rjeily, Joe Abou; Sader, Joseph; Naaman, Alfred

    2010-05-01

    The purpose of this study was to compare canal transportation and centering ability of 2 rotary nickel-titanium (NiTi) systems (Twisted Files [TF] and Pathfile-ProTaper [PP]) with conventional stainless steel K-files. Ninety root canals with severe curvature and short radius were selected. Canals were divided randomly into 3 groups of 30 each. After preparation with TF, PP, and stainless steel files, the amount of transportation that occurred was assessed by using computed tomography. Three sections from apical, mid-root, and coronal levels of the canal were recorded. Amount of transportation and centering ability were assessed. The 3 groups were statistically compared with analysis of variance and Tukey honestly significant difference test. Less transportation and better centering ability occurred with TF rotary instruments (P < .0001). K-files showed the highest transportation followed by PP system. PP system showed significant transportation when compared with TF (P < .0001). The TF system was found to be the best for all variables measured in this study. Copyright (c) 2010 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  9. Modification to the Monte Carlo N-Particle (MCNP) Visual Editor (MCNPVised) to Read in Computer Aided Design (CAD) Files

    International Nuclear Information System (INIS)

    Randolph Schwarz; Leland L. Carter; Alysia Schwarz

    2005-01-01

    Monte Carlo N-Particle Transport Code (MCNP) is the code of choice for doing complex neutron/photon/electron transport calculations for the nuclear industry and research institutions. The Visual Editor for Monte Carlo N-Particle is internationally recognized as the best code for visually creating and graphically displaying input files for MCNP. The work performed in this grant was used to enhance the capabilities of the MCNP Visual Editor to allow it to read in both 2D and 3D Computer Aided Design (CAD) files, allowing the user to electronically generate a valid MCNP input geometry

  10. Cohabiting Couple, Filing Jointly? Resource Pooling and U.S. Poverty Policies

    Science.gov (United States)

    Kenney, Catherine

    2004-01-01

    Social policy in the United States is inconsistent in its treatment of cohabiting-parent households. For example, although welfare policy generally assumes that marital status should not affect the extent to which children benefit from each adult's income, tax policy and the poverty classification assume income pooling among married but not…

  11. Conversion of Input Data between KENO and MCNP File Formats for Computer Criticality Assessments

    International Nuclear Information System (INIS)

    Schwarz, Randolph A.; Carter, Leland L.; Schwarz Alysia L.

    2006-01-01

    KENO is a Monte Carlo criticality code that is maintained by Oak Ridge National Laboratory (ORNL). KENO is included in the SCALE (Standardized Computer Analysis for Licensing Evaluation) package. KENO is often used because it was specifically designed for criticality calculations. Because KENO has convenient geometry input, including the treatment of lattice arrays of materials, it is frequently used for production calculations. Monte Carlo N-Particle (MCNP) is a Monte Carlo transport code maintained by Los Alamos National Laboratory (LANL). MCNP has a powerful 3D geometry package and an extensive cross section database. It is a general-purpose code and may be used for calculations involving shielding or medical facilities, for example, but can also be used for criticality calculations. MCNP is becoming increasingly more popular for performing production criticality calculations. Both codes have their own specific advantages. After a criticality calculation has been performed with one of the codes, it is often desirable (or may be a safety requirement) to repeat the calculation with the other code to compare the important parameters using a different geometry treatment and cross section database. This manual conversion of input files between the two codes is labor intensive. The industry needs the capability of converting geometry models between MCNP and KENO without a large investment in manpower. The proposed conversion package will aid the user in converting between the codes. It is not intended to be used as a ''black box''. The resulting input file will need to be carefully inspected by criticality safety personnel to verify the intent of the calculation is preserved in the conversion. The purpose of this package is to help the criticality specialist in the conversion process by converting the geometry, materials, and pertinent data cards

  12. Towards Static Analysis of Policy-Based Self-adaptive Computing Systems

    DEFF Research Database (Denmark)

    Margheri, Andrea; Nielson, Hanne Riis; Nielson, Flemming

    2016-01-01

    For supporting the design of self-adaptive computing systems, the PSCEL language offers a principled approach that relies on declarative definitions of adaptation and authorisation policies enforced at runtime. Policies permit managing system components by regulating their interactions...... and by dynamically introducing new actions to accomplish task-oriented goals. However, the runtime evaluation of policies and their effects on system components make the prediction of system behaviour challenging. In this paper, we introduce the construction of a flow graph that statically points out the policy...... evaluations that can take place at runtime and exploit it to analyse the effects of policy evaluations on the progress of system components....

  13. Modification to the Monte N-Particle (MCNP) Visual Editor (MCNPVised) to read in Computer Aided Design (CAD) files

    International Nuclear Information System (INIS)

    Schwarz, Randy A.; Carter, Leeland L.

    2004-01-01

    Monte Carlo N-Particle Transport Code (MCNP) (Reference 1) is the code of choice for doing complex neutron/photon/electron transport calculations for the nuclear industry and research institutions. The Visual Editor for Monte Carlo N-Particle (References 2 to 11) is recognized internationally as the best code for visually creating and graphically displaying input files for MCNP. The work performed in this grant enhanced the capabilities of the MCNP Visual Editor to allow it to read in a 2D Computer Aided Design (CAD) file, allowing the user to modify and view the 2D CAD file and then electronically generate a valid MCNP input geometry with a user specified axial extent

  14. Mapping an Emergent Field of "Computational Education Policy": Policy Rationalities, Prediction and Data in the Age of Artificial Intelligence

    Science.gov (United States)

    Gulson, Kalervo N.; Webb, P. Taylor

    2017-01-01

    Contemporary education policy involves the integration of novel forms of data and the creation of new data platforms, in addition to the infusion of business principles into school governance networks, and intensification of socio-technical relations. In this paper, we examine how "computational rationality" may be understood as…

  15. The Impact of Fiscal Policy on Poverty in Ethiopia: A Computable ...

    African Journals Online (AJOL)

    Ethiopia has implemented various fiscal policy reforms in the past decade. Most of these reforms center on indirect taxes and pro-poor expenditure patterns. This study investigates the economy-wide impacts of these fiscal policy changes on poverty. To this effect, the study used a static computable general equilibrium ...

  16. Computational intelligence in economic games and policy design

    NARCIS (Netherlands)

    Dawid, H.; Poutré, La J.A.; Yao, X.

    2008-01-01

    Developing CI techniques for economic games and policies is a very promising and fast-growing field. Several interesting multi-disciplinary subfields exist, which require researchers of various disciplines to collaborate with each other and contribute to the advances of knowledge in this emerging

  17. Computer Attack and Cyberterrorism: Vulnerabilities and Policy Issues for Congress

    National Research Council Canada - National Science Library

    Wilson, Clay

    2005-01-01

    Many international terrorist groups now actively use computers and the Internet to communicate, and several may develop or acquire the necessary technical skills to direct a coordinated attack against...

  18. Geothermal-energy files in computer storage: sites, cities, and industries

    Energy Technology Data Exchange (ETDEWEB)

    O' Dea, P.L.

    1981-12-01

    The site, city, and industrial files are described. The data presented are from the hydrothermal site file containing about three thousand records which describe some of the principal physical features of hydrothermal resources in the United States. Data elements include: latitude, longitude, township, range, section, surface temperature, subsurface temperature, the field potential, and well depth for commercialization. (MHR)

  19. The All-or-Nothing Anti-Theft Policy - Theft Protection for Pervasive Computing

    DEFF Research Database (Denmark)

    Pedersen, Michael Østergaard; Pagter, Jakob Illeborg

    2007-01-01

    In many application scenarios for pervasive computing, theft is a serious security threat. In this paper we present the All-Or-Nothing anti-theft policy aimed at providing theft protection for pervasive computing. The overall idea behind the All-Or-Nothing anti-theft policy is to chain devices...... together in friendly networks so that any device will only work when it can see all of its friends. Thus a thief will have to keep the network of friendly devices together even if he only desires to steal one of the devices. Otherwise the device will not work. We provide a detailed security policy, present...... the required cryptographic protocols, provide three different applications, and finally we document that the policy is suitable for implementation on typical pervasive computing devices....

  20. Heuristic and optimal policy computations in the human brain during sequential decision-making.

    Science.gov (United States)

    Korn, Christoph W; Bach, Dominik R

    2018-01-23

    Optimal decisions across extended time horizons require value calculations over multiple probabilistic future states. Humans may circumvent such complex computations by resorting to easy-to-compute heuristics that approximate optimal solutions. To probe the potential interplay between heuristic and optimal computations, we develop a novel sequential decision-making task, framed as virtual foraging in which participants have to avoid virtual starvation. Rewards depend only on final outcomes over five-trial blocks, necessitating planning over five sequential decisions and probabilistic outcomes. Here, we report model comparisons demonstrating that participants primarily rely on the best available heuristic but also use the normatively optimal policy. FMRI signals in medial prefrontal cortex (MPFC) relate to heuristic and optimal policies and associated choice uncertainties. Crucially, reaction times and dorsal MPFC activity scale with discrepancies between heuristic and optimal policies. Thus, sequential decision-making in humans may emerge from integration between heuristic and optimal policies, implemented by controllers in MPFC.

  1. Evaluation of Single File Systems Reciproc, Oneshape, and WaveOne using Cone Beam Computed Tomography -An In Vitro Study.

    Science.gov (United States)

    Dhingra, Annil; Ruhal, Nidhi; Miglani, Anjali

    2015-04-01

    Successful endodontic therapy depends on many factor, one of the most important step in any root canal treatment is root canal preparation. In addition, respecting the original shape of the canal is of the same importance; otherwise, canal aberrations such as transportation will be created. The purpose of this study is to compare and evaluate Reciprocating WaveOne ,Reciproc and Rotary Oneshape Single File Instrumentation System On Cervical Dentin Thickness, Cross Sectional Area and Canal Transportation on First Mandibular Molar Using Cone Beam Computed Tomography. Sixty Mandibular First Molars extracted due to periodontal reason was collected from the Department of Oral and Maxillofacial. Teeth were prepared using one rotary and two reciprocating single file system. Teeth were divided into 3 groups 20 teeth in each group. Pre instrumentation and Post instrumentation scans was done and evaluated for three parameters Canal Transportation, Cervical Dentinal Thickness, Cross-sectional Area. Results were analysed statistically using ANOVA, Post-Hoc Tukey analysis. The change in cross-sectional area after filing showed significant difference at 0mm, 1mm, 2mm and 7mm (pfile system over a distance of 7 mm (starting from 0mm and then evaluation at 1mm, 2mm, 3mm, 5mm and 7mm), the results showed a significant difference among the file systems at various lengths (p= 0.014, 0.046, 0.004, 0.028, 0.005 & 0.029 respectively). Mean value of cervical dentinal removal is maximum at all the levels for oneshape and minimum for waveone showing the better quality of waveone and reciproc over oneshape file system. Significant difference was found at 9mm, 11mm and 12mm between all the three file systems (p<0.001,< 0.001, <0.001). It was concluded that reciprocating motion is better than rotary motion in all the three parameters Canal Transportation, Cross-sectional Area, Cervical Dentinal Thickness.

  2. CERN’s Computing rules updated to include policy for control systems

    CERN Multimedia

    IT Department

    2008-01-01

    The use of CERN’s computing facilities is governed by rules defined in Operational Circular No. 5 and its subsidiary rules of use. These rules are available from the web site http://cern.ch/ComputingRules. Please note that the subsidiary rules for Internet/Network use have been updated to include a requirement that control systems comply with the CNIC(Computing and Network Infrastructure for Control) Security Policy. The security policy for control systems, which was approved earlier this year, can be accessed at https://edms.cern.ch/document/584092 IT Department

  3. "Life" and Education Policy: Intervention, Augmentation and Computation

    Science.gov (United States)

    Gulson, Kalervo N.; Webb, P. Taylor

    2018-01-01

    In this paper, we are interested in the notion of multiple ways of thinking, knowing and transforming life, namely an increasing capacity to intervene in "life" as a "molecular biopolitics," and the changing ways in which "life" can be understood computationally. We identify and speculate on the ways different ideas…

  4. Generation of Gaussian 09 Input Files for the Computation of 1H and 13C NMR Chemical Shifts of Structures from a Spartan’14 Conformational Search

    OpenAIRE

    sprotocols

    2014-01-01

    Authors: Spencer Reisbick & Patrick Willoughby ### Abstract This protocol describes an approach to preparing a series of Gaussian 09 computational input files for an ensemble of conformers generated in Spartan’14. The resulting input files are necessary for computing optimum geometries, relative conformer energies, and NMR shielding tensors using Gaussian. Using the conformational search feature within Spartan’14, an ensemble of conformational isomers was obtained. To convert the str...

  5. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE) Model of Water Resources and Water Environments

    OpenAIRE

    Guohua Fang; Ting Wang; Xinyi Si; Xin Wen; Yu Liu

    2016-01-01

    To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE) model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and out...

  6. New Directions for Hardware-assisted Trusted Computing Policies (Position Paper)

    Science.gov (United States)

    Bratus, Sergey; Locasto, Michael E.; Ramaswamy, Ashwin; Smith, Sean W.

    The basic technological building blocks of the TCG architecture seem to be stabilizing. As a result, we believe that the focus of the Trusted Computing (TC) discipline must naturally shift from the design and implementation of the hardware root of trust (and the subsequent trust chain) to the higher-level application policies. Such policies must build on these primitives to express new sets of security goals. We highlight the relationship between enforcing these types of policies and debugging, since both activities establish the link between expected and actual application behavior. We argue that this new class of policies better fits developers' mental models of expected application behaviors, and we suggest a hardware design direction for enabling the efficient interpretation of such policies.

  7. Computational system to create an entry file for replicating I-125 seeds simulating brachytherapy case studies using the MCNPX code

    Directory of Open Access Journals (Sweden)

    Leonardo da Silva Boia

    2014-03-01

    Full Text Available Purpose: A computational system was developed for this paper in the C++ programming language, to create a 125I radioactive seed entry file, based on the positioning of a virtual grid (template in voxel geometries, with the purpose of performing prostate cancer treatment simulations using the MCNPX code.Methods: The system is fed with information from the planning system with regard to each seed’s location and its depth, and an entry file is automatically created with all the cards (instructions for each seed regarding their cell blocks and surfaces spread out spatially in the 3D environment. The system provides with precision a reproduction of the clinical scenario for the MCNPX code’s simulation environment, thereby allowing the technique’s in-depth study.Results and Conclusion: The preliminary results from this study showed that the lateral penumbra of uniform scanning proton beams was less sensitive In order to validate the computational system, an entry file was created with 88 125I seeds that were inserted in the phantom’s MAX06 prostate region with initial activity determined for the seeds at the 0.27 mCi value. Isodose curves were obtained in all the prostate slices in 5 mm steps in the 7 to 10 cm interval, totaling 7 slices. Variance reduction techniques were applied in order to optimize computational time and the reduction of uncertainties such as photon and electron energy interruptions in 4 keV and forced collisions regarding cells of interest. Through the acquisition of isodose curves, the results obtained show that hot spots have values above 300 Gy, as anticipated in literature, stressing the importance of the sources’ correct positioning, in which the computational system developed provides, in order not to release excessive doses in adjacent risk organs. The 144 Gy prescription curve showed in the validation process that it covers perfectly a large percentage of the volume, at the same time that it demonstrates a large

  8. The design and development of GRASS file reservation system

    International Nuclear Information System (INIS)

    Huang Qiulan; Zhu Suijiang; Cheng Yaodong; Chen Gang

    2010-01-01

    GFRS (GRASS File Reservation System) is designed to improve the file access performance of GRASS (Grid-enabled Advanced Storage System) which is a Hierarchical Storage Management (HSM) system developed at Computing Center, Institute of High Energy Physics. GRASS can provide massive storage management and data migration, but the data migration policy is simply based factors such as pool water level, the intervals for migration and so on, so it is short of precise control over files. As for that, we design GFRS to implement user-based file reservation which is to reserve and keep the required files on disks for High Energy physicists. CFRS can improve file access speed for users by avoiding migrating frequently accessed files to tapes. In this paper we first give a brief introduction of GRASS system and then detailed architecture and implementation of GFRS. Experiments results from GFRS have shown good performance and a simple analysis is made based on it. (authors)

  9. Selecting, Evaluating and Creating Policies for Computer-Based Resources in the Behavioral Sciences and Education.

    Science.gov (United States)

    Richardson, Linda B., Comp.; And Others

    This collection includes four handouts: (1) "Selection Critria Considerations for Computer-Based Resources" (Linda B. Richardson); (2) "Software Collection Policies in Academic Libraries" (a 24-item bibliography, Jane W. Johnson); (3) "Circulation and Security of Software" (a 19-item bibliography, Sara Elizabeth Williams); and (4) "Bibliography of…

  10. National Degree of Computerization: A Context for Evaluating Computer Education Policies in Developing Countries.

    Science.gov (United States)

    Boehm, Barry W.

    Developing countries should take immediate steps to avoid some of the serious problems that are now facing the United States in regard to the pool of trained computer professionals. Problem areas which should be reconciled involve a diverse range of topics from general national policy to salary structures and conversions efforts. By using the…

  11. OK, Computer: File Sharing, the Music Industry, and Why We Need the Pirate Party

    Directory of Open Access Journals (Sweden)

    Adrian Cosstick

    2009-03-01

    Full Text Available The Pirate Party believes the state and big business are in the process of protecting stale and inefficient models of business for their own monetary benefit by limiting our right to share information. The Pirate Party suggests that they are achieving this goal through the amendment of intellectual property legislation. In the dawn of the digital era, the Pirate Party advocates that governments and multinational corporations are using intellectual property to: crack down on file sharing which limits the ability to share knowledge and information; increase the terms and length of copyright to raise profits; and build code into music files which limits their ability to be shared (Pirate Party, 2009. There are a number of ‘copyright industries’ that are affected by these issues, none more so than the music industry. Its relationship with file sharing is topical and makes an excellent case study to address the impact big business has had on intellectual property and the need for the Pirate Party’s legislative input. The essay will then examine the central issues raised by illegal file sharing. In particular, the future for record companies in an environment that increasingly demands flexibility, and whether the Pirate Party’s proposal is a viable solution to the music industry’s problems

  12. Comparative evaluation of effect of rotary and reciprocating single-file systems on pericervical dentin: A cone-beam computed tomography study.

    Science.gov (United States)

    Zinge, Priyanka Ramdas; Patil, Jayaprakash

    2017-01-01

    The aim of this study is to evaluate and compare the effect of one shape, Neolix rotary single-file systems and WaveOne, Reciproc reciprocating single-file systems on pericervical dentin (PCD) using cone-beam computed tomography (CBCT). A total of 40 freshly extracted mandibular premolars were collected and divided into two groups, namely, Group A - Rotary: A 1 - Neolix and A 2 - OneShape and Group B - Reciprocating: B 1 - WaveOne and B 2 - Reciproc. Preoperative scans of each were taken followed by conventional access cavity preparation and working length determination with 10-k file. Instrumentation of the canal was done according to the respective file system, and postinstrumentation CBCT scans of teeth were obtained. 90 μm thick slices were obtained 4 mm apical and coronal to the cementoenamel junction. The PCD thickness was calculated as the shortest distance from the canal outline to the closest adjacent root surface, which was measured in four surfaces, i.e., facial, lingual, mesial, and distal for all the groups in the two obtained scans. There was no significant difference found between rotary single-file systems and reciprocating single-file systems in their effect on PCD, but in Group B 2 , there was most significant loss of tooth structure in the mesial, lingual, and distal surface ( P file system removes more PCD as compared to other experimental groups, whereas Neolix single file system had the least effect on PCD.

  13. ERX: a software for editing files containing X-ray spectra to be used in exposure computational models

    International Nuclear Information System (INIS)

    Cabral, Manuela O.M.; Vieira, Jose W.; Silva, Alysson G.; Leal Neto, Viriato; Oliveira, Alex C.H.; Lima, Fernando R.A.

    2011-01-01

    Exposure Computational Models (ECMs) are utilities that simulate situations in which occurs irradiation in a given environment. An ECM is composed primarily by an anthropomorphic model (phantom), and a Monte Carlo code (MC). This paper presents a tutorial of the software Espectro de Raios-X (ERX). This software performs reading and numerical and graphical analysis of text files containing diagnostic X-ray spectra for use in algorithms of radioactive sources in the ECMs of a Grupo de Dosimetria Numerica. The ERX allows the user to select one among several X-ray spectrums in the energy range Diagnostic radiology X-Ray most commonly used in radiology clinics. In the current version of the ERX there are two types of input files: the contained in mspectra.dat file and the resulting of MC simulations in Geant4. The software allows the construction of charts of the Probability Density Function (PDF) and Cumulative Distribution Function (CDF) of a selected spectrum as well as the table with the values of these functions and the spectrum. In addition, the ERX allows the user to make comparative analysis between the PDF graphics of the two catalogs of spectra available, besides being can perform dosimetric evaluations with the selected spectrum. A software of this kind is an important computational tool for researchers in numerical dosimetry because of the diversity of Diagnostic radiology X-Ray machines, which implies in a mass of input data highly diverse. And because of this, the ERX provides independence to the group related to the data origin that is contained in the catalogs created, not being necessary to resort to others. (author)

  14. Essays on environmental policy analysis: Computable general equilibrium approaches applied to Sweden

    International Nuclear Information System (INIS)

    Hill, M.

    2001-01-01

    This thesis consists of three essays within the field of applied environmental economics, with the common basic aim of analyzing effects of Swedish environmental policy. Starting out from Swedish environmental goals, the thesis assesses a range of policy-related questions. The objective is to quantify policy outcomes by constructing and applying numerical models especially designed for environmental policy analysis. Static and dynamic multi-sectoral computable general equilibrium models are developed in order to analyze the following issues. The costs and benefits of a domestic carbon dioxide (CO 2 ) tax reform. Special attention is given to how these costs and benefits depend on the structure of the tax system and, furthermore, how they depend on policy-induced changes in 'secondary' pollutants. The effects of allowing for emission permit trading through time when the domestic long-term domestic environmental goal is specified in CO 2 stock terms. The effects on long-term projected economic growth and welfare that are due to damages from emission flow and accumulation of 'local' pollutants (nitrogen oxides and sulfur dioxide), as well as the outcome of environmental policy when costs and benefits are considered in an integrated environmental-economic framework

  15. Government Cloud Computing Policies: Potential Opportunities for Advancing Military Biomedical Research.

    Science.gov (United States)

    Lebeda, Frank J; Zalatoris, Jeffrey J; Scheerer, Julia B

    2018-02-07

    This position paper summarizes the development and the present status of Department of Defense (DoD) and other government policies and guidances regarding cloud computing services. Due to the heterogeneous and growing biomedical big datasets, cloud computing services offer an opportunity to mitigate the associated storage and analysis requirements. Having on-demand network access to a shared pool of flexible computing resources creates a consolidated system that should reduce potential duplications of effort in military biomedical research. Interactive, online literature searches were performed with Google, at the Defense Technical Information Center, and at two National Institutes of Health research portfolio information sites. References cited within some of the collected documents also served as literature resources. We gathered, selected, and reviewed DoD and other government cloud computing policies and guidances published from 2009 to 2017. These policies were intended to consolidate computer resources within the government and reduce costs by decreasing the number of federal data centers and by migrating electronic data to cloud systems. Initial White House Office of Management and Budget information technology guidelines were developed for cloud usage, followed by policies and other documents from the DoD, the Defense Health Agency, and the Armed Services. Security standards from the National Institute of Standards and Technology, the Government Services Administration, the DoD, and the Army were also developed. Government Services Administration and DoD Inspectors General monitored cloud usage by the DoD. A 2016 Government Accountability Office report characterized cloud computing as being economical, flexible and fast. A congressionally mandated independent study reported that the DoD was active in offering a wide selection of commercial cloud services in addition to its milCloud system. Our findings from the Department of Health and Human Services

  16. 78 FR 63177 - Order on Voluntary Remand and Clarifying Policy on Filing of Reactive Power Service Rate...

    Science.gov (United States)

    2013-10-23

    ... the obligation to follow a voltage schedule.'' \\19\\ The Commission distinguished Hot Spring Power Co..., to explore the mechanics of public utilities filing reactive power rate schedules for which there is...'' jurisdictional service and, accordingly, must be filed for Commission review); Sulphur Springs Valley Elec. Coop...

  17. Micro computed tomography evaluation of the Self-adjusting file and ProTaper Universal system on curved mandibular molars.

    Science.gov (United States)

    Serefoglu, Burcu; Piskin, Beyser

    2017-09-26

    The aim of this investigation was to compare the cleaning and shaping efficiency of Self-adjusting file and Protaper, and to assess the correlation between root canal curvature and working time in mandibular molars using micro-computed tomography. Twenty extracted mandibular molars instrumented with Protaper and Self-adjusting file and the total working time was measured in mesial canals. The changes in canal volume, surface area and structure model index, transportation, uninstrumented area and the correlation between working-time and the curvature were analyzed. Although no statistically significant difference was observed between two systems in distal canals (p>0.05), a significantly higher amount of removed dentin volume and lower uninstrumented area were provided by Protaper in mesial canals (p<0.0001). A correlation between working-time and the canal-curvature was also observed in mesial canals for both groups (SAFr 2 =0.792, p<0.0004, PTUr 2 =0.9098, p<0.0001).

  18. Health adaptation policy for climate vulnerable groups: a 'critical computational linguistics' analysis.

    Science.gov (United States)

    Seidel, Bastian M; Bell, Erica

    2014-11-28

    Many countries are developing or reviewing national adaptation policy for climate change but the extent to which these meet the health needs of vulnerable groups has not been assessed. This study examines the adequacy of such policies for nine known climate-vulnerable groups: people with mental health conditions, Aboriginal people, culturally and linguistically diverse groups, aged people, people with disabilities, rural communities, children, women, and socioeconomically disadvantaged people. The study analyses an exhaustive sample of national adaptation policy documents from Annex 1 ('developed') countries of the United Nations Framework Convention on Climate Change: 20 documents from 12 countries. A 'critical computational linguistics' method was used involving novel software-driven quantitative mapping and traditional critical discourse analysis. The study finds that references to vulnerable groups are relatively little present or non-existent, as well as poorly connected to language about practical strategies and socio-economic contexts, both also little present. The conclusions offer strategies for developing policy that is better informed by a 'social determinants of health' definition of climate vulnerability, consistent with best practice in the literature and global policy prescriptions.

  19. Computation of a near-optimal service policy for a single-server queue with homogeneous jobs

    DEFF Research Database (Denmark)

    Johansen, Søren Glud; Larsen, Christian

    2001-01-01

    We present an algorithm for computing a near-optimal service policy for a single-server queueing system when the service cost is a convex function of the service time. The policy has state-dependent service times, and it includes the options to remove jobs from the system and to let the server...... be off. The systems' semi-Markov decision model has infinite action sets for the positive states. We design a new tailor-made policy-iteration algorithm for computing a policy for which the long-run average cost is at most a positive tolerance above the minimum average cost. For any positive tolerance...

  20. Computation of a near-optimal service policy for a single-server queue with homogeneous jobs

    DEFF Research Database (Denmark)

    Johansen, Søren Glud; Larsen, Christian

    2000-01-01

    We present an algorithm for computing a near optimal service policy for a single-server queueing system when the service cost is a convex function of the service time. The policy has state-dependent service times, and it includes the options to remove jobs from the system and to let the server...... be off. The system's semi-Markov decision model has infinite action sets for the positive states. We design a new tailor-made policy iteration algorithm for computing a policy for which the long-run average cost is at most a positive tolerance above the minimum average cost. For any positive tolerance...

  1. A computer program for creating keyword indexes to textual data files

    Science.gov (United States)

    Moody, David W.

    1972-01-01

    A keyword-in-context (KWIC) or out-of-context (KWOC) index is a convenient means of organizing information. This keyword index program can be used to create either KWIC or KWOC indexes of bibliographic references or other types of information punched on. cards, typed on optical scanner sheets, or retrieved from various Department of Interior data bases using the Generalized Information Processing System (GIPSY). The index consists of a 'bibliographic' section and a keyword-section based on the permutation of. document titles, project titles, environmental impact statement titles, maps, etc. or lists of descriptors. The program can also create a back-of-the-book index to documents from a list of descriptors. By providing the user with a wide range of input and output options, the program provides the researcher, manager, or librarian with a means of-maintaining a list and index to documents in. a small library, reprint collection, or office file.

  2. Verification of data files of TREF-computer program; TREF-ohjelmiston ohjaustiedostojen soveltuvuustutkimus

    Energy Technology Data Exchange (ETDEWEB)

    Ruottu, S.; Halme, A.; Ruottu, A. [Einco Oy, Karhula (Finland)

    1996-12-01

    Originally the aim of Y43 project was to verify TREF data files for several different processes. However, it appeared that deficient or missing coordination between experimental and theoretical works made meaningful verifications impossible in some cases. Therefore verification calculations were focused on catalytic cracking reactor which was developed by Neste. The studied reactor consisted of prefluidisation and reaction zones. Verification calculations concentrated mainly on physical phenomena like vaporization near oil injection zone. The main steps of the cracking process can be described as follows oil (liquid) -> oil (gas) -> oil (catal) -> product (catal) + char (catal) -> product (gas). Catalytic nature of cracking reaction was accounted by defining the cracking pseudoreaction into catalyst phase. This simplified reaction model was valid only for vaporization zone. Applied fluid dynamic theory was based on the results of EINCO`s earlier LIEKKI-projects. (author)

  3. Shaping ability of the conventional nickel-titanium and reciprocating nickel-titanium file systems: a comparative study using micro-computed tomography.

    Science.gov (United States)

    Hwang, Young-Hye; Bae, Kwang-Shik; Baek, Seung-Ho; Kum, Kee-Yeon; Lee, WooCheol; Shon, Won-Jun; Chang, Seok Woo

    2014-08-01

    This study used micro-computed tomographic imaging to compare the shaping ability of Mtwo (VDW, Munich, Germany), a conventional nickel-titanium file system, and Reciproc (VDW), a reciprocating file system morphologically similar to Mtwo. Root canal shaping was performed on the mesiobuccal and distobuccal canals of extracted maxillary molars. In the RR group (n = 15), Reciproc was used in a reciprocating motion (150° counterclockwise/30° clockwise, 300 rpm); in the MR group, Mtwo was used in a reciprocating motion (150° clockwise/30° counterclockwise, 300 rpm); and in the MC group, Mtwo was used in a continuous rotating motion (300 rpm). Micro-computed tomographic images taken before and after canal shaping were used to analyze canal volume change and the degree of transportation at the cervical, middle, and apical levels. The time required for canal shaping was recorded. Afterward, each file was analyzed using scanning electron microscopy. No statistically significant differences were found among the 3 groups in the time for canal shaping or canal volume change (P > .05). Transportation values of the RR and MR groups were not significantly different at any level. However, the transportation value of the MC group was significantly higher than both the RR and MR groups at the cervical and apical levels (P file deformation was observed for 1 file in group RR (1/15), 3 files in group MR (3/15), and 5 files in group MC (5/15). In terms of shaping ability, Mtwo used in a reciprocating motion was not significantly different from the Reciproc system. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  4. Large-scale automated analysis of news media: a novel computational method for obesity policy research.

    Science.gov (United States)

    Hamad, Rita; Pomeranz, Jennifer L; Siddiqi, Arjumand; Basu, Sanjay

    2015-02-01

    Analyzing news media allows obesity policy researchers to understand popular conceptions about obesity, which is important for targeting health education and policies. A persistent dilemma is that investigators have to read and manually classify thousands of individual news articles to identify how obesity and obesity-related policy proposals may be described to the public in the media. A machine learning method called "automated content analysis" that permits researchers to train computers to "read" and classify massive volumes of documents was demonstrated. 14,302 newspaper articles that mentioned the word "obesity" during 2011-2012 were identified. Four states that vary in obesity prevalence and policy (Alabama, California, New Jersey, and North Carolina) were examined. The reliability of an automated program to categorize the media's framing of obesity as an individual-level problem (e.g., diet) and/or an environmental-level problem (e.g., obesogenic environment) was tested. The automated program performed similarly to human coders. The proportion of articles with individual-level framing (27.7-31.0%) was higher than the proportion with neutral (18.0-22.1%) or environmental-level framing (16.0-16.4%) across all states and over the entire study period (Pnews media was demonstrated. © 2014 The Obesity Society.

  5. PCF File Format.

    Energy Technology Data Exchange (ETDEWEB)

    Thoreson, Gregory G [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    PCF files are binary files designed to contain gamma spectra and neutron count rates from radiation sensors. It is the native format for the GAmma Detector Response and Analysis Software (GADRAS) package [1]. It can contain multiple spectra and information about each spectrum such as energy calibration. This document outlines the format of the file that would allow one to write a computer program to parse and write such files.

  6. Reciprocating vs Rotary Instrumentation in Pediatric Endodontics: Cone Beam Computed Tomographic Analysis of Deciduous Root Canals using Two Single-file Systems.

    Science.gov (United States)

    Prabhakar, Attiguppe R; Yavagal, Chandrashekar; Dixit, Kratika; Naik, Saraswathi V

    2016-01-01

    Primary root canals are considered to be most challenging due to their complex anatomy. "Wave one" and "one shape" are single-file systems with reciprocating and rotary motion respectively. The aim of this study was to evaluate and compare dentin thickness, centering ability, canal transportation, and instrumentation time of wave one and one shape files in primary root canals using a cone beam computed tomographic (CBCT) analysis. This is an experimental, in vitro study comparing the two groups. A total of 24 extracted human primary teeth with minimum 7 mm root length were included in the study. Cone beam computed tomographic images were taken before and after the instrumentation for each group. Dentin thickness, centering ability, canal transportation, and instrumentation times were evaluated for each group. A significant difference was found in instrumentation time and canal transportation measures between the two groups. Wave one showed less canal transportation as compared with one shape, and the mean instrumentation time of wave one was significantly less than one shape. Reciprocating single-file systems was found to be faster with much less procedural errors and can hence be recommended for shaping the root canals of primary teeth. How to cite this article: Prabhakar AR, Yavagal C, Dixit K, Naik SV. Reciprocating vs Rotary Instrumentation in Pediatric Endodontics: Cone Beam Computed Tomographic Analysis of Deciduous Root Canals using Two Single-File Systems. Int J Clin Pediatr Dent 2016;9(1):45-49.

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  8. Computable general equilibrium modelling in the context of trade and environmental policy

    Energy Technology Data Exchange (ETDEWEB)

    Koesler, Simon Tobias

    2014-10-14

    This thesis is dedicated to the evaluation of environmental policies in the context of climate change. Its objectives are twofold. Its first part is devoted to the development of potent instruments for quantitative impact analysis of environmental policy. In this context, the main contributions include the development of a new computable general equilibrium (CGE) model which makes use of the new comprehensive and coherent World Input-Output Dataset (WIOD) and which features a detailed representation of bilateral and bisectoral trade flows. Moreover it features an investigation of input substitutability to provide modellers with adequate estimates for key elasticities as well as a discussion and amelioration of the standard base year calibration procedure of most CGE models. Building on these tools, the second part applies the improved modelling framework and studies the economic implications of environmental policy. This includes an analysis of so called rebound effects, which are triggered by energy efficiency improvements and reduce their net benefit, an investigation of how firms restructure their production processes in the presence of carbon pricing mechanisms, and an analysis of a regional maritime emission trading scheme as one of the possible options to reduce emissions of international shipping in the EU context.

  9. Computable general equilibrium modelling in the context of trade and environmental policy

    International Nuclear Information System (INIS)

    Koesler, Simon Tobias

    2014-01-01

    This thesis is dedicated to the evaluation of environmental policies in the context of climate change. Its objectives are twofold. Its first part is devoted to the development of potent instruments for quantitative impact analysis of environmental policy. In this context, the main contributions include the development of a new computable general equilibrium (CGE) model which makes use of the new comprehensive and coherent World Input-Output Dataset (WIOD) and which features a detailed representation of bilateral and bisectoral trade flows. Moreover it features an investigation of input substitutability to provide modellers with adequate estimates for key elasticities as well as a discussion and amelioration of the standard base year calibration procedure of most CGE models. Building on these tools, the second part applies the improved modelling framework and studies the economic implications of environmental policy. This includes an analysis of so called rebound effects, which are triggered by energy efficiency improvements and reduce their net benefit, an investigation of how firms restructure their production processes in the presence of carbon pricing mechanisms, and an analysis of a regional maritime emission trading scheme as one of the possible options to reduce emissions of international shipping in the EU context.

  10. Cone-beam Computed Tomographic Assessment of Canal Centering Ability and Transportation after Preparation with Twisted File and Bio RaCe Instrumentation.

    Directory of Open Access Journals (Sweden)

    Kiamars Honardar

    2014-08-01

    Full Text Available Use of rotary Nickel-Titanium (NiTi instruments for endodontic preparation has introduced a new era in endodontic practice, but this issue has undergone dramatic modifications in order to achieve improved shaping abilities. Cone-beam computed tomography (CBCT has made it possible to accurately evaluate geometrical changes following canal preparation. This study was carried out to compare canal centering ability and transportation of Twisted File and BioRaCe rotary systems by means of cone-beam computed tomography.Thirty root canals from freshly extracted mandibular and maxillary teeth were selected. Teeth were mounted and scanned before and after preparation by CBCT at different apical levels. Specimens were divided into 2 groups of 15. In the first group Twisted File and in the second, BioRaCe was used for canal preparation. Canal transportation and centering ability after preparation were assessed by NNT Viewer and Photoshop CS4 software. Statistical analysis was performed using t-test and two-way ANOVA.All samples showed deviations from the original axes of the canals. No significant differences were detected between the two rotary NiTi instruments for canal centering ability in all sections. Regarding canal transportation however, a significant difference was seen in the BioRaCe group at 7.5mm from the apex.Under the conditions of this in vitro study, Twisted File and BioRaCe rotary NiTi files retained original canal geometry.

  11. Evaluation of Single File Systems Reciproc, Oneshape, and WaveOne using Cone Beam Computed Tomography –An In Vitro Study

    Science.gov (United States)

    Dhingra, Annil; Miglani, Anjali

    2015-01-01

    Background Successful endodontic therapy depends on many factor, one of the most important step in any root canal treatment is root canal preparation. In addition, respecting the original shape of the canal is of the same importance; otherwise, canal aberrations such as transportation will be created. Aim The purpose of this study is to compare and evaluate Reciprocating WaveOne ,Reciproc and Rotary Oneshape Single File Instrumentation System On Cervical Dentin Thickness, Cross Sectional Area and Canal Transportation on First Mandibular Molar Using Cone Beam Computed Tomography. Materials and Methods Sixty Mandibular First Molars extracted due to periodontal reason was collected from the Department of Oral and Maxillofacial. Teeth were prepared using one rotary and two reciprocating single file system. Teeth were divided into 3 groups 20 teeth in each group. Pre instrumentation and Post instrumentation scans was done and evaluated for three parameters Canal Transportation, Cervical Dentinal Thickness, Cross-sectional Area. Results were analysed statistically using ANOVA, Post-Hoc Tukey analysis. Results The change in cross-sectional area after filing showed significant difference at 0mm, 1mm, 2mm and 7mm (pfile system over a distance of 7 mm (starting from 0mm and then evaluation at 1mm, 2mm, 3mm, 5mm and 7mm), the results showed a significant difference among the file systems at various lengths (p= 0.014, 0.046, 0.004, 0.028, 0.005 & 0.029 respectively). Mean value of cervical dentinal removal is maximum at all the levels for oneshape and minimum for waveone showing the better quality of waveone and reciproc over oneshape file system. Significant difference was found at 9mm, 11mm and 12mm between all the three file systems (p<0.001,< 0.001, <0.001). Conclusion It was concluded that reciprocating motion is better than rotary motion in all the three parameters Canal Transportation, Cross-sectional Area, Cervical Dentinal Thickness. PMID:26023639

  12. Dynamic virtual machine allocation policy in cloud computing complying with service level agreement using CloudSim

    Science.gov (United States)

    Aneri, Parikh; Sumathy, S.

    2017-11-01

    Cloud computing provides services over the internet and provides application resources and data to the users based on their demand. Base of the Cloud Computing is consumer provider model. Cloud provider provides resources which consumer can access using cloud computing model in order to build their application based on their demand. Cloud data center is a bulk of resources on shared pool architecture for cloud user to access. Virtualization is the heart of the Cloud computing model, it provides virtual machine as per application specific configuration and those applications are free to choose their own configuration. On one hand, there is huge number of resources and on other hand it has to serve huge number of requests effectively. Therefore, resource allocation policy and scheduling policy play very important role in allocation and managing resources in this cloud computing model. This paper proposes the load balancing policy using Hungarian algorithm. Hungarian Algorithm provides dynamic load balancing policy with a monitor component. Monitor component helps to increase cloud resource utilization by managing the Hungarian algorithm by monitoring its state and altering its state based on artificial intelligent. CloudSim used in this proposal is an extensible toolkit and it simulates cloud computing environment.

  13. Computational and Statistical Models: A Comparison for Policy Modeling of Childhood Obesity

    Science.gov (United States)

    Mabry, Patricia L.; Hammond, Ross; Ip, Edward Hak-Sing; Huang, Terry T.-K.

    As systems science methodologies have begun to emerge as a set of innovative approaches to address complex problems in behavioral, social science, and public health research, some apparent conflicts with traditional statistical methodologies for public health have arisen. Computational modeling is an approach set in context that integrates diverse sources of data to test the plausibility of working hypotheses and to elicit novel ones. Statistical models are reductionist approaches geared towards proving the null hypothesis. While these two approaches may seem contrary to each other, we propose that they are in fact complementary and can be used jointly to advance solutions to complex problems. Outputs from statistical models can be fed into computational models, and outputs from computational models can lead to further empirical data collection and statistical models. Together, this presents an iterative process that refines the models and contributes to a greater understanding of the problem and its potential solutions. The purpose of this panel is to foster communication and understanding between statistical and computational modelers. Our goal is to shed light on the differences between the approaches and convey what kinds of research inquiries each one is best for addressing and how they can serve complementary (and synergistic) roles in the research process, to mutual benefit. For each approach the panel will cover the relevant "assumptions" and how the differences in what is assumed can foster misunderstandings. The interpretations of the results from each approach will be compared and contrasted and the limitations for each approach will be delineated. We will use illustrative examples from CompMod, the Comparative Modeling Network for Childhood Obesity Policy. The panel will also incorporate interactive discussions with the audience on the issues raised here.

  14. The Development of Computer Policies in Government, Political Parties, and Trade Unions in Norway 1961-1983

    Science.gov (United States)

    Elgsaas, Knut; Hegna, Håvard

    A “Council for Government Electronic Data Processing” was established in 1961. This was the start of development of a common policy for computers and data within the public administration. In 1969-70, computers got on the agenda of political parties and the trade unions. In the course of the seventies and the beginning of the eighties the government, the political parties, and the trade unions established a more comprehensive view of data political questions that we will designate by the term data policy. This paper puts some light on the causes and forces that drove the evolvement of a data policy within these central sectors in Norway. We will also show how various actors of research, trade and industry, and political life influenced the development of data policy and present links between the actors that indicate that they mutually influenced each other.

  15. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE Model of Water Resources and Water Environments

    Directory of Open Access Journals (Sweden)

    Guohua Fang

    2016-09-01

    Full Text Available To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and output sources of the National Economic Production Department. Secondly, an extended Social Accounting Matrix (SAM of Jiangsu province is developed to simulate various scenarios. By changing values of the discharge fees (increased by 50%, 100% and 150%, three scenarios are simulated to examine their influence on the overall economy and each industry. The simulation results show that an increased fee will have a negative impact on Gross Domestic Product (GDP. However, waste water may be effectively controlled. Also, this study demonstrates that along with the economic costs, the increase of the discharge fee will lead to the upgrading of industrial structures from a situation of heavy pollution to one of light pollution which is beneficial to the sustainable development of the economy and the protection of the environment.

  16. User's guide for the implementation of level one of the proposed American National Standard Specifications for an information interchange data descriptive file on control data 6000/7000 series computers

    CERN Document Server

    Wiley, R A

    1977-01-01

    User's guide for the implementation of level one of the proposed American National Standard Specifications for an information interchange data descriptive file on control data 6000/7000 series computers

  17. A Centralized Control and Dynamic Dispatch Architecture for File Integrity Analysis

    Directory of Open Access Journals (Sweden)

    Ronald DeMara

    2006-02-01

    Full Text Available The ability to monitor computer file systems for unauthorized changes is a powerful administrative tool. Ideally this task could be performed remotely under the direction of the administrator to allow on-demand checking, and use of tailorable reporting and exception policies targeted to adjustable groups of network elements. This paper introduces M-FICA, a Mobile File Integrity and Consistency Analyzer as a prototype to achieve this capability using mobile agents. The M-FICA file tampering detection approach uses MD5 message digests to identify file changes. Two agent types, Initiator and Examiner, are used to perform file integrity tasks. An Initiator travels to client systems, computes a file digest, then stores those digests in a database file located on write-once media. An Examiner agent computes a new digest to compare with the original digests in the database file. Changes in digest values indicate that the file contents have been modified. The design and evaluation results for a prototype developed in the Concordia agent framework are described.

  18. Comprehensive optimisation of China’s energy prices, taxes and subsidy policies based on the dynamic computable general equilibrium model

    International Nuclear Information System (INIS)

    He, Y.X.; Liu, Y.Y.; Du, M.; Zhang, J.X.; Pang, Y.X.

    2015-01-01

    Highlights: • Energy policy is defined as a complication of energy price, tax and subsidy policies. • The maximisation of total social benefit is the optimised objective. • A more rational carbon tax ranges from 10 to 20 Yuan/ton under the current situation. • The optimal coefficient pricing is more conducive to maximise total social benefit. - Abstract: Under the condition of increasingly serious environmental pollution, rational energy policy plays an important role in the practical significance of energy conservation and emission reduction. This paper defines energy policies as the compilation of energy prices, taxes and subsidy policies. Moreover, it establishes the optimisation model of China’s energy policy based on the dynamic computable general equilibrium model, which maximises the total social benefit, in order to explore the comprehensive influences of a carbon tax, the sales pricing mechanism and the renewable energy fund policy. The results show that when the change rates of gross domestic product and consumer price index are ±2%, ±5% and the renewable energy supply structure ratio is 7%, the more reasonable carbon tax ranges from 10 to 20 Yuan/ton, and the optimal coefficient pricing mechanism is more conducive to the objective of maximising the total social benefit. From the perspective of optimising the overall energy policies, if the upper limit of change rate in consumer price index is 2.2%, the existing renewable energy fund should be improved

  19. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study

    Science.gov (United States)

    Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-01-01

    Background The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. Aim The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Materials and Methods Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20o to 35o were divided into three groups of 20 samples each: ProTaper PT (group I) – full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO – single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey’s honestly significant difference test. Results It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant Conclusion It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal

  20. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study.

    Science.gov (United States)

    Agarwal, Rolly S; Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-05-01

    The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20(o) to 35(o) were divided into three groups of 20 samples each: ProTaper PT (group I) - full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO - single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey's honestly significant difference test. It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant. It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal transportation and centering ability comparable to full sequence

  1. An approach to computing marginal land use change carbon intensities for bioenergy in policy applications

    International Nuclear Information System (INIS)

    Wise, Marshall; Hodson, Elke L.; Mignone, Bryan K.; Clarke, Leon; Waldhoff, Stephanie; Luckow, Patrick

    2015-01-01

    Accurately characterizing the emissions implications of bioenergy is increasingly important to the design of regional and global greenhouse gas mitigation policies. Market-based policies, in particular, often use information about carbon intensity to adjust relative deployment incentives for different energy sources. However, the carbon intensity of bioenergy is difficult to quantify because carbon emissions can occur when land use changes to expand production of bioenergy crops rather than simply when the fuel is consumed as for fossil fuels. Using a long-term, integrated assessment model, this paper develops an approach for computing the carbon intensity of bioenergy production that isolates the marginal impact of increasing production of a specific bioenergy crop in a specific region, taking into account economic competition among land uses. We explore several factors that affect emissions intensity and explain these results in the context of previous studies that use different approaches. Among the factors explored, our results suggest that the carbon intensity of bioenergy production from land use change (LUC) differs by a factor of two depending on the region in which the bioenergy crop is grown in the United States. Assumptions about international land use policies (such as those related to forest protection) and crop yields also significantly impact carbon intensity. Finally, we develop and demonstrate a generalized method for considering the varying time profile of LUC emissions from bioenergy production, taking into account the time path of future carbon prices, the discount rate and the time horizon. When evaluated in the context of power sector applications, we found electricity from bioenergy crops to be less carbon-intensive than conventional coal-fired electricity generation and often less carbon-intensive than natural-gas fired generation. - Highlights: • Modeling methodology for assessing land use change emissions from bioenergy • Use GCAM

  2. Balancing Exploration, Uncertainty Representation and Computational Time in Many-Objective Reservoir Policy Optimization

    Science.gov (United States)

    Zatarain-Salazar, J.; Reed, P. M.; Quinn, J.; Giuliani, M.; Castelletti, A.

    2016-12-01

    As we confront the challenges of managing river basin systems with a large number of reservoirs and increasingly uncertain tradeoffs impacting their operations (due to, e.g. climate change, changing energy markets, population pressures, ecosystem services, etc.), evolutionary many-objective direct policy search (EMODPS) solution strategies will need to address the computational demands associated with simulating more uncertainties and therefore optimizing over increasingly noisy objective evaluations. Diagnostic assessments of state-of-the-art many-objective evolutionary algorithms (MOEAs) to support EMODPS have highlighted that search time (or number of function evaluations) and auto-adaptive search are key features for successful optimization. Furthermore, auto-adaptive MOEA search operators are themselves sensitive to having a sufficient number of function evaluations to learn successful strategies for exploring complex spaces and for escaping from local optima when stagnation is detected. Fortunately, recent parallel developments allow coordinated runs that enhance auto-adaptive algorithmic learning and can handle scalable and reliable search with limited wall-clock time, but at the expense of the total number of function evaluations. In this study, we analyze this tradeoff between parallel coordination and depth of search using different parallelization schemes of the Multi-Master Borg on a many-objective stochastic control problem. We also consider the tradeoff between better representing uncertainty in the stochastic optimization, and simplifying this representation to shorten the function evaluation time and allow for greater search. Our analysis focuses on the Lower Susquehanna River Basin (LSRB) system where multiple competing objectives for hydropower production, urban water supply, recreation and environmental flows need to be balanced. Our results provide guidance for balancing exploration, uncertainty, and computational demands when using the EMODPS

  3. A security-awareness virtual machine management scheme based on Chinese wall policy in cloud computing.

    Science.gov (United States)

    Yu, Si; Gui, Xiaolin; Lin, Jiancai; Tian, Feng; Zhao, Jianqiang; Dai, Min

    2014-01-01

    Cloud computing gets increasing attention for its capacity to leverage developers from infrastructure management tasks. However, recent works reveal that side channel attacks can lead to privacy leakage in the cloud. Enhancing isolation between users is an effective solution to eliminate the attack. In this paper, to eliminate side channel attacks, we investigate the isolation enhancement scheme from the aspect of virtual machine (VM) management. The security-awareness VMs management scheme (SVMS), a VMs isolation enhancement scheme to defend against side channel attacks, is proposed. First, we use the aggressive conflict of interest relation (ACIR) and aggressive in ally with relation (AIAR) to describe user constraint relations. Second, based on the Chinese wall policy, we put forward four isolation rules. Third, the VMs placement and migration algorithms are designed to enforce VMs isolation between the conflict users. Finally, based on the normal distribution, we conduct a series of experiments to evaluate SVMS. The experimental results show that SVMS is efficient in guaranteeing isolation between VMs owned by conflict users, while the resource utilization rate decreases but not by much.

  4. Computer Security: Coming soon, a pragmatic Data Protection Policy for an open Organisation

    CERN Multimedia

    Computer Security Team

    2014-01-01

    Like any other organisation/employer, CERN holds confidential data, e.g. medical records, personnel files, files on harassment cases, NDAs & contracts, credit card information, and even unpublished scientific results. Unfortunately, our current methods of handling such documents are inadequate owing to a lack of clarity with regard to responsibilities and obligations.   So, from time to time, some documents have become public that should not have (such as the premature publication of videos about the 2012 “Higgs” announcement); some of us have accidentally leaked confidential information (such as passwords used to access accelerator and experiment control systems in 2011); other colleagues have lost their laptops or had them stolen (e.g. from a delegation on duty travel in 2013) along with the e-mails and private files saved on them. Fortunately, these times of inadvertent data loss and lack of clarity concerning our obligations should soon&a...

  5. Evaluation of the Self-Adjusting File system (SAF) for the instrumentation of primary molar root canals: a micro-computed tomographic study.

    Science.gov (United States)

    Kaya, E; Elbay, M; Yiğit, D

    2017-06-01

    The Self-Adjusting File (SAF) system has been recommended for use in permanent teeth since it offers more conservative and effective root-canal preparation when compared to traditional rotary systems. However, no study had evaluated the usage of SAF in primary teeth. The aim of this study was to evaluate and compare the use of SAF, K file (manual instrumentation) and Profile (traditional rotary instrumentation) systems for primary-tooth root-canal preparation in terms of instrumentation time and amounts of dentin removed using micro-computed tomography (μCT) technology. Study Design: The study was conducted with 60 human primary mandibular second molar teeth divided into 3 groups according to instrumentation technique: Group I: SAF (n=20); Group II: K file (n=20); Group III; Profile (n=20). Teeth were embedded in acrylic blocks and scanned with a μCT scanner prior to instrumentation. All distal root canals were prepared up to size 30 for K file,.04/30 for Profile and 2 mm thickness, size 25 for SAF; instrumentation time was recorded for each tooth, and a second μCT scan was performed after instrumentation was complete. Amounts of dentin removed were measured using the three-dimensional images by calculating the difference in root-canal volume before and after preparation. Data was statistically analysed using the Kolmogorov-Smirnov and Kruskal-Wallis tests. Manual instrumentation (K file) resulted in significantly more dentin removal when compared to rotary instrumentation (Profile and SAF), while the SAF system generated significantly less dentin removal than both manual instrumentation (K file) and traditional rotary instrumentation (Profile) (psystems. Within the experimental conditions of the present study, the SAF seems as a useful system for root-canal instrumentation in primary molars because it removed less dentin than other systems, which is especially important for the relatively thin-walled canals of primary teeth, and because it involves less

  6. [Comparison of effectiveness and safety between Twisted File technique and ProTaper Universal rotary full sequence based on micro-computed tomography].

    Science.gov (United States)

    Chen, Xiao-bo; Chen, Chen; Liang, Yu-hong

    2016-02-18

    To evaluate the efficacy and security of two type of rotary nickel titanium system (Twisted File and ProTaper Universal) for root canal preparation based on micro-computed tomography(micro-CT). Twenty extracted molars (including 62 canals) were divided into two experimental groups and were respectively instrumented using Twisted File rotary nickel titanium system (TF) and ProTaper Universal rotary nickel titanium system (PU) to #25/0.08 following recommended protocol. Time for root canal instrumentation (accumulation of time for every single file) was recorded. The 0-3 mm root surface from apex was observed under an optical stereomicroscope at 25 × magnification. The presence of crack line was noted. The root canals were scanned with micro-CT before and after root canal preparation. Three-dimensional shape images of canals were reconstructed, calculated and evaluated. The amount of canal central transportation of the two groups was calculated and compared. The shorter preparation time [(0.53 ± 0.14) min] was observed in TF group, while the preparation time of PU group was (2.06 ± 0.39) min (Pvs. (0.097 ± 0.084) mm, P<0.05]. No instrument separation was observed in both the groups. Cracks were not found in both the groups either based in micro-CT images or observation under an optical stereomicroscope at 25 × magnification. Compared with ProTaper Universal, Twisted File took less time in root canal preparation and exhibited better shaping ability, and less canal transportation.

  7. Flat Files - JSNP | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ... Data file File name: jsnp_flat_files File URL: ftp://ftp.biosciencedbc.jp/archiv...his Database Database Description Download License Update History of This Database Site Policy | Contact Us Flat Files - JSNP | LSDB Archive ...

  8. Assessing the performance of a computer-based policy model of HIV and AIDS.

    Science.gov (United States)

    Rydzak, Chara E; Cotich, Kara L; Sax, Paul E; Hsu, Heather E; Wang, Bingxia; Losina, Elena; Freedberg, Kenneth A; Weinstein, Milton C; Goldie, Sue J

    2010-09-09

    Model-based analyses, conducted within a decision analytic framework, provide a systematic way to combine information about the natural history of disease and effectiveness of clinical management strategies with demographic and epidemiological characteristics of the population. Among the challenges with disease-specific modeling include the need to identify influential assumptions and to assess the face validity and internal consistency of the model. We describe a series of exercises involved in adapting a computer-based simulation model of HIV disease to the Women's Interagency HIV Study (WIHS) cohort and assess model performance as we re-parameterized the model to address policy questions in the U.S. relevant to HIV-infected women using data from the WIHS. Empiric calibration targets included 24-month survival curves stratified by treatment status and CD4 cell count. The most influential assumptions in untreated women included chronic HIV-associated mortality following an opportunistic infection, and in treated women, the 'clinical effectiveness' of HAART and the ability of HAART to prevent HIV complications independent of virologic suppression. Good-fitting parameter sets required reductions in the clinical effectiveness of 1st and 2nd line HAART and improvements in 3rd and 4th line regimens. Projected rates of treatment regimen switching using the calibrated cohort-specific model closely approximated independent analyses published using data from the WIHS. The model demonstrated good internal consistency and face validity, and supported cohort heterogeneities that have been reported in the literature. Iterative assessment of model performance can provide information about the relative influence of uncertain assumptions and provide insight into heterogeneities within and between cohorts. Description of calibration exercises can enhance the transparency of disease-specific models.

  9. Assessing the performance of a computer-based policy model of HIV and AIDS.

    Directory of Open Access Journals (Sweden)

    Chara E Rydzak

    2010-09-01

    Full Text Available Model-based analyses, conducted within a decision analytic framework, provide a systematic way to combine information about the natural history of disease and effectiveness of clinical management strategies with demographic and epidemiological characteristics of the population. Among the challenges with disease-specific modeling include the need to identify influential assumptions and to assess the face validity and internal consistency of the model.We describe a series of exercises involved in adapting a computer-based simulation model of HIV disease to the Women's Interagency HIV Study (WIHS cohort and assess model performance as we re-parameterized the model to address policy questions in the U.S. relevant to HIV-infected women using data from the WIHS. Empiric calibration targets included 24-month survival curves stratified by treatment status and CD4 cell count. The most influential assumptions in untreated women included chronic HIV-associated mortality following an opportunistic infection, and in treated women, the 'clinical effectiveness' of HAART and the ability of HAART to prevent HIV complications independent of virologic suppression. Good-fitting parameter sets required reductions in the clinical effectiveness of 1st and 2nd line HAART and improvements in 3rd and 4th line regimens. Projected rates of treatment regimen switching using the calibrated cohort-specific model closely approximated independent analyses published using data from the WIHS.The model demonstrated good internal consistency and face validity, and supported cohort heterogeneities that have been reported in the literature. Iterative assessment of model performance can provide information about the relative influence of uncertain assumptions and provide insight into heterogeneities within and between cohorts. Description of calibration exercises can enhance the transparency of disease-specific models.

  10. Pilot Study: Impact of Computer Simulation on Students' Economic Policy Performance. Pilot Study.

    Science.gov (United States)

    Domazlicky, Bruce; France, Judith

    Fiscal and monetary policies taught in macroeconomic principles courses are concepts that might require both lecture and simulation methods. The simulation models, which apply the principles gleened from comparative statistics to a dynamic world, may give students an appreciation for the problems facing policy makers. This paper is a report of a…

  11. Long term file migration. Part I: file reference patterns

    International Nuclear Information System (INIS)

    Smith, A.J.

    1978-08-01

    In most large computer installations, files are moved between on-line disk and mass storage (tape, integrated mass storage device) either automatically by the system or specifically at the direction of the user. This is the first of two papers which study the selection of algorithms for the automatic migration of files between mass storage and disk. The use of the text editor data sets at the Stanford Linear Accelerator Center (SLAC) computer installation is examined through the analysis of thirteen months of file reference data. Most files are used very few times. Of those that are used sufficiently frequently that their reference patterns may be examined, about a third show declining rates of reference during their lifetime; of the remainder, very few (about 5%) show correlated interreference intervals, and interreference intervals (in days) appear to be more skewed than would occur with the Bernoulli process. Thus, about two-thirds of all sufficiently active files appear to be referenced as a renewal process with a skewed interreference distribution. A large number of other file reference statistics (file lifetimes, interference distributions, moments, means, number of uses/file, file sizes, file rates of reference, etc.) are computed and presented. The results are applied in the following paper to the development and comparative evaluation of file migration algorithms. 17 figures, 13 tables

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  13. Computer Attack and Cyber Terrorism: Vulnerabilities and Policy Issues for Congress

    National Research Council Canada - National Science Library

    Wilson, Clay

    2003-01-01

    Persistent computer security vulnerabilities may expose U.S. critical infrastructure and government computer systems to possible cyber attack by terrorists, possibly affecting the economy or other areas of national security...

  14. The Impact of Fiscal Policy on Poverty in Ethiopia: A Computable ...

    African Journals Online (AJOL)

    Daniel

    study investigates the economy-wide impacts of these fiscal policy changes on poverty. ... of the World Bank and the International Monetary Fund in the 1990s and ..... factors of production (labor, land, livestock and capital), 7 institutions (an.

  15. Tobacco Town: Computational Modeling of Policy Options to Reduce Tobacco Retailer Density.

    Science.gov (United States)

    Luke, Douglas A; Hammond, Ross A; Combs, Todd; Sorg, Amy; Kasman, Matt; Mack-Crane, Austen; Ribisl, Kurt M; Henriksen, Lisa

    2017-05-01

    To identify the behavioral mechanisms and effects of tobacco control policies designed to reduce tobacco retailer density. We developed the Tobacco Town agent-based simulation model to examine 4 types of retailer reduction policies: (1) random retailer reduction, (2) restriction by type of retailer, (3) limiting proximity of retailers to schools, and (4) limiting proximity of retailers to each other. The model examined the effects of these policies alone and in combination across 4 different types of towns, defined by 2 levels of population density (urban vs suburban) and 2 levels of income (higher vs lower). Model results indicated that reduction of retailer density has the potential to decrease accessibility of tobacco products by driving up search and purchase costs. Policy effects varied by town type: proximity policies worked better in dense, urban towns whereas retailer type and random retailer reduction worked better in less-dense, suburban settings. Comprehensive retailer density reduction policies have excellent potential to reduce the public health burden of tobacco use in communities.

  16. A micro-computed tomographic evaluation of dentinal microcrack alterations during root canal preparation using single-file Ni-Ti systems.

    Science.gov (United States)

    Li, Mei-Lin; Liao, Wei-Li; Cai, Hua-Xiong

    2018-01-01

    The aim of the present study was to evaluate the length of dentinal microcracks observed prior to and following root canal preparation with different single-file nickel-titanium (Ni-Ti) systems using micro-computed tomography (micro-CT) analysis. A total of 80 mesial roots of mandibular first molars presenting with type II Vertucci canal configurations were scanned at an isotropic resolution of 7.4 µm. The samples were randomly assigned into four groups (n=20 per group) according to the system used for root canal preparation, including the WaveOne (WO), OneShape (OS), Reciproc (RE) and control groups. A second micro-CT scan was conducted after the root canals were prepared with size 25 instruments. Pre- and postoperative cross-section images of the roots (n=237,760) were then screened to identify the lengths of the microcracks. The results indicated that the microcrack lengths were notably increased following root canal preparation (Pfiles. Among the single-file Ni-Ti systems, WO and RE were not observed to cause notable microcracks, while the OS system resulted in evident microcracks.

  17. Feedback control policies employed by people using intracortical brain-computer interfaces

    Science.gov (United States)

    Willett, Francis R.; Pandarinath, Chethan; Jarosiewicz, Beata; Murphy, Brian A.; Memberg, William D.; Blabe, Christine H.; Saab, Jad; Walter, Benjamin L.; Sweet, Jennifer A.; Miller, Jonathan P.; Henderson, Jaimie M.; Shenoy, Krishna V.; Simeral, John D.; Hochberg, Leigh R.; Kirsch, Robert F.; Bolu Ajiboye, A.

    2017-02-01

    Objective. When using an intracortical BCI (iBCI), users modulate their neural population activity to move an effector towards a target, stop accurately, and correct for movement errors. We call the rules that govern this modulation a ‘feedback control policy’. A better understanding of these policies may inform the design of higher-performing neural decoders. Approach. We studied how three participants in the BrainGate2 pilot clinical trial used an iBCI to control a cursor in a 2D target acquisition task. Participants used a velocity decoder with exponential smoothing dynamics. Through offline analyses, we characterized the users’ feedback control policies by modeling their neural activity as a function of cursor state and target position. We also tested whether users could adapt their policy to different decoder dynamics by varying the gain (speed scaling) and temporal smoothing parameters of the iBCI. Main results. We demonstrate that control policy assumptions made in previous studies do not fully describe the policies of our participants. To account for these discrepancies, we propose a new model that captures (1) how the user’s neural population activity gradually declines as the cursor approaches the target from afar, then decreases more sharply as the cursor comes into contact with the target, (2) how the user makes constant feedback corrections even when the cursor is on top of the target, and (3) how the user actively accounts for the cursor’s current velocity to avoid overshooting the target. Further, we show that users can adapt their control policy to decoder dynamics by attenuating neural modulation when the cursor gain is high and by damping the cursor velocity more strongly when the smoothing dynamics are high. Significance. Our control policy model may help to build better decoders, understand how neural activity varies during active iBCI control, and produce better simulations of closed-loop iBCI movements.

  18. Evaluating the efficiency of divestiture policy in promoting competitiveness using an analytical method and agent-based computational economics

    Energy Technology Data Exchange (ETDEWEB)

    Rahimiyan, Morteza; Rajabi Mashhadi, Habib [Department of Electrical Engineering, Faculty of Engineering, Ferdowsi University of Mashhad, Mashhad (Iran)

    2010-03-15

    Choosing a desired policy for divestiture of dominant firms' generation assets has been a challenging task and open question for regulatory authority. To deal with this problem, in this paper, an analytical method and agent-based computational economics (ACE) approach are used for ex-ante analysis of divestiture policy in reducing market power. The analytical method is applied to solve a designed concentration boundary problem, even for situations where the cost data of generators are unknown. The concentration boundary problem is the problem of minimizing or maximizing market concentration subject to operation constraints of the electricity market. It is proved here that the market concentration corresponding to operation condition is certainly viable in an interval calculated by the analytical method. For situations where the cost function of generators is available, the ACE is used to model the electricity market. In ACE, each power producer's profit-maximization problem is solved by the computational approach of Q-learning. The power producer using the Q-learning method learns from past experiences to implicitly identify the market power, and find desired response in competing with the rivals. Both methods are applied in a multi-area power system and effects of different divestiture policies on market behavior are analyzed. (author)

  19. Evaluating the efficiency of divestiture policy in promoting competitiveness using an analytical method and agent-based computational economics

    International Nuclear Information System (INIS)

    Rahimiyan, Morteza; Rajabi Mashhadi, Habib

    2010-01-01

    Choosing a desired policy for divestiture of dominant firms' generation assets has been a challenging task and open question for regulatory authority. To deal with this problem, in this paper, an analytical method and agent-based computational economics (ACE) approach are used for ex-ante analysis of divestiture policy in reducing market power. The analytical method is applied to solve a designed concentration boundary problem, even for situations where the cost data of generators are unknown. The concentration boundary problem is the problem of minimizing or maximizing market concentration subject to operation constraints of the electricity market. It is proved here that the market concentration corresponding to operation condition is certainly viable in an interval calculated by the analytical method. For situations where the cost function of generators is available, the ACE is used to model the electricity market. In ACE, each power producer's profit-maximization problem is solved by the computational approach of Q-learning. The power producer using the Q-learning method learns from past experiences to implicitly identify the market power, and find desired response in competing with the rivals. Both methods are applied in a multi-area power system and effects of different divestiture policies on market behavior are analyzed.

  20. States Move toward Computer Science Standards. Policy Update. Vol. 23, No. 17

    Science.gov (United States)

    Tilley-Coulson, Eve

    2016-01-01

    While educators and parents recognize computer science as a key skill for career readiness, only five states have adopted learning standards in this area. Tides are changing, however, as the Every Student Succeeds Act (ESSA) recognizes with its call on states to provide a "well-rounded education" for students, to include computer science…

  1. 76 FR 28018 - Combined Notice of Filings #1

    Science.gov (United States)

    2011-05-13

    ... tariff filing per 35.13(a)(2)(iii: Information Policy Revisions to be effective 6/20/ 2011. Filed Date... Interconnection, L.L.C. Description: PJM Interconnection, L.L.C. submits tariff filing per 35.13(a)(2)(iii: Queue... New Mexico submits tariff filing per 35.13(a)(2)(iii: PNM LGIP Filing to be effective 7/5/2011. Filed...

  2. File-System Workload on a Scientific Multiprocessor

    Science.gov (United States)

    Kotz, David; Nieuwejaar, Nils

    1995-01-01

    Many scientific applications have intense computational and I/O requirements. Although multiprocessors have permitted astounding increases in computational performance, the formidable I/O needs of these applications cannot be met by current multiprocessors a their I/O subsystems. To prevent I/O subsystems from forever bottlenecking multiprocessors and limiting the range of feasible applications, new I/O subsystems must be designed. The successful design of computer systems (both hardware and software) depends on a thorough understanding of their intended use. A system designer optimizes the policies and mechanisms for the cases expected to most common in the user's workload. In the case of multiprocessor file systems, however, designers have been forced to build file systems based only on speculation about how they would be used, extrapolating from file-system characterizations of general-purpose workloads on uniprocessor and distributed systems or scientific workloads on vector supercomputers (see sidebar on related work). To help these system designers, in June 1993 we began the Charisma Project, so named because the project sought to characterize 1/0 in scientific multiprocessor applications from a variety of production parallel computing platforms and sites. The Charisma project is unique in recording individual read and write requests-in live, multiprogramming, parallel workloads (rather than from selected or nonparallel applications). In this article, we present the first results from the project: a characterization of the file-system workload an iPSC/860 multiprocessor running production, parallel scientific applications at NASA's Ames Research Center.

  3. A Feasibility Study of Implementing a Bring-Your-Own-Computing-Device Policy

    Science.gov (United States)

    2013-12-01

    telecom charges is applicable to a corporate environment that allows for telecommuting or where employees require data access to their devices while...do not want to try to control their students’ computers, but the focus of BYOD in education is generally on educational outcomes (Sweeney, 2012). C...of the computer system, while application software is responsible for controlling the specific command tasks. Therefore, the relationship between

  4. In search of synergies between policy-based systems management and economic models for autonomic computing

    OpenAIRE

    Anthony, Richard

    2011-01-01

    Policy-based systems management (PBM) and economics-based systems management (EBM) are two of the many techniques available for implementing autonomic systems, each having specific benefits and limitations, and thus different applicability; choosing the most appropriate technique is\\ud the first of many challenges faced by the developer. This talk begins with a critical discussion of the general design goals of autonomic systems and the main issues involved with their development and deployme...

  5. 76 FR 59672 - Combined Notice of Filings (September 19, 2011)

    Science.gov (United States)

    2011-09-27

    ... Natural Gas Company, LLC. Description: Southern Natural Gas Company, L.L.C. submits tariff filing per 154... Policies. Filed Date: 09/14/2011. Accession Number: 20110914-5143. Comment Date: 5 p.m. Eastern Time on.... Description: Midcontinent Express Pipeline LLC submits tariff filing per 154.204: Filing to Remove Expired...

  6. Computed micro-tomographic evaluation of glide path with nickel-titanium rotary PathFile in maxillary first molars curved canals.

    Science.gov (United States)

    Pasqualini, Damiano; Bianchi, Caterina Chiara; Paolino, Davide Salvatore; Mancini, Lucia; Cemenasco, Andrea; Cantatore, Giuseppe; Castellucci, Arnaldo; Berutti, Elio

    2012-03-01

    X-ray computed micro-tomography scanning allows high-resolution 3-dimensional imaging of small objects. In this study, micro-CT scanning was used to compare the ability of manual and mechanical glide path to maintain the original root canal anatomy. Eight extracted upper first permanent molars were scanned at the TOMOLAB station at ELETTRA Synchrotron Light Laboratory in Trieste, Italy, with a microfocus cone-beam geometry system. A total of 2,400 projections on 360° have been acquired at 100 kV and 80 μA, with a focal spot size of 8 μm. Buccal root canals of each specimen (n = 16) were randomly assigned to PathFile (P) or stainless-steel K-file (K) to perform glide path at the full working length. Specimens were then microscanned at the apical level (A) and at the point of the maximum curvature level (C) for post-treatment analyses. Curvatures of root canals were classified as moderate (≤35°) or severe (≥40°). The ratio of diameter ratios (RDRs) and the ratio of cross-sectional areas (RAs) were assessed. For each level of analysis (A and C), 2 balanced 2-way factorial analyses of variance (P < .05) were performed to evaluate the significance of the instrument factor and of canal curvature factor as well as the interactions of the factors both with RDRs and RAs. Specimens in the K group had a mean curvature of 35.4° ± 11.5°; those in the P group had a curvature of 38° ± 9.9°. The instrument factor (P and K) was extremely significant (P < .001) for both the RDR and RA parameters, regardless of the point of analysis. Micro-CT scanning confirmed that NiTi rotary PathFile instruments preserve the original canal anatomy and cause less canal aberrations. Copyright © 2012 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  7. Portable File Format (PFF) specifications

    Energy Technology Data Exchange (ETDEWEB)

    Dolan, Daniel H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Created at Sandia National Laboratories, the Portable File Format (PFF) allows binary data transfer across computer platforms. Although this capability is supported by many other formats, PFF files are still in use at Sandia, particularly in pulsed power research. This report provides detailed PFF specifications for accessing data without relying on legacy code.

  8. Computing a constrained control policy for a single-server queueing system

    DEFF Research Database (Denmark)

    Larsen, Christian

    We consider a single-server queueing system designed to serve homogeneous jobs. The jobs arrive to the system after a Poisson process and all processing times are deterministic. There is a set-up cost for starting up production and a holding cost rate is incurred for each job present. Also......, there is a service cost per job, which is a convex function of the service time. The control policy specifies when the server is on or off. It also specifies the state-dependent processing times. In order to avoid a very detailed control policy (which could be hard to implement) we will only allow the server to use...... n different processing times. Hence, we must subdivide the infinite state space into n disjoint sets and for each set decide which processing time to use. We show how to derive a mathematical expression for the long-run average cost per time unit. We also present an algorithm to find the optimal...

  9. CINDA 99, supplement 2 to CINDA 97 (1988-1999). The index to literature and computer files on microscopic neutron data

    International Nuclear Information System (INIS)

    1999-01-01

    CINDA, the Computer Index of Neutron Data, contains bibliographical references to measurements, calculations, reviews and evaluations of neutron cross-sections and other microscopic neutron data; it includes also index references to computer libraries of numerical neutron data available from four regional neutron data centres. The present issue, CINDA 99, is the second supplement to CINDA 97, the index to the literature on neutron data published after 1987. It supersedes the first supplement, CINDA 98. The complete CINDA file as of 1 June 1999 is contained in: the archival issue CINDA-A (5 volumes, 1990), CINDA 97 and the current issue CINDA 99. The compilation and publication of CINDA are the result of worldwide co-operation involving the following four data centres. Each centre is responsible for compiling the CINDA entries from the literature published in a defined geographical area given in brackets below: the USA National Nuclear Data Center at the Brookhaven National Laboratory, USA (United States of America and Canada); the Russian Nuclear Data Centre at the Fiziko-Energeticheskij Institut, Obninsk, Russian Federation (former USSR countries); the NEA Data Bank in Paris, France (European OECD member countries in Western Europe and Japan); and the IAEA Nuclear Data Section in Vienna, Austria (all other countries in Eastern Europe, Asia, Australia, Africa, Central and South America; also IAEA publications and translation journals). Besides the published CINDA books, up-to-date computer retrievals for specified CINDA information are currently available on request from the responsible CINDA centres, or via direct access to the on-line services as described in this publication

  10. POLICIES FOR GREEN COMPUTING AND E-WASTE - THE ROMANIAN CASE -

    Directory of Open Access Journals (Sweden)

    STEGAROIU CARINA

    2014-12-01

    Full Text Available Computers today are an integral part of individuals’ lives all around the world; but unfortunately these devices are toxic to the environment given the materials used, their limited battery life and technological obsolescence. Individuals are concerned about the hazardous materials ever present in computers, even if the importance of various attributes differs, and that a more environment‐friendly attitude can be obtained through exposure to educational materials. In this paper, we aim to delineate the problem of e-waste in Romania and highlight a series of measures and the advantage they herald for our country and propose a series of action steps to develop in these areas further. It is possible for Romania to have an immediate economic stimulus and job creation while moving quickly to abide by the requirements of climate change legislation and energy efficiency directives. The costs of implementing energy efficiency and renewable energy measures are minimal as they are not cash expenditures but rather investments paid back by future, continuous energy savings.

  11. In-vitro Assessing the Shaping Ability of Three Nickel-Titanium Rotary Single File Systems by Cone Beam Computed Tomography

    Directory of Open Access Journals (Sweden)

    Ali Imad Al-Asadi

    2018-02-01

    Full Text Available Aim of the study was to evaluate the canal transportation and centering ability of three nickel-titanium single file rotary systems by cone beam computed tomography (CBCT. Materials and methods: Thirty permanent maxillary first molar with a range of mesiobuccal canals curvature from 20-30 degree were selected and assigned into three groups (n=10, according to the biomechanical preparation system used: Hyflex EDM (HF, Reciproc blue (RB and OneShape (OS. The sampled were scanned by CBCT after being mounted on customized acrylic base and then rescanned after the instrumentation. Slices from the axial section were taken from both exposures at 3 mm, 6 mm and 9 mm from the root apex corresponding to the apical, middle, and coronal third respectively. Data were statistically analyzed using Kurskal-Wallis and Mann-Whitney U tests at the 5% confidence level. Results: The results showed that there were no significant differences at the apical and coronal third and a significant difference at the middle third regarding canal transportation. However, there was a significant difference at the apical third and no significant difference at the middle and coronal third regarding centering ratio. Conclusion: It was concluded that the three single rotary systems reported a degree in canal transportation and centric ratio but the Hyflex EDM reported the least one.

  12. Canal transportation and centering ability of protaper and self-adjusting file system in long oval canals: An ex-vivo cone-beam computed tomography analysis.

    Science.gov (United States)

    Shah, Dipali Yogesh; Wadekar, Swati Ishwara; Dadpe, Ashwini Manish; Jadhav, Ganesh Ranganath; Choudhary, Lalit Jayant; Kalra, Dheeraj Deepak

    2017-01-01

    The purpose of this study was to compare and evaluate the shaping ability of ProTaper (PT) and Self-Adjusting File (SAF) system using cone-beam computed tomography (CBCT) to assess their performance in oval-shaped root canals. Sixty-two mandibular premolars with single oval canals were divided into two experimental groups ( n = 31) according to the systems used: Group I - PT and Group II - SAF. Canals were evaluated before and after instrumentation using CBCT to assess centering ratio and canal transportation at three levels. Data were statistically analyzed using one-way analysis of variance, post hoc Tukey's test, and t -test. The SAF showed better centering ability and lesser canal transportation than the PT only in the buccolingual plane at 6 and 9 mm levels. The shaping ability of the PT was best in the apical third in both the planes. The SAF had statistically significant better centering and lesser canal transportation in the buccolingual as compared to the mesiodistal plane at the middle and coronal levels. The SAF produced significantly less transportation and remained centered than the PT at the middle and coronal levels in the buccolingual plane of oval canals. In the mesiodistal plane, the performance of both the systems was parallel.

  13. Building a parallel file system simulator

    International Nuclear Information System (INIS)

    Molina-Estolano, E; Maltzahn, C; Brandt, S A; Bent, J

    2009-01-01

    Parallel file systems are gaining in popularity in high-end computing centers as well as commercial data centers. High-end computing systems are expected to scale exponentially and to pose new challenges to their storage scalability in terms of cost and power. To address these challenges scientists and file system designers will need a thorough understanding of the design space of parallel file systems. Yet there exist few systematic studies of parallel file system behavior at petabyte- and exabyte scale. An important reason is the significant cost of getting access to large-scale hardware to test parallel file systems. To contribute to this understanding we are building a parallel file system simulator that can simulate parallel file systems at very large scale. Our goal is to simulate petabyte-scale parallel file systems on a small cluster or even a single machine in reasonable time and fidelity. With this simulator, file system experts will be able to tune existing file systems for specific workloads, scientists and file system deployment engineers will be able to better communicate workload requirements, file system designers and researchers will be able to try out design alternatives and innovations at scale, and instructors will be able to study very large-scale parallel file system behavior in the class room. In this paper we describe our approach and provide preliminary results that are encouraging both in terms of fidelity and simulation scalability.

  14. Design and application of remote file management system

    International Nuclear Information System (INIS)

    Zhu Haijun; Liu Dekang; Shen liren

    2006-01-01

    File transfer protocol can help users transfer files between computers on internet. FTP can not fulfill the needs of users in special occasions, so it needs programmer define file transfer protocol himself based on users. The method or realization and application for user-defined file transfer protocol is introduced. (authors)

  15. Zebra: A striped network file system

    Science.gov (United States)

    Hartman, John H.; Ousterhout, John K.

    1992-01-01

    The design of Zebra, a striped network file system, is presented. Zebra applies ideas from log-structured file system (LFS) and RAID research to network file systems, resulting in a network file system that has scalable performance, uses its servers efficiently even when its applications are using small files, and provides high availability. Zebra stripes file data across multiple servers, so that the file transfer rate is not limited by the performance of a single server. High availability is achieved by maintaining parity information for the file system. If a server fails its contents can be reconstructed using the contents of the remaining servers and the parity information. Zebra differs from existing striped file systems in the way it stripes file data: Zebra does not stripe on a per-file basis; instead it stripes the stream of bytes written by each client. Clients write to the servers in units called stripe fragments, which are analogous to segments in an LFS. Stripe fragments contain file blocks that were written recently, without regard to which file they belong. This method of striping has numerous advantages over per-file striping, including increased server efficiency, efficient parity computation, and elimination of parity update.

  16. Traces of the Book Vatican Policy in the Manuscript File of Ramón del Valle-Inclán

    Directory of Open Access Journals (Sweden)

    Amparo de Juan Bolufer

    2017-01-01

    Full Text Available In various interviews done in 1933 Valle-Inclán said he was preparing a book for the historic series El Ruedo Ibérico that would explore the influence of the Vatican in Spanish politics. This article seeks to examine the traces of this narrative plan that have been preserved in the manuscript file belonging to the Valle-Inclán Alsina family, because we know that the writer continued to work on this historical project until his death. After examining the indirect documentation located primarily in the periodical press which bears witness to the process of writing this never published story, we will describe the unknown plans of structure and the drafts of work that are part of this creation file and that allow us to perceive the plot lines as well as the fundamental features of characterization for the prominent protagonists in this incomplete project.

  17. Healthcare Policy Statement on the Utility of Coronary Computed Tomography for Evaluation of Cardiovascular Conditions and Preventive Healthcare: From the Health Policy Working Group of the Society of Cardiovascular Computed Tomography.

    Science.gov (United States)

    Slim, Ahmad M; Jerome, Scott; Blankstein, Ron; Weigold, Wm Guy; Patel, Amit R; Kalra, Dinesh K; Miller, Ryan; Branch, Kelley; Rabbat, Mark G; Hecht, Harvey; Nicol, Edward D; Villines, Todd C; Shaw, Leslee J

    The rising cost of healthcare is prompting numerous policy and advocacy discussions regarding strategies for constraining growth and creating a more efficient and effective healthcare system. Cardiovascular imaging is central to the care of patients at risk of, and living with, heart disease. Estimates are that utilization of cardiovascular imaging exceeds 20 million studies per year. The Society of Cardiovascular CT (SCCT), alongside Rush University Medical Center, and in collaboration with government agencies, regional payers, and industry healthcare experts met in November 2016 in Chicago, IL to evaluate obstacles and hurdles facing the cardiovascular imaging community and how they can contribute to efficacy while maintaining or even improving outcomes and quality. The summit incorporated inputs from payers, providers, and patients' perspectives, providing a platform for all voices to be heard, allowing for a constructive dialogue with potential solutions moving forward. This article outlines the proceedings from the summit, with a detailed review of past hurdles, current status, and potential solutions as we move forward in an ever-changing healthcare landscape. Copyright © 2017 Society of Cardiovascular Computed Tomography. All rights reserved.

  18. 48 CFR 750.7109-1 - Filing requests.

    Science.gov (United States)

    2010-10-01

    ... CONTRACT MANAGEMENT EXTRAORDINARY CONTRACTUAL ACTIONS Extraordinary Contractual Actions To Protect Foreign Policy Interests of the United States 750.7109-1 Filing requests. Any person (hereinafter called the...

  19. Comparing ProFile Vortex to ProTaper Next for the efficacy of removal of root filling material: An ex vivo micro-computed tomography study

    Directory of Open Access Journals (Sweden)

    Emad AlShwaimi

    2018-01-01

    Conclusion: Our findings suggest that PV is as effective as PTN for removal of root canal filling material. Therefore, PV can be considered for use in endodontic retreatment, although more effective files or techniques are still required.

  20. Protecting Files Hosted on Virtual Machines With Out-of-Guest Access Control

    Science.gov (United States)

    2017-12-01

    of the system call, we additionally check for 35 a match on the newname. As enforced by our SACL, the first part ensures that if the user or group...file, as per the SACL- enforced policy. Figure 3.8 shows the code for the permission checks done in the case of the open() and openat() system calls...maximum 200 words) When an operating system (OS) runs on a virtual machine (VM), a hypervisor, the software that facilitates virtualization of computer

  1. 37 CFR 401.16 - Electronic filing.

    Science.gov (United States)

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Electronic filing. 401.16 Section 401.16 Patents, Trademarks, and Copyrights ASSISTANT SECRETARY FOR TECHNOLOGY POLICY, DEPARTMENT... GOVERNMENT GRANTS, CONTRACTS, AND COOPERATIVE AGREEMENTS § 401.16 Electronic filing. Unless otherwise...

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  3. The evidence-policy divide: a 'critical computational linguistics' approach to the language of 18 health agency CEOs from 9 countries.

    Science.gov (United States)

    Bell, Erica; Seidel, Bastian M

    2012-10-30

    There is an emerging body of literature suggesting that the evidence-practice divide in health policy is complex and multi-factorial but less is known about the processes by which health policy-makers use evidence and their views about the specific features of useful evidence. This study aimed to contribute to understandings of how the most influential health policy-makers view useful evidence, in ways that help explore and question how the evidence-policy divide is understood and what research might be supported to help overcome this divide. A purposeful sample of 18 national and state health agency CEOs from 9 countries was obtained. Participants were interviewed using open-ended questions that asked them to define specific features of useful evidence. The analysis involved two main approaches 1)quantitative mapping of interview transcripts using Bayesian-based computational linguistics software 2)qualitative critical discourse analysis to explore the nuances of language extracts so identified. The decision-making, conclusions-oriented world of policy-making is constructed separately, but not exclusively, by policy-makers from the world of research. Research is not so much devalued by them as described as too technical- yet at the same time not methodologically complex enough to engage with localised policy-making contexts. It is not that policy-makers are negative about academics or universities, it is that they struggle to find complexity-oriented methodologies for understanding their stakeholder communities and improving systems. They did not describe themselves as having a more positive role in solving this challenge than academics. These interviews do not support simplistic definitions of policy-makers and researchers as coming from two irreconcilable worlds. They suggest that qualitative and quantitative research is valued by policy-makers but that to be policy-relevant health research may need to focus on building complexity-oriented research methods for

  4. ACONC Files

    Data.gov (United States)

    U.S. Environmental Protection Agency — ACONC files containing simulated ozone and PM2.5 fields that were used to create the model difference plots shown in the journal article. This dataset is associated...

  5. XML Files

    Science.gov (United States)

    ... this page: https://medlineplus.gov/xml.html MedlinePlus XML Files To use the sharing features on this page, please enable JavaScript. MedlinePlus produces XML data sets that you are welcome to download ...

  6. 831 Files

    Data.gov (United States)

    Social Security Administration — SSA-831 file is a collection of initial and reconsideration adjudicative level DDS disability determinations. (A few hearing level cases are also present, but the...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  8. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  10. 12 CFR Appendix F to Part 360 - Customer File Structure

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Customer File Structure F Appendix F to Part... POLICY RESOLUTION AND RECEIVERSHIP RULES Pt. 360, App. F Appendix F to Part 360—Customer File Structure This is the structure of the data file to provide to the FDIC information related to each customer who...

  11. 76 FR 22390 - Combined Notice of Filings #1

    Science.gov (United States)

    2011-04-21

    ...: Conforming Tariff Record--Exhibit 1D Billing Policy to be effective 5/1/2011. Filed Date: 04/12/2011.... Description: Entergy Arkansas, Inc. submits tariff filing per 35.17(b): West Memphis Corrected NITSA to be...: ER11-3333-000. Applicants: NV Energy, Inc. Description: NV Energy, Inc. submits tariff filing per 35.12...

  12. JNDC FP decay data file

    International Nuclear Information System (INIS)

    Yamamoto, Tohru; Akiyama, Masatsugu

    1981-02-01

    The decay data file for fission product nuclides (FP DECAY DATA FILE) has been prepared for summation calculation of the decay heat of fission products. The average energies released in β- and γ-transitions have been calculated with computer code PROFP. The calculated results and necessary information have been arranged in tabular form together with the estimated results for 470 nuclides of which decay data are not available experimentally. (author)

  13. Efficacy of Twisted File Adaptive, Reciproc and ProTaper Universal Retreatment instruments for root-canal-filling removal: A cone-beam computed tomography study.

    Science.gov (United States)

    Akbulut, Makbule Bilge; Akman, Melek; Terlemez, Arslan; Magat, Guldane; Sener, Sevgi; Shetty, Heeresh

    2016-01-01

    The aim of this study was to evaluate the efficacy of Twisted File (TF) Adaptive, Reciproc, and ProTaper Universal Retreatment (UR) System instruments for removing root-canal-filling. Sixty single rooted teeth were decoronated, instrumented and obturated. Preoperative CBCT scans were taken and the teeth were retreated with TF Adaptive, Reciproc, ProTaper UR, or hand files (n=15). Then, the teeth were rescanned, and the percentage volume of the residual root-canal-filling material was established. The total time for retreatment was recorded, and the data was statistically analyzed. The statistical ranking of the residual filling material volume was as follows: hand file=TF Adaptive>ProTaper UR=Reciproc. The ProTaper UR and Reciproc systems required shorter periods of time for retreatment. Root canal filling was more efficiently removed by using Reciproc and ProTaper UR instruments than TF Adaptive instruments and hand files. The TF Adaptive system was advantageous over hand files with regard to operating time.

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  17. File Type Identification of File Fragments using Longest Common Subsequence (LCS)

    Science.gov (United States)

    Rahmat, R. F.; Nicholas, F.; Purnamawati, S.; Sitompul, O. S.

    2017-01-01

    Computer forensic analyst is a person in charge of investigation and evidence tracking. In certain cases, the file needed to be presented as digital evidence was deleted. It is difficult to reconstruct the file, because it often lost its header and cannot be identified while being restored. Therefore, a method is required for identifying the file type of file fragments. In this research, we propose Longest Common Subsequences that consists of three steps, namely training, testing and validation, to identify the file type from file fragments. From all testing results we can conlude that our proposed method works well and achieves 92.91% of accuracy to identify the file type of file fragment for three data types.

  18. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  3. Flexibility and Performance of Parallel File Systems

    Science.gov (United States)

    Kotz, David; Nieuwejaar, Nils

    1996-01-01

    As we gain experience with parallel file systems, it becomes increasingly clear that a single solution does not suit all applications. For example, it appears to be impossible to find a single appropriate interface, caching policy, file structure, or disk-management strategy. Furthermore, the proliferation of file-system interfaces and abstractions make applications difficult to port. We propose that the traditional functionality of parallel file systems be separated into two components: a fixed core that is standard on all platforms, encapsulating only primitive abstractions and interfaces, and a set of high-level libraries to provide a variety of abstractions and application-programmer interfaces (API's). We present our current and next-generation file systems as examples of this structure. Their features, such as a three-dimensional file structure, strided read and write interfaces, and I/O-node programs, are specifically designed with the flexibility and performance necessary to support a wide range of applications.

  4. An analysis of file system and installation of the file management system for NOS operating system

    International Nuclear Information System (INIS)

    Lee, Young Jai; Park, Sun Hee; Hwang, In Ah; Kim, Hee Kyung

    1992-06-01

    In this technical report, we analyze NOS file structure for Cyber 170-875 and Cyber 960-31 computer system. We also describe functions, procedure and how-to-operate and how-to-use of VDS. VDS is used to manage large files effectively for Cyber computer system. The purpose of the VDS installation is to increase the virtual disk storage by utilizing magnetic tape, to assist the users of the computer system in managing their files, and to enhance the performance of KAERI Cyber computer system. (Author)

  5. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  6. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  8. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  9. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  12. Policy Analysis Implications of a Model to Improve the Delivery of Financial Aid to Disadvantaged Students. AIR 1983 Annual Forum Paper.

    Science.gov (United States)

    Fenske, Robert H.; Porter, John D.

    The role of institutional research in policy analysis regarding the operation of a computer model for delivery of financial aid to disadvantaged students is considered. A student financial aid model at Arizona State University is designed to develop a profile of late appliers for aid funds and also those who file inaccurate or incomplete…

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  14. JENDL special purpose file

    International Nuclear Information System (INIS)

    Nakagawa, Tsuneo

    1995-01-01

    In JENDL-3,2, the data on all the reactions having significant cross section over the neutron energy from 0.01 meV to 20 MeV are given for 340 nuclides. The object range of application extends widely, such as the neutron engineering, shield and others of fast reactors, thermal neutron reactors and nuclear fusion reactors. This is a general purpose data file. On the contrary to this, the file in which only the data required for a specific application field are collected is called special purpose file. The file for dosimetry is a typical special purpose file. The Nuclear Data Center, Japan Atomic Energy Research Institute, is making ten kinds of JENDL special purpose files. The files, of which the working groups of Sigma Committee are in charge, are listed. As to the format of the files, ENDF format is used similarly to JENDL-3,2. Dosimetry file, activation cross section file, (α, n) reaction data file, fusion file, actinoid file, high energy data file, photonuclear data file, PKA/KERMA file, gas production cross section file and decay data file are described on their contents, the course of development and their verification. Dosimetry file and gas production cross section file have been completed already. As for the others, the expected time of completion is shown. When these files are completed, they are opened to the public. (K.I.)

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  16. Detection Of Alterations In Audio Files Using Spectrograph Analysis

    Directory of Open Access Journals (Sweden)

    Anandha Krishnan G

    2015-08-01

    Full Text Available The corresponding study was carried out to detect changes in audio file using spectrograph. An audio file format is a file format for storing digital audio data on a computer system. A sound spectrograph is a laboratory instrument that displays a graphical representation of the strengths of the various component frequencies of a sound as time passes. The objectives of the study were to find the changes in spectrograph of audio after altering them to compare altering changes with spectrograph of original files and to check for similarity and difference in mp3 and wav. Five different alterations were carried out on each audio file to analyze the differences between the original and the altered file. For altering the audio file MP3 or WAV by cutcopy the file was opened in Audacity. A different audio was then pasted to the audio file. This new file was analyzed to view the differences. By adjusting the necessary parameters the noise was reduced. The differences between the new file and the original file were analyzed. By adjusting the parameters from the dialog box the necessary changes were made. The edited audio file was opened in the software named spek where after analyzing a graph is obtained of that particular file which is saved for further analysis. The original audio graph received was combined with the edited audio file graph to see the alterations.

  17. Verification of Security Policy Enforcement in Enterprise Systems

    Science.gov (United States)

    Gupta, Puneet; Stoller, Scott D.

    Many security requirements for enterprise systems can be expressed in a natural way as high-level access control policies. A high-level policy may refer to abstract information resources, independent of where the information is stored; it controls both direct and indirect accesses to the information; it may refer to the context of a request, i.e., the request’s path through the system; and its enforcement point and enforcement mechanism may be unspecified. Enforcement of a high-level policy may depend on the system architecture and the configurations of a variety of security mechanisms, such as firewalls, host login permissions, file permissions, DBMS access control, and application-specific security mechanisms. This paper presents a framework in which all of these can be conveniently and formally expressed, a method to verify that a high-level policy is enforced, and an algorithm to determine a trusted computing base for each resource.

  18. Remote file inquiry (RFI) system

    Science.gov (United States)

    1975-01-01

    System interrogates and maintains user-definable data files from remote terminals, using English-like, free-form query language easily learned by persons not proficient in computer programming. System operates in asynchronous mode, allowing any number of inquiries within limitation of available core to be active concurrently.

  19. Research of Performance Linux Kernel File Systems

    Directory of Open Access Journals (Sweden)

    Andrey Vladimirovich Ostroukh

    2015-10-01

    Full Text Available The article describes the most common Linux Kernel File Systems. The research was carried out on a personal computer, the characteristics of which are written in the article. The study was performed on a typical workstation running GNU/Linux with below characteristics. On a personal computer for measuring the file performance, has been installed the necessary software. Based on the results, conclusions and proposed recommendations for use of file systems. Identified and recommended by the best ways to store data.

  20. Fast probabilistic file fingerprinting for big data.

    Science.gov (United States)

    Tretyakov, Konstantin; Laur, Sven; Smant, Geert; Vilo, Jaak; Prins, Pjotr

    2013-01-01

    Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff.

  1. The rice growth image files - The Rice Growth Monitoring for The Phenotypic Functional Analysis | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us The Rice Growth Monitoring for The Phenotypic Functional Analysis The rice growth image file...s Data detail Data name The rice growth image files DOI 10.18908/lsdba.nbdc00945-004 Description of data contents The rice growth ima...ge files categorized based on file size. Data file File name: image files (director...y) File URL: ftp://ftp.biosciencedbc.jp/archive/agritogo-rice-phenome/LATEST/image...ite Policy | Contact Us The rice growth image files - The Rice Growth Monitoring for The Phenotypic Functional Analysis | LSDB Archive ...

  2. Download this PDF file

    African Journals Online (AJOL)

    Administrator

    health, adolescents, Family life education (FLE), ABC approach, abstinence, curriculum, Youths,. Teenagers ... They are seen as children and divulging .... opponents say it should be the indicator of behavior ... climate for policies and laws on adolescent health. .... have computer laboratories that students can use to.

  3. HUD GIS Boundary Files

    Data.gov (United States)

    Department of Housing and Urban Development — The HUD GIS Boundary Files are intended to supplement boundary files available from the U.S. Census Bureau. The files are for community planners interested in...

  4. Online file sharing innovations in media consumption

    CERN Document Server

    Andersson Schwarz, Jonas

    2013-01-01

    It is apparent that file sharing on the Internet has become an emerging norm of media consumption-especially among young people. This book provides a critical perspective on this phenomenon, exploring issues related to file sharing, downloading, peer-to-peer networks, ""piracy,"" and (not least) policy issues regarding these practices. Andersson Schwartz critically engages with the justificatory discourses of the actual file-sharers, taking Sweden as a geographic focus. By focusing on the example of Sweden-home to both The Pirate Bay and Spotify-he provides a unique insight into a mentality th

  5. Router Agent Technology for Policy-Based Network Management

    Science.gov (United States)

    Chow, Edward T.; Sudhir, Gurusham; Chang, Hsin-Ping; James, Mark; Liu, Yih-Chiao J.; Chiang, Winston

    2011-01-01

    This innovation can be run as a standalone network application on any computer in a networked environment. This design can be configured to control one or more routers (one instance per router), and can also be configured to listen to a policy server over the network to receive new policies based on the policy- based network management technology. The Router Agent Technology transforms the received policies into suitable Access Control List syntax for the routers it is configured to control. It commits the newly generated access control lists to the routers and provides feedback regarding any errors that were faced. The innovation also automatically generates a time-stamped log file regarding all updates to the router it is configured to control. This technology, once installed on a local network computer and started, is autonomous because it has the capability to keep listening to new policies from the policy server, transforming those policies to router-compliant access lists, and committing those access lists to a specified interface on the specified router on the network with any error feedback regarding commitment process. The stand-alone application is named RouterAgent and is currently realized as a fully functional (version 1) implementation for the Windows operating system and for CISCO routers.

  6. A secure file manager for UNIX

    Energy Technology Data Exchange (ETDEWEB)

    DeVries, R.G.

    1990-12-31

    The development of a secure file management system for a UNIX-based computer facility with supercomputers and workstations is described. Specifically, UNIX in its usual form does not address: (1) Operation which would satisfy rigorous security requirements. (2) Online space management in an environment where total data demands would be many times the actual online capacity. (3) Making the file management system part of a computer network in which users of any computer in the local network could retrieve data generated on any other computer in the network. The characteristics of UNIX can be exploited to develop a portable, secure file manager which would operate on computer systems ranging from workstations to supercomputers. Implementation considerations making unusual use of UNIX features, rather than requiring extensive internal system changes, are described, and implementation using the Cray Research Inc. UNICOS operating system is outlined.

  7. Numerical computation of inventory policies, based on the EOQ/sigma-x value for order-point systems

    DEFF Research Database (Denmark)

    Alstrøm, Poul

    2001-01-01

    This paper examines the numerical computation of two control parameters, order size and order point in the well-known inventory control model, an (s,Q)system with a beta safety strategy. The aim of the paper is to show that the EOQ/sigma-x value is both sufficient for controlling the system and e...

  8. Numerical computation of inventory policies, based on the EOQ/sigma-x value for order-point systems

    DEFF Research Database (Denmark)

    Alstrøm, Poul

    2000-01-01

    This paper examines the numerical computation of two control parameters, order size and order point in the well-known inventory control model, an (s,Q)system with a beta safety strategy. The aim of the paper is to show that the EOQ/sigma-x value is both sufficient for controlling the system and e...

  9. The International Computer and Information Literacy Study (ICILS): Main Findings and Implications for Education Policies in Europe

    Science.gov (United States)

    European Commission, 2014

    2014-01-01

    The 2013 European Commission Communication on Opening up Education underlined the importance of solid evidence to assess developments and take full advantage of the impact of technology on education, and called for sustained effort and international cooperation to improve our knowledge-base in this area. The International Computer and Information…

  10. Schools (Students) Exchanging CAD/CAM Files over the Internet.

    Science.gov (United States)

    Mahoney, Gary S.; Smallwood, James E.

    This document discusses how students and schools can benefit from exchanging computer-aided design/computer-aided manufacturing (CAD/CAM) files over the Internet, explains how files are exchanged, and examines the problem of selected hardware/software incompatibility. Key terms associated with information search services are defined, and several…

  11. Parallel file system with metadata distributed across partitioned key-value store c

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron

    2017-09-19

    Improved techniques are provided for storing metadata associated with a plurality of sub-files associated with a single shared file in a parallel file system. The shared file is generated by a plurality of applications executing on a plurality of compute nodes. A compute node implements a Parallel Log Structured File System (PLFS) library to store at least one portion of the shared file generated by an application executing on the compute node and metadata for the at least one portion of the shared file on one or more object storage servers. The compute node is also configured to implement a partitioned data store for storing a partition of the metadata for the shared file, wherein the partitioned data store communicates with partitioned data stores on other compute nodes using a message passing interface. The partitioned data store can be implemented, for example, using Multidimensional Data Hashing Indexing Middleware (MDHIM).

  12. Computer Games as a Tool for Implementation of Memory Policy (on the Example of Displaying Events of The Great Patriotic War in Video Games

    Directory of Open Access Journals (Sweden)

    Сергей Игоревич Белов

    2018-12-01

    Full Text Available The presented work is devoted to the study of the practice of using computer games as a tool of the memory policy. The relevance of this study determines both the growth of the importance of video games as a means of forming ideas about the events of the past, and a low degree of study of this topic. As the goal of the author's research, the research is to identify the prospects for using computer games as an instrument for implementing the memory policy within the framework of the case of the events of the Great Patriotic War. The empirical base of the work was formed due to the generalization of the content of such video games as “Call of Duty 1”, “Call of Duty 14: WWII”, “Company of Heroes 2” and “Commandos 3: Destination Berlin”. The methodological base of the research is formed due to the involvement of elements of descriptive political analysis, the theory of operant conditioning B.F. Skinner and the concept of social identity H. Tajfel and J. Turner. The author comes to the conclusion that familiarization of users with the designated games contributes to the consolidation in the minds of users of negative stereotypes regarding the participation of the Red Army in the Great Patriotic War. The process of integration of negative images is carried out using the methods of operant conditioning. The integration of the system of negative images into the mass consciousness of the inhabitants of the post-Soviet space makes it difficult to preserve the remnants of Soviet political symbols and elements constructed on their basis identity. The author puts forward the hypothesis that in the case of complete desovietization of the public policy space in the states that emerged as a result of the collapse of the USSR, the task of revising the history of the Great Patriotic War will be greatly facilitated, and with the subsequent departure from the life of the last eyewitnesses of the relevant events, achieving this goal will be only a

  13. Peregrine System | High-Performance Computing | NREL

    Science.gov (United States)

    classes of nodes that users access: Login Nodes Peregrine has four login nodes, each of which has Intel E5 /scratch file systems, the /mss file system is mounted on all login nodes. Compute Nodes Peregrine has 2592

  14. Provider of Services File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The POS file consists of two data files, one for CLIA labs and one for 18 other provider types. The file names are CLIA and OTHER. If downloading the file, note it...

  15. PC Graphic file programing

    International Nuclear Information System (INIS)

    Yang, Jin Seok

    1993-04-01

    This book gives description of basic of graphic knowledge and understanding and realization of graphic file form. The first part deals with graphic with graphic data, store of graphic data and compress of data, programing language such as assembling, stack, compile and link of program and practice and debugging. The next part mentions graphic file form such as Mac paint file, GEM/IMG file, PCX file, GIF file, and TIFF file, consideration of hardware like mono screen driver and color screen driver in high speed, basic conception of dithering and conversion of formality.

  16. 76 FR 72993 - Options Price Reporting Authority; Notice of Filing and Immediate Effectiveness of Proposed...

    Science.gov (United States)

    2011-11-28

    ... Policies are also available on OPRA's Web site.) Footnote 3 of the Policy notes that a disaster would... disaster recovery sites. Accordingly, OPRA will implement the Policies upon filing with the Commission. \\6... a Policy Named ``Policy With Respect to Disaster Recovery Facilities'' November 21, 2011. Pursuant...

  17. Image files - RPD | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...ftp://ftp.biosciencedbc.jp/archive/rpd/LATEST/rpd_gel_image.zip File size: 38.5 MB Simple search URL - Data ... License Update History of This Database Site Policy | Contact Us Image files - RPD | LSDB Archive ...

  18. 76 FR 61956 - Electronic Tariff Filing System (ETFS)

    Science.gov (United States)

    2011-10-06

    ...] Electronic Tariff Filing System (ETFS) AGENCY: Federal Communications Commission. ACTION: Final rule... with the Commission's Electronic Tariff Filing System (ETFS), Report and Order (Order). This notice is...: Pamela Arluk, Pricing Policy Division, Wireline Competition Bureau, at (202) 418-1520, or email: pamela...

  19. Methods and Algorithms for Detecting Objects in Video Files

    Directory of Open Access Journals (Sweden)

    Nguyen The Cuong

    2018-01-01

    Full Text Available Video files are files that store motion pictures and sounds like in real life. In today's world, the need for automated processing of information in video files is increasing. Automated processing of information has a wide range of application including office/home surveillance cameras, traffic control, sports applications, remote object detection, and others. In particular, detection and tracking of object movement in video file plays an important role. This article describes the methods of detecting objects in video files. Today, this problem in the field of computer vision is being studied worldwide.

  20. Protecting your files on the DFS file system

    CERN Multimedia

    Computer Security Team

    2011-01-01

    The Windows Distributed File System (DFS) hosts user directories for all NICE users plus many more data.    Files can be accessed from anywhere, via a dedicated web portal (http://cern.ch/dfs). Due to the ease of access to DFS with in CERN it is of utmost importance to properly protect access to sensitive data. As the use of DFS access control mechanisms is not obvious to all users, passwords, certificates or sensitive files might get exposed. At least this happened in past to the Andrews File System (AFS) - the Linux equivalent to DFS) - and led to bad publicity due to a journalist accessing supposedly "private" AFS folders (SonntagsZeitung 2009/11/08). This problem does not only affect the individual user but also has a bad impact on CERN's reputation when it comes to IT security. Therefore, all departments and LHC experiments agreed recently to apply more stringent protections to all DFS user folders. The goal of this data protection policy is to assist users in pro...

  1. Protecting your files on the AFS file system

    CERN Multimedia

    2011-01-01

    The Andrew File System is a world-wide distributed file system linking hundreds of universities and organizations, including CERN. Files can be accessed from anywhere, via dedicated AFS client programs or via web interfaces that export the file contents on the web. Due to the ease of access to AFS it is of utmost importance to properly protect access to sensitive data in AFS. As the use of AFS access control mechanisms is not obvious to all users, passwords, private SSH keys or certificates have been exposed in the past. In one specific instance, this also led to bad publicity due to a journalist accessing supposedly "private" AFS folders (SonntagsZeitung 2009/11/08). This problem does not only affect the individual user but also has a bad impact on CERN's reputation when it comes to IT security. Therefore, all departments and LHC experiments agreed in April 2010 to apply more stringent folder protections to all AFS user folders. The goal of this data protection policy is to assist users in...

  2. 78 FR 64294 - Loan Guaranty: Mandatory Electronic Delivery of Loan Files for Review

    Science.gov (United States)

    2013-10-28

    ... DEPARTMENT OF VETERANS AFFAIRS Loan Guaranty: Mandatory Electronic Delivery of Loan Files for... Affairs (VA) Loan Guaranty Service (LGY) announces a new policy with regard to lender submission of VA- guaranteed closed loan files for review. Currently, lenders can submit loan files selected for review by LGY...

  3. NASA work unit system file maintenance manual

    Science.gov (United States)

    1972-01-01

    The NASA Work Unit System is a management information system for research tasks (i.e., work units) performed under NASA grants and contracts. It supplies profiles on research efforts and statistics on fund distribution. The file maintenance operator can add, delete and change records at a remote terminal or can submit punched cards to the computer room for batch update. The system is designed for file maintenance by a person with little or no knowledge of data processing techniques.

  4. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  5. Computerized index for teaching files

    International Nuclear Information System (INIS)

    Bramble, J.M.

    1989-01-01

    A computerized index can be used to retrieve cases from a teaching file that have radiographic findings similar to an unknown case. The probability that a user will review cases with a correct diagnosis was estimated with use of radiographic findings of arthritis in hand radiographs of 110 cases from a teaching file. The nearest-neighbor classification algorithm was used as a computer index to 110 cases of arthritis. Each case was treated as an unknown and inputted to the computer index. The accuracy of the computer index in retrieving cases with the same diagnosis (including rheumatoid arthritis, gout, psoriatic arthritis, inflammatory osteoarthritis, and pyrophosphate arthropathy) was measured. A Bayes classifier algorithm was also tested on the same database. Results are presented. The nearest-neighbor algorithm was 83%. By comparison, the estimated accuracy of the Bayes classifier algorithm was 78%. Conclusions: A computerized index to a teaching file based on the nearest-neighbor algorithm should allow the user to review cases with the correct diagnosis of an unknown case, by entering the findings of the unknown case

  6. Decay data file based on the ENSDF file

    Energy Technology Data Exchange (ETDEWEB)

    Katakura, J. [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-03-01

    A decay data file with the JENDL (Japanese Evaluated Nuclear Data Library) format based on the ENSDF (Evaluated Nuclear Structure Data File) file was produced as a tentative one of special purpose files of JENDL. The problem using the ENSDF file as primary source data of the JENDL decay data file is presented. (author)

  7. UPIN Group File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Group Unique Physician Identifier Number (UPIN) File is the business entity file that contains the group practice UPIN and descriptive information. It does NOT...

  8. Multi-level, automatic file management system using magnetic disk, mass storage system and magnetic tape

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1979-12-01

    A simple, effective file management system using magnetic disk, mass storage system (MSS) and magnetic tape is described. Following are the concepts and techniques introduced in this file management system. (1) File distribution and continuity character of file references are closely approximated by memory retention function. A density function using the memory retention function is thus defined. (2) A method of computing the cost/benefit lines for magnetic disk, MSS and magnetic tape is presented. (3) A decision process of an optimal organization of file facilities incorporating file demands distribution to respective file devices, is presented. (4) A method of simple, practical, effective, automatic file management, incorporating multi-level file management, space management and file migration control, is proposed. (author)

  9. A File Archival System

    Science.gov (United States)

    Fanselow, J. L.; Vavrus, J. L.

    1984-01-01

    ARCH, file archival system for DEC VAX, provides for easy offline storage and retrieval of arbitrary files on DEC VAX system. System designed to eliminate situations that tie up disk space and lead to confusion when different programers develop different versions of same programs and associated files.

  10. Text File Comparator

    Science.gov (United States)

    Kotler, R. S.

    1983-01-01

    File Comparator program IFCOMP, is text file comparator for IBM OS/VScompatable systems. IFCOMP accepts as input two text files and produces listing of differences in pseudo-update form. IFCOMP is very useful in monitoring changes made to software at the source code level.

  11. Grammar-Based Specification and Parsing of Binary File Formats

    Directory of Open Access Journals (Sweden)

    William Underwood

    2012-03-01

    Full Text Available The capability to validate and view or play binary file formats, as well as to convert binary file formats to standard or current file formats, is critically important to the preservation of digital data and records. This paper describes the extension of context-free grammars from strings to binary files. Binary files are arrays of data types, such as long and short integers, floating-point numbers and pointers, as well as characters. The concept of an attribute grammar is extended to these context-free array grammars. This attribute grammar has been used to define a number of chunk-based and directory-based binary file formats. A parser generator has been used with some of these grammars to generate syntax checkers (recognizers for validating binary file formats. Among the potential benefits of an attribute grammar-based approach to specification and parsing of binary file formats is that attribute grammars not only support format validation, but support generation of error messages during validation of format, validation of semantic constraints, attribute value extraction (characterization, generation of viewers or players for file formats, and conversion to current or standard file formats. The significance of these results is that with these extensions to core computer science concepts, traditional parser/compiler technologies can potentially be used as a part of a general, cost effective curation strategy for binary file formats.

  12. 29 CFR 4000.28 - What if I send a computer disk?

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false What if I send a computer disk? 4000.28 Section 4000.28... I send a computer disk? (a) In general. We determine your filing or issuance date for a computer... paragraph (b) of this section. (1) Filings. For computer-disk filings, we may treat your submission as...

  13. A data compression algorithm for nuclear spectrum files

    International Nuclear Information System (INIS)

    Mika, J.F.; Martin, L.J.; Johnston, P.N.

    1990-01-01

    The total space occupied by computer files of spectra generated in nuclear spectroscopy systems can lead to problems of storage, and transmission time. An algorithm is presented which significantly reduces the space required to store nuclear spectra, without loss of any information content. Testing indicates that spectrum files can be routinely compressed by a factor of 5. (orig.)

  14. New orthopaedic implant management tool for computer-assisted planning, navigation, and simulation: from implant CAD files to a standardized XML-based implant database.

    Science.gov (United States)

    Sagbo, S; Blochaou, F; Langlotz, F; Vangenot, C; Nolte, L-P; Zheng, G

    2005-01-01

    Computer-Assisted Orthopaedic Surgery (CAOS) has made much progress over the last 10 years. Navigation systems have been recognized as important tools that help surgeons, and various such systems have been developed. A disadvantage of these systems is that they use non-standard formalisms and techniques. As a result, there are no standard concepts for implant and tool management or data formats to store information for use in 3D planning and navigation. We addressed these limitations and developed a practical and generic solution that offers benefits for surgeons, implant manufacturers, and CAS application developers. We developed a virtual implant database containing geometrical as well as calibration information for orthopedic implants and instruments, with a focus on trauma. This database has been successfully tested for various applications in the client/server mode. The implant information is not static, however, because manufacturers periodically revise their implants, resulting in the deletion of some implants and the introduction of new ones. Tracking these continuous changes and keeping CAS systems up to date is a tedious task if done manually. This leads to additional costs for system development, and some errors are inevitably generated due to the huge amount of information that has to be processed. To ease management with respect to implant life cycle, we developed a tool to assist end-users (surgeons, hospitals, CAS system providers, and implant manufacturers) in managing their implants. Our system can be used for pre-operative planning and intra-operative navigation, and also for any surgical simulation involving orthopedic implants. Currently, this tool allows addition of new implants, modification of existing ones, deletion of obsolete implants, export of a given implant, and also creation of backups. Our implant management system has been successfully tested in the laboratory with very promising results. It makes it possible to fill the current gap

  15. Securing the AliEn File Catalogue - Enforcing authorization with accountable file operations

    International Nuclear Information System (INIS)

    Schreiner, Steffen; Banerjee, Subho Sankar; Betev, Latchezar; Carminati, Federico; Vladimirovna Datskova, Olga; Furano, Fabrizio; Grigoras, Alina; Grigoras, Costin; Mendez Lorenzo, Patricia; Peters, Andreas Joachim; Saiz, Pablo; Bagnasco, Stefano; Zhu Jianlin

    2011-01-01

    The AliEn Grid Services, as operated by the ALICE Collaboration in its global physics analysis grid framework, is based on a central File Catalogue together with a distributed set of storage systems and the possibility to register links to external data resources. This paper describes several identified vulnerabilities in the AliEn File Catalogue access protocol regarding fraud and unauthorized file alteration and presents a more secure and revised design: a new mechanism, called LFN Booking Table, is introduced in order to keep track of access authorization in the transient state of files entering or leaving the File Catalogue. Due to a simplification of the original Access Envelope mechanism for xrootd-protocol-based storage systems, fundamental computational improvements of the mechanism were achieved as well as an up to 50% reduction of the credential's size. By extending the access protocol with signed status messages from the underlying storage system, the File Catalogue receives trusted information about a file's size and checksum and the protocol is no longer dependent on client trust. Altogether, the revised design complies with atomic and consistent transactions and allows for accountable, authentic, and traceable file operations. This paper describes these changes as part and beyond the development of AliEn version 2.19.

  16. 18 CFR 281.204 - Tariff filing requirements.

    Science.gov (United States)

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Tariff filing... COMMISSION, DEPARTMENT OF ENERGY OTHER REGULATIONS UNDER THE NATURAL GAS POLICY ACT OF 1978 AND RELATED AUTHORITIES NATURAL GAS CURTAILMENT UNDER THE NATURAL GAS POLICY ACT OF 1978 Permanent Curtailment Rule § 281...

  17. ATLAS, an integrated structural analysis and design system. Volume 4: Random access file catalog

    Science.gov (United States)

    Gray, F. P., Jr. (Editor)

    1979-01-01

    A complete catalog is presented for the random access files used by the ATLAS integrated structural analysis and design system. ATLAS consists of several technical computation modules which output data matrices to corresponding random access file. A description of the matrices written on these files is contained herein.

  18. pcircle - A Suite of Scalable Parallel File System Tools

    Energy Technology Data Exchange (ETDEWEB)

    2015-10-01

    Most of the software related to file system are written for conventional local file system, they are serialized and can't take advantage of the benefit of a large scale parallel file system. "pcircle" software builds on top of ubiquitous MPI in cluster computing environment and "work-stealing" pattern to provide a scalable, high-performance suite of file system tools. In particular - it implemented parallel data copy and parallel data checksumming, with advanced features such as async progress report, checkpoint and restart, as well as integrity checking.

  19. MR-AFS: a global hierarchical file-system

    International Nuclear Information System (INIS)

    Reuter, H.

    2000-01-01

    The next generation of fusion experiments will use object-oriented technology creating the need for world wide sharing of an underlying hierarchical file-system. The Andrew file system (AFS) is a well known and widely spread global distributed file-system. Multiple-resident-AFS (MR-AFS) combines the features of AFS with hierarchical storage management systems. Files in MR-AFS therefore may be migrated on secondary storage, such as roboted tape libraries. MR-AFS is in use at IPP for the current experiments and data originating from super-computer applications. Experiences and scalability issues are discussed

  20. Network survivability performance (computer diskette)

    Science.gov (United States)

    1993-11-01

    File characteristics: Data file; 1 file. Physical description: 1 computer diskette; 3 1/2 in.; high density; 2.0MB. System requirements: Mac; Word. This technical report has been developed to address the survivability of telecommunications networks including services. It responds to the need for a common understanding of, and assessment techniques for network survivability, availability, integrity, and reliability. It provides a basis for designing and operating telecommunication networks to user expectations for network survivability.

  1. Characteristics of file sharing and peer to peer networking | Opara ...

    African Journals Online (AJOL)

    Characteristics of file sharing and peer to peer networking. ... distributing or providing access to digitally stored information, such as computer programs, ... including in multicast systems, anonymous communications systems, and web caches.

  2. LASIP-III, a generalized processor for standard interface files

    International Nuclear Information System (INIS)

    Bosler, G.E.; O'Dell, R.D.; Resnik, W.M.

    1976-03-01

    The LASIP-III code was developed for processing Version III standard interface data files which have been specified by the Committee on Computer Code Coordination. This processor performs two distinct tasks, namely, transforming free-field format, BCD data into well-defined binary files and providing for printing and punching data in the binary files. While LASIP-III is exported as a complete free-standing code package, techniques are described for easily separating the processor into two modules, viz., one for creating the binary files and one for printing the files. The two modules can be separated into free-standing codes or they can be incorporated into other codes. Also, the LASIP-III code can be easily expanded for processing additional files, and procedures are described for such an expansion. 2 figures, 8 tables

  3. CryptoCache: A Secure Sharable File Cache for Roaming Users

    DEFF Research Database (Denmark)

    Jensen, Christian D.

    2000-01-01

    . Conventional distributed file systems cache everything locally or not at all; there is no possibility to cache files on nearby nodes.In this paper we present the design of a secure cache system called CryptoCache that allows roaming users to cache files on untrusted file hosting servers. The system allows...... flexible sharing of cached files among unauthenticated users, i.e. unlike most distributed file systems CryptoCache does not require a global authentication framework.Files are encrypted when they are transferred over the network and while stored on untrusted servers. The system uses public key......Small mobile computers are now sufficiently powerful to run many applications, but storage capacity remains limited so working files cannot be cached or stored locally. Even if files can be stored locally, the mobile device is not powerful enough to act as server in collaborations with other users...

  4. Distributed Data Management and Distributed File Systems

    CERN Document Server

    Girone, Maria

    2015-01-01

    The LHC program has been successful in part due to the globally distributed computing resources used for collecting, serving, processing, and analyzing the large LHC datasets. The introduction of distributed computing early in the LHC program spawned the development of new technologies and techniques to synchronize information and data between physically separated computing centers. Two of the most challenges services are the distributed file systems and the distributed data management systems. In this paper I will discuss how we have evolved from local site services to more globally independent services in the areas of distributed file systems and data management and how these capabilities may continue to evolve into the future. I will address the design choices, the motivations, and the future evolution of the computing systems used for High Energy Physics.

  5. Utilizing HDF4 File Content Maps for the Cloud

    Science.gov (United States)

    Lee, Hyokyung Joe

    2016-01-01

    We demonstrate a prototype study that HDF4 file content map can be used for efficiently organizing data in cloud object storage system to facilitate cloud computing. This approach can be extended to any binary data formats and to any existing big data analytics solution powered by cloud computing because HDF4 file content map project started as long term preservation of NASA data that doesn't require HDF4 APIs to access data.

  6. BIBLIO: A Reprint File Management Algorithm

    Science.gov (United States)

    Zelnio, Robert N.; And Others

    1977-01-01

    The development of a simple computer algorithm designed for use by the individual educator or researcher in maintaining and searching reprint files is reported. Called BIBLIO, the system is inexpensive and easy to operate and maintain without sacrificing flexibility and utility. (LBH)

  7. An information retrieval system for research file data

    Science.gov (United States)

    Joan E. Lengel; John W. Koning

    1978-01-01

    Research file data have been successfully retrieved at the Forest Products Laboratory through a high-speed cross-referencing system involving the computer program FAMULUS as modified by the Madison Academic Computing Center at the University of Wisconsin. The method of data input, transfer to computer storage, system utilization, and effectiveness are discussed....

  8. GEODOC: the GRID document file, record structure and data element description

    Energy Technology Data Exchange (ETDEWEB)

    Trippe, T.; White, V.; Henderson, F.; Phillips, S.

    1975-11-06

    The purpose of this report is to describe the information structure of the GEODOC file. GEODOC is a computer based file which contains the descriptive cataloging and indexing information for all documents processed by the National Geothermal Information Resource Group. This file (along with other GRID files) is managed by DBMS, the Berkeley Data Base Management System. Input for the system is prepared using the IRATE Text Editing System with its extended (12 bit) character set, or punched cards.

  9. Adding Data Management Services to Parallel File Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brandt, Scott [Univ. of California, Santa Cruz, CA (United States)

    2015-03-04

    The objective of this project, called DAMASC for “Data Management in Scientific Computing”, is to coalesce data management with parallel file system management to present a declarative interface to scientists for managing, querying, and analyzing extremely large data sets efficiently and predictably. Managing extremely large data sets is a key challenge of exascale computing. The overhead, energy, and cost of moving massive volumes of data demand designs where computation is close to storage. In current architectures, compute/analysis clusters access data in a physically separate parallel file system and largely leave it scientist to reduce data movement. Over the past decades the high-end computing community has adopted middleware with multiple layers of abstractions and specialized file formats such as NetCDF-4 and HDF5. These abstractions provide a limited set of high-level data processing functions, but have inherent functionality and performance limitations: middleware that provides access to the highly structured contents of scientific data files stored in the (unstructured) file systems can only optimize to the extent that file system interfaces permit; the highly structured formats of these files often impedes native file system performance optimizations. We are developing Damasc, an enhanced high-performance file system with native rich data management services. Damasc will enable efficient queries and updates over files stored in their native byte-stream format while retaining the inherent performance of file system data storage via declarative queries and updates over views of underlying files. Damasc has four key benefits for the development of data-intensive scientific code: (1) applications can use important data-management services, such as declarative queries, views, and provenance tracking, that are currently available only within database systems; (2) the use of these services becomes easier, as they are provided within a familiar file

  10. File access prediction using neural networks.

    Science.gov (United States)

    Patra, Prashanta Kumar; Sahu, Muktikanta; Mohapatra, Subasish; Samantray, Ronak Kumar

    2010-06-01

    One of the most vexing issues in design of a high-speed computer is the wide gap of access times between the memory and the disk. To solve this problem, static file access predictors have been used. In this paper, we propose dynamic file access predictors using neural networks to significantly improve upon the accuracy, success-per-reference, and effective-success-rate-per-reference by using neural-network-based file access predictor with proper tuning. In particular, we verified that the incorrect prediction has been reduced from 53.11% to 43.63% for the proposed neural network prediction method with a standard configuration than the recent popularity (RP) method. With manual tuning for each trace, we are able to improve upon the misprediction rate and effective-success-rate-per-reference using a standard configuration. Simulations on distributed file system (DFS) traces reveal that exact fit radial basis function (RBF) gives better prediction in high end system whereas multilayer perceptron (MLP) trained with Levenberg-Marquardt (LM) backpropagation outperforms in system having good computational capability. Probabilistic and competitive predictors are the most suitable for work stations having limited resources to deal with and the former predictor is more efficient than the latter for servers having maximum system calls. Finally, we conclude that MLP with LM backpropagation algorithm has better success rate of file prediction than those of simple perceptron, last successor, stable successor, and best k out of m predictors.

  11. Source Reference File

    Data.gov (United States)

    Social Security Administration — This file contains a national set of names and contact information for doctors, hospitals, clinics, and other facilities (known collectively as sources) from which...

  12. Patient Assessment File (PAF)

    Data.gov (United States)

    Department of Veterans Affairs — The Patient Assessment File (PAF) database compiles the results of the Patient Assessment Instrument (PAI) questionnaire filled out for intermediate care Veterans...

  13. RRB Earnings File (RRBERN)

    Data.gov (United States)

    Social Security Administration — RRBERN contains records for all beneficiaries on the RRB's PSSVES file who's SSNs are validated through the SVES processing. Validated output is processed through...

  14. Radiology Teaching Files on the Internet

    International Nuclear Information System (INIS)

    Lim, Eun Chung; Kim, Eun Kyung

    1996-01-01

    There is increasing attention about radiology teaching files on the Internet in the field of diagnostic radiology. The purpose of this study was to aid in the creation of new radiology teaching file by analysing the present radiology teaching file sites on the Internet with many aspects and evaluating images on those sites, using Macintosh II ci compute r, 28.8kbps TelePort Fax/Modem, Netscape Navigator 2.0 software. The results were as follow : 1. Analysis of radiology teaching file sites (1) Country distribution was the highest in USA (57.5%). (2) Average number of cases was 186 cases and radiology teaching file sites with search engine were 9 sites (22.5%). (3) At the method of case arrangement, anatomic area type and diagnosis type were found at the 10 sites (25%) each, question and answer type was found at the 9 sites (22.5%). (4) Radiology teaching file sites with oro-maxillofacial disorder were 9 sites (22.5%). (5) At the image format, GIF format was found at the 14 sites (35%), and JPEG format found at the 14 sites (35%). (6) Created year was the highest in 1995 (43.7%). (7) Continuing case upload was found at the 35 sites (87.5%). 2. Evaluation of images on the radiology teaching files (1) Average file size of GIF format (71 Kbyte) was greater than that of JPEG format (24 Kbyte). (P<0.001) (2) Image quality of GIF format was better than that of JPEG format. (P<0.001)

  15. 76 FR 62092 - Filing Procedures

    Science.gov (United States)

    2011-10-06

    ... INTERNATIONAL TRADE COMMISSION Filing Procedures AGENCY: International Trade Commission. ACTION: Notice of issuance of Handbook on Filing Procedures. SUMMARY: The United States International Trade Commission (``Commission'') is issuing a Handbook on Filing Procedures to replace its Handbook on Electronic...

  16. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, J; Sartirana, A

    2001-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on thei...

  17. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, Jose

    2010-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on the...

  18. Improving CMS data transfers among its distributed computing facilities

    International Nuclear Information System (INIS)

    Flix, J; Magini, N; Sartirana, A

    2011-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on their usage, customizing the topologies and improving their setup in order to keep CMS transferring data at the desired levels in a reliable and robust way.

  19. Virus Alert: Ten Steps to Safe Computing.

    Science.gov (United States)

    Gunter, Glenda A.

    1997-01-01

    Discusses computer viruses and explains how to detect them; discusses virus protection and the need to update antivirus software; and offers 10 safe computing tips, including scanning floppy disks and commercial software, how to safely download files from the Internet, avoiding pirated software copies, and backing up files. (LRW)

  20. Nuclear plant fire incident data file

    International Nuclear Information System (INIS)

    Sideris, A.G.; Hockenbury, R.W.; Yeater, M.L.; Vesely, W.E.

    1979-01-01

    A computerized nuclear plant fire incident data file was developed by American Nuclear Insurers and was further analyzed by Rensselaer Polytechnic Institute with technical and monetary support provided by the Nuclear Regulatory Commission. Data on 214 fires that occurred at nuclear facilities have been entered in the file. A computer program has been developed to sort the fire incidents according to various parameters. The parametric sorts that are presented in this article are significant since they are the most comprehensive statistics presently available on fires that have occurred at nuclear facilities

  1. FHEO Filed Cases

    Data.gov (United States)

    Department of Housing and Urban Development — The dataset is a list of all the Title VIII fair housing cases filed by FHEO from 1/1/2007 - 12/31/2012 including the case number, case name, filing date, state and...

  2. Efficacy of D-RaCe and ProTaper Universal Retreatment NiTi instruments and hand files in removing gutta-percha from curved root canals - a micro-computed tomography study.

    Science.gov (United States)

    Rödig, T; Hausdörfer, T; Konietschke, F; Dullin, C; Hahn, W; Hülsmann, M

    2012-06-01

    To compare the efficacy of two rotary NiTi retreatment systems and Hedström files in removing filling material from curved root canals. Curved root canals of 57 extracted teeth were prepared using FlexMaster instruments and filled with gutta-percha and AH Plus. After determination of root canal curvatures and radii in two directions, the teeth were assigned to three identical groups (n = 19). The root fillings were removed with D-RaCe instruments, ProTaper Universal Retreatment instruments or Hedström files. Pre- and postoperative micro-CT imaging was used to assess the percentage of residual filling material as well as the amount of dentine removal. Working time and procedural errors were recorded. Data were analysed using analysis of covariance and analysis of variance procedures. D-RaCe instruments were significantly more effective than ProTaper Universal Retreatment instruments and Hedström files (P ProTaper group, four instrument fractures and one lateral perforation were observed. Five instrument fractures were recorded for D-RaCe. D-RaCe instruments were associated with significantly less residual filling material than ProTaper Universal Retreatment instruments and hand files. Hedström files removed significantly less dentine than both rotary NiTi systems. Retreatment with rotary NiTi systems resulted in a high incidence of procedural errors. © 2012 International Endodontic Journal.

  3. Next generation WLCG File Transfer Service (FTS)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    LHC experiments at CERN and worldwide utilize WLCG resources and middleware components to perform distributed computing tasks. One of the most important tasks is reliable file replication. It is a complex problem, suffering from transfer failures, disconnections, transfer duplication, server and network overload, differences in storage systems, etc. To address these problems, EMI and gLite have provided the independent File Transfer Service (FTS) and Grid File Access Library (GFAL) tools. Their development started almost a decade ago, in the meantime, requirements in data management have changed - the old architecture of FTS and GFAL cannot keep support easily these changes. Technology has also been progressing: FTS and GFAL do not fit into the new paradigms (cloud, messaging, for example). To be able to serve the next stage of LHC data collecting (from 2013), we need a new generation of  these tools: FTS 3 and GFAL 2. We envision a service requiring minimal configuration, which can dynamically adapt to the...

  4. GIFT: an HEP project for file transfer

    International Nuclear Information System (INIS)

    Ferrer, M.L.; Mirabelli, G.; Valente, E.

    1986-01-01

    Started in autumn 1983, GIFT (General Internetwork File Transfer) is a collaboration among several HEP centers, including CERN, Frascati, Oslo, Oxford, RAL and Rome. The collaboration was initially set up with the aim of studying the feasibility of a software system to allow direct file exchange between computers which do not share a common Virtual File Protocol. After the completion of this first phase, an implementation phase started and, since March 1985, an experimental service based on this system has been running at CERN between DECnet, CERNET and the UK Coloured Book protocols. The authors present the motivations that, together with previous gateway experiences, led to the definition of GIFT specifications and to the implementation of the GIFT Kernel system. The position of GIFT in the overall development framework of the networking facilities needed by large international collaborations within the HEP community is explained. (Auth.)

  5. MalHaploFreq: A computer programme for estimating malaria haplotype frequencies from blood samples

    Directory of Open Access Journals (Sweden)

    Smith Thomas A

    2008-07-01

    Full Text Available Abstract Background Molecular markers, particularly those associated with drug resistance, are important surveillance tools that can inform policy choice. People infected with falciparum malaria often contain several genetically-distinct clones of the parasite; genotyping the patients' blood reveals whether or not the marker is present (i.e. its prevalence, but does not reveal its frequency. For example a person with four malaria clones may contain both mutant and wildtype forms of a marker but it is not possible to distinguish the relative frequencies of the mutant and wildtypes i.e. 1:3, 2:2 or 3:1. Methods An appropriate method for obtaining frequencies from prevalence data is by Maximum Likelihood analysis. A computer programme has been developed that allows the frequency of markers, and haplotypes defined by up to three codons, to be estimated from blood phenotype data. Results The programme has been fully documented [see Additional File 1] and provided with a user-friendly interface suitable for large scale analyses. It returns accurate frequencies and 95% confidence intervals from simulated dataset sets and has been extensively tested on field data sets. Additional File 1 User manual for MalHaploFreq. Click here for file Conclusion The programme is included [see Additional File 2] and/or may be freely downloaded from 1. It can then be used to extract molecular marker and haplotype frequencies from their prevalence in human blood samples. This should enhance the use of frequency data to inform antimalarial drug policy choice. Additional File 2 executable programme compiled for use on DOS or windows Click here for file

  6. Configuration Management File Manager Developed for Numerical Propulsion System Simulation

    Science.gov (United States)

    Follen, Gregory J.

    1997-01-01

    One of the objectives of the High Performance Computing and Communication Project's (HPCCP) Numerical Propulsion System Simulation (NPSS) is to provide a common and consistent way to manage applications, data, and engine simulations. The NPSS Configuration Management (CM) File Manager integrated with the Common Desktop Environment (CDE) window management system provides a common look and feel for the configuration management of data, applications, and engine simulations for U.S. engine companies. In addition, CM File Manager provides tools to manage a simulation. Features include managing input files, output files, textual notes, and any other material normally associated with simulation. The CM File Manager includes a generic configuration management Application Program Interface (API) that can be adapted for the configuration management repositories of any U.S. engine company.

  7. Tabulation of Fundamental Assembly Heat and Radiation Source Files

    International Nuclear Information System (INIS)

    T. deBues; J.C. Ryman

    2006-01-01

    The purpose of this calculation is to tabulate a set of computer files for use as input to the WPLOAD thermal loading software. These files contain details regarding heat and radiation from pressurized water reactor (PWR) assemblies and boiling water reactor (BWR) assemblies. The scope of this calculation is limited to rearranging and reducing the existing file information into a more streamlined set of tables for use as input to WPLOAD. The electronic source term files used as input to this calculation were generated from the output files of the SAS2H/ORIGIN-S sequence of the SCALE Version 4.3 modular code system, as documented in References 2.1.1 and 2.1.2, and are included in Attachment II

  8. Study and development of a document file system with selective access

    International Nuclear Information System (INIS)

    Mathieu, Jean-Claude

    1974-01-01

    The objective of this research thesis was to design and to develop a set of software aimed at an efficient management of a document file system by using methods of selective access to information. Thus, the three main aspects of file processing (creation, modification, reorganisation) have been addressed. The author first presents the main problems related to the development of a comprehensive automatic documentation system, and their conventional solutions. Some future aspects, notably dealing with the development of peripheral computer technology, are also evoked. He presents the characteristics of INIS bibliographic records provided by the IAEA which have been used to create the files. In the second part, he briefly describes the file system general organisation. This system is based on the use of two main files: an inverse file which contains for each descriptor a list of of numbers of files indexed by this descriptor, and a dictionary of descriptor or input file which gives access to the inverse file. The organisation of these both files is then describes in a detailed way. Other related or associated files are created, and the overall architecture and mechanisms integrated into the file data input software are described, as well as various processing applied to these different files. Performance and possible development are finally discussed

  9. 17 CFR 229.1003 - (Item 1003) Identity and background of filing person.

    Science.gov (United States)

    2010-04-01

    ... 17 Commodity and Securities Exchanges 2 2010-04-01 2010-04-01 false (Item 1003) Identity and background of filing person. 229.1003 Section 229.1003 Commodity and Securities Exchanges SECURITIES AND EXCHANGE COMMISSION STANDARD INSTRUCTIONS FOR FILING FORMS UNDER SECURITIES ACT OF 1933, SECURITIES EXCHANGE ACT OF 1934 AND ENERGY POLICY AND...

  10. The Improvement and Performance of Mobile Environment Using Both Cloud and Text Computing

    OpenAIRE

    S.Saravana Kumar; J.Lakshmi Priya; P.Hannah Jennifer; N.Jeff Monica; Fathima

    2013-01-01

    In this research paper presents an design model for file sharing system for ubiquitos mobile devices using both cloud and text computing. File s haring is one of the rationales for computer networks with increasing demand for file sharing ap plications and technologies in small and large enterprise networks and on the Internet. File transfer is an important process in any form of computing as we need to really share the data ac ross. ...

  11. Download this PDF file

    African Journals Online (AJOL)

    5,. May. 1923, p. 287. ISouth African Military Schools) p 287. CGS Box 231, File 31/0/2. .... One gains the impression that the sphere .... tions, Anthropology, Sociology and Man Manage- ment. ... of the word, possesses personality and initiative,.

  12. MMLEADS Public Use File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare-Medicaid Linked Enrollee Analytic Data Source (MMLEADS) Public Use File (PUF) contains demographic, enrollment, condition prevalence, utilization, and...

  13. Hospital Service Area File

    Data.gov (United States)

    U.S. Department of Health & Human Services — This file is derived from the calendar year inpatient claims data. The records contain number of discharges, length of stay, and total charges summarized by provider...

  14. Patient Treatment File (PTF)

    Data.gov (United States)

    Department of Veterans Affairs — This database is part of the National Medical Information System (NMIS). The Patient Treatment File (PTF) contains a record for each inpatient care episode provided...

  15. USEEIO Satellite Files

    Data.gov (United States)

    U.S. Environmental Protection Agency — These files contain the environmental data as particular emissions or resources associated with a BEA sectors that are used in the USEEIO model. They are organized...

  16. Provider of Services File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The POS file contains data on characteristics of hospitals and other types of healthcare facilities, including the name and address of the facility and the type of...

  17. Download this PDF file

    African Journals Online (AJOL)

    countries quite a number of distance education institutions and programmes are more likely to be ... The Open University of Tanzania (OUT), (Ministry of Higher Education, Science and ..... (1991) Comic Relief Funding file. BAI, London, 1st ...

  18. 47 CFR 64.1002 - International settlements policy.

    Science.gov (United States)

    2010-10-01

    ... (CONTINUED) MISCELLANEOUS RULES RELATING TO COMMON CARRIERS International Settlements Policy and Modification... accounting rate modification, filed pursuant to § 64.1001, that includes a settlement rate that is at or... behavior that is harmful to U.S. customers. Carriers and other parties filing complaints must support their...

  19. JENDL Dosimetry File

    International Nuclear Information System (INIS)

    Nakazawa, Masaharu; Iguchi, Tetsuo; Kobayashi, Katsuhei; Iwasaki, Shin; Sakurai, Kiyoshi; Ikeda, Yujiro; Nakagawa, Tsuneo.

    1992-03-01

    The JENDL Dosimetry File based on JENDL-3 was compiled and integral tests of cross section data were performed by the Dosimetry Integral Test Working Group of the Japanese Nuclear Data Committee. Data stored in the JENDL Dosimetry File are the cross sections and their covariance data for 61 reactions. The cross sections were mainly taken from JENDL-3 and the covariances from IRDF-85. For some reactions, data were adopted from other evaluated data files. The data are given in the neutron energy region below 20 MeV in both of point-wise and group-wise files in the ENDF-5 format. In order to confirm reliability of the data, several integral tests were carried out; comparison with the data in IRDF-85 and average cross sections measured in fission neutron fields, fast reactor spectra, DT neutron fields and Li(d, n) neutron fields. As a result, it has been found that the JENDL Dosimetry File gives better results than IRDF-85 but there are some problems to be improved in future. The contents of the JENDL Dosimetry File and the results of the integral tests are described in this report. All of the dosimetry cross sections are shown in a graphical form. (author) 76 refs

  20. JENDL Dosimetry File

    Energy Technology Data Exchange (ETDEWEB)

    Nakazawa, Masaharu; Iguchi, Tetsuo [Tokyo Univ. (Japan). Faculty of Engineering; Kobayashi, Katsuhei [Kyoto Univ., Kumatori, Osaka (Japan). Research Reactor Inst.; Iwasaki, Shin [Tohoku Univ., Sendai (Japan). Faculty of Engineering; Sakurai, Kiyoshi; Ikeda, Yujior; Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1992-03-15

    The JENDL Dosimetry File based on JENDL-3 was compiled and integral tests of cross section data were performed by the Dosimetry Integral Test Working Group of the Japanese Nuclear Data Committee. Data stored in the JENDL Dosimetry File are the cross sections and their covariance data for 61 reactions. The cross sections were mainly taken from JENDL-3 and the covariances from IRDF-85. For some reactions, data were adopted from other evaluated data files. The data are given in the neutron energy region below 20 MeV in both of point-wise and group-wise files in the ENDF-5 format. In order to confirm reliability of the data, several integral tests were carried out; comparison with the data in IRDF-85 and average cross sections measured in fission neutron fields, fast reactor spectra, DT neutron fields and Li(d,n) neutron fields. As a result, it has been found that the JENDL Dosimetry File gives better results than IRDF-85 but there are some problems to be improved in future. The contents of the JENDL Dosimetry File and the results of the integral tests are described in this report. All of the dosimetry cross sections are shown in a graphical form.

  1. Dynamic Non-Hierarchical File Systems for Exascale Storage

    Energy Technology Data Exchange (ETDEWEB)

    Long, Darrell E. [Univ. of California, Santa Cruz, CA (United States); Miller, Ethan L [Univ. of California, Santa Cruz, CA (United States)

    2015-02-24

    This constitutes the final report for “Dynamic Non-Hierarchical File Systems for Exascale Storage”. The ultimate goal of this project was to improve data management in scientific computing and high-end computing (HEC) applications, and to achieve this goal we proposed: to develop the first, HEC-targeted, file system featuring rich metadata and provenance collection, extreme scalability, and future storage hardware integration as core design goals, and to evaluate and develop a flexible non-hierarchical file system interface suitable for providing more powerful and intuitive data management interfaces to HEC and scientific computing users. Data management is swiftly becoming a serious problem in the scientific community – while copious amounts of data are good for obtaining results, finding the right data is often daunting and sometimes impossible. Scientists participating in a Department of Energy workshop noted that most of their time was spent “...finding, processing, organizing, and moving data and it’s going to get much worse”. Scientists should not be forced to become data mining experts in order to retrieve the data they want, nor should they be expected to remember the naming convention they used several years ago for a set of experiments they now wish to revisit. Ideally, locating the data you need would be as easy as browsing the web. Unfortunately, existing data management approaches are usually based on hierarchical naming, a 40 year-old technology designed to manage thousands of files, not exabytes of data. Today’s systems do not take advantage of the rich array of metadata that current high-end computing (HEC) file systems can gather, including content-based metadata and provenance1 information. As a result, current metadata search approaches are typically ad hoc and often work by providing a parallel management system to the “main” file system, as is done in Linux (the locate utility), personal computers, and enterprise search

  2. Development of data file system for cardiovascular nuclear medicine

    International Nuclear Information System (INIS)

    Hayashida, Kohei; Nishimura, Tsunehiko; Uehara, Toshiisa; Nisawa, Yoshifumi.

    1985-01-01

    A computer-assisted filing system for storing and processing data from cardiac pool scintigraphy and myocardial scintigraphy has been developed. Individual patient data are stored with his (her) identification number (ID) into floppy discs successively in order of receiving scintigraphy. Data for 900 patients can be stored per floppy disc. Scintigraphic findings can be outputted in a uniform file format, and can be used as a reporting format. Output or retrieval of filed individual patient data is possible according to each examination, disease code or ID. This system seems to be used for prospective study in patients with cardiovascular diseases. (Namekawa, K.)

  3. Tuning HDF5 subfiling performance on parallel file systems

    Energy Technology Data Exchange (ETDEWEB)

    Byna, Suren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chaarawi, Mohamad [Intel Corp. (United States); Koziol, Quincey [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Mainzer, John [The HDF Group (United States); Willmore, Frank [The HDF Group (United States)

    2017-05-12

    Subfiling is a technique used on parallel file systems to reduce locking and contention issues when multiple compute nodes interact with the same storage target node. Subfiling provides a compromise between the single shared file approach that instigates the lock contention problems on parallel file systems and having one file per process, which results in generating a massive and unmanageable number of files. In this paper, we evaluate and tune the performance of recently implemented subfiling feature in HDF5. In specific, we explain the implementation strategy of subfiling feature in HDF5, provide examples of using the feature, and evaluate and tune parallel I/O performance of this feature with parallel file systems of the Cray XC40 system at NERSC (Cori) that include a burst buffer storage and a Lustre disk-based storage. We also evaluate I/O performance on the Cray XC30 system, Edison, at NERSC. Our results show performance benefits of 1.2X to 6X performance advantage with subfiling compared to writing a single shared HDF5 file. We present our exploration of configurations, such as the number of subfiles and the number of Lustre storage targets to storing files, as optimization parameters to obtain superior I/O performance. Based on this exploration, we discuss recommendations for achieving good I/O performance as well as limitations with using the subfiling feature.

  4. Renewal-anomalous-heterogeneous files

    International Nuclear Information System (INIS)

    Flomenbom, Ophir

    2010-01-01

    Renewal-anomalous-heterogeneous files are solved. A simple file is made of Brownian hard spheres that diffuse stochastically in an effective 1D channel. Generally, Brownian files are heterogeneous: the spheres' diffusion coefficients are distributed and the initial spheres' density is non-uniform. In renewal-anomalous files, the distribution of waiting times for individual jumps is not exponential as in Brownian files, yet obeys: ψ α (t)∼t -1-α , 0 2 >, obeys, 2 >∼ 2 > nrml α , where 2 > nrml is the MSD in the corresponding Brownian file. This scaling is an outcome of an exact relation (derived here) connecting probability density functions of Brownian files and renewal-anomalous files. It is also shown that non-renewal-anomalous files are slower than the corresponding renewal ones.

  5. 76 FR 4867 - Marine Mammals; File No. 15453

    Science.gov (United States)

    2011-01-27

    ... Public Comment'' from the Features box on the Applications and Permits for Protected Species (APPS) home page, https://apps.nmfs.noaa.gov , and then selecting File No. 15453 from the list of available... educational graphics about the Hawaiian monk seal. In compliance with the National Environmental Policy Act of...

  6. Legal, economic and cultural aspects of file sharing

    NARCIS (Netherlands)

    van Eijk, N.; Poort, J.P.; Rutten, P.

    2010-01-01

    This contribution seeks to identify the short and long-term economic and cultural effects of file sharing on music, films and games, while taking into account the legal context and policy developments. The short-term implications examined concern direct costs and benefits to society, whereas the

  7. 75 FR 42689 - Marine Mammals; File Nos. 15498 and 15500

    Science.gov (United States)

    2010-07-22

    ...-9394; and File No. 15500: Southeast Region, NMFS, 263 13th Avenue South, Saint Petersburg, FL 33701... Beard, (301) 713-2289. SUPPLEMENTARY INFORMATION: On May 3, 2010, notice was published in the Federal... Policy Act of 1969 (42 U.S.C. 4321 et seq.), a final determination has been made that the activity...

  8. 75 FR 23242 - Marine Mammals; File Nos. 15498 and 15500

    Science.gov (United States)

    2010-05-03

    ... (978) 281-9394; and File No. 15500: Southeast Region, NMFS, 263 13th Avenue South, Saint Petersburg, FL... the address listed above. Comments may also be submitted by facsimile to (301) 713-0376, or by email... activities stated in the applications. In compliance with the National Environmental Policy Act of 1969 (42 U...

  9. Privacy Impact Assessment for the Claims Office Master Files

    Science.gov (United States)

    The Claims Office Master Files System collects information on companies in debt to the EPA. Learn how this data is collected, how it will be used, access to the data, the purpose of data collection, and record retention policies for this data.

  10. The version control service for the ATLAS data acquisition configuration files

    International Nuclear Information System (INIS)

    Soloviev, Igor

    2012-01-01

    The ATLAS experiment at the LHC in Geneva uses a complex and highly distributed Trigger and Data Acquisition system, involving a very large number of computing nodes and custom modules. The configuration of the system is specified by schema and data in more than 1000 XML files, with various experts responsible for updating the files associated with their components. Maintaining an error free and consistent set of XML files proved a major challenge. Therefore a special service was implemented; to validate any modifications; to check the authorization of anyone trying to modify a file; to record who had made changes, plus when and why; and to provide tools to compare different versions of files and to go back to earlier versions if required. This paper provides details of the implementation and exploitation experience, that may be interesting for other applications using many human-readable files maintained by different people, where consistency of the files and traceability of modifications are key requirements.

  11. Sustainability Policy and Environmental Policy

    OpenAIRE

    John C. V. Pezzey

    2001-01-01

    A theoretical, representative agent economy with a depletable resource stock, polluting emissions and productive capital is used to contrast environmental policy, which internalises externalised environmental values, with sustainability policy, which achieves some form of intergenerational equity. The obvious environmental policy comprises an emissions tax and a resource stock subsidy, each equal to the respective external cost or benefit. Sustainability policy comprises an incentive affectin...

  12. Students "Hacking" School Computer Systems

    Science.gov (United States)

    Stover, Del

    2005-01-01

    This article deals with students hacking school computer systems. School districts are getting tough with students "hacking" into school computers to change grades, poke through files, or just pit their high-tech skills against district security. Dozens of students have been prosecuted recently under state laws on identity theft and unauthorized…

  13. NASA ARCH- A FILE ARCHIVAL SYSTEM FOR THE DEC VAX

    Science.gov (United States)

    Scott, P. J.

    1994-01-01

    The function of the NASA ARCH system is to provide a permanent storage area for files that are infrequently accessed. The NASA ARCH routines were designed to provide a simple mechanism by which users can easily store and retrieve files. The user treats NASA ARCH as the interface to a black box where files are stored. There are only five NASA ARCH user commands, even though NASA ARCH employs standard VMS directives and the VAX BACKUP utility. Special care is taken to provide the security needed to insure file integrity over a period of years. The archived files may exist in any of three storage areas: a temporary buffer, the main buffer, and a magnetic tape library. When the main buffer fills up, it is transferred to permanent magnetic tape storage and deleted from disk. Files may be restored from any of the three storage areas. A single file, multiple files, or entire directories can be stored and retrieved. archived entities hold the same name, extension, version number, and VMS file protection scheme as they had in the user's account prior to archival. NASA ARCH is capable of handling up to 7 directory levels. Wildcards are supported. User commands include TEMPCOPY, DISKCOPY, DELETE, RESTORE, and DIRECTORY. The DIRECTORY command searches a directory of savesets covering all three archival areas, listing matches according to area, date, filename, or other criteria supplied by the user. The system manager commands include 1) ARCHIVE- to transfer the main buffer to duplicate magnetic tapes, 2) REPORTto determine when the main buffer is full enough to archive, 3) INCREMENT- to back up the partially filled main buffer, and 4) FULLBACKUP- to back up the entire main buffer. On-line help files are provided for all NASA ARCH commands. NASA ARCH is written in DEC VAX DCL for interactive execution and has been implemented on a DEC VAX computer operating under VMS 4.X. This program was developed in 1985.

  14. Grid collector: An event catalog with automated file management

    International Nuclear Information System (INIS)

    Wu, Kesheng; Zhang, Wei-Ming; Sim, Alexander; Gu, Junmin; Shoshani, Arie

    2003-01-01

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides ''direct'' access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select events based on tags, such as, ''production date between March 10 and 20, and the number of charged tracks > 100.'' The Grid Collector locates the files containing relevant events, transfers the files across the Grid if necessary, and delivers the events to the analysis code through the familiar iterators. There has been some research efforts to address the file management issues, the Grid Collector is unique in that it addresses the event access issue together with the file management issues. This makes it more useful to a large variety of users

  15. Grid collector: An event catalog with automated file management

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Kesheng; Zhang, Wei-Ming; Sim, Alexander; Gu, Junmin; Shoshani, Arie

    2003-10-17

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides ''direct'' access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select events based on tags, such as, ''production date between March 10 and 20, and the number of charged tracks > 100.'' The Grid Collector locates the files containing relevant events, transfers the files across the Grid if necessary, and delivers the events to the analysis code through the familiar iterators. There has been some research efforts to address the file management issues, the Grid Collector is unique in that it addresses the event access issue together with the file management issues. This makes it more useful to a large variety of users.

  16. Is the "Net Generation" Ready for Digital Citizenship? Perspectives from the IEA International Computer and Information Literacy Study 2013. Policy Brief No. 6

    Science.gov (United States)

    Watkins, Ryan; Engel, Laura C.; Hastedt, Dirk

    2015-01-01

    The rise of digital information and communication technologies (ICT) has made the acquisition of computer and information literacy (CIL) a leading factor in creating an engaged, informed, and employable citizenry. However, are young people, often described as "digital natives" or the "net generation," developing the necessary…

  17. Cloud object store for archive storage of high performance computing data using decoupling middleware

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-06-30

    Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  18. Cloud object store for checkpoints of high performance computing applications using decoupling middleware

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-04-19

    Cloud object storage is enabled for checkpoints of high performance computing applications using a middleware process. A plurality of files, such as checkpoint files, generated by a plurality of processes in a parallel computing system are stored by obtaining said plurality of files from said parallel computing system; converting said plurality of files to objects using a log structured file system middleware process; and providing said objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  19. Formalizing a hierarchical file system

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, Muhammad Ikram

    An abstract file system is defined here as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for creation, removal,

  20. Formalizing a Hierarchical File System

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, M.I.

    2009-01-01

    In this note, we define an abstract file system as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for removal

  1. Extracting the Data From the LCM vk4 Formatted Output File

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, James G. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-29

    These are slides about extracting the data from the LCM vk4 formatted output file. The following is covered: vk4 file produced by Keyence VK Software, custom analysis, no off the shelf way to read the file, reading the binary data in a vk4 file, various offsets in decimal lines, finding the height image data, directly in MATLAB, binary output beginning of height image data, color image information, color image binary data, color image decimal and binary data, MATLAB code to read vk4 file (choose a file, read the file, compute offsets, read optical image, laser optical image, read and compute laser intensity image, read height image, timing, display height image, display laser intensity image, display RGB laser optical images, display RGB optical images, display beginning data and save images to workspace, gamma correction subroutine), reading intensity form the vk4 file, linear in the low range, linear in the high range, gamma correction for vk4 files, computing the gamma intensity correction, observations.

  2. Reliable file sharing in distributed operating system using web RTC

    Science.gov (United States)

    Dukiya, Rajesh

    2017-12-01

    Since, the evolution of distributed operating system, distributed file system is come out to be important part in operating system. P2P is a reliable way in Distributed Operating System for file sharing. It was introduced in 1999, later it became a high research interest topic. Peer to Peer network is a type of network, where peers share network workload and other load related tasks. A P2P network can be a period of time connection, where a bunch of computers connected by a USB (Universal Serial Bus) port to transfer or enable disk sharing i.e. file sharing. Currently P2P requires special network that should be designed in P2P way. Nowadays, there is a big influence of browsers in our life. In this project we are going to study of file sharing mechanism in distributed operating system in web browsers, where we will try to find performance bottlenecks which our research will going to be an improvement in file sharing by performance and scalability in distributed file systems. Additionally, we will discuss the scope of Web Torrent file sharing and free-riding in peer to peer networks.

  3. Privacy Policy

    Science.gov (United States)

    ... Home → NLM Privacy Policy URL of this page: https://medlineplus.gov/privacy.html NLM Privacy Policy To ... out of cookies in the most popular browsers, http://www.usa.gov/optout_instructions.shtml. Please note ...

  4. 75 FR 30839 - Privacy Act of 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer...

    Science.gov (United States)

    2010-06-02

    ... 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer Match No. 1048, IRS... Services (CMS). ACTION: Notice of renewal of an existing computer matching program (CMP) that has an...'' section below for comment period. DATES: Effective Dates: CMS filed a report of the Computer Matching...

  5. Accessing files in an Internet: The Jade file system

    Science.gov (United States)

    Peterson, Larry L.; Rao, Herman C.

    1991-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  6. Accessing files in an internet - The Jade file system

    Science.gov (United States)

    Rao, Herman C.; Peterson, Larry L.

    1993-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  7. Prefetching in file systems for MIMD multiprocessors

    Science.gov (United States)

    Kotz, David F.; Ellis, Carla Schlatter

    1990-01-01

    The question of whether prefetching blocks on the file into the block cache can effectively reduce overall execution time of a parallel computation, even under favorable assumptions, is considered. Experiments have been conducted with an interleaved file system testbed on the Butterfly Plus multiprocessor. Results of these experiments suggest that (1) the hit ratio, the accepted measure in traditional caching studies, may not be an adequate measure of performance when the workload consists of parallel computations and parallel file access patterns, (2) caching with prefetching can significantly improve the hit ratio and the average time to perform an I/O (input/output) operation, and (3) an improvement in overall execution time has been observed in most cases. In spite of these gains, prefetching sometimes results in increased execution times (a negative result, given the optimistic nature of the study). The authors explore why it is not trivial to translate savings on individual I/O requests into consistently better overall performance and identify the key problems that need to be addressed in order to improve the potential of prefetching techniques in the environment.

  8. The policies

    International Nuclear Information System (INIS)

    Laruelle, Ph.; Snegaroff, Th.; Moreau, S.; Tellenne, C.; Brunel, S.

    2005-01-01

    Fourth chapter of the book on the geo-policy of the sustainable development, this chapter deal with the different and international policies concerned by the problem. The authors analyze the american energy attitude and policy, the economical equilibrium facing the environmental equilibrium for the european policy, the sanctified and sacrificed nature and the japanese attitude, India and China, the great fear of the 21 century and the sustainable development in Africa. (A.L.B.)

  9. Trade Policy

    OpenAIRE

    Murray Gibbs

    2007-01-01

    In an otherwise insightful and thoughtful article, Sebastian Pfotenhauer (Trade Policy Is Science Policy,” Issues, Fall 2013) might better have entitled his contribution “Trade Policy Needs to Be Reconciled with Science Policy.” The North American Free Trade Agreement (NAFTA) and the agreements administered by the World Trade Organization, particularly the General Agreement on Tariffs and Trade (GATT) and the Technical Barriers to Trade (TBT), were adopted to promote international trade and i...

  10. Download this PDF file

    African Journals Online (AJOL)

    1- is gifts' ta5ie" in elist fig'equitable' fees distilition s ... O'." & 1 25; 33i) re...) C SS Sati ri. Southerri'Stillah diffigFiles'f actities s % -- - , a v. & ' " St - a s fit . . . fiji ſti i ...

  11. Challenging Ubiquitous Inverted Files

    NARCIS (Netherlands)

    de Vries, A.P.

    2000-01-01

    Stand-alone ranking systems based on highly optimized inverted file structures are generally considered ‘the’ solution for building search engines. Observing various developments in software and hardware, we argue however that IR research faces a complex engineering problem in the quest for more

  12. The Global File System

    Science.gov (United States)

    Soltis, Steven R.; Ruwart, Thomas M.; OKeefe, Matthew T.

    1996-01-01

    The global file system (GFS) is a prototype design for a distributed file system in which cluster nodes physically share storage devices connected via a network-like fiber channel. Networks and network-attached storage devices have advanced to a level of performance and extensibility so that the previous disadvantages of shared disk architectures are no longer valid. This shared storage architecture attempts to exploit the sophistication of storage device technologies whereas a server architecture diminishes a device's role to that of a simple component. GFS distributes the file system responsibilities across processing nodes, storage across the devices, and file system resources across the entire storage pool. GFS caches data on the storage devices instead of the main memories of the machines. Consistency is established by using a locking mechanism maintained by the storage devices to facilitate atomic read-modify-write operations. The locking mechanism is being prototyped in the Silicon Graphics IRIX operating system and is accessed using standard Unix commands and modules.

  13. Download this PDF file

    African Journals Online (AJOL)

    AJNS WEBMASTERS

    Incidence is higher in the elderly, about 58 per 100,000 per year. Diagnosis of CSDH is still .... in the other two patients was not stated in the case file. Evacuation of the Subdural .... Personal experience in 39 patients. Br J of Neurosurg. 2003 ...

  14. File System Virtual Appliances

    Science.gov (United States)

    2010-05-01

    4 KB of data is read or written, data is copied back and forth using trampoline buffers — pages that are shared during proxy initialization — because...in 2008. CIO Magazine. 104 · File system virtual appliances [64] Megiddo, N. and Modha, D. S. 2003. ARC: A Self-Tuning, Low Over- head Replacement

  15. Standard interface files and procedures for reactor physics codes. Version IV

    International Nuclear Information System (INIS)

    O'Dell, R.D.

    1977-09-01

    Standards, procedures, and recommendations of the Committee on Computer Code Coordination for promoting the exchange of reactor physics codes are updated to Version IV status. Standards and procedures covering general programming, program structure, standard interface files, and file management and handling subroutines are included

  16. Testing the Forensic Interestingness of Image Files Based on Size and Type

    Science.gov (United States)

    2017-09-01

    down to 0.18% (Rowe, 2015). 7 III. IMAGE FILE FORMATS When scanning a computer hard drive, many kinds of pictures are found. Digital images are not...3  III.  IMAGE FILE FORMATS ...Interchange Format JPEG Joint Photographic Experts Group LSH Locality Sensitive Hashing NSRL National Software Reference Library PDF Portable Document

  17. High School and Beyond Transcripts Survey (1982). Data File User's Manual. Contractor Report.

    Science.gov (United States)

    Jones, Calvin; And Others

    This data file user's manual documents the procedures used to collect and process high school transcripts for a large sample of the younger cohort (1980 sophomores) in the High School and Beyond survey. The manual provides the user with the technical assistance needed to use the computer file and also discusses the following: (1) sample design for…

  18. 11 CFR 100.19 - File, filed or filing (2 U.S.C. 434(a)).

    Science.gov (United States)

    2010-01-01

    ... a facsimile machine or by electronic mail if the reporting entity is not required to file..., including electronic reporting entities, may use the Commission's website's on-line program to file 48-hour... the reporting entity is not required to file electronically in accordance with 11 CFR 104.18. [67 FR...

  19. Evaluated neutronic file for indium

    International Nuclear Information System (INIS)

    Smith, A.B.; Chiba, S.; Smith, D.L.; Meadows, J.W.; Guenther, P.T.; Lawson, R.D.; Howerton, R.J.

    1990-01-01

    A comprehensive evaluated neutronic data file for elemental indium is documented. This file, extending from 10 -5 eV to 20 MeV, is presented in the ENDF/B-VI format, and contains all neutron-induced processes necessary for the vast majority of neutronic applications. In addition, an evaluation of the 115 In(n,n') 116m In dosimetry reaction is presented as a separate file. Attention is given in quantitative values, with corresponding uncertainty information. These files have been submitted for consideration as a part of the ENDF/B-VI national evaluated-file system. 144 refs., 10 figs., 4 tabs

  20. Translator program converts computer printout into braille language

    Science.gov (United States)

    Powell, R. A.

    1967-01-01

    Computer program converts print image tape files into six dot Braille cells, enabling a blind computer programmer to monitor and evaluate data generated by his own programs. The Braille output is printed 8 lines per inch.

  1. Code 672 observational science branch computer networks

    Science.gov (United States)

    Hancock, D. W.; Shirk, H. G.

    1988-01-01

    In general, networking increases productivity due to the speed of transmission, easy access to remote computers, ability to share files, and increased availability of peripherals. Two different networks within the Observational Science Branch are described in detail.

  2. A History of the Andrew File System

    CERN Multimedia

    CERN. Geneva; Altman, Jeffrey

    2011-01-01

    Derrick Brashear and Jeffrey Altman will present a technical history of the evolution of Andrew File System starting with the early days of the Andrew Project at Carnegie Mellon through the commercialization by Transarc Corporation and IBM and a decade of OpenAFS. The talk will be technical with a focus on the various decisions and implementation trade-offs that were made over the course of AFS versions 1 through 4, the development of the Distributed Computing Environment Distributed File System (DCE DFS), and the course of the OpenAFS development community. The speakers will also discuss the various AFS branches developed at the University of Michigan, Massachusetts Institute of Technology and Carnegie Mellon University.

  3. Xenotransplantation: science, ethics, and public policy

    National Research Council Canada - National Science Library

    Committee on Xenograft, Transplantation Institute; Institute of Medicine

    ... Division of Health Sciences Policy Division of Health Care Services INSTITUTE OF MEDICINE NATIONAL ACADEMY PRESS Washington, D.C. 1996 Copyrightoriginal retained, the be not from cannot book, paper original however, for version formatting, authoritative the typesetting-specific created from the as publication files other XML and from this of recomp...

  4. SIDS-toADF File Mapping Manual

    Science.gov (United States)

    McCarthy, Douglas; Smith, Matthew; Poirier, Diane; Smith, Charles A. (Technical Monitor)

    2002-01-01

    The "CFD General Notation System" (CGNS) consists of a collection of conventions, and conforming software, for the storage and retrieval of Computational Fluid Dynamics (CFD) data. It facilitates the exchange of data between sites and applications, and helps stabilize the archiving of aerodynamic data. This effort was initiated in order to streamline the procedures in exchanging data and software between NASA and its customers, but the goal is to develop CGNS into a National Standard for the exchange of aerodynamic data. The CGNS development team is comprised of members from Boeing Commercial Airplane Group, NASA-Ames, NASA-Langley, NASA-Lewis, McDonnell-Douglas Corporation (now Boeing-St. Louis), Air Force-Wright Lab., and ICEM-CFD Engineering. The elements of CGNS address all activities associated with the storage of data on external media and its movement to and from application programs. These elements include: 1) The Advanced Data Format (ADF) Database manager, consisting of both a file format specification and its I/O software, which handles the actual reading and writing of data from and to external storage media; 2) The Standard Interface Data Structures (SIDS), which specify the intellectual content of CFD data and the conventions governing naming and terminology; 3) The SIDS-to-ADF File Mapping conventions, which specify the exact location where the CFD data defined by the SIDS is to be stored within the ADF file(s); and 4) The CGNS Mid-level Library, which provides CFD-knowledgeable routines suitable for direct installation into application codes. The SIDS-toADF File Mapping Manual specifies the exact manner in which, under CGNS conventions, CFD data structures (the SIDS) are to be stored in (i.e., mapped onto) the file structure provided by the database manager (ADF). The result is a conforming CGNS database. Adherence to the mapping conventions guarantees uniform meaning and location of CFD data within ADF files, and thereby allows the construction of

  5. Cloud Computing: Architecture and Services

    OpenAIRE

    Ms. Ravneet Kaur

    2018-01-01

    Cloud computing is Internet-based computing, whereby shared resources, software, and information are provided to computers and other devices on demand, like the electricity grid. It is a method for delivering information technology (IT) services where resources are retrieved from the Internet through web-based tools and applications, as opposed to a direct connection to a server. Rather than keeping files on a proprietary hard drive or local storage device, cloud-based storage makes it possib...

  6. 75 FR 7467 - Gary E. Hall and Rita C. Hall; Notice of Application Accepted for Filing With the Commision...

    Science.gov (United States)

    2010-02-19

    ... Rita C. Hall; Notice of Application Accepted for Filing With the Commision, Soliciting Motions To.... Project No.: 13652-000. c. Date filed: January 11, 2010. d. Applicant: Gary E. Hall and Rita C. Hall. e... Policies Act of 1978, 16 U.S.C. 2705, 2708. h. Applicant Contact: Mr. Gary E. Hall and Ms. Rita C. Hall, P...

  7. File structure and organization in the automation system for operative account of equipment and materials in JINR

    International Nuclear Information System (INIS)

    Gulyaeva, N.D.; Markova, N.F.; Nikitina, V.I.; Tentyukova, G.N.

    1975-01-01

    The structure and organization of files in the information bank for the first variant of a JINR material and technical supply subsystem are described. Automated system of equipment operative stock-taking on the base of the SDS-6200 computer is developed. Information is stored on magnetic discs. The arrangement of each file depends on its purpose and structure of data. Access to the files can be arbitrary or consecutive. The files are divided into groups: primary document files, long-term reference, information on items that may change as a result of administrative decision [ru

  8. Secure File Allocation and Caching in Large-scale Distributed Systems

    DEFF Research Database (Denmark)

    Di Mauro, Alessio; Mei, Alessandro; Jajodia, Sushil

    2012-01-01

    In this paper, we present a file allocation and caching scheme that guarantees high assurance, availability, and load balancing in a large-scale distributed file system that can support dynamic updates of authorization policies. The scheme uses fragmentation and replication to store files with hi......-balancing, and reducing delay of read operations. The system offers a trade-off-between performance and security that is dynamically tunable according to the current level of threat. We validate our mechanisms with extensive simulations in an Internet-like network.......In this paper, we present a file allocation and caching scheme that guarantees high assurance, availability, and load balancing in a large-scale distributed file system that can support dynamic updates of authorization policies. The scheme uses fragmentation and replication to store files with high...... security requirements in a system composed of a majority of low-security servers. We develop mechanisms to fragment files, to allocate them into multiple servers, and to cache them as close as possible to their readers while preserving the security requirement of the files, providing load...

  9. Energy taxation and the double dividend effect in Taiwan's energy conservation policy-an empirical study using a computable general equilibrium model

    International Nuclear Information System (INIS)

    Bor, Yunchang Jeffrey; Huang Yophy

    2010-01-01

    Faced with pressure from greenhouse gas reductions and energy price hikes, the Taiwan government is in the process of developing an energy tax regime to reflect environmental external costs and effectively curb energy consumption, as well as mitigate CO 2 emissions through an adequate pricing system. This study utilizes a CGE model to simulate and analyze the economic impacts of the draft Energy Tax Bill and its complementary fiscal measures. Under the assumption of tax revenue neutrality, the use of energy tax revenue generated for the purpose of reducing income tax is the best choice with double dividend effects since it will effectively stimulate domestic consumption and investment, and, consequently, mitigate the negative impacts of the distortionary tax regime. The double dividend effect is less significant, however, when the supplementary measures being used are for government expenditure. Nevertheless, all supplementary measures have effectively reduced energy consumption, which means they have delivered at least the first dividend-in the sense of CO 2 emissions control. It has been verified in this study that having adequate public-finance policy measures is the key to realizing the double dividend effect.

  10. File: International bilateral relations

    International Nuclear Information System (INIS)

    Feltin, Ch.; Rabouhams, J.; Bravo, X.; Rousseau, M.; Le Breton, S.; Saint Raymond, Ph.; Brigaud, O.; Pertuis, V.; McNair, J.; Sayers, M.R.; Bye, R.; Scherrer, J.

    1998-01-01

    Since its creation in 1973, the Authority of Safety was assigned missions in the international field with following objectives: to develop information exchanges with its foreign counterpart, to make know and to explain the French approach and practice; to give to concerned countries the useful information on french nuclear facilities situated near the border; This file shows with some examples, how bilateral relations allow to fill up these objectives and how the French Authority got the foreign experience. (N.C.)

  11. Telecommunications Companies to Build Computer Network Security Policy Inquiry%电信企业计算机网络安全构建策略探究

    Institute of Scientific and Technical Information of China (English)

    张国平

    2013-01-01

    The rapid development of Internet technology, the exchange of information to businesses, such as business develop-ment to create good conditions, especially in China's telecommunications companies, Internet technology significantly improves the use of telecommunications services and enterprise management level, at the same time more computer network security prob-lems gradually, to China's telecom enterprises put forward higher requirements. This paper analyzes the status quo of China's tele-communications computer network, and the problems both from a technical and management countermeasures were put forward in order to build a secure telecommunications enterprise network system to provide a reference.%互联网技术迅速发展给各行各业的信息交流、业务开展等创造了良好的条件,尤其在我国电信企业中,互联网技术的运用显著提高了电信企业的服务与管理水平,与此同时更多的计算机网络安全问题逐渐显现,给我国的电信企业提出了更高要求。该文分析了我国电信计算机网络的现状,并对存在的问题从技术和管理两个方面提出了应对措施,以期为构建安全的电信企业网络系统提供参考。

  12. FEDGROUP - A program system for producing group constants from evaluated nuclear data of files disseminated by IAEA

    International Nuclear Information System (INIS)

    Vertes, P.

    1976-06-01

    A program system for calculating group constants from several evaluated nuclear data files has been developed. These files are distributed by the Nuclear Data Section of IAEA. Our program system - FEDGROUP - has certain advantage over the well-known similar codes such as: 1. it requires only a medium sized computer />or approximately equal to 20000 words memory/, 2. it is easily adaptable to any type of computer, 3. it is flexible to the input evaluated nuclear data file and to the output group constant file. Nowadays, FEDGROUP calculates practically all types of group constants needed for reactor physics calculations by using the most frequent representations of evaluated data. (author)

  13. Enkripsi dan Dekripsi File dengan Algoritma Blowfish pada Perangkat Mobile Berbasis Android

    Directory of Open Access Journals (Sweden)

    Siswo Wardoyo

    2016-03-01

    Full Text Available Cryptography is one of the ways used to secure data in the form of a file with encrypt files so that others are not entitled to know the file is private and confidential. One method is the algorithm Blowfish Cryptography which is a symmetric key using the algorithm to perform encryption and decryption. Applications that are built can perform file encryption-shaped images, videos, and documents. These applications can be running on a mobile phone that has a minimal operating system Android version 2.3. The software used to build these applications is Eclipse. The results of this research indicate that applications built capable of performing encryption and decryption. The results file encryption makes files into another unknown meaning. By using the keys numbered 72 bits or 9 character takes 1,49x108 years to break it with the speed it’s computation is 106 key/sec.

  14. Defining nuclear medical file formal based on DICOM standard

    International Nuclear Information System (INIS)

    He Bin; Jin Yongjie; Li Yulan

    2001-01-01

    With the wide application of computer technology in medical area, DICOM is becoming the standard of digital imaging and communication. The author discusses how to define medical imaging file formal based on DICOM standard. It also introduces the format of ANMIS system the authors defined the validity and integrality of this format

  15. Introducing a new operational policy : the PIS operational policy

    CSIR Research Space (South Africa)

    Pattinson, T

    2009-01-01

    Full Text Available ; No. of Pages 14 Computers and Chemical Engineering xxx (2009) xxx–xxx Contents lists available at ScienceDirect Computers and Chemical Engineering journa l homepage: www.e lsev ier .com/ locate /compchemeng Introducing a new operational policy...), • Finite Intermediate Storage (FIS), • Unlimited Intermediate Storage (UIS), ARTICLE IN PRESSG ModelCACE-3875; No. of Pages 14 2 T. Pattinson, T. Majozi / Computers and Chemical Engineering xxx (2009) xxx–xxx W situatio ationa polic u CI policie...

  16. 48 CFR 227.7202-1 - Policy.

    Science.gov (United States)

    2010-10-01

    ... OF DEFENSE GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-1 Policy. (a) Commercial computer software or commercial computer software documentation shall be acquired under the licenses customarily provided to the public...

  17. PFS: a distributed and customizable file system

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.

    1996-01-01

    In this paper we present our ongoing work on the Pegasus File System (PFS), a distributed and customizable file system that can be used for off-line file system experiments and on-line file system storage. PFS is best described as an object-oriented component library from which either a true file

  18. Huygens file service and storage architecture

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Stabell-Kulo, Tage; Stabell-Kulo, Tage

    1993-01-01

    The Huygens file server is a high-performance file server which is able to deliver multi-media data in a timely manner while also providing clients with ordinary “Unix” like file I/O. The file server integrates client machines, file servers and tertiary storage servers in the same storage

  19. Huygens File Service and Storage Architecture

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Stabell-Kulo, Tage; Stabell-Kulo, Tage

    1993-01-01

    The Huygens file server is a high-performance file server which is able to deliver multi-media data in a timely manner while also providing clients with ordinary “Unix” like file I/O. The file server integrates client machines, file servers and tertiary storage servers in the same storage

  20. 78 FR 75554 - Combined Notice of Filings

    Science.gov (United States)

    2013-12-12

    ...-000. Applicants: Young Gas Storage Company, Ltd. Description: Young Fuel Reimbursement Filing to be.... Protests may be considered, but intervention is necessary to become a party to the proceeding. eFiling is... qualifying facilities filings can be found at: http://www.ferc.gov/docs-filing/efiling/filing-req.pdf . For...

  1. 12 CFR 5.4 - Filing required.

    Science.gov (United States)

    2010-01-01

    ... CORPORATE ACTIVITIES Rules of General Applicability § 5.4 Filing required. (a) Filing. A depository institution shall file an application or notice with the OCC to engage in corporate activities and... advise an applicant through a pre-filing communication to send the filing or submission directly to the...

  2. The crystallographic information file (CIF): A new standard archive file for crystallography

    International Nuclear Information System (INIS)

    Hall, S.R.; Allen, F.H.; Brown, I.D.

    1991-01-01

    The specification of a new standard Crystallographic Information File (CIF) is described. Its development is based on the Self-Defining Text Archieve and Retrieval (STAR) procedure. The CIF is a general, flexible and easily extensible free-format archive file; it is human and machine readable and can be edited by a simple editor. The CIF is designed for the electronic transmission of crystallographic data between individual laboratories, journals and databases: It has been adopted by the International Union of Crystallography as the recommended medium for this purpose. The file consists of data names and data items, together with a loop facility for repeated items. The data names, constructed hierarchically so as to form data categories, are self-descriptive within a 32-character limit. The sorted list of data names, together with their precise definitions, constitutes the CIF dictionary (core version 1991). The CIF core dictionary is presented in full and covers the fundamental and most commonly used data items relevant to crystal structure analysis. The dictionary is also available as an electronic file suitable for CIF computer applications. Future extensions to the dictionary will include data items used in more specialized areas of crystallography. (orig.)

  3. The Galley Parallel File System

    Science.gov (United States)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    Most current multiprocessor file systems are designed to use multiple disks in parallel, using the high aggregate bandwidth to meet the growing I/0 requirements of parallel scientific applications. Many multiprocessor file systems provide applications with a conventional Unix-like interface, allowing the application to access multiple disks transparently. This interface conceals the parallelism within the file system, increasing the ease of programmability, but making it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. In addition to providing an insufficient interface, most current multiprocessor file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic scientific multiprocessor workloads. We discuss Galley's file structure and application interface, as well as the performance advantages offered by that interface.

  4. Download this PDF file

    African Journals Online (AJOL)

    computer networks. The calculation of the path in real time is useful in a number of situations. ... The most effective protocol in computer networks is ... in reconfigurable communication systems [7], it ..... The operating characteristic of a routing ...

  5. Download this PDF file

    African Journals Online (AJOL)

    The Effect of Language on Human-Computer Interactions in Cameroon. ONIBEREE. A.', NGOLAHC.E.º, SHU W.S.*. Department of Computer Science, University of Buea, Cameroon ..... Multiple-choice questions asked the respondent for their.

  6. Download this PDF file

    African Journals Online (AJOL)

    Dr Olaleye

    study is to determine the prevalence of computer related eye problems and their associations among computer users .... there were bright lights in their field of view while viewing .... range close to the near point of accommodation for a variable.

  7. Download this PDF file

    African Journals Online (AJOL)

    licenses and even security holograms. They are made to ... onto a computer system to encourage customers into buying their computer hardware. ..... producers, the government, the users and ICT business circles. The ... It is important that the.

  8. Download this PDF file

    African Journals Online (AJOL)

    GB

    internet connection and access to computers with in hospitals were found to be statistically ... hospitals use their computers for recording patient ..... relationship between the predictor and outcome ... information by physicians for patient care in.

  9. Grid collector an event catalog with automated file management

    CERN Document Server

    Ke Sheng Wu; Sim, A; Jun Min Gu; Shoshani, A

    2004-01-01

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides "direct" access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select ev...

  10. SLIB77, Source Library Data Compression and File Maintenance System

    International Nuclear Information System (INIS)

    Lunsford, A.

    1989-01-01

    Description of program or function: SLIB77 is a source librarian program designed to maintain FORTRAN source code in a compressed form on magnetic disk. The program was prepared to meet program maintenance requirements for ongoing program development and continual improvement of very large programs involving many programmers from a number of different organizations. SLIB77 automatically maintains in one file the source of the current program as well as all previous modifications. Although written originally for FORTRAN programs, SLIB77 is suitable for use with data files, text files, operating systems, and other programming languages, such as Ada, C and COBOL. It can handle libraries with records of up to 160-characters. Records are grouped into DECKS and assigned deck names by the user. SLIB77 assigns a number to each record in each DECK. Records can be deleted or restored singly or as a group within each deck. Modification records are grouped and assigned modification identification names by the user. The program assigns numbers to each new record within the deck. The program has two modes of execution, BATCH and EDIT. The BATCH mode is controlled by an input file and is used to make changes permanent and create new library files. The EDIT mode is controlled by interactive terminal input and a built-in line editor is used for modification of single decks. Transferring of a library from one computer system to another is accomplished using a Portable Library File created by SLIB77 in a BATCH run

  11. Federating LHCb datasets using the DIRAC File catalog

    CERN Document Server

    Haen, Christophe; Frank, Markus; Tsaregorodtsev, Andrei

    2015-01-01

    In the distributed computing model of LHCb the File Catalog (FC) is a central component that keeps track of each file and replica stored on the Grid. It is federating the LHCb data files in a logical namespace used by all LHCb applications. As a replica catalog, it is used for brokering jobs to sites where their input data is meant to be present, but also by jobs for finding alternative replicas if necessary. The LCG File Catalog (LFC) used originally by LHCb and other experiments is now being retired and needs to be replaced. The DIRAC File Catalog (DFC) was developed within the framework of the DIRAC Project and presented during CHEP 2012. From the technical point of view, the code powering the DFC follows an Aspect oriented programming (AOP): each type of entity that is manipulated by the DFC (Users, Files, Replicas, etc) is treated as a separate 'concern' in the AOP terminology. Hence, the database schema can also be adapted to the needs of a Virtual Organization. LHCb opted for a highly tuned MySQL datab...

  12. Building sustainable policy framework for transport development: A review of national transport policy initiatives in Nigeria

    Directory of Open Access Journals (Sweden)

    Sumaila A.F.

    2013-06-01

    Full Text Available This paper is concerned with building a sustainable policy framework for transport development in Nigeria. Its objective is to review the country’s transport policy initiatives in order to understand the extent to which it addresses Nigeria’s mobility and transportation problems. From published materials and official government documents and files, the study identifies four national policy initiatives which are reviewed and analysed with regard to their context, contents, and consequences. The study reveals that while the policy initiatives could be said to be adequate and comprehensive in terms of their context and contents, the major challenge is implementation of recommended solutions. The study therefore provides a general checklist to guide policy direction, while advocating for policy-based researches and empirical studies in order to provide the data base for formulation of a sustainable national transport policy for Nigeria.

  13. Access to DIII-D data located in multiple files and multiple locations

    International Nuclear Information System (INIS)

    McHarg, B.B. Jr.

    1993-10-01

    The General Atomics DIII-D tokamak fusion experiment is now collecting over 80 MB of data per discharge once every 10 min, and that quantity is expected to double within the next year. The size of the data files, even in compressed format, is becoming increasingly difficult to handle. Data is also being acquired now on a variety of UNIX systems as well as MicroVAX and MODCOMP computer systems. The existing computers collect all the data into a single shot file, and this data collection is taking an ever increasing amount of time as the total quantity of data increases. Data is not available to experimenters until it has been collected into the shot file, which is in conflict with the substantial need for data examination on a timely basis between shots. The experimenters are also spread over many different types of computer systems (possibly located at other sites). To improve data availability and handling, software has been developed to allow individual computer systems to create their own shot files locally. The data interface routine PTDATA that is used to access DIII-D data has been modified so that a user's code on any computer can access data from any computer where that data might be located. This data access is transparent to the user. Breaking up the shot file into separate files in multiple locations also impacts software used for data archiving, data management, and data restoration

  14. Macroeconomic impact of a mild influenza pandemic and associated policies in Thailand, South Africa and Uganda: a computable general equilibrium analysis.

    Science.gov (United States)

    Smith, Richard D; Keogh-Brown, Marcus R

    2013-11-01

    Previous research has demonstrated the value of macroeconomic analysis of the impact of influenza pandemics. However, previous modelling applications focus on high-income countries and there is a lack of evidence concerning the potential impact of an influenza pandemic on lower- and middle-income countries. To estimate the macroeconomic impact of pandemic influenza in Thailand, South Africa and Uganda with particular reference to pandemic (H1N1) 2009. A single-country whole-economy computable general equilibrium (CGE) model was set up for each of the three countries in question and used to estimate the economic impact of declines in labour attributable to morbidity, mortality and school closure. Overall GDP impacts were less than 1% of GDP for all countries and scenarios. Uganda's losses were proportionally larger than those of Thailand and South Africa. Labour-intensive sectors suffer the largest losses. The economic cost of unavoidable absence in the event of an influenza pandemic could be proportionally larger for low-income countries. The cost of mild pandemics, such as pandemic (H1N1) 2009, appears to be small, but could increase for more severe pandemics and/or pandemics with greater behavioural change and avoidable absence. © 2013 John Wiley & Sons Ltd.

  15. Recalling ISX shot data files from the off-line archive

    International Nuclear Information System (INIS)

    Stanton, J.S.

    1981-02-01

    This document describes a set of computer programs designed to allow access to ISX shot data files stored on off-line disk packs. The programs accept user requests for data files and build a queue of end requests. When an operator is available to mount the necessary disk packs, the system copies the requested files to an on-line disk area. The program runs on the Fusion Energy Division's DECsystem-10 computer. The request queue is implemented under the System 1022 data base management system. The support programs are coded in MACRO-10 and FORTRAN-10

  16. ENERGY POLICY

    OpenAIRE

    Avrupa Topluluğu Enstitüsü, Marmara Üniversitesi

    2015-01-01

    John Mitchell considers EU policies on energy supply security; Tera Allas on energy security of supply in the UK: the way forward; Peter Odell assesses public/private partnerships on the UKCS; Olivier Appert provides an overview of French energy policy.

  17. Energy policy

    International Nuclear Information System (INIS)

    Forrester, J.W.

    1979-01-01

    The author places the energy problem in the context of world economy. The various obstacles encountered in the United States to spell out a viable national energy policy are cited. A certain number of practical proposals is given to lead to an 'effective policy' which would allow energy economy at the same time as energy development, that is, including nuclear energy [fr

  18. Xeml Lab: a tool that supports the design of experiments at a graphical interface and generates computer-readable metadata files, which capture information about genotypes, growth conditions, environmental perturbations and sampling strategy.

    Science.gov (United States)

    Hannemann, Jan; Poorter, Hendrik; Usadel, Björn; Bläsing, Oliver E; Finck, Alex; Tardieu, Francois; Atkin, Owen K; Pons, Thijs; Stitt, Mark; Gibon, Yves

    2009-09-01

    Data mining depends on the ability to access machine-readable metadata that describe genotypes, environmental conditions, and sampling times and strategy. This article presents Xeml Lab. The Xeml Interactive Designer provides an interactive graphical interface at which complex experiments can be designed, and concomitantly generates machine-readable metadata files. It uses a new eXtensible Mark-up Language (XML)-derived dialect termed XEML. Xeml Lab includes a new ontology for environmental conditions, called Xeml Environment Ontology. However, to provide versatility, it is designed to be generic and also accepts other commonly used ontology formats, including OBO and OWL. A review summarizing important environmental conditions that need to be controlled, monitored and captured as metadata is posted in a Wiki (http://www.codeplex.com/XeO) to promote community discussion. The usefulness of Xeml Lab is illustrated by two meta-analyses of a large set of experiments that were performed with Arabidopsis thaliana during 5 years. The first reveals sources of noise that affect measurements of metabolite levels and enzyme activities. The second shows that Arabidopsis maintains remarkably stable levels of sugars and amino acids across a wide range of photoperiod treatments, and that adjustment of starch turnover and the leaf protein content contribute to this metabolic homeostasis.

  19. 75 FR 51994 - Combined Notice of Filings

    Science.gov (United States)

    2010-08-24

    ...: Panther Interstate Pipeline Energy, LLC. Description: Panther Interstate Pipeline Energy, LLC submits tariff filing per 154.203: Panther Baseline eTariff Filing to be effective 8/ 12/2010. Filed Date: 08/13...

  20. Download this PDF file

    African Journals Online (AJOL)

    NESG PUBLICATIONS

    improving its agriculture, its breed of useful animals, and other ... redistribution of productive land. ... holdings, to those who work the land, or the consolidation of small .... the grantee for life, and for his ..... policy that fails to balance the.

  1. Download this PDF file

    African Journals Online (AJOL)

    MJZ

    illnesses, seriously affecting their daily functioning. The World Bank in ... deficits, frequently resulting in loss of independent. 6, 8, 9 living skills. ... plans and policies. In Zambia, the .... Joshua demonstrates hyperactivity and attention- seeking ...

  2. Download this PDF file

    African Journals Online (AJOL)

    Dr Kazungu

    order to test for the effect of the Economic Recovery Programme on the relationship ... control inflation and increase employment is partly explained by failure of policy makers to ... Given the rapid rural-urban migration in Ghana, inability.

  3. Download this PDF file

    African Journals Online (AJOL)

    Dr Olaleye

    explored the role of PMVs in the provision of contraceptive services in ... survey was conducted in four rural Local Government Areas (LGAs) in Oyo ... Interventions and policy actions ... developed by the Nigerian Federal Ministry of Health.

  4. Download this PDF file

    African Journals Online (AJOL)

    2013-07-01

    Jul 1, 2013 ... policies shift to normalize HIV testing as routine in a range of clinical settings, greater effort .... actions; (2) disclosing to sexual partners helps prevent HIV trans- ..... person would react, PLHIV reported they would listen to what.

  5. Download this PDF file

    African Journals Online (AJOL)

    Irohibe and Agwu

    Climate change is a clear threat to all sectors of the Nigerian socio - ... build the capacities of local institutions to support disaster management policies. ... change agenda and communicating relevant information to the public is very crucial.

  6. Download this PDF file

    African Journals Online (AJOL)

    EBELE

    The study examined the determinants of adaptation to climate change ... of the public extension service, access to adequate information on climate change by ..... is therefore recommended that policy issues on climate change should be ...

  7. Download this PDF file

    African Journals Online (AJOL)

    DR Nneka

    2015-01-13

    Jan 13, 2015 ... Ethnicity, religious fundamentalism and threats to National ..... security policy should have political accommodation as a primary and persistent aim. ... Reasons Advanced for Ethnicity, Religious Extremism and Internal Security.

  8. Download this PDF file

    African Journals Online (AJOL)

    eliasn

    ers.4 On the other hand, policy refers to the general principles by which a ... price controls and institutions like marketing boards phase out in many coun- .... tained, firms could achieve a legitimate dominant market position, for exam-.

  9. Download this PDF file

    African Journals Online (AJOL)

    corruption, the state of policing, and social ... article explores how their experiences of crime and police raids are reshaping the dynamics of the ... arrested,8 the dominant policy, political and media ..... pregnant women and children) are met.

  10. Download this PDF file

    African Journals Online (AJOL)

    RAGHAVENDRA

    training institutions should strengthen constructivist te teachers trainees, discussion is ..... pilot study at Diza secondary school to 13 teachers and. 27 students by purposive ..... large, was given due attention in the education and training policy ...

  11. Download this PDF file

    African Journals Online (AJOL)

    Administrator

    Skype: almedom. Resilience research and policy/practice discourse in health, social, ... Introduction. An overview of recent developments and current ... Journal of Social and Clinical Psychology·(2000) Special Issue. Classical Sources of ...

  12. Download this PDF file

    African Journals Online (AJOL)

    ADOWIE PERE

    1Institute of Engineering, Technology, and Innovation Management (METI), University of Port Harcourt, Rivers/Science Policy and .... governments, environmental bodies and the scientific ..... assess, project and proffer workable management.

  13. 46 CFR 308.523 - Application for revision of Open Cargo Policy, Form MA-303.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Application for revision of Open Cargo Policy, Form MA... Application for revision of Open Cargo Policy, Form MA-303. An application for the revision of an Open Cargo Policy shall be filed in duplicate with the Underwriting Agent on a form which may be obtained from the...

  14. 76 FR 52323 - Combined Notice of Filings; Filings Instituting Proceedings

    Science.gov (United States)

    2011-08-22

    .... Applicants: Young Gas Storage Company, Ltd. Description: Young Gas Storage Company, Ltd. submits tariff..., but intervention is necessary to become a party to the proceeding. The filings are accessible in the.... More detailed information relating to filing requirements, interventions, protests, and service can be...

  15. Computer security engineering management

    International Nuclear Information System (INIS)

    McDonald, G.W.

    1988-01-01

    For best results, computer security should be engineered into a system during its development rather than being appended later on. This paper addresses the implementation of computer security in eight stages through the life cycle of the system; starting with the definition of security policies and ending with continuing support for the security aspects of the system throughout its operational life cycle. Security policy is addressed relative to successive decomposition of security objectives (through policy, standard, and control stages) into system security requirements. This is followed by a discussion of computer security organization and responsibilities. Next the paper directs itself to analysis and management of security-related risks, followed by discussion of design and development of the system itself. Discussion of security test and evaluation preparations, and approval to operate (certification and accreditation), is followed by discussion of computer security training for users is followed by coverage of life cycle support for the security of the system

  16. The DNA Files

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-09

    The DNA Files is a radio documentary which disseminates genetics information over public radio. The documentaries explore subjects which include the following: How genetics affects society. How human life began and how it evolved. Could new prenatal genetic tests hold the key to disease prevention later in life? Would a national genetic data base sacrifice individual privacy? and Should genes that may lead to the cure for cancer be privately owned? This report serves as a project update for the second quarter of 1998. It includes the spring/summer 1998 newsletter, the winter 1998 newsletter, the program clock, and the latest flyer.

  17. Data Policy

    Directory of Open Access Journals (Sweden)

    Mark A Parsons

    2013-07-01

    Full Text Available The first purpose of data policy should be to serve the objectives of the organization or project sponsoring the collection of the data. With research data, data policy should also serve the broader goals of advancing scientific and scholarly inquiry and society at large. This is especially true with government-funded data, which likely comprise the vast majority of research data. Data policy should address multiple issues, depending on the nature and objectives of the data. These issues include data access requirements, data preservation and stewardship requirements, standards and compliance mechanisms, data security issues, privacy and ethical concerns, and potentially even specific collection protocols and defined data flows. The specifics of different policies can vary dramatically, but all data policies need to address data access and preservation. Research data gain value with use and must therefore be accessible and preserved for future access. This article focuses on data access. While policy might address multiple issues, at a first level it must address where the data stand on what Lyon (2009 calls the continuum of openness. Making data as openly accessible as possible provides the greatest societal benefit, and a central purpose of data policy is to work toward ethically open data access. An open data regime not only maximizes the benefit of the data, it also simplifies most of the other issues around effective research data stewardship and infrastructure development.

  18. Computer system operation

    International Nuclear Information System (INIS)

    Lee, Young Jae; Lee, Hae Cho; Lee, Ho Yeun; Kim, Young Taek; Lee, Sung Kyu; Park, Jeong Suk; Nam, Ji Wha; Kim, Soon Kon; Yang, Sung Un; Sohn, Jae Min; Moon, Soon Sung; Park, Bong Sik; Lee, Byung Heon; Park, Sun Hee; Kim, Jin Hee; Hwang, Hyeoi Sun; Lee, Hee Ja; Hwang, In A.

    1993-12-01

    The report described the operation and the trouble shooting of main computer and KAERINet. The results of the project are as follows; 1. The operation and trouble shooting of the main computer system. (Cyber 170-875, Cyber 960-31, VAX 6320, VAX 11/780). 2. The operation and trouble shooting of the KAERINet. (PC to host connection, host to host connection, file transfer, electronic-mail, X.25, CATV etc.). 3. The development of applications -Electronic Document Approval and Delivery System, Installation the ORACLE Utility Program. 22 tabs., 12 figs. (Author) .new

  19. Computer system operation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Jae; Lee, Hae Cho; Lee, Ho Yeun; Kim, Young Taek; Lee, Sung Kyu; Park, Jeong Suk; Nam, Ji Wha; Kim, Soon Kon; Yang, Sung Un; Sohn, Jae Min; Moon, Soon Sung; Park, Bong Sik; Lee, Byung Heon; Park, Sun Hee; Kim, Jin Hee; Hwang, Hyeoi Sun; Lee, Hee Ja; Hwang, In A [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1993-12-01

    The report described the operation and the trouble shooting of main computer and KAERINet. The results of the project are as follows; 1. The operation and trouble shooting of the main computer system. (Cyber 170-875, Cyber 960-31, VAX 6320, VAX 11/780). 2. The operation and trouble shooting of the KAERINet. (PC to host connection, host to host connection, file transfer, electronic-mail, X.25, CATV etc.). 3. The development of applications -Electronic Document Approval and Delivery System, Installation the ORACLE Utility Program. 22 tabs., 12 figs. (Author) .new.

  20. Computer-communication networks

    CERN Document Server

    Meditch, James S

    1983-01-01

    Computer- Communication Networks presents a collection of articles the focus of which is on the field of modeling, analysis, design, and performance optimization. It discusses the problem of modeling the performance of local area networks under file transfer. It addresses the design of multi-hop, mobile-user radio networks. Some of the topics covered in the book are the distributed packet switching queuing network design, some investigations on communication switching techniques in computer networks and the minimum hop flow assignment and routing subject to an average message delay constraint

  1. CRYPTOGRAPHIC SECURE CLOUD STORAGE MODEL WITH ANONYMOUS AUTHENTICATION AND AUTOMATIC FILE RECOVERY

    Directory of Open Access Journals (Sweden)

    Sowmiya Murthy

    2014-10-01

    Full Text Available We propose a secure cloud storage model that addresses security and storage issues for cloud computing environments. Security is achieved by anonymous authentication which ensures that cloud users remain anonymous while getting duly authenticated. For achieving this goal, we propose a digital signature based authentication scheme with a decentralized architecture for distributed key management with multiple Key Distribution Centers. Homomorphic encryption scheme using Paillier public key cryptosystem is used for encrypting the data that is stored in the cloud. We incorporate a query driven approach for validating the access policies defined by an individual user for his/her data i.e. the access is granted to a requester only if his credentials matches with the hidden access policy. Further, since data is vulnerable to losses or damages due to the vagaries of the network, we propose an automatic retrieval mechanism where lost data is recovered by data replication and file replacement with string matching algorithm. We describe a prototype implementation of our proposed model.

  2. Earnings Public-Use File, 2006

    Data.gov (United States)

    Social Security Administration — Social Security Administration released Earnings Public-Use File (EPUF) for 2006. File contains earnings information for individuals drawn from a systematic random...

  3. Development of a script for converting DICOM files to .TXT

    International Nuclear Information System (INIS)

    Abrantes, Marcos E.S.; Oliveira, A.H. de

    2014-01-01

    Background: with the increased use of computer simulation techniques for diagnosis or therapy in patients, the MCNP and SCMS software is being widely used. For use as SCMS data entry interface for the MCNP is necessary to perform transformation of DICOM images to text files. Objective: to produce a semi automatic script conversion DICOM images generated by Computerized Tomography or Magnetic Resonance, for .txt in the IMAGEJ software. Methodology: this study was developed in the IMAGEJ software platform with an Intel Core 2 Duo computer, CPU of 2.00GHz, with 2:00 GB of RAM for 32-bit system. Development of the script was held in a text editor using JAVA language. For script insertion in IMAGEJ the plug in tool of this software was used. After this, a window is open asking for the path of the files that will be read, first and last name of DICOM file to be converted, along with where the new files will be stored. Results: for the manual processing of DICOM conversion to .txt of cerebral computed tomography with 600 images requires a time of about 8 hours. The use of script allows conversion time reduction for 12 minutes. Conclusion: the script used demonstrates DICOM conversion ability to .txt and a significant improvement in time savings in processing

  4. The Fermilab central computing facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-01-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front-end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS cluster interactive front-end, an Amdahl VM Computing engine, ACP farms, and (primarily) VMS workstations. This paper will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. (orig.)

  5. The Fermilab Central Computing Facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-05-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS Cluster interactive front end, an Amdahl VM computing engine, ACP farms, and (primarily) VMS workstations. This presentation will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. 2 figs

  6. Software for Managing Personal Files.

    Science.gov (United States)

    Lundeen, Gerald

    1989-01-01

    Discusses the special characteristics of personal file management software and compares four microcomputer software packages: Notebook II with Bibliography and Convert, Pro-Cite with Biblio-Links, askSam, and Reference Manager. Each package is evaluated in terms of the user interface, file maintenance, retrieval capabilities, output, and…

  7. Mixed-Media File Systems

    NARCIS (Netherlands)

    Bosch, H.G.P.

    1999-01-01

    This thesis addresses the problem of implementing mixed-media storage systems. In this work a mixed-media file system is defined to be a system that stores both conventional (best-effort) file data and real-time continuous-media data. Continuous-media data is usually bulky, and servers storing and

  8. The NEA computer program library: a possible GDMS application

    International Nuclear Information System (INIS)

    Schuler, W.

    1978-01-01

    NEA Computer Program library maintains a series of eleven sequential computer files, used for linked applications in managing their stock of computer codes for nuclear reactor calculations, storing index and program abstract information, and administering their service to requesters. The high data redundancy beween the files suggests that a data base approach would be valid and this paper suggests a possible 'schema' for an CODASYL GDMS

  9. 28 March 2014 - Italian Minister of Education, University and Research S. Giannini welcomed by CERN Director-General R. Heuer and Director for Research and Scientific Computing S. Bertolucci in the ATLAS experimental cavern with Former Collaboration Spokesperson F. Gianotti. Signature of the guest book with Belgian State Secretary for the Scientific Policy P. Courard.

    CERN Multimedia

    Gadmer, Jean-Claude

    2014-01-01

    28 March 2014 - Italian Minister of Education, University and Research S. Giannini welcomed by CERN Director-General R. Heuer and Director for Research and Scientific Computing S. Bertolucci in the ATLAS experimental cavern with Former Collaboration Spokesperson F. Gianotti. Signature of the guest book with Belgian State Secretary for the Scientific Policy P. Courard.

  10. Design and creation of a direct access nuclear data file

    International Nuclear Information System (INIS)

    Charpentier, P.

    1981-06-01

    General considerations on the structure of instructions and files are reviewed. Design, organization and mode of use of the different files: instruction file, index files, inverted files, automatic analysis and inquiry programs are examined [fr

  11. 3rd ICTs and Society Meeting; Paper Session - Inequalities: social, economic, political; Paper 1: Information Society Policies 2.0. A Critical Analysis of the Potential and Pit-falls of Social Computing & Informatics in the Light of E-inclusion

    Directory of Open Access Journals (Sweden)

    Pieter Verdegem

    2010-06-01

    Full Text Available In this paper we reflect on how research and policies can and/or should help in the development of a sustainable information society for all. More specifically, we critically investigate how social computing & informatics can entail both potential and pitfalls, especially with regard to the difficult relationship between digital and social inclusion. First of all, traditional information society policies are scrutinized. Furthermore, we point at the existence of digital inequalities and we reflect briefly on policy intervention on this (e-inclusion. In addition, we also evaluate the raise of social computing & informatics. Finally, attention is given to the challenge of how research can contribute to the participation of all in the information society.

  12. 78 FR 34099 - FCC Extends Pleading Cycle for Indecency Cases Policy

    Science.gov (United States)

    2013-06-06

    ... the Commission's Web site through its Electronic Document Management System (EDOCS) at http... Electronic Comment Filing System (ECFS). See Electronic Filing of Documents in Rulemaking Proceedings, 63 FR... Indecency Cases Policy AGENCY: Federal Communications Commission. ACTION: Notice. SUMMARY: In this document...

  13. Fast processing the film data file

    International Nuclear Information System (INIS)

    Abramov, B.M.; Avdeev, N.F.; Artemov, A.V.

    1978-01-01

    The problems of processing images obtained from three-meter magnetic spectrometer on a new PSP-2 automatic device are considered. A detailed description of the filtration program, which controls the correctness of operation connection line, as well as of scanning parameters and technical quality of information. The filtration process can be subdivided into the following main stages: search of fiducial marks binding of track to fiducial marks; plotting from sparks of track fragments in chambers. For filtration purposes the BESM-6 computer has been chosen. The complex of filtration programs is shaped as a RAM-file, the required version of the program is collected by the PATCHY program. The subprograms, performing the greater part of the calculations are written in the autocode MADLEN, the rest of the subprograms - in FORTRAN and ALGOL. The filtration time for one image makes 1,2-2 s of the calculation. The BESM-6 computer processes up to 12 thousand images a day

  14. Temperature increases on the external root surface during endodontic treatment using single file systems.

    Science.gov (United States)

    Özkocak, I; Taşkan, M M; Gökt Rk, H; Aytac, F; Karaarslan, E Şirin

    2015-01-01

    The aim of this study is to evaluate increases in temperature on the external root surface during endodontic treatment with different rotary systems. Fifty human mandibular incisors with a single root canal were selected. All root canals were instrumented using a size 20 Hedstrom file, and the canals were irrigated with 5% sodium hypochlorite solution. The samples were randomly divided into the following three groups of 15 teeth: Group 1: The OneShape Endodontic File no.: 25; Group 2: The Reciproc Endodontic File no.: 25; Group 3: The WaveOne Endodontic File no.: 25. During the preparation, the temperature changes were measured in the middle third of the roots using a noncontact infrared thermometer. The temperature data were transferred from the thermometer to the computer and were observed graphically. Statistical analysis was performed using the Kruskal-Wallis analysis of variance at a significance level of 0.05. The increases in temperature caused by the OneShape file system were lower than those of the other files (P file showed the highest temperature increases. However, there were no significant differences between the Reciproc and WaveOne files. The single file rotary systems used in this study may be recommended for clinical use.

  15. Policy Innovation in Innovation Policy

    DEFF Research Database (Denmark)

    Borras, Susana

    During the past two decades Europe has experienced important changes and transformations in the way in which governments approach the issue of science, technology and innovation, and their relation to economic growth and competitiveness. This has to do with the European Union level as well...... as with national and sub-national governments in Europe, all of them introducing interesting novelties in their innovation policy. These changes refer to different aspects of policy, mainly the content of policy initiatives towards science, technology and innovation; the instruments governments are using...... at the EU level, and mentions similar trends taking place at national and sub-national levels. The questions that guide the contents here are essentially three, namely, what are the main traits of innovation policies in Europe since the 1990s and how have the EU and different national governments approached...

  16. Virtual file system for PSDS

    Science.gov (United States)

    Runnels, Tyson D.

    1993-01-01

    This is a case study. It deals with the use of a 'virtual file system' (VFS) for Boeing's UNIX-based Product Standards Data System (PSDS). One of the objectives of PSDS is to store digital standards documents. The file-storage requirements are that the files must be rapidly accessible, stored for long periods of time - as though they were paper, protected from disaster, and accumulative to about 80 billion characters (80 gigabytes). This volume of data will be approached in the first two years of the project's operation. The approach chosen is to install a hierarchical file migration system using optical disk cartridges. Files are migrated from high-performance media to lower performance optical media based on a least-frequency-used algorithm. The optical media are less expensive per character stored and are removable. Vital statistics about the removable optical disk cartridges are maintained in a database. The assembly of hardware and software acts as a single virtual file system transparent to the PSDS user. The files are copied to 'backup-and-recover' media whose vital statistics are also stored in the database. Seventeen months into operation, PSDS is storing 49 gigabytes. A number of operational and performance problems were overcome. Costs are under control. New and/or alternative uses for the VFS are being considered.

  17. PKA spectrum file

    Energy Technology Data Exchange (ETDEWEB)

    Kawai, M. [Toshiba Corp., Kawasaki, Kanagawa (Japan). Nuclear Engineering Lab.

    1997-03-01

    In the Japanese Nuclear Data Committee, the PKA/KERMA file containing PKA spectra, KERMA factors and DPA cross sections in the energy range between 10{sup -5} eV and 50 MeV is being prepared from the evaluated nuclear data. The processing code ESPERANT was developed to calculate quantities of PKA, KERMA and DPA from evaluated nuclear data for medium and heavy elements by using the effective single particle emission approximation (ESPEA). For light elements, the PKA spectra are evaluated by the SCINFUL/DDX and EXIFON codes, simultaneously with other neutron cross sections. The DPA cross sections due to charged particle emitted from light elements are evaluated for high neutron energy above 20 MeV. (author)

  18. Competency Reference for Computer Assisted Drafting.

    Science.gov (United States)

    Oregon State Dept. of Education, Salem. Div. of Vocational Technical Education.

    This guide, developed in Oregon, lists competencies essential for students in computer-assisted drafting (CAD). Competencies are organized in eight categories: computer hardware, file usage and manipulation, basic drafting techniques, mechanical drafting, specialty disciplines, three dimensional drawing/design, plotting/printing, and advanced CAD.…

  19. Download this PDF file

    African Journals Online (AJOL)

    NESG PUBLICATIONS

    United Kingdom, France, Germany and the nations of the European Union, ..... and Chinweizu that a past history of ..... As illustrated in our review of the .... Professor Anya O. Anya, Ph.D (Cambridge), D.Sc (Hon), FAS, OFR, NNOM, is a Policy ...

  20. Download this PDF file

    African Journals Online (AJOL)

    introduction. It is widely accepted that development in Information and Communication ... For effective and efficient adoption and application of ICT by libraries there is a ... and technological obsolescence as software and hardware platforms change. .... joint efforts with a multi pronged approach in establishing these policies,.

  1. Download this PDF file

    African Journals Online (AJOL)

    NESG PUBLICATIONS

    fuelled by the persistent global economic crisis, which ... Some studies (e.g., Perotti: 1999, and Giavazzi ... consumption response to fiscal ... that government spending; taxes and .... In the case ..... effects of fiscal policy changes: International evidence and the swedish ... Department of Economics, Lund University, 2003.

  2. Download this PDF file

    African Journals Online (AJOL)

    with the general population. In the same vein, a world collaborative report on in vitro fertilization recorded a multiple birth rate of 29%, the majority of. 5 ... Conclusion: With the desire for twins and high poverty level in Nigeria, a policy of single ...

  3. Download this PDF file

    African Journals Online (AJOL)

    CJN

    2009-12-15

    Dec 15, 2009 ... of an attorney who found himself in a conflict of interests situation. .... Whether the contracts were contra bonos mores or against public policy ... Rights. This approach leaves space for the doctrine of pacta sunt servanda ... respondent breached the standards of professional ethics by knowingly entering into.

  4. Download this PDF file

    African Journals Online (AJOL)

    puts us far below neighbouring Botswana, ... This article will argue that masculine domination is a crucial factor in black male ... likely to be at the receiving end of fatal political violence. ... political leaders, policy makers and police chiefs to speak out more often, ... Even in cases where .... Ironically, in recent times South.

  5. Download this PDF file

    African Journals Online (AJOL)

    McConnell, W. J.

    2009-06-01

    Jun 1, 2009 ... able analyses of the role of human agency and the effects of various policy ... Grâce à de nouvelles sources d'informations mais ... Center for Systems Integration & Sustainability .... be compared with 'control' locations that were similar in other ... areas with greater internal disparity of income (Gorenflo et al.

  6. Download this PDF file

    African Journals Online (AJOL)

    PUBLICATIONS1

    clinics and hospitals, the cost of healthcare, socio-cultural beliefs and norms and the confidence ... tional bone-setters and the West Mamprusi District Hospital and the integration of traditional bone-setting into the national public health framework need urgent policy attention ...... ogy in Kumasi should forge a partnership with.

  7. Download this PDF file

    African Journals Online (AJOL)

    In his book Beggars can be Choosers he integrated .... Most interventions from outside (whether public or private) adhered to programmes of poverty ... to transformation, the development that frees the person from the culture of poverty into ..... Partnership in administration means that a policy for administration will have to be ...

  8. Download this PDF file

    African Journals Online (AJOL)

    Nubidga

    of this new area of knowledge is to explain the formation of a large ... answer the above question by policy makers in different rapidly .... extension to the roads such as payment of fines for poor parking .... implication of taxes on substitute goods in terms of consumers ... of transport choice and location decision of firms. τ* τ*τ* ...

  9. Download this PDF file

    African Journals Online (AJOL)

    ETIWISTIC

    2013-04-29

    Apr 29, 2013 ... payments system, the harmonization of economic policies and .... the consequent grave implications of one or two members ... states by removing some of the payment obstacles to trade. .... average size of firms in the integrating area. .... integration decision in the ECOWAS since 1983 are discussed below.

  10. Download this PDF file

    African Journals Online (AJOL)

    ABSTRACT. Dairy farmers face various challenges in developing their businesses. The current ... agricultural policy report of the Malawian government identifies three major restrictions ... of respondents were female (n = 352), and the average age was 49.4 years. In an open ..... Presentation for the FAO Asia. Regional ...

  11. Download this PDF file

    African Journals Online (AJOL)

    raoul

    2012-01-24

    Jan 24, 2012 ... contributing in this way for a relative reduction in the difference between the ... class and private sector steadily grows in urban areas, and the policy focus is ... On the other hand, public sector health workers true brain drain ...

  12. Download this PDF file

    African Journals Online (AJOL)

    2010-12-04

    Dec 4, 2010 ... This study therefore sought to empirically examine whether in the ... Her research interests focus on human resource management, ... its disproportionate effect on the most productive segment of the ..... learn about relevant new developments. .... some companies might fail to implement policy commitments.

  13. 5 CFR 1203.13 - Filing pleadings.

    Science.gov (United States)

    2010-01-01

    ... delivery, by facsimile, or by e-filing in accordance with § 1201.14 of this chapter. If the document was... submitted by e-filing, it is considered to have been filed on the date of electronic submission. (e... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Filing pleadings. 1203.13 Section 1203.13...

  14. 12 CFR 16.33 - Filing fees.

    Science.gov (United States)

    2010-01-01

    ... Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY SECURITIES OFFERING DISCLOSURE RULES § 16.33 Filing fees. (a) Filing fees must accompany certain filings made under the provisions of this part... Comptroller of the Currency Fees published pursuant to § 8.8 of this chapter. (b) Filing fees must be paid by...

  15. 77 FR 13587 - Combined Notice of Filings

    Science.gov (United States)

    2012-03-07

    .... Applicants: Transcontinental Gas Pipe Line Company. Description: Annual Electric Power Tracker Filing... Company. Description: 2012 Annual Fuel and Electric Power Reimbursement to be effective 4/1/2012. Filed... submits tariff filing per 154.403: Storm Surcharge 2012 to be effective 4/1/2012. Filed Date: 3/1/12...

  16. 75 FR 4689 - Electronic Tariff Filings

    Science.gov (United States)

    2010-01-29

    ... elements ``are required to properly identify the nature of the tariff filing, organize the tariff database... (or other pleading) and the Type of Filing code chosen will be resolved in favor of the Type of Filing...'s wish expressed in its transmittal letter or in other pleadings, the Commission may not review a...

  17. ChemEngine: harvesting 3D chemical structures of supplementary data from PDF files.

    Science.gov (United States)

    Karthikeyan, Muthukumarasamy; Vyas, Renu

    2016-01-01

    Digital access to chemical journals resulted in a vast array of molecular information that is now available in the supplementary material files in PDF format. However, extracting this molecular information, generally from a PDF document format is a daunting task. Here we present an approach to harvest 3D molecular data from the supporting information of scientific research articles that are normally available from publisher's resources. In order to demonstrate the feasibility of extracting truly computable molecules from PDF file formats in a fast and efficient manner, we have developed a Java based application, namely ChemEngine. This program recognizes textual patterns from the supplementary data and generates standard molecular structure data (bond matrix, atomic coordinates) that can be subjected to a multitude of computational processes automatically. The methodology has been demonstrated via several case studies on different formats of coordinates data stored in supplementary information files, wherein ChemEngine selectively harvested the atomic coordinates and interpreted them as molecules with high accuracy. The reusability of extracted molecular coordinate data was demonstrated by computing Single Point Energies that were in close agreement with the original computed data provided with the articles. It is envisaged that the methodology will enable large scale conversion of molecular information from supplementary files available in the PDF format into a collection of ready- to- compute molecular data to create an automated workflow for advanced computational processes. Software along with source codes and instructions available at https://sourceforge.net/projects/chemengine/files/?source=navbar.Graphical abstract.

  18. Study and development of a document file system with selective access; Etude et realisation d'un systeme de fichiers documentaires a acces selectif

    Energy Technology Data Exchange (ETDEWEB)

    Mathieu, Jean-Claude

    1974-06-21

    The objective of this research thesis was to design and to develop a set of software aimed at an efficient management of a document file system by using methods of selective access to information. Thus, the three main aspects of file processing (creation, modification, reorganisation) have been addressed. The author first presents the main problems related to the development of a comprehensive automatic documentation system, and their conventional solutions. Some future aspects, notably dealing with the development of peripheral computer technology, are also evoked. He presents the characteristics of INIS bibliographic records provided by the IAEA which have been used to create the files. In the second part, he briefly describes the file system general organisation. This system is based on the use of two main files: an inverse file which contains for each descriptor a list of of numbers of files indexed by this descriptor, and a dictionary of descriptor or input file which gives access to the inverse file. The organisation of these both files is then describes in a detailed way. Other related or associated files are created, and the overall architecture and mechanisms integrated into the file data input software are described, as well as various processing applied to these different files. Performance and possible development are finally discussed.

  19. Detecting Malicious Code by Binary File Checking

    Directory of Open Access Journals (Sweden)

    Marius POPA

    2014-01-01

    Full Text Available The object, library and executable code is stored in binary files. Functionality of a binary file is altered when its content or program source code is changed, causing undesired effects. A direct content change is possible when the intruder knows the structural information of the binary file. The paper describes the structural properties of the binary object files, how the content can be controlled by a possible intruder and what the ways to identify malicious code in such kind of files. Because the object files are inputs in linking processes, early detection of the malicious content is crucial to avoid infection of the binary executable files.

  20. Policy stories

    DEFF Research Database (Denmark)

    Ren, Carina Bregnholm; Rasmussen, Rasmus Kjærgaard

    This article uses Arctic Winter 2016 as an exploration site of values and futures in Greenland. By taking a valuation approach where the creation and interpretation of event values are seen as an ongoing and taxing accomplishment, we firstly expand the understanding of events beyond their actual...... present three central policy stories from the field. The stories tell of how the event was first interested, then activated and finally evaluated. Besides adding a new understanding to policy-driven events as a locus of value creation, we also argue that the AWG 2016 offer speculative bets for new...... planning and execution and of event outcomes beyond the narrow confines of bed nights and legacies. Second, we introduce policies as an entry point to unlock discussions and manifestations of value and futures which connect to AWG. In order to exemplify the workings of the AWG event in these domains, we...

  1. Download this PDF file

    African Journals Online (AJOL)

    user

    particularly addresses computer crime. 6 ... Hence, the other crimes in terms of ... by way of questionnaires and are available in binary column electronic format ..... Agreement on Trade-Related Aspects of Intellectual Property Rights (TRIPS) ...

  2. Download this PDF file

    African Journals Online (AJOL)

    AJRH Managing Editor

    emphasised the need to bridge gender inequalities and empower ..... A single composite variable was computed using .... expand in scope to include over-age out of school children ... advantages and disadvantages of home birthing revealed ...

  3. Download this PDF file

    African Journals Online (AJOL)

    NWUuser

    addresses questions such as what is the nature of science and what distinguishes science ... See Russell Human Knowledge; Russell Scientific Outlook. 6 ..... that Computer Science (as the study of algorithms) is a humanities discipline, even.

  4. Download this PDF file

    African Journals Online (AJOL)

    Prof. Osuagwwu

    ICT Education: A Tool For Quality Assurance In Tertiary Institution In Nigeria ... Department of Computer Science, College of Education, Pankshin, Plateau .... informal education opportunity. ... example secondary School students who must.

  5. Download this PDF file

    African Journals Online (AJOL)

    User

    Information Impact | Journal of information and knowledge management. Page 15 ... Little appears to be known about colleges of education students‟ use of OPAC, computer self-efficacy and the ... and saving time for a library‟s users.

  6. Download this PDF file

    African Journals Online (AJOL)

    USER

    use of ICT in instruction enhance academic performance of students. The study ... computer based tools used by teachers to teach ... like video, audio, camera, and so on, which convert .... example about 84.9% of the respondents agreed.

  7. Download this PDF file

    African Journals Online (AJOL)

    Unknown User

    1National University of Rwanda, Applied Mathematics Department, B.P. 117, Huye,. Rwanda, email: ... A linear programming problem (LPP) is a mathematical programming ..... reduced cost obtained from the model computations for a pipe of.

  8. Download this PDF file

    African Journals Online (AJOL)

    User

    EFFECTIVENESS OF FIREFLY ALGORITHM BASED NEURAL NETWORK IN. TIME SERIES FORECASTING .... Most forecasting problems employs the traditional NN often called ..... SAGA 2009, Lecture Notes in Computer. Sciences 5792: ...

  9. Download this PDF file

    African Journals Online (AJOL)

    Dr Olaleye

    genetic background, in brain slice preparations in normal and high K+ conditions, we studied the effect of 4 ..... (2013) used an interface recording chamber, superfused with ... experimental and computer modeling studies (Hahn et al., 2001 ...

  10. Download this PDF file

    African Journals Online (AJOL)

    Nigerian agriculture still maintained peasant oriented economy that was prominent in the pre- ... demand (Baba, 2010). The Food and .... For the Linear functional form, the elasticity with respect to the production inputs was computed using the ...

  11. Download this PDF file

    African Journals Online (AJOL)

    Library _info_Sc_ 1

    Twenty (120) questionnaires were placed at the circulation desk (from October to December ... analysis was carried out using Statistical Package for social Sciences (SPSS), and the ... learning modules for medical students, and the computer.

  12. Download this PDF file

    African Journals Online (AJOL)

    pc

    2018-03-22

    Mar 22, 2018 ... Abstract: an essential issue in pervasive computing application. FM radio ... This integration allows the user of mobile or any device which contains ..... [6] A. Bekkelien, “Bluetooth Indoor Positioning”, master thesis, University.

  13. Download this PDF file

    African Journals Online (AJOL)

    User

    Technology simply implies the application of knowledge to meet the goals, goods and services desired by ... and television to telephones (fixed and mobile), computers and the internet. ..... Doctoral dissertation, University of Ghana, Accra.

  14. Download this PDF file

    African Journals Online (AJOL)

    RAGHAVENDRA

    teaching is enabling learners to become an independent learner, develop .... i.e Mathematics and Biology from Natural and. Computational Science faculty, Geography and English from Social ...... Perceptions of Woldia University. Instructors ...

  15. Download this PDF file

    African Journals Online (AJOL)

    *School of Computer, Statistical and Mathematical Sciences North-West University ( ... Following on an overview of the pipeline design problem, ... Key words: Network flow models, integer linear programming, extended tree knapsack model.

  16. Download this PDF file

    African Journals Online (AJOL)

    2013-12-07

    Dec 7, 2013 ... Department of Mechanical Engineering, College of Engineering and ... piece quality in the CNC (Computer Numerical Control) turning process. An effective approach of optimization techniques genetic algorithm (GA) and ...

  17. Download this PDF file

    African Journals Online (AJOL)

    Ada

    Software project management is the control of the transformation of users' requirements ... AcadSoft Solutions, Calabar, Unical Computer Centre, and OmegaBiz.ng ... for managing people, technology, resources and risks in software projects,.

  18. Download this PDF file

    African Journals Online (AJOL)

    SOFTLINKS DIGITAL

    Technologies (ICTs) on Practices of Enterprises: The Nigerian. Perspective. 1. Oladimeji, S. A., ... 3Department of Computer Science, Federal Polytechnic Damaturu, P O Box 358, Damaturu, Nigeria .... project failures in developing countries is.

  19. Download this PDF file

    African Journals Online (AJOL)

    Erah

    2008-06-02

    Jun 2, 2008 ... namely, sociodemographic data, access to computer and internet ... Keywords: LMS, e-Learning, pharmacy students, ICT, teaching and ... changes in Universities in many countries, ... lecturers is very low because of time, and.

  20. Download this PDF file

    African Journals Online (AJOL)

    USER

    for the adoption of cloud computing in NOUN library such as the need to disclose their ... are delivered as a service to external customers using ... cloud deployment (Creeger, 2009), Library .... continuous development that reshape the way.

  1. A lightweight high availability strategy for Atlas LCG File Catalogs

    International Nuclear Information System (INIS)

    Martelli, Barbara; Salvo, Alessandro de; Anzellotti, Daniela; Rinaldi, Lorenzo; Cavalli, Alessandro; Pra, Stefano dal; Dell'Agnello, Luca; Gregori, Daniele; Prosperini, Andrea; Ricci, Pier Paolo; Sapunenko, Vladimir

    2010-01-01

    The LCG File Catalog is a key component of the LHC Computing Grid middleware [1], as it contains the mapping between Logical File Names and Physical File Names on the Grid. The Atlas computing model foresees multiple local LFC housed in each Tier-1 and Tier-0, containing all information about files stored in the regional cloud. As the local LFC contents are presently not replicated anywhere, this turns out in a dangerous single point of failure for all of the Atlas regional clouds. In order to solve this problem we propose a novel solution for high availability (HA) of Oracle based Grid services, obtained by composing an Oracle Data Guard deployment and a series of application level scripts. This approach has the advantage of being very easy to deploy and maintain, and represents a good candidate solution for all Tier-2s which are usually little centres with little manpower dedicated to service operations. We also present the results of a wide range of functionality and performance tests run on a test-bed having characteristics similar to the ones required for production. The test-bed consists of a failover deployment between the Italian LHC Tier-1 (INFN - CNAF) and an Atlas Tier-2 located at INFN - Roma1. Moreover, we explain how the proposed strategy can be deployed on the present Grid infrastructure, without requiring any change to the middleware and in a way that is totally transparent to end users and applications.

  2. Population policy.

    Science.gov (United States)

    1987-03-01

    Participants in the Seminar on Population Policies for Top-level Policy Makers and Program Managers, meeting in Thailand during January 1987, examined the challenges now facing them regarding the implementation of fertility regulation programs in their respective countries -- Bangladesh, China, India, Indonesia, Malaysia, Nepal, Pakistan, the Philippines, the Republic of Korea, and Thailand. This Seminar was organized to coincide with the completion of an Economic and Social Commission for Asia and the Pacific (ESCAP) study investigating the impact and efficiency of family planning programs in the region. Country studies were reviewed at the Seminar along with policy issues about the status of women, incentive and disincentive programs, and socioeconomic factors affecting fertility. In Bangladesh the government recognizes population growth as its top priority problem related to the socioeconomic development of the country and is working to promote a reorientation strategy from the previous clinic-oriented to a multidimensional family welfare program. China's family planning program seeks to postpone marraige, space the births of children between 3-5 years, and promote the 1-child family. Its goal is to reduce the rate of natural increase from 12/1000 in 1978 to 5/1000 by 1985 and 0 by 2000. India's 7th Five-Year-Plan (1986-90) calls for establishing a 2-child family norm by 2000. In Indonesia the government's population policy includes reducing the rate of population growth, achieving a redistribution of the population, adjusting economic factors, and creating prosperous families. The government of Indonesia reversed its policy to reduce the population growth rate in 1984 and announced its goal of achieving a population of 70 million by 2100 in order to support mass consumption industries. It has created an income tax deduction system favoring large families and maternity benefits for women who have up to 5 children as incentives. Nepal's official policy is to

  3. Policy Reader

    International Nuclear Information System (INIS)

    1985-09-01

    This policy reader comprises: Correspondence; Memorandum of Understanding between the US Department of Transportation and the US Department of Energy for the Transportation of Radioactive Materials under the Nuclear Waste Policy Act; Internal Guidelines for Interactions with Communities and Local Governments; Statement by Ben C. Rusche before the Committee on Interior and Insular Affairs, Subcommittee on Energy and the Environment, US House of Representatives, September 13, 1985; Speech presented by Ben C. Rusche before the ANS/CNS/AESJ/ENS Topical Meeting, Pasco, Washington, September 24, 1985 - ''Status of the United States' High-Level Nuclear Waste Disposal Program''; and ''DOE Seeks Comments on Nuclear Transportation Planning,'' DOE News, September 30, 1985

  4. Language Policy

    DEFF Research Database (Denmark)

    Lauridsen, Karen M.

    2008-01-01

    Like any other text, instructive texts function within a given cultural and situational setting and may only be available in one language. However, the end users may not be familiar with that language and therefore unable to read and understand the instructions. This article therefore argues...... that instructive texts should always be available in a language that is understood by the end users, and that a corporate communication policy which includes a language policy should ensure that this is in fact the case for all instructive texts....

  5. Kepler Data Validation Time Series File: Description of File Format and Content

    Science.gov (United States)

    Mullally, Susan E.

    2016-01-01

    The Kepler space mission searches its time series data for periodic, transit-like signatures. The ephemerides of these events, called Threshold Crossing Events (TCEs), are reported in the TCE tables at the NASA Exoplanet Archive (NExScI). Those TCEs are then further evaluated to create planet candidates and populate the Kepler Objects of Interest (KOI) table, also hosted at the Exoplanet Archive. The search, evaluation and export of TCEs is performed by two pipeline modules, TPS (Transit Planet Search) and DV (Data Validation). TPS searches for the strongest, believable signal and then sends that information to DV to fit a transit model, compute various statistics, and remove the transit events so that the light curve can be searched for other TCEs. More on how this search is done and on the creation of the TCE table can be found in Tenenbaum et al. (2012), Seader et al. (2015), Jenkins (2002). For each star with at least one TCE, the pipeline exports a file that contains the light curves used by TPS and DV to find and evaluate the TCE(s). This document describes the content of these DV time series files, and this introduction provides a bit of context for how the data in these files are used by the pipeline.

  6. 77 FR 74839 - Combined Notice of Filings

    Science.gov (United States)

    2012-12-18

    ..., LP. Description: National Grid LNG, LP submits tariff filing per 154.203: Adoption of NAESB Version 2... with Order to Amend NAESB Version 2.0 Filing to be effective 12/1/2012. Filed Date: 12/11/12. Accession...: Refile to comply with Order on NAESB Version 2.0 Filing to be effective 12/1/2012. Filed Date: 12/11/12...

  7. The File System Interface is an Anachronism

    OpenAIRE

    Ellard, Daniel

    2003-01-01

    Contemporary file systems implement a set of abstractions and semantics that are suboptimal for many (if not most) purposes. The philosophy of using the simple mechanisms of the file system as the basis for a vast array of higher-level mechanisms leads to inefficient and incorrect implementations. We propose several extensions to the canonical file system model, including explicit support for lock files, indexed files, and resource forks, and the benefit of session semantics for write updates...

  8. Teaching, Learning, and Collaborating in the Cloud: Applications of Cloud Computing for Educators in Post-Secondary Institutions

    Science.gov (United States)

    Aaron, Lynn S.; Roche, Catherine M.

    2012-01-01

    "Cloud computing" refers to the use of computing resources on the Internet instead of on individual personal computers. The field is expanding and has significant potential value for educators. This is discussed with a focus on four main functions: file storage, file synchronization, document creation, and collaboration--each of which has…

  9. CEP energy policy : Policy 917

    International Nuclear Information System (INIS)

    2002-10-01

    Some of the environmental challenges facing the world in the twenty-first century are energy and global warming. Vital human needs such as warmth, light and transportation require energy, which is also required in the production of goods. Absent from the debate concerning the energy industry and its efforts to stop climate change is the voice of energy workers. Previous policies from the Communications, Energy and Paperworkers Union of Canada (CEP) were replaced by this policy document. After providing a brief introduction, the document tackled global challenge: climate change. The following section dealt with global challenge: corporate rule. Canada's energy industries were examined from the workers' perspective, and the state of Canada's energy reserves was discussed. From national policies to national betrayal was the title of the following section of the document. Energy de-regulation and privatization was discussed, and an argument was made for a Canadian energy policy. The industrial policy was explored, as was the environment. A transition to sustainability was examined. refs

  10. Curved canals: Ancestral files revisited

    Directory of Open Access Journals (Sweden)

    Jain Nidhi

    2008-01-01

    Full Text Available The aim of this article is to provide an insight into different techniques of cleaning and shaping of curved root canals with hand instruments. Although a plethora of root canal instruments like ProFile, ProTaper, LightSpeed ® etc dominate the current scenario, the inexpensive conventional root canal hand files such as K-files and flexible files can be used to get optimum results when handled meticulously. Special emphasis has been put on the modifications in biomechanical canal preparation in a variety of curved canal cases. This article compiles a series of clinical cases of root canals with curvatures in the middle and apical third and with S-shaped curvatures that were successfully completed by employing only conventional root canal hand instruments.

  11. Value Modifier Public Use File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Center for Medicare (CM) has created a standard analytical file intended to promote transparency. For each Value Modifier performance year, CM will publish a...

  12. Cytoscape file of chemical networks

    Data.gov (United States)

    U.S. Environmental Protection Agency — The maximum connectivity scores of pairwise chemical conditions summarized from Cmap results in a file with Cytoscape format (http://www.cytoscape.org/). The figures...

  13. File: nuclear safety and transparency

    International Nuclear Information System (INIS)

    Martinez, J.P.; Etchegoyen, A.; Jeandron, C.

    2001-01-01

    Several experiences of nuclear safety and transparency are related in this file. Public information, access to documents, transparency in nuclear regulation are such subjects developed in this debate. (N.C.)

  14. Physician Compare National Downloadable File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Physician Compare National Downloadable File is organized at the individual eligible professional level; each line is unique at the professional/enrollment...

  15. Registering Researchers in Authority Files

    NARCIS (Netherlands)

    Altman, M.; Conlon, M.; Cristan, A.L.; Dawson, L.; Dunham, J.; Hickey, T.; Hook, D.; Horstmann, W.; MacEwan, A.; Schreur, P.; Smart, L.; Smith-Yoshimura, K.; Wacker, M.; Woutersen, S.

    2014-01-01

    Registering researchers in some type of authority file or identifier system has become more compelling as both institutions and researchers recognize the need to compile their scholarly output. The report presents functional requirements and recommendations for six stakeholders: researchers,

  16. Health Topic XML File Description

    Science.gov (United States)

    ... this page: https://medlineplus.gov/xmldescription.html Health Topic XML File Description: MedlinePlus To use the sharing ... information categories assigned. Example of a Full Health Topic Record A record for a MedlinePlus health topic ...

  17. Dynamic file-access characteristics of a production parallel scientific workload

    Science.gov (United States)

    Kotz, David; Nieuwejaar, Nils

    1994-01-01

    Multiprocessors have permitted astounding increases in computational performance, but many cannot meet the intense I/O requirements of some scientific applications. An important component of any solution to this I/O bottleneck is a parallel file system that can provide high-bandwidth access to tremendous amounts of data in parallel to hundreds or thousands of processors. Most successful systems are based on a solid understanding of the expected workload, but thus far there have been no comprehensive workload characterizations of multiprocessor file systems. This paper presents the results of a three week tracing study in which all file-related activity on a massively parallel computer was recorded. Our instrumentation differs from previous efforts in that it collects information about every I/O request and about the mix of jobs running in a production environment. We also present the results of a trace-driven caching simulation and recommendations for designers of multiprocessor file systems.

  18. Agent-Mining of Grid Log-Files: A Case Study

    NARCIS (Netherlands)

    Stoter, A.; Dalmolen, Simon; Mulder, .W.

    2013-01-01

    Grid monitoring requires analysis of large amounts of log files across multiple domains. An approach is described for automated extraction of job-flow information from large computer grids, using software agents and genetic computation. A prototype was created as a first step towards communities of

  19. Specifying a Realistic File System

    Directory of Open Access Journals (Sweden)

    Sidney Amani

    2015-11-01

    Full Text Available We present the most interesting elements of the correctness specification of BilbyFs, a performant Linux flash file system. The BilbyFs specification supports asynchronous writes, a feature that has been overlooked by several file system verification projects, and has been used to verify the correctness of BilbyFs's fsync( C implementation. It makes use of nondeterminism to be concise and is shallowly-embedded in higher-order logic.

  20. Informed policies

    International Development Research Centre (IDRC) Digital Library (Canada)

    INTERNATIONAL DEVELOPMENT RESEARCH CENTRE. Informed ... more evidence-based policy on social ... Community involvement is key to the success of CBMS in reducing poverty. IDRC ... nationwide network of “telecentres” that ... and holidays for young people to use for ... National Conference on Youth led to the.

  1. Vaccination Policies

    NARCIS (Netherlands)

    Verweij, M.F.

    2013-01-01

    Vaccination involves priming the immune system with an antigenic agent that mimics a virus or bacterium, which results in immunity against the “real” microorganism. Collective vaccination policies have played an important role in the control of infectious disease worldwide. They can serve the

  2. 76 FR 43679 - Filing via the Internet; Notice of Additional File Formats for efiling

    Science.gov (United States)

    2011-07-21

    ... list of acceptable file formats the four-character file extensions for Microsoft Office 2007/2010... files from Office 2007 or Office 2010 in an Office 2003 format prior to submission. Dated: July 15, 2011...

  3. Input data requirements for special processors in the computation system containing the VENTURE neutronics code

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.

    1979-07-01

    User input data requirements are presented for certain special processors in a nuclear reactor computation system. These processors generally read data in formatted form and generate binary interface data files. Some data processing is done to convert from the user oriented form to the interface file forms. The VENTURE diffusion theory neutronics code and other computation modules in this system use the interface data files which are generated

  4. Input data requirements for special processors in the computation system containing the VENTURE neutronics code

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.

    1976-11-01

    This report presents user input data requirements for certain special processors in a nuclear reactor computation system. These processors generally read data in formatted form and generate binary interface data files. Some data processing is done to convert from the user-oriented form to the interface file forms. The VENTURE diffusion theory neutronics code and other computation modules in this system use the interface data files which are generated

  5. FROM CAD MODEL TO 3D PRINT VIA “STL” FILE FORMAT

    Directory of Open Access Journals (Sweden)

    Cătălin IANCU

    2010-06-01

    Full Text Available The paper work presents the STL file format, which is now used for transferring information from CAD software to a 3D printer, for obtaining the solid model in Rapid prototyping and Computer Aided Manufacturing. It’s presented also the STL format structure, its history, limitations and further development, as well as its new version to arrive and other similar file formats. As a conclusion, STL files used to transfer data from CAD package to 3D printers has a series of limitations and therefore new formats will replace it soon.

  6. EVALUATED NUCLEAR STRUCTURE DATA FILE. A MANUAL FOR PREPARATION OF DATA SETS

    International Nuclear Information System (INIS)

    TULI, J.K.

    2001-01-01

    This manual describes the organization and structure of the Evaluated Nuclear Structure Data File (ENSDF). This computer-based file is maintained by the National Nuclear Data Center (NNDC) at Brookhaven National Laboratory for the international Nuclear Structure and Decay Data Network. For every mass number (presently, A ≤ 293), the Evaluated Nuclear Structure Data File (ENSDF) contains evaluated structure information. For masses A ≥ 44, this information is published in the Nuclear Data Sheets; for A < 44, ENSDF is based on compilations published in the journal Nuclear Physics. The information in ENSDF is updated by mass chain or by nuclide with a varying cycle time dependent on the availability of new information

  7. Benchmarking and monitoring framework for interconnected file synchronization and sharing services

    DEFF Research Database (Denmark)

    Mrówczyński, Piotr; Mościcki, Jakub T.; Lamanna, Massimo

    2018-01-01

    computing and storage infrastructure in the research labs. In this work we present a benchmarking and monitoring framework for file synchronization and sharing services. It allows service providers to monitor the operational status of their services, understand the service behavior under different load...... types and with different network locations of the synchronization clients. The framework is designed as a monitoring and benchmarking tool to provide performance and robustness metrics for interconnected file synchronization and sharing services such as Open Cloud Mesh....

  8. The Fifth Workshop on HPC Best Practices: File Systems and Archives

    Energy Technology Data Exchange (ETDEWEB)

    Hick, Jason; Hules, John; Uselton, Andrew

    2011-11-30

    The workshop on High Performance Computing (HPC) Best Practices on File Systems and Archives was the fifth in a series sponsored jointly by the Department Of Energy (DOE) Office of Science and DOE National Nuclear Security Administration. The workshop gathered technical and management experts for operations of HPC file systems and archives from around the world. Attendees identified and discussed best practices in use at their facilities, and documented findings for the DOE and HPC community in this report.

  9. Silvabase: A flexible data file management system

    Science.gov (United States)

    Lambing, Steven J.; Reynolds, Sandra J.

    1991-01-01

    The need for a more flexible and efficient data file management system for mission planning in the Mission Operations Laboratory (EO) at MSFC has spawned the development of Silvabase. Silvabase is a new data file structure based on a B+ tree data structure. This data organization allows for efficient forward and backward sequential reads, random searches, and appends to existing data. It also provides random insertions and deletions with reasonable efficiency, utilization of storage space well but not at the expense of speed, and performance of these functions on a large volume of data. Mission planners required that some data be keyed and manipulated in ways not found in a commercial product. Mission planning software is currently being converted to use Silvabase in the Spacelab and Space Station Mission Planning Systems. Silvabase runs on a Digital Equipment Corporation's popular VAX/VMS computers in VAX Fortran. Silvabase has unique features involving time histories and intervals such as in operations research. Because of its flexibility and unique capabilities, Silvabase could be used in almost any government or commercial application that requires efficient reads, searches, and appends in medium to large amounts of almost any kind of data.

  10. An Evaluation of American English File Series

    Directory of Open Access Journals (Sweden)

    Behnam Ghasemi

    2012-03-01

    Full Text Available Textbooks play a pivotal role in language learning classrooms. The problem is that among a wide range of textbooks in market which is appropriate for a specific classroom and a group of learners. In order to evaluate ELT textbooks theorists and writers have offered different kinds of evaluative frameworks based on a number of principles and criteria. This study evaluates a series of ELT textbook, namely, American English File by the use of Littlejohn’s (1998 evaluative framework to see what explicit features of the book are, what pedagogic values it has, whether it is in line with its claimed objectives, and what its merits and demerits are. Littlejohn believes that we should evaluate a textbook based on its own pedagogic values and we should see what is in it not what teacher and evaluators think must exist in it. Consequently his framework is claimed to be devoid of any impressionistic ideas and it is in-depth and objective rather than being subjective. Nine ELT experts and ten ELT teachers helped the researcher rate the evaluative checklists. The results of the study show that although a number of shortcomings and drawbacks were found in American English File, it stood up reasonably well to a detailed and in-depth analysis and that its pedagogic values and positive attributes far out-weighed its shortcomings. The internal consistency between ratings was computed via the statistical tool of Cronbach’s alpha that indicated a desirable inter-rater reliability.

  11. PETRI NET MODELING OF COMPUTER VIRUS LIFE CYCLE

    African Journals Online (AJOL)

    Dr Obe

    dynamic system analysis is applied to model the virus life cycle. Simulation of the derived model ... Keywords: Virus lifecycle, Petri nets, modeling. simulation. .... complex process. Figure 2 .... by creating Matlab files for five different computer ...

  12. Evolutionary computation for reinforcement learning

    NARCIS (Netherlands)

    Whiteson, S.; Wiering, M.; van Otterlo, M.

    2012-01-01

    Algorithms for evolutionary computation, which simulate the process of natural selection to solve optimization problems, are an effective tool for discovering high-performing reinforcement-learning policies. Because they can automatically find good representations, handle continuous action spaces,

  13. Download this PDF file

    African Journals Online (AJOL)

    ADOWIE PERE

    Initial reference volume pressure for Boyle's law grain volume (psig). Bo= Oil formation ..... Sor (Volumetric balance). Voi, Vop, Vp. (. ) p op oi oi ... 1989): Computed Variables. Derived Uncertainties Equation. 1. Basics Properties. - Pore Volume (Vp). VP (saturation). VP (Boyle's Law). VP (Difference). 2. 1. 2. 2. 2. P. P. V. V.

  14. Download this PDF file

    African Journals Online (AJOL)

    all parts of the world use ICT to promote development, poverty reduction, empower- ment and participation .... lines and cellular phones, 3 use personal computers, and 41 are internet users. A more current report for ..... ern telecommunication services to poor people in Bangladesh with the objective of providing access to ...

  15. Download this PDF file

    African Journals Online (AJOL)

    Dr Obe

    indicative of defects in a vibration time series for diagnosis using computers. Such problems may be interpretational in nature arising either ... periodic structure in the vibration time series, it is possible to identify the sources of defect ... implemented and ranked for diagnostic accuracy in order to choose the most appropriate ...

  16. Download this PDF file

    African Journals Online (AJOL)

    uom

    communication. Since the frequency band width is limited, the optimal assignment problem ... 2 CHANNEL ASSIGNMENT PROBLEM IN CELLULAR RADIO .... Distribution Algorithms” (EDAs) (Larranaga, 2002) helps to make prediction of the ..... chaotic simulated annealing', Lecture Notes in Computer Science, vol. 2084 ...

  17. Download this PDF file

    African Journals Online (AJOL)

    dental practice, education, research, and management is caJledDental .... Informatics: • 'Computers; offering immediate access. ··1.0 .... the ability tar multiple users to work simultaneously, ... Previous version of the, WindO\\vs operating systems.

  18. Download this PDF file

    African Journals Online (AJOL)

    phones, TV screens, latest brands of cars, refrigerators and computers readily available at a price right here. Yet when it comes to preventing mothers from ... they will be in touch with reality and through this information they will be able to fight from their political platform for improvement. They are our partnersl We need them ...

  19. Download this PDF file

    African Journals Online (AJOL)

    2005-06-20

    Jun 20, 2005 ... that the fruits have high sugar content and low acidity. The air dried seeds of Annona squamosa L. ... Coefficient of similarity was computed using the formula of Sokal and Sneath. (1963). .... Pino, J.A., Rosado, A., Roncal, E., Marbot, R., Aguaro J. and. Gonzalez, G., 1998. Study of the aromatic and flavour.

  20. Download this PDF file

    African Journals Online (AJOL)

    Information Technology

    continuation of traditional practices in the design of new general-purpose dic- tionaries which were published within this period, the increasingly central role of the computer in dictionary work, and the negative effects computerization is having on lexicography as "industry". Why the change in title? The author hints at an ...

  1. Download this PDF file

    African Journals Online (AJOL)

    abp

    2017-10-18

    Oct 18, 2017 ... Our purpose was to describe the epidemiological, diagnostic, and common causes of new-bornhaemorrhagic syndrome in paediatric ... exclusive breast feeding, chronic diarrhea and prolonged use of ... ultrasonography and computed tomography (CT) were performed in ..... Paediatric Blood Cancer. 2008 ...

  2. Download this PDF file

    African Journals Online (AJOL)

    gains, revenge, terrorism and simple publicity. People in the communication and computer engineering have· really working to their best to introduce new security mechanisms on the one band and to make the already existing security mechanisms more tightening both ip software and hardware solutions, on the other band.

  3. Download this PDF file

    African Journals Online (AJOL)

    Mr Olusoji

    for their co-operation and assistance. I must thank Mrs Odengle, Mrs Aigbe and Mr Ojo of computer science dept, ABU Zaria, whose contribution helped me in using the SPSS software for analysis. ... may help to audit the quality of care in a more. 4 meaningful manner . The ICU of 730-bed Ahmadu Bello University. Teaching ...

  4. Download this PDF file

    African Journals Online (AJOL)

    De Don

    It also looked at problems that could be solved by computer and those that could in ... It also reported that the concept of automata, or manlike machines, has ..... symbols in the sources program should be considered as representations of ...

  5. Download this PDF file

    African Journals Online (AJOL)

    eobe

    2017-10-04

    Oct 4, 2017 ... problem-solving using a computer, a programmer chooses from a number of ... otherwise the running program may run out of memory required to do its ...... [6] I. Han and M. Kamber, “Data mining concepts and techniques ...

  6. Download this PDF file

    African Journals Online (AJOL)

    PUBLICATIONS1

    layers, which comprise the topsoil, weathered bedrock and the fresh basement rock. The topsoil ... the Dipole–dipole and computed ground resistance distribution revealed typical 3- layer earth ..... zero resistance, the upper limit of 10 ohm was .... Theory and Practice of electrochemical corrosion protection techniques.

  7. Download this PDF file

    African Journals Online (AJOL)

    pc

    the Faculty of Computer Science and Information Technology, Bayero University, Kano-Nigeria. The ... technologies for the implementation of web applications .... View Latest Update. View General Timetable. Customize Timetable. Customize Invigilation Schedules. View Lecture Guidelines/Etiquettes. View Examination ...

  8. Download this PDF file

    African Journals Online (AJOL)

    RAGHAVENDRA

    compression coat weight to 200 mg. Estimation of Tablet Physical Parameters. The prepared tablets were assessed for weight variation, hardness, friability and for drug content. To compute the weight variation, 20 tablets of each formulation were weighed using an electronic weighing balance (Shimadzu, Japan) and ...

  9. The state of social media policies in higher education.

    Science.gov (United States)

    Pomerantz, Jeffrey; Hank, Carolyn; Sugimoto, Cassidy R

    2015-01-01

    This paper presents an analysis of the current state of development of social media policies at institution of higher education. Content analysis of social media policies for all institutions listed in the Carnegie Classification Data File revealed that less than one-quarter of institutions had an accessible social media policy. Analysis was done by institution and campus unit, finding that social media policies were most likely to appear at doctorate-granting institutions and health, athletics, and library units. Policies required that those affiliated with the institution post appropriate content, represent the unit appropriately, and moderate conversations with coworkers and external agencies. This analysis may inform the development and revision of social media policies across the field of higher education, taking into consideration the rapidly changing landscape of social media, issues of academic freedom, and notions of interoperability with policies at the unit and campus levels.

  10. Energy policy

    International Nuclear Information System (INIS)

    1992-09-01

    Gasoline consumption by passenger cars and light trucks is a major source of air pollution. It also adds to the economy's dependence on petroleum and vulnerability to oil price shocks. Despite these environmental and other costs, called external cost, the price of gasoline, adjusted for inflation, has generally been declining since 1985, encouraging increased consumption. This paper reports that with these concerns in mind, the Chairman, Subcommittee on Environment, House Committee on Science, Space, and Technology, requested that GAO assess policy options for addressing the external costs of gasoline consumption. To do this, GAO identified six major policy options and evaluated whether they addressed several relevant objectives, including economic growth, environmental quality, equity, petroleum conservation, visibility of costs, energy security, traffic congestion, competitiveness, and administrative feasibility

  11. Internet Policy

    Science.gov (United States)

    Lehr, William H.; Pupillo, Lorenzo Maria

    The Internet is now widely regarded as essential infrastructure for our global economy and society. It is in our homes and businesses. We use it to communicate and socialize, for research, and as a platform for E-commerce. In the late 1990s, much was predicted about what the Internet has become at present; but now, we have actual experience living with the Internet as a critical component of our everyday lives. Although the Internet has already had profound effects, there is much we have yet to realize. The present volume represents a third installment in a collaborative effort to highlight the all-encompassing, multidisciplinary implications of the Internet for public policy. The first installment was conceived in 1998, when we initiated plans to organize an international conference among academic, industry, and government officials to discuss the growing policy agenda posed by the Internet. The conference was hosted by the European Commission in Brussels in 1999 and brought together a diverse mix of perspectives on what the pressing policy issues would be confronting the Internet. All of the concerns identified remain with us today, including how to address the Digital Divide, how to modify intellectual property laws to accommodate the new realities of the Internet, what to do about Internet governance and name-space management, and how to evolve broadcast and telecommunications regulatory frameworks for a converged world.

  12. Activity-based computing: computational management of activities reflecting human intention

    DEFF Research Database (Denmark)

    Bardram, Jakob E; Jeuris, Steven; Houben, Steven

    2015-01-01

    paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...

  13. Database organization for computer-aided characterization of laser diode

    International Nuclear Information System (INIS)

    Oyedokun, Z.O.

    1988-01-01

    Computer-aided data logging involves a huge amount of data which must be properly managed for optimized storage space, easy access, retrieval and utilization. An organization method is developed to enhance the advantages of computer-based data logging of the testing of the semiconductor injection laser which optimize storage space, permit authorized user easy access and inhibits penetration. This method is based on unique file identification protocol tree structure and command file-oriented access procedures

  14. 76 FR 72986 - Options Price Reporting Authority; Notice of Filing and Immediate Effectiveness of Proposed...

    Science.gov (United States)

    2011-11-28

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-65795; File No. SR-OPRA-2011-04] Options Price... Implement the Datafeed Policy November 21, 2011. Pursuant to Section 11A of the Securities Exchange Act of... Options Price Reporting Authority (``OPRA'') submitted to the Securities and Exchange Commission...

  15. 12 CFR Appendix G to Part 360 - Deposit-Customer Join File Structure

    Science.gov (United States)

    2010-01-01

    ..._Code Relationship CodeThe code indicating how the customer is related to the account. Possible values... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Deposit-Customer Join File Structure G Appendix... GENERAL POLICY RESOLUTION AND RECEIVERSHIP RULES Pt. 360, App. G Appendix G to Part 360—Deposit-Customer...

  16. Learning analytics in serious gaming: uncovering the hidden treasury of game log files

    NARCIS (Netherlands)

    Westera, Wim; Nadolski, Rob; Hummel, Hans

    2018-01-01

    This paper presents an exploratory analysis of existing log files of the VIBOA environmental policy games at Utrecht University. For reasons of statistical power we’ve combined student cohorts 2008, 2009, 2010, and 2011, which led to a sample size of 118 students. The VIBOA games are inquiry-based

  17. 76 FR 20298 - Insurer Reporting Requirements; List of Insurers; Required To File Reports

    Science.gov (United States)

    2011-04-12

    ... vehicle insurers that are required to file reports on their motor vehicle theft loss experiences. An... the agency. Each insurer's report includes information about thefts and recoveries of motor vehicles... more vehicles not covered by theft insurance policies issued by insurers of motor vehicles, other than...

  18. 76 FR 41138 - Insurer Reporting Requirements; List of Insurers Required To File Reports

    Science.gov (United States)

    2011-07-13

    ... passenger motor vehicle insurers that are required to file reports on their motor vehicle theft loss... information about thefts and recoveries of motor vehicles, the rating rules used by the insurer to establish... companies with a fleet of 20 or more vehicles not covered by theft insurance policies issued by insurers of...

  19. 77 FR 28343 - Insurer Reporting Requirements; List of Insurers Required To File Reports

    Science.gov (United States)

    2012-05-14

    ... vehicle insurers that are required to file reports on their motor vehicle theft loss experiences. An... vehicles not covered by theft insurance policies issued by insurers of motor vehicles, other than any... than any governmental entity) used for rental or lease whose vehicles are not covered by theft...

  20. 75 FR 34966 - Insurer Reporting Requirements; List of Insurers Required To File Reports

    Science.gov (United States)

    2010-06-21

    ... vehicle insurers that are required to file reports on their motor vehicle theft loss experiences. An... the agency. Each insurer's report includes information about thefts and recoveries of motor vehicles... vehicles not covered by theft insurance policies issued by insurers of motor vehicles, other than any...

  1. 76 FR 46283 - Peoples Natural Gas Company LLC; Notice of Baseline Filing

    Science.gov (United States)

    2011-08-02

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR11-3-001] Peoples Natural Gas Company LLC; Notice of Baseline Filing Take notice that on July 20, 2011, Peoples Natural Gas... provided under Section 311 of the Natural Gas Policy Act of 1978 (NGPA). Any person desiring to participate...

  2. Simulating Policy Processes through Electronic Mail.

    Science.gov (United States)

    Flynn, John P.

    1987-01-01

    Focuses on the use of electronic mail for teaching and learning about social welfare policy processes and compares electronic mail as a simulation medium to more structured computer applications. (Author)

  3. PFS: a distributed and customizable file system

    OpenAIRE

    Bosch, H.G.P.; Mullender, Sape J.

    1996-01-01

    In this paper we present our ongoing work on the Pegasus File System (PFS), a distributed and customizable file system that can be used for off-line file system experiments and on-line file system storage. PFS is best described as an object-oriented component library from which either a true file system or a file-system simulator can be constructed. Each of the components in the library is easily replaced by another implementation to accommodate a wide range of applications.

  4. Column: File Cabinet Forensics

    Directory of Open Access Journals (Sweden)

    Simson Garfinkel

    2011-12-01

    Full Text Available Researchers can spend their time reverse engineering, performing reverse analysis, or making substantive contributions to digital forensics science. Although work in all of these areas is important, it is the scientific breakthroughs that are the most critical for addressing the challenges that we face.Reverse Engineering is the traditional bread-and-butter of digital forensics research. Companies like Microsoft and Apple deliver computational artifacts (operating systems, applications and phones to the commercial market. These artifacts are bought and used by billions. Some have evil intent, and (if society is lucky, the computers end up in the hands of law enforcement. Unfortunately the original vendors rarely provide digital forensics tools that make their systems amenable to analysis by law enforcement. Hence the need for reverse engineering.(see PDF for full column

  5. A comprehensive review of the policy and programmatic response to ...

    African Journals Online (AJOL)

    Journal Home > Vol 46, No 2 (2012) > ... Yet, until recently, they have been neglected and not considered a health priority. ... Methods: Unpublished reports, documents, relevant files of the Ghana Health Service (GHS) were examined ... Keywords: chronic non-communicable diseases, health systems, health policy, funding, ...

  6. 7 CFR 407.8 - The application and policy.

    Science.gov (United States)

    2010-01-01

    ..., DEPARTMENT OF AGRICULTURE GROUP RISK PLAN OF INSURANCE REGULATIONS § 407.8 The application and policy. (a... applicable sales closing date on file in the insurance provider's local office. (b) FCIC or the reinsured... is excessive. The Manager of the Corporation is authorized in any crop year to extend the sales...

  7. Search the SEC website for the latest EDGAR filings

    Data.gov (United States)

    Securities and Exchange Commission — This listing contains the most recent filings for the current official filing date (including filings made after the 5:30pm deadline on the previous filing day)....

  8. POPULATION POLICY OR SOCIAL POLICY?

    Directory of Open Access Journals (Sweden)

    ANDREI STANOIU

    2011-04-01

    Full Text Available After 1989, the demographic situation of Romania population experienced a dramatic, very concerning and dangerous evolution trend. One of the first measures of the new political power was to abolish the very restrictive, anti-human and abusive legal regulation adopted in 1966 by the communist regime concerning abortion and the whole old demographic policy. As a result of this measure and of the worsening economic and social situation of the great majority of Romanian population, the birth rate declined sharply and, from 1992, the natural demographic growth rate became a negative one. The absolute number of Romanian population decreased more and more and, if nothing changes, in the next few decades it will be no bigger than 15 million people. At the same time, the process of demographic ageing of population will accentuate, generating serious problems from demographic and social-economic point of view, Taking into account the present demographic situation and, especially, the foreseen trend of evolution, it is more than clear that there should be taken some urgent, coherent and consistent measures in order to stop this dangerous demographic evolution, until it is not too late, and to avoid, as much as possible, a potential demographic disaster. The problem is: what kind of measures should be taken and what kind of policy should be adopted? Some social scientists believe that a new population policy should be adopted; some others believe that rather a social policy should be adopted. The purpose of my paper is to analyze this different opinions and to show that, behind the dispute on the terminology, should be taken consistent measures, at governmental level, in order to assure a substantial improvement of demographic situation, not only from a quantitative, but from a qualitative point of view as well, and to identify some of these kind of measures.

  9. Internet Policy

    Science.gov (United States)

    1999-11-17

    equipment, including information technology , includes, but is not limited to, personal computers and related peripheral equipment and software, office...classified or Sensitive Unclassified Information through the Internet unless the Designated Approving Authority has approved the method of transmission in...writing. 2. Government office equipment, including Information Technology (IT), shall only be used for official purposes, except as specifically

  10. De-identifying a public use microdata file from the Canadian national discharge abstract database

    Directory of Open Access Journals (Sweden)

    Paton David

    2011-08-01

    Full Text Available Abstract Background The Canadian Institute for Health Information (CIHI collects hospital discharge abstract data (DAD from Canadian provinces and territories. There are many demands for the disclosure of this data for research and analysis to inform policy making. To expedite the disclosure of data for some of these purposes, the construction of a DAD public use microdata file (PUMF was considered. Such purposes include: confirming some published results, providing broader feedback to CIHI to improve data quality, training students and fellows, providing an easily accessible data set for researchers to prepare for analyses on the full DAD data set, and serve as a large health data set for computer scientists and statisticians to evaluate analysis and data mining techniques. The objective of this study was to measure the probability of re-identification for records in a PUMF, and to de-identify a national DAD PUMF consisting of 10% of records. Methods Plausible attacks on a PUMF were evaluated. Based on these attacks, the 2008-2009 national DAD was de-identified. A new algorithm was developed to minimize the amount of suppression while maximizing the precision of the data. The acceptable threshold for the probability of correct re-identification of a record was set at between 0.04 and 0.05. Information loss was measured in terms of the extent of suppression and entropy. Results Two different PUMF files were produced, one with geographic information, and one with no geographic information but more clinical information. At a threshold of 0.05, the maximum proportion of records with the diagnosis code suppressed was 20%, but these suppressions represented only 8-9% of all values in the DAD. Our suppression algorithm has less information loss than a more traditional approach to suppression. Smaller regions, patients with longer stays, and age groups that are infrequently admitted to hospitals tend to be the ones with the highest rates of suppression

  11. The climate file

    International Nuclear Information System (INIS)

    2010-01-01

    A series of interviews of a member of the IPCC (Intergovernmental Panel on Climate Change) and of researchers gives an overview of scientific knowledge on climate, discusses what could be a good agreement at the Copenhagen conference, outlines what is at stake in these negotiations, and proposes an overview of the French policy for the struggle against climate change. An article comments the content of a report published by the CAS (Centre d'Analyse Strategique), and more particularly the position of Russia and of the OPEC before the Copenhagen negotiations. A last article comments the results of three opinion surveys made in France about climate change, its origins and solutions, and about the representation French people have of greenhouse effect

  12. Legitimizing policies

    DEFF Research Database (Denmark)

    Jørgensen, Martin Bak

    2012-01-01

    The focus of this article is on representations of irregular migration in a Scandinavian context and how irregular migrants are constructed as a target group. A common feature in many (Western-)European states is the difficult attempt to navigate between an urge for control and respecting......, upholding and promoting humanitarian aspects of migration management. Legitimizing policies therefore become extremely important as governments have to appease national voters to remain in power and have to respect European regulations and international conventions. Doing so raises questions of social...

  13. Overview and Status of the Ceph File System

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The Ceph file system (CephFS) is the POSIX-compatible distributed file system running on top of Ceph's powerful and stable object store. This presentation will give a general introduction of CephFS and detail the recent work the Ceph team has done to improve its stability and usability. In particular, we will cover directory fragmentation, multiple active metadata servers, and directory subtree pinning to metadata servers, features slated for stability in the imminent Luminous release. This talk will also give an overview of how we are measuring performance of multiple active metadata servers using large on-demand cloud deployments. The results will highlight how CephFS distributes metadata load across metadata servers to achieve scaling. About the speaker Patrick Donnelly is a software engineer at Red Hat, Inc. currently working on the Ceph distributed file system. In 2016 he completed his Ph.D. in computer science at the University of Notre Dame with a dissertation on the topic of file transfer management...

  14. Chronic Condition Public Use File (PUF)

    Data.gov (United States)

    U.S. Department of Health & Human Services — This release contains the Chronic Conditions Public Use Files (PUF) with information from Medicare claims. The CMS Chronic Conditions PUF is an aggregated file in...

  15. Improving File System Performance by Striping

    Science.gov (United States)

    Lam, Terance L.; Kutler, Paul (Technical Monitor)

    1998-01-01

    This document discusses the performance and advantages of striped file systems on the SGI AD workstations. Performance of several striped file system configurations are compared and guidelines for optimal striping are recommended.

  16. 47 CFR 1.1704 - Station files.

    Science.gov (United States)

    2010-10-01

    ... System (COALS) § 1.1704 Station files. Applications, notifications, correspondence, electronic filings... Television Relay Service (CARS) are maintained by the Commission in COALS and the Public Reference Room...

  17. source files for manuscript in tex format

    Data.gov (United States)

    U.S. Environmental Protection Agency — Source tex files used to create the manuscript including original figure files and raw data used in tables and inline text. This dataset is associated with the...

  18. 75 FR 66748 - Notice of Baseline Filings

    Science.gov (United States)

    2010-10-29

    ...- 000] Notice of Baseline Filings October 22, 2010. ONEOK Gas Transportation, L.L.C Docket No. PR11-68... above submitted a revised baseline filing of their Statement of Operating Conditions for services...

  19. Physician-Supplier Procedure Summary Master File

    Data.gov (United States)

    U.S. Department of Health & Human Services — This file is a 100 percent summary of all Part B Carrier and DMERC Claims processed through the Common Working File and stored in the National Claims History...

  20. Data_files_Reyes_EHP_phthalates

    Data.gov (United States)

    U.S. Environmental Protection Agency — The file contains three files in comma separated values (.csv) format. “Reyes_EHP_Phthalates_US_metabolites.csv” contains information about the National Health and...