WorldWideScience

Sample records for survey computer file

  1. Computer files.

    Science.gov (United States)

    Malik, M

    1995-02-01

    From what has been said, several recommendations can be made for users of small personal computers regardless of which operating system they use. If your computer has a large hard disk not specially required by any single application, organize the disk into a small number of volumes. You will then be using the computer as if it had several smaller disks, which will help you to create a logical file structure. The size of individual volumes has to be selected carefully with respect to the files kept in each volume. Otherwise, it may be that you will have too much space in one volume and not enough in another. In each volume, organize the structure of directories and subdirectories logically so that they correspond to the logic of your file content. Be aware of the fact that the directories suggested as default when installing new software are often not the optimum. For instance, it is better to put different graphics packages under a common subdirectory rather than to install them at the same level as all other packages including statistics, text processors, etc. Create a special directory for each task you use the computer. Note that it is a bad practice to keep many different and logically unsorted files in the root directory of any of your volumes. Only system and important service files should be kept there. Although any file may be written all over the disk, access to it will be faster if it is written over the minimum number of cylinders. From time to time, use special programs that reorganize your files in this way.(ABSTRACT TRUNCATED AT 250 WORDS)

  2. Survey on Security Issues in File Management in Cloud Computing Environment

    Science.gov (United States)

    Gupta, Udit

    2015-06-01

    Cloud computing has pervaded through every aspect of Information technology in past decade. It has become easier to process plethora of data, generated by various devices in real time, with the advent of cloud networks. The privacy of users data is maintained by data centers around the world and hence it has become feasible to operate on that data from lightweight portable devices. But with ease of processing comes the security aspect of the data. One such security aspect is secure file transfer either internally within cloud or externally from one cloud network to another. File management is central to cloud computing and it is paramount to address the security concerns which arise out of it. This survey paper aims to elucidate the various protocols which can be used for secure file transfer and analyze the ramifications of using each protocol.

  3. Administration of Library-Owned Computer Files. SPEC Kit 159.

    Science.gov (United States)

    Shaw, Suzanne J.

    This document reports the results of a follow-up survey of 34 Association of Research Libraries member libraries which was conducted in 1989 to measure changes that had taken place in the administration of computer files (CF)--previously referred to as machine readable data files--since the original survey in 1984. It is noted that this survey…

  4. Permanent-File-Validation Utility Computer Program

    Science.gov (United States)

    Derry, Stephen D.

    1988-01-01

    Errors in files detected and corrected during operation. Permanent File Validation (PFVAL) utility computer program provides CDC CYBER NOS sites with mechanism to verify integrity of permanent file base. Locates and identifies permanent file errors in Mass Storage Table (MST) and Track Reservation Table (TRT), in permanent file catalog entries (PFC's) in permit sectors, and in disk sector linkage. All detected errors written to listing file and system and job day files. Program operates by reading system tables , catalog track, permit sectors, and disk linkage bytes to vaidate expected and actual file linkages. Used extensively to identify and locate errors in permanent files and enable online correction, reducing computer-system downtime.

  5. Identifiable Data Files - Health Outcomes Survey (HOS)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare Health Outcomes Survey (HOS) identifiable data files are comprised of the entire national sample for a given 2-year cohort (including both respondents...

  6. 76 FR 4934 - Idaho: Filing of Plats of Survey

    Science.gov (United States)

    2011-01-27

    ... Bureau of Land Management Idaho: Filing of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of surveys. SUMMARY: The Bureau of Land Management (BLM) has officially filed the plats of survey of the lands described below in the BLM Idaho State Office, Boise,...

  7. 76 FR 23333 - Idaho: Filing of Plats of Survey

    Science.gov (United States)

    2011-04-26

    ... Bureau of Land Management Idaho: Filing of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats of Surveys. SUMMARY: The Bureau of Land Management (BLM) has officially filed the plats of survey of the lands described below in the BLM Idaho State Office, Boise,...

  8. 77 FR 42759 - IDAHO: Filing of Plats of Survey

    Science.gov (United States)

    2012-07-20

    ... Bureau of Land Management IDAHO: Filing of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of surveys. SUMMARY: The Bureau of Land Management (BLM) has officially filed the plats of survey of the lands described below in the BLM Idaho State Office, Boise,...

  9. 78 FR 64530 - Idaho: Filing of Plats of Survey

    Science.gov (United States)

    2013-10-29

    ... Bureau of Land Management Idaho: Filing of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of surveys. SUMMARY: The Bureau of Land Management (BLM) has officially filed the plats of survey of the lands described below in the BLM Idaho State Office, Boise,...

  10. 75 FR 66788 - Idaho: Filing of Plats of Survey

    Science.gov (United States)

    2010-10-29

    ... Bureau of Land Management Idaho: Filing of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats of Surveys. SUMMARY: The Bureau of Land Management (BLM) has officially filed the plat of survey of the lands described below in the BLM Idaho State Office, Boise,...

  11. 77 FR 3791 - Idaho: Filing of Plats of Survey

    Science.gov (United States)

    2012-01-25

    ... Bureau of Land Management Idaho: Filing of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of surveys. SUMMARY: The Bureau of Land Management (BLM) has officially filed the plats of survey of the lands described below in the BLM Idaho State Office, Boise,...

  12. 77 FR 77089 - Idaho: Filing of Plats of Survey

    Science.gov (United States)

    2012-12-31

    ... Bureau of Land Management Idaho: Filing of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Filing of Plats of Surveys. SUMMARY: The Bureau of Land Management (BLM) has officially filed the plats of survey of the lands described below in the BLM Idaho State Office, Boise,...

  13. Small file aggregation in a parallel computing system

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Zhang, Jingwang

    2014-09-02

    Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.

  14. 76 FR 50492 - Idaho: Filing of Plats of Survey

    Science.gov (United States)

    2011-08-15

    ... Bureau of Land Management Idaho: Filing of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of filing of plats of surveys. SUMMARY: The Bureau of Land Management (BLM) has officially accepted the plat of survey of the lands described below in the BLM Idaho State Office, Boise,...

  15. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-95, with projections to 2020; (supplement three to U.S. Geological Survey Water-resources investigations report 94-4251)

    Science.gov (United States)

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.) and revised by Kernodle (Kernodle, J.M., 1998, Simulation of ground-water flow in the Albuquerque Basin, 1901-95, with projections to 2020 (supplement two to U.S. Geological Survey Water-Resources Investigations Report 94-4251): U.S. Geological Survey Open-File Report 96-209, 54 p.). Output files resulting from the computer simulations are included for reference.

  16. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-94, with projections to 2020; (supplement one to U.S. Geological Survey Water-resources investigations report 94-4251)

    Science.gov (United States)

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.). Output files resulting from the computer simulations are included for reference.

  17. Music information retrieval in compressed audio files: a survey

    Science.gov (United States)

    Zampoglou, Markos; Malamos, Athanasios G.

    2014-07-01

    In this paper, we present an organized survey of the existing literature on music information retrieval systems in which descriptor features are extracted directly from the compressed audio files, without prior decompression to pulse-code modulation format. Avoiding the decompression step and utilizing the readily available compressed-domain information can significantly lighten the computational cost of a music information retrieval system, allowing application to large-scale music databases. We identify a number of systems relying on compressed-domain information and form a systematic classification of the features they extract, the retrieval tasks they tackle and the degree in which they achieve an actual increase in the overall speed-as well as any resulting loss in accuracy. Finally, we discuss recent developments in the field, and the potential research directions they open toward ultra-fast, scalable systems.

  18. 77 FR 24218 - Filing of Plats of Survey; Nevada

    Science.gov (United States)

    2012-04-23

    ... was executed to meet certain administrative needs of Pershing County Water Conservation District. 3... needs of Pershing County Water Conservation District. 7. The Plats of Survey of the following described...] Filing of Plats of Survey; Nevada AGENCY: Bureau of Land Management, Interior. ACTION: Notice....

  19. 77 FR 50530 - Filing of Plats of Survey; Nevada

    Science.gov (United States)

    2012-08-21

    ... administrative needs of the Pershing County Water Conservation District. A plat, in 3 sheets, representing the... May 3, 2012. This survey was executed to meet certain administrative needs of the Pershing County...] Filing of Plats of Survey; Nevada AGENCY: Bureau of Land Management, Interior. ACTION: Notice....

  20. Survey Stresses Need for Filing Instruction

    Science.gov (United States)

    Stutsman, Galen

    1974-01-01

    With the mass of paperwork which today's high-speed machines can produce, the job of filing and retrieving records and of records management is increasing in importance. Results of a follow-up of previous students in records management emphasize the value of such training in the preparation of future office workers. (Author/SC)

  1. Distributing an executable job load file to compute nodes in a parallel computer

    Science.gov (United States)

    Gooding, Thomas M.

    2016-08-09

    Distributing an executable job load file to compute nodes in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: determining, by a compute node in the parallel computer, whether the compute node is participating in a job; determining, by the compute node in the parallel computer, whether a descendant compute node is participating in the job; responsive to determining that the compute node is participating in the job or that the descendant compute node is participating in the job, communicating, by the compute node to a parent compute node, an identification of a data communications link over which the compute node receives data from the parent compute node; constructing a class route for the job, wherein the class route identifies all compute nodes participating in the job; and broadcasting the executable load file for the job along the class route for the job.

  2. Distributing an executable job load file to compute nodes in a parallel computer

    Energy Technology Data Exchange (ETDEWEB)

    Gooding, Thomas M.

    2016-09-13

    Distributing an executable job load file to compute nodes in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: determining, by a compute node in the parallel computer, whether the compute node is participating in a job; determining, by the compute node in the parallel computer, whether a descendant compute node is participating in the job; responsive to determining that the compute node is participating in the job or that the descendant compute node is participating in the job, communicating, by the compute node to a parent compute node, an identification of a data communications link over which the compute node receives data from the parent compute node; constructing a class route for the job, wherein the class route identifies all compute nodes participating in the job; and broadcasting the executable load file for the job along the class route for the job.

  3. Sharing digital micrographs and other data files between computers.

    Science.gov (United States)

    Entwistle, A

    2004-01-01

    It ought to be easy to exchange digital micrographs and other computer data files with a colleague even on another continent. In practice, this often is not the case. The advantages and disadvantages of various methods that are available for exchanging data files between computers are discussed. When possible, data should be transferred through computer networking. When data are to be exchanged locally between computers with similar operating systems, the use of a local area network is recommended. For computers in commercial or academic environments that have dissimilar operating systems or are more widely spaced, the use of FTPs is recommended. Failing this, posting the data on a website and transferring by hypertext transfer protocol is suggested. If peer to peer exchange between computers in domestic environments is needed, the use of Messenger services such as Microsoft Messenger or Yahoo Messenger is the method of choice. When it is not possible to transfer the data files over the internet, single use, writable CD ROMs are the best media for transferring data. If for some reason this is not possible, DVD-R/RW, DVD+R/RW, 100 MB ZIP disks and USB flash media are potentially useful media for exchanging data files.

  4. 29 CFR 459.1 - Computation of time for filing papers.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 2 2010-07-01 2010-07-01 false Computation of time for filing papers. 459.1 Section 459.1... OF CONDUCT MISCELLANEOUS § 459.1 Computation of time for filing papers. In computing any period of... computations. When these regulations require the filing of any paper, such document must be received by...

  5. NET: an inter-computer file transfer command

    Energy Technology Data Exchange (ETDEWEB)

    Burris, R.D.

    1978-05-01

    The NET command was defined and supported in order to facilitate file transfer between computers. Among the goals of the implementation were greatest possible ease of use, maximum power (i.e., support of a diversity of equipment and operations), and protection of the operating system.

  6. Experiments in computing: a survey.

    Science.gov (United States)

    Tedre, Matti; Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general.

  7. 5 CFR 2429.21 - Computation of time for filing papers.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Computation of time for filing papers... REQUIREMENTS General Requirements § 2429.21 Computation of time for filing papers. (a) In computing any period... § 2429.23(a) of this part, when this subchapter requires the filing of any paper with the Authority,...

  8. A Survey on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Poulami dalapati

    2013-06-01

    Full Text Available Cloud Computing is a very recent term which is mainly based on distributed computing, virtualization, utility computing, networking and web and software services. This kind of service oriented architecture reduces information technology overhead for end user, total cost of ownership, supports flexibility and on-demand services.

  9. A survey of computational steering environments

    NARCIS (Netherlands)

    Mulder, J.D.; Wijk, J.J. van; Liere, R. van

    1998-01-01

    Computational steering is a powerful concept that allows scientists to interactively control a computational process during its execution. In this paper, a survey of computational steering environments for the on-line steering of ongoing scientific and engineering simulations is presented. These env

  10. A survey of computational physics introductory computational science

    CERN Document Server

    Landau, Rubin H; Bordeianu, Cristian C

    2008-01-01

    Computational physics is a rapidly growing subfield of computational science, in large part because computers can solve previously intractable problems or simulate natural processes that do not have analytic solutions. The next step beyond Landau's First Course in Scientific Computing and a follow-up to Landau and Páez's Computational Physics, this text presents a broad survey of key topics in computational physics for advanced undergraduates and beginning graduate students, including new discussions of visualization tools, wavelet analysis, molecular dynamics, and computational fluid dynamics

  11. A Survey of Parallel Computing

    Science.gov (United States)

    1988-07-01

    11 is being developed as a more powerful restructuring compiler that will produce executable code. It will be a multilanguage compiler allowing...IL 61821-8149. R6 SN’ ,i Ok~. ;§ .1’ma JOURNALS AND BOOKS 167 IEEE Computer This journal is published by the Computer Society of the Institute of...Electrical and Electronics Engineers (IEEE). An annual subscription is included in society member dues. Computer Architecture News This journal is

  12. Definitions of database files and fields of the Personal Computer-Based Water Data Sources Directory

    Science.gov (United States)

    Green, J. Wayne

    1991-01-01

    This report describes the data-base files and fields of the personal computer-based Water Data Sources Directory (WDSD). The personal computer-based WDSD was derived from the U.S. Geological Survey (USGS) mainframe computer version. The mainframe version of the WDSD is a hierarchical data-base design. The personal computer-based WDSD is a relational data- base design. This report describes the data-base files and fields of the relational data-base design in dBASE IV (the use of brand names in this abstract is for identification purposes only and does not constitute endorsement by the U.S. Geological Survey) for the personal computer. The WDSD contains information on (1) the type of organization, (2) the major orientation of water-data activities conducted by each organization, (3) the names, addresses, and telephone numbers of offices within each organization from which water data may be obtained, (4) the types of data held by each organization and the geographic locations within which these data have been collected, (5) alternative sources of an organization's data, (6) the designation of liaison personnel in matters related to water-data acquisition and indexing, (7) the volume of water data indexed for the organization, and (8) information about other types of data and services available from the organization that are pertinent to water-resources activities.

  13. Development of CONSER Cataloging Policies for Remote Access Computer File Serials.

    Science.gov (United States)

    Anderson, Bill; Hawkins, Les

    1996-01-01

    Describes the development of CONSER (Cooperative Online Serials) policies and practices for cataloging remote access computer file serials. Topics include electronic serials on the Internet, cataloging standards for computer files, OCLC and Internet resources, networked resources as published documents, multiple file formats, sources of…

  14. Cloud Computing Security: A Survey

    Directory of Open Access Journals (Sweden)

    Issa M. Khalil

    2014-02-01

    Full Text Available Cloud computing is an emerging technology paradigm that migrates current technological and computing concepts into utility-like solutions similar to electricity and water systems. Clouds bring out a wide range of benefits including configurable computing resources, economic savings, and service flexibility. However, security and privacy concerns are shown to be the primary obstacles to a wide adoption of clouds. The new concepts that clouds introduce, such as multi-tenancy, resource sharing and outsourcing, create new challenges to the security community. Addressing these challenges requires, in addition to the ability to cultivate and tune the security measures developed for traditional computing systems, proposing new security policies, models, and protocols to address the unique cloud security challenges. In this work, we provide a comprehensive study of cloud computing security and privacy concerns. We identify cloud vulnerabilities, classify known security threats and attacks, and present the state-of-the-art practices to control the vulnerabilities, neutralize the threats, and calibrate the attacks. Additionally, we investigate and identify the limitations of the current solutions and provide insights of the future security perspectives. Finally, we provide a cloud security framework in which we present the various lines of defense and identify the dependency levels among them. We identify 28 cloud security threats which we classify into five categories. We also present nine general cloud attacks along with various attack incidents, and provide effectiveness analysis of the proposed countermeasures.

  15. Storing files in a parallel computing system based on user or application specification

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Nick, Jeffrey M.; Grider, Gary; Torres, Aaron

    2016-03-29

    Techniques are provided for storing files in a parallel computing system based on a user-specification. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a specification from the distributed application indicating how the plurality of files should be stored; and storing one or more of the plurality of files in one or more storage nodes of a multi-tier storage system based on the specification. The plurality of files comprise a plurality of complete files and/or a plurality of sub-files. The specification can optionally be processed by a daemon executing on one or more nodes in a multi-tier storage system. The specification indicates how the plurality of files should be stored, for example, identifying one or more storage nodes where the plurality of files should be stored.

  16. Motivators, Barriers and Concerns in Adoption of Electronic Filing System: Survey Evidence from Malaysian Professional Accountants

    Directory of Open Access Journals (Sweden)

    Ming-Ling Lai

    2010-01-01

    Full Text Available Problem statement: Worldwide, electronic filing (e-filing system and its' adoption has attracted much attention, however, scholarly study on accounting professionals' acceptance of e-filing system is scant. Approach: This study aimed (i to examine factors that motivated professional accountants to use e-filing (ii to solicit their usage experience and (iii to assess the barriers to adoption and other compliance considerations. The questionnaire survey was administered on 700 professionals from tax practice and commercial sectors who attended "Budget 2008" Tax Seminars, organized by the Malaysian Institute of Accountants in Peninsular Malaysia. In total, 456 usable responses from accounting and tax professionals were collected and analyzed. Results: The survey found out of 456 respondents, just 23.7% had used e-filing in 2007 to file personal tax return forms. Majority of the e-filers opted to use e-filing for the sake of convenience (55.8%, in faith to get faster tax refund (16.8% and speed of filing (15.9%. For those who did not use e-filing, the key impediments were concerned over the security and did not trust of e-filing system. Some (4.8% were unable to access to the e-filing website. Overall, just 26.1% of the professionals surveyed had confidence in the IRBM in managing the e-filing system successfully. Majority (41.2% thought that 'speedy tax refund' to be the most desirable incentive to motivate individuals to use e-filing. Conclusion: As the IRBM is counting on professional accountants to promote the usage of e-filing system, this study provided important insights to the IRBM to developing marketing and business strategies to motivate professional accountant in business to use e-filing in order to accelerate the diffusion of e-filing system in a developing country like Malaysia.

  17. Converting Between PLY and Ballistic Research Laboratory-Computer-Aided Design (BRL-CAD) File Formats

    Science.gov (United States)

    2015-02-01

    Converting Between PLY and Ballistic Research Laboratory–Computer-Aided Design (BRL- CAD ) File Formats by Rishub Jain ARL-CR-0760...0760 February 2015 Converting Between PLY and Ballistic Research Laboratory–Computer-Aided Design (BRL- CAD ) File Formats Rishub Jain US...and Ballistic Research Laboratory–Computer-Aided Design (BRL- CAD ) File Formats 5a. CONTRACT NUMBER W911NF-10-2-0076 5b. GRANT NUMBER 5c. PROGRAM

  18. Medical Expenditure Panel Survey (MEPS) Restricted Data Files

    Data.gov (United States)

    U.S. Department of Health & Human Services — Restricted Data Files Available at the Data Centers Researchers and users with approved research projects can access restricted data files that have not been...

  19. Methods and apparatus for multi-resolution replication of files in a parallel computing system using semantic information

    Energy Technology Data Exchange (ETDEWEB)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-10-20

    Techniques are provided for storing files in a parallel computing system using different resolutions. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a sub-file. The method comprises the steps of obtaining semantic information related to the file; generating a plurality of replicas of the file with different resolutions based on the semantic information; and storing the file and the plurality of replicas of the file in one or more storage nodes of the parallel computing system. The different resolutions comprise, for example, a variable number of bits and/or a different sub-set of data elements from the file. A plurality of the sub-files can be merged to reproduce the file.

  20. 12 CFR 269b.720 - Computation of time for filing papers.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Computation of time for filing papers. 269b.720... papers. In computing any period of time prescribed by or allowed by the panel, the day of the act, event... regulations in this subchapter require the filing of any paper, such document must be received by the panel...

  1. Analyzing Log Files to Predict Students' Problem Solving Performance in a Computer-Based Physics Tutor

    Science.gov (United States)

    Lee, Young-Jin

    2015-01-01

    This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…

  2. Arranging and finding folders and files on your Windows 7 computer

    CERN Document Server

    Steps, Studio Visual

    2014-01-01

    If you have lots of documents on your desk, it may prove to be impossible to find the document you are looking for. In order to easily find certain documents, they are often stored in a filing cabinet and arranged in a logical order. The folders on your computer serve the same purpose. They do not just contain files; they can also contain other folders. You can create an unlimited number of folders, and each folder can contain any number of subfolders and files. You can use Windows Explorer, also called the folder window, to work with the files and folders on your computer. You can copy, delete, move, find, and sort files, among other things. Or you can transfer files and folders to a USB stick, an external hard drive, a CD, DVD or Blu-Ray disk. In this practical guide we will show you how to use the folder window, and help you arrange your own files.

  3. Survey: Risk Assessment for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Drissi S.

    2013-01-01

    Full Text Available with the increase in the growth of cloud computing and the changes in technology that have resulted a new ways for cloud providers to deliver their services to cloud consumers, the cloud consumers should be aware of the risks and vulnerabilities present in the current cloud computing environment. An information security risk assessment is designed specifically for that task. However, there is lack of structured risk assessment approach to do it. This paper aims to survey existing knowledge regarding risk assessment for cloud computing and analyze existing use cases from cloud computing to identify the level of risk assessment realization in state of art systems and emerging challenges for future research.

  4. Comparison of canal transportation and centering ability of hand Protaper files and rotary Protaper files by using micro computed tomography

    OpenAIRE

    Amit Gandhi; Taru Gandhi

    2011-01-01

    Introduction and objective: The aim of the present study was to compare root canal preparation with rotary ProTaper files and hand ProTaper files to find a better instrumentation technique for maintaining root canal geometry with the aid of computed tomography. Material and methods: Twenty curved root canals with at least 10 degree of curvature were divided into 2 groups of 10 teeth each. In group I the canals were prepared with hand ProTaper files and in group II the canals were prepared wit...

  5. 76 FR 77551 - Notice of Filing of Plats of Survey, New Mexico

    Science.gov (United States)

    2011-12-13

    ... Bureau of Land Management Notice of Filing of Plats of Survey, New Mexico AGENCY: Bureau of Land... below are scheduled to be officially filed in the New Mexico State Office, Bureau of Land Management, Santa Fe, New Mexico, thirty (30) calendar days from the date of this publication....

  6. 78 FR 64974 - Notice of Filing of Plats of Survey, New Mexico

    Science.gov (United States)

    2013-10-30

    ... Bureau of Land Management Notice of Filing of Plats of Survey, New Mexico AGENCY: Bureau of Land... below are scheduled to be officially filed in the New Mexico State Office, Bureau of Land Management, Santa Fe, New Mexico, thirty (30) calendar days from the date of this publication. FOR FURTHER...

  7. 77 FR 17092 - Notice of Filing of Plats of Survey, New Mexico

    Science.gov (United States)

    2012-03-23

    ... Bureau of Land Management Notice of Filing of Plats of Survey, New Mexico AGENCY: Bureau of Land... below are scheduled to be officially filed in the New Mexico State Office, Bureau of Land Management, Santa Fe, New Mexico, thirty (30) calendar days from the date of this publication....

  8. 75 FR 17432 - Notice of Filing of Plats of Survey, New Mexico

    Science.gov (United States)

    2010-04-06

    ... Bureau of Land Management Notice of Filing of Plats of Survey, New Mexico AGENCY: Bureau of Land... below are scheduled to be officially filed in the New Mexico State Office, Bureau of Land Management (BLM), Santa Fe, New Mexico, thirty (30) calendar days from the date of this publication....

  9. Heuristic file sorted assignment algorithm of parallel I/O on cluster computing system

    Institute of Scientific and Technical Information of China (English)

    CHEN Zhi-gang; ZENG Bi-qing; XIONG Ce; DENG Xiao-heng; ZENG Zhi-wen; LIU An-feng

    2005-01-01

    A new file assignment strategy of parallel I/O, which is named heuristic file sorted assignment algorithm was proposed on cluster computing system. Based on the load balancing, it assigns the files to the same disk according to the similar service time. Firstly, the files were sorted and stored at the set I in descending order in terms of their service time, then one disk of cluster node was selected randomly when the files were to be assigned, and at last the continuous files were taken orderly from the set I to the disk until the disk reached its load maximum. The experimental results show that the new strategy improves the performance by 20.2% when the load of the system is light and by 31.6% when the load is heavy. And the higher the data access rate, the more evident the improvement of the performance obtained by the heuristic file sorted assignment algorithm.

  10. User's guide for MODTOOLS: Computer programs for translating data of MODFLOW and MODPATH into geographic information system files

    Science.gov (United States)

    Orzol, Leonard L.

    1997-01-01

    MODTOOLS is a set of computer programs for translating data of the ground-water model, MODFLOW, and the particle-tracker, MODPATH, into a Geographic Information System (GIS). MODTOOLS translates data into a GIS software called ARC/INFO. MODFLOW is the recognized name for the U.S. Geological Survey Modular Three-Dimensional Finite-Difference Ground-Water Model. MODTOOLS uses the data arrays input to or output by MODFLOW during a ground-water flow simulation to construct several types of GIS output files. MODTOOLS can also be used to translate data from MODPATH into GIS files. MODPATH and its companion program, MODPATH-PLOT, are collectively called the U.S. Geological Survey Three-Dimensional Particle Tracking Post-Processing Programs. MODPATH is used to calculate ground-water flow paths using the results of MODFLOW and MODPATH-PLOT can be used to display the flow paths in various ways.

  11. RRB / SSI Interface Checkwriting Integrated Computer Operation Extract File (CHICO)

    Data.gov (United States)

    Social Security Administration — This monthly file provides SSA with information about benefit payments made to railroad retirement beneficiaries. SSA uses this data to verify Supplemental Security...

  12. Southeast Region Headboat Survey-K-factor files

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The k-factor files contain the correction factors calculated from the headboat activity report. A correction factor is calculated for individual vessels by...

  13. PRIVACY IN CLOUD COMPUTING: A SURVEY

    Directory of Open Access Journals (Sweden)

    Arockiam L

    2012-07-01

    Full Text Available Various cloud computing models are used to increase the profit of an organization. Cloud provides a convenient environment and more advantages to business organizations to run their business. But, it has some issues related to the privacy of data. User’s data are stored and maintained out of user’s premises. The failure of data protection causes many issues like data theft which affects the individual organization. The cloud users may be satisfied, if their data are protected properly from unauthorized access. This paper presents a survey on different privacy issues involved in the cloud service. It also provides some suggestions to the cloud users to select their suitable cloud services by knowing their privacy policies.

  14. 75 FR 19656 - Filing of Plats of Survey; Nevada

    Science.gov (United States)

    2010-04-15

    ... section 12 and metes-and-bounds surveys of the easterly and westerly right-of-way lines of the Nevada... subdivisional lines, metes-and-bounds surveys of Tracts 37 and 38, and metes-and-bounds surveys of the easterly... 25, and metes-and-bounds surveys of portions of the easterly and westerly right-of-way lines of...

  15. DISTRIBUTED COMPUTING SUPPORT SERVICE USER SURVEY

    CERN Multimedia

    2001-01-01

    IT Division operates a Distributed Computing Support Service, which offers support to owners and users of all variety of desktops throughout CERN as well as more dedicated services for certain groups, divisions and experiments. It also provides the staff who operate the central and satellite Computing Helpdesks, it supports printers throughout the site and it provides the installation activities of the IT Division PC Service. We have published a questionnaire, which seeks to gather your feedback on how the services are seen, how they are progressing and how they can be improved. Please take a few minutes to fill in this questionnaire. Replies will be treated in confidence if desired although you may also request an opportunity to be contacted by CERN's service management directly. Please tell us if you met problems but also if you had a successful conclusion to your request for assistance. You will find the questionnaire at the web site http://wwwinfo/support/survey/desktop-contract There will also be a link...

  16. DISTRIBUTED COMPUTING SUPPORT CONTRACT USER SURVEY

    CERN Multimedia

    2001-01-01

    IT Division operates a Distributed Computing Support Service, which offers support to owners and users of all variety of desktops throughout CERN as well as more dedicated services for certain groups, divisions and experiments. It also provides the staff who operate the central and satellite Computing Helpdesks, it supports printers throughout the site and it provides the installation activities of the IT Division PC Service. We have published a questionnaire which seeks to gather your feedback on how the services are seen, how they are progressing and how they can be improved. Please take a few minutes to fill in this questionnaire. Replies will be treated in confidence if desired although you may also request an opportunity to be contacted by CERN's service management directly. Please tell us if you met problems but also if you had a successful conclusion to your request for assistance. You will find the questionnaire at the web site http://wwwinfo/support/survey/desktop-contract There will also be a link ...

  17. Technical documentation for the 1990 Nationwide Truck Activity and Commodity Survey Public Use File

    Energy Technology Data Exchange (ETDEWEB)

    1992-09-01

    The Nationwide Truck Activity and Commodity Survey (NTACS) provides detailed activity data for a sample of trucks covered in the 1987 Truck Inventory and Use Survey (TIUS) for days selected at random over a 12-month period ending in 1990. The NTACS was conducted by the US Bureau of the Census for the US Department of Transportation (DOT). A Public Use File for the NTACS was developed by Oak Ridge National Laboratory (ORNL) under a reimbursable agreement with the DOT. The content of the Public Use File and the design of the NTACS are described in this document.

  18. Technical documentation for the 1990 Nationwide Truck Activity and Commodity Survey Public Use File

    Energy Technology Data Exchange (ETDEWEB)

    1992-09-01

    The Nationwide Truck Activity and Commodity Survey (NTACS) provides detailed activity data for a sample of trucks covered in the 1987 Truck Inventory and Use Survey (TIUS) for days selected at random over a 12-month period ending in 1990. The NTACS was conducted by the US Bureau of the Census for the US Department of Transportation (DOT). A Public Use File for the NTACS was developed by Oak Ridge National Laboratory (ORNL) under a reimbursable agreement with the DOT. The content of the Public Use File and the design of the NTACS are described in this document.

  19. Fast and Easy Searching of Files in Unisys 2200 Computers

    Science.gov (United States)

    Snook, Bryan E.

    2010-01-01

    A program has been written to enable (1) fast and easy searching of symbolic files for one or more strings of characters, dates, or numerical values in specific fields or columns and (2) summarizing results of searching other fields or columns.

  20. 76 FR 19787 - Filing of Plats of Survey; Nevada

    Science.gov (United States)

    2011-04-08

    ... portions of the easterly and westerly right-of-way lines of the Nevada Northern Railway, Township 20 North...-bounds surveys of portions of the easterly and westerly right-of-way lines of the Nevada Northern Railway... section 11, and metes-and-bounds surveys of portions of the easterly and westerly right-of-way lines...

  1. Synchronizing files or images among several computers or removable devices. A utility to avoid frequent back-ups.

    Science.gov (United States)

    Leonardi, Rosalia; Maiorana, Francesco; Giordano, Daniela

    2008-06-01

    Many of us use and maintain files on more than 1 computer--a desktop part of the time, and a notebook, a palmtop, or removable devices at other times. It can be easy to forget which device contains the latest version of a particular file, and time-consuming searches often ensue. One way to solve this problem is to use software that synchronizes the files. This allows users to maintain updated versions of the same file in several locations.

  2. Equivalency of Paper versus Tablet Computer Survey Data

    Science.gov (United States)

    Ravert, Russell D.; Gomez-Scott, Jessica; Donnellan, M. Brent

    2015-01-01

    Survey responses collected via paper surveys and computer tablets were compared to test for differences between those methods of obtaining self-report data. College students (N = 258) were recruited in public campus locations and invited to complete identical surveys on either paper or iPad tablet. Only minor homogeneity differences were found…

  3. The 2001 Residential Finance Survey - Rental Property File

    Data.gov (United States)

    Department of Housing and Urban Development — The 2001 Residential Finance Survey (RFS) was sponsored by the Department of Housing and Urban Development and conducted by the Census Bureau. The RFS is a follow-on...

  4. The 2001 Residential Finance Survey - Owners Property File

    Data.gov (United States)

    Department of Housing and Urban Development — The 2001 Residential Finance Survey (RFS) was sponsored by the Department of Housing and Urban Development and conducted by the Census Bureau. The RFS is a follow-on...

  5. Computer Education - A Survey of Seventh and Eighth Grade Teachers.

    Science.gov (United States)

    Bassler, Otto; And Others

    Tennessee is in the process of implementing a computer literacy plan for grades 7 and 8. Determining the views of teachers in those grades about computers, what they think students should be taught about computers, and the extent to which they agree with aspects of the plan was the goal of this survey. Data were analyzed from 122 teachers and…

  6. Computing poverty measures with survey data

    OpenAIRE

    Philippe Van Kerm

    2009-01-01

    I discuss estimation of poverty measures from household survey data in Stata and show how to derive analytic standard errors that take into account survey design features. Where needed, standard errors are adjusted for the estimation of the poverty line as a fraction of the mean or median income. The linearization approach based on influence functions is generally applicable to many estimators.

  7. Free Oscilloscope Web App Using a Computer Mic, Built-In Sound Library, or Your Own Files

    Science.gov (United States)

    Ball, Edward; Ruiz, Frances; Ruiz, Michael J.

    2017-01-01

    We have developed an online oscilloscope program which allows users to see waveforms by utilizing their computer microphones, selecting from our library of over 30 audio files, and opening any *.mp3 or *.wav file on their computers. The oscilloscope displays real-time signals against time. The oscilloscope has been calibrated so one can make…

  8. r.maxent.lambdas - Computes raw and/or logistic prediction maps from MaxEnt lambdas files

    OpenAIRE

    Blumentrath, Stefan

    2016-01-01

    The script is intended to compute raw and/or logistic prediction maps from a lambdas file produced with MaxEnt 3.3.3e. It will parse the specified lambdas-file from MaxEnt 3.3.3e and translate it into an r.mapcalc-expression which is then stored in a temporary file and finally piped to r.mapcalc. If alias names had been used in MaxEnt, these alias names can automatically be replaced according to a CSV-like file provided by the user. This file should contain alias names in the first column and...

  9. Dimensional quality control of Ti-Ni dental file by optical coordinate metrology and computed tomography

    DEFF Research Database (Denmark)

    Yagüe-Fabra, J.A.; Tosello, Guido; Ontiveros, S.

    2014-01-01

    Endodontic dental files usually present complex 3D geometries, which make the complete measurement of the component very challenging with conventional micro metrology tools. Computed Tomography (CT) can represent a suitable alternative solution to micro metrology tools based on optical and tactil...

  10. Participation of VAX VMS computers in IBM file-transfer networks

    Energy Technology Data Exchange (ETDEWEB)

    Raffenetti, R.C.

    1983-01-01

    Communications software written at Argonne National Laboratory enables VAX VMS computer systems to participate as end nodes in a standard IBM file-transfer network. The software, which emulates the IBM Network Job Entry (NJE) protocol, has been in use at Argonne for over two years, and is in use at other installations. The basic NJE services include transfer of print and punch files, job submittal, execution of remote commands, and transmission of user-to-user messages. The transmit services are asynchronous to the user's VMS session and received files are automatically routed to a designated user directory. Access to files is validated according to the VMS protection mechanism. New features which were added recently include application level software to transfer general, sequential files and to bridge the electronic mail systems of VMS and VM/CMS. This paper will review the NJE emulator and describe the design and implementation of the sequential file transfer service. The performance of the emulator will be described. Another paper at this symposium will describe the mail bridge.

  11. A Survey of Civilian Dental Computer Systems.

    Science.gov (United States)

    1988-01-01

    r.arketplace, the orthodontic community continued to pioneer clinical automation through diagnosis, treat- (1) patient registration, identification...profession." New York State Dental Journal 34:76, 1968. 17. Ehrlich, A., The Role of Computers in Dental Practice Management. Champaign, IL: Colwell...Council on Dental military dental clinic. Medical Bulletin of the US Army Practice. Report: Dental Computer Vendors. 1984 Europe 39:14-16, 1982. 19

  12. A Survey of Computer Science Capstone Course Literature

    Science.gov (United States)

    Dugan, Robert F., Jr.

    2011-01-01

    In this article, we surveyed literature related to undergraduate computer science capstone courses. The survey was organized around course and project issues. Course issues included: course models, learning theories, course goals, course topics, student evaluation, and course evaluation. Project issues included: software process models, software…

  13. A Survey of Computer Science Capstone Course Literature

    Science.gov (United States)

    Dugan, Robert F., Jr.

    2011-01-01

    In this article, we surveyed literature related to undergraduate computer science capstone courses. The survey was organized around course and project issues. Course issues included: course models, learning theories, course goals, course topics, student evaluation, and course evaluation. Project issues included: software process models, software…

  14. Free oscilloscope web app using a computer mic, built-in sound library, or your own files

    Science.gov (United States)

    Ball, Edward; Ruiz, Frances; Ruiz, Michael J.

    2017-07-01

    We have developed an online oscilloscope program which allows users to see waveforms by utilizing their computer microphones, selecting from our library of over 30 audio files, and opening any *.mp3 or *.wav file on their computers. The oscilloscope displays real-time signals against time. The oscilloscope has been calibrated so one can make accurate frequency measurements of periodic waves to within 1%. The web app is ideal for computer projection in class.

  15. DOC-a file system cache to support mobile computers

    Science.gov (United States)

    Huizinga, D. M.; Heflinger, K.

    1995-09-01

    This paper identifies design requirements of system-level support for mobile computing in small form-factor battery-powered portable computers and describes their implementation in DOC (Disconnected Operation Cache). DOC is a three-level client caching system designed and implemented to allow mobile clients to transition between connected, partially disconnected and fully disconnected modes of operation with minimal user involvement. Implemented for notebook computers, DOC addresses not only typical issues of mobile elements such as resource scarcity and fluctuations in service quality but also deals with the pitfalls of MS-DOS, the operating system which prevails in the commercial notebook market. Our experiments performed in the software engineering environment of AST Research indicate not only considerable performance gains for connected and partially disconnected modes of DOC, but also the successful operation of the disconnected mode.

  16. Authentication Methods in Cloud Computing: A Survey

    Directory of Open Access Journals (Sweden)

    Mahnoush Babaeizadeh

    2015-03-01

    Full Text Available This study presents a review on the various methods of authentication in cloud environment. Authentication plays an important role in security of Cloud Computing (CC. It protects Cloud Service Providers (CSP against various types of attacks, where the aim is to verify a user’s identity when a user wishes to request services from cloud servers. There are multiple authentication technologies that verify the identity of a user before granting access to resources.

  17. Evaluating tablet computers as a survey tool in rural communities.

    Science.gov (United States)

    Newell, Steve M; Logan, Henrietta L; Guo, Yi; Marks, John G; Shepperd, James A

    2015-01-01

    Although tablet computers offer advantages in data collection over traditional paper-and-pencil methods, little research has examined whether the 2 formats yield similar responses, especially with underserved populations. We compared the 2 survey formats and tested whether participants' responses to common health questionnaires or perceptions of usability differed by survey format. We also tested whether we could replicate established paper-and-pencil findings via tablet computer. We recruited a sample of low-income community members living in the rural southern United States. Participants were 170 residents (black = 49%; white = 36%; other races and missing data = 15%) drawn from 2 counties meeting Florida's state statutory definition of rural with 100 persons or fewer per square mile. We randomly assigned participants to complete scales (Center for Epidemiologic Studies Depression Inventory and Regulatory Focus Questionnaire) along with survey format usability ratings via paper-and-pencil or tablet computer. All participants rated a series of previously validated posters using a tablet computer. Finally, participants completed comparisons of the survey formats and reported survey format preferences. Participants preferred using the tablet computer and showed no significant differences between formats in mean responses, scale reliabilities, or in participants' usability ratings. Overall, participants reported similar scales responses and usability ratings between formats. However, participants reported both preferring and enjoying responding via tablet computer more. Collectively, these findings are among the first data to show that tablet computers represent a suitable substitute among an underrepresented rural sample for paper-and-pencil methodology in survey research. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  18. Computational Mechanisms for Metaphor in Languages: A Survey

    Institute of Scientific and Technical Information of China (English)

    Chang-Le Zhou; Yun Yang; Xiao-Xi Huang

    2007-01-01

    Metaphor computation has attracted more and more attention because metaphor, to some extent, is the focus of mind and language mechanism. However, it encounters problems not only due to the rich expressive power of natural language but also due to cognitive nature of human being. Therefore machine-understanding of metaphor is now becoming abottle-neck in natural language processing and machine translation. This paper first suggests how a metaphor is understood and then presents a survey of current computational approaches, in terms of their linguistic historical roots, underlying foundations, methods and techniques currently used, advantages, limitations, and future trends. A comparison between metaphors in English and Chinese languages is also introduced because compared with development in English language Chinese metaphor computation is just at its starting stage. So a separate summarization of current progress made in Chinese metaphor computation is presented. As a conclusion, a few suggestions are proposed for further research on metaphor computation especially on Chinese metaphor computation.

  19. Cloud computing in data mining – a survey

    Directory of Open Access Journals (Sweden)

    Viktor Nekvapil

    2015-01-01

    Full Text Available Cloud computing in data mining presents promising solution for businesses willing to analyse their data with lower costs or those companies which want to utilise their “big data”. In this survey, reasons for using cloud computing solutions in data mining are studied and respective tools corresponding to these reasons are evaluated. The emphasis is laid to functionality of the tools and the integration with other applications. In total, 13 solutions were evaluated.

  20. Empirical Validation and Application of the Computing Attitudes Survey

    Science.gov (United States)

    Dorn, Brian; Elliott Tew, Allison

    2015-01-01

    Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning…

  1. The ACLS Survey of Scholars: Views on Publications, Computers, Libraries.

    Science.gov (United States)

    Morton, Herbert C.; Price, Anne Jamieson

    1986-01-01

    Reviews results of a survey by the American Council of Learned Societies (ACLS) of 3,835 scholars in the humanities and social sciences who are working both in colleges and universities and outside the academic community. Areas highlighted include professional reading, authorship patterns, computer use, and library use. (LRW)

  2. Empirical Validation and Application of the Computing Attitudes Survey

    Science.gov (United States)

    Dorn, Brian; Elliott Tew, Allison

    2015-01-01

    Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning…

  3. A Newer User Authentication, File encryption and Distributed Server Based Cloud Computing security architecture

    Directory of Open Access Journals (Sweden)

    Kawser Wazed Nafi

    2012-10-01

    Full Text Available The cloud computing platform gives people the opportunity for sharing resources, services and information among the people of the whole world. In private cloud system, information is shared among the persons who are in that cloud. For this, security or personal information hiding process hampers. In this paper we have proposed new security architecture for cloud computing platform. This ensures secure communication system and hiding information from others. AES based file encryption system and asynchronous key system for exchanging information or data is included in this model. This structure can be easily applied with main cloud computing features, e.g. PaaS, SaaS and IaaS. This model also includes onetime password system for user authentication process. Our work mainly deals with the security system of the whole cloud computing platform.

  4. A Survey of Service Composition Mechanisms in Ubiquitous Computing

    DEFF Research Database (Denmark)

    Brønsted, Jeppe; Hansen, Klaus Marius; Ingstrup, Mads

    2007-01-01

    Composition of services, i.e., providing new services by combining existing ones, is a pervasive idea in ubiquitous computing. We surveyed the field by looking at what features are actually present in technologies that support service composition in some form. Condensing this into a list of featu......Composition of services, i.e., providing new services by combining existing ones, is a pervasive idea in ubiquitous computing. We surveyed the field by looking at what features are actually present in technologies that support service composition in some form. Condensing this into a list...... management for composites—one of the concerns differentiating service composition in ubiquitous computing from its counterpart in less dynamic settings....

  5. A survey of GPU-based medical image computing techniques.

    Science.gov (United States)

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming; Wang, Defeng

    2012-09-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine.

  6. Current Trends in Cloud Computing A Survey of Cloud Computing Systems

    Directory of Open Access Journals (Sweden)

    Harjit Singh

    2012-06-01

    Full Text Available Cloud computing that has become an increasingly important trend, is a virtualization technology that uses the internet and central remote servers to offer the sharing of resources that include infrastructures, software, applications and business processes to the market environment to fulfill the elastic demand. In today’s competitive environment, the service vitality, elasticity, choices and flexibility offered by this scalable technology are too attractive that makes the cloud computing to increasingly becoming an integral part of the enterprise computing environment. This paper presents a survey of the current state of Cloud Computing. It includes a discussion of the evolution process of cloud computing, characteristics of Cloud, current technologies adopted in cloud computing, This paper also presents a comparative study of cloud computing platforms (Amazon, Google and Microsoft, and its challenges.

  7. Evaluation of clinical data in childhood asthma. Application of a computer file system

    Energy Technology Data Exchange (ETDEWEB)

    Fife, D.; Twarog, F.J.; Geha, R.S.

    1983-10-01

    A computer file system was used in our pediatric allergy clinic to assess the value of chest roentgenograms and hemoglobin determinations used in the examination of patients and to correlate exposure to pets and forced hot air with the severity of asthma. Among 889 children with asthma, 20.7% had abnormal chest roentgenographic findings, excluding hyperinflation and peribronchial thickening, and 0.7% had abnormal hemoglobin values. Environmental exposure to pets or forced hot air was not associated with increased severity of asthma, as assessed by five measures of outcome: number of medications administered, requirement for corticosteroids, frequency of clinic visits, frequency of emergency room visits, and frequency of hospitalizations.

  8. Documentation for the Academic Library Survey (ALS) Data File: Fiscal Year 2000 (Public Use). NCES 2006-342

    Science.gov (United States)

    Schmitt, Carl M.

    2006-01-01

    The purpose of this report is to document the procedures and methodologies employed during the Academic Library Survey of 2000. This report is designed to provide guidance and documentation for users of the public-release and restricted-use data files. Information about the universe of academic libraries and how to access this information is…

  9. Findings from the Teaching, Learning, and Computing Survey

    Directory of Open Access Journals (Sweden)

    Henry Jay Becker

    2000-11-01

    Full Text Available Cuban (1986; 2000 has argued that computers are largely incompatible with the requirements of teaching, and that, for the most part, teachers will continue to reject their use as instruments of student work during class. Using data from a nationally representative survey of 4th through 12th grade teachers, this paper demonstrates that although Cuban correctly characterizes frequent use of computers in academic subject classes as a teaching practice of a small and distinct minority, certain conditions make a big difference in the likelihood of a teacher having her students use computers frequently during class time. In particular, academic subject-matter teachers who have at least five computers present in their classroom, who have at least average levels of technical expertise in their use, and who are in the top quartile on a reliable and extensive measure of constructivist teaching philosophy are very likely to have students make regular use of computers during class. More than 3/4 of such teachers have students use word processing programs regularly during class and a majority are regular users of at least one other type of software besides skill-based games. In addition, other factors-such as an orientation towards depth rather than breadth in their teaching(perhaps caused by limited pressures to cover large amounts of content and block scheduling structures that provide for long class periods-are also associated with greater use of computers by students during class. Finally, the paper provides evidence that certain approaches to using computers result in students taking greater initiative in using computers outside of class time-approaches consistent with a constructivist teaching philosophy, rather than a standards- based, accountability-oriented approach to teaching. Thus, despite their clear minority status as a primary resource in academic subject classroom teaching, computers are playing a major role in at least one major direction of

  10. Surveying co-located space geodesy techniques for ITRF computation

    Science.gov (United States)

    Sarti, P.; Sillard, P.; Vittuari, L.

    2003-04-01

    We present a comprehensive operational methodology, based on classical geodesy triangulation and trilateration, that allows the determination of reference points of the five space geodesy techniques used in ITRF computation (i.e.: DORIS, GPS, LLR, SLR, VLBI). Most of the times, for a single technique, the reference point is not accessible and measurable directly. Likewise, no mechanically determined ex-center with respect to an external and measurable point is usually given. In these cases, it is not possible to directly measure the sought reference points and it is even less straightforward to obtain the statistical information relating these points for different techniques. We outline the most general practical surveying methodology that permits to recover the reference points of the different techniques regardless of their physical materialization. We also give a detailed analytical approach for less straightforward cases (e.g.: non geodetic VLBI antennae and SLR/LLR systems). We stress the importance of surveying instrumentation and procedure in achieving the best possible results and outline the impact of the information retrieved with our method in ITRF computation. In particular, we will give numerical examples of computation of the reference point of VLBI antennae (Ny Aalesund and Medicina) and the ex-centre vector computation linking co-located VLBI and GPS techniques in Medicina (Italy). A special attention was paid to the rigorous derivation of statistical elements. They will be presented in an other presentation.

  11. Trust in social computing. The case of peer-to-peer file sharing networks

    Directory of Open Access Journals (Sweden)

    Heng Xu

    2011-09-01

    Full Text Available Social computing and online communities are changing the fundamental way people share information and communicate with each other. Social computing focuses on how users may have more autonomy to express their ideas and participate in social exchanges in various ways, one of which may be peer-to-peer (P2P file sharing. Given the greater risk of opportunistic behavior by malicious or criminal communities in P2P networks, it is crucial to understand the factors that affect individual’s use of P2P file sharing software. In this paper, we develop and empirically test a research model that includes trust beliefs and perceived risks as two major antecedent beliefs to the usage intention. Six trust antecedents are assessed including knowledge-based trust, cognitive trust, and both organizational and peer-network factors of institutional trust. Our preliminary results show general support for the model and offer some important implications for software vendors in P2P sharing industry and regulatory bodies.

  12. 7 CFR 47.25 - Filing; extensions of time; effective date of filing; computations of time; official notice.

    Science.gov (United States)

    2010-01-01

    ... Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE MARKETING OF PERISHABLE AGRICULTURAL COMMODITIES RULES OF PRACTICE UNDER THE PERISHABLE AGRICULTURAL COMMODITIES ACT Rules Applicable to Reparation Proceedings § 47.25 Filing; extensions of...

  13. A survey of experience-based preference of Nickel-Titanium rotary files and incidence of fracture among general dentists

    Directory of Open Access Journals (Sweden)

    WooCheol Lee

    2012-11-01

    Full Text Available Objectives The purpose was to investigate the preference and usage technique of NiTi rotary instruments and to retrieve data on the frequency of re-use and the estimated incidence of file separation in the clinical practice among general dentists. Materials and Methods A survey was disseminated via e-mail and on-site to 673 general dentists. The correlation between the operator's experience or preferred technique and frequency of re-use or incidence of file fracture was assessed. Results A total of 348 dentists (51.7% responded. The most frequently used NiTi instruments was ProFile (39.8% followed by ProTaper. The most preferred preparation technique was crown-down (44.6%. 54.3% of the respondents re-used NiTi files more than 10 times. There was a significant correlation between experience with NiTi files and the number of reuses (p = 0.0025. 54.6% of the respondents estimated experiencing file separation less than 5 times per year. The frequency of separation was significantly correlated with the instrumentation technique (p = 0.0003. Conclusions A large number of general dentists in Korea prefer to re-use NiTi rotary files. As their experience with NiTi files increased, the number of re-uses increased, while the frequency of breakage decreased. Operators who adopt the hybrid technique showed less tendency of separation even with the increased number of re-use.

  14. Survey of patient dose in computed tomography in Syria 2009.

    Science.gov (United States)

    Kharita, M H; Khazzam, S

    2010-09-01

    The radiation doses to patient in computed tomography (CT) in Syria have been investigated and compared with similar studies in different countries. This work surveyed 30 CT scanners from six different manufacturers distributed all over Syria. Some of the results in this paper were part of a project launched by the International Atomic Energy Agency in different regions of the world covering Asia, Africa and Eastern Europe. The dose quantities covered are CT dose index (CTDI(w)), dose-length product (DLP), effective dose (E) and collective dose. It was found that most CTDI(w) and DLP values were similar to the European reference levels and in line with the results of similar surveys in the world. The results were in good agreement with the UNSCEAR Report 2007. This study concluded a recommendation for national diagnostic reference level for the most common CT protocols in Syria. The results can be used as a base for future optimisation studies in the country.

  15. A Survey of Software Infrastructures and Frameworks for Ubiquitous Computing

    Directory of Open Access Journals (Sweden)

    Christoph Endres

    2005-01-01

    Full Text Available In this survey, we discuss 29 software infrastructures and frameworks which support the construction of distributed interactive systems. They range from small projects with one implemented prototype to large scale research efforts, and they come from the fields of Augmented Reality (AR, Intelligent Environments, and Distributed Mobile Systems. In their own way, they can all be used to implement various aspects of the ubiquitous computing vision as described by Mark Weiser [60]. This survey is meant as a starting point for new projects, in order to choose an existing infrastructure for reuse, or to get an overview before designing a new one. It tries to provide a systematic, relatively broad (and necessarily not very deep overview, while pointing to relevant literature for in-depth study of the systems discussed.

  16. A Survey of Service Composition Mechanisms in Ubiquitous Computing

    DEFF Research Database (Denmark)

    Brønsted, Jeppe; Hansen, Klaus Marius; Ingstrup, Mads

    2007-01-01

    of features allowed us to discuss the qualitative merits and drawbacks of various approaches to service composition, focusing in particular on usability, adaptability and efficiency. Moreover, we found that further research is needed into quality-of-service assurance of composites and into contingency......Composition of services, i.e., providing new services by combining existing ones, is a pervasive idea in ubiquitous computing. We surveyed the field by looking at what features are actually present in technologies that support service composition in some form. Condensing this into a list...

  17. Noise-based deterministic logic and computing: a brief survey

    CERN Document Server

    Kish, Laszlo B; Bezrukov, Sergey M; Peper, Ferdinand; Gingl, Zoltan; Horvath, Tamas

    2010-01-01

    A short survey is provided about our recent explorations of the young topic of noise-based logic. After outlining the motivation behind noise-based computation schemes, we present a short summary of our ongoing efforts in the introduction, development and design of several noise-based deterministic multivalued logic schemes and elements. In particular, we describe classical, instantaneous, continuum, spike and random-telegraph-signal based schemes with applications such as circuits that emulate the brain's functioning and string verification via a slow communication channel.

  18. A survey on top security threats in cloud computing

    Directory of Open Access Journals (Sweden)

    Muhammad Kazim

    2015-03-01

    Full Text Available Cloud computing enables the sharing of resources such as storage, network, applications and software through internet. Cloud users can lease multiple resources according to their requirements, and pay only for the services they use. However, despite all cloud benefits there are many security concerns related to hardware, virtualization, network, data and service providers that act as a significant barrier in the adoption of cloud in the IT industry. In this paper, we survey the top security concerns related to cloud computing. For each of these security threats we describe, i how it can be used to exploit cloud components and its effect on cloud entities such as providers and users, and ii the security solutions that must be taken to prevent these threats. These solutions include the security techniques from existing literature as well as the best security practices that must be followed by cloud administrators.

  19. A Survey on Security Issues in Cloud Computing

    CERN Document Server

    Bhadauria, Rohit; Chaki, Nabendu; Sanyal, Sugata

    2011-01-01

    Cloud Computing holds the potential to eliminate the requirements for setting up of high-cost computing infrastructure for the IT-based solutions and services that the industry uses. It promises to provide a flexible IT architecture, accessible through internet for lightweight portable devices. This would allow many-fold increase in the capacity or capabilities of the existing and new software. In a cloud computing environment, the entire data reside over a set of networked resources, enabling the data to be accessed through virtual machines. Since these data centers may lie in any corner of the world beyond the reach and control of users, there are multifarious security and privacy challenges that need to be understood and taken care of. Also, one can never deny the possibility of a server breakdown that has been witnessed, rather quite often in the recent times. There are various issues that need to be dealt with respect to security and privacy in a cloud computing scenario. This extensive survey paper aims...

  20. INFLUENCE OF EVOLUTIONARY COMPUTING ON NUTRITION RECOMMENDATION: A SURVEY

    Directory of Open Access Journals (Sweden)

    S. A. AlFayoumi

    2014-01-01

    Full Text Available This study is a survey about how Evolutionary Computing doesn’t play its important role in a vital field such as Nutrition. Evolutionary computing is a subset from the artificial intelligence umbrella that involves continuous optimization and combinational optimization which is based on searching methodologies. It has also a lot of algorithms that have played a main role in supporting the decision making and taking processes accurately and effectively. It is concerning many fields in our life such as Industry, Agriculture, Engineering, Transportation, Medicine and Nutrition, etc. One of these algorithms is Genetic Algorithm (GA which is contributed to a lot of fields. Moreover, Nutrition is a wide field of research because it has several sides, medically, physically and psychologically and so on. But, has Genetic Algorithms been used to contribute to the field of nutrition? This survey illustrates that (GA is not involved in nutrition computerized models or applications and it suggests building a model to promote a nutrition system using this powerful algorithm and this study presents a suggestion to build a model for nutrition as a future work that uses Genetic Algorithm.

  1. Scanned Hardcopy Maps, Plat of Survey maps filed in the Manitowoc County Real Propery Lister's Office., Published in unknown, Manitowoc County.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This Scanned Hardcopy Maps dataset, was produced all or in part from Hardcopy Maps information as of unknown. It is described as 'Plat of Survey maps filed in the...

  2. Modification to the Monte Carlo N-Particle (MCNP) Visual Editor (MCNPVised) to Read in Computer Aided Design (CAD) Files

    Energy Technology Data Exchange (ETDEWEB)

    Randolph Schwarz; Leland L. Carter; Alysia Schwarz

    2005-08-23

    Monte Carlo N-Particle Transport Code (MCNP) is the code of choice for doing complex neutron/photon/electron transport calculations for the nuclear industry and research institutions. The Visual Editor for Monte Carlo N-Particle is internationally recognized as the best code for visually creating and graphically displaying input files for MCNP. The work performed in this grant was used to enhance the capabilities of the MCNP Visual Editor to allow it to read in both 2D and 3D Computer Aided Design (CAD) files, allowing the user to electronically generate a valid MCNP input geometry.

  3. Modification to the Monte Carlo N-Particle (MCNP) Visual Editor (MCNPVised) to Read in Computer Aided Design (CAD) Files

    Energy Technology Data Exchange (ETDEWEB)

    Randolph Schwarz; Leland L. Carter; Alysia Schwarz

    2005-08-23

    Monte Carlo N-Particle Transport Code (MCNP) is the code of choice for doing complex neutron/photon/electron transport calculations for the nuclear industry and research institutions. The Visual Editor for Monte Carlo N-Particle is internationally recognized as the best code for visually creating and graphically displaying input files for MCNP. The work performed in this grant was used to enhance the capabilities of the MCNP Visual Editor to allow it to read in both 2D and 3D Computer Aided Design (CAD) files, allowing the user to electronically generate a valid MCNP input geometry.

  4. Modern Cosmology: Interactive Computer Simulations that use Recent Observational Surveys

    CERN Document Server

    Moldenhauer, Jacob; Stone, Keenan; Shuler, Ezekiel

    2013-01-01

    We present a collection of new, open-source computational tools for numerically modeling recent large-scale observational data sets using modern cosmology theory. Specifically, these tools will allow both students and researchers to constrain the parameter values in competitive cosmological models, thereby discovering both the accelerated expansion of the universe and its composition (e.g., dark matter and dark energy). These programs have several features to help the non-cosmologist build an understanding of cosmological models and their relation to observational data: a built-in collection of several real observational data sets; sliders to vary the values of the parameters that define different cosmological models; real-time plotting of simulated data; and $\\chi^2$ calculations of the goodness of fit for each choice of parameters (theory) and observational data (experiment). The current list of built-in observations includes several recent supernovae Type Ia surveys, baryon acoustic oscillations, the cosmi...

  5. Project Integration Architecture (PIA) and Computational Analysis Programming Interface (CAPRI) for Accessing Geometry Data from CAD Files

    Science.gov (United States)

    Benyo, Theresa L.

    2002-01-01

    Integration of a supersonic inlet simulation with a computer aided design (CAD) system is demonstrated. The integration is performed using the Project Integration Architecture (PIA). PIA provides a common environment for wrapping many types of applications. Accessing geometry data from CAD files is accomplished by incorporating appropriate function calls from the Computational Analysis Programming Interface (CAPRI). CAPRI is a CAD vendor neutral programming interface that aids in acquiring geometry data directly from CAD files. The benefits of wrapping a supersonic inlet simulation into PIA using CAPRI are; direct access of geometry data, accurate capture of geometry data, automatic conversion of data units, CAD vendor neutral operation, and on-line interactive history capture. This paper describes the PIA and the CAPRI wrapper and details the supersonic inlet simulation demonstration.

  6. Instructional Uses of Computers in Higher Education: A Survey of Higher Education in Massachusetts.

    Science.gov (United States)

    Demb, Ada Barbara

    A survey of computer use was conducted in 1974 in a small, nonrandom sample of Massachusetts colleges and universities. Allowing for inflation, but adjusting for the increase in computer power per dollar, it is clear that significantly more computer power is being devoted to instruction--both "with" and "about" the computer.…

  7. Geothermal-energy files in computer storage: sites, cities, and industries

    Energy Technology Data Exchange (ETDEWEB)

    O' Dea, P.L.

    1981-12-01

    The site, city, and industrial files are described. The data presented are from the hydrothermal site file containing about three thousand records which describe some of the principal physical features of hydrothermal resources in the United States. Data elements include: latitude, longitude, township, range, section, surface temperature, subsurface temperature, the field potential, and well depth for commercialization. (MHR)

  8. Enhanced Historical Land-Use and Land-Cover Data Sets of the U.S. Geological Survey: polygon format files

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This data set depicts land use and land cover from the 1970s and 1980s and has been previously published by the U.S. Geological Survey (USGS) in other file formats....

  9. Enhanced Historical Land-Use and Land-Cover Data Sets of the U.S. Geological Survey: raster format files

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This data set depicts land use and land cover from the 1970s and 1980s and has been previously published by the U.S. Geological Survey (USGS) in other file formats....

  10. Novel Framework for Hidden Data in the Image Page within Executable File Using Computation between Advanced Encryption Standard and Distortion Techniques

    CERN Document Server

    Naji, A W; Zaidan, B B; Al-Khateeb, Wajdi F; Khalifa, Othman O; Zaidan, A A; Gunawan, Teddy S

    2009-01-01

    The hurried development of multimedia and internet allows for wide distribution of digital media data. It becomes much easier to edit, modify and duplicate digital information. In additional, digital document is also easy to copy and distribute, therefore it may face many threats. It became necessary to find an appropriate protection due to the significance, accuracy and sensitivity of the information. Furthermore, there is no formal method to be followed to discover a hidden data. In this paper, a new information hiding framework is presented.The proposed framework aim is implementation of framework computation between advance encryption standard (AES) and distortion technique (DT) which embeds information in image page within executable file (EXE file) to find a secure solution to cover file without change the size of cover file. The framework includes two main functions; first is the hiding of the information in the image page of EXE file, through the execution of four process (specify the cover file, spec...

  11. 7 CFR 900.15 - Filing; extensions of time; effective date of filing; and computation of time.

    Science.gov (United States)

    2010-01-01

    ...; and computation of time. 900.15 Section 900.15 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE GENERAL REGULATIONS Rules of Practice and Procedure...

  12. Parallel MOPEX: Computing Mosaics of Large-Area Spitzer Surveys on a Cluster Computer

    Directory of Open Access Journals (Sweden)

    Joseph C. Jacob

    2007-01-01

    Full Text Available The Spitzer Science Center's MOPEX software is a part of the Spitzer Space Telescope's operational pipeline that enables detection of cosmic ray collisions with the detector array, masking of the corrupted pixels due to these collisions, subsequent mosaicking of image fields, and extraction of point sources to create catalogs of celestial objects. This paper reports on our experiences in parallelizing the parts of MOPEX related to cosmic ray rejection and mosaicking on a 1,024-processor cluster computer at NASA's Jet Propulsion Laboratory. The architecture and performance of the new Parallel MOPEX software are described. This work was done in order to rapidly mosaic the IRAC shallow survey data, covering a region of the sky observed with one of Spitzer's infrared instruments for the study of galaxy clusters, large-scale structure, and brown dwarfs.

  13. How Undergraduates Learn Computer Skills: Results of a Survey and Focus Group.

    Science.gov (United States)

    Davis, Philip

    1999-01-01

    Reports on the quantitative findings from a survey of Cornell University undergraduates and on qualitative findings from a prior focus group that investigated computer literacy and the most effective methods to learn computer-literacy skills. Implications for supporting student computing needs are discussed. (LRW)

  14. Computational system to create an entry file for replicating I-125 seeds simulating brachytherapy case studies using the MCNPX code

    Directory of Open Access Journals (Sweden)

    Leonardo da Silva Boia

    2014-03-01

    Full Text Available Purpose: A computational system was developed for this paper in the C++ programming language, to create a 125I radioactive seed entry file, based on the positioning of a virtual grid (template in voxel geometries, with the purpose of performing prostate cancer treatment simulations using the MCNPX code.Methods: The system is fed with information from the planning system with regard to each seed’s location and its depth, and an entry file is automatically created with all the cards (instructions for each seed regarding their cell blocks and surfaces spread out spatially in the 3D environment. The system provides with precision a reproduction of the clinical scenario for the MCNPX code’s simulation environment, thereby allowing the technique’s in-depth study.Results and Conclusion: The preliminary results from this study showed that the lateral penumbra of uniform scanning proton beams was less sensitive In order to validate the computational system, an entry file was created with 88 125I seeds that were inserted in the phantom’s MAX06 prostate region with initial activity determined for the seeds at the 0.27 mCi value. Isodose curves were obtained in all the prostate slices in 5 mm steps in the 7 to 10 cm interval, totaling 7 slices. Variance reduction techniques were applied in order to optimize computational time and the reduction of uncertainties such as photon and electron energy interruptions in 4 keV and forced collisions regarding cells of interest. Through the acquisition of isodose curves, the results obtained show that hot spots have values above 300 Gy, as anticipated in literature, stressing the importance of the sources’ correct positioning, in which the computational system developed provides, in order not to release excessive doses in adjacent risk organs. The 144 Gy prescription curve showed in the validation process that it covers perfectly a large percentage of the volume, at the same time that it demonstrates a large

  15. A Survey on Interoperability in the Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Bahman Rashidi

    2013-07-01

    Full Text Available In the recent years, Cloud Computing has been one of the top ten new technologies which provides various services such as software, platform and infrastructure for internet users. The Cloud Computing is a promising IT paradigm which enables the Internet evolution into a global market of collaborating services. In order to provide better services for cloud customers, cloud providers need services that are in cooperation with other services. Therefore, Cloud Computing semantic interoperability plays a key role in Cloud Computing services. In this paper, we address interoperability issues in Cloud Computing environments. After a description of Cloud Computing interoperability from different aspects and references, we describe two architectures of cloud service interoperability. Architecturally, we classify existing interoperability challenges and we describe them. Moreover, we use these aspects to discuss and compare several interoperability approaches.

  16. The efficacy of the Self-Adjusting File versus WaveOne in removal of root filling residue that remains in oval canals after the use of ProTaper retreatment files: A cone-beam computed tomography study

    Directory of Open Access Journals (Sweden)

    Ajinkya M Pawar

    2016-01-01

    Full Text Available Aim: The current ex vivo study compared the efficacy of removing root fillings using ProTaper retreatment files followed by either WaveOne reciprocating file or the Self-Adjusting File (SAF. Materials and Methods: Forty maxillary canines with single oval root canal were selected and sectioned to obtain 18-mm root segments. The root canals were instrumented with WaveOne primary files, followed by obturation using warm lateral compaction, and the sealer was allowed to fully set. The teeth were then divided into two equal groups (N = 20. Initial removal of the bulk of root filling material was performed with ProTaper retreatment files, followed by either WaveOne files (Group 1 or SAF (Group 2. Endosolv R was used as a gutta-percha softener. Preoperative and postoperative high-resolution cone-beam computed tomography (CBCT was used to measure the volume of the root filling residue that was left after the procedure. Statistical analysis was performed using t-test. Results: The mean volume of root filling residue in Group 1 was 9.4 (±0.5 mm 3 , whereas in Group 2 the residue volume was 2.6 (±0.4 mm 3 , (P < 0.001; t-test. Conclusions: When SAF was used after ProTaper retreatment files, significantly less root filling residue was left in the canals compared to when WaveOne was used.

  17. A Survey of DNA Computing and DNA Computer%DNA计算与DNA计算机研究进展与展望

    Institute of Scientific and Technical Information of China (English)

    丁建立; 陈增强; 袁著祉

    2003-01-01

    DNA computing is a new method based on biochemical reactions and molecular biology technology.The paper first introduces the basic principle and advantages of DNA computing, and then surveys DNA computing and DNA computer, finally, points out current existing problems and future search directions of DNA computing and DNA computer.

  18. A Fault—Tolerant File Management Algorithm in Distributed Computer System “THUDS”

    Institute of Scientific and Technical Information of China (English)

    廖先Shi; 金兰

    1989-01-01

    A concurrent control with independent processes from simultaneous access to a critical section is discussed for the case where there are two distinct classes of processes known as readers and writers.The readers can share the file with one another,but the interleaved execution with readers and writers may produce undesirable conflicts.The file management algorithm proposed in this paper is the activity of avoiding these results.This algorithm not only guarantees the consistency and integrity of the shared file,but also supports optimal parallelism.The concept of dynamic virtual queue is introduced and serves the foundation for this algorithm.Our algorithm with its implicit redundancy allows software fault-tolerant technique.

  19. Software survey: VOSviewer, a computer program for bibliometric mapping

    NARCIS (Netherlands)

    N.J.P. van Eck (Nees Jan); L. Waltman (Ludo)

    2010-01-01

    textabstractWe present VOSviewer, a freely available computer program that we have developed for constructing and viewing bibliometric maps. Unlike most computer programs that are used for bibliometric mapping, VOSviewer pays special attention to the graphical representation of bibliometric maps. Th

  20. A Survey of Formal Models for Computer Security.

    Science.gov (United States)

    1981-09-30

    presenting the individual models. 6.1 Basic Concepts and Trends The finite state machine model for computation views a computer system as a finite...top-level specification. The simplest description of the top-level model for DSU is given by Walker, et al. [36]. It is a finite state machine model , with

  1. Data and shape files for the sedimentation survey of Lago La Plata, Toa Alta, Puerto Rico

    Science.gov (United States)

    Gomez-Fragoso, Julieta

    2016-01-01

    This data release contains spatial data associated the sedimentation survey conducted by the U.S. Geological Survey cfor the Lago La Plata, Toa Alta, Puerto Rico, during March and April 2015 to provide up-to-date information about the relation of pool elevation and storage volume in the reservoir. The survey was conducted in cooperation with the Puerto Rico Aqueduct and Sewer Authority. A total of 264 navigation lines were surveyed, using a depth sounding device coupled to a global positioning system. The results of the survey were used to prepare a bathymetric map showing the reservoir bottom referenced with respect to the spillway elevation.

  2. A Survey of Distributed Capability File Systems and Their Application to Cloud Environments

    Science.gov (United States)

    2014-09-01

    Department of the Navy memorandum N2N6/4U119014. [2] I. R. Porche III, B. Wilson, E.-E. Johnson , S. Tierney, and E. Saltzman, “Data flood: Helping...file systems,” in Proceedings of the 2007 ACM/IEEE Conference on Supercomputing (SC’07), Nov. 2007, pp. 1–12. [61] J. G. Steiner , C. Neuman, and J. I

  3. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software

    Energy Technology Data Exchange (ETDEWEB)

    Fenske, K.R.

    1991-11-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization Plans for Word Processors, Personal Computers, Workstations, and Associated Software to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference that documents the plans of each organization for office automation, identifies appropriate planners and other contact people in those organizations, and encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan.

  4. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software. Revision 3

    Energy Technology Data Exchange (ETDEWEB)

    Fenske, K.R.

    1991-11-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization Plans for Word Processors, Personal Computers, Workstations, and Associated Software to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference that documents the plans of each organization for office automation, identifies appropriate planners and other contact people in those organizations, and encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan.

  5. OK, Computer: File Sharing, the Music Industry, and Why We Need the Pirate Party

    Directory of Open Access Journals (Sweden)

    Adrian Cosstick

    2009-03-01

    Full Text Available The Pirate Party believes the state and big business are in the process of protecting stale and inefficient models of business for their own monetary benefit by limiting our right to share information. The Pirate Party suggests that they are achieving this goal through the amendment of intellectual property legislation. In the dawn of the digital era, the Pirate Party advocates that governments and multinational corporations are using intellectual property to: crack down on file sharing which limits the ability to share knowledge and information; increase the terms and length of copyright to raise profits; and build code into music files which limits their ability to be shared (Pirate Party, 2009. There are a number of ‘copyright industries’ that are affected by these issues, none more so than the music industry. Its relationship with file sharing is topical and makes an excellent case study to address the impact big business has had on intellectual property and the need for the Pirate Party’s legislative input. The essay will then examine the central issues raised by illegal file sharing. In particular, the future for record companies in an environment that increasingly demands flexibility, and whether the Pirate Party’s proposal is a viable solution to the music industry’s problems

  6. Models of parallel computation :a survey and classification

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yunquan; CHEN Guoliang; SUN Guangzhong; MIAO Qiankun

    2007-01-01

    In this paper,the state-of-the-art parallel computational model research is reviewed.We will introduce various models that were developed during the past decades.According to their targeting architecture features,especially memory organization,we classify these parallel computational models into three generations.These models and their characteristics are discussed based on three generations classification.We believe that with the ever increasing speed gap between the CPU and memory systems,incorporating non-uniform memory hierarchy into computational models will become unavoidable.With the emergence of multi-core CPUs,the parallelism hierarchy of current computing platforms becomes more and more complicated.Describing this complicated parallelism hierarchy in future computational models becomes more and more important.A semi-automatic toolkit that can extract model parameters and their values on real computers can reduce the model analysis complexity,thus allowing more complicated models with more parameters to be adopted.Hierarchical memory and hierarchical parallelism will be two very important features that should be considered in future model design and research.

  7. School Survey on Crime and Safety (SSOCS) 2000 Public-Use Data Files, User's Manual, and Detailed Data Documentation. [CD-ROM].

    Science.gov (United States)

    National Center for Education Statistics (ED), Washington, DC.

    This CD-ROM contains the raw, public-use data from the 2000 School Survey on Crime and Safety (SSOCS) along with a User's Manual and Detailed Data Documentation. The data are provided in SAS, SPSS, STATA, and ASCII formats. The User's Manual and the Detailed Data Documentation are provided as .pdf files. (Author)

  8. Infrared Testing of the Wide-field Infrared Survey Telescope Grism Using Computer Generated Holograms

    Science.gov (United States)

    Dominguez, Margaret Z.; Content, David A.; Gong, Qian; Griesmann, Ulf; Hagopian, John G.; Marx, Catherine T; Whipple, Arthur L.

    2017-01-01

    Infrared Computer Generated Holograms (CGHs) were designed, manufactured and used to measure the performance of the grism (grating prism) prototype which includes testing Diffractive Optical Elements (DOE). The grism in the Wide Field Infrared Survey Telescope (WFIRST) will allow the surveying of a large section of the sky to find bright galaxies.

  9. A Synthesis and Survey of Critical Success Factors for Computer Technology Projects

    Science.gov (United States)

    Baker, Ross A.

    2012-01-01

    The author investigated the existence of critical success factors for computer technology projects. Current research literature and a survey of experienced project managers indicate that there are 23 critical success factors (CSFs) that correlate with project success. The survey gathered an assessment of project success and the degree to which…

  10. F2AC: A Lightweight, Fine-Grained, and Flexible Access Control Scheme for File Storage in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Wei Ren

    2016-01-01

    Full Text Available Current file storage service models for cloud servers assume that users either belong to single layer with different privileges or cannot authorize privileges iteratively. Thus, the access control is not fine-grained and flexible. Besides, most access control methods at cloud servers mainly rely on computationally intensive cryptographic algorithms and, especially, may not be able to support highly dynamic ad hoc groups with addition and removal of group members. In this paper, we propose a scheme called F2AC, which is a lightweight, fine-grained, and flexible access control scheme for file storage in mobile cloud computing. F2AC can not only achieve iterative authorization, authentication with tailored policies, and access control for dynamically changing accessing groups, but also provide access privilege transition and revocation. A new access control model called directed tree with linked leaf model is proposed for further implementations in data structures and algorithms. The extensive analysis is given for justifying the soundness and completeness of F2AC.

  11. Multigrid Methods on Parallel Computers: A Survey on Recent Developments

    Science.gov (United States)

    1990-12-01

    multi- color (red-black, four color etc.) order- ing of the grid points. Clearly, computation of defects, interpolation and restriction can be also...73716 72555 .984 85750 82919 95800 85206 .889 113086 97406 16406 16383 .999 22042 21845 23024 21853 .949 31668 29143 Table 6: Evaluated time

  12. A Survey of Computer Use Reported in "Teaching of Psychology": 1974-1988.

    Science.gov (United States)

    Beins, Bernard C.

    1989-01-01

    Surveys computer use and the changing role of teachers in the development of computer applications by looking at manuscripts published in "Teaching of Psychology" from 1974 to 1988. Notes that although more teachers are developing software, many teachers do not have the time required for such development. (KO)

  13. Web surveys optimized for smartphones: are there differences between computer and smartphone users?

    OpenAIRE

    Andreadis, Ioannis

    2015-01-01

    "This paper shows that computer users and smartphone users taking part in a web survey optimized for smartphones give responses of almost the same quality. Combining a design of one question in each page and innovative page navigation methods, we can get high quality data by both computer and smartphone users. The two groups of users are also compared with regard to their precisely measured item response times. The analysis shows that using a smartphone instead of a computer increases about 2...

  14. Aqueous Computing:A Survey with an Invitation to Participate

    Institute of Scientific and Technical Information of China (English)

    Tom Head; Xia Chen; Masayuki Yamamura; Susannah Gal

    2002-01-01

    The concept of aqueous computing is presented here, first in full generality,and afterward, using an implementation in a specific enzymatic technology. Aqueous computingarose in the context of biomolecular (DNA) computing, but the concept is independent ofthe specifics of its biochemical origin. Alternate technologies for realizing aqueous computingare being considered for future implementation. A solution of an instance of the Booleansatisfiability problem, (SAT), is reported here that provides a new example of an aqueouscomputation that has been carried out successfully. This small instance of the SAT problemis sufficiently complex to allow our current enzymatic technology to be illustrated in detail.The reader is invited to participate in the rich interdisciplinary activity required by wet labcomputing. A project is suggested to the reader for determining the three-colorings of a graph.The basic operations required for this project are exhibited in the solution of the SAT examplereported here.

  15. A Survey of Computational Intelligence Techniques in Protein Function Prediction

    OpenAIRE

    Arvind Kumar Tiwari; Rajeev Srivastava

    2014-01-01

    During the past, there was a massive growth of knowledge of unknown proteins with the advancement of high throughput microarray technologies. Protein function prediction is the most challenging problem in bioinformatics. In the past, the homology based approaches were used to predict the protein function, but they failed when a new protein was different from the previous one. Therefore, to alleviate the problems associated with homology based traditional approaches, numerous computational int...

  16. Survey of Energy Computing in the Smart Grid Domain

    Directory of Open Access Journals (Sweden)

    Rajesh Kumar

    2013-07-01

    Full Text Available Resource optimization, with advance computing tools, improves the efficient use of energy resources. The renewable energy resources are instantaneous and needs to be conserve at the same time. To optimize real time process, the complex design, includes plan of resources and control for effective utilization. The advances in information communication technology tools enables data formatting and analysis results in optimization of use the renewable resources for sustainable energy solution on smart grid.The paper presents energy computing models for optimally allocating different types of renewable in the distribution system so as to minimize energy loss. The proposed energy computing model optimizes the integration of renewable energy resources with technical and financial feasibility. An econometric model identifies the potential of renewable energy sources, mapping them for computational analysis, which enables the study to forecast the demand and supply scenario. The enriched database on renewable sources and Government policies customize delivery model for potential to transcend the costs vs. benefits barrier. The simulation and modeling techniques have overtaken the drawbacks of traditional information and communication technology (ICT in tackling the new challenges in maximizing the benefits with smart hybrid grid. Data management has to start at the initial reception of the energy source data, reviewing it for events that should trigger alarms into outage management systems and other real-time systems such as portfolio management of a virtual hybrid power plant operator. The paper highlighted two renewable source, solar and wind, for the study in this paper, which can extend to other renewable sources.

  17. A survey of computational intelligence techniques in protein function prediction.

    Science.gov (United States)

    Tiwari, Arvind Kumar; Srivastava, Rajeev

    2014-01-01

    During the past, there was a massive growth of knowledge of unknown proteins with the advancement of high throughput microarray technologies. Protein function prediction is the most challenging problem in bioinformatics. In the past, the homology based approaches were used to predict the protein function, but they failed when a new protein was different from the previous one. Therefore, to alleviate the problems associated with homology based traditional approaches, numerous computational intelligence techniques have been proposed in the recent past. This paper presents a state-of-the-art comprehensive review of various computational intelligence techniques for protein function predictions using sequence, structure, protein-protein interaction network, and gene expression data used in wide areas of applications such as prediction of DNA and RNA binding sites, subcellular localization, enzyme functions, signal peptides, catalytic residues, nuclear/G-protein coupled receptors, membrane proteins, and pathway analysis from gene expression datasets. This paper also summarizes the result obtained by many researchers to solve these problems by using computational intelligence techniques with appropriate datasets to improve the prediction performance. The summary shows that ensemble classifiers and integration of multiple heterogeneous data are useful for protein function prediction.

  18. Minimizing Power Consumption by Personal Computers: A Technical Survey

    Directory of Open Access Journals (Sweden)

    P. K. Gupta

    2012-09-01

    Full Text Available Recently, the demand of “Green Computing”, which represents an environmentally responsible way of reducing power consumption, and involves various environmental issues such as waste management and greenhouse gases is increasing explosively. We have laid great emphasis on the need to minimize power consumption and heat dissipation by computer systems, as well as the requirement for changing the current power scheme options in their operating systems (OS. In this paper, we have provided a comprehensive technical review of the existing, though challenging, work on minimizing power consumption by computer systems, by utilizing various approaches, and emphasized on the software approach by making use of dynamic power management as it is used by most of the OSs in their power scheme configurations, seeking a better understanding of the power management schemes and current issues, and future directions in this field. Herein, we review the various approaches and techniques, including hardware, software, the central processing unit (CPU usage and algorithmic approaches for power economy. On the basis of analysis and observations, we found that this area still requires a lot of work, and needs to be focused towards some new intelligent approaches so that human inactivity periods for computer systems could be reduced intelligently.

  19. A Survey on Resource Allocation Strategies in Cloud Computing

    Directory of Open Access Journals (Sweden)

    V.Vinothina

    2012-06-01

    Full Text Available Cloud computing has become a new age technology that has got huge potentials in enterprises and markets. Clouds can make it possible to access applications and associated data from anywhere. Companies are able to rent resources from cloud for storage and other computational purposes so that their infrastructure cost can be reduced significantly. Further they can make use of company-wide access to applications, based on pay-as-you-go model. Hence there is no need for getting licenses for individual products. However one of the major pitfalls in cloud computing is related to optimizing the resources being allocated. Because of the uniqueness of the model, resource allocation is performed with the objective of minimizing the costs associated with it. The other challenges of resource allocation are meeting customer demands and application requirements. In this paper, various resource allocation strategies and their challenges are discussed in detail. It is believed that this paper would benefit both cloud users and researchers in overcoming the challenges faced.

  20. A Survey on Cloud Computing Security, Challenges and Threats

    Directory of Open Access Journals (Sweden)

    Rajnish Choubey,

    2011-03-01

    Full Text Available Cloud computing is an internet based model that enable convenient, on demand and pay per use access to a pool of shared resources. It is a new technology that satisfies a user’s requirement for computingresources like networks, storage, servers, services and applications, without physically acquiring them. It reduces the overhead of the organization of marinating the large system but it has associated risks and threats also which include – security, data leakage, insecure interface and sharing of resources and inside attacks.

  1. Reliability of a computer and Internet survey (Computer User Profile) used by adults with and without traumatic brain injury (TBI).

    Science.gov (United States)

    Kilov, Andrea M; Togher, Leanne; Power, Emma

    2015-01-01

    To determine test-re-test reliability of the 'Computer User Profile' (CUP) in people with and without TBI. The CUP was administered on two occasions to people with and without TBI. The CUP investigated the nature and frequency of participants' computer and Internet use. Intra-class correlation coefficients and kappa coefficients were conducted to measure reliability of individual CUP items. Descriptive statistics were used to summarize content of responses. Sixteen adults with TBI and 40 adults without TBI were included in the study. All participants were reliable in reporting demographic information, frequency of social communication and leisure activities and computer/Internet habits and usage. Adults with TBI were reliable in 77% of their responses to survey items. Adults without TBI were reliable in 88% of their responses to survey items. The CUP was practical and valuable in capturing information about social, leisure, communication and computer/Internet habits of people with and without TBI. Adults without TBI scored more items with satisfactory reliability overall in their surveys. Future studies may include larger samples and could also include an exploration of how people with/without TBI use other digital communication technologies. This may provide further information on determining technology readiness for people with TBI in therapy programmes.

  2. A User’s Index to CRREL Land Treatment Computer Programs and Data Files.

    Science.gov (United States)

    1982-11-01

    file in raw form. It reads from DATAMETL. NUMERO BASIC 2 Program to print out a user-specifiable section of the heavy metals data in a tabular form. It...reads from METAL and DATAMETL. NUMERO BASIC 2 0 Same as NUMERO , except that there is a printing limit of 11 samples. UPETAL BASIC 16 Program to update...DATA EXPER>STATS>NTNSOCT 19 NUMBER BASIC EXPER>METALS>DART>NUMBER 13 ’ NUMERO BASIC EXPER>METALS>DART> NUMERO 13 NUMEROL BASIC EXPER>METALS>DART

  3. Documentation for the Academic Library Survey (ALS) Data File: Fiscal Year 2002. NCES 2006-308

    Science.gov (United States)

    Schmitt, Carl M.

    2005-01-01

    This manual describes the methods, procedures, techniques, and activities that were used to produce the Academic Library Survey of 2002 (ALS:2002). This manual is designed to provide guidance and documentation for users of the ALS data. Included in the manual are the following: an overview of the study and its predecessor studies; an account of…

  4. U7 snRNAs: A Computational Survey

    Institute of Scientific and Technical Information of China (English)

    Mania; Marz; Axel; Mosig; B(a)rbel; M.R.; Stadler; Peter; F.; Stadler

    2007-01-01

    U7 small nuclear RNA (snRNA) sequences have been described only for a handful of animal species in the past. Here we describe a computational search for func- tional U7 snRNA genes throughout vertebrates including the upstream sequence elements characteristic for snRNAs transcribed by polymerase Ⅱ. Based on the results of this search, we discuss the high variability of U7 snRNAs in both se- quence and structure, and report on an attempt to find U7 snRNA sequences in basal deuterostomes and non-drosophilids insect genomes based on a combination of sequence, structure, and promoter features. Due to the extremely short se- quence and the high variability in both sequence and structure, no unambiguous candidates were found. These results cast doubt on putative U7 homologs in even more distant organisms that are reported in the most recent release of the Rfam database.

  5. Gender stereotypes, aggression, and computer games: an online survey of women.

    Science.gov (United States)

    Norris, Kamala O

    2004-12-01

    Computer games were conceptualized as a potential mode of entry into computer-related employment for women. Computer games contain increasing levels of realism and violence, as well as biased gender portrayals. It has been suggested that aggressive personality characteristics attract people to aggressive video games, and that more women do not play computer games because they are socialized to be non-aggressive. To explore gender identity and aggressive personality in the context of computers, an online survey was conducted on women who played computer games and women who used the computer but did not play computer games. Women who played computer games perceived their online environments as less friendly but experienced less sexual harassment online, were more aggressive themselves, and did not differ in gender identity, degree of sex role stereotyping, or acceptance of sexual violence when compared to women who used the computer but did not play video games. Finally, computer gaming was associated with decreased participation in computer-related employment; however, women with high masculine gender identities were more likely to use computers at work.

  6. Comp Plan: A computer program to generate dose and radiobiological metrics from dose-volume histogram files.

    Science.gov (United States)

    Holloway, Lois Charlotte; Miller, Julie-Anne; Kumar, Shivani; Whelan, Brendan M; Vinod, Shalini K

    2012-01-01

    Treatment planning studies often require the calculation of a large number of dose and radiobiological metrics. To streamline these calculations, a computer program called Comp Plan was developed using MATLAB. Comp Plan calculates common metrics, including equivalent uniform dose, tumor control probability, and normal tissue complication probability from dose-volume histogram data. The dose and radiobiological metrics can be calculated for the original data or for an adjusted fraction size using the linear quadratic model. A homogeneous boost dose can be added to a given structure if desired. The final output is written to an Excel file in a format convenient for further statistical analysis. Comp Plan was verified by independent calculations. A lung treatment planning study comparing 45 plans for 7 structures using up to 6 metrics for each structure was successfully analyzed within approximately 5 minutes with Comp Plan. The code is freely available from the authors on request.

  7. International Journal of Computer Science and Engineering Survey (IJCSES

    Directory of Open Access Journals (Sweden)

    Sohini Roychowdhury

    2015-10-01

    Full Text Available Automated facial identification and facial expression recognition have topics of active research over the past few decades. Facial and expression recognition find applications in human-computer interfaces, subject tracking, real-time security surveillance systems and social networking. Several holistic and geometric methods have been developed to identify faces and expressions using public and local facial image databases. In this work we present the evolution in facial image data sets and the methodologies for facial identification and recognition of expressions such as anger, sadness, happiness, disgust, fear and surprise. We observe that most of the earlier methods for facial and expression recognition aimed at improving the recognition rates for facial feature-based methods using static images. However, the recent methodologies have shifted focus towards robust implementation of facial/expression recognition from large image databases that vary with space (gathered from the internet and time (video recordings. The evolution trends in databases and methodologies for facial and expression recognition can be useful for assessing the next-generation topics that may have applications in security systems or personal identification systems that involve “Quantitative face” assessments.

  8. PCF File Format.

    Energy Technology Data Exchange (ETDEWEB)

    Thoreson, Gregory G [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    PCF files are binary files designed to contain gamma spectra and neutron count rates from radiation sensors. It is the native format for the GAmma Detector Response and Analysis Software (GADRAS) package [1]. It can contain multiple spectra and information about each spectrum such as energy calibration. This document outlines the format of the file that would allow one to write a computer program to parse and write such files.

  9. Cloud computing for energy management in smart grid - an application survey

    Science.gov (United States)

    Naveen, P.; Kiing Ing, Wong; Kobina Danquah, Michael; Sidhu, Amandeep S.; Abu-Siada, Ahmed

    2016-03-01

    The smart grid is the emerging energy system wherein the application of information technology, tools and techniques that make the grid run more efficiently. It possesses demand response capacity to help balance electrical consumption with supply. The challenges and opportunities of emerging and future smart grids can be addressed by cloud computing. To focus on these requirements, we provide an in-depth survey on different cloud computing applications for energy management in the smart grid architecture. In this survey, we present an outline of the current state of research on smart grid development. We also propose a model of cloud based economic power dispatch for smart grid.

  10. Analisis Teknik-Teknik Keamanan Pada Future Cloud Computing vs Current Cloud Computing: Survey Paper

    Directory of Open Access Journals (Sweden)

    Beny Nugraha

    2016-08-01

    Full Text Available Cloud computing adalah salah satu dari teknologi jaringan yang sedang berkembang pesat saat ini, hal ini dikarenakan cloud computing memiliki kelebihan dapat meningkatkan fleksibilitas dan kapabilitas dari proses komputer secara dinamis tanpa perlu mengeluarkan dana besar untuk membuat infrastruktur baru, oleh karena itu, peningkatan kualitas keamanan jaringan cloud computing sangat diperlukan. Penelitian ini akan meneliti teknik-teknik keamanan yang ada pada cloud computing saat ini dan arsitektur cloud computing masa depan, yaitu NEBULA. Teknik-teknik keamanan tersebut akan dibandingkan dalam hal kemampuannya dalam menangani serangan-serangan keamanan yang mungkin terjadi pada cloud computing. Metode yang digunakan pada penelitian ini adalah metode attack centric, yaitu setiap serangan keamanan dianalisis karakteristiknya dan kemudian diteliti mekanisme keamanan untuk menanganinya. Terdapat empat serangan keamanan yang diteliti dalam penelitian ini, dengan mengetahui bagaimana cara kerja sebuah serangan keamanan, maka akan diketahui juga mekanisme keamanan yang mana yang bisa mengatasi serangan tersebut. Dari penelitian ini didapatkan bahwa NEBULA memiliki tingkat keamanan yang paling tinggi. NEBULA memiliki tiga teknik baru yaitu Proof of Consent (PoC, Proof of Path (PoP, dan teknik kriptografi ICING. Ketiga teknik tersebut ditambah dengan teknik onion routing dapat mengatasi serangan keamanan yang dianalisa pada penelitian ini.

  11. Verification of data files of TREF-computer program; TREF-ohjelmiston ohjaustiedostojen soveltuvuustutkimus

    Energy Technology Data Exchange (ETDEWEB)

    Ruottu, S.; Halme, A.; Ruottu, A. [Einco Oy, Karhula (Finland)

    1996-12-01

    Originally the aim of Y43 project was to verify TREF data files for several different processes. However, it appeared that deficient or missing coordination between experimental and theoretical works made meaningful verifications impossible in some cases. Therefore verification calculations were focused on catalytic cracking reactor which was developed by Neste. The studied reactor consisted of prefluidisation and reaction zones. Verification calculations concentrated mainly on physical phenomena like vaporization near oil injection zone. The main steps of the cracking process can be described as follows oil (liquid) -> oil (gas) -> oil (catal) -> product (catal) + char (catal) -> product (gas). Catalytic nature of cracking reaction was accounted by defining the cracking pseudoreaction into catalyst phase. This simplified reaction model was valid only for vaporization zone. Applied fluid dynamic theory was based on the results of EINCO`s earlier LIEKKI-projects. (author)

  12. Creation and Use of a Survey Instrument for Comparing Mobile Computing Devices

    Science.gov (United States)

    Macri, Jennifer M.; Lee, Paul P.; Silvey, Garry M.; Lobach, David F.

    2005-01-01

    Both personal digital assistants (PDAs) and tablet computers have emerged to facilitate data collection at the point of care. However, little research has been reported comparing these mobile computing devices in specific care settings. In this study we present an approach for comparing functionally identical applications on a Palm operating system-based PDA and a Windows-based tablet computer for point-of-care documentation of clinical observations by eye care professionals when caring for patients with diabetes. Eye-care professionals compared the devices through focus group sessions and through validated usability surveys. This poster describes the development and use of the survey instrument used for comparing mobile computing devices. PMID:16779327

  13. A Survey of Data Management System for Cloud Computing: Models and Searching Methods

    Directory of Open Access Journals (Sweden)

    Linhua Zhou

    2013-06-01

    Full Text Available At present, the research of data storage and management in cloud computing mainly focuses on dealing with data expression and search. This study gives a comprehensive survey of numerous models and approaches of data-intensive applications in cloud computing in both academic and industrial communities. We review various approaches and their ideas of design. And then, we attempt to summarize and appraise the open issues.

  14. Weight loss from maximum body weight and mortality: the Third National Health and Nutrition Examination Survey Linked Mortality File.

    Science.gov (United States)

    Ingram, D D; Mussolino, M E

    2010-06-01

    The aim of this longitudinal study is to examine the relationship between weight loss from maximum body weight, body mass index (BMI), and mortality in a nationally representative sample of men and women. Longitudinal cohort study. In all, 6117 whites, blacks, and Mexican-Americans 50 years and over at baseline who survived at least 3 years of follow-up, from the Third National Health and Nutrition Examination Survey Linked Mortality Files (1988-1994 with passive mortality follow-up through 2000), were included. Measured body weight and self-reported maximum body weight obtained at baseline. Weight loss (maximum body weight minus baseline weight) was categorized as or=15%. Maximum BMI (reported maximum weight (kg)/measured baseline height (m)(2)) was categorized as healthy weight (18.5-24.9), overweight (25.0-29.9), and obese (>or=30.0). In all, 1602 deaths were identified. After adjusting for age, race, smoking, health status, and preexisting illness, overweight men with weight loss of 15% or more, overweight women with weight loss of 5-weight loss of 15% or more were at increased risk of death from all causes compared with those in the same BMI category who lost Weight loss of 5-Weight loss of 15% or more from maximum body weight is associated with increased risk of death from all causes among overweight men and among women regardless of maximum BMI.

  15. Contours, Contours created by processing hillshade TIF files derived from the U.S. Geological Survey National Elevation Dataset. Available in 50',100, 250', and 500' intervals., Published in 2004, Arizona State Land Department.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This Contours dataset as of 2004. It is described as 'Contours created by processing hillshade TIF files derived from the U.S. Geological Survey National Elevation...

  16. A Survey on Cloud Computing Security Issues, Vendor Evaluation and Selection Process: World Future Society

    Directory of Open Access Journals (Sweden)

    Arun Sangwan

    2014-05-01

    Full Text Available Cloud computing is an emerging technology which could replace long-established IT systems. Cloud computing made big strides forward in 2013, and if a host of industry experts proves correct, it will make even bigger advances in 2014. As Forbes, CxoToday, GigaOM, and other news services that cover technology report, the experts forecast many more companies joining clouds or creating their own; new professional services emerging to manage the clouds and the data within them; and the clouds’ expansion transforming IT and work life in general trough out the world. Cloud computing makes it possible for an organizations’ IT to be more malleable, save costs and process information and data faster than with long-established IT. Cloud computing is the operating of programs and storage of data and files in an online network, not on physical disks and hardware’s. Cloud computing arises from the IT technicians desire to add another layer of separation in processing information. Cloud Vendor Evaluation & Selection offering leverages the database to accelerate the identification and screening of candidate vendors in four-step vendor evaluation and selection process.

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  18. Children's Experiences of Completing a Computer-Based Violence Survey: Ethical Implications

    Science.gov (United States)

    Ellonen, Noora; Poso, Tarja

    2011-01-01

    This article aims to contribute to the discussion about the ethics of research on children when studying sensitive issues such as violence. The empirical analysis is based on the accounts given by children (11 377) who completed a computer-based questionnaire about their experiences of violence ("The Finnish Child Victim Survey 2008")…

  19. Children's Experiences of Completing a Computer-Based Violence Survey: Ethical Implications

    Science.gov (United States)

    Ellonen, Noora; Poso, Tarja

    2011-01-01

    This article aims to contribute to the discussion about the ethics of research on children when studying sensitive issues such as violence. The empirical analysis is based on the accounts given by children (11 377) who completed a computer-based questionnaire about their experiences of violence ("The Finnish Child Victim Survey 2008")…

  20. The Asilomar Survey: Stakeholders’ Opinions on Ethical Issues Related to Brain-Computer Interfacing

    NARCIS (Netherlands)

    Nijboer, F.; Clausen, J.; Allison, B.Z.; Haselager, W.F.G.

    2011-01-01

    Brain-Computer Interface (BCI) research and (future) applications raise important ethical issues that need to be addressed to promote societal acceptance and adequate policies. Here we report on a survey we conducted among 145 BCI researchers at the 4th International BCI conference, which took place

  1. The Asilomar Survey: Stakeholders' Opinions on Ethical Issues Related to Brain-Computer Interfacing

    NARCIS (Netherlands)

    Nijboer, Femke; Clausen, Jens; Allison, Brendan Z.; Haselager, Pim

    2011-01-01

    Brain-Computer Interface (BCI) research and (future) applications raise important ethical issues that need to be addressed to promote societal acceptance and adequate policies. Here we report on a survey we conducted among 145 BCI researchers at the 4th International BCI conference, which took place

  2. On the Integration of Computer Algebra Systems (CAS) by Canadian Mathematicians: Results of a National Survey

    Science.gov (United States)

    Buteau, Chantal; Jarvis, Daniel H.; Lavicza, Zsolt

    2014-01-01

    In this article, we outline the findings of a Canadian survey study (N = 302) that focused on the extent of computer algebra systems (CAS)-based technology use in postsecondary mathematics instruction. Results suggest that a considerable number of Canadian mathematicians use CAS in research and teaching. CAS use in research was found to be the…

  3. Arecibo PALFA survey and Einstein@Home: binary pulsar discovery by volunteer computing

    NARCIS (Netherlands)

    Knispel, B.; Lazarus, P.; Allen, B.; Anderson, D.; Aulbert, C.; Bhat, N.D.R.; Bock, O.; Bogdanov, S.; Brazier, A.; Camilo, F.; Chatterjee, S.; Cordes, J.M.; Crawford, F.; Deneva, J.S.; Desvignes, G.; Fehrmann, H.; Freire, P.C.C.; Hammer, D.; Hessels, J.W.T.; Jenet, F.A.; Kaspi, V.M.; Kramer, M.; van Leeuwen, J.; Lorimer, D.R.; Lyne, A.G.; Machenschalk, B.; McLaughlin, M.A.; Messenger, C.; Nice, D.J.; Papa, M.A.; Pletsch, H.J.; Prix, R.; Ransom, S.M.; Siemens, X.; Stairs, I.H.; Stappers, B.W.; Stovall, K.; Venkataraman, A.

    2011-01-01

    We report the discovery of the 20.7 ms binary pulsar J1952+2630, made using the distributed computing project Einstein@Home in Pulsar ALFA survey observations with the Arecibo telescope. Follow-up observations with the Arecibo telescope confirm the binary nature of the system. We obtain a circular o

  4. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software

    Energy Technology Data Exchange (ETDEWEB)

    Fenske, K.R.; Rockwell, V.S.

    1992-08-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization plans for Word Processors, Personal Computers, Workstations, and Associated Software (ANL/TM, Revision 4) to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference document that (1) documents the plans of each organization for office automation, (2) identifies appropriate planners and other contact people in those organizations and (3) encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations (ANL/TM 458) and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan (ANL/TM 466).

  5. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software. Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Fenske, K.R.; Rockwell, V.S.

    1992-08-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization plans for Word Processors, Personal Computers, Workstations, and Associated Software (ANL/TM, Revision 4) to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference document that (1) documents the plans of each organization for office automation, (2) identifies appropriate planners and other contact people in those organizations and (3) encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations (ANL/TM 458) and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan (ANL/TM 466).

  6. A Massively Parallel Computational Method of Reading Index Files for SOAPsnv.

    Science.gov (United States)

    Zhu, Xiaoqian; Peng, Shaoliang; Liu, Shaojie; Cui, Yingbo; Gu, Xiang; Gao, Ming; Fang, Lin; Fang, Xiaodong

    2015-12-01

    SOAPsnv is the software used for identifying the single nucleotide variation in cancer genes. However, its performance is yet to match the massive amount of data to be processed. Experiments reveal that the main performance bottleneck of SOAPsnv software is the pileup algorithm. The original pileup algorithm's I/O process is time-consuming and inefficient to read input files. Moreover, the scalability of the pileup algorithm is also poor. Therefore, we designed a new algorithm, named BamPileup, aiming to improve the performance of sequential read, and the new pileup algorithm implemented a parallel read mode based on index. Using this method, each thread can directly read the data start from a specific position. The results of experiments on the Tianhe-2 supercomputer show that, when reading data in a multi-threaded parallel I/O way, the processing time of algorithm is reduced to 3.9 s and the application program can achieve a speedup up to 100×. Moreover, the scalability of the new algorithm is also satisfying.

  7. A computational simulation study on the acoustic pressure generated by a dental endosonic file: effects of intensity, file shape and volume.

    Science.gov (United States)

    Tiong, T Joyce; Price, Gareth J; Kanagasingam, Shalini

    2014-09-01

    One of the uses of ultrasound in dentistry is in the field of endodontics (i.e. root canal treatment) in order to enhance cleaning efficiency during the treatment. The acoustic pressures generated by the oscillation of files in narrow channels has been calculated using the COMSOL simulation package. Acoustic pressures in excess of the cavitation threshold can be generated and higher values were found in narrower channels. This parallels experimental observations of sonochemiluminescence. The effect of varying the channel width and length and the dimensions and shape of the file are reported. As well as explaining experimental observations, the work provides a basis for the further development and optimisation of the design of endosonic files. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Text files of the navigation logged by the U.S. Geological Survey offshore of Fire Island, NY in 2011 (Geographic, WGS 84, HYPACK ASCII Text Files)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The U.S. Geological Survey (USGS) mapped approximately 336 square kilometers of the lower shoreface and inner-continental shelf offshore of Fire Island, New York in...

  9. Text files of the navigation logged by the U.S. Geological Survey offshore of Fire Island, NY in 2011 (Geographic, WGS 84, HYPACK ASCII Text Files)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The U.S. Geological Survey (USGS) mapped approximately 336 square kilometers of the lower shoreface and inner-continental shelf offshore of Fire Island, New York in...

  10. Cone-beam Computed Tomographic Assessment of Canal Centering Ability and Transportation after Preparation with Twisted File and Bio RaCe Instrumentation.

    Directory of Open Access Journals (Sweden)

    Kiamars Honardar

    2014-08-01

    Full Text Available Use of rotary Nickel-Titanium (NiTi instruments for endodontic preparation has introduced a new era in endodontic practice, but this issue has undergone dramatic modifications in order to achieve improved shaping abilities. Cone-beam computed tomography (CBCT has made it possible to accurately evaluate geometrical changes following canal preparation. This study was carried out to compare canal centering ability and transportation of Twisted File and BioRaCe rotary systems by means of cone-beam computed tomography.Thirty root canals from freshly extracted mandibular and maxillary teeth were selected. Teeth were mounted and scanned before and after preparation by CBCT at different apical levels. Specimens were divided into 2 groups of 15. In the first group Twisted File and in the second, BioRaCe was used for canal preparation. Canal transportation and centering ability after preparation were assessed by NNT Viewer and Photoshop CS4 software. Statistical analysis was performed using t-test and two-way ANOVA.All samples showed deviations from the original axes of the canals. No significant differences were detected between the two rotary NiTi instruments for canal centering ability in all sections. Regarding canal transportation however, a significant difference was seen in the BioRaCe group at 7.5mm from the apex.Under the conditions of this in vitro study, Twisted File and BioRaCe rotary NiTi files retained original canal geometry.

  11. A Taxonomy and Survey of Energy-Efficient Data Centers and Cloud Computing Systems

    CERN Document Server

    Beloglazov, Anton; Lee, Young Choon; Zomaya, Albert

    2010-01-01

    Traditionally, the development of computing systems has been focused on performance improvements driven by the demand of applications from consumer, scientific and business domains. However, the ever increasing energy consumption of computing systems has started to limit further performance growth due to overwhelming electricity bills and carbon dioxide footprints. Therefore, the goal of the computer system design has been shifted to power and energy efficiency. To identify open challenges in the area and facilitate future advancements it is essential to synthesize and classify the research on power and energy-efficient design conducted to date. In this work we discuss causes and problems of high power / energy consumption, and present a taxonomy of energy-efficient design of computing systems covering the hardware, operating system, virtualization and data center levels. We survey various key works in the area and map them to our taxonomy to guide future design and development efforts. This chapter is conclu...

  12. Female Under-Representation in Computing Education and Industry - A Survey of Issues and Interventions

    Directory of Open Access Journals (Sweden)

    Joseph Osunde

    2014-10-01

    Full Text Available This survey paper examines the issue of female under-representation in computing education and industry, which has been shown from empirical studies to be a problem for over two decades. While various measures and intervention strategies have been implemented to increase the interest of girls in computing education and industry, the level of success has been discouraging. The primary contribution of this paper is to provide an analysis of the extensive research work in this area. It outlines the progressive decline in female representation in computing education. It also presents the key arguments that attempt to explain the decline and intervention strategies. We conclude that there is a need to further explore strategies that will encourage young female learners to interact more with computer educational games.

  13. The survey of American college students computer technology preferences & purchasing plans

    CERN Document Server

    2009-01-01

    This report presents data from a survey of more than 400 American college students.  The report presents data on student computer ownership of both PCs and laptops, purchasing plans for PCs and laptops, as well as purchasing plans for cell phones and digital cameras.  The report also provides details on how student finance their computer purchases, how much money comes from parents or guardians, and how much from the student themselves, or from their parties.  In addition to data on PCs the report provides detailed info on use of popular word processing packages such as Word, WordPerfect and Open Office.

  14. Using the Superpopulation Model for Imputations and Variance Computation in Survey Sampling

    Directory of Open Access Journals (Sweden)

    Petr Novák

    2012-03-01

    Full Text Available This study is aimed at variance computation techniques for estimates of population characteristics based on survey sampling and imputation. We use the superpopulation regression model, which means that the target variable values for each statistical unit are treated as random realizations of a linear regression model with weighted variance. We focus on regression models with one auxiliary variable and no intercept, which have many applications and straightforward interpretation in business statistics. Furthermore, we deal with caseswhere the estimates are not independent and thus the covariance must be computed. We also consider chained regression models with auxiliary variables as random variables instead of constants.

  15. Investigating Multiple Household Water Sources and Uses with a Computer-Assisted Personal Interviewing (CAPI Survey

    Directory of Open Access Journals (Sweden)

    Morgan C. MacDonald

    2016-12-01

    Full Text Available The investigation of multiple sources in household water management is considered overly complicated and time consuming using paper and pen interviewing (PAPI. We assess the advantages of computer-assisted personal interviewing (CAPI in Pacific Island Countries (PICs. We adapted an existing PAPI survey on multiple water sources and expanded it to incorporate location of water use and the impacts of extreme weather events using SurveyCTO on Android tablets. We then compared the efficiency and accuracy of data collection using the PAPI version (n = 44 with the CAPI version (n = 291, including interview duration, error rate and trends in interview duration with enumerator experience. CAPI surveys facilitated high-quality data collection and were an average of 15.2 min faster than PAPI. CAPI survey duration decreased by 0.55% per survey delivered (p < 0.0001, whilst embedded skip patterns and answer lists lowered data entry error rates, relative to PAPI (p < 0.0001. Large-scale household surveys commonly used in global monitoring and evaluation do not differentiate multiple water sources and uses. CAPI equips water researchers with a quick and reliable tool to address these knowledge gaps and advance our understanding of development research priorities.

  16. Survey and future directions of fault-tolerant distributed computing on board spacecraft

    Science.gov (United States)

    Fayyaz, Muhammad; Vladimirova, Tanya

    2016-12-01

    Current and future space missions demand highly reliable on-board computing systems, which are capable of carrying out high-performance data processing. At present, no single computing scheme satisfies both, the highly reliable operation requirement and the high-performance computing requirement. The aim of this paper is to review existing systems and offer a new approach to addressing the problem. In the first part of the paper, a detailed survey of fault-tolerant distributed computing systems for space applications is presented. Fault types and assessment criteria for fault-tolerant systems are introduced. Redundancy schemes for distributed systems are analyzed. A review of the state-of-the-art on fault-tolerant distributed systems is presented and limitations of current approaches are discussed. In the second part of the paper, a new fault-tolerant distributed computing platform with wireless links among the computing nodes is proposed. Novel algorithms, enabling important aspects of the architecture, such as time slot priority adaptive fault-tolerant channel access and fault-tolerant distributed computing using task migration are introduced.

  17. Computational Challenge of Fractional Differential Equations and the Potential Solutions: A Survey

    Directory of Open Access Journals (Sweden)

    Chunye Gong

    2015-01-01

    Full Text Available We present a survey of fractional differential equations and in particular of the computational cost for their numerical solutions from the view of computer science. The computational complexities of time fractional, space fractional, and space-time fractional equations are O(N2M, O(NM2, and O(NM(M + N compared with O(MN for the classical partial differential equations with finite difference methods, where M, N are the number of space grid points and time steps. The potential solutions for this challenge include, but are not limited to, parallel computing, memory access optimization (fractional precomputing operator, short memory principle, fast Fourier transform (FFT based solutions, alternating direction implicit method, multigrid method, and preconditioner technology. The relationships of these solutions for both space fractional derivative and time fractional derivative are discussed. The authors pointed out that the technologies of parallel computing should be regarded as a basic method to overcome this challenge, and some attention should be paid to the fractional killer applications, high performance iteration methods, high order schemes, and Monte Carlo methods. Since the computation of fractional equations with high dimension and variable order is even heavier, the researchers from the area of mathematics and computer science have opportunity to invent cornerstones in the area of fractional calculus.

  18. PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.

    1979-10-01

    The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined.

  19. Clinical Computer Systems Survey (CLICS): learning about health information technology (HIT) in its context of use.

    Science.gov (United States)

    Lichtner, Valentina; Cornford, Tony; Klecun, Ela

    2013-01-01

    Successful health information technology (HIT) implementations need to be informed on the context of use and on users' attitudes. To this end, we developed the CLinical Computer Systems Survey (CLICS) instrument. CLICS reflects a socio-technical view of HIT adoption, and is designed to encompass all members of the clinical team. We used the survey in a large English hospital as part of its internal evaluation of the implementation of an electronic patient record system (EPR). The survey revealed extent and type of use of the EPR; how it related to and integrated with other existing systems; and people's views on its use, usability and emergent safety issues. Significantly, participants really appreciated 'being asked'. They also reminded us of the wider range of administrative roles engaged with EPR. This observation reveals pertinent questions as to our understanding of the boundaries between administrative tasks and clinical medicine - what we propose as the field of 'administrative medicine'.

  20. Evaluation of the Efficacy of TRUShape and Reciproc File Systems in the Removal of Root Filling Material: An Ex Vivo Micro-Computed Tomographic Study.

    Science.gov (United States)

    de Siqueira Zuolo, Arthur; Zuolo, Mario Luis; da Silveira Bueno, Carlos Eduardo; Chu, Rene; Cunha, Rodrigo Sanches

    2016-02-01

    The purpose of this study was to evaluate the efficacy of TRUShape (Dentsply Tulsa Dental Specialties, Tulsa, OK) compared with the Reciproc file (VDW, Munich, Germany) in the removal of filling material from oval canals filled with 2 different sealers and differences in the working time. Sixty-four mandibular canines with oval canals were prepared and divided into 4 groups (n = 16). Half of the specimens were filled with gutta-percha and pulp canal sealer (PCS), and the remainders were filled with gutta-percha and bioceramic sealer (BCS). The specimens were retreated using either the Reciproc or TRUShape files. A micro-computed tomographic scanner was used to assess filling material removal, and the time taken for removal was also recorded. Data were analyzed using the Kruskal-Wallis and Mann-Whitney U tests. The mean volume of the remaining filling material was similar when comparing both files (P ≥ .05). However, in the groups filled with BCS, the percentage of remaining filling material was higher than in the groups filled with PCS (P material when comparing both files system; however, Reciproc was faster than TRUShape. BCS groups exhibited significantly more remaining filling material in the canals and required more time for retreatment. Remaining filling material was observed in all samples regardless of the technique or sealer used. Copyright © 2016 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  1. Using electronic surveys in nursing research.

    Science.gov (United States)

    Cope, Diane G

    2014-11-01

    Computer and Internet use in businesses and homes in the United States has dramatically increased since the early 1980s. In 2011, 76% of households reported having a computer, compared with only 8% in 1984 (File, 2013). A similar increase in Internet use has also been seen, with 72% of households reporting access of the Internet in 2011 compared with 18% in 1997 (File, 2013). This emerging trend in technology has prompted use of electronic surveys in the research community as an alternative to previous telephone and postal surveys. Electronic surveys can offer an efficient, cost-effective method for data collection; however, challenges exist. An awareness of the issues and strategies to optimize data collection using web-based surveys is critical when designing research studies. This column will discuss the different types and advantages and disadvantages of using electronic surveys in nursing research, as well as methods to optimize the quality and quantity of survey responses.

  2. Enhanced Survey and Proposal to secure the data in Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    MR.S.SUBBIAH

    2013-01-01

    Full Text Available Cloud computing have the power to eliminate the cost of setting high end computing infrastructure. It is a promising area or design to give very flexible architecture, accessible through the internet. In the cloud computing environment the data will be reside at any of the data centers. Due to that, some data center may leak the data stored on there, beyond the reach and control of the users. For this kind of misbehaving data centers, the service providers should take care of the security and privacy of the data stored in the data centers through the cloud computing environment. This survey paper try to elaborate and analyze the various unresolved issues in the cloud computing environment and try to propose an alternate method which can be useful to the various kind of users who are willing to get into the new era of cloud computing. Moreover this paper try to give some suggestions in the area of Securing the data while storing the data in the cloud server, implement new Data displacement strategies, Service Level Agreement between the user and the Cloud Service Provider and finally how to improve the Quality of Service.

  3. Structure and Application pf WAV File

    Institute of Scientific and Technical Information of China (English)

    Guo,Xingji

    2005-01-01

    As regards audio digitization, the researcher introduced several computer process means of audio information, and then, presented application patterns based on WAV file after analyzing thoroughly the structure of the widely used WAV File in computer application field.

  4. Opportunities and Needs for Mobile-Computing Technology to Support U.S. Geological Survey Fieldwork

    Science.gov (United States)

    Wood, Nathan J.; Halsing, David L.

    2006-01-01

    To assess the opportunities and needs for mobile-computing technology at the U.S. Geological Survey (USGS), we conducted an internal, Internet-based survey of bureau scientists whose research includes fieldwork. In summer 2005, 144 survey participants answered 65 questions about fieldwork activities and conditions, technology to support field research, and postfieldwork data processing and analysis. Results suggest that some types of mobile-computing technology are already commonplace, such as digital cameras and Global Positioning System (GPS) receivers, whereas others are not, such as personal digital assistants (PDAs) and tablet-based personal computers (tablet PCs). The potential for PDA use in the USGS is high: 97 percent of respondents record field observations (primarily environmental conditions and water-quality data), and 87 percent take field samples (primarily water-quality data, water samples, and sediment/soil samples). The potential for tablet PC use in the USGS is also high: 59 percent of respondents map environmental features in the field, primarily by sketching in field notebooks, on aerial photographs, or on topographic-map sheets. Results also suggest that efficient mobile-computing-technology solutions could benefit many USGS scientists because most respondents spend at least 1 week per year in the field, conduct field sessions that are least 1 week in duration, have field crews of one to three people, and typically travel on foot about 1 mi from their field vehicles. By allowing researchers to enter data directly into digital databases while in the field, mobile-computing technology could also minimize postfieldwork data processing: 93 percent of respondents enter collected field data into their office computers, and more than 50 percent spend at least 1 week per year on postfieldwork data processing. Reducing postfieldwork data processing could free up additional time for researchers and result in cost savings for the bureau. Generally

  5. Computer-assisted measurement of perceived stress: an application for a community-based survey.

    Science.gov (United States)

    Kimura, Tomoaki; Uchida, Seiya; Tsuda, Yasutami; Eboshida, Akira

    2005-09-01

    The assessment of stress is a key issue in health promotion policies as well as in treatment strategies for patients. The aim of this study was to confirm the accessibility and reliability of computer-assisted data collection for perceived stress measurement, using the Japanese version of the Perceived Stress Scale (JPSS), within the setting of a community-based survey. There were two groups of participants in this survey. One group responded to a Web-based application, and the other to the VBA of a spreadsheet software. The total scores of JPSS were almost normally distributed. The means of total scores of JPSS were 23.6 and 23.1. These results were lower than the previous study of JPSS. Since Cronbach's alpha coefficients in both surveys were more than 0.8, high reliability was demonstrated despite a number of computer-illiterate and/or aged participants. They felt that the spreadsheet form was easier to respond to. Two components were extracted with the Varimax rotation of principal component analysis, and these were named "perception of stress and stressors" and "behavior to stress". This finding suggests that it is possible to determine sub-scales. From the viewpoint of preventive medicine, it is expected that the JPSS applications will be utilized to investigate the relationship between stress and other factors such as lifestyle, environment and quality of life.

  6. 76 FR 39757 - Filing Procedures

    Science.gov (United States)

    2011-07-06

    ... an accurate filing history can be maintained. (b) Changes to user information other than the Firm... computer or internet resources. If the request is granted, the Secretary will promptly inform you and... applicable hardware and software requirements for electronic filing. (2) To file or submit a document in...

  7. Raw navigation files logged with HYPACK Survey software during a geophysical survey conducted by the USGS within Red Brook Harbor, MA, 2009

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — These data were collected under a cooperative agreement with the Massachusetts Office of Coastal Zone Management (CZM) and the U.S. Geological Survey (USGS), Coastal...

  8. Raw navigation files logged with HYPACK Survey software during a geophysical survey conducted by the USGS within Red Brook Harbor, MA, 2009

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — These data were collected under a cooperative agreement with the Massachusetts Office of Coastal Zone Management (CZM) and the U.S. Geological Survey (USGS),...

  9. Raw navigation files logged with HYPACK Survey software during a geophysical survey conducted by the USGS within Red Brook Harbor, MA, 2009

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — These data were collected under a cooperative agreement with the Massachusetts Office of Coastal Zone Management (CZM) and the U.S. Geological Survey (USGS), Coastal...

  10. Sensitivity Data File Formats

    Energy Technology Data Exchange (ETDEWEB)

    Rearden, Bradley T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    The format of the TSUNAMI-A sensitivity data file produced by SAMS for cases with deterministic transport solutions is given in Table 6.3.A.1. The occurrence of each entry in the data file is followed by an identification of the data contained on each line of the file and the FORTRAN edit descriptor denoting the format of each line. A brief description of each line is also presented. A sample of the TSUNAMI-A data file for the Flattop-25 sample problem is provided in Figure 6.3.A.1. Here, only two profiles out of the 130 computed are shown.

  11. Arecibo PALFA Survey and Einstein@Home: Binary Pulsar Discovery by Volunteer Computing

    OpenAIRE

    Knispel, B.; Lazarus, P; Allen, B.; Anderson, D; Aulbert, C.; Bhat, N; Bock, O.; Bogdanov, S.; Brazier, A.; Camilo, F.; S Chatterjee; Cordes, J.; Crawford, F; Deneva, J.; Desvignes, G.

    2011-01-01

    We report the discovery of the 20.7 ms binary pulsar J1952+2630, made using the distributed computing project Einstein@Home in Pulsar ALFA survey observations with the Arecibo telescope. Follow-up observations with the Arecibo telescope confirm the binary nature of the system. We obtain a circular orbital solution with an orbital period of 9.4 hr, a projected orbital radius of 2.8 lt-s, and a mass function of f = 0.15 solar masses by analysis of spin period measurements. No evidence of orbita...

  12. A Comparison of Paper vs. Computer-Assisted Self Interview for School, Alcohol, Tobacco, and Other Drug Surveys.

    Science.gov (United States)

    Hallfors, Denise; Khatapoush, Shereen; Kadushin, Charles; Watson, Kim; Saxe, Leonard

    2000-01-01

    Examined whether computer assisted self-interview (CASI) alcohol, tobacco, and drug use surveys are feasible with 2,296 7th, 9th, and 11th graders in 2 communities. CASI surveys did not increase reported rates of substance abuse, but did improve the speed of data processing and decrease missing data. (SLD)

  13. City locations for all places in the TIGER files

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — City locations for all places in the TIGER files; this file was extracted from dbf files posted on the internet by the Bureau of the Census. This is basically a...

  14. Survey results of Internet and computer usage in veterans with epilepsy.

    Science.gov (United States)

    Pramuka, Michael; Hendrickson, Rick; Van Cott, Anne C

    2010-03-01

    After our study of a self-management intervention for epilepsy, we gathered data on Internet use and computer availability to assess the feasibility of computer-based interventions in a veteran population. Veterans were asked to complete an anonymous questionnaire that gathered information regarding seizures/epilepsy in addition to demographic data, Internet use, computer availability, and interest in distance education regarding epilepsy. Three hundred twenty-four VA neurology clinic patients completed the survey. One hundred twenty-six self-reported a medical diagnosis of epilepsy and constituted the epilepsy/seizure group. For this group of veterans, the need for remote/distance-based interventions was validated given the majority of veterans traveled long distances (>2 hours). Only 51% of the epilepsy/seizure group had access to the Internet, and less than half (42%) expressed an interest in getting information on epilepsy self-management on their computer, suggesting that Web-based interventions may not be an optimal method for a self-management intervention in this population.

  15. An ergonomic questionnaire survey on the use of computers in schools.

    Science.gov (United States)

    Sotoyama, Midori; Bergqvist, Ulf; Jonai, Hiroshi; Saito, Susumu

    2002-04-01

    A questionnaire was sent out to elementary, junior high and high schools in Yokohama and Kawasaki Cities from January to March 1998 regarding the use of personal computers by pupils and students. The survey included the questions that asked how often and in what environment computers are used, whether any instructions are given as to their use, children's working posture, and the effect on health. The results show that most schools are slow to develop instructive programs from the environmental or ergonomic point of view. So far there are not many children who complain of any serious symptoms such as pain in the neck, head or shoulders, but a future increase in the number of classes which involve computing, as well as the widespread popularity of home computers, will surely arouse a legitimate concern about the health of pupils and students, since they will spend more and more time operating the devices. An effective way to anticipate the problem is to provide young students with adequate knowledge of easy-on-body usage and environmental design, and now there is an urgent need for specific guidelines to protect them.

  16. Survey on Security Issues in Cloud Computing and Associated Mitigation Techniques

    CERN Document Server

    Bhadauria, Rohit

    2012-01-01

    Cloud Computing holds the potential to eliminate the requirements for setting up of high-cost computing infrastructure for IT-based solutions and services that the industry uses. It promises to provide a flexible IT architecture, accessible through internet for lightweight portable devices. This would allow multi-fold increase in the capacity or capabilities of the existing and new software. In a cloud computing environment, the entire data reside over a set of networked resources, enabling the data to be accessed through virtual machines. Since these data-centers may lie in any corner of the world beyond the reach and control of users, there are multifarious security and privacy challenges that need to be understood and taken care of. Also, one can never deny the possibility of a server breakdown that has been witnessed, rather quite often in the recent times. There are various issues that need to be dealt with respect to security and privacy in a cloud computing scenario. This extensive survey paper aims to...

  17. A State-Wide Survey of South Australian Secondary Schools to Determine the Current Emphasis on Ergonomics and Computer Use

    Science.gov (United States)

    Sawyer, Janet; Penman, Joy

    2012-01-01

    This study investigated the pattern of teaching of healthy computing skills to high school students in South Australia. A survey approach was used to collect data, specifically to determine the emphasis placed by schools on ergonomics that relate to computer use. Participating schools were recruited through the Department for Education and Child…

  18. A survey on resource allocation in high performance distributed computing systems

    Energy Technology Data Exchange (ETDEWEB)

    Hussain, Hameed; Malik, Saif Ur Rehman; Hameed, Abdul; Khan, Samee Ullah; Bickler, Gage; Min-Allah, Nasro; Qureshi, Muhammad Bilal; Zhang, Limin; Yongji, Wang; Ghani, Nasir; Kolodziej, Joanna; Zomaya, Albert Y.; Xu, Cheng-Zhong; Balaji, Pavan; Vishnu, Abhinav; Pinel, Fredric; Pecero, Johnatan E.; Kliazovich, Dzmitry; Bouvry, Pascal; Li, Hongxiang; Wang, Lizhe; Chen, Dan; Rayes, Ammar

    2013-11-01

    An efficient resource allocation is a fundamental requirement in high performance computing (HPC) systems. Many projects are dedicated to large-scale distributed computing systems that have designed and developed resource allocation mechanisms with a variety of architectures and services. In our study, through analysis, a comprehensive survey for describing resource allocation in various HPCs is reported. The aim of the work is to aggregate under a joint framework, the existing solutions for HPC to provide a thorough analysis and characteristics of the resource management and allocation strategies. Resource allocation mechanisms and strategies play a vital role towards the performance improvement of all the HPCs classifications. Therefore, a comprehensive discussion of widely used resource allocation strategies deployed in HPC environment is required, which is one of the motivations of this survey. Moreover, we have classified the HPC systems into three broad categories, namely: (a) cluster, (b) grid, and (c) cloud systems and define the characteristics of each class by extracting sets of common attributes. All of the aforementioned systems are cataloged into pure software and hybrid/hardware solutions. The system classification is used to identify approaches followed by the implementation of existing resource allocation strategies that are widely presented in the literature.

  19. Research on surveying technology applied for DTM modelling and volume computation in open pit mines

    Directory of Open Access Journals (Sweden)

    Jaroslaw Wajs

    2016-01-01

    Full Text Available The spatial information systems of mining company can be used for monitoring of mining activity, excavation planning, calculations of the ore volume and decision making. Nowadays, data base has to be updated by sources such as surveying positioning technologies and remote sensed photogrammetry data. The presented paper contains review of the methodology for the digital terrain model, i.e. DTM, modelling and obtaining data from surveying technologies in an open pit mine or quarry. This paper reviews the application of GPS, total station measurements, and ground photogrammetry for the volume accuracy assessment of a selected object. The testing field was situated in Belchatow lignite open pit mine. A suitable object had been selected. The testing layer of coal seam was located at 8’th pit sidewall excavation area. The data were acquired two times within one month period and it was connected with monthly DTM actualization of excavation. This paper presents the technological process and the results of the research of using digital photogrammetry for opencast mining purposes in the scope of numerical volume computation and monitoring the mines by comparison of different sources. The results shows that the presented workflow allow to build DTM manually and remote sensed and the accuracy assessment was presented by the volume computation pathway. Major advantages of the techniques are presented illustrating how a terrestrial photogrammetry techniques provide rapid spatial measurements of breaklines 3D data utilized to volume calculation.

  20. Introduction to Hadoop Distributed File System

    Directory of Open Access Journals (Sweden)

    Vaibhav Gopal korat

    2012-04-01

    Full Text Available HDFS is a distributed file system designed to hold very large amounts of data (terabytes or even petabytes, and provide high-throughput access to this information. Files are stored in a redundant fashion across multiple machines to ensure their durability to failure and high availability to very parallel applications. This paper includes the step by step introduction to the file system to distributed file system and to the Hadoop Distributed File System. Section I introduces What is file System, Need of File System, Conventional File System, its advantages, Need of Distributed File System, What is Distributed File System and Benefits of Distributed File System. Also the analysis of large dataset and comparison of mapreducce with RDBMS, HPC and Grid Computing communities have been doing large-scale data processing for years. Sections II introduce the concept of Hadoop Distributed File System. Lastly section III contains Conclusion followed with the References.

  1. Influence of the glide path on various parameters of root canal prepared with WaveOne reciprocating file using cone beam computed tomography

    Directory of Open Access Journals (Sweden)

    Anil Dhingra

    2015-01-01

    Full Text Available Background: Nickel-titanium (NiTi rotary instrumentation carries a risk of fracture, mainly as a result of flexural (fatigue fracture and torsional (shear failure stresses. This risk might be reduced by creating a glide path before NiTi rotary instrumentation. The aim of this study was to compare various root canal parameters with the new WaveOne single-file reciprocating system in mesial canals of mandibular molars with and without glide path using cone beam computed tomography (CBCT. Materials and Methods: One hundred mandibular molar teeth with canal curvature between 20° and 30° were divided into two groups of 50 teeth each. In Group 1, no glide path was created, whereas in Group 2, a glide path was created with PathFiles at working length (WL. In both groups, canals were shaped with WaveOne primary reciprocating files to the WL. Canals were scanned in a CBCT unit before and after instrumentation. Postinstrumentation changes in canal curvature, cross-sectional area, centric ability, residual dentin thickness, and the extent of canal transportation were calculated using image analysis software and subjected to statistical analysis. Data were analyzed using Student′s t-test and Mann-Whitney U-test (P < 0.05. Results: The mean difference of root canal curvature, cross-sectional area, centric ability, and residual dentin thickness increased, whereas it reduced significantly for canal transportation in Group 2. Conclusion: WaveOne NiTi files appeared to maintain the original canal anatomy and the presence of a glide path further improves their performance and was found to be beneficial for all the parameters tested in this study.

  2. WATSTORE Stream Flow Basin Characteristics File

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The Stream Flow Basin Characteristics file contains information about the drainage basins of selected USGS gaging stations. Data elements of this file were converted...

  3. Risk, Benefit, and Moderators of the Affect Heuristic in a Widespread Unlawful Activity: Evidence from a Survey of Unlawful File-Sharing Behavior.

    Science.gov (United States)

    Watson, Steven J; Zizzo, Daniel J; Fleming, Piers

    2016-09-13

    Increasing the perception of legal risk via publicized litigation and lobbying for copyright law enforcement has had limited success in reducing unlawful content sharing by the public. We consider the extent to which engaging in file sharing online is motivated by the perceived benefits of this activity as opposed to perceived legal risks. Moreover, we explore moderators of the relationship between perceived risk and perceived benefits; namely, trust in industry and legal regulators, and perceived online anonymity. We examine these questions via a large two-part survey of consumers of music (n = 658) and eBooks (n = 737). We find that perceptions of benefit, but not of legal risk, predict stated file-sharing behavior. An affect heuristic is employed: as perceived benefit increases, perceived risk falls. This relationship is increased under high regulator and industry trust (which actually increases perceived risk in this study) and low anonymity (which also increases perceived risk). We propose that, given the limited impact of perceived legal risk upon unlawful downloading, it would be better for the media industries to target enhancing the perceived benefit and availability of lawful alternatives.

  4. Documentation for the NCES Common Core of Data National Public Education Financial Survey (NPEFS), School Year 2008-09 (Fiscal Year 2009). Revised File Version 1b. NCES 2011-330rev

    Science.gov (United States)

    Cornman, Stephen Q.; Zhou, Lei; Nakamoto, Nanae

    2012-01-01

    This documentation is for the revised file (Version 1b) of the National Center for Education Statistics' (NCES) Common Core of Data (CCD) National Public Education Financial Survey (NPEFS) for school year 2008-2009, fiscal year 2009 (FY 09). It contains a brief description of the data collection along with information required to understand and…

  5. Wireless data collection of self-administered surveys using tablet computers.

    Science.gov (United States)

    Singleton, Kyle W; Lan, Mars; Arnold, Corey; Vahidi, Mani; Arangua, Lisa; Gelberg, Lillian; Bui, Alex A T

    2011-01-01

    The accurate and expeditious collection of survey data by coordinators in the field is critical in the support of research studies. Early methods that used paper documentation have slowly evolved into electronic capture systems. Indeed, tools such as REDCap and others illustrate this transition. However, many current systems are tailored web-browsers running on desktop/laptop computers, requiring keyboard and mouse input. We present a system that utilizes a touch screen interface running on a tablet PC with consideration for portability, limited screen space, wireless connectivity, and potentially inexperienced and low literacy users. The system was developed using C#, ASP.net, and SQL Server by multiple programmers over the course of a year. The system was developed in coordination with UCLA Family Medicine and is currently deployed for the collection of data in a group of Los Angeles area clinics of community health centers for a study on drug addiction and intervention.

  6. A primary care physician perspective survey on the limited use of handwriting and pen computing in the electronic medical record

    Directory of Open Access Journals (Sweden)

    Gary Arvary

    2002-09-01

    The use of handwriting in the EMR was broadly supported by this group of PCPs in private practice. Likewise, wireless pen computers were the overwhelming choice of computer for use during a consultation. In this group, older and lower volume physicians were less likely to desire a computer for use during a consultation. User acceptance of the EMR may be related to how closely it resembles the processes that are being automated. More surveys are required to determine the needs and expectations of physicians. The data also support other research studies that demonstrate the preference for handwriting and wireless computers, and the need for a limited, standardised and controlled vocabulary.

  7. Sci—Thur PM: Imaging — 06: Canada's National Computed Tomography (CT) Survey

    Energy Technology Data Exchange (ETDEWEB)

    Wardlaw, GM; Martel, N [Medical Imaging Division, Consumer and Clinical Radiation Protection Bureau, Healthy Environments and Consumer Safety Branch, Health Canada (Canada); Blackler, W; Asselin, J-F [Data Analysis and Information Systems, Applied Research and Analysis Directorate, Strategic Policy Branch, Health Canada (Canada)

    2014-08-15

    The value of computed tomography (CT) in medical imaging is reflected in its' increased use and availability since the early 1990's; however, given CT's relatively larger exposures (vs. planar x-ray) greater care must be taken to ensure that CT procedures are optimised in terms of providing the smallest dose possible while maintaining sufficient diagnostic image quality. The development of CT Diagnostic Reference Levels (DRLs) supports this process. DRLs have been suggested/supported by international/national bodies since the early 1990's and widely adopted elsewhere, but not on a national basis in Canada. Essentially, CT DRLs provide guidance on what is considered good practice for common CT exams, but require a representative sample of CT examination data to make any recommendations. Canada's National CT Survey project, in collaboration with provincial/territorial authorities, has collected a large national sample of CT practice data for 7 common examinations (with associated clinical indications) of both adult and pediatric patients. Following completion of data entry into a common database, a survey summary report and recommendations will be made on CT DRLs from this data. It is hoped that these can then be used by local regions to promote CT practice optimisation and support any dose reduction initiatives.

  8. Computer Programs for Obtaining and Analyzing Daily Mean Steamflow Data from the U.S. Geological Survey National Water Information System Web Site

    Science.gov (United States)

    Granato, Gregory E.

    2009-01-01

    Research Council, 2004). The USGS maintains the National Water Information System (NWIS), a distributed network of computers and file servers used to store and retrieve hydrologic data (Mathey, 1998; U.S. Geological Survey, 2008). NWISWeb is an online version of this database that includes water data from more than 24,000 streamflow-gaging stations throughout the United States (U.S. Geological Survey, 2002, 2008). Information from NWISWeb is commonly used to characterize streamflows at gaged sites and to help predict streamflows at ungaged sites. Five computer programs were developed for obtaining and analyzing streamflow from the National Water Information System (NWISWeb). The programs were developed as part of a study by the U.S. Geological Survey, in cooperation with the Federal Highway Administration, to develop a stochastic empirical loading and dilution model. The programs were developed because reliable, efficient, and repeatable methods are needed to access and process streamflow information and data. The first program is designed to facilitate the downloading and reformatting of NWISWeb streamflow data. The second program is designed to facilitate graphical analysis of streamflow data. The third program is designed to facilitate streamflow-record extension and augmentation to help develop long-term statistical estimates for sites with limited data. The fourth program is designed to facilitate statistical analysis of streamflow data. The fifth program is a preprocessor to create batch input files for the U.S. Environmental Protection Agency DFLOW3 program for calculating low-flow statistics. These computer programs were developed to facilitate the analysis of daily mean streamflow data for planning-level water-quality analyses but also are useful for many other applications pertaining to streamflow data and statistics. These programs and the associated documentation are included on the CD-ROM accompanying this report. This report and the appendixes on the

  9. OTS: a program for converting Noldus Observer data files to SDIS files.

    Science.gov (United States)

    Bakeman, R; Quera, V

    2000-02-01

    A program for converting Noldus Observer data files (ODF) to sequential data interchange standard (SDIS) files is described. Observer users who convert their data files can then take advantage of various flexible and powerful data modification and computational procedures available in the Generalized Sequential Querier, a program that assumes SDIS-formatted files.

  10. Perceived problems with computer gaming and internet use among adolescents: measurement tool for non-clinical survey studies

    OpenAIRE

    Holstein, Bjørn E.; Pedersen, Trine Pagh; Bendtsen, Pernille; Madsen, Katrine Rich; Meilstrup, Charlotte Riebeling; Nielsen, Line; Rasmussen, Mette

    2014-01-01

    Background Existing instruments for measuring problematic computer and console gaming and internet use are often lengthy and often based on a pathological perspective. The objective was to develop and present a new and short non-clinical measurement tool for perceived problems related to computer use and gaming among adolescents and to study the association between screen time and perceived problems. Methods Cross-sectional school-survey of 11-, 13-, and 15-year old students in thirteen schoo...

  11. Selective File Dumper

    Science.gov (United States)

    Bassetti, Nanni; Frati, Denis

    During a computer forensics investigation we faced a problem how to get all the interesting files we need fast. We work, mainly, using the Open Source software products and Linux OS, and we consider the Sleuthkit and the Foremost two very useful tools, but for reaching our target they were too complicated and time consuming to use. For this reason we developed the Selective File Dumper, a Linux Bash script which makes it possible to extract all the referenced, deleted and unallocated files and finally to perform a keyword search, in a simple way.

  12. Documentation for the Academic Library Survey (ALS) Data File: Fiscal Year 2004 (Public Use). NCES 2007-343

    Science.gov (United States)

    Schmitt, Carl M.; O'Shae, Patricia; Vaden, Kaleen

    2007-01-01

    This manual describes the methods, procedures, techniques, and activities that were used to produce the Academic Library Survey of 2004 (ALS:2004). This manual is designed to provide guidance and documentation for users of the ALS data. Included in the manual are the following: (1) an overview of the study and its predecessor studies; (2) an…

  13. A Comprehensive Survey on the Status of Social and Professional Issues in United States Undergraduate Computer Science Programs and Recommendations

    Science.gov (United States)

    Spradling, Carol; Soh, Leen-Kiat; Ansorge, Charles J.

    2009-01-01

    A national web-based survey was administered to 700 undergraduate computer science (CS) programs in the United States as part of a stratified random sample of 797 undergraduate CS programs. The 251 program responses (36% response rate) regarding social and professional issues are presented. This article describes the demographics of the…

  14. A Survey of Exemplar Teachers' Perceptions, Use, and Access of Computer-Based Games and Technology for Classroom Instruction

    Science.gov (United States)

    Proctor, Michael D.; Marks, Yaela

    2013-01-01

    This research reports and analyzes for archival purposes surveyed perceptions, use, and access by 259 United States based exemplar Primary and Secondary educators of computer-based games and technology for classroom instruction. Participating respondents were considered exemplary as they each won the Milken Educator Award during the 1996-2009…

  15. Do Mathematicians Integrate Computer Algebra Systems in University Teaching? Comparing a Literature Review to an International Survey Study

    Science.gov (United States)

    Marshall, Neil; Buteau, Chantal; Jarvis, Daniel H.; Lavicza, Zsolt

    2012-01-01

    We present a comparative study of a literature review of 326 selected contributions (Buteau, Marshall, Jarvis & Lavicza, 2010) to an international (US, UK, Hungary) survey of mathematicians (Lavicza, 2008) regarding the use of Computer Algebra Systems (CAS) in post-secondary mathematics education. The comparison results are organized with respect…

  16. Promoting CLT within a Computer Assisted Learning Environment: A Survey of the Communicative English Course of FLTC

    Science.gov (United States)

    Haider, Md. Zulfeqar; Chowdhury, Takad Ahmed

    2012-01-01

    This study is based on a survey of the Communicative English Language Certificate (CELC) course run by the Foreign Language Training Center (FLTC), a Project under the Ministry of Education, Bangladesh. FLTC is working to promote the teaching and learning of English through its eleven computer-based and state of the art language laboratories. As…

  17. The Development of a Computer Assisted Math Review for Physical Science Survey Students at Brevard Community College.

    Science.gov (United States)

    Sherman, Joel F.

    A computer assisted mathematics review unit was designed for students enrolled in a community college physical science survey course, who had severe mathematical deficiences in their backgrounds. The CAI program (written in BASIC) covered multiplication and division of numbers written in scientific notation. Thirty-five students who scored zero…

  18. 1999 Survey of Active Duty Personnel: Administration, Datasets, and Codebook. Appendix G: Frequency and Percentage Distributions for Variables in the Survey Analysis Files.

    Science.gov (United States)

    2000-12-01

    Survey of Active Duty Personnel - Form A SRED - What is the highest degree or level of school that you have completed? (MARK THE ONE ANSWER...100.0 | TOTALS ALTHOUGH THIS ITEM ASKS FOR ONE (HIGHEST GRADE OR DEGREE) RESPONSE, RESPONDENTS FREQUENTLY MARK MULTIPLE RESPONSES. SRED IS...CODED AS A STANDARD MARK ONE ITEM WHILE SREDA-SREDH ARE CODED AS A MARK-ALL-THAT-APPLY. SREDHI EQUALS SRED EXCEPT WHERE SRED HAS A MULTIPLE

  19. Nationwide radiation dose survey of computed tomography for fetal skeletal dysplasias

    Energy Technology Data Exchange (ETDEWEB)

    Miyazaki, Osamu [National Center for Child Health and Development, Department of Radiology, Setagaya-ku, Tokyo (Japan); Sawai, Hideaki [Hyogo College of Medicine, Department of Obstetrics and Gynecology, Nishinomiya-shi, Hyogo (Japan); Murotsuki, Jun [Miyagi Children' s Hospital, Department of Maternal and Fetal Medicine, Sendai-shi, Miyagi (Japan); Tohoku University Graduate School of Medicine, Department of Advanced Fetal and Developmental Medicine, Sendai-shi, Miyagi (Japan); Nishimura, Gen [Tokyo Metropolitan Children' s Medical Center, Department of Pediatric Imaging, Fuchu-shi, Tokyo (Japan); Horiuchi, Tetsuya [National Center for Child Health and Development, Department of Radiology, Setagaya-ku, Tokyo (Japan); Osaka University, Department of Medical Physics and Engineering, Division of Medical Technology and Science, Course of Health Science, Graduate School of Medicine, Suita, Osaka (Japan)

    2014-08-15

    Recently, computed tomography (CT) has been used to diagnose fetal skeletal dysplasia. However, no surveys have been conducted to determine the radiation exposure dose and the diagnostic reference level (DRL). To collect CT dose index volume (CTDIvol) and dose length product (DLP) data from domestic hospitals implementing fetal skeletal 3-D CT and to establish DRLs for Japan. Scan data of 125 cases of 20 protocols from 16 hospitals were analyzed. The minimum, first-quartile, median, third-quartile and maximum values of CTDIvol and DLP were determined. The time-dependent change in radiation dose setting in hospitals with three or more cases with scans was also examined. The minimum, first-quartile, median, third-quartile and maximum CTDIvol values were 2.1, 3.7, 7.7, 11.3 and 23.1 mGy, respectively, and these values for DLP were 69.0, 122.3, 276.8, 382.6 and 1025.6 mGy.cm, respectively. Six of the 12 institutions reduced the dose setting during the implementation period. The DRLs of CTDIvol and DLP for fetal CT were 11.3 mGy and 382.6 mGy.cm, respectively. Institutions implementing fetal CT should use these established DRLs as the standard and make an effort to reduce radiation exposure by voluntarily decreasing the dose. (orig.)

  20. Experimental evidence validating the computational inference of functional associations from gene fusion events: a critical survey.

    Science.gov (United States)

    Promponas, Vasilis J; Ouzounis, Christos A; Iliopoulos, Ioannis

    2014-05-01

    More than a decade ago, a number of methods were proposed for the inference of protein interactions, using whole-genome information from gene clusters, gene fusions and phylogenetic profiles. This structural and evolutionary view of entire genomes has provided a valuable approach for the functional characterization of proteins, especially those without sequence similarity to proteins of known function. Furthermore, this view has raised the real possibility to detect functional associations of genes and their corresponding proteins for any entire genome sequence. Yet, despite these exciting developments, there have been relatively few cases of real use of these methods outside the computational biology field, as reflected from citation analysis. These methods have the potential to be used in high-throughput experimental settings in functional genomics and proteomics to validate results with very high accuracy and good coverage. In this critical survey, we provide a comprehensive overview of 30 most prominent examples of single pairwise protein interaction cases in small-scale studies, where protein interactions have either been detected by gene fusion or yielded additional, corroborating evidence from biochemical observations. Our conclusion is that with the derivation of a validated gold-standard corpus and better data integration with big experiments, gene fusion detection can truly become a valuable tool for large-scale experimental biology.

  1. Arecibo PALFA Survey and Einstein@Home: Binary Pulsar Discovery by Volunteer Computing

    Science.gov (United States)

    Knispel, B.; Lazarus, P.; Allen, B.; Anderson, D.; Aulbert, C.; Bhat, N. D. R.; Bock, O.; Bogdanov, S.; Brazier, A.; Camilo, F.; Chatterjee, S.; Cordes, J. M.; Crawford, F.; Deneva, J. S.; Desvignes, G.; Fehrmann, H.; Freire, P. C. C.; Hammer, D.; Hessels, J. W. T.; Jenet, F. A.; Kaspi, V. M.; Kramer, M.; van Leeuwen, J.; Lorimer, D. R.; Lyne, A. G.; Machenschalk, B.; McLaughlin, M. A.; Messenger, C.; Nice, D. J.; Papa, M. A.; Pletsch, H. J.; Prix, R.; Ransom, S. M.; Siemens, X.; Stairs, I. H.; Stappers, B. W.; Stovall, K.; Venkataraman, A.

    2011-05-01

    We report the discovery of the 20.7 ms binary pulsar J1952+2630, made using the distributed computing project Einstein@Home in Pulsar ALFA survey observations with the Arecibo telescope. Follow-up observations with the Arecibo telescope confirm the binary nature of the system. We obtain a circular orbital solution with an orbital period of 9.4 hr, a projected orbital radius of 2.8 lt-s, and a mass function of f = 0.15 M sun by analysis of spin period measurements. No evidence of orbital eccentricity is apparent; we set a 2σ upper limit e <~ 1.7 × 10-3. The orbital parameters suggest a massive white dwarf companion with a minimum mass of 0.95 M sun, assuming a pulsar mass of 1.4 M sun. Most likely, this pulsar belongs to the rare class of intermediate-mass binary pulsars. Future timing observations will aim to determine the parameters of this system further, measure relativistic effects, and elucidate the nature of the companion star.

  2. Machine Learning in Computer-aided Diagnosis of the Thorax and Colon in CT: A Survey.

    Science.gov (United States)

    Suzuki, Kenji

    2013-04-01

    Computer-aided detection (CADe) and diagnosis (CAD) has been a rapidly growing, active area of research in medical imaging. Machine leaning (ML) plays an essential role in CAD, because objects such as lesions and organs may not be represented accurately by a simple equation; thus, medical pattern recognition essentially require "learning from examples." One of the most popular uses of ML is the classification of objects such as lesion candidates into certain classes (e.g., abnormal or normal, and lesions or non-lesions) based on input features (e.g., contrast and area) obtained from segmented lesion candidates. The task of ML is to determine "optimal" boundaries for separating classes in the multidimensional feature space which is formed by the input features. ML algorithms for classification include linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), multilayer perceptrons, and support vector machines (SVM). Recently, pixel/voxel-based ML (PML) emerged in medical image processing/analysis, which uses pixel/voxel values in images directly, instead of features calculated from segmented lesions, as input information; thus, feature calculation or segmentation is not required. In this paper, ML techniques used in CAD schemes for detection and diagnosis of lung nodules in thoracic CT and for detection of polyps in CT colonography (CTC) are surveyed and reviewed.

  3. Arecibo PALFA Survey and Einstein@Home: Binary Pulsar Discovery by Volunteer Computing

    CERN Document Server

    Knispel, B; Allen, B; Anderson, D; Aulbert, C; Bhat, N D R; Bock, O; Bogdanov, S; Brazier, A; Camilo, F; Chatterjee, S; Cordes, J M; Crawford, F; Deneva, J S; Desvignes, G; Fehrmann, H; Freire, P C C; Hammer, D; Hessels, J W T; Jenet, F A; Kaspi, V M; Kramer, M; van Leeuwen, J; Lorimer, D R; Lyne, A G; Machenschalk, B; McLaughlin, M A; Messenger, C; Nice, D J; Papa, M A; Pletsch, H J; Prix, R; Ransom, S M; Siemens, X; Stairs, I H; Stappers, B W; Stovall, K; Venkataraman, A

    2011-01-01

    We report the discovery of the 20.7-ms binary pulsar J1952+2630, made using the distributed computing project Einstein@Home in Pulsar ALFA survey observations with the Arecibo telescope. Follow-up observations with the Arecibo telescope confirm the binary nature of the system. We obtain a circular orbital solution with an orbital period of 9.4 hr, a projected orbital radius of 2.8 lt-s, and a mass function of f = 0.15 solar masses by analysis of spin period measurements. No evidence of orbital eccentricity is apparent; we set a 2-sigma upper limit e < 1.7e-3. The orbital parameters suggest a massive white dwarf companion with a minimum mass of 0.95 solar masses, assuming a pulsar mass of 1.4 solar masses. Most likely, this pulsar belongs to the rare class of intermediate mass binary pulsars. Future timing observations will aim to determine the parameters of this system further, measure relativistic effects, and elucidate the nature of the companion star.

  4. Research on Technology of Recovery Algorithm based on File Feature in Computer Forensics%计算机取证中基于文件特征的数据恢复算法技术研究

    Institute of Scientific and Technical Information of China (English)

    李贵华; 荣世辉; 王刚

    2013-01-01

      为解决计算机取证中的数据恢复问题,提出了一种基于新式技术文件系统( new technology file system,NTFS)的数据恢复方案。通过剖析其主控文件表(master file table,MFT),提出一种基于文件特征的数据恢复算法。此算法通过全盘深粒度扫描磁盘扇区并根据各种类型文件的头部和尾部特征码在磁盘中匹配确定文件的起始和结束扇区,从而根据文件起始、结束扇区之间数据重建恢复此类型文件。实验结果表明,通过此算法设计的软件在搜索量,效率方面都有明显改善。%To solve data recovery problems in computer forensics, this paper proposed a new method for data recovery based on NTFS( new technology file system). By analyzing the structure of MFT(master file table),a recovery algorithm based on file feature,was presented in this paper.The algorithm identified the startand end sector of the lost file by scanning all sectors of the disk and matched them according to the head and foot feature codesof the lost files,then recovered the files by restoring the data between the start and the end sector.Experimental results show that the designed software in the search volume and efficiency has been improved significantly.

  5. Survey update on implementation, indications, and technical performance of computed tomography colonography in Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Fisichella, Valeria A.; Hellstroem, Mikael (Dept. of Radiology, Sahlgrenska Univ. Hospital and Sahlgrenska Academy at Univ. of Goeteborg, Goeteborg (Sweden)), e-mail: valeria.fisichella@vgregion.se

    2010-01-15

    Background: Computed tomographic colonography (CTC) has gained increased acceptance in the last few years as a valid substitute for double-contrast barium enema (DCBE). However, implementation of new technologies is complex, since several factors may influence the process. Purpose: To evaluate the current situation in Sweden concerning implementation of CTC, as compared to a previous national survey in 2005. Material and Methods: In December 2008, a structured, self-assessed questionnaire regarding implementation and technical performance of CTC was mailed to all radiology departments in Sweden. In March 2009, departments who had not replied were contacted by e-mail or by telephone. All (100%, 119/119) departments answered the questionnaire. Results: CTC is currently performed in 50/119 (42%) departments, i.e., 18 additional departments compared to 2005. Twenty-three out of 60 (38%) responding departments stated that they intend to start to perform CTC in the near future. DCBE is currently performed in 77/119 (65%) departments, 12 departments less compared to 2005. The most common reasons for non-implementation of CTC are non-availability of spiral CT scanner (41%, 26/64) and/or multidetector-row CT scanner (39%, 25/64), and lack of doctors' time (34%, 22/64). Only 3% (2/64) of departments are 'awaiting further scientific documentation' on CTC, a significant reduction compared to 2005 (P=0.002). Until 2009, 59% (29/49) of CTC centers had performed more than 200 CTCs compared to 13% (4/32) of CTC centers in 2005. Intravenous contrast material is routinely administered in 86% (42/49), and carbon dioxide is used to distend the colon in 90% (44/49). Almost all radiology departments (93%, 93/100) currently believe that CTC will 'absolutely' or 'probably' replace barium enema in the future, while in 2005 only 56% (55/99) gave similar answers. Conclusion: The survey reflects a further transition process from DCBE to CTC, with attitudes

  6. Portable File Format (PFF) specifications

    Energy Technology Data Exchange (ETDEWEB)

    Dolan, Daniel H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Created at Sandia National Laboratories, the Portable File Format (PFF) allows binary data transfer across computer platforms. Although this capability is supported by many other formats, PFF files are still in use at Sandia, particularly in pulsed power research. This report provides detailed PFF specifications for accessing data without relying on legacy code.

  7. 普适计算综述%A Survey of Pervasive Computing

    Institute of Scientific and Technical Information of China (English)

    郑增威; 吴朝晖

    2003-01-01

    Pervasive computing has being now considered and noticed universally, which is the 21st century compute mode. This article first analyzes the origins of generating Mark Weiser's pervasive computing idea and the thrusts of researching it in the process of computing development. It then simply describes the pervasive computing projects of scientific research organizations of main developed countries, such as enterprise and universities, to realize Mark Weiser's idea. Next, it sketches a Layered Pervasive Computing (LPG) model and its related terms. Finally, the article closes with a discussion of great influence of pervasive computing technologies, and makes prospects of human being living environments such as economic, daily life, work, etc., according to the current development of computer hardware and software technologies and the evolvement of researching pervasive computing.

  8. File sharing

    NARCIS (Netherlands)

    van Eijk, N.

    2011-01-01

    File sharing’ has become generally accepted on the Internet. Users share files for downloading music, films, games, software etc. In this note, we have a closer look at the definition of file sharing, the legal and policy-based context as well as enforcement issues. The economic and cultural impact

  9. Micro-computed Tomography Assessment of Dentinal Micro-cracks after Root Canal Preparation with TRUShape and Self-adjusting File Systems.

    Science.gov (United States)

    Zuolo, Mario Luis; De-Deus, Gustavo; Belladonna, Felipe Gonçalves; Silva, Emmanuel João Nogueira Leal da; Lopes, Ricardo Tadeu; Souza, Erick Miranda; Versiani, Marco Aurélio; Zaia, Alexandre Augusto

    2017-04-01

    The aim of the present study was to evaluate the percentage frequency of dentinal micro-cracks observed after root canal preparation with TRUShape and Self-Adjusting File (SAF) systems by means of micro-computed tomography imaging analysis. A conventional full-sequence rotary system (BioRace) and a single-file reciprocation system (Reciproc) were used as reference techniques for comparison because of their known assertive cutting efficiency. Forty anatomically matched mandibular incisors were selected, scanned at a resolution of 14.25 μm, and assigned to 4 experimental groups (n = 10), according to the preparation protocol: TRUShape, SAF, BioRace, and Reciproc systems. After the experimental procedures, the specimens were scanned again, and the registered preoperative and postoperative cross-section images of the roots (n = 70,030) were screened to identify the presence of dentinal micro-cracks. Overall, dentinal defects were observed in 28,790 cross-section images (41.11%). In the TRUShape, SAF, BioRace, and Reciproc groups, dentinal micro-cracks were visualized in 56.47% (n = 9842), 42.38% (n = 7450), 32.90% (n = 5826), and 32.77% (n = 5672) of the slices, respectively. All dentinal defects observed in the postoperative data sets were already present in the corresponding preoperative images. None of the preparation systems induced the formation of new dentinal micro-cracks. Copyright © 2016 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  10. A hospital survey on the utilization of the master file of the standardized nursing practice terminology in Japan.

    Science.gov (United States)

    Tsuru, Satoko; Wako, Fumiko; Watanabe, Chitose; Uchiyama, Makiko; Okamine, Eiko; Inoue, Manami; Omori, Miho

    2013-01-01

    A common language in nursing facilitates better communication among nurses and other healthcare providers, assuring better nursing care, hence better patient outcomes. As we developed and disseminated the standardized terminology of nursing which provided nurses with a set of terms to describe nursing observations and nursing actions, we run a survey to see how much it was recognized and utilized in actual clinical settings. The result showed that approximately 60% of the respondents were cognizant of our terminology, and again 60% of them were either actually using the terminology or interested in using it in the future. For them, the main purposes of utilizing the terminology were nursing documentation and care planning. Sometimes it was used as an educational tool. This suggests that we should further develop a tool to assist nurses with their documentation and care planning alongside the revision of the terminology itself.

  11. SURVEY

    DEFF Research Database (Denmark)

    SURVEY er en udbredt metode og benyttes inden for bl.a. samfundsvidenskab, humaniora, psykologi og sundhedsforskning. Også uden for forskningsverdenen er der mange organisationer som f.eks. konsulentfirmaer og offentlige institutioner samt marketingsafdelinger i private virksomheder, der arbejder...... med surveys. Denne bog gennemgår alle surveyarbejdets faser og giver en praktisk indføring i: • design af undersøgelsen og udvælgelse af stikprøver, • formulering af spørgeskemaer samt indsamling og kodning af data, • metoder til at analysere resultaterne...

  12. Program Facilitates Distributed Computing

    Science.gov (United States)

    Hui, Joseph

    1993-01-01

    KNET computer program facilitates distribution of computing between UNIX-compatible local host computer and remote host computer, which may or may not be UNIX-compatible. Capable of automatic remote log-in. User communicates interactively with remote host computer. Data output from remote host computer directed to local screen, to local file, and/or to local process. Conversely, data input from keyboard, local file, or local process directed to remote host computer. Written in ANSI standard C language.

  13. Computing the Deflection of the Vertical for Improving Aerial Surveys: A Comparison between EGM2008 and ITALGEO05 Estimates

    Science.gov (United States)

    Barzaghi, Riccardo; Carrion, Daniela; Pepe, Massimiliano; Prezioso, Giuseppina

    2016-01-01

    Recent studies on the influence of the anomalous gravity field in GNSS/INS applications have shown that neglecting the impact of the deflection of vertical in aerial surveys induces horizontal and vertical errors in the measurement of an object that is part of the observed scene; these errors can vary from a few tens of centimetres to over one meter. The works reported in the literature refer to vertical deflection values based on global geopotential model estimates. In this paper we compared this approach with the one based on local gravity data and collocation methods. In particular, denoted by ξ and η, the two mutually-perpendicular components of the deflection of the vertical vector (in the north and east directions, respectively), their values were computed by collocation in the framework of the Remove-Compute-Restore technique, applied to the gravity database used for estimating the ITALGEO05 geoid. Following this approach, these values have been computed at different altitudes that are relevant in aerial surveys. The (ξ, η) values were then also estimated using the high degree EGM2008 global geopotential model and compared with those obtained in the previous computation. The analysis of the differences between the two estimates has shown that the (ξ, η) global geopotential model estimate can be reliably used in aerial navigation applications that require the use of sensors connected to a GNSS/INS system only above a given height (e.g., 3000 m in this paper) that must be defined by simulations. PMID:27472333

  14. Computing the Deflection of the Vertical for Improving Aerial Surveys: A Comparison between EGM2008 and ITALGEO05 Estimates

    Directory of Open Access Journals (Sweden)

    Riccardo Barzaghi

    2016-07-01

    Full Text Available Recent studies on the influence of the anomalous gravity field in GNSS/INS applications have shown that neglecting the impact of the deflection of vertical in aerial surveys induces horizontal and vertical errors in the measurement of an object that is part of the observed scene; these errors can vary from a few tens of centimetres to over one meter. The works reported in the literature refer to vertical deflection values based on global geopotential model estimates. In this paper we compared this approach with the one based on local gravity data and collocation methods. In particular, denoted by ξ and η, the two mutually-perpendicular components of the deflection of the vertical vector (in the north and east directions, respectively, their values were computed by collocation in the framework of the Remove-Compute-Restore technique, applied to the gravity database used for estimating the ITALGEO05 geoid. Following this approach, these values have been computed at different altitudes that are relevant in aerial surveys. The (ξ, η values were then also estimated using the high degree EGM2008 global geopotential model and compared with those obtained in the previous computation. The analysis of the differences between the two estimates has shown that the (ξ, η global geopotential model estimate can be reliably used in aerial navigation applications that require the use of sensors connected to a GNSS/INS system only above a given height (e.g., 3000 m in this paper that must be defined by simulations.

  15. A Survey of Denial-of-Service and Distributed Denial of Service Attacks and Defenses in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Adrien Bonguet

    2017-08-01

    Full Text Available Cloud Computing is a computing model that allows ubiquitous, convenient and on-demand access to a shared pool of highly configurable resources (e.g., networks, servers, storage, applications and services. Denial-of-Service (DoS and Distributed Denial-of-Service (DDoS attacks are serious threats to the Cloud services’ availability due to numerous new vulnerabilities introduced by the nature of the Cloud, such as multi-tenancy and resource sharing. In this paper, new types of DoS and DDoS attacks in Cloud Computing are explored, especially the XML-DoS and HTTP-DoS attacks, and some possible detection and mitigation techniques are examined. This survey also provides an overview of the existing defense solutions and investigates the experiments and metrics that are usually designed and used to evaluate their performance, which is helpful for the future research in the domain.

  16. Iowa Intensive Archaeological Survey

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — This shape file contains intensive level archaeological survey areas for the state of Iowa. All intensive Phase I surveys that are submitted to the State Historic...

  17. MAX and Survey Linkages

    Data.gov (United States)

    U.S. Department of Health & Human Services — CMS is interested in linking MAX files with survey data, including four surveys conducted by the National Center for Health Statistics (NCHS) - the National Health...

  18. MAX and Survey Linkages

    Data.gov (United States)

    U.S. Department of Health & Human Services — CMS is interested in linking MAX files with survey data, including four surveys conducted by the National Center for Health Statistics (NCHS) - the National Health...

  19. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    Science.gov (United States)

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously

  20. Chapter 6. Tabular data and graphical images in support of the U.S. Geological Survey National Oil and Gas Assessment-East Texas basin and Louisiana-Mississippi salt basins provinces, Jurassic Smackover interior salt basins total petroleum system (504902), Travis Peak and Hosston formations.

    Science.gov (United States)

    ,

    2006-01-01

    This chapter describes data used in support of the process being applied by the U.S. Geological Survey (USGS) National Oil and Gas Assessment (NOGA) project. Digital tabular data used in this report and archival data that permit the user to perform further analyses are available elsewhere on the CD-ROM. Computers and software may import the data without transcription from the Portable Document Format files (.pdf files) of the text by the reader. Because of the number and variety of platforms and software available, graphical images are provided as .pdf files and tabular data are provided in a raw form as tab-delimited text files (.tab files).

  1. Chapter 3. Tabular data and graphical images in support of the U.S. Geological Survey National Oil and Gas Assessment--East Texas basin and Louisiana-Mississippi salt basins provinces, Jurassic Smackover Interior salt basins total petroleum system (504902), Cotton Valley group.

    Science.gov (United States)

    Klett, T.R.; Le, P.A.

    2006-01-01

    This chapter describes data used in support of the process being applied by the U.S. Geological Survey (USGS) National Oil and Gas Assessment (NOGA) project. Digital tabular data used in this report and archival data that permit the user to perform further analyses are available elsewhere on the CD-ROM. Computers and software may import the data without transcription from the Portable Document Format files (.pdf files) of the text by the reader. Because of the number and variety of platforms and software available, graphical images are provided as .pdf files and tabular data are provided in a raw form as tab-delimited text files (.tab files).

  2. Tabular data and graphical images in support of the U.S. Geological Survey National Oil and Gas Assessment--San Juan Basin Province (5022): Chapter 7 in Total petroleum systems and geologic assessment of undiscovered oil and gas resources in the San Juan Basin Province, exclusive of Paleozoic rocks, New Mexico and Colorado

    Science.gov (United States)

    Klett, T.R.; Le, P.A.

    2013-01-01

    This chapter describes data used in support of the process being applied by the U.S. Geological Survey (USGS) National Oil and Gas Assessment (NOGA) project. Digital tabular data used in this report and archival data that permit the user to perform further analyses are available elsewhere on this CD–ROM. Computers and software may import the data without transcription from the Portable Document Format files (.pdf files) of the text by the reader. Because of the number and variety of platforms and software available, graphical images are provided as .pdf files and tabular data are provided in a raw form as tab-delimited text files (.tab files).

  3. A Survey of High-Quality Computational Libraries and their Impactin Science and Engineering Applications

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Hernandez, V.; Marques, O.; Roman, J.E.; Vidal, V.

    2004-09-20

    Recently, a number of important scientific and engineering problems have been successfully studied and solved by means of computational modeling and simulation. Many of these computational models and simulations benefited from the use of available software tools and libraries to achieve high performance and portability. In this article, we present a reference matrix of the performance of robust, reliable and widely used tools mapped to scientific and engineering applications that use them. We aim at regularly maintaining and disseminating this matrix to the computational science community. This matrix will contain information on state-of-the-art computational tools, their applications and their use.

  4. Some gender issues in educational computer use: results of an international comparative survey

    NARCIS (Netherlands)

    Janssen Reinen, Ingeborg; Plomp, Tjeerd

    1993-01-01

    In the framework of the Computers in Education international study of the International Association for the Evaluation of Educational Achievement (IEA), data have been collected concerning the use of computers in 21 countries. This article examines some results regarding the involvement of women in

  5. A Survey and Evaluation of Simulators Suitable for Teaching Courses in Computer Architecture and Organization

    Science.gov (United States)

    Nikolic, B.; Radivojevic, Z.; Djordjevic, J.; Milutinovic, V.

    2009-01-01

    Courses in Computer Architecture and Organization are regularly included in Computer Engineering curricula. These courses are usually organized in such a way that students obtain not only a purely theoretical experience, but also a practical understanding of the topics lectured. This practical work is usually done in a laboratory using simulators…

  6. The use of computers in education worldwide : results from a comparative survey in 18 countries

    NARCIS (Netherlands)

    Pelgrum, Willem J.; Plomp, Tjeerd

    1991-01-01

    In 1989, the International Association for the Evaluation of Educational Achievement (IEA) Computers in Education study collected data on computer use in elementary, and lower- and upper-secondary education in 22 countries. Although all data sets from the participating countries had not been receive

  7. Computer Game Theories for Designing Motivating Educational Software: A Survey Study

    Science.gov (United States)

    Ang, Chee Siang; Rao, G. S. V. Radha Krishna

    2008-01-01

    The purpose of this study is to evaluate computer game theories for educational software. We propose a framework for designing engaging educational games based on contemporary game studies which includes ludology and narratology. Ludology focuses on the study of computer games as play and game activities, while narratology revolves around the…

  8. Brain-computer interfaces for multimodal interaction: a survey and principles

    NARCIS (Netherlands)

    Gürkök, Hayrettin; Nijholt, Anton

    2012-01-01

    For decades, brain–computer interfaces (BCIs) have been used for restoring the communication and mobility of disabled people through applications such as spellers, web browsers, and wheelchair controls. In parallel to advances in computational intelligence and the production of consumer BCI products

  9. Product identification file

    Energy Technology Data Exchange (ETDEWEB)

    Gray, C.E. (ed.)

    1978-06-01

    This product identification file was compiled as an aid to the industrial hygienist who may encounter the products listed in surveys for and studies of occupational health hazards. It is pointed out that the chemical composition of a product may vary from year to year and some components may be added or deleted without an indication on the label. Some of the data in this file may not be complete depending on the analysis requested. For example, a solvent may be the only component for which the product was analyzed. The file is arranged by listing the chemical manufacturer, followed by the trade name. In cases where no manufacturer is known, the trade name appears in alphabetical order. The log number and the chemist who analyzed the product are listed for reference.

  10. 基于系统日志文件的计算机系统脆弱性分析%Analysis of the Computer System Vulnerability based on the System Log File

    Institute of Scientific and Technical Information of China (English)

    黄波

    2012-01-01

      随着信息社会的发展,计算机系统的安全是信息社会中信息安全保障的重要部分之一。文章根据Windows系统、数据库系统、防火墙系统的日志文件形式、结构、内容的组成,分析了在计算机操作系统、网络系统中日志文件的安全性及其安全作用,并阐述了实现计算机系统脆弱性分析的根源。%  With the development of information society, the security of the computer system in the information society is the most important part of information security. In this paper, According to form and structure and content of log file in the windows system, database system, firewall system ,analysed the safety and security of log files in the computer operating system and network system, and elaborated the root of realizing computer system vulnerability analysis of through the system log files.

  11. An exploratory survey of design science research amongst South African computing scholars

    CSIR Research Space (South Africa)

    Naidoo, R

    2012-10-01

    Full Text Available The debate ensues as to whether the traditional focus of computing research on theory development and verification and therefore has adequate immediate practical relevance. Despite increasing claims of the potential of design science research (DSR...

  12. Survey of ONERA activities on adaptive-wall applications and computation of residual corrections

    Science.gov (United States)

    Chevallier, J. P.

    1984-01-01

    The research undertaken concerning the computation and/or reduction of wall interference follows two main axes: improvement of wall correction determinations, and use of adaptive flexible walls. The use of wall-measured data to compute interference effects is reliable when the model representation is assessed by signatures with known boundary conditions. When the computed interferences are not easily applicable to correcting the results (especially for gradients in two-dimensional cases), the flexible adaptive walls in operation in T2 are an efficient and assessed means of reducing the boundary effects to a negligible level, if the direction and speed of the flow are accurately measured on the boundary. The extension of the use of adaptive walls to three-dimensional cases may be attempted since the residual corrections are assumed to be small and are computable.

  13. Secure Human-Computer Identification against Peeping Attacks (SecHCI): A Survey

    OpenAIRE

    Li, SJ; Shum, HY

    2003-01-01

    This paper focuses on human-computer identification systems against peeping attacks, in which adversaries can observe (and even control) interactions between humans (provers) and computers (verifiers). Real cases on peeping attacks were reported by Ross J. Anderson ten years before. Fixed passwords are insecure to peeping attacks since adversaries can simply replay the observed passwords. Some identification techniques can be used to defeat peeping attacks, but auxiliary devices must be used ...

  14. Holism, ambiguity and approximation in the logics of quantum computation: a survey

    Science.gov (United States)

    Dalla Chiara, Maria Luisa; Giuntini, Roberto; Leporini, Roberto

    2011-01-01

    Quantum computation has suggested some new forms of quantum logic (called quantum computational logics), where meanings of sentences are identified with quantum information quantities. This provides a mathematical formalism for an abstract theory of meanings that can be applied to investigate different kinds of semantic phenomena (in social sciences, in medicine, in natural languages and in the languages of art), where both ambiguity and holism play an essential role.

  15. TOPICAL REVIEW: A survey of signal processing algorithms in brain computer interfaces based on electrical brain signals

    Science.gov (United States)

    Bashashati, Ali; Fatourechi, Mehrdad; Ward, Rabab K.; Birch, Gary E.

    2007-06-01

    Brain computer interfaces (BCIs) aim at providing a non-muscular channel for sending commands to the external world using the electroencephalographic activity or other electrophysiological measures of the brain function. An essential factor in the successful operation of BCI systems is the methods used to process the brain signals. In the BCI literature, however, there is no comprehensive review of the signal processing techniques used. This work presents the first such comprehensive survey of all BCI designs using electrical signal recordings published prior to January 2006. Detailed results from this survey are presented and discussed. The following key research questions are addressed: (1) what are the key signal processing components of a BCI, (2) what signal processing algorithms have been used in BCIs and (3) which signal processing techniques have received more attention?

  16. A Comparative Survey of Lotka and Pao’s Laws Conformity with the Number of Researchers and Their Articles in Computer Science and Artificial Intelligence Fields in Web of Science (1986-2009

    Directory of Open Access Journals (Sweden)

    Farideh Osareh

    2011-10-01

    Full Text Available The purpose of this research was to examine the validity of Lotka and Pao’s laws with authorship distribution of "Computer Science" and "Artificial Intelligence" fields using Web of Science (WoS during 1986 to 2009 and comparing the results of examinations. This study was done by using the methods of citation analysis which are scientometrics techniques. The research sample includes all articles in computer science and artificial intelligence fields indexed in the databases accessible via Web of Science during 1986-2009; that were stored in 500 records files and added to "ISI.exe" software for analysis to be performed. Then, the required output of this software was saved in Excel. There were 19150 articles in the computer science field (by 45713 authors and 958 articles in artificial intelligence field (by 2487 authors. Then for final counting and analyzing, the data converted to “Excel” spreadsheet software. Lotka and Pao’s laws were tested using both Lotka’s formula: (for Lotka’s Law; also for testing Pao’s law the values of the exponent n and the constant c are computed and Kolmogorov-Smirnov goodness-of-fit tests were applied. The results suggested that author productivity distribution predicted in “Lotka's generalized inverse square law” was not applicable to computer science and artificial intelligence; but Pao’s law was applicable to these subject areas. Survey both literature and original examining of Lotka and Pao’s Laws witnessed some aspects should be considered. The main elements involved in fitting in a bibliometrics method have been identified: using Lotka or Pao’s law, subject area, period of time, measurement of authors, and a criterion for assessing goodness-of-fit.

  17. Comma-Separated Value files of the raw sound velocity profiles collected by the U.S. Geological Survey off the southern shore of Martha's Vineyard, MA, 2007 (CSV Files)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The USGS Woods Hole Science Center conducted a nearshore geophysical survey offshore of the southern coast of Martha's Vineyard, in the vicinity of the Martha's...

  18. OTEC cold water pipe: a survey of available shell analysis computer programs and implications of hydrodynamic loadings

    Energy Technology Data Exchange (ETDEWEB)

    Pompa, J.A.; Allik, H.; Webman, K.; Spaulding, M.

    1979-02-01

    The design and analysis of the cold water pipe (CWP) is one of the most important technological problems to be solved in the OTEC ocean engineering program. Analytical computer models have to be developed and verified in order to provide an engineering approach for the OTEC CWP with regards to environmental factors such as waves, currents, platform motions, etc., and for various structural configurations and materials such as rigid wall CWP, compliant CWP, stockade CWP, etc. To this end, Analysis and Technology, Inc. has performed a review and evaluation of shell structural analysis computer programs applicable to the design of an OTEC CWP. Included in this evaluation are discussions of the hydrodynamic flow field, structure-fluid interaction and the state-of-the-art analytical procedures for analysis of offshore structures. The analytical procedures which must be incorporated into the design of a CWP are described. A brief review of the state-of-the-art for analysis of offshore structures and the need for a shell analysis for the OTEC CWP are included. A survey of available shell computer programs, both special purpose and general purpose, and discussions of the features of these dynamic shell programs and how the hydrodynamic loads are represented within the computer programs are included. The hydrodynamic loads design criteria for the CWP are described. An assessment of the current state of knowledge for hydrodynamic loads is presented. (WHK)

  19. Drug Metabolism in Preclinical Drug Development: A Survey of the Discovery Process, Toxicology, and Computational Tools.

    Science.gov (United States)

    Issa, Naiem T; Wathieu, Henri; Ojo, Abiola; Byers, Stephen W; Dakshanamurthy, Sivanesan

    2017-03-15

    Increased R & D spending and high failure rates exist in drug development, due in part to inadequate prediction of drug metabolism and its consequences in the human body. Hence, there is a need for computational methods to supplement and complement current biological assessment strategies. In this review, we provide an overview of drug metabolism in pharmacology, and discuss the current in vitro and in vivo strategies for assessing drug metabolism in preclinical drug development. We highlight computational tools available to the scientific community for the in silico prediction of drug metabolism, and examine how these tools have been implemented to produce drug-target signatures relevant to metabolic routes. Computational workflows that assess drug metabolism and its toxicological and pharmacokinetic effects, such as by applying the adverse outcome pathway framework for risk assessment, may improve the efficiency and speed of preclinical drug development.

  20. Audio computer-assisted self interview compared to traditional interview in an HIV-related behavioral survey in Vietnam.

    Science.gov (United States)

    Le, Linh Cu; Vu, Lan T H

    2012-10-01

    Globally, population surveys on HIV/AIDS and other sensitive topics have been using audio computer-assisted self interview for many years. This interview technique, however, is still new to Vietnam and little is known about its application and impact in general population surveys. One plausible hypothesis is that residents of Vietnam interviewed using this technique may provide a higher response rate and be more willing to reveal their true behaviors than if interviewed with traditional methods. This study aims to compare audio computer-assisted self interview with traditional face-to-face personal interview and self-administered interview with regard to rates of refusal and affirmative responses to questions on sensitive topics related to HIV/AIDS. In June 2010, a randomized study was conducted in three cities (Ha Noi, Da Nan and Can Tho), using a sample of 4049 residents aged 15 to 49 years. Respondents were randomly assigned to one of three interviewing methods: audio computer-assisted self interview, personal face-to-face interview, and self-administered paper interview. Instead of providing answers directly to interviewer questions as with traditional methods, audio computer-assisted self-interview respondents read the questions displayed on a laptop screen, while listening to the questions through audio headphones, then entered responses using a laptop keyboard. A MySQL database was used for data management and SPSS statistical package version 18 used for data analysis with bivariate and multivariate statistical techniques. Rates of high risk behaviors and mean values of continuous variables were compared for the three data collection methods. Audio computer-assisted self interview showed advantages over comparison techniques, achieving lower refusal rates and reporting higher prevalence of some sensitive and risk behaviors (perhaps indication of more truthful answers). Premarital sex was reported by 20.4% in the audio computer-assisted self-interview survey

  1. A Survey of Knowledge Management Skills Acquisition in an Online Team-Based Distributed Computing Course

    Science.gov (United States)

    Thomas, Jennifer D. E.

    2007-01-01

    This paper investigates students' perceptions of their acquisition of knowledge management skills, namely thinking and team-building skills, resulting from the integration of various resources and technologies into an entirely team-based, online upper level distributed computing (DC) information systems (IS) course. Results seem to indicate that…

  2. Effects of Gender on Computer-Mediated Communication: A Survey of University Faculty

    Science.gov (United States)

    Valenziano, Laura

    2007-01-01

    The influence of gender on computer-mediated communication is a research area with tremendous growth. This study sought to determine what gender effects exist in email communication between professors and students. The study also explored the amount of lying and misinterpretation that occurs through online communication. The study results indicate…

  3. Using Computers in Distance Study: Results of a Survey amongst Disabled Distance Students.

    Science.gov (United States)

    Ommerborn, Rainer; Schuemer, Rudolf

    2002-01-01

    In the euphoria about new technologies in distance education there exists the danger of not sufficiently considering how ever increasing "virtualization" may exclude some student groups. An explorative study was conducted that asked disabled students about their experiences with using computers and the Internet. Overall, those questioned…

  4. Photo-Modeling and Cloud Computing. Applications in the Survey of Late Gothic Architectural Elements

    Science.gov (United States)

    Casu, P.; Pisu, C.

    2013-02-01

    This work proposes the application of the latest methods of photo-modeling to the study of Gothic architecture in Sardinia. The aim is to consider the versatility and ease of use of such documentation tools in order to study architecture and its ornamental details. The paper illustrates a procedure of integrated survey and restitution, with the purpose to obtain an accurate 3D model of some gothic portals. We combined the contact survey and the photographic survey oriented to the photo-modelling. The software used is 123D Catch by Autodesk an Image Based Modelling (IBM) system available free. It is a web-based application that requires a few simple steps to produce a mesh from a set of not oriented photos. We tested the application on four portals, working at different scale of detail: at first the whole portal and then the different architectural elements that composed it. We were able to model all the elements and to quickly extrapolate simple sections, in order to make a comparison between the moldings, highlighting similarities and differences. Working in different sites at different scale of detail, have allowed us to test the procedure under different conditions of exposure, sunshine, accessibility, degradation of surface, type of material, and with different equipment and operators, showing if the final result could be affected by these factors. We tested a procedure, articulated in a few repeatable steps, that can be applied, with the right corrections and adaptations, to similar cases and/or larger or smaller elements.

  5. A Method to Defend File-Attacking

    Institute of Scientific and Technical Information of China (English)

    HE Hongjun; LUO Li; CAO Sihua; FENG Tao; PAN Li; ZOU Zhiji

    2006-01-01

    The paper points out that the deep reason why modern computer system fails to defense malware lies in that user has no right to control the access of information, and proposes an explicit authorization mechanism. Its basic idea is that user explicitly authorizes program the file set it can access, and monitor all file access operations; once program requests to access file out of the authorized file set, refuse it, and this means that the program is malicious or has design errors. Computers based on this novel mechanism can protect information from attacking reliably, and have good software and hardware compatibility. A testing system is presented to validate our theory.

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  7. Market Survey and Analysis in Support of ASAS Computer-Based Training System Design

    Science.gov (United States)

    1988-11-01

    response. 10. CBT Can Be Made Adaptive. Just as a good instructor adapts his teaching technique, pace, and content to the abilities and perceived...Figure 4-2. A PAWS is converted to training mode by unplugging or switching off the standard terminals and connecting the Matrox EIDS stations. The...LANGUAGE TenCORE Language Authoring System and TenCORE Assistant Computer Teaching Corporation 1713 South Neil Street Champaign, IL 61820 (217)352-6363

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  10. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  11. ACONC Files

    Data.gov (United States)

    U.S. Environmental Protection Agency — ACONC files containing simulated ozone and PM2.5 fields that were used to create the model difference plots shown in the journal article. This dataset is associated...

  12. 831 Files

    Data.gov (United States)

    Social Security Administration — SSA-831 file is a collection of initial and reconsideration adjudicative level DDS disability determinations. (A few hearing level cases are also present, but the...

  13. Computational analysis in epilepsy neuroimaging: A survey of features and methods

    Directory of Open Access Journals (Sweden)

    Lohith G. Kini

    2016-01-01

    Advances in neuroimaging and machine learning allow semi-automated detection of malformations of cortical development (MCDs, a common cause of drug resistant epilepsy. A frequently asked question in the field is what techniques currently exist to assist radiologists in identifying these lesions, especially subtle forms of MCDs such as focal cortical dysplasia (FCD Type I and low grade glial tumors. Below we introduce some of the common lesions encountered in patients with epilepsy and the common imaging findings that radiologists look for in these patients. We then review and discuss the computational techniques introduced over the past 10 years for quantifying and automatically detecting these imaging findings. Due to large variations in the accuracy and implementation of these studies, specific techniques are traditionally used at individual centers, often guided by local expertise, as well as selection bias introduced by the varying prevalence of specific patient populations in different epilepsy centers. We discuss the need for a multi-institutional study that combines features from different imaging modalities as well as computational techniques to definitively assess the utility of specific automated approaches to epilepsy imaging. We conclude that sharing and comparing these different computational techniques through a common data platform provides an opportunity to rigorously test and compare the accuracy of these tools across different patient populations and geographical locations. We propose that these kinds of tools, quantitative imaging analysis methods and open data platforms for aggregating and sharing data and algorithms, can play a vital role in reducing the cost of care, the risks of invasive treatments, and improve overall outcomes for patients with epilepsy.

  14. From Monge to Higgs: a survey of distance computations in noncommutative geometry

    CERN Document Server

    Martinetti, Pierre

    2016-01-01

    This is a review of explicit computations of Connes distance in noncommutative geometry, covering finite dimensional spectral triples, almost-commutative geometries, and spectral triples on the algebra of compact operators. Several applications to physics are covered, like the metric interpretation of the Higgs field, and the comparison of Connes distance with the minimal length that emerges in various models of quantum spacetime. Links with other areas of mathematics are studied, in particular the horizontal distance in sub-Riemannian geometry. The interpretation of Connes distance as a noncommutative version of the Monge-Kantorovich metric in optimal transport is also discussed.

  15. THE VIEW OF THE TECHNICAL AND COMPUTER SCIENCE SPECIALIZATION IN THE SURVEY OF THEIR STUDENTS

    Directory of Open Access Journals (Sweden)

    Mirosław Malec

    2012-12-01

    Full Text Available The article presents the results of the chosen research of Institute of Technology Students of Technical and Computer Science Specialization with use of questionnaire. For the research were chosen students from the last and the first year of study so that we can define the attitude to specialization and university of students who are finishing the study but also we can define expectations of people who had just begin their studies. Collected data were analyzed with the use of SPSS programme.

  16. Survey of biomedical and environental data bases, models, and integrated computer systems at Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Murarka, I.P.; Bodeau, D.J.; Scott, J.M.; Huebner, R.H.

    1978-08-01

    This document contains an inventory (index) of information resources pertaining to biomedical and environmental projects at Argonne National Laboratory--the information resources include a data base, model, or integrated computer system. Entries are categorized as models, numeric data bases, bibliographic data bases, or integrated hardware/software systems. Descriptions of the Information Coordination Focal Point (ICFP) program, the system for compiling this inventory, and the plans for continuing and expanding it are given, and suggestions for utilizing the services of the ICFP are outlined.

  17. 储备池计算概述%Survey on Reservoir Computing

    Institute of Scientific and Technical Information of China (English)

    彭宇; 王建民; 彭喜元

    2011-01-01

    针对传统递归神经网络存在训练困难的问题,一种新的递归神经网络的训练方法——储备池计算被提出,这种方法的核心思想是只训练网络部分连接权,其余连接权一经产生就不再改变,网络的训练一般只需要通过求解线性回归问题.广义地说,储备池可以作为一种时序相关的核函数使用,从而完全拓展了其应用领域,使之不再仅仅是递归神经网络训练算法的一种改进.本文在介绍储备池计算基本数学模型的基础上,从储备池计算研究的热点问题——储备池适应性问题的角度,全面地分析了目前储备池计算的研究现状、热点及应用等方面的问题.%A novel training method for recurrent neural networks, which is called reservoir computing, was proposed with the purpose of dealing with difficulties in the training of the traditional recurrent neural networks. The main idea of the reservoir com putting is training only parts of the connection weights of the networks, and generating the rest parts randomly. The connection weights generated randomly remain unchanged during the training process. Then training process of the network can be carried out by solving a linear regression problem. The reservoir can be considered as a temporal kernel function which extends the applications of the reservoir computing. In fact, the reservoir computing is not only a modification of the training algorithm to recurrent neural networks. In this paper, we firstly introduce the mathematical model of the reservoir computing and analyze the current related re searches and applications in detail in the view of reservoir adaption which has attracted much interest of the researchers recently.

  18. File Type Identification of File Fragments using Longest Common Subsequence (LCS)

    Science.gov (United States)

    Rahmat, R. F.; Nicholas, F.; Purnamawati, S.; Sitompul, O. S.

    2017-01-01

    Computer forensic analyst is a person in charge of investigation and evidence tracking. In certain cases, the file needed to be presented as digital evidence was deleted. It is difficult to reconstruct the file, because it often lost its header and cannot be identified while being restored. Therefore, a method is required for identifying the file type of file fragments. In this research, we propose Longest Common Subsequences that consists of three steps, namely training, testing and validation, to identify the file type from file fragments. From all testing results we can conlude that our proposed method works well and achieves 92.91% of accuracy to identify the file type of file fragment for three data types.

  19. Vertical Files in Midlands Academic Libraries.

    Science.gov (United States)

    Lillis, John G.

    1991-01-01

    Reviews survey responses from 127 nonmedical academic libraries in Nebraska, Iowa, and Kansas regarding their vertical files (e.g., acquisitions, weeding, size, nature, collection management, frequency of use, maintenance of statistics, types of users, circulation, and security), reporting that 109 had vertical files, with most emphasizing topics…

  20. Efficacy of Twisted File Adaptive, Reciproc and ProTaper Universal Retreatment instruments for root-canal-filling removal: A cone-beam computed tomography study.

    Science.gov (United States)

    Akbulut, Makbule Bilge; Akman, Melek; Terlemez, Arslan; Magat, Guldane; Sener, Sevgi; Shetty, Heeresh

    2016-01-01

    The aim of this study was to evaluate the efficacy of Twisted File (TF) Adaptive, Reciproc, and ProTaper Universal Retreatment (UR) System instruments for removing root-canal-filling. Sixty single rooted teeth were decoronated, instrumented and obturated. Preoperative CBCT scans were taken and the teeth were retreated with TF Adaptive, Reciproc, ProTaper UR, or hand files (n=15). Then, the teeth were rescanned, and the percentage volume of the residual root-canal-filling material was established. The total time for retreatment was recorded, and the data was statistically analyzed. The statistical ranking of the residual filling material volume was as follows: hand file=TF Adaptive>ProTaper UR=Reciproc. The ProTaper UR and Reciproc systems required shorter periods of time for retreatment. Root canal filling was more efficiently removed by using Reciproc and ProTaper UR instruments than TF Adaptive instruments and hand files. The TF Adaptive system was advantageous over hand files with regard to operating time.

  1. Quantum Information, computation and cryptography. An introductory survey of theory, technology and experiments

    Energy Technology Data Exchange (ETDEWEB)

    Benatti, Fabio [Trieste Univ., Miramare (Italy). Dipt. Fisica Teorica; Fannes, Mark [Leuven Univ. (Belgium). Inst. voor Theoretische Fysica; Floreanini, Roberto [INFN, Trieste (Italy). Dipt. di Fisica Teorica; Petritis, Dimitri (eds.) [Rennes 1 Univ., 35 (France). Inst. de Recherche Mathematique de Rennes

    2010-07-01

    This multi-authored textbook addresses graduate students with a background in physics, mathematics or computer science. No research experience is necessary. Consequently, rather than comprehensively reviewing the vast body of knowledge and literature gathered in the past twenty years, this book concentrates on a number of carefully selected aspects of quantum information theory and technology. Given the highly interdisciplinary nature of the subject, the multi-authored approach brings together different points of view from various renowned experts, providing a coherent picture of the subject matter. The book consists of ten chapters and includes examples, problems, and exercises. The first five present the mathematical tools required for a full comprehension of various aspects of quantum mechanics, classical information, and coding theory. Chapter 6 deals with the manipulation and transmission of information in the quantum realm. Chapters 7 and 8 discuss experimental implementations of quantum information ideas using photons and atoms. Finally, chapters 9 and 10 address ground-breaking applications in cryptography and computation. (orig.)

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  3. Identifiable Data Files - Denominator File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Denominator File combines Medicare beneficiary entitlement status information from administrative enrollment records with third-party payer information and GHP...

  4. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  5. Survey and comparison for Open and closed sources in cloud computing

    Directory of Open Access Journals (Sweden)

    Nadir Kamal Salih

    2012-05-01

    Full Text Available Cloud computing is a new technology widely studied in recent years. Now there are many cloud platforms both in industry and in academic circle. How to understand and use these platforms is a big issue. A detailed comparison has been presented in this paper focused on the aspects such as the architecture, characteristics, application and so on. To know the differences between open source and close source in cloud environment we mention some examples for Software-as-a-Service, Platform-as-a-Service, and Infrastructure-as-a-Service. We made comparison between them. Before conclusion we demonstrate some convergences and differences between open and closed platform, but we realized open source should be the best option

  6. Self-report computer-based survey of technology use by people with intellectual and developmental disabilities.

    Science.gov (United States)

    Tanis, Emily Shea; Palmer, Susan; Wehmeyer, Michael; Davies, Daniel K; Stock, Steven E; Lobb, Kathy; Bishop, Barbara

    2012-02-01

    Advancements of technologies in the areas of mobility, hearing and vision, communication, and daily living for people with intellectual and developmental disabilities has the potential to greatly enhance independence and self-determination. Previous research, however, suggests that there is a technological divide with regard to the use of such technologies by people with intellectual and developmental disabilities when compared with the use reported by the general public. To provide current information with regard to technology use by people with intellectual and developmental disabilities by examining the technology needs, use, and barriers to such use experienced by 180 adults with intellectual and developmental disabilities, we used QuestNet, a self-directed computer survey program. Results suggest that although there has been progress in technology acquisition and use by people with intellectual and developmental disabilities, an underutilization of technologies across the population remains.

  7. A Self-Report Computer-Based Survey of Technology Use by People with Intellectual and Developmental Disabilities

    Science.gov (United States)

    Tanis, Emily Shea; Palmer, Susan B.; Wehmeyer, Michael L.; Davies, Danial; Stock, Steven; Lobb, Kathy; Bishop, Barbara

    2014-01-01

    Advancements of technologies in the areas of mobiliy, hearing and vision, communication, and daily living for people with intellectual and developmental disabilities (IDD) has the potential to greatly enhance indepencence and self-determination. Previous research, however, suggests that there is a “technological divide” with regard to the use of such technologies by people with IDD when compared with the general public. The present study sought to provide current information with regard to technology use by people with IDD by examining the technology needs, use, and barriers to such use experienced by 180 adults with IDD through QuestNet, a self-directed computer survey program. The study findings suggest that although there has been progress in technology acquisition and use by people IDD, yet there remains an underutilization of technologies across the population. PMID:22316226

  8. Text Files of the DGPS Navigation Logged with HYPACK Software on U.S. Geological Survey Cruise 2012-002-FA from June 11 to June 14, 2012

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The U.S. Geological Survey (USGS), in cooperation with the National Oceanic and Atmospheric Administration (NOAA), is producing detailed geologic maps of the coastal...

  9. Text Files of the DGPS Navigation Logged with HYPACK Software on U.S. Geological Survey Cruise 2012-002-FA from June 11 to June 14, 2012

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The U.S. Geological Survey (USGS), in cooperation with the National Oceanic and Atmospheric Administration (NOAA), is producing detailed geologic maps of the coastal...

  10. A review of brain-computer interface games and an opinion survey from researchers, developers and users.

    Science.gov (United States)

    Ahn, Minkyu; Lee, Mijin; Choi, Jinyoung; Jun, Sung Chan

    2014-08-11

    In recent years, research on Brain-Computer Interface (BCI) technology for healthy users has attracted considerable interest, and BCI games are especially popular. This study reviews the current status of, and describes future directions, in the field of BCI games. To this end, we conducted a literature search and found that BCI control paradigms using electroencephalographic signals (motor imagery, P300, steady state visual evoked potential and passive approach reading mental state) have been the primary focus of research. We also conducted a survey of nearly three hundred participants that included researchers, game developers and users around the world. From this survey, we found that all three groups (researchers, developers and users) agreed on the significant influence and applicability of BCI and BCI games, and they all selected prostheses, rehabilitation and games as the most promising BCI applications. User and developer groups tended to give low priority to passive BCI and the whole head sensor array. Developers gave higher priorities to "the easiness of playing" and the "development platform" as important elements for BCI games and the market. Based on our assessment, we discuss the critical point at which BCI games will be able to progress from their current stage to widespread marketing to consumers. In conclusion, we propose three critical elements important for expansion of the BCI game market: standards, gameplay and appropriate integration.

  11. A Review of Brain-Computer Interface Games and an Opinion Survey from Researchers, Developers and Users

    Directory of Open Access Journals (Sweden)

    Minkyu Ahn

    2014-08-01

    Full Text Available In recent years, research on Brain-Computer Interface (BCI technology for healthy users has attracted considerable interest, and BCI games are especially popular. This study reviews the current status of, and describes future directions, in the field of BCI games. To this end, we conducted a literature search and found that BCI control paradigms using electroencephalographic signals (motor imagery, P300, steady state visual evoked potential and passive approach reading mental state have been the primary focus of research. We also conducted a survey of nearly three hundred participants that included researchers, game developers and users around the world. From this survey, we found that all three groups (researchers, developers and users agreed on the significant influence and applicability of BCI and BCI games, and they all selected prostheses, rehabilitation and games as the most promising BCI applications. User and developer groups tended to give low priority to passive BCI and the whole head sensor array. Developers gave higher priorities to “the easiness of playing” and the “development platform” as important elements for BCI games and the market. Based on our assessment, we discuss the critical point at which BCI games will be able to progress from their current stage to widespread marketing to consumers. In conclusion, we propose three critical elements important for expansion of the BCI game market: standards, gameplay and appropriate integration.

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  14. Text files of the Wide Area Augmentation System (WAAS) navigation collected by the U.S. Geological Survey in Moultonborough Bay, Lake Winnipesaukee, New Hampshire in 2005 (Geographic, WGS 84, HYPACK ASCII Text Files)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — In freshwater bodies of New Hampshire, the most problematic aquatic invasive plant species is Myriophyllum heterophyllum or variable leaf water-milfoil. Once...

  15. 2005-004-FA_HYPACK: Text files of the Wide Area Augmentation System (WAAS) navigation collected by the U.S. Geological Survey in Moultonborough Bay, Lake Winnipesaukee, New Hampshire in 2005 (Geographic, WGS 84, HYPACK ASCII Text Files)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — In freshwater bodies of New Hampshire, the most problematic aquatic invasive plant species is Myriophyllum heterophyllum or variable leaf water-milfoil. Once...

  16. Download this PDF file

    African Journals Online (AJOL)

    `123456789jkl''''#

    An Empirical Survey of Technology Application in Teaching Geography in Nigerian. Secondary ... 55% of Geography teachers had access to computer but did not have the pre-requisite ICT skills. .... widely affect the students in future as regard.

  17. A survey and taxonomy on energy efficient resource allocation techniques for cloud computing systems

    Energy Technology Data Exchange (ETDEWEB)

    Hameed, Abdul; Khoshkbarforoushha, Alireza; Ranjan, Rajiv; Jayaraman, Prem Prakash; Kolodziej, Joanna; Balaji, Pavan; Zeadally, Sherali; Malluhi, Qutaibah Marwan; Tziritas, Nikos; Vishnu, Abhinav; Khan, Samee U.; Zomaya, Albert

    2014-06-06

    In a cloud computing paradigm, energy efficient allocation of different virtualized ICT resources (servers, storage disks, and networks, and the like) is a complex problem due to the presence of heterogeneous application (e.g., content delivery networks, MapReduce, web applications, and the like) workloads having contentious allocation requirements in terms of ICT resource capacities (e.g., network bandwidth, processing speed, response time, etc.). Several recent papers have tried to address the issue of improving energy efficiency in allocating cloud resources to applications with varying degree of success. However, to the best of our knowledge there is no published literature on this subject that clearly articulates the research problem and provides research taxonomy for succinct classification of existing techniques. Hence, the main aim of this paper is to identify open challenges associated with energy efficient resource allocation. In this regard, the study, first, outlines the problem and existing hardware and software-based techniques available for this purpose. Furthermore, available techniques already presented in the literature are summarized based on the energy-efficient research dimension taxonomy. The advantages and disadvantages of the existing techniques are comprehensively analyzed against the proposed research dimension taxonomy namely: resource adaption policy, objective function, allocation method, allocation operation, and interoperability.

  18. A survey of radiation dose to patients and operators during radiofrequency ablation using computed tomography

    Science.gov (United States)

    Saidatul, A; Azlan, CA; Megat Amin, MSA; Abdullah, BJJ; Ng, KH

    2010-01-01

    Computed tomography (CT) fluoroscopy is able to give real time images to a physician undertaking minimally invasive procedures such as biopsies, percutaneous drainage, and radio frequency ablation (RFA). Both operators executing the procedure and patients too, are thus at risk of radiation exposure during a CT fluoroscopy. This study focuses on the radiation exposure present during a series of radio frequency ablation (RFA) procedures, and used Gafchromic film (Type XR-QA; International Specialty Products, USA) and thermoluminescent dosimeters (TLD-100H; Bicron, USA) to measure the radiation received by patients undergoing treatment, and also operators subject to scatter radiation. The voltage was held constant at 120 kVp and the current 70mA, with 5mm thickness. The duration of irradiation was between 150-638 seconds. Ultimately, from a sample of 30 liver that have undergone RFA, the study revealed that the operator received the highest dose at the hands, which was followed by the eyes and thyroid, while secondary staff dosage was moderately uniform across all parts of the body that were measured. PMID:21611060

  19. A survey of radiation dose to patients and operators during radiofrequency ablation using computed tomography.

    Science.gov (United States)

    Saidatul, A; Azlan, Ca; Megat Amin, Msa; Abdullah, Bjj; Ng, Kh

    2010-01-01

    Computed tomography (CT) fluoroscopy is able to give real time images to a physician undertaking minimally invasive procedures such as biopsies, percutaneous drainage, and radio frequency ablation (RFA). Both operators executing the procedure and patients too, are thus at risk of radiation exposure during a CT fluoroscopy.This study focuses on the radiation exposure present during a series of radio frequency ablation (RFA) procedures, and used Gafchromic film (Type XR-QA; International Specialty Products, USA) and thermoluminescent dosimeters (TLD-100H; Bicron, USA) to measure the radiation received by patients undergoing treatment, and also operators subject to scatter radiation.The voltage was held constant at 120 kVp and the current 70mA, with 5mm thickness. The duration of irradiation was between 150-638 seconds.Ultimately, from a sample of 30 liver that have undergone RFA, the study revealed that the operator received the highest dose at the hands, which was followed by the eyes and thyroid, while secondary staff dosage was moderately uniform across all parts of the body that were measured.

  20. Computational survey of representative energetic materials as propellants for microthruster applications

    Science.gov (United States)

    Fuchs, Brian; Stec, Daniel, III

    2007-04-01

    Microthrusters are critical for the development of terrestrial micromissiles and nano air vehicles for reconnaissance, surveillance, and sensor emplacement. With the maturation of MEMS manufacturing technology, the physical components of the thrusters can be readily fabricated. The thruster type that is the most straightforward is chemical combustion of a propellant that is ignited by a heating element giving a single shot thrust. Arrays of MEMS manufactured thrusters can be ganged to give multiple firings. The basic model for such a system is a solid rocket motor. The desired elements for the propellant of a chemical thruster are high specific impulse (I sp), high temperature and pressure, and low molecular weight combustion gases. Since the combustion chamber of a microthruster is extremely small, the propellant material must be able to ignite, sustain and complete its burn inside the chamber. The propellant can be either a solid or a liquid. There are a large number of energetic materials available as candidates for a propellant for microthrusters. There has been no systematic evaluation of the available energetic materials as propellant candidates for microthrusters. This report summarizes computations done on a series of energetic materials to address their suitabilities as microthruster propellants.

  1. Accurate treatments of electrostatics for computer simulations of biological systems: A brief survey of developments and existing problems

    Science.gov (United States)

    Yi, Sha-Sha; Pan, Cong; Hu, Zhong-Han

    2015-12-01

    Modern computer simulations of biological systems often involve an explicit treatment of the complex interactions among a large number of molecules. While it is straightforward to compute the short-ranged Van der Waals interaction in classical molecular dynamics simulations, it has been a long-lasting issue to develop accurate methods for the longranged Coulomb interaction. In this short review, we discuss three types of methodologies for the accurate treatment of electrostatics in simulations of explicit molecules: truncation-type methods, Ewald-type methods, and mean-field-type methods. Throughout the discussion, we brief the formulations and developments of these methods, emphasize the intrinsic connections among the three types of methods, and focus on the existing problems which are often associated with the boundary conditions of electrostatics. This brief survey is summarized with a short perspective on future trends along the method developments and applications in the field of biological simulations. Project supported by the National Natural Science Foundation of China (Grant Nos. 91127015 and 21522304) and the Open Project from the State Key Laboratory of Theoretical Physics, and the Innovation Project from the State Key Laboratory of Supramolecular Structure and Materials.

  2. ReRep: Computational detection of repetitive sequences in genome survey sequences (GSS

    Directory of Open Access Journals (Sweden)

    Alves-Ferreira Marcelo

    2008-09-01

    Full Text Available Abstract Background Genome survey sequences (GSS offer a preliminary global view of a genome since, unlike ESTs, they cover coding as well as non-coding DNA and include repetitive regions of the genome. A more precise estimation of the nature, quantity and variability of repetitive sequences very early in a genome sequencing project is of considerable importance, as such data strongly influence the estimation of genome coverage, library quality and progress in scaffold construction. Also, the elimination of repetitive sequences from the initial assembly process is important to avoid errors and unnecessary complexity. Repetitive sequences are also of interest in a variety of other studies, for instance as molecular markers. Results We designed and implemented a straightforward pipeline called ReRep, which combines bioinformatics tools for identifying repetitive structures in a GSS dataset. In a case study, we first applied the pipeline to a set of 970 GSSs, sequenced in our laboratory from the human pathogen Leishmania braziliensis, the causative agent of leishmaniosis, an important public health problem in Brazil. We also verified the applicability of ReRep to new sequencing technologies using a set of 454-reads of an Escheria coli. The behaviour of several parameters in the algorithm is evaluated and suggestions are made for tuning of the analysis. Conclusion The ReRep approach for identification of repetitive elements in GSS datasets proved to be straightforward and efficient. Several potential repetitive sequences were found in a L. braziliensis GSS dataset generated in our laboratory, and further validated by the analysis of a more complete genomic dataset from the EMBL and Sanger Centre databases. ReRep also identified most of the E. coli K12 repeats prior to assembly in an example dataset obtained by automated sequencing using 454 technology. The parameters controlling the algorithm behaved consistently and may be tuned to the properties

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  4. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  9. Proximity of premolar roots to maxillary sinus: a radiographic survey using cone-beam computed tomography.

    Science.gov (United States)

    von Arx, Thomas; Fodich, Ivo; Bornstein, Michael M

    2014-10-01

    The proximity of the roots of the posterior maxillary teeth to the maxillary sinus is a constant challenge to the dental practitioner. Because the majority of studies have assessed the relationship regarding molars, the present study focused on premolars. Cone-beam computed tomographic images of 192 patients were reconstructed in sagittal, coronal, and axial planes to quantify the distances between the root apices of the maxillary premolars and the adjacent maxillary sinus. Measurements were taken for each root, and data were correlated with age, sex, side, and presence of both or absence of 1 of the 2 premolars. A total of 296 teeth (177 first and 119 second premolars) were evaluated. The mean distances from buccal roots of the first premolars to the border of the maxillary sinus in the sagittal, coronal, and axial planes ranged from 5.15 ± 2.99 to 8.28 ± 6.27 mm. From palatal roots, the mean distances ranged from 4.20 ± 3.69 to 7.17 ± 6.14 mm. The mean distances of second premolars were markedly shorter in buccal roots between 2.32 ± 2.19 and 3.28 ± 3.17 mm and in palatal roots between 2.68 ± 3.58 and 3.80 ± 3.71 mm, respectively. The frequency of a premolar root protrusion into the maxillary sinus was very low in first premolars (0%-7.2%) but higher in second premolars (2.5%-13.6%). Sex, age, side, and presence/absence of premolars failed to significantly influence the mean distances between premolar roots and the maxillary sinus. Based on the calculated mean distances of the present study, only few premolars (and if so second premolars) would present a risk of violating the border of the maxillary sinus during conventional or surgical endodontic treatment or in case of tooth extraction. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  10. State-of-the-art and dissemination of computational tools for drug-design purposes: a survey among Italian academics and industrial institutions.

    Science.gov (United States)

    Artese, Anna; Alcaro, Stefano; Moraca, Federica; Reina, Rocco; Ventura, Marzia; Costantino, Gabriele; Beccari, Andrea R; Ortuso, Francesco

    2013-05-01

    During the first edition of the Computationally Driven Drug Discovery meeting, held in November 2011 at Dompé Pharma (L'Aquila, Italy), a questionnaire regarding the diffusion and the use of computational tools for drug-design purposes in both academia and industry was distributed among all participants. This is a follow-up of a previously reported investigation carried out among a few companies in 2007. The new questionnaire implemented five sections dedicated to: research group identification and classification; 18 different computational techniques; software information; hardware data; and economical business considerations. In this article, together with a detailed history of the different computational methods, a statistical analysis of the survey results that enabled the identification of the prevalent computational techniques adopted in drug-design projects is reported and a profile of the computational medicinal chemist currently working in academia and pharmaceutical companies in Italy is highlighted.

  11. Text files of the navigation logged with HYPACK Software during survey 2010-004-FA conducted in Buzzards Bay and Vineyard Sound by the U.S. Geological Survey offshore of Massachusetts in 2010.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — These data were collected under a cooperative agreement between the Massachusetts Office of Coastal Zone Management (CZM) and the U.S. Geological Survey (USGS),...

  12. Text files of the navigation logged with HYPACK Software during survey 2009-002-FA conducted in Buzzards Bay and Vineyard Sound by the U.S. Geological Survey offshore of Massachusetts in 2009.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — These data were collected under a cooperative agreement between the Massachusetts Office of Coastal Zone Management (CZM) and the U.S. Geological Survey (USGS),...

  13. Text files of the navigation logged with Microsoft HyperTerminal during sampling survey 07003 conducted aboard the R/V Connecticut by the U.S. Geological Survey offshore of Massachusetts between Duxbury and Hull (DH_SAMPLING_NAV)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — These data were collected under a cooperative agreement with the Massachusetts Office of Coastal Zone Management (CZM) and the U.S. Geological Survey (USGS), Coastal...

  14. Text files of the navigation logged with HYPACK Software during survey 2011-004-FA conducted in Buzzards Bay and Vineyard Sound by the U.S. Geological Survey offshore of Massachusetts in 2011.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — These data were collected under a cooperative agreement between the Massachusetts Office of Coastal Zone Management (CZM) and the U.S. Geological Survey (USGS),...

  15. Text files of the navigation logged with HYPACK Software during surveys 07002, and 08002 conducted by the U.S. Geological Survey offshore of Massachusetts within northern Cape Cod Bay (CCB_Hypack_Nav)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — These data were collected under a cooperative agreement with the Massachusetts Office of Coastal Zone Management (CZM) and the U.S. Geological Survey (USGS), Coastal...

  16. Text files of the navigation logged with HYPACK Software during survey 2010-004-FA conducted in Buzzards Bay and Vineyard Sound by the U.S. Geological Survey offshore of Massachusetts in 2010.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — These data were collected under a cooperative agreement between the Massachusetts Office of Coastal Zone Management (CZM) and the U.S. Geological Survey (USGS),...

  17. Text files of the navigation logged with HYPACK Software during survey 2009-002-FA conducted in Buzzards Bay and Vineyard Sound by the U.S. Geological Survey offshore of Massachusetts in 2009.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — These data were collected under a cooperative agreement between the Massachusetts Office of Coastal Zone Management (CZM) and the U.S. Geological Survey (USGS),...

  18. Text files of the navigation logged with HYPACK Software during surveys 06012 and 07001 conducted by the U.S. Geological Survey offshore of Massachusetts between Duxbury and Hull (DH_HYPACK_NAV)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — These data were collected under a cooperative agreement with the Massachusetts Office of Coastal Zone Management (CZM) and the U.S. Geological Survey (USGS), Coastal...

  19. Detection Of Alterations In Audio Files Using Spectrograph Analysis

    Directory of Open Access Journals (Sweden)

    Anandha Krishnan G

    2015-08-01

    Full Text Available The corresponding study was carried out to detect changes in audio file using spectrograph. An audio file format is a file format for storing digital audio data on a computer system. A sound spectrograph is a laboratory instrument that displays a graphical representation of the strengths of the various component frequencies of a sound as time passes. The objectives of the study were to find the changes in spectrograph of audio after altering them to compare altering changes with spectrograph of original files and to check for similarity and difference in mp3 and wav. Five different alterations were carried out on each audio file to analyze the differences between the original and the altered file. For altering the audio file MP3 or WAV by cutcopy the file was opened in Audacity. A different audio was then pasted to the audio file. This new file was analyzed to view the differences. By adjusting the necessary parameters the noise was reduced. The differences between the new file and the original file were analyzed. By adjusting the parameters from the dialog box the necessary changes were made. The edited audio file was opened in the software named spek where after analyzing a graph is obtained of that particular file which is saved for further analysis. The original audio graph received was combined with the edited audio file graph to see the alterations.

  20. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  2. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  3. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  5. COMPUTING

    CERN Document Server

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  7. Assessment of Universal Healthcare Coverage in a District of North India: A Rapid Cross-Sectional Survey Using Tablet Computers.

    Science.gov (United States)

    Singh, Tarundeep; Roy, Pritam; Jamir, Limalemla; Gupta, Saurav; Kaur, Navpreet; Jain, D K; Kumar, Rajesh

    2016-01-01

    A rapid survey was carried out in Shaheed Bhagat Singh Nagar District of Punjab state in India to ascertain health seeking behavior and out-of-pocket health expenditures. Using multistage cluster sampling design, 1,008 households (28 clusters x 36 households in each cluster) were selected proportionately from urban and rural areas. Households were selected through a house-to-house survey during April and May 2014 whose members had (a) experienced illness in the past 30 days, (b) had illness lasting longer than 30 days, (c) were hospitalized in the past 365 days, or (d) had women who were currently pregnant or experienced childbirth in the past two years. In these selected households, trained investigators, using a tablet computer-based structured questionnaire, enquired about the socio-demographics, nature of illness, source of healthcare, and healthcare and household expenditure. The data was transmitted daily to a central server using wireless communication network. Mean healthcare expenditures were computed for various health conditions. Catastrophic healthcare expenditure was defined as more than 10% of the total annual household expenditure on healthcare. Chi square test for trend was used to compare catastrophic expenditures on hospitalization between households classified into expenditure quartiles. The mean monthly household expenditure was 15,029 Indian Rupees (USD 188.2). Nearly 14.2% of the household expenditure was on healthcare. Fever, respiratory tract diseases, gastrointestinal diseases were the common acute illnesses, while heart disease, diabetes mellitus, and respiratory diseases were the more common chronic diseases. Hospitalizations were mainly due to cardiovascular diseases, gastrointestinal problems, and accidents. Only 17%, 18%, 20% and 31% of the healthcare for acute illnesses, chronic illnesses, hospitalizations and childbirth was sought in the government health facilities. Average expenditure in government health facilities was 16.6% less

  8. Design and Implementation of Log Structured FAT and ExFAT File Systems

    Directory of Open Access Journals (Sweden)

    Keshava Munegowda

    2014-08-01

    Full Text Available The File Allocation Table (FAT file system is supported in multiple Operating Systems (OS. Hence, FAT file system is universal exchange format for files/directories used in Solid State Drives (SSD and Hard disk Drives (HDD. The Microsoft Corporation introduced the new file system called Extended FAT file system (ExFAT to support larger size storage devices. The ExFAT file system is optimized to use with SSDs. But, Both FAT and ExFAT are not power fail safe. This means that the uncontrolled power loss or abrupt storage device removable from the computer system, during file system update, causes corruption of file system meta data and hence it leads to loss of data in storage device. This paper implements the Logging and Committing features to FAT and ExFAT file systems and ensures that the file system meta data is consistent across the abrupt power loss or device removal from the computer system.

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  10. Adolescent Fertility: National File [Machine-Readable Data File].

    Science.gov (United States)

    Moore, Kristin A.; And Others

    This computer file contains recent cross sectional data on adolescent fertility in the United States for 1960, 1965, 1970, 1975 and 1980-85. The following variables are included: (1) births; (2) birth rates; (3) abortions; (4) non-marital childbearing; (5) infant mortality; and (6) low birth weight. Data for both teenagers and women aged 20-24 are…

  11. Technology Policy Survey: A Study of State Policies Supporting the Use of Calculators and Computers in the Study of Precollege Mathematics.

    Science.gov (United States)

    Kansky, Bob

    The Technology Advisory Committee of the National Council of Teachers of Mathematics recently conducted a survey to assess the status of state-level policies affecting the use of calculators and computers in the teaching of mathematics in grades K-12. The committee determined that state-level actions related to the increased availability of…

  12. Prevention of supine hypotensive syndrome in pregnant women undergoing computed tomography - A national survey of current practice

    Energy Technology Data Exchange (ETDEWEB)

    McMahon, Michelle A.; Fenwick, Alison [Department of Diagnostic Imaging, Queen' s Medical Centre Campus, Nottingham University Hospitals NHS Trust, Derby Road, Nottingham, NG7 2UH (United Kingdom); Banks, Amelia [Department of Anaesthesia, City Hospital Campus, Nottingham University Hospitals NHS Trust, Hucknall Road, Nottingham, NG5 1PB (United Kingdom); Dineen, Robert A. [Department of Diagnostic Imaging, Queen' s Medical Centre Campus, Nottingham University Hospitals NHS Trust, Derby Road, Nottingham, NG7 2UH (United Kingdom)], E-mail: Robert.dineen@nhs.net

    2009-05-15

    Aim: Supine hypotensive syndrome (SHS) can occur in women in the second half of pregnancy due to compression of the aorta and inferior vena cava by the gravid uterus. This results in a decrease in cardiac output with effects ranging from transient asymptomatic hypotension to cardiovascular collapse. SHS can be easily avoided by left lateral tilt positioning. We undertook a nationwide survey to assess the awareness amongst senior computed tomography (CT) radiographers of the potential risk of SHS in women in this patient group, and to identify the extent to which preventative practices and protocols are in place. Methods and materials: A questionnaire was sent to superintendent CT radiographers at all acute NHS Trusts in England and Wales examining awareness of the risk of SHS and the preventative practices and protocols currently used. Results: Completed questionnaires were received from 64% institutions. Of respondents who scan women in this patient group, only 44% were aware of the risk of SHS. No institution had a written protocol specifying positioning of women in this patient group. Seventy-five percent of institutions never employed oblique positioning. Eighty-five percent felt that specific guidelines from the Society of Radiographers or Royal College of Radiologists would be helpful. Conclusion: Current awareness and practices for preventing this easily avoidable but potentially harmful condition are inadequate. Central guidance would be welcomed by a large majority of respondents.

  13. User's manual for SEDCALC, a computer program for computation of suspended-sediment discharge

    Science.gov (United States)

    Koltun, G.F.; Gray, John R.; McElhone, T.J.

    1994-01-01

    Sediment-Record Calculations (SEDCALC), a menu-driven set of interactive computer programs, was developed to facilitate computation of suspended-sediment records. The programs comprising SEDCALC were developed independently in several District offices of the U.S. Geological Survey (USGS) to minimize the intensive labor associated with various aspects of sediment-record computations. SEDCALC operates on suspended-sediment-concentration data stored in American Standard Code for Information Interchange (ASCII) files in a predefined card-image format. Program options within SEDCALC can be used to assist in creating and editing the card-image files, as well as to reformat card-image files to and from formats used by the USGS Water-Quality System. SEDCALC provides options for creating card-image files containing time series of equal-interval suspended-sediment concentrations from 1. digitized suspended-sediment-concentration traces, 2. linear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals, and 3. nonlinear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals. Suspended-sediment discharge can be computed from the streamflow and suspended-sediment-concentration data or by application of transport relations derived by regressing log-transformed instantaneous streamflows on log-transformed instantaneous suspended-sediment concentrations or discharges. The computed suspended-sediment discharge data are stored in card-image files that can be either directly imported to the USGS Automated Data Processing System or used to generate plots by means of other SEDCALC options.

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  15. Common File Formats.

    Science.gov (United States)

    Mills, Lauren

    2014-03-21

    An overview of the many file formats commonly used in bioinformatics and genome sequence analysis is presented, including various data file formats, alignment file formats, and annotation file formats. Example workflows illustrate how some of the different file types are typically used.

  16. [Digital library for archiving files of radiology and medical imaging].

    Science.gov (United States)

    Duvauferrier, R; Rambeau, M; Moulène, F

    1993-01-01

    The Conseil des Enseignants de Radiologie de France in collaboration with the Ilab-TSI company and Schering laboratories has developed a computer programme allowing the storage and consultation of radiological teaching files. This programme, developed on Macintosh from standard Hypercard and Quicktime applications, allows, in consultation mode, the multicriteria search and visualisation of selected radiological files. In the author mode, new files can be included after digitalizing the author's own images or after obtaining images from another image library. This programme, which allows juxtaposition of digitalised radiological files, is designed to be extremely open and can be easily combined with other computer-assisted teaching or computer-assisted presentation applications.

  17. Computer Viruses. Technology Update.

    Science.gov (United States)

    Ponder, Tim, Comp.; Ropog, Marty, Comp.; Keating, Joseph, Comp.

    This document provides general information on computer viruses, how to help protect a computer network from them, measures to take if a computer becomes infected. Highlights include the origins of computer viruses; virus contraction; a description of some common virus types (File Virus, Boot Sector/Partition Table Viruses, Trojan Horses, and…

  18. HUD GIS Boundary Files

    Data.gov (United States)

    Department of Housing and Urban Development — The HUD GIS Boundary Files are intended to supplement boundary files available from the U.S. Census Bureau. The files are for community planners interested in...

  19. Parallel file system with metadata distributed across partitioned key-value store c

    Energy Technology Data Exchange (ETDEWEB)

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron

    2017-09-19

    Improved techniques are provided for storing metadata associated with a plurality of sub-files associated with a single shared file in a parallel file system. The shared file is generated by a plurality of applications executing on a plurality of compute nodes. A compute node implements a Parallel Log Structured File System (PLFS) library to store at least one portion of the shared file generated by an application executing on the compute node and metadata for the at least one portion of the shared file on one or more object storage servers. The compute node is also configured to implement a partitioned data store for storing a partition of the metadata for the shared file, wherein the partitioned data store communicates with partitioned data stores on other compute nodes using a message passing interface. The partitioned data store can be implemented, for example, using Multidimensional Data Hashing Indexing Middleware (MDHIM).

  20. Aspects on Transfer of Aided - Design Files

    Science.gov (United States)

    Goanta, A. M.; Anghelache, D. G.

    2016-08-01

    At this stage of development of hardware and software, each company that makes design software packages has a certain type of file created and customized in time to distinguish that company from its competitors. Thus today are widely known the DWG files belonging AutoCAD, IPT / IAM belonging to Inventor, PAR / ASM of Solid Edge's, PRT from the NX and so on. Behind every type of file there is a mathematical model which is common to more types of files. A specific aspect of the computer -aided design is that all softwares are working with both individual parts and assemblies, but their approach is different in that some use the same type of file both for each part and for the whole (PRT ), while others use different types of files (IPT / IAM, PAR / ASM, etc.). Another aspect of the computer -aided design is to transfer files between different companies which use different software packages or even the same software package but in different versions. Each of these situations generates distinct issues. Thus, to solve the partial reading by a project different from the native one, transfer files of STEP and IGES type are used

  1. Schools (Students) Exchanging CAD/CAM Files over the Internet.

    Science.gov (United States)

    Mahoney, Gary S.; Smallwood, James E.

    This document discusses how students and schools can benefit from exchanging computer-aided design/computer-aided manufacturing (CAD/CAM) files over the Internet, explains how files are exchanged, and examines the problem of selected hardware/software incompatibility. Key terms associated with information search services are defined, and several…

  2. 43 CFR 4.1352 - Who may file; where to file; when to file.

    Science.gov (United States)

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Who may file; where to file; when to file... Indian Lands) § 4.1352 Who may file; where to file; when to file. (a) The applicant or operator may file... to file a timely request constitutes a waiver of the opportunity for a hearing before OSM makes...

  3. Text files of the navigation logged with HYPACK Software during survey 2015-001-FA conducted along the Delmarva Peninsula, MD and VA by the U.S. Geological Survey in 2015.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The Delmarva Peninsula is a 220-kilometer-long headland, spit, and barrier island complex that was significantly affected by Hurricane Sandy in the fall of 2012. The...

  4. Text files of the navigation logged with HYPACK Software during survey 2014-002-FA conducted along the Delmarva Peninsula, MD and VA by the U.S. Geological Survey in 2014.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The Delmarva Peninsula is a 220-kilometer-long headland, spit, and barrier island complex that was significantly affected by Hurricane Sandy. A U.S. Geological...

  5. Text files of the navigation logged with HYPACK Software during survey 2014-002-FA conducted along the Delmarva Peninsula, MD and VA by the U.S. Geological Survey in 2014.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The Delmarva Peninsula is a 220-kilometer-long headland, spit, and barrier island complex that was significantly affected by Hurricane Sandy. A U.S. Geological...

  6. Fact File

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    @@ Are delegates selected according to predetermined quotas of ethnicity and gender? The delegates who attend the CPC National Congress represent a broad spectrum of Party members.They include leading officials at various levels, and rank and file Party members working at the front line of production and those from more regular walks of life.A large proportion of the delegates are model Party members who have made outstanding contributions in various sectors and undertakings of the economy, science and technology, national defense, politics and law, education, public relations, public health, culture and sports.

  7. The Use of General Practice Computer Systems for Data Handling and Clinical Audit - A Survey of General Practices in Leicestershire

    Directory of Open Access Journals (Sweden)

    Farooqi A

    1998-11-01

    Conclusion: Despite considerable investment in GP computer systems there is evidence of both under-utilisation and inefficient use. Most practices identified a number of training needs. This suggests that lack of training is a barrier to the effective use of computers. Health authorities and general practices need urgently to develop strategies to improve computer skills.

  8. Results of an Experimental Program to Provide Low Cost Computer Searches of the NASA Information File to University Graduate Students in the Southeast. Final Report.

    Science.gov (United States)

    Smetana, Frederick O.; Phillips, Dennis M.

    In an effort to increase dissemination of scientific and technological information, a program was undertaken whereby graduate students in science and engineering could request a computer-produced bibliography and/or abstracts of documents identified by the computer. The principal resource was the National Aeronautics and Space Administration…

  9. Text Classification: Classifying Plain Source Files with Neural Network

    Directory of Open Access Journals (Sweden)

    Jaromir Veber

    2010-10-01

    Full Text Available The automated text file categorization has an important place in computer engineering, particularly in the process called data management automation. A lot has been written about text classification and the methods allowing classification of these files are well known. Unfortunately most studies are theoretical and for practical implementation more research is needed. I decided to contribute with a research focused on creating of a classifier for different kinds of programs (source files, scripts…. This paper will describe practical implementation of the classifier for text files depending on file content.

  10. National Survey on Drug Use and Health: 2-Year R-DAS (NSDUH-2002-2003)

    Data.gov (United States)

    U.S. Department of Health & Human Services — This file includes data from the 2002 through 2011 National Survey on Drug Use and Health (NSDUH) survey. The only variables included in the data file are ones that...

  11. National Survey on Drug Use and Health: 4-Year R-DAS (NSDUH-2002-2005)

    Data.gov (United States)

    U.S. Department of Health & Human Services — This file includes data from the 2002 through 2013 National Survey on Drug Use and Health (NSDUH) survey. The only variables included in the data file are ones that...

  12. Provider of Services File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The POS file consists of two data files, one for CLIA labs and one for 18 other provider types. The file names are CLIA and OTHER. If downloading the file, note it...

  13. Peak data for U.S. Geological Survey gaging stations, Texas network and computer program to estimate peak-streamflow frequency

    Science.gov (United States)

    Slade, R.M.; Asquith, W.H.

    1996-01-01

    About 23,000 annual peak streamflows and about 400 historical peak streamflows exist for about 950 stations in the surface-water data-collection network of Texas. These data are presented on a computer diskette along with the corresponding dates, gage heights, and information concerning the basin, and nature or cause for the flood. Also on the computer diskette is a U.S. Geological Survey computer program that estimates peak-streamflow frequency based on annual and historical peak streamflow. The program estimates peak streamflow for 2-, 5-, 10-, 25-, 50-, and 100-year recurrence intervals and is based on guidelines established by the Interagency Advisory Committee on Water Data. Explanations are presented for installing the program, and an example is presented with discussion of its options.

  14. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  15. Tables of file names, times, and locations of images collected during unmanned aerial systems (UAS) flights over Coast Guard Beach, Nauset Spit, Nauset Inlet, and Nauset Marsh, Cape Cod National Seashore, Eastham, Massachusetts on 1 March 2016 (text files)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — These text files contain tables of the file names, times, and locations of images obtained from an unmanned aerial systems (UAS) flown in the Cape Cod National...

  16. Tables of file names, times, and locations of images collected during unmanned aerial systems (UAS) flights over Coast Guard Beach, Nauset Spit, Nauset Inlet, and Nauset Marsh, Cape Cod National Seashore, Eastham, Massachusetts on 1 March 2016 (text files)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — These text files contain tables of the file names, times, and locations of images obtained from an unmanned aerial systems (UAS) flown in the Cape Cod National...

  17. ShoreZone Survey Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset is a point file showing GPS trackline data collected during a ShoreZone aerial imaging survey. This flight trackline is recorded at 1-second intervals...

  18. Where we stand, where we are moving: Surveying computational techniques for identifying miRNA genes and uncovering their regulatory role

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2013-06-01

    Traditional biology was forced to restate some of its principles when the microRNA (miRNA) genes and their regulatory role were firstly discovered. Typically, miRNAs are small non-coding RNA molecules which have the ability to bind to the 3\\'untraslated region (UTR) of their mRNA target genes for cleavage or translational repression. Existing experimental techniques for their identification and the prediction of the target genes share some important limitations such as low coverage, time consuming experiments and high cost reagents. Hence, many computational methods have been proposed for these tasks to overcome these limitations. Recently, many researchers emphasized on the development of computational approaches to predict the participation of miRNA genes in regulatory networks and to analyze their transcription mechanisms. All these approaches have certain advantages and disadvantages which are going to be described in the present survey. Our work is differentiated from existing review papers by updating the methodologies list and emphasizing on the computational issues that arise from the miRNA data analysis. Furthermore, in the present survey, the various miRNA data analysis steps are treated as an integrated procedure whose aims and scope is to uncover the regulatory role and mechanisms of the miRNA genes. This integrated view of the miRNA data analysis steps may be extremely useful for all researchers even if they work on just a single step. © 2013 Elsevier Inc.

  19. Does modality of survey administration impact data quality: audio computer assisted self interview (ACASI versus self-administered pen and paper?

    Directory of Open Access Journals (Sweden)

    William M Reichmann

    Full Text Available BACKGROUND: In the context of a randomized controlled trial (RCT on HIV testing in the emergency department (ED setting, we evaluated preferences for survey modality and data quality arising from each modality. METHODS: Enrolled participants were offered the choice of answering a survey via audio computer assisted self-interview (ACASI or pen and paper self-administered questionnaire (SAQ. We evaluated factors influencing choice of survey modality. We defined unusable data for a particular survey domain as answering fewer than 75% of the questions in the domain. We then compared ACASI and SAQ with respect to unusable data for domains that address sensitive topics. RESULTS: Of 758 enrolled ED patients, 218 (29% chose ACASI, 343 chose SAQ (45% and 197 (26% opted not to complete either. Results of the log-binomial regression indicated that older (RR = 1.08 per decade and less educated participants (RR = 1.25 were more likely to choose SAQ over ACASI. ACASI yielded substantially less unusable data than SAQ. CONCLUSIONS: In the ED setting there may be a tradeoff between increased participation with SAQ versus better data quality with ACASI. Future studies of novel approaches to maximize the use of ACASI in the ED setting are needed.

  20. Decay data file based on the ENSDF file

    Energy Technology Data Exchange (ETDEWEB)

    Katakura, J. [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-03-01

    A decay data file with the JENDL (Japanese Evaluated Nuclear Data Library) format based on the ENSDF (Evaluated Nuclear Structure Data File) file was produced as a tentative one of special purpose files of JENDL. The problem using the ENSDF file as primary source data of the JENDL decay data file is presented. (author)

  1. ERF1 -- Enhanced River Reach File 1.2

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — U.S. Environmental Protection Agency's River Reach File 1 (RF1)to ensure the hydrologic integrity of the digital reach traces and to quantify the mean water time of...

  2. 29 CFR 4000.28 - What if I send a computer disk?

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false What if I send a computer disk? 4000.28 Section 4000.28... I send a computer disk? (a) In general. We determine your filing or issuance date for a computer... paragraph (b) of this section. (1) Filings. For computer-disk filings, we may treat your submission...

  3. Survey of new vector computers: The CRAY 1S from CRAY research; the CYBER 205 from CDC and the parallel computer from ICL - architecture and programming

    Science.gov (United States)

    Gentzsch, W.

    1982-01-01

    Problems which can arise with vector and parallel computers are discussed in a user oriented context. Emphasis is placed on the algorithms used and the programming techniques adopted. Three recently developed supercomputers are examined and typical application examples are given in CRAY FORTRAN, CYBER 205 FORTRAN and DAP (distributed array processor) FORTRAN. The systems performance is compared. The addition of parts of two N x N arrays is considered. The influence of the architecture on the algorithms and programming language is demonstrated. Numerical analysis of magnetohydrodynamic differential equations by an explicit difference method is illustrated, showing very good results for all three systems. The prognosis for supercomputer development is assessed.

  4. ActSds and OdfSds: Programs for Converting INTERACT and The Observer Data Files into SDIS Timed-Event Sequential Data Files

    OpenAIRE

    Bakeman, Roger; Quera, Vicenç

    2008-01-01

    Programs for converting Mangold International’s INTERACT and Noldus Information Technology’s The Observer data files to Sequential Data Interchange Standard (SDIS) timed-event sequential data files are described. Users who convert their INTERACT or The Observer data files can then take advantage of various flexible and powerful data modification and computational procedures available in the Generalized Sequential Querier (GSEQ), a program that assumes SDIS-formatted files.

  5. ActSds and OdfSds: programs for converting INTERACT and The Observer data files into SDIS timed-event sequential data files.

    Science.gov (United States)

    Bakeman, Roger; Quera, Vicenç

    2008-08-01

    In this article, we describe programs for converting Mangold International's INTERACT and Noldus Information Technology's The Observer data files to sequential data interchange standard (SDIS) timed-event sequential data files. Users who convert their INTERACT or The Observer data files can then take advantage of various flexible and powerful data modification and computational procedures available in the Generalized Sequential Querier, a program that assumes SDIS-formatted files.

  6. Social presence reinforcement and computer-mediated communication: the effect of the solicitor's photography on compliance to a survey request made by e-mail.

    Science.gov (United States)

    Guéguen, Nicolas; Jacob, Céline

    2002-04-01

    Personal information is scarce in computer-mediated communication. So when information about the sender is attached with an e-mail, this could induce a positive feeling toward the sender. An experiment was carried out where a male and a female student-solicitor, by way of an e-mail, requested a student-subject to participate in a survey. In half of the cases, a digital photograph of the solicitor appeared at the end of the e-mail. Results show that subjects agreed more readily to the request in the experimental condition than in the control condition where no digital photograph was sent with the e-mail. The importance of social information on computer-mediated communication is used to explain such results.

  7. Survey of computed tomography doses in head and chest protocols; Levantamento de doses em tomografia computadorizada em protocolos de cranio e torax

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Giordana Salvi de; Silva, Ana Maria Marques da, E-mail: giordana.souza@acad.pucrs.br [Pontificia Universidade Catolica do Rio Grande do Sul (PUC-RS), Porto Alegre, RS (Brazil). Faculdade de Fisica. Nucleo de Pesquisa em Imagens Medicas

    2016-07-01

    Computed tomography is a clinical tool for the diagnosis of patients. However, the patient is subjected to a complex dose distribution. The aim of this study was to survey dose indicators in head and chest protocols CT scans, in terms of Dose-Length Product(DLP) and effective dose for adult and pediatric patients, comparing them with diagnostic reference levels in the literature. Patients were divided into age groups and the following image acquisition parameters were collected: age, kV, mAs, Volumetric Computed Tomography Dose Index (CTDIvol) and DLP. The effective dose was found multiplying DLP by correction factors. The results were obtained from the third quartile and showed the importance of determining kV and mAs values for each patient depending on the studied region, age and thickness. (author)

  8. A Survey of eye discomfort and headache associated with computer use among dormitory students of Tehran University of Medical Sciences

    Directory of Open Access Journals (Sweden)

    2012-12-01

    Full Text Available Introduction: Eye discomforts and headache are prevalent among computer users. The aim of present study was to determine eye discomforts and headache associated with computer use among dormitory students of Tehran University of Medical Sciences. .Material and Method: In this cross – sectional study, data were collected using MIRTH (Musculo skeletal Injury Reduction Tools for Health and Safety questionnaire. The results were Analyzed using SPSS Software and with descriptive statistical indexes as well as correlation test. .Result: of the 744 distributed questionnaires, 631 students completed the questionnaire (84.4%. The mean age for the studied population was 22.72± 3.6 years. The relative frequency of eye discomforts for the female and male student was 76.38% and 70.11% respectively. The correlation test showed significant relationship between eye discomforts as well as headache with gender, work hours’ with a computer per day and glass use (P-V < 0.05. Also, the correlation between eye discomforts and headache was statistically significant (P-V < 0.01. .Conclusion: Eye discomforts and headache relating to the computer uses are prevalent among students. The condition is more prevalent among females and prolonged computer users. Interventional and training programs should be considered to prevent and reduction of related problems.

  9. National Survey on Drug Use and Health: 10-Year Substate R-DAS (NSDUH-2002-2011)

    Data.gov (United States)

    U.S. Department of Health & Human Services — This file includes data from the 2002 through 2011 National Survey on Drug Use and Health (NSDUH) survey. The only variables included in the data file are ones that...

  10. Computational Analysis and Simulation of Empathic Behaviors: a Survey of Empathy Modeling with Behavioral Signal Processing Framework.

    Science.gov (United States)

    Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S

    2016-05-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.

  11. Standard interface file handbook

    Energy Technology Data Exchange (ETDEWEB)

    Shapiro, A.; Huria, H.C. (Cincinnati Univ., OH (United States))

    1992-10-01

    This handbook documents many of the standard interface file formats that have been adopted by the US Department of Energy to facilitate communications between and portability of, various large reactor physics and radiation transport software packages. The emphasis is on those files needed for use of the VENTURE/PC diffusion-depletion code system. File structures, contents and some practical advice on use of the various files are provided.

  12. 43 CFR 4.1381 - Who may file; when to file; where to file.

    Science.gov (United States)

    2010-10-01

    ... may file; when to file; where to file. (a) Any person who receives a written decision issued by OSM under 30 CFR 773.28 on a challenge to an ownership or control listing or finding may file a request for... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Who may file; when to file; where to...

  13. Securing the AliEn File Catalogue - Enforcing authorization with accountable file operations

    Science.gov (United States)

    Schreiner, Steffen; Bagnasco, Stefano; Sankar Banerjee, Subho; Betev, Latchezar; Carminati, Federico; Vladimirovna Datskova, Olga; Furano, Fabrizio; Grigoras, Alina; Grigoras, Costin; Mendez Lorenzo, Patricia; Peters, Andreas Joachim; Saiz, Pablo; Zhu, Jianlin

    2011-12-01

    The AliEn Grid Services, as operated by the ALICE Collaboration in its global physics analysis grid framework, is based on a central File Catalogue together with a distributed set of storage systems and the possibility to register links to external data resources. This paper describes several identified vulnerabilities in the AliEn File Catalogue access protocol regarding fraud and unauthorized file alteration and presents a more secure and revised design: a new mechanism, called LFN Booking Table, is introduced in order to keep track of access authorization in the transient state of files entering or leaving the File Catalogue. Due to a simplification of the original Access Envelope mechanism for xrootd-protocol-based storage systems, fundamental computational improvements of the mechanism were achieved as well as an up to 50% reduction of the credential's size. By extending the access protocol with signed status messages from the underlying storage system, the File Catalogue receives trusted information about a file's size and checksum and the protocol is no longer dependent on client trust. Altogether, the revised design complies with atomic and consistent transactions and allows for accountable, authentic, and traceable file operations. This paper describes these changes as part and beyond the development of AliEn version 2.19.

  14. Protecting Your Computer from Viruses

    Science.gov (United States)

    Descy, Don E.

    2006-01-01

    A computer virus is defined as a software program capable of reproducing itself and usually capable of causing great harm to files or other programs on the same computer. The existence of computer viruses--or the necessity of avoiding viruses--is part of using a computer. With the advent of the Internet, the door was opened wide for these…

  15. Nuclear fuel cycle risk assessment: survey and computer compilation of risk-related literature. [Once-through Cycle and Plutonium Recycle

    Energy Technology Data Exchange (ETDEWEB)

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    1982-10-01

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searched and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included.

  16. Inter-method reliability of paper surveys and computer assisted telephone interviews in a randomized controlled trial of yoga for low back pain.

    Science.gov (United States)

    Cerrada, Christian J; Weinberg, Janice; Sherman, Karen J; Saper, Robert B

    2014-04-09

    Little is known about the reliability of different methods of survey administration in low back pain trials. This analysis was designed to determine the reliability of responses to self-administered paper surveys compared to computer assisted telephone interviews (CATI) for the primary outcomes of pain intensity and back-related function, and secondary outcomes of patient satisfaction, SF-36, and global improvement among participants enrolled in a study of yoga for chronic low back pain. Pain intensity, back-related function, and both physical and mental health components of the SF-36 showed excellent reliability at all three time points; ICC scores ranged from 0.82 to 0.98. Pain medication use showed good reliability; kappa statistics ranged from 0.68 to 0.78. Patient satisfaction had moderate to excellent reliability; ICC scores ranged from 0.40 to 0.86. Global improvement showed poor reliability at 6 weeks (ICC = 0.24) and 12 weeks (ICC = 0.10). CATI shows excellent reliability for primary outcomes and at least some secondary outcomes when compared to self-administered paper surveys in a low back pain yoga trial. Having two reliable options for data collection may be helpful to increase response rates for core outcomes in back pain trials. ClinicalTrials.gov: NCT01761617. Date of trial registration: December 4, 2012.

  17. Efficient load rebalancing for distributed file system in Clouds

    Directory of Open Access Journals (Sweden)

    Mr. Mohan S. Deshmukh

    2016-05-01

    Full Text Available Cloud computing is an upcoming era in software industry. It’s a very vast and developing technology. Distributed file systems play an important role in cloud computing applications based on map reduce techniques. While making use of distributed file systems for cloud computing, nodes serves computing and storage functions at the same time. Given file is divided into small parts to use map reduce algorithms in parallel. But the problem lies here since in cloud computing nodes may be added, deleted or modified any time and also operations on files may be done dynamically. This causes the unequal load distribution of load among the nodes which leads to load imbalance problem in distributed file system. Newly developed distributed file system mostly depends upon central node for load distribution but this method is not helpful in large-scale and where chances of failure are more. Use of central node for load distribution creates a problem of single point dependency and chances of performance of bottleneck are more. As well as issues like movement cost and network traffic caused due to migration of nodes and file chunks need to be resolved. So we are proposing algorithm which will overcome all these problems and helps to achieve uniform load distribution efficiently. To verify the feasibility and efficiency of our algorithm we will be using simulation setup and compare our algorithm with existing techniques for the factors like load imbalance factor, movement cost and network traffic.

  18. Survey of CPU/GPU Synergetic Parallel Computing%CPU/GPU协同并行计算研究综述

    Institute of Scientific and Technical Information of China (English)

    卢风顺; 宋君强; 银福康; 张理论

    2011-01-01

    CPU/GPU异构混合并行系统以其强劲计算能力、高性价比和低能耗等特点成为新型高性能计算平台,但其复杂体系结构为并行计算研究提出了巨大挑战.CPU/GPU协同并行计算属于新兴研究领域,是一个开放的课题.根据所用计算资源的规模将CPU/GPU协同并行计算研究划分为三类,尔后从立项依据、研究内容和研究方法等方面重点介绍了几个混合计算项目,并指出了可进一步研究的方向,以期为领域科学家进行协同并行计算研究提供一定参考.%With the features of tremendous capability, high performance/price ratio and low power, the heterogeneous hybrid CPU/GPU parallel systems have become the new high performance computing platforms. However, the architecture complexity of the hybrid system poses many challenges on the parallel algorithms design on the infrastructure. According to the scale of computational resources involved in the synergetic parallel computing, we classified the recent researches into three categories, detailed the motivations, methodologies and applications of several projects, and discussed some on-going research issues in this direction in the end. We hope the domain experts can gain useful information about synergetic parallel computing from this work.

  19. Semantic File Annotation and Retrieval on Mobile Devices

    Directory of Open Access Journals (Sweden)

    Sadaqat Jan

    2011-01-01

    Full Text Available The rapid development of mobile technologies has facilitated users to generate and store files on mobile devices such as mobile phones and PDAs. However, it has become a challenging issue for users to efficiently and effectively search for files of interest in a mobile environment involving a large number of mobile nodes. This paper presents SemFARM framework which facilitates users to publish, annotate and retrieve files which are geographically distributed in a mobile network enabled by Bluetooth. The SemFARM framework is built on semantic web technologies in support of file retrieval on low-end mobile devices. A generic ontology is developed which defines a number of keywords, their possible domains and properties. Based on semantic reasoning, similarity degrees are computed to match user queries with published file descriptions. The SemFARM prototype is implemented using the Java mobile platform (J2ME. The performance of SemFARM is evaluated from a number of aspects in comparison with traditional mobile file systems and enhanced alternatives. Experimental results are encouraging showing the effectiveness of SemFARM in file retrieval. We can conclude that the use of semantic web technologies have facilitated file retrieval in mobile computing environments maximizing user satisfaction in searching for files of interest.

  20. Internet Use for Health-Related Information via Personal Computers and Cell Phones in Japan: A Cross-Sectional Population-Based Survey

    Science.gov (United States)

    Takahashi, Yoshimitsu; Ohura, Tomoko; Ishizaki, Tatsuro; Okamoto, Shigeru; Miki, Kenji; Naito, Mariko; Akamatsu, Rie; Sugimori, Hiroki; Yoshiike, Nobuo; Miyaki, Koichi; Shimbo, Takuro

    2011-01-01

    Background The Internet is known to be used for health purposes by the general public all over the world. However, little is known about the use of, attitudes toward, and activities regarding eHealth among the Japanese population. Objectives This study aimed to measure the prevalence of Internet use for health-related information compared with other sources, and to examine the effects on user knowledge, attitudes, and activities with regard to Internet use for health-related information in Japan. We examined the extent of use via personal computers and cell phones. Methods We conducted a cross-sectional survey of a quasi-representative sample (N = 1200) of the Japanese general population aged 15–79 years in September 2007. The main outcome measures were (1) self-reported rates of Internet use in the past year to acquire health-related information and to contact health professionals, family, friends, and peers specifically for health-related purposes, and (2) perceived effects of Internet use on health care. Results The prevalence of Internet use via personal computer for acquiring health-related information was 23.8% (286/1200) among those surveyed, whereas the prevalence via cell phone was 6% (77). Internet use via both personal computer and cell phone for communicating with health professionals, family, friends, or peers was not common. The Internet was used via personal computer for acquiring health-related information primarily by younger people, people with higher education levels, and people with higher household incomes. The majority of those who used the Internet for health care purposes responded that the Internet improved their knowledge or affected their lifestyle attitude, and that they felt confident in the health-related information they obtained from the Internet. However, less than one-quarter thought it improved their ability to manage their health or affected their health-related activities. Conclusions Japanese moderately used the Internet via

  1. Effect of survey instrument on participation in a follow-up study: a randomization study of a mailed questionnaire versus a computer-assisted telephone interview

    Directory of Open Access Journals (Sweden)

    Rocheleau Carissa M

    2012-07-01

    Full Text Available Abstract Background Many epidemiological and public health surveys report increasing difficulty obtaining high participation rates. We conducted a pilot follow-up study to determine whether a mailed or telephone survey would better facilitate data collection in a subset of respondents to an earlier telephone survey conducted as part of the National Birth Defects Prevention Study. Methods We randomly assigned 392 eligible mothers to receive a self-administered, mailed questionnaire (MQ or a computer-assisted telephone interview (CATI using similar recruitment protocols. If mothers gave permission to contact the fathers, fathers were recruited to complete the same instrument (MQ or CATI as mothers. Results Mothers contacted for the MQ, within all demographic strata examined, were more likely to participate than those contacted for the CATI (86.6% vs. 70.6%. The median response time for mothers completing the MQ was 17 days, compared to 29 days for mothers completing the CATI. Mothers completing the MQ also required fewer reminder calls or letters to finish participation versus those assigned to the CATI (median 3 versus 6, though they were less likely to give permission to contact the father (75.0% vs. 85.8%. Fathers contacted for the MQ, however, had higher participation compared to fathers contacted for the CATI (85.2% vs. 54.5%. Fathers recruited to the MQ also had a shorter response time (median 17 days and required fewer reminder calls and letters (median 3 reminders than those completing the CATI (medians 28 days and 6 reminders. Conclusions We concluded that offering a MQ substantially improved participation rates and reduced recruitment effort compared to a CATI in this study. While a CATI has the advantage of being able to clarify answers to complex questions or eligibility requirements, our experience suggests that a MQ might be a good survey option for some studies.

  2. Influence of flexion angle of files on the decentralization of oval canals during instrumentation

    OpenAIRE

    Maria Antonieta Veloso Carvalho OLIVEIRA; Letícia Duarte ALVES; Pereira,Analice Giovani; RAPOSO,Luís Henrique Araújo; João Carlos Gabrielli BIFFI

    2015-01-01

    The aim of this study was to evaluate the influence of the flexion angle of files on the decentralization of root canals during instrumentation. Fifteen lower incisors were instrumented with Protaper Universal files and radiographed in two directions (mesiodistal and buccolingual) before and after instrumentation with a #15 K-file in position for evaluating the flexion angle of files. The specimens were also scanned before and after instrumentation using micro-computed tomography to obtain th...

  3. GEODOC: the GRID document file, record structure and data element description

    Energy Technology Data Exchange (ETDEWEB)

    Trippe, T.; White, V.; Henderson, F.; Phillips, S.

    1975-11-06

    The purpose of this report is to describe the information structure of the GEODOC file. GEODOC is a computer based file which contains the descriptive cataloging and indexing information for all documents processed by the National Geothermal Information Resource Group. This file (along with other GRID files) is managed by DBMS, the Berkeley Data Base Management System. Input for the system is prepared using the IRATE Text Editing System with its extended (12 bit) character set, or punched cards.

  4. A Generic, Computer-assisted Method for Rapid Vegetation Classification and Survey: Tropical and Temperate Case Studies

    Directory of Open Access Journals (Sweden)

    Andrew N. Gillison

    2002-12-01

    Full Text Available Standard methods of vegetation classification and survey tend to be either too broad for management purposes or too reliant on local species to support inter-regional comparisons. A new approach to this problem uses species-independent plant functional types with a wide spectrum of environmental sensitivity. By means of a rule set, plant functional types can be constructed according to specific combinations from within a generic set of 35 adaptive, morphological plant functional attributes. Each combination assumes that a vascular plant individual can be described as a "coherent" functional unit. When used together with vegetation structure, plant functional types facilitate rapid vegetation assessment that complements species-based data and makes possible uniform comparisons of vegetation response to environmental change within and between countries. Recently developed user-friendly software (VegClass facilitates data entry and the analysis of biophysical field records from a standardized, rapid, survey pro forma. Case studies are presented at a variety of spatial scales and for vegetation types ranging from species-poor arctic tundra to intensive, multitaxa, baseline biodiversity assessments in complex, humid tropical forests. These demonstrate how such data can be rapidly acquired, analyzed, and communicated to conservation managers. Sample databases are linked to downloadable software and a training manual.

  5. Computer Virus Protection

    Science.gov (United States)

    Rajala, Judith B.

    2004-01-01

    A computer virus is a program--a piece of executable code--that has the unique ability to replicate. Like biological viruses, computer viruses can spread quickly and are often difficult to eradicate. They can attach themselves to just about any type of file, and are spread by replicating and being sent from one individual to another. Simply having…

  6. Cloud Computing

    CERN Document Server

    Mirashe, Shivaji P

    2010-01-01

    Computing as you know it is about to change, your applications and documents are going to move from the desktop into the cloud. I'm talking about cloud computing, where applications and files are hosted on a "cloud" consisting of thousands of computers and servers, all linked together and accessible via the Internet. With cloud computing, everything you do is now web based instead of being desktop based. You can access all your programs and documents from any computer that's connected to the Internet. How will cloud computing change the way you work? For one thing, you're no longer tied to a single computer. You can take your work anywhere because it's always accessible via the web. In addition, cloud computing facilitates group collaboration, as all group members can access the same programs and documents from wherever they happen to be located. Cloud computing might sound far-fetched, but chances are you're already using some cloud applications. If you're using a web-based email program, such as Gmail or Ho...

  7. PR Educators Stress Computers.

    Science.gov (United States)

    Fleming, Charles A.

    1988-01-01

    Surveys the varied roles computers play in public relations education. Asserts that, because computers are used extensively in the public relations field, students should become acquainted with the varied capabilities of computers and their role in public relations practice. (MM)

  8. 内存计算技术研究综述∗%Survey on In-Memory Computing Technology

    Institute of Scientific and Technical Information of China (English)

    罗乐; 刘轶; 钱德沛

    2016-01-01

    在大数据时代,如何高效地处理海量数据以满足性能需求,是一个需要解决的重要问题。内存计算充分利用大容量内存进行数据处理,减少甚至避免 I/O 操作,因而极大地提高了海量数据处理的性能,同时也面临一系列有待解决的问题。首先,在分析内存计算技术特点的基础上对其进行了分类,并分别介绍了各类技术及系统的原理、研究现状及热点问题;其次,对内存计算的典型应用进行了分析;最后,从总体层面和应用层面对内存计算面临的挑战予以分析,并且对其发展前景做了展望。%In the era of big data, systems need to process massive data efficiently to meet performance requirements of applications. In-memory computing technology can improve performance of massive data processing significantly by utilizing memory and avoid I/O operations. However, the technology faces a series of challenges that need to be solved. This paper analyzes characteristics of in-memory computing technology, lays out its classification, and introduces principles, related works and hot topics of every category. In addition, several typical applications of in-memory technology are introduced. Finally, challenges and opportunities for in-memory computing are elaborated.

  9. New orthopaedic implant management tool for computer-assisted planning, navigation, and simulation: from implant CAD files to a standardized XML-based implant database.

    Science.gov (United States)

    Sagbo, S; Blochaou, F; Langlotz, F; Vangenot, C; Nolte, L-P; Zheng, G

    2005-01-01

    Computer-Assisted Orthopaedic Surgery (CAOS) has made much progress over the last 10 years. Navigation systems have been recognized as important tools that help surgeons, and various such systems have been developed. A disadvantage of these systems is that they use non-standard formalisms and techniques. As a result, there are no standard concepts for implant and tool management or data formats to store information for use in 3D planning and navigation. We addressed these limitations and developed a practical and generic solution that offers benefits for surgeons, implant manufacturers, and CAS application developers. We developed a virtual implant database containing geometrical as well as calibration information for orthopedic implants and instruments, with a focus on trauma. This database has been successfully tested for various applications in the client/server mode. The implant information is not static, however, because manufacturers periodically revise their implants, resulting in the deletion of some implants and the introduction of new ones. Tracking these continuous changes and keeping CAS systems up to date is a tedious task if done manually. This leads to additional costs for system development, and some errors are inevitably generated due to the huge amount of information that has to be processed. To ease management with respect to implant life cycle, we developed a tool to assist end-users (surgeons, hospitals, CAS system providers, and implant manufacturers) in managing their implants. Our system can be used for pre-operative planning and intra-operative navigation, and also for any surgical simulation involving orthopedic implants. Currently, this tool allows addition of new implants, modification of existing ones, deletion of obsolete implants, export of a given implant, and also creation of backups. Our implant management system has been successfully tested in the laboratory with very promising results. It makes it possible to fill the current gap

  10. Utilizing HDF4 File Content Maps for the Cloud

    Science.gov (United States)

    Lee, Hyokyung Joe

    2016-01-01

    We demonstrate a prototype study that HDF4 file content map can be used for efficiently organizing data in cloud object storage system to facilitate cloud computing. This approach can be extended to any binary data formats and to any existing big data analytics solution powered by cloud computing because HDF4 file content map project started as long term preservation of NASA data that doesn't require HDF4 APIs to access data.

  11. Adding Data Management Services to Parallel File Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brandt, Scott [Univ. of California, Santa Cruz, CA (United States)

    2015-03-04

    The objective of this project, called DAMASC for “Data Management in Scientific Computing”, is to coalesce data management with parallel file system management to present a declarative interface to scientists for managing, querying, and analyzing extremely large data sets efficiently and predictably. Managing extremely large data sets is a key challenge of exascale computing. The overhead, energy, and cost of moving massive volumes of data demand designs where computation is close to storage. In current architectures, compute/analysis clusters access data in a physically separate parallel file system and largely leave it scientist to reduce data movement. Over the past decades the high-end computing community has adopted middleware with multiple layers of abstractions and specialized file formats such as NetCDF-4 and HDF5. These abstractions provide a limited set of high-level data processing functions, but have inherent functionality and performance limitations: middleware that provides access to the highly structured contents of scientific data files stored in the (unstructured) file systems can only optimize to the extent that file system interfaces permit; the highly structured formats of these files often impedes native file system performance optimizations. We are developing Damasc, an enhanced high-performance file system with native rich data management services. Damasc will enable efficient queries and updates over files stored in their native byte-stream format while retaining the inherent performance of file system data storage via declarative queries and updates over views of underlying files. Damasc has four key benefits for the development of data-intensive scientific code: (1) applications can use important data-management services, such as declarative queries, views, and provenance tracking, that are currently available only within database systems; (2) the use of these services becomes easier, as they are provided within a familiar file

  12. Distributed Data Management and Distributed File Systems

    CERN Document Server

    Girone, Maria

    2015-01-01

    The LHC program has been successful in part due to the globally distributed computing resources used for collecting, serving, processing, and analyzing the large LHC datasets. The introduction of distributed computing early in the LHC program spawned the development of new technologies and techniques to synchronize information and data between physically separated computing centers. Two of the most challenges services are the distributed file systems and the distributed data management systems. In this paper I will discuss how we have evolved from local site services to more globally independent services in the areas of distributed file systems and data management and how these capabilities may continue to evolve into the future. I will address the design choices, the motivations, and the future evolution of the computing systems used for High Energy Physics.

  13. Virus Alert: Ten Steps to Safe Computing.

    Science.gov (United States)

    Gunter, Glenda A.

    1997-01-01

    Discusses computer viruses and explains how to detect them; discusses virus protection and the need to update antivirus software; and offers 10 safe computing tips, including scanning floppy disks and commercial software, how to safely download files from the Internet, avoiding pirated software copies, and backing up files. (LRW)

  14. A survey of advancements in nucleic acid-based logic gates and computing for applications in biotechnology and biomedicine.

    Science.gov (United States)

    Wu, Cuichen; Wan, Shuo; Hou, Weijia; Zhang, Liqin; Xu, Jiehua; Cui, Cheng; Wang, Yanyue; Hu, Jun; Tan, Weihong

    2015-03-04

    Nucleic acid-based logic devices were first introduced in 1994. Since then, science has seen the emergence of new logic systems for mimicking mathematical functions, diagnosing disease and even imitating biological systems. The unique features of nucleic acids, such as facile and high-throughput synthesis, Watson-Crick complementary base pairing, and predictable structures, together with the aid of programming design, have led to the widespread applications of nucleic acids (NA) for logic gate and computing in biotechnology and biomedicine. In this feature article, the development of in vitro NA logic systems will be discussed, as well as the expansion of such systems using various input molecules for potential cellular, or even in vivo, applications.

  15. Statistical models and regularization strategies in statistical image reconstruction of low-dose X-ray computed tomography: a survey

    CERN Document Server

    Zhang, Hao; Ma, Jianhua; Lu, Hongbing; Liang, Zhengrong

    2014-01-01

    Statistical image reconstruction (SIR) methods have shown potential to substantially improve the image quality of low-dose X-ray computed tomography (CT) as compared to the conventional filtered back-projection (FBP) method for various clinical tasks. According to the maximum a posterior (MAP) estimation, the SIR methods can be typically formulated by an objective function consisting of two terms: (1) data-fidelity (or equivalently, data-fitting or data-mismatch) term modeling the statistics of projection measurements, and (2) regularization (or equivalently, prior or penalty) term reflecting prior knowledge or expectation on the characteristics of the image to be reconstructed. Existing SIR methods for low-dose CT can be divided into two groups: (1) those that use calibrated transmitted photon counts (before log-transform) with penalized maximum likelihood (pML) criterion, and (2) those that use calibrated line-integrals (after log-transform) with penalized weighted least-squares (PWLS) criterion. Accurate s...

  16. Location of sea floor video tracklines along with videos collected in 2014 by the U.S. Geological Survey offshore of Fire Island, NY (MP4 videos files and Esri polyline shapefile, Geographic, WGS 84).

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The U.S. Geological Survey (USGS) conducted a geophysical and sampling survey in October 2014 that focused on a series of shoreface-attached ridges offshore of...

  17. File access prediction using neural networks.

    Science.gov (United States)

    Patra, Prashanta Kumar; Sahu, Muktikanta; Mohapatra, Subasish; Samantray, Ronak Kumar

    2010-06-01

    One of the most vexing issues in design of a high-speed computer is the wide gap of access times between the memory and the disk. To solve this problem, static file access predictors have been used. In this paper, we propose dynamic file access predictors using neural networks to significantly improve upon the accuracy, success-per-reference, and effective-success-rate-per-reference by using neural-network-based file access predictor with proper tuning. In particular, we verified that the incorrect prediction has been reduced from 53.11% to 43.63% for the proposed neural network prediction method with a standard configuration than the recent popularity (RP) method. With manual tuning for each trace, we are able to improve upon the misprediction rate and effective-success-rate-per-reference using a standard configuration. Simulations on distributed file system (DFS) traces reveal that exact fit radial basis function (RBF) gives better prediction in high end system whereas multilayer perceptron (MLP) trained with Levenberg-Marquardt (LM) backpropagation outperforms in system having good computational capability. Probabilistic and competitive predictors are the most suitable for work stations having limited resources to deal with and the former predictor is more efficient than the latter for servers having maximum system calls. Finally, we conclude that MLP with LM backpropagation algorithm has better success rate of file prediction than those of simple perceptron, last successor, stable successor, and best k out of m predictors.

  18. Identifiable Data Files - Name and Address File and Vital...

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Names and Addresses File and the Vital Status File are subsets of the data elements in the Enrollment Database (EDB). The particular information in each file is...

  19. HIV-related risk behaviors among the general population: a survey using Audio Computer-Assisted Self-Interview in 3 cities in Vietnam.

    Science.gov (United States)

    Vu, Lan T H; Nadol, Patrick; Le, Linh Cu

    2015-03-01

    This study used a confidential survey method-namely, Audio Computer-Assisted Self-Interview (ACASI)-to gather data about HIV-related risk knowledge/behaviors among the general population in Vietnam. The study sample included 1371 people aged 15 to 49 years in 3 cities-Hanoi, Da nang, and Can Tho. Results indicated that 7% of participants had ever had nonconsensual sex, and 3.6% of them had ever had a one-night stand. The percentage of male participants reported to ever have sex with sex workers was 9.6% and to ever inject drugs was 4.3%. The proportion of respondents who had ever tested for HIV was 17.6%. The risk factors and attitudes reported in the survey indicate the importance of analyzing risk behaviors related to HIV infection among the general population. Young people, especially men in more urbanized settings, are engaging in risky behaviors and may act as a "bridge" for the transmission of HIV from high-risk groups to the general population in Vietnam.

  20. Survey of Search and Replication Schemes in Unstructured P2P Networks

    CERN Document Server

    Thampi, Sabu M

    2010-01-01

    P2P computing lifts taxing issues in various areas of computer science. The largely used decentralized unstructured P2P systems are ad hoc in nature and present a number of research challenges. In this paper, we provide a comprehensive theoretical survey of various state-of-the-art search and replication schemes in unstructured P2P networks for file-sharing applications. The classifications of search and replication techniques and their advantages and disadvantages are briefly explained. Finally, the various issues on searching and replication for unstructured P2P networks are discussed.

  1. Survey of computed tomography doses and establishment of national diagnostic reference levels in the Republic of Belarus.

    Science.gov (United States)

    Kharuzhyk, S A; Matskevich, S A; Filjustin, A E; Bogushevich, E V; Ugolkova, S A

    2010-01-01

    Computed tomography dose index (CTDI) was measured on eight CT scanners at seven public hospitals in the Republic of Belarus. The effective dose was calculated using normalised values of effective dose per dose-length product (DLP) over various body regions. Considerable variations of the dose values were observed. Mean effective doses amounted to 1.4 +/- 0.4 mSv for brain, 2.6 +/- 1.0 mSv for neck, 6.9 +/- 2.2 mSv for thorax, 7.0 +/- 2.3 mSv for abdomen and 8.8 +/- 3.2 mSv for pelvis. Diagnostic reference levels (DRLs) were proposed by calculating the third quartiles of dose value distributions (body region/volume CTDI, mGy/DLP, mGy cm): brain/60/730, neck/55/640, thorax/20/500, abdomen/25/600 and pelvis/25/490. It is evident that the protocols need to be optimised on some of the CT scanners, in view of the fact that these are the first formulated DRLs for the Republic of Belarus.

  2. Potential and limitations of X-Ray micro-computed tomography in arthropod neuroanatomy: a methodological and comparative survey.

    Science.gov (United States)

    Sombke, Andy; Lipke, Elisabeth; Michalik, Peter; Uhl, Gabriele; Harzsch, Steffen

    2015-06-01

    Classical histology or immunohistochemistry combined with fluorescence or confocal laser scanning microscopy are common techniques in arthropod neuroanatomy, and these methods often require time-consuming and difficult dissections and sample preparations. Moreover, these methods are prone to artifacts due to compression and distortion of tissues, which often result in information loss and especially affect the spatial relationships of the examined parts of the nervous system in their natural anatomical context. Noninvasive approaches such as X-ray micro-computed tomography (micro-CT) can overcome such limitations and have been shown to be a valuable tool for understanding and visualizing internal anatomy and structural complexity. Nevertheless, knowledge about the potential of this method for analyzing the anatomy and organization of nervous systems, especially of taxa with smaller body size (e.g., many arthropods), is limited. This study set out to analyze the brains of selected arthropods with micro-CT, and to compare these results with available histological and immunohistochemical data. Specifically, we explored the influence of different sample preparation procedures. Our study shows that micro-CT is highly suitable for analyzing arthropod neuroarchitecture in situ and allows specific neuropils to be distinguished within the brain to extract quantitative data such as neuropil volumes. Moreover, data acquisition is considerably faster compared with many classical histological techniques. Thus, we conclude that micro-CT is highly suitable for targeting neuroanatomy, as it reduces the risk of artifacts and is faster than classical techniques. © 2015 Wiley Periodicals, Inc.

  3. Computer Game Use and Television Viewing Increased Risk for Overweight among Low Activity Girls: Fourth Thai National Health Examination Survey 2008-2009

    Directory of Open Access Journals (Sweden)

    Ladda Mo-suwan

    2014-01-01

    Full Text Available Studies of the relationship between sedentary behaviors and overweight among children and adolescents show mixed results. The fourth Thai National Health Examination Survey data collected between 2008 and 2009 were used to explore this association in 5,999 children aged 6 to 14 years. The prevalence of overweight defined by the age- and gender-specific body mass index cut-points of the International Obesity Task Force was 16%. Using multiple logistic regression, computer game use for more than 1 hour a day was found to be associated with an increased risk of overweight (adjusted odds ratio (AOR = 1.4; 95% confidence interval: 1.02–1.93. The effect of computer game use and TV viewing on the risk for overweight was significantly pronounced among girls who spent ≤3 days/week in 60 minutes of moderate-intensity physical activity (AOR = 1.99 and 1.72, resp.. On the contrary, these sedentary behaviors did not exert significant risk for overweight among boys. The moderating effect on risk of overweight by physical inactivity and media use should be taken into consideration in designing the interventions for overweight control in children and adolescents. Tracking societal changes is essential for identification of potential areas for targeted interventions.

  4. Patient Assessment File (PAF)

    Data.gov (United States)

    Department of Veterans Affairs — The Patient Assessment File (PAF) database compiles the results of the Patient Assessment Instrument (PAI) questionnaire filled out for intermediate care Veterans...

  5. RRB Earnings File (RRBERN)

    Data.gov (United States)

    Social Security Administration — RRBERN contains records for all beneficiaries on the RRB's PSSVES file who's SSNs are validated through the SVES processing. Validated output is processed through...

  6. SCR Algorithm: Saving/Restoring States of File Systems

    Institute of Scientific and Technical Information of China (English)

    魏晓辉; 鞠九滨

    2000-01-01

    Fault-tolerance is very important in cluster computing and has been implemented in many famous cluster-computing systems using checkpoint/restart mechanisms. But existent check-pointing algorithms cannot restore the states of a file system when roll-backing the running of a program, so there axe many restrictions on file accesses in existent fault-tolerance systems. SCR algorithm, an algorithm based on atomic operation and consistent schedule, which can restore the states of file systems, is presented in this paper. In the SCR algorithm, system calls on file systems are classified into idem-potent operations and non-idem-potent operations. A non-idem-potent operation modifies a file system's states, while an idem-potent operation does not. SCR algorithm tracks changes of the file system states. It logs each non-idem-potent operation used by user programs and the information that can restore the operation in disks. When check-pointing roll-backing the program, SCR algorithm will revert the file system states to the last checkpoint time. By using SCR algorithm, users are allowed to use any file operation in their programs.

  7. Preconsult interactive computer-assisted client assessment survey for common mental disorders in a community health centre: a randomized controlled trial

    Science.gov (United States)

    Ahmad, Farah; Lou, Wendy; Shakya, Yogendra; Ginsburg, Liane; Ng, Peggy T.; Rashid, Meb; Dinca-Panaitescu, Serban; Ledwos, Cliff; McKenzie, Kwame

    2017-01-01

    Background: Access disparities for mental health care exist for vulnerable ethnocultural and immigrant groups. Community health centres that serve these groups could be supported further by interactive, computer-based, self-assessments. Methods: An interactive computer-assisted client assessment survey (iCCAS) tool was developed for preconsult assessment of common mental disorders (using the Patient Health Questionnaire [PHQ-9], Generalized Anxiety Disorder 7-item [GAD-7] scale, Primary Care Post-traumatic Stress Disorder [PTSD-PC] screen and CAGE [concern/cut-down, anger, guilt and eye-opener] questionnaire), with point-of-care reports. The pilot randomized controlled trial recruited adult patients, fluent in English or Spanish, who were seeing a physician or nurse practitioner at the partnering community health centre in Toronto. Randomization into iCCAS or usual care was computer generated, and allocation was concealed in sequentially numbered, opaque envelopes that were opened after consent. The objectives were to examine the interventions' efficacy in improving mental health discussion (primary) and symptom detection (secondary). Data were collected by exit survey and chart review. Results: Of the 1248 patients assessed, 190 were eligible for participation. Of these, 148 were randomly assigned (response rate 78%). The iCCAS (n = 75) and usual care (n = 72) groups were similar in sociodemographics; 98% were immigrants, and 68% were women. Mental health discussion occurred for 58.7% of patients in the iCCAS group and 40.3% in the usual care group (p ≤ 0.05). The effect remained significant while controlling for potential covariates (language, sex, education, employment) in generalized linear mixed model (GLMM; adjusted odds ratio [OR] 2.2; 95% confidence interval [CI] 1.1-4.5). Mental health symptom detection occurred for 38.7% of patients in the iCCAS group and 27.8% in the usual care group (p > 0.05). The effect was not significant beyond potential

  8. Key Statistics from the National Survey of Family Growth: Vasectomy

    Science.gov (United States)

    ... Collection Systems Vital Statistics: Birth Data NCHS Key Statistics from the National Survey of Family Growth - V ... NCHS Listservs Surveys and Data Collection Systems Vital Statistics: Birth Data File Formats Help: How do I ...

  9. 77 FR 66497 - Self-Regulatory Organizations; The Depository Trust Company; Notice of Filing and Immediate...

    Science.gov (United States)

    2012-11-05

    ... and distributing information to its Participants using its proprietary computer to computer facility (``CCF'') files. In order to reduce risk, improve transparency and increase efficiency in the announcing... the end-of-day batch CCF files. Participants that have volunteered to participate in a pilot...

  10. Register file soft error recovery

    Science.gov (United States)

    Fleischer, Bruce M.; Fox, Thomas W.; Wait, Charles D.; Muff, Adam J.; Watson, III, Alfred T.

    2013-10-15

    Register file soft error recovery including a system that includes a first register file and a second register file that mirrors the first register file. The system also includes an arithmetic pipeline for receiving data read from the first register file, and error detection circuitry to detect whether the data read from the first register file includes corrupted data. The system further includes error recovery circuitry to insert an error recovery instruction into the arithmetic pipeline in response to detecting the corrupted data. The inserted error recovery instruction replaces the corrupted data in the first register file with a copy of the data from the second register file.

  11. OE CAI: COMPUTER-ASSISTED INSTRUCTION OF OLD ENGLISH

    Directory of Open Access Journals (Sweden)

    Alejandro Alcaraz Sintes

    2002-06-01

    Full Text Available This article offer a general but thorougli survey of Computer Assisted lnstruction as applied to the Old English language íkoni the work of the late 80's pioneers to December 2001. It enibraces all the different facets of the question: stand-alone and web-based applications, Internet sites. CD-ROMs, grammars, dictioriaries, general courses, reading software, extralinguistic material, exercises, handouts, audio files ... Each instruction itee whether it be a website, a java exercise, an online course or an electronic book- is reviewed and URLs are provided in Sootiiotes. These reviews are accompanied all throughout by the pertinent theoretical background and practical advice.

  12. Computation of Flow Through Water-Control Structures Using Program DAMFLO.2

    Science.gov (United States)

    Sanders, Curtis L.; Feaster, Toby D.

    2004-01-01

    As part of its mission to collect, analyze, and store streamflow data, the U.S. Geological Survey computes flow through several dam structures throughout the country. Flows are computed using hydraulic equations that describe flow through sluice and Tainter gates, crest gates, lock gates, spillways, locks, pumps, and siphons, which are calibrated using flow measurements. The program DAMFLO.2 was written to compute, tabulate, and plot flow through dam structures using data that describe the physical properties of dams and various hydraulic parameters and ratings that use time-varying data, such as lake elevations or gate openings. The program uses electronic computer files of time-varying data, such as lake elevation or gate openings, retrieved from the U.S. Geological Survey Automated Data Processing System. Computed time-varying flow data from DAMFLO.2 are output in flat files, which can be entered into the Automated Data Processing System database. All computations are made in units of feet and seconds. DAMFLO.2 uses the procedures and language developed by the SAS Institute Inc.

  13. FHEO Filed Cases

    Data.gov (United States)

    Department of Housing and Urban Development — The dataset is a list of all the Title VIII fair housing cases filed by FHEO from 1/1/2007 - 12/31/2012 including the case number, case name, filing date, state and...

  14. Next generation WLCG File Transfer Service (FTS)

    CERN Document Server

    CERN. Geneva

    2012-01-01

    LHC experiments at CERN and worldwide utilize WLCG resources and middleware components to perform distributed computing tasks. One of the most important tasks is reliable file replication. It is a complex problem, suffering from transfer failures, disconnections, transfer duplication, server and network overload, differences in storage systems, etc. To address these problems, EMI and gLite have provided the independent File Transfer Service (FTS) and Grid File Access Library (GFAL) tools. Their development started almost a decade ago, in the meantime, requirements in data management have changed - the old architecture of FTS and GFAL cannot keep support easily these changes. Technology has also been progressing: FTS and GFAL do not fit into the new paradigms (cloud, messaging, for example). To be able to serve the next stage of LHC data collecting (from 2013), we need a new generation of  these tools: FTS 3 and GFAL 2. We envision a service requiring minimal configuration, which can dynamically adapt to the...

  15. Bi-weekly waterfowl survey data entry

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — Data sheet for the entry of bi-weekly waterfowl survey data from the state of Kansas. This Excel file contains the data entry sheet and a chart displaying waterfowl...

  16. At-sea aerial survey GPS points in southern California, 1999-2002

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This file contains the flight trackline Global Positioning System (GPS) point data from the aerial surveys. Surveys were flown at 60 meters (200 feet) above sea...

  17. 78 FR 45513 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2013-07-29

    .... DESCRIPTION OF COMPUTER MATCHING PROGRAM: Each participating SPAA will send ACF an electronic file of eligible public assistance client information. These files are non- Federal computer records maintained by the... on no more than 10,000,000 public assistance beneficiaries. 2. The DMDC computer database...

  18. Data Processing Factory for the Sloan Digital Sky Survey

    Science.gov (United States)

    Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan

    2002-12-01

    The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.

  19. Assessing the Accuracy of High Resolution Digital Surface Models Computed by PhotoScan® and MicMac® in Sub-Optimal Survey Conditions

    Directory of Open Access Journals (Sweden)

    Marion Jaud

    2016-06-01

    Full Text Available For monitoring purposes and in the context of geomorphological research, Unmanned Aerial Vehicles (UAV appear to be a promising solution to provide multi-temporal Digital Surface Models (DSMs and orthophotographs. There are a variety of photogrammetric software tools available for UAV-based data. The objective of this study is to investigate the level of accuracy that can be achieved using two of these software tools: Agisoft PhotoScan® Pro and an open-source alternative, IGN© MicMac®, in sub-optimal survey conditions (rugged terrain, with a large variety of morphological features covering a range of roughness sizes, poor GPS reception. A set of UAV images has been taken by a hexacopter drone above the Rivière des Remparts, a river on Reunion Island. This site was chosen for its challenging survey conditions: the topography of the study area (i involved constraints on the flight plan; (ii implied errors on some GPS measurements; (iii prevented an optimal distribution of the Ground Control Points (GCPs and; (iv was very complex to reconstruct. Several image processing tests are performed with different scenarios in order to analyze the sensitivity of each software package to different parameters (image quality, numbers of GCPs, etc.. When computing the horizontal and vertical errors within a control region on a set of ground reference targets, both methods provide rather similar results. A precision up to 3–4 cm is achievable with these software packages. The DSM quality is also assessed over the entire study area comparing PhotoScan DSM and MicMac DSM with a Terrestrial Laser Scanner (TLS point cloud. PhotoScan and MicMac DSM are also compared at the scale of particular features. Both software packages provide satisfying results: PhotoScan is more straightforward to use but its source code is not open; MicMac is recommended for experimented users as it is more flexible.

  20. ASCII Text File of the Original 1-m Bathymetry (Partial Coverage) from National Oceanic and Atmospheric Administration (NOAA) Survey H11322 in Western Rhode Island Sound (H11322_1M_UTM19NAD83.TXT)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The United States Geological Survey (USGS) is working cooperatively with the National Oceanic and Atmospheric Administration (NOAA) to interpret the surficial...

  1. ASCII text file of the Original 1-m Gridded Bathymetry from NOAA Survey H11310 in Central Narragansett Bay (H11310_1M_UTM19NAD83.TXT)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The United States Geological Survey (USGS) is working cooperatively with the National Oceanic and Atmospheric Association (NOAA) to interpret the surficial geology...

  2. Text files of the navigation logged with HYPACK Software during field activity 2013-003-FA in 2013 by the U.S. Geological Survey south of Martha's Vineyard and north of Nantucket, Massachusetts

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — These data were collected under a cooperative agreement between the Massachusetts Office of Coastal Zone Management (CZM) and the U.S. Geological Survey (USGS),...

  3. 2010-010-FA HYPACK NAVIGATION: Text Files of the DGPS Navigation Logged with HYPACK Software During SEABOSS Operations on U.S. Geological Survey (USGS) Cruise 2010-010-FA from April 17 to April 18, 2010

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The U.S. Geological Survey (USGS), in cooperation with the National Oceanic and Atmospheric Administration (NOAA) and the Connecticut Department of Energy and...

  4. ASCII Text File of the Original 1-m Bathymetry from National Oceanic and Atmospheric Administration (NOAA) Survey H11321 in Central Rhode Island Sound (H11321_1M_UTM19NAD83.TXT)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The United States Geological Survey (USGS) is working cooperatively with the National Oceanic and Atmospheric Administration (NOAA) to interpret the surficial...

  5. 2013-005-FA_HYPACK: Text Files of the DGPS Navigation Logged with HYPACK Software on U.S. Geological Survey Cruise 2013-005-FA from June 17 to June 20, 2013

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The U.S. Geological Survey (USGS), in cooperation with the National Oceanic and Atmospheric Administration (NOAA), is producing detailed geologic maps of the coastal...

  6. Text files of the navigation logged with HYPACK Software during field activity 2013-003-FA in 2013 by the U.S. Geological Survey south of Martha's Vineyard and north of Nantucket, Massachusetts

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — These data were collected under a cooperative agreement between the Massachusetts Office of Coastal Zone Management (CZM) and the U.S. Geological Survey (USGS),...

  7. Sidescan sonar polyline shapefile of trackline navigation files collected by the U.S. Geological Survey in the Madison Swanson and Steamboat Lumps Marine Protected Areas, Gulf of Mexico in 2000 (Geographic, WGS 84)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The U.S. Geological Survey (USGS) mapped approximately 22 square miles of the Madison Swanson Marine Protected Area (MPA) and Steamboat Lumps MPA, which are located...

  8. IT Barometer survey, Denmark

    DEFF Research Database (Denmark)

    Howard, Rob

    1998-01-01

    Survey results from Danish architects, engineers, contractors and property managers in the construction industry concerning their use of computers, communications, problems and needs.......Survey results from Danish architects, engineers, contractors and property managers in the construction industry concerning their use of computers, communications, problems and needs....

  9. Dynamic Non-Hierarchical File Systems for Exascale Storage

    Energy Technology Data Exchange (ETDEWEB)

    Long, Darrell E. [PI; Miller, Ethan L [Co PI

    2015-02-24

    This constitutes the final report for “Dynamic Non-Hierarchical File Systems for Exascale Storage”. The ultimate goal of this project was to improve data management in scientific computing and high-end computing (HEC) applications, and to achieve this goal we proposed: to develop the first, HEC-targeted, file system featuring rich metadata and provenance collection, extreme scalability, and future storage hardware integration as core design goals, and to evaluate and develop a flexible non-hierarchical file system interface suitable for providing more powerful and intuitive data management interfaces to HEC and scientific computing users. Data management is swiftly becoming a serious problem in the scientific community – while copious amounts of data are good for obtaining results, finding the right data is often daunting and sometimes impossible. Scientists participating in a Department of Energy workshop noted that most of their time was spent “...finding, processing, organizing, and moving data and it’s going to get much worse”. Scientists should not be forced to become data mining experts in order to retrieve the data they want, nor should they be expected to remember the naming convention they used several years ago for a set of experiments they now wish to revisit. Ideally, locating the data you need would be as easy as browsing the web. Unfortunately, existing data management approaches are usually based on hierarchical naming, a 40 year-old technology designed to manage thousands of files, not exabytes of data. Today’s systems do not take advantage of the rich array of metadata that current high-end computing (HEC) file systems can gather, including content-based metadata and provenance1 information. As a result, current metadata search approaches are typically ad hoc and often work by providing a parallel management system to the “main” file system, as is done in Linux (the locate utility), personal computers, and enterprise search

  10. CIF (Crystallographic Information File): A Standard for Crystallographic Data Interchange

    Science.gov (United States)

    Brown, I. D.

    1996-01-01

    The Crystallographic Information File (CIF) uses the self-defining STAR file structure. This requires the creation of a dictionary of data names and definitions. A basic dictionary of terms needed to describe the crystal structures of small molecules was approved in 1991 and is currently used for the submission of papers to Acta Crystallographica C. A number of extensions to this dictionary are in preparation. By storing the dictionary itself as a STAR file, the definitions and relationships in the CIF dictionary become computer interpretable. This offers many possibilities for the automatic handling of crystallographic information. PMID:27805170

  11. The Design of a Secure File Storage System

    Science.gov (United States)

    1979-12-01

    8217brjnw Into the FM process meory to - m - check for proper discretionary access. The complete Dathname, in terms of the FSS file system, - passed to...research shows that a viable approach to the auestion of internal computer security exists. This approach, sometimes termed the "security kernel approach...eration is gninR on, aI significant advantage if the data file is long . kfter theJ file is stored by the 10 process, the FM process gets a ticket to the

  12. Value-Based File Retention: File Attributes as File Value and Information Waste Indicators

    NARCIS (Netherlands)

    Wijnhoven, Fons; Amrit, Chintan; Dietz, Pim

    2014-01-01

    Several file retention policy methods propose that a file retention policy should be based on file value. Though such a retention policy might increase the value of accessible files, the method to arrive at such a policy is underresearched. This article discusses how one can arrive at a method for d

  13. DCFPAK: Dose coefficient data file package for Sandia National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Eckerman, K.F.; Leggett, R.W.

    1996-07-31

    The FORTRAN-based computer package DCFPAK (Dose Coefficient File Package) has been developed to provide electronic access to the dose coefficient data files summarized in Federal Guidance Reports 11 and 12. DCFPAK also provides access to standard information regarding decay chains and assembles dose coefficients for all dosimetrically significant radioactive progeny of a specified radionuclide. DCFPAK was designed for application on a PC but, with minor modifications, may be implemented on a UNIX workstation.

  14. Low-Carbon Computing

    Science.gov (United States)

    Hignite, Karla

    2009-01-01

    Green information technology (IT) is grabbing more mainstream headlines--and for good reason. Computing, data processing, and electronic file storage collectively account for a significant and growing share of energy consumption in the business world and on higher education campuses. With greater scrutiny of all activities that contribute to an…

  15. Isothiourea-catalysed enantioselective pyrrolizine synthesis: synthetic and computational studies† †Electronic supplementary information (ESI) available: NMR spectra, HPLC analysis and computational co-ordinates. Data available.12 CCDC 1483759. For ESI and crystallographic data in CIF or other electronic format see DOI: 10.1039/c6ob01557c Click here for additional data file. Click here for additional data file. Click here for additional data file.

    Science.gov (United States)

    Stark, Daniel G.; Williamson, Patrick; Gayner, Emma R.; Musolino, Stefania F.; Kerr, Ryan W. F.; Taylor, James E.; Slawin, Alexandra M. Z.; O'Riordan, Timothy J. C.

    2016-01-01

    The catalytic enantioselective synthesis of a range of cis-pyrrolizine carboxylate derivatives with outstanding stereocontrol (14 examples, >95 : 5 dr, >98 : 2 er) through an isothiourea-catalyzed intramolecular Michael addition-lactonisation and ring-opening approach from the corresponding enone acid is reported. An optimised and straightforward three-step synthetic route to the enone acid starting materials from readily available pyrrole-2-carboxaldehydes is delineated, with benzotetramisole (5 mol%) proving the optimal catalyst for the enantioselective process. Ring-opening of the pyrrolizine dihydropyranone products with either MeOH or a range of amines leads to the desired products in excellent yield and enantioselectivity. Computation has been used to probe the factors leading to high stereocontrol, with the formation of the observed cis-steroisomer predicted to be kinetically and thermodynamically favoured. PMID:27489030

  16. Closed Claim Query File

    Data.gov (United States)

    Social Security Administration — This file is used to hold information about disability claims that have been closed and have been selected for sampling.Sampling is the process whereby OQR reviews...

  17. Provider of Services File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The POS file contains data on characteristics of hospitals and other types of healthcare facilities, including the name and address of the facility and the type of...

  18. MMLEADS Public Use File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare-Medicaid Linked Enrollee Analytic Data Source (MMLEADS) Public Use File (PUF) contains demographic, enrollment, condition prevalence, utilization, and...

  19. Patient Treatment File (PTF)

    Data.gov (United States)

    Department of Veterans Affairs — This database is part of the National Medical Information System (NMIS). The Patient Treatment File (PTF) contains a record for each inpatient care episode provided...

  20. USEEIO Satellite Files

    Data.gov (United States)

    U.S. Environmental Protection Agency — These files contain the environmental data as particular emissions or resources associated with a BEA sectors that are used in the USEEIO model. They are organized...

  1. 网格计算技术及应用综述%A Survey on The Technology & Application of Grid Computing

    Institute of Scientific and Technical Information of China (English)

    洪学海; 许卓群; 丁文魁

    2003-01-01

    Grid computing is a new kind of distributed computing technology and computing environment,and also anattentive hot point of information technology in the world. It launches a space for new generation internal application,due to its supporting the complicated service efficiently and useable resource in the Internet. This paper sums up theresults and applications instance in the grid computing in the world during recent years ,and then analyzes emphaticallythe system architecture ,components ,working principle in the grid computing and some typical grid systems ,and thendiscusses the problems of the large scale science computing and network service in grid computing in China,and alsopoints out the future trends.

  2. File System Virtual Appliances

    Science.gov (United States)

    2010-05-01

    some of my closest friendships. Ashraf, Ippo and Neil made life much more fun. Mike Merideth introduced me to fine scotch, hi-fi speakers, and piano ...system virtual appliances (4) We analyze the sources of latency in traditional inter-VM communica- tion techniques and present a novel energy- and...tiple file system implementations within the Sun UNIX kernel” [52]. This was achieved through two techniques . First, outside the file system layer

  3. INTERNET: PENGIRIMAN FILE

    Directory of Open Access Journals (Sweden)

    Zainul Bakri

    2012-10-01

    Full Text Available Salah satu alasan pengguna komputer untuk berhubungan dengan Internet adalah mendapat kesempatan untuk menyalin ('download' informasi yang tersimpan dari server jaringan komputer lain (misalnya menyalin program aplikasi komputer, data mentah, dan sebagainya. File Transfer  Protocol (FfP adalah cara di Internet untuk mengirim file dari satu tempat ke komputer pengguna. Untuk keperluan ini dapat menggunakan program FTP khusus atau dengan menggunakan Web browser.

  4. Prior to the Oral Therapy, What Do We Know About HCV-4 in Egypt: A Randomized Survey of Prevalence and Risks Using Data Mining Computed Analysis

    Science.gov (United States)

    Abd Elrazek, Abd Elrazek; Bilasy, Shymaa E.; Elbanna, Abduh E. M.; Elsherif, Abd Elhalim A.

    2014-01-01

    Abstract Hepatitis C virus (HCV) affects over 180 million people worldwide and it's the leading cause of chronic liver diseases and hepatocellular carcinoma. HCV is classified into seven major genotypes and a series of subtypes. In general, HCV genotype 4 (HCV-4) is common in the Middle East and Africa, where it is responsible for more than 80% of HCV infections. Although HCV-4 is the cause of approximately 20% of the 180 million cases of chronic hepatitis C worldwide, it has not been a major subject of research yet. The aim of the current study is to survey the morbidities and disease complications among Egyptian population infected with HCV-4 using data mining advanced computing methods mainly and other complementary statistical analysis. Six thousand six hundred sixty subjects, aged between 17 and 58 years old, from different Egyptian Governorates were screened for HCV infection by ELISA and qualitative PCR. HCV-positive patients were further investigated for the incidence of liver cirrhosis and esophageal varices. Obtained data were analyzed by data mining approach. Among 6660 subjects enrolled in this survey, 1018 patients (15.28%) were HCV-positive. Proportion of infected-males was significantly higher than females; 61.6% versus 38.4% (P = 0.0052). Around two-third of infected-patients (635/1018; 62.4%) were presented with liver cirrhosis. Additionally, approximately half of the cirrhotic patients (301/635; 47.4%) showed degrees of large esophageal varices (LEVs), with higher variceal grade observed in males. Age for esophageal variceal development was 47 ± 1. Data mining analysis yielded esophageal wall thickness (>6.5 mm), determined by conventional U/S, as the only independent predictor for esophageal varices. This study emphasizes the high prevalence of HCV infection among Egyptian population, in particular among males. Egyptians with HCV-4 infection are at a higher risk to develop cirrhotic liver and esophageal varices. Data mining, a new

  5. Tax Filing and Other Financial Behaviors of EITC-Eligible Households: Differences of Banked and Unbanked

    Science.gov (United States)

    Lim, Younghee; Livermore, Michelle; Davis, Belinda Creel

    2011-01-01

    Holding a bank account is crucial to the income-maximizing and asset-building of households. This study uses 2008 survey data of EITC-eligible households assisted at Volunteer Income Tax Assistance (VITA) sites to document their tax filing behavior and use of Alternate Financial Services (AFS). Specifically, the differences in tax filing and AFS…

  6. Computability and unsolvability

    CERN Document Server

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  7. An Approach to Analyze Physical Memory Image File of Mac OS X

    Institute of Scientific and Technical Information of China (English)

    LiJuan Xu; LianHai Wang

    2014-01-01

    Memory analysis is one of the key techniques in computer live forensics. Especially, the analysis of a Mac OS X operating system’ s memory image file plays an important role in identifying the running status of an apple computer. However, how to analyze the image file without using extra”mach-kernel” file is one of the unsolved difficulties. In this paper, we firstly compare several approaches for physical memory acquisition and analyze the effects of each approach on physical memory. Then, we discuss the traditional methods for the physical memory file analysis of Mac OS X. A novel physical memory image file analysis approach without using extra“mach-kernel” file is proposed base on the discussion. We verify the performance of the new approach on Mac OS X 10�8�2. The experimental results show that the proposed approach is simpler and more practical than previous ones.

  8. 76 FR 12398 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Bureau of the Public Debt (BPD...

    Science.gov (United States)

    2011-03-07

    ... comparison file compiled of records from our expanded Medicare Database (MDB) File system of records in order to support our administration of the prescription drug subsidy program. The MDB File system of... computer systems and provide the response file to us as soon as possible. This agreement covers...

  9. The clustering of galaxies in the completed SDSS-III Baryon Oscillation Spectroscopic Survey: towards a computationally efficient analysis without informative priors

    Science.gov (United States)

    Pellejero-Ibanez, Marcos; Chuang, Chia-Hsun; Rubiño-Martín, J. A.; Cuesta, Antonio J.; Wang, Yuting; Zhao, Gongbo; Ross, Ashley J.; Rodríguez-Torres, Sergio; Prada, Francisco; Slosar, Anže; Vazquez, Jose A.; Alam, Shadab; Beutler, Florian; Eisenstein, Daniel J.; Gil-Marín, Héctor; Grieb, Jan Niklas; Ho, Shirley; Kitaura, Francisco-Shu; Percival, Will J.; Rossi, Graziano; Salazar-Albornoz, Salvador; Samushia, Lado; Sánchez, Ariel G.; Satpathy, Siddharth; Seo, Hee-Jong; Tinker, Jeremy L.; Tojeiro, Rita; Vargas-Magaña, Mariana; Brownstein, Joel R.; Nichol, Robert C.; Olmstead, Matthew D.

    2017-07-01

    We develop a new computationally efficient methodology called double-probe analysis with the aim of minimizing informative priors (those coming from extra probes) in the estimation of cosmological parameters. Using our new methodology, we extract the dark energy model-independent cosmological constraints from the joint data sets of the Baryon Oscillation Spectroscopic Survey (BOSS) galaxy sample and Planck cosmic microwave background (CMB) measurements. We measure the mean values and covariance matrix of {R, la, Ωbh2, ns, log(As), Ωk, H(z), DA(z), f(z)σ8(z)}, which give an efficient summary of the Planck data and two-point statistics from the BOSS galaxy sample. The CMB shift parameters are R=√{Ω _m H_0^2} r(z_*) and la = πr(z*)/rs(z*), where z* is the redshift at the last scattering surface, and r(z*) and rs(z*) denote our comoving distance to the z* and sound horizon at z*, respectively; Ωb is the baryon fraction at z = 0. This approximate methodology guarantees that we will not need to put informative priors on the cosmological parameters that galaxy clustering is unable to constrain, i.e. Ωbh2 and ns. The main advantage is that the computational time required for extracting these parameters is decreased by a factor of 60 with respect to exact full-likelihood analyses. The results obtained show no tension with the flat Λ cold dark matter (ΛCDM) cosmological paradigm. By comparing with the full-likelihood exact analysis with fixed dark energy models, on one hand we demonstrate that the double-probe method provides robust cosmological parameter constraints that can be conveniently used to study dark energy models, and on the other hand we provide a reliable set of measurements assuming dark energy models to be used, for example, in distance estimations. We extend our study to measure the sum of the neutrino mass using different methodologies, including double-probe analysis (introduced in this study), full-likelihood analysis and single-probe analysis

  10. Water and streambed-material data, Eagle Creek watershed, Indiana, August 1980, October and December 1982, and April 1983; updating of U.S. Geological Survey Open-file report 83-215

    Science.gov (United States)

    Wangsness, David J.

    1983-01-01

    Water-quality surveys within the Eagle Creek watershed were done by the U.S. Geological Survey in August 1980, October and December 1982 and April 1983 in cooperation with the city of Indianapolis, Department of Public Works. Streambed-material and water samples were collected from Finley and Eagle Creek and was analyzed for selected metals, insecticides, and acid-extractable and base-neutral-extractable compounds. Water samples also were analyzed for volatile organic compounds. The 1982-83 surveys represent different flow conditions. This report lists all the data collected and analyzed by the U.S. Geological Survey but does not interpret any of the results.

  11. Cloud object store for checkpoints of high performance computing applications using decoupling middleware

    Energy Technology Data Exchange (ETDEWEB)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-04-19

    Cloud object storage is enabled for checkpoints of high performance computing applications using a middleware process. A plurality of files, such as checkpoint files, generated by a plurality of processes in a parallel computing system are stored by obtaining said plurality of files from said parallel computing system; converting said plurality of files to objects using a log structured file system middleware process; and providing said objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  12. Cloud object store for archive storage of high performance computing data using decoupling middleware

    Energy Technology Data Exchange (ETDEWEB)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-06-30

    Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  13. Cloud object store for archive storage of high performance computing data using decoupling middleware

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-06-30

    Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  14. LASIP-III, a generalized processor for standard interface files. [For creating binary files from BCD input data and printing binary file data in BCD format (devised for fast reactor physics codes)

    Energy Technology Data Exchange (ETDEWEB)

    Bosler, G.E.; O' Dell, R.D.; Resnik, W.M.

    1976-03-01

    The LASIP-III code was developed for processing Version III standard interface data files which have been specified by the Committee on Computer Code Coordination. This processor performs two distinct tasks, namely, transforming free-field format, BCD data into well-defined binary files and providing for printing and punching data in the binary files. While LASIP-III is exported as a complete free-standing code package, techniques are described for easily separating the processor into two modules, viz., one for creating the binary files and one for printing the files. The two modules can be separated into free-standing codes or they can be incorporated into other codes. Also, the LASIP-III code can be easily expanded for processing additional files, and procedures are described for such an expansion. 2 figures, 8 tables.

  15. International survey of emergency physicians' awareness and use of the Canadian Cervical-Spine Rule and the Canadian Computed Tomography Head Rule.

    Science.gov (United States)

    Eagles, Debra; Stiell, Ian G; Clement, Catherine M; Brehaut, Jamie; Taljaard, Monica; Kelly, Anne-Maree; Mason, Suzanne; Kellermann, Arthur; Perry, Jeffrey J

    2008-12-01

    The derivation and validation studies for the Canadian Cervical-Spine (C-Spine) Rule (CCR) and the Canadian Computed Tomography (CT) Head Rule (CCHR) have been published in major medical journals. The objectives were to determine: 1) physician awareness and use of these rules in Australasia, Canada, the United Kingdom, and the United States and 2) physician characteristics associated with awareness and use. A self-administered e-mail and postal survey was sent to members of four national emergency physician (EP) associations using a modified Dillman technique. Results were analyzed using repeated-measures logistic regression models. The response rate was 54.8% (1,150/2,100). Reported awareness of the CCR ranged from 97% (Canada) to 65% (United States); for the CCHR it ranged from 86% (Canada) to 31% (United States). Reported use of the CCR ranged from 73% (Canada) to 30% (United States); for the CCHR, it was 57% (Canada) to 12% (United States). Predictors of awareness were country, type of rule, full-time employment, younger age, and teaching hospital (p rules were highest in Canada and lowest in the United States. While younger physicians, those employed full-time, and those working in teaching hospitals were more likely to be aware of a decision rule, age and employment status were not significant predictors of use. A better understanding of factors related to awareness and use of emergency medicine (EM) decision rules will enhance our understanding of knowledge translation and facilitate strategies to enhance dissemination and implementation of future rules.

  16. Students "Hacking" School Computer Systems

    Science.gov (United States)

    Stover, Del

    2005-01-01

    This article deals with students hacking school computer systems. School districts are getting tough with students "hacking" into school computers to change grades, poke through files, or just pit their high-tech skills against district security. Dozens of students have been prosecuted recently under state laws on identity theft and unauthorized…

  17. Students "Hacking" School Computer Systems

    Science.gov (United States)

    Stover, Del

    2005-01-01

    This article deals with students hacking school computer systems. School districts are getting tough with students "hacking" into school computers to change grades, poke through files, or just pit their high-tech skills against district security. Dozens of students have been prosecuted recently under state laws on identity theft and unauthorized…

  18. ERF1_2 -- Enhanced River Reach File 2.0

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The digital segmented network based on watershed boundaries, ERF1_2, includes enhancements to the U.S. Environmental Protection Agency's River Reach File 1 (RF1)...

  19. DSSTOX MASTER STRUCTURE-INDEX FILE: SDF FILE AND DOCUMENTATION

    Science.gov (United States)

    The DSSTox Master Structure-Index File serves to consolidate, manage, and ensure quality and uniformity of the chemical and substance information spanning all DSSTox Structure Data Files, including those in development but not yet published separately on this website.

  20. A Centralized Control and Dynamic Dispatch Architecture for File Integrity Analysis

    Directory of Open Access Journals (Sweden)

    Ronald DeMara

    2006-02-01

    Full Text Available The ability to monitor computer file systems for unauthorized changes is a powerful administrative tool. Ideally this task could be performed remotely under the direction of the administrator to allow on-demand checking, and use of tailorable reporting and exception policies targeted to adjustable groups of network elements. This paper introduces M-FICA, a Mobile File Integrity and Consistency Analyzer as a prototype to achieve this capability using mobile agents. The M-FICA file tampering detection approach uses MD5 message digests to identify file changes. Two agent types, Initiator and Examiner, are used to perform file integrity tasks. An Initiator travels to client systems, computes a file digest, then stores those digests in a database file located on write-once media. An Examiner agent computes a new digest to compare with the original digests in the database file. Changes in digest values indicate that the file contents have been modified. The design and evaluation results for a prototype developed in the Concordia agent framework are described.

  1. Log files can and should be prepared for a functionalistic approach

    DEFF Research Database (Denmark)

    Bergenholtz, Henning; Johnsen, Mia

    2007-01-01

     User surveys of printed dictionaries may be characterised as non-representative and non-realistic laboratory tests, often with retrospective questions based on memory. Log file analy­ses concerning the use of Internet dictionaries, on the other hand, are based on large numbers of users and look......-ups. However, log file analyses have also been characterised by a juggling of num­bers based on data calculations of limited direct relevance to practical and theoretical lexicography. This article proposes the development of lexicographically relevant log files for the use in log file analyses in order...

  2. DSSTOX MASTER STRUCTURE-INDEX FILE: SDF FILE AND ...

    Science.gov (United States)

    The DSSTox Master Structure-Index File serves to consolidate, manage, and ensure quality and uniformity of the chemical and substance information spanning all DSSTox Structure Data Files, including those in development but not yet published separately on this website. The DSSTox Master Structure-Index File serves to consolidate, manage, and ensure quality and uniformity of the chemical and substance information spanning all DSSTox Structure Data Files, including those in development but not yet published separately on this website.

  3. Challenges of Hidden Data in the Unused Area Two within Executable Files

    Directory of Open Access Journals (Sweden)

    A. W. Naji

    2009-01-01

    Full Text Available Problem statement: The executable files are one of the most important files in operating systems and in most systems designed by developers (programmers/software engineers, and then hiding information in these file is the basic goal for this study, because most users of any system cannot alter or modify the content of these files. There are many challenges of hidden data in the unused area two within executable files, which is dependencies of the size of the cover file with the size of hidden information, differences of the size of file before and after the hiding process, availability of the cover file after the hiding process to perform normally and detection by antivirus software as a result of changes made to the file. Approach: The system designed to accommodate the release mechanism that consists of two functions; first is the hiding of the information in the unused area 2 of PE-file (exe.file, through the execution of four process (specify the cover file, specify the information file, encryption of the information, and hiding the information and the second function is the extraction of the hiding information through three process (specify the steno file, extract the information, and decryption of the information. Results: The programs were coded in Java computer language and implemented on Pentium PC. The designed algorithms were intended to help in proposed system aim to hide and retract information (data file with in unused area 2 of any execution file (exe.file. Conclusion: Features of the short-term responses were simulated that the size of the hidden data does depend on the size of the unused area2 within cover file which is equal 20% from the size of exe.file before hiding process, most antivirus systems do not allow direct write in executable file, so the approach of the proposed system is to prevent the hidden information to observation of these systems and the exe.file still function as usual after the hiding process.

  4. M-FILE FOR MIX DESIGN OF STRUCTURAL LIGHTWEIGHT CONCRETE USING DEVELOPED MODELS

    Directory of Open Access Journals (Sweden)

    M. ABDULLAHI

    2011-08-01

    Full Text Available An m-file for mix design of structural lightweight concrete is presented. Mix design of structural lightweight concrete is conducted using guide in the standards. This may be tasking involving reading and understanding of the relevant standards. This renders the process inefficient and liable to errors in computations. A computer approach to mix design will alleviate this problem. An m-file was developed in MATLAB environment for the concrete mix design. The m-file has been tested and has proved to be efficient in computing the mix composition for the first trial batch of lightweight concrete mixes. It can also perform concrete mixture proportioning adjustment.

  5. Efficient compression of molecular dynamics trajectory files.

    Science.gov (United States)

    Marais, Patrick; Kenwood, Julian; Smith, Keegan Carruthers; Kuttel, Michelle M; Gain, James

    2012-10-15

    We investigate whether specific properties of molecular dynamics trajectory files can be exploited to achieve effective file compression. We explore two classes of lossy, quantized compression scheme: "interframe" predictors, which exploit temporal coherence between successive frames in a simulation, and more complex "intraframe" schemes, which compress each frame independently. Our interframe predictors are fast, memory-efficient and well suited to on-the-fly compression of massive simulation data sets, and significantly outperform the benchmark BZip2 application. Our schemes are configurable: atomic positional accuracy can be sacrificed to achieve greater compression. For high fidelity compression, our linear interframe predictor gives the best results at very little computational cost: at moderate levels of approximation (12-bit quantization, maximum error ≈ 10(-2) Å), we can compress a 1-2 fs trajectory file to 5-8% of its original size. For 200 fs time steps-typically used in fine grained water diffusion experiments-we can compress files to ~25% of their input size, still substantially better than BZip2. While compression performance degrades with high levels of quantization, the simulation error is typically much greater than the associated approximation error in such cases.

  6. A survey of TRIPOLI-4

    Energy Technology Data Exchange (ETDEWEB)

    Both, J.P.; Derriennic, H.; Morillon, B.; Nimal, J.C.

    1994-12-31

    A survey of the new version of the TRIPOLI code used in shielding calculations, is presented. The main new features introduced in this version are: combinatorial geometry, a multigroup automatic weighting scheme, and complete treatment of nuclear evaluation files; a simulation implementation including strategy for parallelization of the code is presented. Some benchmark calculations are also presented. 7 figs., 5 tabs., 4 refs.

  7. Measurements of file transfer rates over dedicated long-haul connections

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Nageswara S [ORNL; Settlemyer, Bradley W [ORNL; Imam, Neena [ORNL; Hinkel, Gregory Carl [ORNL

    2016-01-01

    Wide-area file transfers are an integral part of several High-Performance Computing (HPC) scenarios. Dedicated network connections with high capacity, low loss rate and low competing traffic, are increasingly being provisioned over current HPC infrastructures to support such transfers. To gain insights into these file transfers, we collected transfer rate measurements for Lustre and xfs file systems between dedicated multi-core servers over emulated 10 Gbps connections with round trip times (rtt) in 0-366 ms range. Memory transfer throughput over these connections is measured using iperf, and file IO throughput on host systems is measured using xddprof. We consider two file system configurations: Lustre over IB network and xfs over SSD connected to PCI bus. Files are transferred using xdd across these connections, and the transfer rates are measured, which indicate the need to jointly optimize the connection and host file IO parameters to achieve peak transfer rates. In particular, these measurements indicate that (i) peak file transfer rate is lower than peak connection and host IO throughput, in some cases by as much as 50% or lower, (ii) xdd request sizes that achieve peak throughput for host file IO do not necessarily lead to peak file transfer rates, and (iii) parallelism in host IO and TCP transport does not always improve the file transfer rates.

  8. Cloud Based Log file analysis with Interactive Spark as a Service

    Directory of Open Access Journals (Sweden)

    Nurudeen Sherif

    2016-07-01

    Full Text Available The Software applications are usually programmed to generate some auxiliary text files referred to as log files. Such files are used throughout various stages of the software development, primarily for debugging and identification of errors. Use of log files makes debugging easier during testing. It permits following the logic of the program, at high level, while not having to run it in debug mode. Nowadays, log files are usually used at commercial software installations for the aim of permanent software observation and finetuning. Log files became a typical part of software application and are essential in operating systems, networks and distributed systems. Log files are usually the only way to determine and find errors in a software application, because probe effect has no effect on log file analysis. Log files are usually massive and may have an intricate structure. Though the method of generating log files is sort of easy and simple, log file analysis may well be an incredible task that needs immense computing resources and complex procedures.

  9. Translator program converts computer printout into braille language

    Science.gov (United States)

    Powell, R. A.

    1967-01-01

    Computer program converts print image tape files into six dot Braille cells, enabling a blind computer programmer to monitor and evaluate data generated by his own programs. The Braille output is printed 8 lines per inch.

  10. Building Hot Snapshot Copy Based on Windows File System

    Institute of Scientific and Technical Information of China (English)

    WANG Lina; GUO Chi; WANG Dejun; ZHU Qin

    2006-01-01

    This paper describes a method for building hot snapshot copy based on windows-file system (HSCF). The architecture and running mechanism of HSCF are discussed after giving a comparison with other on-line backup technology. HSCF, based on a file system filter driver, protects computer data and ensures their integrity and consistency with following three steps:access to open files, synchronization and copy-on-write. Its strategies for improving system performance are analyzed including priority setting, incremental snapshot and load balance. HSCF is a new kind of snapshot technology to solve the data integrity and consistency problem in online backup, which is different from other storage-level snapshot and Open File Solution.

  11. Data file, Continental Margin Program, Atlantic Coast of the United States: vol. 2 sample collection and analytical data

    Science.gov (United States)

    Hathaway, John C.

    1971-01-01

    The purpose of the data file presented below is twofold: the first purpose is to make available in printed form the basic data relating to the samples collected as part of the joint U.S. Geological Survey - Woods Hole Oceanographic Institution program of study of the Atlantic continental margin of the United States; the second purpose is to maintain these data in a form that is easily retrievable by modern computer methods. With the data in such form, repeate manual transcription for statistical or similar mathematical treatment becomes unnecessary. Manual plotting of information or derivatives from the information may also be eliminated. Not only is handling of data by the computer considerably faster than manual techniques, but a fruitful source of errors, transcription mistakes, is eliminated.

  12. Location and analyses of sediment samples collected by the U.S. Geological Survey in 2015 along the Delmarva Peninsula, MD and VA (Esri point shapefile and CSV file, Geographic, WGS 84).

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The Delmarva Peninsula is a 220-kilometer-long headland, spit, and barrier island complex that was significantly affected by Hurricane Sandy in the fall of 2012. The...

  13. Text Files of the DGPS Navigation Logged with HYPACK Software on U.S. Geological Survey (USGS) Cruise 2011-006-FA from June 13 to June 21, 2011

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The USGS, in cooperation with NOAA, is producing detailed maps of the seafloor off southern New England. The current phase of this cooperative research program is...

  14. CAPENORTH_GEO4M_XYZ.TXT: ASCII formatted file of the 4-m bathymetry from the northern half of USGS survey 98015 of the Sea Floor off Eastern Cape Cod (Geographic)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This data set includes bathymetry of the sea floor offshore of eastern Cape Cod, Massachusetts. The data were collected with a multibeam sea floor mapping system...

  15. ASCII formatted file of the 4-m bathymetry from the northern half of USGS survey 98015 of the Sea Floor off Eastern Cape Cod (CAPENORTH_GEO4M_XYZ.TXT, Geographic, NAD83)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This data set includes bathymetry of the sea floor offshore of eastern Cape Cod, Massachusetts. The data were collected with a multibeam sea floor mapping system...

  16. Single-Beam Bathymetry Data Collected in 2015 nearshore Dauphin Island, Alabama, U.S. Geological Survey (USGS). This metadata file is specific to the International Reference Frame 2000 (ITRF00) xyz point data.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Dauphin Island, Alabama is a barrier island located in the Gulf of Mexico that supports local residence, tourism, commercial infrastructure, and the historical Fort...

  17. ASCII formatted file of the 4-m bathymetry from the southern half of USGS Survey 98015 of the Sea Floor off Eastern Cape Cod (CAPESOUTH_GEO4M_XYZ.TXT, Geographic, NAD83)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This data set includes bathymetry of the sea floor offshore of eastern Cape Cod, Massachusetts. The data were collected with a multibeam sea floor mapping system...

  18. Location and analysis of sediment samples collected by the U.S. Geological Survey in 2014 along the Delmarva Peninsula, MD and VA (Esri point shapefile and CSV file, Geographic, WGS 84)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The Delmarva Peninsula is a 220-kilometer-long headland, spit, and barrier island complex that was significantly affected by Hurricane Sandy. A U.S. Geological...

  19. Text Files of the DGPS Navigation Logged with HYPACK Software on U.S. Geological Survey (USGS) Cruise 2011-006-FA from June 13 to June 21, 2011

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The USGS, in cooperation with NOAA, is producing detailed maps of the seafloor off southern New England. The current phase of this cooperative research program is...

  20. Formalizing a Hierarchical File System

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, M.I.

    2009-01-01

    In this note, we define an abstract file system as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for removal (rm)

  1. Formalizing a hierarchical file system

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, Muhammad Ikram

    2012-01-01

    An abstract file system is defined here as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for creation, removal, a

  2. The File. Case Study in Correction (1977-1979).

    Science.gov (United States)

    Lang, Serge

    In reaction to the 1977 Survey of the American Professoriate by Everett C. Ladd and Seymour Martin Lipset, Serge Lang began a file of letters expressing concern over and opposition to the study. The controversy began when Lang wrote a letter to the researchers expressing his opposition, with copies to members of the academic community,…

  3. The File. Case Study in Correction (1977-1979).

    Science.gov (United States)

    Lang, Serge

    In reaction to the 1977 Survey of the American Professoriate by Everett C. Ladd and Seymour Martin Lipset, Serge Lang began a file of letters expressing concern over and opposition to the study. The controversy began when Lang wrote a letter to the researchers expressing his opposition, with copies to members of the academic community,…

  4. Evaluating audio computer assisted self-interviews in urban south African communities: evidence for good suitability and reduced social desirability bias of a cross-sectional survey on sexual behaviour

    Directory of Open Access Journals (Sweden)

    Beauclair Roxanne

    2013-01-01

    Full Text Available Abstract Background Efficient HIV prevention requires accurate identification of individuals with risky sexual behaviour. However, self-reported data from sexual behaviour surveys are prone to social desirability bias (SDB. Audio Computer-Assisted Self-Interviewing (ACASI has been suggested as an alternative to face-to-face interviewing (FTFI, because it may promote interview privacy and reduce SDB. However, little is known about the suitability and accuracy of ACASI in urban communities with high HIV prevalence in South Africa. To test this, we conducted a sexual behaviour survey in Cape Town, South Africa, using ACASI methods. Methods Participants (n = 878 answered questions about their sexual relationships on a touch screen computer in a private mobile office. We included questions at the end of the ACASI survey that were used to assess participants’ perceived ease of use, privacy, and truthfulness. Univariate logistic regression models, supported by multivariate models, were applied to identify groups of people who had adverse interviewing experiences. Further, we constructed male–female ratios of self-reported sexual behaviours as indicators of SDB. We used these indicators to compare SDB in our survey and in recent FTFI-based Demographic and Health Surveys (DHSs from Lesotho, Swaziland, and Zimbabwe. Results Most participants found our methods easy to use (85.9%, perceived privacy (96.3% and preferred ACASI to other modes of inquiry (82.5% when reporting on sexual behaviours. Unemployed participants and those in the 40–70 year old age group were the least likely to find our methods easy to use (OR 0.69; 95% CI: 0.47–1.01 and OR 0.37; 95% CI: 0.23–0.58, respectively. In our survey, the male–female ratio for reporting >2 sexual partners in the past year, a concurrent relationship in the past year, and > 2 sexual partners in a lifetime was 3.4, 2.6, and 1.2, respectively— far lower than the ratios observed in the Demographic

  5. Computational Chemistry Data Management Platform Based on the Semantic Web.

    Science.gov (United States)

    Wang, Bing; Dobosh, Paul A; Chalk, Stuart; Sopek, Mirek; Ostlund, Neil S

    2017-01-12

    This paper presents a formal data publishing platform for computational chemistry using semantic web technologies. This platform encapsulates computational chemistry data from a variety of packages in an Extensible Markup Language (XML) file called CSX (Common Standard for eXchange). On the basis of a Gainesville Core (GC) ontology for computational chemistry, a CSX XML file is converted into the JavaScript Object Notation for Linked Data (JSON-LD) format using an XML Stylesheet Language Transformation (XSLT) file. Ultimately the JSON-LD file is converted to subject-predicate-object triples in a Turtle (TTL) file and published on the web portal. By leveraging semantic web technologies, we are able to place computational chemistry data onto web portals as a component of a Giant Global Graph (GGG) such that computer agents, as well as individual chemists, can access the data.

  6. JavaFIRE: A Replica and File System for Grids

    Science.gov (United States)

    Petek, Marko; da Silva Gomes, Diego; Resin Geyer, Claudio Fernando; Santoro, Alberto; Gowdy, Stephen

    2012-12-01

    The work is focused on the creation of a replica and file transfers system for Computational Grids inspired on the needs of the High Energy Physics (HEP). Due to the high volume of data created by the HEP experiments, an efficient file and dataset replica system may play an important role on the computing model. Data replica systems allow the creation of copies, distributed between the different storage elements on the Grid. In the HEP context, the data files are basically immutable. This eases the task of the replica system, because given sufficient local storage resources any dataset just needs to be replicated to a particular site once. Concurrent with the advent of computational Grids, another important theme in the distributed systems area that has also seen some significant interest is that of peer-to-peer networks (p2p). P2p networks are an important and evolving mechanism that eases the use of distributed computing and storage resources by end users. One common technique to achieve faster file download from possibly overloaded storage elements over congested networks is to split the files into smaller pieces. This way, each piece can be transferred from a different replica, in parallel or not, optimizing the moments when the network conditions are better suited to the transfer. The main tasks achieved by the system are: the creation of replicas, the development of a system for replicas transfer (RFT) and for replicas location (RLS) with a different architecture that the one provided by Globus and the development of a system for file transfer in pieces on computational grids with interfaces for several storage elements. The RLS uses a p2p overlay based on the Kademlia algorithm.

  7. Factors influencing health professions students' use of computers for data analysis at three Ugandan public medical schools: a cross-sectional survey.

    Science.gov (United States)

    Munabi, Ian G; Buwembo, William; Bajunirwe, Francis; Kitara, David Lagoro; Joseph, Ruberwa; Peter, Kawungezi; Obua, Celestino; Quinn, John; Mwaka, Erisa S

    2015-02-25

    Effective utilization of computers and their applications in medical education and research is of paramount importance to students. The objective of this study was to determine the association between owning a computer and use of computers for research data analysis and the other factors influencing health professions students' computer use for data analysis. We conducted a cross sectional study among undergraduate health professions students at three public universities in Uganda using a self-administered questionnaire. The questionnaire was composed of questions on participant demographics, students' participation in research, computer ownership, and use of computers for data analysis. Descriptive and inferential statistics (uni-variable and multi- level logistic regression analysis) were used to analyse data. The level of significance was set at 0.05. Six hundred (600) of 668 questionnaires were completed and returned (response rate 89.8%). A majority of respondents were male (68.8%) and 75.3% reported owning computers. Overall, 63.7% of respondents reported that they had ever done computer based data analysis. The following factors were significant predictors of having ever done computer based data analysis: ownership of a computer (adj. OR 1.80, p = 0.02), recently completed course in statistics (Adj. OR 1.48, p =0.04), and participation in research (Adj. OR 2.64, p computer, participation in research and undertaking courses in research methods influence undergraduate students' use of computers for research data analysis. Students are increasingly participating in research, and thus need to have competencies for the successful conduct of research. Medical training institutions should encourage both curricular and extra-curricular efforts to enhance research capacity in line with the modern theories of adult learning.

  8. 43 CFR 4.1371 - Who may file, where to file, when to file.

    Science.gov (United States)

    2010-10-01

    ... of proposed suspension or rescission under 30 CFR 773.22 or a notice of suspension or rescission under 30 CFR 773.23 may file a request for review with the Hearings Division, Office of Hearings and... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Who may file, where to file, when to...

  9. 78 FR 17394 - Filing via the Internet; Electronic Tariff Filings; Revisions to Electric Quarterly Report Filing...

    Science.gov (United States)

    2013-03-21

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Filing via the Internet; Electronic Tariff Filings; Revisions to Electric Quarterly Report Filing Process; Notice of Technical Conference Take notice that on April 16, 2013,...

  10. 11 CFR 100.19 - File, filed or filing (2 U.S.C. 434(a)).

    Science.gov (United States)

    2010-01-01

    ... a facsimile machine or by electronic mail if the reporting entity is not required to file..., including electronic reporting entities, may use the Commission's website's on-line program to file 48-hour... the reporting entity is not required to file electronically in accordance with 11 CFR 104.18....

  11. Preprocessor and postprocessor computer programs for a radial-flow finite-element model

    Science.gov (United States)

    Pucci, A.A.; Pope, D.A.

    1987-01-01

    Preprocessing and postprocessing computer programs that enhance the utility of the U.S. Geological Survey radial-flow model have been developed. The preprocessor program: (1) generates a triangular finite element mesh from minimal data input, (2) produces graphical displays and tabulations of data for the mesh , and (3) prepares an input data file to use with the radial-flow model. The postprocessor program is a version of the radial-flow model, which was modified to (1) produce graphical output for simulation and field results, (2) generate a statistic for comparing the simulation results with observed data, and (3) allow hydrologic properties to vary in the simulated region. Examples of the use of the processor programs for a hypothetical aquifer test are presented. Instructions for the data files, format instructions, and a listing of the preprocessor and postprocessor source codes are given in the appendixes. (Author 's abstract)

  12. A History of the Andrew File System

    CERN Document Server

    CERN. Geneva; Altman, Jeffrey

    2011-01-01

    Derrick Brashear and Jeffrey Altman will present a technical history of the evolution of Andrew File System starting with the early days of the Andrew Project at Carnegie Mellon through the commercialization by Transarc Corporation and IBM and a decade of OpenAFS. The talk will be technical with a focus on the various decisions and implementation trade-offs that were made over the course of AFS versions 1 through 4, the development of the Distributed Computing Environment Distributed File System (DCE DFS), and the course of the OpenAFS development community. The speakers will also discuss the various AFS branches developed at the University of Michigan, Massachusetts Institute of Technology and Carnegie Mellon University.

  13. FORMATION OF A COMPUTER SECURITY POLICY BOARD

    CERN Multimedia

    2001-01-01

    In view of the increasing number of security incidents at CERN, the Directorate has set up a Computer Security Policy Board. Information about the mandate and the meetings of the Board is linked from http://cern.ch/security which is the entry point for computer security information at CERN. File Services Computing Rule The use of CERN's Computing facilities are governed by Operational Circular No 5 and its subsidiary rules. To protect file servers at CERN from unauthorised use, the Organization has issued a new subsidiary rule related to file services. Details hereof and of the complete set of rules applicable to the use of CERN computing facilities are available at http://cern.ch/ComputingRules

  14. FORMATION OF A COMPUTER SECURITY POLICY BOARD

    CERN Multimedia

    2001-01-01

    In view of the increasing number of security incidents at CERN, the Directorate has set up a Computer Security Policy Board. Information about the mandate and the meetings of the Board is linked from http://cern.ch/security, which is the entry point for computer security information at CERN. FILE SERVICES COMPUTING RULE The use of CERN's Computing facilities are governed by Operational Circular No 5 and its subsidiary rules. To protect file servers at CERN from unauthorised use, the Organization has issued a new subsidiary rule related to file services. Details hereof and of the complete set of rules applicable to the use of CERN computing facilities are available at http://cern.ch/ComputingRules.

  15. Download this PDF file

    African Journals Online (AJOL)

    Buchi

    Method: The case files of the 225 maternal deaths which occurred between ... cause related to or aggravated by the pregnancy or its ... Multiple factors have been suggested for this fall ..... responsible for only 2, (0.9%) of deaths. In other.

  16. Download this PDF file

    African Journals Online (AJOL)

    User

    MARC (machine-readable catalogue) format. The records are ... software in the library, while in Web OPAC,. HTML files are used ... training in other fields of applied learning that is relevant to the ..... the Libraries of Engineering. Colleges in ...

  17. Download this PDF file

    African Journals Online (AJOL)

    Fr. Ikenga

    Nigeria; and the gains and challenges of utilizing e-taxation in tax ... teething problems usually encountered in all new schemes. ... is to help tax authorities reduce and possibly eliminate tax evasion. .... issues that include how to use the tax authorities' electronic tax filing system and ..... financial institution for Direct Deposit.

  18. The New Resource File

    Science.gov (United States)

    Luck, Donald D.

    2011-01-01

    The development of the resource file is a common experience in teacher preparation programs. The author examines strategies for transforming what has been a project composed of physical resources to one emphasizing digital resources. Methods for finding, tagging, storing and retrieving resources are explored.

  19. FROM CAD MODEL TO 3D PRINT VIA “STL” FILE FORMAT

    National Research Council Canada - National Science Library

    Cătălin IANCU; Daniela IANCU; Alin STĂNCIOIU

    2010-01-01

    The paper work presents the STL file format, which is now used for transferring information from CAD software to a 3D printer, for obtaining the solid model in Rapid prototyping and Computer Aided Manufacturing...

  20. National Survey on Drug Use and Health: 8-Year R-DAS (NSDUH-2002-2009)

    Data.gov (United States)

    U.S. Department of Health & Human Services — This file includes data from the 2002 through 2009 National Survey on Drug Use and Health (NSDUH) survey. The only variables included in the 8-year 2002-2009 data...