WorldWideScience

Sample records for technology database desktop

  1. Remote desktop with HTML5 technology

    OpenAIRE

    Banič, Žiga

    2013-01-01

    In our thesis we implemented a remote desktop with the newest technology HTML5. The main reason we choose this technology is to access the system through many different computing platforms. Present web browsers take care for the right support of using before mentioned technology. With the new element canvas we can manipulate and display the remote computer. To manipulate the system we just add event listeners for mouse, keyboard and touch events on the element. Technology WebSockets served as...

  2. MICA: desktop software for comprehensive searching of DNA databases

    Directory of Open Access Journals (Sweden)

    Glick Benjamin S

    2006-10-01

    Full Text Available Abstract Background Molecular biologists work with DNA databases that often include entire genomes. A common requirement is to search a DNA database to find exact matches for a nondegenerate or partially degenerate query. The software programs available for such purposes are normally designed to run on remote servers, but an appealing alternative is to work with DNA databases stored on local computers. We describe a desktop software program termed MICA (K-Mer Indexing with Compact Arrays that allows large DNA databases to be searched efficiently using very little memory. Results MICA rapidly indexes a DNA database. On a Macintosh G5 computer, the complete human genome could be indexed in about 5 minutes. The indexing algorithm recognizes all 15 characters of the DNA alphabet and fully captures the information in any DNA sequence, yet for a typical sequence of length L, the index occupies only about 2L bytes. The index can be searched to return a complete list of exact matches for a nondegenerate or partially degenerate query of any length. A typical search of a long DNA sequence involves reading only a small fraction of the index into memory. As a result, searches are fast even when the available RAM is limited. Conclusion MICA is suitable as a search engine for desktop DNA analysis software.

  3. A VBA Desktop Database for Proposal Processing at National Optical Astronomy Observatories

    Science.gov (United States)

    Brown, Christa L.

    National Optical Astronomy Observatories (NOAO) has developed a relational Microsoft Windows desktop database using Microsoft Access and the Microsoft Office programming language, Visual Basic for Applications (VBA). The database is used to track data relating to observing proposals from original receipt through the review process, scheduling, observing, and final statistical reporting. The database has automated proposal processing and distribution of information. It allows NOAO to collect and archive data so as to query and analyze information about our science programs in new ways.

  4. Extending Database Integration Technology

    National Research Council Canada - National Science Library

    Buneman, Peter

    1999-01-01

    Formal approaches to the semantics of databases and database languages can have immediate and practical consequences in extending database integration technologies to include a vastly greater range...

  5. From Server to Desktop: Capital and Institutional Planning for Client/Server Technology.

    Science.gov (United States)

    Mullig, Richard M.; Frey, Keith W.

    1994-01-01

    Beginning with a request for an enhanced system for decision/strategic planning support, the University of Chicago's biological sciences division has developed a range of administrative client/server tools, instituted a capital replacement plan for desktop technology, and created a planning and staffing approach enabling rapid introduction of new…

  6. Desktop Simulation: Towards a New Strategy for Arts Technology Education

    Science.gov (United States)

    Eidsheim, Nina Sun

    2009-01-01

    For arts departments in many institutions, technology education entails prohibitive equipment costs, maintenance requirements and administrative demands. There are also inherent pedagogical challenges: for example, recording studio classes where, due to space and time constraints, only a few students in what might be a large class can properly…

  7. Network Computer Technology. Phase I: Viability and Promise within NASA's Desktop Computing Environment

    Science.gov (United States)

    Paluzzi, Peter; Miller, Rosalind; Kurihara, West; Eskey, Megan

    1998-01-01

    Over the past several months, major industry vendors have made a business case for the network computer as a win-win solution toward lowering total cost of ownership. This report provides results from Phase I of the Ames Research Center network computer evaluation project. It identifies factors to be considered for determining cost of ownership; further, it examines where, when, and how network computer technology might fit in NASA's desktop computing architecture.

  8. Benchmarking desktop and mobile handwriting across COTS devices: The e-BioSign biometric database

    OpenAIRE

    Tolosana, Ruben; Vera-Rodriguez, Ruben; Fierrez, Julian; Morales, Aythami; Ortega-Garcia, Javier

    2017-01-01

    This paper describes the design, acquisition process and baseline evaluation of the new e-BioSign database, which includes dynamic signature and handwriting information. Data is acquired from 5 different COTS devices: three Wacom devices (STU-500, STU-530 and DTU-1031) specifically designed to capture dynamic signatures and handwriting, and two general purpose tablets (Samsung Galaxy Note 10.1 and Samsung ATIV 7). For the two Samsung tablets, data is collected using both pen stylus and also t...

  9. Semantic Desktop

    Science.gov (United States)

    Sauermann, Leo; Kiesel, Malte; Schumacher, Kinga; Bernardi, Ansgar

    In diesem Beitrag wird gezeigt, wie der Arbeitsplatz der Zukunft aussehen könnte und wo das Semantic Web neue Möglichkeiten eröffnet. Dazu werden Ansätze aus dem Bereich Semantic Web, Knowledge Representation, Desktop-Anwendungen und Visualisierung vorgestellt, die es uns ermöglichen, die bestehenden Daten eines Benutzers neu zu interpretieren und zu verwenden. Dabei bringt die Kombination von Semantic Web und Desktop Computern besondere Vorteile - ein Paradigma, das unter dem Titel Semantic Desktop bekannt ist. Die beschriebenen Möglichkeiten der Applikationsintegration sind aber nicht auf den Desktop beschränkt, sondern können genauso in Web-Anwendungen Verwendung finden.

  10. Database management systems understanding and applying database technology

    CERN Document Server

    Gorman, Michael M

    1991-01-01

    Database Management Systems: Understanding and Applying Database Technology focuses on the processes, methodologies, techniques, and approaches involved in database management systems (DBMSs).The book first takes a look at ANSI database standards and DBMS applications and components. Discussion focus on application components and DBMS components, implementing the dynamic relationship application, problems and benefits of dynamic relationship DBMSs, nature of a dynamic relationship application, ANSI/NDL, and DBMS standards. The manuscript then ponders on logical database, interrogation, and phy

  11. Desktop Virtual Reality: A Powerful New Technology for Teaching and Research in Industrial Teacher Education

    Science.gov (United States)

    Ausburn, Lynna J.; Ausburn, Floyd B.

    2004-01-01

    Virtual Reality has been defined in many different ways and now means different things in various contexts. VR can range from simple environments presented on a desktop computer to fully immersive multisensory environments experienced through complex headgear and bodysuits. In all of its manifestations, VR is basically a way of simulating or…

  12. Desktop Genetics.

    Science.gov (United States)

    Hough, Soren H; Ajetunmobi, Ayokunmi; Brody, Leigh; Humphryes-Kirilov, Neil; Perello, Edward

    2016-11-01

    Desktop Genetics is a bioinformatics company building a gene-editing platform for personalized medicine. The company works with scientists around the world to design and execute state-of-the-art clustered regularly interspaced short palindromic repeats (CRISPR) experiments. Desktop Genetics feeds the lessons learned about experimental intent, single-guide RNA design and data from international genomics projects into a novel CRISPR artificial intelligence system. We believe that machine learning techniques can transform this information into a cognitive therapeutic development tool that will revolutionize medicine.

  13. Database Access through Java Technologies

    Directory of Open Access Journals (Sweden)

    Nicolae MERCIOIU

    2010-09-01

    Full Text Available As a high level development environment, the Java technologies offer support to the development of distributed applications, independent of the platform, providing a robust set of methods to access the databases, used to create software components on the server side, as well as on the client side. Analyzing the evolution of Java tools to access data, we notice that these tools evolved from simple methods that permitted the queries, the insertion, the update and the deletion of the data to advanced implementations such as distributed transactions, cursors and batch files. The client-server architectures allows through JDBC (the Java Database Connectivity the execution of SQL (Structured Query Language instructions and the manipulation of the results in an independent and consistent manner. The JDBC API (Application Programming Interface creates the level of abstractization needed to allow the call of SQL queries to any DBMS (Database Management System. In JDBC the native driver and the ODBC (Open Database Connectivity-JDBC bridge and the classes and interfaces of the JDBC API will be described. The four steps needed to build a JDBC driven application are presented briefly, emphasizing on the way each step has to be accomplished and the expected results. In each step there are evaluations on the characteristics of the database systems and the way the JDBC programming interface adapts to each one. The data types provided by SQL2 and SQL3 standards are analyzed by comparison with the Java data types, emphasizing on the discrepancies between those and the SQL types, but also the methods that allow the conversion between different types of data through the methods of the ResultSet object. Next, starting from the metadata role and studying the Java programming interfaces that allow the query of result sets, we will describe the advanced features of the data mining with JDBC. As alternative to result sets, the Rowsets add new functionalities that

  14. Desktop-Virtualisierung

    Science.gov (United States)

    von Liebisch, Daniel

    Thema dieses Artikels sind die Desktop-Virtualisierung, die Vorteile für das Management von Desktops und Citrix XenDesktop als komplette Lösung für die Desktop-Virtualisierung. Folgende Aspekte werden betrachtet: Die Herausforderungen des Desktop-Managements in Umgebungen ohne Desktop-Virtualisierung

  15. "Mr. Database" : Jim Gray and the History of Database Technologies.

    Science.gov (United States)

    Hanwahr, Nils C

    2017-12-01

    Although the widespread use of the term "Big Data" is comparatively recent, it invokes a phenomenon in the developments of database technology with distinct historical contexts. The database engineer Jim Gray, known as "Mr. Database" in Silicon Valley before his disappearance at sea in 2007, was involved in many of the crucial developments since the 1970s that constitute the foundation of exceedingly large and distributed databases. Jim Gray was involved in the development of relational database systems based on the concepts of Edgar F. Codd at IBM in the 1970s before he went on to develop principles of Transaction Processing that enable the parallel and highly distributed performance of databases today. He was also involved in creating forums for discourse between academia and industry, which influenced industry performance standards as well as database research agendas. As a co-founder of the San Francisco branch of Microsoft Research, Gray increasingly turned toward scientific applications of database technologies, e. g. leading the TerraServer project, an online database of satellite images. Inspired by Vannevar Bush's idea of the memex, Gray laid out his vision of a Personal Memex as well as a World Memex, eventually postulating a new era of data-based scientific discovery termed "Fourth Paradigm Science". This article gives an overview of Gray's contributions to the development of database technology as well as his research agendas and shows that central notions of Big Data have been occupying database engineers for much longer than the actual term has been in use.

  16. Multimedia database retrieval technology and applications

    CERN Document Server

    Muneesawang, Paisarn; Guan, Ling

    2014-01-01

    This book explores multimedia applications that emerged from computer vision and machine learning technologies. These state-of-the-art applications include MPEG-7, interactive multimedia retrieval, multimodal fusion, annotation, and database re-ranking. The application-oriented approach maximizes reader understanding of this complex field. Established researchers explain the latest developments in multimedia database technology and offer a glimpse of future technologies. The authors emphasize the crucial role of innovation, inspiring users to develop new applications in multimedia technologies

  17. Designing Corporate Databases to Support Technology Innovation

    Science.gov (United States)

    Gultz, Michael Jarett

    2012-01-01

    Based on a review of the existing literature on database design, this study proposed a unified database model to support corporate technology innovation. This study assessed potential support for the model based on the opinions of 200 technology industry executives, including Chief Information Officers, Chief Knowledge Officers and Chief Learning…

  18. XML technology planning database : lessons learned

    Science.gov (United States)

    Some, Raphael R.; Neff, Jon M.

    2005-01-01

    A hierarchical Extensible Markup Language(XML) database called XCALIBR (XML Analysis LIBRary) has been developed by Millennium Program to assist in technology investment (ROI) analysis and technology Language Capability the New return on portfolio optimization. The database contains mission requirements and technology capabilities, which are related by use of an XML dictionary. The XML dictionary codifies a standardized taxonomy for space missions, systems, subsystems and technologies. In addition to being used for ROI analysis, the database is being examined for use in project planning, tracking and documentation. During the past year, the database has moved from development into alpha testing. This paper describes the lessons learned during construction and testing of the prototype database and the motivation for moving from an XML taxonomy to a standard XML-based ontology.

  19. Evolution of Database Replication Technologies for WLCG

    CERN Document Server

    Baranowski, Zbigniew; Blaszczyk, Marcin; Dimitrov, Gancho; Canali, Luca

    2015-01-01

    In this article we summarize several years of experience on database replication technologies used at WLCG and we provide a short review of the available Oracle technologies and their key characteristics. One of the notable changes and improvement in this area in recent past has been the introduction of Oracle GoldenGate as a replacement of Oracle Streams. We report in this article on the preparation and later upgrades for remote replication done in collaboration with ATLAS and Tier 1 database administrators, including the experience from running Oracle GoldenGate in production. Moreover, we report on another key technology in this area: Oracle Active Data Guard which has been adopted in several of the mission critical use cases for database replication between online and offline databases for the LHC experiments.

  20. Training Database Technology in DBMS MS Access

    Directory of Open Access Journals (Sweden)

    Nataliya Evgenievna Surkova

    2015-05-01

    Full Text Available The article describes the methodological issues of learning relational database technology and management systems relational databases. DBMS Microsoft Access is the primer for learning of DBMS. This methodology allows to generate some general cultural competence, such as the possession of the main methods, ways and means of production, storage and processing of information, computer skills as a means of managing information. Also must formed professional competence such as the ability to collect, analyze and process the data necessary for solving the professional tasks, the ability to use solutions for analytical and research tasks modern technology and information technology.

  1. Desktop Virtualization: Applications and Considerations

    Science.gov (United States)

    Hodgman, Matthew R.

    2013-01-01

    As educational technology continues to rapidly become a vital part of a school district's infrastructure, desktop virtualization promises to provide cost-effective and education-enhancing solutions to school-based computer technology problems in school systems locally and abroad. This article outlines the history of and basic concepts behind…

  2. Evolution of Database Replication Technologies for WLCG

    OpenAIRE

    Baranowski, Zbigniew; Pardavila, Lorena Lobato; Blaszczyk, Marcin; Dimitrov, Gancho; Canali, Luca

    2015-01-01

    In this article we summarize several years of experience on database replication technologies used at WLCG and we provide a short review of the available Oracle technologies and their key characteristics. One of the notable changes and improvement in this area in recent past has been the introduction of Oracle GoldenGate as a replacement of Oracle Streams. We report in this article on the preparation and later upgrades for remote replication done in collaboration with ATLAS and Tier 1 databas...

  3. Solar Sail Propulsion Technology Readiness Level Database

    Science.gov (United States)

    Adams, Charles L.

    2004-01-01

    The NASA In-Space Propulsion Technology (ISPT) Projects Office has been sponsoring 2 solar sail system design and development hardware demonstration activities over the past 20 months. Able Engineering Company (AEC) of Goleta, CA is leading one team and L Garde, Inc. of Tustin, CA is leading the other team. Component, subsystem and system fabrication and testing has been completed successfully. The goal of these activities is to advance the technology readiness level (TRL) of solar sail propulsion from 3 towards 6 by 2006. These activities will culminate in the deployment and testing of 20-meter solar sail system ground demonstration hardware in the 30 meter diameter thermal-vacuum chamber at NASA Glenn Plum Brook in 2005. This paper will describe the features of a computer database system that documents the results of the solar sail development activities to-date. Illustrations of the hardware components and systems, test results, analytical models, relevant space environment definition and current TRL assessment, as stored and manipulated within the database are presented. This database could serve as a central repository for all data related to the advancement of solar sail technology sponsored by the ISPT, providing an up-to-date assessment of the TRL of this technology. Current plans are to eventually make the database available to the Solar Sail community through the Space Transportation Information Network (STIN).

  4. Conductive Carbon Nanotube Inks for Use with Desktop Inkjet Printing Technology

    Science.gov (United States)

    Roberson, Luke; Williams, Martha; Tate, LaNetra; Fortier, Craig; Smith, David; Davia, Kyle; Gibson, Tracy; Snyder, Sarah

    2013-01-01

    Inkjet printing is a common commercial process. In addition to the familiar use in printing documents from computers, it is also used in some industrial applications. For example, wire manufacturers are required by law to print the wire type, gauge, and safety information on the exterior of each foot of manufactured wire, and this is typically done with inkjet or laser printers. The goal of this work was the creation of conductive inks that can be applied to a wire or flexible substrates via inkjet printing methods. The use of inkjet printing technology to print conductive inks has been in testing for several years. While researchers have been able to get the printing system to mechanically work, the application of conductive inks on substrates has not consistently produced adequate low resistances in the kilohm range. Conductive materials can be applied using a printer in single or multiple passes onto a substrate including textiles, polymer films, and paper. The conductive materials are composed of electrical conductors such as carbon nanotubes (including functionalized carbon nanotubes and metal-coated carbon nanotubes); graphene, a polycyclic aromatic hydrocarbon (e.g., pentacene and bisperipentacene); metal nanoparticles; inherently conductive polymers (ICP); and combinations thereof. Once the conductive materials are applied, the materials are dried and sintered to form adherent conductive materials on the substrate. For certain formulations, increased conductivity can be achieved by printing on substrates supported by low levels of magnetic field alignment. The adherent conductive materials can be used in applications such as damage detection, dust particle removal, smart coating systems, and flexible electronic circuitry. By applying alternating layers of different electrical conductors to form a layered composite material, a single homogeneous layer can be produced with improved electrical properties. It is believed that patterning alternate layers of

  5. Instant Citrix XenDesktop 5 starter

    CERN Document Server

    Magdy, Mahmoud

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. This easy-to-follow, hands-on guide shows you how to implement desktop virtualization with real life cases and step-by-step instructions. It is a tutorial with step-by-step instructions and adequate screenshots for the installation and administration of Citrix XenDesktop.If you are new to XenDesktop or are looking to build your skills in desktop virtualization, this is your step-by-step guide to learning Citrix XenDesktop. For those architects a

  6. Exploiting relational database technology in a GIS

    Science.gov (United States)

    Batty, Peter

    1992-05-01

    All systems for managing data face common problems such as backup, recovery, auditing, security, data integrity, and concurrent update. Other challenges include the ability to share data easily between applications and to distribute data across several computers, whereas continuing to manage the problems already mentioned. Geographic information systems are no exception, and need to tackle all these issues. Standard relational database-management systems (RDBMSs) provide many features to help solve the issues mentioned so far. This paper describes how the IBM geoManager product approaches these issues by storing all its geographic data in a standard RDBMS in order to take advantage of such features. Areas in which standard RDBMS functions need to be extended are highlighted, and the way in which geoManager does this is explained. The performance implications of storing all data in the relational database are discussed. An important distinction is made between the storage and management of geographic data and the manipulation and analysis of geographic data, which needs to be made when considering the applicability of relational database technology to GIS.

  7. Research and implementation of a Web-based remote desktop image monitoring system

    International Nuclear Information System (INIS)

    Ren Weijuan; Li Luofeng; Wang Chunhong

    2010-01-01

    It studied and implemented an ISS (Image Snapshot Server) system based on Web, using Java Web technology. The ISS system consisted of client web browser and server. The server part could be divided into three modules as the screen shots software, web server and Oracle database. Screen shots software intercepted the desktop environment of the remote monitored PC and sent these pictures to a Tomcat web server for displaying on the web at real time. At the same time, these pictures were also saved in an Oracle database. Through the web browser, monitor person can view the real-time and historical desktop pictures of the monitored PC during some period. It is very convenient for any user to monitor the desktop image of remote monitoring PC. (authors)

  8. Desktop grid computing

    CERN Document Server

    Cerin, Christophe

    2012-01-01

    Desktop Grid Computing presents common techniques used in numerous models, algorithms, and tools developed during the last decade to implement desktop grid computing. These techniques enable the solution of many important sub-problems for middleware design, including scheduling, data management, security, load balancing, result certification, and fault tolerance. The book's first part covers the initial ideas and basic concepts of desktop grid computing. The second part explores challenging current and future problems. Each chapter presents the sub-problems, discusses theoretical and practical

  9. Linux Desktop Pocket Guide

    CERN Document Server

    Brickner, David

    2005-01-01

    While Mac OS X garners all the praise from pundits, and Windows XP attracts all the viruses, Linux is quietly being installed on millions of desktops every year. For programmers and system administrators, business users, and educators, desktop Linux is a breath of fresh air and a needed alternative to other operating systems. The Linux Desktop Pocket Guide is your introduction to using Linux on five of the most popular distributions: Fedora, Gentoo, Mandriva, SUSE, and Ubuntu. Despite what you may have heard, using Linux is not all that hard. Firefox and Konqueror can handle all your web bro

  10. Revolutionary Database Technology for Data Intensive Research

    NARCIS (Netherlands)

    Kersten, M.; Manegold, S.

    2012-01-01

    The ability to explore huge digital resources assembled in data warehouses, databases and files, at unprecedented speed, is becoming the driver of progress in science. However, existing database management systems (DBMS) are far from capable of meeting the scientists' requirements. The Database

  11. Towards P2P XML Database Technology

    NARCIS (Netherlands)

    Y. Zhang (Ying)

    2007-01-01

    textabstractTo ease the development of data-intensive P2P applications, we envision a P2P XML Database Management System (P2P XDBMS) that acts as a database middle-ware, providing a uniform database abstraction on top of a dynamic set of distributed data sources. In this PhD work, we research which

  12. Applying artificial intelligence to astronomical databases - a surveyof applicable technology.

    Science.gov (United States)

    Rosenthal, D. A.

    This paper surveys several emerging technologies which are relevant to astronomical database issues such as interface technology, internal database representation, and intelligent data reduction aids. Among the technologies discussed are natural language understanding, frame and object representations, planning, pattern analysis, machine learning and the nascent study of simulated neural nets. These techniques will become increasingly important for astronomical research, and in particular, for applications with large databases.

  13. Revolutionary Database Technology for Data Intensive Research

    NARCIS (Netherlands)

    M.L. Kersten (Martin); S. Manegold (Stefan)

    2012-01-01

    textabstractThe ability to explore huge digital resources assembled in data warehouses, databases and files, at unprecedented speed, is becoming the driver of progress in science. However, existing database management systems (DBMS) are far from capable of meeting the scientists’ requirements.

  14. An interactive physics-based unmanned ground vehicle simulator leveraging open source gaming technology: progress in the development and application of the virtual autonomous navigation environment (VANE) desktop

    Science.gov (United States)

    Rohde, Mitchell M.; Crawford, Justin; Toschlog, Matthew; Iagnemma, Karl D.; Kewlani, Guarav; Cummins, Christopher L.; Jones, Randolph A.; Horner, David A.

    2009-05-01

    It is widely recognized that simulation is pivotal to vehicle development, whether manned or unmanned. There are few dedicated choices, however, for those wishing to perform realistic, end-to-end simulations of unmanned ground vehicles (UGVs). The Virtual Autonomous Navigation Environment (VANE), under development by US Army Engineer Research and Development Center (ERDC), provides such capabilities but utilizes a High Performance Computing (HPC) Computational Testbed (CTB) and is not intended for on-line, real-time performance. A product of the VANE HPC research is a real-time desktop simulation application under development by the authors that provides a portal into the HPC environment as well as interaction with wider-scope semi-automated force simulations (e.g. OneSAF). This VANE desktop application, dubbed the Autonomous Navigation Virtual Environment Laboratory (ANVEL), enables analysis and testing of autonomous vehicle dynamics and terrain/obstacle interaction in real-time with the capability to interact within the HPC constructive geo-environmental CTB for high fidelity sensor evaluations. ANVEL leverages rigorous physics-based vehicle and vehicle-terrain interaction models in conjunction with high-quality, multimedia visualization techniques to form an intuitive, accurate engineering tool. The system provides an adaptable and customizable simulation platform that allows developers a controlled, repeatable testbed for advanced simulations. ANVEL leverages several key technologies not common to traditional engineering simulators, including techniques from the commercial video-game industry. These enable ANVEL to run on inexpensive commercial, off-the-shelf (COTS) hardware. In this paper, the authors describe key aspects of ANVEL and its development, as well as several initial applications of the system.

  15. DOE technology information management system database study report

    Energy Technology Data Exchange (ETDEWEB)

    Widing, M.A.; Blodgett, D.W.; Braun, M.D.; Jusko, M.J.; Keisler, J.M.; Love, R.J.; Robinson, G.L. [Argonne National Lab., IL (United States). Decision and Information Sciences Div.

    1994-11-01

    To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performed detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s Technology Information Management System.

  16. Desktop Computing Integration Project

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1992-01-01

    The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.

  17. Digital video for the desktop

    CERN Document Server

    Pender, Ken

    1999-01-01

    Practical introduction to creating and editing high quality video on the desktop. Using examples from a variety of video applications, benefit from a professional's experience, step-by-step, through a series of workshops demonstrating a wide variety of techniques. These include producing short films, multimedia and internet presentations, animated graphics and special effects.The opportunities for the independent videomaker have never been greater - make sure you bring your understanding fully up to date with this invaluable guide.No prior knowledge of the technology is assumed, with explanati

  18. Campus Computing, 1998. The Ninth National Survey of Desktop Computing and Information Technology in American Higher Education.

    Science.gov (United States)

    Green, Kenneth C.

    This report presents findings of a June 1998 survey of computing officials at 1,623 two- and four-year U.S. colleges and universities concerning the use of computer technology. The survey found that computing and information technology (IT) are now core components of the campus environment and classroom experience. However, key aspects of IT…

  19. Campus Computing, 1996. The Seventh National Survey of Desktop Computing and Information Technology in American Higher Education.

    Science.gov (United States)

    Green, Kenneth C.

    This report presents the findings of a June, 1996, survey of computing officials at 660 two- and four-year colleges and universities across the United States concerning the use of computer technology on college campuses. The survey found that instructional integration and user support emerged as the two most important information technology (IT)…

  20. Scalable Database Access Technologies for ATLAS Distributed Computing

    CERN Document Server

    Vaniachine, A

    2009-01-01

    ATLAS event data processing requires access to non-event data (detector conditions, calibrations, etc.) stored in relational databases. The database-resident data are crucial for the event data reconstruction processing steps and often required for user analysis. A main focus of ATLAS database operations is on the worldwide distribution of the Conditions DB data, which are necessary for every ATLAS data processing job. Since Conditions DB access is critical for operations with real data, we have developed the system where a different technology can be used as a redundant backup. Redundant database operations infrastructure fully satisfies the requirements of ATLAS reprocessing, which has been proven on a scale of one billion database queries during two reprocessing campaigns of 0.5 PB of single-beam and cosmics data on the Grid. To collect experience and provide input for a best choice of technologies, several promising options for efficient database access in user analysis were evaluated successfully. We pre...

  1. Database mirroring in fault-tolerant continuous technological process control

    Directory of Open Access Journals (Sweden)

    R. Danel

    2015-10-01

    Full Text Available This paper describes the implementations of mirroring technology of the selected database systems – Microsoft SQL Server, MySQL and Caché. By simulating critical failures the systems behavior and their resilience against failure were tested. The aim was to determine whether the database mirroring is suitable to use in continuous metallurgical processes for ensuring the fault-tolerant solution at affordable cost. The present day database systems are characterized by high robustness and are resistant to sudden system failure. Database mirroring technologies are reliable and even low-budget projects can be provided with a decent fault-tolerant solution. The database system technologies available for low-budget projects are not suitable for use in real-time systems.

  2. Citrix XenApp 7.5 desktop virtualization solutions

    CERN Document Server

    Paul, Andy

    2014-01-01

    If you are a Citrix® engineer, a virtualization consultant, or an IT project manager with prior experience of using Citrix XenApp® and related technologies for desktop virtualization and want to further explore the power of XenApp® for flawless desktop virtualization, then this book is for you.

  3. Potential use of routine databases in health technology assessment.

    Science.gov (United States)

    Raftery, J; Roderick, P; Stevens, A

    2005-05-01

    To develop criteria for classifying databases in relation to their potential use in health technology (HT) assessment and to apply them to a list of databases of relevance in the UK. To explore the extent to which prioritized databases could pick up those HTs being assessed by the National Coordinating Centre for Health Technology Assessment (NCCHTA) and the extent to which these databases have been used in HT assessment. To explore the validation of the databases and their cost. Electronic databases. Key literature sources. Experienced users of routine databases. A 'first principles' examination of the data necessary for each type of HT assessment was carried out, supplemented by literature searches and a historical review. The principal investigators applied the criteria to the databases. Comments of the 'keepers' of the prioritized databases were incorporated. Details of 161 topics funded by the NHS R&D Health Technology Assessment (HTA) programme were reviewed iteratively by the principal investigators. Uses of databases in HTAs were identified by literature searches, which included the title of each prioritized database as a keyword. Annual reports of databases were examined and 'keepers' queried. The validity of each database was assessed using criteria based on a literature search and involvement by the authors in a national academic network. The costs of databases were established from annual reports, enquiries to 'keepers' of databases and 'guesstimates' based on cost per record. For assessing effectiveness, equity and diffusion, routine databases were classified into three broad groups: (1) group I databases, identifying both HTs and health states, (2) group II databases, identifying the HTs, but not a health state, and (3) group III databases, identifying health states, but not an HT. Group I datasets were disaggregated into clinical registries, clinical administrative databases and population-oriented databases. Group III were disaggregated into adverse

  4. Analysis of technologies databases use in physical education and sport

    Directory of Open Access Journals (Sweden)

    Usychenko V.V.

    2010-03-01

    Full Text Available Analysis and systematization is conducted scientific methodical and the special literature. The questions of the use of technology of databases rise in the system of preparation of sportsmen. The necessity of application of technologies of operative treatment of large arrays of sporting information is rotined. Collected taking on the use of computer-aided technologies of account and analysis of results of testing of parameters of training process. The question of influence of technologies is considered on training and competition activity. A database is presented «Athlete». A base contains anthropometric and myometrical indexes of sportsmen of bodybuilding of high qualification.

  5. Fusion research and technology records in INIS database

    International Nuclear Information System (INIS)

    Hillebrand, C.D.

    1998-01-01

    This article is a summary of a survey study ''''A survey on publications in Fusion Research and Technology. Science and Technology Indicators in Fusion R and T'''' by the same author on Fusion R and T records in the International Nuclear Information System (INIS) bibliographic database. In that study, for the first time, all scientometric and bibliometric information contained in a bibliographic database, using INIS records, is analyzed and quantified, specific to a selected field of science and technology. A variety of new science and technology indicators which can be used for evaluating research and development activities is also presented in that study that study

  6. Analysis of technologies databases use in physical education and sport

    OpenAIRE

    Usychenko V.V.; Byshevets N.G.

    2010-01-01

    Analysis and systematization is conducted scientific methodical and the special literature. The questions of the use of technology of databases rise in the system of preparation of sportsmen. The necessity of application of technologies of operative treatment of large arrays of sporting information is rotined. Collected taking on the use of computer-aided technologies of account and analysis of results of testing of parameters of training process. The question of influence of technologies is ...

  7. JENDL. Nuclear databases for science and technology

    International Nuclear Information System (INIS)

    Shibata, Keiichi

    2013-01-01

    It is exactly 50 years since the Japanese Nuclear Data Committee was founded both in the Atomic Energy Society of Japan and in the former Japan Atomic Energy Research Institute. The committee promoted the development of Japan's own evaluated nuclear data libraries. As a result, we managed to produce a series of Japanese Evaluated Nuclear Data Libraries (JENDLs) to be used in various fields for science and technology. The libraries are categorized into general-purpose and special-purpose ones. The general-purpose libraries have been updated periodically by considering the latest knowledge on experimental and theoretical nuclear physics that was available at the time of the updates. On the other hand, the special-purpose libraries have been issued in order to meet the needs for particular application fields. This paper reviews the research and development for those libraries. (author)

  8. High-Performance Secure Database Access Technologies for HEP Grids

    International Nuclear Information System (INIS)

    Vranicar, Matthew; Weicher, John

    2006-01-01

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysis capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist's computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that 'Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications'. There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the secure

  9. High-Performance Secure Database Access Technologies for HEP Grids

    Energy Technology Data Exchange (ETDEWEB)

    Matthew Vranicar; John Weicher

    2006-04-17

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysis capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist’s computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that "Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications.” There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the

  10. Fatigue monitoring desktop guide

    International Nuclear Information System (INIS)

    Woods, K.; Thomas, K.

    2012-01-01

    The development of a program for managing material aging (MMG) in the nuclear industry requires a new and different perspective. The classical method for MMG is cycle counting, which has been shown to have limited success. The classical method has been successful in satisfying the ductile condition per the America Society of Mechanical Engineers' (ASME) design criteria. However, the defined material failure mechanism has transformed from through-wall cracking and leakage (ASME) to crack initiation (NUREG-6909). This transformation is based on current industry experience with material degradation early in plant life and can be attributed to fabrication issues and environment concerns where cycle counting has been unsuccessful. This new perspective provides a different approach to cycle counting that incorporates all of the information about the material conditions. This approach goes beyond the consideration of a static analysis and includes a dynamic assessment of component health, which is required for operating plants. This health definition should consider fabrication, inspections, transient conditions and industry operating experience. In addition, this collection of information can be transparent to a broader audience that may not have a full understanding of the system design or the potential causes of early material degradation. This paper will present the key points that are needed for a successful fatigue monitoring desktop guide. (authors)

  11. Experience with a run file archive using database technology

    International Nuclear Information System (INIS)

    Nixdorf, U.

    1993-12-01

    High Energy Physics experiments are known for their production of large amounts of data. Even small projects may have to manage several Giga Byte of event information. One possible solution for the management of this data is to use today's technology to archive the raw data files in tertiary storage and build on-line catalogs which reference interesting data. This approach has been taken by the Gammas, Electrons and Muons (GEM) Collaboration for their evaluation of muon chamber technologies at the Superconducting Super Collider Laboratory (SSCL). Several technologies were installed and tested during a 6 month period. Events produced were first recorded in the UNIX filesystem of the data acquisition system and then migrated to the Physics Detector Simulation Facility (PDSF) for long term storage. The software system makes use of a commercial relational database management system (SYBASE) and the Data Management System (DMS), a tape archival system developed at the SSCL. The components are distributed among several machines inside and outside PDSF. A Motif-based graphical user interface (GUI) enables physicists to retrieve interesting runs from the archive using the on-line database catalog

  12. Strength of PLA Components Fabricated with Fused Deposition Technology Using a Desktop 3D Printer as a Function of Geometrical Parameters of the Process

    Directory of Open Access Journals (Sweden)

    Vladimir E. Kuznetsov

    2018-03-01

    Full Text Available The current paper studies the influence of geometrical parameters of the fused deposition modeling (FDM—fused filament fabrication (FFF 3D printing process on printed part strength for open source desktop 3D printers and the most popular material used for that purpose—i.e., polylactic acid (PLA. The study was conducted using a set of different nozzles (0.4, 0.6, and 0.8 mm and a range of layer heights from the minimum to maximum physical limits of the machine. To assess print strength, a novel assessment method is proposed. A tubular sample is loaded in the weakest direction (across layers in a three-point bending fixture. Mesostructure evaluation through scanning electronic microscopy (SEM scans of the samples was used to explain the obtained results. We detected a significant influence of geometric process parameters on sample mesostructure, and consequently, on sample strength.

  13. Nuclear plant analyzer desktop workstation

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1990-01-01

    In 1983 the U.S. Nuclear Regulatory Commission (USNRC) commissioned the Idaho National Engineering Laboratory (INEL) to develop a Nuclear Plant Analyzer (NPA). The NPA was envisioned as a graphical aid to assist reactor safety analysts in comprehending the results of thermal-hydraulic code calculations. The development was to proceed in three distinct phases culminating in a desktop reactor safety workstation. The desktop NPA is now complete. The desktop NPA is a microcomputer based reactor transient simulation, visualization and analysis tool developed at INEL to assist an analyst in evaluating the transient behavior of nuclear power plants by means of graphic displays. The NPA desktop workstation integrates advanced reactor simulation codes with online computer graphics allowing reactor plant transient simulation and graphical presentation of results. The graphics software, written exclusively in ANSI standard C and FORTRAN 77 and implemented over the UNIX/X-windows operating environment, is modular and is designed to interface to the NRC's suite of advanced thermal-hydraulic codes to the extent allowed by that code. Currently, full, interactive, desktop NPA capabilities are realized only with RELAP5

  14. The Point Lepreau Desktop Simulator

    International Nuclear Information System (INIS)

    MacLean, M.; Hogg, J.; Newman, H.

    1997-01-01

    The Point Lepreau Desktop Simulator runs plant process modeling software on a 266 MHz single CPU DEC Alpha computer. This same Alpha also runs the plant control computer software on an SSCI 125 emulator. An adjacent Pentium PC runs the simulator's Instructor Facility software, and communicates with the Alpha through an Ethernet. The Point Lepreau Desktop simulator is constructed to be as similar as possible to the Point Lepreau full scope training simulator. This minimizes total maintenance costs and enhances the benefits of the desktop simulator. Both simulators have the same modeling running on a single CPU in the same schedule of calculations. Both simulators have the same Instructor Facility capable of developing and executing the same lesson plans, doing the same monitoring and control of simulations, inserting all the same malfunctions, performing all the same overrides, capable of making and restoring all the same storepoints. Both simulators run the same plant control computer software - the same assembly language control programs as the power plant uses for reactor control, heat transport control, annunciation, etc. This is a higher degree of similarity between a desktop simulator and a full scope training simulator than previously reported for a computer controlled nuclear plant. The large quantity of control room hardware missing from the desktop simulator is replaced by software. The Instructor Facility panel override software of the training simulator provides the means by which devices (switches, controllers, windows, etc.) on the control room panels can be controlled and monitored in the desktop simulator. The CRT of the Alpha provides a mouse operated DCC keyboard mimic for controlling the plant control computer emulation. Two emulated RAMTEK display channels appear as windows for monitoring anything of interest on plant DCC displays, including one channel for annunciation. (author)

  15. Health technology management: a database analysis as support of technology managers in hospitals.

    Science.gov (United States)

    Miniati, Roberto; Dori, Fabrizio; Iadanza, Ernesto; Fregonara, Mario M; Gentili, Guido Biffi

    2011-01-01

    Technology management in healthcare must continually respond and adapt itself to new improvements in medical equipment. Multidisciplinary approaches which consider the interaction of different technologies, their use and user skills, are necessary in order to improve safety and quality. An easy and sustainable methodology is vital to Clinical Engineering (CE) services in healthcare organizations in order to define criteria regarding technology acquisition and replacement. This article underlines the critical aspects of technology management in hospitals by providing appropriate indicators for benchmarking CE services exclusively referring to the maintenance database from the CE department at the Careggi Hospital in Florence, Italy.

  16. Scaling up ATLAS Database Release Technology for the LHC Long Run

    International Nuclear Information System (INIS)

    Borodin, M; Nevski, P; Vaniachine, A

    2011-01-01

    To overcome scalability limitations in database access on the Grid, ATLAS introduced the Database Release technology replicating databases in files. For years Database Release technology assured scalable database access for Monte Carlo production on the Grid. Since previous CHEP, Database Release technology was used successfully in ATLAS data reprocessing on the Grid. Frozen Conditions DB snapshot guarantees reproducibility and transactional consistency isolating Grid data processing tasks from continuous conditions updates at the 'live' Oracle server. Database Release technology fully satisfies the requirements of ATLAS data reprocessing and Monte Carlo production. We parallelized the Database Release build workflow to avoid linear dependency of the build time on the length of LHC data-taking period. In recent data reprocessing campaigns the build time was reduced by an order of magnitude thanks to a proven master-worker architecture used in the Google MapReduce. We describe further Database Release optimizations scaling up the technology for the LHC long run.

  17. Nielsen PrimeLocation Web/Desktop: Assessing and GIS Mapping Market Area

    Data.gov (United States)

    Social Security Administration — Nielsen PrimeLocation Web and Desktop Software Licensed for Internal Use only: Pop-Facts Demographics Database, Geographic Mapping Data Layers, Geo-Coding locations.

  18. Structure health monitoring system using internet and database technologies

    International Nuclear Information System (INIS)

    Kwon, Il Bum; Kim, Chi Yeop; Choi, Man Yong; Lee, Seung Seok

    2003-01-01

    Structural health monitoring system should developed to be based on internet and database technology in order to manage efficiently large structures. This system is operated by internet connected with the side of structures. The monitoring system has some functions: self monitoring, self diagnosis, and self control etc. Self monitoring is the function of sensor fault detection. If some sensors are not normally worked, then this system can detect the fault sensors. Also Self diagnosis function repair the abnormal condition of sensors. And self control is the repair function of the monitoring system. Especially, the monitoring system can identify the replacement of sensors. For further study, the real application test will be performed to check some unconvince.

  19. When the Internet Knocks, Unlock the Desktop.

    Science.gov (United States)

    Nyerges, Mike

    1999-01-01

    Describes one public school library's struggles and solutions to problems associated with managing its new Windows desktop. To implement goals for organizing the desktop, the library selected the FoolProof desktop security system, utilized networks administration to enforce the school's Acceptable Use Policy, and used Windows 95 System Policies.…

  20. Databases

    Digital Repository Service at National Institute of Oceanography (India)

    Kunte, P.D.

    Information on bibliographic as well as numeric/textual databases relevant to coastal geomorphology has been included in a tabular form. Databases cover a broad spectrum of related subjects like coastal environment and population aspects, coastline...

  1. Scaling up ATLAS Database Release Technology for the LHC Long Run

    CERN Document Server

    Borodin, M; The ATLAS collaboration; Vaniachine, A

    2010-01-01

    To overcome scalability limitations in database access on the Grid, ATLAS introduced the Database Release technology replicating databases in files. For years Database Release technology assured scalable database access for Monte Carlo production on the Grid. Since previous CHEP, Database Release technology was used successfully in ATLAS data reprocessing on the Grid. Frozen Conditions DB snapshot guarantees reproducibility and transactional consistency isolating Grid data processing tasks from continuous conditions updates at the “live” Oracle server. Database Release technology fully satisfies the requirements of ATLAS data reprocessing and Monte Carlo production. It is fast (on-demand access to ~100 MB of data takes less than 10 s), robust (failure rate less than 10**-6 per job that makes 10K queries), and scalable (served 1B queries in one of the reprocessing campaigns). We parallelized the Database Release build workflow to avoid linear dependency of the build time on the length of LHC data-taking pe...

  2. Ceramics Technology Project database: September 1991 summary report

    Energy Technology Data Exchange (ETDEWEB)

    Keyes, B.L.P.

    1992-06-01

    The piston ring-cylinder liner area of the internal combustion engine must withstand very-high-temperature gradients, highly-corrosive environments, and constant friction. Improving the efficiency in the engine requires ring and cylinder liner materials that can survive this abusive environment and lubricants that resist decomposition at elevated temperatures. Wear and friction tests have been done on many material combinations in environments similar to actual use to find the right materials for the situation. This report covers tribology information produced from 1986 through July 1991 by Battelle columbus Laboratories, Caterpillar Inc., and Cummins Engine Company, Inc. for the Ceramic Technology Project (CTP). All data in this report were taken from the project`s semiannual and bimonthly progress reports and cover base materials, coatings, and lubricants. The data, including test rig descriptions and material characterizations, are stored in the CTP database and are available to all project participants on request. Objective of this report is to make available the test results from these studies, but not to draw conclusions from these data.

  3. Applications of GIS and database technologies to manage a Karst Feature Database

    Science.gov (United States)

    Gao, Y.; Tipping, R.G.; Alexander, E.C.

    2006-01-01

    This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.

  4. Desktop supercomputer: what can it do?

    Science.gov (United States)

    Bogdanov, A.; Degtyarev, A.; Korkhov, V.

    2017-12-01

    The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.

  5. Clinical evaluation of a desktop robotic assistant.

    Science.gov (United States)

    Hammel, J; Hall, K; Lees, D; Leifer, L; Van der Loos, M; Perkash, I; Crigler, R

    1989-01-01

    A desktop vocational assistant robotic workstation was evaluated by 24 high-level quadriplegics from the Palo Alto Veterans Affairs Spinal Cord Injury Center. The system is capable of performing daily living and vocational activities for individuals with high-level quadriplegia via voice control. Subjects were asked to use the robot to perform a repertoire of daily living activities, including preparing a meal and feeding themselves, washing their face, shaving, and brushing teeth. Pre- and post-test questionnaires, interviews, and observer assessments were conducted to determine the quality of the robot performance and the reaction of the disabled users toward this technology. Results of the evaluations were generally positive and demonstrated the usefulness of this technology in assisting high-level quadriplegics to perform daily activities and to gain a modicum of independence and privacy in their lives.

  6. Databases

    Directory of Open Access Journals (Sweden)

    Nick Ryan

    2004-01-01

    Full Text Available Databases are deeply embedded in archaeology, underpinning and supporting many aspects of the subject. However, as well as providing a means for storing, retrieving and modifying data, databases themselves must be a result of a detailed analysis and design process. This article looks at this process, and shows how the characteristics of data models affect the process of database design and implementation. The impact of the Internet on the development of databases is examined, and the article concludes with a discussion of a range of issues associated with the recording and management of archaeological data.

  7. NoSQL technologies for the CMS Conditions Database

    Science.gov (United States)

    Sipos, Roland

    2015-12-01

    With the restart of the LHC in 2015, the growth of the CMS Conditions dataset will continue, therefore the need of consistent and highly available access to the Conditions makes a great cause to revisit different aspects of the current data storage solutions. We present a study of alternative data storage backends for the Conditions Databases, by evaluating some of the most popular NoSQL databases to support a key-value representation of the CMS Conditions. The definition of the database infrastructure is based on the need of storing the conditions as BLOBs. Because of this, each condition can reach the size that may require special treatment (splitting) in these NoSQL databases. As big binary objects may be problematic in several database systems, and also to give an accurate baseline, a testing framework extension was implemented to measure the characteristics of the handling of arbitrary binary data in these databases. Based on the evaluation, prototypes of a document store, using a column-oriented and plain key-value store, are deployed. An adaption layer to access the backends in the CMS Offline software was developed to provide transparent support for these NoSQL databases in the CMS context. Additional data modelling approaches and considerations in the software layer, deployment and automatization of the databases are also covered in the research. In this paper we present the results of the evaluation as well as a performance comparison of the prototypes studied.

  8. multiplierz: an extensible API based desktop environment for proteomics data analysis.

    Science.gov (United States)

    Parikh, Jignesh R; Askenazi, Manor; Ficarro, Scott B; Cashorali, Tanya; Webber, James T; Blank, Nathaniel C; Zhang, Yi; Marto, Jarrod A

    2009-10-29

    Efficient analysis of results from mass spectrometry-based proteomics experiments requires access to disparate data types, including native mass spectrometry files, output from algorithms that assign peptide sequence to MS/MS spectra, and annotation for proteins and pathways from various database sources. Moreover, proteomics technologies and experimental methods are not yet standardized; hence a high degree of flexibility is necessary for efficient support of high- and low-throughput data analytic tasks. Development of a desktop environment that is sufficiently robust for deployment in data analytic pipelines, and simultaneously supports customization for programmers and non-programmers alike, has proven to be a significant challenge. We describe multiplierz, a flexible and open-source desktop environment for comprehensive proteomics data analysis. We use this framework to expose a prototype version of our recently proposed common API (mzAPI) designed for direct access to proprietary mass spectrometry files. In addition to routine data analytic tasks, multiplierz supports generation of information rich, portable spreadsheet-based reports. Moreover, multiplierz is designed around a "zero infrastructure" philosophy, meaning that it can be deployed by end users with little or no system administration support. Finally, access to multiplierz functionality is provided via high-level Python scripts, resulting in a fully extensible data analytic environment for rapid development of custom algorithms and deployment of high-throughput data pipelines. Collectively, mzAPI and multiplierz facilitate a wide range of data analysis tasks, spanning technology development to biological annotation, for mass spectrometry-based proteomics research.

  9. The research of network database security technology based on web service

    Science.gov (United States)

    Meng, Fanxing; Wen, Xiumei; Gao, Liting; Pang, Hui; Wang, Qinglin

    2013-03-01

    Database technology is one of the most widely applied computer technologies, its security is becoming more and more important. This paper introduced the database security, network database security level, studies the security technology of the network database, analyzes emphatically sub-key encryption algorithm, applies this algorithm into the campus-one-card system successfully. The realization process of the encryption algorithm is discussed, this method is widely used as reference in many fields, particularly in management information system security and e-commerce.

  10. NoSQL technologies for the CMS Conditions Database

    CERN Document Server

    Sipos, Roland

    2015-01-01

    With the restart of the LHC in 2015, the growth of the CMS Conditions dataset will continue, therefore the need of consistent and highly available access to the Conditions makes a great cause to revisit different aspects of the current data storage solutions.We present a study of alternative data storage backends for the Conditions Databases, by evaluating some of the most popular NoSQL databases to support a key-value representation of the CMS Conditions. An important detail about the Conditions that the payloads are stored as BLOBs, and they can reach sizes that may require special treatment (splitting) in these NoSQL databases. As big binary objects may be a bottleneck in several database systems, and also to give an accurate baseline, a testing framework extension was implemented to measure the characteristics of the handling of arbitrary binary data in these databases. Based on the evaluation, prototypes of a document store, using a column-oriented and plain key-value store, are deployed. An adaption l...

  11. Development of Integrated PSA Database and Application Technology

    International Nuclear Information System (INIS)

    Han, Sang Hoon; Kang, Dae Il; Park, Jin Hee; Kim, Seung Hwan; Choi, Sun Yeong; Jung, Woo Sik; Ha, Jae Joo; Ahn, Kwang Il

    2007-06-01

    The high quality of PSA is essential for the risk informed regulation and applications. The main elements of PSA are the model, methodology, reliability data, and tools. The purpose of the project is to develop the reliability database for the Korean nuclear power plants and PSA analysis and management system. The reliability database system has been developed and the reliability data has been collected for 4 types of reliability data such as the reactor trip, the piping, the component and the common cause failure. The database provides the reliability data for PSAs and risk informed applications. The FTREX software is the fastest PSA quantification engine in the world. The license agreement between KAERI and EPRI is made to sell FTREX to the members of EPRI. The advanced PSA management system AIMS- PSA has been developed. The PSA model is stored in the database and solved by clicking one button. All the information necessary for the KSNP Level-1 and 2 PSA is stored in the PSA information database. It provides the PSA users a useful mean to review and analyze the PSA

  12. Data-Base Software For Tracking Technological Developments

    Science.gov (United States)

    Aliberti, James A.; Wright, Simon; Monteith, Steve K.

    1996-01-01

    Technology Tracking System (TechTracS) computer program developed for use in storing and retrieving information on technology and related patent information developed under auspices of NASA Headquarters and NASA's field centers. Contents of data base include multiple scanned still images and quick-time movies as well as text. TechTracS includes word-processing, report-editing, chart-and-graph-editing, and search-editing subprograms. Extensive keyword searching capabilities enable rapid location of technologies, innovators, and companies. System performs routine functions automatically and serves multiple users.

  13. Healthcare Databases in Thailand and Japan: Potential Sources for Health Technology Assessment Research.

    Science.gov (United States)

    Saokaew, Surasak; Sugimoto, Takashi; Kamae, Isao; Pratoomsoot, Chayanin; Chaiyakunapruk, Nathorn

    2015-01-01

    Health technology assessment (HTA) has been continuously used for value-based healthcare decisions over the last decade. Healthcare databases represent an important source of information for HTA, which has seen a surge in use in Western countries. Although HTA agencies have been established in Asia-Pacific region, application and understanding of healthcare databases for HTA is rather limited. Thus, we reviewed existing databases to assess their potential for HTA in Thailand where HTA has been used officially and Japan where HTA is going to be officially introduced. Existing healthcare databases in Thailand and Japan were compiled and reviewed. Databases' characteristics e.g. name of database, host, scope/objective, time/sample size, design, data collection method, population/sample, and variables were described. Databases were assessed for its potential HTA use in terms of safety/efficacy/effectiveness, social/ethical, organization/professional, economic, and epidemiological domains. Request route for each database was also provided. Forty databases- 20 from Thailand and 20 from Japan-were included. These comprised of national censuses, surveys, registries, administrative data, and claimed databases. All databases were potentially used for epidemiological studies. In addition, data on mortality, morbidity, disability, adverse events, quality of life, service/technology utilization, length of stay, and economics were also found in some databases. However, access to patient-level data was limited since information about the databases was not available on public sources. Our findings have shown that existing databases provided valuable information for HTA research with limitation on accessibility. Mutual dialogue on healthcare database development and usage for HTA among Asia-Pacific region is needed.

  14. A Collaborative Digital Pathology System for Multi-Touch Mobile and Desktop Computing Platforms

    KAUST Repository

    Jeong, W.

    2013-06-13

    Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server system that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch-enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. We introduce a domain-specific image-stack compression method that leverages real-time hardware decoding on mobile devices. It adaptively encodes image stacks in a decorrelated colour space to achieve extremely low bitrates (0.8 bpp) with very low loss of image quality. We evaluate the image quality of our compression method and the performance of our system for diagnosis with an in-depth user study. Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server systems that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. © 2013 The Eurographics Association and John Wiley & Sons Ltd.

  15. Real-Time Wildfire Monitoring Using Scientific Database and Linked Data Technologies

    NARCIS (Netherlands)

    M. Koubarakis (Manolis); C. Kontoes (Charalampos); S. Manegold (Stefan); M. Karpathiotakis (Manos); K. Kyzirakos (Konstantinos); K. Bereta (Konstantina); G. Garbis (George); C. Nikolaou (Charalampos); D. Michail (Dimitrios); I. Papoutsis (Ioannis); T. Herekakis (Themistocles); M.G. Ivanova (Milena); Y. Zhang (Ying); H. Pirk (Holger); M.L. Kersten (Martin); K. Dogani (Kallirroi); S. Giannakopoulou (Stella); P. Smeros (Panayiotis)

    2013-01-01

    textabstractWe present a real-time wildfire monitoring service that exploits satellite images and linked geospatial data to detect hotspots and monitor the evolution of fire fronts. The service makes heavy use of scientific database technologies (array databases, SciQL, data vaults) and linked data

  16. Development of a national neutron database for nuclear technology

    International Nuclear Information System (INIS)

    Igantyuk, A.V.; Kononov, V.N.; Kuzminov, B.D.; Manokhin, V.N.; Nikolaev, M.N.; Furzov, B.I.

    1997-01-01

    This paper describes the stages of a many years activities at the IPPE consisting of the measurement, theoretical description and evaluation of neutron data, and of the establishment of a national data bank of neutron data for nuclear technology. A list of libraries which are stored at the Nuclear Data Centre is given. (author). 16 refs, 14 tabs

  17. A comparison of different database technologies for the CMS AsyncStageOut transfer database

    Science.gov (United States)

    Ciangottini, D.; Balcas, J.; Mascheroni, M.; Rupeika, E. A.; Vaandering, E.; Riahi, H.; Silva, J. M. D.; Hernandez, J. M.; Belforte, S.; Ivanov, T. T.

    2017-10-01

    AsyncStageOut (ASO) is the component of the CMS distributed data analysis system (CRAB) that manages users transfers in a centrally controlled way using the File Transfer System (FTS3) at CERN. It addresses a major weakness of the previous, decentralized model, namely that the transfer of the user’s output data to a single remote site was part of the job execution, resulting in inefficient use of job slots and an unacceptable failure rate. Currently ASO manages up to 600k files of various sizes per day from more than 500 users per month, spread over more than 100 sites. ASO uses a NoSQL database (CouchDB) as internal bookkeeping and as way to communicate with other CRAB components. Since ASO/CRAB were put in production in 2014, the number of transfers constantly increased up to a point where the pressure to the central CouchDB instance became critical, creating new challenges for the system scalability, performance, and monitoring. This forced a re-engineering of the ASO application to increase its scalability and lowering its operational effort. In this contribution we present a comparison of the performance of the current NoSQL implementation and a new SQL implementation, and how their different strengths and features influenced the design choices and operational experience. We also discuss other architectural changes introduced in the system to handle the increasing load and latency in delivering output to the user.

  18. A Comparison of Different Database Technologies for the CMS AsyncStageOut Transfer Database

    Energy Technology Data Exchange (ETDEWEB)

    Ciangottini, D. [INFN, Perugia; Balcas, J. [Caltech; Mascheroni, M. [Fermilab; Rupeika, E. A. [Vilnius U.; Vaandering, E. [Fermilab; Riahi, H. [CERN; Silva, J. M.D. [Sao Paulo, IFT; Hernandez, J. M. [Madrid, CIEMAT; Belforte, S. [INFN, Trieste; Ivanov, T. T. [Sofiya U.

    2017-11-22

    AsyncStageOut (ASO) is the component of the CMS distributed data analysis system (CRAB) that manages users transfers in a centrally controlled way using the File Transfer System (FTS3) at CERN. It addresses a major weakness of the previous, decentralized model, namely that the transfer of the user’s output data to a single remote site was part of the job execution, resulting in inefficient use of job slots and an unacceptable failure rate. Currently ASO manages up to 600k files of various sizes per day from more than 500 users per month, spread over more than 100 sites. ASO uses a NoSQL database (CouchDB) as internal bookkeeping and as way to communicate with other CRAB components. Since ASO/CRAB were put in production in 2014, the number of transfers constantly increased up to a point where the pressure to the central CouchDB instance became critical, creating new challenges for the system scalability, performance, and monitoring. This forced a re-engineering of the ASO application to increase its scalability and lowering its operational effort. In this contribution we present a comparison of the performance of the current NoSQL implementation and a new SQL implementation, and how their different strengths and features influenced the design choices and operational experience. We also discuss other architectural changes introduced in the system to handle the increasing load and latency in delivering output to the user.

  19. Full-scope nuclear training simulator -brought to the desktop

    International Nuclear Information System (INIS)

    LaPointe, D.J.; Manz, A.; Hall, G.S.

    1997-01-01

    RighTSTEP is a suite of simulation software which has been initially designed to facilitate upgrade of Ontario Hydro's full-scope simulators, but is also adaptable to a variety of other roles. it is presently being commissioned at Bruch A Training Simulator and has seen preliminary use in desktop and classroom roles. Because of the flexibility of the system, we anticipate it will see common use in the corporation for full-scope simulation roles. A key reason for developing RighTSTEP (Real Time Simulator Technology Extensible and Portable) was the need to modernize and upgrade the full-scope training simulator while protecting the investment in modelling code. This modelling code represents the end product of 18 years of evolution from the beginning of its development in 1979. Bringing this modelling code to a modern and more useful framework - the combination of simulator host, operating system, and simulator operating system - also could provide many spin-off benefits. The development (and first implementation) of the righTSTEP system was cited for saving the corporation 5.6M$ and was recognized by a corporate New Technology Award last year. The most important spin-off from this project has been the desktop version of the full-scope simulator. The desktop simulator uses essentially the same software as does its full-scope counterpart, and may be used for a variety of new purposes. Classroom and individual simulator training can now be easily accommodated since a desktop simulator is both affordable and relatively ease to use. Further, a wide group of people can be trained using the desktop simulator: by contrast the full-scope simulators were almost exclusively devoted to front-line operating staff. The desktop is finding increasing use in support of engineering applications, resulting from its easy accessibility, breadth of station systems represented, and tools for analysis and viewing. As further plant models are made available on the new simulator platform and

  20. Using XML technology for the ontology-based semantic integration of life science databases.

    Science.gov (United States)

    Philippi, Stephan; Köhler, Jacob

    2004-06-01

    Several hundred internet accessible life science databases with constantly growing contents and varying areas of specialization are publicly available via the internet. Database integration, consequently, is a fundamental prerequisite to be able to answer complex biological questions. Due to the presence of syntactic, schematic, and semantic heterogeneities, large scale database integration at present takes considerable efforts. As there is a growing apprehension of extensible markup language (XML) as a means for data exchange in the life sciences, this article focuses on the impact of XML technology on database integration in this area. In detail, a general architecture for ontology-driven data integration based on XML technology is introduced, which overcomes some of the traditional problems in this area. As a proof of concept, a prototypical implementation of this architecture based on a native XML database and an expert system shell is described for the realization of a real world integration scenario.

  1. Ceramic Technology Project database: September 1993 summary report

    Energy Technology Data Exchange (ETDEWEB)

    Keyes, B.L.P.

    1994-01-01

    Data presented in this report represent an intense effort to improve processing methods, testing methods, and general mechanical properties of candidate ceramics for use in advanced heat engines. Materials discussed include GN-10, GS-44, GTE PY6, NT-154, NT-164, sintered-reaction-bonded silicon nitrides, silicon nitride combined with rare-earth oxides, NT-230, Hexoloy SX-G1, Dow Corning`s {beta}-Si{sub 3}N{sub 4}, and a few whisker-reinforced ceramic composites. Information in this report was taken from the project`s semiannual and bimonthly progress reports and from final reports summarizing the results of individual studies. Test results are presented in tabular form and in graphs. All data, including test rig descriptions and material characterizations, are stored in the CTP database and are available to all project participants on request. Objective of this report is to make available the test results from these studies but not to draw conclusions from those data.

  2. Integrating heterogeneous databases in clustered medic care environments using object-oriented technology

    Science.gov (United States)

    Thakore, Arun K.; Sauer, Frank

    1994-05-01

    The organization of modern medical care environments into disease-related clusters, such as a cancer center, a diabetes clinic, etc., has the side-effect of introducing multiple heterogeneous databases, often containing similar information, within the same organization. This heterogeneity fosters incompatibility and prevents the effective sharing of data amongst applications at different sites. Although integration of heterogeneous databases is now feasible, in the medical arena this is often an ad hoc process, not founded on proven database technology or formal methods. In this paper we illustrate the use of a high-level object- oriented semantic association method to model information found in different databases into an integrated conceptual global model that integrates the databases. We provide examples from the medical domain to illustrate an integration approach resulting in a consistent global view, without attacking the autonomy of the underlying databases.

  3. VMware Horizon 6 desktop virtualization solutions

    CERN Document Server

    Cartwright, Ryan; Langone, Jason; Leibovici, Andre

    2014-01-01

    If you are a desktop architect, solution provider, end-user consultant, virtualization engineer, or anyone who wants to learn how to plan and design the implementation of a virtual desktop solution based on Horizon 6, then this book is for you. An understanding of VMware vSphere fundamentals coupled with experience in the installation or administration of a VMware environment would be a plus during reading.

  4. An Updating System for the Gridded Population Database of China Based on Remote Sensing, GIS and Spatial Database Technologies

    Science.gov (United States)

    Yang, Xiaohuan; Huang, Yaohuan; Dong, Pinliang; Jiang, Dong; Liu, Honghui

    2009-01-01

    The spatial distribution of population is closely related to land use and land cover (LULC) patterns on both regional and global scales. Population can be redistributed onto geo-referenced square grids according to this relation. In the past decades, various approaches to monitoring LULC using remote sensing and Geographic Information Systems (GIS) have been developed, which makes it possible for efficient updating of geo-referenced population data. A Spatial Population Updating System (SPUS) is developed for updating the gridded population database of China based on remote sensing, GIS and spatial database technologies, with a spatial resolution of 1 km by 1 km. The SPUS can process standard Moderate Resolution Imaging Spectroradiometer (MODIS L1B) data integrated with a Pattern Decomposition Method (PDM) and an LULC-Conversion Model to obtain patterns of land use and land cover, and provide input parameters for a Population Spatialization Model (PSM). The PSM embedded in SPUS is used for generating 1 km by 1 km gridded population data in each population distribution region based on natural and socio-economic variables. Validation results from finer township-level census data of Yishui County suggest that the gridded population database produced by the SPUS is reliable. PMID:22399959

  5. An Updating System for the Gridded Population Database of China Based on Remote Sensing, GIS and Spatial Database Technologies

    Directory of Open Access Journals (Sweden)

    Xiaohuan Yang

    2009-02-01

    Full Text Available The spatial distribution of population is closely related to land use and land cover (LULC patterns on both regional and global scales. Population can be redistributed onto geo-referenced square grids according to this relation. In the past decades, various approaches to monitoring LULC using remote sensing and Geographic Information Systems (GIS have been developed, which makes it possible for efficient updating of geo-referenced population data. A Spatial Population Updating System (SPUS is developed for updating the gridded population database of China based on remote sensing, GIS and spatial database technologies, with a spatial resolution of 1 km by 1 km. The SPUS can process standard Moderate Resolution Imaging Spectroradiometer (MODIS L1B data integrated with a Pattern Decomposition Method (PDM and an LULC-Conversion Model to obtain patterns of land use and land cover, and provide input parameters for a Population Spatialization Model (PSM. The PSM embedded in SPUS is used for generating 1 km by 1 km gridded population data in each population distribution region based on natural and socio-economic variables. Validation results from finer township-level census data of Yishui County suggest that the gridded population database produced by the SPUS is reliable.

  6. multiplierz: an extensible API based desktop environment for proteomics data analysis

    Directory of Open Access Journals (Sweden)

    Webber James T

    2009-10-01

    Full Text Available Abstract Background Efficient analysis of results from mass spectrometry-based proteomics experiments requires access to disparate data types, including native mass spectrometry files, output from algorithms that assign peptide sequence to MS/MS spectra, and annotation for proteins and pathways from various database sources. Moreover, proteomics technologies and experimental methods are not yet standardized; hence a high degree of flexibility is necessary for efficient support of high- and low-throughput data analytic tasks. Development of a desktop environment that is sufficiently robust for deployment in data analytic pipelines, and simultaneously supports customization for programmers and non-programmers alike, has proven to be a significant challenge. Results We describe multiplierz, a flexible and open-source desktop environment for comprehensive proteomics data analysis. We use this framework to expose a prototype version of our recently proposed common API (mzAPI designed for direct access to proprietary mass spectrometry files. In addition to routine data analytic tasks, multiplierz supports generation of information rich, portable spreadsheet-based reports. Moreover, multiplierz is designed around a "zero infrastructure" philosophy, meaning that it can be deployed by end users with little or no system administration support. Finally, access to multiplierz functionality is provided via high-level Python scripts, resulting in a fully extensible data analytic environment for rapid development of custom algorithms and deployment of high-throughput data pipelines. Conclusion Collectively, mzAPI and multiplierz facilitate a wide range of data analysis tasks, spanning technology development to biological annotation, for mass spectrometry-based proteomics research.

  7. A study on retrieval of article and making database in radio technology with personal computer

    International Nuclear Information System (INIS)

    Kim, Sung Hwan

    1997-01-01

    Although many useful articles appear in journals published in Korea, they are not always cited by researchers mainly due to absence of efficient searching system. The author made a program with 4 predefined filtering forms to detect published articles rapidly and accurately. The program was coded using database management system CA-Clipper VER 5.2. I used 486DX-II(8 Mbyte Ram, VGA, 560 Mbyte Hard Disk), desk-jet printer(HP-560k), and MS-DOS VER 5.0. I inputed twenty articles in the journal of Korean Society Radio technological Technology, And this program test for retrieve article and made database

  8. Public open space desktop auditing tool

    DEFF Research Database (Denmark)

    Mygind, Lærke; Bentsen, Peter; Badland, Hannah

    2016-01-01

    Features of public open space (POS) have traditionally been described using on-site direct observation, but recently, low-cost and time-efficient remote desktop auditing tools have been developed. We adapted an existing, validated desktop auditing tool (the Public Open Space Desktop Auditing Tool......: POSDAT) and tested it in a pilot sample of regional and metropolitan settings in Victoria, Australia. Using Google Maps and Street View, local government webpages, the National Public Toilet Registry and spatial data, we captured POSDAT items in 171 POS across 17 suburbs, of which 9 were regional. POSDAT...... resolution Google Street View imagery available for some outer regional areas and the inconsistency of detail in information on local government webpages hindered a consistent assessment of POS. Thus, POSDAT, based on the spatial data applied in this study, is appropriate for use in metropolitan...

  9. Proposal for Implementing Multi-User Database (MUD) Technology in an Academic Library.

    Science.gov (United States)

    Filby, A. M. Iliana

    1996-01-01

    Explores the use of MOO (multi-user object oriented) virtual environments in academic libraries to enhance reference services. Highlights include the development of multi-user database (MUD) technology from gaming to non-recreational settings; programming issues; collaborative MOOs; MOOs as distinguished from other types of virtual reality; audio…

  10. A Database for Reviewing and Selecting Radioactive Waste Treatment Technologies and Vendors

    International Nuclear Information System (INIS)

    P. C. Marushia; W. E. Schwinkendorf

    1999-01-01

    Several attempts have been made in past years to collate and present waste management technologies and solutions to waste generators. These efforts have been manifested as reports, buyers' guides, and databases. While this information is helpful at the time it is assembled, the principal weakness is maintaining the timeliness and accuracy of the information over time. In many cases, updates have to be published or developed as soon as the product is disseminated. The recently developed National Low-Level Waste Management Program's Technologies Database is a vendor-updated Internet based database designed to overcome this problem. The National Low-Level Waste Management Program's Technologies Database contains information about waste types, treatment technologies, and vendor information. Information is presented about waste types, typical treatments, and the vendors who provide those treatment methods. The vendors who provide services update their own contact information, their treatment processes, and the types of wastes for which their treatment process is applicable. This information is queriable by a generator of low-level or mixed low-level radioactive waste who is seeking information on waste treatment methods and the vendors who provide them. Timeliness of the information in the database is assured using time clocks and automated messaging to remind featured vendors to keep their information current. Failure to keep the entries current results in a vendor being warned and then ultimately dropped from the database. This assures that the user is dealing with the most current information available and the vendors who are active in reaching and serving their market

  11. Managing Database Services: An Approach Based in Information Technology Services Availabilty and Continuity Management

    Directory of Open Access Journals (Sweden)

    Leonardo Bastos Pontes

    2017-01-01

    Full Text Available This paper is held in the information technology services management environment, with a few ideas of information technology governance, and purposes to implement a hybrid model to manage the services of a database, based on the principles of information technology services management in a supplementary health operator. This approach utilizes fundamental nuances of services management guides, such as CMMI for Services, COBIT, ISO 20000, ITIL and MPS.BR for Services; it studies harmonically Availability and Continuity Management, as most part of the guides also do. This work has its importance because it keeps a good flow in the database and improves the agility of the systems in the accredited clinics in the health plan.

  12. Thomas Jefferson, Page Design, and Desktop Publishing.

    Science.gov (United States)

    Hartley, James

    1991-01-01

    Discussion of page design for desktop publishing focuses on the importance of functional issues as opposed to aesthetic issues, and criticizes a previous article that stressed aesthetic issues. Topics discussed include balance, consistency in text structure, and how differences in layout affect the clarity of "The Declaration of…

  13. Desktop Virtualization in Action: Simplicity Is Power

    Science.gov (United States)

    Fennell, Dustin

    2010-01-01

    Discover how your institution can better manage and increase access to instructional applications and desktops while providing a blended learning environment. Receive practical insight into how academic computing virtualization can be leveraged to enhance education at your institution while lowering Total Cost of Ownership (TCO) and reducing the…

  14. Aquatic Habitats: Exploring Desktop Ponds. Teacher's Guide.

    Science.gov (United States)

    Barrett, Katharine; Willard, Carolyn

    This book, for grades 2-6, is designed to provide students with a highly motivating and unique opportunity to investigate an aquatic habitat. Students set up, observe, study, and reflect upon their own "desktop ponds." Accessible plants and small animals used in these activities include Elodea, Tubifex worms, snails, mosquito larvae, and fish.…

  15. Architectural Desktop : release 3.3

    DEFF Research Database (Denmark)

    Skauge, Jørn

    2002-01-01

    Øvelser i ADT's mest grundlæggende dele på basis af modellering af et lille og simpelt hus. Der arbejdes primært med værktøjerne i designdelens menu. 1. udgave med titel: Architectural Desktop (Fiskers hus). 2002. 60 sider....

  16. Preferance of computer technology for analytical support of large database of medical information systems

    Directory of Open Access Journals (Sweden)

    Biryukov А.P.

    2013-12-01

    Full Text Available Aim: to study the use of intelligent technologies for analytical support of large databases of medical information systems. Material and methods. We used the techniques of object-oriented software design and database design. Results. Based on expert review of models and algorithms for analysis of clinical and epidemiological data and principles of knowledge representation in large-scale health information systems, data mining schema were implemented in the software package of the register of Research Center n.a. A. I. Burnazyan of Russia. Identified areas for effective implementation of abstract data model of EAV and procedures Data Maning for the design of database of biomedical registers. Conclusions. Using intelligent software platform that supports different sets of APIs and object models for different operations in different software environments, allows you to build and maintain an information system through the procedures of data biomedical processing.

  17. The Neotoma Paleoecology Database

    Science.gov (United States)

    Grimm, E. C.; Ashworth, A. C.; Barnosky, A. D.; Betancourt, J. L.; Bills, B.; Booth, R.; Blois, J.; Charles, D. F.; Graham, R. W.; Goring, S. J.; Hausmann, S.; Smith, A. J.; Williams, J. W.; Buckland, P.

    2015-12-01

    The Neotoma Paleoecology Database (www.neotomadb.org) is a multiproxy, open-access, relational database that includes fossil data for the past 5 million years (the late Neogene and Quaternary Periods). Modern distributional data for various organisms are also being made available for calibration and paleoecological analyses. The project is a collaborative effort among individuals from more than 20 institutions worldwide, including domain scientists representing a spectrum of Pliocene-Quaternary fossil data types, as well as experts in information technology. Working groups are active for diatoms, insects, ostracodes, pollen and plant macroscopic remains, testate amoebae, rodent middens, vertebrates, age models, geochemistry and taphonomy. Groups are also active in developing online tools for data analyses and for developing modules for teaching at different levels. A key design concept of NeotomaDB is that stewards for various data types are able to remotely upload and manage data. Cooperatives for different kinds of paleo data, or from different regions, can appoint their own stewards. Over the past year, much progress has been made on development of the steward software-interface that will enable this capability. The steward interface uses web services that provide access to the database. More generally, these web services enable remote programmatic access to the database, which both desktop and web applications can use and which provide real-time access to the most current data. Use of these services can alleviate the need to download the entire database, which can be out-of-date as soon as new data are entered. In general, the Neotoma web services deliver data either from an entire table or from the results of a view. Upon request, new web services can be quickly generated. Future developments will likely expand the spatial and temporal dimensions of the database. NeotomaDB is open to receiving new datasets and stewards from the global Quaternary community

  18. A desktop 3D printer with dual extruders to produce customised electronic circuitry

    OpenAIRE

    Butt, Javaid; Onimowo, Dominic A.; Gohrabian, Mohammed; Sharma, Tinku; Shirvani, Hassan

    2018-01-01

    3D printing has opened new horizons for the manufacturing industry in general and 3D printers have become the tools for technological advancements. There is a huge divide between the pricing of industrial and desktop 3D printers with the former being on the expensive side capable of producing excellent quality products and latter being on the low-cost side with moderate quality results. However, there is a larger room for improvements and enhancements for the desktop systems as compared to th...

  19. Database use and technology in Japan: JTEC panel report. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Wiederhold, G.; Beech, D.; Bourne, C.; Farmer, N.; Jajodia, Sushil; Kahaner, D.; Minoura, Toshi; Smith, D.; Smith, J.M.

    1992-04-01

    This report presents the findings of a group of database experts, sponsored by the Japanese Technology Evaluation Center (JTEC), based on an intensive study trip to Japan during March 1991. Academic, industrial, and governmental sites were visited. The primary findings are that Japan is supporting its academic research establishment poorly, that industry is making progress in key areas, and that both academic and industrial researchers are well aware of current domestic and foreign technology. Information sharing between industry and academia is effectively supported by governmental sponsorship of joint planning and review activities, and enhances technology transfer. In two key areas, multimedia and object-oriented databases, the authors can expect to see future export of Japanese database products, typically integrated into larger systems. Support for academic research is relatively modest. Nevertheless, the senior faculty are well-known and respected, and communicate frequently and in depth with each other, with government agencies, and with industry. In 1988 there were a total of 1,717 Ph.D.`s in engineering and 881 in science. It appears that only about 30 of these were academic Ph.D.`s in the basic computer sciences.

  20. A Cross-Case Analysis of Gender Issues in Desktop Virtual Reality Learning Environments

    Science.gov (United States)

    Ausburn, Lynna J.; Martens, Jon; Washington, Andre; Steele, Debra; Washburn, Earlene

    2009-01-01

    This study examined gender-related issues in using new desktop virtual reality (VR) technology as a learning tool in career and technical education (CTE). Using relevant literature, theory, and cross-case analysis of data and findings, the study compared and analyzed the outcomes of two recent studies conducted by a research team at Oklahoma State…

  1. Use of Signaling to Integrate Desktop Virtual Reality and Online Learning Management Systems

    Science.gov (United States)

    Dodd, Bucky J.; Antonenko, Pavlo D.

    2012-01-01

    Desktop virtual reality is an emerging educational technology that offers many potential benefits for learners in online learning contexts; however, a limited body of research is available that connects current multimedia learning techniques with these new forms of media. Because most formal online learning is delivered using learning management…

  2. Construction of an ortholog database using the semantic web technology for integrative analysis of genomic data.

    Science.gov (United States)

    Chiba, Hirokazu; Nishide, Hiroyo; Uchiyama, Ikuo

    2015-01-01

    Recently, various types of biological data, including genomic sequences, have been rapidly accumulating. To discover biological knowledge from such growing heterogeneous data, a flexible framework for data integration is necessary. Ortholog information is a central resource for interlinking corresponding genes among different organisms, and the Semantic Web provides a key technology for the flexible integration of heterogeneous data. We have constructed an ortholog database using the Semantic Web technology, aiming at the integration of numerous genomic data and various types of biological information. To formalize the structure of the ortholog information in the Semantic Web, we have constructed the Ortholog Ontology (OrthO). While the OrthO is a compact ontology for general use, it is designed to be extended to the description of database-specific concepts. On the basis of OrthO, we described the ortholog information from our Microbial Genome Database for Comparative Analysis (MBGD) in the form of Resource Description Framework (RDF) and made it available through the SPARQL endpoint, which accepts arbitrary queries specified by users. In this framework based on the OrthO, the biological data of different organisms can be integrated using the ortholog information as a hub. Besides, the ortholog information from different data sources can be compared with each other using the OrthO as a shared ontology. Here we show some examples demonstrating that the ortholog information described in RDF can be used to link various biological data such as taxonomy information and Gene Ontology. Thus, the ortholog database using the Semantic Web technology can contribute to biological knowledge discovery through integrative data analysis.

  3. High energy nuclear database: a test-bed for nuclear data information technology

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D.A.; Vogt, R.; Beck, B.; Pruet, J. [Lawrence Livermore National Lab, Livermore, CA (United States); Vogt, R. [Davis Univ. of California, CA (United States)

    2008-07-01

    We describe the development of an on-line high-energy heavy-ion experimental database. When completed, the database will be searchable and cross-indexed with relevant publications, including published detector descriptions. While this effort is relatively new, it will eventually contain all published data from older heavy-ion programs as well as published data from current and future facilities. These data include all measured observables in proton-proton, proton-nucleus and nucleus-nucleus collisions. Once in general use, this database will have tremendous scientific payoff as it makes systematic studies easier and allows simpler benchmarking of theoretical models for a broad range of experiments. Furthermore, there is a growing need for compilations of high-energy nuclear data for applications including stockpile stewardship, technology development for inertial confinement fusion, target and source development for upcoming facilities such as the International Linear Collider and homeland security. This database is part of a larger proposal that includes the production of periodic data evaluations and topical reviews. These reviews would provide an alternative and impartial mechanism to resolve discrepancies between published data from rival experiments and between theory and experiment. Since this database will be a community resource, it requires the high-energy nuclear physics community's financial and manpower support. This project serves as a test-bed for the further development of an object-oriented nuclear data format and database system. By using 'off-the-shelf' software tools and techniques, the system is simple, robust, and extensible. Eventually we envision a 'Grand Unified Nuclear Format' encapsulating data types used in the ENSDF, Endf/B, EXFOR, NSR and other formats, including processed data formats. (authors)

  4. Calculation of Investments for the Distribution of GPON Technology in the village of Bishtazhin through database

    Directory of Open Access Journals (Sweden)

    MSc. Jusuf Qarkaxhija

    2013-12-01

    Full Text Available According to daily reports, the income from internet services is getting lower each year. Landline phone services are running at a loss,  whereas mobile phone services are getting too mainstream and the only bright spot holding together cable operators (ISP  in positive balance is the income from broadband services (Fast internet, IPTV. Broadband technology is a term that defines multiple methods of information distribution through internet at great speed. Some of the broadband technologies are: optic fiber, coaxial cable, DSL, Wireless, mobile broadband, and satellite connection.  The ultimate goal of any broadband service provider is being able to provide voice, data and the video through a single network, called triple play service. The Internet distribution remains an important issue in Kosovo and particularly in rural zones. Considering the immense development of the technologies and different alternatives that we can face, the goal of this paper is to emphasize the necessity of a forecasting of such investment and to give an experience in this aspect. Because of the fact that in this investment are involved many factors related to population, geographical factors, several technologies and the fact that these factors are in continuously change, the best way is, to store all the data in a database and to use this database for different results. This database helps us to substitute the previous manual calculations with an automatic procedure of calculations. This way of work will improve the work style, having now all the tools to take the right decision about an Internet investment considering all the aspects of this investment.

  5. Rhinoplasty perioperative database using a personal digital assistant.

    Science.gov (United States)

    Kotler, Howard S

    2004-01-01

    To construct a reliable, accurate, and easy-to-use handheld computer database that facilitates the point-of-care acquisition of perioperative text and image data specific to rhinoplasty. A user-modified database (Pendragon Forms [v.3.2]; Pendragon Software Corporation, Libertyville, Ill) and graphic image program (Tealpaint [v.4.87]; Tealpaint Software, San Rafael, Calif) were used to capture text and image data, respectively, on a Palm OS (v.4.11) handheld operating with 8 megabytes of memory. The handheld and desktop databases were maintained secure using PDASecure (v.2.0) and GoldSecure (v.3.0) (Trust Digital LLC, Fairfax, Va). The handheld data were then uploaded to a desktop database of either FileMaker Pro 5.0 (v.1) (FileMaker Inc, Santa Clara, Calif) or Microsoft Access 2000 (Microsoft Corp, Redmond, Wash). Patient data were collected from 15 patients undergoing rhinoplasty in a private practice outpatient ambulatory setting. Data integrity was assessed after 6 months' disk and hard drive storage. The handheld database was able to facilitate data collection and accurately record, transfer, and reliably maintain perioperative rhinoplasty data. Query capability allowed rapid search using a multitude of keyword search terms specific to the operative maneuvers performed in rhinoplasty. Handheld computer technology provides a method of reliably recording and storing perioperative rhinoplasty information. The handheld computer facilitates the reliable and accurate storage and query of perioperative data, assisting the retrospective review of one's own results and enhancement of surgical skills.

  6. Value of databases other than medline for rapid health technology assessments.

    Science.gov (United States)

    Lorenzetti, Diane L; Topfer, Leigh-Ann; Dennett, Liz; Clement, Fiona

    2014-04-01

    The objective of this study was to explore the degree to which databases other than MEDLINE contribute studies relevant for inclusion in rapid health technology assessments (HTA). We determined the extent to which the clinical, economic, and social studies included in twenty-one full and four rapid HTAs published by three Canadian HTA agencies from 2007 to 2012 were indexed in MEDLINE. Other electronic databases, including EMBASE, were then searched, in sequence, to assess whether or not they indexed studies not found in MEDLINE. Assessment topics ranged from purely clinical (e.g., drug-eluting stents) to those with broader social implications (e.g., spousal violence). MEDLINE contributed the majority of studies in all but two HTA reports, indexing a mean of 89.6 percent of clinical studies across all HTAs, and 88.3 percent of all clinical, economic, and social studies in twenty-four of twenty-five HTAs. While EMBASE contributed unique studies to twenty-two of twenty-five HTAs, three rapid HTAs did not include any EMBASE studies. In some instances, PsycINFO and CINAHL contributed as many, if not more, non-MEDLINE studies than EMBASE. Our findings highlight the importance of assessing the topic-specific relative value of including EMBASE, or more specialized databases, in HTA search protocols. Although MEDLINE continues to be a key resource for HTAs, the time and resource limitations inherent in the production of rapid HTAs require that researchers carefully consider the value and limitations of other information sources to identify relevant studies.

  7. GREEN SUPERCOMPUTING IN A DESKTOP BOX

    Energy Technology Data Exchange (ETDEWEB)

    HSU, CHUNG-HSING [Los Alamos National Laboratory; FENG, WU-CHUN [NON LANL; CHING, AVERY [NON LANL

    2007-01-17

    The computer workstation, introduced by Sun Microsystems in 1982, was the tool of choice for scientists and engineers as an interactive computing environment for the development of scientific codes. However, by the mid-1990s, the performance of workstations began to lag behind high-end commodity PCs. This, coupled with the disappearance of BSD-based operating systems in workstations and the emergence of Linux as an open-source operating system for PCs, arguably led to the demise of the workstation as we knew it. Around the same time, computational scientists started to leverage PCs running Linux to create a commodity-based (Beowulf) cluster that provided dedicated computer cycles, i.e., supercomputing for the rest of us, as a cost-effective alternative to large supercomputers, i.e., supercomputing for the few. However, as the cluster movement has matured, with respect to cluster hardware and open-source software, these clusters have become much more like their large-scale supercomputing brethren - a shared (and power-hungry) datacenter resource that must reside in a machine-cooled room in order to operate properly. Consequently, the above observations, when coupled with the ever-increasing performance gap between the PC and cluster supercomputer, provide the motivation for a 'green' desktop supercomputer - a turnkey solution that provides an interactive and parallel computing environment with the approximate form factor of a Sun SPARCstation 1 'pizza box' workstation. In this paper, they present the hardware and software architecture of such a solution as well as its prowess as a developmental platform for parallel codes. In short, imagine a 12-node personal desktop supercomputer that achieves 14 Gflops on Linpack but sips only 185 watts of power at load, resulting in a performance-power ratio that is over 300% better than their reference SMP platform.

  8. Migrating to the Cloud IT Application, Database, and Infrastructure Innovation and Consolidation

    CERN Document Server

    Laszewski, Tom

    2011-01-01

    Whether your company is planning on database migration, desktop application migration, or has IT infrastructure consolidation projects, this book gives you all the resources you'll need. It gives you recommendations on tools, strategy and best practices and serves as a guide as you plan, determine effort and budget, design, execute and roll your modern Oracle system out to production. Focusing on Oracle grid relational database technology and Oracle Fusion Middleware as the target cloud-based architecture, your company can gain organizational efficiency, agility, increase innovation and reduce

  9. Research on Construction of Road Network Database Based on Video Retrieval Technology

    Directory of Open Access Journals (Sweden)

    Wang Fengling

    2017-01-01

    Full Text Available Based on the characteristics of the video database and the basic structure of the video database and several typical video data models, the segmentation-based multi-level data model is used to describe the landscape information video database, the network database model and the road network management database system. Landscape information management system detailed design and implementation of a detailed preparation.

  10. The application of database technologies to the study of terrorism and counter-terrorism : a post 9/11 analysis

    OpenAIRE

    Bowie, Neil Gordon

    2012-01-01

    Data and information of the highest quality are critical to understanding and countering acts of terrorism. As a tool, database technologies are becoming integral to the field of terrorism studies. The intelligence failings of September 11th 2001 illustrate the need for timely, relevant and accurate data, derived from a plethora of complex intelligence sources. This thesis will argue that, at least until 9/11, the academic study of terrorism and counter-terrorism databases h...

  11. Search and Graph Database Technologies for Biomedical Semantic Indexing: Experimental Analysis.

    Science.gov (United States)

    Segura Bedmar, Isabel; Martínez, Paloma; Carruana Martín, Adrián

    2017-12-01

    Biomedical semantic indexing is a very useful support tool for human curators in their efforts for indexing and cataloging the biomedical literature. The aim of this study was to describe a system to automatically assign Medical Subject Headings (MeSH) to biomedical articles from MEDLINE. Our approach relies on the assumption that similar documents should be classified by similar MeSH terms. Although previous work has already exploited the document similarity by using a k-nearest neighbors algorithm, we represent documents as document vectors by search engine indexing and then compute the similarity between documents using cosine similarity. Once the most similar documents for a given input document are retrieved, we rank their MeSH terms to choose the most suitable set for the input document. To do this, we define a scoring function that takes into account the frequency of the term into the set of retrieved documents and the similarity between the input document and each retrieved document. In addition, we implement guidelines proposed by human curators to annotate MEDLINE articles; in particular, the heuristic that says if 3 MeSH terms are proposed to classify an article and they share the same ancestor, they should be replaced by this ancestor. The representation of the MeSH thesaurus as a graph database allows us to employ graph search algorithms to quickly and easily capture hierarchical relationships such as the lowest common ancestor between terms. Our experiments show promising results with an F1 of 69% on the test dataset. To the best of our knowledge, this is the first work that combines search and graph database technologies for the task of biomedical semantic indexing. Due to its horizontal scalability, ElasticSearch becomes a real solution to index large collections of documents (such as the bibliographic database MEDLINE). Moreover, the use of graph search algorithms for accessing MeSH information could provide a support tool for cataloging MEDLINE

  12. Using of Gis Technology For Arctic Geophysical, Geological and Magmatic Rock Database

    Science.gov (United States)

    Ryakhovsky, V. M.; Mironov, Yu. V.; Pustovoy, A. A.

    The software technology is developed on the basis of Oracle ODBC and unites hard- ware, software, and multi-aspect subject-oriented databases on geology and geo- physics including data on composition of magmatic rocks for purposes of geodynamic and metallogenic analysis. The technology provides users with opportunity to form the attributive tables through voluntary quarries with crossing indices on various types of objects. In case, the data in tables have geographic coordinates, they could be adapted to wide spectrum of specialized digitized maps using ArcView. Beside of that, the tables could be used in the environment of popular processing software such as MS Excel, MS Access, Surfer, etc. On the basis of developed technology, a GIS structure chart is created for the multi-purpose processing of huge data files containing multi- aspect geological information. The users get an opportunity to model objects and sit- uations; the dialog language is quasi-natural; the consulting on specific and restricted problems is possible. Such a multi-contour system is able, at the analytical level, to ad- just different informational models with reference ones, which sufficiently decreases the efficiency of scientific researches as a whole. One of the important results of used software technology is the revealing of specific Arctic isotope province, which in- cludes spreading ridges of Northern Atlantic, Norvegian-Greenland sea and Arctic ocean, Iceland and Jan-Mayen island, Iceland-Faeroe Rise, and also traps of Norway, Britain, and Greenland. MORB and the island rocks of this province are analogous in relation with ratios of most Sr, Nd, and Pb isotopes to basalts of well-known South- ern Hemisphere DUPAL-anomaly, but, by 207Pb/204Pb and 206Pb/204Pb ratios, they are corresponding with normal MORB. This specificity is connected to admixture of the special component ARCTIC. This component represents one of end-components of trends, which are formed by compositions of continental

  13. Analytical Hierarchy Process for the selection of strategic alternatives for introduction of infrastructure virtual desktop infrastructure in the university

    Directory of Open Access Journals (Sweden)

    Katerina A. Makoviy

    2017-12-01

    Full Text Available The task of choosing a strategy for implementing the virtual desktop infrastructure into the IT infrastructure of the university is considered. The infrastructure of virtual desktops is a technology that provides centralization of management of client workplaces, increase the service life of computers in classrooms. The analysis of strengths and weaknesses, threats and opportunities for introducing virtualization in the university. Alternatives to implementation based on the results of the pilot project have been developed. To obtain quantitative estimates in the SWOT - analysis of the pilot project, the analytical hierarchy process is used. The analysis of implementation of the pilot project by experts is carried out and the integral value of quantitative estimates of various alternatives is generated. The combination of the analytical hierarchy process and SWOT - analysis allows you to choose the optimal strategy for implementing desktop virtualization.

  14. Relational databases

    CERN Document Server

    Bell, D A

    1986-01-01

    Relational Databases explores the major advances in relational databases and provides a balanced analysis of the state of the art in relational databases. Topics covered include capture and analysis of data placement requirements; distributed relational database systems; data dependency manipulation in database schemata; and relational database support for computer graphics and computer aided design. This book is divided into three sections and begins with an overview of the theory and practice of distributed systems, using the example of INGRES from Relational Technology as illustration. The

  15. Recon2Neo4j: applying graph database technologies for managing comprehensive genome-scale networks.

    Science.gov (United States)

    Balaur, Irina; Mazein, Alexander; Saqi, Mansoor; Lysenko, Artem; Rawlings, Christopher J; Auffray, Charles

    2017-04-01

    The goal of this work is to offer a computational framework for exploring data from the Recon2 human metabolic reconstruction model. Advanced user access features have been developed using the Neo4j graph database technology and this paper describes key features such as efficient management of the network data, examples of the network querying for addressing particular tasks, and how query results are converted back to the Systems Biology Markup Language (SBML) standard format. The Neo4j-based metabolic framework facilitates exploration of highly connected and comprehensive human metabolic data and identification of metabolic subnetworks of interest. A Java-based parser component has been developed to convert query results (available in the JSON format) into SBML and SIF formats in order to facilitate further results exploration, enhancement or network sharing. The Neo4j-based metabolic framework is freely available from: https://diseaseknowledgebase.etriks.org/metabolic/browser/ . The java code files developed for this work are available from the following url: https://github.com/ibalaur/MetabolicFramework . ibalaur@eisbm.org. Supplementary data are available at Bioinformatics online.

  16. Mars Propellant Liquefaction Modeling in Thermal Desktop

    Science.gov (United States)

    Desai, Pooja; Hauser, Dan; Sutherlin, Steven

    2017-01-01

    NASAs current Mars architectures are assuming the production and storage of 23 tons of liquid oxygen on the surface of Mars over a duration of 500+ days. In order to do this in a mass efficient manner, an energy efficient refrigeration system will be required. Based on previous analysis NASA has decided to do all liquefaction in the propulsion vehicle storage tanks. In order to allow for transient Martian environmental effects, a propellant liquefaction and storage system for a Mars Ascent Vehicle (MAV) was modeled using Thermal Desktop. The model consisted of a propellant tank containing a broad area cooling loop heat exchanger integrated with a reverse turbo Brayton cryocooler. Cryocooler sizing and performance modeling was conducted using MAV diurnal heat loads and radiator rejection temperatures predicted from a previous thermal model of the MAV. A system was also sized and modeled using an alternative heat rejection system that relies on a forced convection heat exchanger. Cryocooler mass, input power, and heat rejection for both systems were estimated and compared against sizing based on non-transient sizing estimates.

  17. A Five-Year Hedonic Price Breakdown for Desktop Personal Computer Attributes in Brazil

    Directory of Open Access Journals (Sweden)

    Nuno Manoel Martins Dias Fouto

    2009-07-01

    Full Text Available The purpose of this article is to identify the attributes that discriminate the prices of personal desktop computers. We employ the hedonic price method in evaluating such characteristics. This approach allows market prices to be expressed as a function, a set of attributes present in the products and services offered. Prices and characteristics of up to 3,779 desktop personal computers offered in the IT pages of one of the main Brazilian newspapers were collected from January 2003 to December 2007. Several specifications for the hedonic (multivariate linear regression were tested. In this particular study, the main attributes were found to be hard drive capacity, screen technology, main board brand, random memory size, microprocessor brand, video board memory, digital video and compact disk recording devices, screen size and microprocessor speed. These results highlight the novel contribution of this study: the manner and means in which hedonic price indexes may be estimated in Brazil.

  18. Nuclear Plant Analyzer desktop workstation: An integrated interactive simulation, visualization and analysis tool

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1991-01-01

    The advanced, best-estimate, reactor thermal-hydraulic codes were originally developed as mainframe computer applications because of speed, precision, memory and mass storage requirements. However, the productivity of numerical reactor safety analysts has historically been hampered by mainframe dependence due to limited mainframe CPU allocation, accessibility and availability, poor mainframe job throughput, and delays in obtaining and difficulty comprehending printed numerical results. The Nuclear Plant Analyzer (NPA) was originally developed as a mainframe computer-graphics aid for reactor safety analysts in addressing the latter consideration. Rapid advances in microcomputer technology have since enabled the installation and execution of these reactor safety codes on desktop computers thereby eliminating mainframe dependence. The need for a complementary desktop graphics display generation and presentation capability, coupled with the need for software standardization and portability, has motivated the redesign of the NPA as a UNIX/X-Windows application suitable for both mainframe and microcomputer

  19. Training Capabilities of Wearable and Desktop Simulator Interfaces

    Science.gov (United States)

    2011-11-01

    Intrinsic Motivation Inventory) than the control group, with no differences being found between the desktop and wearable groups.  The Desktop and...associated with the use of the simulator for training, with each of these scales ranging from 0 - 500. Intrinsic Motivation Inventory (IMI). The IMI...subscale of the Presence Questionnaire (r = .350, p = .001) as well as the Perceived Competence subscale of the Intrinsic Motivation Inventory (r

  20. Microsoft Virtualization Master Microsoft Server, Desktop, Application, and Presentation Virtualization

    CERN Document Server

    Olzak, Thomas; Boomer, Jason; Keefer, Robert M

    2010-01-01

    Microsoft Virtualization helps you understand and implement the latest virtualization strategies available with Microsoft products. This book focuses on: Server Virtualization, Desktop Virtualization, Application Virtualization, and Presentation Virtualization. Whether you are managing Hyper-V, implementing desktop virtualization, or even migrating virtual machines, this book is packed with coverage on all aspects of these processes. Written by a talented team of Microsoft MVPs, Microsoft Virtualization is the leading resource for a full installation, migration, or integration of virtual syste

  1. Desktop system for accounting, audit, and research in A&E.

    Science.gov (United States)

    Taylor, C J; Brain, S G; Bull, F; Crosby, A C; Ferguson, D G

    1997-01-01

    The development of a database for audit, research, and accounting in accident and emergency (A&E) is described. The system uses a desktop computer, an optical scanner, sophisticated optical mark reader software, and workload management data. The system is highly flexible, easy to use, and at a cost of around 16,000 pounds affordable for larger departments wishing to move towards accounting. For smaller departments, it may be an alternative to full computerisation. Images Figure 1 Figure 2 Figure 3 Figure 5 Figure 6 PMID:9132200

  2. Post-Caesarean Section Surgical Site Infection Surveillance Using an Online Database and Mobile Phone Technology.

    Science.gov (United States)

    Castillo, Eliana; McIsaac, Corrine; MacDougall, Bhreagh; Wilson, Douglas; Kohr, Rosemary

    2017-08-01

    Obstetric surgical site infections (SSIs) are common and expensive to the health care system but remain under reported given shorter postoperative hospital stays and suboptimal post-discharge surveillance systems. SSIs, for the purpose of this paper, are defined according to the Center for Disease Control and Prevention (1999) as infection incurring within 30 days of the operative procedure (in this case, Caesarean section [CS]). Demonstrate the feasibility of real-life use of a patient driven SSIs post-discharge surveillance system consisting of an online database and mobile phone technology (surgical mobile app - how2trak) among women undergoing CS in a Canadian urban centre. Estimate the rate of SSIs and associated predisposing factors. Prospective cohort of consecutive women delivering by CS at one urban Canadian hospital. Using surgical mobile app-how2trak-predetermined demographics, comorbidities, procedure characteristics, and self-reported symptoms and signs of infection were collected and linked to patients' incision self-portraits (photos) on postpartum days 3, 7, 10, and 30. A total of 105 patients were enrolled over a 5-month period. Mean age was 31 years, 13% were diabetic, and most were at low risk of surgical complications. Forty-six percent of surgeries were emergency CSs, and 104/105 received antibiotic prophylaxis. Forty-five percent of patients (47/105) submitted at least one photo, and among those, one surgical site infection was detected by photo appearance and self-reported symptoms by postpartum day 10. The majority of patients whom uploaded photos did so multiple times and 43% of them submitted photos up to day 30. Patients with either a diagnosis of diabetes or self-reported Asian ethnicity were less likely to submit photos. Post-discharge surveillance for CS-related SSIs using surgical mobile app how2trak is feasible and deserves further study in the post-discharge setting. Copyright © 2017. Published by Elsevier Inc.

  3. Increasing Open Source Software Integration on the Department of Defense Unclassified Desktop

    Science.gov (United States)

    2008-06-01

    Figure 3. Windows Remote Desktop from Linux................................................ 49 Figure 4. Citrix XenDesktop Virtualization...released XenDesktop product from Citrix is one such example (Figure 4). 50 Figure 4. Citrix XenDesktop Virtualization Source: http...described in Chapter IV are currently used to satisfy the needs of these COI’s, namely remote desktop computing via terminal servers and Citrix -based

  4. A desktop 3D printer with dual extruders to produce customised electronic circuitry

    Science.gov (United States)

    Butt, Javaid; Onimowo, Dominic Adaoiza; Gohrabian, Mohammed; Sharma, Tinku; Shirvani, Hassan

    2018-03-01

    3D printing has opened new horizons for the manufacturing industry in general, and 3D printers have become the tools for technological advancements. There is a huge divide between the pricing of industrial and desktop 3D printers with the former being on the expensive side capable of producing excellent quality products and latter being on the low-cost side with moderate quality results. However, there is a larger room for improvements and enhancements for the desktop systems as compared to the industrial ones. In this paper, a desktop 3D printer called Prusa Mendel i2 has been modified and integrated with an additional extruder so that the system can work with dual extruders and produce bespoke electronic circuits. The communication between the two extruders has been established by making use of the In-Chip Serial Programming port on the Arduino Uno controlling the printer. The biggest challenge is to control the flow of electric paint (to be dispensed by the new extruder) and CFD (Computational Fluid Dynamics) analysis has been carried out to ascertain the optimal conditions for proper dispensing. The final product is a customised electronic circuit with the base of plastic (from the 3D printer's extruder) and electronic paint (from the additional extruder) properly dispensed to create a live circuit on a plastic platform. This low-cost enhancement to a desktop 3D printer can provide a new prospect to produce multiple material parts where the additional extruder can be filled with any material that can be properly dispensed from its nozzle.

  5. Life cycle assessment study of a Chinese desktop personal computer

    International Nuclear Information System (INIS)

    Duan Huabo; Eugster, Martin; Hischier, Roland; Streicher-Porte, Martin; Li Jinhui

    2009-01-01

    Associated with the tremendous prosperity in world electronic information and telecommunication industry, there continues to be an increasing awareness of the environmental impacts related to the accelerating mass production, electricity use, and waste management of electronic and electric products (e-products). China's importance as both a consumer and supplier of e-products has grown at an unprecedented pace in recent decade. Hence, this paper aims to describe the application of life cycle assessment (LCA) to investigate the environmental performance of Chinese e-products from a global level. A desktop personal computer system has been selected to carry out a detailed and modular LCA which follows the ISO 14040 series. The LCA is constructed by SimaPro software version 7.0 and expressed with the Eco-indicator'99 life cycle impact assessment method. For a sensitivity analysis of the overall LCA results, the so-called CML method is used in order to estimate the influence of the choice of the assessment method on the result. Life cycle inventory information is complied by ecoinvent 1.3 databases, combined with literature and field investigations on the present Chinese situation. The established LCA study shows that that the manufacturing and the use of such devices are of the highest environmental importance. In the manufacturing of such devices, the integrated circuits (ICs) and the Liquid Crystal Display (LCD) are those parts contributing most to the impact. As no other aspects are taken into account during the use phase, the impact is due to the way how the electricity is produced. The final process steps - i.e. the end of life phase - lead to a clear environmental benefit if a formal and modern, up-to-date technical system is assumed, like here in this study

  6. Basic survey for promoting energy efficiency in developing countries. Database development project directory of energy conservation technology in Japan

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-02-01

    In order to promote energy conservation in developing countries, the gist of Japanese energy saving technologies was edited into a database. The Asian territory is expected of remarkable economic development and increased energy consumption including that for fossil fuels. Therefore, this project of structuring a database has urgent importance for the Asian countries. New and wide-area discussions were given to revise the 1995 edition. The committee was composed of members from high energy consuming areas such as iron and steel, paper and pulp, chemical, oil refining, cement, electric power, machinery, electric devices, and industrial machinery industries. Technical literatures and reports were referred to, and opinions were heard from specialists and committee members representing the respective areas. In order to reflect the current status and particular conditions in specific industrial areas, additions were given under the assistance and guidance from the specialists. The energy saving technologies recorded in the database may be called small to medium scale technologies, with the target placed on saving energy by 10% or more. Small-scale energy saving technologies were omitted. Flow charts for manufacturing processes were also added. (NEDO)

  7. Knowledge base technology for CT-DIMS: Report 1. [CT-DIMS (Cutting Tool - Database and Information Management System)

    Energy Technology Data Exchange (ETDEWEB)

    Kelley, E.E.

    1993-05-01

    This report discusses progress on the Cutting Tool-Database and Information Management System (CT-DIMS) project being conducted by the University of Illinois Urbana-Champaign (UIUC) under contract to the Department of Energy. This project was initiated in October 1991 by UIUC. The Knowledge-Based Engineering Systems Research Laboratory (KBESRL) at UIUC is developing knowledge base technology and prototype software for the presentation and manipulation of the cutting tool databases at Allied-Signal Inc., Kansas City Division (KCD). The graphical tool selection capability being developed for CT-DIMS in the Intelligent Design Environment for Engineering Automation (IDEEA) will provide a concurrent environment for simultaneous access to tool databases, tool standard libraries, and cutting tool knowledge.

  8. Indiana Humanities Council Request for the Indianapolis Energy Conversion Inst. For Phase I of the Indianapolis Energy Conservation Res Initiative also called the smartDESKTOP Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Keller, John B.

    2007-12-06

    The smartDESKTOP Initiative at the Indiana Humanities Council received critical support in building and delivering a digital desktop for Indiana educators through the Department of Energy Grant DE-FG02-06ER64282. During the project period September 2006 through October of 2007, the number of Indiana educators with accounts on the smartDESKTOP more than tripled from under 2,000 to more than 7,000 accounts. An external review of the project conducted for the purposes of understanding the impact of the service in Indiana schools revealed that the majority of respondents felt that using the smartDESKTOP did reduce the time they spent managing paper. The same study revealed the challenges of implementing a digital desktop meant to help teachers leverage technology to improve their teaching and ultimately student learning. The most significant outcome of this project is that the Indiana Department of Education expressed interest in assuming responsibility for sustaining this project. The transition of the smartDESKTOP to the Indiana Department of Education was effective on November 1, 2007.

  9. Investigation of an artificial intelligence technology--Model trees. Novel applications for an immediate release tablet formulation database.

    Science.gov (United States)

    Shao, Q; Rowe, R C; York, P

    2007-06-01

    This study has investigated an artificial intelligence technology - model trees - as a modelling tool applied to an immediate release tablet formulation database. The modelling performance was compared with artificial neural networks that have been well established and widely applied in the pharmaceutical product formulation fields. The predictability of generated models was validated on unseen data and judged by correlation coefficient R(2). Output from the model tree analyses produced multivariate linear equations which predicted tablet tensile strength, disintegration time, and drug dissolution profiles of similar quality to neural network models. However, additional and valuable knowledge hidden in the formulation database was extracted from these equations. It is concluded that, as a transparent technology, model trees are useful tools to formulators.

  10. Bibliometric analysis of Spanish scientific publications in the subject Construction & Building Technology in Web of Science database (1997-2008)

    OpenAIRE

    Rojas-Sola, J. I.; de San-Antonio-Gómez, C.

    2010-01-01

    In this paper the publications from Spanish institutions listed in the journals of the Construction & Building Technology subject of Web of Science database for the period 1997- 2008 are analyzed. The number of journals in whose is published is 35 and the number of articles was 760 (Article or Review). Also a bibliometric assessment has done and we propose two new parameters: Weighted Impact Factor and Relative Impact Factor; also includes the number of citations and the number documents ...

  11. Application of SIG and OLAP technologies on IBGE databases as a decision support tool for the county administration

    Directory of Open Access Journals (Sweden)

    REGO, E. A.

    2008-06-01

    Full Text Available This paper shows a Decision Support System development for any brazilian county. The system is free of any costs research. For doing so, one uses the datawarehouse, OLAP and GIS technologies all together with the IBGE's database to give to the user a query building tool, showing the results in maps or/and tables format, on a very simple and efficient way.

  12. A more complete library on your desktop

    CERN Multimedia

    2003-01-01

    The CERN library announces two new services: a complete database on standards containing the description of 400,000 standards, and a collection of scientific journals with more than three million articles. These include historical papers, some of them dating from the end of the 19th century.

  13. Desktop Computing - Distributed Cognition in a Tax Office

    Directory of Open Access Journals (Sweden)

    Martin Nielsen

    2004-05-01

    Full Text Available Based on a detailed study of the use of representations in a tax assessment process, this paper presents an analysis of the use of the physical desktop and of paper documents, files and electronic information. This analysis challenges the ways in which the computer desktop is designed and used normally, and we present a number of challenges to user interface design. Taking these seriously, means to revisit several taken-for-granted elements of the current WIMP regime: the randomly overlapping windows on a non-structured background; the lack of traces of time and past location; and the individualised and non-activity-oriented set-up of the desktop.

  14. Perception Analysis of Desktop and Mobile Service Website

    Directory of Open Access Journals (Sweden)

    Rizqiyatul Khoiriyah

    2016-12-01

    Full Text Available The research was conducted as a qualitative study of the website to deeper explore and examine the analysis of user perception of desktop and mobile website services. This research reviewed about user perception of desktop and mobile service website used by using qualitative methods adapted to WebQual and User Experience approach. This qualitative research refered to the theoretical reference written by Creswell (2014. The expected outcome is to know the user perceptions of the available services and information in the website along with the possibility of desktop and mobile gap arising from differences in the two services. These results can be used as a service model on the website of the user experience.

  15. Distributed multimedia database technologies supported by MPEG-7 and MPEG-21

    CERN Document Server

    Kosch, Harald

    2003-01-01

    15 Introduction Multimedia Content: Context Multimedia Systems and Databases (Multi)Media Data and Multimedia Metadata Purpose and Organization of the Book MPEG-7: The Multimedia Content Description Standard Introduction MPEG-7 and Multimedia Database Systems Principles for Creating MPEG-7 Documents MPEG-7 Description Definition Language Step-by-Step Approach for Creating an MPEG-7 Document Extending the Description Schema of MPEG-7 Encoding and Decoding of MPEG-7 Documents for Delivery-Binary Format for MPEG-7 Audio Part of MPEG-7 MPEG-7 Supporting Tools and Referen

  16. New Desktop Virtual Reality Technology in Technical Education

    Science.gov (United States)

    Ausburn, Lynna J.; Ausburn, Floyd B.

    2008-01-01

    Virtual reality (VR) that immerses users in a 3D environment through use of headwear, body suits, and data gloves has demonstrated effectiveness in technical and professional education. Immersive VR is highly engaging and appealing to technically skilled young Net Generation learners. However, technical difficulty and very high costs have kept…

  17. New generation of 3D desktop computer interfaces

    Science.gov (United States)

    Skerjanc, Robert; Pastoor, Siegmund

    1997-05-01

    Today's computer interfaces use 2-D displays showing windows, icons and menus and support mouse interactions for handling programs and data files. The interface metaphor is that of a writing desk with (partly) overlapping sheets of documents placed on its top. Recent advances in the development of 3-D display technology give the opportunity to take the interface concept a radical stage further by breaking the design limits of the desktop metaphor. The major advantage of the envisioned 'application space' is, that it offers an additional, immediately perceptible dimension to clearly and constantly visualize the structure and current state of interrelations between documents, videos, application programs and networked systems. In this context, we describe the development of a visual operating system (VOS). Under VOS, applications appear as objects in 3-D space. Users can (graphically connect selected objects to enable communication between the respective applications. VOS includes a general concept of visual and object oriented programming for tasks ranging from, e.g., low-level programming up to high-level application configuration. In order to enable practical operation in an office or at home for many hours, the system should be very comfortable to use. Since typical 3-D equipment used, e.g., in virtual-reality applications (head-mounted displays, data gloves) is rather cumbersome and straining, we suggest to use off-head displays and contact-free interaction techniques. In this article, we introduce an autostereoscopic 3-D display and connected video based interaction techniques which allow viewpoint-depending imaging (by head tracking) and visually controlled modification of data objects and links (by gaze tracking, e.g., to pick, 3-D objects just by looking at them).

  18. Design of All Digital Flight Program Training Desktop Application System

    Directory of Open Access Journals (Sweden)

    Li Yu

    2017-01-01

    Full Text Available All digital flight program training desktop application system operating conditions are simple. Can make the aircraft aircrew learning theory and operation training closely. Improve the training efficiency and effectiveness. This paper studies the application field and design requirements of flight program training system. Based on the WINDOWS operating system desktop application, the design idea and system architecture of the all digital flight program training system are put forward. Flight characteristics, key airborne systems and aircraft cockpit are simulated. Finally, By comparing flight training simulator and the specific script program training system, The characteristics and advantages of the training system are analyzed in this paper.

  19. An Introduction to Version Control Using GitHub Desktop

    Directory of Open Access Journals (Sweden)

    Daniel van Strien

    2016-06-01

    Full Text Available In this lesson you will be introduced to the basics of version control, understand why it is useful and implement basic version control for a plain text document using GitHub Desktop. By the end of this lesson you should understand: * what version control is and why it can be useful * the differences between Git and GitHub * how to implement version control using ‘GitHub Desktop,’ a Graphical User Interface for GitHub * be aware of other resources that will help you implement version control in your academic writing

  20. Colorado Late Cenozoic Fault and Fold Database and Internet Map Server: User-friendly technology for complex information

    Science.gov (United States)

    Morgan, K.S.; Pattyn, G.J.; Morgan, M.L.

    2005-01-01

    Internet mapping applications for geologic data allow simultaneous data delivery and collection, enabling quick data modification while efficiently supplying the end user with information. Utilizing Web-based technologies, the Colorado Geological Survey's Colorado Late Cenozoic Fault and Fold Database was transformed from a monothematic, nonspatial Microsoft Access database into a complex information set incorporating multiple data sources. The resulting user-friendly format supports easy analysis and browsing. The core of the application is the Microsoft Access database, which contains information compiled from available literature about faults and folds that are known or suspected to have moved during the late Cenozoic. The database contains nonspatial fields such as structure type, age, and rate of movement. Geographic locations of the fault and fold traces were compiled from previous studies at 1:250,000 scale to form a spatial database containing information such as length and strike. Integration of the two databases allowed both spatial and nonspatial information to be presented on the Internet as a single dataset (http://geosurvey.state.co.us/pubs/ceno/). The user-friendly interface enables users to view and query the data in an integrated manner, thus providing multiple ways to locate desired information. Retaining the digital data format also allows continuous data updating and quick delivery of newly acquired information. This dataset is a valuable resource to anyone interested in earthquake hazards and the activity of faults and folds in Colorado. Additional geologic hazard layers and imagery may aid in decision support and hazard evaluation. The up-to-date and customizable maps are invaluable tools for researchers or the public.

  1. Analysis of condensed matter physics records in databases. Science and technology indicators in condensed matter physics

    International Nuclear Information System (INIS)

    Hillebrand, C.D.

    1999-05-01

    An analysis of the literature on Condensed Matter Physics, with particular emphasis on High Temperature Superconductors, was performed on the contents of the bibliographic database International Nuclear Information System (INIS). Quantitative data were obtained on various characteristics of the relevant INIS records such as subject categories, language and country of publication, publication types, etc. The analysis opens up the possibility for further studies, e.g. on international research co-operation and on publication patterns. (author)

  2. Telecollaborative Desktop-Videoconferencing Exchange: The Case of Mark

    Science.gov (United States)

    Martin, Véronique

    2014-01-01

    This presentation is a case study of the Intercultural Communicative Competence (ICC) development of Mark, one of ten American students engaged in a desktop-videoconferencing telecollaborative exchange with a class of French students. Due in part to its inherent complexity, this context has not been widely researched. To observe ICC development, I…

  3. Perancangan Sistem Otomatis Update Pada Aplikasi Desktop Abios

    Directory of Open Access Journals (Sweden)

    Karto Iskandar

    2010-12-01

    Full Text Available Unlike web applications easier to update the latest version, desktop applications more difficult and must involve the user in doing so. It is caused by a desktop application is an application that is installed in the computer user. The purpose of this research is to design an automatic system updates on a desktop application, an example case: Application Binus International Operational Support (ABIOS. This research used literature study and system design. In desktop applications, often there is update the latest applications that are not known to the user who sometimes fatal and disrupt business operations. Generally, developer will inform the changes version to user that they can update the application. In an update of applications, should be done by the system automatically, not manually by users. Once in a while, the user background is not from computer base. After doing the research, it can be concluded that the system automatically updates the application has benefits to users in obtaining information regarding the latest version, and can assist in automatically update the latest application is based on computerization. For further development of this system is expected to operate on multi platforms and or mobile applications. 

  4. An Exercise in Desktop Publishing: Using the "Newsroom."

    Science.gov (United States)

    Kiteka, Sebastian F.

    This guide provides a description and step-by-step instructions for the use of "Newsroom," a desktop-publishing program for the Apple II series of microcomputers produced by Springboard Software Inc. Based on the 1984 version of the program, this two-hour exercise focuses on the design and production of a newsletter with text and…

  5. Versatile Desktop Experiment Module (DEMo) on Heat Transfer

    Science.gov (United States)

    Minerick, Adrienne R.

    2010-01-01

    This paper outlines a new Desktop Experiment Module (DEMo) engineered for a chemical engineering junior-level Heat Transfer course. This new DEMo learning tool is versatile, fairly inexpensive, and portable such that it can be positioned on student desks throughout a classroom. The DEMo system can illustrate conduction of various materials,…

  6. Laevo: A Temporal Desktop Interface for Integrated Knowledge Work

    DEFF Research Database (Denmark)

    Jeuris, Steven; Houben, Steven; Bardram, Jakob

    2014-01-01

    states and transitions of an activity. The life cycle is used to inform the design of Laevo, a temporal activity-centric desktop interface for personal knowledge work. Laevo allows users to structure work within dedicated workspaces, managed on a timeline. Through a centralized notification system which...

  7. Comparing Web Applications with Desktop Applications: An Empirical Study

    DEFF Research Database (Denmark)

    Pop, Paul

    2002-01-01

    of developing and using such applications. In this paper we present a comparison of web and desktop applications from the usability point of view. The comparison is based on an empirical study that investigates the performance of a group of users on two calendaring applications: Yahoo!Calendar and Microsoft...

  8. Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Levine, Aaron L [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-19

    Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit presentation from the WPTO FY14-FY16 Peer Review. The toolkit is aimed at regulatory agencies, consultants, project developers, the public, and any other party interested in learning more about the hydropower regulatory process.

  9. Djeen (Database for Joomla!'s Extensible Engine): a research information management system for flexible multi-technology project administration.

    Science.gov (United States)

    Stahl, Olivier; Duvergey, Hugo; Guille, Arnaud; Blondin, Fanny; Vecchio, Alexandre Del; Finetti, Pascal; Granjeaud, Samuel; Vigy, Oana; Bidaut, Ghislain

    2013-06-06

    With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. We developed Djeen (Database for Joomla!'s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group.Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material.

  10. EPA Region 8, Memo on Desktop Printer Ink Cartridges Policy & Voluntary Printer Turn-in

    Science.gov (United States)

    This memo requests EPA Region 8 users to voluntarily turn-in their desktop printers and notifies users of the Region 8 policy to not provide maintenance or ink and toner cartridges for desktop printers.

  11. MedlinePlus® Everywhere: Access from Your Phone, Tablet or Desktop

    Science.gov (United States)

    ... responsivefull.html MedlinePlus® Everywhere: Access from Your Phone, Tablet or Desktop To use the sharing features on ... provide a consistent user experience from a desktop, tablet, or phone. All users, regardless of how they ...

  12. Science and Technology Text Mining: Origins of Database Tomography and Multi-Word Phrase Clustering

    Science.gov (United States)

    2003-08-15

    Management of Engineering and Technology, October 27-31, 1991c. Kostoff, R. N., " Research Impact Quantification," R&D Management, 24:3, July 1994...Analysis of the Research Impact Assessment Literature and the Journal of the American Chemical Society.” DTIC Technical Report Number ADA...Technology. 5:5. 24-26. June 2001. Kostoff, R. N., and Del Rio, J. A. “Physics Research Impact Assessment”. Physics World. 14:6. 47-52. June

  13. Use of Dynamic Technologies for Web-enabled Database Management Systems

    OpenAIRE

    Bogdanova, Galina; Todorov, Todor; Blagoev, Dimitar; Todorova, Mirena

    2007-01-01

    In this paper we consider two computer systems and the dynamic Web technologies they are using. Different contemporary dynamic web technologies are described in details and their advantages and disadvantages have been shown. Specific applications are developed, clinic and studying systems, and their programming models are described. Finally we implement these two applications in the students education process: Online studying has been tested in the Technical University – Va...

  14. Survey of the situation of technology succession. Databases of articles including in industrial technology museums; Gijutsu keisho jokyo chosa. Sangyo gijutsu hakubutsukan shuzohin D.B. hen

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    To promote the succession of history of and the creative use of industrial science technologies, the paper made lists and databases of the articles of industrial technology museums and material halls in Japan. Record/preservation and collection/systematization of history of the industrial technology is useful for forming bases necessary for promotion of future research/development and international contribution. Museums and material halls are the fields for making comprehensive and practical activities. The data were made as one of the basic databases as the first step for promoting activities for examining the technical succession situation in a long term range continuously and systematically. In the classification of the data, the energy relation was divided into electric power, nuclear power, oil, coal, gas and energy in general. Others were classified into metal/mine, electricity/electronics/communication, chemistry/food, ship building/heavy machinery, printing/precision instrument, and textile/spinning. Moreover, the traffic relation was classified into railroad, automobiles/two-wheeled vehicles, airline/space, and ships. Items were also set of life relation, civil engineering/architecture, and general. The total number of the museums for the survey reached 208.

  15. Modern SQL and NoSQL database technologies for the ATLAS experiment

    CERN Document Server

    Barberis, Dario; The ATLAS collaboration

    2017-01-01

    Structured data storage technologies evolve very rapidly in the IT world. LHC experiments, and ATLAS in particular, try to select and use these technologies balancing the performance for a given set of use cases with the availability, ease of use and of getting support, and stability of the product. We definitely and definitively moved from the “one fits all” (or “all has to fit into one”) paradigm to choosing the best solution for each group of data and for the applications that use these data. This talk describes the solutions in use, or under study, for the ATLAS experiment and their selection process and performance.

  16. Modern SQL and NoSQL database technologies for the ATLAS experiment

    CERN Document Server

    Barberis, Dario; The ATLAS collaboration

    2017-01-01

    Structured data storage technologies evolve very rapidly in the IT world. LHC experiments, and ATLAS in particular, try to select and use these technologies balancing the performance for a given set of use cases with the availability, ease of use and of getting support, and stability of the product. We definitely and definitively moved from the “one fits all” (or “all has to fit into one”) paradigm to choosing the best solution for each group of data and for the applications that use these data. This paper describes the solutions in use, or under study, for the ATLAS experiment and their selection process and performance measurements.

  17. Development of prostate cancer research database with the clinical data warehouse technology for direct linkage with electronic medical record system.

    Science.gov (United States)

    Choi, In Young; Park, Seungho; Park, Bumjoon; Chung, Byung Ha; Kim, Choung-Soo; Lee, Hyun Moo; Byun, Seok-Soo; Lee, Ji Youl

    2013-01-01

    In spite of increased prostate cancer patients, little is known about the impact of treatments for prostate cancer patients and outcome of different treatments based on nationwide data. In order to obtain more comprehensive information for Korean prostate cancer patients, many professionals urged to have national system to monitor the quality of prostate cancer care. To gain its objective, the prostate cancer database system was planned and cautiously accommodated different views from various professions. This prostate cancer research database system incorporates information about a prostate cancer research including demographics, medical history, operation information, laboratory, and quality of life surveys. And, this system includes three different ways of clinical data collection to produce a comprehensive data base; direct data extraction from electronic medical record (EMR) system, manual data entry after linking EMR documents like magnetic resonance imaging findings and paper-based data collection for survey from patients. We implemented clinical data warehouse technology to test direct EMR link method with St. Mary's Hospital system. Using this method, total number of eligible patients were 2,300 from 1997 until 2012. Among them, 538 patients conducted surgery and others have different treatments. Our database system could provide the infrastructure for collecting error free data to support various retrospective and prospective studies.

  18. Kajian Unified Theory of Acceptance and Use of Technology Dalam Penggunaan Open Source Software Database Management System

    Directory of Open Access Journals (Sweden)

    Michael Sonny

    2016-06-01

    Full Text Available Perkembangan perangkat lunak computer dewasa ini terjadi sedemikian pesatnya, perkembangan tidak hanya terjadi pada perangkat lunak yang memiliki lisensi tertentu, perangkat open source pun demikian. Perkembangan itu tentu saja sangat menggembirakan bagi pengguna computer khususnya di kalangan pendidikan maupun di kalangan mahasiswa, karena pengguna mempunyai beberapa pilihan untuk menggunakan aplikasi. Perangkat lunak open source juga menawarkan produk yang umumnya gratis, diberikan kode programnya, kebebasan untuk modifikasi dan mengembangkan. Meneliti aplikasi berbasis open source tentu saja sangat beragam seperti aplikasi untuk pemrograman (PHP, Gambas, Database Management System (MySql, SQLite, browsing (Mozilla, Firefox, Opera. Pada penelitian ini di kaji penerimaan aplikasi DBMS (Database Management System seperti MySql dan SQLite dengan menggunakan sebuah model yang dikembangkan oleh Venkantes(2003 yaitu UTAUT (Unified Theory of Acceptance and Use of Technology. Faktor – faktor tertentu juga mempengaruhi dalam melakukan kegiatan pembelajaran aplikasi open source ini, salah satu faktor atau yang disebut dengan moderating yang bisa mempengaruhi efektifitas dan efisiensi. Dengan demikian akan mendapatkan hasil yang bisa membuat kelancaran dalam pembelajaran aplikasi berbasis open source ini.   Kata kunci— open source, Database Management System (DBMS, Modereting

  19. Development of an Expanded, High Reliability Cost and Performance Database for In Situ Remediation Technologies

    Science.gov (United States)

    2016-03-01

    At many sites, restoring groundwater to a potentially- usable source of drinking water is the ultimate goal, requiring that contaminant...Remediation Technologies, ER-201120, 9 May. Heron, G., S. Carroll, and S.G. Nielsen , 2005. “Full-Scale Removal of DNAPL Constituents Using Steam-Enhanced

  20. Establishment of database and network for research of stream generator and state of the art technology review

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jae Bong; Hur, Nam Su; Moon, Seong In; Seo, Hyeong Won; Park, Bo Kyu; Park, Sung Ho; Kim, Hyung Geun [Sungkyunkwan Univ., Seoul (Korea, Republic of)

    2004-02-15

    A significant number of steam generator tubes are defective and are removed from service or repaired world widely. This wide spread damage has been caused by diverse degradation mechanisms, some of which are difficult to detect and predict. Regarding domestic nuclear power plants, also, the increase of number of operating nuclear power plants and operating periods may result in the increase of steam generator tube failure. So, it is important to carry out the integrity evaluation process to prevent the steam generator tube damage. There are two objectives of this research. The one is to make database for the research of steam generator at domestic research institution. It will increase the efficiency and capability of limited domestic research resources by sharing data and information through network organization. Also, it will enhance the current standard of integrity evaluation procedure that is considerably conservative but can be more reasonable. The second objective is to establish the standard integrity evaluation procedure for steam generator tube by reviewing state of the art technology. The research resources related to steam generator tubes are managed by the established web-based database system. The following topics are covered in this project: development of web-based network for research on steam generator tubes review of state of the art technology.

  1. The Energy Science and Technology Database on a local library system: A case study at the Los Alamos National Research Library

    Energy Technology Data Exchange (ETDEWEB)

    Holtkamp, I.S.

    1994-10-01

    This paper presents an overview of efforts at Los Alamos National Laboratory to acquire and mount the Energy Science and Technology Database (EDB) as a citation database on the Research Library`s Geac Advance system. The rationale for undertaking this project and expected benefits are explained. Significant issues explored are loading non-USMARC records into a MARC-based library system, the use of EDB records to replace or supplement in-house cataloging of technical reports, the impact of different cataloging standards and database size on searching and retrieval, and how integrating an external database into the library`s online catalog may affect staffing and workflow.

  2. ELSEVIER SCIENTIFIC JOURNALS AVAILABLE ON YOUR DESKTOP

    CERN Multimedia

    1999-01-01

    Elsevier Science Publishers have for decades distributed renowned journals in science and technology, which are now accessible on the Web through their Science Direct service. CERN has been granted a site licence trial period until the end of 1999.Included among the titles are: Astroparticle physics, Computer physics communications, Nuclear instruments and methods in physics research A and B, Nuclear physics A and B, Physics letters A and B, Physics reports, Surface science and Thin solid films.Links to the individual titles appear in our electronic journals list at:http://wwwas.cern.ch/library/electronic_journals/ejAH.htmlThe Library invites all readers to search and download articles of the journals currently subscribed to. You can also access the full Science Direct site at: http://www.sciencedirect.com/(Choose 'group-wide login' or, for a 'personal login' registration, please contact us)All questions and comments are welcome and can be addressed to: library.desk@cern.ch

  3. Cleanup of a HLW nuclear fuel-reprocessing center using 3-D database modeling technology

    International Nuclear Information System (INIS)

    Sauer, R.C.

    1992-01-01

    A significant challenge in decommissioning any large nuclear facility is how to solidify the large volume of residual high-level radioactive waste (HLW) without structurally interfering with the existing equipment and piping used at the original facility or would require rework due to interferences which were not identified during the design process. This problem is further compounded when the nuclear facility to be decommissioned is a 35 year old nuclear fuel reprocessing center designed to recover usable uranium and plutonium. Facilities of this vintage usually tend to lack full documentation of design changes made over the years and as a result, crude traps or pockets of high-level contamination may not be fully realized. Any miscalculation in the construction or modification sequences could compound the overall dismantling and decontamination of the facility. This paper reports that development of a 3-dimensional (3-D) computer database tool was considered critical in defining the most complex portions of this one-of-a-kind vitrification facility

  4. TOPCAT: Desktop Exploration of Tabular Data for Astronomy and Beyond

    Directory of Open Access Journals (Sweden)

    Mark Taylor

    2017-06-01

    Full Text Available TOPCAT, the Tool for OPerations on Catalogues And Tables, is an interactive desktop application for retrieval, analysis and manipulation of tabular data, offering a powerful and flexible range of interactive visualization options amongst other features. Its visualization capabilities focus on enabling interactive exploration of large static local tables—millions of rows and hundreds of columns can easily be handled on a standard desktop or laptop machine, and various options are provided for meaningful graphical representation of such large datasets. TOPCAT has been developed in the context of astronomy, but many of its features are equally applicable to other domains. The software, which is free and open source, is written in Java, and the underlying high-performance visualisation library is suitable for re-use in other applications.

  5. Systems Engineering Model and Training Application for Desktop Environment

    Science.gov (United States)

    May, Jeffrey T.

    2010-01-01

    Provide a graphical user interface based simulator for desktop training, operations and procedure development and system reference. This simulator allows for engineers to train and further understand the dynamics of their system from their local desktops. It allows the users to train and evaluate their system at a pace and skill level based on the user's competency and from a perspective based on the user's need. The simulator will not require any special resources to execute and should generally be available for use. The interface is based on a concept of presenting the model of the system in ways that best suits the user's application or training needs. The three levels of views are Component View, the System View (overall system), and the Console View (monitor). These views are portals into a single model, so changing the model from one view or from a model manager Graphical User Interface will be reflected on all other views.

  6. Feasibility of Bioprinting with a Modified Desktop 3D Printer.

    Science.gov (United States)

    Goldstein, Todd A; Epstein, Casey J; Schwartz, John; Krush, Alex; Lagalante, Dan J; Mercadante, Kevin P; Zeltsman, David; Smith, Lee P; Grande, Daniel A

    2016-12-01

    Numerous studies have shown the capabilities of three-dimensional (3D) printing for use in the medical industry. At the time of this publication, basic home desktop 3D printer kits can cost as little as $300, whereas medical-specific 3D bioprinters can cost more than $300,000. The purpose of this study is to show how a commercially available desktop 3D printer could be modified to bioprint an engineered poly-l-lactic acid scaffold containing viable chondrocytes in a bioink. Our bioprinter was used to create a living 3D functional tissue-engineered cartilage scaffold. In this article, we detail the design, production, and calibration of this bioprinter. In addition, the bioprinted cells were tested for viability, proliferation, biochemistry, and gene expression; these tests showed that the cells survived the printing process, were able to continue dividing, and produce the extracellular matrix expected of chondrocytes.

  7. Application of desktop computers in nuclear engineering education

    International Nuclear Information System (INIS)

    Graves, H.W. Jr.

    1990-01-01

    Utilization of desktop computers in the academic environment is based on the same objectives as in the industrial environment - increased quality and efficiency. Desktop computers can be extremely useful teaching tools in two general areas: classroom demonstrations and homework assignments. Although differences in emphasis exist, tutorial programs share many characteristics with interactive software developed for the industrial environment. In the Reactor Design and Fuel Management course at the University of Maryland, several interactive tutorial programs provided by Energy analysis Software Service have been utilized. These programs have been designed to be sufficiently structured to permit an orderly, disciplined solution to the problem being solved, and yet be flexible enough to accommodate most problem solution options

  8. Research focus and trends in nuclear science and technology in Ghana: a bibliometric study based on the INIS database

    International Nuclear Information System (INIS)

    Agyeman, E. A.; Bilson, A.

    2015-01-01

    The peaceful application of atomic energy was introduced into Ghana about fifty years ago. This is the first bibliometric study of nuclear science and technology research publications originating from Ghana and listed in the International Nuclear Information System (INIS) Database. The purpose was to use the simple document counting method to determine the geographical distribution, annual growth and the subject areas of the publications as well as communication channels, key journals and authorship trends. The main findings of the study were that, a greater number of the nuclear science and technology records listed in the Database were published in Ghana (598 or 56.57% against 459 or 43.43% published outside Ghana). There has been a steady growth in the number of publications over the years with the most productive year being 2012. The main focus of research has been in the area of applied life sciences, comprising plant cultivation & breeding, pest & disease control, food protection and preservation, human nutrition and animal husbandry; followed by chemistry; environmental sciences; radiation protection; nuclear reactors; physics; energy; and radiology and nuclear medicine. The area with the least number of publications was safeguards and physical protection. The main channel of communicating research results was peer reviewed journals and a greater number of the journal articles were published in Ghana followed by the United Kingdom, Hungary and the Netherlands. The core journals identified in this study were Journal of Applied Science and Technology; Journal of Radioanalytical and Nuclear Chemistry; Journal of the Ghana Science Association; Radiation Protection Dosimetry; Journal of the Kumasi University of Science and Technology; West African Journal of Applied Ecology; Ghana Journal of Science; Applied Radiation and Isotopes; Annals of Nuclear Energy, IOP Conference Series (Earth and Environmental Science) and Radiation Physics and Chemistry. Eighty percent

  9. Virtual reality exposure therapy: 150-degree screen to desktop PC.

    Science.gov (United States)

    Tichon, Jennifer; Banks, Jasmine

    2006-08-01

    Virtual reality exposure therapy (VRET) developed using immersive or semi-immersive virtual environments present a usability problem for practitioners. To meet practitioner requirements for lower cost and portability VRET programs must often be ported onto desktop environments such as the personal computer (PC). However, success of VRET has been shown to be linked to presence, and the environment's ability to evoke the same reactions and emotions as a real experience. It is generally accepted that high-end virtual environments (VEs) are more immersive than desktop PCs, but level of immersion does not always predict level of presence. This paper reports on the impact on presence of porting a therapeutic VR application for schizophrenia from the initial research environment of a semi-immersive curved screen to PC. Presence in these two environments is measured both introspectively and across a number of causal factors thought to underlie the experience of presence. Results show that the VR exposure program successfully made users feel they were "present" in both platforms. While the desktop PC achieved higher scores on presence across causal factors participants reported they felt more present in the curved screen environment. While comparison of the two groups was statistically significant for the PQ but not for the IPQ, subjective reports of experiences in the environments should be considered in future research as the success of VRET relies heavily on the emotional response of patients to the therapeutic program.

  10. DIaaS: Resource Management System for the Intra-Cloud with On-Premise Desktops

    Directory of Open Access Journals (Sweden)

    Hyun-Woo Kim

    2017-01-01

    Full Text Available Infrastructure as a service with desktops (DIaaS based on the extensible mark-up language (XML is herein proposed to utilize surplus resources. DIaaS is a traditional surplus-resource integrated management technology. It is designed to provide fast work distribution and computing services based on user service requests as well as storage services through desktop-based distributed computing and storage resource integration. DIaaS includes a nondisruptive resource service and an auto-scalable scheme to enhance the availability and scalability of intra-cloud computing resources. A performance evaluation of the proposed scheme measured the clustering performance time for surplus resource utilization. The results showed improvement in computing and storage services in a connection of at least two computers compared to the traditional method for high-availability measurement of nondisruptive services. Furthermore, an artificial server error environment was used to create a clustering delay for computing and storage services and for nondisruptive services. It was compared to the Hadoop distributed file system (HDFS.

  11. Users' attitude towards science and technology database system : INIS user needs survey

    International Nuclear Information System (INIS)

    Fukazawa, Takeyasu; Takahashi, Satoko; Yonezawa, Minoru; Kajiro, Tadashi; Mineo, Yukinobu; Habara, Takako; Komatsubara, Yasutoshi; Hiramatsu, Nobuaki; Habara, Tadashi.

    1995-01-01

    The International Nuclear Information System (INIS) is the world's leading information system on the peaceful use of nuclear energy which is being operated by the International Atomic Energy Agency (IAEA) in collaboration with its member-states and other international organizations. After more than 20 years of the operation of INIS, a user needs survey was conducted with the aim of assisting the INIS Secretariat to decide which way INIS should go. This report describes users' attitude towards that system on the basis of the conclusions drawn from the questionnaires sent out to the users by the Japan Atomic Energy Research Institute, the INIS national center in Japan, in close collaboration with the Japan Information Center of Science and Technology. (author)

  12. National Database for Autism Research (NDAR): Big Data Opportunities for Health Services Research and Health Technology Assessment.

    Science.gov (United States)

    Payakachat, Nalin; Tilford, J Mick; Ungar, Wendy J

    2016-02-01

    The National Database for Autism Research (NDAR) is a US National Institutes of Health (NIH)-funded research data repository created by integrating heterogeneous datasets through data sharing agreements between autism researchers and the NIH. To date, NDAR is considered the largest neuroscience and genomic data repository for autism research. In addition to biomedical data, NDAR contains a large collection of clinical and behavioral assessments and health outcomes from novel interventions. Importantly, NDAR has a global unique patient identifier that can be linked to aggregated individual-level data for hypothesis generation and testing, and for replicating research findings. As such, NDAR promotes collaboration and maximizes public investment in the original data collection. As screening and diagnostic technologies as well as interventions for children with autism are expensive, health services research (HSR) and health technology assessment (HTA) are needed to generate more evidence to facilitate implementation when warranted. This article describes NDAR and explains its value to health services researchers and decision scientists interested in autism and other mental health conditions. We provide a description of the scope and structure of NDAR and illustrate how data are likely to grow over time and become available for HSR and HTA.

  13. Gigabits to the Desktop: Installing tomorrow`s networks today

    Energy Technology Data Exchange (ETDEWEB)

    Kuhfuss, T.C.; Phillips, P.T.

    1994-03-01

    Argonne is one of the US Department of Energy`s world class research institutions. Leading edge computing tools and networks allow Argonne to maintain and enhance this reputation. One current effort to deploy leading edge tools is the Argonne ``Gigabits to the Desktop`` project. While the delivering and using gigabits to the desktop is little more than a hope at this time, this paper will discuss the hurdles to achieving it and how to tear down as many hurdles as possible. Under this project, four distinct areas are being investigated and enhanced. This paper will discuss briefly the applications and tools that we see driving the requirement for gigabits to the desktop. It will touch on a functional description of our``ideal`` workstations, architectures and the candidates for the next generation network capable of delivering gigabits. Lastly, it will provide an in-depth analysis of physical layer options and attempt to prove that this area, while the least risky, must be done properly, with the proper media. This paper assumes one important point. It assumes that bandwidth is essentially free. We will discuss network architectures and physical installation recommendations which have a fixed cost. However on a campus, there is no marginal cost for additional packets on these networks once the network infrastructure is installed. This point is important when extrapolating our conclusions to the wide area. The marginal cost of a packet sent to a commercial network is usually non zero. This fact may prove to be a great hindrance in migrating the applications mentioned beyond the organizational boundaries.

  14. Working Inside The Box: An Example Of Google Desktop Search in a Forensic Examination

    Directory of Open Access Journals (Sweden)

    Timothy James LaTulippe

    2011-12-01

    Full Text Available Information and the technological advancements for which mankind develops with regards to its storage has increased tremendously over the past few decades. As the total amount of data stored rapidly increases in conjunction with the amount of widely available computer-driven devices being used, solutions are being developed to better harness this data. These types of advancements are continually assisting investigators and computer forensic examiners. One such application which houses copious amounts of fruitful data is the Google Desktop Search program. Coupled with tested and verified techniques, examiners can exploit the power of this application to cater to their investigative needs. Please find within a real world case example of these techniques and its subsequent outcome.

  15. Desktop computer graphics for RMS/payload handling flight design

    Science.gov (United States)

    Homan, D. J.

    1984-01-01

    A computer program, the Multi-Adaptive Drawings, Renderings and Similitudes (MADRAS) program, is discussed. The modeling program, written for a desktop computer system (the Hewlett-Packard 9845/C), is written in BASIC and uses modular construction of objects while generating both wire-frame and hidden-line drawings from any viewpoint. The dimensions and placement of objects are user definable. Once the hidden-line calculations are made for a particular viewpoint, the viewpoint may be rotated in pan, tilt, and roll without further hidden-line calculations. The use and results of this program are discussed.

  16. Improving "tail" computations in a BOINC-based Desktop Grid

    Science.gov (United States)

    Kolokoltsev, Yevgeniy; Ivashko, Evgeny; Gershenson, Carlos

    2017-12-01

    A regular Desktop Grid bag-of-tasks project can take a lot of time to complete computations. An important part of the process is tail computations: when the number of tasks to perform becomes less than the number of computing nodes. At this stage, a dynamic replication could be used to reduce the time needed to complete computations. In this paper, we propose a mathematical model and a strategy of dynamic replication at the tail stage. The results of the numerical experiments are given.

  17. Efficient Sustainable Operation Mechanism of Distributed Desktop Integration Storage Based on Virtualization with Ubiquitous Computing

    Directory of Open Access Journals (Sweden)

    Hyun-Woo Kim

    2015-06-01

    Full Text Available Following the rapid growth of ubiquitous computing, many jobs that were previously manual have now been automated. This automation has increased the amount of time available for leisure; diverse services are now being developed for this leisure time. In addition, the development of small and portable devices like smartphones, diverse Internet services can be used regardless of time and place. Studies regarding diverse virtualization are currently in progress. These studies aim to determine ways to efficiently store and process the big data generated by the multitude of devices and services in use. One topic of such studies is desktop storage virtualization, which integrates distributed desktop resources and provides these resources to users to integrate into distributed legacy desktops via virtualization. In the case of desktop storage virtualization, high availability of virtualization is necessary and important for providing reliability to users. Studies regarding hierarchical structures and resource integration are currently in progress. These studies aim to create efficient data distribution and storage for distributed desktops based on resource integration environments. However, studies regarding efficient responses to server faults occurring in desktop-based resource integration environments have been insufficient. This paper proposes a mechanism for the sustainable operation of desktop storage (SODS for high operational availability. It allows for the easy addition and removal of desktops in desktop-based integration environments. It also activates alternative servers when a fault occurs within a system.

  18. KALIMER database development

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment

  19. KALIMER database development

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment.

  20. Database Description - JSNP | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us JSNP Database Description General information of database Database name JSNP Alternative nam...n Science and Technology Agency Creator Affiliation: Contact address E-mail : Database... classification Human Genes and Diseases - General polymorphism databases Organism Taxonomy Name: Homo ...sapiens Taxonomy ID: 9606 Database description A database of about 197,000 polymorphisms in Japanese populat... and manner of utilization of database Allele frequencies in Japanese populatoin are also available. License

  1. Desktop-VR system for preflight 3D navigation training

    Science.gov (United States)

    Aoki, Hirofumi; Oman, Charles M.; Buckland, Daniel A.; Natapoff, Alan

    Crews who inhabit spacecraft with complex 3D architecture frequently report inflight disorientation and navigation problems. Preflight virtual reality (VR) training may reduce those risks. Although immersive VR techniques may better support spatial orientation training in a local environment, a non-immersive desktop (DT) system may be more convenient for navigation training in "building scale" spaces, especially if the two methods achieve comparable results. In this study trainees' orientation and navigation performance during simulated space station emergency egress tasks was compared while using immersive head-mounted display (HMD) and DT-VR systems. Analyses showed no differences in pointing angular-error or egress time among the groups. The HMD group was significantly faster than DT group when pointing from destination to start location and from start toward different destination. However, this may be attributed to differences in the input device used (a head-tracker for HMD group vs. a keyboard touchpad or a gamepad in the DT group). All other 3D navigation performance measures were similar using the immersive and non-immersive VR systems, suggesting that the simpler desktop VR system may be useful for astronaut 3D navigation training.

  2. 3d visualization of atomistic simulations on every desktop

    International Nuclear Information System (INIS)

    Peled, Dan; Silverman, Amihai; Adler, Joan

    2013-01-01

    Once upon a time, after making simulations, one had to go to a visualization center with fancy SGI machines to run a GL visualization and make a movie. More recently, OpenGL and its mesa clone have let us create 3D on simple desktops (or laptops), whether or not a Z-buffer card is present. Today, 3D a la Avatar is a commodity technique, presented in cinemas and sold for home TV. However, only a few special research centers have systems large enough for entire classes to view 3D, or special immersive facilities like visualization CAVEs or walls, and not everyone finds 3D immersion easy to view. For maximum physics with minimum effort a 3D system must come to each researcher and student. So how do we create 3D visualization cheaply on every desktop for atomistic simulations? After several months of attempts to select commodity equipment for a whole room system, we selected an approach that goes back a long time, even predating GL. The old concept of anaglyphic stereo relies on two images, slightly displaced, and viewed through colored glasses, or two squares of cellophane from a regular screen/projector or poster. We have added this capability to our AViz atomistic visualization code in its new, 6.1 version, which is RedHat, CentOS and Ubuntu compatible. Examples using data from our own research and that of other groups will be given

  3. SERVICE HANDBOOK FOR THE DESKTOP SUPPORT CONTRACT WIH IT DIVISION

    CERN Document Server

    2000-01-01

    A Desktop Support Contract has been running since January 1999 to offer help to all users at CERN with problems that occur with their desktop computers. The contract is run conjointly by the Swedish Company WM-data and the Swiss company DCS.The contract is comprised of the Computing Helpdesk, a General Service for all parts of CERN and also Local Service for those divisions and groups that want faster response times and additional help with their specific computer environment.In order to describe what services are being offered, and also to give a better understanding of the structure of the contract, a Service Handbook has been created. The intended audience for the Service Handbook is everyone that is using the contract, i.e. users, managers and also the service staff inside the contract. In the handbook you will find what help you can get from the contract, how to get in touch with the contract, and also what response times you can expect. Since the computer environment at CERN is a never-changing entity, ...

  4. Bibliometric analysis of Spanish scientific publications in the subject Construction & Building Technology in Web of Science database (1997-2008

    Directory of Open Access Journals (Sweden)

    Rojas-Sola, J. I.

    2010-12-01

    Full Text Available In this paper the publications from Spanish institutions listed in the journals of the Construction & Building Technology subject of Web of Science database for the period 1997- 2008 are analyzed. The number of journals in whose is published is 35 and the number of articles was 760 (Article or Review. Also a bibliometric assessment has done and we propose two new parameters: Weighted Impact Factor and Relative Impact Factor; also includes the number of citations and the number documents at the institutional level. Among the major production Institutions with greater scientific production, as expected, the Institute of Constructional Science Eduardo Torroja (CSIC, while taking into account the weighted impact factor ranks first University of Vigo. On the other hand, only two journals Cement and Concrete Materials and Materials de Construction agglutinate the 45.26% of the Spanish scientific production published in the Construction & Building Technology subject, with 172 papers each one. Regarding international cooperation, include countries such as England, Mexico, United States, Italy, Argentina and France.

    En este trabajo se analizan las publicaciones procedentes de instituciones españolas recogidas en las revistas de la categoría Construction & Building Technology de la base de datos Web of Science para el periodo 1997-2008. El número de revistas incluidas es de 35 y el número de artículos publicados ha sido de 760 (Article o Review. Se ha realizado una evaluación bibliométrica con dos nuevos parámetros: Factor de Impacto Ponderado y Factor de Impacto Relativo; asimismo se incluyen el número de citas y el número de documentos a nivel institucional. Entre los centros con una mayor producción científica destaca, como era de prever, el Instituto de Ciencias de la Construcción Eduardo Torroja (CSIC, mientras que atendiendo al Factor de Impacto Ponderado ocupa el primer lugar la Universidad de Vigo. Por otro lado, sólo dos

  5. CLOUD-BASED VS DESKTOP-BASED PROPERTY MANAGEMENT SYSTEMS IN HOTEL

    Directory of Open Access Journals (Sweden)

    Mustafa\tGULMEZ

    2015-06-01

    Full Text Available Even though keeping up with the modern developments in IT sector is crucial for the success and competitiveness of a hotel, it is usually very hard for new technologies to be accepted and implemented. This is the case with the cloud technology for which the opinions between hoteliers are divided on those who think that it is just another fashion trend, unnecessary to be taken into consideration and those that believe that it helps in performing daily operations more easily, leaving space for more interaction with guests both in virtual and real world. Usage of cloud technology in hotels is still in its beginning phase and hoteliers still have to learn more about its advantages and adequate usage for the benefit of overall hotel operating. On the example of hotel property management system (PMS and comparison between features of its older desktop-version and new web-based programs, this research aims at finding out at which stage and how effective is usage of cloud technology in hotels. For this, qualitative research with semi-structured interviews with hotel mangers that use one of these programs was conducted. Reasons for usage and advantages of each version are discussed.

  6. Veterans Administration Databases

    Science.gov (United States)

    The Veterans Administration Information Resource Center provides database and informatics experts, customer service, expert advice, information products, and web technology to VA researchers and others.

  7. New types of and access to grey literature databases generated by the Russian National Public Library for Science and Technology

    OpenAIRE

    Shraiberg, Yakov (Russian National Public Library for Science and Technology); GreyNet, Grey Literature Network Service

    1996-01-01

    The paper presents new types of databases as part of services provided by the Library and the sources which may be regarded as "grey literature": patents, reports, unpublished translations, industrial catalogs. The paper describes services with these and other databases based on grey literature processing, local and remote access, interaction with Union Catalog and pilot CD-ROM projects. The paper provides sample records of the database on "grey" literature and explains the differences in dat...

  8. Emission of particulate matter from a desktop three-dimensional (3D) printer.

    Science.gov (United States)

    Yi, Jinghai; LeBouf, Ryan F; Duling, Matthew G; Nurkiewicz, Timothy; Chen, Bean T; Schwegler-Berry, Diane; Virji, M Abbas; Stefaniak, Aleksandr B

    2016-01-01

    Desktop three-dimensional (3D) printers are becoming commonplace in business offices, public libraries, university labs and classrooms, and even private homes; however, these settings are generally not designed for exposure control. Prior experience with a variety of office equipment devices such as laser printers that emit ultrafine particles (UFP) suggests the need to characterize 3D printer emissions to enable reliable risk assessment. The aim of this study was to examine factors that influence particulate emissions from 3D printers and characterize their physical properties to inform risk assessment. Emissions were evaluated in a 0.5-m(3) chamber and in a small room (32.7 m(3)) using real-time instrumentation to measure particle number, size distribution, mass, and surface area. Factors evaluated included filament composition and color, as well as the manufacturer-provided printer emissions control technologies while printing an object. Filament type significantly influenced emissions, with acrylonitrile butadiene styrene (ABS) emitting larger particles than polylactic acid (PLA), which may have been the result of agglomeration. Geometric mean particle sizes and total particle (TP) number and mass emissions differed significantly among colors of a given filament type. Use of a cover on the printer reduced TP emissions by a factor of 2. Lung deposition calculations indicated a threefold higher PLA particle deposition in alveoli compared to ABS. Desktop 3D printers emit high levels of UFP, which are released into indoor environments where adequate ventilation may not be present to control emissions. Emissions in nonindustrial settings need to be reduced through the use of a hierarchy of controls, beginning with device design, followed by engineering controls (ventilation) and administrative controls such as choice of filament composition and color.

  9. A Personal Desktop Liquid-Metal Printer as a Pervasive Electronics Manufacturing Tool for Society in the Near Future

    Directory of Open Access Journals (Sweden)

    Jun Yang

    2015-12-01

    Full Text Available It has long been a dream in the electronics industry to be able to write out electronics directly, as simply as printing a picture onto paper with an office printer. The first-ever prototype of a liquid-metal printer has been invented and demonstrated by our lab, bringing this goal a key step closer. As part of a continuous endeavor, this work is dedicated to significantly extending such technology to the consumer level by making a very practical desktop liquid-metal printer for society in the near future. Through the industrial design and technical optimization of a series of key technical issues such as working reliability, printing resolution, automatic control, human-machine interface design, software, hardware, and integration between software and hardware, a high-quality personal desktop liquid-metal printer that is ready for mass production in industry was fabricated. Its basic features and important technical mechanisms are explained in this paper, along with demonstrations of several possible consumer end-uses for making functional devices such as light-emitting diode (LED displays. This liquid-metal printer is an automatic, easy-to-use, and low-cost personal electronics manufacturing tool with many possible applications. This paper discusses important roles that the new machine may play for a group of emerging needs. The prospective future of this cutting-edge technology is outlined, along with a comparative interpretation of several historical printing methods. This desktop liquid-metal printer is expected to become a basic electronics manufacturing tool for a wide variety of emerging practices in the academic realm, in industry, and in education as well as for individual end-users in the near future.

  10. Online referral and OPD booking from the GP desktop.

    Science.gov (United States)

    Nicholson, Caroline; Jackson, Claire L; Wright, Bernadette; Mainwaring, Paul; Holliday, Dimity; Lankowski, Andrew; Kardash, Christine

    2006-08-01

    The Brisbane Inner South E-referral Project (BISEP) developed an application which allowed general practitioners, from their desktop, to successfully search for and book an available hospital outpatient appointment for patients with suspected cancer, send the referral electronically, and inform the patient of both the appointment and referral during the consultation. The hospital changed their outpatient department processes to allow such functionality for local GPs with patients with suspected cancer, working from a mutually agreed set of best practice referral criteria. A group of 19 GPs participated in an 11-week pilot implementation of the application, and were enthusiastic about continuing and expanding the approach. Patient satisfaction measures post intervention indicated that they perceived no major disadvantage in this form of outpatient department referral.

  11. Desk-top microcomputer for lab-scale process control

    International Nuclear Information System (INIS)

    Overman, R.F.; Byrd, J.S.; Goosey, M.H.; Sand, R.J.

    1981-01-01

    A desk-top microcomputer was programmed to acquire the data from various process control sensors installed in a laboratory scale liquid-liquid extraction, pulse column facility. The parameters monitored included valve positions, gamma spectra, alpha radioactivity, temperature, pH, density, and flow rates. The program for the microcomputer is written in BASIC and requires about 31000 8-bit bytes of memory. All data is stored on floppy discs, and can be displayed or printed. Unexpected data values are brought to the process operator's attention via CRT display or print-out. The general organization of the program and a few subroutines unique to polling instruments are explained. Some of the data acquisition devices were designed and built at the Savannah River Laboratory. These include a pulse height analyzer, a data multiplexer, and a data acquisition instrument. A general description of the electronics design of these instruments is also given with emphasis placed on data formatting and bus addressing

  12. Direct Desktop Printed-Circuits-on-Paper Flexible Electronics

    Science.gov (United States)

    Zheng, Yi; He, Zhizhu; Gao, Yunxia; Liu, Jing

    2013-05-01

    There currently lacks of a way to directly write out electronics, just like printing pictures on paper by an office printer. Here we show a desktop printing of flexible circuits on paper via developing liquid metal ink and related working mechanisms. Through modifying adhesion of the ink, overcoming its high surface tension by dispensing machine and designing a brush like porous pinhead for printing alloy and identifying matched substrate materials among different papers, the slightly oxidized alloy ink was demonstrated to be flexibly printed on coated paper, which could compose various functional electronics and the concept of Printed-Circuits-on-Paper was thus presented. Further, RTV silicone rubber was adopted as isolating inks and packaging material to guarantee the functional stability of the circuit, which suggests an approach for printing 3D hybrid electro-mechanical device. The present work paved the way for a low cost and easygoing method in directly printing paper electronics.

  13. GRID : unlimited computing power on your desktop Conference MT17

    CERN Multimedia

    2001-01-01

    The Computational GRID is an analogy to the electrical power grid for computing resources. It decouples the provision of computing, data, and networking from its use, it allows large-scale pooling and sharing of resources distributed world-wide. Every computer, from a desktop to a mainframe or supercomputer, can provide computing power or data for the GRID. The final objective is to plug your computer into the wall and have direct access to huge computing resources immediately, just like plugging-in a lamp to get instant light. The GRID will facilitate world-wide scientific collaborations on an unprecedented scale. It will provide transparent access to major distributed resources of computer power, data, information, and collaborations.

  14. Visual attention for a desktop virtual environment with ambient scent.

    Science.gov (United States)

    Toet, Alexander; van Schaik, Martin G

    2013-01-01

    In the current study participants explored a desktop virtual environment (VE) representing a suburban neighborhood with signs of public disorder (neglect, vandalism, and crime), while being exposed to either room air (control group), or subliminal levels of tar (unpleasant; typically associated with burned or waste material) or freshly cut grass (pleasant; typically associated with natural or fresh material) ambient odor. They reported all signs of disorder they noticed during their walk together with their associated emotional response. Based on recent evidence that odors reflexively direct visual attention to (either semantically or affectively) congruent visual objects, we hypothesized that participants would notice more signs of disorder in the presence of ambient tar odor (since this odor may bias attention to unpleasant and negative features), and less signs of disorder in the presence of ambient grass odor (since this odor may bias visual attention toward the vegetation in the environment and away from the signs of disorder). Contrary to our expectations the results provide no indication that the presence of an ambient odor affected the participants' visual attention for signs of disorder or their emotional response. However, the paradigm used in present study does not allow us to draw any conclusions in this respect. We conclude that a closer affective, semantic, or spatiotemporal link between the contents of a desktop VE and ambient scents may be required to effectively establish diagnostic associations that guide a user's attention. In the absence of these direct links, ambient scent may be more diagnostic for the physical environment of the observer as a whole than for the particular items in that environment (or, in this case, items represented in the VE).

  15. Turbulence Visualization at the Terascale on Desktop PCs

    KAUST Repository

    Treib, M.

    2012-12-01

    Despite the ongoing efforts in turbulence research, the universal properties of the turbulence small-scale structure and the relationships between small-and large-scale turbulent motions are not yet fully understood. The visually guided exploration of turbulence features, including the interactive selection and simultaneous visualization of multiple features, can further progress our understanding of turbulence. Accomplishing this task for flow fields in which the full turbulence spectrum is well resolved is challenging on desktop computers. This is due to the extreme resolution of such fields, requiring memory and bandwidth capacities going beyond what is currently available. To overcome these limitations, we present a GPU system for feature-based turbulence visualization that works on a compressed flow field representation. We use a wavelet-based compression scheme including run-length and entropy encoding, which can be decoded on the GPU and embedded into brick-based volume ray-casting. This enables a drastic reduction of the data to be streamed from disk to GPU memory. Our system derives turbulence properties directly from the velocity gradient tensor, and it either renders these properties in turn or generates and renders scalar feature volumes. The quality and efficiency of the system is demonstrated in the visualization of two unsteady turbulence simulations, each comprising a spatio-temporal resolution of 10244. On a desktop computer, the system can visualize each time step in 5 seconds, and it achieves about three times this rate for the visualization of a scalar feature volume. © 1995-2012 IEEE.

  16. REPLIKASI UNIDIRECTIONAL PADA HETEROGEN DATABASE

    OpenAIRE

    Hendro Nindito; Evaristus Didik Madyatmadja; Albert Verasius Dian Sano

    2013-01-01

    The use of diverse database technology in enterprise today can not be avoided. Thus, technology is needed to generate information in real time. The purpose of this research is to discuss a database replication technology that can be applied in heterogeneous database environments. In this study we use Windows-based MS SQL Server database to Linux-based Oracle database as the goal. The research method used is prototyping where development can be done quickly and testing of working models of the...

  17. Image Format Conversion to DICOM and Lookup Table Conversion to Presentation Value of the Japanese Society of Radiological Technology (JSRT) Standard Digital Image Database.

    Science.gov (United States)

    Yanagita, Satoshi; Imahana, Masato; Suwa, Kazuaki; Sugimura, Hitomi; Nishiki, Masayuki

    2016-01-01

    Japanese Society of Radiological Technology (JSRT) standard digital image database contains many useful cases of chest X-ray images, and has been used in many state-of-the-art researches. However, the pixel values of all the images are simply digitized as relative density values by utilizing a scanned film digitizer. As a result, the pixel values are completely different from the standardized display system input value of digital imaging and communications in medicine (DICOM), called presentation value (P-value), which can maintain a visual consistency when observing images using different display luminance. Therefore, we converted all the images from JSRT standard digital image database to DICOM format followed by the conversion of the pixel values to P-value using an original program developed by ourselves. Consequently, JSRT standard digital image database has been modified so that the visual consistency of images is maintained among different luminance displays.

  18. Food Composition Database Format and Structure: A User Focused Approach.

    Science.gov (United States)

    Clancy, Annabel K; Woods, Kaitlyn; McMahon, Anne; Probst, Yasmine

    2015-01-01

    This study aimed to investigate the needs of Australian food composition database user's regarding database format and relate this to the format of databases available globally. Three semi structured synchronous online focus groups (M = 3, F = 11) and n = 6 female key informant interviews were recorded. Beliefs surrounding the use, training, understanding, benefits and limitations of food composition data and databases were explored. Verbatim transcriptions underwent preliminary coding followed by thematic analysis with NVivo qualitative analysis software to extract the final themes. Schematic analysis was applied to the final themes related to database format. Desktop analysis also examined the format of six key globally available databases. 24 dominant themes were established, of which five related to format; database use, food classification, framework, accessibility and availability, and data derivation. Desktop analysis revealed that food classification systems varied considerably between databases. Microsoft Excel was a common file format used in all databases, and available software varied between countries. User's also recognised that food composition databases format should ideally be designed specifically for the intended use, have a user-friendly food classification system, incorporate accurate data with clear explanation of data derivation and feature user input. However, such databases are limited by data availability and resources. Further exploration of data sharing options should be considered. Furthermore, user's understanding of food composition data and databases limitations is inherent to the correct application of non-specific databases. Therefore, further exploration of user FCDB training should also be considered.

  19. Food Composition Database Format and Structure: A User Focused Approach

    Science.gov (United States)

    Clancy, Annabel K.; Woods, Kaitlyn; McMahon, Anne; Probst, Yasmine

    2015-01-01

    This study aimed to investigate the needs of Australian food composition database user’s regarding database format and relate this to the format of databases available globally. Three semi structured synchronous online focus groups (M = 3, F = 11) and n = 6 female key informant interviews were recorded. Beliefs surrounding the use, training, understanding, benefits and limitations of food composition data and databases were explored. Verbatim transcriptions underwent preliminary coding followed by thematic analysis with NVivo qualitative analysis software to extract the final themes. Schematic analysis was applied to the final themes related to database format. Desktop analysis also examined the format of six key globally available databases. 24 dominant themes were established, of which five related to format; database use, food classification, framework, accessibility and availability, and data derivation. Desktop analysis revealed that food classification systems varied considerably between databases. Microsoft Excel was a common file format used in all databases, and available software varied between countries. User’s also recognised that food composition databases format should ideally be designed specifically for the intended use, have a user-friendly food classification system, incorporate accurate data with clear explanation of data derivation and feature user input. However, such databases are limited by data availability and resources. Further exploration of data sharing options should be considered. Furthermore, user’s understanding of food composition data and databases limitations is inherent to the correct application of non-specific databases. Therefore, further exploration of user FCDB training should also be considered. PMID:26554836

  20. VRLane: a desktop virtual safety management program for underground coal mine

    Science.gov (United States)

    Li, Mei; Chen, Jingzhu; Xiong, Wei; Zhang, Pengpeng; Wu, Daozheng

    2008-10-01

    VR technologies, which generate immersive, interactive, and three-dimensional (3D) environments, are seldom applied to coal mine safety work management. In this paper, a new method that combined the VR technologies with underground mine safety management system was explored. A desktop virtual safety management program for underground coal mine, called VRLane, was developed. The paper mainly concerned about the current research advance in VR, system design, key techniques and system application. Two important techniques were introduced in the paper. Firstly, an algorithm was designed and implemented, with which the 3D laneway models and equipment models can be built on the basis of the latest mine 2D drawings automatically, whereas common VR programs established 3D environment by using 3DS Max or the other 3D modeling software packages with which laneway models were built manually and laboriously. Secondly, VRLane realized system integration with underground industrial automation. VRLane not only described a realistic 3D laneway environment, but also described the status of the coal mining, with functions of displaying the run states and related parameters of equipment, per-alarming the abnormal mining events, and animating mine cars, mine workers, or long-wall shearers. The system, with advantages of cheap, dynamic, easy to maintenance, provided a useful tool for safety production management in coal mine.

  1. Database Description - PSCDB | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us PSCDB Database Description General information of database Database name PSCDB Alternative n...rial Science and Technology (AIST) Takayuki Amemiya E-mail: Database classification Structure Databases - Protein structure Database... description The purpose of this database is to represent the relationship between p... Features and manner of utilization of database - License CC BY-SA Detail Background and funding - Reference...(s) Article title: PSCDB: a database for protein structural change upon ligand binding. Author name(s): T. A

  2. Analyst productivity and the RELAP5 desktop analyzer

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1989-01-01

    Historically, the productivity of a numerical reactor safety analyst has been hampered by several factors: poor mainframe computer turnaround for problem setup, checkout, and initialization; limited mainframe CPU allocation, accessibility and availability for transient advancement; lost or delayed output; and difficulty assimilating numerical results. Clearly, an economical engineering workstation capable of running RELAP5 interactively, and of simultaneously displaying the results in a coherent graphic fashion as they are produced, would alleviate many of these concerns. The RELAP5 desktop analyzer (RDA) is such a workstation. Although not yet capable of real-time simulation, the RDA will nevertheless reduce analysis costs and enhance analyst productivity since analysis cannot be done in real time anyway. The RDA is a microcomputer-based reactor transient simulation, visualization, and analysis tool developed at the Idaho National Engineering Laboratory (INEL) to assist an analyst in simulating and evaluating the transient behavior of nuclear power plants. The RDA integrates RELAP5 advanced best-estimate engineering simulation capabilities with on-line computer graphics routines allowing interactive reactor plant transient simulation and on-line analysis of results, or replay of past simulations, by means of graphic displays

  3. BDE-209 in the Australian Environment: Desktop review

    International Nuclear Information System (INIS)

    English, Karin; Toms, Leisa-Maree L.; Gallen, Christie; Mueller, Jochen F.

    2016-01-01

    The commercial polybrominated diphenyl ether (PBDE) flame retardant mixture c-decaBDE is now being considered for listing on the Stockholm Convention on Persistent Organic Pollutants. The aim of our study was to review the literature regarding the use and detection of BDE-209, a major component of c-decaBDE, in consumer products and provide a best estimate of goods that are likely to contain BDE-209 in Australia. This review is part of a larger study, which will include quantitative testing of items to assess for BDE-209. The findings of this desktop review will be used to determine which items should be prioritized for quantitative testing. We identified that electronics, particularly televisions, computers, small household appliances and power boards, were the items that were most likely to contain BDE-209 in Australia. Further testing of these items should include items of various ages. Several other items were identified as high priority for future testing, including transport vehicles, building materials and textiles in non-domestic settings. The findings from this study will aid in the development of appropriate policies, should listing of c-decaBDE on the Stockholm Convention and Australia’s ratification of that listing proceed.

  4. BDE-209 in the Australian Environment: Desktop review

    Energy Technology Data Exchange (ETDEWEB)

    English, Karin, E-mail: k.english@uq.edu.au [School of Medicine, The University of Queensland, Brisbane (Australia); Children’s Health and Environment Program, Child Health Research Centre, The University of Queensland, Brisbane (Australia); Queensland Children’s Medical Research Institute, Children’s Health Research Centre, Brisbane (Australia); Toms, Leisa-Maree L. [School of Public Health and Social Work, and Institute of Health and Biomedical Innovation, Queensland University of Technology, Brisbane (Australia); Gallen, Christie; Mueller, Jochen F. [The University of Queensland, National Research Centre for Environmental Toxicology (Entox), Brisbane (Australia)

    2016-12-15

    The commercial polybrominated diphenyl ether (PBDE) flame retardant mixture c-decaBDE is now being considered for listing on the Stockholm Convention on Persistent Organic Pollutants. The aim of our study was to review the literature regarding the use and detection of BDE-209, a major component of c-decaBDE, in consumer products and provide a best estimate of goods that are likely to contain BDE-209 in Australia. This review is part of a larger study, which will include quantitative testing of items to assess for BDE-209. The findings of this desktop review will be used to determine which items should be prioritized for quantitative testing. We identified that electronics, particularly televisions, computers, small household appliances and power boards, were the items that were most likely to contain BDE-209 in Australia. Further testing of these items should include items of various ages. Several other items were identified as high priority for future testing, including transport vehicles, building materials and textiles in non-domestic settings. The findings from this study will aid in the development of appropriate policies, should listing of c-decaBDE on the Stockholm Convention and Australia’s ratification of that listing proceed.

  5. DATABASE REPLICATION IN HETEROGENOUS PLATFORM

    OpenAIRE

    Hendro Nindito; Evaristus Didik Madyatmadja; Albert Verasius Dian Sano

    2014-01-01

    The application of diverse database technologies in enterprises today is increasingly a common practice. To provide high availability and survavibality of real-time information, a database replication technology that has capability to replicate databases under heterogenous platforms is required. The purpose of this research is to find the technology with such capability. In this research, the data source is stored in MSSQL database server running on Windows. The data will be replicated to MyS...

  6. MultiSpec: A Desktop and Online Geospatial Image Data Processing Tool

    Science.gov (United States)

    Biehl, L. L.; Hsu, W. K.; Maud, A. R. M.; Yeh, T. T.

    2017-12-01

    MultiSpec is an easy to learn and use, freeware image processing tool for interactively analyzing a broad spectrum of geospatial image data, with capabilities such as image display, unsupervised and supervised classification, feature extraction, feature enhancement, and several other functions. Originally developed for Macintosh and Windows desktop computers, it has a community of several thousand users worldwide, including researchers and educators, as a practical and robust solution for analyzing multispectral and hyperspectral remote sensing data in several different file formats. More recently MultiSpec was adapted to run in the HUBzero collaboration platform so that it can be used within a web browser, allowing new user communities to be engaged through science gateways. MultiSpec Online has also been extended to interoperate with other components (e.g., data management) in HUBzero through integration with the geospatial data building blocks (GABBs) project. This integration enables a user to directly launch MultiSpec Online from data that is stored and/or shared in a HUBzero gateway and to save output data from MultiSpec Online to hub storage, allowing data sharing and multi-step workflows without having to move data between different systems. MultiSpec has also been used in K-12 classes for which one example is the GLOBE program (www.globe.gov) and in outreach material such as that provided by the USGS (eros.usgs.gov/educational-activities). MultiSpec Online now provides teachers with another way to use MultiSpec without having to install the desktop tool. Recently MultiSpec Online was used in a geospatial data session with 30-35 middle school students at the Turned Onto Technology and Leadership (TOTAL) Camp in the summers of 2016 and 2017 at Purdue University. The students worked on a flood mapping exercise using Landsat 5 data to learn about land remote sensing using supervised classification techniques. Online documentation is available for Multi

  7. Database Technology Activities and Assessment for Defense Modeling and Simulation Office (DMSO) (August 1991-November 1992). A Documented Briefing

    Science.gov (United States)

    1994-01-01

    secure, fault-tolerant, or real-time. Note, however, that a particular DBMS could be of more than one type (e.g., Teradata implements a relational...GOcarousel We 4656 seW. I___Kim_ _ Databse mfachi Teradata (Intul 80486) Teradam -> RISC pmoceesoe Trasputer-baed Cormmeclsyaveilble...promise of database machines is manifested in the commercial product from Teradata , a linearly scalable architecture that can add processors and I/O

  8. Djeen (Database for Joomla!’s Extensible Engine): a research information management system for flexible multi-technology project administration

    Science.gov (United States)

    2013-01-01

    Background With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. Findings We developed Djeen (Database for Joomla!’s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Conclusion Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group. Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material. PMID:23742665

  9. A Look Under the Hood: How the JPL Tropical Cyclone Information System Uses Database Technologies to Present Big Data to Users

    Science.gov (United States)

    Knosp, B.; Gangl, M.; Hristova-Veleva, S. M.; Kim, R. M.; Li, P.; Turk, J.; Vu, Q. A.

    2015-12-01

    The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data and model forecast related to tropical cyclones. The TCIS has been running a near-real time (NRT) data portal during North Atlantic hurricane season that typically runs from June through October each year, since 2010. Data collected by the TCIS varies by type, format, contents, and frequency and is served to the user in two ways: (1) as image overlays on a virtual globe and (2) as derived output from a suite of analysis tools. In order to support these two functions, the data must be collected and then made searchable by criteria such as date, mission, product, pressure level, and geospatial region. Creating a database architecture that is flexible enough to manage, intelligently interrogate, and ultimately present this disparate data to the user in a meaningful way has been the primary challenge. The database solution for the TCIS has been to use a hybrid MySQL + Solr implementation. After testing other relational database and NoSQL solutions, such as PostgreSQL and MongoDB respectively, this solution has given the TCIS the best offerings in terms of query speed and result reliability. This database solution also supports the challenging (and memory overwhelming) geospatial queries that are necessary to support analysis tools requested by users. Though hardly new technologies on their own, our implementation of MySQL + Solr had to be customized and tuned to be able to accurately store, index, and search the TCIS data holdings. In this presentation, we will discuss how we arrived on our MySQL + Solr database architecture, why it offers us the most consistent fast and reliable results, and how it supports our front end so that we can offer users a look into our "big data" holdings.

  10. Low-cost Method for Obtaining Medical Rapid Prototyping Using Desktop 3D printing: A Novel Technique for Mandibular Reconstruction Planning.

    Science.gov (United States)

    Velasco, Ignacio; Vahdani, Soheil; Ramos, Hector

    2017-09-01

    Three-dimensional (3D) printing is relatively a new technology with clinical applications, which enable us to create rapid accurate prototype of the selected anatomic region, making it possible to plan complex surgery and pre-bend hardware for individual surgical cases. This study aimed to express our experience with the use of medical rapid prototype (MRP) of the maxillofacial region created by desktop 3D printer and its application in maxillofacial reconstructive surgeries. Three patients with benign mandible tumors were included in this study after obtaining informed consent. All patient's maxillofacial CT scan data was processed by segmentation and isolation software and mandible MRP was printed using our desktop 3D printer. These models were used for preoperative surgical planning and prebending of the reconstruction plate. MRP created by desktop 3D printer is a cost-efficient, quick and easily produced appliance for the planning of reconstructive surgery. It can contribute in patient orientation and helping them in a better understanding of their condition and proposed surgical treatment. It helps surgeons for pre-operative planning in the resection or reconstruction cases and represent an excellent tool in academic setting for residents training. The pre-bended reconstruction plate based on MRP, resulted in decreased surgery time, cost and anesthesia risks on the patients. Key words: 3D printing, medical modeling, rapid prototype, mandibular reconstruction, ameloblastoma.

  11. Utilization and success rates of unstimulated in vitro fertilization in the United States: an analysis of the Society for Assisted Reproductive Technology database.

    Science.gov (United States)

    Gordon, John David; DiMattina, Michael; Reh, Andrea; Botes, Awie; Celia, Gerard; Payson, Mark

    2013-08-01

    To examine the utilization and outcomes of natural cycle (unstimulated) IVF as reported to the Society of Assisted Reproductive Technology (SART) in 2006 and 2007. Retrospective analysis. Dataset analysis from the SART Clinical Outcome Reporting System national database. All patients undergoing IVF as reported to SART in 2006 and 2007. None. Utilization of unstimulated IVF; description of patient demographics; and comparison of implantation and pregnancy rates between unstimulated and stimulated IVF cycles. During 2006 and 2007 a total of 795 unstimulated IVF cycles were initiated. Success rates were age dependent, with patients Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  12. [Public scientific knowledge distribution in health information, communication and information technology indexed in MEDLINE and LILACS databases].

    Science.gov (United States)

    Packer, Abel Laerte; Tardelli, Adalberto Otranto; Castro, Regina Célia Figueiredo

    2007-01-01

    This study explores the distribution of international, regional and national scientific output in health information and communication, indexed in the MEDLINE and LILACS databases, between 1996 and 2005. A selection of articles was based on the hierarchical structure of Information Science in MeSH vocabulary. Four specific domains were determined: health information, medical informatics, scientific communications on healthcare and healthcare communications. The variables analyzed were: most-covered subjects and journals, author affiliation and publication countries and languages, in both databases. The Information Science category is represented in nearly 5% of MEDLINE and LILACS articles. The four domains under analysis showed a relative annual increase in MEDLINE. The Medical Informatics domain showed the highest number of records in MEDLINE, representing about half of all indexed articles. The importance of Information Science as a whole is more visible in publications from developed countries and the findings indicate the predominance of the United States, with significant growth in scientific output from China and South Korea and, to a lesser extent, Brazil.

  13. Efficiency Sustainability Resource Visual Simulator for Clustered Desktop Virtualization Based on Cloud Infrastructure

    Directory of Open Access Journals (Sweden)

    Jong Hyuk Park

    2014-11-01

    Full Text Available Following IT innovations, manual operations have been automated, improving the overall quality of life. This has been possible because an organic topology has been formed among many diverse smart devices grafted onto real life. To provide services to these smart devices, enterprises or users use the cloud. Cloud services are divided into infrastructure as a service (IaaS, platform as a service (PaaS and software as a service (SaaS. SaaS is operated on PaaS, and PaaS is operated on IaaS. Since IaaS is the foundation of all services, algorithms for the efficient operation of virtualized resources are required. Among these algorithms, desktop resource virtualization is used for high resource availability when existing desktop PCs are unavailable. For this high resource availability, clustering for hierarchical structures is important. In addition, since many clustering algorithms show different percentages of the main resources depending on the desktop PC distribution rates and environments, selecting appropriate algorithms is very important. If diverse attempts are made to find algorithms suitable for the operating environments’ desktop resource virtualization, huge costs are incurred for the related power, time and labor. Therefore, in the present paper, a desktop resource virtualization clustering simulator (DRV-CS, a clustering simulator for selecting clusters of desktop virtualization clusters to be maintained sustainably, is proposed. The DRV-CS provides simulations, so that clustering algorithms can be selected and elements can be properly applied in different desktop PC environments through the DRV-CS.

  14. Food traceability systems in China: The current status of and future perspectives on food supply chain databases, legal support, and technological research and support for food safety regulation.

    Science.gov (United States)

    Tang, Qi; Li, Jiajia; Sun, Mei; Lv, Jun; Gai, Ruoyan; Mei, Lin; Xu, Lingzhong

    2015-02-01

    Over the past few decades, the field of food security has witnessed numerous problems and incidents that have garnered public attention. Given this serious situation, the food traceability system (FTS) has become part of the expanding food safety continuum to reduce the risk of food safety problems. This article reviews a great deal of the related literature and results from previous studies of FTS to corroborate this contention. This article describes the development and benefits of FTS in developed countries like the United States of America (USA), Japan, and some European countries. Problems with existing FTS in China are noted, including a lack of a complete database, inadequate laws and regulations, and lagging technological research into FTS. This article puts forward several suggestions for the future, including improvement of information websites, clarification of regulatory responsibilities, and promotion of technological research.

  15. Desktop NMR for structure elucidation and identification of strychnine adulteration.

    Science.gov (United States)

    Singh, Kawarpal; Blümich, Bernhard

    2017-05-02

    Elucidating the structure of complex molecules is difficult at low magnetic fields due to the overlap of different peak multiplets and second-order coupling effects. This is even more challenging for rigid molecules with small chemical shift differences and with prochiral centers. Since low-field NMR spectroscopy is sometimes presumed as restricted to the analysis of only small and simple molecules, this paper aims at countering this misconception: it demonstrates the use of low-field NMR spectroscopy in chemical forensics for identifying strychnine and its counterions by exploring the chemical shift as a signature in different 1D 1 H and 13 C experiments. Hereby the applied methodologies combine various 1D and 2D experiments such as 1D 1 H, 13 C, DEPT, and 2D COSY, HETCOR, HSQC, HMBC and J-resolved spectroscopy to elucidate the molecular structure and skeleton of strychnine at 1 Tesla. Strychnine is exemplified here, because it is a basic precursor in the chemistry of natural products and is employed as a chemical weapon and as a doping agent in sports including the Olympics. In our study, the molecular structure of the compound could be identified either with a 1D experiment at high magnetic field or with HMBC and HSQC experiments at 1 T. In conclusion, low-field NMR spectroscopy enables the chemical elucidation of the strychnine structure through a simple click with a computer mouse. In situations where a high-field NMR spectrometer is unavailable, compact NMR spectrometers can nevertheless generate knowledge of the structure, important for identifying the different chemical reaction mechanisms associated with the molecule. Desktop NMR is a cost-effective viable option in chemical forensics. It can prove adulteration and identify the origin of different strychnine salts, in particular, the strychnine free base, strychnine hemisulphate and strychnine hydrochloride. The chemical shift signatures report the chemical structure of the molecules due to the impact of

  16. Testing the quality of images for permanent magnet desktop MRI systems using specially designed phantoms.

    Science.gov (United States)

    Qiu, Jianfeng; Wang, Guozhu; Min, Jiao; Wang, Xiaoyan; Wang, Pengcheng

    2013-12-21

    Our aim was to measure the performance of desktop magnetic resonance imaging (MRI) systems using specially designed phantoms, by testing imaging parameters and analysing the imaging quality. We designed multifunction phantoms with diameters of 18 and 60 mm for desktop MRI scanners in accordance with the American Association of Physicists in Medicine (AAPM) report no. 28. We scanned the phantoms with three permanent magnet 0.5 T desktop MRI systems, measured the MRI image parameters, and analysed imaging quality by comparing the data with the AAPM criteria and Chinese national standards. Image parameters included: resonance frequency, high contrast spatial resolution, low contrast object detectability, slice thickness, geometrical distortion, signal-to-noise ratio (SNR), and image uniformity. The image parameters of three desktop MRI machines could be measured using our specially designed phantoms, and most parameters were in line with MRI quality control criterion, including: resonance frequency, high contrast spatial resolution, low contrast object detectability, slice thickness, geometrical distortion, image uniformity and slice position accuracy. However, SNR was significantly lower than in some references. The imaging test and quality control are necessary for desktop MRI systems, and should be performed with the applicable phantom and corresponding standards.

  17. Waste management and technologies analytical database project for Los Alamos National Laboratory/Department of Energy. Final report, June 7, 1993--June 15, 1994

    International Nuclear Information System (INIS)

    1995-01-01

    The Waste Management and Technologies Analytical Database System (WMTADS) supported by the Department of Energy's (DOE) Office of Environmental Management (EM), Office of Technology Development (EM-50), was developed and based at the Los Alamos National Laboratory (LANL), Los Alamos, New Mexico, to collect, identify, organize, track, update, and maintain information related to existing/available/developing and planned technologies to characterize, treat, and handle mixed, hazardous and radioactive waste for storage and disposal in support of EM strategies and goals and to focus area projects. WMTADS was developed as a centralized source of on-line information regarding technologies for environmental management processes that can be accessed by a computer, modem, phone line, and communications software through a Local Area Network (LAN), and server connectivity on the Internet, the world's largest computer network, and with file transfer protocol (FTP) can also be used to globally transfer files from the server to the user's computer through Internet and World Wide Web (WWW) using Mosaic

  18. Visualization of multidimensional database

    Science.gov (United States)

    Lee, Chung

    2008-01-01

    The concept of multidimensional databases has been extensively researched and wildly used in actual database application. It plays an important role in contemporary information technology, but due to the complexity of its inner structure, the database design is a complicated process and users are having a hard time fully understanding and using the database. An effective visualization tool for higher dimensional information system helps database designers and users alike. Most visualization techniques focus on displaying dimensional data using spreadsheets and charts. This may be sufficient for the databases having three or fewer dimensions but for higher dimensions, various combinations of projection operations are needed and a full grasp of total database architecture is very difficult. This study reviews existing visualization techniques for multidimensional database and then proposes an alternate approach to visualize a database of any dimension by adopting the tool proposed by Kiviat for software engineering processes. In this diagramming method, each dimension is represented by one branch of concentric spikes. This paper documents a C++ based visualization tool with extensive use of OpenGL graphics library and GUI functions. Detailed examples of actual databases demonstrate the feasibility and effectiveness in visualizing multidimensional databases.

  19. Community Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This excel spreadsheet is the result of merging at the port level of several of the in-house fisheries databases in combination with other demographic databases such...

  20. Biofuel Database

    Science.gov (United States)

    Biofuel Database (Web, free access)   This database brings together structural, biological, and thermodynamic data for enzymes that are either in current use or are being considered for use in the production of biofuels.

  1. The File Sync Algorithm of the ownCloud Desktop Clients

    CERN Document Server

    CERN. Geneva

    2014-01-01

    The ownCloud desktop clients provide file syncing between desktop machines and the ownCloud server, available for the important desktop platforms. This presentation will give an overview of the sync algorithm used by the clients to provide a fast, reliable and robust syncing experience for the users. It will describe the phases a sync run will go through and how it is triggered. It also will provide an insight on the algorithms that decided if a file is uploaded, downloaded or even deleted on either on the local machine or in the cloud. Some examples of non obvious situations in file syncing will be described and discussed. As the ownCloud sync protocol is based on the open standard WebDAV the resulting challenges and the solutions will be illustrated. Finally a couple of frequently proposed enhancements will be reviewed and assed for the future development of the ownCloud server and syncing clients.

  2. Developing a Process Model for the Forensic Extraction of Information from Desktop Search Applications

    Directory of Open Access Journals (Sweden)

    Timothy Pavlic

    2008-03-01

    Full Text Available Desktop search applications can contain cached copies of files that were deleted from the file system. Forensic investigators see this as a potential source of evidence, as documents deleted by suspects may still exist in the cache. Whilst there have been attempts at recovering data collected by desktop search applications, there is no methodology governing the process, nor discussion on the most appropriate means to do so. This article seeks to address this issue by developing a process model that can be applied when developing an information extraction application for desktop search applications, discussing preferred methods and the limitations of each. This work represents a more structured approach than other forms of current research.

  3. KALIMER database development (database configuration and design methodology)

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Kwon, Young Min; Lee, Young Bum; Chang, Won Pyo; Hahn, Do Hee

    2001-10-01

    KALIMER Database is an advanced database to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applicatins. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), and 3D CAD database, Team Cooperation system, and Reserved Documents, Results Database is a research results database during phase II for Liquid Metal Reactor Design Technology Develpment of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is s schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment. This report describes the features of Hardware and Software and the Database Design Methodology for KALIMER

  4. Database Administrator

    Science.gov (United States)

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  5. IMIS desktop & smartphone software solutions for monitoring spacecrafts' payload from anywhere

    Science.gov (United States)

    Baroukh, J.; Queyrut, O.; Airaud, J.

    In the past years, the demand for satellite remote operations has increased guided by on one hand, the will to reduce operations cost (on-call operators out of business hours), and on the other hand, the development of cooperation space missions resulting in a world wide distribution of engineers and science team members. Only a few off-the-shelf solutions exist to fulfill the need of remote payload monitoring, and they mainly use proprietary devices. The recent advent of mobile technologies (laptops, smartphones and tablets) as well as the worldwide deployment of broadband networks (3G, Wi-Fi hotspots), has opened up a technical window that brings new options. As part of the Mars Science Laboratory (MSL) mission, the Centre National D'Etudes Spatiales (CNES, the French space agency) has developed a new software solution for monitoring spacecraft payloads. The Instrument Monitoring Interactive Software (IMIS) offers state-of-the-art operational features for payload monitoring, and can be accessed remotely. It was conceived as a generic tool that can be used for heterogeneous payloads and missions. IMIS was designed as a classical client/server architecture. The server is hosted at CNES and acts as a data provider while two different kinds of clients are available depending on the level of mobility required. The first one is a rich client application, built on Eclipse framework, which can be installed on usual operating systems and communicates with the server through the Internet. The second one is a smartphone application for any Android platform, connected to the server thanks to the mobile broadband network or a Wi-Fi connection. This second client is mainly devoted to on-call operations and thus only contains a subset of the IMIS functionalities. This paper describes the operational context, including security aspects, that led IMIS development, presents the selected software architecture and details the various features of both clients: the desktop and the sm

  6. Characterization of emissions from a desktop 3D printer and indoor air measurements in office settings.

    Science.gov (United States)

    Steinle, Patrick

    2016-01-01

    Emissions from a desktop 3D printer based on fused deposition modeling (FDM) technology were measured in a test chamber and indoor air was monitored in office settings. Ultrafine aerosol (UFA) emissions were higher while printing a standard object with polylactic acid (PLA) than with acrylonitrile butadiene styrene (ABS) polymer (2.1 × 10(9) vs. 2.4 × 10(8) particles/min). Prolonged use of the printer led to higher emission rates (factor 2 with PLA and 4 with ABS, measured after seven months of occasional use). UFA consisted mainly of volatile droplets, and some small (100-300 nm diameter) iron containing and soot-like particles were found. Emissions of inhalable and respirable dust were below the limit of detection (LOD) when measured gravimetrically, and only slightly higher than background when measured with an aerosol spectrometer. Emissions of volatile organic compounds (VOC) were in the range of 10 µg/min. Styrene accounted for more than 50% of total VOC emitted when printing with ABS; for PLA, methyl methacrylate (MMA, 37% of TVOC) was detected as the predominant compound. Two polycyclic aromatic hydrocarbons (PAH), fluoranthene and pyrene, were observed in very low amounts. All other analyzed PAH, as well as inorganic gases and metal emissions except iron (Fe) and zinc (Zn), were below the LOD or did not differ from background without printing. A single 3D print (165 min) in a large, well-ventilated office did not significantly increase the UFA and VOC concentrations, whereas these were readily detectable in a small, unventilated room, with UFA concentrations increasing by 2,000 particles/cm(3) and MMA reaching a peak of 21 µg/m(3) and still being detectable in the room even 20 hr after printing.

  7. Applications and a three-dimensional desktop environment for an immersive virtual reality system

    International Nuclear Information System (INIS)

    Kageyama, Akira; Masada, Youhei

    2013-01-01

    We developed an application launcher called Multiverse for scientific visualizations in a CAVE-type virtual reality (VR) system. Multiverse can be regarded as a type of three-dimensional (3D) desktop environment. In Multiverse, a user in a CAVE room can browse multiple visualization applications with 3D icons and explore movies that float in the air. Touching one of the movies causes ''teleportation'' into the application's VR space. After analyzing the simulation data using the application, the user can jump back into Multiverse's VR desktop environment in the CAVE

  8. Usability Comparisons of Head-Mounted vs. Stereoscopic Desktop Displays in a Virtual Reality Environment with Pain Patients.

    Science.gov (United States)

    Tong, Xin; Gromala, Diane; Gupta, Dimple; Squire, Pam

    2016-01-01

    Researchers have shown that immersive Virtual Reality (VR) can serve as an unusually powerful pain control technique. However, research assessing the reported symptoms and negative effects of VR systems indicate that it is important to ascertain if these symptoms arise from the use of particular VR display devices, particularly for users who are deemed "at risk," such as chronic pain patients Moreover, these patients have specific and often complex needs and requirements, and because basic issues such as 'comfort' may trigger anxiety or panic attacks, it is important to examine basic questions of the feasibility of using VR displays. Therefore, this repeated-measured experiment was conducted with two VR displays: the Oculus Rift's head-mounted display (HMD) and Firsthand Technologies' immersive desktop display, DeepStream3D. The characteristics of these immersive desktop displays differ: one is worn, enabling patients to move their heads, while the other is peered into, allowing less head movement. To assess the severity of physical discomforts, 20 chronic pain patients tried both displays while watching a VR pain management demo in clinical settings. Results indicated that participants experienced higher levels of Simulator Sickness using the Oculus Rift HMD. However, results also indicated other preferences of the two VR displays among patients, including physical comfort levels and a sense of immersion. Few studies have been conducted that compare usability of specific VR devices specifically with chronic pain patients using a therapeutic virtual environment in pain clinics. Thus, the results may help clinicians and researchers to choose the most appropriate VR displays for chronic pain patients and guide VR designers to enhance the usability of VR displays for long-term pain management interventions.

  9. Generalized Database Management System Support for Numeric Database Environments.

    Science.gov (United States)

    Dominick, Wayne D.; Weathers, Peggy G.

    1982-01-01

    This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…

  10. Forecasting and management of technology

    National Research Council Canada - National Science Library

    Roper, A. T

    2011-01-01

    ... what the authors see as the innovations to technology management in the last 17 years: the Internet; the greater focus on group decision-making including process management and mechanism design; and desktop software that has transformed the analytical capabilities of technology managers"--Provided by publisher.

  11. [Conceptual foundations of creation of branch database of technology and intellectual property rights owned by scientific institutions, organizations, higher medical educational institutions and enterprises of healthcare sphere of Ukraine].

    Science.gov (United States)

    Horban', A Ie

    2013-09-01

    The question of implementation of the state policy in the field of technology transfer in the medical branch to implement the law of Ukraine of 02.10.2012 No 5407-VI "On Amendments to the law of Ukraine" "On state regulation of activity in the field of technology transfers", namely to ensure the formation of branch database on technology and intellectual property rights owned by scientific institutions, organizations, higher medical education institutions and enterprises of healthcare sphere of Ukraine and established by budget are considered. Analysis of international and domestic experience in the processing of information about intellectual property rights and systems implementation support transfer of new technologies are made. The main conceptual principles of creation of this branch database of technology transfer and branch technology transfer network are defined.

  12. DataBase on Demand

    International Nuclear Information System (INIS)

    Aparicio, R Gaspar; Gomez, D; Wojcik, D; Coz, I Coterillo

    2012-01-01

    At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.

  13. Database Replication

    Directory of Open Access Journals (Sweden)

    Marius Cristian MAZILU

    2010-12-01

    Full Text Available For someone who has worked in an environment in which the same database is used for data entry and reporting, or perhaps managed a single database server that was utilized by too many users, the advantages brought by data replication are clear. The main purpose of this paper is to emphasize those advantages as well as presenting the different types of Database Replication and the cases in which their use is recommended.

  14. Students' Beliefs about Mobile Devices vs. Desktop Computers in South Korea and the United States

    Science.gov (United States)

    Sung, Eunmo; Mayer, Richard E.

    2012-01-01

    College students in the United States and in South Korea completed a 28-item multidimensional scaling (MDS) questionnaire in which they rated the similarity of 28 pairs of multimedia learning materials on a 10-point scale (e.g., narrated animation on a mobile device Vs. movie clip on a desktop computer) and a 56-item semantic differential…

  15. Preparatory Study for the Design of a Desktop Videoconferencing Platform for Synchronous Language Teaching

    Science.gov (United States)

    Guichon, Nicolas

    2010-01-01

    This case study paves the way for a research and development project aiming to design a desktop videoconferencing platform specifically dedicated to synchronous language teaching. It starts by defining a model of language pedagogy adapted to distance and taking into account the affordances of videoconferencing for language teaching. It then…

  16. Empirical Analysis of Server Consolidation and Desktop Virtualization in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Bao Rong Chang

    2013-01-01

    Full Text Available Physical server transited to virtual server infrastructure (VSI and desktop device to virtual desktop infrastructure (VDI have the crucial problems of server consolidation, virtualization performance, virtual machine density, total cost of ownership (TCO, and return on investments (ROI. Besides, how to appropriately choose hypervisor for the desired server/desktop virtualization is really challenging, because a trade-off between virtualization performance and cost is a hard decision to make in the cloud. This paper introduces five hypervisors to establish the virtual environment and then gives a careful assessment based on C/P ratio that is derived from composite index, consolidation ratio, virtual machine density, TCO, and ROI. As a result, even though ESX server obtains the highest ROI and lowest TCO in server virtualization and Hyper-V R2 gains the best performance of virtual machine management; both of them however cost too much. Instead the best choice is Proxmox Virtual Environment (Proxmox VE because it not only saves the initial investment a lot to own a virtual server/desktop infrastructure, but also obtains the lowest C/P ratio.

  17. How Simulator Interfaces Affect Transfer of Training: Comparing Wearable and Desktop Systems

    Science.gov (United States)

    2012-06-01

    a keyboard and mouse to control a virtual Soldier. The desktop computers used were Dell XPS systems, with a 2.66 GHz Intel Core 2 Duo CPU, 4 GB of...significantly slower to complete the scenarios than the control condition, with the magnitude of this difference diminishing over time. 15 . SUBJECT... 15 ACRONYMS

  18. Randomized Trial of Desktop Humidifier for Dry Eye Relief in Computer Users.

    Science.gov (United States)

    Wang, Michael T M; Chan, Evon; Ea, Linda; Kam, Clifford; Lu, Yvonne; Misra, Stuti L; Craig, Jennifer P

    2017-11-01

    Dry eye is a frequently reported problem among computer users. Low relative humidity environments are recognized to exacerbate signs and symptoms of dry eye, yet are common in offices of computer operators. Desktop USB-powered humidifiers are available commercially, but their efficacy for dry eye relief has not been established. This study aims to evaluate the potential for a desktop USB-powered humidifier to improve tear-film parameters, ocular surface characteristics, and subjective comfort of computer users. Forty-four computer users were enrolled in a prospective, masked, randomized crossover study. On separate days, participants were randomized to 1 hour of continuous computer use, with and without exposure to a desktop humidifier. Lipid-layer grade, noninvasive tear-film breakup time, and tear meniscus height were measured before and after computer use. Following the 1-hour period, participants reported whether ocular comfort was greater, equal, or lesser than that at baseline. The desktop humidifier effected a relative difference in humidity between the two environments of +5.4 ± 5.0% (P .05). However, a relative increase in the median noninvasive tear-film breakup time of +4.0 seconds was observed in the humidified environment (P computer use.Trial registration no: ACTRN12617000326392.

  19. Negotiation of Meaning in Desktop Videoconferencing-Supported Distance Language Learning

    Science.gov (United States)

    Wang, Yuping

    2006-01-01

    The aim of this research is to reveal the dynamics of focus on form in task completion via videoconferencing. This examination draws on current second language learning theories regarding effective language acquisition, research in Computer Mediated Communication (CMC) and empirical data from an evaluation of desktop videoconferencing-supported…

  20. Fostering Second Language Oral Communication through Constructivist Interaction in Desktop Videoconferencing

    Science.gov (United States)

    Lee, Lina

    2007-01-01

    This article describes a classroom project using one-to-one desktop videoconferencing to enhance the development of second language (1.2) oral skills. Eighteen university students worked collaboratively with expert speakers to complete task-based activities. The author gathered data from video-recording samples, reflections, and oral interviews to…

  1. Multimodal Language Learner Interactions via Desktop Videoconferencing within a Framework of Social Presence: Gaze

    Science.gov (United States)

    Satar, H. Muge

    2013-01-01

    Desktop videoconferencing (DVC) offers many opportunities for language learning through its multimodal features. However, it also brings some challenges such as gaze and mutual gaze, that is, eye-contact. This paper reports some of the findings of a PhD study investigating social presence in DVC interactions of English as a Foreign Language (EFL)…

  2. The Use of the Webcam for Teaching a Foreign Language in a Desktop Videoconferencing Environment

    Science.gov (United States)

    Develotte, Christine; Guichon, Nicolas; Vincent, Caroline

    2010-01-01

    This paper explores how language teachers learn to teach with a synchronous multimodal setup ("Skype"), and it focuses on their use of the webcam during the pedagogical interaction. First, we analyze the ways that French graduate students learning to teach online use the multimodal resources available in a desktop videoconferencing (DVC)…

  3. Using M@th Desktop Notebooks and Palettes in the Classroom

    Science.gov (United States)

    Simonovits, Reinhard

    2011-01-01

    This article explains the didactical design of M@th Desktop (MD), a teaching and learning software application for high schools and universities. The use of two types of MD resources is illustrated: notebooks and palettes, focusing on the topic of exponential functions. The handling of MD in a blended learning approach and the impact on the…

  4. Writing Essays on a Laptop or a Desktop Computer: Does It Matter?

    Science.gov (United States)

    Ling, Guangming; Bridgeman, Brent

    2013-01-01

    To explore the potential effect of computer type on the Test of English as a Foreign Language-Internet-Based Test (TOEFL iBT) Writing Test, a sample of 444 international students was used. The students were randomly assigned to either a laptop or a desktop computer to write two TOEFL iBT practice essays in a simulated testing environment, followed…

  5. Cybersickness and desktop simulations : Field of view effects and user experience

    NARCIS (Netherlands)

    Toet, Alexander; De Vries, Sjoerd C.; Van Emmerik, Martijn L.; Bos, Jelte E.

    2008-01-01

    We used a desktop computer game environment to study the effect Field-of-View (FOV) on cybersickness. In particular, we examined the effect of differences between the internal FOV (iFOV, the FOV which the graphics generator is using to render its images) and the external FOV (eFOV, the FOV of the

  6. Cybersickness and desktop simulations : field of view effects and user experience

    NARCIS (Netherlands)

    Toet, A.; Vries, S.C. de; Emmerik, M.L. van; Bos, J.E.

    2008-01-01

    We used a desktop computer game environment to study the effect Field-of-View (FOV) on cybersickness. In particular, we examined the effect of differences between the internal FOV (iFOV, the FOV which the graphics generator is using to render its images) and the external FOV (eFOV, the FOV of the

  7. Generating Alternative Engineering Designs by Integrating Desktop VR with Genetic Algorithms

    Science.gov (United States)

    Chandramouli, Magesh; Bertoline, Gary; Connolly, Patrick

    2009-01-01

    This study proposes an innovative solution to the problem of multiobjective engineering design optimization by integrating desktop VR with genetic computing. Although, this study considers the case of construction design as an example to illustrate the framework, this method can very much be extended to other engineering design problems as well.…

  8. Multimodal Desktop Interaction: The Face –Object-Gesture–Voice Example

    DEFF Research Database (Denmark)

    Vidakis, Nikolas; Vlasopoulos, Anastasios; Kounalakis, Tsampikos

    2013-01-01

    This paper presents a natural user interface system based on multimodal human computer interaction, which operates as an intermediate module between the user and the operating system. The aim of this work is to demonstrate a multimodal system which gives users the ability to interact with desktop...

  9. GTfold: Enabling parallel RNA secondary structure prediction on multi-core desktops

    DEFF Research Database (Denmark)

    Swenson, M Shel; Anderson, Joshua; Ash, Andrew

    2012-01-01

    achieved significant improvements in runtime, but their implementations were not portable from niche high-performance computers or easily accessible to most RNA researchers. With the increasing prevalence of multi-core desktop machines, a new parallel prediction program is needed to take full advantage...

  10. Campus Computing 1992. The EDUCOM-USC Survey of Desktop Computing in Higher Education.

    Science.gov (United States)

    Green, Kenneth C.; Eastman, Skip

    A national survey of desktop computing in higher education was conducted in 1992 of 2500 institutions. Data were responses from public and private research universities, public and private four-year colleges, and community colleges. Respondents (N=970) were individuals specifically responsible for the operation and future direction of academic…

  11. Campus Computing 1993. The USC National Survey of Desktop Computing in Higher Education.

    Science.gov (United States)

    Green, Kenneth C.; Eastman, Skip

    A national survey of desktop computing in higher education was conducted in spring and summer 1993 at over 2500 institutions. Data were responses from public and private research universities, public and private four-year colleges and community colleges. Respondents (N=1011) were individuals specifically responsible for the operation and future…

  12. Campus Computing 1991. The EDUCOM-USC Survey of Desktop Computing in Higher Education.

    Science.gov (United States)

    Green, Kenneth C.; Eastman, Skip

    A national survey of desktop computing in higher education was conducted in 1991 of 2500 institutions. Data were responses from public and private research universities, public and private four-year colleges, and community colleges. Respondents (N=1099) were individuals specifically responsible for the operation and future direction of academic…

  13. Meaning-Making in Online Language Learner Interactions via Desktop Videoconferencing

    Science.gov (United States)

    Satar, H. Müge

    2016-01-01

    Online language learning and teaching in multimodal contexts has been identified as one of the key research areas in computer-aided learning (CALL) (Lamy, 2013; White, 2014). This paper aims to explore meaning-making in online language learner interactions via desktop videoconferencing (DVC) and in doing so illustrate multimodal transcription and…

  14. JICST Factual DatabaseJICST Chemical Substance Safety Regulation Database

    Science.gov (United States)

    Abe, Atsushi; Sohma, Tohru

    JICST Chemical Substance Safety Regulation Database is based on the Database of Safety Laws for Chemical Compounds constructed by Japan Chemical Industry Ecology-Toxicology & Information Center (JETOC) sponsored by the Sience and Technology Agency in 1987. JICST has modified JETOC database system, added data and started the online service through JOlS-F (JICST Online Information Service-Factual database) in January 1990. JICST database comprises eighty-three laws and fourteen hundred compounds. The authors outline the database, data items, files and search commands. An example of online session is presented.

  15. Calculating length of gestation from the Society for Assisted Reproductive Technology Clinic Outcome Reporting System (SART CORS) database versus vital records may alter reported rates of prematurity.

    Science.gov (United States)

    Stern, Judy E; Kotelchuck, Milton; Luke, Barbara; Declercq, Eugene; Cabral, Howard; Diop, Hafsatou

    2014-05-01

    To compare length of gestation after assisted reproductive technology (ART) as calculated by three methods from the Society for Assisted Reproductive Technology Clinic Outcome Reporting System (SART CORS) and vital records (birth and fetal death) in the Massachusetts Pregnancy to Early Life Longitudinal Data System (PELL). Historical cohort study. Database linkage analysis. Live or stillborn deliveries. None. ART deliveries were linked to live birth or fetal death certificates. Length of gestation in 7,171 deliveries from fresh autologous ART cycles (2004-2008) was calculated and compared with that of SART CORS with the use of methods: M1 = outcome date - cycle start date; M2 = outcome date - transfer date + 17 days; and M3 = outcome date - transfer date + 14 days + day of transfer. Generalized estimating equation models were used to compare methods. Singleton and multiple deliveries were included. Overall prematurity (delivery 45% of deliveries and by more than 1 week in >22% of deliveries. Each method differed from each other. Estimates of preterm birth in ART vary depending on source of data and method of calculation. Some estimates may overestimate preterm birth rates for ART conceptions. Copyright © 2014 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  16. Teaching a Foreign Language in a Desktop Videoconferencing Environment

    Science.gov (United States)

    Kotula, Krzysztof

    2016-01-01

    This paper aims to explore how language instructors teach with a synchronous multimodal setup (Skype). It reports on findings from research which evaluated how teachers use technologies to enable them to work in distance learning contexts. A total of 124 teachers (86 female and 38 male), offering online private lessons, were asked to complete a…

  17. Portable Desktop Apps with GitHub Electron

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    Wouldn't it be nice if you could develop applications that work everywhere, regardless of Operating System or Platform? Even better, what if you could employ the same front-end technologies you use for your web/mobile apps? Meet GitHub Electron.

  18. Brasilia’s Database Administrators

    Directory of Open Access Journals (Sweden)

    Jane Adriana

    2016-06-01

    Full Text Available Database administration has gained an essential role in the management of new database technologies. Different data models are being created for supporting the enormous data volume, from the traditional relational database. These new models are called NoSQL (Not only SQL databases. The adoption of best practices and procedures, has become essential for the operation of database management systems. Thus, this paper investigates some of the techniques and tools used by database administrators. The study highlights features and particularities in databases within the area of Brasilia, the Capital of Brazil. The results point to which new technologies regarding database management are currently the most relevant, as well as the central issues in this area.

  19. LandIT Database

    DEFF Research Database (Denmark)

    Iftikhar, Nadeem; Pedersen, Torben Bach

    2010-01-01

    and reporting purposes. This paper presents the LandIT database; which is result of the LandIT project, which refers to an industrial collaboration project that developed technologies for communication and data integration between farming devices and systems. The LandIT database in principal is based...... on the ISOBUS standard; however the standard is extended with additional requirements, such as gradual data aggregation and flexible exchange of farming data. This paper describes the conceptual and logical schemas of the proposed database based on a real-life farming case study....

  20. National Geochronological Database

    Science.gov (United States)

    Revised by Sloan, Jan; Henry, Christopher D.; Hopkins, Melanie; Ludington, Steve; Original database by Zartman, Robert E.; Bush, Charles A.; Abston, Carl

    2003-01-01

    The National Geochronological Data Base (NGDB) was established by the United States Geological Survey (USGS) to collect and organize published isotopic (also known as radiometric) ages of rocks in the United States. The NGDB (originally known as the Radioactive Age Data Base, RADB) was started in 1974. A committee appointed by the Director of the USGS was given the mission to investigate the feasibility of compiling the published radiometric ages for the United States into a computerized data bank for ready access by the user community. A successful pilot program, which was conducted in 1975 and 1976 for the State of Wyoming, led to a decision to proceed with the compilation of the entire United States. For each dated rock sample reported in published literature, a record containing information on sample location, rock description, analytical data, age, interpretation, and literature citation was constructed and included in the NGDB. The NGDB was originally constructed and maintained on a mainframe computer, and later converted to a Helix Express relational database maintained on an Apple Macintosh desktop computer. The NGDB and a program to search the data files were published and distributed on Compact Disc-Read Only Memory (CD-ROM) in standard ISO 9660 format as USGS Digital Data Series DDS-14 (Zartman and others, 1995). As of May 1994, the NGDB consisted of more than 18,000 records containing over 30,000 individual ages, which is believed to represent approximately one-half the number of ages published for the United States through 1991. Because the organizational unit responsible for maintaining the database was abolished in 1996, and because we wanted to provide the data in more usable formats, we have reformatted the data, checked and edited the information in some records, and provided this online version of the NGDB. This report describes the changes made to the data and formats, and provides instructions for the use of the database in geographic

  1. Federal databases

    International Nuclear Information System (INIS)

    Welch, M.J.; Welles, B.W.

    1988-01-01

    Accident statistics on all modes of transportation are available as risk assessment analytical tools through several federal agencies. This paper reports on the examination of the accident databases by personal contact with the federal staff responsible for administration of the database programs. This activity, sponsored by the Department of Energy through Sandia National Laboratories, is an overview of the national accident data on highway, rail, air, and marine shipping. For each mode, the definition or reporting requirements of an accident are determined and the method of entering the accident data into the database is established. Availability of the database to others, ease of access, costs, and who to contact were prime questions to each of the database program managers. Additionally, how the agency uses the accident data was of major interest

  2. Beginning ArcGIS for desktop development using .NET

    CERN Document Server

    Amirian, Pouria

    2013-01-01

    Pouria Amirian holds a Ph.D. of Geospatial Information Systems (GIS). Dr. Amirian is a developer and GIS/IT lecturer with extensive experience developing and deploying small to large-scale Geospatial Information Systems. Wrox Beginning guides are crafted to make learning programming languages and technologies easier than you think, providing a structured, tutorial format that guides you through all the techniques involved.

  3. A desktop 3D printer in safety-critical Java

    DEFF Research Database (Denmark)

    Strøm, Tórur Biskopstø; Schoeberl, Martin

    2012-01-01

    It is desirable to bring Java technology to safety-critical systems. To this end The Open Group has created the safety-critical Java specification, which will allow Java applications, written according to the specification, to be certifiable in accordance with safety-critical standards. Although...... there exist several safety-critical Java framework implementations, there is a lack of safety-critical use cases implemented according to the specification. In this paper we present a 3D printer and its safety-critical Java level 1 implementation as a use case. With basis in the implementation we evaluate...

  4. Java in a Nutshell a Desktop Quick Reference

    CERN Document Server

    Flanagan, David

    2005-01-01

    With more than 700,000 copies sold to date, Java ina Nutshellfrom O'Reilly is clearly the favorite resource amongst the legion ofdevelopers and programmers using Java technology. And now, with therelease of the 5.0 version of Java, O'Reilly has given the book thatdefined the "in a Nutshell" category another impressive tune-up. In this latest revision, readers will find Java in aNutshell,5th Edition, does more than just cover the extensive changes implicit in5.0, the newest version of Java. It's undergone a complete makeover--inscope, size, and type of coverage--in order to more closely meet

  5. Java Foundation Classes in a Nutshell Desktop Quick Reference

    CERN Document Server

    Flanagan, David

    1999-01-01

    Java Foundation Classes in a Nutshell is an indispensable quick reference for Java programmers who are writing applications that use graphics or graphical user interfaces. The author of the bestsellingJava in a Nutshell has written fast-paced introductions to the Java APIs that comprise the Java Foundation Classes (JFC), such as the Swing GUI components and Java 2D, so that you can start using these exciting new technologies right away. This book also includes O'Reilly's classic-style, quick-reference material for all of the classes in the javax.swing and java.awt packages and their numerous

  6. Database Replication

    CERN Document Server

    Kemme, Bettina

    2010-01-01

    Database replication is widely used for fault-tolerance, scalability and performance. The failure of one database replica does not stop the system from working as available replicas can take over the tasks of the failed replica. Scalability can be achieved by distributing the load across all replicas, and adding new replicas should the load increase. Finally, database replication can provide fast local access, even if clients are geographically distributed clients, if data copies are located close to clients. Despite its advantages, replication is not a straightforward technique to apply, and

  7. Correlation between National Influenza Surveillance Data and Search Queries from Mobile Devices and Desktops in South Korea.

    Science.gov (United States)

    Shin, Soo-Yong; Kim, Taerim; Seo, Dong-Woo; Sohn, Chang Hwan; Kim, Sung-Hoon; Ryoo, Seung Mok; Lee, Yoon-Seon; Lee, Jae Ho; Kim, Won Young; Lim, Kyoung Soo

    2016-01-01

    Digital surveillance using internet search queries can improve both the sensitivity and timeliness of the detection of a health event, such as an influenza outbreak. While it has recently been estimated that the mobile search volume surpasses the desktop search volume and mobile search patterns differ from desktop search patterns, the previous digital surveillance systems did not distinguish mobile and desktop search queries. The purpose of this study was to compare the performance of mobile and desktop search queries in terms of digital influenza surveillance. The study period was from September 6, 2010 through August 30, 2014, which consisted of four epidemiological years. Influenza-like illness (ILI) and virologic surveillance data from the Korea Centers for Disease Control and Prevention were used. A total of 210 combined queries from our previous survey work were used for this study. Mobile and desktop weekly search data were extracted from Naver, which is the largest search engine in Korea. Spearman's correlation analysis was used to examine the correlation of the mobile and desktop data with ILI and virologic data in Korea. We also performed lag correlation analysis. We observed that the influenza surveillance performance of mobile search queries matched or exceeded that of desktop search queries over time. The mean correlation coefficients of mobile search queries and the number of queries with an r-value of ≥ 0.7 equaled or became greater than those of desktop searches over the four epidemiological years. A lag correlation analysis of up to two weeks showed similar trends. Our study shows that mobile search queries for influenza surveillance have equaled or even become greater than desktop search queries over time. In the future development of influenza surveillance using search queries, the recognition of changing trend of mobile search data could be necessary.

  8. Correlation between National Influenza Surveillance Data and Search Queries from Mobile Devices and Desktops in South Korea.

    Directory of Open Access Journals (Sweden)

    Soo-Yong Shin

    Full Text Available Digital surveillance using internet search queries can improve both the sensitivity and timeliness of the detection of a health event, such as an influenza outbreak. While it has recently been estimated that the mobile search volume surpasses the desktop search volume and mobile search patterns differ from desktop search patterns, the previous digital surveillance systems did not distinguish mobile and desktop search queries. The purpose of this study was to compare the performance of mobile and desktop search queries in terms of digital influenza surveillance.The study period was from September 6, 2010 through August 30, 2014, which consisted of four epidemiological years. Influenza-like illness (ILI and virologic surveillance data from the Korea Centers for Disease Control and Prevention were used. A total of 210 combined queries from our previous survey work were used for this study. Mobile and desktop weekly search data were extracted from Naver, which is the largest search engine in Korea. Spearman's correlation analysis was used to examine the correlation of the mobile and desktop data with ILI and virologic data in Korea. We also performed lag correlation analysis. We observed that the influenza surveillance performance of mobile search queries matched or exceeded that of desktop search queries over time. The mean correlation coefficients of mobile search queries and the number of queries with an r-value of ≥ 0.7 equaled or became greater than those of desktop searches over the four epidemiological years. A lag correlation analysis of up to two weeks showed similar trends.Our study shows that mobile search queries for influenza surveillance have equaled or even become greater than desktop search queries over time. In the future development of influenza surveillance using search queries, the recognition of changing trend of mobile search data could be necessary.

  9. Refactoring databases evolutionary database design

    CERN Document Server

    Ambler, Scott W

    2006-01-01

    Refactoring has proven its value in a wide range of development projects–helping software professionals improve system designs, maintainability, extensibility, and performance. Now, for the first time, leading agile methodologist Scott Ambler and renowned consultant Pramodkumar Sadalage introduce powerful refactoring techniques specifically designed for database systems. Ambler and Sadalage demonstrate how small changes to table structures, data, stored procedures, and triggers can significantly enhance virtually any database design–without changing semantics. You’ll learn how to evolve database schemas in step with source code–and become far more effective in projects relying on iterative, agile methodologies. This comprehensive guide and reference helps you overcome the practical obstacles to refactoring real-world databases by covering every fundamental concept underlying database refactoring. Using start-to-finish examples, the authors walk you through refactoring simple standalone databas...

  10. The desktop muon detector: A simple, physics-motivated machine- and electronics-shop project for university students

    Science.gov (United States)

    Axani, S. N.; Conrad, J. M.; Kirby, C.

    2017-12-01

    This paper describes the construction of a desktop muon detector, an undergraduate-level physics project that develops machine-shop and electronics-shop technical skills. The desktop muon detector is a self-contained apparatus that employs a plastic scintillator as the detection medium and a silicon photomultiplier for light collection. This detector can be battery powered and is used in conjunction with the provided software. The total cost per detector is approximately 100. We describe physics experiments we have performed, and then suggest several other interesting measurements that are possible, with one or more desktop muon detectors.

  11. Replikasi Unidirectional pada Heterogen Database

    Directory of Open Access Journals (Sweden)

    Hendro Nindito

    2013-12-01

    Full Text Available The use of diverse database technology in enterprise today can not be avoided. Thus, technology is needed to generate information in real time. The purpose of this research is to discuss a database replication technology that can be applied in heterogeneous database environments. In this study we use Windows-based MS SQL Server database to Linux-based Oracle database as the goal. The research method used is prototyping where development can be done quickly and testing of working models of the interaction process is done through repeated. From this research it is obtained that the database replication technolgy using Oracle Golden Gate can be applied in heterogeneous environments in real time as well.

  12. Dealer Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The dealer reporting databases contain the primary data reported by federally permitted seafood dealers in the northeast. Electronic reporting was implemented May 1,...

  13. RDD Databases

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database was established to oversee documents issued in support of fishery research activities including experimental fishing permits (EFP), letters of...

  14. Snowstorm Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Snowstorm Database is a collection of over 500 snowstorms dating back to 1900 and updated operationally. Only storms having large areas of heavy snowfall (10-20...

  15. National database

    DEFF Research Database (Denmark)

    Kristensen, Helen Grundtvig; Stjernø, Henrik

    1995-01-01

    Artikel om national database for sygeplejeforskning oprettet på Dansk Institut for Sundheds- og Sygeplejeforskning. Det er målet med databasen at samle viden om forsknings- og udviklingsaktiviteter inden for sygeplejen.......Artikel om national database for sygeplejeforskning oprettet på Dansk Institut for Sundheds- og Sygeplejeforskning. Det er målet med databasen at samle viden om forsknings- og udviklingsaktiviteter inden for sygeplejen....

  16. Ceramics Technology Project database: September 1991 summary report. [Materials for piston ring-cylinder liner for advanced heat/diesel engines

    Energy Technology Data Exchange (ETDEWEB)

    Keyes, B.L.P.

    1992-06-01

    The piston ring-cylinder liner area of the internal combustion engine must withstand very-high-temperature gradients, highly-corrosive environments, and constant friction. Improving the efficiency in the engine requires ring and cylinder liner materials that can survive this abusive environment and lubricants that resist decomposition at elevated temperatures. Wear and friction tests have been done on many material combinations in environments similar to actual use to find the right materials for the situation. This report covers tribology information produced from 1986 through July 1991 by Battelle columbus Laboratories, Caterpillar Inc., and Cummins Engine Company, Inc. for the Ceramic Technology Project (CTP). All data in this report were taken from the project's semiannual and bimonthly progress reports and cover base materials, coatings, and lubricants. The data, including test rig descriptions and material characterizations, are stored in the CTP database and are available to all project participants on request. Objective of this report is to make available the test results from these studies, but not to draw conclusions from these data.

  17. Fast and sensitive alignment of microbial whole genome sequencing reads to large sequence datasets on a desktop PC: application to metagenomic datasets and pathogen identification.

    Directory of Open Access Journals (Sweden)

    Lőrinc S Pongor

    Full Text Available Next generation sequencing (NGS of metagenomic samples is becoming a standard approach to detect individual species or pathogenic strains of microorganisms. Computer programs used in the NGS community have to balance between speed and sensitivity and as a result, species or strain level identification is often inaccurate and low abundance pathogens can sometimes be missed. We have developed Taxoner, an open source, taxon assignment pipeline that includes a fast aligner (e.g. Bowtie2 and a comprehensive DNA sequence database. We tested the program on simulated datasets as well as experimental data from Illumina, IonTorrent, and Roche 454 sequencing platforms. We found that Taxoner performs as well as, and often better than BLAST, but requires two orders of magnitude less running time meaning that it can be run on desktop or laptop computers. Taxoner is slower than the approaches that use small marker databases but is more sensitive due the comprehensive reference database. In addition, it can be easily tuned to specific applications using small tailored databases. When applied to metagenomic datasets, Taxoner can provide a functional summary of the genes mapped and can provide strain level identification. Taxoner is written in C for Linux operating systems. The code and documentation are available for research applications at http://code.google.com/p/taxoner.

  18. The PEP-II project-wide database

    International Nuclear Information System (INIS)

    Chan, A.; Calish, S.; Crane, G.; MacGregor, I.; Meyer, S.; Wong, J.

    1995-05-01

    The PEP-II Project Database is a tool for monitoring the technical and documentation aspects of this accelerator construction. It holds the PEP-II design specifications, fabrication and installation data in one integrated system. Key pieces of the database include the machine parameter list, magnet and vacuum fabrication data. CAD drawings, publications and documentation, survey and alignment data and property control. The database can be extended to contain information required for the operations phase of the accelerator and detector. Features such as viewing CAD drawing graphics from the database will be implemented in the future. This central Oracle database on a UNIX server is built using ORACLE Case tools. Users at the three collaborating laboratories (SLAC, LBL, LLNL) can access the data remotely, using various desktop computer platforms and graphical interfaces

  19. Functionally Graded Materials Database

    Science.gov (United States)

    Kisara, Katsuto; Konno, Tomomi; Niino, Masayuki

    2008-02-01

    Functionally Graded Materials Database (hereinafter referred to as FGMs Database) was open to the society via Internet in October 2002, and since then it has been managed by the Japan Aerospace Exploration Agency (JAXA). As of October 2006, the database includes 1,703 research information entries with 2,429 researchers data, 509 institution data and so on. Reading materials such as "Applicability of FGMs Technology to Space Plane" and "FGMs Application to Space Solar Power System (SSPS)" were prepared in FY 2004 and 2005, respectively. The English version of "FGMs Application to Space Solar Power System (SSPS)" is now under preparation. This present paper explains the FGMs Database, describing the research information data, the sitemap and how to use it. From the access analysis, user access results and users' interests are discussed.

  20. Telemedicine in rural areas. Experience with medical desktop-conferencing via satellite.

    Science.gov (United States)

    Ricke, J; Kleinholz, L; Hosten, N; Zendel, W; Lemke, A; Wielgus, W; Vöge, K H; Fleck, E; Marciniak, R; Felix, R

    1995-01-01

    Cooperation between physicians in hospitals in rural areas can be assisted by desktop-conferencing using a satellite link. For six weeks, medical desktop-conferencing was tested during daily clinical conferences between the Virchow-Klinikum, Berlin, and the Medical Academy, Wroclaw. The communications link was provided by the German Telekom satellite system MCS, which allowed temporary connections to be established on demand by manual dialling. Standard hardware and software were used for videoconferencing, as well as software for medical communication developed in the BERMED project. Digital data, such as computed tomography or magnetic resonance images, were transmitted by a digital data channel in parallel to the transmission of analogue video and audio signals. For conferences involving large groups of people, hardware modifications were required. These included the installation of a video projector, adaptation of the audio system with improved echo cancellation, and installation of extra microphones. Learning to use an unfamiliar communication medium proved to be uncomplicated for the participating physicians.

  1. The use of the webcam for teaching a foreign language in a desktop videoconferencing environment

    OpenAIRE

    Develotte, Christine; Guichon, Nicolas; Vincent, Caroline

    2010-01-01

    International audience; This paper explores how language teachers learn to teach with a synchronous multimodal setup (Skype), and it focuses on their use of the webcam during the pedagogical interaction. First, we analyze the ways that French graduate students learning to teach online use the multimodal resources available in a desktop videoconferencing (DVC) environment to monitor pedagogical interactions with intermediate level learners of French in a North-American university. Then, we exa...

  2. In and out the frame: teacher gestures during desktop videoconferencing interactions

    OpenAIRE

    Wigham, Ciara R.; Guichon, Nicolas

    2014-01-01

    This presentation will focus on the production of gestures by teacher trainees when they are interacting online with distant language learners during a desktop videoconferencing interaction. Previous research has provided a mixed picture of the use of the webcam in pedagogical situations: while it is seen as a useful medium to provide complementary information in communicative breakdown (Buckett, Stringer and Datta, 1999), the limited access to visual cues provided by the webcam is felt as li...

  3. Reducing the Digital Divide among Children Who Received Desktop or Hybrid Computers for the Home

    Directory of Open Access Journals (Sweden)

    Gila Cohen Zilka

    2016-06-01

    Full Text Available Researchers and policy makers have been exploring ways to reduce the digital divide. Parameters commonly used to examine the digital divide worldwide, as well as in this study, are: (a the digital divide in the accessibility and mobility of the ICT infrastructure and of the content infrastructure (e.g., sites used in school; and (b the digital divide in literacy skills. In the present study we examined the degree of effectiveness of receiving a desktop or hybrid computer for the home in reducing the digital divide among children of low socio-economic status aged 8-12 from various localities across Israel. The sample consisted of 1,248 respondents assessed in two measurements. As part of the mixed-method study, 128 children were also interviewed. Findings indicate that after the children received desktop or hybrid computers, changes occurred in their frequency of access, mobility, and computer literacy. Differences were found between the groups: hybrid computers reduce disparities and promote work with the computer and surfing the Internet more than do desktop computers. Narrowing the digital divide for this age group has many implications for the acquisition of skills and study habits, and consequently, for the realization of individual potential. The children spoke about self improvement as a result of exposure to the digital environment, about a sense of empowerment and of improvement in their advantage in the social fabric. Many children expressed a desire to continue their education and expand their knowledge of computer applications, the use of software, of games, and more. Therefore, if there is no computer in the home and it is necessary to decide between a desktop and a hybrid computer, a hybrid computer is preferable.

  4. Fab the coming revolution on your desktop : from personal computers to personal fabrication

    CERN Document Server

    Gershenfeld, Neil

    2005-01-01

    What if you could someday put the manufacturing power of an automobile plant on your desktop? According to Neil Gershenfeld, the renowned MIT scientist and inventor, the next big thing is personal fabrication-the ability to design and produce your own products, in your own home, with a machine that combines consumer electronics and industrial tools. Personal fabricators are about to revolutionize the world just as personal computers did a generation ago, and Fab shows us how.

  5. A Case for Database Filesystems

    Energy Technology Data Exchange (ETDEWEB)

    Adams, P A; Hax, J C

    2009-05-13

    Data intensive science is offering new challenges and opportunities for Information Technology and traditional relational databases in particular. Database filesystems offer the potential to store Level Zero data and analyze Level 1 and Level 3 data within the same database system [2]. Scientific data is typically composed of both unstructured files and scalar data. Oracle SecureFiles is a new database filesystem feature in Oracle Database 11g that is specifically engineered to deliver high performance and scalability for storing unstructured or file data inside the Oracle database. SecureFiles presents the best of both the filesystem and the database worlds for unstructured content. Data stored inside SecureFiles can be queried or written at performance levels comparable to that of traditional filesystems while retaining the advantages of the Oracle database.

  6. Experiment Databases

    Science.gov (United States)

    Vanschoren, Joaquin; Blockeel, Hendrik

    Next to running machine learning algorithms based on inductive queries, much can be learned by immediately querying the combined results of many prior studies. Indeed, all around the globe, thousands of machine learning experiments are being executed on a daily basis, generating a constant stream of empirical information on machine learning techniques. While the information contained in these experiments might have many uses beyond their original intent, results are typically described very concisely in papers and discarded afterwards. If we properly store and organize these results in central databases, they can be immediately reused for further analysis, thus boosting future research. In this chapter, we propose the use of experiment databases: databases designed to collect all the necessary details of these experiments, and to intelligently organize them in online repositories to enable fast and thorough analysis of a myriad of collected results. They constitute an additional, queriable source of empirical meta-data based on principled descriptions of algorithm executions, without reimplementing the algorithms in an inductive database. As such, they engender a very dynamic, collaborative approach to experimentation, in which experiments can be freely shared, linked together, and immediately reused by researchers all over the world. They can be set up for personal use, to share results within a lab or to create open, community-wide repositories. Here, we provide a high-level overview of their design, and use an existing experiment database to answer various interesting research questions about machine learning algorithms and to verify a number of recent studies.

  7. Solving Relational Database Problems with ORDBMS in an Advanced Database Course

    Science.gov (United States)

    Wang, Ming

    2011-01-01

    This paper introduces how to use the object-relational database management system (ORDBMS) to solve relational database (RDB) problems in an advanced database course. The purpose of the paper is to provide a guideline for database instructors who desire to incorporate the ORDB technology in their traditional database courses. The paper presents…

  8. Efficient Redundancy Techniques in Cloud and Desktop Grid Systems using MAP/G/c-type Queues

    Science.gov (United States)

    Chakravarthy, Srinivas R.; Rumyantsev, Alexander

    2018-03-01

    Cloud computing is continuing to prove its flexibility and versatility in helping industries and businesses as well as academia as a way of providing needed computing capacity. As an important alternative to cloud computing, desktop grids allow to utilize the idle computer resources of an enterprise/community by means of distributed computing system, providing a more secure and controllable environment with lower operational expenses. Further, both cloud computing and desktop grids are meant to optimize limited resources and at the same time to decrease the expected latency for users. The crucial parameter for optimization both in cloud computing and in desktop grids is the level of redundancy (replication) for service requests/workunits. In this paper we study the optimal replication policies by considering three variations of Fork-Join systems in the context of a multi-server queueing system with a versatile point process for the arrivals. For services we consider phase type distributions as well as shifted exponential and Weibull. We use both analytical and simulation approach in our analysis and report some interesting qualitative results.

  9. Investigation Methodology of a Virtual Desktop Infrastructure for IoT

    Directory of Open Access Journals (Sweden)

    Doowon Jeong

    2015-01-01

    Full Text Available Cloud computing for IoT (Internet of Things has exhibited the greatest growth in the IT market in the recent past and this trend is expected to continue. Many companies are adopting a virtual desktop infrastructure (VDI for private cloud computing to reduce costs and enhance the efficiency of their servers. As a VDI is widely used, threats of cyber terror and invasion are also increasing. To minimize the damage, response procedure for cyber intrusion on a VDI should be systematized. Therefore, we propose an investigation methodology for VDI solutions in this paper. Here we focus on a virtual desktop infrastructure and introduce various desktop virtualization solutions that are widely used, such as VMware, Citrix, and Microsoft. In addition, we verify the integrity of the data acquired in order that the result of our proposed methodology is acceptable as evidence in a court of law. During the experiment, we observed an error: one of the commonly used digital forensic tools failed to mount a dynamically allocated virtual disk properly.

  10. Cycle 1 as predictor of assisted reproductive technology treatment outcome over multiple cycles: an analysis of linked cycles from the Society for Assisted Reproductive Technology Clinic Outcomes Reporting System online database.

    Science.gov (United States)

    Stern, Judy E; Brown, Morton B; Luke, Barbara; Wantman, Ethan; Lederman, Avi; Hornstein, Mark D

    2011-02-01

    To determine whether the first cycle of assisted reproductive technology (ART) predicts treatment course and outcome. Retrospective study of linked cycles. Society for Assisted Reproductive Technology Clinic Outcome Reporting System database. A total of 6,352 ART patients residing or treated in Massachusetts with first treatment cycle in 2004-2005 using fresh, autologous oocytes and no prior ART. Women were categorized by first cycle as follows: Group I, no retrieval; Group II, retrieval, no transfer; Group III, transfer, no embryo cryopreservation; Group IV, transfer plus cryopreservation; and Group V, all embryos cryopreserved. None. Cumulative live-birth delivery per woman, use of donor eggs, intracytoplasmic sperm injection (ICSI), or frozen embryo transfers (FET). Groups differed in age, baseline FSH level, prior gravidity, diagnosis, and failure to return for Cycle 2. Live-birth delivery per woman for groups I through V for women with no delivery in Cycle I were 32.1%, 35.9%, 40.1%, 53.4%, and 51.3%, respectively. Groups I and II were more likely to subsequently use donor eggs (14.5% and 10.9%). Group II had the highest use of ICSI (73.3%); Group III had the lowest use of FET (8.9%). Course of treatment in the first ART cycle is related to different cumulative live-birth delivery rates and eventual use of donor egg, ICSI, and FET. Copyright © 2011 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  11. Teaching, Technology and Terminal Tutoring.

    Science.gov (United States)

    Page, M.

    1980-01-01

    Reviews computer assisted instructional (CAI) technology now available in large scale systems such as PLATO, and in videocassette-linked computer systems. Computer Managed Instruction (CMI) is described as a means of fitting instruction to the student's cognitive style. Discusses briefly cable TV, electronic newspapers, desk-top computers, and…

  12. No effect of ambient odor on the affective appraisal of a desktop virtual environment with signs of disorder.

    Directory of Open Access Journals (Sweden)

    Alexander Toet

    Full Text Available Desktop virtual environments (VEs are increasingly deployed to study the effects of environmental qualities and interventions on human behavior and safety related concerns in built environments. For these applications it is essential that users appraise the affective qualities of the VE similar to those of its real world counterpart. Previous studies have shown that factors like simulated lighting, sound and dynamic elements all contribute to the affective appraisal of a desktop VE. Since ambient odor is known to affect the affective appraisal of real environments, and has been shown to increase the sense of presence in immersive VEs, it may also be an effective tool to tune the affective appraisal of desktop VEs. This study investigated if exposure to ambient odor can modulate the affective appraisal of a desktop VE with signs of public disorder.Participants explored a desktop VE representing a suburban neighborhood with signs of public disorder (neglect, vandalism and crime, while being exposed to either room air or subliminal levels of unpleasant (tar or pleasant (cut grass ambient odor. Whenever they encountered signs of disorder they reported their safety related concerns and associated affective feelings.Signs of crime in the desktop VE were associated with negative affective feelings and concerns for personal safety and personal property. However, there was no significant difference between reported safety related concerns and affective connotations in the control (no-odor and in each of the two ambient odor conditions.Ambient odor did not affect safety related concerns and affective connotations associated with signs of disorder in the desktop VE. Thus, semantic congruency between ambient odor and a desktop VE may not be sufficient to influence its affective appraisal, and a more realistic simulation in which simulated objects appear to emit scents may be required to achieve this goal.

  13. No effect of ambient odor on the affective appraisal of a desktop virtual environment with signs of disorder.

    Science.gov (United States)

    Toet, Alexander; van Schaik, Martin; Theunissen, Nicolet C M

    2013-01-01

    Desktop virtual environments (VEs) are increasingly deployed to study the effects of environmental qualities and interventions on human behavior and safety related concerns in built environments. For these applications it is essential that users appraise the affective qualities of the VE similar to those of its real world counterpart. Previous studies have shown that factors like simulated lighting, sound and dynamic elements all contribute to the affective appraisal of a desktop VE. Since ambient odor is known to affect the affective appraisal of real environments, and has been shown to increase the sense of presence in immersive VEs, it may also be an effective tool to tune the affective appraisal of desktop VEs. This study investigated if exposure to ambient odor can modulate the affective appraisal of a desktop VE with signs of public disorder. Participants explored a desktop VE representing a suburban neighborhood with signs of public disorder (neglect, vandalism and crime), while being exposed to either room air or subliminal levels of unpleasant (tar) or pleasant (cut grass) ambient odor. Whenever they encountered signs of disorder they reported their safety related concerns and associated affective feelings. Signs of crime in the desktop VE were associated with negative affective feelings and concerns for personal safety and personal property. However, there was no significant difference between reported safety related concerns and affective connotations in the control (no-odor) and in each of the two ambient odor conditions. Ambient odor did not affect safety related concerns and affective connotations associated with signs of disorder in the desktop VE. Thus, semantic congruency between ambient odor and a desktop VE may not be sufficient to influence its affective appraisal, and a more realistic simulation in which simulated objects appear to emit scents may be required to achieve this goal.

  14. Solubility Database

    Science.gov (United States)

    SRD 106 IUPAC-NIST Solubility Database (Web, free access)   These solubilities are compiled from 18 volumes (Click here for List) of the International Union for Pure and Applied Chemistry(IUPAC)-NIST Solubility Data Series. The database includes liquid-liquid, solid-liquid, and gas-liquid systems. Typical solvents and solutes include water, seawater, heavy water, inorganic compounds, and a variety of organic compounds such as hydrocarbons, halogenated hydrocarbons, alcohols, acids, esters and nitrogen compounds. There are over 67,500 solubility measurements and over 1800 references.

  15. Outline of the Desktop Severe Accident Graphic Simulator Module for OPR-1000

    Energy Technology Data Exchange (ETDEWEB)

    Park, S. Y.; Ahn, K. I. [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    This paper introduce the desktop severe accident graphic simulator module (VMAAP) which is a window-based severe accident simulator using MAAP as its engine. The VMAAP is one of the submodules in SAMEX system (Severe Accident Management Support Expert System) which is a decision support system for use in a severe accident management following an incident at a nuclear power plant. The SAMEX system consists of four major modules as sub-systems: (a) Severe accident risk data base module (SARDB): stores the data of integrated severe accident analysis code results like MAAP and MELCOR for hundreds of high frequency scenarios for the reference plant; (b) Risk-informed severe accident risk data base management module (RI-SARD): provides a platform to identify the initiating event, determine plant status and equipment availability, diagnoses the status of the reactor core, reactor vessel and containment building, and predicts the plant behaviors; (c) Severe accident management simulator module (VMAAP): runs the MAAP4 code with user friendly graphic interface for input deck and output display; (d) On-line severe accident management guidance module (On-line SAMG); provides available accident management strategies with an electronic format. The role of VMAAP in SAMEX can be described as followings. SARDB contains the most of high frequency scenarios based on a level 2 probabilistic safety analysis. Therefore, there is good chance that a real accident sequence is similar to one of the data base cases. In such a case, RI-SARD can predict an accident progression by a scenario-base or symptom-base search depends on the available plant parameter information. Nevertheless, there still may be deviations or variations between the actual scenario and the data base scenario. The deviations can be decreased by using a real-time graphic accident simulator, VMAAP.. VMAAP is a MAAP4-based severe accident simulation model for OPR-1000 plant. It can simulate spectrum of physical processes

  16. Outline of the Desktop Severe Accident Graphic Simulator Module for OPR-1000

    International Nuclear Information System (INIS)

    Park, S. Y.; Ahn, K. I.

    2015-01-01

    This paper introduce the desktop severe accident graphic simulator module (VMAAP) which is a window-based severe accident simulator using MAAP as its engine. The VMAAP is one of the submodules in SAMEX system (Severe Accident Management Support Expert System) which is a decision support system for use in a severe accident management following an incident at a nuclear power plant. The SAMEX system consists of four major modules as sub-systems: (a) Severe accident risk data base module (SARDB): stores the data of integrated severe accident analysis code results like MAAP and MELCOR for hundreds of high frequency scenarios for the reference plant; (b) Risk-informed severe accident risk data base management module (RI-SARD): provides a platform to identify the initiating event, determine plant status and equipment availability, diagnoses the status of the reactor core, reactor vessel and containment building, and predicts the plant behaviors; (c) Severe accident management simulator module (VMAAP): runs the MAAP4 code with user friendly graphic interface for input deck and output display; (d) On-line severe accident management guidance module (On-line SAMG); provides available accident management strategies with an electronic format. The role of VMAAP in SAMEX can be described as followings. SARDB contains the most of high frequency scenarios based on a level 2 probabilistic safety analysis. Therefore, there is good chance that a real accident sequence is similar to one of the data base cases. In such a case, RI-SARD can predict an accident progression by a scenario-base or symptom-base search depends on the available plant parameter information. Nevertheless, there still may be deviations or variations between the actual scenario and the data base scenario. The deviations can be decreased by using a real-time graphic accident simulator, VMAAP.. VMAAP is a MAAP4-based severe accident simulation model for OPR-1000 plant. It can simulate spectrum of physical processes

  17. Establishment and application of an analytical in-house database (IHDB) for rapid discrimination of Bacillus subtilis group (BSG) using whole-cell MALDI-TOF MS technology.

    Science.gov (United States)

    Huang, Chien-Hsun; Huang, Lina; Chang, Mu-Tzu; Chen, Kuo-Lung

    2016-10-01

    Members of the Bacillus subtilis group (BSG) possess industrial applicability; unfortunately, B. subtilis and its phylogenetically closest species are indistinguishable from one another using 16S rDNA sequencing, physiological and biochemical tests. Matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) is a relatively novel technique for the fast and reliable identification of microorganisms. The aim of this study was to construct a unique analytical in-house database (IHDB) for BSG discrimination based on whole-cell protein fingerprinting using MALDI-TOF MS, as well as to discover biomarkers from the MS peaks to generate a classification model for further differentiation using the ClinProTools software. Type strains of 12 species (included five subspecies) of the BSG were used to build a main spectrum profile (MSP) to create an IHDB under the optimized parameters. The BSG isolates obtained from partial recA gene sequencing were used for IHDB validation. A total of 84 (100%) isolates were correctly identified to the species level and had high score values (mean score: 2.52). However, the IHDB had ambiguous identification at the subspecies level of Bacillus amyloliquefaciens. After implementation of the classification models, the strains could be clearly differentiated. We have successfully developed a rapid, accurate and cost-effective platform for the species- and subspecies-level discrimination of BSG based on the implementation of the IHDB and coupled with ClinProTools, which can be employed as an alternative technology to DNA sequencing and applied for efficient quality control of the microbial agent. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. CloVR: a virtual machine for automated and portable sequence analysis from the desktop using cloud computing.

    Science.gov (United States)

    Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian

    2011-08-30

    Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.

  19. A Tropical Marine Microbial Natural Products Geobibliography as an Example of Desktop Exploration of Current Research Using Web Visualisation Tools

    Directory of Open Access Journals (Sweden)

    Elizabeth A. Evans-Illidge

    2008-10-01

    Full Text Available Microbial marine biodiscovery is a recent scientific endeavour developing at a time when information and other technologies are also undergoing great technical strides. Global visualisation of datasets is now becoming available to the world through powerful and readily available software such as Worldwind ™, ArcGIS Explorer ™ and Google Earth ™. Overlaying custom information upon these tools is within the hands of every scientist and more and more scientific organisations are making data available that can also be integrated into these global visualisation tools. The integrated global view that these tools enable provides a powerful desktop exploration tool. Here we demonstrate the value of this approach to marine microbial biodiscovery by developing a geobibliography that incorporates citations on tropical and near-tropical marine microbial natural products research with Google Earth ™ and additional ancillary global data sets. The tools and software used are all readily available and the reader is able to use and install the material described in this article.

  20. Exploitation of Existing Voice Over Internet Protocol Technology for Department of the Navy Application

    National Research Council Canada - National Science Library

    Vegter, Henry

    2002-01-01

    ..., reduced cost associated with toll calls and the merger of the telephone with the desktop will keep adoption of this technology on the path to ubiquitous use, Topics explored in the thesis include...

  1. Inorganic Crystal Structure Database (ICSD)

    Science.gov (United States)

    SRD 84 FIZ/NIST Inorganic Crystal Structure Database (ICSD) (PC database for purchase)   The Inorganic Crystal Structure Database (ICSD) is produced cooperatively by the Fachinformationszentrum Karlsruhe(FIZ) and the National Institute of Standards and Technology (NIST). The ICSD is a comprehensive collection of crystal structure data of inorganic compounds containing more than 140,000 entries and covering the literature from 1915 to the present.

  2. The implementation of virtualization technology in EAST data system

    International Nuclear Information System (INIS)

    Wang, Feng; Sun, Xiaoyang; Li, Shi; Wang, Yong; Xiao, Bingjia; Chang, Sidi

    2014-01-01

    Highlights: • The server virtualization based on XenServer has been used in EAST data center for common servers and software development platform. • The application virtualization based on XenApp has been demonstrated in EAST to provide an easy and unified data browser method. • The desktop virtualization based on XenDesktop has been adopted for desktop virtualization in the new EAST central control room. - Abstract: The virtualization technology is very popular in many fields at present which has many advantages such as reducing costs, unified management, mobile applications, cross platform, etc. We have also implemented the virtualization technology in EAST control and data system. There are primarily four kinds of technology providers in virtualization technology including VMware, Citrix, Microsoft Hyper-V as well as open source solutions. We have chosen the Citrix solution to implement our virtualization system which mainly includes three aspects. Firstly, we adopt the XenServer technology to realize virtual server for EAST data management and service system. Secondly, we use XenApp technology to realize cross platform system for unify data access. Thirdly, in order to simplify the management of the client computers, we adopt the XenDesktop technology to realize virtual desktops for new central control room. The details of the implementation are described in this paper

  3. The implementation of virtualization technology in EAST data system

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Feng, E-mail: wangfeng@ipp.ac.cn [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei, Anhui 230031 (China); Sun, Xiaoyang; Li, Shi; Wang, Yong [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei, Anhui 230031 (China); Xiao, Bingjia [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei, Anhui 230031 (China); Chang, Sidi [Huazhong University of Science and Technology, Wuhan, Hubei 430074 (China)

    2014-05-15

    Highlights: • The server virtualization based on XenServer has been used in EAST data center for common servers and software development platform. • The application virtualization based on XenApp has been demonstrated in EAST to provide an easy and unified data browser method. • The desktop virtualization based on XenDesktop has been adopted for desktop virtualization in the new EAST central control room. - Abstract: The virtualization technology is very popular in many fields at present which has many advantages such as reducing costs, unified management, mobile applications, cross platform, etc. We have also implemented the virtualization technology in EAST control and data system. There are primarily four kinds of technology providers in virtualization technology including VMware, Citrix, Microsoft Hyper-V as well as open source solutions. We have chosen the Citrix solution to implement our virtualization system which mainly includes three aspects. Firstly, we adopt the XenServer technology to realize virtual server for EAST data management and service system. Secondly, we use XenApp technology to realize cross platform system for unify data access. Thirdly, in order to simplify the management of the client computers, we adopt the XenDesktop technology to realize virtual desktops for new central control room. The details of the implementation are described in this paper.

  4. National Patient Care Database (NPCD)

    Data.gov (United States)

    Department of Veterans Affairs — The National Patient Care Database (NPCD), located at the Austin Information Technology Center, is part of the National Medical Information Systems (NMIS). The NPCD...

  5. Database Vs Data Warehouse

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Data warehouse technology includes a set of concepts and methods that offer the users useful information for decision making. The necessity to build a data warehouse arises from the necessity to improve the quality of information in the organization. The date proceeding from different sources, having a variety of forms - both structured and unstructured, are filtered according to business rules and are integrated in a single large data collection. Using informatics solutions, managers have understood that data stored in operational systems - including databases, are an informational gold mine that must be exploited. Data warehouses have been developed to answer the increasing demands for complex analysis, which could not be properly achieved with operational databases. The present paper emphasizes some of the criteria that information application developers can use in order to choose between a database solution or a data warehouse one.

  6. DYNALIGHT DESKTOP

    DEFF Research Database (Denmark)

    Mærsk-Møller, Hans Martin; Kjær, Katrine Heinsvig; Ottosen, Carl-Otto

    2018-01-01

    approach uses weather forecasts and electricity prices to compute energy and cost-efficient supplemental light plans, which fulfils the production goals of the grower. The approach is supported by a set of newly developed planning software, which interfaces with a greenhouse climate computer. The planning...... for energy and cost-efficient climate control strategies that do not compromise product quality. In this paper, we present a novel approach addressing dynamic control of supplemental light in greenhouses aiming to decrease electricity costs and energy consumption without loss in plant productivity. Our...... resulted in large daily variation in the distribution and duration of light periods and daily light integrals (DLI). However, plant growth and flowering was mainly affected by differences in average DLI and marginally affected by the irregular light periods and a 25% reduction in electricity use and cost...

  7. DYNALIGHT DESKTOP

    DEFF Research Database (Denmark)

    Mærsk-Møller, Hans Martin; Kjær, Katrine Heinsvig; Ottosen, Carl-Otto

    2018-01-01

    approach uses weather forecasts and electricity prices to compute energy and cost-efficient supplemental light plans, which fulfils the production goals of the grower. The approach is supported by a set of newly developed planning software, which interfaces with a greenhouse climate computer. The planning...

  8. RANCANG BANGUN PERANGKAT LUNAK MANAJEMEN DATABASE SQL SERVER BERBASIS WEB

    Directory of Open Access Journals (Sweden)

    Muchammad Husni

    2005-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Microsoft SQL Server merupakan aplikasi desktop database server yang bersifat client/server, karena memiliki komponen client, yang  berfungsi menampilkan dan memanipulasi data; serta komponen server yang berfungsi menyimpan, memanggil, dan mengamankan database. Operasi-operasi manajemen semua server database dalam jaringan dilakukan administrator database dengan menggunakan tool administratif utama SQL Server yang bernama Enterprise Manager. Hal ini mengakibatkan administrator database hanya bisa  melakukan operasi-operasi tersebut di komputer yang telah diinstalasi Microsoft SQL Server. Pada penelitian ini, dirancang suatu aplikasi berbasis web dengan menggunakan ASP.Net untuk melakukan pengaturan database server. Aplikasi ini menggunakan ADO.NET yang memanfaatkan Transact-SQL dan stored procedure pada server untuk melakukan operasi-operasi manajemen database pada suatu server database SQL, dan menampilkannya ke dalam web. Administrator database bisa menjalankan aplikasi berbasis web tersebut dari komputer mana saja pada jaringan dan melakukan koneksi ke server database SQL dengan menggunakan web browser. Dengan demikian memudahkan administrator melakukan tugasnya tanpa harus menggunakan komputer server.   Kata Kunci : Transact-SQL, ASP.Net, ADO.NET, SQL Server

  9. Oracle Application Express 5 for beginners a practical guide to rapidly develop data-centric web applications accessible from desktop, laptops, tablets, and smartphones

    CERN Document Server

    2015-01-01

    Oracle Application Express has taken another big leap towards becoming a true next generation RAD tool. It has entered into its fifth version to build robust web applications. One of the most significant feature in this release is a new page designer that helps developers create and edit page elements within a single page design view, which enormously maximizes developer productivity. Without involving the audience too much into the boring bits, this full colored edition adopts an inspiring approach that helps beginners practically evaluate almost every feature of Oracle Application Express, including all features new to version 5. The most convincing way to explore a technology is to apply it to a real world problem. In this book, you’ll develop a sales application that demonstrates almost every feature to practically expose the anatomy of Oracle Application Express 5. The short list below presents some main topics of Oracle APEX covered in this book: Rapid web application development for desktops, la...

  10. Multi-memetic Mind Evolutionary Computation Algorithm for Loosely Coupled Systems of Desktop Computers

    Directory of Open Access Journals (Sweden)

    M. K. Sakharov

    2015-01-01

    Full Text Available This paper deals with the development and software implementation of the hybrid multi-memetic algorithm for distributed computing systems. The main algorithm is based on the modification of MEC algorithm proposed by the authors. The multi-memetic algorithm utilizes three various local optimization methods. Software implementation was developed using MPI for Python and tested on a grid network made of twenty desktop computers. Performance of the proposed algorithm and its software implementation was investigated using multi-dimensional multi-modal benchmark functions from CEC’14.

  11. Development of an automated desktop procedure for defining macro-reaches for river longitudinal profiles

    CSIR Research Space (South Africa)

    Dollar, LH

    2006-07-01

    Full Text Available method Change points from WLRT method Crocodile 450 450 28 550 40 030 39 950 39 950 40 750λ 40 750λ 170 450 205 500 205 500 205 500 214 350 214 350 Olifants 1040 1040 5 890 24 090 23 390 23 390 26 540 26 540 47 690 132 340 161 290... cc. Cape Town. 140 pp. DOLLAR ESJ (2002) Desktop Geomorphological Analysis of IFR Sites 5, 6, 7, 8 and 9 for the Mhlathuze, Nseleni and Mfule Rivers, Northern KwaZulu-Natal. Unpublished Report for IWR Environmental. 19 pp. DOLLAR ESJ (2003) Macro...

  12. [Teaching Desktop] Video Conferencing in a Collaborative and Problem Based Setting

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Mouritzen, Per

    2013-01-01

    , teachers and assistant teachers wanted to find ways in the design for learning that enables the learners to acquire knowledge about the theories, models and concepts of the subject, as well as hands‐on competencies in a learning‐by‐doing manner. In particular we address the area of desktop video...... their educational design and their role as teachers in this video conference setting. The students reflect on their experiences and designs in a blog and the group collaboratively hands in a reflection paper online. Both blog posts and reflection papers needs to relate to the literature of the module. Our analysis...

  13. Database reliability engineering designing and operating resilient database systems

    CERN Document Server

    Campbell, Laine

    2018-01-01

    The infrastructure-as-code revolution in IT is also affecting database administration. With this practical book, developers, system administrators, and junior to mid-level DBAs will learn how the modern practice of site reliability engineering applies to the craft of database architecture and operations. Authors Laine Campbell and Charity Majors provide a framework for professionals looking to join the ranks of today’s database reliability engineers (DBRE). You’ll begin by exploring core operational concepts that DBREs need to master. Then you’ll examine a wide range of database persistence options, including how to implement key technologies to provide resilient, scalable, and performant data storage and retrieval. With a firm foundation in database reliability engineering, you’ll be ready to dive into the architecture and operations of any modern database. This book covers: Service-level requirements and risk management Building and evolving an architecture for operational visibility ...

  14. Performance and function of a desktop viewer at Mayo Clinic Scottsdale.

    Science.gov (United States)

    Eversman, W G; Pavlicek, W; Zavalkovskiy, B; Erickson, B J

    2000-05-01

    A clinical viewing system was integrated with the Mayo Clinic Scottsdale picture archiving and communication system (PACS) for providing images and the report as part of the electronic medical record (EMR). Key attributes of the viewer include a single user log-on, an integrated patient centric EMR image access for all ordered examinations, prefetching of the most recent prior examination of the same modality, and the ability to provide comparison of current and past exams at the same time on the display. Other functions included preset windows, measurement tools, and multiformat display. Images for the prior 12 months are stored on the clinical server and are viewable in less than a second. Images available on the desktop include all computed radiography (CR), chest, magnetic resonance images (MRI), computed tomography (CT), ultrasound (U/S), nuclear, angiographic, gastrointestinal (GI) digital spots, and portable C-arm digital spots. Ad hoc queries of examinations from PACS are possible for those patients whose image may not be on the clinical server, but whose images reside on the PACS archive (10TB). Clinician satisfaction was reported to be high, especially for those staff heavily dependent on timely access to images, as well as those having heavy film usage. The desktop viewer is used for resident access to images. It is also useful for teaching conferences with large-screen projection without film. We report on the measurements of functionality, reliability, and speed of image display with this application.

  15. Fabrication of low cost soft tissue prostheses with the desktop 3D printer

    Science.gov (United States)

    He, Yong; Xue, Guang-Huai; Fu, Jian-Zhong

    2014-11-01

    Soft tissue prostheses such as artificial ear, eye and nose are widely used in the maxillofacial rehabilitation. In this report we demonstrate how to fabricate soft prostheses mold with a low cost desktop 3D printer. The fabrication method used is referred to as Scanning Printing Polishing Casting (SPPC). Firstly the anatomy is scanned with a 3D scanner, then a tissue casting mold is designed on computer and printed with a desktop 3D printer. Subsequently, a chemical polishing method is used to polish the casting mold by removing the staircase effect and acquiring a smooth surface. Finally, the last step is to cast medical grade silicone into the mold. After the silicone is cured, the fine soft prostheses can be removed from the mold. Utilizing the SPPC method, soft prostheses with smooth surface and complicated structure can be fabricated at a low cost. Accordingly, the total cost of fabricating ear prosthesis is about $30, which is much lower than the current soft prostheses fabrication methods.

  16. Fabrication of low cost soft tissue prostheses with the desktop 3D printer

    Science.gov (United States)

    He, Yong; Xue, Guang-huai; Fu, Jian-zhong

    2014-01-01

    Soft tissue prostheses such as artificial ear, eye and nose are widely used in the maxillofacial rehabilitation. In this report we demonstrate how to fabricate soft prostheses mold with a low cost desktop 3D printer. The fabrication method used is referred to as Scanning Printing Polishing Casting (SPPC). Firstly the anatomy is scanned with a 3D scanner, then a tissue casting mold is designed on computer and printed with a desktop 3D printer. Subsequently, a chemical polishing method is used to polish the casting mold by removing the staircase effect and acquiring a smooth surface. Finally, the last step is to cast medical grade silicone into the mold. After the silicone is cured, the fine soft prostheses can be removed from the mold. Utilizing the SPPC method, soft prostheses with smooth surface and complicated structure can be fabricated at a low cost. Accordingly, the total cost of fabricating ear prosthesis is about $30, which is much lower than the current soft prostheses fabrication methods. PMID:25427880

  17. Fabrication of low cost soft tissue prostheses with the desktop 3D printer.

    Science.gov (United States)

    He, Yong; Xue, Guang-huai; Fu, Jian-zhong

    2014-11-27

    Soft tissue prostheses such as artificial ear, eye and nose are widely used in the maxillofacial rehabilitation. In this report we demonstrate how to fabricate soft prostheses mold with a low cost desktop 3D printer. The fabrication method used is referred to as Scanning Printing Polishing Casting (SPPC). Firstly the anatomy is scanned with a 3D scanner, then a tissue casting mold is designed on computer and printed with a desktop 3D printer. Subsequently, a chemical polishing method is used to polish the casting mold by removing the staircase effect and acquiring a smooth surface. Finally, the last step is to cast medical grade silicone into the mold. After the silicone is cured, the fine soft prostheses can be removed from the mold. Utilizing the SPPC method, soft prostheses with smooth surface and complicated structure can be fabricated at a low cost. Accordingly, the total cost of fabricating ear prosthesis is about $30, which is much lower than the current soft prostheses fabrication methods.

  18. Stackfile Database

    Science.gov (United States)

    deVarvalho, Robert; Desai, Shailen D.; Haines, Bruce J.; Kruizinga, Gerhard L.; Gilmer, Christopher

    2013-01-01

    This software provides storage retrieval and analysis functionality for managing satellite altimetry data. It improves the efficiency and analysis capabilities of existing database software with improved flexibility and documentation. It offers flexibility in the type of data that can be stored. There is efficient retrieval either across the spatial domain or the time domain. Built-in analysis tools are provided for frequently performed altimetry tasks. This software package is used for storing and manipulating satellite measurement data. It was developed with a focus on handling the requirements of repeat-track altimetry missions such as Topex and Jason. It was, however, designed to work with a wide variety of satellite measurement data [e.g., Gravity Recovery And Climate Experiment -- GRACE). The software consists of several command-line tools for importing, retrieving, and analyzing satellite measurement data.

  19. Tight-coupling of groundwater flow and transport modelling engines with spatial databases and GIS technology: a new approach integrating Feflow and ArcGIS

    Directory of Open Access Journals (Sweden)

    Ezio Crestaz

    2012-09-01

    Full Text Available Implementation of groundwater flow and transport numerical models is generally a challenge, time-consuming and financially-demanding task, in charge to specialized modelers and consulting firms. At a later stage, within clearly stated limits of applicability, these models are often expected to be made available to less knowledgeable personnel to support/design and running of predictive simulations within more familiar environments than specialized simulation systems. GIS systems coupled with spatial databases appear to be ideal candidates to address problem above, due to their much wider diffusion and expertise availability. Current paper discusses the issue from a tight-coupling architecture perspective, aimed at integration of spatial databases, GIS and numerical simulation engines, addressing both observed and computed data management, retrieval and spatio-temporal analysis issues. Observed data can be migrated to the central database repository and then used to set up transient simulation conditions in the background, at run time, while limiting additional complexity and integrity failure risks as data duplication during data transfer through proprietary file formats. Similarly, simulation scenarios can be set up in a familiar GIS system and stored to spatial database for later reference. As numerical engine is tightly coupled with the GIS, simulations can be run within the environment and results themselves saved to the database. Further tasks, as spatio-temporal analysis (i.e. for postcalibration auditing scopes, cartography production and geovisualization, can then be addressed using traditional GIS tools. Benefits of such an approach include more effective data management practices, integration and availability of modeling facilities in a familiar environment, streamlining spatial analysis processes and geovisualization requirements for the non-modelers community. Major drawbacks include limited 3D and time-dependent support in

  20. Virtualisation Devices for Student Learning: Comparison between Desktop-Based (Oculus Rift) and Mobile-Based (Gear VR) Virtual Reality in Medical and Health Science Education

    Science.gov (United States)

    Moro, Christian; Stromberga, Zane; Stirling, Allan

    2017-01-01

    Consumer-grade virtual reality has recently become available for both desktop and mobile platforms and may redefine the way that students learn. However, the decision regarding which device to utilise within a curriculum is unclear. Desktop-based VR has considerably higher setup costs involved, whereas mobile-based VR cannot produce the quality of…

  1. Innovative GIS and Information Technologies Supporting Wide Area Assessment of UXO Sites

    Science.gov (United States)

    2008-10-01

    documentation, project status information and contacts. Access to the ESRI ArcGIS® client software and licenses through the Citrix MetaFrame...ArcGIS environment. They were available using the ArcMap desktop client either locally or remotely by using the Citrix ArcGIS Desktop Server. Saved...Technologies for WAA Final Report October 2008 -7- available via Citrix Presentation Server, which provided users

  2. Database modeling and design logical design

    CERN Document Server

    Teorey, Toby J; Nadeau, Tom; Jagadish, HV

    2011-01-01

    Database systems and database design technology have undergone significant evolution in recent years. The relational data model and relational database systems dominate business applications; in turn, they are extended by other technologies like data warehousing, OLAP, and data mining. How do you model and design your database application in consideration of new technology or new business needs? In the extensively revised fifth edition, you'll get clear explanations, lots of terrific examples and an illustrative case, and the really practical advice you have come to count on--with design rules

  3. Database modeling and design logical design

    CERN Document Server

    Teorey, Toby J; Nadeau, Tom; Jagadish, HV

    2005-01-01

    Database systems and database design technology have undergone significant evolution in recent years. The relational data model and relational database systems dominate business applications; in turn, they are extended by other technologies like data warehousing, OLAP, and data mining. How do you model and design your database application in consideration of new technology or new business needs? In the extensively revised fourth edition, you'll get clear explanations, lots of terrific examples and an illustrative case, and the really practical advice you have come to count on--with design rul

  4. Detection of analyte binding to microarrays using gold nanoparticle labels and a desktop scanner

    DEFF Research Database (Denmark)

    Han, Anpan; Dufva, Martin; Belleville, Erik

    2003-01-01

    six attomoles of antibody-gold conjugates. This detection system was used in a competitive immunoassay to measure the concentration of the pesticide metabolite 2,6-dichlorobenzamide (BAM) in water samples. The results showed that the gold labeled antibodies functioned comparably with a fluorescent......Microarray hybridization or antibody binding can be detected by many techniques, however, only a few are suitable for widespread use since many of these detection techniques rely on bulky and expensive instruments. Here, we describe the usefulness of a simple and inexpensive detection method based...... on gold nanoparticle labeled antibodies visualized by a commercial, office desktop flatbed scanner. Scanning electron microscopy studies showed that the signal from the flatbed scanner was proportional to the surface density of the bound antibody-gold conjugates, and that the flatbed scanner could detect...

  5. FRAMEWORK PARA CONVERSÃO DE APLICATIVOS DELPHI DESKTOP EM APLICATIVOS ANDROID NATIVO

    Directory of Open Access Journals (Sweden)

    Rodrigo da Silva Riquena

    2014-08-01

    Full Text Available With the growing use of mobile devices by companies and organizations there is an increasing demand applications in production mobile platform. For certain companies, business success may depend on a mobile application which approaches the customers or improve the performance of internal processes. However, developing software for the mobile platform is an expensive process which takes time and resources. A framework to convert Delphi Desktop applications into native Android applications in an automatic way constitutes a useful tool for architects and software developers can contribute with the implementation phase of the application. Therefore, this work is based on methods and processes for software reengineering as the PRE / OO (Process of Reengineering Object Oriented, for automatic conversion of an application developed in Delphi environment in an application for Android mobile platform. At last, an experiment was performed with a real case to corroborate the goals.

  6. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    Science.gov (United States)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  7. The CosmicWatch Desktop Muon Detector: a self-contained, pocket sized particle detector

    Science.gov (United States)

    Axani, S. N.; Frankiewicz, K.; Conrad, J. M.

    2018-03-01

    The CosmicWatch Desktop Muon Detector is a self-contained, hand-held cosmic ray muon detector that is valuable for astro/particle physics research applications and outreach. The material cost of each detector is under 100 and it takes a novice student approximately four hours to build their first detector. The detectors are powered via a USB connection and the data can either be recorded directly to a computer or to a microSD card. Arduino- and Python-based software is provided to operate the detector and an online application to plot the data in real-time. In this paper, we describe the various design features, evaluate the performance, and illustrate the detectors capabilities by providing several example measurements.

  8. Using the rear projection of the Socibot Desktop robot for creation of applications with facial expressions

    Science.gov (United States)

    Gîlcă, G.; Bîzdoacă, N. G.; Diaconu, I.

    2016-08-01

    This article aims to implement some practical applications using the Socibot Desktop social robot. We mean to realize three applications: creating a speech sequence using the Kiosk menu of the browser interface, creating a program in the Virtual Robot browser interface and making a new guise to be loaded into the robot's memory in order to be projected onto it face. The first application is actually created in the Compose submenu that contains 5 file categories: audio, eyes, face, head, mood, this being helpful in the creation of the projected sequence. The second application is more complex, the completed program containing: audio files, speeches (can be created in over 20 languages), head movements, the robot's facial parameters function of each action units (AUs) of the facial muscles, its expressions and its line of sight. Last application aims to change the robot's appearance with the guise created by us. The guise was created in Adobe Photoshop and then loaded into the robot's memory.

  9. Advanced Stirling Radioisotope Generator Thermal Power Model in Thermal Desktop SINDA/FLUINT Analyzer

    Science.gov (United States)

    Wang, Xiao-Yen; Fabanich, William A.; Schmitz, Paul C.

    2012-01-01

    This paper presents a three-dimensional Advanced Stirling Radioisotope Generator (ASRG) thermal power model that was built using the Thermal Desktop SINDA/FLUINT thermal analyzer. The model was correlated with ASRG engineering unit (EU) test data and ASRG flight unit predictions from Lockheed Martin's Ideas TMG thermal model. ASRG performance under (1) ASC hot-end temperatures, (2) ambient temperatures, and (3) years of mission for the general purpose heat source fuel decay was predicted using this model for the flight unit. The results were compared with those reported by Lockheed Martin and showed good agreement. In addition, the model was used to study the performance of the ASRG flight unit for operations on the ground and on the surface of Titan, and the concept of using gold film to reduce thermal loss through insulation was investigated.

  10. Thermoelectric cooling of microelectronic circuits and waste heat electrical power generation in a desktop personal computer

    International Nuclear Information System (INIS)

    Gould, C.A.; Shammas, N.Y.A.; Grainger, S.; Taylor, I.

    2011-01-01

    Thermoelectric cooling and micro-power generation from waste heat within a standard desktop computer has been demonstrated. A thermoelectric test system has been designed and constructed, with typical test results presented for thermoelectric cooling and micro-power generation when the computer is executing a number of different applications. A thermoelectric module, operating as a heat pump, can lower the operating temperature of the computer's microprocessor and graphics processor to temperatures below ambient conditions. A small amount of electrical power, typically in the micro-watt or milli-watt range, can be generated by a thermoelectric module attached to the outside of the computer's standard heat sink assembly, when a secondary heat sink is attached to the other side of the thermoelectric module. Maximum electrical power can be generated by the thermoelectric module when a water cooled heat sink is used as the secondary heat sink, as this produces the greatest temperature difference between both sides of the module.

  11. Fabrication of cerebral aneurysm simulator with a desktop 3D printer.

    Science.gov (United States)

    Liu, Yu; Gao, Qing; Du, Song; Chen, ZiChen; Fu, JianZhong; Chen, Bing; Liu, ZhenJie; He, Yong

    2017-05-17

    Now, more and more patients are suffering cerebral aneurysm. However, long training time limits the rapid growth of cerebrovascular neurosurgeons. Here we developed a novel cerebral aneurysm simulator which can be better represented the dynamic bulging process of cerebral aneurysm The proposed simulator features the integration of a hollow elastic vascular model, a skull model and a brain model, which can be affordably fabricated at the clinic (Fab@Clinic), under $25.00 each with the help of a low-cost desktop 3D printer. Moreover, the clinical blood flow and pulsation pressure similar to the human can be well simulated, which can be used to train the neurosurgical residents how to clip aneurysms more effectively.

  12. Qualitative research ethics on the spot: Not only on the desktop.

    Science.gov (United States)

    Øye, Christine; Sørensen, Nelli Øvre; Glasdam, Stinne

    2016-06-01

    The increase in medical ethical regulations and bureaucracy handled by institutional review boards and healthcare institutions puts the researchers using qualitative methods in a challenging position. Based on three different cases from three different research studies, the article explores and discusses research ethical dilemmas. First, and especially, the article addresses the challenges for gatekeepers who influence the informant's decisions to participate in research. Second, the article addresses the challenges in following research ethical guidelines related to informed consent and doing no harm. Third, the article argues for the importance of having research ethical guidelines and review boards to question and discuss the possible ethical dilemmas that occur in qualitative research. Research ethics must be understood in qualitative research as relational, situational, and emerging. That is, that focus on ethical issues and dilemmas has to be paid attention on the spot and not only at the desktop. © The Author(s) 2015.

  13. Economic analysis of cloud-based desktop virtualization implementation at a hospital.

    Science.gov (United States)

    Yoo, Sooyoung; Kim, Seok; Kim, Taeki; Baek, Rong-Min; Suh, Chang Suk; Chung, Chin Youb; Hwang, Hee

    2012-10-30

    Cloud-based desktop virtualization infrastructure (VDI) is known as providing simplified management of application and desktop, efficient management of physical resources, and rapid service deployment, as well as connection to the computer environment at anytime, anywhere with any device. However, the economic validity of investing in the adoption of the system at a hospital has not been established. This study computed the actual investment cost of the hospital-wide VDI implementation at the 910-bed Seoul National University Bundang Hospital in Korea and the resulting effects (i.e., reductions in PC errors and difficulties, application and operating system update time, and account management time). Return on investment (ROI), net present value (NPV), and internal rate of return (IRR) indexes used for corporate investment decision-making were used for the economic analysis of VDI implementation. The results of five-year cost-benefit analysis given for 400 Virtual Machines (VMs; i.e., 1,100 users in the case of SNUBH) showed that the break-even point was reached in the fourth year of the investment. At that point, the ROI was 122.6%, the NPV was approximately US$192,000, and the IRR showed an investment validity of 10.8%. From our sensitivity analysis to changing the number of VMs (in terms of number of users), the greater the number of adopted VMs was the more investable the system was. This study confirms that the emerging VDI can have an economic impact on hospital information system (HIS) operation and utilization in a tertiary hospital setting.

  14. Economic analysis of cloud-based desktop virtualization implementation at a hospital

    Directory of Open Access Journals (Sweden)

    Yoo Sooyoung

    2012-10-01

    Full Text Available Abstract Background Cloud-based desktop virtualization infrastructure (VDI is known as providing simplified management of application and desktop, efficient management of physical resources, and rapid service deployment, as well as connection to the computer environment at anytime, anywhere with anydevice. However, the economic validity of investing in the adoption of the system at a hospital has not been established. Methods This study computed the actual investment cost of the hospital-wide VDI implementation at the 910-bed Seoul National University Bundang Hospital in Korea and the resulting effects (i.e., reductions in PC errors and difficulties, application and operating system update time, and account management time. Return on investment (ROI, net present value (NPV, and internal rate of return (IRR indexes used for corporate investment decision-making were used for the economic analysis of VDI implementation. Results The results of five-year cost-benefit analysis given for 400 Virtual Machines (VMs; i.e., 1,100 users in the case of SNUBH showed that the break-even point was reached in the fourth year of the investment. At that point, the ROI was 122.6%, the NPV was approximately US$192,000, and the IRR showed an investment validity of 10.8%. From our sensitivity analysis to changing the number of VMs (in terms of number of users, the greater the number of adopted VMs was the more investable the system was. Conclusions This study confirms that the emerging VDI can have an economic impact on hospital information system (HIS operation and utilization in a tertiary hospital setting.

  15. Assessing soil erosion risk using RUSLE through a GIS open source desktop and web application.

    Science.gov (United States)

    Duarte, L; Teodoro, A C; Gonçalves, J A; Soares, D; Cunha, M

    2016-06-01

    Soil erosion is a serious environmental problem. An estimation of the expected soil loss by water-caused erosion can be calculated considering the Revised Universal Soil Loss Equation (RUSLE). Geographical Information Systems (GIS) provide different tools to create categorical maps of soil erosion risk which help to study the risk assessment of soil loss. The objective of this study was to develop a GIS open source application (in QGIS), using the RUSLE methodology for estimating erosion rate at the watershed scale (desktop application) and provide the same application via web access (web application). The applications developed allow one to generate all the maps necessary to evaluate the soil erosion risk. Several libraries and algorithms from SEXTANTE were used to develop these applications. These applications were tested in Montalegre municipality (Portugal). The maps involved in RUSLE method-soil erosivity factor, soil erodibility factor, topographic factor, cover management factor, and support practices-were created. The estimated mean value of the soil loss obtained was 220 ton km(-2) year(-1) ranged from 0.27 to 1283 ton km(-2) year(-1). The results indicated that most of the study area (80 %) is characterized by very low soil erosion level (soil erosion was higher than 962 ton km(-2) year(-1). It was also concluded that areas with high slope values and bare soil are related with high level of erosion and the higher the P and C values, the higher the soil erosion percentage. The RUSLE web and the desktop application are freely available.

  16. The Taverna workflow suite: designing and executing workflows of Web Services on the desktop, web or in the cloud

    NARCIS (Netherlands)

    Wolstencroft, K.; Haines, R.; Fellows, D.; Williams, A.; Withers, D.; Owen, S.; Soiland-Reyes, S.; Dunlop, I.; Nenadic, A.; Fisher, P.; Bhagat, J.; Belhajjame, K.; Bacall, F.; Hardisty, A.; Nieva de la Hidalga, A.; Balcazar Vargas, M.P.; Sufi, S.; Goble, C.

    2013-01-01

    The Taverna workflow tool suite (http://www.taverna.org.uk) is designed to combine distributed Web Services and/or local tools into complex analysis pipelines. These pipelines can be executed on local desktop machines or through larger infrastructure (such as supercomputers, Grids or cloud

  17. No effect of ambient odor on the affective appraisal of a desktop virtual environment with signs of disorder

    NARCIS (Netherlands)

    Toet, A.; Schaik, M.G. van; Theunissen, N.C.M.

    2013-01-01

    Background : Desktop virtual environments (VEs) are increasingly deployed to study the effects of environmental qualities and interventions on human behavior and safety related concerns in built environments. For these applications it is essential that users appraise the affective qualities of the

  18. Influences of Gender and Computer Gaming Experience in Occupational Desktop Virtual Environments: A Cross-Case Analysis Study

    Science.gov (United States)

    Ausburn, Lynna J.; Ausburn, Floyd B.; Kroutter, Paul J.

    2013-01-01

    This study used a cross-case analysis methodology to compare four line-of-inquiry studies of desktop virtual environments (DVEs) to examine the relationships of gender and computer gaming experience to learning performance and perceptions. Comparison was made of learning patterns in a general non-technical DVE with patterns in technically complex,…

  19. The Learner Characteristics, Features of Desktop 3D Virtual Reality Environments, and College Chemistry Instruction: A Structural Equation Modeling Analysis

    Science.gov (United States)

    Merchant, Zahira; Goetz, Ernest T.; Keeney-Kennicutt, Wendy; Kwok, Oi-man; Cifuentes, Lauren; Davis, Trina J.

    2012-01-01

    We examined a model of the impact of a 3D desktop virtual reality environment on the learner characteristics (i.e. perceptual and psychological variables) that can enhance chemistry-related learning achievements in an introductory college chemistry class. The relationships between the 3D virtual reality features and the chemistry learning test as…

  20. Reach a New Threshold of Freedom and Control with Dell's Flexible Computing Solution: On-Demand Desktop Streaming

    Science.gov (United States)

    Technology & Learning, 2008

    2008-01-01

    When it comes to IT, there has always been an important link between data center control and client flexibility. As computing power increases, so do the potentially crippling threats to security, productivity and financial stability. This article talks about Dell's On-Demand Desktop Streaming solution which is designed to centralize complete…

  1. Environmental Effects of Hydrokinetic Turbines on Fish: Desktop and Laboratory Flume Studies

    Energy Technology Data Exchange (ETDEWEB)

    Jacobson, Paul T. [Electric Power Research Institute; Amaral, Stephen V. [Alden Research Laboratory; Castro-Santos, Theodore [U.S. Geological Survey; Giza, Dan [Alden Research Laboratory; Haro, Alexander J. [U.S. Geological Survey; Hecker, George [Alden Research Laboratory; McMahon, Brian [Alden Research Laboratory; Perkins, Norman [Alden Research Laboratory; Pioppi, Nick [Alden Research Laboratory

    2012-12-31

    This collection of three reports describes desktop and laboratory flume studies that provide information to support assessment of the potential for injury and mortality of fish that encounter hydrokinetic turbines of various designs installed in tidal and river environments. Behavioral responses to turbine exposure also are investigated to support assessment of the potential for disruptions to upstream and downstream movements of fish. The studies: (1) conducted an assessment of potential injury mechanisms using available data from studies with conventional hydro turbines; (2) developed theoretical models for predicting blade strike probabilities and mortality rates; and (3) performed flume testing with three turbine designs and several fish species and size groups in two laboratory flumes to estimate survival rates and document fish behavior. The project yielded three reports which this document comprises. The three constituent documents are addressed individually below Fish Passage Through Turbines: Application of Conventional Hydropower Data to Hydrokinetic Technologies Fish passing through the blade sweep of a hydrokinetic turbine experience a much less harsh physical environment than do fish entrained through conventional hydro turbines. The design and operation of conventional turbines results in high flow velocities, abrupt changes in flow direction, relatively high runner rotational and blade speeds, rapid and significant changes in pressure, and the need for various structures throughout the turbine passageway that can be impacted by fish. These conditions generally do not occur or are not significant factors for hydrokinetic turbines. Furthermore, compared to conventional hydro turbines, hydrokinetic turbines typically produce relatively minor changes in shear, turbulence, and pressure levels from ambient conditions in the surrounding environment. Injuries and mortality from mechanical injuries will be less as well, mainly due to low rotational speeds and

  2. Analysis of low and medium energy physics records in databases. Science and technology indicators in low and medium energy physics. With particular emphasis on nuclear data

    International Nuclear Information System (INIS)

    Hillebrand, C.D.

    1998-12-01

    An analysis of the literature on low and medium energy physics, with particular emphasis on nuclear data, was performed on the basis of the contents of the bibliographic database INIS (International Nuclear Information System). Quantitative data were obtained on various characteristics of relevant INIS records such as subject categories, language and country of publication, publication types, etc. Rather surprisingly, it was found that the number of records in nuclear physics has remained nearly constant over the last decade. The analysis opens up the possibility of further studies, e.g. on international research co-operation and on publication patterns. (author)

  3. INIS: international nuclear information system. World first international database on pacific uses of nuclear sciences and technologies; INIS: International Nuclear Information System. Premiere base de donnees internationale sur les applications pacifiques des sciences et technologies nucleaires

    Energy Technology Data Exchange (ETDEWEB)

    Surmont, J.; Constant, A.; Guille, N.; Le Blanc, A.; Mouffron, O.; Anguise, P.; Jouve, J.J

    2007-07-01

    This poster, prepared for the 2007 CEA meetings on scientific and technical information, presents the INIS information system, the document-types content and subject coverage of the database, the French contribution to this system thanks to the INIS team of the CEA-Saclay, the input preparation process, and an example of valorization of a scientific and historical patrimony with the CEA/IAEA joint project of digitization of about 2760 CEA reports published between 1948 and 1969. All these reports have been digitized by the IAEA and analyzed by CEA, and entered in the INIS database with a link to the full text. (J.S.)

  4. Curcumin Resource Database

    Science.gov (United States)

    Kumar, Anil; Chetia, Hasnahana; Sharma, Swagata; Kabiraj, Debajyoti; Talukdar, Narayan Chandra; Bora, Utpal

    2015-01-01

    Curcumin is one of the most intensively studied diarylheptanoid, Curcuma longa being its principal producer. This apart, a class of promising curcumin analogs has been generated in laboratories, aptly named as Curcuminoids which are showing huge potential in the fields of medicine, food technology, etc. The lack of a universal source of data on curcumin as well as curcuminoids has been felt by the curcumin research community for long. Hence, in an attempt to address this stumbling block, we have developed Curcumin Resource Database (CRDB) that aims to perform as a gateway-cum-repository to access all relevant data and related information on curcumin and its analogs. Currently, this database encompasses 1186 curcumin analogs, 195 molecular targets, 9075 peer reviewed publications, 489 patents and 176 varieties of C. longa obtained by extensive data mining and careful curation from numerous sources. Each data entry is identified by a unique CRDB ID (identifier). Furnished with a user-friendly web interface and in-built search engine, CRDB provides well-curated and cross-referenced information that are hyperlinked with external sources. CRDB is expected to be highly useful to the researchers working on structure as well as ligand-based molecular design of curcumin analogs. Database URL: http://www.crdb.in PMID:26220923

  5. Adoption of new technologies in a highly uncertain environment : the case of knowledge discovery in databases for customer relationship management in Egyptian public banks

    NARCIS (Netherlands)

    Khedr, Ayman El_Sayed

    2008-01-01

    “How can we better understand the process of adopting a new technology and its impact on business value in situations of high uncertainty?” In short, this is the central research question addressed in this thesis. The dissertation explores how uncertainty factors affect the adoption process of a new

  6. Technology.

    Science.gov (United States)

    Online-Offline, 1998

    1998-01-01

    Focuses on technology, on advances in such areas as aeronautics, electronics, physics, the space sciences, as well as computers and the attendant progress in medicine, robotics, and artificial intelligence. Describes educational resources for elementary and middle school students, including Web sites, CD-ROMs and software, videotapes, books,…

  7. FY 1997 report on the outline and summary of research for a database of unutilized energy technologies; 1997 nendo chosa hokokusho (miriyo energy ni kansuru data shu sakusei chosa)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    Data on new energy have been systematically investigated and arranged for use in advisory projects for the introduction of new energy projects and the formation of a new energy vision for the future. Heat supply systems which utilize unutilized energy (UE) technologies were defined, UE sources were categorized by type and characteristics, and a number of typical heat supply systems were cited. Research was done on actual heat supply facilities in Japan which utilize UE technology, data were classified by energy source, temperature level and region, and a detailed database on actual models was prepared. Data were also organized and classified on overseas models which have been introduced, especially in Europe and the USA. Japanese, European, and the US policies, laws and regulations, and subsidies for fostering use were researched, and the data obtained were organized and classified. Representative models of heat supply systems utilizing UE technology were researched to determine the effects produced. The future amount of UE was estimated on the basis of the basic guideline for new energy introduction. Two representative models of facilities which utilize UE technology for heat supply were researched to analyze the costs for these facilities. 17 figs.

  8. A concept for the modernization of underground mining master maps based on the enrichment of data definitions and spatial database technology

    Science.gov (United States)

    Krawczyk, Artur

    2018-01-01

    In this article, topics regarding the technical and legal aspects of creating digital underground mining maps are described. Currently used technologies and solutions for creating, storing and making digital maps accessible are described in the context of the Polish mining industry. Also, some problems with the use of these technologies are identified and described. One of the identified problems is the need to expand the range of mining map data provided by survey departments to other mining departments, such as ventilation maintenance or geological maintenance. Three solutions are proposed and analyzed, and one is chosen for further analysis. The analysis concerns data storage and making survey data accessible not only from paper documentation, but also directly from computer systems. Based on enrichment data, new processing procedures are proposed for a new way of presenting information that allows the preparation of new cartographic representations (symbols) of data with regard to users' needs.

  9. A concept for the modernization of underground mining master maps based on the enrichment of data definitions and spatial database technology

    Directory of Open Access Journals (Sweden)

    Krawczyk Artur

    2018-01-01

    Full Text Available In this article, topics regarding the technical and legal aspects of creating digital underground mining maps are described. Currently used technologies and solutions for creating, storing and making digital maps accessible are described in the context of the Polish mining industry. Also, some problems with the use of these technologies are identified and described. One of the identified problems is the need to expand the range of mining map data provided by survey departments to other mining departments, such as ventilation maintenance or geological maintenance. Three solutions are proposed and analyzed, and one is chosen for further analysis. The analysis concerns data storage and making survey data accessible not only from paper documentation, but also directly from computer systems. Based on enrichment data, new processing procedures are proposed for a new way of presenting information that allows the preparation of new cartographic representations (symbols of data with regard to users’ needs.

  10. Databases for Assessment of Military Speech Technology Equipment. (les Bases de donnees pour l’evatuation des equipements de technologie vocale militaire)

    Science.gov (United States)

    2000-03-01

    Prieur de la Crte d’Or Ciudad Universitaria 94114 Arcueil Cedex 28040 Madrid France Spain Mr. John J. Grieco Dr. Dough Reynolds AFRL/IFEC Information...Aviation Carretera de Torrej6n a Ajalvir, Pk.4 obrann6ho v2zkumu (NISCR) c/o Flugrad 28850 Torrej6n de Ardoz - Madrid Mladoboleslavsk. ul., 197 06...for Assessment of Military Speech Technology Equipment (les Bases de donnees pour I’evaluation des equipements de technologie vocale militaire) This

  11. Lung segmentation refinement based on optimal surface finding utilizing a hybrid desktop/virtual reality user interface.

    Science.gov (United States)

    Sun, Shanhui; Sonka, Milan; Beichel, Reinhard R

    2013-01-01

    Recently, the optimal surface finding (OSF) and layered optimal graph image segmentation of multiple objects and surfaces (LOGISMOS) approaches have been reported with applications to medical image segmentation tasks. While providing high levels of performance, these approaches may locally fail in the presence of pathology or other local challenges. Due to the image data variability, finding a suitable cost function that would be applicable to all image locations may not be feasible. This paper presents a new interactive refinement approach for correcting local segmentation errors in the automated OSF-based segmentation. A hybrid desktop/virtual reality user interface was developed for efficient interaction with the segmentations utilizing state-of-the-art stereoscopic visualization technology and advanced interaction techniques. The user interface allows a natural and interactive manipulation of 3-D surfaces. The approach was evaluated on 30 test cases from 18 CT lung datasets, which showed local segmentation errors after employing an automated OSF-based lung segmentation. The performed experiments exhibited significant increase in performance in terms of mean absolute surface distance errors (2.54±0.75 mm prior to refinement vs. 1.11±0.43 mm post-refinement, p≪0.001). Speed of the interactions is one of the most important aspects leading to the acceptance or rejection of the approach by users expecting real-time interaction experience. The average algorithm computing time per refinement iteration was 150 ms, and the average total user interaction time required for reaching complete operator satisfaction was about 2 min per case. This time was mostly spent on human-controlled manipulation of the object to identify whether additional refinement was necessary and to approve the final segmentation result. The reported principle is generally applicable to segmentation problems beyond lung segmentation in CT scans as long as the underlying segmentation utilizes the

  12. Databases and their application

    NARCIS (Netherlands)

    Grimm, E.C.; Bradshaw, R.H.W; Brewer, S.; Flantua, S.; Giesecke, T.; Lézine, A.M.; Takahara, H.; Williams, J.W.,Jr; Elias, S.A.; Mock, C.J.

    2013-01-01

    During the past 20 years, several pollen database cooperatives have been established. These databases are now constituent databases of the Neotoma Paleoecology Database, a public domain, multiproxy, relational database designed for Quaternary-Pliocene fossil data and modern surface samples. The

  13. SmallSat Database

    Science.gov (United States)

    Petropulos, Dolores; Bittner, David; Murawski, Robert; Golden, Bert

    2015-01-01

    The SmallSat has an unrealized potential in both the private industry and in the federal government. Currently over 70 companies, 50 universities and 17 governmental agencies are involved in SmallSat research and development. In 1994, the U.S. Army Missile and Defense mapped the moon using smallSat imagery. Since then Smart Phones have introduced this imagery to the people of the world as diverse industries watched this trend. The deployment cost of smallSats is also greatly reduced compared to traditional satellites due to the fact that multiple units can be deployed in a single mission. Imaging payloads have become more sophisticated, smaller and lighter. In addition, the growth of small technology obtained from private industries has led to the more widespread use of smallSats. This includes greater revisit rates in imagery, significantly lower costs, the ability to update technology more frequently and the ability to decrease vulnerability of enemy attacks. The popularity of smallSats show a changing mentality in this fast paced world of tomorrow. What impact has this created on the NASA communication networks now and in future years? In this project, we are developing the SmallSat Relational Database which can support a simulation of smallSats within the NASA SCaN Compatability Environment for Networks and Integrated Communications (SCENIC) Modeling and Simulation Lab. The NASA Space Communications and Networks (SCaN) Program can use this modeling to project required network support needs in the next 10 to 15 years. The SmallSat Rational Database could model smallSats just as the other SCaN databases model the more traditional larger satellites, with a few exceptions. One being that the smallSat Database is designed to be built-to-order. The SmallSat database holds various hardware configurations that can be used to model a smallSat. It will require significant effort to develop as the research material can only be populated by hand to obtain the unique data

  14. Technology

    Directory of Open Access Journals (Sweden)

    Xu Jing

    2016-01-01

    Full Text Available The traditional answer card reading method using OMR (Optical Mark Reader, most commonly, OMR special card special use, less versatile, high cost, aiming at the existing problems proposed a method based on pattern recognition of the answer card identification method. Using the method based on Line Segment Detector to detect the tilt of the image, the existence of tilt image rotation correction, and eventually achieve positioning and detection of answers to the answer sheet .Pattern recognition technology for automatic reading, high accuracy, detect faster

  15. Dietary Supplement Ingredient Database

    Science.gov (United States)

    ... and US Department of Agriculture Dietary Supplement Ingredient Database Toggle navigation Menu Home About DSID Mission Current ... values can be saved to build a small database or add to an existing database for national, ...

  16. AS COMPOSIÇÕES DA ESCUTA E A INTERNET: OS DJ'S DE DESKTOP

    Directory of Open Access Journals (Sweden)

    Rodrigo Fonseca e Rodrigues

    2006-12-01

    Full Text Available O texto aborda questões emergentes das ocasiões de experiência que, ativadas pelo uso idiossincrático das máquinas em rede, passam a contagiar circunstâncias e práticas genuínas de composição musical, bem como de outras disposições da escuta. Isto se passa quando os "dj's produtores de desktop" remodulam forças latentes nos enunciados musicais da cultura corporativa para revelarem, a partir da prática do remix, outros eventos, instaurando regimes singulares para a sensação. Seria preciso pensar tais poéticas da escuta em rede como implicações conjuntas de conexões não apenas de materiais musicais, mas também de universos incorporais, de forças insonoras, amparando-se, para tanto, nas direções conceituais deleuzianas acerca das questões do virtual e do acontecimento na sensação musical.

  17. A novel desktop device for lapping thin-walled micro groove

    Science.gov (United States)

    Wang, Shilei; Wang, Bo; Che, Lin; Ding, Fei; Li, Duo

    2014-08-01

    This paper presents a novel desktop device for lapping thin-walled micro groove of a specimen used in optical equipment, the device is aimed to remove metamorphic layer (about 1μm thick) formed on the groove's upper surface as well as ensure its thickness accuracy. It adopts the way of macro/micro motion combination, the macro-motion table uses stepper motor and ball screws to realize motion in large stroke, high speed and the micron level positioning, the micro-motion table uses the electrostriction appliance to actuate the flexible four bars mechanism to realize the small stroke, low speed, and the submicron level positioning. The system uses the strategy of two ways of feedback, the macro/micro motion table uses the precise linear grating as close-loop position feedback, and the sensing holder uses the eddy current transducer as the force and deformation feedback of the elastic fixture. The most novel aspect is the first proposed idea of realizing automatic feeding by elastic recovery of the fixture, whose structure has been delicately designed. In order to ensure small lapping force and relatively high natural frequency, both static and modal analysis of the fixture has been done by ANSYS, the results was in good accordance with experiments. Lapping experiments have showed that this device can remove metamorphic layer efficiently as well as obtain good surface quality at the same time.

  18. Prototype Implementation of Web and Desktop Applications for ALMA Science Verification Data and the Lessons Learned

    Science.gov (United States)

    Eguchi, S.; Kawasaki, W.; Shirasaki, Y.; Komiya, Y.; Kosugi, G.; Ohishi, M.; Mizumoto, Y.

    2013-10-01

    ALMA is estimated to generate TB scale data during only one observation; astronomers need to identify which part of the data they are really interested in. We have been developing new GUI software for this purpose utilizing the VO interface: ALMA Web Quick Look System (ALMAWebQL) and ALMA Desktop Application (Vissage). The former is written in JavaScript and HTML5 generated from Java code by the Google Web Toolkit, and the latter is in pure Java. An essential point of our approach is how to reduce network traffic: we prepare, in advance, “compressed” FITS files of 2x2x1 (horizontal, vertical, and spectral directions, respectively) binning, 2 x 2 x 2 binning, 4 x 4 x 2 binning data, and so on. These files are hidden from users, and Web QL automatically chooses the proper one for each user operation. Through this work, we find that network traffic in our system is still a bottleneck towards TB scale data distribution. Hence we have to develop alternative data containers for much faster data processing. In this paper, we introduce our data analysis systems, and describe what we learned through the development.

  19. Unique Methodologies for Nano/Micro Manufacturing Job Training Via Desktop Supercomputer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, Clyde [Northern Illinois Univ., DeKalb, IL (United States); Karonis, Nicholas [Northern Illinois Univ., DeKalb, IL (United States); Lurio, Laurence [Northern Illinois Univ., DeKalb, IL (United States); Piot, Philippe [Northern Illinois Univ., DeKalb, IL (United States); Xiao, Zhili [Northern Illinois Univ., DeKalb, IL (United States); Glatz, Andreas [Northern Illinois Univ., DeKalb, IL (United States); Pohlman, Nicholas [Northern Illinois Univ., DeKalb, IL (United States); Hou, Minmei [Northern Illinois Univ., DeKalb, IL (United States); Demir, Veysel [Northern Illinois Univ., DeKalb, IL (United States); Song, Jie [Northern Illinois Univ., DeKalb, IL (United States); Duffin, Kirk [Northern Illinois Univ., DeKalb, IL (United States); Johns, Mitrick [Northern Illinois Univ., DeKalb, IL (United States); Sims, Thomas [Northern Illinois Univ., DeKalb, IL (United States); Yin, Yanbin [Northern Illinois Univ., DeKalb, IL (United States)

    2012-11-21

    This project establishes an initiative in high speed (Teraflop)/large-memory desktop supercomputing for modeling and simulation of dynamic processes important for energy and industrial applications. It provides a training ground for employment of current students in an emerging field with skills necessary to access the large supercomputing systems now present at DOE laboratories. It also provides a foundation for NIU faculty to quantum leap beyond their current small cluster facilities. The funding extends faculty and student capability to a new level of analytic skills with concomitant publication avenues. The components of the Hewlett Packard computer obtained by the DOE funds create a hybrid combination of a Graphics Processing System (12 GPU/Teraflops) and a Beowulf CPU system (144 CPU), the first expandable via the NIU GAEA system to ~60 Teraflops integrated with a 720 CPU Beowulf system. The software is based on access to the NVIDIA/CUDA library and the ability through MATLAB multiple licenses to create additional local programs. A number of existing programs are being transferred to the CPU Beowulf Cluster. Since the expertise necessary to create the parallel processing applications has recently been obtained at NIU, this effort for software development is in an early stage. The educational program has been initiated via formal tutorials and classroom curricula designed for the coming year. Specifically, the cost focus was on hardware acquisitions and appointment of graduate students for a wide range of applications in engineering, physics and computer science.

  20. A Unified Algorithm for Virtual Desktops Placement in Distributed Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jiangtao Zhang

    2016-01-01

    Full Text Available Distributed cloud has been widely adopted to support service requests from dispersed regions, especially for large enterprise which requests virtual desktops for multiple geodistributed branch companies. The cloud service provider (CSP aims to deliver satisfactory services at the least cost. CSP selects proper data centers (DCs closer to the branch companies so as to shorten the response time to user request. At the same time, it also strives to cut cost considering both DC level and server level. At DC level, the expensive long distance inter-DC bandwidth consumption should be reduced and lower electricity price is sought. Inside each tree-like DC, servers are trying to be used as little as possible so as to save equipment cost and power. In nature, there is a noncooperative relation between the DC level and server level in the selection. To attain these objectives and capture the noncooperative relation, multiobjective bilevel programming is used to formulate the problem. Then a unified genetic algorithm is proposed to solve the problem which realizes the selection of DC and server simultaneously. The extensive simulation shows that the proposed algorithm outperforms baseline algorithm in both quality of service guaranteeing and cost saving.

  1. Clinical predictors of the optimal spectacle correction for comfort performing desktop tasks.

    Science.gov (United States)

    Leffler, Christopher T; Davenport, Byrd; Rentz, Jodi; Miller, Amy; Benson, William

    2008-11-01

    The best strategy for spectacle correction of presbyopia for near tasks has not been determined. Thirty volunteers over the age of 40 years were tested for subjective accommodative amplitude, pupillary size, fusional vergence, interpupillary distance, arm length, preferred working distance, near and far visual acuity and preferred reading correction in the phoropter and trial frames. Subjects performed near tasks (reading, writing and counting change) using various spectacle correction strengths. Predictors of the correction maximising near task comfort were determined by multivariable linear regression. The mean age was 54.9 years (range 43 to 71) and 40 per cent had diabetes. Significant predictors of the most comfortable addition in univariate analyses were age (pphoropter (p=0.002) or trial frames (p0.15). The preferred addition wearing trial frames holding a reading target at a distance selected by the patient was the only independent predictor. Excluding this variable, distance visual acuity was predictive independent of age or near vision wearing distance correction. The distance selected for task performance was predicted by vision wearing distance correction at near and at distance. Multivariable linear regression can be used to generate tables based on distance visual acuity and age or near vision wearing distance correction to determine tentative near spectacle addition. Final spectacle correction for desktop tasks can be estimated by subjective refraction with trial frames.

  2. Deep Unsupervised Learning on a Desktop PC: A Primer for Cognitive Scientists

    Science.gov (United States)

    Testolin, Alberto; Stoianov, Ivilin; De Filippo De Grazia, Michele; Zorzi, Marco

    2013-01-01

    Deep belief networks hold great promise for the simulation of human cognition because they show how structured and abstract representations may emerge from probabilistic unsupervised learning. These networks build a hierarchy of progressively more complex distributed representations of the sensory data by fitting a hierarchical generative model. However, learning in deep networks typically requires big datasets and it can involve millions of connection weights, which implies that simulations on standard computers are unfeasible. Developing realistic, medium-to-large-scale learning models of cognition would therefore seem to require expertise in programing parallel-computing hardware, and this might explain why the use of this promising approach is still largely confined to the machine learning community. Here we show how simulations of deep unsupervised learning can be easily performed on a desktop PC by exploiting the processors of low cost graphic cards (graphic processor units) without any specific programing effort, thanks to the use of high-level programming routines (available in MATLAB or Python). We also show that even an entry-level graphic card can outperform a small high-performance computing cluster in terms of learning time and with no loss of learning quality. We therefore conclude that graphic card implementations pave the way for a widespread use of deep learning among cognitive scientists for modeling cognition and behavior. PMID:23653617

  3. MD Simulations of Viruslike Particles with Supra CG Solvation Affordable to Desktop Computers.

    Science.gov (United States)

    Machado, Matı As R; González, Humberto C; Pantano, Sergio

    2017-10-10

    Viruses are tremendously efficient molecular devices that optimize the packing of genetic material using a minimalistic number of proteins to form a capsid or envelope that protects them from external threats, being also part of cell recognition, fusion, and budding machineries. Progress in experimental techniques has provided a large number of high-resolution structures of viruses and viruslike particles (VLP), while molecular dynamics simulations may furnish lively and complementary insights on the fundamental forces ruling viral assembly, stability, and dynamics. However, the large size and complexity of these macromolecular assemblies pose significant computational challenges. Alternatively, Coarse-Grained (CG) methods, which resign atomistic resolution privileging computational efficiency, can be used to characterize the dynamics of VLPs. Still, the massive amount of solvent present in empty capsids or envelopes suggests that hybrid schemes keeping a higher resolution on regions of interest (i.e., the viral proteins and their surroundings) and a progressively coarser description on the bulk may further improve efficiency. Here we introduce a mesoscale explicit water model to be used in double- or triple-scale simulations in combination with popular atomistic parameters and the CG water used by the SIRAH force field. Simulations performed on VLPs of different sizes, along with a comprehensive analysis of the PDB, indicate that most of the VLPs so far reported are amenable to be handled on a GPU-accelerated desktop computer using this simulation scheme.

  4. Correleation of the SAGE III on ISS Thermal Models in Thermal Desktop

    Science.gov (United States)

    Amundsen, Ruth M.; Davis, Warren T.; Liles, Kaitlin, A. K.; McLeod, Shawn C.

    2017-01-01

    The Stratospheric Aerosol and Gas Experiment III (SAGE III) instrument is the fifth in a series of instruments developed for monitoring aerosols and gaseous constituents in the stratosphere and troposphere. SAGE III was launched on February 19, 2017 and mounted to the International Space Station (ISS) to begin its three-year mission. A detailed thermal model of the SAGE III payload, which consists of multiple subsystems, has been developed in Thermal Desktop (TD). Correlation of the thermal model is important since the payload will be expected to survive a three-year mission on ISS under varying thermal environments. Three major thermal vacuum (TVAC) tests were completed during the development of the SAGE III Instrument Payload (IP); two subsystem-level tests and a payload-level test. Additionally, a characterization TVAC test was performed in order to verify performance of a system of heater plates that was designed to allow the IP to achieve the required temperatures during payload-level testing; model correlation was performed for this test configuration as well as those including the SAGE III flight hardware. This document presents the methods that were used to correlate the SAGE III models to TVAC at the subsystem and IP level, including the approach for modeling the parts of the payload in the thermal chamber, generating pre-test predictions, and making adjustments to the model to align predictions with temperatures observed during testing. Model correlation quality will be presented and discussed, and lessons learned during the correlation process will be shared.

  5. Generation of orientation tools for automated zebrafish screening assays using desktop 3D printing.

    Science.gov (United States)

    Wittbrodt, Jonas N; Liebel, Urban; Gehrig, Jochen

    2014-05-01

    The zebrafish has been established as the main vertebrate model system for whole organism screening applications. However, the lack of consistent positioning of zebrafish embryos within wells of microtiter plates remains an obstacle for the comparative analysis of images acquired in automated screening assays. While technical solutions to the orientation problem exist, dissemination is often hindered by the lack of simple and inexpensive ways of distributing and duplicating tools. Here, we provide a cost effective method for the production of 96-well plate compatible zebrafish orientation tools using a desktop 3D printer. The printed tools enable the positioning and orientation of zebrafish embryos within cavities formed in agarose. Their applicability is demonstrated by acquiring lateral and dorsal views of zebrafish embryos arrayed within microtiter plates using an automated screening microscope. This enables the consistent visualization of morphological phenotypes and reporter gene expression patterns. The designs are refined versions of previously demonstrated devices with added functionality and strongly reduced production costs. All corresponding 3D models are freely available and digital design can be easily shared electronically. In combination with the increasingly widespread usage of 3D printers, this provides access to the developed tools to a wide range of zebrafish users. Finally, the design files can serve as templates for other additive and subtractive fabrication methods.

  6. Mars Propellant Liquefaction and Storage Performance Modeling using Thermal Desktop with an Integrated Cryocooler Model

    Science.gov (United States)

    Desai, Pooja; Hauser, Dan; Sutherlin, Steven

    2017-01-01

    NASAs current Mars architectures are assuming the production and storage of 23 tons of liquid oxygen on the surface of Mars over a duration of 500+ days. In order to do this in a mass efficient manner, an energy efficient refrigeration system will be required. Based on previous analysis NASA has decided to do all liquefaction in the propulsion vehicle storage tanks. In order to allow for transient Martian environmental effects, a propellant liquefaction and storage system for a Mars Ascent Vehicle (MAV) was modeled using Thermal Desktop. The model consisted of a propellant tank containing a broad area cooling loop heat exchanger integrated with a reverse turbo Brayton cryocooler. Cryocooler sizing and performance modeling was conducted using MAV diurnal heat loads and radiator rejection temperatures predicted from a previous thermal model of the MAV. A system was also sized and modeled using an alternative heat rejection system that relies on a forced convection heat exchanger. Cryocooler mass, input power, and heat rejection for both systems were estimated and compared against sizing based on non-transient sizing estimates.

  7. Deep Unsupervised Learning on a Desktop PC: A Primer for Cognitive Scientists.

    Science.gov (United States)

    Testolin, Alberto; Stoianov, Ivilin; De Filippo De Grazia, Michele; Zorzi, Marco

    2013-01-01

    Deep belief networks hold great promise for the simulation of human cognition because they show how structured and abstract representations may emerge from probabilistic unsupervised learning. These networks build a hierarchy of progressively more complex distributed representations of the sensory data by fitting a hierarchical generative model. However, learning in deep networks typically requires big datasets and it can involve millions of connection weights, which implies that simulations on standard computers are unfeasible. Developing realistic, medium-to-large-scale learning models of cognition would therefore seem to require expertise in programing parallel-computing hardware, and this might explain why the use of this promising approach is still largely confined to the machine learning community. Here we show how simulations of deep unsupervised learning can be easily performed on a desktop PC by exploiting the processors of low cost graphic cards (graphic processor units) without any specific programing effort, thanks to the use of high-level programming routines (available in MATLAB or Python). We also show that even an entry-level graphic card can outperform a small high-performance computing cluster in terms of learning time and with no loss of learning quality. We therefore conclude that graphic card implementations pave the way for a widespread use of deep learning among cognitive scientists for modeling cognition and behavior.

  8. Deep unsupervised learning on a desktop PC: A primer for cognitive scientists

    Directory of Open Access Journals (Sweden)

    Alberto eTestolin

    2013-05-01

    Full Text Available Deep belief networks hold great promise for the simulation of human cognition because they show how structured and abstract representations may emerge from probabilistic unsupervised learning. These networks build a hierarchy of progressively more complex distributed representations of the sensory data by fitting a hierarchical generative model. However, learning in deep networks typically requires big datasets and it can involve millions of connection weights, which implies that simulations on standard computers are unfeasible. Developing realistic, medium-to-large-scale learning models of cognition would therefore seem to require expertise in programming parallel-computing hardware, and this might explain why the use of this promising approach is still largely confined to the machine learning community. Here we show how simulations of deep unsupervised learning can be easily performed on a desktop PC by exploiting the processors of low-cost graphic cards (GPUs without any specific programming effort, thanks to the use of high-level programming routines (available in MATLAB or Python. We also show that even an entry-level graphic card can outperform a small high-performance computing cluster in terms of learning time and with no loss of learning quality. We therefore conclude that graphic card implementations pave the way for a widespread use of deep learning among cognitive scientists for modeling cognition and behavior.

  9. A desktop system of virtual morphometric globes for Mars and the Moon

    Science.gov (United States)

    Florinsky, I. V.; Filippov, S. V.

    2017-03-01

    Global morphometric models can be useful for earth and planetary studies. Virtual globes - programs implementing interactive three-dimensional (3D) models of planets - are increasingly used in geo- and planetary sciences. We describe the development of a desktop system of virtual morphometric globes for Mars and the Moon. As the initial data, we used 15'-gridded global digital elevation models (DEMs) extracted from the Mars Orbiter Laser Altimeter (MOLA) and the Lunar Orbiter Laser Altimeter (LOLA) gridded archives. For two celestial bodies, we derived global digital models of several morphometric attributes, such as horizontal curvature, vertical curvature, minimal curvature, maximal curvature, and catchment area. To develop the system, we used Blender, the free open-source software for 3D modeling and visualization. First, a 3D sphere model was generated. Second, the global morphometric maps were imposed to the sphere surface as textures. Finally, the real-time 3D graphics Blender engine was used to implement rotation and zooming of the globes. The testing of the developed system demonstrated its good performance. Morphometric globes clearly represent peculiarities of planetary topography, according to the physical and mathematical sense of a particular morphometric variable.

  10. Desktop Software for Patch-Clamp Raw Binary Data Conversion and Preprocessing

    Directory of Open Access Journals (Sweden)

    Ning Zhang

    2011-01-01

    Full Text Available Since raw data recorded by patch-clamp systems are always stored in binary format, electrophysiologists may experience difficulties with patch clamp data preprocessing especially when they want to analyze by custom-designed algorithms. In this study, we present desktop software, called PCDReader, which could be an effective and convenient solution for patch clamp data preprocessing for daily laboratory use. We designed a novel class module, called clsPulseData, to directly read the raw data along with the parameters recorded from HEKA instruments without any other program support. By a graphical user interface, raw binary data files can be converted into several kinds of ASCII text files for further analysis, with several preprocessing options. And the parameters can also be viewed, modified and exported into ASCII files by a user-friendly Explorer style window. The real-time data loading technique and optimized memory management programming makes PCDReader a fast and efficient tool. The compiled software along with the source code of the clsPulseData class module is freely available to academic and nonprofit users.

  11. Evaluation of usefulness and availability for orthopedic surgery using clavicle fracture model manufactured by desktop 3D printer

    International Nuclear Information System (INIS)

    Oh, Wang Kyun

    2014-01-01

    Usefulness and clinical availability for surgery efficiency were evaluated by conducting pre-operative planning with a model manufactured by desktop 3D printer by using clavicle CT image. The patient-customized clavicle fracture model was manufactured by desktop 3D printer of FDM wire laminated processing method by converting the CT image into STL file in Open Source DICOM Viewer Osirix. Also, the model of the original shape before damaged was restored and manufactured by Mirror technique based on STL file of not fractured clavicle of the other side by using the symmetry feature of the human body. For the model, the position and size, degree of the fracture was equally printed out. Using the clavicle model directly manufactured with low cost and less time in Department of Radiology is considered to be useful because it can reduce secondary damage during surgery and increase surgery efficiency with Minimal invasive percutaneous plate osteosynthesis(MIPO)

  12. Evaluation of usefulness and availability for orthopedic surgery using clavicle fracture model manufactured by desktop 3D printer

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Wang Kyun [Dept. of Diagnostic Radiology, Cheongju Medical Center, Cheongju (Korea, Republic of)

    2014-09-15

    Usefulness and clinical availability for surgery efficiency were evaluated by conducting pre-operative planning with a model manufactured by desktop 3D printer by using clavicle CT image. The patient-customized clavicle fracture model was manufactured by desktop 3D printer of FDM wire laminated processing method by converting the CT image into STL file in Open Source DICOM Viewer Osirix. Also, the model of the original shape before damaged was restored and manufactured by Mirror technique based on STL file of not fractured clavicle of the other side by using the symmetry feature of the human body. For the model, the position and size, degree of the fracture was equally printed out. Using the clavicle model directly manufactured with low cost and less time in Department of Radiology is considered to be useful because it can reduce secondary damage during surgery and increase surgery efficiency with Minimal invasive percutaneous plate osteosynthesis(MIPO)

  13. Does It Matter Whether One Takes a Test on an iPad or a Desktop Computer?

    Science.gov (United States)

    Ling, Guangming

    2016-01-01

    To investigate possible iPad related mode effect, we tested 403 8th graders in Indiana, Maryland, and New Jersey under three mode conditions through random assignment: a desktop computer, an iPad alone, and an iPad with an external keyboard. All students had used an iPad or computer for six months or longer. The 2-hour test included reading, math,…

  14. Do small fish mean no voucher? Using a flatbed desktop scanner to document larval and small specimens before destructive analyses

    Czech Academy of Sciences Publication Activity Database

    Kalous, L.; Šlechtová, Věra; Petrtýl, M.; Kohout, Jan; Čech, Martin

    2010-01-01

    Roč. 26, č. 4 (2010), s. 614-617 ISSN 0175-8659 R&D Projects: GA ČR GA206/06/1371; GA ČR GP206/09/P266 Institutional research plan: CEZ:AV0Z50450515; CEZ:AV0Z60170517 Keywords : small fish * voucher * desktop scanner Subject RIV: GL - Fishing Impact factor: 0.945, year: 2010

  15. NoSQL Databases

    OpenAIRE

    PANYKO, Tomáš

    2013-01-01

    This thesis deals with database systems referred to as NoSQL databases. In the second chapter, I explain basic terms and the theory of database systems. A short explanation is dedicated to database systems based on the relational data model and the SQL standardized query language. Chapter Three explains the concept and history of the NoSQL databases, and also presents database models, major features and the use of NoSQL databases in comparison with traditional database systems. In the fourth ...

  16. Collecting Taxes Database

    Data.gov (United States)

    US Agency for International Development — The Collecting Taxes Database contains performance and structural indicators about national tax systems. The database contains quantitative revenue performance...

  17. USAID Anticorruption Projects Database

    Data.gov (United States)

    US Agency for International Development — The Anticorruption Projects Database (Database) includes information about USAID projects with anticorruption interventions implemented worldwide between 2007 and...

  18. XML databases and the semantic web

    CERN Document Server

    Thuraisingham, Bhavani

    2002-01-01

    Efficient access to data, sharing data, extracting information from data, and making use of the information have become urgent needs for today''s corporations. With so much data on the Web, managing it with conventional tools is becoming almost impossible. New tools and techniques are necessary to provide interoperability as well as warehousing between multiple data sources and systems, and to extract information from the databases. XML Databases and the Semantic Web focuses on critical and new Web technologies needed for organizations to carry out transactions on the Web, to understand how to use the Web effectively, and to exchange complex documents on the Web.This reference for database administrators, database designers, and Web designers working in tandem with database technologists covers three emerging technologies of significant impact for electronic business: Extensible Markup Language (XML), semi-structured databases, and the semantic Web. The first two parts of the book explore these emerging techn...

  19. Curcumin Resource Database.

    Science.gov (United States)

    Kumar, Anil; Chetia, Hasnahana; Sharma, Swagata; Kabiraj, Debajyoti; Talukdar, Narayan Chandra; Bora, Utpal

    2015-01-01

    Curcumin is one of the most intensively studied diarylheptanoid, Curcuma longa being its principal producer. This apart, a class of promising curcumin analogs has been generated in laboratories, aptly named as Curcuminoids which are showing huge potential in the fields of medicine, food technology, etc. The lack of a universal source of data on curcumin as well as curcuminoids has been felt by the curcumin research community for long. Hence, in an attempt to address this stumbling block, we have developed Curcumin Resource Database (CRDB) that aims to perform as a gateway-cum-repository to access all relevant data and related information on curcumin and its analogs. Currently, this database encompasses 1186 curcumin analogs, 195 molecular targets, 9075 peer reviewed publications, 489 patents and 176 varieties of C. longa obtained by extensive data mining and careful curation from numerous sources. Each data entry is identified by a unique CRDB ID (identifier). Furnished with a user-friendly web interface and in-built search engine, CRDB provides well-curated and cross-referenced information that are hyperlinked with external sources. CRDB is expected to be highly useful to the researchers working on structure as well as ligand-based molecular design of curcumin analogs. © The Author(s) 2015. Published by Oxford University Press.

  20. Optimizing the number of cleavage stage embryos to transfer on day 3 in women 38 years of age and older: a Society for Assisted Reproductive Technology database study.

    Science.gov (United States)

    Stern, Judy E; Goldman, Marlene B; Hatasaka, Harry; MacKenzie, Todd A; Surrey, Eric S; Racowsky, Catherine

    2009-03-01

    To determine the optimal number of day 3 embryos to transfer in women >or=38 years by conducting an evidence-based evaluation. Retrospective analysis of 2000-2004 national SART data. National writing group. A total of 36,103 day 3 embryo transfers in women >or=38 years undergoing their first assisted reproductive technology cycle. None. Logistic regression was used to model the probability of pregnancy, delivery, and multiple births (twin or high order) based on age- and cycle-specific parameters. Pregnancy rates, delivery rates, and multiple rates increased up to transfer of three embryos in 38-year-olds and four in 39-year-olds; beyond this number, only multiple rates increased. In women >or=40 years, delivery rates and multiple rates climbed steadily with increasing numbers transferred. Multivariate analysis confirmed the statistically significant effect of age, number of oocytes retrieved, and embryo cryopreservation on delivery and multiple rates. Maximum FSH level was not an independent predictor by multivariate analysis. Use of intracytoplasmic sperm injection was associated with lowered delivery rate. No more than three or four embryos should be transferred in 38- and 39-year-olds, respectively, whereas up to five embryos could be transferred in >or=40-year-olds. Numbers of embryos to transfer should be adjusted according to number of oocytes retrieved and availability of excess embryos for cryopreservation.

  1. Beginning C# 2008 databases from novice to professional

    CERN Document Server

    Fahad Gilani, Syed; Reid, Jon; Raghuram, Ranga; Huddleston, James; Hammer Pedersen, Jacob

    2008-01-01

    This book is for every C# programmer. It assumes no prior database experience and teaches through hands-on examples how to create and use relational databases with the standard database language SQL and how to access them with C#.Assuming only basic knowledge of C# 3.0, Beginning C# 3.0 Databases teaches all the fundamentals of database technology and database programming readers need to quickly become highly proficient database users and application developers. A comprehensive tutorial on both SQL Server 2005 and ADO.NET 3.0, this book explains and demonstrates how to create database objects

  2. Selection of nuclear power information database management system

    International Nuclear Information System (INIS)

    Zhang Shuxin; Wu Jianlei

    1996-01-01

    In the condition of the present database technology, in order to build the Chinese nuclear power information database (NPIDB) in the nuclear industry system efficiently at a high starting point, an important task is to select a proper database management system (DBMS), which is the hinge of the matter to build the database successfully. Therefore, this article explains how to build a practical information database about nuclear power, the functions of different database management systems, the reason of selecting relation database management system (RDBMS), the principles of selecting RDBMS, the recommendation of ORACLE management system as the software to build database and so on

  3. Analisis Kebutuhan Bandwidth Pada Pemanfaatan Web Streaming Justin.tv Sebagai Media E-Learning Dengan Menggunakan Wirecast Dan Desktop Presenter

    Directory of Open Access Journals (Sweden)

    Muhammad Ubaidilah

    2014-05-01

    Full Text Available Perkembangan teknologi informasi begitu cepat seperti sekarang telah banyak mengubah sudut pandang banyak orang, antara lain sudut pandang orang untuk mengubah dunia pendidikan menjadi lebih baik. Salah satu contohnya pembelajaran berbasis Information and Communication Technologies (ICT yaitu pembelajaran menggunakan video streaming. Dengan instalasi software open source Wirecast dan Desktop presenter digunakan untuk membuat video pembelajaran Streaming, disiarkan secara real time melalui media broadcast justin.tv (internet TV Channel, diharapkan dapat lebih mendukung konsep pembelajaran kapan dan dimana saja. Masalah terbesar dari teknologi ini adalah keterbatasan bandwidth. Bandwidth adalah parameter penting untuk melakukan streaming dalam jaringan. Sedangkan proses komunikasi menggunakan video digital ini menghabiskan resource yang cukup besar. Sehingga penggunaan wireshark di sini sangat diperlukan untuk menganalisis bandwidth pada paket yang diterima oleh client. Dari hasil pengukuran video dengan standar H.264 resolusi (720 x 540, dengan rata-rata 20 menit dalam pengambilan sampel, sebanyak 30 pengujian sampel streaming video menggunakan wireshark, diperoleh rata-rata throughput keseluruhan 0,343 Mbps, rata-rata throughput terendah 0,309 Mbps dan throughput tertinggi 0,372 Mbps. Dapat disimpulkan bahwa jika dihasilkan throughput yang lebih besar maka kualitas video streaming akan lebih baik, tetapi jika throughput dihasilkan semakin kecil maka kualitas video streaming akan menurun

  4. Numeric Databases in the 80s.

    Science.gov (United States)

    Fried, John B.; Kovacs, Gabor J.

    1982-01-01

    Defining a numeric database as a computer-readable collection of data predominantly numeric in nature, this article reviews techniques and technologies having a positive influence on the growth of numeric databases, such as videotex, mini- and microcomputers, artificial intelligence, improved software, telecommunications, and office automation.…

  5. KALIMER design database development and operation manual

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Hahn, Do Hee; Lee, Yong Bum; Chang, Won Pyo

    2000-12-01

    KALIMER Design Database is developed to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applications. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), 3D CAD database, Team Cooperation System, and Reserved Documents. Results Database is a research results database for mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is a schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment

  6. KALIMER design database development and operation manual

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwan Seong; Hahn, Do Hee; Lee, Yong Bum; Chang, Won Pyo

    2000-12-01

    KALIMER Design Database is developed to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applications. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), 3D CAD database, Team Cooperation System, and Reserved Documents. Results Database is a research results database for mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is a schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment.

  7. From the desktop to the grid: scalable bioinformatics via workflow conversion.

    Science.gov (United States)

    de la Garza, Luis; Veit, Johannes; Szolek, Andras; Röttig, Marc; Aiche, Stephan; Gesing, Sandra; Reinert, Knut; Kohlbacher, Oliver

    2016-03-12

    Reproducibility is one of the tenets of the scientific method. Scientific experiments often comprise complex data flows, selection of adequate parameters, and analysis and visualization of intermediate and end results. Breaking down the complexity of such experiments into the joint collaboration of small, repeatable, well defined tasks, each with well defined inputs, parameters, and outputs, offers the immediate benefit of identifying bottlenecks, pinpoint sections which could benefit from parallelization, among others. Workflows rest upon the notion of splitting complex work into the joint effort of several manageable tasks. There are several engines that give users the ability to design and execute workflows. Each engine was created to address certain problems of a specific community, therefore each one has its advantages and shortcomings. Furthermore, not all features of all workflow engines are royalty-free -an aspect that could potentially drive away members of the scientific community. We have developed a set of tools that enables the scientific community to benefit from workflow interoperability. We developed a platform-free structured representation of parameters, inputs, outputs of command-line tools in so-called Common Tool Descriptor documents. We have also overcome the shortcomings and combined the features of two royalty-free workflow engines with a substantial user community: the Konstanz Information Miner, an engine which we see as a formidable workflow editor, and the Grid and User Support Environment, a web-based framework able to interact with several high-performance computing resources. We have thus created a free and highly accessible way to design workflows on a desktop computer and execute them on high-performance computing resources. Our work will not only reduce time spent on designing scientific workflows, but also make executing workflows on remote high-performance computing resources more accessible to technically inexperienced users. We

  8. Logical database design principles

    CERN Document Server

    Garmany, John; Clark, Terry

    2005-01-01

    INTRODUCTION TO LOGICAL DATABASE DESIGNUnderstanding a Database Database Architectures Relational Databases Creating the Database System Development Life Cycle (SDLC)Systems Planning: Assessment and Feasibility System Analysis: RequirementsSystem Analysis: Requirements Checklist Models Tracking and Schedules Design Modeling Functional Decomposition DiagramData Flow Diagrams Data Dictionary Logical Structures and Decision Trees System Design: LogicalSYSTEM DESIGN AND IMPLEMENTATION The ER ApproachEntities and Entity Types Attribute Domains AttributesSet-Valued AttributesWeak Entities Constraint

  9. Oracle database systems administration

    OpenAIRE

    Šilhavý, Dominik

    2017-01-01

    Master's thesis with the name Oracle database systems administration describes problems in databases and how to solve them, which is important for database administrators. It helps them in delivering faster solutions without the need to look for or figure out solutions on their own. Thesis describes database backup and recovery methods that are closely related to problems solutions. The main goal is to provide guidance and recommendations regarding database troubles and how to solve them. It ...

  10. Issues in Big-Data Database Systems

    Science.gov (United States)

    2014-06-01

    that big data will not be manageable using conventional relational database technology, and it is true that alternative paradigms, such as NoSQL systems...conventional relational database technology, and it is true that alternative paradigms, such as NoSQL systems and search engines, have much to offer...scale well, and because integration with external data sources is so difficult. NoSQL systems are more open to this integration, and provide excellent

  11. 3D Printing in Technology and Engineering Education

    Science.gov (United States)

    Martin, Robert L.; Bowden, Nicholas S.; Merrill, Chris

    2014-01-01

    In the past five years, there has been tremendous growth in the production and use of desktop 3D printers. This growth has been driven by the increasing availability of inexpensive computing and electronics technologies. The ability to rapidly share ideas and intelligence over the Internet has also played a key role in the growth. Growth is also…

  12. Database Description - RED | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us RED Database Description General information of database Database name RED Alternative name Rice Expression Database...enome Research Unit Shoshi Kikuchi E-mail : Database classification Plant databases - Rice Database classifi...cation Microarray, Gene Expression Organism Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database descripti...on The Rice Expression Database (RED) is a database that aggregates the gene expr...icroarray Project and other research groups. Features and manner of utilization of database

  13. Centralized vs. Distributed Databases. Case Study

    Directory of Open Access Journals (Sweden)

    Nicoleta Magdalena Iacob

    2015-12-01

    Full Text Available Currently, in information technology domain and implicit in databases domain can be noticed two apparently contradictory approaches: centralization and distribution respectively. Although both aim to produce some benefits, it is a known fact that for any advantage a price must be paid. In addition, in this paper we have presented a case study, e-learning portal performance optimization by using distributed databases technology. In the stage of development in which institutions have branches distributed over a wide geographic area, distributed database systems become more appropriate to use, because they offer a higher degree of flexibility and adaptability then centralized ones.

  14. DEPOT database: Reference manual and user's guide

    International Nuclear Information System (INIS)

    Clancey, P.; Logg, C.

    1991-03-01

    DEPOT has been developed to provide tracking for the Stanford Linear Collider (SLC) control system equipment. For each piece of equipment entered into the database, complete location, service, maintenance, modification, certification, and radiation exposure histories can be maintained. To facilitate data entry accuracy, efficiency, and consistency, barcoding technology has been used extensively. DEPOT has been an important tool in improving the reliability of the microsystems controlling SLC. This document describes the components of the DEPOT database, the elements in the database records, and the use of the supporting programs for entering data, searching the database, and producing reports from the information

  15. Information Technology: A Road to the Future? To Promote Academic Justice and Excellence Series.

    Science.gov (United States)

    Gilbert, Steven W.; Green, Kenneth C.

    This publication is intended to provide college faculty and staff with a guide to information technology issues in higher education. Mid-Way through the 1990s, higher education confronts the second phase of the information technology (IT) revolution, a shift in emphasis from the computer as a desktop tool to the computer as a communications…

  16. Online Databases for Health Professionals

    OpenAIRE

    Marshall, Joanne Gard

    1987-01-01

    Recent trends in the marketing of electronic information technology have increased interest among health professionals in obtaining direct access to online biomedical databases such as Medline. During 1985, the Canadian Medical Association (CMA) and Telecom Canada conducted an eight-month trial of the use made of online information retrieval systems by 23 practising physicians and one pharmacist. The results of this project demonstrated both the value and the limitations of these systems in p...

  17. Open Source Vulnerability Database Project

    Directory of Open Access Journals (Sweden)

    Jake Kouns

    2008-06-01

    Full Text Available This article introduces the Open Source Vulnerability Database (OSVDB project which manages a global collection of computer security vulnerabilities, available for free use by the information security community. This collection contains information on known security weaknesses in operating systems, software products, protocols, hardware devices, and other infrastructure elements of information technology. The OSVDB project is intended to be the centralized global open source vulnerability collection on the Internet.

  18. Characterization of chemical contaminants generated by a desktop fused deposition modeling 3-dimensional Printer.

    Science.gov (United States)

    Stefaniak, Aleksandr B; LeBouf, Ryan F; Yi, Jinghai; Ham, Jason; Nurkewicz, Timothy; Schwegler-Berry, Diane E; Chen, Bean T; Wells, J Raymond; Duling, Matthew G; Lawrence, Robert B; Martin, Stephen B; Johnson, Alyson R; Virji, M Abbas

    2017-07-01

    Printing devices are known to emit chemicals into the indoor atmosphere. Understanding factors that influence release of chemical contaminants from printers is necessary to develop effective exposure assessment and control strategies. In this study, a desktop fused deposition modeling (FDM) 3-dimensional (3-D) printer using acrylonitrile butadiene styrene (ABS) or polylactic acid (PLA) filaments and two monochrome laser printers were evaluated in a 0.5 m 3 chamber. During printing, chamber air was monitored for vapors using a real-time photoionization detector (results expressed as isobutylene equivalents) to measure total volatile organic compound (TVOC) concentrations, evacuated canisters to identify specific VOCs by off-line gas chromatography-mass spectrometry (GC-MS) analysis, and liquid bubblers to identify carbonyl compounds by GC-MS. Airborne particles were collected on filters for off-line analysis using scanning electron microscopy with an energy dispersive x-ray detector to identify elemental constituents. For 3-D printing, TVOC emission rates were influenced by a printer malfunction, filament type, and to a lesser extent, by filament color; however, rates were not influenced by the number of printer nozzles used or the manufacturer's provided cover. TVOC emission rates were significantly lower for the 3-D printer (49-3552 µg h -1 ) compared to the laser printers (5782-7735 µg h -1 ). A total of 14 VOCs were identified during 3-D printing that were not present during laser printing. 3-D printed objects continued to off-gas styrene, indicating potential for continued exposure after the print job is completed. Carbonyl reaction products were likely formed from emissions of the 3-D printer, including 4-oxopentanal. Ultrafine particles generated by the 3-D printer using ABS and a laser printer contained chromium. Consideration of the factors that influenced the release of chemical contaminants (including known and suspected asthmagens such as styrene and

  19. Database Description - RMOS | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us RMOS Database Description General information of database Database name RMOS Alternative nam...arch Unit Shoshi Kikuchi E-mail : Database classification Plant databases - Rice Microarray Data and other Gene Expression Database...s Organism Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database description The Ric...e Microarray Opening Site is a database of comprehensive information for Rice Mic...es and manner of utilization of database You can refer to the information of the

  20. Kentucky geotechnical database.

    Science.gov (United States)

    2005-03-01

    Development of a comprehensive dynamic, geotechnical database is described. Computer software selected to program the client/server application in windows environment, components and structure of the geotechnical database, and primary factors cons...

  1. Directory of IAEA databases

    International Nuclear Information System (INIS)

    1991-11-01

    The first edition of the Directory of IAEA Databases is intended to describe the computerized information sources available to IAEA staff members. It contains a listing of all databases produced at the IAEA, together with information on their availability

  2. Cell Centred Database (CCDB)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Cell Centered Database (CCDB) is a web accessible database for high resolution 2D, 3D and 4D data from light and electron microscopy, including correlated imaging.

  3. Physiological Information Database (PID)

    Science.gov (United States)

    EPA has developed a physiological information database (created using Microsoft ACCESS) intended to be used in PBPK modeling. The database contains physiological parameter values for humans from early childhood through senescence as well as similar data for laboratory animal spec...

  4. E3 Staff Database

    Data.gov (United States)

    US Agency for International Development — E3 Staff database is maintained by E3 PDMS (Professional Development & Management Services) office. The database is Mysql. It is manually updated by E3 staff as...

  5. Database Urban Europe

    NARCIS (Netherlands)

    Sleutjes, B.; de Valk, H.A.G.

    2016-01-01

    Database Urban Europe: ResSegr database on segregation in The Netherlands. Collaborative research on residential segregation in Europe 2014–2016 funded by JPI Urban Europe (Joint Programming Initiative Urban Europe).

  6. Development of technical information database for high level waste disposal

    International Nuclear Information System (INIS)

    Kudo, Koji; Takada, Susumu; Kawanishi, Motoi

    2005-01-01

    A concept design of the high level waste disposal information database and the disposal technologies information database are explained. The high level waste disposal information database contains information on technologies, waste, management and rules, R and D, each step of disposal site selection, characteristics of sites, demonstration of disposal technology, design of disposal site, application for disposal permit, construction of disposal site, operation and closing. Construction of the disposal technologies information system and the geological disposal technologies information system is described. The screen image of the geological disposal technologies information system is shown. User is able to search the full text retrieval and attribute retrieval in the image. (S.Y. )

  7. Scopus database: a review.

    Science.gov (United States)

    Burnham, Judy F

    2006-03-08

    The Scopus database provides access to STM journal articles and the references included in those articles, allowing the searcher to search both forward and backward in time. The database can be used for collection development as well as for research. This review provides information on the key points of the database and compares it to Web of Science. Neither database is inclusive, but complements each other. If a library can only afford one, choice must be based in institutional needs.

  8. Scopus database: a review

    OpenAIRE

    Burnham, Judy F

    2006-01-01

    The Scopus database provides access to STM journal articles and the references included in those articles, allowing the searcher to search both forward and backward in time. The database can be used for collection development as well as for research. This review provides information on the key points of the database and compares it to Web of Science. Neither database is inclusive, but complements each other. If a library can only afford one, choice must be based in institutional needs.

  9. Automated Oracle database testing

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    Ensuring database stability and steady performance in the modern world of agile computing is a major challenge. Various changes happening at any level of the computing infrastructure: OS parameters & packages, kernel versions, database parameters & patches, or even schema changes, all can potentially harm production services. This presentation shows how an automatic and regular testing of Oracle databases can be achieved in such agile environment.

  10. Library Databases as Unexamined Classroom Technologies

    Science.gov (United States)

    Faix, Allison

    2014-01-01

    In their 1994 article, "The Politics of the Interface: Power and its Exercise in Electronic Contact Zones," compositionists Cynthia Selfe and Richard Selfe give examples of how certain features of word processing software and other programs used in writing classrooms (including their icons, clip art, interfaces, and file structures) can…

  11. Keyword Search in Databases

    CERN Document Server

    Yu, Jeffrey Xu; Chang, Lijun

    2009-01-01

    It has become highly desirable to provide users with flexible ways to query/search information over databases as simple as keyword search like Google search. This book surveys the recent developments on keyword search over databases, and focuses on finding structural information among objects in a database using a set of keywords. Such structural information to be returned can be either trees or subgraphs representing how the objects, that contain the required keywords, are interconnected in a relational database or in an XML database. The structural keyword search is completely different from

  12. Nuclear power economic database

    International Nuclear Information System (INIS)

    Ding Xiaoming; Li Lin; Zhao Shiping

    1996-01-01

    Nuclear power economic database (NPEDB), based on ORACLE V6.0, consists of three parts, i.e., economic data base of nuclear power station, economic data base of nuclear fuel cycle and economic database of nuclear power planning and nuclear environment. Economic database of nuclear power station includes data of general economics, technique, capital cost and benefit, etc. Economic database of nuclear fuel cycle includes data of technique and nuclear fuel price. Economic database of nuclear power planning and nuclear environment includes data of energy history, forecast, energy balance, electric power and energy facilities

  13. Protein sequence databases.

    Science.gov (United States)

    Apweiler, Rolf; Bairoch, Amos; Wu, Cathy H

    2004-02-01

    A variety of protein sequence databases exist, ranging from simple sequence repositories, which store data with little or no manual intervention in the creation of the records, to expertly curated universal databases that cover all species and in which the original sequence data are enhanced by the manual addition of further information in each sequence record. As the focus of researchers moves from the genome to the proteins encoded by it, these databases will play an even more important role as central comprehensive resources of protein information. Several the leading protein sequence databases are discussed here, with special emphasis on the databases now provided by the Universal Protein Knowledgebase (UniProt) consortium.

  14. Antimüllerian hormone as a predictor of live birth following assisted reproduction: an analysis of 85,062 fresh and thawed cycles from the Society for Assisted Reproductive Technology Clinic Outcome Reporting System database for 2012-2013.

    Science.gov (United States)

    Tal, Reshef; Seifer, David B; Wantman, Ethan; Baker, Valerie; Tal, Oded

    2018-02-01

    To determine if serum antimüllerian hormone (AMH) is associated with and/or predictive of live birth assisted reproductive technology (ART) outcomes. Retrospective analysis of Society for Assisted Reproductive Technology Clinic Outcome Reporting System database from 2012 to 2013. Not applicable. A total of 69,336 (81.8%) fresh and 15,458 (18.2%) frozen embryo transfer (FET) cycles with AMH values. None. Live birth. A total of 85,062 out of 259,499 (32.7%) fresh and frozen-thawed autologous non-preimplantation genetic diagnosis cycles had AMH reported for cycles over this 2-year period. Of those, 70,565 cycles which had embryo transfers were included in the analysis. Serum AMH was significantly associated with live birth outcome per transfer in both fresh and FET cycles. Multiple logistic regression demonstrated that AMH is an independent predictor of live birth in fresh transfer cycles and FET cycles when controlling for age, body mass index, race, day of transfer, and number of embryos transferred. Receiver operating characteristic (ROC) curves demonstrated that the areas under the curve (AUC) for AMH as predictors of live birth in fresh cycles and thawed cycles were 0.631 and 0.540, respectively, suggesting that AMH alone is a weak independent predictor of live birth after ART. Similar ROC curves were obtained also when elective single-embryo transfer (eSET) cycles were analyzed separately in either fresh (AUC 0.655) or FET (AUC 0.533) cycles, although AMH was not found to be an independent predictor in eSET cycles. AMH is a poor independent predictor of live birth outcome in either fresh or frozen embryo transfer for both eSET and non-SET transfers. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  15. Database Description - RPD | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us RPD Database Description General information of database Database name RPD Alternative name Rice Proteome Database...titute of Crop Science, National Agriculture and Food Research Organization Setsuko Komatsu E-mail: Database... classification Proteomics Resources Plant databases - Rice Organism Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database... description Rice Proteome Database contains information on protei...AGE) reference maps. Features and manner of utilization of database Proteins extracted from organs and subce

  16. Database Description - ASTRA | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us ASTRA Database Description General information of database Database name ASTRA Alternative n...tics Journal Search: Contact address Database classification Nucleotide Sequence Databases - Gene structure,...3702 Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database description The database represents classified p...mes. Features and manner of utilization of database This database enables to sear...ch and represent alternative splicing/transcriptional initiation genes and their patterns (ex: cassette) base

  17. Emissions of Ultrafine Particles and Volatile Organic Compounds from Commercially Available Desktop Three-Dimensional Printers with Multiple Filaments.

    Science.gov (United States)

    Azimi, Parham; Zhao, Dan; Pouzet, Claire; Crain, Neil E; Stephens, Brent

    2016-02-02

    Previous research has shown that desktop 3D printers can emit large numbers of ultrafine particles (UFPs, particles less than 100 nm) and some hazardous volatile organic compounds (VOCs) during printing, although very few filament and 3D printer combinations have been tested to date. Here we quantify emissions of UFPs and speciated VOCs from five commercially available filament extrusion desktop 3D printers utilizing up to nine different filaments by controlled experiments in a test chamber. Median estimates of time-varying UFP emission rates ranged from ∼10(8) to ∼10(11) min(-1) across all tested combinations, varying primarily by filament material and, to a lesser extent, bed temperature. The individual VOCs emitted in the largest quantities included caprolactam from nylon-based and imitation wood and brick filaments (ranging from ∼2 to ∼180 μg/min), styrene from acrylonitrile butadiene styrene (ABS) and high-impact polystyrene (HIPS) filaments (ranging from ∼10 to ∼110 μg/min), and lactide from polylactic acid (PLA) filaments (ranging from ∼4 to ∼5 μg/min). Results from a screening analysis of potential exposure to these products in a typical small office environment suggest caution should be used when operating many of the printer and filament combinations in poorly ventilated spaces or without the aid of combined gas and particle filtration systems.

  18. Database Description - Trypanosomes Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Trypanosomes Database Database Description General information of database Database name Trypanosomes Database...stitute of Genetics Research Organization of Information and Systems Yata 1111, Mishima, Shizuoka 411-8540, JAPAN E mail: Database... classification Protein sequence databases Organism Taxonom...y Name: Trypanosoma Taxonomy ID: 5690 Taxonomy Name: Homo sapiens Taxonomy ID: 9606 Database description The Trypanosomes database... is a database providing the comprehensive information of proteins that is effective t

  19. Database Description - Arabidopsis Phenome Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Arabidopsis Phenome Database Database Description General information of database Database n...ame Arabidopsis Phenome Database Alternative name - DOI 10.18908/lsdba.nbdc01509-000 Creator Creator Name: H... BioResource Center Hiroshi Masuya Database classification Plant databases - Arabidopsis thaliana Organism T...axonomy Name: Arabidopsis thaliana Taxonomy ID: 3702 Database description The Arabidopsis thaliana phenome i...heir effective application. We developed the new Arabidopsis Phenome Database integrating two novel database

  20. Design of multi-tiered database application based on CORBA component

    International Nuclear Information System (INIS)

    Sun Xiaoying; Dai Zhimin

    2003-01-01

    As computer technology quickly developing, middleware technology changed traditional two-tier database system. The multi-tiered database system, consisting of client application program, application servers and database serves, is mainly applying. While building multi-tiered database system using CORBA component has become the mainstream technique. In this paper, an example of DUV-FEL database system is presented, and then discuss the realization of multi-tiered database based on CORBA component. (authors)

  1. SSC lattice database and graphical interface

    International Nuclear Information System (INIS)

    Trahern, C.G.; Zhou, J.

    1991-11-01

    When completed the Superconducting Super Collider will be the world's largest accelerator complex. In order to build this system on schedule, the use of database technologies will be essential. In this paper we discuss one of the database efforts underway at the SSC, the lattice database. The SSC lattice database provides a centralized source for the design of each major component of the accelerator complex. This includes the two collider rings, the High Energy Booster, Medium Energy Booster, Low Energy Booster, and the LINAC as well as transfer and test beam lines. These designs have been created using a menagerie of programs such as SYNCH, DIMAD, MAD, TRANSPORT, MAGIC, TRACE3D AND TEAPOT. However, once a design has been completed, it is entered into a uniform database schema in the database system. In this paper we discuss the reasons for creating the lattice database and its implementation via the commercial database system SYBASE. Each lattice in the lattice database is composed of a set of tables whose data structure can describe any of the SSC accelerator lattices. In order to allow the user community access to the databases, a programmatic interface known as dbsf (for database to several formats) has been written. Dbsf creates ascii input files appropriate to the above mentioned accelerator design programs. In addition it has a binary dataset output using the Self Describing Standard data discipline provided with the Integrated Scientific Tool Kit software tools. Finally we discuss the graphical interfaces to the lattice database. The primary interface, known as OZ, is a simulation environment as well as a database browser

  2. Interactive bibliographical database on color

    Science.gov (United States)

    Caivano, Jose L.

    2002-06-01

    The paper describes the methodology and results of a project under development, aimed at the elaboration of an interactive bibliographical database on color in all fields of application: philosophy, psychology, semiotics, education, anthropology, physical and natural sciences, biology, medicine, technology, industry, architecture and design, arts, linguistics, geography, history. The project is initially based upon an already developed bibliography, published in different journals, updated in various opportunities, and now available at the Internet, with more than 2,000 entries. The interactive database will amplify that bibliography, incorporating hyperlinks and contents (indexes, abstracts, keywords, introductions, or eventually the complete document), and devising mechanisms for information retrieval. The sources to be included are: books, doctoral dissertations, multimedia publications, reference works. The main arrangement will be chronological, but the design of the database will allow rearrangements or selections by different fields: subject, Decimal Classification System, author, language, country, publisher, etc. A further project is to develop another database, including color-specialized journals or newsletters, and articles on color published in international journals, arranged in this case by journal name and date of publication, but allowing also rearrangements or selections by author, subject and keywords.

  3. Design and validation of a 3D virtual reality desktop system for sonographic length and volume measurements in early pregnancy evaluation.

    Science.gov (United States)

    Baken, Leonie; van Gruting, Isabelle M A; Steegers, Eric A P; van der Spek, Peter J; Exalto, Niek; Koning, Anton H J

    2015-03-01

    To design and validate a desktop virtual reality (VR) system, for presentation and assessment of volumetric data, based on commercially off-the-shelf hardware as an alternative to a fully immersive CAVE-like I-Space VR system. We designed a desktop VR system, using a three-dimensional (3D) monitor and a six degrees-of-freedom tracking system. A personal computer uses the V-Scope (Erasmus MC, Rotterdam, The Netherlands) volume-rendering application, developed for the I-Space, to create a hologram of volumetric data. Inter- and intraobserver reliability for crown-rump length and embryonic volume measurements are investigated using Bland-Altman plots and intraclass correlation coefficients. Time required for the measurements was recorded. Comparing the I-Space and the desktop VR system, the mean difference for crown-rump length is -0.34% (limits of agreement -2.58-1.89, ±2.24%) and for embryonic volume -0.92% (limits of agreement -6.97-5.13, ±6.05%). Intra- and interobserver intraclass correlation coefficients of the desktop VR system were all >0.99. Measurement times were longer on the desktop VR system compared with the I-Space, but the differences were not statistically significant. A user-friendly desktop VR system can be put together using commercially off-the-shelf hardware at an acceptable price. This system provides a valid and reliable method for embryonic length and volume measurements and can be used in clinical practice. © 2014 Wiley Periodicals, Inc.

  4. Database Support for Workflow Management: The WIDE Project

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Pernici, B; Sánchez, G.; Unknown, [Unknown

    1999-01-01

    Database Support for Workflow Management: The WIDE Project presents the results of the ESPRIT WIDE project on advanced database support for workflow management. The book discusses the state of the art in combining database management and workflow management technology, especially in the areas of

  5. Hazard Analysis Database Report

    CERN Document Server

    Grams, W H

    2000-01-01

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from t...

  6. National Database of Geriatrics

    DEFF Research Database (Denmark)

    Kannegaard, Pia Nimann; Vinding, Kirsten L; Hare-Bruun, Helle

    2016-01-01

    AIM OF DATABASE: The aim of the National Database of Geriatrics is to monitor the quality of interdisciplinary diagnostics and treatment of patients admitted to a geriatric hospital unit. STUDY POPULATION: The database population consists of patients who were admitted to a geriatric hospital unit....... Geriatric patients cannot be defined by specific diagnoses. A geriatric patient is typically a frail multimorbid elderly patient with decreasing functional ability and social challenges. The database includes 14-15,000 admissions per year, and the database completeness has been stable at 90% during the past......, percentage of discharges with a rehabilitation plan, and the part of cases where an interdisciplinary conference has taken place. Data are recorded by doctors, nurses, and therapists in a database and linked to the Danish National Patient Register. DESCRIPTIVE DATA: Descriptive patient-related data include...

  7. Concierge: Personal database software for managing digital research resources

    Directory of Open Access Journals (Sweden)

    Hiroyuki Sakai

    2007-11-01

    Full Text Available This article introduces a desktop application, named Concierge, for managing personal digital research resources. Using simple operations, it enables storage of various types of files and indexes them based on content descriptions. A key feature of the software is a high level of extensibility. By installing optional plug-ins, users can customize and extend the usability of the software based on their needs. In this paper, we also introduce a few optional plug-ins: literaturemanagement, electronic laboratory notebook, and XooNlps client plug-ins. XooNIps is a content management system developed to share digital research resources among neuroscience communities. It has been adopted as the standard database system in Japanese neuroinformatics projects. Concierge, therefore, offers comprehensive support from management of personal digital research resources to their sharing in open-access neuroinformatics databases such as XooNIps. This interaction between personal and open-access neuroinformatics databases is expected to enhance the dissemination of digital research resources. Concierge is developed as an open source project; Mac OS X and Windows XP versions have been released at the official site (http://concierge.sourceforge.jp.

  8. AMDD: Antimicrobial Drug Database

    OpenAIRE

    Danishuddin, Mohd; Kaushal, Lalima; Hassan Baig, Mohd; Khan, Asad U.

    2012-01-01

    Drug resistance is one of the major concerns for antimicrobial chemotherapy against any particular target. Knowledge of the primary structure of antimicrobial agents and their activities is essential for rational drug design. Thus, we developed a comprehensive database, anti microbial drug database (AMDD), of known synthetic antibacterial and antifungal compounds that were extracted from the available literature and other chemical databases, e.g., PubChem, PubChem BioAssay and ZINC, etc. The ...

  9. Molecular Biology Database List.

    Science.gov (United States)

    Burks, C

    1999-01-01

    Molecular Biology Database List (MBDL) includes brief descriptions and pointers to Web sites for the various databases described in this issue as well as other Web sites presenting data sets relevant to molecular biology. This information is compiled into a list (http://www.oup.co.uk/nar/Volume_27/Issue_01/summary/ gkc105_gml.html) which includes links both to source Web sites and to on-line versions of articles describing the databases. PMID:9847130

  10. The need for speed: Latest communications technologies instantaneously send information from oilfield to operator's head office

    Energy Technology Data Exchange (ETDEWEB)

    Anon

    2005-03-01

    The role played by satellite phones, cellular phones, telefax machines, electronic mail, desktop and laptop computers, remote computer networks, high-speed satellite links for Voice-over IP, SCADA (supervisory control and data acquisition) systems, and the Internet in the oil and natural gas industry are discussed. Examples of each technology, and the best technology to use in given situations, are reviewed. photos.

  11. Database principles programming performance

    CERN Document Server

    O'Neil, Patrick

    2014-01-01

    Database: Principles Programming Performance provides an introduction to the fundamental principles of database systems. This book focuses on database programming and the relationships between principles, programming, and performance.Organized into 10 chapters, this book begins with an overview of database design principles and presents a comprehensive introduction to the concepts used by a DBA. This text then provides grounding in many abstract concepts of the relational model. Other chapters introduce SQL, describing its capabilities and covering the statements and functions of the programmi

  12. LOWELL OBSERVATORY COMETARY DATABASE

    Data.gov (United States)

    National Aeronautics and Space Administration — The database presented here is comprised entirely of observations made utilizing conventional photoelectric photometers and narrowband filters isolating 5 emission...

  13. Transporter Classification Database (TCDB)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Transporter Classification Database details a comprehensive classification system for membrane transport proteins known as the Transporter Classification (TC)...

  14. The Relational Database Dictionary

    CERN Document Server

    J, C

    2006-01-01

    Avoid misunderstandings that can affect the design, programming, and use of database systems. Whether you're using Oracle, DB2, SQL Server, MySQL, or PostgreSQL, The Relational Database Dictionary will prevent confusion about the precise meaning of database-related terms (e.g., attribute, 3NF, one-to-many correspondence, predicate, repeating group, join dependency), helping to ensure the success of your database projects. Carefully reviewed for clarity, accuracy, and completeness, this authoritative and comprehensive quick-reference contains more than 600 terms, many with examples, covering i

  15. Key health indicators database.

    Science.gov (United States)

    Menic, J L

    1990-01-01

    A new database developed by the Canadian Centre for Health Information (CCHI) contains 40 key health indicators and lets users select a range of disaggregations, categories and variables. The database can be accessed through CANSIM, Statistics Canada's electronic database and retrieval system, or through a package for personal computers. This package includes the database on diskettes, as well as software for retrieving and manipulating data and for producing graphics. A data dictionary, a user's guide and tables and graphs that highlight aspects of each indicator are also included.

  16. Intermodal Passenger Connectivity Database -

    Data.gov (United States)

    Department of Transportation — The Intermodal Passenger Connectivity Database (IPCD) is a nationwide data table of passenger transportation terminals, with data on the availability of connections...

  17. IVR EFP Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database contains trip-level reports submitted by vessels participating in Exempted Fishery projects with IVR reporting requirements.

  18. Residency Allocation Database

    Data.gov (United States)

    Department of Veterans Affairs — The Residency Allocation Database is used to determine allocation of funds for residency programs offered by Veterans Affairs Medical Centers (VAMCs). Information...

  19. Smart Location Database - Service

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Smart Location Database (SLD) summarizes over 80 demographic, built environment, transit service, and destination accessibility attributes for every census block...

  20. Smart Location Database - Download

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Smart Location Database (SLD) summarizes over 80 demographic, built environment, transit service, and destination accessibility attributes for every census block...

  1. Towards Sensor Database Systems

    DEFF Research Database (Denmark)

    Bonnet, Philippe; Gehrke, Johannes; Seshadri, Praveen

    2001-01-01

    . These systems lack flexibility because data is extracted in a predefined way; also, they do not scale to a large number of devices because large volumes of raw data are transferred regardless of the queries that are submitted. In our new concept of sensor database system, queries dictate which data is extracted...... from the sensors. In this paper, we define the concept of sensor databases mixing stored data represented as relations and sensor data represented as time series. Each long-running query formulated over a sensor database defines a persistent view, which is maintained during a given time interval. We...... also describe the design and implementation of the COUGAR sensor database system....

  2. Database Publication Practices

    DEFF Research Database (Denmark)

    Bernstein, P.A.; DeWitt, D.; Heuer, A.

    2005-01-01

    There has been a growing interest in improving the publication processes for database research papers. This panel reports on recent changes in those processes and presents an initial cut at historical data for the VLDB Journal and ACM Transactions on Database Systems.......There has been a growing interest in improving the publication processes for database research papers. This panel reports on recent changes in those processes and presents an initial cut at historical data for the VLDB Journal and ACM Transactions on Database Systems....

  3. Database Description - RMG | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us RMG Database Description General information of database Database name RMG Alternative name ...raki 305-8602, Japan National Institute of Agrobiological Sciences E-mail : Database... classification Nucleotide Sequence Databases Organism Taxonomy Name: Oryza sativa Japonica Group Taxonomy ID: 39947 Database... description This database contains information on the rice mitochondrial genome. You ca...sis results. Features and manner of utilization of database The mitochondrial genome information can be used

  4. Update History of This Database - Arabidopsis Phenome Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Arabidopsis Phenome Database Update History of This Database Date Update contents 2017/02/27... Arabidopsis Phenome Database English archive site is opened. - Arabidopsis Phenome Database (http://jphenom...e.info/?page_id=95) is opened. About This Database Database Description Download License Update History of This Database... Site Policy | Contact Us Update History of This Database - Arabidopsis Phenome Database | LSDB Archive ...

  5. Update History of This Database - SKIP Stemcell Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us SKIP Stemcell Database Update History of This Database Date Update contents 2017/03/13 SKIP Stemcell Database... English archive site is opened. 2013/03/29 SKIP Stemcell Database ( https://www.skip.med.k...eio.ac.jp/SKIPSearch/top?lang=en ) is opened. About This Database Database Description Download License Upda...te History of This Database Site Policy | Contact Us Update History of This Database - SKIP Stemcell Database | LSDB Archive ...

  6. Economic Evaluation of Voice Recognition (VR) for the Clinician's Desktop at the Naval Hospital Roosevelt Roads

    National Research Council Canada - National Science Library

    1997-01-01

    This thesis investigates the current status of VR technology, its use in support of Joint vision 2010, its use in the Healthcare environment and provides an analysis of the VR Pilot Project at NHRR...

  7. Searching and Indexing Genomic Databases via Kernelization

    Directory of Open Access Journals (Sweden)

    Travis eGagie

    2015-02-01

    Full Text Available The rapid advance of DNA sequencing technologies has yielded databases of thousands of genomes. To search and index these databases effectively, it is important that we take advantage of the similarity between those genomes. Several authors have recently suggested searching or indexing only one reference genome and the parts of the other genomes where they differ. In this paper we survey the twenty-year history of this idea and discuss its relation to kernelization in parameterized complexity.

  8. A Taxonomy of Metrics for Hosted Databases

    OpenAIRE

    Jordan Shropshire

    2006-01-01

    The past three years has seen exponential growth in the number of organizations who have elected to entrust core information technology functions to application service providers. Of particular interest is the outsourcing of critical systems such as corporate databases. Major banks and financial service firms are contracting with third party organizations, sometimes overseas, for their database needs. These sophisticated contracts require careful supervision by both parties. Due to the comple...

  9. Searching and Indexing Genomic Databases via Kernelization.

    Science.gov (United States)

    Gagie, Travis; Puglisi, Simon J

    2015-01-01

    The rapid advance of DNA sequencing technologies has yielded databases of thousands of genomes. To search and index these databases effectively, it is important that we take advantage of the similarity between those genomes. Several authors have recently suggested searching or indexing only one reference genome and the parts of the other genomes where they differ. In this paper, we survey the 20-year history of this idea and discuss its relation to kernelization in parameterized complexity.

  10. Evolution of the Configuration Database Design

    International Nuclear Information System (INIS)

    Salnikov, A.

    2006-01-01

    The BABAR experiment at SLAC successfully collects physics data since 1999. One of the major parts of its on-line system is the configuration database which provides other parts of the system with the configuration data necessary for data taking. Originally the configuration database was implemented in the Objectivity/DB ODBMS. Recently BABAR performed a successful migration of its event store from Objectivity/DB to ROOT and this prompted a complete phase-out of the Objectivity/DB in all other BABAR databases. It required the complete redesign of the configuration database to hide any implementation details and to support multiple storage technologies. In this paper we describe the process of the migration of the configuration database, its new design, implementation strategy and details

  11. CLOUD-BASED VS DESKTOP-BASED PROPERTY MANAGEMENT SYSTEMS IN HOTEL

    OpenAIRE

    Mustafa GULMEZ; Edina AJANOVIC; Ismail KARAYUN

    2015-01-01

    Even though keeping up with the modern developments in IT sector is crucial for the success and competitiveness of a hotel, it is usually very hard for new technologies to be accepted and implemented. This is the case with the cloud technology for which the opinions between hoteliers are divided on those who think that it is just another fashion trend, unnecessary to be taken into consideration and those that believe that it helps in performing daily operations more easily, leaving space for ...

  12. Database design and database administration for a kindergarten

    OpenAIRE

    Vítek, Daniel

    2009-01-01

    The bachelor thesis deals with creation of database design for a standard kindergarten, installation of the designed database into the database system Oracle Database 10g Express Edition and demonstration of the administration tasks in this database system. The verification of the database was proved by a developed access application.

  13. CERN pushes the envelope with Oracle9i database

    CERN Multimedia

    2001-01-01

    Oracle Corp. today announced that unique capabilities in Oracle9i Database are helping CERN, the European Organization for Nuclear Research in Geneva. The LHC project will generate petabytes of data - an amount well beyond the capability of any relational database technology today. CERN is developing a new route in data management and analysis using Oracle9i Real Application Cluster technology.

  14. Some Considerations about Modern Database Machines

    Directory of Open Access Journals (Sweden)

    Manole VELICANU

    2010-01-01

    Full Text Available Optimizing the two computing resources of any computing system - time and space - has al-ways been one of the priority objectives of any database. A current and effective solution in this respect is the computer database. Optimizing computer applications by means of database machines has been a steady preoccupation of researchers since the late seventies. Several information technologies have revolutionized the present information framework. Out of these, those which have brought a major contribution to the optimization of the databases are: efficient handling of large volumes of data (Data Warehouse, Data Mining, OLAP – On Line Analytical Processing, the improvement of DBMS – Database Management Systems facilities through the integration of the new technologies, the dramatic increase in computing power and the efficient use of it (computer networks, massive parallel computing, Grid Computing and so on. All these information technologies, and others, have favored the resumption of the research on database machines and the obtaining in the last few years of some very good practical results, as far as the optimization of the computing resources is concerned.

  15. DEIMOS – an Open Source Image Database

    Directory of Open Access Journals (Sweden)

    M. Blazek

    2011-12-01

    Full Text Available The DEIMOS (DatabasE of Images: Open Source is created as an open-source database of images and videos for testing, verification and comparing of various image and/or video processing techniques such as enhancing, compression and reconstruction. The main advantage of DEIMOS is its orientation to various application fields – multimedia, television, security, assistive technology, biomedicine, astronomy etc. The DEIMOS is/will be created gradually step-by-step based upon the contributions of team members. The paper is describing basic parameters of DEIMOS database including application examples.

  16. Solutions for medical databases optimal exploitation.

    Science.gov (United States)

    Branescu, I; Purcarea, V L; Dobrescu, R

    2014-03-15

    The paper discusses the methods to apply OLAP techniques for multidimensional databases that leverage the existing, performance-enhancing technique, known as practical pre-aggregation, by making this technique relevant to a much wider range of medical applications, as a logistic support to the data warehousing techniques. The transformations have practically low computational complexity and they may be implemented using standard relational database technology. The paper also describes how to integrate the transformed hierarchies in current OLAP systems, transparently to the user and proposes a flexible, "multimodel" federated system for extending OLAP querying to external object databases.

  17. Enabling Semantic Queries Against the Spatial Database

    Directory of Open Access Journals (Sweden)

    PENG, X.

    2012-02-01

    Full Text Available The spatial database based upon the object-relational database management system (ORDBMS has the merits of a clear data model, good operability and high query efficiency. That is why it has been widely used in spatial data organization and management. However, it cannot express the semantic relationships among geospatial objects, making the query results difficult to meet the user's requirement well. Therefore, this paper represents an attempt to combine the Semantic Web technology with the spatial database so as to make up for the traditional database's disadvantages. In this way, on the one hand, users can take advantages of ORDBMS to store and manage spatial data; on the other hand, if the spatial database is released in the form of Semantic Web, the users could describe a query more concisely with the cognitive pattern which is similar to that of daily life. As a consequence, this methodology enables the benefits of both Semantic Web and the object-relational database (ORDB available. The paper discusses systematically the semantic enriched spatial database's architecture, key technologies and implementation. Subsequently, we demonstrate the function of spatial semantic queries via a practical prototype system. The query results indicate that the method used in this study is feasible.

  18. HIV Structural Database

    Science.gov (United States)

    SRD 102 HIV Structural Database (Web, free access)   The HIV Protease Structural Database is an archive of experimentally determined 3-D structures of Human Immunodeficiency Virus 1 (HIV-1), Human Immunodeficiency Virus 2 (HIV-2) and Simian Immunodeficiency Virus (SIV) Proteases and their complexes with inhibitors or products of substrate cleavage.

  19. Structural Ceramics Database

    Science.gov (United States)

    SRD 30 NIST Structural Ceramics Database (Web, free access)   The NIST Structural Ceramics Database (WebSCD) provides evaluated materials property data for a wide range of advanced ceramics known variously as structural ceramics, engineering ceramics, and fine ceramics.

  20. The international spinach database

    NARCIS (Netherlands)

    Treuren, van R.; Menting, F.B.J.

    2007-01-01

    The database concentrates on passport data of spinach of germplasm collections worldwide. All available passport data of accessions included in the International Spinach Database are downloadable as zipped Excel file. This zip file also contains the decoding tables, except for the FAO institutes

  1. Directory of IAEA databases

    International Nuclear Information System (INIS)

    1992-12-01

    This second edition of the Directory of IAEA Databases has been prepared within the Division of Scientific and Technical Information (NESI). Its main objective is to describe the computerized information sources available to staff members. This directory contains all databases produced at the IAEA, including databases stored on the mainframe, LAN's and PC's. All IAEA Division Directors have been requested to register the existence of their databases with NESI. For the second edition database owners were requested to review the existing entries for their databases and answer four additional questions. The four additional questions concerned the type of database (e.g. Bibliographic, Text, Statistical etc.), the category of database (e.g. Administrative, Nuclear Data etc.), the available documentation and the type of media used for distribution. In the individual entries on the following pages the answers to the first two questions (type and category) is always listed, but the answers to the second two questions (documentation and media) is only listed when information has been made available

  2. Atomic Spectra Database (ASD)

    Science.gov (United States)

    SRD 78 NIST Atomic Spectra Database (ASD) (Web, free access)   This database provides access and search capability for NIST critically evaluated data on atomic energy levels, wavelengths, and transition probabilities that are reasonably up-to-date. The NIST Atomic Spectroscopy Data Center has carried out these critical compilations.

  3. Children's Culture Database (CCD)

    DEFF Research Database (Denmark)

    Wanting, Birgit

    a Dialogue inspired database with documentation, network (individual and institutional profiles) and current news , paper presented at the research seminar: Electronic access to fiction, Copenhagen, November 11-13, 1996......a Dialogue inspired database with documentation, network (individual and institutional profiles) and current news , paper presented at the research seminar: Electronic access to fiction, Copenhagen, November 11-13, 1996...

  4. Odense Pharmacoepidemiological Database (OPED)

    DEFF Research Database (Denmark)

    Hallas, Jesper; Poulsen, Maja Hellfritzsch; Hansen, Morten Rix

    2017-01-01

    The Odense University Pharmacoepidemiological Database (OPED) is a prescription database established in 1990 by the University of Southern Denmark, covering reimbursed prescriptions from the county of Funen in Denmark and the region of Southern Denmark (1.2 million inhabitants). It is still active...

  5. Consumer Product Category Database

    Science.gov (United States)

    The Chemical and Product Categories database (CPCat) catalogs the use of over 40,000 chemicals and their presence in different consumer products. The chemical use information is compiled from multiple sources while product information is gathered from publicly available Material Safety Data Sheets (MSDS). EPA researchers are evaluating the possibility of expanding the database with additional product and use information.

  6. NoSQL database scaling

    OpenAIRE

    Žardin, Norbert

    2017-01-01

    NoSQL database scaling is a decision, where system resources or financial expenses are traded for database performance or other benefits. By scaling a database, database performance and resource usage might increase or decrease, such changes might have a negative impact on an application that uses the database. In this work it is analyzed how database scaling affect database resource usage and performance. As a results, calculations are acquired, using which database scaling types and differe...

  7. The LHCb configuration database

    CERN Document Server

    Abadie, L; Van Herwijnen, Eric; Jacobsson, R; Jost, B; Neufeld, N

    2005-01-01

    The aim of the LHCb configuration database is to store information about all the controllable devices of the detector. The experiment's control system (that uses PVSS ) will configure, start up and monitor the detector from the information in the configuration database. The database will contain devices with their properties, connectivity and hierarchy. The ability to store and rapidly retrieve huge amounts of data, and the navigability between devices are important requirements. We have collected use cases to ensure the completeness of the design. Using the entity relationship modelling technique we describe the use cases as classes with attributes and links. We designed the schema for the tables using relational diagrams. This methodology has been applied to the TFC (switches) and DAQ system. Other parts of the detector will follow later. The database has been implemented using Oracle to benefit from central CERN database support. The project also foresees the creation of tools to populate, maintain, and co...

  8. Deposition of PEDOT: PSS Nanoparticles as a Conductive Microlayer Anode in OLEDs Device by Desktop Inkjet Printer

    Directory of Open Access Journals (Sweden)

    S. Ummartyotin

    2011-01-01

    Full Text Available A simple microfabrication technique for delivering macromolecules and patterning microelectrode arrays using desktop inkjet printer was described. Aqueous solution of nanoparticle of poly (3,4-ethylenedioxythiophene (PEDOT doped with polystyrene sulfonic acid (PSS was prepared while its particle size, the surface tension, and the viscosity of the solution were adjusted to be suitable for deposition on a flexible cellulose nanocomposite substrate via inkjet printer. The statistical average of PEDOT: PSS particle size of 100 nm was observed. The microthickness, surface morphology, and electrical conductivity of the printed substrate were then characterized by profilometer, atomic force microscope (AFM, and four-point probe electrical measurement, respectively. The inkjet deposition of PEDOT: PSS was successfully carried out, whilst retained its transparency feature. Highly smooth surface (roughness ~23–44 nm was achieved.

  9. Database Description - DGBY | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us DGBY Database Description General information of database Database name DGBY Alternative name Database...EL: +81-29-838-8066 E-mail: Database classification Microarray Data and other Gene Expression Databases Orga...nism Taxonomy Name: Saccharomyces cerevisiae Taxonomy ID: 4932 Database descripti...-called phenomics). We uploaded these data on this website which is designated DGBY(Database for Gene expres...sion and function of Baker's yeast). Features and manner of utilization of database This database

  10. The simulation of the half-dry stroke based on the force feedback technology

    Science.gov (United States)

    Guo, Chao; Hou, Zeng-xuan; Zheng, Shuan-zhu; Yang, Guang-qing

    2017-02-01

    A novel stroke simulation method of the Half-dry style of Chinese calligraphy based on the force feedback technology is proposed for the virtual painting. Firstly, according to the deformation of the brush when the force is exerted on it, the brush footprint between the brush and paper is calculated. The complete brush stroke is obtained by superimposing brush footprints along the painting direction, and the dynamic painting of the brush stroke is implemented. Then, we establish the half-dry texture databases and propose the concept of half-dry value by researching the main factors that affect the effects of the half-dry stroke. In the virtual painting, the half-dry texture is mapped into the stroke in real time according to the half-dry value and painting technique. A technique of texture blending based on the KM model is applied to avoid the seams while texture mapping. The proposed method has been successfully applied to the virtual painting system based on the force feedback technology. In this system, users can implement the painting in real time with a Phantom Desktop haptic device, which can effectively enhance reality to users.

  11. The use of web internet technologies to distribute medical images

    International Nuclear Information System (INIS)

    Deller, A.L.; Cheal, D.; Field, J.

    1999-01-01

    Full text: In the past, internet browsers were considered ineffective for image distribution. Today we have the technology to use internet standards for picture archive and communication systems (PACS) and teleradiology effectively. Advanced wavelet compression and state-of-the-art JAVA software allows us to distribute images on normal computer hardware. The use of vendor and database neutral software and industry-standard hardware has many advantages. This standards base approach avoids the costly rapid obsolescence of proprietary PACS and is cheaper to purchase and maintain. Images can be distributed around a hospital site, as well as outside the campus, quickly and inexpensively. It also allows integration between the Hospital Information System (HIS) and the Radiology Information System (RIS). Being able to utilize standard internet technologies and computer hardware for PACS is a cost-effective alternative. A system based on this technology can be used for image distribution, archiving, teleradiology and RIS integration. This can be done without expensive specialized imaging workstations and telecommunication systems. Web distribution of images allows you to send images to multiple places concurrently. A study can be within your Medical Imaging Department, as well as in the ward and on the desktop of referring clinicians - with a report. As long as there is a computer with an internet access account, high-quality images can be at your disposal 24 h a day. The importance of medical images for patient management makes them a valuable component of the patient's medical record. Therefore, an efficient system for displaying and distributing images can improve patient management and make your workplace more effective

  12. Effects of Desktop Virtual Reality Environment Training on State Anxiety and Vocational Identity Scores among Persons with Disabilities during Job Placement

    Science.gov (United States)

    Washington, Andre Lamont

    2013-01-01

    This study examined how desktop virtual reality environment training (DVRET) affected state anxiety and vocational identity of vocational rehabilitation services consumers during job placement/job readiness activities. It utilized a quantitative research model with a quasi-experimental pretest-posttest design plus some qualitative descriptive…

  13. Mediational Effects of Desktop-Videoconferencing Telecollaborative Exchanges on the Intercultural Communicative Competence of Students of French as a Foreign Language

    Science.gov (United States)

    Martin, Veronique

    2013-01-01

    Since the early 2000s, foreign language practitioners and researchers have shown an increasing interest in exploring the affordances of multimodal telecollaborative environments for the linguistic and intercultural development of their students. Due in part to their inherent complexity, one-on-one desktop-videoconferencing contexts have not been…

  14. Development, Implementation, and Analysis of Desktop-Scale Model Industrial Equipment and a Critical Thinking Rubric for Use in Chemical Engineering Education

    Science.gov (United States)

    Golter, Paul B.

    2011-01-01

    In order to address some of the challenges facing engineering education, namely the demand that students be better prepared to practice professional as well as technical skills, we have developed an intervention consisting of equipment, assessments and a novel pedagogy. The equipment consists of desktop-scale replicas of common industrial…

  15. A Database Practicum for Teaching Database Administration and Software Development at Regis University

    Science.gov (United States)

    Mason, Robert T.

    2013-01-01

    This research paper compares a database practicum at the Regis University College for Professional Studies (CPS) with technology oriented practicums at other universities. Successful andragogy for technology courses can motivate students to develop a genuine interest in the subject, share their knowledge with peers and can inspire students to…

  16. Hazard Analysis Database Report

    Energy Technology Data Exchange (ETDEWEB)

    GAULT, G.W.

    1999-10-13

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.

  17. Head-mounted display versus desktop for 3D navigation in virtual reality : A user study

    NARCIS (Netherlands)

    Sousa Santos, B.; Dias, P.; Pimentel, A.; Baggerman, J.W.; Ferreira, C.; Silva, S.; Madeira, J.

    2008-01-01

    Virtual Reality (VR) has been constantly evolving since its early days, and is now a fundamental technology in different application areas. User evaluation is a crucial step in the design and development of VR systems that do respond to users’ needs, as well as for identifying applications that

  18. Campus Computing, 1995: The Sixth National Survey of Desktop Computing in Higher Education.

    Science.gov (United States)

    Green, Kenneth C.

    This monograph reports findings of a Fall, 1995 survey of computing officials at approximately 650 two- and four-year colleges and universities across the United States concerning increasing use of technology on college campuses. Major findings include: the percentage of college courses using e-mail and multimedia resources more than doubled; the…

  19. Product Licenses Database Application

    CERN Document Server

    Tonkovikj, Petar

    2016-01-01

    The goal of this project is to organize and centralize the data about software tools available to CERN employees, as well as provide a system that would simplify the license management process by providing information about the available licenses and their expiry dates. The project development process is consisted of two steps: modeling the products (software tools), product licenses, legal agreements and other data related to these entities in a relational database and developing the front-end user interface so that the user can interact with the database. The result is an ASP.NET MVC web application with interactive views for displaying and managing the data in the underlying database.

  20. JICST Factual Database(2)

    Science.gov (United States)

    Araki, Keisuke

    The computer programme, which builds atom-bond connection tables from nomenclatures, is developed. Chemical substances with their nomenclature and varieties of trivial names or experimental code numbers are inputted. The chemical structures of the database are stereospecifically stored and are able to be searched and displayed according to stereochemistry. Source data are from laws and regulations of Japan, RTECS of US and so on. The database plays a central role within the integrated fact database service of JICST and makes interrelational retrieval possible.

  1. The Role of Wireless Computing Technology in the Design of Schools.

    Science.gov (United States)

    Nair, Prakash

    This document discusses integrating computers logically and affordably into a school building's infrastructure through the use of wireless technology. It begins by discussing why wireless networks using mobile computers are preferable to desktop machines in each classoom. It then explains the features of a wireless local area network (WLAN) and…

  2. Making Choices in the Virtual World: The New Model at United Technologies Information Network.

    Science.gov (United States)

    Gulliford, Bradley

    1998-01-01

    Describes changes in services of the United Technologies Corporation Information Network from a traditional library system to a virtual system of World Wide Web sites, a document-delivery unit, telephone and e-mail reference, and desktop technical support to provide remote access. Staff time, security, and licensing issues are addressed.…

  3. Database Description - SAHG | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us SAHG Database Description General information of database Database name SAHG Alternative nam...h: Contact address Chie Motono Tel : +81-3-3599-8067 E-mail : Database classification Structure Databases - ...Protein structure Human and other Vertebrate Genomes - Human ORFs Protein sequence database...s - Protein properties Organism Taxonomy Name: Homo sapiens Taxonomy ID: 9606 Database description...42,577 domain-structure models in ~24900 unique human protein sequences from the RefSeq database. Features a

  4. Database Description - PLACE | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us PLACE Database Description General information of database Database name PLACE Alternative name A Database...Kannondai, Tsukuba, Ibaraki 305-8602, Japan National Institute of Agrobiological Sciences E-mail : Databas...e classification Plant databases Organism Taxonomy Name: Tracheophyta Taxonomy ID: 58023 Database... description PLACE is a database of motifs found in plant cis-acting regulatory DNA elements base...that have been identified in these motifs in other genes or in other plant species in later publications. The database

  5. Marine Jurisdictions Database

    National Research Council Canada - National Science Library

    Goldsmith, Roger

    1998-01-01

    The purpose of this project was to take the data gathered for the Maritime Claims chart and create a Maritime Jurisdictions digital database suitable for use with oceanographic mission planning objectives...

  6. Medicare Coverage Database

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare Coverage Database (MCD) contains all National Coverage Determinations (NCDs) and Local Coverage Determinations (LCDs), local articles, and proposed NCD...

  7. Children's Culture Database (CCD)

    DEFF Research Database (Denmark)

    Wanting, Birgit

    a Dialogue inspired database with documentation, network (individual and institutional profiles) and current news , paper presented at the research seminar: Electronic access to fiction, Copenhagen, November 11-13, 1996...

  8. The Danish Melanoma Database

    DEFF Research Database (Denmark)

    Hölmich, Lisbet Rosenkrantz; Klausen, Siri; Spaun, Eva

    2016-01-01

    AIM OF DATABASE: The aim of the database is to monitor and improve the treatment and survival of melanoma patients. STUDY POPULATION: All Danish patients with cutaneous melanoma and in situ melanomas must be registered in the Danish Melanoma Database (DMD). In 2014, 2,525 patients with invasive......-node-metastasis stage. Information about the date of diagnosis, treatment, type of surgery, including safety margins, results of lymphoscintigraphy in patients for whom this was indicated (tumors > T1a), results of sentinel node biopsy, pathological evaluation hereof, and follow-up information, including recurrence......, nature, and treatment hereof is registered. In case of death, the cause and date are included. Currently, all data are entered manually; however, data catchment from the existing registries is planned to be included shortly. DESCRIPTIVE DATA: The DMD is an old research database, but new as a clinical...

  9. Danish Urogynaecological Database

    DEFF Research Database (Denmark)

    Hansen, Ulla Darling; Gradel, Kim Oren; Larsen, Michael Due

    2016-01-01

    The Danish Urogynaecological Database is established in order to ensure high quality of treatment for patients undergoing urogynecological surgery. The database contains details of all women in Denmark undergoing incontinence surgery or pelvic organ prolapse surgery amounting to ~5,200 procedures...... per year. The variables are collected along the course of treatment of the patient from the referral to a postoperative control. Main variables are prior obstetrical and gynecological history, symptoms, symptom-related quality of life, objective urogynecological findings, type of operation......, complications if relevant, implants used if relevant, 3-6-month postoperative recording of symptoms, if any. A set of clinical quality indicators is being maintained by the steering committee for the database and is published in an annual report which also contains extensive descriptive statistics. The database...

  10. Danish Gynecological Cancer Database

    DEFF Research Database (Denmark)

    Sørensen, Sarah Mejer; Bjørn, Signe Frahm; Jochumsen, Kirsten Marie

    2016-01-01

    AIM OF DATABASE: The Danish Gynecological Cancer Database (DGCD) is a nationwide clinical cancer database and its aim is to monitor the treatment quality of Danish gynecological cancer patients, and to generate data for scientific purposes. DGCD also records detailed data on the diagnostic measures...... for gynecological cancer. STUDY POPULATION: DGCD was initiated January 1, 2005, and includes all patients treated at Danish hospitals for cancer of the ovaries, peritoneum, fallopian tubes, cervix, vulva, vagina, and uterus, including rare histological types. MAIN VARIABLES: DGCD data are organized within separate...... is the registration of oncological treatment data, which is incomplete for a large number of patients. CONCLUSION: The very complete collection of available data from more registries form one of the unique strengths of DGCD compared to many other clinical databases, and provides unique possibilities for validation...

  11. Reach Address Database (RAD)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Reach Address Database (RAD) stores the reach address of each Water Program feature that has been linked to the underlying surface water features (streams,...

  12. Atomicity for XML Databases

    Science.gov (United States)

    Biswas, Debmalya; Jiwane, Ashwin; Genest, Blaise

    With more and more data stored into XML databases, there is a need to provide the same level of failure resilience and robustness that users have come to expect from relational database systems. In this work, we discuss strategies to provide the transactional aspect of atomicity to XML databases. The main contribution of this paper is to propose a novel approach for performing updates-in-place on XML databases, with the undo statements stored in the same high level language as the update statements. Finally, we give experimental results to study the performance/storage trade-off of the updates-in-place strategy (based on our undo proposal) against the deferred updates strategy to providing atomicity.

  13. Mouse Phenome Database (MPD)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Mouse Phenome Database (MPD) has characterizations of hundreds of strains of laboratory mice to facilitate translational discoveries and to assist in selection...

  14. Ganymede Crater Database

    Data.gov (United States)

    National Aeronautics and Space Administration — This web page leads to a database of images and information about the 150 major impact craters on Ganymede and is updated semi-regularly based on continuing analysis...

  15. Dissolution Methods Database

    Data.gov (United States)

    U.S. Department of Health & Human Services — For a drug product that does not have a dissolution test method in the United States Pharmacopeia (USP), the FDA Dissolution Methods Database provides information on...

  16. Toxicity Reference Database

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Toxicity Reference Database (ToxRefDB) contains approximately 30 years and $2 billion worth of animal studies. ToxRefDB allows scientists and the interested...

  17. Records Management Database

    Data.gov (United States)

    US Agency for International Development — The Records Management Database is tool created in Microsoft Access specifically for USAID use. It contains metadata in order to access and retrieve the information...

  18. ARTI Refrigerant Database

    Energy Technology Data Exchange (ETDEWEB)

    Calm, J.M. [Calm (James M.), Great Falls, VA (United States)

    1994-05-27

    The Refrigerant Database consolidates and facilitates access to information to assist industry in developing equipment using alternative refrigerants. The underlying purpose is to accelerate phase out of chemical compounds of environmental concern.

  19. Database for West Africa

    African Journals Online (AJOL)

    NCRS USDA English Morphology and analytical. ISIS ISRIC English ..... problems. The compilation of the database cannot be carried out without adequate funding It also needs a strong and firm management. It is important that all participants ...

  20. Venus Crater Database

    Data.gov (United States)

    National Aeronautics and Space Administration — This web page leads to a database of images and information about the 900 or so impact craters on the surface of Venus by diameter, latitude, and name.