WorldWideScience

Sample records for technology database desktop

  1. Desktop Technology for Newspapers: Use of the Computer Tool.

    Science.gov (United States)

    Wilson, Howard Alan

    This work considers desktop publishing technology as a way used to paginate newspapers electronically, tracing the technology's development from the beginning of desktop publishing in the mid-1980s to the 1990s. The work emphasizes how desktop publishing technology is and can be used by weekly newspapers. It reports on a Pennsylvania weekly…

  2. Desktop Publishing: Changing Technology, Changing Occupations.

    Science.gov (United States)

    Stanton, Michael

    1991-01-01

    Describes desktop publishing (DTP) and its place in corporations. Lists job titles of those working in desktop publishing and describes DTP as it is taught at secondary and postsecondary levels and by private trainers. (JOW)

  3. MICA: desktop software for comprehensive searching of DNA databases

    Directory of Open Access Journals (Sweden)

    Glick Benjamin S

    2006-10-01

    Full Text Available Abstract Background Molecular biologists work with DNA databases that often include entire genomes. A common requirement is to search a DNA database to find exact matches for a nondegenerate or partially degenerate query. The software programs available for such purposes are normally designed to run on remote servers, but an appealing alternative is to work with DNA databases stored on local computers. We describe a desktop software program termed MICA (K-Mer Indexing with Compact Arrays that allows large DNA databases to be searched efficiently using very little memory. Results MICA rapidly indexes a DNA database. On a Macintosh G5 computer, the complete human genome could be indexed in about 5 minutes. The indexing algorithm recognizes all 15 characters of the DNA alphabet and fully captures the information in any DNA sequence, yet for a typical sequence of length L, the index occupies only about 2L bytes. The index can be searched to return a complete list of exact matches for a nondegenerate or partially degenerate query of any length. A typical search of a long DNA sequence involves reading only a small fraction of the index into memory. As a result, searches are fast even when the available RAM is limited. Conclusion MICA is suitable as a search engine for desktop DNA analysis software.

  4. A VBA Desktop Database for Proposal Processing at National Optical Astronomy Observatories

    Science.gov (United States)

    Brown, Christa L.

    National Optical Astronomy Observatories (NOAO) has developed a relational Microsoft Windows desktop database using Microsoft Access and the Microsoft Office programming language, Visual Basic for Applications (VBA). The database is used to track data relating to observing proposals from original receipt through the review process, scheduling, observing, and final statistical reporting. The database has automated proposal processing and distribution of information. It allows NOAO to collect and archive data so as to query and analyze information about our science programs in new ways.

  5. Extending Database Integration Technology

    National Research Council Canada - National Science Library

    Buneman, Peter

    1999-01-01

    Formal approaches to the semantics of databases and database languages can have immediate and practical consequences in extending database integration technologies to include a vastly greater range...

  6. Potential Pedagogical Benefits and Limitations of Multimedia Integrated Desktop Video Conferencing Technology for Synchronous Learning

    NARCIS (Netherlands)

    drs Maurice Schols

    2009-01-01

    As multimedia gradually becomes more and more an integrated part of video conferencing systems, the use of multimedia integrated desktop video conferencing technology (MIDVCT) will open up new educational possibilities for synchronous learning. However, the possibilities and limitations of this

  7. From Server to Desktop: Capital and Institutional Planning for Client/Server Technology.

    Science.gov (United States)

    Mullig, Richard M.; Frey, Keith W.

    1994-01-01

    Beginning with a request for an enhanced system for decision/strategic planning support, the University of Chicago's biological sciences division has developed a range of administrative client/server tools, instituted a capital replacement plan for desktop technology, and created a planning and staffing approach enabling rapid introduction of new…

  8. Relational Database Technology: An Overview.

    Science.gov (United States)

    Melander, Nicole

    1987-01-01

    Describes the development of relational database technology as it applies to educational settings. Discusses some of the new tools and models being implemented in an effort to provide educators with technologically advanced ways of answering questions about education programs and data. (TW)

  9. Database of Information technology resources

    OpenAIRE

    Barzda, Erlandas

    2005-01-01

    The subject of this master work is the internet information resource database. This work also handles the problems of old information systems which do not meet the new contemporary requirements. The aim is to create internet information system, based on object-oriented technologies and tailored to computer users’ needs. The internet information database system helps computers administrators to get the all needed information about computers network elements and easy to register all changes int...

  10. Benchmarking desktop and mobile handwriting across COTS devices: The e-BioSign biometric database.

    Science.gov (United States)

    Tolosana, Ruben; Vera-Rodriguez, Ruben; Fierrez, Julian; Morales, Aythami; Ortega-Garcia, Javier

    2017-01-01

    This paper describes the design, acquisition process and baseline evaluation of the new e-BioSign database, which includes dynamic signature and handwriting information. Data is acquired from 5 different COTS devices: three Wacom devices (STU-500, STU-530 and DTU-1031) specifically designed to capture dynamic signatures and handwriting, and two general purpose tablets (Samsung Galaxy Note 10.1 and Samsung ATIV 7). For the two Samsung tablets, data is collected using both pen stylus and also the finger in order to study the performance of signature verification in a mobile scenario. Data was collected in two sessions for 65 subjects, and includes dynamic information of the signature, the full name and alpha numeric sequences. Skilled forgeries were also performed for signatures and full names. We also report a benchmark evaluation based on e-BioSign for person verification under three different real scenarios: 1) intra-device, 2) inter-device, and 3) mixed writing-tool. We have experimented the proposed benchmark using the main existing approaches for signature verification: feature- and time functions-based. As a result, new insights into the problem of signature biometrics in sensor-interoperable scenarios have been obtained, namely: the importance of specific methods for dealing with device interoperability, and the necessity of a deeper analysis on signatures acquired using the finger as the writing tool. This e-BioSign public database allows the research community to: 1) further analyse and develop signature verification systems in realistic scenarios, and 2) investigate towards a better understanding of the nature of the human handwriting when captured using electronic COTS devices in realistic conditions.

  11. Nuclear technology databases and information network systems

    International Nuclear Information System (INIS)

    Iwata, Shuichi; Kikuchi, Yasuyuki; Minakuchi, Satoshi

    1993-01-01

    This paper describes the databases related to nuclear (science) technology, and information network. Following contents are collected in this paper: the database developed by JAERI, ENERGY NET, ATOM NET, NUCLEN nuclear information database, INIS, NUclear Code Information Service (NUCLIS), Social Application of Nuclear Technology Accumulation project (SANTA), Nuclear Information Database/Communication System (NICS), reactor materials database, radiation effects database, NucNet European nuclear information database, reactor dismantling database. (J.P.N.)

  12. Database management systems understanding and applying database technology

    CERN Document Server

    Gorman, Michael M

    1991-01-01

    Database Management Systems: Understanding and Applying Database Technology focuses on the processes, methodologies, techniques, and approaches involved in database management systems (DBMSs).The book first takes a look at ANSI database standards and DBMS applications and components. Discussion focus on application components and DBMS components, implementing the dynamic relationship application, problems and benefits of dynamic relationship DBMSs, nature of a dynamic relationship application, ANSI/NDL, and DBMS standards. The manuscript then ponders on logical database, interrogation, and phy

  13. Semantic Desktop

    Science.gov (United States)

    Sauermann, Leo; Kiesel, Malte; Schumacher, Kinga; Bernardi, Ansgar

    In diesem Beitrag wird gezeigt, wie der Arbeitsplatz der Zukunft aussehen könnte und wo das Semantic Web neue Möglichkeiten eröffnet. Dazu werden Ansätze aus dem Bereich Semantic Web, Knowledge Representation, Desktop-Anwendungen und Visualisierung vorgestellt, die es uns ermöglichen, die bestehenden Daten eines Benutzers neu zu interpretieren und zu verwenden. Dabei bringt die Kombination von Semantic Web und Desktop Computern besondere Vorteile - ein Paradigma, das unter dem Titel Semantic Desktop bekannt ist. Die beschriebenen Möglichkeiten der Applikationsintegration sind aber nicht auf den Desktop beschränkt, sondern können genauso in Web-Anwendungen Verwendung finden.

  14. Desktop Genetics

    OpenAIRE

    Hough, Soren H; Ajetunmobi, Ayokunmi; Brody, Leigh; Humphryes-Kirilov, Neil; Perello, Edward

    2016-01-01

    Desktop Genetics is a bioinformatics company building a gene-editing platform for personalized medicine. The company works with scientists around the world to design and execute state-of-the-art clustered regularly interspaced short palindromic repeats (CRISPR) experiments. Desktop Genetics feeds the lessons learned about experimental intent, single-guide RNA design and data from international genomics projects into a novel CRISPR artificial intelligence system. We believe that machine learni...

  15. Database Access through Java Technologies

    Directory of Open Access Journals (Sweden)

    Nicolae MERCIOIU

    2010-09-01

    Full Text Available As a high level development environment, the Java technologies offer support to the development of distributed applications, independent of the platform, providing a robust set of methods to access the databases, used to create software components on the server side, as well as on the client side. Analyzing the evolution of Java tools to access data, we notice that these tools evolved from simple methods that permitted the queries, the insertion, the update and the deletion of the data to advanced implementations such as distributed transactions, cursors and batch files. The client-server architectures allows through JDBC (the Java Database Connectivity the execution of SQL (Structured Query Language instructions and the manipulation of the results in an independent and consistent manner. The JDBC API (Application Programming Interface creates the level of abstractization needed to allow the call of SQL queries to any DBMS (Database Management System. In JDBC the native driver and the ODBC (Open Database Connectivity-JDBC bridge and the classes and interfaces of the JDBC API will be described. The four steps needed to build a JDBC driven application are presented briefly, emphasizing on the way each step has to be accomplished and the expected results. In each step there are evaluations on the characteristics of the database systems and the way the JDBC programming interface adapts to each one. The data types provided by SQL2 and SQL3 standards are analyzed by comparison with the Java data types, emphasizing on the discrepancies between those and the SQL types, but also the methods that allow the conversion between different types of data through the methods of the ResultSet object. Next, starting from the metadata role and studying the Java programming interfaces that allow the query of result sets, we will describe the advanced features of the data mining with JDBC. As alternative to result sets, the Rowsets add new functionalities that

  16. Desktop Genetics.

    Science.gov (United States)

    Hough, Soren H; Ajetunmobi, Ayokunmi; Brody, Leigh; Humphryes-Kirilov, Neil; Perello, Edward

    2016-11-01

    Desktop Genetics is a bioinformatics company building a gene-editing platform for personalized medicine. The company works with scientists around the world to design and execute state-of-the-art clustered regularly interspaced short palindromic repeats (CRISPR) experiments. Desktop Genetics feeds the lessons learned about experimental intent, single-guide RNA design and data from international genomics projects into a novel CRISPR artificial intelligence system. We believe that machine learning techniques can transform this information into a cognitive therapeutic development tool that will revolutionize medicine.

  17. Common Sense Wordworking III: Desktop Publishing and Desktop Typesetting.

    Science.gov (United States)

    Crawford, Walt

    1987-01-01

    Describes current desktop publishing packages available for microcomputers and discusses the disadvantages, especially in cost, for most personal computer users. Also described is a less expensive alternative technology--desktop typesetting--which meets the requirements of users who do not need elaborate techniques for combining text and graphics.…

  18. Promises and Realities of Desktop Publishing.

    Science.gov (United States)

    Thompson, Patricia A.; Craig, Robert L.

    1991-01-01

    Examines the underlying assumptions of the rhetoric of desktop publishing promoters. Suggests four criteria to help educators provide insights into issues and challenges concerning desktop publishing technology that design students will face on the job. (MG)

  19. Desktop Publishing.

    Science.gov (United States)

    Stanley, Milt

    1986-01-01

    Defines desktop publishing, describes microcomputer developments and software tools that make it possible, and discusses its use as an instructional tool to improve writing skills. Reasons why students' work should be published, examples of what to publish, and types of software and hardware to facilitate publishing are reviewed. (MBR)

  20. "Mr. Database" : Jim Gray and the History of Database Technologies.

    Science.gov (United States)

    Hanwahr, Nils C

    2017-12-01

    Although the widespread use of the term "Big Data" is comparatively recent, it invokes a phenomenon in the developments of database technology with distinct historical contexts. The database engineer Jim Gray, known as "Mr. Database" in Silicon Valley before his disappearance at sea in 2007, was involved in many of the crucial developments since the 1970s that constitute the foundation of exceedingly large and distributed databases. Jim Gray was involved in the development of relational database systems based on the concepts of Edgar F. Codd at IBM in the 1970s before he went on to develop principles of Transaction Processing that enable the parallel and highly distributed performance of databases today. He was also involved in creating forums for discourse between academia and industry, which influenced industry performance standards as well as database research agendas. As a co-founder of the San Francisco branch of Microsoft Research, Gray increasingly turned toward scientific applications of database technologies, e. g. leading the TerraServer project, an online database of satellite images. Inspired by Vannevar Bush's idea of the memex, Gray laid out his vision of a Personal Memex as well as a World Memex, eventually postulating a new era of data-based scientific discovery termed "Fourth Paradigm Science". This article gives an overview of Gray's contributions to the development of database technology as well as his research agendas and shows that central notions of Big Data have been occupying database engineers for much longer than the actual term has been in use.

  1. Multimedia database retrieval technology and applications

    CERN Document Server

    Muneesawang, Paisarn; Guan, Ling

    2014-01-01

    This book explores multimedia applications that emerged from computer vision and machine learning technologies. These state-of-the-art applications include MPEG-7, interactive multimedia retrieval, multimodal fusion, annotation, and database re-ranking. The application-oriented approach maximizes reader understanding of this complex field. Established researchers explain the latest developments in multimedia database technology and offer a glimpse of future technologies. The authors emphasize the crucial role of innovation, inspiring users to develop new applications in multimedia technologies

  2. XML technology planning database : lessons learned

    Science.gov (United States)

    Some, Raphael R.; Neff, Jon M.

    2005-01-01

    A hierarchical Extensible Markup Language(XML) database called XCALIBR (XML Analysis LIBRary) has been developed by Millennium Program to assist in technology investment (ROI) analysis and technology Language Capability the New return on portfolio optimization. The database contains mission requirements and technology capabilities, which are related by use of an XML dictionary. The XML dictionary codifies a standardized taxonomy for space missions, systems, subsystems and technologies. In addition to being used for ROI analysis, the database is being examined for use in project planning, tracking and documentation. During the past year, the database has moved from development into alpha testing. This paper describes the lessons learned during construction and testing of the prototype database and the motivation for moving from an XML taxonomy to a standard XML-based ontology.

  3. Evolution of Database Replication Technologies for WLCG

    CERN Document Server

    Baranowski, Zbigniew; Blaszczyk, Marcin; Dimitrov, Gancho; Canali, Luca

    2015-01-01

    In this article we summarize several years of experience on database replication technologies used at WLCG and we provide a short review of the available Oracle technologies and their key characteristics. One of the notable changes and improvement in this area in recent past has been the introduction of Oracle GoldenGate as a replacement of Oracle Streams. We report in this article on the preparation and later upgrades for remote replication done in collaboration with ATLAS and Tier 1 database administrators, including the experience from running Oracle GoldenGate in production. Moreover, we report on another key technology in this area: Oracle Active Data Guard which has been adopted in several of the mission critical use cases for database replication between online and offline databases for the LHC experiments.

  4. Training Database Technology in DBMS MS Access

    Directory of Open Access Journals (Sweden)

    Nataliya Evgenievna Surkova

    2015-05-01

    Full Text Available The article describes the methodological issues of learning relational database technology and management systems relational databases. DBMS Microsoft Access is the primer for learning of DBMS. This methodology allows to generate some general cultural competence, such as the possession of the main methods, ways and means of production, storage and processing of information, computer skills as a means of managing information. Also must formed professional competence such as the ability to collect, analyze and process the data necessary for solving the professional tasks, the ability to use solutions for analytical and research tasks modern technology and information technology.

  5. Training Database Technology in DBMS MS Access

    OpenAIRE

    Nataliya Evgenievna Surkova

    2015-01-01

    The article describes the methodological issues of learning relational database technology and management systems relational databases. DBMS Microsoft Access is the primer for learning of DBMS. This methodology allows to generate some general cultural competence, such as the possession of the main methods, ways and means of production, storage and processing of information, computer skills as a means of managing information. Also must formed professional competence such as the ability to coll...

  6. Evolution of Database Replication Technologies for WLCG

    OpenAIRE

    Baranowski, Zbigniew; Pardavila, Lorena Lobato; Blaszczyk, Marcin; Dimitrov, Gancho; Canali, Luca

    2015-01-01

    In this article we summarize several years of experience on database replication technologies used at WLCG and we provide a short review of the available Oracle technologies and their key characteristics. One of the notable changes and improvement in this area in recent past has been the introduction of Oracle GoldenGate as a replacement of Oracle Streams. We report in this article on the preparation and later upgrades for remote replication done in collaboration with ATLAS and Tier 1 databas...

  7. WEB-BASED DATABASE ON RENEWAL TECHNOLOGIES ...

    Science.gov (United States)

    As U.S. utilities continue to shore up their aging infrastructure, renewal needs now represent over 43% of annual expenditures compared to new construction for drinking water distribution and wastewater collection systems (Underground Construction [UC], 2016). An increased understanding of renewal options will ultimately assist drinking water utilities in reducing water loss and help wastewater utilities to address infiltration and inflow issues in a cost-effective manner. It will also help to extend the service lives of both drinking water and wastewater mains. This research effort involved collecting case studies on the use of various trenchless pipeline renewal methods and providing the information in an online searchable database. The overall objective was to further support technology transfer and information sharing regarding emerging and innovative renewal technologies for water and wastewater mains. The result of this research is a Web-based, searchable database that utility personnel can use to obtain technology performance and cost data, as well as case study references. The renewal case studies include: technologies used; the conditions under which the technology was implemented; costs; lessons learned; and utility contact information. The online database also features a data mining tool for automated review of the technologies selected and cost data. Based on a review of the case study results and industry data, several findings are presented on tren

  8. Solar Sail Propulsion Technology Readiness Level Database

    Science.gov (United States)

    Adams, Charles L.

    2004-01-01

    The NASA In-Space Propulsion Technology (ISPT) Projects Office has been sponsoring 2 solar sail system design and development hardware demonstration activities over the past 20 months. Able Engineering Company (AEC) of Goleta, CA is leading one team and L Garde, Inc. of Tustin, CA is leading the other team. Component, subsystem and system fabrication and testing has been completed successfully. The goal of these activities is to advance the technology readiness level (TRL) of solar sail propulsion from 3 towards 6 by 2006. These activities will culminate in the deployment and testing of 20-meter solar sail system ground demonstration hardware in the 30 meter diameter thermal-vacuum chamber at NASA Glenn Plum Brook in 2005. This paper will describe the features of a computer database system that documents the results of the solar sail development activities to-date. Illustrations of the hardware components and systems, test results, analytical models, relevant space environment definition and current TRL assessment, as stored and manipulated within the database are presented. This database could serve as a central repository for all data related to the advancement of solar sail technology sponsored by the ISPT, providing an up-to-date assessment of the TRL of this technology. Current plans are to eventually make the database available to the Solar Sail community through the Space Transportation Information Network (STIN).

  9. Desktop Virtualization: Applications and Considerations

    Science.gov (United States)

    Hodgman, Matthew R.

    2013-01-01

    As educational technology continues to rapidly become a vital part of a school district's infrastructure, desktop virtualization promises to provide cost-effective and education-enhancing solutions to school-based computer technology problems in school systems locally and abroad. This article outlines the history of and basic concepts behind…

  10. Pages from the Desktop: Desktop Publishing Today.

    Science.gov (United States)

    Crawford, Walt

    1994-01-01

    Discusses changes that have made desktop publishing appealing and reasonably priced. Hardware, software, and printer options for getting started and moving on, typeface developments, and the key characteristics of desktop publishing are described. The author's notes on 33 articles from the personal computing literature from January-March 1994 are…

  11. Conductive Carbon Nanotube Inks for Use with Desktop Inkjet Printing Technology

    Science.gov (United States)

    Roberson, Luke; Williams, Martha; Tate, LaNetra; Fortier, Craig; Smith, David; Davia, Kyle; Gibson, Tracy; Snyder, Sarah

    2013-01-01

    Inkjet printing is a common commercial process. In addition to the familiar use in printing documents from computers, it is also used in some industrial applications. For example, wire manufacturers are required by law to print the wire type, gauge, and safety information on the exterior of each foot of manufactured wire, and this is typically done with inkjet or laser printers. The goal of this work was the creation of conductive inks that can be applied to a wire or flexible substrates via inkjet printing methods. The use of inkjet printing technology to print conductive inks has been in testing for several years. While researchers have been able to get the printing system to mechanically work, the application of conductive inks on substrates has not consistently produced adequate low resistances in the kilohm range. Conductive materials can be applied using a printer in single or multiple passes onto a substrate including textiles, polymer films, and paper. The conductive materials are composed of electrical conductors such as carbon nanotubes (including functionalized carbon nanotubes and metal-coated carbon nanotubes); graphene, a polycyclic aromatic hydrocarbon (e.g., pentacene and bisperipentacene); metal nanoparticles; inherently conductive polymers (ICP); and combinations thereof. Once the conductive materials are applied, the materials are dried and sintered to form adherent conductive materials on the substrate. For certain formulations, increased conductivity can be achieved by printing on substrates supported by low levels of magnetic field alignment. The adherent conductive materials can be used in applications such as damage detection, dust particle removal, smart coating systems, and flexible electronic circuitry. By applying alternating layers of different electrical conductors to form a layered composite material, a single homogeneous layer can be produced with improved electrical properties. It is believed that patterning alternate layers of

  12. Desktop Publishing Made Simple.

    Science.gov (United States)

    Wentling, Rose Mary

    1989-01-01

    The author discusses the types of computer hardware and software necessary to set up a desktop publishing system, both for use in educational administration and for instructional purposes. Classroom applications of desktop publishing are presented. The author also provides guidelines for preparing to teach desktop publishing. (CH)

  13. A Course in Desktop Publishing.

    Science.gov (United States)

    Somerick, Nancy M.

    1992-01-01

    Describes "Promotional Publications," a required course for public relations majors, which teaches the basics of desktop publishing. Outlines how the course covers the preparation of publications used as communication tools in public relations, advertising, and organizations, with an emphasis upon design, layout, and technology. (MM)

  14. Advanced information technology: Building stronger databases

    Energy Technology Data Exchange (ETDEWEB)

    Price, D. [Lawrence Livermore National Lab., CA (United States)

    1994-12-01

    This paper discusses the attributes of the Advanced Information Technology (AIT) tool set, a database application builder designed at the Lawrence Livermore National Laboratory. AIT consists of a C library and several utilities that provide referential integrity across a database, interactive menu and field level help, and a code generator for building tightly controlled data entry support. AIT also provides for dynamic menu trees, report generation support, and creation of user groups. Composition of the library and utilities is discussed, along with relative strengths and weaknesses. In addition, an instantiation of the AIT tool set is presented using a specific application. Conclusions about the future and value of the tool set are then drawn based on the use of the tool set with that specific application.

  15. Information persistence using XML database technology

    Science.gov (United States)

    Clark, Thomas A.; Lipa, Brian E. G.; Macera, Anthony R.; Staskevich, Gennady R.

    2005-05-01

    The Joint Battlespace Infosphere (JBI) Information Management (IM) services provide information exchange and persistence capabilities that support tailored, dynamic, and timely access to required information, enabling near real-time planning, control, and execution for DoD decision making. JBI IM services will be built on a substrate of network centric core enterprise services and when transitioned, will establish an interoperable information space that aggregates, integrates, fuses, and intelligently disseminates relevant information to support effective warfighter business processes. This virtual information space provides individual users with information tailored to their specific functional responsibilities and provides a highly tailored repository of, or access to, information that is designed to support a specific Community of Interest (COI), geographic area or mission. Critical to effective operation of JBI IM services is the implementation of repositories, where data, represented as information, is represented and persisted for quick and easy retrieval. This paper will address information representation, persistence and retrieval using existing database technologies to manage structured data in Extensible Markup Language (XML) format as well as unstructured data in an IM services-oriented environment. Three basic categories of database technologies will be compared and contrasted: Relational, XML-Enabled, and Native XML. These technologies have diverse properties such as maturity, performance, query language specifications, indexing, and retrieval methods. We will describe our application of these evolving technologies within the context of a JBI Reference Implementation (RI) by providing some hopefully insightful anecdotes and lessons learned along the way. This paper will also outline future directions, promising technologies and emerging COTS products that can offer more powerful information management representations, better persistence mechanisms and

  16. Exploiting relational database technology in a GIS

    Science.gov (United States)

    Batty, Peter

    1992-05-01

    All systems for managing data face common problems such as backup, recovery, auditing, security, data integrity, and concurrent update. Other challenges include the ability to share data easily between applications and to distribute data across several computers, whereas continuing to manage the problems already mentioned. Geographic information systems are no exception, and need to tackle all these issues. Standard relational database-management systems (RDBMSs) provide many features to help solve the issues mentioned so far. This paper describes how the IBM geoManager product approaches these issues by storing all its geographic data in a standard RDBMS in order to take advantage of such features. Areas in which standard RDBMS functions need to be extended are highlighted, and the way in which geoManager does this is explained. The performance implications of storing all data in the relational database are discussed. An important distinction is made between the storage and management of geographic data and the manipulation and analysis of geographic data, which needs to be made when considering the applicability of relational database technology to GIS.

  17. Instant Citrix XenDesktop 5 starter

    CERN Document Server

    Magdy, Mahmoud

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. This easy-to-follow, hands-on guide shows you how to implement desktop virtualization with real life cases and step-by-step instructions. It is a tutorial with step-by-step instructions and adequate screenshots for the installation and administration of Citrix XenDesktop.If you are new to XenDesktop or are looking to build your skills in desktop virtualization, this is your step-by-step guide to learning Citrix XenDesktop. For those architects a

  18. Research and implementation of a Web-based remote desktop image monitoring system

    International Nuclear Information System (INIS)

    Ren Weijuan; Li Luofeng; Wang Chunhong

    2010-01-01

    It studied and implemented an ISS (Image Snapshot Server) system based on Web, using Java Web technology. The ISS system consisted of client web browser and server. The server part could be divided into three modules as the screen shots software, web server and Oracle database. Screen shots software intercepted the desktop environment of the remote monitored PC and sent these pictures to a Tomcat web server for displaying on the web at real time. At the same time, these pictures were also saved in an Oracle database. Through the web browser, monitor person can view the real-time and historical desktop pictures of the monitored PC during some period. It is very convenient for any user to monitor the desktop image of remote monitoring PC. (authors)

  19. Desktop Publishing for Counselors.

    Science.gov (United States)

    Lucking, Robert; Mitchum, Nancy

    1990-01-01

    Discusses the fundamentals of desktop publishing for counselors, including hardware and software systems and peripherals. Notes by using desktop publishing, counselors can produce their own high-quality documents without the expense of commercial printers. Concludes computers present a way of streamlining the communications of a counseling…

  20. Development of Information Technology of Object-relational Databases Design

    Directory of Open Access Journals (Sweden)

    Valentyn A. Filatov

    2012-12-01

    Full Text Available The article is concerned with the development of information technology of object-relational databases design and study of object features infological and logical database schemes entities and connections.

  1. Towards P2P XML Database Technology

    NARCIS (Netherlands)

    Y. Zhang (Ying)

    2007-01-01

    textabstractTo ease the development of data-intensive P2P applications, we envision a P2P XML Database Management System (P2P XDBMS) that acts as a database middle-ware, providing a uniform database abstraction on top of a dynamic set of distributed data sources. In this PhD work, we research which

  2. Applying artificial intelligence to astronomical databases - a surveyof applicable technology.

    Science.gov (United States)

    Rosenthal, D. A.

    This paper surveys several emerging technologies which are relevant to astronomical database issues such as interface technology, internal database representation, and intelligent data reduction aids. Among the technologies discussed are natural language understanding, frame and object representations, planning, pattern analysis, machine learning and the nascent study of simulated neural nets. These techniques will become increasingly important for astronomical research, and in particular, for applications with large databases.

  3. Linux Desktop Pocket Guide

    CERN Document Server

    Brickner, David

    2005-01-01

    While Mac OS X garners all the praise from pundits, and Windows XP attracts all the viruses, Linux is quietly being installed on millions of desktops every year. For programmers and system administrators, business users, and educators, desktop Linux is a breath of fresh air and a needed alternative to other operating systems. The Linux Desktop Pocket Guide is your introduction to using Linux on five of the most popular distributions: Fedora, Gentoo, Mandriva, SUSE, and Ubuntu. Despite what you may have heard, using Linux is not all that hard. Firefox and Konqueror can handle all your web bro

  4. Desktop grid computing

    CERN Document Server

    Cerin, Christophe

    2012-01-01

    Desktop Grid Computing presents common techniques used in numerous models, algorithms, and tools developed during the last decade to implement desktop grid computing. These techniques enable the solution of many important sub-problems for middleware design, including scheduling, data management, security, load balancing, result certification, and fault tolerance. The book's first part covers the initial ideas and basic concepts of desktop grid computing. The second part explores challenging current and future problems. Each chapter presents the sub-problems, discusses theoretical and practical

  5. Desktop Publishing in Libraries.

    Science.gov (United States)

    Cisler, Steve

    1987-01-01

    Describes the components, costs, and capabilities of several desktop publishing systems, and examines their possible impact on work patterns within organizations. The text and graphics of the article were created using various microcomputer software packages. (CLB)

  6. DOE technology information management system database study report

    Energy Technology Data Exchange (ETDEWEB)

    Widing, M.A.; Blodgett, D.W.; Braun, M.D.; Jusko, M.J.; Keisler, J.M.; Love, R.J.; Robinson, G.L. [Argonne National Lab., IL (United States). Decision and Information Sciences Div.

    1994-11-01

    To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performed detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s Technology Information Management System.

  7. An interactive physics-based unmanned ground vehicle simulator leveraging open source gaming technology: progress in the development and application of the virtual autonomous navigation environment (VANE) desktop

    Science.gov (United States)

    Rohde, Mitchell M.; Crawford, Justin; Toschlog, Matthew; Iagnemma, Karl D.; Kewlani, Guarav; Cummins, Christopher L.; Jones, Randolph A.; Horner, David A.

    2009-05-01

    It is widely recognized that simulation is pivotal to vehicle development, whether manned or unmanned. There are few dedicated choices, however, for those wishing to perform realistic, end-to-end simulations of unmanned ground vehicles (UGVs). The Virtual Autonomous Navigation Environment (VANE), under development by US Army Engineer Research and Development Center (ERDC), provides such capabilities but utilizes a High Performance Computing (HPC) Computational Testbed (CTB) and is not intended for on-line, real-time performance. A product of the VANE HPC research is a real-time desktop simulation application under development by the authors that provides a portal into the HPC environment as well as interaction with wider-scope semi-automated force simulations (e.g. OneSAF). This VANE desktop application, dubbed the Autonomous Navigation Virtual Environment Laboratory (ANVEL), enables analysis and testing of autonomous vehicle dynamics and terrain/obstacle interaction in real-time with the capability to interact within the HPC constructive geo-environmental CTB for high fidelity sensor evaluations. ANVEL leverages rigorous physics-based vehicle and vehicle-terrain interaction models in conjunction with high-quality, multimedia visualization techniques to form an intuitive, accurate engineering tool. The system provides an adaptable and customizable simulation platform that allows developers a controlled, repeatable testbed for advanced simulations. ANVEL leverages several key technologies not common to traditional engineering simulators, including techniques from the commercial video-game industry. These enable ANVEL to run on inexpensive commercial, off-the-shelf (COTS) hardware. In this paper, the authors describe key aspects of ANVEL and its development, as well as several initial applications of the system.

  8. Digital video for the desktop

    CERN Document Server

    Pender, Ken

    1999-01-01

    Practical introduction to creating and editing high quality video on the desktop. Using examples from a variety of video applications, benefit from a professional's experience, step-by-step, through a series of workshops demonstrating a wide variety of techniques. These include producing short films, multimedia and internet presentations, animated graphics and special effects.The opportunities for the independent videomaker have never been greater - make sure you bring your understanding fully up to date with this invaluable guide.No prior knowledge of the technology is assumed, with explanati

  9. Database mirroring in fault-tolerant continuous technological process control

    Directory of Open Access Journals (Sweden)

    R. Danel

    2015-10-01

    Full Text Available This paper describes the implementations of mirroring technology of the selected database systems – Microsoft SQL Server, MySQL and Caché. By simulating critical failures the systems behavior and their resilience against failure were tested. The aim was to determine whether the database mirroring is suitable to use in continuous metallurgical processes for ensuring the fault-tolerant solution at affordable cost. The present day database systems are characterized by high robustness and are resistant to sudden system failure. Database mirroring technologies are reliable and even low-budget projects can be provided with a decent fault-tolerant solution. The database system technologies available for low-budget projects are not suitable for use in real-time systems.

  10. Choosing the Right Desktop Publisher.

    Science.gov (United States)

    Eiser, Leslie

    1988-01-01

    Investigates the many different desktop publishing packages available today. Lists the steps to desktop publishing. Suggests which package to use with specific hardware available. Compares several packages for IBM, Mac, and Apple II based systems. (MVL)

  11. Desktop Publishing in Education.

    Science.gov (United States)

    Hall, Wendy; Layman, J.

    1989-01-01

    Discusses the state of desktop publishing (DTP) in education today and describes the weaknesses of the systems available for use in the classroom. Highlights include document design and layout; text composition; graphics; word processing capabilities; a comparison of commercial and educational DTP packages; and skills required for DTP. (four…

  12. Analysis of technologies databases use in physical education and sport

    Directory of Open Access Journals (Sweden)

    Usychenko V.V.

    2010-03-01

    Full Text Available Analysis and systematization is conducted scientific methodical and the special literature. The questions of the use of technology of databases rise in the system of preparation of sportsmen. The necessity of application of technologies of operative treatment of large arrays of sporting information is rotined. Collected taking on the use of computer-aided technologies of account and analysis of results of testing of parameters of training process. The question of influence of technologies is considered on training and competition activity. A database is presented «Athlete». A base contains anthropometric and myometrical indexes of sportsmen of bodybuilding of high qualification.

  13. Fusion research and technology records in INIS database

    International Nuclear Information System (INIS)

    Hillebrand, C.D.

    1998-01-01

    This article is a summary of a survey study ''''A survey on publications in Fusion Research and Technology. Science and Technology Indicators in Fusion R and T'''' by the same author on Fusion R and T records in the International Nuclear Information System (INIS) bibliographic database. In that study, for the first time, all scientometric and bibliometric information contained in a bibliographic database, using INIS records, is analyzed and quantified, specific to a selected field of science and technology. A variety of new science and technology indicators which can be used for evaluating research and development activities is also presented in that study that study

  14. Citrix XenApp 7.5 desktop virtualization solutions

    CERN Document Server

    Paul, Andy

    2014-01-01

    If you are a Citrix® engineer, a virtualization consultant, or an IT project manager with prior experience of using Citrix XenApp® and related technologies for desktop virtualization and want to further explore the power of XenApp® for flawless desktop virtualization, then this book is for you.

  15. Desktop Publishing: A New Frontier for Instructional Technologists.

    Science.gov (United States)

    Bell, Norman T.; Warner, James W.

    1986-01-01

    Discusses new possibilities that computers and laser printers offer instructional technologists. Includes a brief history of printed communications, a description of new technological advances referred to as "desktop publishing," and suggests the application of this technology to instructional tasks. (TW)

  16. Analysis of technologies databases use in physical education and sport

    OpenAIRE

    Usychenko V.V.; Byshevets N.G.

    2010-01-01

    Analysis and systematization is conducted scientific methodical and the special literature. The questions of the use of technology of databases rise in the system of preparation of sportsmen. The necessity of application of technologies of operative treatment of large arrays of sporting information is rotined. Collected taking on the use of computer-aided technologies of account and analysis of results of testing of parameters of training process. The question of influence of technologies is ...

  17. Nuclear Criticality Technology and Safety Project parameter study database

    International Nuclear Information System (INIS)

    Toffer, H.; Erickson, D.G.; Samuel, T.J.; Pearson, J.S.

    1993-03-01

    A computerized, knowledge-screened, comprehensive database of the nuclear criticality safety documentation has been assembled as part of the Nuclear Criticality Technology and Safety (NCTS) Project. The database is focused on nuclear criticality parameter studies. The database has been computerized using dBASE III Plus and can be used on a personal computer or a workstation. More than 1300 documents have been reviewed by nuclear criticality specialists over the last 5 years to produce over 800 database entries. Nuclear criticality specialists will be able to access the database and retrieve information about topical parameter studies, authors, and chronology. The database places the accumulated knowledge in the nuclear criticality area over the last 50 years at the fingertips of a criticality analyst

  18. JENDL. Nuclear databases for science and technology

    International Nuclear Information System (INIS)

    Shibata, Keiichi

    2013-01-01

    It is exactly 50 years since the Japanese Nuclear Data Committee was founded both in the Atomic Energy Society of Japan and in the former Japan Atomic Energy Research Institute. The committee promoted the development of Japan's own evaluated nuclear data libraries. As a result, we managed to produce a series of Japanese Evaluated Nuclear Data Libraries (JENDLs) to be used in various fields for science and technology. The libraries are categorized into general-purpose and special-purpose ones. The general-purpose libraries have been updated periodically by considering the latest knowledge on experimental and theoretical nuclear physics that was available at the time of the updates. On the other hand, the special-purpose libraries have been issued in order to meet the needs for particular application fields. This paper reviews the research and development for those libraries. (author)

  19. Desk Congest Desktop Congesting Software for Desktop Clutter Congestion

    Directory of Open Access Journals (Sweden)

    Solomon A. Adepoju

    2015-06-01

    Full Text Available Abstract The computer desktop environment is a working environment which can be likened unto a users desk in homes and offices. Often times the computer desktop get cluttered with files either as shortcuts used for quick links files stored temporarily to be accessed later or just being dumped there for no vivid reasons. However previous researches have shown that cluttered desktop affects users productivity and getting these files organized is a laborious task for most users. To be able to conveniently alleviate the effect clutters have on users performances and productivity there is need for third party software that will help get the desktop environment organized in a logical and efficient manner. It is to this end that desktop decongesting software is being designed and implemented to help curb clutter problems which existing tools have only partially addressed. The system is designed using Visual Basic .Net and it proves to be effective in tackling desktop congestion problem.

  20. A desktop PRA

    International Nuclear Information System (INIS)

    Dolan, B.J.; Weber, B.J.

    1989-01-01

    This paper reports that Duke Power Company has completed full-scope PRAs for each of its nuclear stations - Oconee, McGuire and Catawba. These living PRAs are being maintained using desktop personal computers. Duke's PRA group now has powerful personal computer-based tools that have both decreased direct costs (computer analysis expenses) and increased group efficiency (less time to perform analyses). The shorter turnaround time has already resulted in direct savings through analyses provided in support of justification for continued station operation. Such savings are expected to continue with similar future support

  1. High-Performance Secure Database Access Technologies for HEP Grids

    Energy Technology Data Exchange (ETDEWEB)

    Matthew Vranicar; John Weicher

    2006-04-17

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysis capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist’s computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that "Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications.” There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the

  2. High-Performance Secure Database Access Technologies for HEP Grids

    International Nuclear Information System (INIS)

    Vranicar, Matthew; Weicher, John

    2006-01-01

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysis capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist's computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that 'Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications'. There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the secure

  3. Adobe AIR, Bringing Rich Internet Applications to the Desktop

    Directory of Open Access Journals (Sweden)

    Valentin Vieriu

    2009-01-01

    Full Text Available Rich Internet Applications are the new trend in software development today. Adobe AIR offers the possibility to create cross-platform desktop applications using popular Web technologies like HTML, JavaScript, Flash and Flex. This article is focused on presenting the advantages that this new environment has to offer for the web development community and how quickly you can develop a desktop application using Adobe AIR.

  4. Adobe AIR, Bringing Rich Internet Applications to the Desktop

    OpenAIRE

    Vieriu, Valentin; Tuican, Catalin

    2009-01-01

    Rich Internet Applications are the new trend in software development today. Adobe AIR offers the possibility to create cross-platform desktop applications using popular Web technologies like HTML, JavaScript, Flash and Flex. This article is focused on presenting the advantages that this new environment has to offer for the web development community and how quickly you can develop a desktop application using Adobe AIR.

  5. Fatigue monitoring desktop guide

    International Nuclear Information System (INIS)

    Woods, K.; Thomas, K.

    2012-01-01

    The development of a program for managing material aging (MMG) in the nuclear industry requires a new and different perspective. The classical method for MMG is cycle counting, which has been shown to have limited success. The classical method has been successful in satisfying the ductile condition per the America Society of Mechanical Engineers' (ASME) design criteria. However, the defined material failure mechanism has transformed from through-wall cracking and leakage (ASME) to crack initiation (NUREG-6909). This transformation is based on current industry experience with material degradation early in plant life and can be attributed to fabrication issues and environment concerns where cycle counting has been unsuccessful. This new perspective provides a different approach to cycle counting that incorporates all of the information about the material conditions. This approach goes beyond the consideration of a static analysis and includes a dynamic assessment of component health, which is required for operating plants. This health definition should consider fabrication, inspections, transient conditions and industry operating experience. In addition, this collection of information can be transparent to a broader audience that may not have a full understanding of the system design or the potential causes of early material degradation. This paper will present the key points that are needed for a successful fatigue monitoring desktop guide. (authors)

  6. Fatigue monitoring desktop guide

    Energy Technology Data Exchange (ETDEWEB)

    Woods, K. [InnoTech Engineering Solutions, LLC (United States); Thomas, K. [Nebraska Public Power District (United States)

    2012-07-01

    The development of a program for managing material aging (MMG) in the nuclear industry requires a new and different perspective. The classical method for MMG is cycle counting, which has been shown to have limited success. The classical method has been successful in satisfying the ductile condition per the America Society of Mechanical Engineers' (ASME) design criteria. However, the defined material failure mechanism has transformed from through-wall cracking and leakage (ASME) to crack initiation (NUREG-6909). This transformation is based on current industry experience with material degradation early in plant life and can be attributed to fabrication issues and environment concerns where cycle counting has been unsuccessful. This new perspective provides a different approach to cycle counting that incorporates all of the information about the material conditions. This approach goes beyond the consideration of a static analysis and includes a dynamic assessment of component health, which is required for operating plants. This health definition should consider fabrication, inspections, transient conditions and industry operating experience. In addition, this collection of information can be transparent to a broader audience that may not have a full understanding of the system design or the potential causes of early material degradation. This paper will present the key points that are needed for a successful fatigue monitoring desktop guide. (authors)

  7. Desktop publishing com o scribus

    OpenAIRE

    Silva, Fabrício Riff; Uchôa, Kátia Cilene Amaral

    2015-01-01

    Este artigo apresenta um breve tutorial sobre Desktop Publishing, com ênfase no software livre Scribus, através da criação de um exemplo prático que explora algumas de suas principais funcionalidades.

  8. Experience with a run file archive using database technology

    International Nuclear Information System (INIS)

    Nixdorf, U.

    1993-12-01

    High Energy Physics experiments are known for their production of large amounts of data. Even small projects may have to manage several Giga Byte of event information. One possible solution for the management of this data is to use today's technology to archive the raw data files in tertiary storage and build on-line catalogs which reference interesting data. This approach has been taken by the Gammas, Electrons and Muons (GEM) Collaboration for their evaluation of muon chamber technologies at the Superconducting Super Collider Laboratory (SSCL). Several technologies were installed and tested during a 6 month period. Events produced were first recorded in the UNIX filesystem of the data acquisition system and then migrated to the Physics Detector Simulation Facility (PDSF) for long term storage. The software system makes use of a commercial relational database management system (SYBASE) and the Data Management System (DMS), a tape archival system developed at the SSCL. The components are distributed among several machines inside and outside PDSF. A Motif-based graphical user interface (GUI) enables physicists to retrieve interesting runs from the archive using the on-line database catalog

  9. Strength of PLA Components Fabricated with Fused Deposition Technology Using a Desktop 3D Printer as a Function of Geometrical Parameters of the Process

    Directory of Open Access Journals (Sweden)

    Vladimir E. Kuznetsov

    2018-03-01

    Full Text Available The current paper studies the influence of geometrical parameters of the fused deposition modeling (FDM—fused filament fabrication (FFF 3D printing process on printed part strength for open source desktop 3D printers and the most popular material used for that purpose—i.e., polylactic acid (PLA. The study was conducted using a set of different nozzles (0.4, 0.6, and 0.8 mm and a range of layer heights from the minimum to maximum physical limits of the machine. To assess print strength, a novel assessment method is proposed. A tubular sample is loaded in the weakest direction (across layers in a three-point bending fixture. Mesostructure evaluation through scanning electronic microscopy (SEM scans of the samples was used to explain the obtained results. We detected a significant influence of geometric process parameters on sample mesostructure, and consequently, on sample strength.

  10. Nuclear plant analyzer desktop workstation

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1990-01-01

    In 1983 the U.S. Nuclear Regulatory Commission (USNRC) commissioned the Idaho National Engineering Laboratory (INEL) to develop a Nuclear Plant Analyzer (NPA). The NPA was envisioned as a graphical aid to assist reactor safety analysts in comprehending the results of thermal-hydraulic code calculations. The development was to proceed in three distinct phases culminating in a desktop reactor safety workstation. The desktop NPA is now complete. The desktop NPA is a microcomputer based reactor transient simulation, visualization and analysis tool developed at INEL to assist an analyst in evaluating the transient behavior of nuclear power plants by means of graphic displays. The NPA desktop workstation integrates advanced reactor simulation codes with online computer graphics allowing reactor plant transient simulation and graphical presentation of results. The graphics software, written exclusively in ANSI standard C and FORTRAN 77 and implemented over the UNIX/X-windows operating environment, is modular and is designed to interface to the NRC's suite of advanced thermal-hydraulic codes to the extent allowed by that code. Currently, full, interactive, desktop NPA capabilities are realized only with RELAP5

  11. Scaling up ATLAS Database Release Technology for the LHC Long Run

    International Nuclear Information System (INIS)

    Borodin, M; Nevski, P; Vaniachine, A

    2011-01-01

    To overcome scalability limitations in database access on the Grid, ATLAS introduced the Database Release technology replicating databases in files. For years Database Release technology assured scalable database access for Monte Carlo production on the Grid. Since previous CHEP, Database Release technology was used successfully in ATLAS data reprocessing on the Grid. Frozen Conditions DB snapshot guarantees reproducibility and transactional consistency isolating Grid data processing tasks from continuous conditions updates at the 'live' Oracle server. Database Release technology fully satisfies the requirements of ATLAS data reprocessing and Monte Carlo production. We parallelized the Database Release build workflow to avoid linear dependency of the build time on the length of LHC data-taking period. In recent data reprocessing campaigns the build time was reduced by an order of magnitude thanks to a proven master-worker architecture used in the Google MapReduce. We describe further Database Release optimizations scaling up the technology for the LHC long run.

  12. Health technology management: a database analysis as support of technology managers in hospitals.

    Science.gov (United States)

    Miniati, Roberto; Dori, Fabrizio; Iadanza, Ernesto; Fregonara, Mario M; Gentili, Guido Biffi

    2011-01-01

    Technology management in healthcare must continually respond and adapt itself to new improvements in medical equipment. Multidisciplinary approaches which consider the interaction of different technologies, their use and user skills, are necessary in order to improve safety and quality. An easy and sustainable methodology is vital to Clinical Engineering (CE) services in healthcare organizations in order to define criteria regarding technology acquisition and replacement. This article underlines the critical aspects of technology management in hospitals by providing appropriate indicators for benchmarking CE services exclusively referring to the maintenance database from the CE department at the Careggi Hospital in Florence, Italy.

  13. Structure health monitoring system using internet and database technologies

    International Nuclear Information System (INIS)

    Kwon, Il Bum; Kim, Chi Yeop; Choi, Man Yong; Lee, Seung Seok

    2003-01-01

    Structural health monitoring system should developed to be based on internet and database technology in order to manage efficiently large structures. This system is operated by internet connected with the side of structures. The monitoring system has some functions: self monitoring, self diagnosis, and self control etc. Self monitoring is the function of sensor fault detection. If some sensors are not normally worked, then this system can detect the fault sensors. Also Self diagnosis function repair the abnormal condition of sensors. And self control is the repair function of the monitoring system. Especially, the monitoring system can identify the replacement of sensors. For further study, the real application test will be performed to check some unconvince.

  14. Structural health monitoring system using internet and database technologies

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chi Yeop; Choi, Man Yong; Kwon, Il Bum; Lee, Seung Seok [Nonstructive Measurment Lab., KRISS, Daejeon (Korea, Republic of)

    2003-07-01

    Structure health monitoring system should develope to be based on internet and database technology in order to manage efficiency large structures. This system is operated by internet connected with the side of structures. The monitoring system has some functions: self monitoring, self diagnosis, and self control etc. Self monitoring is the function of sensor fault detection. If some sensors are not normally worked, then this system can detect the fault sensors. Also Self diagnosis function repair the abnormal condition of sensors. And self control is the repair function of the monitoring system. Especially, the monitoring system can identify the replacement of sensors. For further study, the real application test will be performed to check some unconviniences.

  15. Structure health monitoring system using internet and database technologies

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Il Bum; Kim, Chi Yeop; Choi, Man Yong; Lee, Seung Seok [Smart Measurment Group. Korea Resarch Institute of Standards and Science, Saejeon (Korea, Republic of)

    2003-05-15

    Structural health monitoring system should developed to be based on internet and database technology in order to manage efficiently large structures. This system is operated by internet connected with the side of structures. The monitoring system has some functions: self monitoring, self diagnosis, and self control etc. Self monitoring is the function of sensor fault detection. If some sensors are not normally worked, then this system can detect the fault sensors. Also Self diagnosis function repair the abnormal condition of sensors. And self control is the repair function of the monitoring system. Especially, the monitoring system can identify the replacement of sensors. For further study, the real application test will be performed to check some unconvince.

  16. Structural health monitoring system using internet and database technologies

    International Nuclear Information System (INIS)

    Kim, Chi Yeop; Choi, Man Yong; Kwon, Il Bum; Lee, Seung Seok

    2003-01-01

    Structure health monitoring system should develope to be based on internet and database technology in order to manage efficiency large structures. This system is operated by internet connected with the side of structures. The monitoring system has some functions: self monitoring, self diagnosis, and self control etc. Self monitoring is the function of sensor fault detection. If some sensors are not normally worked, then this system can detect the fault sensors. Also Self diagnosis function repair the abnormal condition of sensors. And self control is the repair function of the monitoring system. Especially, the monitoring system can identify the replacement of sensors. For further study, the real application test will be performed to check some unconviniences.

  17. The Point Lepreau Desktop Simulator

    International Nuclear Information System (INIS)

    MacLean, M.; Hogg, J.; Newman, H.

    1997-01-01

    The Point Lepreau Desktop Simulator runs plant process modeling software on a 266 MHz single CPU DEC Alpha computer. This same Alpha also runs the plant control computer software on an SSCI 125 emulator. An adjacent Pentium PC runs the simulator's Instructor Facility software, and communicates with the Alpha through an Ethernet. The Point Lepreau Desktop simulator is constructed to be as similar as possible to the Point Lepreau full scope training simulator. This minimizes total maintenance costs and enhances the benefits of the desktop simulator. Both simulators have the same modeling running on a single CPU in the same schedule of calculations. Both simulators have the same Instructor Facility capable of developing and executing the same lesson plans, doing the same monitoring and control of simulations, inserting all the same malfunctions, performing all the same overrides, capable of making and restoring all the same storepoints. Both simulators run the same plant control computer software - the same assembly language control programs as the power plant uses for reactor control, heat transport control, annunciation, etc. This is a higher degree of similarity between a desktop simulator and a full scope training simulator than previously reported for a computer controlled nuclear plant. The large quantity of control room hardware missing from the desktop simulator is replaced by software. The Instructor Facility panel override software of the training simulator provides the means by which devices (switches, controllers, windows, etc.) on the control room panels can be controlled and monitored in the desktop simulator. The CRT of the Alpha provides a mouse operated DCC keyboard mimic for controlling the plant control computer emulation. Two emulated RAMTEK display channels appear as windows for monitoring anything of interest on plant DCC displays, including one channel for annunciation. (author)

  18. Nielsen PrimeLocation Web/Desktop: Assessing and GIS Mapping Market Area

    Data.gov (United States)

    Social Security Administration — Nielsen PrimeLocation Web and Desktop Software Licensed for Internal Use only: Pop-Facts Demographics Database, Geographic Mapping Data Layers, Geo-Coding locations.

  19. Databases

    Digital Repository Service at National Institute of Oceanography (India)

    Kunte, P.D.

    Information on bibliographic as well as numeric/textual databases relevant to coastal geomorphology has been included in a tabular form. Databases cover a broad spectrum of related subjects like coastal environment and population aspects, coastline...

  20. Making the Leap to Desktop Publishing.

    Science.gov (United States)

    Schleifer, Neal

    1986-01-01

    Describes one teacher's approach to desktop publishing. Explains how the Macintosh and LaserWriter were used in the publication of a school newspaper. Guidelines are offered to teachers for the establishment of a desktop publishing lab. (ML)

  1. The Printout: Desktop Pulishing in the Classroom.

    Science.gov (United States)

    Balajthy, Ernest; Link, Gordon

    1988-01-01

    Reviews software available to the classroom teacher for desktop publishing and describes specific classroom activities. Suggests using desktop publishing to produce large print texts for students with limited sight or for primary students.(NH)

  2. Desktop Publishing in the University.

    Science.gov (United States)

    Burstyn, Joan N., Ed.

    Highlighting changes in the work of people within the university, this book presents nine essays that examine the effects of desktop publishing and electronic publishing on professors and students, librarians, and those who work at university presses and in publication departments. Essays in the book are: (1) "Introduction: The Promise of Desktop…

  3. Desktop Publishing Choices: Making an Appropriate Decision.

    Science.gov (United States)

    Crawford, Walt

    1991-01-01

    Discusses various choices available for desktop publishing systems. Four categories of software are described, including advanced word processing, graphics software, low-end desktop publishing, and mainstream desktop publishing; appropriate hardware is considered; and selection guidelines are offered, including current and future publishing needs,…

  4. Basics of Desktop Publishing. Second Edition.

    Science.gov (United States)

    Beeby, Ellen; Crummett, Jerrie

    This document contains teacher and student materials for a basic course in desktop publishing. Six units of instruction cover the following: (1) introduction to desktop publishing; (2) desktop publishing systems; (3) software; (4) type selection; (5) document design; and (6) layout. The teacher edition contains some or all of the following…

  5. Databases

    Directory of Open Access Journals (Sweden)

    Nick Ryan

    2004-01-01

    Full Text Available Databases are deeply embedded in archaeology, underpinning and supporting many aspects of the subject. However, as well as providing a means for storing, retrieving and modifying data, databases themselves must be a result of a detailed analysis and design process. This article looks at this process, and shows how the characteristics of data models affect the process of database design and implementation. The impact of the Internet on the development of databases is examined, and the article concludes with a discussion of a range of issues associated with the recording and management of archaeological data.

  6. Applications of GIS and database technologies to manage a Karst Feature Database

    Science.gov (United States)

    Gao, Y.; Tipping, R.G.; Alexander, E.C.

    2006-01-01

    This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.

  7. Ceramics Technology Project database: September 1991 summary report

    Energy Technology Data Exchange (ETDEWEB)

    Keyes, B.L.P.

    1992-06-01

    The piston ring-cylinder liner area of the internal combustion engine must withstand very-high-temperature gradients, highly-corrosive environments, and constant friction. Improving the efficiency in the engine requires ring and cylinder liner materials that can survive this abusive environment and lubricants that resist decomposition at elevated temperatures. Wear and friction tests have been done on many material combinations in environments similar to actual use to find the right materials for the situation. This report covers tribology information produced from 1986 through July 1991 by Battelle columbus Laboratories, Caterpillar Inc., and Cummins Engine Company, Inc. for the Ceramic Technology Project (CTP). All data in this report were taken from the project`s semiannual and bimonthly progress reports and cover base materials, coatings, and lubricants. The data, including test rig descriptions and material characterizations, are stored in the CTP database and are available to all project participants on request. Objective of this report is to make available the test results from these studies, but not to draw conclusions from these data.

  8. Using Desktop Publishing To Enhance the "Writing Process."

    Science.gov (United States)

    Millman, Patricia G.; Clark, Margaret P.

    1997-01-01

    Describes the development of an instructional technology course at Fairmont State College (West Virginia) for education majors that included a teaching module combining steps of the writing process to provide for the interdisciplinary focus of writing across the curriculum. Discusses desktop publishing, the National Writing Project, and student…

  9. NoSQL technologies for the CMS Conditions Database

    Science.gov (United States)

    Sipos, Roland

    2015-12-01

    With the restart of the LHC in 2015, the growth of the CMS Conditions dataset will continue, therefore the need of consistent and highly available access to the Conditions makes a great cause to revisit different aspects of the current data storage solutions. We present a study of alternative data storage backends for the Conditions Databases, by evaluating some of the most popular NoSQL databases to support a key-value representation of the CMS Conditions. The definition of the database infrastructure is based on the need of storing the conditions as BLOBs. Because of this, each condition can reach the size that may require special treatment (splitting) in these NoSQL databases. As big binary objects may be problematic in several database systems, and also to give an accurate baseline, a testing framework extension was implemented to measure the characteristics of the handling of arbitrary binary data in these databases. Based on the evaluation, prototypes of a document store, using a column-oriented and plain key-value store, are deployed. An adaption layer to access the backends in the CMS Offline software was developed to provide transparent support for these NoSQL databases in the CMS context. Additional data modelling approaches and considerations in the software layer, deployment and automatization of the databases are also covered in the research. In this paper we present the results of the evaluation as well as a performance comparison of the prototypes studied.

  10. Desktop supercomputer: what can it do?

    Science.gov (United States)

    Bogdanov, A.; Degtyarev, A.; Korkhov, V.

    2017-12-01

    The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.

  11. Desktop supercomputer: what can it do?

    International Nuclear Information System (INIS)

    Bogdanov, A.; Degtyarev, A.; Korkhov, V.

    2017-01-01

    The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.

  12. The research of network database security technology based on web service

    Science.gov (United States)

    Meng, Fanxing; Wen, Xiumei; Gao, Liting; Pang, Hui; Wang, Qinglin

    2013-03-01

    Database technology is one of the most widely applied computer technologies, its security is becoming more and more important. This paper introduced the database security, network database security level, studies the security technology of the network database, analyzes emphatically sub-key encryption algorithm, applies this algorithm into the campus-one-card system successfully. The realization process of the encryption algorithm is discussed, this method is widely used as reference in many fields, particularly in management information system security and e-commerce.

  13. NoSQL technologies for the CMS Conditions Database

    CERN Document Server

    Sipos, Roland

    2015-01-01

    With the restart of the LHC in 2015, the growth of the CMS Conditions dataset will continue, therefore the need of consistent and highly available access to the Conditions makes a great cause to revisit different aspects of the current data storage solutions.We present a study of alternative data storage backends for the Conditions Databases, by evaluating some of the most popular NoSQL databases to support a key-value representation of the CMS Conditions. An important detail about the Conditions that the payloads are stored as BLOBs, and they can reach sizes that may require special treatment (splitting) in these NoSQL databases. As big binary objects may be a bottleneck in several database systems, and also to give an accurate baseline, a testing framework extension was implemented to measure the characteristics of the handling of arbitrary binary data in these databases. Based on the evaluation, prototypes of a document store, using a column-oriented and plain key-value store, are deployed. An adaption l...

  14. Exploiting database technology for object based event storage and retrieval

    International Nuclear Information System (INIS)

    Rawat, Anil; Rajan, Alpana; Tomar, Shailendra Singh; Bansal, Anurag

    2005-01-01

    This paper discusses the storage and retrieval of experimental data on relational databases. Physics experiments carried out using reactors and particle accelerators, generate huge amount of data. Also, most of the data analysis and simulation programs are developed using object oriented programming concepts. Hence, one of the most important design features of an experiment related software framework is the way object persistency is handled. We intend to discuss these issues in the light of the module developed by us for storing C++ objects in relational databases like Oracle. This module was developed under the POOL persistency framework being developed for LHC, CERN grid. (author)

  15. Development of Web-Based Remote Desktop to Provide Adaptive User Interfaces in Cloud Platform

    OpenAIRE

    Shuen-Tai Wang; Hsi-Ya Chang

    2014-01-01

    Cloud virtualization technologies are becoming more and more prevalent, cloud users usually encounter the problem of how to access to the virtualized remote desktops easily over the web without requiring the installation of special clients. To resolve this issue, we took advantage of the HTML5 technology and developed web-based remote desktop. It permits users to access the terminal which running in our cloud platform from anywhere. We implemented a sketch of web interfac...

  16. ADAM (Affordable Desktop Application Manager): a Unix desktop application manager

    International Nuclear Information System (INIS)

    Liebana, M.; Marquina, M.; Ramos, R.

    1996-01-01

    ADAM stands for Affordable Desktop Application Manager. It is a GUI developed at CERN with the aim to ease access to applications. The motivation to develop ADAM came from the unavailability of environments like COSE/CDE and their heavy resource consumption. ADAM has proven to be user friendly: new users are able to customize it to their needs in few minutes. Groups of users may share through ADAM a common application environment. ADAM also integrates the Unix and the PC world. PC users can excess Unix applications in the same way as their usual Windows applications. This paper describes all the ADAM features, how they are used at CERN Public Services, and the future plans for ADAM. (author)

  17. Data-Base Software For Tracking Technological Developments

    Science.gov (United States)

    Aliberti, James A.; Wright, Simon; Monteith, Steve K.

    1996-01-01

    Technology Tracking System (TechTracS) computer program developed for use in storing and retrieving information on technology and related patent information developed under auspices of NASA Headquarters and NASA's field centers. Contents of data base include multiple scanned still images and quick-time movies as well as text. TechTracS includes word-processing, report-editing, chart-and-graph-editing, and search-editing subprograms. Extensive keyword searching capabilities enable rapid location of technologies, innovators, and companies. System performs routine functions automatically and serves multiple users.

  18. Healthcare Databases in Thailand and Japan: Potential Sources for Health Technology Assessment Research.

    Directory of Open Access Journals (Sweden)

    Surasak Saokaew

    Full Text Available Health technology assessment (HTA has been continuously used for value-based healthcare decisions over the last decade. Healthcare databases represent an important source of information for HTA, which has seen a surge in use in Western countries. Although HTA agencies have been established in Asia-Pacific region, application and understanding of healthcare databases for HTA is rather limited. Thus, we reviewed existing databases to assess their potential for HTA in Thailand where HTA has been used officially and Japan where HTA is going to be officially introduced.Existing healthcare databases in Thailand and Japan were compiled and reviewed. Databases' characteristics e.g. name of database, host, scope/objective, time/sample size, design, data collection method, population/sample, and variables were described. Databases were assessed for its potential HTA use in terms of safety/efficacy/effectiveness, social/ethical, organization/professional, economic, and epidemiological domains. Request route for each database was also provided.Forty databases- 20 from Thailand and 20 from Japan-were included. These comprised of national censuses, surveys, registries, administrative data, and claimed databases. All databases were potentially used for epidemiological studies. In addition, data on mortality, morbidity, disability, adverse events, quality of life, service/technology utilization, length of stay, and economics were also found in some databases. However, access to patient-level data was limited since information about the databases was not available on public sources.Our findings have shown that existing databases provided valuable information for HTA research with limitation on accessibility. Mutual dialogue on healthcare database development and usage for HTA among Asia-Pacific region is needed.

  19. Healthcare Databases in Thailand and Japan: Potential Sources for Health Technology Assessment Research.

    Science.gov (United States)

    Saokaew, Surasak; Sugimoto, Takashi; Kamae, Isao; Pratoomsoot, Chayanin; Chaiyakunapruk, Nathorn

    2015-01-01

    Health technology assessment (HTA) has been continuously used for value-based healthcare decisions over the last decade. Healthcare databases represent an important source of information for HTA, which has seen a surge in use in Western countries. Although HTA agencies have been established in Asia-Pacific region, application and understanding of healthcare databases for HTA is rather limited. Thus, we reviewed existing databases to assess their potential for HTA in Thailand where HTA has been used officially and Japan where HTA is going to be officially introduced. Existing healthcare databases in Thailand and Japan were compiled and reviewed. Databases' characteristics e.g. name of database, host, scope/objective, time/sample size, design, data collection method, population/sample, and variables were described. Databases were assessed for its potential HTA use in terms of safety/efficacy/effectiveness, social/ethical, organization/professional, economic, and epidemiological domains. Request route for each database was also provided. Forty databases- 20 from Thailand and 20 from Japan-were included. These comprised of national censuses, surveys, registries, administrative data, and claimed databases. All databases were potentially used for epidemiological studies. In addition, data on mortality, morbidity, disability, adverse events, quality of life, service/technology utilization, length of stay, and economics were also found in some databases. However, access to patient-level data was limited since information about the databases was not available on public sources. Our findings have shown that existing databases provided valuable information for HTA research with limitation on accessibility. Mutual dialogue on healthcare database development and usage for HTA among Asia-Pacific region is needed.

  20. System Testing of Desktop and Web Applications

    Science.gov (United States)

    Slack, James M.

    2011-01-01

    We want our students to experience system testing of both desktop and web applications, but the cost of professional system-testing tools is far too high. We evaluate several free tools and find that AutoIt makes an ideal educational system-testing tool. We show several examples of desktop and web testing with AutoIt, starting with simple…

  1. Desktop Publishing as a Learning Resources Service.

    Science.gov (United States)

    Drake, David

    In late 1988, Midland College in Texas implemented a desktop publishing service to produce instructional aids and reduce and complement the workload of the campus print shop. The desktop service was placed in the Media Services Department of the Learning Resource Center (LRC) for three reasons: the LRC was already established as a campus-wide…

  2. Desktop Publishing for the Gifted/Talented.

    Science.gov (United States)

    Hamilton, Wayne

    1987-01-01

    Examines the nature of desktop publishing and how it can be used in the classroom for gifted/talented students. Characteristics and special needs of such students are identified, and it is argued that desktop publishing addresses those needs, particularly with regard to creativity. Twenty-six references are provided. (MES)

  3. Technical Writing Teachers and the Challenges of Desktop Publishing.

    Science.gov (United States)

    Kalmbach, James

    1988-01-01

    Argues that technical writing teachers must understand desktop publishing. Discusses the strengths that technical writing teachers bring to desktop publishing, and the impact desktop publishing will have on technical writing courses and programs. (ARH)

  4. Development of Integrated PSA Database and Application Technology

    Energy Technology Data Exchange (ETDEWEB)

    Han, Sang Hoon; Park, Jin Hee; Kim, Seung Hwan; Choi, Sun Yeong; Jung, Woo Sik; Jeong, Kwang Sub; Ha Jae Joo; Yang, Joon Eon; Min Kyung Ran; Kim, Tae Woon

    2005-04-15

    The purpose of this project is to develop 1) the reliability database framework, 2) the methodology for the reactor trip and abnormal event analysis, and 3) the prototype PSA information DB system. We already have a part of the reactor trip and component reliability data. In this study, we extend the collection of data up to 2002. We construct the pilot reliability database for common cause failure and piping failure data. A reactor trip or a component failure may have an impact on the safety of a nuclear power plant. We perform the precursor analysis for such events that occurred in the KSNP, and to develop a procedure for the precursor analysis. A risk monitor provides a mean to trace the changes in the risk following the changes in the plant configurations. We develop a methodology incorporating the model of secondary system related to the reactor trip into the risk monitor model. We develop a prototype PSA information system for the UCN 3 and 4 PSA models where information for the PSA is inputted into the system such as PSA reports, analysis reports, thermal-hydraulic analysis results, system notebooks, and so on. We develop a unique coherent BDD method to quantify a fault tree and the fastest fault tree quantification engine FTREX. We develop quantification software for a full PSA model and a one top model.

  5. A comparison of different database technologies for the CMS AsyncStageOut transfer database

    Science.gov (United States)

    Ciangottini, D.; Balcas, J.; Mascheroni, M.; Rupeika, E. A.; Vaandering, E.; Riahi, H.; Silva, J. M. D.; Hernandez, J. M.; Belforte, S.; Ivanov, T. T.

    2017-10-01

    AsyncStageOut (ASO) is the component of the CMS distributed data analysis system (CRAB) that manages users transfers in a centrally controlled way using the File Transfer System (FTS3) at CERN. It addresses a major weakness of the previous, decentralized model, namely that the transfer of the user’s output data to a single remote site was part of the job execution, resulting in inefficient use of job slots and an unacceptable failure rate. Currently ASO manages up to 600k files of various sizes per day from more than 500 users per month, spread over more than 100 sites. ASO uses a NoSQL database (CouchDB) as internal bookkeeping and as way to communicate with other CRAB components. Since ASO/CRAB were put in production in 2014, the number of transfers constantly increased up to a point where the pressure to the central CouchDB instance became critical, creating new challenges for the system scalability, performance, and monitoring. This forced a re-engineering of the ASO application to increase its scalability and lowering its operational effort. In this contribution we present a comparison of the performance of the current NoSQL implementation and a new SQL implementation, and how their different strengths and features influenced the design choices and operational experience. We also discuss other architectural changes introduced in the system to handle the increasing load and latency in delivering output to the user.

  6. A Comparison of Different Database Technologies for the CMS AsyncStageOut Transfer Database

    Energy Technology Data Exchange (ETDEWEB)

    Ciangottini, D. [INFN, Perugia; Balcas, J. [Caltech; Mascheroni, M. [Fermilab; Rupeika, E. A. [Vilnius U.; Vaandering, E. [Fermilab; Riahi, H. [CERN; Silva, J. M.D. [Sao Paulo, IFT; Hernandez, J. M. [Madrid, CIEMAT; Belforte, S. [INFN, Trieste; Ivanov, T. T. [Sofiya U.

    2017-11-22

    AsyncStageOut (ASO) is the component of the CMS distributed data analysis system (CRAB) that manages users transfers in a centrally controlled way using the File Transfer System (FTS3) at CERN. It addresses a major weakness of the previous, decentralized model, namely that the transfer of the user’s output data to a single remote site was part of the job execution, resulting in inefficient use of job slots and an unacceptable failure rate. Currently ASO manages up to 600k files of various sizes per day from more than 500 users per month, spread over more than 100 sites. ASO uses a NoSQL database (CouchDB) as internal bookkeeping and as way to communicate with other CRAB components. Since ASO/CRAB were put in production in 2014, the number of transfers constantly increased up to a point where the pressure to the central CouchDB instance became critical, creating new challenges for the system scalability, performance, and monitoring. This forced a re-engineering of the ASO application to increase its scalability and lowering its operational effort. In this contribution we present a comparison of the performance of the current NoSQL implementation and a new SQL implementation, and how their different strengths and features influenced the design choices and operational experience. We also discuss other architectural changes introduced in the system to handle the increasing load and latency in delivering output to the user.

  7. Cloud Computing and Virtual Desktop Infrastructures in Afloat Environments

    OpenAIRE

    Gillette, Stefan E.

    2012-01-01

    The phenomenon of “cloud computing” has become ubiquitous among users of the Internet and many commercial applications. Yet, the U.S. Navy has conducted limited research in this nascent technology. This thesis explores the application and integration of cloud computing both at the shipboard level and in a multi-ship environment. A virtual desktop infrastructure, mirroring a shipboard environment, was built and analyzed in the Cloud Lab at the Naval Postgraduate School, which offers a potentia...

  8. A Collaborative Digital Pathology System for Multi-Touch Mobile and Desktop Computing Platforms

    KAUST Repository

    Jeong, W.

    2013-06-13

    Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server system that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch-enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. We introduce a domain-specific image-stack compression method that leverages real-time hardware decoding on mobile devices. It adaptively encodes image stacks in a decorrelated colour space to achieve extremely low bitrates (0.8 bpp) with very low loss of image quality. We evaluate the image quality of our compression method and the performance of our system for diagnosis with an in-depth user study. Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server systems that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. © 2013 The Eurographics Association and John Wiley & Sons Ltd.

  9. A Collaborative Digital Pathology System for Multi-Touch Mobile and Desktop Computing Platforms

    KAUST Repository

    Jeong, W.; Schneider, J.; Hansen, A.; Lee, M.; Turney, S. G.; Faulkner-Jones, B. E.; Hecht, J. L.; Najarian, R.; Yee, E.; Lichtman, J. W.; Pfister, H.

    2013-01-01

    Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server system that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch-enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. We introduce a domain-specific image-stack compression method that leverages real-time hardware decoding on mobile devices. It adaptively encodes image stacks in a decorrelated colour space to achieve extremely low bitrates (0.8 bpp) with very low loss of image quality. We evaluate the image quality of our compression method and the performance of our system for diagnosis with an in-depth user study. Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server systems that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. © 2013 The Eurographics Association and John Wiley & Sons Ltd.

  10. Development of a national neutron database for nuclear technology

    International Nuclear Information System (INIS)

    Igantyuk, A.V.; Kononov, V.N.; Kuzminov, B.D.; Manokhin, V.N.; Nikolaev, M.N.; Furzov, B.I.

    1997-01-01

    This paper describes the stages of a many years activities at the IPPE consisting of the measurement, theoretical description and evaluation of neutron data, and of the establishment of a national data bank of neutron data for nuclear technology. A list of libraries which are stored at the Nuclear Data Centre is given. (author). 16 refs, 14 tabs

  11. Using XML technology for the ontology-based semantic integration of life science databases.

    Science.gov (United States)

    Philippi, Stephan; Köhler, Jacob

    2004-06-01

    Several hundred internet accessible life science databases with constantly growing contents and varying areas of specialization are publicly available via the internet. Database integration, consequently, is a fundamental prerequisite to be able to answer complex biological questions. Due to the presence of syntactic, schematic, and semantic heterogeneities, large scale database integration at present takes considerable efforts. As there is a growing apprehension of extensible markup language (XML) as a means for data exchange in the life sciences, this article focuses on the impact of XML technology on database integration in this area. In detail, a general architecture for ontology-driven data integration based on XML technology is introduced, which overcomes some of the traditional problems in this area. As a proof of concept, a prototypical implementation of this architecture based on a native XML database and an expert system shell is described for the realization of a real world integration scenario.

  12. Ceramic Technology Project database: September 1993 summary report

    Energy Technology Data Exchange (ETDEWEB)

    Keyes, B.L.P.

    1994-01-01

    Data presented in this report represent an intense effort to improve processing methods, testing methods, and general mechanical properties of candidate ceramics for use in advanced heat engines. Materials discussed include GN-10, GS-44, GTE PY6, NT-154, NT-164, sintered-reaction-bonded silicon nitrides, silicon nitride combined with rare-earth oxides, NT-230, Hexoloy SX-G1, Dow Corning`s {beta}-Si{sub 3}N{sub 4}, and a few whisker-reinforced ceramic composites. Information in this report was taken from the project`s semiannual and bimonthly progress reports and from final reports summarizing the results of individual studies. Test results are presented in tabular form and in graphs. All data, including test rig descriptions and material characterizations, are stored in the CTP database and are available to all project participants on request. Objective of this report is to make available the test results from these studies but not to draw conclusions from those data.

  13. Integrating heterogeneous databases in clustered medic care environments using object-oriented technology

    Science.gov (United States)

    Thakore, Arun K.; Sauer, Frank

    1994-05-01

    The organization of modern medical care environments into disease-related clusters, such as a cancer center, a diabetes clinic, etc., has the side-effect of introducing multiple heterogeneous databases, often containing similar information, within the same organization. This heterogeneity fosters incompatibility and prevents the effective sharing of data amongst applications at different sites. Although integration of heterogeneous databases is now feasible, in the medical arena this is often an ad hoc process, not founded on proven database technology or formal methods. In this paper we illustrate the use of a high-level object- oriented semantic association method to model information found in different databases into an integrated conceptual global model that integrates the databases. We provide examples from the medical domain to illustrate an integration approach resulting in a consistent global view, without attacking the autonomy of the underlying databases.

  14. Full-scope nuclear training simulator -brought to the desktop

    International Nuclear Information System (INIS)

    LaPointe, D.J.; Manz, A.; Hall, G.S.

    1997-01-01

    RighTSTEP is a suite of simulation software which has been initially designed to facilitate upgrade of Ontario Hydro's full-scope simulators, but is also adaptable to a variety of other roles. it is presently being commissioned at Bruch A Training Simulator and has seen preliminary use in desktop and classroom roles. Because of the flexibility of the system, we anticipate it will see common use in the corporation for full-scope simulation roles. A key reason for developing RighTSTEP (Real Time Simulator Technology Extensible and Portable) was the need to modernize and upgrade the full-scope training simulator while protecting the investment in modelling code. This modelling code represents the end product of 18 years of evolution from the beginning of its development in 1979. Bringing this modelling code to a modern and more useful framework - the combination of simulator host, operating system, and simulator operating system - also could provide many spin-off benefits. The development (and first implementation) of the righTSTEP system was cited for saving the corporation 5.6M$ and was recognized by a corporate New Technology Award last year. The most important spin-off from this project has been the desktop version of the full-scope simulator. The desktop simulator uses essentially the same software as does its full-scope counterpart, and may be used for a variety of new purposes. Classroom and individual simulator training can now be easily accommodated since a desktop simulator is both affordable and relatively ease to use. Further, a wide group of people can be trained using the desktop simulator: by contrast the full-scope simulators were almost exclusively devoted to front-line operating staff. The desktop is finding increasing use in support of engineering applications, resulting from its easy accessibility, breadth of station systems represented, and tools for analysis and viewing. As further plant models are made available on the new simulator platform and

  15. VATE: VAlidation of high TEchnology based on large database analysis by learning machine

    NARCIS (Netherlands)

    Meldolesi, E; Van Soest, J; Alitto, A R; Autorino, R; Dinapoli, N; Dekker, A; Gambacorta, M A; Gatta, R; Tagliaferri, L; Damiani, A; Valentini, V

    2014-01-01

    The interaction between implementation of new technologies and different outcomes can allow a broad range of researches to be expanded. The purpose of this paper is to introduce the VAlidation of high TEchnology based on large database analysis by learning machine (VATE) project that aims to combine

  16. An Updating System for the Gridded Population Database of China Based on Remote Sensing, GIS and Spatial Database Technologies

    Science.gov (United States)

    Yang, Xiaohuan; Huang, Yaohuan; Dong, Pinliang; Jiang, Dong; Liu, Honghui

    2009-01-01

    The spatial distribution of population is closely related to land use and land cover (LULC) patterns on both regional and global scales. Population can be redistributed onto geo-referenced square grids according to this relation. In the past decades, various approaches to monitoring LULC using remote sensing and Geographic Information Systems (GIS) have been developed, which makes it possible for efficient updating of geo-referenced population data. A Spatial Population Updating System (SPUS) is developed for updating the gridded population database of China based on remote sensing, GIS and spatial database technologies, with a spatial resolution of 1 km by 1 km. The SPUS can process standard Moderate Resolution Imaging Spectroradiometer (MODIS L1B) data integrated with a Pattern Decomposition Method (PDM) and an LULC-Conversion Model to obtain patterns of land use and land cover, and provide input parameters for a Population Spatialization Model (PSM). The PSM embedded in SPUS is used for generating 1 km by 1 km gridded population data in each population distribution region based on natural and socio-economic variables. Validation results from finer township-level census data of Yishui County suggest that the gridded population database produced by the SPUS is reliable. PMID:22399959

  17. An Updating System for the Gridded Population Database of China Based on Remote Sensing, GIS and Spatial Database Technologies

    Directory of Open Access Journals (Sweden)

    Xiaohuan Yang

    2009-02-01

    Full Text Available The spatial distribution of population is closely related to land use and land cover (LULC patterns on both regional and global scales. Population can be redistributed onto geo-referenced square grids according to this relation. In the past decades, various approaches to monitoring LULC using remote sensing and Geographic Information Systems (GIS have been developed, which makes it possible for efficient updating of geo-referenced population data. A Spatial Population Updating System (SPUS is developed for updating the gridded population database of China based on remote sensing, GIS and spatial database technologies, with a spatial resolution of 1 km by 1 km. The SPUS can process standard Moderate Resolution Imaging Spectroradiometer (MODIS L1B data integrated with a Pattern Decomposition Method (PDM and an LULC-Conversion Model to obtain patterns of land use and land cover, and provide input parameters for a Population Spatialization Model (PSM. The PSM embedded in SPUS is used for generating 1 km by 1 km gridded population data in each population distribution region based on natural and socio-economic variables. Validation results from finer township-level census data of Yishui County suggest that the gridded population database produced by the SPUS is reliable.

  18. Design Options for a Desktop Publishing Course.

    Science.gov (United States)

    Mayer, Kenneth R.; Nelson, Sandra J.

    1992-01-01

    Offers recommendations for development of an undergraduate desktop publishing course. Discusses scholastic level and prerequisites, purpose and objectives, instructional resources and methodology, assignments and evaluation, and a general course outline. (SR)

  19. A VM-shared desktop virtualization system based on OpenStack

    Science.gov (United States)

    Liu, Xi; Zhu, Mingfa; Xiao, Limin; Jiang, Yuanjie

    2018-04-01

    With the increasing popularity of cloud computing, desktop virtualization is rising in recent years as a branch of virtualization technology. However, existing desktop virtualization systems are mostly designed as a one-to-one mode, which one VM can only be accessed by one user. Meanwhile, previous desktop virtualization systems perform weakly in terms of response time and cost saving. This paper proposes a novel VM-Shared desktop virtualization system based on OpenStack platform. The paper modified the connecting process and the display data transmission process of the remote display protocol SPICE to support VM-Shared function. On the other hand, we propose a server-push display mode to improve user interactive experience. The experimental results show that our system performs well in response time and achieves a low CPU consumption.

  20. VMware Horizon 6 desktop virtualization solutions

    CERN Document Server

    Cartwright, Ryan; Langone, Jason; Leibovici, Andre

    2014-01-01

    If you are a desktop architect, solution provider, end-user consultant, virtualization engineer, or anyone who wants to learn how to plan and design the implementation of a virtual desktop solution based on Horizon 6, then this book is for you. An understanding of VMware vSphere fundamentals coupled with experience in the installation or administration of a VMware environment would be a plus during reading.

  1. Desktop Publishing: A Brave New World and Publishing from the Desktop.

    Science.gov (United States)

    Lormand, Robert; Rowe, Jane J.

    1988-01-01

    The first of two articles presents basic selection criteria for desktop publishing software packages, including discussion of expectations, required equipment, training costs, publication size, desired software features, additional equipment needed, and quality control. The second provides a brief description of desktop publishing using the Apple…

  2. The Effect of Relational Database Technology on Administrative Computing at Carnegie Mellon University.

    Science.gov (United States)

    Golden, Cynthia; Eisenberger, Dorit

    1990-01-01

    Carnegie Mellon University's decision to standardize its administrative system development efforts on relational database technology and structured query language is discussed and its impact is examined in one of its larger, more widely used applications, the university information system. Advantages, new responsibilities, and challenges of the…

  3. Proposal for Implementing Multi-User Database (MUD) Technology in an Academic Library.

    Science.gov (United States)

    Filby, A. M. Iliana

    1996-01-01

    Explores the use of MOO (multi-user object oriented) virtual environments in academic libraries to enhance reference services. Highlights include the development of multi-user database (MUD) technology from gaming to non-recreational settings; programming issues; collaborative MOOs; MOOs as distinguished from other types of virtual reality; audio…

  4. A Database for Reviewing and Selecting Radioactive Waste Treatment Technologies and Vendors

    International Nuclear Information System (INIS)

    P. C. Marushia; W. E. Schwinkendorf

    1999-01-01

    Several attempts have been made in past years to collate and present waste management technologies and solutions to waste generators. These efforts have been manifested as reports, buyers' guides, and databases. While this information is helpful at the time it is assembled, the principal weakness is maintaining the timeliness and accuracy of the information over time. In many cases, updates have to be published or developed as soon as the product is disseminated. The recently developed National Low-Level Waste Management Program's Technologies Database is a vendor-updated Internet based database designed to overcome this problem. The National Low-Level Waste Management Program's Technologies Database contains information about waste types, treatment technologies, and vendor information. Information is presented about waste types, typical treatments, and the vendors who provide those treatment methods. The vendors who provide services update their own contact information, their treatment processes, and the types of wastes for which their treatment process is applicable. This information is queriable by a generator of low-level or mixed low-level radioactive waste who is seeking information on waste treatment methods and the vendors who provide them. Timeliness of the information in the database is assured using time clocks and automated messaging to remind featured vendors to keep their information current. Failure to keep the entries current results in a vendor being warned and then ultimately dropped from the database. This assures that the user is dealing with the most current information available and the vendors who are active in reaching and serving their market

  5. Advanced technologies for scalable ATLAS conditions database access on the grid

    International Nuclear Information System (INIS)

    Basset, R; Canali, L; Girone, M; Hawkings, R; Valassi, A; Viegas, F; Dimitrov, G; Nevski, P; Vaniachine, A; Walker, R; Wong, A

    2010-01-01

    During massive data reprocessing operations an ATLAS Conditions Database application must support concurrent access from numerous ATLAS data processing jobs running on the Grid. By simulating realistic work-flow, ATLAS database scalability tests provided feedback for Conditions Db software optimization and allowed precise determination of required distributed database resources. In distributed data processing one must take into account the chaotic nature of Grid computing characterized by peak loads, which can be much higher than average access rates. To validate database performance at peak loads, we tested database scalability at very high concurrent jobs rates. This has been achieved through coordinated database stress tests performed in series of ATLAS reprocessing exercises at the Tier-1 sites. The goal of database stress tests is to detect scalability limits of the hardware deployed at the Tier-1 sites, so that the server overload conditions can be safely avoided in a production environment. Our analysis of server performance under stress tests indicates that Conditions Db data access is limited by the disk I/O throughput. An unacceptable side-effect of the disk I/O saturation is a degradation of the WLCG 3D Services that update Conditions Db data at all ten ATLAS Tier-1 sites using the technology of Oracle Streams. To avoid such bottlenecks we prototyped and tested a novel approach for database peak load avoidance in Grid computing. Our approach is based upon the proven idea of pilot job submission on the Grid: instead of the actual query, an ATLAS utility library sends to the database server a pilot query first.

  6. Managing Database Services: An Approach Based in Information Technology Services Availabilty and Continuity Management

    Directory of Open Access Journals (Sweden)

    Leonardo Bastos Pontes

    2017-01-01

    Full Text Available This paper is held in the information technology services management environment, with a few ideas of information technology governance, and purposes to implement a hybrid model to manage the services of a database, based on the principles of information technology services management in a supplementary health operator. This approach utilizes fundamental nuances of services management guides, such as CMMI for Services, COBIT, ISO 20000, ITIL and MPS.BR for Services; it studies harmonically Availability and Continuity Management, as most part of the guides also do. This work has its importance because it keeps a good flow in the database and improves the agility of the systems in the accredited clinics in the health plan.

  7. The Neotoma Paleoecology Database

    Science.gov (United States)

    Grimm, E. C.; Ashworth, A. C.; Barnosky, A. D.; Betancourt, J. L.; Bills, B.; Booth, R.; Blois, J.; Charles, D. F.; Graham, R. W.; Goring, S. J.; Hausmann, S.; Smith, A. J.; Williams, J. W.; Buckland, P.

    2015-12-01

    The Neotoma Paleoecology Database (www.neotomadb.org) is a multiproxy, open-access, relational database that includes fossil data for the past 5 million years (the late Neogene and Quaternary Periods). Modern distributional data for various organisms are also being made available for calibration and paleoecological analyses. The project is a collaborative effort among individuals from more than 20 institutions worldwide, including domain scientists representing a spectrum of Pliocene-Quaternary fossil data types, as well as experts in information technology. Working groups are active for diatoms, insects, ostracodes, pollen and plant macroscopic remains, testate amoebae, rodent middens, vertebrates, age models, geochemistry and taphonomy. Groups are also active in developing online tools for data analyses and for developing modules for teaching at different levels. A key design concept of NeotomaDB is that stewards for various data types are able to remotely upload and manage data. Cooperatives for different kinds of paleo data, or from different regions, can appoint their own stewards. Over the past year, much progress has been made on development of the steward software-interface that will enable this capability. The steward interface uses web services that provide access to the database. More generally, these web services enable remote programmatic access to the database, which both desktop and web applications can use and which provide real-time access to the most current data. Use of these services can alleviate the need to download the entire database, which can be out-of-date as soon as new data are entered. In general, the Neotoma web services deliver data either from an entire table or from the results of a view. Upon request, new web services can be quickly generated. Future developments will likely expand the spatial and temporal dimensions of the database. NeotomaDB is open to receiving new datasets and stewards from the global Quaternary community

  8. Desktop Virtualization in Action: Simplicity Is Power

    Science.gov (United States)

    Fennell, Dustin

    2010-01-01

    Discover how your institution can better manage and increase access to instructional applications and desktops while providing a blended learning environment. Receive practical insight into how academic computing virtualization can be leveraged to enhance education at your institution while lowering Total Cost of Ownership (TCO) and reducing the…

  9. Desktop publishing: a useful tool for scientists.

    Science.gov (United States)

    Lindroth, J R; Cooper, G; Kent, R L

    1994-01-01

    Desktop publishing offers features that are not available in word processing programs. The process yields an impressive and professional-looking document that is legible and attractive. It is a simple but effective tool to enhance the quality and appearance of your work and perhaps also increase your productivity.

  10. Desktop Publishing: Things Gutenberg Never Taught You.

    Science.gov (United States)

    Bowman, Joel P.; Renshaw, Debbie A.

    1989-01-01

    Provides a desktop publishing (DTP) overview, including: advantages and disadvantages; hardware and software requirements; and future development. Discusses cost-effectiveness, confidentiality, credibility, effects on volume of paper-based communication, and the need for training in layout and design which DTP creates. Includes a glossary of DTP…

  11. Thomas Jefferson, Page Design, and Desktop Publishing.

    Science.gov (United States)

    Hartley, James

    1991-01-01

    Discussion of page design for desktop publishing focuses on the importance of functional issues as opposed to aesthetic issues, and criticizes a previous article that stressed aesthetic issues. Topics discussed include balance, consistency in text structure, and how differences in layout affect the clarity of "The Declaration of…

  12. Basics of Desktop Publishing. Teacher Edition.

    Science.gov (United States)

    Beeby, Ellen

    This color-coded teacher's guide contains curriculum materials designed to give students an awareness of various desktop publishing techniques before they determine their computer hardware and software needs. The guide contains six units, each of which includes some or all of the following basic components: objective sheet, suggested activities…

  13. Aquatic Habitats: Exploring Desktop Ponds. Teacher's Guide.

    Science.gov (United States)

    Barrett, Katharine; Willard, Carolyn

    This book, for grades 2-6, is designed to provide students with a highly motivating and unique opportunity to investigate an aquatic habitat. Students set up, observe, study, and reflect upon their own "desktop ponds." Accessible plants and small animals used in these activities include Elodea, Tubifex worms, snails, mosquito larvae, and fish.…

  14. Cubby : Multiscreen Desktop VR Part III

    NARCIS (Netherlands)

    Djajadiningrat, J.P.; Gribnau, M.W.

    2000-01-01

    In this month's final episode of our 'Cubby: Multiscreen Desktop VR' trilogy we explain how you read the InputSprocket driver from part II, how you use it as input for the cameras from part I and how you calibrate the input device so that it leads to the correct head position.

  15. Cubby : Multiscreen Desktop VR Part II

    NARCIS (Netherlands)

    Gribnau, M.W.; Djajadiningrat, J.P.

    2000-01-01

    In this second part of our 'Cubby: Multiscreen Desktop VR' trilogy, we will introduce you to the art of creating a driver to read an Origin Instruments Dynasight input device. With the Dynasight, the position of the head of the user is established so that Cubby can display the correct images on its

  16. Semantic document architecture for desktop data integration and management

    OpenAIRE

    Nesic, Sasa; Jazayeri, Mehdi

    2011-01-01

    Over the last decade, personal desktops have faced the problem of information overload due to increasing computational power, easy access to the Web and cheap data storage. Moreover, an increasing number of diverse end-user desktop applications have led to the problem of information fragmentation. Each desktop application has its own data, unaware of related and relevant data in other applications. In other words, personal desktops face a lack of interoperability of data managed by differ...

  17. The Virtual Desktop: Options and Challenges in Selecting a Secure Desktop Infrastructure Based on Virtualization

    Science.gov (United States)

    2011-10-01

    the virtual desktop environment still functions for the users associated with it. Users can access the virtual desktop through the local network and...technologie de virtualisation du poste de travail peut contribuer à combler les besoins de partage de l’information sécuritaire au sein du MDN. Le... virtualisation . Il englobe un aperçu de la virtualisation d’un poste de travail, y compris un examen approfondi de deux architectures différentes : le

  18. Database use and technology in Japan: JTEC panel report. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Wiederhold, G.; Beech, D.; Bourne, C.; Farmer, N.; Jajodia, Sushil; Kahaner, D.; Minoura, Toshi; Smith, D.; Smith, J.M.

    1992-04-01

    This report presents the findings of a group of database experts, sponsored by the Japanese Technology Evaluation Center (JTEC), based on an intensive study trip to Japan during March 1991. Academic, industrial, and governmental sites were visited. The primary findings are that Japan is supporting its academic research establishment poorly, that industry is making progress in key areas, and that both academic and industrial researchers are well aware of current domestic and foreign technology. Information sharing between industry and academia is effectively supported by governmental sponsorship of joint planning and review activities, and enhances technology transfer. In two key areas, multimedia and object-oriented databases, the authors can expect to see future export of Japanese database products, typically integrated into larger systems. Support for academic research is relatively modest. Nevertheless, the senior faculty are well-known and respected, and communicate frequently and in depth with each other, with government agencies, and with industry. In 1988 there were a total of 1,717 Ph.D.`s in engineering and 881 in science. It appears that only about 30 of these were academic Ph.D.`s in the basic computer sciences.

  19. Construction of an ortholog database using the semantic web technology for integrative analysis of genomic data.

    Science.gov (United States)

    Chiba, Hirokazu; Nishide, Hiroyo; Uchiyama, Ikuo

    2015-01-01

    Recently, various types of biological data, including genomic sequences, have been rapidly accumulating. To discover biological knowledge from such growing heterogeneous data, a flexible framework for data integration is necessary. Ortholog information is a central resource for interlinking corresponding genes among different organisms, and the Semantic Web provides a key technology for the flexible integration of heterogeneous data. We have constructed an ortholog database using the Semantic Web technology, aiming at the integration of numerous genomic data and various types of biological information. To formalize the structure of the ortholog information in the Semantic Web, we have constructed the Ortholog Ontology (OrthO). While the OrthO is a compact ontology for general use, it is designed to be extended to the description of database-specific concepts. On the basis of OrthO, we described the ortholog information from our Microbial Genome Database for Comparative Analysis (MBGD) in the form of Resource Description Framework (RDF) and made it available through the SPARQL endpoint, which accepts arbitrary queries specified by users. In this framework based on the OrthO, the biological data of different organisms can be integrated using the ortholog information as a hub. Besides, the ortholog information from different data sources can be compared with each other using the OrthO as a shared ontology. Here we show some examples demonstrating that the ortholog information described in RDF can be used to link various biological data such as taxonomy information and Gene Ontology. Thus, the ortholog database using the Semantic Web technology can contribute to biological knowledge discovery through integrative data analysis.

  20. High energy nuclear database: a test-bed for nuclear data information technology

    International Nuclear Information System (INIS)

    Brown, D.A.; Vogt, R.; Beck, B.; Pruet, J.; Vogt, R.

    2008-01-01

    We describe the development of an on-line high-energy heavy-ion experimental database. When completed, the database will be searchable and cross-indexed with relevant publications, including published detector descriptions. While this effort is relatively new, it will eventually contain all published data from older heavy-ion programs as well as published data from current and future facilities. These data include all measured observables in proton-proton, proton-nucleus and nucleus-nucleus collisions. Once in general use, this database will have tremendous scientific payoff as it makes systematic studies easier and allows simpler benchmarking of theoretical models for a broad range of experiments. Furthermore, there is a growing need for compilations of high-energy nuclear data for applications including stockpile stewardship, technology development for inertial confinement fusion, target and source development for upcoming facilities such as the International Linear Collider and homeland security. This database is part of a larger proposal that includes the production of periodic data evaluations and topical reviews. These reviews would provide an alternative and impartial mechanism to resolve discrepancies between published data from rival experiments and between theory and experiment. Since this database will be a community resource, it requires the high-energy nuclear physics community's financial and manpower support. This project serves as a test-bed for the further development of an object-oriented nuclear data format and database system. By using 'off-the-shelf' software tools and techniques, the system is simple, robust, and extensible. Eventually we envision a 'Grand Unified Nuclear Format' encapsulating data types used in the ENSDF, Endf/B, EXFOR, NSR and other formats, including processed data formats. (authors)

  1. High Energy Nuclear Database: A Testbed for Nuclear Data Information Technology

    International Nuclear Information System (INIS)

    Brown, D A; Vogt, R; Beck, B; Pruet, J

    2007-01-01

    We describe the development of an on-line high-energy heavy-ion experimental database. When completed, the database will be searchable and cross-indexed with relevant publications, including published detector descriptions. While this effort is relatively new, it will eventually contain all published data from older heavy-ion programs as well as published data from current and future facilities. These data include all measured observables in proton-proton, proton-nucleus and nucleus-nucleus collisions. Once in general use, this database will have tremendous scientific payoff as it makes systematic studies easier and allows simpler benchmarking of theoretical models for a broad range of experiments. Furthermore, there is a growing need for compilations of high-energy nuclear data for applications including stockpile stewardship, technology development for inertial confinement fusion, target and source development for upcoming facilities such as the International Linear Collider and homeland security. This database is part of a larger proposal that includes the production of periodic data evaluations and topical reviews. These reviews would provide an alternative and impartial mechanism to resolve discrepancies between published data from rival experiments and between theory and experiment. Since this database will be a community resource, it requires the high-energy nuclear physics community's financial and manpower support. This project serves as a testbed for the further development of an object-oriented nuclear data format and database system. By using ''off-the-shelf'' software tools and techniques, the system is simple, robust, and extensible. Eventually we envision a ''Grand Unified Nuclear Format'' encapsulating data types used in the ENSDF, ENDF/B, EXFOR, NSR and other formats, including processed data formats

  2. Neurocognitive sparing of desktop microbeam irradiation.

    Science.gov (United States)

    Bazyar, Soha; Inscoe, Christina R; Benefield, Thad; Zhang, Lei; Lu, Jianping; Zhou, Otto; Lee, Yueh Z

    2017-08-11

    Normal tissue toxicity is the dose-limiting side effect of radiotherapy. Spatial fractionation irradiation techniques, like microbeam radiotherapy (MRT), have shown promising results in sparing the normal brain tissue. Most MRT studies have been conducted at synchrotron facilities. With the aim to make this promising treatment more available, we have built the first desktop image-guided MRT device based on carbon nanotube x-ray technology. In the current study, our purpose was to evaluate the effects of MRT on the rodent normal brain tissue using our device and compare it with the effect of the integrated equivalent homogenous dose. Twenty-four, 8-week-old male C57BL/6 J mice were randomly assigned to three groups: MRT, broad-beam (BB) and sham. The hippocampal region was irradiated with two parallel microbeams in the MRT group (beam width = 300 μm, center-to-center = 900 μm, 160 kVp). The BB group received the equivalent integral dose in the same area of their brain. Rotarod, marble burying and open-field activity tests were done pre- and every month post-irradiation up until 8 months to evaluate the cognitive changes and potential irradiation side effects on normal brain tissue. The open-field activity test was substituted by Barnes maze test at 8th month. A multilevel model, random coefficients approach was used to evaluate the longitudinal and temporal differences among treatment groups. We found significant differences between BB group as compared to the microbeam-treated and sham mice in the number of buried marble and duration of the locomotion around the open-field arena than shams. Barnes maze revealed that BB mice had a lower capacity for spatial learning than MRT and shams. Mice in the BB group tend to gain weight at the slower pace than shams. No meaningful differences were found between MRT and sham up until 8-month follow-up using our measurements. Applying MRT with our newly developed prototype compact CNT-based image-guided MRT system

  3. Virtual Reality on a Desktop Hailed as New Tool in Distance Education.

    Science.gov (United States)

    Young, Jeffrey R.

    2000-01-01

    Describes college and university educational applications of desktop virtual reality to provide a more human touch to interactive distance education programs and impress the brain with more vivid images. Critics suggest the technology is too costly and time consuming and may even distract students from the content of an online course. (DB)

  4. A Cross-Case Analysis of Gender Issues in Desktop Virtual Reality Learning Environments

    Science.gov (United States)

    Ausburn, Lynna J.; Martens, Jon; Washington, Andre; Steele, Debra; Washburn, Earlene

    2009-01-01

    This study examined gender-related issues in using new desktop virtual reality (VR) technology as a learning tool in career and technical education (CTE). Using relevant literature, theory, and cross-case analysis of data and findings, the study compared and analyzed the outcomes of two recent studies conducted by a research team at Oklahoma State…

  5. Use of Signaling to Integrate Desktop Virtual Reality and Online Learning Management Systems

    Science.gov (United States)

    Dodd, Bucky J.; Antonenko, Pavlo D.

    2012-01-01

    Desktop virtual reality is an emerging educational technology that offers many potential benefits for learners in online learning contexts; however, a limited body of research is available that connects current multimedia learning techniques with these new forms of media. Because most formal online learning is delivered using learning management…

  6. Calculation of Investments for the Distribution of GPON Technology in the village of Bishtazhin through database

    Directory of Open Access Journals (Sweden)

    MSc. Jusuf Qarkaxhija

    2013-12-01

    Full Text Available According to daily reports, the income from internet services is getting lower each year. Landline phone services are running at a loss,  whereas mobile phone services are getting too mainstream and the only bright spot holding together cable operators (ISP  in positive balance is the income from broadband services (Fast internet, IPTV. Broadband technology is a term that defines multiple methods of information distribution through internet at great speed. Some of the broadband technologies are: optic fiber, coaxial cable, DSL, Wireless, mobile broadband, and satellite connection.  The ultimate goal of any broadband service provider is being able to provide voice, data and the video through a single network, called triple play service. The Internet distribution remains an important issue in Kosovo and particularly in rural zones. Considering the immense development of the technologies and different alternatives that we can face, the goal of this paper is to emphasize the necessity of a forecasting of such investment and to give an experience in this aspect. Because of the fact that in this investment are involved many factors related to population, geographical factors, several technologies and the fact that these factors are in continuously change, the best way is, to store all the data in a database and to use this database for different results. This database helps us to substitute the previous manual calculations with an automatic procedure of calculations. This way of work will improve the work style, having now all the tools to take the right decision about an Internet investment considering all the aspects of this investment.

  7. SAMP: Application Messaging for Desktop and Web Applications

    Science.gov (United States)

    Taylor, M. B.; Boch, T.; Fay, J.; Fitzpatrick, M.; Paioro, L.

    2012-09-01

    SAMP, the Simple Application Messaging Protocol, is a technology which allows tools to communicate. It is deployed in a number of desktop astronomy applications including ds9, Aladin, TOPCAT, World Wide Telescope and numerous others, and makes it straightforward for a user to treat a selection of these tools as a loosely-integrated suite, combining the most powerful features of each. It has been widely used within Virtual Observatory contexts, but is equally suitable for non-VO use. Enabling SAMP communication from web-based content has long been desirable. An obvious use case is arranging for a click on a web page link to deliver an image, table or spectrum to a desktop viewer, but more sophisticated two-way interaction with rich internet applications would also be possible. Use from the web however presents some problems related to browser sandboxing. We explain how the SAMP Web Profile, introduced in version 1.3 of the SAMP protocol, addresses these issues, and discuss the resulting security implications.

  8. Multimedia architectures: from desktop systems to portable appliances

    Science.gov (United States)

    Bhaskaran, Vasudev; Konstantinides, Konstantinos; Natarajan, Balas R.

    1997-01-01

    Future desktop and portable computing systems will have as their core an integrated multimedia system. Such a system will seamlessly combine digital video, digital audio, computer animation, text, and graphics. Furthermore, such a system will allow for mixed-media creation, dissemination, and interactive access in real time. Multimedia architectures that need to support these functions have traditionally required special display and processing units for the different media types. This approach tends to be expensive and is inefficient in its use of silicon. Furthermore, such media-specific processing units are unable to cope with the fluid nature of the multimedia market wherein the needs and standards are changing and system manufacturers may demand a single component media engine across a range of products. This constraint has led to a shift towards providing a single-component multimedia specific computing engine that can be integrated easily within desktop systems, tethered consumer appliances, or portable appliances. In this paper, we review some of the recent architectural efforts in developing integrated media systems. We primarily focus on two efforts, namely the evolution of multimedia-capable general purpose processors and a more recent effort in developing single component mixed media co-processors. Design considerations that could facilitate the migration of these technologies to a portable integrated media system also are presented.

  9. Desktop war - data suppliers competing for bigger market share

    International Nuclear Information System (INIS)

    Sword, M.

    1999-01-01

    The intense competition among suppliers of computerized data and computer software to the petroleum and natural gas industry in western Canada is discussed. It is estimated that the Canadian oil patch spends a large sum, about $ 400 million annually on geoscience information and related costs and industry is looking for ways to significantly reduce those costs. There is a need for integrated, desktop driven data sets. Sensing the determination of industry to reduce information acquisition costs, data providers are responding with major consolidation of data sets. The major evolution in the industry is on-line access to increase the speed of information delivery. Data vendors continue to integrate land, well, log, production and other data sets whether public or proprietary. The result is stronger foundations as platforms for interpretive software. Another development is the rise of the Internet and Intranets and the re-definition of the role of information technology departments in the industry as both of these are paving the way for electronic delivery of information and software tools to the desktop. Development of proprietary data sets, acquisition of competitors with complimentary data sets that enhances products and services are just some of the ways data vendors are using to get a bigger piece of the exploration and development pie

  10. MELCOR/VISOR PWR desktop simulator

    International Nuclear Information System (INIS)

    With, Anka de; Wakker, Pieter

    2010-01-01

    Increasingly, there is a need for a learning support and training tool for nuclear engineers, utilities and students in order to broaden their understanding of advanced nuclear plant characteristics, dynamics, transients and safety features. Nuclear system analysis codes like ASTEC, RELAP5, RETRAN and MELCOR provide calculation results of and visualization tools can be used to graphically represent these results. However, for an efficient education and training a more interactive tool such as a simulator is needed. The simulator connects the graphical tool with the calculation tool in an interactive manner. A small number of desktop simulators exist [1-3]. The existing simulators are capable of representing different types of power plants and various accident conditions. However, they were found to be too general to be used as a reliable plant-specific accident analysis or training tool. A desktop simulator of the Pressurized Water Reactor (PWR) has been created under contract of the Dutch nuclear regulatory body (KFD). The desktop simulator is a software package that provides a close to real simulation of the Dutch nuclear power plant Borssele (KCB) and is used for training of the accident response. The simulator includes the majority of the power plant systems, necessary for the successful simulation of the KCB plant during normal operation, malfunctions and accident situations, and it has been successfully validated against the results of the safety evaluations from the KCB safety report. (orig.)

  11. The desktop interface in intelligent tutoring systems

    Science.gov (United States)

    Baudendistel, Stephen; Hua, Grace

    1987-01-01

    The interface between an Intelligent Tutoring System (ITS) and the person being tutored is critical to the success of the learning process. If the interface to the ITS is confusing or non-supportive of the tutored domain, the effectiveness of the instruction will be diminished or lost entirely. Consequently, the interface to an ITS should be highly integrated with the domain to provide a robust and semantically rich learning environment. In building an ITS for ZetaLISP on a LISP Machine, a Desktop Interface was designed to support a programming learning environment. Using the bitmapped display, windows, and mouse, three desktops were designed to support self-study and tutoring of ZetaLISP. Through organization, well-defined boundaries, and domain support facilities, the desktops provide substantial flexibility and power for the student and facilitate learning ZetaLISP programming while screening the student from the complex LISP Machine environment. The student can concentrate on learning ZetaLISP programming and not on how to operate the interface or a LISP Machine.

  12. Rhinoplasty perioperative database using a personal digital assistant.

    Science.gov (United States)

    Kotler, Howard S

    2004-01-01

    To construct a reliable, accurate, and easy-to-use handheld computer database that facilitates the point-of-care acquisition of perioperative text and image data specific to rhinoplasty. A user-modified database (Pendragon Forms [v.3.2]; Pendragon Software Corporation, Libertyville, Ill) and graphic image program (Tealpaint [v.4.87]; Tealpaint Software, San Rafael, Calif) were used to capture text and image data, respectively, on a Palm OS (v.4.11) handheld operating with 8 megabytes of memory. The handheld and desktop databases were maintained secure using PDASecure (v.2.0) and GoldSecure (v.3.0) (Trust Digital LLC, Fairfax, Va). The handheld data were then uploaded to a desktop database of either FileMaker Pro 5.0 (v.1) (FileMaker Inc, Santa Clara, Calif) or Microsoft Access 2000 (Microsoft Corp, Redmond, Wash). Patient data were collected from 15 patients undergoing rhinoplasty in a private practice outpatient ambulatory setting. Data integrity was assessed after 6 months' disk and hard drive storage. The handheld database was able to facilitate data collection and accurately record, transfer, and reliably maintain perioperative rhinoplasty data. Query capability allowed rapid search using a multitude of keyword search terms specific to the operative maneuvers performed in rhinoplasty. Handheld computer technology provides a method of reliably recording and storing perioperative rhinoplasty information. The handheld computer facilitates the reliable and accurate storage and query of perioperative data, assisting the retrospective review of one's own results and enhancement of surgical skills.

  13. Migrating to the Cloud IT Application, Database, and Infrastructure Innovation and Consolidation

    CERN Document Server

    Laszewski, Tom

    2011-01-01

    Whether your company is planning on database migration, desktop application migration, or has IT infrastructure consolidation projects, this book gives you all the resources you'll need. It gives you recommendations on tools, strategy and best practices and serves as a guide as you plan, determine effort and budget, design, execute and roll your modern Oracle system out to production. Focusing on Oracle grid relational database technology and Oracle Fusion Middleware as the target cloud-based architecture, your company can gain organizational efficiency, agility, increase innovation and reduce

  14. Research on Construction of Road Network Database Based on Video Retrieval Technology

    Directory of Open Access Journals (Sweden)

    Wang Fengling

    2017-01-01

    Full Text Available Based on the characteristics of the video database and the basic structure of the video database and several typical video data models, the segmentation-based multi-level data model is used to describe the landscape information video database, the network database model and the road network management database system. Landscape information management system detailed design and implementation of a detailed preparation.

  15. Evaluating virtual hosted desktops for graphics-intensive astronomy

    Science.gov (United States)

    Meade, B. F.; Fluke, C. J.

    2018-04-01

    Visualisation of data is critical to understanding astronomical phenomena. Today, many instruments produce datasets that are too big to be downloaded to a local computer, yet many of the visualisation tools used by astronomers are deployed only on desktop computers. Cloud computing is increasingly used to provide a computation and simulation platform in astronomy, but it also offers great potential as a visualisation platform. Virtual hosted desktops, with graphics processing unit (GPU) acceleration, allow interactive, graphics-intensive desktop applications to operate co-located with astronomy datasets stored in remote data centres. By combining benchmarking and user experience testing, with a cohort of 20 astronomers, we investigate the viability of replacing physical desktop computers with virtual hosted desktops. In our work, we compare two Apple MacBook computers (one old and one new, representing hardware and opposite ends of the useful lifetime) with two virtual hosted desktops: one commercial (Amazon Web Services) and one in a private research cloud (the Australian NeCTAR Research Cloud). For two-dimensional image-based tasks and graphics-intensive three-dimensional operations - typical of astronomy visualisation workflows - we found that benchmarks do not necessarily provide the best indication of performance. When compared to typical laptop computers, virtual hosted desktops can provide a better user experience, even with lower performing graphics cards. We also found that virtual hosted desktops are equally simple to use, provide greater flexibility in choice of configuration, and may actually be a more cost-effective option for typical usage profiles.

  16. Exploring Graphic Design. A Short Course in Desktop Publishing.

    Science.gov (United States)

    Stanley, MLG

    This course in desktop publishing contains seven illustrated modules designed to meet the following objectives: (1) use a desktop publishing program to explore advanced topics in graphic design; (2) learn about typography and how to make design decisions on the use of typestyles; (3) learn basic principles in graphic communications and apply them…

  17. Desktop Publishing: A Powerful Tool for Advanced Composition Courses.

    Science.gov (United States)

    Sullivan, Patricia

    1988-01-01

    Examines the advantages of using desktop publishing in advanced writing classes. Explains how desktop publishing can spur creativity, call attention to the interaction between words and pictures, encourage the social dimensions of computing and composing, and provide students with practical skills. (MM)

  18. Desktop Publishing in a PC-Based Environment.

    Science.gov (United States)

    Sims, Harold A.

    1987-01-01

    Identifies, considers, and interrelates the functionality of hardware, firmware, and software types; discusses the relationship of input and output devices in the PC-based desktop publishing environment; and reports some of what has been experienced in three years of working intensively in/with desktop publishing devices and solutions. (MES)

  19. A NICE approach to managing large numbers of desktop PC's

    International Nuclear Information System (INIS)

    Foster, David

    1996-01-01

    The problems of managing desktop systems are far from resolved. As we deploy increasing numbers of systems, PC's Mackintoshes and UN*X Workstations. This paper will concentrate on the solution adopted at CERN for the management of the rapidly increasing numbers of desktop PC's in use in all parts of the laboratory. (author)

  20. Database created with the operation of environmental monitoring program from the Nuclear Technology Development Center (CDTN) - Brazilian CNEN

    International Nuclear Information System (INIS)

    Peixoto, C.M.

    1995-01-01

    The environmental control from the Nuclear Technology Development Center (CDTN - Brazilian CNEN) is done through a Program of Environmental Monitoring-PMA, which has been in operation since 1985. To register all the analytic results of the several samples, samples, a database was created. In this work, this database structure as well as the information used in the evaluation of the results obtained from the operation of the above-mentioned PMA are presented. (author). 5 refs, 1 fig, 3 tabs

  1. Relational databases

    CERN Document Server

    Bell, D A

    1986-01-01

    Relational Databases explores the major advances in relational databases and provides a balanced analysis of the state of the art in relational databases. Topics covered include capture and analysis of data placement requirements; distributed relational database systems; data dependency manipulation in database schemata; and relational database support for computer graphics and computer aided design. This book is divided into three sections and begins with an overview of the theory and practice of distributed systems, using the example of INGRES from Relational Technology as illustration. The

  2. Experimental Setup for Ultrasonic-Assisted Desktop Fused Deposition Modeling System

    OpenAIRE

    Maidin, S.; Muhamad, M. K.; Pei, Eujin

    2014-01-01

    Fused deposition modeling (FDM) is an additive manufacturing (AM) process that has been used in various manufacturing fields. However, the drawback of FDM is poor surface finish of part produced, leading to surface roughness and requires hand finishing. In this study, ultrasonic technology will be integrated into a desktop FDM system. Ultrasound has been applied in various conventional machining process and shows good machined surface finish. However, very little research regarding the applic...

  3. Advanced technologies for scalable ATLAS conditions database access on the grid

    CERN Document Server

    Basset, R; Dimitrov, G; Girone, M; Hawkings, R; Nevski, P; Valassi, A; Vaniachine, A; Viegas, F; Walker, R; Wong, A

    2010-01-01

    During massive data reprocessing operations an ATLAS Conditions Database application must support concurrent access from numerous ATLAS data processing jobs running on the Grid. By simulating realistic work-flow, ATLAS database scalability tests provided feedback for Conditions Db software optimization and allowed precise determination of required distributed database resources. In distributed data processing one must take into account the chaotic nature of Grid computing characterized by peak loads, which can be much higher than average access rates. To validate database performance at peak loads, we tested database scalability at very high concurrent jobs rates. This has been achieved through coordinated database stress tests performed in series of ATLAS reprocessing exercises at the Tier-1 sites. The goal of database stress tests is to detect scalability limits of the hardware deployed at the Tier-1 sites, so that the server overload conditions can be safely avoided in a production environment. Our analysi...

  4. Recon2Neo4j: applying graph database technologies for managing comprehensive genome-scale networks.

    Science.gov (United States)

    Balaur, Irina; Mazein, Alexander; Saqi, Mansoor; Lysenko, Artem; Rawlings, Christopher J; Auffray, Charles

    2017-04-01

    The goal of this work is to offer a computational framework for exploring data from the Recon2 human metabolic reconstruction model. Advanced user access features have been developed using the Neo4j graph database technology and this paper describes key features such as efficient management of the network data, examples of the network querying for addressing particular tasks, and how query results are converted back to the Systems Biology Markup Language (SBML) standard format. The Neo4j-based metabolic framework facilitates exploration of highly connected and comprehensive human metabolic data and identification of metabolic subnetworks of interest. A Java-based parser component has been developed to convert query results (available in the JSON format) into SBML and SIF formats in order to facilitate further results exploration, enhancement or network sharing. The Neo4j-based metabolic framework is freely available from: https://diseaseknowledgebase.etriks.org/metabolic/browser/ . The java code files developed for this work are available from the following url: https://github.com/ibalaur/MetabolicFramework . ibalaur@eisbm.org. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  5. LCCP Desktop Application v1.0 Engineering Reference

    Energy Technology Data Exchange (ETDEWEB)

    Beshr, Mohamed [Univ. of Maryland, College Park, MD (United States); Aute, Vikrant [Univ. of Maryland, College Park, MD (United States)

    2014-04-01

    This Life Cycle Climate Performance (LCCP) Desktop Application Engineering Reference is divided into three parts. The first part of the guide, consisting of the LCCP objective, literature review, and mathematical background, is presented in Sections 2-4. The second part of the guide (given in Sections 5-10) provides a description of the input data required by the LCCP desktop application, including each of the input pages (Application Information, Load Information, and Simulation Information) and details for interfacing the LCCP Desktop Application with the VapCyc and EnergyPlus simulation programs. The third part of the guide (given in Section 11) describes the various interfaces of the LCCP code.

  6. Analytical Hierarchy Process for the selection of strategic alternatives for introduction of infrastructure virtual desktop infrastructure in the university

    Directory of Open Access Journals (Sweden)

    Katerina A. Makoviy

    2017-12-01

    Full Text Available The task of choosing a strategy for implementing the virtual desktop infrastructure into the IT infrastructure of the university is considered. The infrastructure of virtual desktops is a technology that provides centralization of management of client workplaces, increase the service life of computers in classrooms. The analysis of strengths and weaknesses, threats and opportunities for introducing virtualization in the university. Alternatives to implementation based on the results of the pilot project have been developed. To obtain quantitative estimates in the SWOT - analysis of the pilot project, the analytical hierarchy process is used. The analysis of implementation of the pilot project by experts is carried out and the integral value of quantitative estimates of various alternatives is generated. The combination of the analytical hierarchy process and SWOT - analysis allows you to choose the optimal strategy for implementing desktop virtualization.

  7. A Database for Decision-Making in Training and Distributed Learning Technology

    National Research Council Canada - National Science Library

    Stouffer, Virginia

    1998-01-01

    .... A framework for incorporating data about distributed learning courseware into the existing training database was devised and a plan for a national electronic courseware redistribution network was recommended...

  8. A Five-Year Hedonic Price Breakdown for Desktop Personal Computer Attributes in Brazil

    Directory of Open Access Journals (Sweden)

    Nuno Manoel Martins Dias Fouto

    2009-07-01

    Full Text Available The purpose of this article is to identify the attributes that discriminate the prices of personal desktop computers. We employ the hedonic price method in evaluating such characteristics. This approach allows market prices to be expressed as a function, a set of attributes present in the products and services offered. Prices and characteristics of up to 3,779 desktop personal computers offered in the IT pages of one of the main Brazilian newspapers were collected from January 2003 to December 2007. Several specifications for the hedonic (multivariate linear regression were tested. In this particular study, the main attributes were found to be hard drive capacity, screen technology, main board brand, random memory size, microprocessor brand, video board memory, digital video and compact disk recording devices, screen size and microprocessor speed. These results highlight the novel contribution of this study: the manner and means in which hedonic price indexes may be estimated in Brazil.

  9. Nuclear Plant Analyzer desktop workstation: An integrated interactive simulation, visualization and analysis tool

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1991-01-01

    The advanced, best-estimate, reactor thermal-hydraulic codes were originally developed as mainframe computer applications because of speed, precision, memory and mass storage requirements. However, the productivity of numerical reactor safety analysts has historically been hampered by mainframe dependence due to limited mainframe CPU allocation, accessibility and availability, poor mainframe job throughput, and delays in obtaining and difficulty comprehending printed numerical results. The Nuclear Plant Analyzer (NPA) was originally developed as a mainframe computer-graphics aid for reactor safety analysts in addressing the latter consideration. Rapid advances in microcomputer technology have since enabled the installation and execution of these reactor safety codes on desktop computers thereby eliminating mainframe dependence. The need for a complementary desktop graphics display generation and presentation capability, coupled with the need for software standardization and portability, has motivated the redesign of the NPA as a UNIX/X-Windows application suitable for both mainframe and microcomputer

  10. Survey on utilization of database for research and development of global environmental industry technology; Chikyu kankyo sangyo gijutsu kenkyu kaihatsu no tame no database nado no riyo ni kansuru chosa

    Energy Technology Data Exchange (ETDEWEB)

    1993-03-01

    To optimize networks and database systems for promotion of the industry technology development contributing to the solution of the global environmental problem, studies are made on reusable information resource and its utilization methods. As reusable information resource, there are external database and network system for researchers` information exchange and for computer use. The external database includes commercial database and academic database. As commercial database, 6 agents and 13 service systems are selected. As academic database, there are NACSIS-IR and the database which is connected with INTERNET in the U.S. These are used in connection with the UNIX academic research network called INTERNET. For connection with INTERNET, a commercial UNIX network service called IIJ which starts service in April 1993 can be used. However, personal computer communication network is used for the time being. 6 figs., 4 tabs.

  11. Mars Propellant Liquefaction Modeling in Thermal Desktop

    Science.gov (United States)

    Desai, Pooja; Hauser, Dan; Sutherlin, Steven

    2017-01-01

    NASAs current Mars architectures are assuming the production and storage of 23 tons of liquid oxygen on the surface of Mars over a duration of 500+ days. In order to do this in a mass efficient manner, an energy efficient refrigeration system will be required. Based on previous analysis NASA has decided to do all liquefaction in the propulsion vehicle storage tanks. In order to allow for transient Martian environmental effects, a propellant liquefaction and storage system for a Mars Ascent Vehicle (MAV) was modeled using Thermal Desktop. The model consisted of a propellant tank containing a broad area cooling loop heat exchanger integrated with a reverse turbo Brayton cryocooler. Cryocooler sizing and performance modeling was conducted using MAV diurnal heat loads and radiator rejection temperatures predicted from a previous thermal model of the MAV. A system was also sized and modeled using an alternative heat rejection system that relies on a forced convection heat exchanger. Cryocooler mass, input power, and heat rejection for both systems were estimated and compared against sizing based on non-transient sizing estimates.

  12. Desktop Publishing: The New Wave in Business Education.

    Science.gov (United States)

    Huprich, Violet M.

    1989-01-01

    Discusses the challenges of teaching desktop publishing (DTP); the industry is in flux with the software packages constantly being updated. Indicates that the demand for those with DTP skills is great. (JOW)

  13. Turbulence Visualization at the Terascale on Desktop PCs

    KAUST Repository

    Treib, M.; Burger, K.; Reichl, F.; Meneveau, C.; Szalay, A.; Westermann, R.

    2012-01-01

    is challenging on desktop computers. This is due to the extreme resolution of such fields, requiring memory and bandwidth capacities going beyond what is currently available. To overcome these limitations, we present a GPU system for feature-based turbulence

  14. Development of an automated desktop procedure for defining macro ...

    African Journals Online (AJOL)

    2006-07-03

    break points' such as ... An automated desktop procedure was developed for computing statistically defensible, multiple change .... from source to mouth. .... the calculated value was less than the test statistic given in Owen.

  15. Post-Caesarean Section Surgical Site Infection Surveillance Using an Online Database and Mobile Phone Technology.

    Science.gov (United States)

    Castillo, Eliana; McIsaac, Corrine; MacDougall, Bhreagh; Wilson, Douglas; Kohr, Rosemary

    2017-08-01

    Obstetric surgical site infections (SSIs) are common and expensive to the health care system but remain under reported given shorter postoperative hospital stays and suboptimal post-discharge surveillance systems. SSIs, for the purpose of this paper, are defined according to the Center for Disease Control and Prevention (1999) as infection incurring within 30 days of the operative procedure (in this case, Caesarean section [CS]). Demonstrate the feasibility of real-life use of a patient driven SSIs post-discharge surveillance system consisting of an online database and mobile phone technology (surgical mobile app - how2trak) among women undergoing CS in a Canadian urban centre. Estimate the rate of SSIs and associated predisposing factors. Prospective cohort of consecutive women delivering by CS at one urban Canadian hospital. Using surgical mobile app-how2trak-predetermined demographics, comorbidities, procedure characteristics, and self-reported symptoms and signs of infection were collected and linked to patients' incision self-portraits (photos) on postpartum days 3, 7, 10, and 30. A total of 105 patients were enrolled over a 5-month period. Mean age was 31 years, 13% were diabetic, and most were at low risk of surgical complications. Forty-six percent of surgeries were emergency CSs, and 104/105 received antibiotic prophylaxis. Forty-five percent of patients (47/105) submitted at least one photo, and among those, one surgical site infection was detected by photo appearance and self-reported symptoms by postpartum day 10. The majority of patients whom uploaded photos did so multiple times and 43% of them submitted photos up to day 30. Patients with either a diagnosis of diabetes or self-reported Asian ethnicity were less likely to submit photos. Post-discharge surveillance for CS-related SSIs using surgical mobile app how2trak is feasible and deserves further study in the post-discharge setting. Copyright © 2017. Published by Elsevier Inc.

  16. Perception Analysis of Desktop and Mobile Service Website

    OpenAIRE

    Khoiriyah, Rizqiyatul

    2016-01-01

    The research was conducted as a qualitative study of the website to deeper explore and examine the analysis of user perception of desktop and mobile website services. This research reviewed about user perception of desktop and mobile service website used by using qualitative methods adapted to WebQual and User Experience approach. This qualitative research refered to the theoretical reference written by Creswell (2014). The expected outcome is to know the user perceptions of the available ser...

  17. FORMED: Bringing Formal Methods to the Engineering Desktop

    Science.gov (United States)

    2016-02-01

    FORMED: BRINGING FORMAL METHODS TO THE ENGINEERING DESKTOP BAE SYSTEMS FEBRUARY 2016 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE...This report is published in the interest of scientific and technical information exchange, and its publication does not constitute the Government’s...BRINGING FORMAL METHODS TO THE ENGINEERING DESKTOP 5a. CONTRACT NUMBER FA8750-14-C-0024 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 63781D

  18. Microsoft Virtualization Master Microsoft Server, Desktop, Application, and Presentation Virtualization

    CERN Document Server

    Olzak, Thomas; Boomer, Jason; Keefer, Robert M

    2010-01-01

    Microsoft Virtualization helps you understand and implement the latest virtualization strategies available with Microsoft products. This book focuses on: Server Virtualization, Desktop Virtualization, Application Virtualization, and Presentation Virtualization. Whether you are managing Hyper-V, implementing desktop virtualization, or even migrating virtual machines, this book is packed with coverage on all aspects of these processes. Written by a talented team of Microsoft MVPs, Microsoft Virtualization is the leading resource for a full installation, migration, or integration of virtual syste

  19. Basic survey for promoting energy efficiency in developing countries. Database development project directory of energy conservation technology in Japan

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-02-01

    In order to promote energy conservation in developing countries, the gist of Japanese energy saving technologies was edited into a database. The Asian territory is expected of remarkable economic development and increased energy consumption including that for fossil fuels. Therefore, this project of structuring a database has urgent importance for the Asian countries. New and wide-area discussions were given to revise the 1995 edition. The committee was composed of members from high energy consuming areas such as iron and steel, paper and pulp, chemical, oil refining, cement, electric power, machinery, electric devices, and industrial machinery industries. Technical literatures and reports were referred to, and opinions were heard from specialists and committee members representing the respective areas. In order to reflect the current status and particular conditions in specific industrial areas, additions were given under the assistance and guidance from the specialists. The energy saving technologies recorded in the database may be called small to medium scale technologies, with the target placed on saving energy by 10% or more. Small-scale energy saving technologies were omitted. Flow charts for manufacturing processes were also added. (NEDO)

  20. Knowledge base technology for CT-DIMS: Report 1. [CT-DIMS (Cutting Tool - Database and Information Management System)

    Energy Technology Data Exchange (ETDEWEB)

    Kelley, E.E.

    1993-05-01

    This report discusses progress on the Cutting Tool-Database and Information Management System (CT-DIMS) project being conducted by the University of Illinois Urbana-Champaign (UIUC) under contract to the Department of Energy. This project was initiated in October 1991 by UIUC. The Knowledge-Based Engineering Systems Research Laboratory (KBESRL) at UIUC is developing knowledge base technology and prototype software for the presentation and manipulation of the cutting tool databases at Allied-Signal Inc., Kansas City Division (KCD). The graphical tool selection capability being developed for CT-DIMS in the Intelligent Design Environment for Engineering Automation (IDEEA) will provide a concurrent environment for simultaneous access to tool databases, tool standard libraries, and cutting tool knowledge.

  1. A desktop 3D printer with dual extruders to produce customised electronic circuitry

    Science.gov (United States)

    Butt, Javaid; Onimowo, Dominic Adaoiza; Gohrabian, Mohammed; Sharma, Tinku; Shirvani, Hassan

    2018-03-01

    3D printing has opened new horizons for the manufacturing industry in general, and 3D printers have become the tools for technological advancements. There is a huge divide between the pricing of industrial and desktop 3D printers with the former being on the expensive side capable of producing excellent quality products and latter being on the low-cost side with moderate quality results. However, there is a larger room for improvements and enhancements for the desktop systems as compared to the industrial ones. In this paper, a desktop 3D printer called Prusa Mendel i2 has been modified and integrated with an additional extruder so that the system can work with dual extruders and produce bespoke electronic circuits. The communication between the two extruders has been established by making use of the In-Chip Serial Programming port on the Arduino Uno controlling the printer. The biggest challenge is to control the flow of electric paint (to be dispensed by the new extruder) and CFD (Computational Fluid Dynamics) analysis has been carried out to ascertain the optimal conditions for proper dispensing. The final product is a customised electronic circuit with the base of plastic (from the 3D printer's extruder) and electronic paint (from the additional extruder) properly dispensed to create a live circuit on a plastic platform. This low-cost enhancement to a desktop 3D printer can provide a new prospect to produce multiple material parts where the additional extruder can be filled with any material that can be properly dispensed from its nozzle.

  2. Investigation of an artificial intelligence technology--Model trees. Novel applications for an immediate release tablet formulation database.

    Science.gov (United States)

    Shao, Q; Rowe, R C; York, P

    2007-06-01

    This study has investigated an artificial intelligence technology - model trees - as a modelling tool applied to an immediate release tablet formulation database. The modelling performance was compared with artificial neural networks that have been well established and widely applied in the pharmaceutical product formulation fields. The predictability of generated models was validated on unseen data and judged by correlation coefficient R(2). Output from the model tree analyses produced multivariate linear equations which predicted tablet tensile strength, disintegration time, and drug dissolution profiles of similar quality to neural network models. However, additional and valuable knowledge hidden in the formulation database was extracted from these equations. It is concluded that, as a transparent technology, model trees are useful tools to formulators.

  3. Life cycle assessment study of a Chinese desktop personal computer.

    Science.gov (United States)

    Duan, Huabo; Eugster, Martin; Hischier, Roland; Streicher-Porte, Martin; Li, Jinhui

    2009-02-15

    Associated with the tremendous prosperity in world electronic information and telecommunication industry, there continues to be an increasing awareness of the environmental impacts related to the accelerating mass production, electricity use, and waste management of electronic and electric products (e-products). China's importance as both a consumer and supplier of e-products has grown at an unprecedented pace in recent decade. Hence, this paper aims to describe the application of life cycle assessment (LCA) to investigate the environmental performance of Chinese e-products from a global level. A desktop personal computer system has been selected to carry out a detailed and modular LCA which follows the ISO 14040 series. The LCA is constructed by SimaPro software version 7.0 and expressed with the Eco-indicator'99 life cycle impact assessment method. For a sensitivity analysis of the overall LCA results, the so-called CML method is used in order to estimate the influence of the choice of the assessment method on the result. Life cycle inventory information is complied by ecoinvent 1.3 databases, combined with literature and field investigations on the present Chinese situation. The established LCA study shows that that the manufacturing and the use of such devices are of the highest environmental importance. In the manufacturing of such devices, the integrated circuits (ICs) and the Liquid Crystal Display (LCD) are those parts contributing most to the impact. As no other aspects are taken into account during the use phase, the impact is due to the way how the electricity is produced. The final process steps--i.e. the end of life phase--lead to a clear environmental benefit if a formal and modern, up-to-date technical system is assumed, like here in this study.

  4. Life cycle assessment study of a Chinese desktop personal computer

    International Nuclear Information System (INIS)

    Duan Huabo; Eugster, Martin; Hischier, Roland; Streicher-Porte, Martin; Li Jinhui

    2009-01-01

    Associated with the tremendous prosperity in world electronic information and telecommunication industry, there continues to be an increasing awareness of the environmental impacts related to the accelerating mass production, electricity use, and waste management of electronic and electric products (e-products). China's importance as both a consumer and supplier of e-products has grown at an unprecedented pace in recent decade. Hence, this paper aims to describe the application of life cycle assessment (LCA) to investigate the environmental performance of Chinese e-products from a global level. A desktop personal computer system has been selected to carry out a detailed and modular LCA which follows the ISO 14040 series. The LCA is constructed by SimaPro software version 7.0 and expressed with the Eco-indicator'99 life cycle impact assessment method. For a sensitivity analysis of the overall LCA results, the so-called CML method is used in order to estimate the influence of the choice of the assessment method on the result. Life cycle inventory information is complied by ecoinvent 1.3 databases, combined with literature and field investigations on the present Chinese situation. The established LCA study shows that that the manufacturing and the use of such devices are of the highest environmental importance. In the manufacturing of such devices, the integrated circuits (ICs) and the Liquid Crystal Display (LCD) are those parts contributing most to the impact. As no other aspects are taken into account during the use phase, the impact is due to the way how the electricity is produced. The final process steps - i.e. the end of life phase - lead to a clear environmental benefit if a formal and modern, up-to-date technical system is assumed, like here in this study

  5. Bibliometric analysis of Spanish scientific publications in the subject Construction & Building Technology in Web of Science database (1997-2008)

    OpenAIRE

    Rojas-Sola, J. I.; de San-Antonio-Gómez, C.

    2010-01-01

    In this paper the publications from Spanish institutions listed in the journals of the Construction & Building Technology subject of Web of Science database for the period 1997- 2008 are analyzed. The number of journals in whose is published is 35 and the number of articles was 760 (Article or Review). Also a bibliometric assessment has done and we propose two new parameters: Weighted Impact Factor and Relative Impact Factor; also includes the number of citations and the number documents ...

  6. Application of SIG and OLAP technologies on IBGE databases as a decision support tool for the county administration

    Directory of Open Access Journals (Sweden)

    REGO, E. A.

    2008-06-01

    Full Text Available This paper shows a Decision Support System development for any brazilian county. The system is free of any costs research. For doing so, one uses the datawarehouse, OLAP and GIS technologies all together with the IBGE's database to give to the user a query building tool, showing the results in maps or/and tables format, on a very simple and efficient way.

  7. Intelligent Access to Sequence and Structure Databases (IASSD) - an interface for accessing information from major web databases.

    Science.gov (United States)

    Ganguli, Sayak; Gupta, Manoj Kumar; Basu, Protip; Banik, Rahul; Singh, Pankaj Kumar; Vishal, Vineet; Bera, Abhisek Ranjan; Chakraborty, Hirak Jyoti; Das, Sasti Gopal

    2014-01-01

    With the advent of age of big data and advances in high throughput technology accessing data has become one of the most important step in the entire knowledge discovery process. Most users are not able to decipher the query result that is obtained when non specific keywords or a combination of keywords are used. Intelligent access to sequence and structure databases (IASSD) is a desktop application for windows operating system. It is written in Java and utilizes the web service description language (wsdl) files and Jar files of E-utilities of various databases such as National Centre for Biotechnology Information (NCBI) and Protein Data Bank (PDB). Apart from that IASSD allows the user to view protein structure using a JMOL application which supports conditional editing. The Jar file is freely available through e-mail from the corresponding author.

  8. Hanford Tank Initiative (HTI) and Acquire Commercial Technology for Retrieval Report and Database

    International Nuclear Information System (INIS)

    SEDERBURG, J. P

    2000-01-01

    The data base is an annotated bibliography of technology evaluations and demonstrations conducted in previous years by the Hanford Tank Initiative (HTI) and the Acquire Commercial Technology for Retrieval (ACTR) programs

  9. Indiana Humanities Council Request for the Indianapolis Energy Conversion Inst. For Phase I of the Indianapolis Energy Conservation Res Initiative also called the smartDESKTOP Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Keller, John B.

    2007-12-06

    The smartDESKTOP Initiative at the Indiana Humanities Council received critical support in building and delivering a digital desktop for Indiana educators through the Department of Energy Grant DE-FG02-06ER64282. During the project period September 2006 through October of 2007, the number of Indiana educators with accounts on the smartDESKTOP more than tripled from under 2,000 to more than 7,000 accounts. An external review of the project conducted for the purposes of understanding the impact of the service in Indiana schools revealed that the majority of respondents felt that using the smartDESKTOP did reduce the time they spent managing paper. The same study revealed the challenges of implementing a digital desktop meant to help teachers leverage technology to improve their teaching and ultimately student learning. The most significant outcome of this project is that the Indiana Department of Education expressed interest in assuming responsibility for sustaining this project. The transition of the smartDESKTOP to the Indiana Department of Education was effective on November 1, 2007.

  10. Implementation Issues of Virtual Desktop Infrastructure and Its Case Study for a Physician's Round at Seoul National University Bundang Hospital

    OpenAIRE

    Yoo, Sooyoung; Kim, Seok; Kim, Taegi; Kim, Jon Soo; Baek, Rong-Min; Suh, Chang Suk; Chung, Chin Youb; Hwang, Hee

    2012-01-01

    Objectives The cloud computing-based virtual desktop infrastructure (VDI) allows access to computing environments with no limitations in terms of time or place such that it can permit the rapid establishment of a mobile hospital environment. The objective of this study was to investigate the empirical issues to be considered when establishing a virtual mobile environment using VDI technology in a hospital setting and to examine the utility of the technology with an Apple iPad during a physici...

  11. Distributed multimedia database technologies supported by MPEG-7 and MPEG-21

    CERN Document Server

    Kosch, Harald

    2003-01-01

    15 Introduction Multimedia Content: Context Multimedia Systems and Databases (Multi)Media Data and Multimedia Metadata Purpose and Organization of the Book MPEG-7: The Multimedia Content Description Standard Introduction MPEG-7 and Multimedia Database Systems Principles for Creating MPEG-7 Documents MPEG-7 Description Definition Language Step-by-Step Approach for Creating an MPEG-7 Document Extending the Description Schema of MPEG-7 Encoding and Decoding of MPEG-7 Documents for Delivery-Binary Format for MPEG-7 Audio Part of MPEG-7 MPEG-7 Supporting Tools and Referen

  12. Introducing the CUAHSI Hydrologic Information System Desktop Application (HydroDesktop) and Open Development Community

    Science.gov (United States)

    Ames, D.; Kadlec, J.; Horsburgh, J. S.; Maidment, D. R.

    2009-12-01

    The Consortium of Universities for the Advancement of Hydrologic Sciences (CUAHSI) Hydrologic Information System (HIS) project includes extensive development of data storage and delivery tools and standards including WaterML (a language for sharing hydrologic data sets via web services); and HIS Server (a software tool set for delivering WaterML from a server); These and other CUASHI HIS tools have been under development and deployment for several years and together, present a relatively complete software “stack” to support the consistent storage and delivery of hydrologic and other environmental observation data. This presentation describes the development of a new HIS software tool called “HydroDesktop” and the development of an online open source software development community to update and maintain the software. HydroDesktop is a local (i.e. not server-based) client side software tool that ultimately will run on multiple operating systems and will provide a highly usable level of access to HIS services. The software provides many key capabilities including data query, map-based visualization, data download, local data maintenance, editing, graphing, data export to selected model-specific data formats, linkage with integrated modeling systems such as OpenMI, and ultimately upload to HIS servers from the local desktop software. As the software is presently in the early stages of development, this presentation will focus on design approach and paradigm and is viewed as an opportunity to encourage participation in the open development community. Indeed, recognizing the value of community based code development as a means of ensuring end-user adoption, this project has adopted an “iterative” or “spiral” software development approach which will be described in this presentation.

  13. Colorado Late Cenozoic Fault and Fold Database and Internet Map Server: User-friendly technology for complex information

    Science.gov (United States)

    Morgan, K.S.; Pattyn, G.J.; Morgan, M.L.

    2005-01-01

    Internet mapping applications for geologic data allow simultaneous data delivery and collection, enabling quick data modification while efficiently supplying the end user with information. Utilizing Web-based technologies, the Colorado Geological Survey's Colorado Late Cenozoic Fault and Fold Database was transformed from a monothematic, nonspatial Microsoft Access database into a complex information set incorporating multiple data sources. The resulting user-friendly format supports easy analysis and browsing. The core of the application is the Microsoft Access database, which contains information compiled from available literature about faults and folds that are known or suspected to have moved during the late Cenozoic. The database contains nonspatial fields such as structure type, age, and rate of movement. Geographic locations of the fault and fold traces were compiled from previous studies at 1:250,000 scale to form a spatial database containing information such as length and strike. Integration of the two databases allowed both spatial and nonspatial information to be presented on the Internet as a single dataset (http://geosurvey.state.co.us/pubs/ceno/). The user-friendly interface enables users to view and query the data in an integrated manner, thus providing multiple ways to locate desired information. Retaining the digital data format also allows continuous data updating and quick delivery of newly acquired information. This dataset is a valuable resource to anyone interested in earthquake hazards and the activity of faults and folds in Colorado. Additional geologic hazard layers and imagery may aid in decision support and hazard evaluation. The up-to-date and customizable maps are invaluable tools for researchers or the public.

  14. A more complete library on your desktop

    CERN Multimedia

    2003-01-01

    The CERN library announces two new services: a complete database on standards containing the description of 400,000 standards, and a collection of scientific journals with more than three million articles. These include historical papers, some of them dating from the end of the 19th century.

  15. Techno-experiential design assessment and media experience database: A method for emerging technology assessment

    OpenAIRE

    Schick, Dan

    2005-01-01

    This thesis evaluates the Techno-Experiential Design Assessment (TEDA) for social research on new media and emerging technology. Dr. Roman Onufrijchuk developed TEDA to address the shortcomings of current methods designed for studying existing technologies. Drawing from the ideas of Canadian media theorist Marshall McLuhan, TEDA focuses on the environmental changes introduced by a new technology into a user's life. I describe the key components of the TEDA methodology and provide examples of ...

  16. Perception Analysis of Desktop and Mobile Service Website

    Directory of Open Access Journals (Sweden)

    Rizqiyatul Khoiriyah

    2016-12-01

    Full Text Available The research was conducted as a qualitative study of the website to deeper explore and examine the analysis of user perception of desktop and mobile website services. This research reviewed about user perception of desktop and mobile service website used by using qualitative methods adapted to WebQual and User Experience approach. This qualitative research refered to the theoretical reference written by Creswell (2014. The expected outcome is to know the user perceptions of the available services and information in the website along with the possibility of desktop and mobile gap arising from differences in the two services. These results can be used as a service model on the website of the user experience.

  17. Modern Hardware Technologies and Software Techniques for On-Line Database Storage and Access.

    Science.gov (United States)

    1985-12-01

    of the information in a message narrative. This method employs artificial intelligence techniques to extract information, In simalest terms, an...disf ribif ion (tape replacemenf) systemns Database distribution On-fine mass storage Videogame ROM (luke-box I Media Cost Mt $2-10/438 $10-SO/G38...trajninq ot tne great intelligence for the analyst would be required. If, on’ the other hand, a sentence analysis scneme siTole enouq,. for the low-level

  18. Application of Optical Disc Databases and Related Technology to Public Access Settings

    Science.gov (United States)

    1992-03-01

    librarians during one on one instruction, and the ability of users to browse the database. Correlation of the James A. Haley Veterans Hospital study findings...library to another, librarians must collect and study data about information gathering characteristics of their own users . (Harter and Jackson 1988...based training: improving the quality of end- user searching. The Journal of Academic Librarianship 17, no. 3: 152-56. Ciuffetti, Peter D. 1991a. A plea

  19. Analysis of condensed matter physics records in databases. Science and technology indicators in condensed matter physics

    International Nuclear Information System (INIS)

    Hillebrand, C.D.

    1999-05-01

    An analysis of the literature on Condensed Matter Physics, with particular emphasis on High Temperature Superconductors, was performed on the contents of the bibliographic database International Nuclear Information System (INIS). Quantitative data were obtained on various characteristics of the relevant INIS records such as subject categories, language and country of publication, publication types, etc. The analysis opens up the possibility for further studies, e.g. on international research co-operation and on publication patterns. (author)

  20. New generation of 3D desktop computer interfaces

    Science.gov (United States)

    Skerjanc, Robert; Pastoor, Siegmund

    1997-05-01

    Today's computer interfaces use 2-D displays showing windows, icons and menus and support mouse interactions for handling programs and data files. The interface metaphor is that of a writing desk with (partly) overlapping sheets of documents placed on its top. Recent advances in the development of 3-D display technology give the opportunity to take the interface concept a radical stage further by breaking the design limits of the desktop metaphor. The major advantage of the envisioned 'application space' is, that it offers an additional, immediately perceptible dimension to clearly and constantly visualize the structure and current state of interrelations between documents, videos, application programs and networked systems. In this context, we describe the development of a visual operating system (VOS). Under VOS, applications appear as objects in 3-D space. Users can (graphically connect selected objects to enable communication between the respective applications. VOS includes a general concept of visual and object oriented programming for tasks ranging from, e.g., low-level programming up to high-level application configuration. In order to enable practical operation in an office or at home for many hours, the system should be very comfortable to use. Since typical 3-D equipment used, e.g., in virtual-reality applications (head-mounted displays, data gloves) is rather cumbersome and straining, we suggest to use off-head displays and contact-free interaction techniques. In this article, we introduce an autostereoscopic 3-D display and connected video based interaction techniques which allow viewpoint-depending imaging (by head tracking) and visually controlled modification of data objects and links (by gaze tracking, e.g., to pick, 3-D objects just by looking at them).

  1. A hypertext-based Internet-assessable database for the MSFC Technology Transfer Office

    Science.gov (United States)

    Jackson, Jeff

    1994-01-01

    There exists a continuing need to disseminate technical information and facilities capabilities from NASA field centers in an effort to promote the successful transfer of technologies developed with public funds to the private sector. As technology transfer is a stated NASA mission, there exists a critical need for NASA centers to document technology capabilities and disseminate this information on as wide a basis as possible. Certainly local and regional dissemination is critical, but global dissemination of scientific and engineering facilities and capabilities gives NASA centers the ability to contribute to technology transfer on a much broader scale. Additionally, information should be disseminated in a complete and rapidly available form. To accomplish this information dissemination, the unique capabilities of the Internet are being exploited. The Internet allows widescale information distribution in a rapid fashion to aid in the accomplishment of technology transfer goals established by the NASA/MSFC Technology Transfer Office. Rapid information retrieval coupled with appropriate electronic feedback, allows the scientific and technical capabilities of Marshall Space Flight Center, often unique in the world, to be explored by a large number of potential benefactors of NASA (or NASA-derived) technologies. Electronic feedback, coupled with personal contact with the MSFC Technology Transfer Office personnel, allows rapid responses to technical requests from industry and academic personnel as well as private citizens. The remainder of this report gives a brief overview of the Mosaic software and a discussion of technology transfer office and laboratory facilities data that have been made available on the Internet to promote technology transfer.

  2. New Desktop Virtual Reality Technology in Technical Education

    Science.gov (United States)

    Ausburn, Lynna J.; Ausburn, Floyd B.

    2008-01-01

    Virtual reality (VR) that immerses users in a 3D environment through use of headwear, body suits, and data gloves has demonstrated effectiveness in technical and professional education. Immersive VR is highly engaging and appealing to technically skilled young Net Generation learners. However, technical difficulty and very high costs have kept…

  3. Djeen (Database for Joomla!'s Extensible Engine): a research information management system for flexible multi-technology project administration.

    Science.gov (United States)

    Stahl, Olivier; Duvergey, Hugo; Guille, Arnaud; Blondin, Fanny; Vecchio, Alexandre Del; Finetti, Pascal; Granjeaud, Samuel; Vigy, Oana; Bidaut, Ghislain

    2013-06-06

    With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. We developed Djeen (Database for Joomla!'s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group.Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material.

  4. Use of Dynamic Technologies for Web-enabled Database Management Systems

    OpenAIRE

    Bogdanova, Galina; Todorov, Todor; Blagoev, Dimitar; Todorova, Mirena

    2007-01-01

    In this paper we consider two computer systems and the dynamic Web technologies they are using. Different contemporary dynamic web technologies are described in details and their advantages and disadvantages have been shown. Specific applications are developed, clinic and studying systems, and their programming models are described. Finally we implement these two applications in the students education process: Online studying has been tested in the Technical University – Va...

  5. Grid desktop computing for constructive battlefield simulation

    OpenAIRE

    Repetto, Alejandro Juan Manuel

    2009-01-01

    It is a fact that gaming technology is a state-of-the-art tool for military training, not only in low level simulations, e.g. flight training simulations, but also for strategic and tactical training. It is also a fact that users of this kind of technologies require increasingly more realistic representations of the real world. This functional reality threatens both hardware and software capabilities, making almost impossible to keep up with the requirements. Many optimizations have been perf...

  6. Database Description - PSCDB | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available abase Description General information of database Database name PSCDB Alternative n...rial Science and Technology (AIST) Takayuki Amemiya E-mail: Database classification Structure Databases - Protein structure Database...554-D558. External Links: Original website information Database maintenance site Graduate School of Informat...available URL of Web services - Need for user registration Not available About This Database Database Descri...ption Download License Update History of This Database Site Policy | Contact Us Database Description - PSCDB | LSDB Archive ...

  7. MedlinePlus® Everywhere: Access from Your Phone, Tablet or Desktop

    Science.gov (United States)

    ... responsivefull.html MedlinePlus® Everywhere: Access from Your Phone, Tablet or Desktop To use the sharing features on ... provide a consistent user experience from a desktop, tablet, or phone. All users, regardless of how they ...

  8. EPA Region 8, Memo on Desktop Printer Ink Cartridges Policy & Voluntary Printer Turn-in

    Science.gov (United States)

    This memo requests EPA Region 8 users to voluntarily turn-in their desktop printers and notifies users of the Region 8 policy to not provide maintenance or ink and toner cartridges for desktop printers.

  9. Designing for Communication: The Key to Successful Desktop Publishing.

    Science.gov (United States)

    McCain, Ted D. E.

    Written for those who are new to design and page layout, this book focuses on providing novice desktop publishers with an understanding of communication, graphic design, typography, page layout, and page layout techniques. The book also discusses how people read, design as a consequence of understanding, and the principles of page layout. Chapters…

  10. Versatile Desktop Experiment Module (DEMo) on Heat Transfer

    Science.gov (United States)

    Minerick, Adrienne R.

    2010-01-01

    This paper outlines a new Desktop Experiment Module (DEMo) engineered for a chemical engineering junior-level Heat Transfer course. This new DEMo learning tool is versatile, fairly inexpensive, and portable such that it can be positioned on student desks throughout a classroom. The DEMo system can illustrate conduction of various materials,…

  11. Desk-top publishing using IBM-compatible computers.

    Science.gov (United States)

    Grencis, P W

    1991-01-01

    This paper sets out to describe one Medical Illustration Departments' experience of the introduction of computers for desk-top publishing. In this particular case, after careful consideration of all the options open, an IBM-compatible system was installed rather than the often popular choice of an Apple Macintosh.

  12. Desktop Publishing on the Macintosh: A Software Perspective.

    Science.gov (United States)

    Devan, Steve

    1987-01-01

    Discussion of factors to be considered in selecting desktop publishing software for the Macintosh microcomputer focuses on the two approaches to such software, i.e., batch and interactive, and three technical considerations, i.e., document, text, and graphics capabilities. Some new developments in graphics software are also briefly described. (MES)

  13. Desktop Publishing in the University: Current Progress, Future Visions.

    Science.gov (United States)

    Smith, Thomas W.

    1989-01-01

    Discussion of the workflow involved in desktop publishing focuses on experiences at the College of Engineering at the University of Wisconsin at Madison. Highlights include cost savings and productivity gains in page layout and composition; editing, translation, and revision issues; printing and distribution; and benefits to the reader. (LRW)

  14. What Desktop Publishing Can Teach Professional Writing Students about Publishing.

    Science.gov (United States)

    Dobberstein, Michael

    1992-01-01

    Points out that desktop publishing is a metatechnology that allows professional writing students access to the production phase of publishing, giving students hands-on practice in preparing text for printing and in learning how that preparation affects the visual meaning of documents. (SR)

  15. Warm Hearts/Cold Type: Desktop Publishing Arrives.

    Science.gov (United States)

    Kramer, Felix

    1991-01-01

    Describes desktop publishing (DTP) that may be suitable for community, activist, and nonprofit groups and discusses how it is changing written communication. Topics discussed include costs; laser printers; time savings; hardware and software selection; and guidelines to consider when establishing DTP capability. (LRW)

  16. A Real-World Project for a Desktop Publishing Course.

    Science.gov (United States)

    Marsden, James D.

    1994-01-01

    Describes a project in a desktop publishing course in which students work with nonprofit and campus organizations to design brochures that fulfill important needs. Discusses specific tools students use. Describes the brochure project, project criteria, clients, text and graphics for the project, how to evaluate the project, and guidelines for…

  17. Practical Downloading to Desktop Publishing: Enhancing the Delivery of Information.

    Science.gov (United States)

    Danziger, Pamela N.

    This paper is addressed to librarians and information managers who, as one of the many activities they routinely perform, frequently publish information in such formats as newsletters, manuals, brochures, forms, presentations, or reports. It is argued that desktop publishing--a personal computer-based software package used to generate documents of…

  18. Stop the Presses! An Update on Desktop Publishing.

    Science.gov (United States)

    McCarthy, Robert

    1988-01-01

    Discusses educational applications of desktop publishing at the elementary, secondary, and college levels. Topics discussed include page design capabilities; hardware requirements; software; the production of school newsletters and newspapers; cost factors; writing improvement; university departmental publications; and college book publishing. A…

  19. Desktop Publishing: Its Impact on Community College Journalism.

    Science.gov (United States)

    Grzywacz-Gray, John; And Others

    1987-01-01

    Illustrates the kinds of copy that can be created on Apple Macintosh computers and laser printers. Shows font and type specification options. Discusses desktop publishing costs, potential problems, and computer compatibility. Considers the use of computers in college journalism in production, graphics, accounting, advertising, and promotion. (AYC)

  20. Visual attention for a desktop virtual environment with ambient scent

    NARCIS (Netherlands)

    Toet, A.; Schaik, M.G. van

    2013-01-01

    In the current study participants explored a desktop virtual environment (VE) representing a suburban neighborhood with signs of public disorder (neglect, vandalism and crime), while being exposed to either room air (control group), or subliminal levels of tar (unpleasant; typically associated with

  1. Desktop aligner for fabrication of multilayer microfluidic devices.

    Science.gov (United States)

    Li, Xiang; Yu, Zeta Tak For; Geraldo, Dalton; Weng, Shinuo; Alve, Nitesh; Dun, Wu; Kini, Akshay; Patel, Karan; Shu, Roberto; Zhang, Feng; Li, Gang; Jin, Qinghui; Fu, Jianping

    2015-07-01

    Multilayer assembly is a commonly used technique to construct multilayer polydimethylsiloxane (PDMS)-based microfluidic devices with complex 3D architecture and connectivity for large-scale microfluidic integration. Accurate alignment of structure features on different PDMS layers before their permanent bonding is critical in determining the yield and quality of assembled multilayer microfluidic devices. Herein, we report a custom-built desktop aligner capable of both local and global alignments of PDMS layers covering a broad size range. Two digital microscopes were incorporated into the aligner design to allow accurate global alignment of PDMS structures up to 4 in. in diameter. Both local and global alignment accuracies of the desktop aligner were determined to be about 20 μm cm(-1). To demonstrate its utility for fabrication of integrated multilayer PDMS microfluidic devices, we applied the desktop aligner to achieve accurate alignment of different functional PDMS layers in multilayer microfluidics including an organs-on-chips device as well as a microfluidic device integrated with vertical passages connecting channels located in different PDMS layers. Owing to its convenient operation, high accuracy, low cost, light weight, and portability, the desktop aligner is useful for microfluidic researchers to achieve rapid and accurate alignment for generating multilayer PDMS microfluidic devices.

  2. A Desktop Virtual Reality Earth Motion System in Astronomy Education

    Science.gov (United States)

    Chen, Chih Hung; Yang, Jie Chi; Shen, Sarah; Jeng, Ming Chang

    2007-01-01

    In this study, a desktop virtual reality earth motion system (DVREMS) is designed and developed to be applied in the classroom. The system is implemented to assist elementary school students to clarify earth motion concepts using virtual reality principles. A study was conducted to observe the influences of the proposed system in learning.…

  3. Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Levine, Aaron L [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-19

    Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit presentation from the WPTO FY14-FY16 Peer Review. The toolkit is aimed at regulatory agencies, consultants, project developers, the public, and any other party interested in learning more about the hydropower regulatory process.

  4. An Exercise in Desktop Publishing: Using the "Newsroom."

    Science.gov (United States)

    Kiteka, Sebastian F.

    This guide provides a description and step-by-step instructions for the use of "Newsroom," a desktop-publishing program for the Apple II series of microcomputers produced by Springboard Software Inc. Based on the 1984 version of the program, this two-hour exercise focuses on the design and production of a newsletter with text and…

  5. Development of prostate cancer research database with the clinical data warehouse technology for direct linkage with electronic medical record system.

    Science.gov (United States)

    Choi, In Young; Park, Seungho; Park, Bumjoon; Chung, Byung Ha; Kim, Choung-Soo; Lee, Hyun Moo; Byun, Seok-Soo; Lee, Ji Youl

    2013-01-01

    In spite of increased prostate cancer patients, little is known about the impact of treatments for prostate cancer patients and outcome of different treatments based on nationwide data. In order to obtain more comprehensive information for Korean prostate cancer patients, many professionals urged to have national system to monitor the quality of prostate cancer care. To gain its objective, the prostate cancer database system was planned and cautiously accommodated different views from various professions. This prostate cancer research database system incorporates information about a prostate cancer research including demographics, medical history, operation information, laboratory, and quality of life surveys. And, this system includes three different ways of clinical data collection to produce a comprehensive data base; direct data extraction from electronic medical record (EMR) system, manual data entry after linking EMR documents like magnetic resonance imaging findings and paper-based data collection for survey from patients. We implemented clinical data warehouse technology to test direct EMR link method with St. Mary's Hospital system. Using this method, total number of eligible patients were 2,300 from 1997 until 2012. Among them, 538 patients conducted surgery and others have different treatments. Our database system could provide the infrastructure for collecting error free data to support various retrospective and prospective studies.

  6. Kajian Unified Theory of Acceptance and Use of Technology Dalam Penggunaan Open Source Software Database Management System

    Directory of Open Access Journals (Sweden)

    Michael Sonny

    2016-06-01

    Full Text Available Perkembangan perangkat lunak computer dewasa ini terjadi sedemikian pesatnya, perkembangan tidak hanya terjadi pada perangkat lunak yang memiliki lisensi tertentu, perangkat open source pun demikian. Perkembangan itu tentu saja sangat menggembirakan bagi pengguna computer khususnya di kalangan pendidikan maupun di kalangan mahasiswa, karena pengguna mempunyai beberapa pilihan untuk menggunakan aplikasi. Perangkat lunak open source juga menawarkan produk yang umumnya gratis, diberikan kode programnya, kebebasan untuk modifikasi dan mengembangkan. Meneliti aplikasi berbasis open source tentu saja sangat beragam seperti aplikasi untuk pemrograman (PHP, Gambas, Database Management System (MySql, SQLite, browsing (Mozilla, Firefox, Opera. Pada penelitian ini di kaji penerimaan aplikasi DBMS (Database Management System seperti MySql dan SQLite dengan menggunakan sebuah model yang dikembangkan oleh Venkantes(2003 yaitu UTAUT (Unified Theory of Acceptance and Use of Technology. Faktor – faktor tertentu juga mempengaruhi dalam melakukan kegiatan pembelajaran aplikasi open source ini, salah satu faktor atau yang disebut dengan moderating yang bisa mempengaruhi efektifitas dan efisiensi. Dengan demikian akan mendapatkan hasil yang bisa membuat kelancaran dalam pembelajaran aplikasi berbasis open source ini.   Kata kunci— open source, Database Management System (DBMS, Modereting

  7. Survey of the situation of technology succession. Databases of articles including in industrial technology museums; Gijutsu keisho jokyo chosa. Sangyo gijutsu hakubutsukan shuzohin D.B. hen

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    To promote the succession of history of and the creative use of industrial science technologies, the paper made lists and databases of the articles of industrial technology museums and material halls in Japan. Record/preservation and collection/systematization of history of the industrial technology is useful for forming bases necessary for promotion of future research/development and international contribution. Museums and material halls are the fields for making comprehensive and practical activities. The data were made as one of the basic databases as the first step for promoting activities for examining the technical succession situation in a long term range continuously and systematically. In the classification of the data, the energy relation was divided into electric power, nuclear power, oil, coal, gas and energy in general. Others were classified into metal/mine, electricity/electronics/communication, chemistry/food, ship building/heavy machinery, printing/precision instrument, and textile/spinning. Moreover, the traffic relation was classified into railroad, automobiles/two-wheeled vehicles, airline/space, and ships. Items were also set of life relation, civil engineering/architecture, and general. The total number of the museums for the survey reached 208.

  8. Modern SQL and NoSQL database technologies for the ATLAS experiment

    CERN Document Server

    Barberis, Dario; The ATLAS collaboration

    2017-01-01

    Structured data storage technologies evolve very rapidly in the IT world. LHC experiments, and ATLAS in particular, try to select and use these technologies balancing the performance for a given set of use cases with the availability, ease of use and of getting support, and stability of the product. We definitely and definitively moved from the “one fits all” (or “all has to fit into one”) paradigm to choosing the best solution for each group of data and for the applications that use these data. This paper describes the solutions in use, or under study, for the ATLAS experiment and their selection process and performance measurements.

  9. Modern SQL and NoSQL database technologies for the ATLAS experiment

    CERN Document Server

    Barberis, Dario; The ATLAS collaboration

    2017-01-01

    Structured data storage technologies evolve very rapidly in the IT world. LHC experiments, and ATLAS in particular, try to select and use these technologies balancing the performance for a given set of use cases with the availability, ease of use and of getting support, and stability of the product. We definitely and definitively moved from the “one fits all” (or “all has to fit into one”) paradigm to choosing the best solution for each group of data and for the applications that use these data. This talk describes the solutions in use, or under study, for the ATLAS experiment and their selection process and performance.

  10. Building a Database for Life Cycle Performance Assessment of Water and Wastewater Rehabilitation Technologies - abstract

    Science.gov (United States)

    Pipe rehabilitation and trenchless pipe replacement technologies have seen a steady increase in use over the past 30 to 40 years and represent an increasing proportion of the approximately $25 billion annual expenditure on operations and maintenance of the nation’s water and wast...

  11. Building a Database for Life Cycle Performance Assessment of Water and Wastewater Rehabilitation Technologies

    Science.gov (United States)

    The deployment of trenchless pipe rehabilitation technologies steadily increased over the past 30 to 40 years and continue to represent a growing proportion of the approximately $25 billion annual expenditure on operations and maintenance of the nation’s water and wastewater infr...

  12. Establishment of database and network for research of stream generator and state of the art technology review

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jae Bong; Hur, Nam Su; Moon, Seong In; Seo, Hyeong Won; Park, Bo Kyu; Park, Sung Ho; Kim, Hyung Geun [Sungkyunkwan Univ., Seoul (Korea, Republic of)

    2004-02-15

    A significant number of steam generator tubes are defective and are removed from service or repaired world widely. This wide spread damage has been caused by diverse degradation mechanisms, some of which are difficult to detect and predict. Regarding domestic nuclear power plants, also, the increase of number of operating nuclear power plants and operating periods may result in the increase of steam generator tube failure. So, it is important to carry out the integrity evaluation process to prevent the steam generator tube damage. There are two objectives of this research. The one is to make database for the research of steam generator at domestic research institution. It will increase the efficiency and capability of limited domestic research resources by sharing data and information through network organization. Also, it will enhance the current standard of integrity evaluation procedure that is considerably conservative but can be more reasonable. The second objective is to establish the standard integrity evaluation procedure for steam generator tube by reviewing state of the art technology. The research resources related to steam generator tubes are managed by the established web-based database system. The following topics are covered in this project: development of web-based network for research on steam generator tubes review of state of the art technology.

  13. Establishment of database and network for research of stream generator and state of the art technology review

    International Nuclear Information System (INIS)

    Choi, Jae Bong; Hur, Nam Su; Moon, Seong In; Seo, Hyeong Won; Park, Bo Kyu; Park, Sung Ho; Kim, Hyung Geun

    2004-02-01

    A significant number of steam generator tubes are defective and are removed from service or repaired world widely. This wide spread damage has been caused by diverse degradation mechanisms, some of which are difficult to detect and predict. Regarding domestic nuclear power plants, also, the increase of number of operating nuclear power plants and operating periods may result in the increase of steam generator tube failure. So, it is important to carry out the integrity evaluation process to prevent the steam generator tube damage. There are two objectives of this research. The one is to make database for the research of steam generator at domestic research institution. It will increase the efficiency and capability of limited domestic research resources by sharing data and information through network organization. Also, it will enhance the current standard of integrity evaluation procedure that is considerably conservative but can be more reasonable. The second objective is to establish the standard integrity evaluation procedure for steam generator tube by reviewing state of the art technology. The research resources related to steam generator tubes are managed by the established web-based database system. The following topics are covered in this project: development of web-based network for research on steam generator tubes review of state of the art technology

  14. Desktop Publishing: Organizational Considerations for Adoption and Implementation. TDC Research Report No. 6.

    Science.gov (United States)

    Lee, Paul

    This report explores the implementation of desktop publishing in the Minnesota Extension Service (MES) and provides a framework for its implementation in other organizations. The document begins with historical background on the development of desktop publishing. Criteria for deciding whether to purchase a desktop publishing system, advantages and…

  15. The Energy Science and Technology Database on a local library system: A case study at the Los Alamos National Research Library

    Energy Technology Data Exchange (ETDEWEB)

    Holtkamp, I.S.

    1994-10-01

    This paper presents an overview of efforts at Los Alamos National Laboratory to acquire and mount the Energy Science and Technology Database (EDB) as a citation database on the Research Library`s Geac Advance system. The rationale for undertaking this project and expected benefits are explained. Significant issues explored are loading non-USMARC records into a MARC-based library system, the use of EDB records to replace or supplement in-house cataloging of technical reports, the impact of different cataloging standards and database size on searching and retrieval, and how integrating an external database into the library`s online catalog may affect staffing and workflow.

  16. HydroDesktop: An Open Source GIS-Based Platform for Hydrologic Data Discovery, Visualization, and Analysis

    Science.gov (United States)

    Ames, D. P.; Kadlec, J.; Cao, Y.; Grover, D.; Horsburgh, J. S.; Whiteaker, T.; Goodall, J. L.; Valentine, D. W.

    2010-12-01

    A growing number of hydrologic information servers are being deployed by government agencies, university networks, and individual researchers using the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS). The CUAHSI HIS Project has developed a standard software stack, called HydroServer, for publishing hydrologic observations data. It includes the Observations Data Model (ODM) database and Water Data Service web services, which together enable publication of data on the Internet in a standard format called Water Markup Language (WaterML). Metadata describing available datasets hosted on these servers is compiled within a central metadata catalog called HIS Central at the San Diego Supercomputer Center and is searchable through a set of predefined web services based queries. Together, these servers and central catalog service comprise a federated HIS of a scale and comprehensiveness never previously available. This presentation will briefly review/introduce the CUAHSI HIS system with special focus on a new HIS software tool called "HydroDesktop" and the open source software development web portal, www.HydroDesktop.org, which supports community development and maintenance of the software. HydroDesktop is a client-side, desktop software application that acts as a search and discovery tool for exploring the distributed network of HydroServers, downloading specific data series, visualizing and summarizing data series and exporting these to formats needed for analysis by external software. HydroDesktop is based on the open source DotSpatial GIS developer toolkit which provides it with map-based data interaction and visualization, and a plug-in interface that can be used by third party developers and researchers to easily extend the software using Microsoft .NET programming languages. HydroDesktop plug-ins that are presently available or currently under development within the project and by third party

  17. An Analysis of Database Replication Technologies with Regard to Deep Space Network Application Requirements

    Science.gov (United States)

    Connell, Andrea M.

    2011-01-01

    The Deep Space Network (DSN) has three communication facilities which handle telemetry, commands, and other data relating to spacecraft missions. The network requires these three sites to share data with each other and with the Jet Propulsion Laboratory for processing and distribution. Many database management systems have replication capabilities built in, which means that data updates made at one location will be automatically propagated to other locations. This project examines multiple replication solutions, looking for stability, automation, flexibility, performance, and cost. After comparing these features, Oracle Streams is chosen for closer analysis. Two Streams environments are configured - one with a Master/Slave architecture, in which a single server is the source for all data updates, and the second with a Multi-Master architecture, in which updates originating from any of the servers will be propagated to all of the others. These environments are tested for data type support, conflict resolution, performance, changes to the data structure, and behavior during and after network or server outages. Through this experimentation, it is determined which requirements of the DSN can be met by Oracle Streams and which cannot.

  18. Cleanup of a HLW nuclear fuel-reprocessing center using 3-D database modeling technology

    International Nuclear Information System (INIS)

    Sauer, R.C.

    1992-01-01

    A significant challenge in decommissioning any large nuclear facility is how to solidify the large volume of residual high-level radioactive waste (HLW) without structurally interfering with the existing equipment and piping used at the original facility or would require rework due to interferences which were not identified during the design process. This problem is further compounded when the nuclear facility to be decommissioned is a 35 year old nuclear fuel reprocessing center designed to recover usable uranium and plutonium. Facilities of this vintage usually tend to lack full documentation of design changes made over the years and as a result, crude traps or pockets of high-level contamination may not be fully realized. Any miscalculation in the construction or modification sequences could compound the overall dismantling and decontamination of the facility. This paper reports that development of a 3-dimensional (3-D) computer database tool was considered critical in defining the most complex portions of this one-of-a-kind vitrification facility

  19. Research on database realization technology of seismic information system in CTBT verification

    International Nuclear Information System (INIS)

    Zheng Xuefeng; Shen Junyi; Zhang Huimin; Jing Ping; Sun Peng; Zheng Jiangling

    2005-01-01

    Developing CTBT verification technology has become the most important method that makes sure CTBT to be fulfilled conscientiously. The seismic analysis based on seismic information system (SIS) is playing an important rule in this field. Based on GIS, the SIS will be very sufficient and powerful in spatial analysis, topologic analysis and visualization. However, the critical issue to implement the whole system function depends on the performance of SIS DB. Based on the ArcSDE Geodatabase data model, not only have the spatial data and attribute data seamless integrated management been realized with RDBMS ORACLE really, but also the most functions of ORACLE have been reserved. (authors)

  20. Research focus and trends in nuclear science and technology in Ghana: a bibliometric study based on the INIS database

    International Nuclear Information System (INIS)

    Agyeman, E. A.; Bilson, A.

    2015-01-01

    The peaceful application of atomic energy was introduced into Ghana about fifty years ago. This is the first bibliometric study of nuclear science and technology research publications originating from Ghana and listed in the International Nuclear Information System (INIS) Database. The purpose was to use the simple document counting method to determine the geographical distribution, annual growth and the subject areas of the publications as well as communication channels, key journals and authorship trends. The main findings of the study were that, a greater number of the nuclear science and technology records listed in the Database were published in Ghana (598 or 56.57% against 459 or 43.43% published outside Ghana). There has been a steady growth in the number of publications over the years with the most productive year being 2012. The main focus of research has been in the area of applied life sciences, comprising plant cultivation & breeding, pest & disease control, food protection and preservation, human nutrition and animal husbandry; followed by chemistry; environmental sciences; radiation protection; nuclear reactors; physics; energy; and radiology and nuclear medicine. The area with the least number of publications was safeguards and physical protection. The main channel of communicating research results was peer reviewed journals and a greater number of the journal articles were published in Ghana followed by the United Kingdom, Hungary and the Netherlands. The core journals identified in this study were Journal of Applied Science and Technology; Journal of Radioanalytical and Nuclear Chemistry; Journal of the Ghana Science Association; Radiation Protection Dosimetry; Journal of the Kumasi University of Science and Technology; West African Journal of Applied Ecology; Ghana Journal of Science; Applied Radiation and Isotopes; Annals of Nuclear Energy, IOP Conference Series (Earth and Environmental Science) and Radiation Physics and Chemistry. Eighty percent

  1. Database Description - JSNP | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name JSNP Alternative nam...n Science and Technology Agency Creator Affiliation: Contact address E-mail : Database...sapiens Taxonomy ID: 9606 Database description A database of about 197,000 polymorphisms in Japanese populat...1):605-610 External Links: Original website information Database maintenance site Institute of Medical Scien...er registration Not available About This Database Database Description Download License Update History of This Database

  2. Users' attitude towards science and technology database system : INIS user needs survey

    International Nuclear Information System (INIS)

    Fukazawa, Takeyasu; Takahashi, Satoko; Yonezawa, Minoru; Kajiro, Tadashi; Mineo, Yukinobu; Habara, Takako; Komatsubara, Yasutoshi; Hiramatsu, Nobuaki; Habara, Tadashi.

    1995-01-01

    The International Nuclear Information System (INIS) is the world's leading information system on the peaceful use of nuclear energy which is being operated by the International Atomic Energy Agency (IAEA) in collaboration with its member-states and other international organizations. After more than 20 years of the operation of INIS, a user needs survey was conducted with the aim of assisting the INIS Secretariat to decide which way INIS should go. This report describes users' attitude towards that system on the basis of the conclusions drawn from the questionnaires sent out to the users by the Japan Atomic Energy Research Institute, the INIS national center in Japan, in close collaboration with the Japan Information Center of Science and Technology. (author)

  3. ELSEVIER SCIENTIFIC JOURNALS AVAILABLE ON YOUR DESKTOP

    CERN Multimedia

    1999-01-01

    Elsevier Science Publishers have for decades distributed renowned journals in science and technology, which are now accessible on the Web through their Science Direct service. CERN has been granted a site licence trial period until the end of 1999.Included among the titles are: Astroparticle physics, Computer physics communications, Nuclear instruments and methods in physics research A and B, Nuclear physics A and B, Physics letters A and B, Physics reports, Surface science and Thin solid films.Links to the individual titles appear in our electronic journals list at:http://wwwas.cern.ch/library/electronic_journals/ejAH.htmlThe Library invites all readers to search and download articles of the journals currently subscribed to. You can also access the full Science Direct site at: http://www.sciencedirect.com/(Choose 'group-wide login' or, for a 'personal login' registration, please contact us)All questions and comments are welcome and can be addressed to: library.desk@cern.ch

  4. Feasibility of Bioprinting with a Modified Desktop 3D Printer.

    Science.gov (United States)

    Goldstein, Todd A; Epstein, Casey J; Schwartz, John; Krush, Alex; Lagalante, Dan J; Mercadante, Kevin P; Zeltsman, David; Smith, Lee P; Grande, Daniel A

    2016-12-01

    Numerous studies have shown the capabilities of three-dimensional (3D) printing for use in the medical industry. At the time of this publication, basic home desktop 3D printer kits can cost as little as $300, whereas medical-specific 3D bioprinters can cost more than $300,000. The purpose of this study is to show how a commercially available desktop 3D printer could be modified to bioprint an engineered poly-l-lactic acid scaffold containing viable chondrocytes in a bioink. Our bioprinter was used to create a living 3D functional tissue-engineered cartilage scaffold. In this article, we detail the design, production, and calibration of this bioprinter. In addition, the bioprinted cells were tested for viability, proliferation, biochemistry, and gene expression; these tests showed that the cells survived the printing process, were able to continue dividing, and produce the extracellular matrix expected of chondrocytes.

  5. Comparing Web Applications with Desktop Applications: An Empirical Study

    DEFF Research Database (Denmark)

    Pop, Paul

    2002-01-01

    In recent years, many desktop applications have been ported to the world wide web in order to reduce (multiplatform) development, distribution and maintenance costs. However, there is little data concerning the usability of web applications, and the impact of their usability on the total cost...... of developing and using such applications. In this paper we present a comparison of web and desktop applications from the usability point of view. The comparison is based on an empirical study that investigates the performance of a group of users on two calendaring applications: Yahoo!Calendar and Microsoft...... Calendar. The study shows that in the case of web applications the performance of the users is significantly reduced, mainly because of the restricted interaction mechanisms provided by current web browsers....

  6. Application of desktop computers in nuclear engineering education

    International Nuclear Information System (INIS)

    Graves, H.W. Jr.

    1990-01-01

    Utilization of desktop computers in the academic environment is based on the same objectives as in the industrial environment - increased quality and efficiency. Desktop computers can be extremely useful teaching tools in two general areas: classroom demonstrations and homework assignments. Although differences in emphasis exist, tutorial programs share many characteristics with interactive software developed for the industrial environment. In the Reactor Design and Fuel Management course at the University of Maryland, several interactive tutorial programs provided by Energy analysis Software Service have been utilized. These programs have been designed to be sufficiently structured to permit an orderly, disciplined solution to the problem being solved, and yet be flexible enough to accommodate most problem solution options

  7. Development and organization of scientific methodology and information databases for nuclear technology calculations

    International Nuclear Information System (INIS)

    Gritzay, O.; Kalchenko, O.

    2010-01-01

    Full text: Scientific support of NPPs has to cover several important aspects of scientific and organization activity, namely:1.Training for group of high skilled specialists to do the following work: o nuclear data generation for engineer calculations; o engineer calculations to ensure the safety operation of NPPs; o experimental-calculation support of fluence dosimetry at NPP. 2.Development of up-to-date computer base, equipped with necessary program packages for nuclear data generation and engineer calculations. 3.The updated Libraries of Evaluated Nuclear Data (ENDF), such as ENDF/B-VII (USA), JENDL-3.3 (Japan) and JEFF-3.1 (Europe), RUSFOND ( Russia) and as a result the generation of specialized nuclear data multi-group libraries for special purpose engineer calculations.To reach these purposes, the Ukrainian Nuclear Data Center (UKRNDC) was organized and developed for more, than 10 years (since 1996).The capabilities of the UKRNDC are detailed below. o Modern ENDF libraries, first of all the general purpose libraries, such as ENDF/B-7.0, -6.8, JEFF-3.1.1, JENDL-3.3, etc. These databases contain recommended, evaluated cross sections, spectra, angular distributions, fission product yields, photo-atomic and thermal scattering law data, with emphasis on neutron induced reactions.o Codes for processing these data, updated to the last versions of ENDF and other libraries. First of all these are PREPRO 2007 package (Updated March 17, 2007) and NJOY package updated to versions NJOY-158 and NJOY-253 (in 2009). These codes may give the possibilities to produce the multi-group data for needed spectrum of interacting particles (neutrons, protons, gammas) and temperatures.o Computer base of several specialized server stations, such as ESCALA- S120 (analogous to IBM -240 with RISC 6000 processor) operating under OS under OS UNIX (version AIX 5.1) and IBM PC operating under Linux Red Hat 7.2.o The set of PC computers joined in UKRNDC network, operating mainly in OS Windows

  8. DIaaS: Resource Management System for the Intra-Cloud with On-Premise Desktops

    Directory of Open Access Journals (Sweden)

    Hyun-Woo Kim

    2017-01-01

    Full Text Available Infrastructure as a service with desktops (DIaaS based on the extensible mark-up language (XML is herein proposed to utilize surplus resources. DIaaS is a traditional surplus-resource integrated management technology. It is designed to provide fast work distribution and computing services based on user service requests as well as storage services through desktop-based distributed computing and storage resource integration. DIaaS includes a nondisruptive resource service and an auto-scalable scheme to enhance the availability and scalability of intra-cloud computing resources. A performance evaluation of the proposed scheme measured the clustering performance time for surplus resource utilization. The results showed improvement in computing and storage services in a connection of at least two computers compared to the traditional method for high-availability measurement of nondisruptive services. Furthermore, an artificial server error environment was used to create a clustering delay for computing and storage services and for nondisruptive services. It was compared to the Hadoop distributed file system (HDFS.

  9. Accelerating phylogenetics computing on the desktop: experiments with executing UPGMA in programmable logic.

    Science.gov (United States)

    Davis, J P; Akella, S; Waddell, P H

    2004-01-01

    Having greater computational power on the desktop for processing taxa data sets has been a dream of biologists/statisticians involved in phylogenetics data analysis. Many existing algorithms have been highly optimized-one example being Felsenstein's PHYLIP code, written in C, for UPGMA and neighbor joining algorithms. However, the ability to process more than a few tens of taxa in a reasonable amount of time using conventional computers has not yielded a satisfactory speedup in data processing, making it difficult for phylogenetics practitioners to quickly explore data sets-such as might be done from a laptop computer. We discuss the application of custom computing techniques to phylogenetics. In particular, we apply this technology to speed up UPGMA algorithm execution by a factor of a hundred, against that of PHYLIP code running on the same PC. We report on these experiments and discuss how custom computing techniques can be used to not only accelerate phylogenetics algorithm performance on the desktop, but also on larger, high-performance computing engines, thus enabling the high-speed processing of data sets involving thousands of taxa.

  10. National Database for Autism Research (NDAR): Big Data Opportunities for Health Services Research and Health Technology Assessment.

    Science.gov (United States)

    Payakachat, Nalin; Tilford, J Mick; Ungar, Wendy J

    2016-02-01

    The National Database for Autism Research (NDAR) is a US National Institutes of Health (NIH)-funded research data repository created by integrating heterogeneous datasets through data sharing agreements between autism researchers and the NIH. To date, NDAR is considered the largest neuroscience and genomic data repository for autism research. In addition to biomedical data, NDAR contains a large collection of clinical and behavioral assessments and health outcomes from novel interventions. Importantly, NDAR has a global unique patient identifier that can be linked to aggregated individual-level data for hypothesis generation and testing, and for replicating research findings. As such, NDAR promotes collaboration and maximizes public investment in the original data collection. As screening and diagnostic technologies as well as interventions for children with autism are expensive, health services research (HSR) and health technology assessment (HTA) are needed to generate more evidence to facilitate implementation when warranted. This article describes NDAR and explains its value to health services researchers and decision scientists interested in autism and other mental health conditions. We provide a description of the scope and structure of NDAR and illustrate how data are likely to grow over time and become available for HSR and HTA.

  11. KALIMER database development

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment.

  12. KALIMER database development

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment

  13. GUIdock-VNC: using a graphical desktop sharing system to provide a browser-based interface for containerized software.

    Science.gov (United States)

    Mittal, Varun; Hung, Ling-Hong; Keswani, Jayant; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee

    2017-04-01

    Software container technology such as Docker can be used to package and distribute bioinformatics workflows consisting of multiple software implementations and dependencies. However, Docker is a command line-based tool, and many bioinformatics pipelines consist of components that require a graphical user interface. We present a container tool called GUIdock-VNC that uses a graphical desktop sharing system to provide a browser-based interface for containerized software. GUIdock-VNC uses the Virtual Network Computing protocol to render the graphics within most commonly used browsers. We also present a minimal image builder that can add our proposed graphical desktop sharing system to any Docker packages, with the end result that any Docker packages can be run using a graphical desktop within a browser. In addition, GUIdock-VNC uses the Oauth2 authentication protocols when deployed on the cloud. As a proof-of-concept, we demonstrated the utility of GUIdock-noVNC in gene network inference. We benchmarked our container implementation on various operating systems and showed that our solution creates minimal overhead. © The Authors 2017. Published by Oxford University Press.

  14. Investigation on construction of the database system for research and development of the global environment industry technology; Chikyu kankyo sangyo gijutsu kenkyu kaihatsuyo database system no kochiku ni kansuru chosa

    Energy Technology Data Exchange (ETDEWEB)

    1993-03-01

    This paper studies a concrete plan to introduce a new database system of Research Institute of Innovative Technology for the Earth (RITE) which is necessary to promote the industrial technology development contributing to solution of the global environmental problem. Specifications for system introduction are about maker selection, operation system, detailed schedule for introduction, etc. RITE inhouse database has problems on its operation system and its maintenance cost, and is apt to be high in a construction cost in comparison with a utilization factor. Further study is made on its introduction. Information provided by the inhouse database is only the one owned by the organization, and information outside the organization is provided by the external database. The information is registered and selected by the registerer himself. The access network is set by personal computer network at the beginning and is set to transit to INTERNET in the future. For practical construction of the system, it is necessary to make user`s detailed needs clear for the system design and to adjust functions between hardware systems. 32 figs., 9 tabs.

  15. Research on the establishment of the database system for R and D on the innovative technology for the earth; Chikyu kankyo sangyo gijutsu kenkyu kaihatsuyo database system ni kansuru chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-03-01

    For the purpose of structuring a database system of technical information about the earth environmental issues, the `database system for R and D of the earth environmental industrial technology` was operationally evaluated, and study was made to open it and structure a prototype of database. In the present state as pointed out in the operational evaluation, the utilization frequency is not heightened due to lack of UNIX experience, absence of system managers and shortage of utilizable articles listed, so that the renewal of database does not ideally progress. Therefore, study was then made to introduce tools utilizable by the initiators and open the information access terminal to the researchers at headquarters utilizing the internet. In order for the earth environment-related researchers to easily obtain the information, a database was prototypically structured to support the research exchange. Tasks were made clear to be taken for selecting the fields of research and compiling common thesauri in Japanese, Western and other languages. 28 figs., 16 tabs.

  16. Desktop publishing and validation of custom near visual acuity charts.

    Science.gov (United States)

    Marran, Lynn; Liu, Lei; Lau, George

    2008-11-01

    Customized visual acuity (VA) assessment is an important part of basic and clinical vision research. Desktop computer based distance VA measurements have been utilized, and shown to be accurate and reliable, but computer based near VA measurements have not been attempted, mainly due to the limited spatial resolution of computer monitors. In this paper, we demonstrate how to use desktop publishing to create printed custom near VA charts. We created a set of six near VA charts in a logarithmic progression, 20/20 through 20/63, with multiple lines of the same acuity level, different letter arrangements in each line and a random noise background. This design allowed repeated measures of subjective accommodative amplitude without the potential artifact of familiarity of the optotypes. The background maintained a constant and spatial frequency rich peripheral stimulus for accommodation across the six different acuity levels. The paper describes in detail how pixel-wise accurate black and white bitmaps of Sloan optotypes were used to create the printed custom VA charts. At all acuity levels, the physical sizes of the printed custom optotypes deviated no more than 0.034 log units from that of the standard, satisfying the 0.05 log unit ISO criterion we used to demonstrate physical equivalence. Also, at all acuity levels, log unit differences in the mean target distance for which reliable recognition of letters first occurred for the printed custom optotypes compared to the standard were found to be below 0.05, satisfying the 0.05 log unit ISO criterion we used to demonstrate functional equivalence. It is possible to use desktop publishing to create custom near VA charts that are physically and functionally equivalent to standard VA charts produced by a commercial printing process.

  17. Working Inside The Box: An Example Of Google Desktop Search in a Forensic Examination

    Directory of Open Access Journals (Sweden)

    Timothy James LaTulippe

    2011-12-01

    Full Text Available Information and the technological advancements for which mankind develops with regards to its storage has increased tremendously over the past few decades. As the total amount of data stored rapidly increases in conjunction with the amount of widely available computer-driven devices being used, solutions are being developed to better harness this data. These types of advancements are continually assisting investigators and computer forensic examiners. One such application which houses copious amounts of fruitful data is the Google Desktop Search program. Coupled with tested and verified techniques, examiners can exploit the power of this application to cater to their investigative needs. Please find within a real world case example of these techniques and its subsequent outcome.

  18. Effective Web and Desktop Retrieval with Enhanced Semantic Spaces

    Science.gov (United States)

    Daoud, Amjad M.

    We describe the design and implementation of the NETBOOK prototype system for collecting, structuring and efficiently creating semantic vectors for concepts, noun phrases, and documents from a corpus of free full text ebooks available on the World Wide Web. Automatic generation of concept maps from correlated index terms and extracted noun phrases are used to build a powerful conceptual index of individual pages. To ensure scalabilty of our system, dimension reduction is performed using Random Projection [13]. Furthermore, we present a complete evaluation of the relative effectiveness of the NETBOOK system versus the Google Desktop [8].

  19. Desk-top computer assisted processing of thermoluminescent dosimeters

    International Nuclear Information System (INIS)

    Archer, B.R.; Glaze, S.A.; North, L.B.; Bushong, S.C.

    1977-01-01

    An accurate dosimetric system utilizing a desk-top computer and high sensitivity ribbon type TLDs has been developed. The system incorporates an exposure history file and procedures designed for constant spatial orientation of each dosimeter. Processing of information is performed by two computer programs. The first calculates relative response factors to insure that the corrected response of each TLD is identical following a given dose of radiation. The second program computes a calibration factor and uses it and the relative response factor to determine the actual dose registered by each TLD. (U.K.)

  20. Micro Tools with Pneumatic Actuators for Desktop Factories

    Directory of Open Access Journals (Sweden)

    Björn HOXHOLD

    2009-10-01

    Full Text Available This paper presents the design, the simulation and the fabrication process of two novel pneumatically driven auxiliary micro tools that can be used to improve and to speed up assembling processes in desktop factories. The described micro systems are designed to function as centrifugal feeders for small glass balls or active clamping devices with small external dimensions. They are able to deliver more than six balls per second on demand to a gripper and align and clamp single chips in a fixed position.

  1. Desktop computer graphics for RMS/payload handling flight design

    Science.gov (United States)

    Homan, D. J.

    1984-01-01

    A computer program, the Multi-Adaptive Drawings, Renderings and Similitudes (MADRAS) program, is discussed. The modeling program, written for a desktop computer system (the Hewlett-Packard 9845/C), is written in BASIC and uses modular construction of objects while generating both wire-frame and hidden-line drawings from any viewpoint. The dimensions and placement of objects are user definable. Once the hidden-line calculations are made for a particular viewpoint, the viewpoint may be rotated in pan, tilt, and roll without further hidden-line calculations. The use and results of this program are discussed.

  2. Bringing the medical library to the office desktop.

    Science.gov (United States)

    Brown, S R; Decker, G; Pletzke, C J

    1991-01-01

    This demonstration illustrates LRC Remote Computer Services- a dual operating system, multi-protocol system for delivering medical library services to the medical professional's desktop. A working model draws resources from CD-ROM and magnetic media file services, Novell and AppleTalk network protocol suites and gating, LAN and asynchronous (dial-in) access strategies, commercial applications for MS-DOS and Macintosh workstations and custom user interfaces. The demonstration includes a discussion of issues relevant to the delivery of said services, particularly with respect to maintenance, security, training/support, staffing, software licensing and costs.

  3. Veterans Administration Databases

    Science.gov (United States)

    The Veterans Administration Information Resource Center provides database and informatics experts, customer service, expert advice, information products, and web technology to VA researchers and others.

  4. Efficient Sustainable Operation Mechanism of Distributed Desktop Integration Storage Based on Virtualization with Ubiquitous Computing

    Directory of Open Access Journals (Sweden)

    Hyun-Woo Kim

    2015-06-01

    Full Text Available Following the rapid growth of ubiquitous computing, many jobs that were previously manual have now been automated. This automation has increased the amount of time available for leisure; diverse services are now being developed for this leisure time. In addition, the development of small and portable devices like smartphones, diverse Internet services can be used regardless of time and place. Studies regarding diverse virtualization are currently in progress. These studies aim to determine ways to efficiently store and process the big data generated by the multitude of devices and services in use. One topic of such studies is desktop storage virtualization, which integrates distributed desktop resources and provides these resources to users to integrate into distributed legacy desktops via virtualization. In the case of desktop storage virtualization, high availability of virtualization is necessary and important for providing reliability to users. Studies regarding hierarchical structures and resource integration are currently in progress. These studies aim to create efficient data distribution and storage for distributed desktops based on resource integration environments. However, studies regarding efficient responses to server faults occurring in desktop-based resource integration environments have been insufficient. This paper proposes a mechanism for the sustainable operation of desktop storage (SODS for high operational availability. It allows for the easy addition and removal of desktops in desktop-based integration environments. It also activates alternative servers when a fault occurs within a system.

  5. Bibliometric analysis of Spanish scientific publications in the subject Construction & Building Technology in Web of Science database (1997-2008

    Directory of Open Access Journals (Sweden)

    Rojas-Sola, J. I.

    2010-12-01

    Full Text Available In this paper the publications from Spanish institutions listed in the journals of the Construction & Building Technology subject of Web of Science database for the period 1997- 2008 are analyzed. The number of journals in whose is published is 35 and the number of articles was 760 (Article or Review. Also a bibliometric assessment has done and we propose two new parameters: Weighted Impact Factor and Relative Impact Factor; also includes the number of citations and the number documents at the institutional level. Among the major production Institutions with greater scientific production, as expected, the Institute of Constructional Science Eduardo Torroja (CSIC, while taking into account the weighted impact factor ranks first University of Vigo. On the other hand, only two journals Cement and Concrete Materials and Materials de Construction agglutinate the 45.26% of the Spanish scientific production published in the Construction & Building Technology subject, with 172 papers each one. Regarding international cooperation, include countries such as England, Mexico, United States, Italy, Argentina and France.

    En este trabajo se analizan las publicaciones procedentes de instituciones españolas recogidas en las revistas de la categoría Construction & Building Technology de la base de datos Web of Science para el periodo 1997-2008. El número de revistas incluidas es de 35 y el número de artículos publicados ha sido de 760 (Article o Review. Se ha realizado una evaluación bibliométrica con dos nuevos parámetros: Factor de Impacto Ponderado y Factor de Impacto Relativo; asimismo se incluyen el número de citas y el número de documentos a nivel institucional. Entre los centros con una mayor producción científica destaca, como era de prever, el Instituto de Ciencias de la Construcción Eduardo Torroja (CSIC, mientras que atendiendo al Factor de Impacto Ponderado ocupa el primer lugar la Universidad de Vigo. Por otro lado, sólo dos

  6. RA radiological characterization database application

    International Nuclear Information System (INIS)

    Steljic, M.M; Ljubenov, V.Lj. . E-mail address of corresponding author: milijanas@vin.bg.ac.yu; Steljic, M.M.)

    2005-01-01

    Radiological characterization of the RA research reactor is one of the main activities in the first two years of the reactor decommissioning project. The raw characterization data from direct measurements or laboratory analyses (defined within the existing sampling and measurement programme) have to be interpreted, organized and summarized in order to prepare the final characterization survey report. This report should be made so that the radiological condition of the entire site is completely and accurately shown with the radiological condition of the components clearly depicted. This paper presents an electronic database application, designed as a serviceable and efficient tool for characterization data storage, review and analysis, as well as for the reports generation. Relational database model was designed and the application is made by using Microsoft Access 2002 (SP1), a 32-bit RDBMS for the desktop and client/server database applications that run under Windows XP. (author)

  7. 3d visualization of atomistic simulations on every desktop

    Science.gov (United States)

    Peled, Dan; Silverman, Amihai; Adler, Joan

    2013-08-01

    Once upon a time, after making simulations, one had to go to a visualization center with fancy SGI machines to run a GL visualization and make a movie. More recently, OpenGL and its mesa clone have let us create 3D on simple desktops (or laptops), whether or not a Z-buffer card is present. Today, 3D a la Avatar is a commodity technique, presented in cinemas and sold for home TV. However, only a few special research centers have systems large enough for entire classes to view 3D, or special immersive facilities like visualization CAVEs or walls, and not everyone finds 3D immersion easy to view. For maximum physics with minimum effort a 3D system must come to each researcher and student. So how do we create 3D visualization cheaply on every desktop for atomistic simulations? After several months of attempts to select commodity equipment for a whole room system, we selected an approach that goes back a long time, even predating GL. The old concept of anaglyphic stereo relies on two images, slightly displaced, and viewed through colored glasses, or two squares of cellophane from a regular screen/projector or poster. We have added this capability to our AViz atomistic visualization code in its new, 6.1 version, which is RedHat, CentOS and Ubuntu compatible. Examples using data from our own research and that of other groups will be given.

  8. An ergonomic evaluation comparing desktop, notebook, and subnotebook computers.

    Science.gov (United States)

    Szeto, Grace P; Lee, Raymond

    2002-04-01

    To evaluate and compare the postures and movements of the cervical and upper thoracic spine, the typing performance, and workstation ergonomic factors when using a desktop, notebook, and subnotebook computers. Repeated-measures design. A motion analysis laboratory with an electromagnetic tracking device. A convenience sample of 21 university students between ages 20 and 24 years with no history of neck or shoulder discomfort. Each subject performed a standardized typing task by using each of the 3 computers. Measurements during the typing task were taken at set intervals. Cervical and thoracic spines adopted a more flexed posture in using the smaller-sized computers. There were significantly greater neck movements in using desktop computers when compared with the notebook and subnotebook computers. The viewing distances adopted by the subjects decreased as the computer size decreased. Typing performance and subjective rating of difficulty in using the keyboards were also significantly different among the 3 types of computers. Computer users need to consider the posture of the spine and potential risk of developing musculoskeletal discomfort in choosing computers. Copyright 2002 by the American Congress of Rehabilitation Medicine and the American Academy of Physical Medicine and Rehabilitation

  9. 3d visualization of atomistic simulations on every desktop

    International Nuclear Information System (INIS)

    Peled, Dan; Silverman, Amihai; Adler, Joan

    2013-01-01

    Once upon a time, after making simulations, one had to go to a visualization center with fancy SGI machines to run a GL visualization and make a movie. More recently, OpenGL and its mesa clone have let us create 3D on simple desktops (or laptops), whether or not a Z-buffer card is present. Today, 3D a la Avatar is a commodity technique, presented in cinemas and sold for home TV. However, only a few special research centers have systems large enough for entire classes to view 3D, or special immersive facilities like visualization CAVEs or walls, and not everyone finds 3D immersion easy to view. For maximum physics with minimum effort a 3D system must come to each researcher and student. So how do we create 3D visualization cheaply on every desktop for atomistic simulations? After several months of attempts to select commodity equipment for a whole room system, we selected an approach that goes back a long time, even predating GL. The old concept of anaglyphic stereo relies on two images, slightly displaced, and viewed through colored glasses, or two squares of cellophane from a regular screen/projector or poster. We have added this capability to our AViz atomistic visualization code in its new, 6.1 version, which is RedHat, CentOS and Ubuntu compatible. Examples using data from our own research and that of other groups will be given

  10. SERVICE HANDBOOK FOR THE DESKTOP SUPPORT CONTRACT WIH IT DIVISION

    CERN Multimedia

    2000-01-01

    A Desktop Support Contract has been running since January 1999 to offer help to all users at CERN with problems that occur with their desktop computers. The contract is run conjointly by the Swedish Company WM-data and the Swiss company DCS.The contract is comprised of the Computing Helpdesk, a General Service for all parts of CERN and also Local Service for those divisions and groups that want faster response times and additional help with their specific computer environment.In order to describe what services are being offered, and also to give a better understanding of the structure of the contract, a Service Handbook has been created. The intended audience for the Service Handbook is everyone that is using the contract, i.e. users, managers and also the service staff inside the contract. In the handbook you will find what help you can get from the contract, how to get in touch with the contract, and also what response times you can expect. Since the computer environment at CERN is a never-changing entity, ...

  11. REPLIKASI UNIDIRECTIONAL PADA HETEROGEN DATABASE

    OpenAIRE

    Hendro Nindito; Evaristus Didik Madyatmadja; Albert Verasius Dian Sano

    2013-01-01

    The use of diverse database technology in enterprise today can not be avoided. Thus, technology is needed to generate information in real time. The purpose of this research is to discuss a database replication technology that can be applied in heterogeneous database environments. In this study we use Windows-based MS SQL Server database to Linux-based Oracle database as the goal. The research method used is prototyping where development can be done quickly and testing of working models of the...

  12. CLOUD-BASED VS DESKTOP-BASED PROPERTY MANAGEMENT SYSTEMS IN HOTEL

    Directory of Open Access Journals (Sweden)

    Mustafa\tGULMEZ

    2015-06-01

    Full Text Available Even though keeping up with the modern developments in IT sector is crucial for the success and competitiveness of a hotel, it is usually very hard for new technologies to be accepted and implemented. This is the case with the cloud technology for which the opinions between hoteliers are divided on those who think that it is just another fashion trend, unnecessary to be taken into consideration and those that believe that it helps in performing daily operations more easily, leaving space for more interaction with guests both in virtual and real world. Usage of cloud technology in hotels is still in its beginning phase and hoteliers still have to learn more about its advantages and adequate usage for the benefit of overall hotel operating. On the example of hotel property management system (PMS and comparison between features of its older desktop-version and new web-based programs, this research aims at finding out at which stage and how effective is usage of cloud technology in hotels. For this, qualitative research with semi-structured interviews with hotel mangers that use one of these programs was conducted. Reasons for usage and advantages of each version are discussed.

  13. Attack Potential Evaluation in Desktop and Smartphone Fingerprint Sensors: Can They Be Attacked by Anyone?

    Directory of Open Access Journals (Sweden)

    Ines Goicoechea-Telleria

    2018-01-01

    Full Text Available The use of biometrics keeps growing. Every day, we use biometric recognition to unlock our phones or to have access to places such as the gym or the office, so we rely on the security manufacturers offer when protecting our privileges and private life. It is well known that it is possible to hack into a fingerprint sensor using fake fingers made of Play-Doh and other easy-to-obtain materials but to what extent? Is this true for all users or only for specialists with a deep knowledge on biometrics? Are smartphone fingerprint sensors as reliable as desktop sensors? To answer these questions, we performed 3 separate evaluations. First, we evaluated 4 desktop fingerprint sensors of different technologies by attacking them with 7 different fake finger materials. All of them were successfully attacked by an experienced attacker. Secondly, we carried out a similar test on 5 smartphones with embedded sensors using the most successful materials, which also hacked the 5 sensors. Lastly, we gathered 15 simulated attackers with no background in biometrics to create fake fingers of several materials, and they had one week to attack the fingerprint sensors of the same 5 smartphones, with the starting point of a short video with the techniques to create them. All 5 smartphones were successfully attacked by an inexperienced attacker. This paper will provide the results achieved, as well as an analysis on the attack potential of every case. All results are given following the metrics of the standard ISO/IEC 30107-3.

  14. Emission of particulate matter from a desktop three-dimensional (3D) printer

    Science.gov (United States)

    Yi, Jinghai; LeBouf, Ryan F.; Duling, Matthew G.; Nurkiewicz, Timothy; Chen, Bean T.; Schwegler-Berry, Diane; Virji, M. Abbas; Stefaniak, Aleksandr B.

    2016-01-01

    ABSTRACT Desktop three-dimensional (3D) printers are becoming commonplace in business offices, public libraries, university labs and classrooms, and even private homes; however, these settings are generally not designed for exposure control. Prior experience with a variety of office equipment devices such as laser printers that emit ultrafine particles (UFP) suggests the need to characterize 3D printer emissions to enable reliable risk assessment. The aim of this study was to examine factors that influence particulate emissions from 3D printers and characterize their physical properties to inform risk assessment. Emissions were evaluated in a 0.5-m3 chamber and in a small room (32.7 m3) using real-time instrumentation to measure particle number, size distribution, mass, and surface area. Factors evaluated included filament composition and color, as well as the manufacturer-provided printer emissions control technologies while printing an object. Filament type significantly influenced emissions, with acrylonitrile butadiene styrene (ABS) emitting larger particles than polylactic acid (PLA), which may have been the result of agglomeration. Geometric mean particle sizes and total particle (TP) number and mass emissions differed significantly among colors of a given filament type. Use of a cover on the printer reduced TP emissions by a factor of 2. Lung deposition calculations indicated a threefold higher PLA particle deposition in alveoli compared to ABS. Desktop 3D printers emit high levels of UFP, which are released into indoor environments where adequate ventilation may not be present to control emissions. Emissions in nonindustrial settings need to be reduced through the use of a hierarchy of controls, beginning with device design, followed by engineering controls (ventilation) and administrative controls such as choice of filament composition and color. PMID:27196745

  15. Emission of particulate matter from a desktop three-dimensional (3D) printer.

    Science.gov (United States)

    Yi, Jinghai; LeBouf, Ryan F; Duling, Matthew G; Nurkiewicz, Timothy; Chen, Bean T; Schwegler-Berry, Diane; Virji, M Abbas; Stefaniak, Aleksandr B

    2016-01-01

    Desktop three-dimensional (3D) printers are becoming commonplace in business offices, public libraries, university labs and classrooms, and even private homes; however, these settings are generally not designed for exposure control. Prior experience with a variety of office equipment devices such as laser printers that emit ultrafine particles (UFP) suggests the need to characterize 3D printer emissions to enable reliable risk assessment. The aim of this study was to examine factors that influence particulate emissions from 3D printers and characterize their physical properties to inform risk assessment. Emissions were evaluated in a 0.5-m(3) chamber and in a small room (32.7 m(3)) using real-time instrumentation to measure particle number, size distribution, mass, and surface area. Factors evaluated included filament composition and color, as well as the manufacturer-provided printer emissions control technologies while printing an object. Filament type significantly influenced emissions, with acrylonitrile butadiene styrene (ABS) emitting larger particles than polylactic acid (PLA), which may have been the result of agglomeration. Geometric mean particle sizes and total particle (TP) number and mass emissions differed significantly among colors of a given filament type. Use of a cover on the printer reduced TP emissions by a factor of 2. Lung deposition calculations indicated a threefold higher PLA particle deposition in alveoli compared to ABS. Desktop 3D printers emit high levels of UFP, which are released into indoor environments where adequate ventilation may not be present to control emissions. Emissions in nonindustrial settings need to be reduced through the use of a hierarchy of controls, beginning with device design, followed by engineering controls (ventilation) and administrative controls such as choice of filament composition and color.

  16. A Personal Desktop Liquid-Metal Printer as a Pervasive Electronics Manufacturing Tool for Society in the Near Future

    Directory of Open Access Journals (Sweden)

    Jun Yang

    2015-12-01

    Full Text Available It has long been a dream in the electronics industry to be able to write out electronics directly, as simply as printing a picture onto paper with an office printer. The first-ever prototype of a liquid-metal printer has been invented and demonstrated by our lab, bringing this goal a key step closer. As part of a continuous endeavor, this work is dedicated to significantly extending such technology to the consumer level by making a very practical desktop liquid-metal printer for society in the near future. Through the industrial design and technical optimization of a series of key technical issues such as working reliability, printing resolution, automatic control, human-machine interface design, software, hardware, and integration between software and hardware, a high-quality personal desktop liquid-metal printer that is ready for mass production in industry was fabricated. Its basic features and important technical mechanisms are explained in this paper, along with demonstrations of several possible consumer end-uses for making functional devices such as light-emitting diode (LED displays. This liquid-metal printer is an automatic, easy-to-use, and low-cost personal electronics manufacturing tool with many possible applications. This paper discusses important roles that the new machine may play for a group of emerging needs. The prospective future of this cutting-edge technology is outlined, along with a comparative interpretation of several historical printing methods. This desktop liquid-metal printer is expected to become a basic electronics manufacturing tool for a wide variety of emerging practices in the academic realm, in industry, and in education as well as for individual end-users in the near future.

  17. Construction of a bibliographic information database and development of retrieval system for research reports in nuclear science and technology (II)

    International Nuclear Information System (INIS)

    Han, Duk Haeng; Kim, Tae Whan; Choi, Kwang; Yoo, An Na; Keum, Jong Yong; Kim, In Kwon

    1996-05-01

    The major goal of this project is to construct a bibliographic information database in nuclear engineering and to develop a prototype retrieval system. To give an easy access to microfiche research report, this project has accomplished the construction of microfiche research reports database and the development of retrieval system. The results of the project are as follows; 1. Microfiche research reports database was constructed by downloading from DOE Energy, NTIS, INIS. 2. The retrieval system was developed in host and web version using access point such as title, abstracts, keyword, report number. 6 tabs., 8 figs., 11 refs. (Author) .new

  18. Construction of a bibliographic information database and development of retrieval system for research reports in nuclear science and technology (II)

    Energy Technology Data Exchange (ETDEWEB)

    Han, Duk Haeng; Kim, Tae Whan; Choi, Kwang; Yoo, An Na; Keum, Jong Yong; Kim, In Kwon [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-05-01

    The major goal of this project is to construct a bibliographic information database in nuclear engineering and to develop a prototype retrieval system. To give an easy access to microfiche research report, this project has accomplished the construction of microfiche research reports database and the development of retrieval system. The results of the project are as follows; 1. Microfiche research reports database was constructed by downloading from DOE Energy, NTIS, INIS. 2. The retrieval system was developed in host and web version using access point such as title, abstracts, keyword, report number. 6 tabs., 8 figs., 11 refs. (Author) .new.

  19. Image Format Conversion to DICOM and Lookup Table Conversion to Presentation Value of the Japanese Society of Radiological Technology (JSRT) Standard Digital Image Database.

    Science.gov (United States)

    Yanagita, Satoshi; Imahana, Masato; Suwa, Kazuaki; Sugimura, Hitomi; Nishiki, Masayuki

    2016-01-01

    Japanese Society of Radiological Technology (JSRT) standard digital image database contains many useful cases of chest X-ray images, and has been used in many state-of-the-art researches. However, the pixel values of all the images are simply digitized as relative density values by utilizing a scanned film digitizer. As a result, the pixel values are completely different from the standardized display system input value of digital imaging and communications in medicine (DICOM), called presentation value (P-value), which can maintain a visual consistency when observing images using different display luminance. Therefore, we converted all the images from JSRT standard digital image database to DICOM format followed by the conversion of the pixel values to P-value using an original program developed by ourselves. Consequently, JSRT standard digital image database has been modified so that the visual consistency of images is maintained among different luminance displays.

  20. Laevo: A Temporal Desktop Interface for Integrated Knowledge Work

    DEFF Research Database (Denmark)

    Jeuris, Steven; Houben, Steven; Bardram, Jakob

    2014-01-01

    Prior studies show that knowledge work is characterized by highly interlinked practices, including task, file and window management. However, existing personal information management tools primarily focus on a limited subset of knowledge work, forcing users to perform additional manual...... states and transitions of an activity. The life cycle is used to inform the design of Laevo, a temporal activity-centric desktop interface for personal knowledge work. Laevo allows users to structure work within dedicated workspaces, managed on a timeline. Through a centralized notification system which...... configuration work to integrate the different tools they use. In order to understand tool usage, we review literature on how users' activities are created and evolve over time as part of knowledge worker practices. From this we derive the activity life cycle, a conceptual framework describing the different...

  1. Los Alamos radiation transport code system on desktop computing platforms

    International Nuclear Information System (INIS)

    Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.; West, J.T.

    1990-01-01

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. The current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines

  2. Desk-top microcomputer for lab-scale process control

    International Nuclear Information System (INIS)

    Overman, R.F.; Byrd, J.S.; Goosey, M.H.; Sand, R.J.

    1981-01-01

    A desk-top microcomputer was programmed to acquire the data from various process control sensors installed in a laboratory scale liquid-liquid extraction, pulse column facility. The parameters monitored included valve positions, gamma spectra, alpha radioactivity, temperature, pH, density, and flow rates. The program for the microcomputer is written in BASIC and requires about 31000 8-bit bytes of memory. All data is stored on floppy discs, and can be displayed or printed. Unexpected data values are brought to the process operator's attention via CRT display or print-out. The general organization of the program and a few subroutines unique to polling instruments are explained. Some of the data acquisition devices were designed and built at the Savannah River Laboratory. These include a pulse height analyzer, a data multiplexer, and a data acquisition instrument. A general description of the electronics design of these instruments is also given with emphasis placed on data formatting and bus addressing

  3. A Desktop Screen Sharing System based on Various Connection Methods

    Science.gov (United States)

    Negishi, Yuya; Kawaguchi, Nobuo

    Recently it became very common to use information devices such as PCs during presentations and discussions. In these situations, a need arises for techniques that allow a smooth switch of presenters without changing cables, or an easy screen sharing in case of remote videoconferences. In this paper, we propose a desktop screen sharing system that can be used for such purposes and situations. For that, we designed an automatic control of connections in the VNC system that can be operated remotely over the network. We also suggested an interface that assigns a role such as “Screen sender" or “Screen receiver" to each terminal. In the proposed system, while sharing a screen between multiple terminals, one can easily display and browse the screen without having to understand how the others are connected. We also implemented a “role card" using contactless IC card, where roles are assigned only by placing the card in the IC reader.

  4. Direct Desktop Printed-Circuits-on-Paper Flexible Electronics

    Science.gov (United States)

    Zheng, Yi; He, Zhizhu; Gao, Yunxia; Liu, Jing

    2013-01-01

    There currently lacks of a way to directly write out electronics, just like printing pictures on paper by an office printer. Here we show a desktop printing of flexible circuits on paper via developing liquid metal ink and related working mechanisms. Through modifying adhesion of the ink, overcoming its high surface tension by dispensing machine and designing a brush like porous pinhead for printing alloy and identifying matched substrate materials among different papers, the slightly oxidized alloy ink was demonstrated to be flexibly printed on coated paper, which could compose various functional electronics and the concept of Printed-Circuits-on-Paper was thus presented. Further, RTV silicone rubber was adopted as isolating inks and packaging material to guarantee the functional stability of the circuit, which suggests an approach for printing 3D hybrid electro-mechanical device. The present work paved the way for a low cost and easygoing method in directly printing paper electronics.

  5. GRID : unlimited computing power on your desktop Conference MT17

    CERN Multimedia

    2001-01-01

    The Computational GRID is an analogy to the electrical power grid for computing resources. It decouples the provision of computing, data, and networking from its use, it allows large-scale pooling and sharing of resources distributed world-wide. Every computer, from a desktop to a mainframe or supercomputer, can provide computing power or data for the GRID. The final objective is to plug your computer into the wall and have direct access to huge computing resources immediately, just like plugging-in a lamp to get instant light. The GRID will facilitate world-wide scientific collaborations on an unprecedented scale. It will provide transparent access to major distributed resources of computer power, data, information, and collaborations.

  6. dictyExpress: a Dictyostelium discoideum gene expression database with an explorative data analysis web-based interface

    Science.gov (United States)

    Rot, Gregor; Parikh, Anup; Curk, Tomaz; Kuspa, Adam; Shaulsky, Gad; Zupan, Blaz

    2009-01-01

    Background Bioinformatics often leverages on recent advancements in computer science to support biologists in their scientific discovery process. Such efforts include the development of easy-to-use web interfaces to biomedical databases. Recent advancements in interactive web technologies require us to rethink the standard submit-and-wait paradigm, and craft bioinformatics web applications that share analytical and interactive power with their desktop relatives, while retaining simplicity and availability. Results We have developed dictyExpress, a web application that features a graphical, highly interactive explorative interface to our database that consists of more than 1000 Dictyostelium discoideum gene expression experiments. In dictyExpress, the user can select experiments and genes, perform gene clustering, view gene expression profiles across time, view gene co-expression networks, perform analyses of Gene Ontology term enrichment, and simultaneously display expression profiles for a selected gene in various experiments. Most importantly, these tasks are achieved through web applications whose components are seamlessly interlinked and immediately respond to events triggered by the user, thus providing a powerful explorative data analysis environment. Conclusion dictyExpress is a precursor for a new generation of web-based bioinformatics applications with simple but powerful interactive interfaces that resemble that of the modern desktop. While dictyExpress serves mainly the Dictyostelium research community, it is relatively easy to adapt it to other datasets. We propose that the design ideas behind dictyExpress will influence the development of similar applications for other model organisms. PMID:19706156

  7. Visual attention for a desktop virtual environment with ambient scent

    Directory of Open Access Journals (Sweden)

    Alexander eToet

    2013-11-01

    Full Text Available In the current study participants explored a desktop virtual environment (VE representing a suburban neighborhood with signs of public disorder (neglect, vandalism and crime, while being exposed to either room air (control group, or subliminal levels of tar (unpleasant; typically associated with burned or waste material or freshly cut grass (pleasant; typically associated with natural or fresh material ambient odor. They reported all signs of disorder they noticed during their walk together with their associated emotional response. Based on recent evidence that odors reflexively direct visual attention to (either semantically or affectively congruent visual objects, we hypothesized that participants would notice more signs of disorder in the presence of ambient tar odor (since this odor may bias attention to unpleasant and negative features, and less signs of disorder in the presence of ambient grass odor (since this odor may bias visual attention towards the vegetation in the environment and away from the signs of disorder. Contrary to our expectations the results show that the presence of an ambient odor did not affect the participants’ visual attention for signs of disorder or their emotional response. We conclude that a closer affective, semantic or spatiotemporal link between the contents of a desktop VE and ambient scents may be required to effectively establish diagnostic associations that guide a user’s attention. In the absence of these direct links, ambient scent may be more diagnostic for the physical environment of the observer as a whole than for the particular items in that environment (or, in this case, items represented in the VE.

  8. Turbulence Visualization at the Terascale on Desktop PCs

    KAUST Repository

    Treib, M.

    2012-12-01

    Despite the ongoing efforts in turbulence research, the universal properties of the turbulence small-scale structure and the relationships between small-and large-scale turbulent motions are not yet fully understood. The visually guided exploration of turbulence features, including the interactive selection and simultaneous visualization of multiple features, can further progress our understanding of turbulence. Accomplishing this task for flow fields in which the full turbulence spectrum is well resolved is challenging on desktop computers. This is due to the extreme resolution of such fields, requiring memory and bandwidth capacities going beyond what is currently available. To overcome these limitations, we present a GPU system for feature-based turbulence visualization that works on a compressed flow field representation. We use a wavelet-based compression scheme including run-length and entropy encoding, which can be decoded on the GPU and embedded into brick-based volume ray-casting. This enables a drastic reduction of the data to be streamed from disk to GPU memory. Our system derives turbulence properties directly from the velocity gradient tensor, and it either renders these properties in turn or generates and renders scalar feature volumes. The quality and efficiency of the system is demonstrated in the visualization of two unsteady turbulence simulations, each comprising a spatio-temporal resolution of 10244. On a desktop computer, the system can visualize each time step in 5 seconds, and it achieves about three times this rate for the visualization of a scalar feature volume. © 1995-2012 IEEE.

  9. Visual attention for a desktop virtual environment with ambient scent.

    Science.gov (United States)

    Toet, Alexander; van Schaik, Martin G

    2013-01-01

    In the current study participants explored a desktop virtual environment (VE) representing a suburban neighborhood with signs of public disorder (neglect, vandalism, and crime), while being exposed to either room air (control group), or subliminal levels of tar (unpleasant; typically associated with burned or waste material) or freshly cut grass (pleasant; typically associated with natural or fresh material) ambient odor. They reported all signs of disorder they noticed during their walk together with their associated emotional response. Based on recent evidence that odors reflexively direct visual attention to (either semantically or affectively) congruent visual objects, we hypothesized that participants would notice more signs of disorder in the presence of ambient tar odor (since this odor may bias attention to unpleasant and negative features), and less signs of disorder in the presence of ambient grass odor (since this odor may bias visual attention toward the vegetation in the environment and away from the signs of disorder). Contrary to our expectations the results provide no indication that the presence of an ambient odor affected the participants' visual attention for signs of disorder or their emotional response. However, the paradigm used in present study does not allow us to draw any conclusions in this respect. We conclude that a closer affective, semantic, or spatiotemporal link between the contents of a desktop VE and ambient scents may be required to effectively establish diagnostic associations that guide a user's attention. In the absence of these direct links, ambient scent may be more diagnostic for the physical environment of the observer as a whole than for the particular items in that environment (or, in this case, items represented in the VE).

  10. Development and deployment of a Desktop and Mobile application on grid for GPS studie

    Science.gov (United States)

    Ntumba, Patient; Lotoy, Vianney; Djungu, Saint Jean; Fleury, Rolland; Petitdidier, Monique; Gemünd, André; Schwichtenberg, Horst

    2013-04-01

    GPS networks for scientific studies are developed all other the world and large databases, regularly updated, like IGS are also available. Many GPS have been installed in West and Central Africa during AMMA (African Monsoon Multiplidisciplinary Analysis), IHY (International heliophysical Year)and many other projects since 2005. African scientists have been educated to use those data especially for meteorological and ionospheric studies. The annual variations of ionospheric parameters for a given station or map of a given region are very intensive computing. Then grid or cloud computing may be a solution to obtain results in a relatively short time. Real time At the University of Kinshasa the chosen solution is a grid of several PCs. It has been deployed by using Globus Toolkit on a Condor pool in order to support the processing of GPS data for ionospheric studies. To be user-friendly, graphical user interfaces(GUI) have been developed to help the user to prepare and submit jobs. One is a java GUI for desktop client, the other is an Android GUI for mobile client. The interest of a grid is the possibility to send a bunch of jobs with an adequate agent control in order to survey the job execution and result storage. After the feasibility study the grid will be extended to a larger number of PCs. Other solutions will be in parallel explored.

  11. BoreholeAR: A mobile tablet application for effective borehole database visualization using an augmented reality technology

    Science.gov (United States)

    Lee, Sangho; Suh, Jangwon; Park, Hyeong-Dong

    2015-03-01

    Boring logs are widely used in geological field studies since the data describes various attributes of underground and surface environments. However, it is difficult to manage multiple boring logs in the field as the conventional management and visualization methods are not suitable for integrating and combining large data sets. We developed an iPad application to enable its user to search the boring log rapidly and visualize them using the augmented reality (AR) technique. For the development of the application, a standard borehole database appropriate for a mobile-based borehole database management system was designed. The application consists of three modules: an AR module, a map module, and a database module. The AR module superimposes borehole data on camera imagery as viewed by the user and provides intuitive visualization of borehole locations. The map module shows the locations of corresponding borehole data on a 2D map with additional map layers. The database module provides data management functions for large borehole databases for other modules. Field survey was also carried out using more than 100,000 borehole data.

  12. Providing Databases for Different Indoor Positioning Technologies: Pros and Cons of Magnetic Field and Wi-Fi Based Positioning

    Directory of Open Access Journals (Sweden)

    Joaquín Torres-Sospedra

    2016-01-01

    Full Text Available Localization is one of the main pillars for indoor services. However, it is still very difficult for the mobile sensing community to compare state-of-the-art indoor positioning systems due to the scarcity of publicly available databases. To make fair and meaningful comparisons between indoor positioning systems, they must be evaluated in the same situation, or in the same sets of situations. In this paper, two databases are introduced for studying the performance of magnetic field and Wi-Fi fingerprinting based positioning systems in the same environment (i.e., indoor area. The “magnetic” database contains more than 40,000 discrete captures (270 continuous samples, whereas the “Wi-Fi” one contains 1,140 ones. The environment and both databases are fully detailed in this paper. A set of experiments is also presented where two simple but effective baselines have been developed to test the suitability of the databases. Finally, the pros and cons of both types of positioning techniques are discussed in detail.

  13. DATABASE REPLICATION IN HETEROGENOUS PLATFORM

    OpenAIRE

    Hendro Nindito; Evaristus Didik Madyatmadja; Albert Verasius Dian Sano

    2014-01-01

    The application of diverse database technologies in enterprises today is increasingly a common practice. To provide high availability and survavibality of real-time information, a database replication technology that has capability to replicate databases under heterogenous platforms is required. The purpose of this research is to find the technology with such capability. In this research, the data source is stored in MSSQL database server running on Windows. The data will be replicated to MyS...

  14. INIS: international nuclear information system. World first international database on pacific uses of nuclear sciences and technologies

    International Nuclear Information System (INIS)

    Surmont, J.; Constant, A.; Guille, N.; Le Blanc, A.; Mouffron, O.; Anguise, P.; Jouve, J.J.

    2007-01-01

    This poster, prepared for the 2007 CEA meetings on scientific and technical information, presents the INIS information system, the document-types content and subject coverage of the database, the French contribution to this system thanks to the INIS team of the CEA-Saclay, the input preparation process, and an example of valorization of a scientific and historical patrimony with the CEA/IAEA joint project of digitization of about 2760 CEA reports published between 1948 and 1969. All these reports have been digitized by the IAEA and analyzed by CEA, and entered in the INIS database with a link to the full text. (J.S.)

  15. VRLane: a desktop virtual safety management program for underground coal mine

    Science.gov (United States)

    Li, Mei; Chen, Jingzhu; Xiong, Wei; Zhang, Pengpeng; Wu, Daozheng

    2008-10-01

    VR technologies, which generate immersive, interactive, and three-dimensional (3D) environments, are seldom applied to coal mine safety work management. In this paper, a new method that combined the VR technologies with underground mine safety management system was explored. A desktop virtual safety management program for underground coal mine, called VRLane, was developed. The paper mainly concerned about the current research advance in VR, system design, key techniques and system application. Two important techniques were introduced in the paper. Firstly, an algorithm was designed and implemented, with which the 3D laneway models and equipment models can be built on the basis of the latest mine 2D drawings automatically, whereas common VR programs established 3D environment by using 3DS Max or the other 3D modeling software packages with which laneway models were built manually and laboriously. Secondly, VRLane realized system integration with underground industrial automation. VRLane not only described a realistic 3D laneway environment, but also described the status of the coal mining, with functions of displaying the run states and related parameters of equipment, per-alarming the abnormal mining events, and animating mine cars, mine workers, or long-wall shearers. The system, with advantages of cheap, dynamic, easy to maintenance, provided a useful tool for safety production management in coal mine.

  16. Integrasi pemrograman web pada pemrograman desktop sebagai alternatif fasilitas laporan dalam pengembangan program aplikasi

    Directory of Open Access Journals (Sweden)

    Mardainis Mardainis

    2017-11-01

    Full Text Available AbstrakPemrograman Desktop adalah program aplikasi yang mampu beroperasi tanpa mengandalkan jaringan internet. Penggunaan program desktop biasanya digunakan untuk membuat program yang akan dioperasikan tanpa memerlukan jaringan internet dengan area kerja berada disatu lokasi saja. Sedangkan program web pemakaiannya sangat bergantung pada jaringan internet agar bisa menghubungkan antar pengguna. Pilihan menggunakan program desktop atau program berbasis Web ditentukan oleh kebutuhan dan implementasinya. Jika implementasinya hanya untuk lingkungan perusahaan yang berada di satu tempat, program sebaiknya menggunakan program berbasis desktop. Namun, jika perusahaan memiliki lokasi terpisah di beberapa daerah, penggunaan program berbasis web lebih tepat. Namun banyak programmer, terutama pemula yang enggan menggunakan pemrograman desktop karena dalam membuat laporan harus menggunakan aplikasi pembuat laporan khusus seperti Crystal Report. Kesulitan yang dialami untuk menggunakan aplikasi khusus ini adalah tidak tersedianya aplikasi dalam sistem sehingga perlu diadakan secara khusus. Dalam membuat laporan kadang dirasa agak rumit karena tampilan laporan harus diseting secara manual. Sedangkan dalam bahasa pemrograman berbasis web untuk menampilkan informasi bisa langsung dibuat dengan mudah dalam program itu sendiri tanpa harus menggunakan aplikasi tambahan. Jadi membuat laporan dengan program berbasis web lebih mudah. Untuk menghindari kesulitan para pemrogram dalam membuat laporan tentang program desktop, peneliti mengintegrasikan program berbasis Web dengan pemrograman berbasis desktop dengan tujuan mempermudah membuat laporan. Kata kunci:  Pemrograman Desktop, Implementasi, Integrasi, Crystal Report.  AbstractDesktop Programming is an application programmer capable of operating without relying on the internet network. The use of desktop programs is usually used to create a program that will be operated without the need for internet network with

  17. BDE-209 in the Australian Environment: Desktop review

    International Nuclear Information System (INIS)

    English, Karin; Toms, Leisa-Maree L.; Gallen, Christie; Mueller, Jochen F.

    2016-01-01

    The commercial polybrominated diphenyl ether (PBDE) flame retardant mixture c-decaBDE is now being considered for listing on the Stockholm Convention on Persistent Organic Pollutants. The aim of our study was to review the literature regarding the use and detection of BDE-209, a major component of c-decaBDE, in consumer products and provide a best estimate of goods that are likely to contain BDE-209 in Australia. This review is part of a larger study, which will include quantitative testing of items to assess for BDE-209. The findings of this desktop review will be used to determine which items should be prioritized for quantitative testing. We identified that electronics, particularly televisions, computers, small household appliances and power boards, were the items that were most likely to contain BDE-209 in Australia. Further testing of these items should include items of various ages. Several other items were identified as high priority for future testing, including transport vehicles, building materials and textiles in non-domestic settings. The findings from this study will aid in the development of appropriate policies, should listing of c-decaBDE on the Stockholm Convention and Australia’s ratification of that listing proceed.

  18. Analysis of helium-ion scattering with a desktop computer

    Science.gov (United States)

    Butler, J. W.

    1986-04-01

    This paper describes a program written in an enhanced BASIC language for a desktop computer, for simulating the energy spectra of high-energy helium ions scattered into two concurrent detectors (backward and glancing). The program is designed for 512-channel spectra from samples containing up to 8 elements and 55 user-defined layers. The program is intended to meet the needs of analyses in materials sciences, such as metallurgy, where more than a few elements may be present, where several elements may be near each other in the periodic table, and where relatively deep structure may be important. These conditions preclude the use of completely automatic procedures for obtaining the sample composition directly from the scattered ion spectrum. Therefore, efficient methods are needed for entering and editing large amounts of composition data, with many iterations and with much feedback of information from the computer to the user. The internal video screen is used exclusively for verbal and numeric communications between user and computer. The composition matrix is edited on screen with a two-dimension forms-fill-in text editor and with many automatic procedures, such as doubling the number of layers with appropriate interpolations and extrapolations. The control center of the program is a bank of 10 keys that initiate on-event branching of program flow. The experimental and calculated spectra, including those of individual elements if desired, are displayed on an external color monitor, with an optional inset plot of the depth concentration profiles of the elements in the sample.

  19. BDE-209 in the Australian Environment: Desktop review

    Energy Technology Data Exchange (ETDEWEB)

    English, Karin, E-mail: k.english@uq.edu.au [School of Medicine, The University of Queensland, Brisbane (Australia); Children’s Health and Environment Program, Child Health Research Centre, The University of Queensland, Brisbane (Australia); Queensland Children’s Medical Research Institute, Children’s Health Research Centre, Brisbane (Australia); Toms, Leisa-Maree L. [School of Public Health and Social Work, and Institute of Health and Biomedical Innovation, Queensland University of Technology, Brisbane (Australia); Gallen, Christie; Mueller, Jochen F. [The University of Queensland, National Research Centre for Environmental Toxicology (Entox), Brisbane (Australia)

    2016-12-15

    The commercial polybrominated diphenyl ether (PBDE) flame retardant mixture c-decaBDE is now being considered for listing on the Stockholm Convention on Persistent Organic Pollutants. The aim of our study was to review the literature regarding the use and detection of BDE-209, a major component of c-decaBDE, in consumer products and provide a best estimate of goods that are likely to contain BDE-209 in Australia. This review is part of a larger study, which will include quantitative testing of items to assess for BDE-209. The findings of this desktop review will be used to determine which items should be prioritized for quantitative testing. We identified that electronics, particularly televisions, computers, small household appliances and power boards, were the items that were most likely to contain BDE-209 in Australia. Further testing of these items should include items of various ages. Several other items were identified as high priority for future testing, including transport vehicles, building materials and textiles in non-domestic settings. The findings from this study will aid in the development of appropriate policies, should listing of c-decaBDE on the Stockholm Convention and Australia’s ratification of that listing proceed.

  20. Big Memory Elegance: HyperCard Information Processing and Desktop Publishing.

    Science.gov (United States)

    Bitter, Gary G.; Gerson, Charles W., Jr.

    1991-01-01

    Discusses hardware requirements, functions, and applications of five information processing and desktop publishing software packages for the Macintosh: HyperCard, PageMaker, Cricket Presents, Power Point, and Adobe illustrator. Benefits of these programs for schools are considered. (MES)

  1. Development of an automated desktop procedure for defining macro-reaches for river longitudinal profiles

    CSIR Research Space (South Africa)

    Dollar, LH

    2006-07-01

    Full Text Available This paper presents an automated desktop procedure for delineating river longitudinal profiles into macro-reaches for use in Ecological Reserve assessments and to aid freshwater ecosystem conservation planning. The procedure was developed for use...

  2. ACID Astronomical and Physics Cloud Interactive Desktop: A Prototype of VUI for CTA Science Gateway

    Science.gov (United States)

    Massimino, P.; Costa, A.; Becciani, U.; Vuerli, C.; Bandieramonte, M.; Petta, C.; Riggi, S.; Sciacca, E.; Vitello, F.; Pistagna, C.

    2014-05-01

    The Astronomical & Physics Cloud Interactive Desktop, developed for the prototype of CTA Science Gateway in Catania, Italy, allows to use many software packages without any installation on the local desktop. The users will be able to exploit, if applicable, the native Graphical User Interface (GUI) of the programs that are available in the ACID environment. For using interactively the remote programs, ACID exploits an "ad hoc" VNC-based User Interface (VUI).

  3. A Look Under the Hood: How the JPL Tropical Cyclone Information System Uses Database Technologies to Present Big Data to Users

    Science.gov (United States)

    Knosp, B.; Gangl, M.; Hristova-Veleva, S. M.; Kim, R. M.; Li, P.; Turk, J.; Vu, Q. A.

    2015-12-01

    The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data and model forecast related to tropical cyclones. The TCIS has been running a near-real time (NRT) data portal during North Atlantic hurricane season that typically runs from June through October each year, since 2010. Data collected by the TCIS varies by type, format, contents, and frequency and is served to the user in two ways: (1) as image overlays on a virtual globe and (2) as derived output from a suite of analysis tools. In order to support these two functions, the data must be collected and then made searchable by criteria such as date, mission, product, pressure level, and geospatial region. Creating a database architecture that is flexible enough to manage, intelligently interrogate, and ultimately present this disparate data to the user in a meaningful way has been the primary challenge. The database solution for the TCIS has been to use a hybrid MySQL + Solr implementation. After testing other relational database and NoSQL solutions, such as PostgreSQL and MongoDB respectively, this solution has given the TCIS the best offerings in terms of query speed and result reliability. This database solution also supports the challenging (and memory overwhelming) geospatial queries that are necessary to support analysis tools requested by users. Though hardly new technologies on their own, our implementation of MySQL + Solr had to be customized and tuned to be able to accurately store, index, and search the TCIS data holdings. In this presentation, we will discuss how we arrived on our MySQL + Solr database architecture, why it offers us the most consistent fast and reliable results, and how it supports our front end so that we can offer users a look into our "big data" holdings.

  4. Djeen (Database for Joomla!’s Extensible Engine): a research information management system for flexible multi-technology project administration

    Science.gov (United States)

    2013-01-01

    Background With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. Findings We developed Djeen (Database for Joomla!’s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Conclusion Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group. Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material. PMID:23742665

  5. MultiSpec: A Desktop and Online Geospatial Image Data Processing Tool

    Science.gov (United States)

    Biehl, L. L.; Hsu, W. K.; Maud, A. R. M.; Yeh, T. T.

    2017-12-01

    MultiSpec is an easy to learn and use, freeware image processing tool for interactively analyzing a broad spectrum of geospatial image data, with capabilities such as image display, unsupervised and supervised classification, feature extraction, feature enhancement, and several other functions. Originally developed for Macintosh and Windows desktop computers, it has a community of several thousand users worldwide, including researchers and educators, as a practical and robust solution for analyzing multispectral and hyperspectral remote sensing data in several different file formats. More recently MultiSpec was adapted to run in the HUBzero collaboration platform so that it can be used within a web browser, allowing new user communities to be engaged through science gateways. MultiSpec Online has also been extended to interoperate with other components (e.g., data management) in HUBzero through integration with the geospatial data building blocks (GABBs) project. This integration enables a user to directly launch MultiSpec Online from data that is stored and/or shared in a HUBzero gateway and to save output data from MultiSpec Online to hub storage, allowing data sharing and multi-step workflows without having to move data between different systems. MultiSpec has also been used in K-12 classes for which one example is the GLOBE program (www.globe.gov) and in outreach material such as that provided by the USGS (eros.usgs.gov/educational-activities). MultiSpec Online now provides teachers with another way to use MultiSpec without having to install the desktop tool. Recently MultiSpec Online was used in a geospatial data session with 30-35 middle school students at the Turned Onto Technology and Leadership (TOTAL) Camp in the summers of 2016 and 2017 at Purdue University. The students worked on a flood mapping exercise using Landsat 5 data to learn about land remote sensing using supervised classification techniques. Online documentation is available for Multi

  6. AKA-TPG: a program for kinetic and epidemiological analysis of data from labeled glucose investigations using the two-pool model and database technology

    DEFF Research Database (Denmark)

    Boston, Raymond C; Stefanovski, Darko; Henriksen, Jan E

    2007-01-01

    of technical reasons have deterred researchers from performing TPG analysis. METHODS AND RESULTS: In this paper, we describe AKA-TPG, a new program that combines automatic kinetic analysis of the TPG model data with database technologies. AKA-TPG enables researchers who have no expertise in modeling to quickly...... fit the TPG model to individual FSHGT data sets consisting of plasma concentrations of unlabeled glucose, labeled glucose, and insulin. Most importantly, because the entire process is automated, parameters are almost always identified, and parameter estimates are accurate and reproducible. AKA...

  7. Utilization and success rates of unstimulated in vitro fertilization in the United States: an analysis of the Society for Assisted Reproductive Technology database.

    Science.gov (United States)

    Gordon, John David; DiMattina, Michael; Reh, Andrea; Botes, Awie; Celia, Gerard; Payson, Mark

    2013-08-01

    To examine the utilization and outcomes of natural cycle (unstimulated) IVF as reported to the Society of Assisted Reproductive Technology (SART) in 2006 and 2007. Retrospective analysis. Dataset analysis from the SART Clinical Outcome Reporting System national database. All patients undergoing IVF as reported to SART in 2006 and 2007. None. Utilization of unstimulated IVF; description of patient demographics; and comparison of implantation and pregnancy rates between unstimulated and stimulated IVF cycles. During 2006 and 2007 a total of 795 unstimulated IVF cycles were initiated. Success rates were age dependent, with patients Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  8. Outline and summary of research for a database of unused energy technologies; Miriyo energy ni kansuru data shu sakusei chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    For effective promotion of the new energy utilization project of NEDO and preparation of the future new energy vision, the data on unused energy were systematically arranged. The definition of heat supply systems using unused energy was clarified, and the kind and property of unused energy were arranged. Some typical systems of heat supply facilities using unused energy were presented. The data were classified by energy source, temperature level and region, and the detailed database was prepared for the typical systems. The overseas database was prepared, in particular, for European and American systems. Domestic and overseas policies, laws and subsidy systems were arranged. The introduction effect of two heat supply facilities using unused energy was surveyed, and their operation costs were analyzed. Basic terminology related to unused energy was compiled and explained. Related organizations, in particular, manufacturers were surveyed and arranged. The main manufacturers of a heat recovery refrigerator and waste heat boiler as typical equipment, and the number of these equipment shipped were arranged. 51 figs., 116 tabs.

  9. Food traceability systems in China: The current status of and future perspectives on food supply chain databases, legal support, and technological research and support for food safety regulation.

    Science.gov (United States)

    Tang, Qi; Li, Jiajia; Sun, Mei; Lv, Jun; Gai, Ruoyan; Mei, Lin; Xu, Lingzhong

    2015-02-01

    Over the past few decades, the field of food security has witnessed numerous problems and incidents that have garnered public attention. Given this serious situation, the food traceability system (FTS) has become part of the expanding food safety continuum to reduce the risk of food safety problems. This article reviews a great deal of the related literature and results from previous studies of FTS to corroborate this contention. This article describes the development and benefits of FTS in developed countries like the United States of America (USA), Japan, and some European countries. Problems with existing FTS in China are noted, including a lack of a complete database, inadequate laws and regulations, and lagging technological research into FTS. This article puts forward several suggestions for the future, including improvement of information websites, clarification of regulatory responsibilities, and promotion of technological research.

  10. Coordinating Mobile Databases: A System Demonstration

    OpenAIRE

    Zaihrayeu, Ilya; Giunchiglia, Fausto

    2004-01-01

    In this paper we present the Peer Database Management System (PDBMS). This system runs on top of the standard database management system, and it allows it to connect its database with other (peer) databases on the network. A particularity of our solution is that PDBMS allows for conventional database technology to be effectively operational in mobile settings. We think of database mobility as a database network, where databases appear and disappear spontaneously and their network access point...

  11. Biofuel Database

    Science.gov (United States)

    Biofuel Database (Web, free access)   This database brings together structural, biological, and thermodynamic data for enzymes that are either in current use or are being considered for use in the production of biofuels.

  12. Community Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This excel spreadsheet is the result of merging at the port level of several of the in-house fisheries databases in combination with other demographic databases such...

  13. Structure alerts for carcinogenicity, and the Salmonella assay system: a novel insight through the chemical relational databases technology.

    Science.gov (United States)

    Benigni, Romualdo; Bossa, Cecilia

    2008-01-01

    In the past decades, chemical carcinogenicity has been the object of mechanistic studies that have been translated into valuable experimental (e.g., the Salmonella assays system) and theoretical (e.g., compilations of structure alerts for chemical carcinogenicity) models. These findings remain the basis of the science and regulation of mutagens and carcinogens. Recent advances in the organization and treatment of large databases consisting of both biological and chemical information nowadays allows for a much easier and more refined view of data. This paper reviews recent analyses on the predictive performance of various lists of structure alerts, including a new compilation of alerts that combines previous work in an optimized form for computer implementation. The revised compilation is part of the Toxtree 1.50 software (freely available from the European Chemicals Bureau website). The use of structural alerts for the chemical biological profiling of a large database of Salmonella mutagenicity results is also reported. Together with being a repository of the science on the chemical biological interactions at the basis of chemical carcinogenicity, the SAs have a crucial role in practical applications for risk assessment, for: (a) description of sets of chemicals; (b) preliminary hazard characterization; (c) formation of categories for e.g., regulatory purposes; (d) generation of subsets of congeneric chemicals to be analyzed subsequently with QSAR methods; (e) priority setting. An important aspect of SAs as predictive toxicity tools is that they derive directly from mechanistic knowledge. The crucial role of mechanistic knowledge in the process of applying (Q)SAR considerations to risk assessment should be strongly emphasized. Mechanistic knowledge provides a ground for interaction and dialogue between model developers, toxicologists and regulators, and permits the integration of the (Q)SAR results into a wider regulatory framework, where different types of

  14. KALIMER database development (database configuration and design methodology)

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Kwon, Young Min; Lee, Young Bum; Chang, Won Pyo; Hahn, Do Hee

    2001-10-01

    KALIMER Database is an advanced database to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applicatins. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), and 3D CAD database, Team Cooperation system, and Reserved Documents, Results Database is a research results database during phase II for Liquid Metal Reactor Design Technology Develpment of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is s schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment. This report describes the features of Hardware and Software and the Database Design Methodology for KALIMER

  15. Database Administrator

    Science.gov (United States)

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  16. Efficiency Sustainability Resource Visual Simulator for Clustered Desktop Virtualization Based on Cloud Infrastructure

    Directory of Open Access Journals (Sweden)

    Jong Hyuk Park

    2014-11-01

    Full Text Available Following IT innovations, manual operations have been automated, improving the overall quality of life. This has been possible because an organic topology has been formed among many diverse smart devices grafted onto real life. To provide services to these smart devices, enterprises or users use the cloud. Cloud services are divided into infrastructure as a service (IaaS, platform as a service (PaaS and software as a service (SaaS. SaaS is operated on PaaS, and PaaS is operated on IaaS. Since IaaS is the foundation of all services, algorithms for the efficient operation of virtualized resources are required. Among these algorithms, desktop resource virtualization is used for high resource availability when existing desktop PCs are unavailable. For this high resource availability, clustering for hierarchical structures is important. In addition, since many clustering algorithms show different percentages of the main resources depending on the desktop PC distribution rates and environments, selecting appropriate algorithms is very important. If diverse attempts are made to find algorithms suitable for the operating environments’ desktop resource virtualization, huge costs are incurred for the related power, time and labor. Therefore, in the present paper, a desktop resource virtualization clustering simulator (DRV-CS, a clustering simulator for selecting clusters of desktop virtualization clusters to be maintained sustainably, is proposed. The DRV-CS provides simulations, so that clustering algorithms can be selected and elements can be properly applied in different desktop PC environments through the DRV-CS.

  17. Waste management and technologies analytical database project for Los Alamos National Laboratory/Department of Energy. Final report, June 7, 1993--June 15, 1994

    International Nuclear Information System (INIS)

    1995-01-01

    The Waste Management and Technologies Analytical Database System (WMTADS) supported by the Department of Energy's (DOE) Office of Environmental Management (EM), Office of Technology Development (EM-50), was developed and based at the Los Alamos National Laboratory (LANL), Los Alamos, New Mexico, to collect, identify, organize, track, update, and maintain information related to existing/available/developing and planned technologies to characterize, treat, and handle mixed, hazardous and radioactive waste for storage and disposal in support of EM strategies and goals and to focus area projects. WMTADS was developed as a centralized source of on-line information regarding technologies for environmental management processes that can be accessed by a computer, modem, phone line, and communications software through a Local Area Network (LAN), and server connectivity on the Internet, the world's largest computer network, and with file transfer protocol (FTP) can also be used to globally transfer files from the server to the user's computer through Internet and World Wide Web (WWW) using Mosaic

  18. A Relational Database Model and Tools for Environmental Sound Recognition

    Directory of Open Access Journals (Sweden)

    Yuksel Arslan

    2017-12-01

    Full Text Available Environmental sound recognition (ESR has become a hot topic in recent years. ESR is mainly based on machine learning (ML and ML algorithms require first a training database. This database must comprise the sounds to be recognized and other related sounds. An ESR system needs the database during training, testing and in the production stage. In this paper, we present the design and pilot establishment of a database which will assists all researchers who want to establish an ESR system. This database employs relational database model which is not used for this task before. We explain in this paper design and implementation details of the database, data collection and load process. Besides we explain the tools and developed graphical user interface for a desktop application and for the WEB.

  19. Testing the quality of images for permanent magnet desktop MRI systems using specially designed phantoms.

    Science.gov (United States)

    Qiu, Jianfeng; Wang, Guozhu; Min, Jiao; Wang, Xiaoyan; Wang, Pengcheng

    2013-12-21

    Our aim was to measure the performance of desktop magnetic resonance imaging (MRI) systems using specially designed phantoms, by testing imaging parameters and analysing the imaging quality. We designed multifunction phantoms with diameters of 18 and 60 mm for desktop MRI scanners in accordance with the American Association of Physicists in Medicine (AAPM) report no. 28. We scanned the phantoms with three permanent magnet 0.5 T desktop MRI systems, measured the MRI image parameters, and analysed imaging quality by comparing the data with the AAPM criteria and Chinese national standards. Image parameters included: resonance frequency, high contrast spatial resolution, low contrast object detectability, slice thickness, geometrical distortion, signal-to-noise ratio (SNR), and image uniformity. The image parameters of three desktop MRI machines could be measured using our specially designed phantoms, and most parameters were in line with MRI quality control criterion, including: resonance frequency, high contrast spatial resolution, low contrast object detectability, slice thickness, geometrical distortion, image uniformity and slice position accuracy. However, SNR was significantly lower than in some references. The imaging test and quality control are necessary for desktop MRI systems, and should be performed with the applicable phantom and corresponding standards.

  20. Testing the quality of images for permanent magnet desktop MRI systems using specially designed phantoms

    International Nuclear Information System (INIS)

    Qiu, Jianfeng; Wang, Guozhu; Min, Jiao; Wang, Xiaoyan; Wang, Pengcheng

    2013-01-01

    Our aim was to measure the performance of desktop magnetic resonance imaging (MRI) systems using specially designed phantoms, by testing imaging parameters and analysing the imaging quality. We designed multifunction phantoms with diameters of 18 and 60 mm for desktop MRI scanners in accordance with the American Association of Physicists in Medicine (AAPM) report no. 28. We scanned the phantoms with three permanent magnet 0.5 T desktop MRI systems, measured the MRI image parameters, and analysed imaging quality by comparing the data with the AAPM criteria and Chinese national standards. Image parameters included: resonance frequency, high contrast spatial resolution, low contrast object detectability, slice thickness, geometrical distortion, signal-to-noise ratio (SNR), and image uniformity. The image parameters of three desktop MRI machines could be measured using our specially designed phantoms, and most parameters were in line with MRI quality control criterion, including: resonance frequency, high contrast spatial resolution, low contrast object detectability, slice thickness, geometrical distortion, image uniformity and slice position accuracy. However, SNR was significantly lower than in some references. The imaging test and quality control are necessary for desktop MRI systems, and should be performed with the applicable phantom and corresponding standards. (paper)

  1. Generalized Database Management System Support for Numeric Database Environments.

    Science.gov (United States)

    Dominick, Wayne D.; Weathers, Peggy G.

    1982-01-01

    This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…

  2. DataBase on Demand

    International Nuclear Information System (INIS)

    Aparicio, R Gaspar; Gomez, D; Wojcik, D; Coz, I Coterillo

    2012-01-01

    At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.

  3. The File Sync Algorithm of the ownCloud Desktop Clients

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    The ownCloud desktop clients provide file syncing between desktop machines and the ownCloud server, available for the important desktop platforms. This presentation will give an overview of the sync algorithm used by the clients to provide a fast, reliable and robust syncing experience for the users. It will describe the phases a sync run will go through and how it is triggered. It also will provide an insight on the algorithms that decided if a file is uploaded, downloaded or even deleted on either on the local machine or in the cloud. Some examples of non obvious situations in file syncing will be described and discussed. As the ownCloud sync protocol is based on the open standard WebDAV the resulting challenges and the solutions will be illustrated. Finally a couple of frequently proposed enhancements will be reviewed and assed for the future development of the ownCloud server and syncing clients.

  4. Laptops vs. Desktops in a Google Groups Environment: A Study on Collaborative Learning

    Directory of Open Access Journals (Sweden)

    Steven Lopes Abrantes

    2011-01-01

    Full Text Available Current literature on m-learning refers to the lack of studies on real use of m-learning applications and how they can compete with current desktop counterparts. The study consists of an experiment involving one hundred and twelve students of higher education and a set of learning activities that they have to accomplish. This study has the main objective to validate if the students that use laptops or desktops are in the flow experience and which of them are more in the flow experience, when using Google Groups. The used approach is based on the flow experience introduced by [1]. It was possible to conclude that students have experienced the flow state both by students using laptops or desktops, but having the laptop students a more positive effect in the flow experience.

  5. Developing a Process Model for the Forensic Extraction of Information from Desktop Search Applications

    Directory of Open Access Journals (Sweden)

    Timothy Pavlic

    2008-03-01

    Full Text Available Desktop search applications can contain cached copies of files that were deleted from the file system. Forensic investigators see this as a potential source of evidence, as documents deleted by suspects may still exist in the cache. Whilst there have been attempts at recovering data collected by desktop search applications, there is no methodology governing the process, nor discussion on the most appropriate means to do so. This article seeks to address this issue by developing a process model that can be applied when developing an information extraction application for desktop search applications, discussing preferred methods and the limitations of each. This work represents a more structured approach than other forms of current research.

  6. JICST Factual DatabaseJICST Chemical Substance Safety Regulation Database

    Science.gov (United States)

    Abe, Atsushi; Sohma, Tohru

    JICST Chemical Substance Safety Regulation Database is based on the Database of Safety Laws for Chemical Compounds constructed by Japan Chemical Industry Ecology-Toxicology & Information Center (JETOC) sponsored by the Sience and Technology Agency in 1987. JICST has modified JETOC database system, added data and started the online service through JOlS-F (JICST Online Information Service-Factual database) in January 1990. JICST database comprises eighty-three laws and fourteen hundred compounds. The authors outline the database, data items, files and search commands. An example of online session is presented.

  7. Survey on construction of the database for new energy technology development. Fuel cell; Shin energy gijutsu kaihatsu kankei data shu sakusei chosa. Nenryo denchi

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    As a part of the data related to technological development of new energy, the database for fuel cells was prepared. The major international conferences held in fiscal 1996 were reviewed. As the atmosphere of the whole conference, phosphoric acid fuel cell (PAFC) is in a stage just before practical use, and molten carbonate fuel cell (MCFC) is in a stage of the demonstration study of MW class one. The study on solid oxide fuel cell (SOFC) and polymer electrolyte fuel cell (PEFC) is in considerable progress. In particular, the application of PEFC to automobiles is in real investigation. For the database, kinds and features of various fuel cells, operation principles, system configurations of FC plants, application fields, and characteristics were arranged. Field test examples for public and industrial uses were separately arranged, and in particular, the application examples of PAFC were presented together with developmental conditions of the other fuel cells. Overseas situations were equal to domestic ones, and their marketability was predicted. The Japanese subsidy policy and some U.S. policies were also arranged. 28 refs., 51 figs., 37 tabs.

  8. Information technology and global change science

    Energy Technology Data Exchange (ETDEWEB)

    Baxter, F.P.

    1990-01-01

    The goal of this paper is to identify and briefly describe major existing and near term information technologies that cold have a positive impact on the topics being discussed at this conference by helping to manage the data of global change science and helping global change scientists conduct their research. Desktop computer systems have changed dramatically during the past seven years. Faster data processing can be expected in the future through full development of traditional serial computer architectures. Some other proven information technologies may be currently underutilized by global change scientists. Relational database management systems and good organization of data through the use of thoughtful database design would enable the scientific community to better share and maintain quality research data. Custodians of the data should use rigorous data administration to ensure integrity and long term value of the data resource. Still other emerging information technologies that involve the use of artificial intelligence, parallel computer architectures, and new sensors for data collection will be in relatively common use in the near term and should become part of the global science community's technical toolkit. Consideration should also be given to the establishment of Information Analysis Centers to facilitate effective organization and management of interdisciplinary data and the prototype testing and use of advanced information technology to facilitate rapid and cost-effective integration of these tools into global change science. 8 refs.

  9. [Conceptual foundations of creation of branch database of technology and intellectual property rights owned by scientific institutions, organizations, higher medical educational institutions and enterprises of healthcare sphere of Ukraine].

    Science.gov (United States)

    Horban', A Ie

    2013-09-01

    The question of implementation of the state policy in the field of technology transfer in the medical branch to implement the law of Ukraine of 02.10.2012 No 5407-VI "On Amendments to the law of Ukraine" "On state regulation of activity in the field of technology transfers", namely to ensure the formation of branch database on technology and intellectual property rights owned by scientific institutions, organizations, higher medical education institutions and enterprises of healthcare sphere of Ukraine and established by budget are considered. Analysis of international and domestic experience in the processing of information about intellectual property rights and systems implementation support transfer of new technologies are made. The main conceptual principles of creation of this branch database of technology transfer and branch technology transfer network are defined.

  10. Brasilia’s Database Administrators

    Directory of Open Access Journals (Sweden)

    Jane Adriana

    2016-06-01

    Full Text Available Database administration has gained an essential role in the management of new database technologies. Different data models are being created for supporting the enormous data volume, from the traditional relational database. These new models are called NoSQL (Not only SQL databases. The adoption of best practices and procedures, has become essential for the operation of database management systems. Thus, this paper investigates some of the techniques and tools used by database administrators. The study highlights features and particularities in databases within the area of Brasilia, the Capital of Brazil. The results point to which new technologies regarding database management are currently the most relevant, as well as the central issues in this area.

  11. IMIS desktop & smartphone software solutions for monitoring spacecrafts' payload from anywhere

    Science.gov (United States)

    Baroukh, J.; Queyrut, O.; Airaud, J.

    In the past years, the demand for satellite remote operations has increased guided by on one hand, the will to reduce operations cost (on-call operators out of business hours), and on the other hand, the development of cooperation space missions resulting in a world wide distribution of engineers and science team members. Only a few off-the-shelf solutions exist to fulfill the need of remote payload monitoring, and they mainly use proprietary devices. The recent advent of mobile technologies (laptops, smartphones and tablets) as well as the worldwide deployment of broadband networks (3G, Wi-Fi hotspots), has opened up a technical window that brings new options. As part of the Mars Science Laboratory (MSL) mission, the Centre National D'Etudes Spatiales (CNES, the French space agency) has developed a new software solution for monitoring spacecraft payloads. The Instrument Monitoring Interactive Software (IMIS) offers state-of-the-art operational features for payload monitoring, and can be accessed remotely. It was conceived as a generic tool that can be used for heterogeneous payloads and missions. IMIS was designed as a classical client/server architecture. The server is hosted at CNES and acts as a data provider while two different kinds of clients are available depending on the level of mobility required. The first one is a rich client application, built on Eclipse framework, which can be installed on usual operating systems and communicates with the server through the Internet. The second one is a smartphone application for any Android platform, connected to the server thanks to the mobile broadband network or a Wi-Fi connection. This second client is mainly devoted to on-call operations and thus only contains a subset of the IMIS functionalities. This paper describes the operational context, including security aspects, that led IMIS development, presents the selected software architecture and details the various features of both clients: the desktop and the sm

  12. Characterization of emissions from a desktop 3D printer and indoor air measurements in office settings.

    Science.gov (United States)

    Steinle, Patrick

    2016-01-01

    Emissions from a desktop 3D printer based on fused deposition modeling (FDM) technology were measured in a test chamber and indoor air was monitored in office settings. Ultrafine aerosol (UFA) emissions were higher while printing a standard object with polylactic acid (PLA) than with acrylonitrile butadiene styrene (ABS) polymer (2.1 × 10(9) vs. 2.4 × 10(8) particles/min). Prolonged use of the printer led to higher emission rates (factor 2 with PLA and 4 with ABS, measured after seven months of occasional use). UFA consisted mainly of volatile droplets, and some small (100-300 nm diameter) iron containing and soot-like particles were found. Emissions of inhalable and respirable dust were below the limit of detection (LOD) when measured gravimetrically, and only slightly higher than background when measured with an aerosol spectrometer. Emissions of volatile organic compounds (VOC) were in the range of 10 µg/min. Styrene accounted for more than 50% of total VOC emitted when printing with ABS; for PLA, methyl methacrylate (MMA, 37% of TVOC) was detected as the predominant compound. Two polycyclic aromatic hydrocarbons (PAH), fluoranthene and pyrene, were observed in very low amounts. All other analyzed PAH, as well as inorganic gases and metal emissions except iron (Fe) and zinc (Zn), were below the LOD or did not differ from background without printing. A single 3D print (165 min) in a large, well-ventilated office did not significantly increase the UFA and VOC concentrations, whereas these were readily detectable in a small, unventilated room, with UFA concentrations increasing by 2,000 particles/cm(3) and MMA reaching a peak of 21 µg/m(3) and still being detectable in the room even 20 hr after printing.

  13. [Teaching Desktop] Video Conferencing in a Collaborative and Problem Based Setting

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Mouritzen, Per

    2013-01-01

    , teachers and assistant teachers wanted to find ways in the design for learning that enables the learners to acquire knowledge about the theories, models and concepts of the subject, as well as hands‐on competencies in a learning‐by‐doing manner. In particular we address the area of desktop video...... shows that the students experiment with various pedagogical situations, and that during the process of design, teaching, and reflection they acquire experiences at both a concrete specific and a general abstract level. The desktop video conference system creates challenges, with technical issues...

  14. Applications and a three-dimensional desktop environment for an immersive virtual reality system

    International Nuclear Information System (INIS)

    Kageyama, Akira; Masada, Youhei

    2013-01-01

    We developed an application launcher called Multiverse for scientific visualizations in a CAVE-type virtual reality (VR) system. Multiverse can be regarded as a type of three-dimensional (3D) desktop environment. In Multiverse, a user in a CAVE room can browse multiple visualization applications with 3D icons and explore movies that float in the air. Touching one of the movies causes ''teleportation'' into the application's VR space. After analyzing the simulation data using the application, the user can jump back into Multiverse's VR desktop environment in the CAVE

  15. At the Turning Point of the Current Techno-Economic Paradigm: Commons-Based Peer Production, Desktop Manufacturing and the Role of Civil Society in the Perezian Framework

    Directory of Open Access Journals (Sweden)

    Vasilis Kostakis

    2013-01-01

    Full Text Available Following the theory of techno-economic paradigm shifts (TEPS, this paper calls attention to the phenomenon of Commons-based peer production (CBPP. In the context of the current paradigm, it argues that civil society can play an important role in creating favourable conditions for a more sustainable global knowledge society. Approaching tentatively the ways in which 3D printing and other desktop manufacturing technologies can be used in CBPP, it also explores the ways in which the partnership with the state may provide a supportive innovative institutional basis for taking the maximum advantage of the emerging synergies in the vein of TEPS theory.

  16. LandIT Database

    DEFF Research Database (Denmark)

    Iftikhar, Nadeem; Pedersen, Torben Bach

    2010-01-01

    and reporting purposes. This paper presents the LandIT database; which is result of the LandIT project, which refers to an industrial collaboration project that developed technologies for communication and data integration between farming devices and systems. The LandIT database in principal is based...... on the ISOBUS standard; however the standard is extended with additional requirements, such as gradual data aggregation and flexible exchange of farming data. This paper describes the conceptual and logical schemas of the proposed database based on a real-life farming case study....

  17. Federal databases

    International Nuclear Information System (INIS)

    Welch, M.J.; Welles, B.W.

    1988-01-01

    Accident statistics on all modes of transportation are available as risk assessment analytical tools through several federal agencies. This paper reports on the examination of the accident databases by personal contact with the federal staff responsible for administration of the database programs. This activity, sponsored by the Department of Energy through Sandia National Laboratories, is an overview of the national accident data on highway, rail, air, and marine shipping. For each mode, the definition or reporting requirements of an accident are determined and the method of entering the accident data into the database is established. Availability of the database to others, ease of access, costs, and who to contact were prime questions to each of the database program managers. Additionally, how the agency uses the accident data was of major interest

  18. Usability Comparisons of Head-Mounted vs. Stereoscopic Desktop Displays in a Virtual Reality Environment with Pain Patients.

    Science.gov (United States)

    Tong, Xin; Gromala, Diane; Gupta, Dimple; Squire, Pam

    2016-01-01

    Researchers have shown that immersive Virtual Reality (VR) can serve as an unusually powerful pain control technique. However, research assessing the reported symptoms and negative effects of VR systems indicate that it is important to ascertain if these symptoms arise from the use of particular VR display devices, particularly for users who are deemed "at risk," such as chronic pain patients Moreover, these patients have specific and often complex needs and requirements, and because basic issues such as 'comfort' may trigger anxiety or panic attacks, it is important to examine basic questions of the feasibility of using VR displays. Therefore, this repeated-measured experiment was conducted with two VR displays: the Oculus Rift's head-mounted display (HMD) and Firsthand Technologies' immersive desktop display, DeepStream3D. The characteristics of these immersive desktop displays differ: one is worn, enabling patients to move their heads, while the other is peered into, allowing less head movement. To assess the severity of physical discomforts, 20 chronic pain patients tried both displays while watching a VR pain management demo in clinical settings. Results indicated that participants experienced higher levels of Simulator Sickness using the Oculus Rift HMD. However, results also indicated other preferences of the two VR displays among patients, including physical comfort levels and a sense of immersion. Few studies have been conducted that compare usability of specific VR devices specifically with chronic pain patients using a therapeutic virtual environment in pain clinics. Thus, the results may help clinicians and researchers to choose the most appropriate VR displays for chronic pain patients and guide VR designers to enhance the usability of VR displays for long-term pain management interventions.

  19. National Geochronological Database

    Science.gov (United States)

    Revised by Sloan, Jan; Henry, Christopher D.; Hopkins, Melanie; Ludington, Steve; Original database by Zartman, Robert E.; Bush, Charles A.; Abston, Carl

    2003-01-01

    The National Geochronological Data Base (NGDB) was established by the United States Geological Survey (USGS) to collect and organize published isotopic (also known as radiometric) ages of rocks in the United States. The NGDB (originally known as the Radioactive Age Data Base, RADB) was started in 1974. A committee appointed by the Director of the USGS was given the mission to investigate the feasibility of compiling the published radiometric ages for the United States into a computerized data bank for ready access by the user community. A successful pilot program, which was conducted in 1975 and 1976 for the State of Wyoming, led to a decision to proceed with the compilation of the entire United States. For each dated rock sample reported in published literature, a record containing information on sample location, rock description, analytical data, age, interpretation, and literature citation was constructed and included in the NGDB. The NGDB was originally constructed and maintained on a mainframe computer, and later converted to a Helix Express relational database maintained on an Apple Macintosh desktop computer. The NGDB and a program to search the data files were published and distributed on Compact Disc-Read Only Memory (CD-ROM) in standard ISO 9660 format as USGS Digital Data Series DDS-14 (Zartman and others, 1995). As of May 1994, the NGDB consisted of more than 18,000 records containing over 30,000 individual ages, which is believed to represent approximately one-half the number of ages published for the United States through 1991. Because the organizational unit responsible for maintaining the database was abolished in 1996, and because we wanted to provide the data in more usable formats, we have reformatted the data, checked and edited the information in some records, and provided this online version of the NGDB. This report describes the changes made to the data and formats, and provides instructions for the use of the database in geographic

  20. Forecasting and management of technology

    National Research Council Canada - National Science Library

    Roper, A. T

    2011-01-01

    ... what the authors see as the innovations to technology management in the last 17 years: the Internet; the greater focus on group decision-making including process management and mechanism design; and desktop software that has transformed the analytical capabilities of technology managers"--Provided by publisher.

  1. 36 CFR 1194.26 - Desktop and portable computers.

    Science.gov (United States)

    2010-07-01

    ... BARRIERS COMPLIANCE BOARD ELECTRONIC AND INFORMATION TECHNOLOGY ACCESSIBILITY STANDARDS Technical Standards... input method shall be provided that complies with § 1194.23 (k) (1) through (4). (c) When biometric...

  2. Calculating length of gestation from the Society for Assisted Reproductive Technology Clinic Outcome Reporting System (SART CORS) database versus vital records may alter reported rates of prematurity.

    Science.gov (United States)

    Stern, Judy E; Kotelchuck, Milton; Luke, Barbara; Declercq, Eugene; Cabral, Howard; Diop, Hafsatou

    2014-05-01

    To compare length of gestation after assisted reproductive technology (ART) as calculated by three methods from the Society for Assisted Reproductive Technology Clinic Outcome Reporting System (SART CORS) and vital records (birth and fetal death) in the Massachusetts Pregnancy to Early Life Longitudinal Data System (PELL). Historical cohort study. Database linkage analysis. Live or stillborn deliveries. None. ART deliveries were linked to live birth or fetal death certificates. Length of gestation in 7,171 deliveries from fresh autologous ART cycles (2004-2008) was calculated and compared with that of SART CORS with the use of methods: M1 = outcome date - cycle start date; M2 = outcome date - transfer date + 17 days; and M3 = outcome date - transfer date + 14 days + day of transfer. Generalized estimating equation models were used to compare methods. Singleton and multiple deliveries were included. Overall prematurity (delivery 45% of deliveries and by more than 1 week in >22% of deliveries. Each method differed from each other. Estimates of preterm birth in ART vary depending on source of data and method of calculation. Some estimates may overestimate preterm birth rates for ART conceptions. Copyright © 2014 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  3. Survey on construction of the database for new energy technology development. Cogeneration; Shin energy gijutsu kaihatsu kankei data shu sakusei chosa. Cogeneration

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    As a part of the activity promoting use of new energy, the data related to cogeneration were systematically compiled. For new energy technology, such various policies for introducing new energy are in promotion with a progress of technological development as preparation of subsidy systems, field test business, and support advisory business for introducing new energy. For further effective promotion, integral systematic compilation of various data, and arrangement as basic data are necessary. Such latest announced data in a cogeneration field were collected and compiled as outline of new energy systems, concrete applications, subsidy systems, and approaches to new energy of various countries. Main data items are as follows: trend of cogeneration, outline of system, domestic and foreign concrete applications, prediction data on the use of new energy, overview of domestic and foreign policies for cogeneration, basic terminology, and tables of main related enterprises and organizations. This database is useful for the present activities promoting use of new energy, and preparation of the future vision. 29 figs., 33 tabs.

  4. Database Replication

    CERN Document Server

    Kemme, Bettina

    2010-01-01

    Database replication is widely used for fault-tolerance, scalability and performance. The failure of one database replica does not stop the system from working as available replicas can take over the tasks of the failed replica. Scalability can be achieved by distributing the load across all replicas, and adding new replicas should the load increase. Finally, database replication can provide fast local access, even if clients are geographically distributed clients, if data copies are located close to clients. Despite its advantages, replication is not a straightforward technique to apply, and

  5. Refactoring databases evolutionary database design

    CERN Document Server

    Ambler, Scott W

    2006-01-01

    Refactoring has proven its value in a wide range of development projects–helping software professionals improve system designs, maintainability, extensibility, and performance. Now, for the first time, leading agile methodologist Scott Ambler and renowned consultant Pramodkumar Sadalage introduce powerful refactoring techniques specifically designed for database systems. Ambler and Sadalage demonstrate how small changes to table structures, data, stored procedures, and triggers can significantly enhance virtually any database design–without changing semantics. You’ll learn how to evolve database schemas in step with source code–and become far more effective in projects relying on iterative, agile methodologies. This comprehensive guide and reference helps you overcome the practical obstacles to refactoring real-world databases by covering every fundamental concept underlying database refactoring. Using start-to-finish examples, the authors walk you through refactoring simple standalone databas...

  6. Multimodal Desktop Interaction: The Face –Object-Gesture–Voice Example

    DEFF Research Database (Denmark)

    Vidakis, Nikolas; Vlasopoulos, Anastasios; Kounalakis, Tsampikos

    2013-01-01

    This paper presents a natural user interface system based on multimodal human computer interaction, which operates as an intermediate module between the user and the operating system. The aim of this work is to demonstrate a multimodal system which gives users the ability to interact with desktop...

  7. Delivering an Alternative Medicine Resource to the User's Desktop via World Wide Web.

    Science.gov (United States)

    Li, Jie; Wu, Gang; Marks, Ellen; Fan, Weiyu

    1998-01-01

    Discusses the design and implementation of a World Wide Web-based alternative medicine virtual resource. This homepage integrates regional, national, and international resources and delivers library services to the user's desktop. Goals, structure, and organizational schemes of the system are detailed, and design issues for building such a…

  8. Negotiation of Meaning in Desktop Videoconferencing-Supported Distance Language Learning

    Science.gov (United States)

    Wang, Yuping

    2006-01-01

    The aim of this research is to reveal the dynamics of focus on form in task completion via videoconferencing. This examination draws on current second language learning theories regarding effective language acquisition, research in Computer Mediated Communication (CMC) and empirical data from an evaluation of desktop videoconferencing-supported…

  9. Randomized Trial of Desktop Humidifier for Dry Eye Relief in Computer Users.

    Science.gov (United States)

    Wang, Michael T M; Chan, Evon; Ea, Linda; Kam, Clifford; Lu, Yvonne; Misra, Stuti L; Craig, Jennifer P

    2017-11-01

    Dry eye is a frequently reported problem among computer users. Low relative humidity environments are recognized to exacerbate signs and symptoms of dry eye, yet are common in offices of computer operators. Desktop USB-powered humidifiers are available commercially, but their efficacy for dry eye relief has not been established. This study aims to evaluate the potential for a desktop USB-powered humidifier to improve tear-film parameters, ocular surface characteristics, and subjective comfort of computer users. Forty-four computer users were enrolled in a prospective, masked, randomized crossover study. On separate days, participants were randomized to 1 hour of continuous computer use, with and without exposure to a desktop humidifier. Lipid-layer grade, noninvasive tear-film breakup time, and tear meniscus height were measured before and after computer use. Following the 1-hour period, participants reported whether ocular comfort was greater, equal, or lesser than that at baseline. The desktop humidifier effected a relative difference in humidity between the two environments of +5.4 ± 5.0% (P .05). However, a relative increase in the median noninvasive tear-film breakup time of +4.0 seconds was observed in the humidified environment (P computer use.Trial registration no: ACTRN12617000326392.

  10. Desktop Publishing: The Effects of Computerized Formats on Reading Speed and Comprehension.

    Science.gov (United States)

    Knupfer, Nancy Nelson; McIsaac, Marina Stock

    1989-01-01

    Describes study that was conducted to determine the effects of two electronic text variables used in desktop publishing on undergraduate students' reading speed and comprehension. Research on text variables, graphic design, instructional text design, and computer screen design is discussed, and further studies are suggested. (22 references) (LRW)

  11. What's New in Software? Mastery of the Computer through Desktop Publishing.

    Science.gov (United States)

    Hedley, Carolyn N.; Ellsworth, Nancy J.

    1993-01-01

    Offers thoughts on the phenomenon of the underuse of classroom computers. Argues that desktop publishing is one way of overcoming the computer malaise occurring in schools, using the incentive of classroom reading and writing for mastery of many aspects of computer production, including writing, illustrating, reading, and publishing. (RS)

  12. Designing Design into an Advanced Desktop Publishing Course (A Teaching Tip).

    Science.gov (United States)

    Guthrie, Jim

    1995-01-01

    Describes an advanced desktop publishing course that combines instruction in a few advanced techniques for using software with extensive discussion of such design principles as consistency, proportion, asymmetry, appropriateness, contrast, and color. Describes computer hardware and software, class assignments, problems, and the rationale for such…

  13. Using Desktop Publishing in an Editing Class--The Lessons Learned and Students' Assessments.

    Science.gov (United States)

    Tharp, Marty; Zimmerman, Don

    1992-01-01

    Reports students' perceptions of learning desktop publishing (DTP) systems. Finds that (1) students learned the foundations of DTP in under 60 hours of hands-on experience; (2) the incremental introduction of DTP functions and practice sessions before assignments were the most effective teaching strategy; and (3) use of DTP encouraged nonartistic…

  14. A Desktop Publishing Course: An Alternative to Internships for Rural Universities.

    Science.gov (United States)

    Flammia, Madelyn

    1992-01-01

    Suggests that a course in desktop publishing can provide students at rural schools with experience equivalent to internships. Notes that the course provided students with real-world experience and benefited the university in terms of services and public relations. (RS)

  15. Digital Dome versus Desktop Display: Learning Outcome Assessments by Domain Experts

    Science.gov (United States)

    Jacobson, Jeffery

    2013-01-01

    In previous publications, the author reported that students learned about Egyptian architecture and society by playing an educational game based on a virtual representation of a temple. Students played the game in a digital dome or on a standard desktop computer, and (each) then recorded a video tour of the temple. Those who had used the dome…

  16. Using M@th Desktop Notebooks and Palettes in the Classroom

    Science.gov (United States)

    Simonovits, Reinhard

    2011-01-01

    This article explains the didactical design of M@th Desktop (MD), a teaching and learning software application for high schools and universities. The use of two types of MD resources is illustrated: notebooks and palettes, focusing on the topic of exponential functions. The handling of MD in a blended learning approach and the impact on the…

  17. Students' Beliefs about Mobile Devices vs. Desktop Computers in South Korea and the United States

    Science.gov (United States)

    Sung, Eunmo; Mayer, Richard E.

    2012-01-01

    College students in the United States and in South Korea completed a 28-item multidimensional scaling (MDS) questionnaire in which they rated the similarity of 28 pairs of multimedia learning materials on a 10-point scale (e.g., narrated animation on a mobile device Vs. movie clip on a desktop computer) and a 56-item semantic differential…

  18. Empirical Analysis of Server Consolidation and Desktop Virtualization in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Bao Rong Chang

    2013-01-01

    Full Text Available Physical server transited to virtual server infrastructure (VSI and desktop device to virtual desktop infrastructure (VDI have the crucial problems of server consolidation, virtualization performance, virtual machine density, total cost of ownership (TCO, and return on investments (ROI. Besides, how to appropriately choose hypervisor for the desired server/desktop virtualization is really challenging, because a trade-off between virtualization performance and cost is a hard decision to make in the cloud. This paper introduces five hypervisors to establish the virtual environment and then gives a careful assessment based on C/P ratio that is derived from composite index, consolidation ratio, virtual machine density, TCO, and ROI. As a result, even though ESX server obtains the highest ROI and lowest TCO in server virtualization and Hyper-V R2 gains the best performance of virtual machine management; both of them however cost too much. Instead the best choice is Proxmox Virtual Environment (Proxmox VE because it not only saves the initial investment a lot to own a virtual server/desktop infrastructure, but also obtains the lowest C/P ratio.

  19. Writing Essays on a Laptop or a Desktop Computer: Does It Matter?

    Science.gov (United States)

    Ling, Guangming; Bridgeman, Brent

    2013-01-01

    To explore the potential effect of computer type on the Test of English as a Foreign Language-Internet-Based Test (TOEFL iBT) Writing Test, a sample of 444 international students was used. The students were randomly assigned to either a laptop or a desktop computer to write two TOEFL iBT practice essays in a simulated testing environment, followed…

  20. GTfold: Enabling parallel RNA secondary structure prediction on multi-core desktops

    DEFF Research Database (Denmark)

    Swenson, M Shel; Anderson, Joshua; Ash, Andrew

    2012-01-01

    achieved significant improvements in runtime, but their implementations were not portable from niche high-performance computers or easily accessible to most RNA researchers. With the increasing prevalence of multi-core desktop machines, a new parallel prediction program is needed to take full advantage...

  1. RDD Databases

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database was established to oversee documents issued in support of fishery research activities including experimental fishing permits (EFP), letters of...

  2. Snowstorm Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Snowstorm Database is a collection of over 500 snowstorms dating back to 1900 and updated operationally. Only storms having large areas of heavy snowfall (10-20...

  3. Dealer Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The dealer reporting databases contain the primary data reported by federally permitted seafood dealers in the northeast. Electronic reporting was implemented May 1,...

  4. National database

    DEFF Research Database (Denmark)

    Kristensen, Helen Grundtvig; Stjernø, Henrik

    1995-01-01

    Artikel om national database for sygeplejeforskning oprettet på Dansk Institut for Sundheds- og Sygeplejeforskning. Det er målet med databasen at samle viden om forsknings- og udviklingsaktiviteter inden for sygeplejen.......Artikel om national database for sygeplejeforskning oprettet på Dansk Institut for Sundheds- og Sygeplejeforskning. Det er målet med databasen at samle viden om forsknings- og udviklingsaktiviteter inden for sygeplejen....

  5. Replikasi Unidirectional pada Heterogen Database

    Directory of Open Access Journals (Sweden)

    Hendro Nindito

    2013-12-01

    Full Text Available The use of diverse database technology in enterprise today can not be avoided. Thus, technology is needed to generate information in real time. The purpose of this research is to discuss a database replication technology that can be applied in heterogeneous database environments. In this study we use Windows-based MS SQL Server database to Linux-based Oracle database as the goal. The research method used is prototyping where development can be done quickly and testing of working models of the interaction process is done through repeated. From this research it is obtained that the database replication technolgy using Oracle Golden Gate can be applied in heterogeneous environments in real time as well.

  6. Digital Video--From the Desktop to Antarctica.

    Science.gov (United States)

    Hutto, David N.

    This narrative describes the processes and technologies employed to produce and deliver a series of complex interactive learning experiences that brought together working scientists in Antarctic and students and teachers across North America. This multifaceted program included field production in the Antarctic, the use of experimental…

  7. Portable Desktop Apps with GitHub Electron

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    Wouldn't it be nice if you could develop applications that work everywhere, regardless of Operating System or Platform? Even better, what if you could employ the same front-end technologies you use for your web/mobile apps? Meet GitHub Electron.

  8. Teaching a Foreign Language in a Desktop Videoconferencing Environment

    Science.gov (United States)

    Kotula, Krzysztof

    2016-01-01

    This paper aims to explore how language instructors teach with a synchronous multimodal setup (Skype). It reports on findings from research which evaluated how teachers use technologies to enable them to work in distance learning contexts. A total of 124 teachers (86 female and 38 male), offering online private lessons, were asked to complete a…

  9. Functionally Graded Materials Database

    Science.gov (United States)

    Kisara, Katsuto; Konno, Tomomi; Niino, Masayuki

    2008-02-01

    Functionally Graded Materials Database (hereinafter referred to as FGMs Database) was open to the society via Internet in October 2002, and since then it has been managed by the Japan Aerospace Exploration Agency (JAXA). As of October 2006, the database includes 1,703 research information entries with 2,429 researchers data, 509 institution data and so on. Reading materials such as "Applicability of FGMs Technology to Space Plane" and "FGMs Application to Space Solar Power System (SSPS)" were prepared in FY 2004 and 2005, respectively. The English version of "FGMs Application to Space Solar Power System (SSPS)" is now under preparation. This present paper explains the FGMs Database, describing the research information data, the sitemap and how to use it. From the access analysis, user access results and users' interests are discussed.

  10. INIST: databases reorientation

    International Nuclear Information System (INIS)

    Bidet, J.C.

    1995-01-01

    INIST is a CNRS (Centre National de la Recherche Scientifique) laboratory devoted to the treatment of scientific and technical informations and to the management of these informations compiled in a database. Reorientation of the database content has been proposed in 1994 to increase the transfer of research towards enterprises and services, to develop more automatized accesses to the informations, and to create a quality assurance plan. The catalog of publications comprises 5800 periodical titles (1300 for fundamental research and 4500 for applied research). A science and technology multi-thematic database will be created in 1995 for the retrieval of applied and technical informations. ''Grey literature'' (reports, thesis, proceedings..) and human and social sciences data will be added to the base by the use of informations selected in the existing GRISELI and Francis databases. Strong modifications are also planned in the thematic cover of Earth sciences and will considerably reduce the geological information content. (J.S.). 1 tab

  11. The PEP-II project-wide database

    International Nuclear Information System (INIS)

    Chan, A.; Calish, S.; Crane, G.; MacGregor, I.; Meyer, S.; Wong, J.

    1995-05-01

    The PEP-II Project Database is a tool for monitoring the technical and documentation aspects of this accelerator construction. It holds the PEP-II design specifications, fabrication and installation data in one integrated system. Key pieces of the database include the machine parameter list, magnet and vacuum fabrication data. CAD drawings, publications and documentation, survey and alignment data and property control. The database can be extended to contain information required for the operations phase of the accelerator and detector. Features such as viewing CAD drawing graphics from the database will be implemented in the future. This central Oracle database on a UNIX server is built using ORACLE Case tools. Users at the three collaborating laboratories (SLAC, LBL, LLNL) can access the data remotely, using various desktop computer platforms and graphical interfaces

  12. A Case for Database Filesystems

    Energy Technology Data Exchange (ETDEWEB)

    Adams, P A; Hax, J C

    2009-05-13

    Data intensive science is offering new challenges and opportunities for Information Technology and traditional relational databases in particular. Database filesystems offer the potential to store Level Zero data and analyze Level 1 and Level 3 data within the same database system [2]. Scientific data is typically composed of both unstructured files and scalar data. Oracle SecureFiles is a new database filesystem feature in Oracle Database 11g that is specifically engineered to deliver high performance and scalability for storing unstructured or file data inside the Oracle database. SecureFiles presents the best of both the filesystem and the database worlds for unstructured content. Data stored inside SecureFiles can be queried or written at performance levels comparable to that of traditional filesystems while retaining the advantages of the Oracle database.

  13. Teaching Historians with Databases.

    Science.gov (United States)

    Burton, Vernon

    1993-01-01

    Asserts that, although pressures to publish have detracted from the quality of teaching at the college level, recent innovations in educational technology have created opportunities for instructional improvement. Describes the use of computer-assisted instruction and databases in college-level history courses. (CFR)

  14. Experiment Databases

    Science.gov (United States)

    Vanschoren, Joaquin; Blockeel, Hendrik

    Next to running machine learning algorithms based on inductive queries, much can be learned by immediately querying the combined results of many prior studies. Indeed, all around the globe, thousands of machine learning experiments are being executed on a daily basis, generating a constant stream of empirical information on machine learning techniques. While the information contained in these experiments might have many uses beyond their original intent, results are typically described very concisely in papers and discarded afterwards. If we properly store and organize these results in central databases, they can be immediately reused for further analysis, thus boosting future research. In this chapter, we propose the use of experiment databases: databases designed to collect all the necessary details of these experiments, and to intelligently organize them in online repositories to enable fast and thorough analysis of a myriad of collected results. They constitute an additional, queriable source of empirical meta-data based on principled descriptions of algorithm executions, without reimplementing the algorithms in an inductive database. As such, they engender a very dynamic, collaborative approach to experimentation, in which experiments can be freely shared, linked together, and immediately reused by researchers all over the world. They can be set up for personal use, to share results within a lab or to create open, community-wide repositories. Here, we provide a high-level overview of their design, and use an existing experiment database to answer various interesting research questions about machine learning algorithms and to verify a number of recent studies.

  15. Solving Relational Database Problems with ORDBMS in an Advanced Database Course

    Science.gov (United States)

    Wang, Ming

    2011-01-01

    This paper introduces how to use the object-relational database management system (ORDBMS) to solve relational database (RDB) problems in an advanced database course. The purpose of the paper is to provide a guideline for database instructors who desire to incorporate the ORDB technology in their traditional database courses. The paper presents…

  16. Ceramics Technology Project database: September 1991 summary report. [Materials for piston ring-cylinder liner for advanced heat/diesel engines

    Energy Technology Data Exchange (ETDEWEB)

    Keyes, B.L.P.

    1992-06-01

    The piston ring-cylinder liner area of the internal combustion engine must withstand very-high-temperature gradients, highly-corrosive environments, and constant friction. Improving the efficiency in the engine requires ring and cylinder liner materials that can survive this abusive environment and lubricants that resist decomposition at elevated temperatures. Wear and friction tests have been done on many material combinations in environments similar to actual use to find the right materials for the situation. This report covers tribology information produced from 1986 through July 1991 by Battelle columbus Laboratories, Caterpillar Inc., and Cummins Engine Company, Inc. for the Ceramic Technology Project (CTP). All data in this report were taken from the project's semiannual and bimonthly progress reports and cover base materials, coatings, and lubricants. The data, including test rig descriptions and material characterizations, are stored in the CTP database and are available to all project participants on request. Objective of this report is to make available the test results from these studies, but not to draw conclusions from these data.

  17. Correlation between National Influenza Surveillance Data and Search Queries from Mobile Devices and Desktops in South Korea.

    Science.gov (United States)

    Shin, Soo-Yong; Kim, Taerim; Seo, Dong-Woo; Sohn, Chang Hwan; Kim, Sung-Hoon; Ryoo, Seung Mok; Lee, Yoon-Seon; Lee, Jae Ho; Kim, Won Young; Lim, Kyoung Soo

    2016-01-01

    Digital surveillance using internet search queries can improve both the sensitivity and timeliness of the detection of a health event, such as an influenza outbreak. While it has recently been estimated that the mobile search volume surpasses the desktop search volume and mobile search patterns differ from desktop search patterns, the previous digital surveillance systems did not distinguish mobile and desktop search queries. The purpose of this study was to compare the performance of mobile and desktop search queries in terms of digital influenza surveillance. The study period was from September 6, 2010 through August 30, 2014, which consisted of four epidemiological years. Influenza-like illness (ILI) and virologic surveillance data from the Korea Centers for Disease Control and Prevention were used. A total of 210 combined queries from our previous survey work were used for this study. Mobile and desktop weekly search data were extracted from Naver, which is the largest search engine in Korea. Spearman's correlation analysis was used to examine the correlation of the mobile and desktop data with ILI and virologic data in Korea. We also performed lag correlation analysis. We observed that the influenza surveillance performance of mobile search queries matched or exceeded that of desktop search queries over time. The mean correlation coefficients of mobile search queries and the number of queries with an r-value of ≥ 0.7 equaled or became greater than those of desktop searches over the four epidemiological years. A lag correlation analysis of up to two weeks showed similar trends. Our study shows that mobile search queries for influenza surveillance have equaled or even become greater than desktop search queries over time. In the future development of influenza surveillance using search queries, the recognition of changing trend of mobile search data could be necessary.

  18. MR efficiency using automated MRI-desktop eProtocol

    Science.gov (United States)

    Gao, Fei; Xu, Yanzhe; Panda, Anshuman; Zhang, Min; Hanson, James; Su, Congzhe; Wu, Teresa; Pavlicek, William; James, Judy R.

    2017-03-01

    MRI protocols are instruction sheets that radiology technologists use in routine clinical practice for guidance (e.g., slice position, acquisition parameters etc.). In Mayo Clinic Arizona (MCA), there are over 900 MR protocols (ranging across neuro, body, cardiac, breast etc.) which makes maintaining and updating the protocol instructions a labor intensive effort. The task is even more challenging given different vendors (Siemens, GE etc.). This is a universal problem faced by all the hospitals and/or medical research institutions. To increase the efficiency of the MR practice, we designed and implemented a web-based platform (eProtocol) to automate the management of MRI protocols. It is built upon a database that automatically extracts protocol information from DICOM compliant images and provides a user-friendly interface to the technologists to create, edit and update the protocols. Advanced operations such as protocol migrations from scanner to scanner and capability to upload Multimedia content were also implemented. To the best of our knowledge, eProtocol is the first MR protocol automated management tool used clinically. It is expected that this platform will significantly improve the radiology operations efficiency including better image quality and exam consistency, fewer repeat examinations and less acquisition errors. These protocols instructions will be readily available to the technologists during scans. In addition, this web-based platform can be extended to other imaging modalities such as CT, Mammography, and Interventional Radiology and different vendors for imaging protocol management.

  19. Java in a Nutshell a Desktop Quick Reference

    CERN Document Server

    Flanagan, David

    2005-01-01

    With more than 700,000 copies sold to date, Java ina Nutshellfrom O'Reilly is clearly the favorite resource amongst the legion ofdevelopers and programmers using Java technology. And now, with therelease of the 5.0 version of Java, O'Reilly has given the book thatdefined the "in a Nutshell" category another impressive tune-up. In this latest revision, readers will find Java in aNutshell,5th Edition, does more than just cover the extensive changes implicit in5.0, the newest version of Java. It's undergone a complete makeover--inscope, size, and type of coverage--in order to more closely meet

  20. Java Foundation Classes in a Nutshell Desktop Quick Reference

    CERN Document Server

    Flanagan, David

    1999-01-01

    Java Foundation Classes in a Nutshell is an indispensable quick reference for Java programmers who are writing applications that use graphics or graphical user interfaces. The author of the bestsellingJava in a Nutshell has written fast-paced introductions to the Java APIs that comprise the Java Foundation Classes (JFC), such as the Swing GUI components and Java 2D, so that you can start using these exciting new technologies right away. This book also includes O'Reilly's classic-style, quick-reference material for all of the classes in the javax.swing and java.awt packages and their numerous

  1. A desktop 3D printer in safety-critical Java

    DEFF Research Database (Denmark)

    Strøm, Tórur Biskopstø; Schoeberl, Martin

    2012-01-01

    there exist several safety-critical Java framework implementations, there is a lack of safety-critical use cases implemented according to the specification. In this paper we present a 3D printer and its safety-critical Java level 1 implementation as a use case. With basis in the implementation we evaluate......It is desirable to bring Java technology to safety-critical systems. To this end The Open Group has created the safety-critical Java specification, which will allow Java applications, written according to the specification, to be certifiable in accordance with safety-critical standards. Although...

  2. The desktop muon detector: A simple, physics-motivated machine- and electronics-shop project for university students

    Science.gov (United States)

    Axani, S. N.; Conrad, J. M.; Kirby, C.

    2017-12-01

    This paper describes the construction of a desktop muon detector, an undergraduate-level physics project that develops machine-shop and electronics-shop technical skills. The desktop muon detector is a self-contained apparatus that employs a plastic scintillator as the detection medium and a silicon photomultiplier for light collection. This detector can be battery powered and is used in conjunction with the provided software. The total cost per detector is approximately 100. We describe physics experiments we have performed, and then suggest several other interesting measurements that are possible, with one or more desktop muon detectors.

  3. Cycle 1 as predictor of assisted reproductive technology treatment outcome over multiple cycles: an analysis of linked cycles from the Society for Assisted Reproductive Technology Clinic Outcomes Reporting System online database.

    Science.gov (United States)

    Stern, Judy E; Brown, Morton B; Luke, Barbara; Wantman, Ethan; Lederman, Avi; Hornstein, Mark D

    2011-02-01

    To determine whether the first cycle of assisted reproductive technology (ART) predicts treatment course and outcome. Retrospective study of linked cycles. Society for Assisted Reproductive Technology Clinic Outcome Reporting System database. A total of 6,352 ART patients residing or treated in Massachusetts with first treatment cycle in 2004-2005 using fresh, autologous oocytes and no prior ART. Women were categorized by first cycle as follows: Group I, no retrieval; Group II, retrieval, no transfer; Group III, transfer, no embryo cryopreservation; Group IV, transfer plus cryopreservation; and Group V, all embryos cryopreserved. None. Cumulative live-birth delivery per woman, use of donor eggs, intracytoplasmic sperm injection (ICSI), or frozen embryo transfers (FET). Groups differed in age, baseline FSH level, prior gravidity, diagnosis, and failure to return for Cycle 2. Live-birth delivery per woman for groups I through V for women with no delivery in Cycle I were 32.1%, 35.9%, 40.1%, 53.4%, and 51.3%, respectively. Groups I and II were more likely to subsequently use donor eggs (14.5% and 10.9%). Group II had the highest use of ICSI (73.3%); Group III had the lowest use of FET (8.9%). Course of treatment in the first ART cycle is related to different cumulative live-birth delivery rates and eventual use of donor egg, ICSI, and FET. Copyright © 2011 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  4. Strabo: An App and Database for Structural Geology and Tectonics Data

    Science.gov (United States)

    Newman, J.; Williams, R. T.; Tikoff, B.; Walker, J. D.; Good, J.; Michels, Z. D.; Ash, J.

    2016-12-01

    Strabo is a data system designed to facilitate digital storage and sharing of structural geology and tectonics data. The data system allows researchers to store and share field and laboratory data as well as construct new multi-disciplinary data sets. Strabo is built on graph database technology, as opposed to a relational database, which provides the flexibility to define relationships between objects of any type. This framework allows observations to be linked in a complex and hierarchical manner that is not possible in traditional database topologies. Thus, the advantage of the Strabo data structure is the ability of graph databases to link objects in both numerous and complex ways, in a manner that more accurately reflects the realities of the collecting and organizing of geological data sets. The data system is accessible via a mobile interface (iOS and Android devices) that allows these data to be stored, visualized, and shared during primary collection in the field or the laboratory. The Strabo Data System is underlain by the concept of a "Spot," which we define as any observation that characterizes a specific area. This can be anything from a strike and dip measurement of bedding to cross-cutting relationships between faults in complex dissected terrains. Each of these spots can then contain other Spots and/or measurements (e.g., lithology, slickenlines, displacement magnitude.) Hence, the Spot concept is applicable to all relationships and observation sets. Strabo is therefore capable of quantifying and digitally storing large spatial variations and complex geometries of naturally deformed rocks within hierarchically related maps and images. These approaches provide an observational fidelity comparable to a traditional field book, but with the added benefits of digital data storage, processing, and ease of sharing. This approach allows Strabo to integrate seamlessly into the workflow of most geologists. Future efforts will focus on extending Strabo to

  5. MDA-image: an environment of networked desktop computers for teleradiology/pathology.

    Science.gov (United States)

    Moffitt, M E; Richli, W R; Carrasco, C H; Wallace, S; Zimmerman, S O; Ayala, A G; Benjamin, R S; Chee, S; Wood, P; Daniels, P

    1991-04-01

    MDA-Image, a project of The University of Texas M. D. Anderson Cancer Center, is an environment of networked desktop computers for teleradiology/pathology. Radiographic film is digitized with a film scanner and histopathologic slides are digitized using a red, green, and blue (RGB) video camera connected to a microscope. Digitized images are stored on a data server connected to the institution's computer communication network (Ethernet) and can be displayed from authorized desktop computers connected to Ethernet. Images are digitized for cases presented at the Bone Tumor Management Conference, a multidisciplinary conference in which treatment options are discussed among clinicians, surgeons, radiologists, pathologists, radiotherapists, and medical oncologists. These radiographic and histologic images are shown on a large screen computer monitor during the conference. They are available for later review for follow-up or representation.

  6. Telemedicine in rural areas. Experience with medical desktop-conferencing via satellite.

    Science.gov (United States)

    Ricke, J; Kleinholz, L; Hosten, N; Zendel, W; Lemke, A; Wielgus, W; Vöge, K H; Fleck, E; Marciniak, R; Felix, R

    1995-01-01

    Cooperation between physicians in hospitals in rural areas can be assisted by desktop-conferencing using a satellite link. For six weeks, medical desktop-conferencing was tested during daily clinical conferences between the Virchow-Klinikum, Berlin, and the Medical Academy, Wroclaw. The communications link was provided by the German Telekom satellite system MCS, which allowed temporary connections to be established on demand by manual dialling. Standard hardware and software were used for videoconferencing, as well as software for medical communication developed in the BERMED project. Digital data, such as computed tomography or magnetic resonance images, were transmitted by a digital data channel in parallel to the transmission of analogue video and audio signals. For conferences involving large groups of people, hardware modifications were required. These included the installation of a video projector, adaptation of the audio system with improved echo cancellation, and installation of extra microphones. Learning to use an unfamiliar communication medium proved to be uncomplicated for the participating physicians.

  7. Fab the coming revolution on your desktop : from personal computers to personal fabrication

    CERN Document Server

    Gershenfeld, Neil

    2005-01-01

    What if you could someday put the manufacturing power of an automobile plant on your desktop? According to Neil Gershenfeld, the renowned MIT scientist and inventor, the next big thing is personal fabrication-the ability to design and produce your own products, in your own home, with a machine that combines consumer electronics and industrial tools. Personal fabricators are about to revolutionize the world just as personal computers did a generation ago, and Fab shows us how.

  8. Reducing the Digital Divide among Children Who Received Desktop or Hybrid Computers for the Home

    Directory of Open Access Journals (Sweden)

    Gila Cohen Zilka

    2016-06-01

    Full Text Available Researchers and policy makers have been exploring ways to reduce the digital divide. Parameters commonly used to examine the digital divide worldwide, as well as in this study, are: (a the digital divide in the accessibility and mobility of the ICT infrastructure and of the content infrastructure (e.g., sites used in school; and (b the digital divide in literacy skills. In the present study we examined the degree of effectiveness of receiving a desktop or hybrid computer for the home in reducing the digital divide among children of low socio-economic status aged 8-12 from various localities across Israel. The sample consisted of 1,248 respondents assessed in two measurements. As part of the mixed-method study, 128 children were also interviewed. Findings indicate that after the children received desktop or hybrid computers, changes occurred in their frequency of access, mobility, and computer literacy. Differences were found between the groups: hybrid computers reduce disparities and promote work with the computer and surfing the Internet more than do desktop computers. Narrowing the digital divide for this age group has many implications for the acquisition of skills and study habits, and consequently, for the realization of individual potential. The children spoke about self improvement as a result of exposure to the digital environment, about a sense of empowerment and of improvement in their advantage in the social fabric. Many children expressed a desire to continue their education and expand their knowledge of computer applications, the use of software, of games, and more. Therefore, if there is no computer in the home and it is necessary to decide between a desktop and a hybrid computer, a hybrid computer is preferable.

  9. National Patient Care Database (NPCD)

    Data.gov (United States)

    Department of Veterans Affairs — The National Patient Care Database (NPCD), located at the Austin Information Technology Center, is part of the National Medical Information Systems (NMIS). The NPCD...

  10. Assessment of Two Desk-Top Computer Simulations Used to Train Tactical Decision Making (TDM) of Small Unit Infantry Leaders

    National Research Council Canada - National Science Library

    Beal, Scott A

    2007-01-01

    Fifty-two leaders in the Basic Non-Commissioned Officer Course (BNCOC) at Fort Benning, Georgia, participated in an assessment of two desk-top computer simulations used to train tactical decision making...

  11. Efficient Redundancy Techniques in Cloud and Desktop Grid Systems using MAP/G/c-type Queues

    Science.gov (United States)

    Chakravarthy, Srinivas R.; Rumyantsev, Alexander

    2018-03-01

    Cloud computing is continuing to prove its flexibility and versatility in helping industries and businesses as well as academia as a way of providing needed computing capacity. As an important alternative to cloud computing, desktop grids allow to utilize the idle computer resources of an enterprise/community by means of distributed computing system, providing a more secure and controllable environment with lower operational expenses. Further, both cloud computing and desktop grids are meant to optimize limited resources and at the same time to decrease the expected latency for users. The crucial parameter for optimization both in cloud computing and in desktop grids is the level of redundancy (replication) for service requests/workunits. In this paper we study the optimal replication policies by considering three variations of Fork-Join systems in the context of a multi-server queueing system with a versatile point process for the arrivals. For services we consider phase type distributions as well as shifted exponential and Weibull. We use both analytical and simulation approach in our analysis and report some interesting qualitative results.

  12. Efficient Redundancy Techniques in Cloud and Desktop Grid Systems using MAP/G/c-type Queues

    Directory of Open Access Journals (Sweden)

    Chakravarthy Srinivas R.

    2018-03-01

    Full Text Available Cloud computing is continuing to prove its flexibility and versatility in helping industries and businesses as well as academia as a way of providing needed computing capacity. As an important alternative to cloud computing, desktop grids allow to utilize the idle computer resources of an enterprise/community by means of distributed computing system, providing a more secure and controllable environment with lower operational expenses. Further, both cloud computing and desktop grids are meant to optimize limited resources and at the same time to decrease the expected latency for users. The crucial parameter for optimization both in cloud computing and in desktop grids is the level of redundancy (replication for service requests/workunits. In this paper we study the optimal replication policies by considering three variations of Fork-Join systems in the context of a multi-server queueing system with a versatile point process for the arrivals. For services we consider phase type distributions as well as shifted exponential and Weibull. We use both analytical and simulation approach in our analysis and report some interesting qualitative results.

  13. Realization of a Desktop Flight Simulation System for Motion-Cueing Studies

    Directory of Open Access Journals (Sweden)

    Berkay Volkaner

    2016-05-01

    Full Text Available Parallel robotic mechanisms are generally used in flight simulators with a motion-cueing algorithm to create an unlimited motion feeling of a simulated medium in a bounded workspace of the simulator. A major problem in flight simulators is that the simulation has an unbounded space and the manipulator has a limited one. Using a washout filter in the motion-cueing algorithm overcomes this. In this study, a low-cost six degrees of freedom (DoF desktop parallel manipulator is used to test a classical motion-cueing algorithm; the algorithm's functionality is confirmed with a Simulink real-time environment. Translational accelerations and angular velocities of the simulated medium obtained from FlightGear flight simulation software are processed through a generated washout filter algorithm and the simulated medium's motion information is transmitted to the desktop parallel robotic mechanism as a set point for each leg. The major issues of this paper are designing a desktop simulation system, controlling the parallel manipulator, communicating between the flight simulation and the platform, designing a motion-cueing algorithm and determining the parameters of the washout filters.

  14. Development of a Desktop Simulator for APR1400 Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lee, J. B.

    2016-01-01

    It is essential for utilities to possess a full-scope simulator for operator training and operation test for operators. But it is very expensive and sometimes lack of fidelity if processes of developing the simulator and designing the plant are in parallel. It is due to the situation that simulator development stage sometimes precedes the plant design stage and modifications may occur to the design of the plant in construction stage. In an attempt to build a low cost and efficient simulator, a desktop simulator has been developed. This model is described herein. Using desktop simulators for training operators is an efficient method for familiarizing operators with their plant’s operation. A low cost and efficient desktop simulator for APR1400 has been developed, and brief features are introduced here. It is configured to mimic a full-scale simulator, and can be used for operators to be familiarized to their plant’s operation. Since the size of the simulator is small enough to be fit in a desk, it can be used in a classroom or in an office at any time. It can also be used to evaluate design changes or modifications of the plant before implementing them to the plant

  15. Development of a Desktop Simulator for APR1400 Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. B. [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    It is essential for utilities to possess a full-scope simulator for operator training and operation test for operators. But it is very expensive and sometimes lack of fidelity if processes of developing the simulator and designing the plant are in parallel. It is due to the situation that simulator development stage sometimes precedes the plant design stage and modifications may occur to the design of the plant in construction stage. In an attempt to build a low cost and efficient simulator, a desktop simulator has been developed. This model is described herein. Using desktop simulators for training operators is an efficient method for familiarizing operators with their plant’s operation. A low cost and efficient desktop simulator for APR1400 has been developed, and brief features are introduced here. It is configured to mimic a full-scale simulator, and can be used for operators to be familiarized to their plant’s operation. Since the size of the simulator is small enough to be fit in a desk, it can be used in a classroom or in an office at any time. It can also be used to evaluate design changes or modifications of the plant before implementing them to the plant.

  16. Investigation Methodology of a Virtual Desktop Infrastructure for IoT

    Directory of Open Access Journals (Sweden)

    Doowon Jeong

    2015-01-01

    Full Text Available Cloud computing for IoT (Internet of Things has exhibited the greatest growth in the IT market in the recent past and this trend is expected to continue. Many companies are adopting a virtual desktop infrastructure (VDI for private cloud computing to reduce costs and enhance the efficiency of their servers. As a VDI is widely used, threats of cyber terror and invasion are also increasing. To minimize the damage, response procedure for cyber intrusion on a VDI should be systematized. Therefore, we propose an investigation methodology for VDI solutions in this paper. Here we focus on a virtual desktop infrastructure and introduce various desktop virtualization solutions that are widely used, such as VMware, Citrix, and Microsoft. In addition, we verify the integrity of the data acquired in order that the result of our proposed methodology is acceptable as evidence in a court of law. During the experiment, we observed an error: one of the commonly used digital forensic tools failed to mount a dynamically allocated virtual disk properly.

  17. An Emerging Knowledge-Based Economy in China? Indicators from OECD Databases. OECD Science, Technology and Industry Working Papers, 2004/4

    Science.gov (United States)

    Criscuolo, Chiara; Martin, Ralf

    2004-01-01

    The main objective of this Working Paper is to show a set of indicators on the knowledge-based economy for China, mainly compiled from databases within EAS, although data from databases maintained by other parts of the OECD are included as well. These indicators are put in context by comparison with data for the United States, Japan and the EU (or…

  18. The LAILAPS Search Engine: Relevance Ranking in Life Science Databases

    Directory of Open Access Journals (Sweden)

    Lange Matthias

    2010-06-01

    Full Text Available Search engines and retrieval systems are popular tools at a life science desktop. The manual inspection of hundreds of database entries, that reflect a life science concept or fact, is a time intensive daily work. Hereby, not the number of query results matters, but the relevance does. In this paper, we present the LAILAPS search engine for life science databases. The concept is to combine a novel feature model for relevance ranking, a machine learning approach to model user relevance profiles, ranking improvement by user feedback tracking and an intuitive and slim web user interface, that estimates relevance rank by tracking user interactions. Queries are formulated as simple keyword lists and will be expanded by synonyms. Supporting a flexible text index and a simple data import format, LAILAPS can easily be used both as search engine for comprehensive integrated life science databases and for small in-house project databases.

  19. No effect of ambient odor on the affective appraisal of a desktop virtual environment with signs of disorder.

    Directory of Open Access Journals (Sweden)

    Alexander Toet

    Full Text Available Desktop virtual environments (VEs are increasingly deployed to study the effects of environmental qualities and interventions on human behavior and safety related concerns in built environments. For these applications it is essential that users appraise the affective qualities of the VE similar to those of its real world counterpart. Previous studies have shown that factors like simulated lighting, sound and dynamic elements all contribute to the affective appraisal of a desktop VE. Since ambient odor is known to affect the affective appraisal of real environments, and has been shown to increase the sense of presence in immersive VEs, it may also be an effective tool to tune the affective appraisal of desktop VEs. This study investigated if exposure to ambient odor can modulate the affective appraisal of a desktop VE with signs of public disorder.Participants explored a desktop VE representing a suburban neighborhood with signs of public disorder (neglect, vandalism and crime, while being exposed to either room air or subliminal levels of unpleasant (tar or pleasant (cut grass ambient odor. Whenever they encountered signs of disorder they reported their safety related concerns and associated affective feelings.Signs of crime in the desktop VE were associated with negative affective feelings and concerns for personal safety and personal property. However, there was no significant difference between reported safety related concerns and affective connotations in the control (no-odor and in each of the two ambient odor conditions.Ambient odor did not affect safety related concerns and affective connotations associated with signs of disorder in the desktop VE. Thus, semantic congruency between ambient odor and a desktop VE may not be sufficient to influence its affective appraisal, and a more realistic simulation in which simulated objects appear to emit scents may be required to achieve this goal.

  20. Database Description - FANTOM5 | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us FANTOM5 Database Description General information of database Database name FANTOM5 Alternati...me: Rattus norvegicus Taxonomy ID: 10116 Taxonomy Name: Macaca mulatta Taxonomy ID: 9544 Database descriptio...l Links: Original website information Database maintenance site RIKEN Center for Life Science Technologies, ...ilable Web services Not available URL of Web services - Need for user registration Not available About This Database Database... Description Download License Update History of This Database Site Policy | Contact Us Database Description - FANTOM5 | LSDB Archive ...

  1. Database Vs Data Warehouse

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Data warehouse technology includes a set of concepts and methods that offer the users useful information for decision making. The necessity to build a data warehouse arises from the necessity to improve the quality of information in the organization. The date proceeding from different sources, having a variety of forms - both structured and unstructured, are filtered according to business rules and are integrated in a single large data collection. Using informatics solutions, managers have understood that data stored in operational systems - including databases, are an informational gold mine that must be exploited. Data warehouses have been developed to answer the increasing demands for complex analysis, which could not be properly achieved with operational databases. The present paper emphasizes some of the criteria that information application developers can use in order to choose between a database solution or a data warehouse one.

  2. Technology as Arts-Based Education: Does the Desktop Reflect the Arts?

    Science.gov (United States)

    Gouzouasis, Peter

    2006-01-01

    Since the dawn of time, human imagination has resulted in creating extensions of self (that is, tools) as a means to overcome obstacles produced by genetic limits. Whether the tool extends thought or sense; whether the tool is organic, such as language, or inorganic; and whether electronic, digital, or analog, the artist plies the science or…

  3. RANCANG BANGUN PERANGKAT LUNAK MANAJEMEN DATABASE SQL SERVER BERBASIS WEB

    Directory of Open Access Journals (Sweden)

    Muchammad Husni

    2005-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Microsoft SQL Server merupakan aplikasi desktop database server yang bersifat client/server, karena memiliki komponen client, yang  berfungsi menampilkan dan memanipulasi data; serta komponen server yang berfungsi menyimpan, memanggil, dan mengamankan database. Operasi-operasi manajemen semua server database dalam jaringan dilakukan administrator database dengan menggunakan tool administratif utama SQL Server yang bernama Enterprise Manager. Hal ini mengakibatkan administrator database hanya bisa  melakukan operasi-operasi tersebut di komputer yang telah diinstalasi Microsoft SQL Server. Pada penelitian ini, dirancang suatu aplikasi berbasis web dengan menggunakan ASP.Net untuk melakukan pengaturan database server. Aplikasi ini menggunakan ADO.NET yang memanfaatkan Transact-SQL dan stored procedure pada server untuk melakukan operasi-operasi manajemen database pada suatu server database SQL, dan menampilkannya ke dalam web. Administrator database bisa menjalankan aplikasi berbasis web tersebut dari komputer mana saja pada jaringan dan melakukan koneksi ke server database SQL dengan menggunakan web browser. Dengan demikian memudahkan administrator melakukan tugasnya tanpa harus menggunakan komputer server.   Kata Kunci : Transact-SQL, ASP.Net, ADO.NET, SQL Server

  4. Outline of the Desktop Severe Accident Graphic Simulator Module for OPR-1000

    Energy Technology Data Exchange (ETDEWEB)

    Park, S. Y.; Ahn, K. I. [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    This paper introduce the desktop severe accident graphic simulator module (VMAAP) which is a window-based severe accident simulator using MAAP as its engine. The VMAAP is one of the submodules in SAMEX system (Severe Accident Management Support Expert System) which is a decision support system for use in a severe accident management following an incident at a nuclear power plant. The SAMEX system consists of four major modules as sub-systems: (a) Severe accident risk data base module (SARDB): stores the data of integrated severe accident analysis code results like MAAP and MELCOR for hundreds of high frequency scenarios for the reference plant; (b) Risk-informed severe accident risk data base management module (RI-SARD): provides a platform to identify the initiating event, determine plant status and equipment availability, diagnoses the status of the reactor core, reactor vessel and containment building, and predicts the plant behaviors; (c) Severe accident management simulator module (VMAAP): runs the MAAP4 code with user friendly graphic interface for input deck and output display; (d) On-line severe accident management guidance module (On-line SAMG); provides available accident management strategies with an electronic format. The role of VMAAP in SAMEX can be described as followings. SARDB contains the most of high frequency scenarios based on a level 2 probabilistic safety analysis. Therefore, there is good chance that a real accident sequence is similar to one of the data base cases. In such a case, RI-SARD can predict an accident progression by a scenario-base or symptom-base search depends on the available plant parameter information. Nevertheless, there still may be deviations or variations between the actual scenario and the data base scenario. The deviations can be decreased by using a real-time graphic accident simulator, VMAAP.. VMAAP is a MAAP4-based severe accident simulation model for OPR-1000 plant. It can simulate spectrum of physical processes

  5. Outline of the Desktop Severe Accident Graphic Simulator Module for OPR-1000

    International Nuclear Information System (INIS)

    Park, S. Y.; Ahn, K. I.

    2015-01-01

    This paper introduce the desktop severe accident graphic simulator module (VMAAP) which is a window-based severe accident simulator using MAAP as its engine. The VMAAP is one of the submodules in SAMEX system (Severe Accident Management Support Expert System) which is a decision support system for use in a severe accident management following an incident at a nuclear power plant. The SAMEX system consists of four major modules as sub-systems: (a) Severe accident risk data base module (SARDB): stores the data of integrated severe accident analysis code results like MAAP and MELCOR for hundreds of high frequency scenarios for the reference plant; (b) Risk-informed severe accident risk data base management module (RI-SARD): provides a platform to identify the initiating event, determine plant status and equipment availability, diagnoses the status of the reactor core, reactor vessel and containment building, and predicts the plant behaviors; (c) Severe accident management simulator module (VMAAP): runs the MAAP4 code with user friendly graphic interface for input deck and output display; (d) On-line severe accident management guidance module (On-line SAMG); provides available accident management strategies with an electronic format. The role of VMAAP in SAMEX can be described as followings. SARDB contains the most of high frequency scenarios based on a level 2 probabilistic safety analysis. Therefore, there is good chance that a real accident sequence is similar to one of the data base cases. In such a case, RI-SARD can predict an accident progression by a scenario-base or symptom-base search depends on the available plant parameter information. Nevertheless, there still may be deviations or variations between the actual scenario and the data base scenario. The deviations can be decreased by using a real-time graphic accident simulator, VMAAP.. VMAAP is a MAAP4-based severe accident simulation model for OPR-1000 plant. It can simulate spectrum of physical processes

  6. Desktop-Stereolithography 3D-Printing of a Poly(dimethylsiloxane)-Based Material with Sylgard-184 Properties.

    Science.gov (United States)

    Bhattacharjee, Nirveek; Parra-Cabrera, Cesar; Kim, Yong Tae; Kuo, Alexandra P; Folch, Albert

    2018-05-01

    The advantageous physiochemical properties of poly(dimethylsiloxane) (PDMS) have made it an extremely useful material for prototyping in various technological, scientific, and clinical areas. However, PDMS molding is a manual procedure and requires tedious assembly steps, especially for 3D designs, thereby limiting its access and usability. On the other hand, automated digital manufacturing processes such as stereolithography (SL) enable true 3D design and fabrication. Here the formulation, characterization, and SL application of a 3D-printable PDMS resin (3DP-PDMS) based on commercially available PDMS-methacrylate macromers, a high-efficiency photoinitiator and a high-absorbance photosensitizer, is reported. Using a desktop SL-printer, optically transparent submillimeter structures and microfluidic channels are demonstrated. An optimized blend of PDMS-methacrylate macromers is also used to SL-print structures with mechanical properties similar to conventional thermally cured PDMS (Sylgard-184). Furthermore, it is shown that SL-printed 3DP-PDMS substrates can be rendered suitable for mammalian cell culture. The 3DP-PDMS resin enables assembly-free, automated, digital manufacturing of PDMS, which should facilitate the prototyping of devices for microfluidics, organ-on-chip platforms, soft robotics, flexible electronics, and sensors, among others. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. CloVR: A virtual machine for automated and portable sequence analysis from the desktop using cloud computing

    Science.gov (United States)

    2011-01-01

    Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105

  8. Owgis 2.0: Open Source Java Application that Builds Web GIS Interfaces for Desktop Andmobile Devices

    Science.gov (United States)

    Zavala Romero, O.; Chassignet, E.; Zavala-Hidalgo, J.; Pandav, H.; Velissariou, P.; Meyer-Baese, A.

    2016-12-01

    OWGIS is an open source Java and JavaScript application that builds easily configurable Web GIS sites for desktop and mobile devices. The current version of OWGIS generates mobile interfaces based on HTML5 technology and can be used to create mobile applications. The style of the generated websites can be modified using COMPASS, a well known CSS Authoring Framework. In addition, OWGIS uses several Open Geospatial Consortium standards to request datafrom the most common map servers, such as GeoServer. It is also able to request data from ncWMS servers, allowing the websites to display 4D data from NetCDF files. This application is configured by XML files that define which layers, geographic datasets, are displayed on the Web GIS sites. Among other features, OWGIS allows for animations; streamlines from vector data; virtual globe display; vertical profiles and vertical transects; different color palettes; the ability to download data; and display text in multiple languages. OWGIS users are mainly scientists in the oceanography, meteorology and climate fields.

  9. CloVR: a virtual machine for automated and portable sequence analysis from the desktop using cloud computing.

    Science.gov (United States)

    Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian

    2011-08-30

    Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.

  10. Exploitation of Existing Voice Over Internet Protocol Technology for Department of the Navy Application

    National Research Council Canada - National Science Library

    Vegter, Henry

    2002-01-01

    ..., reduced cost associated with toll calls and the merger of the telephone with the desktop will keep adoption of this technology on the path to ubiquitous use, Topics explored in the thesis include...

  11. Database reliability engineering designing and operating resilient database systems

    CERN Document Server

    Campbell, Laine

    2018-01-01

    The infrastructure-as-code revolution in IT is also affecting database administration. With this practical book, developers, system administrators, and junior to mid-level DBAs will learn how the modern practice of site reliability engineering applies to the craft of database architecture and operations. Authors Laine Campbell and Charity Majors provide a framework for professionals looking to join the ranks of today’s database reliability engineers (DBRE). You’ll begin by exploring core operational concepts that DBREs need to master. Then you’ll examine a wide range of database persistence options, including how to implement key technologies to provide resilient, scalable, and performant data storage and retrieval. With a firm foundation in database reliability engineering, you’ll be ready to dive into the architecture and operations of any modern database. This book covers: Service-level requirements and risk management Building and evolving an architecture for operational visibility ...

  12. The implementation of virtualization technology in EAST data system

    International Nuclear Information System (INIS)

    Wang, Feng; Sun, Xiaoyang; Li, Shi; Wang, Yong; Xiao, Bingjia; Chang, Sidi

    2014-01-01

    Highlights: • The server virtualization based on XenServer has been used in EAST data center for common servers and software development platform. • The application virtualization based on XenApp has been demonstrated in EAST to provide an easy and unified data browser method. • The desktop virtualization based on XenDesktop has been adopted for desktop virtualization in the new EAST central control room. - Abstract: The virtualization technology is very popular in many fields at present which has many advantages such as reducing costs, unified management, mobile applications, cross platform, etc. We have also implemented the virtualization technology in EAST control and data system. There are primarily four kinds of technology providers in virtualization technology including VMware, Citrix, Microsoft Hyper-V as well as open source solutions. We have chosen the Citrix solution to implement our virtualization system which mainly includes three aspects. Firstly, we adopt the XenServer technology to realize virtual server for EAST data management and service system. Secondly, we use XenApp technology to realize cross platform system for unify data access. Thirdly, in order to simplify the management of the client computers, we adopt the XenDesktop technology to realize virtual desktops for new central control room. The details of the implementation are described in this paper

  13. Stackfile Database

    Science.gov (United States)

    deVarvalho, Robert; Desai, Shailen D.; Haines, Bruce J.; Kruizinga, Gerhard L.; Gilmer, Christopher

    2013-01-01

    This software provides storage retrieval and analysis functionality for managing satellite altimetry data. It improves the efficiency and analysis capabilities of existing database software with improved flexibility and documentation. It offers flexibility in the type of data that can be stored. There is efficient retrieval either across the spatial domain or the time domain. Built-in analysis tools are provided for frequently performed altimetry tasks. This software package is used for storing and manipulating satellite measurement data. It was developed with a focus on handling the requirements of repeat-track altimetry missions such as Topex and Jason. It was, however, designed to work with a wide variety of satellite measurement data [e.g., Gravity Recovery And Climate Experiment -- GRACE). The software consists of several command-line tools for importing, retrieving, and analyzing satellite measurement data.

  14. DYNALIGHT DESKTOP

    DEFF Research Database (Denmark)

    Mærsk-Møller, Hans Martin; Kjær, Katrine Heinsvig; Ottosen, Carl-Otto

    2018-01-01

    for energy and cost-efficient climate control strategies that do not compromise product quality. In this paper, we present a novel approach addressing dynamic control of supplemental light in greenhouses aiming to decrease electricity costs and energy consumption without loss in plant productivity. Our...... approach uses weather forecasts and electricity prices to compute energy and cost-efficient supplemental light plans, which fulfils the production goals of the grower. The approach is supported by a set of newly developed planning software, which interfaces with a greenhouse climate computer. The planning...... algorithm is based on a new plant physiological understanding that utilizes the natural plasticity in plants to irregular light periods. The results revealed that different light control strategies using three different set points of daily photosynthesis integral (DPI) compared to a control treatment...

  15. Implementation Issues of Virtual Desktop Infrastructure and Its Case Study for a Physician's Round at Seoul National University Bundang Hospital.

    Science.gov (United States)

    Yoo, Sooyoung; Kim, Seok; Kim, Taegi; Kim, Jon Soo; Baek, Rong-Min; Suh, Chang Suk; Chung, Chin Youb; Hwang, Hee

    2012-12-01

    The cloud computing-based virtual desktop infrastructure (VDI) allows access to computing environments with no limitations in terms of time or place such that it can permit the rapid establishment of a mobile hospital environment. The objective of this study was to investigate the empirical issues to be considered when establishing a virtual mobile environment using VDI technology in a hospital setting and to examine the utility of the technology with an Apple iPad during a physician's rounds as a case study. Empirical implementation issues were derived from a 910-bed tertiary national university hospital that recently launched a VDI system. During the physicians' rounds, we surveyed patient satisfaction levels with the VDI-based mobile consultation service with the iPad and the relationship between these levels of satisfaction and hospital revisits, hospital recommendations, and the hospital brand image. Thirty-five inpatients (including their next-of-kin) and seven physicians participated in the survey. Implementation issues pertaining to the VDI system arose with regard to the highly availability system architecture, wireless network infrastructure, and screen resolution of the system. Other issues were related to privacy and security, mobile device management, and user education. When the system was used in rounds, patients and their next-of-kin expressed high satisfaction levels, and a positive relationship was noted as regards patients' decisions to revisit the hospital and whether the use of the VDI system improved the brand image of the hospital. Mobile hospital environments have the potential to benefit both physicians and patients. The issues related to the implementation of VDI system discussed here should be examined in advance for its successful adoption and implementation.

  16. D Virtual CH Interactive Information Systems for a Smart Web Browsing Experience for Desktop Pcs and Mobile Devices

    Science.gov (United States)

    Scianna, A.; La Guardia, M.

    2018-05-01

    Recently, the diffusion of knowledge on Cultural Heritage (CH) has become an element of primary importance for its valorization. At the same time, the diffusion of surveys based on UAV Unmanned Aerial Vehicles (UAV) technologies and new methods of photogrammetric reconstruction have opened new possibilities for 3D CH representation. Furthermore the recent development of faster and more stable internet connections leads people to increase the use of mobile devices. In the light of all this, the importance of the development of Virtual Reality (VR) environments applied to CH is strategic for the diffusion of knowledge in a smart solution. In particular, the present work shows how, starting from a basic survey and the further photogrammetric reconstruction of a cultural good, is possible to built a 3D CH interactive information system useful for desktop and mobile devices. For this experimentation the Arab-Norman church of the Trinity of Delia (in Castelvetrano-Sicily-Italy) has been adopted as case study. The survey operations have been carried out considering different rapid methods of acquisition (UAV camera, SLR camera and smartphone camera). The web platform to publish the 3D information has been built using HTML5 markup language and WebGL JavaScript libraries (Three.js libraries). This work presents the construction of a 3D navigation system for a web-browsing of a virtual CH environment, with the integration of first person controls and 3D popup links. This contribution adds a further step to enrich the possibilities of open-source technologies applied to the world of CH valorization on web.

  17. 3D Virtual CH Interactive Information Systems for a smart web browsing experience for desktop PCs and mobile devices

    Directory of Open Access Journals (Sweden)

    A. Scianna

    2018-05-01

    Full Text Available Recently, the diffusion of knowledge on Cultural Heritage (CH has become an element of primary importance for its valorization. At the same time, the diffusion of surveys based on UAV Unmanned Aerial Vehicles (UAV technologies and new methods of photogrammetric reconstruction have opened new possibilities for 3D CH representation. Furthermore the recent development of faster and more stable internet connections leads people to increase the use of mobile devices. In the light of all this, the importance of the development of Virtual Reality (VR environments applied to CH is strategic for the diffusion of knowledge in a smart solution. In particular, the present work shows how, starting from a basic survey and the further photogrammetric reconstruction of a cultural good, is possible to built a 3D CH interactive information system useful for desktop and mobile devices. For this experimentation the Arab-Norman church of the Trinity of Delia (in Castelvetrano-Sicily-Italy has been adopted as case study. The survey operations have been carried out considering different rapid methods of acquisition (UAV camera, SLR camera and smartphone camera. The web platform to publish the 3D information has been built using HTML5 markup language and WebGL JavaScript libraries (Three.js libraries. This work presents the construction of a 3D navigation system for a web-browsing of a virtual CH environment, with the integration of first person controls and 3D popup links. This contribution adds a further step to enrich the possibilities of open-source technologies applied to the world of CH valorization on web.

  18. Oracle Application Express 5 for beginners a practical guide to rapidly develop data-centric web applications accessible from desktop, laptops, tablets, and smartphones

    CERN Document Server

    2015-01-01

    Oracle Application Express has taken another big leap towards becoming a true next generation RAD tool. It has entered into its fifth version to build robust web applications. One of the most significant feature in this release is a new page designer that helps developers create and edit page elements within a single page design view, which enormously maximizes developer productivity. Without involving the audience too much into the boring bits, this full colored edition adopts an inspiring approach that helps beginners practically evaluate almost every feature of Oracle Application Express, including all features new to version 5. The most convincing way to explore a technology is to apply it to a real world problem. In this book, you’ll develop a sales application that demonstrates almost every feature to practically expose the anatomy of Oracle Application Express 5. The short list below presents some main topics of Oracle APEX covered in this book: Rapid web application development for desktops, la...

  19. Collection and analysis of environmental radiation data using a desktop computer

    International Nuclear Information System (INIS)

    Gogolak, C.V.

    1982-04-01

    A portable instrumentation sytem using a Hewlett-Packard HP-9825 desktop computer for the collection and analysis of environmental radiation data is described. Procedures for the transmission of data between the HP-9825 and various nuclear counters are given together with a description of the necessary hardware and software. Complete programs for the analysis of Ge(Li) and NaI(Tl) gamma-ray spectra, high pressure ionization chamber monitor data, 86 Kr monitor data and air filter sample alpha particle activity measurements are presented. Some utility programs, intended to increase system flexibility, are included

  20. Computing on the Desktop: From Batch to Online in Two Large Danish Service Bureaus

    DEFF Research Database (Denmark)

    Jørgensen, Anker Helms

    The advent of the personal computer is often hailed as the major step towards empowering the computer user. This step was indeed significant, but it was preceeded by a similar step some 10-15 years earlier: the advent of the video terminal or ”glass–TTY”. The video terminal invaded the desktop...... of many while collar workers and the workplace of many blue collar workers in the 1970s and 1980s. It replaced batch processing and facilitated direct, interactive access to computing services. This had a considerable impact on working conditions. This paper addresses this transition in two large Danish...

  1. Non-Grey Radiation Modeling using Thermal Desktop/Sindaworks TFAWS06-1009

    Science.gov (United States)

    Anderson, Kevin R.; Paine, Chris

    2006-01-01

    This paper provides an overview of the non-grey radiation modeling capabilities of Cullimore and Ring's Thermal Desktop(Registered TradeMark) Version 4.8 SindaWorks software. The non-grey radiation analysis theory implemented by Sindaworks and the methodology used by the software are outlined. Representative results from a parametric trade study of a radiation shield comprised of a series of v-grooved shaped deployable panels is used to illustrate the capabilities of the SindaWorks non-grey radiation thermal analysis software using emissivities with temperature and wavelength dependency modeled via a Hagen-Rubens relationship.

  2. Automating Relational Database Design for Microcomputer Users.

    Science.gov (United States)

    Pu, Hao-Che

    1991-01-01

    Discusses issues involved in automating the relational database design process for microcomputer users and presents a prototype of a microcomputer-based system (RA, Relation Assistant) that is based on expert systems technology and helps avoid database maintenance problems. Relational database design is explained and the importance of easy input…

  3. Tight-coupling of groundwater flow and transport modelling engines with spatial databases and GIS technology: a new approach integrating Feflow and ArcGIS

    Directory of Open Access Journals (Sweden)

    Ezio Crestaz

    2012-09-01

    Full Text Available Implementation of groundwater flow and transport numerical models is generally a challenge, time-consuming and financially-demanding task, in charge to specialized modelers and consulting firms. At a later stage, within clearly stated limits of applicability, these models are often expected to be made available to less knowledgeable personnel to support/design and running of predictive simulations within more familiar environments than specialized simulation systems. GIS systems coupled with spatial databases appear to be ideal candidates to address problem above, due to their much wider diffusion and expertise availability. Current paper discusses the issue from a tight-coupling architecture perspective, aimed at integration of spatial databases, GIS and numerical simulation engines, addressing both observed and computed data management, retrieval and spatio-temporal analysis issues. Observed data can be migrated to the central database repository and then used to set up transient simulation conditions in the background, at run time, while limiting additional complexity and integrity failure risks as data duplication during data transfer through proprietary file formats. Similarly, simulation scenarios can be set up in a familiar GIS system and stored to spatial database for later reference. As numerical engine is tightly coupled with the GIS, simulations can be run within the environment and results themselves saved to the database. Further tasks, as spatio-temporal analysis (i.e. for postcalibration auditing scopes, cartography production and geovisualization, can then be addressed using traditional GIS tools. Benefits of such an approach include more effective data management practices, integration and availability of modeling facilities in a familiar environment, streamlining spatial analysis processes and geovisualization requirements for the non-modelers community. Major drawbacks include limited 3D and time-dependent support in

  4. Fabrication of low cost soft tissue prostheses with the desktop 3D printer.

    Science.gov (United States)

    He, Yong; Xue, Guang-huai; Fu, Jian-zhong

    2014-11-27

    Soft tissue prostheses such as artificial ear, eye and nose are widely used in the maxillofacial rehabilitation. In this report we demonstrate how to fabricate soft prostheses mold with a low cost desktop 3D printer. The fabrication method used is referred to as Scanning Printing Polishing Casting (SPPC). Firstly the anatomy is scanned with a 3D scanner, then a tissue casting mold is designed on computer and printed with a desktop 3D printer. Subsequently, a chemical polishing method is used to polish the casting mold by removing the staircase effect and acquiring a smooth surface. Finally, the last step is to cast medical grade silicone into the mold. After the silicone is cured, the fine soft prostheses can be removed from the mold. Utilizing the SPPC method, soft prostheses with smooth surface and complicated structure can be fabricated at a low cost. Accordingly, the total cost of fabricating ear prosthesis is about $30, which is much lower than the current soft prostheses fabrication methods.

  5. Effects of boundary-layer separation controllers on a desktop fume hood.

    Science.gov (United States)

    Huang, Rong Fung; Chen, Jia-Kun; Hsu, Ching Min; Hung, Shuo-Fu

    2016-10-02

    A desktop fume hood installed with an innovative design of flow boundary-layer separation controllers on the leading edges of the side plates, work surface, and corners was developed and characterized for its flow and containment leakage characteristics. The geometric features of the developed desktop fume hood included a rearward offset suction slot, two side plates, two side-plate boundary-layer separation controllers on the leading edges of the side plates, a slanted surface on the leading edge of the work surface, and two small triangular plates on the upper left and right corners of the hood face. The flow characteristics were examined using the laser-assisted smoke flow visualization technique. The containment leakages were measured by the tracer gas (sulphur hexafluoride) detection method on the hood face plane with a mannequin installed in front of the hood. The results of flow visualization showed that the smoke dispersions induced by the boundary-layer separations on the leading edges of the side plates and work surface, as well as the three-dimensional complex flows on the upper-left and -right corners of the hood face, were effectively alleviated by the boundary-layer separation controllers. The results of the tracer gas detection method with a mannequin standing in front of the hood showed that the leakage levels were negligibly small (≤0.003 ppm) at low face velocities (≥0.19 m/s).

  6. Database modeling and design logical design

    CERN Document Server

    Teorey, Toby J; Nadeau, Tom; Jagadish, HV

    2011-01-01

    Database systems and database design technology have undergone significant evolution in recent years. The relational data model and relational database systems dominate business applications; in turn, they are extended by other technologies like data warehousing, OLAP, and data mining. How do you model and design your database application in consideration of new technology or new business needs? In the extensively revised fifth edition, you'll get clear explanations, lots of terrific examples and an illustrative case, and the really practical advice you have come to count on--with design rules

  7. Database modeling and design logical design

    CERN Document Server

    Teorey, Toby J; Nadeau, Tom; Jagadish, HV

    2005-01-01

    Database systems and database design technology have undergone significant evolution in recent years. The relational data model and relational database systems dominate business applications; in turn, they are extended by other technologies like data warehousing, OLAP, and data mining. How do you model and design your database application in consideration of new technology or new business needs? In the extensively revised fourth edition, you'll get clear explanations, lots of terrific examples and an illustrative case, and the really practical advice you have come to count on--with design rul

  8. Database development and management

    CERN Document Server

    Chao, Lee

    2006-01-01

    Introduction to Database Systems Functions of a DatabaseDatabase Management SystemDatabase ComponentsDatabase Development ProcessConceptual Design and Data Modeling Introduction to Database Design Process Understanding Business ProcessEntity-Relationship Data Model Representing Business Process with Entity-RelationshipModelTable Structure and NormalizationIntroduction to TablesTable NormalizationTransforming Data Models to Relational Databases .DBMS Selection Transforming Data Models to Relational DatabasesEnforcing ConstraintsCreating Database for Business ProcessPhysical Design and Database

  9. Virtualisation Devices for Student Learning: Comparison between Desktop-Based (Oculus Rift) and Mobile-Based (Gear VR) Virtual Reality in Medical and Health Science Education

    Science.gov (United States)

    Moro, Christian; Stromberga, Zane; Stirling, Allan

    2017-01-01

    Consumer-grade virtual reality has recently become available for both desktop and mobile platforms and may redefine the way that students learn. However, the decision regarding which device to utilise within a curriculum is unclear. Desktop-based VR has considerably higher setup costs involved, whereas mobile-based VR cannot produce the quality of…

  10. Draft secure medical database standard.

    Science.gov (United States)

    Pangalos, George

    2002-01-01

    Medical database security is a particularly important issue for all Healthcare establishments. Medical information systems are intended to support a wide range of pertinent health issues today, for example: assure the quality of care, support effective management of the health services institutions, monitor and contain the cost of care, implement technology into care without violating social values, ensure the equity and availability of care, preserve humanity despite the proliferation of technology etc.. In this context, medical database security aims primarily to support: high availability, accuracy and consistency of the stored data, the medical professional secrecy and confidentiality, and the protection of the privacy of the patient. These properties, though of technical nature, basically require that the system is actually helpful for medical care and not harmful to patients. These later properties require in turn not only that fundamental ethical principles are not violated by employing database systems, but instead, are effectively enforced by technical means. This document reviews the existing and emerging work on the security of medical database systems. It presents in detail the related problems and requirements related to medical database security. It addresses the problems of medical database security policies, secure design methodologies and implementation techniques. It also describes the current legal framework and regulatory requirements for medical database security. The issue of medical database security guidelines is also examined in detailed. The current national and international efforts in the area are studied. It also gives an overview of the research work in the area. The document also presents in detail the most complete to our knowledge set of security guidelines for the development and operation of medical database systems.

  11. Analysis of low and medium energy physics records in databases. Science and technology indicators in low and medium energy physics. With particular emphasis on nuclear data

    International Nuclear Information System (INIS)

    Hillebrand, C.D.

    1998-12-01

    An analysis of the literature on low and medium energy physics, with particular emphasis on nuclear data, was performed on the basis of the contents of the bibliographic database INIS (International Nuclear Information System). Quantitative data were obtained on various characteristics of relevant INIS records such as subject categories, language and country of publication, publication types, etc. Rather surprisingly, it was found that the number of records in nuclear physics has remained nearly constant over the last decade. The analysis opens up the possibility of further studies, e.g. on international research co-operation and on publication patterns. (author)

  12. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    Science.gov (United States)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  13. Desktop publishing and medical imaging: paper as hardcopy medium for digital images.

    Science.gov (United States)

    Denslow, S

    1994-08-01

    Desktop-publishing software and hardware has progressed to the point that many widely used word-processing programs are capable of printing high-quality digital images with many shades of gray from black to white. Accordingly, it should be relatively easy to print digital medical images on paper for reports, instructional materials, and in research notes. Components were assembled that were necessary for extracting image data from medical imaging devices and converting the data to a form usable by word-processing software. A system incorporating these components was implemented in a medical setting and has been operating for 18 months. The use of this system by medical staff has been monitored.

  14. Detection of analyte binding to microarrays using gold nanoparticle labels and a desktop scanner

    DEFF Research Database (Denmark)

    Han, Anpan; Dufva, Martin; Belleville, Erik

    2003-01-01

    on gold nanoparticle labeled antibodies visualized by a commercial, office desktop flatbed scanner. Scanning electron microscopy studies showed that the signal from the flatbed scanner was proportional to the surface density of the bound antibody-gold conjugates, and that the flatbed scanner could detect...... six attomoles of antibody-gold conjugates. This detection system was used in a competitive immunoassay to measure the concentration of the pesticide metabolite 2,6-dichlorobenzamide (BAM) in water samples. The results showed that the gold labeled antibodies functioned comparably with a fluorescent...... based immunoassay for detecting BAM in water. A qualitative immunoassay based on gold-labeled antibodies could determineif a water sample contained BAM above and below 60-70 ng L(-1), which is below the maximum allowed BAM concentration for drinking water (100 ng L(-1)) according to European Union...

  15. Using the rear projection of the Socibot Desktop robot for creation of applications with facial expressions

    Science.gov (United States)

    Gîlcă, G.; Bîzdoacă, N. G.; Diaconu, I.

    2016-08-01

    This article aims to implement some practical applications using the Socibot Desktop social robot. We mean to realize three applications: creating a speech sequence using the Kiosk menu of the browser interface, creating a program in the Virtual Robot browser interface and making a new guise to be loaded into the robot's memory in order to be projected onto it face. The first application is actually created in the Compose submenu that contains 5 file categories: audio, eyes, face, head, mood, this being helpful in the creation of the projected sequence. The second application is more complex, the completed program containing: audio files, speeches (can be created in over 20 languages), head movements, the robot's facial parameters function of each action units (AUs) of the facial muscles, its expressions and its line of sight. Last application aims to change the robot's appearance with the guise created by us. The guise was created in Adobe Photoshop and then loaded into the robot's memory.

  16. Expanding services in a shrinking economy: desktop document delivery in a dental school library.

    Science.gov (United States)

    Gushrowski, Barbara A

    2011-07-01

    How can library staff develop and promote a document delivery service and then expand the service to a wide audience? The setting is the library at the Indiana University School of Dentistry (IUSD), Indianapolis. A faculty survey and a citation analysis were conducted to determine potential use of the service. Volume of interlibrary loan transactions and staff and equipment capacity were also studied. IUSD Library staff created a desktop delivery service (DDSXpress) for faculty and then expanded the service to practicing dental professionals and graduate students. The number of faculty using DDSXpress remains consistent. The number of practicing dental professionals using the service is low. Graduate students have been quick to adopt the service. Through careful analysis of capacity and need for the service, staff successfully expanded document delivery service without incurring additional costs. Use of DDSXpress is continually monitored, and opportunities to market the service to practicing dental professionals are being investigated.

  17. Thermoelectric cooling of microelectronic circuits and waste heat electrical power generation in a desktop personal computer

    International Nuclear Information System (INIS)

    Gould, C.A.; Shammas, N.Y.A.; Grainger, S.; Taylor, I.

    2011-01-01

    Thermoelectric cooling and micro-power generation from waste heat within a standard desktop computer has been demonstrated. A thermoelectric test system has been designed and constructed, with typical test results presented for thermoelectric cooling and micro-power generation when the computer is executing a number of different applications. A thermoelectric module, operating as a heat pump, can lower the operating temperature of the computer's microprocessor and graphics processor to temperatures below ambient conditions. A small amount of electrical power, typically in the micro-watt or milli-watt range, can be generated by a thermoelectric module attached to the outside of the computer's standard heat sink assembly, when a secondary heat sink is attached to the other side of the thermoelectric module. Maximum electrical power can be generated by the thermoelectric module when a water cooled heat sink is used as the secondary heat sink, as this produces the greatest temperature difference between both sides of the module.

  18. FRAMEWORK PARA CONVERSÃO DE APLICATIVOS DELPHI DESKTOP EM APLICATIVOS ANDROID NATIVO

    Directory of Open Access Journals (Sweden)

    Rodrigo da Silva Riquena

    2014-08-01

    Full Text Available With the growing use of mobile devices by companies and organizations there is an increasing demand applications in production mobile platform. For certain companies, business success may depend on a mobile application which approaches the customers or improve the performance of internal processes. However, developing software for the mobile platform is an expensive process which takes time and resources. A framework to convert Delphi Desktop applications into native Android applications in an automatic way constitutes a useful tool for architects and software developers can contribute with the implementation phase of the application. Therefore, this work is based on methods and processes for software reengineering as the PRE / OO (Process of Reengineering Object Oriented, for automatic conversion of an application developed in Delphi environment in an application for Android mobile platform. At last, an experiment was performed with a real case to corroborate the goals.

  19. The CosmicWatch Desktop Muon Detector: a self-contained, pocket sized particle detector

    Science.gov (United States)

    Axani, S. N.; Frankiewicz, K.; Conrad, J. M.

    2018-03-01

    The CosmicWatch Desktop Muon Detector is a self-contained, hand-held cosmic ray muon detector that is valuable for astro/particle physics research applications and outreach. The material cost of each detector is under 100 and it takes a novice student approximately four hours to build their first detector. The detectors are powered via a USB connection and the data can either be recorded directly to a computer or to a microSD card. Arduino- and Python-based software is provided to operate the detector and an online application to plot the data in real-time. In this paper, we describe the various design features, evaluate the performance, and illustrate the detectors capabilities by providing several example measurements.

  20. Qualitative research ethics on the spot: Not only on the desktop.

    Science.gov (United States)

    Øye, Christine; Sørensen, Nelli Øvre; Glasdam, Stinne

    2016-06-01

    The increase in medical ethical regulations and bureaucracy handled by institutional review boards and healthcare institutions puts the researchers using qualitative methods in a challenging position. Based on three different cases from three different research studies, the article explores and discusses research ethical dilemmas. First, and especially, the article addresses the challenges for gatekeepers who influence the informant's decisions to participate in research. Second, the article addresses the challenges in following research ethical guidelines related to informed consent and doing no harm. Third, the article argues for the importance of having research ethical guidelines and review boards to question and discuss the possible ethical dilemmas that occur in qualitative research. Research ethics must be understood in qualitative research as relational, situational, and emerging. That is, that focus on ethical issues and dilemmas has to be paid attention on the spot and not only at the desktop. © The Author(s) 2015.

  1. Fabrication of cerebral aneurysm simulator with a desktop 3D printer.

    Science.gov (United States)

    Liu, Yu; Gao, Qing; Du, Song; Chen, ZiChen; Fu, JianZhong; Chen, Bing; Liu, ZhenJie; He, Yong

    2017-05-17

    Now, more and more patients are suffering cerebral aneurysm. However, long training time limits the rapid growth of cerebrovascular neurosurgeons. Here we developed a novel cerebral aneurysm simulator which can be better represented the dynamic bulging process of cerebral aneurysm The proposed simulator features the integration of a hollow elastic vascular model, a skull model and a brain model, which can be affordably fabricated at the clinic (Fab@Clinic), under $25.00 each with the help of a low-cost desktop 3D printer. Moreover, the clinical blood flow and pulsation pressure similar to the human can be well simulated, which can be used to train the neurosurgical residents how to clip aneurysms more effectively.

  2. The Taverna workflow suite: designing and executing workflows of Web Services on the desktop, web or in the cloud

    NARCIS (Netherlands)

    Wolstencroft, K.; Haines, R.; Fellows, D.; Williams, A.; Withers, D.; Owen, S.; Soiland-Reyes, S.; Dunlop, I.; Nenadic, A.; Fisher, P.; Bhagat, J.; Belhajjame, K.; Bacall, F.; Hardisty, A.; Nieva de la Hidalga, A.; Balcazar Vargas, M.P.; Sufi, S.; Goble, C.

    2013-01-01

    The Taverna workflow tool suite (http://www.taverna.org.uk) is designed to combine distributed Web Services and/or local tools into complex analysis pipelines. These pipelines can be executed on local desktop machines or through larger infrastructure (such as supercomputers, Grids or cloud

  3. The Learner Characteristics, Features of Desktop 3D Virtual Reality Environments, and College Chemistry Instruction: A Structural Equation Modeling Analysis

    Science.gov (United States)

    Merchant, Zahira; Goetz, Ernest T.; Keeney-Kennicutt, Wendy; Kwok, Oi-man; Cifuentes, Lauren; Davis, Trina J.

    2012-01-01

    We examined a model of the impact of a 3D desktop virtual reality environment on the learner characteristics (i.e. perceptual and psychological variables) that can enhance chemistry-related learning achievements in an introductory college chemistry class. The relationships between the 3D virtual reality features and the chemistry learning test as…

  4. Improvements in fast-response flood modeling: desktop parallel computing and domain tracking

    Energy Technology Data Exchange (ETDEWEB)

    Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV. OF UTAH

    2009-01-01

    It is becoming increasingly important to have the ability to accurately forecast flooding, as flooding accounts for the most losses due to natural disasters in the world and the United States. Flood inundation modeling has been dominated by one-dimensional approaches. These models are computationally efficient and are considered by many engineers to produce reasonably accurate water surface profiles. However, because the profiles estimated in these models must be superimposed on digital elevation data to create a two-dimensional map, the result may be sensitive to the ability of the elevation data to capture relevant features (e.g. dikes/levees, roads, walls, etc...). Moreover, one-dimensional models do not explicitly represent the complex flow processes present in floodplains and urban environments and because two-dimensional models based on the shallow water equations have significantly greater ability to determine flow velocity and direction, the National Research Council (NRC) has recommended that two-dimensional models be used over one-dimensional models for flood inundation studies. This paper has shown that two-dimensional flood modeling computational time can be greatly reduced through the use of Java multithreading on multi-core computers which effectively provides a means for parallel computing on a desktop computer. In addition, this paper has shown that when desktop parallel computing is coupled with a domain tracking algorithm, significant computation time can be eliminated when computations are completed only on inundated cells. The drastic reduction in computational time shown here enhances the ability of two-dimensional flood inundation models to be used as a near-real time flood forecasting tool, engineering, design tool, or planning tool. Perhaps even of greater significance, the reduction in computation time makes the incorporation of risk and uncertainty/ensemble forecasting more feasible for flood inundation modeling (NRC 2000; Sayers et al

  5. Assessing soil erosion risk using RUSLE through a GIS open source desktop and web application.

    Science.gov (United States)

    Duarte, L; Teodoro, A C; Gonçalves, J A; Soares, D; Cunha, M

    2016-06-01

    Soil erosion is a serious environmental problem. An estimation of the expected soil loss by water-caused erosion can be calculated considering the Revised Universal Soil Loss Equation (RUSLE). Geographical Information Systems (GIS) provide different tools to create categorical maps of soil erosion risk which help to study the risk assessment of soil loss. The objective of this study was to develop a GIS open source application (in QGIS), using the RUSLE methodology for estimating erosion rate at the watershed scale (desktop application) and provide the same application via web access (web application). The applications developed allow one to generate all the maps necessary to evaluate the soil erosion risk. Several libraries and algorithms from SEXTANTE were used to develop these applications. These applications were tested in Montalegre municipality (Portugal). The maps involved in RUSLE method-soil erosivity factor, soil erodibility factor, topographic factor, cover management factor, and support practices-were created. The estimated mean value of the soil loss obtained was 220 ton km(-2) year(-1) ranged from 0.27 to 1283 ton km(-2) year(-1). The results indicated that most of the study area (80 %) is characterized by very low soil erosion level (soil erosion was higher than 962 ton km(-2) year(-1). It was also concluded that areas with high slope values and bare soil are related with high level of erosion and the higher the P and C values, the higher the soil erosion percentage. The RUSLE web and the desktop application are freely available.

  6. Economic analysis of cloud-based desktop virtualization implementation at a hospital.

    Science.gov (United States)

    Yoo, Sooyoung; Kim, Seok; Kim, Taeki; Baek, Rong-Min; Suh, Chang Suk; Chung, Chin Youb; Hwang, Hee

    2012-10-30

    Cloud-based desktop virtualization infrastructure (VDI) is known as providing simplified management of application and desktop, efficient management of physical resources, and rapid service deployment, as well as connection to the computer environment at anytime, anywhere with any device. However, the economic validity of investing in the adoption of the system at a hospital has not been established. This study computed the actual investment cost of the hospital-wide VDI implementation at the 910-bed Seoul National University Bundang Hospital in Korea and the resulting effects (i.e., reductions in PC errors and difficulties, application and operating system update time, and account management time). Return on investment (ROI), net present value (NPV), and internal rate of return (IRR) indexes used for corporate investment decision-making were used for the economic analysis of VDI implementation. The results of five-year cost-benefit analysis given for 400 Virtual Machines (VMs; i.e., 1,100 users in the case of SNUBH) showed that the break-even point was reached in the fourth year of the investment. At that point, the ROI was 122.6%, the NPV was approximately US$192,000, and the IRR showed an investment validity of 10.8%. From our sensitivity analysis to changing the number of VMs (in terms of number of users), the greater the number of adopted VMs was the more investable the system was. This study confirms that the emerging VDI can have an economic impact on hospital information system (HIS) operation and utilization in a tertiary hospital setting.

  7. Economic analysis of cloud-based desktop virtualization implementation at a hospital

    Directory of Open Access Journals (Sweden)

    Yoo Sooyoung

    2012-10-01

    Full Text Available Abstract Background Cloud-based desktop virtualization infrastructure (VDI is known as providing simplified management of application and desktop, efficient management of physical resources, and rapid service deployment, as well as connection to the computer environment at anytime, anywhere with anydevice. However, the economic validity of investing in the adoption of the system at a hospital has not been established. Methods This study computed the actual investment cost of the hospital-wide VDI implementation at the 910-bed Seoul National University Bundang Hospital in Korea and the resulting effects (i.e., reductions in PC errors and difficulties, application and operating system update time, and account management time. Return on investment (ROI, net present value (NPV, and internal rate of return (IRR indexes used for corporate investment decision-making were used for the economic analysis of VDI implementation. Results The results of five-year cost-benefit analysis given for 400 Virtual Machines (VMs; i.e., 1,100 users in the case of SNUBH showed that the break-even point was reached in the fourth year of the investment. At that point, the ROI was 122.6%, the NPV was approximately US$192,000, and the IRR showed an investment validity of 10.8%. From our sensitivity analysis to changing the number of VMs (in terms of number of users, the greater the number of adopted VMs was the more investable the system was. Conclusions This study confirms that the emerging VDI can have an economic impact on hospital information system (HIS operation and utilization in a tertiary hospital setting.

  8. INIS: international nuclear information system. World first international database on pacific uses of nuclear sciences and technologies; INIS: International Nuclear Information System. Premiere base de donnees internationale sur les applications pacifiques des sciences et technologies nucleaires

    Energy Technology Data Exchange (ETDEWEB)

    Surmont, J.; Constant, A.; Guille, N.; Le Blanc, A.; Mouffron, O.; Anguise, P.; Jouve, J.J

    2007-07-01

    This poster, prepared for the 2007 CEA meetings on scientific and technical information, presents the INIS information system, the document-types content and subject coverage of the database, the French contribution to this system thanks to the INIS team of the CEA-Saclay, the input preparation process, and an example of valorization of a scientific and historical patrimony with the CEA/IAEA joint project of digitization of about 2760 CEA reports published between 1948 and 1969. All these reports have been digitized by the IAEA and analyzed by CEA, and entered in the INIS database with a link to the full text. (J.S.)

  9. Mathematics for Databases

    NARCIS (Netherlands)

    ir. Sander van Laar

    2007-01-01

    A formal description of a database consists of the description of the relations (tables) of the database together with the constraints that must hold on the database. Furthermore the contents of a database can be retrieved using queries. These constraints and queries for databases can very well be

  10. Databases and their application

    NARCIS (Netherlands)

    Grimm, E.C.; Bradshaw, R.H.W; Brewer, S.; Flantua, S.; Giesecke, T.; Lézine, A.M.; Takahara, H.; Williams, J.W.,Jr; Elias, S.A.; Mock, C.J.

    2013-01-01

    During the past 20 years, several pollen database cooperatives have been established. These databases are now constituent databases of the Neotoma Paleoecology Database, a public domain, multiproxy, relational database designed for Quaternary-Pliocene fossil data and modern surface samples. The

  11. DOT Online Database

    Science.gov (United States)

    Page Home Table of Contents Contents Search Database Search Login Login Databases Advisory Circulars accessed by clicking below: Full-Text WebSearch Databases Database Records Date Advisory Circulars 2092 5 data collection and distribution policies. Document Database Website provided by MicroSearch

  12. Environmental Effects of Hydrokinetic Turbines on Fish: Desktop and Laboratory Flume Studies

    Energy Technology Data Exchange (ETDEWEB)

    Jacobson, Paul T. [Electric Power Research Institute; Amaral, Stephen V. [Alden Research Laboratory; Castro-Santos, Theodore [U.S. Geological Survey; Giza, Dan [Alden Research Laboratory; Haro, Alexander J. [U.S. Geological Survey; Hecker, George [Alden Research Laboratory; McMahon, Brian [Alden Research Laboratory; Perkins, Norman [Alden Research Laboratory; Pioppi, Nick [Alden Research Laboratory

    2012-12-31

    This collection of three reports describes desktop and laboratory flume studies that provide information to support assessment of the potential for injury and mortality of fish that encounter hydrokinetic turbines of various designs installed in tidal and river environments. Behavioral responses to turbine exposure also are investigated to support assessment of the potential for disruptions to upstream and downstream movements of fish. The studies: (1) conducted an assessment of potential injury mechanisms using available data from studies with conventional hydro turbines; (2) developed theoretical models for predicting blade strike probabilities and mortality rates; and (3) performed flume testing with three turbine designs and several fish species and size groups in two laboratory flumes to estimate survival rates and document fish behavior. The project yielded three reports which this document comprises. The three constituent documents are addressed individually below Fish Passage Through Turbines: Application of Conventional Hydropower Data to Hydrokinetic Technologies Fish passing through the blade sweep of a hydrokinetic turbine experience a much less harsh physical environment than do fish entrained through conventional hydro turbines. The design and operation of conventional turbines results in high flow velocities, abrupt changes in flow direction, relatively high runner rotational and blade speeds, rapid and significant changes in pressure, and the need for various structures throughout the turbine passageway that can be impacted by fish. These conditions generally do not occur or are not significant factors for hydrokinetic turbines. Furthermore, compared to conventional hydro turbines, hydrokinetic turbines typically produce relatively minor changes in shear, turbulence, and pressure levels from ambient conditions in the surrounding environment. Injuries and mortality from mechanical injuries will be less as well, mainly due to low rotational speeds and

  13. Database Description - NBDC NikkajiRDF | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available or Name: Japan Science and Technology Agency (JST) Creator Affiliation: Contact a...e information Database maintenance site Japan Science and Technology Agency (JST) URL of the original websit

  14. Database Description - fRNAdb | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available Affiliation: National Institute of Advanced Industrial Science and Technology (AIST) Journal Search: Creato...D89-92 External Links: Original website information Database maintenance site National Institute of Industrial Science and Technology

  15. International Ventilation Cooling Application Database

    DEFF Research Database (Denmark)

    Holzer, Peter; Psomas, Theofanis Ch.; OSullivan, Paul

    2016-01-01

    The currently running International Energy Agency, Energy and Conservation in Buildings, Annex 62 Ventilative Cooling (VC) project, is coordinating research towards extended use of VC. Within this Annex 62 the joint research activity of International VC Application Database has been carried out...... and locations, using VC as a mean of indoor comfort improvement. The building-spreadsheet highlights distributions of technologies and strategies, such as the following. (Numbers in % refer to the sample of the database’s 91 buildings.) It may be concluded that Ventilative Cooling is applied in temporary......, systematically investigating the distribution of technologies and strategies within VC. The database is structured as both a ticking-list-like building-spreadsheet and a collection of building-datasheets. The content of both closely follows Annex 62 State-Of-The- Art-Report. The database has been filled, based...

  16. Database Systems - Present and Future

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available The database systems have nowadays an increasingly important role in the knowledge-based society, in which computers have penetrated all fields of activity and the Internet tends to develop worldwide. In the current informatics context, the development of the applications with databases is the work of the specialists. Using databases, reach a database from various applications, and also some of related concepts, have become accessible to all categories of IT users. This paper aims to summarize the curricular area regarding the fundamental database systems issues, which are necessary in order to train specialists in economic informatics higher education. The database systems integrate and interfere with several informatics technologies and therefore are more difficult to understand and use. Thus, students should know already a set of minimum, mandatory concepts and their practical implementation: computer systems, programming techniques, programming languages, data structures. The article also presents the actual trends in the evolution of the database systems, in the context of economic informatics.

  17. SmallSat Database

    Science.gov (United States)

    Petropulos, Dolores; Bittner, David; Murawski, Robert; Golden, Bert

    2015-01-01

    The SmallSat has an unrealized potential in both the private industry and in the federal government. Currently over 70 companies, 50 universities and 17 governmental agencies are involved in SmallSat research and development. In 1994, the U.S. Army Missile and Defense mapped the moon using smallSat imagery. Since then Smart Phones have introduced this imagery to the people of the world as diverse industries watched this trend. The deployment cost of smallSats is also greatly reduced compared to traditional satellites due to the fact that multiple units can be deployed in a single mission. Imaging payloads have become more sophisticated, smaller and lighter. In addition, the growth of small technology obtained from private industries has led to the more widespread use of smallSats. This includes greater revisit rates in imagery, significantly lower costs, the ability to update technology more frequently and the ability to decrease vulnerability of enemy attacks. The popularity of smallSats show a changing mentality in this fast paced world of tomorrow. What impact has this created on the NASA communication networks now and in future years? In this project, we are developing the SmallSat Relational Database which can support a simulation of smallSats within the NASA SCaN Compatability Environment for Networks and Integrated Communications (SCENIC) Modeling and Simulation Lab. The NASA Space Communications and Networks (SCaN) Program can use this modeling to project required network support needs in the next 10 to 15 years. The SmallSat Rational Database could model smallSats just as the other SCaN databases model the more traditional larger satellites, with a few exceptions. One being that the smallSat Database is designed to be built-to-order. The SmallSat database holds various hardware configurations that can be used to model a smallSat. It will require significant effort to develop as the research material can only be populated by hand to obtain the unique data

  18. Adoption of new technologies in a highly uncertain environment : the case of knowledge discovery in databases for customer relationship management in Egyptian public banks

    NARCIS (Netherlands)

    Khedr, Ayman El_Sayed

    2008-01-01

    “How can we better understand the process of adopting a new technology and its impact on business value in situations of high uncertainty?” In short, this is the central research question addressed in this thesis. The dissertation explores how uncertainty factors affect the adoption process of a new

  19. Technostress: Surviving a Database Crash.

    Science.gov (United States)

    Dobb, Linda S.

    1990-01-01

    Discussion of technostress in libraries focuses on a database crash at California Polytechnic State University, San Luis Obispo. Steps taken to restore the data are explained, strategies for handling technological accidents are suggested, the impact on library staff is discussed, and a 10-item annotated bibliography on technostress is provided.…

  20. Dietary Supplement Ingredient Database

    Science.gov (United States)

    ... and US Department of Agriculture Dietary Supplement Ingredient Database Toggle navigation Menu Home About DSID Mission Current ... values can be saved to build a small database or add to an existing database for national, ...

  1. Energy Consumption Database

    Science.gov (United States)

    Consumption Database The California Energy Commission has created this on-line database for informal reporting ) classifications. The database also provides easy downloading of energy consumption data into Microsoft Excel (XLSX

  2. A concept for the modernization of underground mining master maps based on the enrichment of data definitions and spatial database technology

    Directory of Open Access Journals (Sweden)

    Krawczyk Artur

    2018-01-01

    Full Text Available In this article, topics regarding the technical and legal aspects of creating digital underground mining maps are described. Currently used technologies and solutions for creating, storing and making digital maps accessible are described in the context of the Polish mining industry. Also, some problems with the use of these technologies are identified and described. One of the identified problems is the need to expand the range of mining map data provided by survey departments to other mining departments, such as ventilation maintenance or geological maintenance. Three solutions are proposed and analyzed, and one is chosen for further analysis. The analysis concerns data storage and making survey data accessible not only from paper documentation, but also directly from computer systems. Based on enrichment data, new processing procedures are proposed for a new way of presenting information that allows the preparation of new cartographic representations (symbols of data with regard to users’ needs.

  3. A concept for the modernization of underground mining master maps based on the enrichment of data definitions and spatial database technology

    Science.gov (United States)

    Krawczyk, Artur

    2018-01-01

    In this article, topics regarding the technical and legal aspects of creating digital underground mining maps are described. Currently used technologies and solutions for creating, storing and making digital maps accessible are described in the context of the Polish mining industry. Also, some problems with the use of these technologies are identified and described. One of the identified problems is the need to expand the range of mining map data provided by survey departments to other mining departments, such as ventilation maintenance or geological maintenance. Three solutions are proposed and analyzed, and one is chosen for further analysis. The analysis concerns data storage and making survey data accessible not only from paper documentation, but also directly from computer systems. Based on enrichment data, new processing procedures are proposed for a new way of presenting information that allows the preparation of new cartographic representations (symbols) of data with regard to users' needs.

  4. Collecting Taxes Database

    Data.gov (United States)

    US Agency for International Development — The Collecting Taxes Database contains performance and structural indicators about national tax systems. The database contains quantitative revenue performance...

  5. USAID Anticorruption Projects Database

    Data.gov (United States)

    US Agency for International Development — The Anticorruption Projects Database (Database) includes information about USAID projects with anticorruption interventions implemented worldwide between 2007 and...

  6. NoSQL databases

    OpenAIRE

    Mrozek, Jakub

    2012-01-01

    This thesis deals with database systems referred to as NoSQL databases. In the second chapter, I explain basic terms and the theory of database systems. A short explanation is dedicated to database systems based on the relational data model and the SQL standardized query language. Chapter Three explains the concept and history of the NoSQL databases, and also presents database models, major features and the use of NoSQL databases in comparison with traditional database systems. In the fourth ...

  7. Fiscal 1998 research report (continued from the fiscal 1997 project). International survey project for rational energy use / Basic survey on efficient energy use in developing countries (Database construction project) / Survey on Japanese energy conservation technologies; 1998 nendo (1997 nendo jigyo kurikoshi) kokusai energy shiyo gorika nado chosa jigyo hatten tojokoku energy shohi koritsuka kiso chosa (database kochiku jigyo) chosa hokokusho. Nippon no sho energy gijutsu ni kansuru chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-02-01

    As a part of development of the energy conservation technology database, revision was made on 'Directory of Energy Conservation Technology in Japan' which was published to present the energy conservation technology of the Japanese energy-intensive industry to Asian countries in fiscal 1996. The previous directory was composed of 307 technical items, and 126 items among them were deleted and new 63 items were added through this revision. The technologies with energy conservation effects more than 10% were mainly selected by knowledge of the industrial members of the committee and other specialists. The energy conservation effect widely ranges from low-level one to high-level one, and the number of all the energy conservation technologies is enormous. Consequently, it should be considered that the applicable extent of the energy conservation technologies is dependent on the energy cost or economical situation of Japan and developing countries. (NEDO)

  8. Jelly Views : Extending Relational Database Systems Toward Deductive Database Systems

    Directory of Open Access Journals (Sweden)

    Igor Wojnicki

    2004-01-01

    Full Text Available This paper regards the Jelly View technology, which provides a new, practical methodology for knowledge decomposition, storage, and retrieval within Relational Database Management Systems (RDBMS. Intensional Knowledge clauses (rules are decomposed and stored in the RDBMS founding reusable components. The results of the rule-based processing are visible as regular views, accessible through SQL. From the end-user point of view the processing capability becomes unlimited (arbitrarily complex queries can be constructed using Intensional Knowledge, while the most external queries are expressed with standard SQL. The RDBMS functionality becomes extended toward that of the Deductive Databases

  9. Q46 Technology Refreshment Assessment

    Science.gov (United States)

    Garcia-Burgos, Axel A.

    2010-01-01

    This document reviews the accomplishments of the author during an internship. During this internship, the author was assigned to the End User Services Office, which coordinates and oversees a wide selection of information technologies that enhance the productivity of the workforce at NASA Kennedy Space Center. During the internship the author performed an assessment of the work of the Outsourcing Desktop Initiative for NASA (ODIN) contract.

  10. XML databases and the semantic web

    CERN Document Server

    Thuraisingham, Bhavani

    2002-01-01

    Efficient access to data, sharing data, extracting information from data, and making use of the information have become urgent needs for today''s corporations. With so much data on the Web, managing it with conventional tools is becoming almost impossible. New tools and techniques are necessary to provide interoperability as well as warehousing between multiple data sources and systems, and to extract information from the databases. XML Databases and the Semantic Web focuses on critical and new Web technologies needed for organizations to carry out transactions on the Web, to understand how to use the Web effectively, and to exchange complex documents on the Web.This reference for database administrators, database designers, and Web designers working in tandem with database technologists covers three emerging technologies of significant impact for electronic business: Extensible Markup Language (XML), semi-structured databases, and the semantic Web. The first two parts of the book explore these emerging techn...

  11. Network and Database Security: Regulatory Compliance, Network, and Database Security - A Unified Process and Goal

    Directory of Open Access Journals (Sweden)

    Errol A. Blake

    2007-12-01

    Full Text Available Database security has evolved; data security professionals have developed numerous techniques and approaches to assure data confidentiality, integrity, and availability. This paper will show that the Traditional Database Security, which has focused primarily on creating user accounts and managing user privileges to database objects are not enough to protect data confidentiality, integrity, and availability. This paper is a compilation of different journals, articles and classroom discussions will focus on unifying the process of securing data or information whether it is in use, in storage or being transmitted. Promoting a change in Database Curriculum Development trends may also play a role in helping secure databases. This paper will take the approach that if one make a conscientious effort to unifying the Database Security process, which includes Database Management System (DBMS selection process, following regulatory compliances, analyzing and learning from the mistakes of others, Implementing Networking Security Technologies, and Securing the Database, may prevent database breach.

  12. NASA's Climate in a Box: Desktop Supercomputing for Open Scientific Model Development

    Science.gov (United States)

    Wojcik, G. S.; Seablom, M. S.; Lee, T. J.; McConaughy, G. R.; Syed, R.; Oloso, A.; Kemp, E. M.; Greenseid, J.; Smith, R.

    2009-12-01

    NASA's High Performance Computing Portfolio in cooperation with its Modeling, Analysis, and Prediction program intends to make its climate and earth science models more accessible to a larger community. A key goal of this effort is to open the model development and validation process to the scientific community at large such that a natural selection process is enabled and results in a more efficient scientific process. One obstacle to others using NASA models is the complexity of the models and the difficulty in learning how to use them. This situation applies not only to scientists who regularly use these models but also non-typical users who may want to use the models such as scientists from different domains, policy makers, and teachers. Another obstacle to the use of these models is that access to high performance computing (HPC) accounts, from which the models are implemented, can be restrictive with long wait times in job queues and delays caused by an arduous process of obtaining an account, especially for foreign nationals. This project explores the utility of using desktop supercomputers in providing a complete ready-to-use toolkit of climate research products to investigators and on demand access to an HPC system. One objective of this work is to pre-package NASA and NOAA models so that new users will not have to spend significant time porting the models. In addition, the prepackaged toolkit will include tools, such as workflow, visualization, social networking web sites, and analysis tools, to assist users in running the models and analyzing the data. The system architecture to be developed will allow for automatic code updates for each user and an effective means with which to deal with data that are generated. We plan to investigate several desktop systems, but our work to date has focused on a Cray CX1. Currently, we are investigating the potential capabilities of several non-traditional development environments. While most NASA and NOAA models are

  13. Lung segmentation refinement based on optimal surface finding utilizing a hybrid desktop/virtual reality user interface.

    Science.gov (United States)

    Sun, Shanhui; Sonka, Milan; Beichel, Reinhard R

    2013-01-01

    Recently, the optimal surface finding (OSF) and layered optimal graph image segmentation of multiple objects and surfaces (LOGISMOS) approaches have been reported with applications to medical image segmentation tasks. While providing high levels of performance, these approaches may locally fail in the presence of pathology or other local challenges. Due to the image data variability, finding a suitable cost function that would be applicable to all image locations may not be feasible. This paper presents a new interactive refinement approach for correcting local segmentation errors in the automated OSF-based segmentation. A hybrid desktop/virtual reality user interface was developed for efficient interaction with the segmentations utilizing state-of-the-art stereoscopic visualization technology and advanced interaction techniques. The user interface allows a natural and interactive manipulation of 3-D surfaces. The approach was evaluated on 30 test cases from 18 CT lung datasets, which showed local segmentation errors after employing an automated OSF-based lung segmentation. The performed experiments exhibited significant increase in performance in terms of mean absolute surface distance errors (2.54±0.75 mm prior to refinement vs. 1.11±0.43 mm post-refinement, p≪0.001). Speed of the interactions is one of the most important aspects leading to the acceptance or rejection of the approach by users expecting real-time interaction experience. The average algorithm computing time per refinement iteration was 150 ms, and the average total user interaction time required for reaching complete operator satisfaction was about 2 min per case. This time was mostly spent on human-controlled manipulation of the object to identify whether additional refinement was necessary and to approve the final segmentation result. The reported principle is generally applicable to segmentation problems beyond lung segmentation in CT scans as long as the underlying segmentation utilizes the

  14. Lung Segmentation Refinement based on Optimal Surface Finding Utilizing a Hybrid Desktop/Virtual Reality User Interface

    Science.gov (United States)

    Sun, Shanhui; Sonka, Milan; Beichel, Reinhard R.

    2013-01-01

    Recently, the optimal surface finding (OSF) and layered optimal graph image segmentation of multiple objects and surfaces (LOGISMOS) approaches have been reported with applications to medical image segmentation tasks. While providing high levels of performance, these approaches may locally fail in the presence of pathology or other local challenges. Due to the image data variability, finding a suitable cost function that would be applicable to all image locations may not be feasible. This paper presents a new interactive refinement approach for correcting local segmentation errors in the automated OSF-based segmentation. A hybrid desktop/virtual reality user interface was developed for efficient interaction with the segmentations utilizing state-of-the-art stereoscopic visualization technology and advanced interaction techniques. The user interface allows a natural and interactive manipulation on 3-D surfaces. The approach was evaluated on 30 test cases from 18 CT lung datasets, which showed local segmentation errors after employing an automated OSF-based lung segmentation. The performed experiments exhibited significant increase in performance in terms of mean absolute surface distance errors (2.54 ± 0.75 mm prior to refinement vs. 1.11 ± 0.43 mm post-refinement, p ≪ 0.001). Speed of the interactions is one of the most important aspects leading to the acceptance or rejection of the approach by users expecting real-time interaction experience. The average algorithm computing time per refinement iteration was 150 ms, and the average total user interaction time required for reaching complete operator satisfaction per case was about 2 min. This time was mostly spent on human-controlled manipulation of the object to identify whether additional refinement was necessary and to approve the final segmentation result. The reported principle is generally applicable to segmentation problems beyond lung segmentation in CT scans as long as the underlying segmentation

  15. Technology.

    Science.gov (United States)

    Online-Offline, 1998

    1998-01-01

    Focuses on technology, on advances in such areas as aeronautics, electronics, physics, the space sciences, as well as computers and the attendant progress in medicine, robotics, and artificial intelligence. Describes educational resources for elementary and middle school students, including Web sites, CD-ROMs and software, videotapes, books,…

  16. PrimateLit Database

    Science.gov (United States)

    Primate Info Net Related Databases NCRR PrimateLit: A bibliographic database for primatology Top of any problems with this service. We welcome your feedback. The PrimateLit database is no longer being Resources, National Institutes of Health. The database is a collaborative project of the Wisconsin Primate

  17. Selection of nuclear power information database management system

    International Nuclear Information System (INIS)

    Zhang Shuxin; Wu Jianlei

    1996-01-01

    In the condition of the present database technology, in order to build the Chinese nuclear power information database (NPIDB) in the nuclear industry system efficiently at a high starting point, an important task is to select a proper database management system (DBMS), which is the hinge of the matter to build the database successfully. Therefore, this article explains how to build a practical information database about nuclear power, the functions of different database management systems, the reason of selecting relation database management system (RDBMS), the principles of selecting RDBMS, the recommendation of ORACLE management system as the software to build database and so on

  18. Beginning C# 2008 databases from novice to professional

    CERN Document Server

    Fahad Gilani, Syed; Reid, Jon; Raghuram, Ranga; Huddleston, James; Hammer Pedersen, Jacob

    2008-01-01

    This book is for every C# programmer. It assumes no prior database experience and teaches through hands-on examples how to create and use relational databases with the standard database language SQL and how to access them with C#.Assuming only basic knowledge of C# 3.0, Beginning C# 3.0 Databases teaches all the fundamentals of database technology and database programming readers need to quickly become highly proficient database users and application developers. A comprehensive tutorial on both SQL Server 2005 and ADO.NET 3.0, this book explains and demonstrates how to create database objects

  19. KALIMER design database development and operation manual

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Hahn, Do Hee; Lee, Yong Bum; Chang, Won Pyo

    2000-12-01

    KALIMER Design Database is developed to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applications. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), 3D CAD database, Team Cooperation System, and Reserved Documents. Results Database is a research results database for mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is a schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment

  20. KALIMER design database development and operation manual

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwan Seong; Hahn, Do Hee; Lee, Yong Bum; Chang, Won Pyo

    2000-12-01

    KALIMER Design Database is developed to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applications. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), 3D CAD database, Team Cooperation System, and Reserved Documents. Results Database is a research results database for mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is a schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment.

  1. Logical database design principles

    CERN Document Server

    Garmany, John; Clark, Terry

    2005-01-01

    INTRODUCTION TO LOGICAL DATABASE DESIGNUnderstanding a Database Database Architectures Relational Databases Creating the Database System Development Life Cycle (SDLC)Systems Planning: Assessment and Feasibility System Analysis: RequirementsSystem Analysis: Requirements Checklist Models Tracking and Schedules Design Modeling Functional Decomposition DiagramData Flow Diagrams Data Dictionary Logical Structures and Decision Trees System Design: LogicalSYSTEM DESIGN AND IMPLEMENTATION The ER ApproachEntities and Entity Types Attribute Domains AttributesSet-Valued AttributesWeak Entities Constraint

  2. An Interoperable Cartographic Database

    OpenAIRE

    Slobodanka Ključanin; Zdravko Galić

    2007-01-01

    The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on t...

  3. Software listing: CHEMTOX database

    International Nuclear Information System (INIS)

    Moskowitz, P.D.

    1993-01-01

    Initially launched in 1983, the CHEMTOX Database was among the first microcomputer databases containing hazardous chemical information. The database is used in many industries and government agencies in more than 17 countries. Updated quarterly, the CHEMTOX Database provides detailed environmental and safety information on 7500-plus hazardous substances covered by dozens of regulatory and advisory sources. This brief listing describes the method of accessing data and provides ordering information for those wishing to obtain the CHEMTOX Database

  4. Evaluation of usefulness and availability for orthopedic surgery using clavicle fracture model manufactured by desktop 3D printer

    International Nuclear Information System (INIS)

    Oh, Wang Kyun

    2014-01-01

    Usefulness and clinical availability for surgery efficiency were evaluated by conducting pre-operative planning with a model manufactured by desktop 3D printer by using clavicle CT image. The patient-customized clavicle fracture model was manufactured by desktop 3D printer of FDM wire laminated processing method by converting the CT image into STL file in Open Source DICOM Viewer Osirix. Also, the model of the original shape before damaged was restored and manufactured by Mirror technique based on STL file of not fractured clavicle of the other side by using the symmetry feature of the human body. For the model, the position and size, degree of the fracture was equally printed out. Using the clavicle model directly manufactured with low cost and less time in Department of Radiology is considered to be useful because it can reduce secondary damage during surgery and increase surgery efficiency with Minimal invasive percutaneous plate osteosynthesis(MIPO)

  5. Evaluation of usefulness and availability for orthopedic surgery using clavicle fracture model manufactured by desktop 3D printer

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Wang Kyun [Dept. of Diagnostic Radiology, Cheongju Medical Center, Cheongju (Korea, Republic of)

    2014-09-15

    Usefulness and clinical availability for surgery efficiency were evaluated by conducting pre-operative planning with a model manufactured by desktop 3D printer by using clavicle CT image. The patient-customized clavicle fracture model was manufactured by desktop 3D printer of FDM wire laminated processing method by converting the CT image into STL file in Open Source DICOM Viewer Osirix. Also, the model of the original shape before damaged was restored and manufactured by Mirror technique based on STL file of not fractured clavicle of the other side by using the symmetry feature of the human body. For the model, the position and size, degree of the fracture was equally printed out. Using the clavicle model directly manufactured with low cost and less time in Department of Radiology is considered to be useful because it can reduce secondary damage during surgery and increase surgery efficiency with Minimal invasive percutaneous plate osteosynthesis(MIPO)

  6. Deep Unsupervised Learning on a Desktop PC: A Primer for Cognitive Scientists.

    Science.gov (United States)

    Testolin, Alberto; Stoianov, Ivilin; De Filippo De Grazia, Michele; Zorzi, Marco

    2013-01-01

    Deep belief networks hold great promise for the simulation of human cognition because they show how structured and abstract representations may emerge from probabilistic unsupervised learning. These networks build a hierarchy of progressively more complex distributed representations of the sensory data by fitting a hierarchical generative model. However, learning in deep networks typically requires big datasets and it can involve millions of connection weights, which implies that simulations on standard computers are unfeasible. Developing realistic, medium-to-large-scale learning models of cognition would therefore seem to require expertise in programing parallel-computing hardware, and this might explain why the use of this promising approach is still largely confined to the machine learning community. Here we show how simulations of deep unsupervised learning can be easily performed on a desktop PC by exploiting the processors of low cost graphic cards (graphic processor units) without any specific programing effort, thanks to the use of high-level programming routines (available in MATLAB or Python). We also show that even an entry-level graphic card can outperform a small high-performance computing cluster in terms of learning time and with no loss of learning quality. We therefore conclude that graphic card implementations pave the way for a widespread use of deep learning among cognitive scientists for modeling cognition and behavior.

  7. Deep unsupervised learning on a desktop PC: A primer for cognitive scientists

    Directory of Open Access Journals (Sweden)

    Alberto eTestolin

    2013-05-01

    Full Text Available Deep belief networks hold great promise for the simulation of human cognition because they show how structured and abstract representations may emerge from probabilistic unsupervised learning. These networks build a hierarchy of progressively more complex distributed representations of the sensory data by fitting a hierarchical generative model. However, learning in deep networks typically requires big datasets and it can involve millions of connection weights, which implies that simulations on standard computers are unfeasible. Developing realistic, medium-to-large-scale learning models of cognition would therefore seem to require expertise in programming parallel-computing hardware, and this might explain why the use of this promising approach is still largely confined to the machine learning community. Here we show how simulations of deep unsupervised learning can be easily performed on a desktop PC by exploiting the processors of low-cost graphic cards (GPUs without any specific programming effort, thanks to the use of high-level programming routines (available in MATLAB or Python. We also show that even an entry-level graphic card can outperform a small high-performance computing cluster in terms of learning time and with no loss of learning quality. We therefore conclude that graphic card implementations pave the way for a widespread use of deep learning among cognitive scientists for modeling cognition and behavior.

  8. Differences in typing forces, muscle activity, comfort, and typing performance among virtual, notebook, and desktop keyboards.

    Science.gov (United States)

    Kim, Jeong Ho; Aulck, Lovenoor; Bartha, Michael C; Harper, Christy A; Johnson, Peter W

    2014-11-01

    The present study investigated whether there were physical exposure and typing productivity differences between a virtual keyboard with no tactile feedback and two conventional keyboards where key travel and tactile feedback are provided by mechanical switches under the keys. The key size and layout were same across all the keyboards. Typing forces; finger and shoulder muscle activity; self-reported comfort; and typing productivity were measured from 19 subjects while typing on a virtual (0 mm key travel), notebook (1.8 mm key travel), and desktop keyboard (4 mm key travel). When typing on the virtual keyboard, subjects typed with less force (p's typing forces and finger muscle activity came at the expense of a 60% reduction in typing productivity (p typing sessions or when typing productivity is at a premium, conventional keyboards with tactile feedback may be more suitable interface. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  9. Unique Methodologies for Nano/Micro Manufacturing Job Training Via Desktop Supercomputer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, Clyde [Northern Illinois Univ., DeKalb, IL (United States); Karonis, Nicholas [Northern Illinois Univ., DeKalb, IL (United States); Lurio, Laurence [Northern Illinois Univ., DeKalb, IL (United States); Piot, Philippe [Northern Illinois Univ., DeKalb, IL (United States); Xiao, Zhili [Northern Illinois Univ., DeKalb, IL (United States); Glatz, Andreas [Northern Illinois Univ., DeKalb, IL (United States); Pohlman, Nicholas [Northern Illinois Univ., DeKalb, IL (United States); Hou, Minmei [Northern Illinois Univ., DeKalb, IL (United States); Demir, Veysel [Northern Illinois Univ., DeKalb, IL (United States); Song, Jie [Northern Illinois Univ., DeKalb, IL (United States); Duffin, Kirk [Northern Illinois Univ., DeKalb, IL (United States); Johns, Mitrick [Northern Illinois Univ., DeKalb, IL (United States); Sims, Thomas [Northern Illinois Univ., DeKalb, IL (United States); Yin, Yanbin [Northern Illinois Univ., DeKalb, IL (United States)

    2012-11-21

    This project establishes an initiative in high speed (Teraflop)/large-memory desktop supercomputing for modeling and simulation of dynamic processes important for energy and industrial applications. It provides a training ground for employment of current students in an emerging field with skills necessary to access the large supercomputing systems now present at DOE laboratories. It also provides a foundation for NIU faculty to quantum leap beyond their current small cluster facilities. The funding extends faculty and student capability to a new level of analytic skills with concomitant publication avenues. The components of the Hewlett Packard computer obtained by the DOE funds create a hybrid combination of a Graphics Processing System (12 GPU/Teraflops) and a Beowulf CPU system (144 CPU), the first expandable via the NIU GAEA system to ~60 Teraflops integrated with a 720 CPU Beowulf system. The software is based on access to the NVIDIA/CUDA library and the ability through MATLAB multiple licenses to create additional local programs. A number of existing programs are being transferred to the CPU Beowulf Cluster. Since the expertise necessary to create the parallel processing applications has recently been obtained at NIU, this effort for software development is in an early stage. The educational program has been initiated via formal tutorials and classroom curricula designed for the coming year. Specifically, the cost focus was on hardware acquisitions and appointment of graduate students for a wide range of applications in engineering, physics and computer science.

  10. Deep Unsupervised Learning on a Desktop PC: A Primer for Cognitive Scientists

    Science.gov (United States)

    Testolin, Alberto; Stoianov, Ivilin; De Filippo De Grazia, Michele; Zorzi, Marco

    2013-01-01

    Deep belief networks hold great promise for the simulation of human cognition because they show how structured and abstract representations may emerge from probabilistic unsupervised learning. These networks build a hierarchy of progressively more complex distributed representations of the sensory data by fitting a hierarchical generative model. However, learning in deep networks typically requires big datasets and it can involve millions of connection weights, which implies that simulations on standard computers are unfeasible. Developing realistic, medium-to-large-scale learning models of cognition would therefore seem to require expertise in programing parallel-computing hardware, and this might explain why the use of this promising approach is still largely confined to the machine learning community. Here we show how simulations of deep unsupervised learning can be easily performed on a desktop PC by exploiting the processors of low cost graphic cards (graphic processor units) without any specific programing effort, thanks to the use of high-level programming routines (available in MATLAB or Python). We also show that even an entry-level graphic card can outperform a small high-performance computing cluster in terms of learning time and with no loss of learning quality. We therefore conclude that graphic card implementations pave the way for a widespread use of deep learning among cognitive scientists for modeling cognition and behavior. PMID:23653617

  11. A Unified Algorithm for Virtual Desktops Placement in Distributed Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jiangtao Zhang

    2016-01-01

    Full Text Available Distributed cloud has been widely adopted to support service requests from dispersed regions, especially for large enterprise which requests virtual desktops for multiple geodistributed branch companies. The cloud service provider (CSP aims to deliver satisfactory services at the least cost. CSP selects proper data centers (DCs closer to the branch companies so as to shorten the response time to user request. At the same time, it also strives to cut cost considering both DC level and server level. At DC level, the expensive long distance inter-DC bandwidth consumption should be reduced and lower electricity price is sought. Inside each tree-like DC, servers are trying to be used as little as possible so as to save equipment cost and power. In nature, there is a noncooperative relation between the DC level and server level in the selection. To attain these objectives and capture the noncooperative relation, multiobjective bilevel programming is used to formulate the problem. Then a unified genetic algorithm is proposed to solve the problem which realizes the selection of DC and server simultaneously. The extensive simulation shows that the proposed algorithm outperforms baseline algorithm in both quality of service guaranteeing and cost saving.

  12. Interactive desktop analysis of high resolution simulations: application to turbulent plume dynamics and current sheet formation

    International Nuclear Information System (INIS)

    Clyne, John; Mininni, Pablo; Norton, Alan; Rast, Mark

    2007-01-01

    The ever increasing processing capabilities of the supercomputers available to computational scientists today, combined with the need for higher and higher resolution computational grids, has resulted in deluges of simulation data. Yet the computational resources and tools required to make sense of these vast numerical outputs through subsequent analysis are often far from adequate, making such analysis of the data a painstaking, if not a hopeless, task. In this paper, we describe a new tool for the scientific investigation of massive computational datasets. This tool (VAPOR) employs data reduction, advanced visualization, and quantitative analysis operations to permit the interactive exploration of vast datasets using only a desktop PC equipped with a commodity graphics card. We describe VAPORs use in the study of two problems. The first, motivated by stellar envelope convection, investigates the hydrodynamic stability of compressible thermal starting plumes as they descend through a stratified layer of increasing density with depth. The second looks at current sheet formation in an incompressible helical magnetohydrodynamic flow to understand the early spontaneous development of quasi two-dimensional (2D) structures embedded within the 3D solution. Both of the problems were studied at sufficiently high spatial resolution, a grid of 504 2 by 2048 points for the first and 1536 3 points for the second, to overwhelm the interactive capabilities of typically available analysis resources

  13. Monte Carlo simulation of electrothermal atomization on a desktop personal computer

    Science.gov (United States)

    Histen, Timothy E.; Güell, Oscar A.; Chavez, Iris A.; Holcombea, James A.

    1996-07-01

    Monte Carlo simulations have been applied to electrothermal atomization (ETA) using a tubular atomizer (e.g. graphite furnace) because of the complexity in the geometry, heating, molecular interactions, etc. The intense computational time needed to accurately model ETA often limited its effective implementation to the use of supercomputers. However, with the advent of more powerful desktop processors, this is no longer the case. A C-based program has been developed and can be used under Windows TM or DOS. With this program, basic parameters such as furnace dimensions, sample placement, furnace heating and kinetic parameters such as activation energies for desorption and adsorption can be varied to show the absorbance profile dependence on these parameters. Even data such as time-dependent spatial distribution of analyte inside the furnace can be collected. The DOS version also permits input of external temperaturetime data to permit comparison of simulated profiles with experimentally obtained absorbance data. The run-time versions are provided along with the source code. This article is an electronic publication in Spectrochimica Acta Electronica (SAE), the electronic section of Spectrochimica Acta Part B (SAB). The hardcopy text is accompanied by a diskette with a program (PC format), data files and text files.

  14. Low Cost Desktop Image Analysis Workstation With Enhanced Interactive User Interface

    Science.gov (United States)

    Ratib, Osman M.; Huang, H. K.

    1989-05-01

    A multimodality picture archiving and communication system (PACS) is in routine clinical use in the UCLA Radiology Department. Several types workstations are currently implemented for this PACS. Among them, the Apple Macintosh II personal computer was recently chosen to serve as a desktop workstation for display and analysis of radiological images. This personal computer was selected mainly because of its extremely friendly user-interface, its popularity among the academic and medical community and its low cost. In comparison to other microcomputer-based systems the Macintosh II offers the following advantages: the extreme standardization of its user interface, file system and networking, and the availability of a very large variety of commercial software packages. In the current configuration the Macintosh II operates as a stand-alone workstation where images are imported from a centralized PACS server through an Ethernet network using a standard TCP-IP protocol, and stored locally on magnetic disk. The use of high resolution screens (1024x768 pixels x 8bits) offer sufficient performance for image display and analysis. We focused our project on the design and implementation of a variety of image analysis algorithms ranging from automated structure and edge detection to sophisticated dynamic analysis of sequential images. Specific analysis programs were developed for ultrasound images, digitized angiograms, MRI and CT tomographic images and scintigraphic images.

  15. Issues in Big-Data Database Systems

    Science.gov (United States)

    2014-06-01

    that big data will not be manageable using conventional relational database technology, and it is true that alternative paradigms, such as NoSQL systems...conventional relational database technology, and it is true that alternative paradigms, such as NoSQL systems and search engines, have much to offer...scale well, and because integration with external data sources is so difficult. NoSQL systems are more open to this integration, and provide excellent

  16. Does It Matter Whether One Takes a Test on an iPad or a Desktop Computer?

    Science.gov (United States)

    Ling, Guangming

    2016-01-01

    To investigate possible iPad related mode effect, we tested 403 8th graders in Indiana, Maryland, and New Jersey under three mode conditions through random assignment: a desktop computer, an iPad alone, and an iPad with an external keyboard. All students had used an iPad or computer for six months or longer. The 2-hour test included reading, math,…

  17. Do small fish mean no voucher? Using a flatbed desktop scanner to document larval and small specimens before destructive analyses

    Czech Academy of Sciences Publication Activity Database

    Kalous, L.; Šlechtová, Věra; Petrtýl, M.; Kohout, Jan; Čech, Martin

    2010-01-01

    Roč. 26, č. 4 (2010), s. 614-617 ISSN 0175-8659 R&D Projects: GA ČR GA206/06/1371; GA ČR GP206/09/P266 Institutional research plan: CEZ:AV0Z50450515; CEZ:AV0Z60170517 Keywords : small fish * voucher * desktop scanner Subject RIV: GL - Fishing Impact factor: 0.945, year: 2010

  18. Evaluation of the use of advanced information technology (expert systems) for data-base system development and emergency management in non-nuclear industries. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Rasmussen, J; Pedersen, O M; Groenberg, C D

    1987-04-01

    During recent years, a number of large industrial accidents have resulted in a widespread concern with organization of emergency services and means for effective support of the distributed organizations involved in emergency management. With the aim of presenting a discussion of the potential of modern information technology for decision support during accidents, the report brings a brief review of approaches to design of decision-support systems and expert systems. From the review it is concluded that models of decision support systems based on a control theoretic point of view, together with a cognitive approach to decision task analysis offer a suitable framework. In addition, it is concluded that Advanced Information-tools for data base design and for communication support in distributed decision-making should be considered for further development. A number of recent Danish industrial accidents are reviewed and key persons interviewed in order to give a preliminary basis for judging the feasibility of the theoretical discussion. The report includes a number of recommendations for further studies to support the development of a distributed data base system for emergency management.

  19. Optimizing the number of cleavage stage embryos to transfer on day 3 in women 38 years of age and older: a Society for Assisted Reproductive Technology database study.

    Science.gov (United States)

    Stern, Judy E; Goldman, Marlene B; Hatasaka, Harry; MacKenzie, Todd A; Surrey, Eric S; Racowsky, Catherine

    2009-03-01

    To determine the optimal number of day 3 embryos to transfer in women >or=38 years by conducting an evidence-based evaluation. Retrospective analysis of 2000-2004 national SART data. National writing group. A total of 36,103 day 3 embryo transfers in women >or=38 years undergoing their first assisted reproductive technology cycle. None. Logistic regression was used to model the probability of pregnancy, delivery, and multiple births (twin or high order) based on age- and cycle-specific parameters. Pregnancy rates, delivery rates, and multiple rates increased up to transfer of three embryos in 38-year-olds and four in 39-year-olds; beyond this number, only multiple rates increased. In women >or=40 years, delivery rates and multiple rates climbed steadily with increasing numbers transferred. Multivariate analysis confirmed the statistically significant effect of age, number of oocytes retrieved, and embryo cryopreservation on delivery and multiple rates. Maximum FSH level was not an independent predictor by multivariate analysis. Use of intracytoplasmic sperm injection was associated with lowered delivery rate. No more than three or four embryos should be transferred in 38- and 39-year-olds, respectively, whereas up to five embryos could be transferred in >or=40-year-olds. Numbers of embryos to transfer should be adjusted according to number of oocytes retrieved and availability of excess embryos for cryopreservation.

  20. Technology

    Directory of Open Access Journals (Sweden)

    Xu Jing

    2016-01-01

    Full Text Available The traditional answer card reading method using OMR (Optical Mark Reader, most commonly, OMR special card special use, less versatile, high cost, aiming at the existing problems proposed a method based on pattern recognition of the answer card identification method. Using the method based on Line Segment Detector to detect the tilt of the image, the existence of tilt image rotation correction, and eventually achieve positioning and detection of answers to the answer sheet .Pattern recognition technology for automatic reading, high accuracy, detect faster