WorldWideScience

Sample records for traditional desktop software

  1. Desk Congest Desktop Congesting Software for Desktop Clutter Congestion

    Directory of Open Access Journals (Sweden)

    Solomon A. Adepoju

    2015-06-01

    Full Text Available Abstract The computer desktop environment is a working environment which can be likened unto a users desk in homes and offices. Often times the computer desktop get cluttered with files either as shortcuts used for quick links files stored temporarily to be accessed later or just being dumped there for no vivid reasons. However previous researches have shown that cluttered desktop affects users productivity and getting these files organized is a laborious task for most users. To be able to conveniently alleviate the effect clutters have on users performances and productivity there is need for third party software that will help get the desktop environment organized in a logical and efficient manner. It is to this end that desktop decongesting software is being designed and implemented to help curb clutter problems which existing tools have only partially addressed. The system is designed using Visual Basic .Net and it proves to be effective in tackling desktop congestion problem.

  2. Desktop Publishing on the Macintosh: A Software Perspective.

    Science.gov (United States)

    Devan, Steve

    1987-01-01

    Discussion of factors to be considered in selecting desktop publishing software for the Macintosh microcomputer focuses on the two approaches to such software, i.e., batch and interactive, and three technical considerations, i.e., document, text, and graphics capabilities. Some new developments in graphics software are also briefly described. (MES)

  3. MICA: desktop software for comprehensive searching of DNA databases

    Directory of Open Access Journals (Sweden)

    Glick Benjamin S

    2006-10-01

    Full Text Available Abstract Background Molecular biologists work with DNA databases that often include entire genomes. A common requirement is to search a DNA database to find exact matches for a nondegenerate or partially degenerate query. The software programs available for such purposes are normally designed to run on remote servers, but an appealing alternative is to work with DNA databases stored on local computers. We describe a desktop software program termed MICA (K-Mer Indexing with Compact Arrays that allows large DNA databases to be searched efficiently using very little memory. Results MICA rapidly indexes a DNA database. On a Macintosh G5 computer, the complete human genome could be indexed in about 5 minutes. The indexing algorithm recognizes all 15 characters of the DNA alphabet and fully captures the information in any DNA sequence, yet for a typical sequence of length L, the index occupies only about 2L bytes. The index can be searched to return a complete list of exact matches for a nondegenerate or partially degenerate query of any length. A typical search of a long DNA sequence involves reading only a small fraction of the index into memory. As a result, searches are fast even when the available RAM is limited. Conclusion MICA is suitable as a search engine for desktop DNA analysis software.

  4. Desktop Publishing: A Brave New World and Publishing from the Desktop.

    Science.gov (United States)

    Lormand, Robert; Rowe, Jane J.

    1988-01-01

    The first of two articles presents basic selection criteria for desktop publishing software packages, including discussion of expectations, required equipment, training costs, publication size, desired software features, additional equipment needed, and quality control. The second provides a brief description of desktop publishing using the Apple…

  5. Pages from the Desktop: Desktop Publishing Today.

    Science.gov (United States)

    Crawford, Walt

    1994-01-01

    Discusses changes that have made desktop publishing appealing and reasonably priced. Hardware, software, and printer options for getting started and moving on, typeface developments, and the key characteristics of desktop publishing are described. The author's notes on 33 articles from the personal computing literature from January-March 1994 are…

  6. Desktop Publishing Made Simple.

    Science.gov (United States)

    Wentling, Rose Mary

    1989-01-01

    The author discusses the types of computer hardware and software necessary to set up a desktop publishing system, both for use in educational administration and for instructional purposes. Classroom applications of desktop publishing are presented. The author also provides guidelines for preparing to teach desktop publishing. (CH)

  7. Desktop Publishing Choices: Making an Appropriate Decision.

    Science.gov (United States)

    Crawford, Walt

    1991-01-01

    Discusses various choices available for desktop publishing systems. Four categories of software are described, including advanced word processing, graphics software, low-end desktop publishing, and mainstream desktop publishing; appropriate hardware is considered; and selection guidelines are offered, including current and future publishing needs,…

  8. GUIdock-VNC: using a graphical desktop sharing system to provide a browser-based interface for containerized software.

    Science.gov (United States)

    Mittal, Varun; Hung, Ling-Hong; Keswani, Jayant; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee

    2017-04-01

    Software container technology such as Docker can be used to package and distribute bioinformatics workflows consisting of multiple software implementations and dependencies. However, Docker is a command line-based tool, and many bioinformatics pipelines consist of components that require a graphical user interface. We present a container tool called GUIdock-VNC that uses a graphical desktop sharing system to provide a browser-based interface for containerized software. GUIdock-VNC uses the Virtual Network Computing protocol to render the graphics within most commonly used browsers. We also present a minimal image builder that can add our proposed graphical desktop sharing system to any Docker packages, with the end result that any Docker packages can be run using a graphical desktop within a browser. In addition, GUIdock-VNC uses the Oauth2 authentication protocols when deployed on the cloud. As a proof-of-concept, we demonstrated the utility of GUIdock-noVNC in gene network inference. We benchmarked our container implementation on various operating systems and showed that our solution creates minimal overhead. © The Authors 2017. Published by Oxford University Press.

  9. The Point Lepreau Desktop Simulator

    International Nuclear Information System (INIS)

    MacLean, M.; Hogg, J.; Newman, H.

    1997-01-01

    The Point Lepreau Desktop Simulator runs plant process modeling software on a 266 MHz single CPU DEC Alpha computer. This same Alpha also runs the plant control computer software on an SSCI 125 emulator. An adjacent Pentium PC runs the simulator's Instructor Facility software, and communicates with the Alpha through an Ethernet. The Point Lepreau Desktop simulator is constructed to be as similar as possible to the Point Lepreau full scope training simulator. This minimizes total maintenance costs and enhances the benefits of the desktop simulator. Both simulators have the same modeling running on a single CPU in the same schedule of calculations. Both simulators have the same Instructor Facility capable of developing and executing the same lesson plans, doing the same monitoring and control of simulations, inserting all the same malfunctions, performing all the same overrides, capable of making and restoring all the same storepoints. Both simulators run the same plant control computer software - the same assembly language control programs as the power plant uses for reactor control, heat transport control, annunciation, etc. This is a higher degree of similarity between a desktop simulator and a full scope training simulator than previously reported for a computer controlled nuclear plant. The large quantity of control room hardware missing from the desktop simulator is replaced by software. The Instructor Facility panel override software of the training simulator provides the means by which devices (switches, controllers, windows, etc.) on the control room panels can be controlled and monitored in the desktop simulator. The CRT of the Alpha provides a mouse operated DCC keyboard mimic for controlling the plant control computer emulation. Two emulated RAMTEK display channels appear as windows for monitoring anything of interest on plant DCC displays, including one channel for annunciation. (author)

  10. Desktop Publishing.

    Science.gov (United States)

    Stanley, Milt

    1986-01-01

    Defines desktop publishing, describes microcomputer developments and software tools that make it possible, and discusses its use as an instructional tool to improve writing skills. Reasons why students' work should be published, examples of what to publish, and types of software and hardware to facilitate publishing are reviewed. (MBR)

  11. Desktop Publishing for Counselors.

    Science.gov (United States)

    Lucking, Robert; Mitchum, Nancy

    1990-01-01

    Discusses the fundamentals of desktop publishing for counselors, including hardware and software systems and peripherals. Notes by using desktop publishing, counselors can produce their own high-quality documents without the expense of commercial printers. Concludes computers present a way of streamlining the communications of a counseling…

  12. Basics of Desktop Publishing. Second Edition.

    Science.gov (United States)

    Beeby, Ellen; Crummett, Jerrie

    This document contains teacher and student materials for a basic course in desktop publishing. Six units of instruction cover the following: (1) introduction to desktop publishing; (2) desktop publishing systems; (3) software; (4) type selection; (5) document design; and (6) layout. The teacher edition contains some or all of the following…

  13. The Printout: Desktop Pulishing in the Classroom.

    Science.gov (United States)

    Balajthy, Ernest; Link, Gordon

    1988-01-01

    Reviews software available to the classroom teacher for desktop publishing and describes specific classroom activities. Suggests using desktop publishing to produce large print texts for students with limited sight or for primary students.(NH)

  14. Incorporating a Human-Computer Interaction Course into Software Development Curriculums

    Science.gov (United States)

    Janicki, Thomas N.; Cummings, Jeffrey; Healy, R. Joseph

    2015-01-01

    Individuals have increasing options on retrieving information related to hardware and software. Specific hardware devices include desktops, tablets and smart devices. Also, the number of software applications has significantly increased the user's capability to access data. Software applications include the traditional web site, smart device…

  15. Desktop Publishing in a PC-Based Environment.

    Science.gov (United States)

    Sims, Harold A.

    1987-01-01

    Identifies, considers, and interrelates the functionality of hardware, firmware, and software types; discusses the relationship of input and output devices in the PC-based desktop publishing environment; and reports some of what has been experienced in three years of working intensively in/with desktop publishing devices and solutions. (MES)

  16. Desktop Publishing in Libraries.

    Science.gov (United States)

    Cisler, Steve

    1987-01-01

    Describes the components, costs, and capabilities of several desktop publishing systems, and examines their possible impact on work patterns within organizations. The text and graphics of the article were created using various microcomputer software packages. (CLB)

  17. Nuclear plant analyzer desktop workstation

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1990-01-01

    In 1983 the U.S. Nuclear Regulatory Commission (USNRC) commissioned the Idaho National Engineering Laboratory (INEL) to develop a Nuclear Plant Analyzer (NPA). The NPA was envisioned as a graphical aid to assist reactor safety analysts in comprehending the results of thermal-hydraulic code calculations. The development was to proceed in three distinct phases culminating in a desktop reactor safety workstation. The desktop NPA is now complete. The desktop NPA is a microcomputer based reactor transient simulation, visualization and analysis tool developed at INEL to assist an analyst in evaluating the transient behavior of nuclear power plants by means of graphic displays. The NPA desktop workstation integrates advanced reactor simulation codes with online computer graphics allowing reactor plant transient simulation and graphical presentation of results. The graphics software, written exclusively in ANSI standard C and FORTRAN 77 and implemented over the UNIX/X-windows operating environment, is modular and is designed to interface to the NRC's suite of advanced thermal-hydraulic codes to the extent allowed by that code. Currently, full, interactive, desktop NPA capabilities are realized only with RELAP5

  18. Desktop publishing com o scribus

    OpenAIRE

    Silva, Fabrício Riff; Uchôa, Kátia Cilene Amaral

    2015-01-01

    Este artigo apresenta um breve tutorial sobre Desktop Publishing, com ênfase no software livre Scribus, através da criação de um exemplo prático que explora algumas de suas principais funcionalidades.

  19. Introducing the CUAHSI Hydrologic Information System Desktop Application (HydroDesktop) and Open Development Community

    Science.gov (United States)

    Ames, D.; Kadlec, J.; Horsburgh, J. S.; Maidment, D. R.

    2009-12-01

    The Consortium of Universities for the Advancement of Hydrologic Sciences (CUAHSI) Hydrologic Information System (HIS) project includes extensive development of data storage and delivery tools and standards including WaterML (a language for sharing hydrologic data sets via web services); and HIS Server (a software tool set for delivering WaterML from a server); These and other CUASHI HIS tools have been under development and deployment for several years and together, present a relatively complete software “stack” to support the consistent storage and delivery of hydrologic and other environmental observation data. This presentation describes the development of a new HIS software tool called “HydroDesktop” and the development of an online open source software development community to update and maintain the software. HydroDesktop is a local (i.e. not server-based) client side software tool that ultimately will run on multiple operating systems and will provide a highly usable level of access to HIS services. The software provides many key capabilities including data query, map-based visualization, data download, local data maintenance, editing, graphing, data export to selected model-specific data formats, linkage with integrated modeling systems such as OpenMI, and ultimately upload to HIS servers from the local desktop software. As the software is presently in the early stages of development, this presentation will focus on design approach and paradigm and is viewed as an opportunity to encourage participation in the open development community. Indeed, recognizing the value of community based code development as a means of ensuring end-user adoption, this project has adopted an “iterative” or “spiral” software development approach which will be described in this presentation.

  20. Application of desktop computers in nuclear engineering education

    International Nuclear Information System (INIS)

    Graves, H.W. Jr.

    1990-01-01

    Utilization of desktop computers in the academic environment is based on the same objectives as in the industrial environment - increased quality and efficiency. Desktop computers can be extremely useful teaching tools in two general areas: classroom demonstrations and homework assignments. Although differences in emphasis exist, tutorial programs share many characteristics with interactive software developed for the industrial environment. In the Reactor Design and Fuel Management course at the University of Maryland, several interactive tutorial programs provided by Energy analysis Software Service have been utilized. These programs have been designed to be sufficiently structured to permit an orderly, disciplined solution to the problem being solved, and yet be flexible enough to accommodate most problem solution options

  1. Adobe AIR, Bringing Rich Internet Applications to the Desktop

    OpenAIRE

    Vieriu, Valentin; Tuican, Catalin

    2009-01-01

    Rich Internet Applications are the new trend in software development today. Adobe AIR offers the possibility to create cross-platform desktop applications using popular Web technologies like HTML, JavaScript, Flash and Flex. This article is focused on presenting the advantages that this new environment has to offer for the web development community and how quickly you can develop a desktop application using Adobe AIR.

  2. Adobe AIR, Bringing Rich Internet Applications to the Desktop

    Directory of Open Access Journals (Sweden)

    Valentin Vieriu

    2009-01-01

    Full Text Available Rich Internet Applications are the new trend in software development today. Adobe AIR offers the possibility to create cross-platform desktop applications using popular Web technologies like HTML, JavaScript, Flash and Flex. This article is focused on presenting the advantages that this new environment has to offer for the web development community and how quickly you can develop a desktop application using Adobe AIR.

  3. Desktop Publishing: The New Wave in Business Education.

    Science.gov (United States)

    Huprich, Violet M.

    1989-01-01

    Discusses the challenges of teaching desktop publishing (DTP); the industry is in flux with the software packages constantly being updated. Indicates that the demand for those with DTP skills is great. (JOW)

  4. Research and implementation of a Web-based remote desktop image monitoring system

    International Nuclear Information System (INIS)

    Ren Weijuan; Li Luofeng; Wang Chunhong

    2010-01-01

    It studied and implemented an ISS (Image Snapshot Server) system based on Web, using Java Web technology. The ISS system consisted of client web browser and server. The server part could be divided into three modules as the screen shots software, web server and Oracle database. Screen shots software intercepted the desktop environment of the remote monitored PC and sent these pictures to a Tomcat web server for displaying on the web at real time. At the same time, these pictures were also saved in an Oracle database. Through the web browser, monitor person can view the real-time and historical desktop pictures of the monitored PC during some period. It is very convenient for any user to monitor the desktop image of remote monitoring PC. (authors)

  5. Basics of Desktop Publishing. Teacher Edition.

    Science.gov (United States)

    Beeby, Ellen

    This color-coded teacher's guide contains curriculum materials designed to give students an awareness of various desktop publishing techniques before they determine their computer hardware and software needs. The guide contains six units, each of which includes some or all of the following basic components: objective sheet, suggested activities…

  6. HydroDesktop: An Open Source GIS-Based Platform for Hydrologic Data Discovery, Visualization, and Analysis

    Science.gov (United States)

    Ames, D. P.; Kadlec, J.; Cao, Y.; Grover, D.; Horsburgh, J. S.; Whiteaker, T.; Goodall, J. L.; Valentine, D. W.

    2010-12-01

    A growing number of hydrologic information servers are being deployed by government agencies, university networks, and individual researchers using the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS). The CUAHSI HIS Project has developed a standard software stack, called HydroServer, for publishing hydrologic observations data. It includes the Observations Data Model (ODM) database and Water Data Service web services, which together enable publication of data on the Internet in a standard format called Water Markup Language (WaterML). Metadata describing available datasets hosted on these servers is compiled within a central metadata catalog called HIS Central at the San Diego Supercomputer Center and is searchable through a set of predefined web services based queries. Together, these servers and central catalog service comprise a federated HIS of a scale and comprehensiveness never previously available. This presentation will briefly review/introduce the CUAHSI HIS system with special focus on a new HIS software tool called "HydroDesktop" and the open source software development web portal, www.HydroDesktop.org, which supports community development and maintenance of the software. HydroDesktop is a client-side, desktop software application that acts as a search and discovery tool for exploring the distributed network of HydroServers, downloading specific data series, visualizing and summarizing data series and exporting these to formats needed for analysis by external software. HydroDesktop is based on the open source DotSpatial GIS developer toolkit which provides it with map-based data interaction and visualization, and a plug-in interface that can be used by third party developers and researchers to easily extend the software using Microsoft .NET programming languages. HydroDesktop plug-ins that are presently available or currently under development within the project and by third party

  7. Warm Hearts/Cold Type: Desktop Publishing Arrives.

    Science.gov (United States)

    Kramer, Felix

    1991-01-01

    Describes desktop publishing (DTP) that may be suitable for community, activist, and nonprofit groups and discusses how it is changing written communication. Topics discussed include costs; laser printers; time savings; hardware and software selection; and guidelines to consider when establishing DTP capability. (LRW)

  8. Desktop Publishing: Things Gutenberg Never Taught You.

    Science.gov (United States)

    Bowman, Joel P.; Renshaw, Debbie A.

    1989-01-01

    Provides a desktop publishing (DTP) overview, including: advantages and disadvantages; hardware and software requirements; and future development. Discusses cost-effectiveness, confidentiality, credibility, effects on volume of paper-based communication, and the need for training in layout and design which DTP creates. Includes a glossary of DTP…

  9. Stop the Presses! An Update on Desktop Publishing.

    Science.gov (United States)

    McCarthy, Robert

    1988-01-01

    Discusses educational applications of desktop publishing at the elementary, secondary, and college levels. Topics discussed include page design capabilities; hardware requirements; software; the production of school newsletters and newspapers; cost factors; writing improvement; university departmental publications; and college book publishing. A…

  10. IMIS desktop & smartphone software solutions for monitoring spacecrafts' payload from anywhere

    Science.gov (United States)

    Baroukh, J.; Queyrut, O.; Airaud, J.

    In the past years, the demand for satellite remote operations has increased guided by on one hand, the will to reduce operations cost (on-call operators out of business hours), and on the other hand, the development of cooperation space missions resulting in a world wide distribution of engineers and science team members. Only a few off-the-shelf solutions exist to fulfill the need of remote payload monitoring, and they mainly use proprietary devices. The recent advent of mobile technologies (laptops, smartphones and tablets) as well as the worldwide deployment of broadband networks (3G, Wi-Fi hotspots), has opened up a technical window that brings new options. As part of the Mars Science Laboratory (MSL) mission, the Centre National D'Etudes Spatiales (CNES, the French space agency) has developed a new software solution for monitoring spacecraft payloads. The Instrument Monitoring Interactive Software (IMIS) offers state-of-the-art operational features for payload monitoring, and can be accessed remotely. It was conceived as a generic tool that can be used for heterogeneous payloads and missions. IMIS was designed as a classical client/server architecture. The server is hosted at CNES and acts as a data provider while two different kinds of clients are available depending on the level of mobility required. The first one is a rich client application, built on Eclipse framework, which can be installed on usual operating systems and communicates with the server through the Internet. The second one is a smartphone application for any Android platform, connected to the server thanks to the mobile broadband network or a Wi-Fi connection. This second client is mainly devoted to on-call operations and thus only contains a subset of the IMIS functionalities. This paper describes the operational context, including security aspects, that led IMIS development, presents the selected software architecture and details the various features of both clients: the desktop and the sm

  11. Designing Design into an Advanced Desktop Publishing Course (A Teaching Tip).

    Science.gov (United States)

    Guthrie, Jim

    1995-01-01

    Describes an advanced desktop publishing course that combines instruction in a few advanced techniques for using software with extensive discussion of such design principles as consistency, proportion, asymmetry, appropriateness, contrast, and color. Describes computer hardware and software, class assignments, problems, and the rationale for such…

  12. Semantic Desktop

    Science.gov (United States)

    Sauermann, Leo; Kiesel, Malte; Schumacher, Kinga; Bernardi, Ansgar

    In diesem Beitrag wird gezeigt, wie der Arbeitsplatz der Zukunft aussehen könnte und wo das Semantic Web neue Möglichkeiten eröffnet. Dazu werden Ansätze aus dem Bereich Semantic Web, Knowledge Representation, Desktop-Anwendungen und Visualisierung vorgestellt, die es uns ermöglichen, die bestehenden Daten eines Benutzers neu zu interpretieren und zu verwenden. Dabei bringt die Kombination von Semantic Web und Desktop Computern besondere Vorteile - ein Paradigma, das unter dem Titel Semantic Desktop bekannt ist. Die beschriebenen Möglichkeiten der Applikationsintegration sind aber nicht auf den Desktop beschränkt, sondern können genauso in Web-Anwendungen Verwendung finden.

  13. Nielsen PrimeLocation Web/Desktop: Assessing and GIS Mapping Market Area

    Data.gov (United States)

    Social Security Administration — Nielsen PrimeLocation Web and Desktop Software Licensed for Internal Use only: Pop-Facts Demographics Database, Geographic Mapping Data Layers, Geo-Coding locations.

  14. Full-scope nuclear training simulator -brought to the desktop

    International Nuclear Information System (INIS)

    LaPointe, D.J.; Manz, A.; Hall, G.S.

    1997-01-01

    RighTSTEP is a suite of simulation software which has been initially designed to facilitate upgrade of Ontario Hydro's full-scope simulators, but is also adaptable to a variety of other roles. it is presently being commissioned at Bruch A Training Simulator and has seen preliminary use in desktop and classroom roles. Because of the flexibility of the system, we anticipate it will see common use in the corporation for full-scope simulation roles. A key reason for developing RighTSTEP (Real Time Simulator Technology Extensible and Portable) was the need to modernize and upgrade the full-scope training simulator while protecting the investment in modelling code. This modelling code represents the end product of 18 years of evolution from the beginning of its development in 1979. Bringing this modelling code to a modern and more useful framework - the combination of simulator host, operating system, and simulator operating system - also could provide many spin-off benefits. The development (and first implementation) of the righTSTEP system was cited for saving the corporation 5.6M$ and was recognized by a corporate New Technology Award last year. The most important spin-off from this project has been the desktop version of the full-scope simulator. The desktop simulator uses essentially the same software as does its full-scope counterpart, and may be used for a variety of new purposes. Classroom and individual simulator training can now be easily accommodated since a desktop simulator is both affordable and relatively ease to use. Further, a wide group of people can be trained using the desktop simulator: by contrast the full-scope simulators were almost exclusively devoted to front-line operating staff. The desktop is finding increasing use in support of engineering applications, resulting from its easy accessibility, breadth of station systems represented, and tools for analysis and viewing. As further plant models are made available on the new simulator platform and

  15. What's New in Software? Mastery of the Computer through Desktop Publishing.

    Science.gov (United States)

    Hedley, Carolyn N.; Ellsworth, Nancy J.

    1993-01-01

    Offers thoughts on the phenomenon of the underuse of classroom computers. Argues that desktop publishing is one way of overcoming the computer malaise occurring in schools, using the incentive of classroom reading and writing for mastery of many aspects of computer production, including writing, illustrating, reading, and publishing. (RS)

  16. Big Memory Elegance: HyperCard Information Processing and Desktop Publishing.

    Science.gov (United States)

    Bitter, Gary G.; Gerson, Charles W., Jr.

    1991-01-01

    Discusses hardware requirements, functions, and applications of five information processing and desktop publishing software packages for the Macintosh: HyperCard, PageMaker, Cricket Presents, Power Point, and Adobe illustrator. Benefits of these programs for schools are considered. (MES)

  17. An Exercise in Desktop Publishing: Using the "Newsroom."

    Science.gov (United States)

    Kiteka, Sebastian F.

    This guide provides a description and step-by-step instructions for the use of "Newsroom," a desktop-publishing program for the Apple II series of microcomputers produced by Springboard Software Inc. Based on the 1984 version of the program, this two-hour exercise focuses on the design and production of a newsletter with text and…

  18. MELCOR/VISOR PWR desktop simulator

    International Nuclear Information System (INIS)

    With, Anka de; Wakker, Pieter

    2010-01-01

    Increasingly, there is a need for a learning support and training tool for nuclear engineers, utilities and students in order to broaden their understanding of advanced nuclear plant characteristics, dynamics, transients and safety features. Nuclear system analysis codes like ASTEC, RELAP5, RETRAN and MELCOR provide calculation results of and visualization tools can be used to graphically represent these results. However, for an efficient education and training a more interactive tool such as a simulator is needed. The simulator connects the graphical tool with the calculation tool in an interactive manner. A small number of desktop simulators exist [1-3]. The existing simulators are capable of representing different types of power plants and various accident conditions. However, they were found to be too general to be used as a reliable plant-specific accident analysis or training tool. A desktop simulator of the Pressurized Water Reactor (PWR) has been created under contract of the Dutch nuclear regulatory body (KFD). The desktop simulator is a software package that provides a close to real simulation of the Dutch nuclear power plant Borssele (KCB) and is used for training of the accident response. The simulator includes the majority of the power plant systems, necessary for the successful simulation of the KCB plant during normal operation, malfunctions and accident situations, and it has been successfully validated against the results of the safety evaluations from the KCB safety report. (orig.)

  19. Common Sense Wordworking III: Desktop Publishing and Desktop Typesetting.

    Science.gov (United States)

    Crawford, Walt

    1987-01-01

    Describes current desktop publishing packages available for microcomputers and discusses the disadvantages, especially in cost, for most personal computer users. Also described is a less expensive alternative technology--desktop typesetting--which meets the requirements of users who do not need elaborate techniques for combining text and graphics.…

  20. Desktop war - data suppliers competing for bigger market share

    International Nuclear Information System (INIS)

    Sword, M.

    1999-01-01

    The intense competition among suppliers of computerized data and computer software to the petroleum and natural gas industry in western Canada is discussed. It is estimated that the Canadian oil patch spends a large sum, about $ 400 million annually on geoscience information and related costs and industry is looking for ways to significantly reduce those costs. There is a need for integrated, desktop driven data sets. Sensing the determination of industry to reduce information acquisition costs, data providers are responding with major consolidation of data sets. The major evolution in the industry is on-line access to increase the speed of information delivery. Data vendors continue to integrate land, well, log, production and other data sets whether public or proprietary. The result is stronger foundations as platforms for interpretive software. Another development is the rise of the Internet and Intranets and the re-definition of the role of information technology departments in the industry as both of these are paving the way for electronic delivery of information and software tools to the desktop. Development of proprietary data sets, acquisition of competitors with complimentary data sets that enhances products and services are just some of the ways data vendors are using to get a bigger piece of the exploration and development pie

  1. Desktop grid computing

    CERN Document Server

    Cerin, Christophe

    2012-01-01

    Desktop Grid Computing presents common techniques used in numerous models, algorithms, and tools developed during the last decade to implement desktop grid computing. These techniques enable the solution of many important sub-problems for middleware design, including scheduling, data management, security, load balancing, result certification, and fault tolerance. The book's first part covers the initial ideas and basic concepts of desktop grid computing. The second part explores challenging current and future problems. Each chapter presents the sub-problems, discusses theoretical and practical

  2. ACID Astronomical and Physics Cloud Interactive Desktop: A Prototype of VUI for CTA Science Gateway

    Science.gov (United States)

    Massimino, P.; Costa, A.; Becciani, U.; Vuerli, C.; Bandieramonte, M.; Petta, C.; Riggi, S.; Sciacca, E.; Vitello, F.; Pistagna, C.

    2014-05-01

    The Astronomical & Physics Cloud Interactive Desktop, developed for the prototype of CTA Science Gateway in Catania, Italy, allows to use many software packages without any installation on the local desktop. The users will be able to exploit, if applicable, the native Graphical User Interface (GUI) of the programs that are available in the ACID environment. For using interactively the remote programs, ACID exploits an "ad hoc" VNC-based User Interface (VUI).

  3. Practical Downloading to Desktop Publishing: Enhancing the Delivery of Information.

    Science.gov (United States)

    Danziger, Pamela N.

    This paper is addressed to librarians and information managers who, as one of the many activities they routinely perform, frequently publish information in such formats as newsletters, manuals, brochures, forms, presentations, or reports. It is argued that desktop publishing--a personal computer-based software package used to generate documents of…

  4. The desktop muon detector: A simple, physics-motivated machine- and electronics-shop project for university students

    Science.gov (United States)

    Axani, S. N.; Conrad, J. M.; Kirby, C.

    2017-12-01

    This paper describes the construction of a desktop muon detector, an undergraduate-level physics project that develops machine-shop and electronics-shop technical skills. The desktop muon detector is a self-contained apparatus that employs a plastic scintillator as the detection medium and a silicon photomultiplier for light collection. This detector can be battery powered and is used in conjunction with the provided software. The total cost per detector is approximately 100. We describe physics experiments we have performed, and then suggest several other interesting measurements that are possible, with one or more desktop muon detectors.

  5. Desktop Genetics

    OpenAIRE

    Hough, Soren H; Ajetunmobi, Ayokunmi; Brody, Leigh; Humphryes-Kirilov, Neil; Perello, Edward

    2016-01-01

    Desktop Genetics is a bioinformatics company building a gene-editing platform for personalized medicine. The company works with scientists around the world to design and execute state-of-the-art clustered regularly interspaced short palindromic repeats (CRISPR) experiments. Desktop Genetics feeds the lessons learned about experimental intent, single-guide RNA design and data from international genomics projects into a novel CRISPR artificial intelligence system. We believe that machine learni...

  6. Non-Grey Radiation Modeling using Thermal Desktop/Sindaworks TFAWS06-1009

    Science.gov (United States)

    Anderson, Kevin R.; Paine, Chris

    2006-01-01

    This paper provides an overview of the non-grey radiation modeling capabilities of Cullimore and Ring's Thermal Desktop(Registered TradeMark) Version 4.8 SindaWorks software. The non-grey radiation analysis theory implemented by Sindaworks and the methodology used by the software are outlined. Representative results from a parametric trade study of a radiation shield comprised of a series of v-grooved shaped deployable panels is used to illustrate the capabilities of the SindaWorks non-grey radiation thermal analysis software using emissivities with temperature and wavelength dependency modeled via a Hagen-Rubens relationship.

  7. Telemedicine in rural areas. Experience with medical desktop-conferencing via satellite.

    Science.gov (United States)

    Ricke, J; Kleinholz, L; Hosten, N; Zendel, W; Lemke, A; Wielgus, W; Vöge, K H; Fleck, E; Marciniak, R; Felix, R

    1995-01-01

    Cooperation between physicians in hospitals in rural areas can be assisted by desktop-conferencing using a satellite link. For six weeks, medical desktop-conferencing was tested during daily clinical conferences between the Virchow-Klinikum, Berlin, and the Medical Academy, Wroclaw. The communications link was provided by the German Telekom satellite system MCS, which allowed temporary connections to be established on demand by manual dialling. Standard hardware and software were used for videoconferencing, as well as software for medical communication developed in the BERMED project. Digital data, such as computed tomography or magnetic resonance images, were transmitted by a digital data channel in parallel to the transmission of analogue video and audio signals. For conferences involving large groups of people, hardware modifications were required. These included the installation of a video projector, adaptation of the audio system with improved echo cancellation, and installation of extra microphones. Learning to use an unfamiliar communication medium proved to be uncomplicated for the participating physicians.

  8. Desktop mapping using GPS. SAHTI - a software package for environmental monitoring. Report on task JNTB898 on the Finnish support programme to IAEA safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Ilander, T; Kansanaho, A; Toivonen, H

    1996-02-01

    Environmental sampling is the key method of the IAEA in searching signatures of a covert nuclear programme. However, it is not always easy to know the exact location of the sampling site. The satellite navigation system, utilizing a small receiver (GPS) and a PC, allows to have independent positioning data easily. The present task on the Finnish Support Programme was launched to create software to merge information about sampling and positioning. The system is build above a desktop mapping software package. However, the result of the development goes beyond the initial goal: the software can be used to real- time positioning in a mobile unit utilizing maps that can be purchased or produced by the user. In addition, the system can be easily enlarged to visualize data in real time from mobile environmental monitors, such as a Geiger counter, a pressurized ionisation chamber of a gamma-ray spectrometer. (orig.) (7 figs.).

  9. Desktop mapping using GPS. SAHTI - a software package for environmental monitoring. Report on task JNTB898 on the Finnish support programme to IAEA safeguards

    International Nuclear Information System (INIS)

    Ilander, T.; Kansanaho, A.; Toivonen, H.

    1996-02-01

    Environmental sampling is the key method of the IAEA in searching signatures of a covert nuclear programme. However, it is not always easy to know the exact location of the sampling site. The satellite navigation system, utilizing a small receiver (GPS) and a PC, allows to have independent positioning data easily. The present task on the Finnish Support Programme was launched to create software to merge information about sampling and positioning. The system is build above a desktop mapping software package. However, the result of the development goes beyond the initial goal: the software can be used to real- time positioning in a mobile unit utilizing maps that can be purchased or produced by the user. In addition, the system can be easily enlarged to visualize data in real time from mobile environmental monitors, such as a Geiger counter, a pressurized ionisation chamber of a gamma-ray spectrometer. (orig.) (7 figs.)

  10. Using M@th Desktop Notebooks and Palettes in the Classroom

    Science.gov (United States)

    Simonovits, Reinhard

    2011-01-01

    This article explains the didactical design of M@th Desktop (MD), a teaching and learning software application for high schools and universities. The use of two types of MD resources is illustrated: notebooks and palettes, focusing on the topic of exponential functions. The handling of MD in a blended learning approach and the impact on the…

  11. Efficiency Sustainability Resource Visual Simulator for Clustered Desktop Virtualization Based on Cloud Infrastructure

    Directory of Open Access Journals (Sweden)

    Jong Hyuk Park

    2014-11-01

    Full Text Available Following IT innovations, manual operations have been automated, improving the overall quality of life. This has been possible because an organic topology has been formed among many diverse smart devices grafted onto real life. To provide services to these smart devices, enterprises or users use the cloud. Cloud services are divided into infrastructure as a service (IaaS, platform as a service (PaaS and software as a service (SaaS. SaaS is operated on PaaS, and PaaS is operated on IaaS. Since IaaS is the foundation of all services, algorithms for the efficient operation of virtualized resources are required. Among these algorithms, desktop resource virtualization is used for high resource availability when existing desktop PCs are unavailable. For this high resource availability, clustering for hierarchical structures is important. In addition, since many clustering algorithms show different percentages of the main resources depending on the desktop PC distribution rates and environments, selecting appropriate algorithms is very important. If diverse attempts are made to find algorithms suitable for the operating environments’ desktop resource virtualization, huge costs are incurred for the related power, time and labor. Therefore, in the present paper, a desktop resource virtualization clustering simulator (DRV-CS, a clustering simulator for selecting clusters of desktop virtualization clusters to be maintained sustainably, is proposed. The DRV-CS provides simulations, so that clustering algorithms can be selected and elements can be properly applied in different desktop PC environments through the DRV-CS.

  12. Desktop Genetics.

    Science.gov (United States)

    Hough, Soren H; Ajetunmobi, Ayokunmi; Brody, Leigh; Humphryes-Kirilov, Neil; Perello, Edward

    2016-11-01

    Desktop Genetics is a bioinformatics company building a gene-editing platform for personalized medicine. The company works with scientists around the world to design and execute state-of-the-art clustered regularly interspaced short palindromic repeats (CRISPR) experiments. Desktop Genetics feeds the lessons learned about experimental intent, single-guide RNA design and data from international genomics projects into a novel CRISPR artificial intelligence system. We believe that machine learning techniques can transform this information into a cognitive therapeutic development tool that will revolutionize medicine.

  13. Instant Citrix XenDesktop 5 starter

    CERN Document Server

    Magdy, Mahmoud

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. This easy-to-follow, hands-on guide shows you how to implement desktop virtualization with real life cases and step-by-step instructions. It is a tutorial with step-by-step instructions and adequate screenshots for the installation and administration of Citrix XenDesktop.If you are new to XenDesktop or are looking to build your skills in desktop virtualization, this is your step-by-step guide to learning Citrix XenDesktop. For those architects a

  14. Realization of a Desktop Flight Simulation System for Motion-Cueing Studies

    Directory of Open Access Journals (Sweden)

    Berkay Volkaner

    2016-05-01

    Full Text Available Parallel robotic mechanisms are generally used in flight simulators with a motion-cueing algorithm to create an unlimited motion feeling of a simulated medium in a bounded workspace of the simulator. A major problem in flight simulators is that the simulation has an unbounded space and the manipulator has a limited one. Using a washout filter in the motion-cueing algorithm overcomes this. In this study, a low-cost six degrees of freedom (DoF desktop parallel manipulator is used to test a classical motion-cueing algorithm; the algorithm's functionality is confirmed with a Simulink real-time environment. Translational accelerations and angular velocities of the simulated medium obtained from FlightGear flight simulation software are processed through a generated washout filter algorithm and the simulated medium's motion information is transmitted to the desktop parallel robotic mechanism as a set point for each leg. The major issues of this paper are designing a desktop simulation system, controlling the parallel manipulator, communicating between the flight simulation and the platform, designing a motion-cueing algorithm and determining the parameters of the washout filters.

  15. Linux Desktop Pocket Guide

    CERN Document Server

    Brickner, David

    2005-01-01

    While Mac OS X garners all the praise from pundits, and Windows XP attracts all the viruses, Linux is quietly being installed on millions of desktops every year. For programmers and system administrators, business users, and educators, desktop Linux is a breath of fresh air and a needed alternative to other operating systems. The Linux Desktop Pocket Guide is your introduction to using Linux on five of the most popular distributions: Fedora, Gentoo, Mandriva, SUSE, and Ubuntu. Despite what you may have heard, using Linux is not all that hard. Firefox and Konqueror can handle all your web bro

  16. Choosing the Right Desktop Publisher.

    Science.gov (United States)

    Eiser, Leslie

    1988-01-01

    Investigates the many different desktop publishing packages available today. Lists the steps to desktop publishing. Suggests which package to use with specific hardware available. Compares several packages for IBM, Mac, and Apple II based systems. (MVL)

  17. Bringing the medical library to the office desktop.

    Science.gov (United States)

    Brown, S R; Decker, G; Pletzke, C J

    1991-01-01

    This demonstration illustrates LRC Remote Computer Services- a dual operating system, multi-protocol system for delivering medical library services to the medical professional's desktop. A working model draws resources from CD-ROM and magnetic media file services, Novell and AppleTalk network protocol suites and gating, LAN and asynchronous (dial-in) access strategies, commercial applications for MS-DOS and Macintosh workstations and custom user interfaces. The demonstration includes a discussion of issues relevant to the delivery of said services, particularly with respect to maintenance, security, training/support, staffing, software licensing and costs.

  18. A Study On Traditional And Evolutionary Software Development Models

    Directory of Open Access Journals (Sweden)

    Kamran Rasheed

    2017-07-01

    Full Text Available Today Computing technologies are becoming the pioneers of the organizations and helpful in individual functionality i.e. added to computing device we need to add softwares. Set of instruction or computer program is known as software. The development of software is done through some traditional or some new or evolutionary models. Software development is becoming a key and a successful business nowadays. Without software all hardware is useless. Some collective steps that are performed in the development of these are known as Software development life cycle SDLC. There are some adaptive and predictive models for developing software. Predictive mean already known like WATERFALL Spiral Prototype and V-shaped models while Adaptive model include agile Scrum. All methodologies of both adaptive and predictive have their own procedure and steps. Predictive are Static and Adaptive are dynamic mean change cannot be made to the predictive while adaptive have the capability of changing. The purpose of this study is to get familiar with all these and discuss their uses and steps of development. This discussion will be helpful in deciding which model they should use in which circumstance and what are the development step including in each model.

  19. DIaaS: Resource Management System for the Intra-Cloud with On-Premise Desktops

    Directory of Open Access Journals (Sweden)

    Hyun-Woo Kim

    2017-01-01

    Full Text Available Infrastructure as a service with desktops (DIaaS based on the extensible mark-up language (XML is herein proposed to utilize surplus resources. DIaaS is a traditional surplus-resource integrated management technology. It is designed to provide fast work distribution and computing services based on user service requests as well as storage services through desktop-based distributed computing and storage resource integration. DIaaS includes a nondisruptive resource service and an auto-scalable scheme to enhance the availability and scalability of intra-cloud computing resources. A performance evaluation of the proposed scheme measured the clustering performance time for surplus resource utilization. The results showed improvement in computing and storage services in a connection of at least two computers compared to the traditional method for high-availability measurement of nondisruptive services. Furthermore, an artificial server error environment was used to create a clustering delay for computing and storage services and for nondisruptive services. It was compared to the Hadoop distributed file system (HDFS.

  20. Desktop Publishing: Changing Technology, Changing Occupations.

    Science.gov (United States)

    Stanton, Michael

    1991-01-01

    Describes desktop publishing (DTP) and its place in corporations. Lists job titles of those working in desktop publishing and describes DTP as it is taught at secondary and postsecondary levels and by private trainers. (JOW)

  1. Promises and Realities of Desktop Publishing.

    Science.gov (United States)

    Thompson, Patricia A.; Craig, Robert L.

    1991-01-01

    Examines the underlying assumptions of the rhetoric of desktop publishing promoters. Suggests four criteria to help educators provide insights into issues and challenges concerning desktop publishing technology that design students will face on the job. (MG)

  2. Making the Leap to Desktop Publishing.

    Science.gov (United States)

    Schleifer, Neal

    1986-01-01

    Describes one teacher's approach to desktop publishing. Explains how the Macintosh and LaserWriter were used in the publication of a school newspaper. Guidelines are offered to teachers for the establishment of a desktop publishing lab. (ML)

  3. Desktop publishing and medical imaging: paper as hardcopy medium for digital images.

    Science.gov (United States)

    Denslow, S

    1994-08-01

    Desktop-publishing software and hardware has progressed to the point that many widely used word-processing programs are capable of printing high-quality digital images with many shades of gray from black to white. Accordingly, it should be relatively easy to print digital medical images on paper for reports, instructional materials, and in research notes. Components were assembled that were necessary for extracting image data from medical imaging devices and converting the data to a form usable by word-processing software. A system incorporating these components was implemented in a medical setting and has been operating for 18 months. The use of this system by medical staff has been monitored.

  4. CernVM - a virtual software appliance for LHC applications

    International Nuclear Information System (INIS)

    Buncic, P; Sanchez, C Aguado; Blomer, J; Franco, L; Mato, P; Harutyunian, A; Yao, Y

    2010-01-01

    CernVM is a Virtual Software Appliance capable of running physics applications from the LHC experiments at CERN. It aims to provide a complete and portable environment for developing and running LHC data analysis on any end-user computer (laptop, desktop) as well as on the Grid, independently of Operating System platforms (Linux, Windows, MacOS). The experiment application software and its specific dependencies are built independently from CernVM and delivered to the appliance just in time by means of a CernVM File System (CVMFS) specifically designed for efficient software distribution. The procedures for building, installing and validating software releases remains under the control and responsibility of each user community. We provide a mechanism to publish pre-built and configured experiment software releases to a central distribution point from where it finds its way to the running CernVM instances via the hierarchy of proxy servers or content delivery networks. In this paper, we present current state of CernVM project and compare performance of CVMFS to performance of traditional network file system like AFS and discuss possible scenarios that could further improve its performance and scalability.

  5. Evaluating virtual hosted desktops for graphics-intensive astronomy

    Science.gov (United States)

    Meade, B. F.; Fluke, C. J.

    2018-04-01

    Visualisation of data is critical to understanding astronomical phenomena. Today, many instruments produce datasets that are too big to be downloaded to a local computer, yet many of the visualisation tools used by astronomers are deployed only on desktop computers. Cloud computing is increasingly used to provide a computation and simulation platform in astronomy, but it also offers great potential as a visualisation platform. Virtual hosted desktops, with graphics processing unit (GPU) acceleration, allow interactive, graphics-intensive desktop applications to operate co-located with astronomy datasets stored in remote data centres. By combining benchmarking and user experience testing, with a cohort of 20 astronomers, we investigate the viability of replacing physical desktop computers with virtual hosted desktops. In our work, we compare two Apple MacBook computers (one old and one new, representing hardware and opposite ends of the useful lifetime) with two virtual hosted desktops: one commercial (Amazon Web Services) and one in a private research cloud (the Australian NeCTAR Research Cloud). For two-dimensional image-based tasks and graphics-intensive three-dimensional operations - typical of astronomy visualisation workflows - we found that benchmarks do not necessarily provide the best indication of performance. When compared to typical laptop computers, virtual hosted desktops can provide a better user experience, even with lower performing graphics cards. We also found that virtual hosted desktops are equally simple to use, provide greater flexibility in choice of configuration, and may actually be a more cost-effective option for typical usage profiles.

  6. Technical Writing Teachers and the Challenges of Desktop Publishing.

    Science.gov (United States)

    Kalmbach, James

    1988-01-01

    Argues that technical writing teachers must understand desktop publishing. Discusses the strengths that technical writing teachers bring to desktop publishing, and the impact desktop publishing will have on technical writing courses and programs. (ARH)

  7. Desktop supercomputer: what can it do?

    Science.gov (United States)

    Bogdanov, A.; Degtyarev, A.; Korkhov, V.

    2017-12-01

    The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.

  8. Desktop Publishing for the Gifted/Talented.

    Science.gov (United States)

    Hamilton, Wayne

    1987-01-01

    Examines the nature of desktop publishing and how it can be used in the classroom for gifted/talented students. Characteristics and special needs of such students are identified, and it is argued that desktop publishing addresses those needs, particularly with regard to creativity. Twenty-six references are provided. (MES)

  9. Semantic document architecture for desktop data integration and management

    OpenAIRE

    Nesic, Sasa; Jazayeri, Mehdi

    2011-01-01

    Over the last decade, personal desktops have faced the problem of information overload due to increasing computational power, easy access to the Web and cheap data storage. Moreover, an increasing number of diverse end-user desktop applications have led to the problem of information fragmentation. Each desktop application has its own data, unaware of related and relevant data in other applications. In other words, personal desktops face a lack of interoperability of data managed by differ...

  10. Nuclear Plant Analyzer desktop workstation: An integrated interactive simulation, visualization and analysis tool

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1991-01-01

    The advanced, best-estimate, reactor thermal-hydraulic codes were originally developed as mainframe computer applications because of speed, precision, memory and mass storage requirements. However, the productivity of numerical reactor safety analysts has historically been hampered by mainframe dependence due to limited mainframe CPU allocation, accessibility and availability, poor mainframe job throughput, and delays in obtaining and difficulty comprehending printed numerical results. The Nuclear Plant Analyzer (NPA) was originally developed as a mainframe computer-graphics aid for reactor safety analysts in addressing the latter consideration. Rapid advances in microcomputer technology have since enabled the installation and execution of these reactor safety codes on desktop computers thereby eliminating mainframe dependence. The need for a complementary desktop graphics display generation and presentation capability, coupled with the need for software standardization and portability, has motivated the redesign of the NPA as a UNIX/X-Windows application suitable for both mainframe and microcomputer

  11. The Virtual Desktop: Options and Challenges in Selecting a Secure Desktop Infrastructure Based on Virtualization

    Science.gov (United States)

    2011-10-01

    the virtual desktop environment still functions for the users associated with it. Users can access the virtual desktop through the local network and...technologie de virtualisation du poste de travail peut contribuer à combler les besoins de partage de l’information sécuritaire au sein du MDN. Le... virtualisation . Il englobe un aperçu de la virtualisation d’un poste de travail, y compris un examen approfondi de deux architectures différentes : le

  12. Desktop supercomputer: what can it do?

    International Nuclear Information System (INIS)

    Bogdanov, A.; Degtyarev, A.; Korkhov, V.

    2017-01-01

    The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.

  13. Multimedia architectures: from desktop systems to portable appliances

    Science.gov (United States)

    Bhaskaran, Vasudev; Konstantinides, Konstantinos; Natarajan, Balas R.

    1997-01-01

    Future desktop and portable computing systems will have as their core an integrated multimedia system. Such a system will seamlessly combine digital video, digital audio, computer animation, text, and graphics. Furthermore, such a system will allow for mixed-media creation, dissemination, and interactive access in real time. Multimedia architectures that need to support these functions have traditionally required special display and processing units for the different media types. This approach tends to be expensive and is inefficient in its use of silicon. Furthermore, such media-specific processing units are unable to cope with the fluid nature of the multimedia market wherein the needs and standards are changing and system manufacturers may demand a single component media engine across a range of products. This constraint has led to a shift towards providing a single-component multimedia specific computing engine that can be integrated easily within desktop systems, tethered consumer appliances, or portable appliances. In this paper, we review some of the recent architectural efforts in developing integrated media systems. We primarily focus on two efforts, namely the evolution of multimedia-capable general purpose processors and a more recent effort in developing single component mixed media co-processors. Design considerations that could facilitate the migration of these technologies to a portable integrated media system also are presented.

  14. Collection and analysis of environmental radiation data using a desktop computer

    International Nuclear Information System (INIS)

    Gogolak, C.V.

    1982-04-01

    A portable instrumentation sytem using a Hewlett-Packard HP-9825 desktop computer for the collection and analysis of environmental radiation data is described. Procedures for the transmission of data between the HP-9825 and various nuclear counters are given together with a description of the necessary hardware and software. Complete programs for the analysis of Ge(Li) and NaI(Tl) gamma-ray spectra, high pressure ionization chamber monitor data, 86 Kr monitor data and air filter sample alpha particle activity measurements are presented. Some utility programs, intended to increase system flexibility, are included

  15. Desktop Technology for Newspapers: Use of the Computer Tool.

    Science.gov (United States)

    Wilson, Howard Alan

    This work considers desktop publishing technology as a way used to paginate newspapers electronically, tracing the technology's development from the beginning of desktop publishing in the mid-1980s to the 1990s. The work emphasizes how desktop publishing technology is and can be used by weekly newspapers. It reports on a Pennsylvania weekly…

  16. VMware Horizon 6 desktop virtualization solutions

    CERN Document Server

    Cartwright, Ryan; Langone, Jason; Leibovici, Andre

    2014-01-01

    If you are a desktop architect, solution provider, end-user consultant, virtualization engineer, or anyone who wants to learn how to plan and design the implementation of a virtual desktop solution based on Horizon 6, then this book is for you. An understanding of VMware vSphere fundamentals coupled with experience in the installation or administration of a VMware environment would be a plus during reading.

  17. FRAMEWORK PARA CONVERSÃO DE APLICATIVOS DELPHI DESKTOP EM APLICATIVOS ANDROID NATIVO

    Directory of Open Access Journals (Sweden)

    Rodrigo da Silva Riquena

    2014-08-01

    Full Text Available With the growing use of mobile devices by companies and organizations there is an increasing demand applications in production mobile platform. For certain companies, business success may depend on a mobile application which approaches the customers or improve the performance of internal processes. However, developing software for the mobile platform is an expensive process which takes time and resources. A framework to convert Delphi Desktop applications into native Android applications in an automatic way constitutes a useful tool for architects and software developers can contribute with the implementation phase of the application. Therefore, this work is based on methods and processes for software reengineering as the PRE / OO (Process of Reengineering Object Oriented, for automatic conversion of an application developed in Delphi environment in an application for Android mobile platform. At last, an experiment was performed with a real case to corroborate the goals.

  18. Open source software migration: Best practices

    CSIR Research Space (South Africa)

    Molefe, Onkgopotse M

    2010-09-01

    Full Text Available Open source software (OSS) has gained prominence worldwide, largely due to cost savings and security considerations. This has caused a change in the IT sector and has led to the migration of desktops from proprietary to OSS. The problem...

  19. Daylighting simulation : comparison of softwares for architect's utilization

    Energy Technology Data Exchange (ETDEWEB)

    Christakou, D.E.; Amorim, C.N.D. [Brazil Univ., Brasilia (Brazil). Faculty of Architecture and Urbanism

    2005-07-01

    This study analyzed and compared 4 daylighting software packages to determine the primary benefits and limits of each one, while considering the priorities for the use of the software by architects. The complex task of daylight simulation is an important step in designing buildings, particularly when the main objective is comfort and energy conservation. Simulation is not yet commonly practiced by professional architects because of the complexities of various software packages, the lack of user friendly interfaces and difficulty in interpreting results. The 4 software packages that were evaluated in this study were: (1) Desktop Radiance, (2) Rayfront, (3) Relux 2004 Vision, and (4) Lightscape. Criteria such as interfaces, flexibility, and help manuals were also analyzed in an effort to establish a frame of the main points to be considered when choosing daylighting software for architectural use, both in educational and office environments. Simulations of a test room were performed in which some parameters were modified to verify the performance of the following main criteria: flexibility in adapting to the architect's workflow; the use of state of the art algorithms; numerical precision; and, access possibility by Brazilian architects. The results demonstrate the potential of software's improvement, particular in terms of user interfaces and help manuals. The study showed that Relux 2004 Vision is the most adequate for architect's use. Rayfront and Desktop Radiance presented more difficulties in the design process, but Desktop Radiance had the advantage of being enclosed in AUTOCAD, a well known interface. Lightscape had a user friendly interface but was not as intuitive as Relux. It was concluded that the ideal daylighting simulation software does not yet exist. The ideal software should integrate diverse factors and combine edition and modeling tools beyond luminous evaluation and thermal consequences of daylight use. 5 refs., 3 tabs., 4 figs.

  20. A software toolkit for implementing low-cost virtual reality training systems

    International Nuclear Information System (INIS)

    Louka, Michael N.

    1999-04-01

    VR is a powerful technology for implementing training systems but better tools are needed to achieve wider usage and acceptance for desktop computer-based training applications. A need has been identified for a software tool kit to support the efficient implementation of well-structured desktop VR training systems. A powerful toolkit for implementing scalable low-cost VR training applications is described in this report (author) (ml)

  1. Desktop Publishing as a Learning Resources Service.

    Science.gov (United States)

    Drake, David

    In late 1988, Midland College in Texas implemented a desktop publishing service to produce instructional aids and reduce and complement the workload of the campus print shop. The desktop service was placed in the Media Services Department of the Learning Resource Center (LRC) for three reasons: the LRC was already established as a campus-wide…

  2. A Reusable Software Copy Protection Using Hash Result and Asymetrical Encryption

    Directory of Open Access Journals (Sweden)

    Aswin Wibisurya

    2014-12-01

    Full Text Available Desktop application is one of the most popular types of application being used in computer due to the one time install simplicity and the quick accessibility from the moment the computer being turned on. Limitation of the copy and usage of desktop applications has long been an important issue to application providers. For security concerns, software copy protection is usually integrated with the application. However, developers seek to reuse the copy protection component of the software. This paper proposes an approach of reusable software copy protection which consists of a certificate validator on the client computer and a certificate generator on the server. The certificate validator integrity is protected using hashing result while all communications are encrypted using asymmetrical encryption to ensure the security of this approach.

  3. System Testing of Desktop and Web Applications

    Science.gov (United States)

    Slack, James M.

    2011-01-01

    We want our students to experience system testing of both desktop and web applications, but the cost of professional system-testing tools is far too high. We evaluate several free tools and find that AutoIt makes an ideal educational system-testing tool. We show several examples of desktop and web testing with AutoIt, starting with simple…

  4. LCCP Desktop Application v1.0 Engineering Reference

    Energy Technology Data Exchange (ETDEWEB)

    Beshr, Mohamed [Univ. of Maryland, College Park, MD (United States); Aute, Vikrant [Univ. of Maryland, College Park, MD (United States)

    2014-04-01

    This Life Cycle Climate Performance (LCCP) Desktop Application Engineering Reference is divided into three parts. The first part of the guide, consisting of the LCCP objective, literature review, and mathematical background, is presented in Sections 2-4. The second part of the guide (given in Sections 5-10) provides a description of the input data required by the LCCP desktop application, including each of the input pages (Application Information, Load Information, and Simulation Information) and details for interfacing the LCCP Desktop Application with the VapCyc and EnergyPlus simulation programs. The third part of the guide (given in Section 11) describes the various interfaces of the LCCP code.

  5. Citrix XenApp 7.5 desktop virtualization solutions

    CERN Document Server

    Paul, Andy

    2014-01-01

    If you are a Citrix® engineer, a virtualization consultant, or an IT project manager with prior experience of using Citrix XenApp® and related technologies for desktop virtualization and want to further explore the power of XenApp® for flawless desktop virtualization, then this book is for you.

  6. Perception Analysis of Desktop and Mobile Service Website

    Directory of Open Access Journals (Sweden)

    Rizqiyatul Khoiriyah

    2016-12-01

    Full Text Available The research was conducted as a qualitative study of the website to deeper explore and examine the analysis of user perception of desktop and mobile website services. This research reviewed about user perception of desktop and mobile service website used by using qualitative methods adapted to WebQual and User Experience approach. This qualitative research refered to the theoretical reference written by Creswell (2014. The expected outcome is to know the user perceptions of the available services and information in the website along with the possibility of desktop and mobile gap arising from differences in the two services. These results can be used as a service model on the website of the user experience.

  7. Desktop Publishing: A Powerful Tool for Advanced Composition Courses.

    Science.gov (United States)

    Sullivan, Patricia

    1988-01-01

    Examines the advantages of using desktop publishing in advanced writing classes. Explains how desktop publishing can spur creativity, call attention to the interaction between words and pictures, encourage the social dimensions of computing and composing, and provide students with practical skills. (MM)

  8. Software development to implement the TxDOT culvert rating guide.

    Science.gov (United States)

    2013-05-01

    This implementation project created CULVLR: Culvert Load Rating, Version 1.0.0, a Windows-based : desktop application software package that automates the process by which Texas Department of Transportation : (TxDOT) engineers and their consultants ...

  9. A Personal Desktop Liquid-Metal Printer as a Pervasive Electronics Manufacturing Tool for Society in the Near Future

    Directory of Open Access Journals (Sweden)

    Jun Yang

    2015-12-01

    Full Text Available It has long been a dream in the electronics industry to be able to write out electronics directly, as simply as printing a picture onto paper with an office printer. The first-ever prototype of a liquid-metal printer has been invented and demonstrated by our lab, bringing this goal a key step closer. As part of a continuous endeavor, this work is dedicated to significantly extending such technology to the consumer level by making a very practical desktop liquid-metal printer for society in the near future. Through the industrial design and technical optimization of a series of key technical issues such as working reliability, printing resolution, automatic control, human-machine interface design, software, hardware, and integration between software and hardware, a high-quality personal desktop liquid-metal printer that is ready for mass production in industry was fabricated. Its basic features and important technical mechanisms are explained in this paper, along with demonstrations of several possible consumer end-uses for making functional devices such as light-emitting diode (LED displays. This liquid-metal printer is an automatic, easy-to-use, and low-cost personal electronics manufacturing tool with many possible applications. This paper discusses important roles that the new machine may play for a group of emerging needs. The prospective future of this cutting-edge technology is outlined, along with a comparative interpretation of several historical printing methods. This desktop liquid-metal printer is expected to become a basic electronics manufacturing tool for a wide variety of emerging practices in the academic realm, in industry, and in education as well as for individual end-users in the near future.

  10. A VM-shared desktop virtualization system based on OpenStack

    Science.gov (United States)

    Liu, Xi; Zhu, Mingfa; Xiao, Limin; Jiang, Yuanjie

    2018-04-01

    With the increasing popularity of cloud computing, desktop virtualization is rising in recent years as a branch of virtualization technology. However, existing desktop virtualization systems are mostly designed as a one-to-one mode, which one VM can only be accessed by one user. Meanwhile, previous desktop virtualization systems perform weakly in terms of response time and cost saving. This paper proposes a novel VM-Shared desktop virtualization system based on OpenStack platform. The paper modified the connecting process and the display data transmission process of the remote display protocol SPICE to support VM-Shared function. On the other hand, we propose a server-push display mode to improve user interactive experience. The experimental results show that our system performs well in response time and achieves a low CPU consumption.

  11. Desktop Publishing in Education.

    Science.gov (United States)

    Hall, Wendy; Layman, J.

    1989-01-01

    Discusses the state of desktop publishing (DTP) in education today and describes the weaknesses of the systems available for use in the classroom. Highlights include document design and layout; text composition; graphics; word processing capabilities; a comparison of commercial and educational DTP packages; and skills required for DTP. (four…

  12. Reducing the Digital Divide among Children Who Received Desktop or Hybrid Computers for the Home

    Directory of Open Access Journals (Sweden)

    Gila Cohen Zilka

    2016-06-01

    Full Text Available Researchers and policy makers have been exploring ways to reduce the digital divide. Parameters commonly used to examine the digital divide worldwide, as well as in this study, are: (a the digital divide in the accessibility and mobility of the ICT infrastructure and of the content infrastructure (e.g., sites used in school; and (b the digital divide in literacy skills. In the present study we examined the degree of effectiveness of receiving a desktop or hybrid computer for the home in reducing the digital divide among children of low socio-economic status aged 8-12 from various localities across Israel. The sample consisted of 1,248 respondents assessed in two measurements. As part of the mixed-method study, 128 children were also interviewed. Findings indicate that after the children received desktop or hybrid computers, changes occurred in their frequency of access, mobility, and computer literacy. Differences were found between the groups: hybrid computers reduce disparities and promote work with the computer and surfing the Internet more than do desktop computers. Narrowing the digital divide for this age group has many implications for the acquisition of skills and study habits, and consequently, for the realization of individual potential. The children spoke about self improvement as a result of exposure to the digital environment, about a sense of empowerment and of improvement in their advantage in the social fabric. Many children expressed a desire to continue their education and expand their knowledge of computer applications, the use of software, of games, and more. Therefore, if there is no computer in the home and it is necessary to decide between a desktop and a hybrid computer, a hybrid computer is preferable.

  13. Exploring Graphic Design. A Short Course in Desktop Publishing.

    Science.gov (United States)

    Stanley, MLG

    This course in desktop publishing contains seven illustrated modules designed to meet the following objectives: (1) use a desktop publishing program to explore advanced topics in graphic design; (2) learn about typography and how to make design decisions on the use of typestyles; (3) learn basic principles in graphic communications and apply them…

  14. FORMED: Bringing Formal Methods to the Engineering Desktop

    Science.gov (United States)

    2016-02-01

    FORMED: BRINGING FORMAL METHODS TO THE ENGINEERING DESKTOP BAE SYSTEMS FEBRUARY 2016 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE...This report is published in the interest of scientific and technical information exchange, and its publication does not constitute the Government’s...BRINGING FORMAL METHODS TO THE ENGINEERING DESKTOP 5a. CONTRACT NUMBER FA8750-14-C-0024 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 63781D

  15. Desktop aligner for fabrication of multilayer microfluidic devices.

    Science.gov (United States)

    Li, Xiang; Yu, Zeta Tak For; Geraldo, Dalton; Weng, Shinuo; Alve, Nitesh; Dun, Wu; Kini, Akshay; Patel, Karan; Shu, Roberto; Zhang, Feng; Li, Gang; Jin, Qinghui; Fu, Jianping

    2015-07-01

    Multilayer assembly is a commonly used technique to construct multilayer polydimethylsiloxane (PDMS)-based microfluidic devices with complex 3D architecture and connectivity for large-scale microfluidic integration. Accurate alignment of structure features on different PDMS layers before their permanent bonding is critical in determining the yield and quality of assembled multilayer microfluidic devices. Herein, we report a custom-built desktop aligner capable of both local and global alignments of PDMS layers covering a broad size range. Two digital microscopes were incorporated into the aligner design to allow accurate global alignment of PDMS structures up to 4 in. in diameter. Both local and global alignment accuracies of the desktop aligner were determined to be about 20 μm cm(-1). To demonstrate its utility for fabrication of integrated multilayer PDMS microfluidic devices, we applied the desktop aligner to achieve accurate alignment of different functional PDMS layers in multilayer microfluidics including an organs-on-chips device as well as a microfluidic device integrated with vertical passages connecting channels located in different PDMS layers. Owing to its convenient operation, high accuracy, low cost, light weight, and portability, the desktop aligner is useful for microfluidic researchers to achieve rapid and accurate alignment for generating multilayer PDMS microfluidic devices.

  16. Feasibility of video codec algorithms for software-only playback

    Science.gov (United States)

    Rodriguez, Arturo A.; Morse, Ken

    1994-05-01

    Software-only video codecs can provide good playback performance in desktop computers with a 486 or 68040 CPU running at 33 MHz without special hardware assistance. Typically, playback of compressed video can be categorized into three tasks: the actual decoding of the video stream, color conversion, and the transfer of decoded video data from system RAM to video RAM. By current standards, good playback performance is the decoding and display of video streams of 320 by 240 (or larger) compressed frames at 15 (or greater) frames-per- second. Software-only video codecs have evolved by modifying and tailoring existing compression methodologies to suit video playback in desktop computers. In this paper we examine the characteristics used to evaluate software-only video codec algorithms, namely: image fidelity (i.e., image quality), bandwidth (i.e., compression) ease-of-decoding (i.e., playback performance), memory consumption, compression to decompression asymmetry, scalability, and delay. We discuss the tradeoffs among these variables and the compromises that can be made to achieve low numerical complexity for software-only playback. Frame- differencing approaches are described since software-only video codecs typically employ them to enhance playback performance. To complement other papers that appear in this session of the Proceedings, we review methods derived from binary pattern image coding since these methods are amenable for software-only playback. In particular, we introduce a novel approach called pixel distribution image coding.

  17. The desktop interface in intelligent tutoring systems

    Science.gov (United States)

    Baudendistel, Stephen; Hua, Grace

    1987-01-01

    The interface between an Intelligent Tutoring System (ITS) and the person being tutored is critical to the success of the learning process. If the interface to the ITS is confusing or non-supportive of the tutored domain, the effectiveness of the instruction will be diminished or lost entirely. Consequently, the interface to an ITS should be highly integrated with the domain to provide a robust and semantically rich learning environment. In building an ITS for ZetaLISP on a LISP Machine, a Desktop Interface was designed to support a programming learning environment. Using the bitmapped display, windows, and mouse, three desktops were designed to support self-study and tutoring of ZetaLISP. Through organization, well-defined boundaries, and domain support facilities, the desktops provide substantial flexibility and power for the student and facilitate learning ZetaLISP programming while screening the student from the complex LISP Machine environment. The student can concentrate on learning ZetaLISP programming and not on how to operate the interface or a LISP Machine.

  18. Metadata Wizard: an easy-to-use tool for creating FGDC-CSDGM metadata for geospatial datasets in ESRI ArcGIS Desktop

    Science.gov (United States)

    Ignizio, Drew A.; O'Donnell, Michael S.; Talbert, Colin B.

    2014-01-01

    Creating compliant metadata for scientific data products is mandated for all federal Geographic Information Systems professionals and is a best practice for members of the geospatial data community. However, the complexity of the The Federal Geographic Data Committee’s Content Standards for Digital Geospatial Metadata, the limited availability of easy-to-use tools, and recent changes in the ESRI software environment continue to make metadata creation a challenge. Staff at the U.S. Geological Survey Fort Collins Science Center have developed a Python toolbox for ESRI ArcDesktop to facilitate a semi-automated workflow to create and update metadata records in ESRI’s 10.x software. The U.S. Geological Survey Metadata Wizard tool automatically populates several metadata elements: the spatial reference, spatial extent, geospatial presentation format, vector feature count or raster column/row count, native system/processing environment, and the metadata creation date. Once the software auto-populates these elements, users can easily add attribute definitions and other relevant information in a simple Graphical User Interface. The tool, which offers a simple design free of esoteric metadata language, has the potential to save many government and non-government organizations a significant amount of time and costs by facilitating the development of The Federal Geographic Data Committee’s Content Standards for Digital Geospatial Metadata compliant metadata for ESRI software users. A working version of the tool is now available for ESRI ArcDesktop, version 10.0, 10.1, and 10.2 (downloadable at http:/www.sciencebase.gov/metadatawizard).

  19. Perception Analysis of Desktop and Mobile Service Website

    OpenAIRE

    Khoiriyah, Rizqiyatul

    2016-01-01

    The research was conducted as a qualitative study of the website to deeper explore and examine the analysis of user perception of desktop and mobile website services. This research reviewed about user perception of desktop and mobile service website used by using qualitative methods adapted to WebQual and User Experience approach. This qualitative research refered to the theoretical reference written by Creswell (2014). The expected outcome is to know the user perceptions of the available ser...

  20. Desktop Publishing: Organizational Considerations for Adoption and Implementation. TDC Research Report No. 6.

    Science.gov (United States)

    Lee, Paul

    This report explores the implementation of desktop publishing in the Minnesota Extension Service (MES) and provides a framework for its implementation in other organizations. The document begins with historical background on the development of desktop publishing. Criteria for deciding whether to purchase a desktop publishing system, advantages and…

  1. Integrasi pemrograman web pada pemrograman desktop sebagai alternatif fasilitas laporan dalam pengembangan program aplikasi

    Directory of Open Access Journals (Sweden)

    Mardainis Mardainis

    2017-11-01

    Full Text Available AbstrakPemrograman Desktop adalah program aplikasi yang mampu beroperasi tanpa mengandalkan jaringan internet. Penggunaan program desktop biasanya digunakan untuk membuat program yang akan dioperasikan tanpa memerlukan jaringan internet dengan area kerja berada disatu lokasi saja. Sedangkan program web pemakaiannya sangat bergantung pada jaringan internet agar bisa menghubungkan antar pengguna. Pilihan menggunakan program desktop atau program berbasis Web ditentukan oleh kebutuhan dan implementasinya. Jika implementasinya hanya untuk lingkungan perusahaan yang berada di satu tempat, program sebaiknya menggunakan program berbasis desktop. Namun, jika perusahaan memiliki lokasi terpisah di beberapa daerah, penggunaan program berbasis web lebih tepat. Namun banyak programmer, terutama pemula yang enggan menggunakan pemrograman desktop karena dalam membuat laporan harus menggunakan aplikasi pembuat laporan khusus seperti Crystal Report. Kesulitan yang dialami untuk menggunakan aplikasi khusus ini adalah tidak tersedianya aplikasi dalam sistem sehingga perlu diadakan secara khusus. Dalam membuat laporan kadang dirasa agak rumit karena tampilan laporan harus diseting secara manual. Sedangkan dalam bahasa pemrograman berbasis web untuk menampilkan informasi bisa langsung dibuat dengan mudah dalam program itu sendiri tanpa harus menggunakan aplikasi tambahan. Jadi membuat laporan dengan program berbasis web lebih mudah. Untuk menghindari kesulitan para pemrogram dalam membuat laporan tentang program desktop, peneliti mengintegrasikan program berbasis Web dengan pemrograman berbasis desktop dengan tujuan mempermudah membuat laporan. Kata kunci:  Pemrograman Desktop, Implementasi, Integrasi, Crystal Report.  AbstractDesktop Programming is an application programmer capable of operating without relying on the internet network. The use of desktop programs is usually used to create a program that will be operated without the need for internet network with

  2. MedlinePlus® Everywhere: Access from Your Phone, Tablet or Desktop

    Science.gov (United States)

    ... responsivefull.html MedlinePlus® Everywhere: Access from Your Phone, Tablet or Desktop To use the sharing features on ... provide a consistent user experience from a desktop, tablet, or phone. All users, regardless of how they ...

  3. A Course in Desktop Publishing.

    Science.gov (United States)

    Somerick, Nancy M.

    1992-01-01

    Describes "Promotional Publications," a required course for public relations majors, which teaches the basics of desktop publishing. Outlines how the course covers the preparation of publications used as communication tools in public relations, advertising, and organizations, with an emphasis upon design, layout, and technology. (MM)

  4. A NICE approach to managing large numbers of desktop PC's

    International Nuclear Information System (INIS)

    Foster, David

    1996-01-01

    The problems of managing desktop systems are far from resolved. As we deploy increasing numbers of systems, PC's Mackintoshes and UN*X Workstations. This paper will concentrate on the solution adopted at CERN for the management of the rapidly increasing numbers of desktop PC's in use in all parts of the laboratory. (author)

  5. Desktop Virtualization: Applications and Considerations

    Science.gov (United States)

    Hodgman, Matthew R.

    2013-01-01

    As educational technology continues to rapidly become a vital part of a school district's infrastructure, desktop virtualization promises to provide cost-effective and education-enhancing solutions to school-based computer technology problems in school systems locally and abroad. This article outlines the history of and basic concepts behind…

  6. The CosmicWatch Desktop Muon Detector: a self-contained, pocket sized particle detector

    Science.gov (United States)

    Axani, S. N.; Frankiewicz, K.; Conrad, J. M.

    2018-03-01

    The CosmicWatch Desktop Muon Detector is a self-contained, hand-held cosmic ray muon detector that is valuable for astro/particle physics research applications and outreach. The material cost of each detector is under 100 and it takes a novice student approximately four hours to build their first detector. The detectors are powered via a USB connection and the data can either be recorded directly to a computer or to a microSD card. Arduino- and Python-based software is provided to operate the detector and an online application to plot the data in real-time. In this paper, we describe the various design features, evaluate the performance, and illustrate the detectors capabilities by providing several example measurements.

  7. Desktop Publishing in the University.

    Science.gov (United States)

    Burstyn, Joan N., Ed.

    Highlighting changes in the work of people within the university, this book presents nine essays that examine the effects of desktop publishing and electronic publishing on professors and students, librarians, and those who work at university presses and in publication departments. Essays in the book are: (1) "Introduction: The Promise of Desktop…

  8. A desktop PRA

    International Nuclear Information System (INIS)

    Dolan, B.J.; Weber, B.J.

    1989-01-01

    This paper reports that Duke Power Company has completed full-scope PRAs for each of its nuclear stations - Oconee, McGuire and Catawba. These living PRAs are being maintained using desktop personal computers. Duke's PRA group now has powerful personal computer-based tools that have both decreased direct costs (computer analysis expenses) and increased group efficiency (less time to perform analyses). The shorter turnaround time has already resulted in direct savings through analyses provided in support of justification for continued station operation. Such savings are expected to continue with similar future support

  9. Efficient Sustainable Operation Mechanism of Distributed Desktop Integration Storage Based on Virtualization with Ubiquitous Computing

    Directory of Open Access Journals (Sweden)

    Hyun-Woo Kim

    2015-06-01

    Full Text Available Following the rapid growth of ubiquitous computing, many jobs that were previously manual have now been automated. This automation has increased the amount of time available for leisure; diverse services are now being developed for this leisure time. In addition, the development of small and portable devices like smartphones, diverse Internet services can be used regardless of time and place. Studies regarding diverse virtualization are currently in progress. These studies aim to determine ways to efficiently store and process the big data generated by the multitude of devices and services in use. One topic of such studies is desktop storage virtualization, which integrates distributed desktop resources and provides these resources to users to integrate into distributed legacy desktops via virtualization. In the case of desktop storage virtualization, high availability of virtualization is necessary and important for providing reliability to users. Studies regarding hierarchical structures and resource integration are currently in progress. These studies aim to create efficient data distribution and storage for distributed desktops based on resource integration environments. However, studies regarding efficient responses to server faults occurring in desktop-based resource integration environments have been insufficient. This paper proposes a mechanism for the sustainable operation of desktop storage (SODS for high operational availability. It allows for the easy addition and removal of desktops in desktop-based integration environments. It also activates alternative servers when a fault occurs within a system.

  10. Design Options for a Desktop Publishing Course.

    Science.gov (United States)

    Mayer, Kenneth R.; Nelson, Sandra J.

    1992-01-01

    Offers recommendations for development of an undergraduate desktop publishing course. Discusses scholastic level and prerequisites, purpose and objectives, instructional resources and methodology, assignments and evaluation, and a general course outline. (SR)

  11. Empirical Analysis of Server Consolidation and Desktop Virtualization in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Bao Rong Chang

    2013-01-01

    Full Text Available Physical server transited to virtual server infrastructure (VSI and desktop device to virtual desktop infrastructure (VDI have the crucial problems of server consolidation, virtualization performance, virtual machine density, total cost of ownership (TCO, and return on investments (ROI. Besides, how to appropriately choose hypervisor for the desired server/desktop virtualization is really challenging, because a trade-off between virtualization performance and cost is a hard decision to make in the cloud. This paper introduces five hypervisors to establish the virtual environment and then gives a careful assessment based on C/P ratio that is derived from composite index, consolidation ratio, virtual machine density, TCO, and ROI. As a result, even though ESX server obtains the highest ROI and lowest TCO in server virtualization and Hyper-V R2 gains the best performance of virtual machine management; both of them however cost too much. Instead the best choice is Proxmox Virtual Environment (Proxmox VE because it not only saves the initial investment a lot to own a virtual server/desktop infrastructure, but also obtains the lowest C/P ratio.

  12. General-purpose software for science technology calculation

    International Nuclear Information System (INIS)

    Aikawa, Hiroshi

    1999-01-01

    We have developed many general-purpose softwares for parallel processing of science technology calculation. This paper reported six softwares such as STA (Seamless Thinking Aid) basic soft, parallel numerical computation library, grid formation software for parallel computer, real-time visualizing system, parallel benchmark test system and object-oriented parallel programing method. STA is a user interface software to perform a total environment for parallel programing, a network computing environment for various parallel computers and a desktop computing environment via Web. Some examples using the above softwares are explained. One of them is a simultaneous parallel calculation of both analysis of flow and structure of supersonic transport to design of them. The other is various kinds of computer parallel calculations for nuclear fusion reaction such as a molecular dynamic calculation and a calculation of reactor structure and fluid. These softs are opened to the public by the home page {http://guide.tokai.jaeri.go.jp/ccse/}. (S.Y.)

  13. Microsoft Virtualization Master Microsoft Server, Desktop, Application, and Presentation Virtualization

    CERN Document Server

    Olzak, Thomas; Boomer, Jason; Keefer, Robert M

    2010-01-01

    Microsoft Virtualization helps you understand and implement the latest virtualization strategies available with Microsoft products. This book focuses on: Server Virtualization, Desktop Virtualization, Application Virtualization, and Presentation Virtualization. Whether you are managing Hyper-V, implementing desktop virtualization, or even migrating virtual machines, this book is packed with coverage on all aspects of these processes. Written by a talented team of Microsoft MVPs, Microsoft Virtualization is the leading resource for a full installation, migration, or integration of virtual syste

  14. Effective UI The Art of Building Great User Experience in Software

    CERN Document Server

    Anderson, Jonathan; Wilson, Robb

    2010-01-01

    People expect effortless, engaging interaction with desktop and web applications, but producing software that generates enjoyable user experiences is much harder than many companies anticipate. With Effective UI, you'll learn proven user-experience strategies that will satisfy your clients and customers, drive business value, and increase brand strength. This book shows you how to capture the collaborative and cooperative spirit among designers, engineers, and management required for building engaging software. You'll also learn valuable methods for maintaining focus throughout the process -

  15. ADAM (Affordable Desktop Application Manager): a Unix desktop application manager

    International Nuclear Information System (INIS)

    Liebana, M.; Marquina, M.; Ramos, R.

    1996-01-01

    ADAM stands for Affordable Desktop Application Manager. It is a GUI developed at CERN with the aim to ease access to applications. The motivation to develop ADAM came from the unavailability of environments like COSE/CDE and their heavy resource consumption. ADAM has proven to be user friendly: new users are able to customize it to their needs in few minutes. Groups of users may share through ADAM a common application environment. ADAM also integrates the Unix and the PC world. PC users can excess Unix applications in the same way as their usual Windows applications. This paper describes all the ADAM features, how they are used at CERN Public Services, and the future plans for ADAM. (author)

  16. Cubby : Multiscreen Desktop VR Part III

    NARCIS (Netherlands)

    Djajadiningrat, J.P.; Gribnau, M.W.

    2000-01-01

    In this month's final episode of our 'Cubby: Multiscreen Desktop VR' trilogy we explain how you read the InputSprocket driver from part II, how you use it as input for the cameras from part I and how you calibrate the input device so that it leads to the correct head position.

  17. Cubby : Multiscreen Desktop VR Part II

    NARCIS (Netherlands)

    Gribnau, M.W.; Djajadiningrat, J.P.

    2000-01-01

    In this second part of our 'Cubby: Multiscreen Desktop VR' trilogy, we will introduce you to the art of creating a driver to read an Origin Instruments Dynasight input device. With the Dynasight, the position of the head of the user is established so that Cubby can display the correct images on its

  18. The File Sync Algorithm of the ownCloud Desktop Clients

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    The ownCloud desktop clients provide file syncing between desktop machines and the ownCloud server, available for the important desktop platforms. This presentation will give an overview of the sync algorithm used by the clients to provide a fast, reliable and robust syncing experience for the users. It will describe the phases a sync run will go through and how it is triggered. It also will provide an insight on the algorithms that decided if a file is uploaded, downloaded or even deleted on either on the local machine or in the cloud. Some examples of non obvious situations in file syncing will be described and discussed. As the ownCloud sync protocol is based on the open standard WebDAV the resulting challenges and the solutions will be illustrated. Finally a couple of frequently proposed enhancements will be reviewed and assed for the future development of the ownCloud server and syncing clients.

  19. [Porting Radiotherapy Software of Varian to Cloud Platform].

    Science.gov (United States)

    Zou, Lian; Zhang, Weisha; Liu, Xiangxiang; Xie, Zhao; Xie, Yaoqin

    2017-09-30

    To develop a low-cost private cloud platform of radiotherapy software. First, a private cloud platform which was based on OpenStack and the virtual GPU hardware was builded. Then on the private cloud platform, all the Varian radiotherapy software modules were installed to the virtual machine, and the corresponding function configuration was completed. Finally the software on the cloud was able to be accessed by virtual desktop client. The function test results of the cloud workstation show that a cloud workstation is equivalent to an isolated physical workstation, and any clients on the LAN can use the cloud workstation smoothly. The cloud platform transplantation in this study is economical and practical. The project not only improves the utilization rates of radiotherapy software, but also makes it possible that the cloud computing technology can expand its applications to the field of radiation oncology.

  20. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  1. EPA Region 8, Memo on Desktop Printer Ink Cartridges Policy & Voluntary Printer Turn-in

    Science.gov (United States)

    This memo requests EPA Region 8 users to voluntarily turn-in their desktop printers and notifies users of the Region 8 policy to not provide maintenance or ink and toner cartridges for desktop printers.

  2. Increasing Open Source Software Integration on the Department of Defense Unclassified Desktop

    National Research Council Canada - National Science Library

    Schearer, Steven A

    2008-01-01

    .... While some of this expenditure goes to fund special-purpose military software, much of it is absorbed by license fees for computer operating systems and general-purpose office automation applications...

  3. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    Science.gov (United States)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  4. Investigation Methodology of a Virtual Desktop Infrastructure for IoT

    Directory of Open Access Journals (Sweden)

    Doowon Jeong

    2015-01-01

    Full Text Available Cloud computing for IoT (Internet of Things has exhibited the greatest growth in the IT market in the recent past and this trend is expected to continue. Many companies are adopting a virtual desktop infrastructure (VDI for private cloud computing to reduce costs and enhance the efficiency of their servers. As a VDI is widely used, threats of cyber terror and invasion are also increasing. To minimize the damage, response procedure for cyber intrusion on a VDI should be systematized. Therefore, we propose an investigation methodology for VDI solutions in this paper. Here we focus on a virtual desktop infrastructure and introduce various desktop virtualization solutions that are widely used, such as VMware, Citrix, and Microsoft. In addition, we verify the integrity of the data acquired in order that the result of our proposed methodology is acceptable as evidence in a court of law. During the experiment, we observed an error: one of the commonly used digital forensic tools failed to mount a dynamically allocated virtual disk properly.

  5. Virtual network computing: cross-platform remote display and collaboration software.

    Science.gov (United States)

    Konerding, D E

    1999-04-01

    VNC (Virtual Network Computing) is a computer program written to address the problem of cross-platform remote desktop/application display. VNC uses a client/server model in which an image of the desktop of the server is transmitted to the client and displayed. The client collects mouse and keyboard input from the user and transmits them back to the server. The VNC client and server can run on Windows 95/98/NT, MacOS, and Unix (including Linux) operating systems. VNC is multi-user on Unix machines (any number of servers can be run are unrelated to the primary display of the computer), while it is effectively single-user on Macintosh and Windows machines (only one server can be run, displaying the contents of the primary display of the server). The VNC servers can be configured to allow more than one client to connect at one time, effectively allowing collaboration through the shared desktop. I describe the function of VNC, provide details of installation, describe how it achieves its goal, and evaluate the use of VNC for molecular modelling. VNC is an extremely useful tool for collaboration, instruction, software development, and debugging of graphical programs with remote users.

  6. Desktop publishing: a useful tool for scientists.

    Science.gov (United States)

    Lindroth, J R; Cooper, G; Kent, R L

    1994-01-01

    Desktop publishing offers features that are not available in word processing programs. The process yields an impressive and professional-looking document that is legible and attractive. It is a simple but effective tool to enhance the quality and appearance of your work and perhaps also increase your productivity.

  7. Thomas Jefferson, Page Design, and Desktop Publishing.

    Science.gov (United States)

    Hartley, James

    1991-01-01

    Discussion of page design for desktop publishing focuses on the importance of functional issues as opposed to aesthetic issues, and criticizes a previous article that stressed aesthetic issues. Topics discussed include balance, consistency in text structure, and how differences in layout affect the clarity of "The Declaration of…

  8. Desktop Virtualization in Action: Simplicity Is Power

    Science.gov (United States)

    Fennell, Dustin

    2010-01-01

    Discover how your institution can better manage and increase access to instructional applications and desktops while providing a blended learning environment. Receive practical insight into how academic computing virtualization can be leveraged to enhance education at your institution while lowering Total Cost of Ownership (TCO) and reducing the…

  9. Desk-top publishing using IBM-compatible computers.

    Science.gov (United States)

    Grencis, P W

    1991-01-01

    This paper sets out to describe one Medical Illustration Departments' experience of the introduction of computers for desk-top publishing. In this particular case, after careful consideration of all the options open, an IBM-compatible system was installed rather than the often popular choice of an Apple Macintosh.

  10. Correlation between National Influenza Surveillance Data and Search Queries from Mobile Devices and Desktops in South Korea.

    Science.gov (United States)

    Shin, Soo-Yong; Kim, Taerim; Seo, Dong-Woo; Sohn, Chang Hwan; Kim, Sung-Hoon; Ryoo, Seung Mok; Lee, Yoon-Seon; Lee, Jae Ho; Kim, Won Young; Lim, Kyoung Soo

    2016-01-01

    Digital surveillance using internet search queries can improve both the sensitivity and timeliness of the detection of a health event, such as an influenza outbreak. While it has recently been estimated that the mobile search volume surpasses the desktop search volume and mobile search patterns differ from desktop search patterns, the previous digital surveillance systems did not distinguish mobile and desktop search queries. The purpose of this study was to compare the performance of mobile and desktop search queries in terms of digital influenza surveillance. The study period was from September 6, 2010 through August 30, 2014, which consisted of four epidemiological years. Influenza-like illness (ILI) and virologic surveillance data from the Korea Centers for Disease Control and Prevention were used. A total of 210 combined queries from our previous survey work were used for this study. Mobile and desktop weekly search data were extracted from Naver, which is the largest search engine in Korea. Spearman's correlation analysis was used to examine the correlation of the mobile and desktop data with ILI and virologic data in Korea. We also performed lag correlation analysis. We observed that the influenza surveillance performance of mobile search queries matched or exceeded that of desktop search queries over time. The mean correlation coefficients of mobile search queries and the number of queries with an r-value of ≥ 0.7 equaled or became greater than those of desktop searches over the four epidemiological years. A lag correlation analysis of up to two weeks showed similar trends. Our study shows that mobile search queries for influenza surveillance have equaled or even become greater than desktop search queries over time. In the future development of influenza surveillance using search queries, the recognition of changing trend of mobile search data could be necessary.

  11. Visualizer: 3D Gridded Data Visualization Software for Geoscience Education and Research

    Science.gov (United States)

    Harwood, C.; Billen, M. I.; Kreylos, O.; Jadamec, M.; Sumner, D. Y.; Kellogg, L. H.; Hamann, B.

    2008-12-01

    In both research and education learning is an interactive and iterative process of exploring and analyzing data or model results. However, visualization software often presents challenges on the path to learning because it assumes the user already knows the locations and types of features of interest, instead of enabling flexible and intuitive examination of results. We present examples of research and teaching using the software, Visualizer, specifically designed to create an effective and intuitive environment for interactive, scientific analysis of 3D gridded data. Visualizer runs in a range of 3D virtual reality environments (e.g., GeoWall, ImmersaDesk, or CAVE), but also provides a similar level of real-time interactivity on a desktop computer. When using Visualizer in a 3D-enabled environment, the software allows the user to interact with the data images as real objects, grabbing, rotating or walking around the data to gain insight and perspective. On the desktop, simple features, such as a set of cross-bars marking the plane of the screen, provide extra 3D spatial cues that allow the user to more quickly understand geometric relationships within the data. This platform portability allows the user to more easily integrate research results into classroom demonstrations and exercises, while the interactivity provides an engaging environment for self-directed and inquiry-based learning by students. Visualizer software is freely available for download (www.keckcaves.org) and runs on Mac OSX and Linux platforms.

  12. Open Source Next Generation Visualization Software for Interplanetary Missions

    Science.gov (United States)

    Trimble, Jay; Rinker, George

    2016-01-01

    Mission control is evolving quickly, driven by the requirements of new missions, and enabled by modern computing capabilities. Distributed operations, access to data anywhere, data visualization for spacecraft analysis that spans multiple data sources, flexible reconfiguration to support multiple missions, and operator use cases, are driving the need for new capabilities. NASA's Advanced Multi-Mission Operations System (AMMOS), Ames Research Center (ARC) and the Jet Propulsion Laboratory (JPL) are collaborating to build a new generation of mission operations software for visualization, to enable mission control anywhere, on the desktop, tablet and phone. The software is built on an open source platform that is open for contributions (http://nasa.github.io/openmct).

  13. Turbulence Visualization at the Terascale on Desktop PCs

    KAUST Repository

    Treib, M.; Burger, K.; Reichl, F.; Meneveau, C.; Szalay, A.; Westermann, R.

    2012-01-01

    is challenging on desktop computers. This is due to the extreme resolution of such fields, requiring memory and bandwidth capacities going beyond what is currently available. To overcome these limitations, we present a GPU system for feature-based turbulence

  14. Randomized Trial of Desktop Humidifier for Dry Eye Relief in Computer Users.

    Science.gov (United States)

    Wang, Michael T M; Chan, Evon; Ea, Linda; Kam, Clifford; Lu, Yvonne; Misra, Stuti L; Craig, Jennifer P

    2017-11-01

    Dry eye is a frequently reported problem among computer users. Low relative humidity environments are recognized to exacerbate signs and symptoms of dry eye, yet are common in offices of computer operators. Desktop USB-powered humidifiers are available commercially, but their efficacy for dry eye relief has not been established. This study aims to evaluate the potential for a desktop USB-powered humidifier to improve tear-film parameters, ocular surface characteristics, and subjective comfort of computer users. Forty-four computer users were enrolled in a prospective, masked, randomized crossover study. On separate days, participants were randomized to 1 hour of continuous computer use, with and without exposure to a desktop humidifier. Lipid-layer grade, noninvasive tear-film breakup time, and tear meniscus height were measured before and after computer use. Following the 1-hour period, participants reported whether ocular comfort was greater, equal, or lesser than that at baseline. The desktop humidifier effected a relative difference in humidity between the two environments of +5.4 ± 5.0% (P .05). However, a relative increase in the median noninvasive tear-film breakup time of +4.0 seconds was observed in the humidified environment (P computer use.Trial registration no: ACTRN12617000326392.

  15. Desktop Publishing: A New Frontier for Instructional Technologists.

    Science.gov (United States)

    Bell, Norman T.; Warner, James W.

    1986-01-01

    Discusses new possibilities that computers and laser printers offer instructional technologists. Includes a brief history of printed communications, a description of new technological advances referred to as "desktop publishing," and suggests the application of this technology to instructional tasks. (TW)

  16. Testing the quality of images for permanent magnet desktop MRI systems using specially designed phantoms.

    Science.gov (United States)

    Qiu, Jianfeng; Wang, Guozhu; Min, Jiao; Wang, Xiaoyan; Wang, Pengcheng

    2013-12-21

    Our aim was to measure the performance of desktop magnetic resonance imaging (MRI) systems using specially designed phantoms, by testing imaging parameters and analysing the imaging quality. We designed multifunction phantoms with diameters of 18 and 60 mm for desktop MRI scanners in accordance with the American Association of Physicists in Medicine (AAPM) report no. 28. We scanned the phantoms with three permanent magnet 0.5 T desktop MRI systems, measured the MRI image parameters, and analysed imaging quality by comparing the data with the AAPM criteria and Chinese national standards. Image parameters included: resonance frequency, high contrast spatial resolution, low contrast object detectability, slice thickness, geometrical distortion, signal-to-noise ratio (SNR), and image uniformity. The image parameters of three desktop MRI machines could be measured using our specially designed phantoms, and most parameters were in line with MRI quality control criterion, including: resonance frequency, high contrast spatial resolution, low contrast object detectability, slice thickness, geometrical distortion, image uniformity and slice position accuracy. However, SNR was significantly lower than in some references. The imaging test and quality control are necessary for desktop MRI systems, and should be performed with the applicable phantom and corresponding standards.

  17. Testing the quality of images for permanent magnet desktop MRI systems using specially designed phantoms

    International Nuclear Information System (INIS)

    Qiu, Jianfeng; Wang, Guozhu; Min, Jiao; Wang, Xiaoyan; Wang, Pengcheng

    2013-01-01

    Our aim was to measure the performance of desktop magnetic resonance imaging (MRI) systems using specially designed phantoms, by testing imaging parameters and analysing the imaging quality. We designed multifunction phantoms with diameters of 18 and 60 mm for desktop MRI scanners in accordance with the American Association of Physicists in Medicine (AAPM) report no. 28. We scanned the phantoms with three permanent magnet 0.5 T desktop MRI systems, measured the MRI image parameters, and analysed imaging quality by comparing the data with the AAPM criteria and Chinese national standards. Image parameters included: resonance frequency, high contrast spatial resolution, low contrast object detectability, slice thickness, geometrical distortion, signal-to-noise ratio (SNR), and image uniformity. The image parameters of three desktop MRI machines could be measured using our specially designed phantoms, and most parameters were in line with MRI quality control criterion, including: resonance frequency, high contrast spatial resolution, low contrast object detectability, slice thickness, geometrical distortion, image uniformity and slice position accuracy. However, SNR was significantly lower than in some references. The imaging test and quality control are necessary for desktop MRI systems, and should be performed with the applicable phantom and corresponding standards. (paper)

  18. Comparing Web Applications with Desktop Applications: An Empirical Study

    DEFF Research Database (Denmark)

    Pop, Paul

    2002-01-01

    In recent years, many desktop applications have been ported to the world wide web in order to reduce (multiplatform) development, distribution and maintenance costs. However, there is little data concerning the usability of web applications, and the impact of their usability on the total cost...... of developing and using such applications. In this paper we present a comparison of web and desktop applications from the usability point of view. The comparison is based on an empirical study that investigates the performance of a group of users on two calendaring applications: Yahoo!Calendar and Microsoft...... Calendar. The study shows that in the case of web applications the performance of the users is significantly reduced, mainly because of the restricted interaction mechanisms provided by current web browsers....

  19. Digital video for the desktop

    CERN Document Server

    Pender, Ken

    1999-01-01

    Practical introduction to creating and editing high quality video on the desktop. Using examples from a variety of video applications, benefit from a professional's experience, step-by-step, through a series of workshops demonstrating a wide variety of techniques. These include producing short films, multimedia and internet presentations, animated graphics and special effects.The opportunities for the independent videomaker have never been greater - make sure you bring your understanding fully up to date with this invaluable guide.No prior knowledge of the technology is assumed, with explanati

  20. [Teaching Desktop] Video Conferencing in a Collaborative and Problem Based Setting

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Mouritzen, Per

    2013-01-01

    , teachers and assistant teachers wanted to find ways in the design for learning that enables the learners to acquire knowledge about the theories, models and concepts of the subject, as well as hands‐on competencies in a learning‐by‐doing manner. In particular we address the area of desktop video...... shows that the students experiment with various pedagogical situations, and that during the process of design, teaching, and reflection they acquire experiences at both a concrete specific and a general abstract level. The desktop video conference system creates challenges, with technical issues...

  1. Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Levine, Aaron L [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-19

    Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit presentation from the WPTO FY14-FY16 Peer Review. The toolkit is aimed at regulatory agencies, consultants, project developers, the public, and any other party interested in learning more about the hydropower regulatory process.

  2. Aquatic Habitats: Exploring Desktop Ponds. Teacher's Guide.

    Science.gov (United States)

    Barrett, Katharine; Willard, Carolyn

    This book, for grades 2-6, is designed to provide students with a highly motivating and unique opportunity to investigate an aquatic habitat. Students set up, observe, study, and reflect upon their own "desktop ponds." Accessible plants and small animals used in these activities include Elodea, Tubifex worms, snails, mosquito larvae, and fish.…

  3. Development of an automated desktop procedure for defining macro ...

    African Journals Online (AJOL)

    2006-07-03

    break points' such as ... An automated desktop procedure was developed for computing statistically defensible, multiple change .... from source to mouth. .... the calculated value was less than the test statistic given in Owen.

  4. AN INVESTIGATION OF THE APPLICABILITY OF SOFTWARE PRODUCT LINE ENGINEERING FOR ENERGY AND COST-EFFICIENT GREENHOUSE PRODUCTION

    DEFF Research Database (Denmark)

    Mærsk-Møller, Hans Martin

    product line architecture, and how the variability is managed and described using SPLE. It also describes how the DynaLight software product line and its products were developed. Doing so, it shows utilization of rich client platform technology in conjunction with SPLE, which has not been described....... Supplementary lighting is utilized to compensate for the light conditions in the darker months of the year in order to grow certain plants. This is both energy consuming and expensive, as the cost of electricity is high. New knowledge on the plasticity in plants to irregular light patterns is basis for a novel...... and cost, and two desktop tools, the first, DynaLight Desktop, a day-ahead-light-planning tool, and the second, DynaLight Desktop w/control, with the added capability to execute the plans by actuating the light inside the greenhouses. This work provides a SPLE methodology explicitly customized to our...

  5. Laptops vs. Desktops in a Google Groups Environment: A Study on Collaborative Learning

    Directory of Open Access Journals (Sweden)

    Steven Lopes Abrantes

    2011-01-01

    Full Text Available Current literature on m-learning refers to the lack of studies on real use of m-learning applications and how they can compete with current desktop counterparts. The study consists of an experiment involving one hundred and twelve students of higher education and a set of learning activities that they have to accomplish. This study has the main objective to validate if the students that use laptops or desktops are in the flow experience and which of them are more in the flow experience, when using Google Groups. The used approach is based on the flow experience introduced by [1]. It was possible to conclude that students have experienced the flow state both by students using laptops or desktops, but having the laptop students a more positive effect in the flow experience.

  6. Developing a Process Model for the Forensic Extraction of Information from Desktop Search Applications

    Directory of Open Access Journals (Sweden)

    Timothy Pavlic

    2008-03-01

    Full Text Available Desktop search applications can contain cached copies of files that were deleted from the file system. Forensic investigators see this as a potential source of evidence, as documents deleted by suspects may still exist in the cache. Whilst there have been attempts at recovering data collected by desktop search applications, there is no methodology governing the process, nor discussion on the most appropriate means to do so. This article seeks to address this issue by developing a process model that can be applied when developing an information extraction application for desktop search applications, discussing preferred methods and the limitations of each. This work represents a more structured approach than other forms of current research.

  7. BioContainers: an open-source and community-driven framework for software standardization

    Science.gov (United States)

    da Veiga Leprevost, Felipe; Grüning, Björn A.; Alves Aflitos, Saulo; Röst, Hannes L.; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C.; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I.; Perez-Riverol, Yasset

    2017-01-01

    Abstract Motivation BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). Availability and Implementation The software is freely available at github.com/BioContainers/. Contact yperez@ebi.ac.uk PMID:28379341

  8. BioContainers: an open-source and community-driven framework for software standardization.

    Science.gov (United States)

    da Veiga Leprevost, Felipe; Grüning, Björn A; Alves Aflitos, Saulo; Röst, Hannes L; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I; Perez-Riverol, Yasset

    2017-08-15

    BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). The software is freely available at github.com/BioContainers/. yperez@ebi.ac.uk. © The Author(s) 2017. Published by Oxford University Press.

  9. Desktop Publishing: Its Impact on Community College Journalism.

    Science.gov (United States)

    Grzywacz-Gray, John; And Others

    1987-01-01

    Illustrates the kinds of copy that can be created on Apple Macintosh computers and laser printers. Shows font and type specification options. Discusses desktop publishing costs, potential problems, and computer compatibility. Considers the use of computers in college journalism in production, graphics, accounting, advertising, and promotion. (AYC)

  10. SAMP: Application Messaging for Desktop and Web Applications

    Science.gov (United States)

    Taylor, M. B.; Boch, T.; Fay, J.; Fitzpatrick, M.; Paioro, L.

    2012-09-01

    SAMP, the Simple Application Messaging Protocol, is a technology which allows tools to communicate. It is deployed in a number of desktop astronomy applications including ds9, Aladin, TOPCAT, World Wide Telescope and numerous others, and makes it straightforward for a user to treat a selection of these tools as a loosely-integrated suite, combining the most powerful features of each. It has been widely used within Virtual Observatory contexts, but is equally suitable for non-VO use. Enabling SAMP communication from web-based content has long been desirable. An obvious use case is arranging for a click on a web page link to deliver an image, table or spectrum to a desktop viewer, but more sophisticated two-way interaction with rich internet applications would also be possible. Use from the web however presents some problems related to browser sandboxing. We explain how the SAMP Web Profile, introduced in version 1.3 of the SAMP protocol, addresses these issues, and discuss the resulting security implications.

  11. Evolving software products, the design of a water-related modeling software ecosystem

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    2017-01-01

    more than 50 years ago. However, a radical change of software products to evolve both in the software engineering as much as the organizational and business aspects in a disruptive manner are rather rare. In this paper, we report on the transformation of one of the market leader product series in water......-related calculation and modeling from a traditional business-as-usual series of products to an evolutionary software ecosystem. We do so by relying on existing concepts on software ecosystem analysis to analyze the future ecosystem. We report and elaborate on the main focus points necessary for this transition. We...... argue for the generalization of our focus points to the transition from traditional business-as-usual software products to software ecosystems....

  12. In Vivo Tooth-Supported Implant Surgical Guides Fabricated With Desktop Stereolithographic Printers: Fully Guided Surgery Is More Accurate Than Partially Guided Surgery.

    Science.gov (United States)

    Bencharit, Sompop; Staffen, Adam; Yeung, Matthew; Whitley, Daniel; Laskin, Daniel M; Deeb, George R

    2018-02-21

    Desktop stereolithographic printers combined with intraoral scanning and implant planning software promise precise and cost-effective guided implant surgery. The purpose of the present study was to determine the overall range of accuracy of tooth-supported guided implant surgery using desktop printed stereolithographic guides. A cross-sectional study comparing fully and partially guided implant surgery was conducted. Preoperative cone beam computed tomography (CBCT) and intraoral scans were used to plan the implant sites. Surgical guides were then fabricated using a desktop stereolithographic 3-dimensional printer. Postoperative CBCT was used to evaluate the accuracy of placement. Deviations from the planned positions were used as the primary outcome variables. The planning software used, implant systems, and anterior/posterior positions were the secondary outcome variables. The differences between the planned and actual implant positions in the mesial, distal, buccal, and lingual dimensions and buccolingual angulations were determined, and the accuracy was compared statistically using the 1-tail F-test (P = .01), box plots, and 95% confidence intervals for the mean. Sixteen partially edentulous patients requiring placement of 31 implants were included in the present study. The implant deviations from the planned positions for mesial, distal, buccal, and lingual dimensions and buccolingual angulations with the fully guided protocol (n = 20) were 0.17 ± 0.78 mm, 0.44 ± 0.78 mm, 0.23 ± 1.08 mm, -0.22 ± 1.44 mm, and -0.32° ± 2.36°, respectively. The corresponding implant deviations for the partially guided protocol (n = 11) were 0.33 ± 1.38 mm, -0.03 ± 1.59 mm, 0.62 ± 1.15 mm, -0.27 ± 1.61 mm, and 0.59° ± 6.83°. The difference between the variances for fully and partially guided surgery for the distal and angulation dimensions was statistically significant (P = .006 and P guided implant surgery is more accurate than

  13. Software Radar Technology

    Directory of Open Access Journals (Sweden)

    Tang Jun

    2015-08-01

    Full Text Available In this paper, the definition and the key features of Software Radar, which is a new concept, are proposed and discussed. We consider the development of modern radar system technology to be divided into three stages: Digital Radar, Software radar and Intelligent Radar, and the second stage is just commencing now. A Software Radar system should be a combination of various modern digital modular components conformed to certain software and hardware standards. Moreover, a software radar system with an open system architecture supporting to decouple application software and low level hardware would be easy to adopt "user requirements-oriented" developing methodology instead of traditional "specific function-oriented" developing methodology. Compared with traditional Digital Radar, Software Radar system can be easily reconfigured and scaled up or down to adapt to the changes of requirements and technologies. A demonstration Software Radar signal processing system, RadarLab 2.0, which has been developed by Tsinghua University, is introduced in this paper and the suggestions for the future development of Software Radar in China are also given in the conclusion.

  14. Activity Theory applied to Global Software Engineering: Theoretical Foundations and Implications for Tool Builders

    DEFF Research Database (Denmark)

    Tell, Paolo; Ali Babar, Muhammad

    2012-01-01

    Although a plethora of tools are available for Global Software Engineering (GSE) teams, it is being realized increasingly that the most prevalent desktop metaphor underpinning the majority of tools have several inherent limitations. We have proposed that Activity-Based Computing (ABC) can be a pr...... in building supporting infrastructure for GSE, and describe a proof of concept prototype....

  15. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  16. Using Desktop Publishing To Enhance the "Writing Process."

    Science.gov (United States)

    Millman, Patricia G.; Clark, Margaret P.

    1997-01-01

    Describes the development of an instructional technology course at Fairmont State College (West Virginia) for education majors that included a teaching module combining steps of the writing process to provide for the interdisciplinary focus of writing across the curriculum. Discusses desktop publishing, the National Writing Project, and student…

  17. Cryogenic process simulation

    International Nuclear Information System (INIS)

    Panek, J.; Johnson, S.

    1994-01-01

    Combining accurate fluid property databases with a commercial equation-solving software package running on a desktop computer allows simulation of cryogenic processes without extensive computer programming. Computer simulation can be a powerful tool for process development or optimization. Most engineering simulations to date have required extensive programming skills in languages such as Fortran, Pascal, etc. Authors of simulation code have also usually been responsible for choosing and writing the particular solution algorithm. This paper describes a method of simulating cryogenic processes with a commercial software package on a desktop personal computer that does not require these traditional programming tasks. Applications include modeling of cryogenic refrigerators, heat exchangers, vapor-cooled power leads, vapor pressure thermometers, and various other engineering problems

  18. Model-driven software engineering

    NARCIS (Netherlands)

    Amstel, van M.F.; Brand, van den M.G.J.; Protic, Z.; Verhoeff, T.; Hamberg, R.; Verriet, J.

    2014-01-01

    Software plays an important role in designing and operating warehouses. However, traditional software engineering methods for designing warehouse software are not able to cope with the complexity, size, and increase of automation in modern warehouses. This chapter describes Model-Driven Software

  19. No effect of ambient odor on the affective appraisal of a desktop virtual environment with signs of disorder.

    Directory of Open Access Journals (Sweden)

    Alexander Toet

    Full Text Available Desktop virtual environments (VEs are increasingly deployed to study the effects of environmental qualities and interventions on human behavior and safety related concerns in built environments. For these applications it is essential that users appraise the affective qualities of the VE similar to those of its real world counterpart. Previous studies have shown that factors like simulated lighting, sound and dynamic elements all contribute to the affective appraisal of a desktop VE. Since ambient odor is known to affect the affective appraisal of real environments, and has been shown to increase the sense of presence in immersive VEs, it may also be an effective tool to tune the affective appraisal of desktop VEs. This study investigated if exposure to ambient odor can modulate the affective appraisal of a desktop VE with signs of public disorder.Participants explored a desktop VE representing a suburban neighborhood with signs of public disorder (neglect, vandalism and crime, while being exposed to either room air or subliminal levels of unpleasant (tar or pleasant (cut grass ambient odor. Whenever they encountered signs of disorder they reported their safety related concerns and associated affective feelings.Signs of crime in the desktop VE were associated with negative affective feelings and concerns for personal safety and personal property. However, there was no significant difference between reported safety related concerns and affective connotations in the control (no-odor and in each of the two ambient odor conditions.Ambient odor did not affect safety related concerns and affective connotations associated with signs of disorder in the desktop VE. Thus, semantic congruency between ambient odor and a desktop VE may not be sufficient to influence its affective appraisal, and a more realistic simulation in which simulated objects appear to emit scents may be required to achieve this goal.

  20. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...

  1. Versatile Desktop Experiment Module (DEMo) on Heat Transfer

    Science.gov (United States)

    Minerick, Adrienne R.

    2010-01-01

    This paper outlines a new Desktop Experiment Module (DEMo) engineered for a chemical engineering junior-level Heat Transfer course. This new DEMo learning tool is versatile, fairly inexpensive, and portable such that it can be positioned on student desks throughout a classroom. The DEMo system can illustrate conduction of various materials,…

  2. Product-oriented Software Certification Process for Software Synthesis

    Science.gov (United States)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  3. CLOUD EDUCATIONAL RESOURCES FOR PHYSICS LEARNING RESEARCHES SUPPORT

    Directory of Open Access Journals (Sweden)

    Oleksandr V. Merzlykin

    2015-10-01

    Full Text Available The definition of cloud educational resource is given in paper. Its program and information components are characterized. The virtualization as the technological ground of transforming from traditional electronic educational resources to cloud ones is reviewed. Such levels of virtualization are described: data storage device virtualization (Data as Service, hardware virtualization (Hardware as Service, computer virtualization (Infrastructure as Service, software system virtualization (Platform as Service, «desktop» virtualization (Desktop as Service, software user interface virtualization (Software as Service. Possibilities of designing the cloud educational resources system for physics learning researches support taking into account standards of learning objects metadata (accessing via OAI-PMH protocol and standards of learning tools interoperability (LTI are shown. The example of integration cloud educational resources into Moodle learning management system with use of OAI-PMH and LTI is given.

  4. Development of a Desktop Simulator for APR1400 Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lee, J. B.

    2016-01-01

    It is essential for utilities to possess a full-scope simulator for operator training and operation test for operators. But it is very expensive and sometimes lack of fidelity if processes of developing the simulator and designing the plant are in parallel. It is due to the situation that simulator development stage sometimes precedes the plant design stage and modifications may occur to the design of the plant in construction stage. In an attempt to build a low cost and efficient simulator, a desktop simulator has been developed. This model is described herein. Using desktop simulators for training operators is an efficient method for familiarizing operators with their plant’s operation. A low cost and efficient desktop simulator for APR1400 has been developed, and brief features are introduced here. It is configured to mimic a full-scale simulator, and can be used for operators to be familiarized to their plant’s operation. Since the size of the simulator is small enough to be fit in a desk, it can be used in a classroom or in an office at any time. It can also be used to evaluate design changes or modifications of the plant before implementing them to the plant

  5. Development of a Desktop Simulator for APR1400 Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. B. [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    It is essential for utilities to possess a full-scope simulator for operator training and operation test for operators. But it is very expensive and sometimes lack of fidelity if processes of developing the simulator and designing the plant are in parallel. It is due to the situation that simulator development stage sometimes precedes the plant design stage and modifications may occur to the design of the plant in construction stage. In an attempt to build a low cost and efficient simulator, a desktop simulator has been developed. This model is described herein. Using desktop simulators for training operators is an efficient method for familiarizing operators with their plant’s operation. A low cost and efficient desktop simulator for APR1400 has been developed, and brief features are introduced here. It is configured to mimic a full-scale simulator, and can be used for operators to be familiarized to their plant’s operation. Since the size of the simulator is small enough to be fit in a desk, it can be used in a classroom or in an office at any time. It can also be used to evaluate design changes or modifications of the plant before implementing them to the plant.

  6. HDX Workbench: Software for the Analysis of H/D Exchange MS Data

    Science.gov (United States)

    Pascal, Bruce D.; Willis, Scooter; Lauer, Janelle L.; Landgraf, Rachelle R.; West, Graham M.; Marciano, David; Novick, Scott; Goswami, Devrishi; Chalmers, Michael J.; Griffin, Patrick R.

    2012-09-01

    Hydrogen/deuterium exchange mass spectrometry (HDX-MS) is an established method for the interrogation of protein conformation and dynamics. While the data analysis challenge of HDX-MS has been addressed by a number of software packages, new computational tools are needed to keep pace with the improved methods and throughput of this technique. To address these needs, we report an integrated desktop program titled HDX Workbench, which facilitates automation, management, visualization, and statistical cross-comparison of large HDX data sets. Using the software, validated data analysis can be achieved at the rate of generation. The application is available at the project home page http://hdx.florida.scripps.edu.

  7. Development of Web-Based Remote Desktop to Provide Adaptive User Interfaces in Cloud Platform

    OpenAIRE

    Shuen-Tai Wang; Hsi-Ya Chang

    2014-01-01

    Cloud virtualization technologies are becoming more and more prevalent, cloud users usually encounter the problem of how to access to the virtualized remote desktops easily over the web without requiring the installation of special clients. To resolve this issue, we took advantage of the HTML5 technology and developed web-based remote desktop. It permits users to access the terminal which running in our cloud platform from anywhere. We implemented a sketch of web interfac...

  8. Desktop Publishing in the University: Current Progress, Future Visions.

    Science.gov (United States)

    Smith, Thomas W.

    1989-01-01

    Discussion of the workflow involved in desktop publishing focuses on experiences at the College of Engineering at the University of Wisconsin at Madison. Highlights include cost savings and productivity gains in page layout and composition; editing, translation, and revision issues; printing and distribution; and benefits to the reader. (LRW)

  9. Applications and a three-dimensional desktop environment for an immersive virtual reality system

    International Nuclear Information System (INIS)

    Kageyama, Akira; Masada, Youhei

    2013-01-01

    We developed an application launcher called Multiverse for scientific visualizations in a CAVE-type virtual reality (VR) system. Multiverse can be regarded as a type of three-dimensional (3D) desktop environment. In Multiverse, a user in a CAVE room can browse multiple visualization applications with 3D icons and explore movies that float in the air. Touching one of the movies causes ''teleportation'' into the application's VR space. After analyzing the simulation data using the application, the user can jump back into Multiverse's VR desktop environment in the CAVE

  10. Open Source Software The Challenge Ahead

    CERN Multimedia

    CERN. Geneva

    2007-01-01

    The open source community has done amazingly well in terms of challenging the historical epicenter of computing - the supercomputer and data center - and driving change there. Linux now represents a healthy and growing share of infrastructure in large organisations globally. Apache and other infrastructural components have established the new de facto standard for software in the back office: freedom. It would be easy to declare victory. But the real challenge lies ahead - taking free software to the mass market, to your grandparents, to your nieces and nephews, to your friends. This is the next wave, and if we are to be successful we need to articulate the audacious goals clearly and loudly - because that's how the community process works best. Speaker Bio: Mark Shuttleworth founded the Ubuntu Project in early 2004. Ubuntu is an enterprise Linux distribution that is freely available worldwide and has both desktop and enterprise server editions. Mark studied finance and information technology at the Universit...

  11. GPS positioning and desktop mapping. Applications to environmental monitoring. Report on task JNT B898 on the Finnish support programme to IAEA safeguards

    International Nuclear Information System (INIS)

    Kansanaho, A.; Ilander, T.; Toivonen, H.

    1995-10-01

    Satellite navigation has been used for in-field applications by the Finnish Centre for Radiation and Nuclear Safety since 1993. Because of this experience, training in the use of GPS positioning and desktop mapping was chosen as a task under the Finnish Support programme to IAEA safeguards. A lecture and a field experiment was held in the training course on environmental monitoring at the IAEA headquarters in June 1995. Real-time mapping of the co-ordinates and storing information on sampling sites and procedures can make safeguards implementation more efficient and effective. Further software development are needed for these purposes. (author) (6 figs.)

  12. A Real-World Project for a Desktop Publishing Course.

    Science.gov (United States)

    Marsden, James D.

    1994-01-01

    Describes a project in a desktop publishing course in which students work with nonprofit and campus organizations to design brochures that fulfill important needs. Discusses specific tools students use. Describes the brochure project, project criteria, clients, text and graphics for the project, how to evaluate the project, and guidelines for…

  13. The control software framework of the web base

    International Nuclear Information System (INIS)

    Nakatani, Takeshi; Inamura, Yasuhiro; Ito, Takayoshi; Otomo, Toshiya

    2015-01-01

    Web browsers are one of the most platform-independent user interfaces. In particular, web pages created using responsive web design (RWD) are available for use on desktop and laptop computers, as well as tablet terminals and smart phones. We developed a common software framework, IROHA, for the instrument control system in the Materials and Life Science Experimental Facility at the Japan Proton Accelerator Research Complex to build a flexible and scalable system by adopting XML/HTTP. However, its user interface was platform-dependent, and we wanted it to be more user-friendly. In 2013, we developed the prototype of a new software framework, IROHA2, comprising several device control servers and an instrument management server, retaining the flexibility and scalability of IROHA. We also adopted the Bootstrap framework to create an RWD user interface for these servers. (author)

  14. Designing for Communication: The Key to Successful Desktop Publishing.

    Science.gov (United States)

    McCain, Ted D. E.

    Written for those who are new to design and page layout, this book focuses on providing novice desktop publishers with an understanding of communication, graphic design, typography, page layout, and page layout techniques. The book also discusses how people read, design as a consequence of understanding, and the principles of page layout. Chapters…

  15. A desktop 3D printer with dual extruders to produce customised electronic circuitry

    Science.gov (United States)

    Butt, Javaid; Onimowo, Dominic Adaoiza; Gohrabian, Mohammed; Sharma, Tinku; Shirvani, Hassan

    2018-03-01

    3D printing has opened new horizons for the manufacturing industry in general, and 3D printers have become the tools for technological advancements. There is a huge divide between the pricing of industrial and desktop 3D printers with the former being on the expensive side capable of producing excellent quality products and latter being on the low-cost side with moderate quality results. However, there is a larger room for improvements and enhancements for the desktop systems as compared to the industrial ones. In this paper, a desktop 3D printer called Prusa Mendel i2 has been modified and integrated with an additional extruder so that the system can work with dual extruders and produce bespoke electronic circuits. The communication between the two extruders has been established by making use of the In-Chip Serial Programming port on the Arduino Uno controlling the printer. The biggest challenge is to control the flow of electric paint (to be dispensed by the new extruder) and CFD (Computational Fluid Dynamics) analysis has been carried out to ascertain the optimal conditions for proper dispensing. The final product is a customised electronic circuit with the base of plastic (from the 3D printer's extruder) and electronic paint (from the additional extruder) properly dispensed to create a live circuit on a plastic platform. This low-cost enhancement to a desktop 3D printer can provide a new prospect to produce multiple material parts where the additional extruder can be filled with any material that can be properly dispensed from its nozzle.

  16. What Desktop Publishing Can Teach Professional Writing Students about Publishing.

    Science.gov (United States)

    Dobberstein, Michael

    1992-01-01

    Points out that desktop publishing is a metatechnology that allows professional writing students access to the production phase of publishing, giving students hands-on practice in preparing text for printing and in learning how that preparation affects the visual meaning of documents. (SR)

  17. Visual attention for a desktop virtual environment with ambient scent

    NARCIS (Netherlands)

    Toet, A.; Schaik, M.G. van

    2013-01-01

    In the current study participants explored a desktop virtual environment (VE) representing a suburban neighborhood with signs of public disorder (neglect, vandalism and crime), while being exposed to either room air (control group), or subliminal levels of tar (unpleasant; typically associated with

  18. Feasibility of Bioprinting with a Modified Desktop 3D Printer.

    Science.gov (United States)

    Goldstein, Todd A; Epstein, Casey J; Schwartz, John; Krush, Alex; Lagalante, Dan J; Mercadante, Kevin P; Zeltsman, David; Smith, Lee P; Grande, Daniel A

    2016-12-01

    Numerous studies have shown the capabilities of three-dimensional (3D) printing for use in the medical industry. At the time of this publication, basic home desktop 3D printer kits can cost as little as $300, whereas medical-specific 3D bioprinters can cost more than $300,000. The purpose of this study is to show how a commercially available desktop 3D printer could be modified to bioprint an engineered poly-l-lactic acid scaffold containing viable chondrocytes in a bioink. Our bioprinter was used to create a living 3D functional tissue-engineered cartilage scaffold. In this article, we detail the design, production, and calibration of this bioprinter. In addition, the bioprinted cells were tested for viability, proliferation, biochemistry, and gene expression; these tests showed that the cells survived the printing process, were able to continue dividing, and produce the extracellular matrix expected of chondrocytes.

  19. SAPHIRE 8 Software Quality Assurance Plan

    Energy Technology Data Exchange (ETDEWEB)

    Curtis Smith

    2010-02-01

    This Quality Assurance (QA) Plan documents the QA activities that will be managed by the INL related to JCN N6423. The NRC developed the SAPHIRE computer code for performing probabilistic risk assessments (PRAs) using a personal computer (PC) at the Idaho National Laboratory (INL) under Job Code Number (JCN) L1429. SAPHIRE started out as a feasibility study for a PRA code to be run on a desktop personal PC and evolved through several phases into a state-of-the-art PRA code. The developmental activity of SAPHIRE was the result of two concurrent important events: The tremendous expansion of PC software and hardware capability of the 90s and the onset of a risk-informed regulation era.

  20. Interfacing of thermal ionization mass spectrometer with PC/XT and related software development

    International Nuclear Information System (INIS)

    Moorthy, A.D.; Gurba, P.B.; Rajendrakumar; Singh, R.K.; Bajpai, D.D.; Coelho, G.J.M.; Das, K.V.; Indurkar, V.S.

    1992-01-01

    A completely automated Thermal Ionization Mass Spectrometer (TIMS), is used in Power Reactor Fuel Reprocessing Plant (PREFRE) Tarapur for precise and accurate measurement of isotopic composition and concentration determination of special nuclear materials (Uranium and Plutonium) for the purpose of input accounting of the plant. It is provided with one Hewlett-Packard, H-9845B desktop computer to control various instrument parameters and perform automatic analysis of 13 samples in sequence. The computer gave fairly good service for six years with intermittent minor maintenance before it developed major problems. In view of the fact that its repair and maintenance cost is several times the cost of locally available computer, it was decided to replace the imported Hewlett-Packard 9845B desktop computer with PC/XT. This report describes the interfacing of TIMS with PC/XT and the related Software development. (author). 3 refs., 8 figs., 2 annexures

  1. MDA-image: an environment of networked desktop computers for teleradiology/pathology.

    Science.gov (United States)

    Moffitt, M E; Richli, W R; Carrasco, C H; Wallace, S; Zimmerman, S O; Ayala, A G; Benjamin, R S; Chee, S; Wood, P; Daniels, P

    1991-04-01

    MDA-Image, a project of The University of Texas M. D. Anderson Cancer Center, is an environment of networked desktop computers for teleradiology/pathology. Radiographic film is digitized with a film scanner and histopathologic slides are digitized using a red, green, and blue (RGB) video camera connected to a microscope. Digitized images are stored on a data server connected to the institution's computer communication network (Ethernet) and can be displayed from authorized desktop computers connected to Ethernet. Images are digitized for cases presented at the Bone Tumor Management Conference, a multidisciplinary conference in which treatment options are discussed among clinicians, surgeons, radiologists, pathologists, radiotherapists, and medical oncologists. These radiographic and histologic images are shown on a large screen computer monitor during the conference. They are available for later review for follow-up or representation.

  2. Integrating R and Hadoop for Big Data Analysis

    OpenAIRE

    Bogdan Oancea; Raluca Mariana Dragoescu

    2014-01-01

    Analyzing and working with big data could be very diffi cult using classical means like relational database management systems or desktop software packages for statistics and visualization. Instead, big data requires large clusters with hundreds or even thousands of computing nodes. Offi cial statistics is increasingly considering big data for deriving new statistics because big data sources could produce more relevant and timely statistics than traditional sources. One of the software tools ...

  3. Emotional reactions of different interface formats: Comparing digital and traditional board games

    Directory of Open Access Journals (Sweden)

    Yu-Min Fang

    2016-03-01

    Full Text Available Some games provide both traditional board games and digital versions at the same time in the market. Why the rise of virtual games has not forced traditional physical board games to disappear? Do traditional physical games evoke different emotional reactions and interpersonal relationships? This article explored the subjects’ preferences toward traditional and digital versions of the same game and investigated social interaction while playing games. Based on Norman’s three emotional design levels—visceral, behavioral, and reflective levels—this study examined players’ satisfaction degree. This study also applied Positive and Negative Affect Schedule to measure subjects’ emotional reactions. Monopoly and Jenga games were selected as stimuli. A total of 77 subjects received tests of three different interface formats (physical, desktop, and tablet and then filled out the questionnaire. The findings successfully evidenced the significant differences between digital and traditional board games. The statistical results indicated that satisfaction degrees of digital games declined in visceral, behavioral, and reflective levels. Traditional games not only evoked users’ stronger emotional reactions but also received higher preferences. Traditional games could improve interpersonal relationships as well.

  4. Design and Development of a Framework Based on Ogc Web Services for the Visualization of Three Dimensional Large-Scale Geospatial Data Over the Web

    Science.gov (United States)

    Roccatello, E.; Nozzi, A.; Rumor, M.

    2013-05-01

    This paper illustrates the key concepts behind the design and the development of a framework, based on OGC services, capable to visualize 3D large scale geospatial data streamed over the web. WebGISes are traditionally bounded to a bi-dimensional simplified representation of the reality and though they are successfully addressing the lack of flexibility and simplicity of traditional desktop clients, a lot of effort is still needed to reach desktop GIS features, like 3D visualization. The motivations behind this work lay in the widespread availability of OGC Web Services inside government organizations and in the technology support to HTML 5 and WebGL standard of the web browsers. This delivers an improved user experience, similar to desktop applications, therefore allowing to augment traditional WebGIS features with a 3D visualization framework. This work could be seen as an extension of the Cityvu project, started in 2008 with the aim of a plug-in free OGC CityGML viewer. The resulting framework has also been integrated in existing 3DGIS software products and will be made available in the next months.

  5. Agile Software Development in the Department of Defense Environment

    Science.gov (United States)

    2017-03-31

    traditional project/program life cycle (i.e., waterfall ). In the traditional model , security requirements are not evaluated until development is...2015), which may better facilitate adoption of Agile software development in the DoD. Several models are provided for software-dominant and software...the DoD has historically used a traditional, waterfall approach for acquiring systems and services), and oversight requirements that are

  6. Multimodal Desktop Interaction: The Face –Object-Gesture–Voice Example

    DEFF Research Database (Denmark)

    Vidakis, Nikolas; Vlasopoulos, Anastasios; Kounalakis, Tsampikos

    2013-01-01

    This paper presents a natural user interface system based on multimodal human computer interaction, which operates as an intermediate module between the user and the operating system. The aim of this work is to demonstrate a multimodal system which gives users the ability to interact with desktop...

  7. Negotiation of Meaning in Desktop Videoconferencing-Supported Distance Language Learning

    Science.gov (United States)

    Wang, Yuping

    2006-01-01

    The aim of this research is to reveal the dynamics of focus on form in task completion via videoconferencing. This examination draws on current second language learning theories regarding effective language acquisition, research in Computer Mediated Communication (CMC) and empirical data from an evaluation of desktop videoconferencing-supported…

  8. A Desktop Publishing Course: An Alternative to Internships for Rural Universities.

    Science.gov (United States)

    Flammia, Madelyn

    1992-01-01

    Suggests that a course in desktop publishing can provide students at rural schools with experience equivalent to internships. Notes that the course provided students with real-world experience and benefited the university in terms of services and public relations. (RS)

  9. Unique Methodologies for Nano/Micro Manufacturing Job Training Via Desktop Supercomputer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, Clyde [Northern Illinois Univ., DeKalb, IL (United States); Karonis, Nicholas [Northern Illinois Univ., DeKalb, IL (United States); Lurio, Laurence [Northern Illinois Univ., DeKalb, IL (United States); Piot, Philippe [Northern Illinois Univ., DeKalb, IL (United States); Xiao, Zhili [Northern Illinois Univ., DeKalb, IL (United States); Glatz, Andreas [Northern Illinois Univ., DeKalb, IL (United States); Pohlman, Nicholas [Northern Illinois Univ., DeKalb, IL (United States); Hou, Minmei [Northern Illinois Univ., DeKalb, IL (United States); Demir, Veysel [Northern Illinois Univ., DeKalb, IL (United States); Song, Jie [Northern Illinois Univ., DeKalb, IL (United States); Duffin, Kirk [Northern Illinois Univ., DeKalb, IL (United States); Johns, Mitrick [Northern Illinois Univ., DeKalb, IL (United States); Sims, Thomas [Northern Illinois Univ., DeKalb, IL (United States); Yin, Yanbin [Northern Illinois Univ., DeKalb, IL (United States)

    2012-11-21

    This project establishes an initiative in high speed (Teraflop)/large-memory desktop supercomputing for modeling and simulation of dynamic processes important for energy and industrial applications. It provides a training ground for employment of current students in an emerging field with skills necessary to access the large supercomputing systems now present at DOE laboratories. It also provides a foundation for NIU faculty to quantum leap beyond their current small cluster facilities. The funding extends faculty and student capability to a new level of analytic skills with concomitant publication avenues. The components of the Hewlett Packard computer obtained by the DOE funds create a hybrid combination of a Graphics Processing System (12 GPU/Teraflops) and a Beowulf CPU system (144 CPU), the first expandable via the NIU GAEA system to ~60 Teraflops integrated with a 720 CPU Beowulf system. The software is based on access to the NVIDIA/CUDA library and the ability through MATLAB multiple licenses to create additional local programs. A number of existing programs are being transferred to the CPU Beowulf Cluster. Since the expertise necessary to create the parallel processing applications has recently been obtained at NIU, this effort for software development is in an early stage. The educational program has been initiated via formal tutorials and classroom curricula designed for the coming year. Specifically, the cost focus was on hardware acquisitions and appointment of graduate students for a wide range of applications in engineering, physics and computer science.

  10. An ergonomic evaluation comparing desktop, notebook, and subnotebook computers.

    Science.gov (United States)

    Szeto, Grace P; Lee, Raymond

    2002-04-01

    To evaluate and compare the postures and movements of the cervical and upper thoracic spine, the typing performance, and workstation ergonomic factors when using a desktop, notebook, and subnotebook computers. Repeated-measures design. A motion analysis laboratory with an electromagnetic tracking device. A convenience sample of 21 university students between ages 20 and 24 years with no history of neck or shoulder discomfort. Each subject performed a standardized typing task by using each of the 3 computers. Measurements during the typing task were taken at set intervals. Cervical and thoracic spines adopted a more flexed posture in using the smaller-sized computers. There were significantly greater neck movements in using desktop computers when compared with the notebook and subnotebook computers. The viewing distances adopted by the subjects decreased as the computer size decreased. Typing performance and subjective rating of difficulty in using the keyboards were also significantly different among the 3 types of computers. Computer users need to consider the posture of the spine and potential risk of developing musculoskeletal discomfort in choosing computers. Copyright 2002 by the American Congress of Rehabilitation Medicine and the American Academy of Physical Medicine and Rehabilitation

  11. GTfold: Enabling parallel RNA secondary structure prediction on multi-core desktops

    DEFF Research Database (Denmark)

    Swenson, M Shel; Anderson, Joshua; Ash, Andrew

    2012-01-01

    achieved significant improvements in runtime, but their implementations were not portable from niche high-performance computers or easily accessible to most RNA researchers. With the increasing prevalence of multi-core desktop machines, a new parallel prediction program is needed to take full advantage...

  12. Tableau your data! fast and easy visual analysis with Tableau software

    CERN Document Server

    Murray, Dan

    2013-01-01

    Best practices and step-by-step instructions for using the Tableau Software toolset Although the Tableau Desktop interface is relatively intuitive, this book goes beyond the simple mechanics of the interface to show best practices for creating effective visualizations for specific business intelligence objectives. It illustrates little-known features and techniques for getting the most from the Tableau toolset, supporting the needs of the business analysts who use the product as well as the data and IT managers who support it. This comprehensive guide covers the core feature set for data anal

  13. Towards a Reference Architecture to Provision Tools as a Service for Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali

    2014-01-01

    Organizations involve in Global Software Development (GSD) face challenges in terms of having access to appropriate set of tools for performing distributed engineering and development activities, integration between heterogeneous desktop and web-based tools, management of artifacts developed...... distributed environment. In this paper, we argue the need to have a cloud-enabled platform for supporting GSD and propose reference architecture of a cloud based Platform for providing support to provision ecosystem of the Tools as a Service (PTaaS)....

  14. Desktop publishing and validation of custom near visual acuity charts.

    Science.gov (United States)

    Marran, Lynn; Liu, Lei; Lau, George

    2008-11-01

    Customized visual acuity (VA) assessment is an important part of basic and clinical vision research. Desktop computer based distance VA measurements have been utilized, and shown to be accurate and reliable, but computer based near VA measurements have not been attempted, mainly due to the limited spatial resolution of computer monitors. In this paper, we demonstrate how to use desktop publishing to create printed custom near VA charts. We created a set of six near VA charts in a logarithmic progression, 20/20 through 20/63, with multiple lines of the same acuity level, different letter arrangements in each line and a random noise background. This design allowed repeated measures of subjective accommodative amplitude without the potential artifact of familiarity of the optotypes. The background maintained a constant and spatial frequency rich peripheral stimulus for accommodation across the six different acuity levels. The paper describes in detail how pixel-wise accurate black and white bitmaps of Sloan optotypes were used to create the printed custom VA charts. At all acuity levels, the physical sizes of the printed custom optotypes deviated no more than 0.034 log units from that of the standard, satisfying the 0.05 log unit ISO criterion we used to demonstrate physical equivalence. Also, at all acuity levels, log unit differences in the mean target distance for which reliable recognition of letters first occurred for the printed custom optotypes compared to the standard were found to be below 0.05, satisfying the 0.05 log unit ISO criterion we used to demonstrate functional equivalence. It is possible to use desktop publishing to create custom near VA charts that are physically and functionally equivalent to standard VA charts produced by a commercial printing process.

  15. A Desktop Virtual Reality Earth Motion System in Astronomy Education

    Science.gov (United States)

    Chen, Chih Hung; Yang, Jie Chi; Shen, Sarah; Jeng, Ming Chang

    2007-01-01

    In this study, a desktop virtual reality earth motion system (DVREMS) is designed and developed to be applied in the classroom. The system is implemented to assist elementary school students to clarify earth motion concepts using virtual reality principles. A study was conducted to observe the influences of the proposed system in learning.…

  16. Software Considerations for Subscale Flight Testing of Experimental Control Laws

    Science.gov (United States)

    Murch, Austin M.; Cox, David E.; Cunningham, Kevin

    2009-01-01

    The NASA AirSTAR system has been designed to address the challenges associated with safe and efficient subscale flight testing of research control laws in adverse flight conditions. In this paper, software elements of this system are described, with an emphasis on components which allow for rapid prototyping and deployment of aircraft control laws. Through model-based design and automatic coding a common code-base is used for desktop analysis, piloted simulation and real-time flight control. The flight control system provides the ability to rapidly integrate and test multiple research control laws and to emulate component or sensor failures. Integrated integrity monitoring systems provide aircraft structural load protection, isolate the system from control algorithm failures, and monitor the health of telemetry streams. Finally, issues associated with software configuration management and code modularity are briefly discussed.

  17. SERVICE HANDBOOK FOR THE DESKTOP SUPPORT CONTRACT WIH IT DIVISION

    CERN Multimedia

    2000-01-01

    A Desktop Support Contract has been running since January 1999 to offer help to all users at CERN with problems that occur with their desktop computers. The contract is run conjointly by the Swedish Company WM-data and the Swiss company DCS.The contract is comprised of the Computing Helpdesk, a General Service for all parts of CERN and also Local Service for those divisions and groups that want faster response times and additional help with their specific computer environment.In order to describe what services are being offered, and also to give a better understanding of the structure of the contract, a Service Handbook has been created. The intended audience for the Service Handbook is everyone that is using the contract, i.e. users, managers and also the service staff inside the contract. In the handbook you will find what help you can get from the contract, how to get in touch with the contract, and also what response times you can expect. Since the computer environment at CERN is a never-changing entity, ...

  18. Efficient Redundancy Techniques in Cloud and Desktop Grid Systems using MAP/G/c-type Queues

    Science.gov (United States)

    Chakravarthy, Srinivas R.; Rumyantsev, Alexander

    2018-03-01

    Cloud computing is continuing to prove its flexibility and versatility in helping industries and businesses as well as academia as a way of providing needed computing capacity. As an important alternative to cloud computing, desktop grids allow to utilize the idle computer resources of an enterprise/community by means of distributed computing system, providing a more secure and controllable environment with lower operational expenses. Further, both cloud computing and desktop grids are meant to optimize limited resources and at the same time to decrease the expected latency for users. The crucial parameter for optimization both in cloud computing and in desktop grids is the level of redundancy (replication) for service requests/workunits. In this paper we study the optimal replication policies by considering three variations of Fork-Join systems in the context of a multi-server queueing system with a versatile point process for the arrivals. For services we consider phase type distributions as well as shifted exponential and Weibull. We use both analytical and simulation approach in our analysis and report some interesting qualitative results.

  19. Efficient Redundancy Techniques in Cloud and Desktop Grid Systems using MAP/G/c-type Queues

    Directory of Open Access Journals (Sweden)

    Chakravarthy Srinivas R.

    2018-03-01

    Full Text Available Cloud computing is continuing to prove its flexibility and versatility in helping industries and businesses as well as academia as a way of providing needed computing capacity. As an important alternative to cloud computing, desktop grids allow to utilize the idle computer resources of an enterprise/community by means of distributed computing system, providing a more secure and controllable environment with lower operational expenses. Further, both cloud computing and desktop grids are meant to optimize limited resources and at the same time to decrease the expected latency for users. The crucial parameter for optimization both in cloud computing and in desktop grids is the level of redundancy (replication for service requests/workunits. In this paper we study the optimal replication policies by considering three variations of Fork-Join systems in the context of a multi-server queueing system with a versatile point process for the arrivals. For services we consider phase type distributions as well as shifted exponential and Weibull. We use both analytical and simulation approach in our analysis and report some interesting qualitative results.

  20. Towards E-CASE Tools for Software Engineering

    OpenAIRE

    Nabil Arman

    2013-01-01

    CASE tools are having an important role in all phases of software systems development and engineering. This is evident in the huge benefits obtained from using these tools including their cost-effectiveness, rapid software application development, and improving the possibility of software reuse to name just a few. In this paper, the idea of moving towards E-CASE tools, rather than traditional CASE tools, is advocated since these E-CASE tools have all the benefits and advantages of traditional...

  1. A flexible and reusable software for real-time control applications at JET

    International Nuclear Information System (INIS)

    De Tommasi, G.; Piccolo, F.; Sartori, F.

    2005-01-01

    The fast growth of the JET real-time control network and the increasing demand for new systems have been the triggers that started the development of the JETRT software framework. This new architecture is designed for maximum reuse and is particularly suited for implementation of both real-time control and data acquisition systems in a complex experimental environment such as JET. Most of the software is the same in all applications independent of the platform. The varying part is the project specific algorithm, which is also compiled into a separate software component, in order to achieve a separation from the plant interface code. This design choice maximises the software reliability, reduces development costs and allows non-specialist programmers to contribute to the implementation of real-time projects. JETRT also provides an integrated set of debugging and testing tools, some of them well integrated with the Matlab environment. This feature besides the framework portability among different platforms allows to perform most of the test and validation phase on a desktop PC running Windows, significantly reducing the commissioning time of a new real-time system

  2. Design and validation of a 3D virtual reality desktop system for sonographic length and volume measurements in early pregnancy evaluation.

    Science.gov (United States)

    Baken, Leonie; van Gruting, Isabelle M A; Steegers, Eric A P; van der Spek, Peter J; Exalto, Niek; Koning, Anton H J

    2015-03-01

    To design and validate a desktop virtual reality (VR) system, for presentation and assessment of volumetric data, based on commercially off-the-shelf hardware as an alternative to a fully immersive CAVE-like I-Space VR system. We designed a desktop VR system, using a three-dimensional (3D) monitor and a six degrees-of-freedom tracking system. A personal computer uses the V-Scope (Erasmus MC, Rotterdam, The Netherlands) volume-rendering application, developed for the I-Space, to create a hologram of volumetric data. Inter- and intraobserver reliability for crown-rump length and embryonic volume measurements are investigated using Bland-Altman plots and intraclass correlation coefficients. Time required for the measurements was recorded. Comparing the I-Space and the desktop VR system, the mean difference for crown-rump length is -0.34% (limits of agreement -2.58-1.89, ±2.24%) and for embryonic volume -0.92% (limits of agreement -6.97-5.13, ±6.05%). Intra- and interobserver intraclass correlation coefficients of the desktop VR system were all >0.99. Measurement times were longer on the desktop VR system compared with the I-Space, but the differences were not statistically significant. A user-friendly desktop VR system can be put together using commercially off-the-shelf hardware at an acceptable price. This system provides a valid and reliable method for embryonic length and volume measurements and can be used in clinical practice. © 2014 Wiley Periodicals, Inc.

  3. CaveMan Enterprise version 1.0 Software Validation and Verification.

    Energy Technology Data Exchange (ETDEWEB)

    Hart, David

    2014-10-01

    The U.S. Department of Energy Strategic Petroleum Reserve stores crude oil in caverns solution-mined in salt domes along the Gulf Coast of Louisiana and Texas. The CaveMan software program has been used since the late 1990s as one tool to analyze pressure mea- surements monitored at each cavern. The purpose of this monitoring is to catch potential cavern integrity issues as soon as possible. The CaveMan software was written in Microsoft Visual Basic, and embedded in a Microsoft Excel workbook; this method of running the CaveMan software is no longer sustainable. As such, a new version called CaveMan Enter- prise has been developed. CaveMan Enterprise version 1.0 does not have any changes to the CaveMan numerical models. CaveMan Enterprise represents, instead, a change from desktop-managed work- books to an enterprise framework, moving data management into coordinated databases and porting the numerical modeling codes into the Python programming language. This document provides a report of the code validation and verification testing.

  4. On the Current Measurement Practices in Agile Software Development

    OpenAIRE

    Javdani, Taghi; Zulzalil, Hazura; Ghani, Abdul Azim Abd; Sultan, Abu Bakar Md; Parizi, Reza Meimandi

    2013-01-01

    Agile software development (ASD) methods were introduced as a reaction to traditional software development methods. Principles of these methods are different from traditional methods and so there are some different processes and activities in agile methods comparing to traditional methods. Thus ASD methods require different measurement practices comparing to traditional methods. Agile teams often do their projects in the simplest and most effective way so, measurement practices in agile metho...

  5. Micro Tools with Pneumatic Actuators for Desktop Factories

    Directory of Open Access Journals (Sweden)

    Björn HOXHOLD

    2009-10-01

    Full Text Available This paper presents the design, the simulation and the fabrication process of two novel pneumatically driven auxiliary micro tools that can be used to improve and to speed up assembling processes in desktop factories. The described micro systems are designed to function as centrifugal feeders for small glass balls or active clamping devices with small external dimensions. They are able to deliver more than six balls per second on demand to a gripper and align and clamp single chips in a fixed position.

  6. PEMANFAATAN REFERENCE MANAGEMENT SOFTWARE (RMS UNTUK PENYUSUNAN KARYA ILMIAH DI PERGURUAN TINGGI

    Directory of Open Access Journals (Sweden)

    mufid mufid

    2014-07-01

    Full Text Available Reference management software (RMS is useful application for  researchers,  lecturers  and  students  helping  them  to  compose scientifc paper. This software has  fuctioned to search online scientifc information, to save search results, and to write bibliography, includes making  citation  and  references  automatically,  to  facilitate  to  share references  with  other  users  and  to  sincron  references  web  based  on desktop or mobile phone.This  article  tries  to  introduce  RMS. Using  RMS,  error  on  writing citation  and  references  can  be  avoided,  publishing  of  scientifc works would be increased and more qualifed.

  7. VRLane: a desktop virtual safety management program for underground coal mine

    Science.gov (United States)

    Li, Mei; Chen, Jingzhu; Xiong, Wei; Zhang, Pengpeng; Wu, Daozheng

    2008-10-01

    VR technologies, which generate immersive, interactive, and three-dimensional (3D) environments, are seldom applied to coal mine safety work management. In this paper, a new method that combined the VR technologies with underground mine safety management system was explored. A desktop virtual safety management program for underground coal mine, called VRLane, was developed. The paper mainly concerned about the current research advance in VR, system design, key techniques and system application. Two important techniques were introduced in the paper. Firstly, an algorithm was designed and implemented, with which the 3D laneway models and equipment models can be built on the basis of the latest mine 2D drawings automatically, whereas common VR programs established 3D environment by using 3DS Max or the other 3D modeling software packages with which laneway models were built manually and laboriously. Secondly, VRLane realized system integration with underground industrial automation. VRLane not only described a realistic 3D laneway environment, but also described the status of the coal mining, with functions of displaying the run states and related parameters of equipment, per-alarming the abnormal mining events, and animating mine cars, mine workers, or long-wall shearers. The system, with advantages of cheap, dynamic, easy to maintenance, provided a useful tool for safety production management in coal mine.

  8. Low Cost Desktop Image Analysis Workstation With Enhanced Interactive User Interface

    Science.gov (United States)

    Ratib, Osman M.; Huang, H. K.

    1989-05-01

    A multimodality picture archiving and communication system (PACS) is in routine clinical use in the UCLA Radiology Department. Several types workstations are currently implemented for this PACS. Among them, the Apple Macintosh II personal computer was recently chosen to serve as a desktop workstation for display and analysis of radiological images. This personal computer was selected mainly because of its extremely friendly user-interface, its popularity among the academic and medical community and its low cost. In comparison to other microcomputer-based systems the Macintosh II offers the following advantages: the extreme standardization of its user interface, file system and networking, and the availability of a very large variety of commercial software packages. In the current configuration the Macintosh II operates as a stand-alone workstation where images are imported from a centralized PACS server through an Ethernet network using a standard TCP-IP protocol, and stored locally on magnetic disk. The use of high resolution screens (1024x768 pixels x 8bits) offer sufficient performance for image display and analysis. We focused our project on the design and implementation of a variety of image analysis algorithms ranging from automated structure and edge detection to sophisticated dynamic analysis of sequential images. Specific analysis programs were developed for ultrasound images, digitized angiograms, MRI and CT tomographic images and scintigraphic images.

  9. NASA's Climate in a Box: Desktop Supercomputing for Open Scientific Model Development

    Science.gov (United States)

    Wojcik, G. S.; Seablom, M. S.; Lee, T. J.; McConaughy, G. R.; Syed, R.; Oloso, A.; Kemp, E. M.; Greenseid, J.; Smith, R.

    2009-12-01

    NASA's High Performance Computing Portfolio in cooperation with its Modeling, Analysis, and Prediction program intends to make its climate and earth science models more accessible to a larger community. A key goal of this effort is to open the model development and validation process to the scientific community at large such that a natural selection process is enabled and results in a more efficient scientific process. One obstacle to others using NASA models is the complexity of the models and the difficulty in learning how to use them. This situation applies not only to scientists who regularly use these models but also non-typical users who may want to use the models such as scientists from different domains, policy makers, and teachers. Another obstacle to the use of these models is that access to high performance computing (HPC) accounts, from which the models are implemented, can be restrictive with long wait times in job queues and delays caused by an arduous process of obtaining an account, especially for foreign nationals. This project explores the utility of using desktop supercomputers in providing a complete ready-to-use toolkit of climate research products to investigators and on demand access to an HPC system. One objective of this work is to pre-package NASA and NOAA models so that new users will not have to spend significant time porting the models. In addition, the prepackaged toolkit will include tools, such as workflow, visualization, social networking web sites, and analysis tools, to assist users in running the models and analyzing the data. The system architecture to be developed will allow for automatic code updates for each user and an effective means with which to deal with data that are generated. We plan to investigate several desktop systems, but our work to date has focused on a Cray CX1. Currently, we are investigating the potential capabilities of several non-traditional development environments. While most NASA and NOAA models are

  10. Building an intelligence-led security program

    CERN Document Server

    Liska, Allan

    2014-01-01

    As recently as five years ago, securing a network meant putting in a firewall, intrusion detection system, and installing antivirus software on the desktop. Unfortunately, attackers have grown more nimble and effective, meaning that traditional security programs are no longer effective. Today's effective cyber security programs take these best practices and overlay them with intelligence. Adding cyber threat intelligence can help security teams uncover events not detected by traditional security platforms and correlate seemingly disparate events across the network. Properly-implemented inte

  11. Turbulence Visualization at the Terascale on Desktop PCs

    KAUST Repository

    Treib, M.

    2012-12-01

    Despite the ongoing efforts in turbulence research, the universal properties of the turbulence small-scale structure and the relationships between small-and large-scale turbulent motions are not yet fully understood. The visually guided exploration of turbulence features, including the interactive selection and simultaneous visualization of multiple features, can further progress our understanding of turbulence. Accomplishing this task for flow fields in which the full turbulence spectrum is well resolved is challenging on desktop computers. This is due to the extreme resolution of such fields, requiring memory and bandwidth capacities going beyond what is currently available. To overcome these limitations, we present a GPU system for feature-based turbulence visualization that works on a compressed flow field representation. We use a wavelet-based compression scheme including run-length and entropy encoding, which can be decoded on the GPU and embedded into brick-based volume ray-casting. This enables a drastic reduction of the data to be streamed from disk to GPU memory. Our system derives turbulence properties directly from the velocity gradient tensor, and it either renders these properties in turn or generates and renders scalar feature volumes. The quality and efficiency of the system is demonstrated in the visualization of two unsteady turbulence simulations, each comprising a spatio-temporal resolution of 10244. On a desktop computer, the system can visualize each time step in 5 seconds, and it achieves about three times this rate for the visualization of a scalar feature volume. © 1995-2012 IEEE.

  12. Developing open source, self-contained disease surveillance software applications for use in resource-limited settings

    Directory of Open Access Journals (Sweden)

    Campbell Timothy C

    2012-09-01

    Full Text Available Abstract Background Emerging public health threats often originate in resource-limited countries. In recognition of this fact, the World Health Organization issued revised International Health Regulations in 2005, which call for significantly increased reporting and response capabilities for all signatory nations. Electronic biosurveillance systems can improve the timeliness of public health data collection, aid in the early detection of and response to disease outbreaks, and enhance situational awareness. Methods As components of its Suite for Automated Global bioSurveillance (SAGES program, The Johns Hopkins University Applied Physics Laboratory developed two open-source, electronic biosurveillance systems for use in resource-limited settings. OpenESSENCE provides web-based data entry, analysis, and reporting. ESSENCE Desktop Edition provides similar capabilities for settings without internet access. Both systems may be configured to collect data using locally available cell phone technologies. Results ESSENCE Desktop Edition has been deployed for two years in the Republic of the Philippines. Local health clinics have rapidly adopted the new technology to provide daily reporting, thus eliminating the two-to-three week data lag of the previous paper-based system. Conclusions OpenESSENCE and ESSENCE Desktop Edition are two open-source software products with the capability of significantly improving disease surveillance in a wide range of resource-limited settings. These products, and other emerging surveillance technologies, can assist resource-limited countries compliance with the revised International Health Regulations.

  13. Desk-top computer assisted processing of thermoluminescent dosimeters

    International Nuclear Information System (INIS)

    Archer, B.R.; Glaze, S.A.; North, L.B.; Bushong, S.C.

    1977-01-01

    An accurate dosimetric system utilizing a desk-top computer and high sensitivity ribbon type TLDs has been developed. The system incorporates an exposure history file and procedures designed for constant spatial orientation of each dosimeter. Processing of information is performed by two computer programs. The first calculates relative response factors to insure that the corrected response of each TLD is identical following a given dose of radiation. The second program computes a calibration factor and uses it and the relative response factor to determine the actual dose registered by each TLD. (U.K.)

  14. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  15. CALIPSO: an interactive image analysis software package for desktop PACS workstations

    Science.gov (United States)

    Ratib, Osman M.; Huang, H. K.

    1990-07-01

    The purpose of this project is to develop a low cost workstation for quantitative analysis of multimodality images using a Macintosh II personal computer. In the current configuration the Macintosh operates as a stand alone workstation where images are imported either from a central PACS server through a standard Ethernet network or recorded through video digitizer board. The CALIPSO software developed contains a large variety ofbasic image display and manipulation tools. We focused our effort however on the design and implementation ofquantitative analysis methods that can be applied to images from different imaging modalities. Analysis modules currently implemented include geometric and densitometric volumes and ejection fraction calculation from radionuclide and cine-angiograms Fourier analysis ofcardiac wall motion vascular stenosis measurement color coded parametric display of regional flow distribution from dynamic coronary angiograms automatic analysis ofmyocardial distribution ofradiolabelled tracers from tomoscintigraphic images. Several of these analysis tools were selected because they use similar color coded andparametric display methods to communicate quantitative data extracted from the images. 1. Rationale and objectives of the project Developments of Picture Archiving and Communication Systems (PACS) in clinical environment allow physicians and radiologists to assess radiographic images directly through imaging workstations (''). This convenient access to the images is often limited by the number of workstations available due in part to their high cost. There is also an increasing need for quantitative analysis ofthe images. During thepast decade

  16. Cloud Computing and Virtual Desktop Infrastructures in Afloat Environments

    OpenAIRE

    Gillette, Stefan E.

    2012-01-01

    The phenomenon of “cloud computing” has become ubiquitous among users of the Internet and many commercial applications. Yet, the U.S. Navy has conducted limited research in this nascent technology. This thesis explores the application and integration of cloud computing both at the shipboard level and in a multi-ship environment. A virtual desktop infrastructure, mirroring a shipboard environment, was built and analyzed in the Cloud Lab at the Naval Postgraduate School, which offers a potentia...

  17. Indiana Humanities Council Request for the Indianapolis Energy Conversion Inst. For Phase I of the Indianapolis Energy Conservation Res Initiative also called the smartDESKTOP Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Keller, John B.

    2007-12-06

    The smartDESKTOP Initiative at the Indiana Humanities Council received critical support in building and delivering a digital desktop for Indiana educators through the Department of Energy Grant DE-FG02-06ER64282. During the project period September 2006 through October of 2007, the number of Indiana educators with accounts on the smartDESKTOP more than tripled from under 2,000 to more than 7,000 accounts. An external review of the project conducted for the purposes of understanding the impact of the service in Indiana schools revealed that the majority of respondents felt that using the smartDESKTOP did reduce the time they spent managing paper. The same study revealed the challenges of implementing a digital desktop meant to help teachers leverage technology to improve their teaching and ultimately student learning. The most significant outcome of this project is that the Indiana Department of Education expressed interest in assuming responsibility for sustaining this project. The transition of the smartDESKTOP to the Indiana Department of Education was effective on November 1, 2007.

  18. Development of research activity supporting system. 2. Evaluation of desktop video conferencing; Kenkyu katsudo shien system no kaihatsu. 2. Enkaku kijo uchiawase hoshiki no jitsuyosei hyoka

    Energy Technology Data Exchange (ETDEWEB)

    Tsutsumi, F.; Hishitani, J.; Matsui, S. [Central Research Institute of Electric Power Industry, Tokyo (Japan)

    1996-03-01

    Desktop video conferencing (DVC) system and its software were investigated. A test system was actually constructed. Based on the test results, its practical advantage and disadvantage were illustrated. The DVC system was constructed using general LAN, and operation experiments were conducted, as to research activity. From the interview with participants and analysis of conversation history, advantage and disadvantage of this system were clarified. Performance of the trial system was examined using practical LAN, to compare with the performance of commercial systems. When using LAN, the same problems were given. It was found that the DVC system can not substitute for the conventional real face to face meeting, but it is suitable for some communication styles, for example free participation meeting or collaboration based on the same objective, which has been hard up to this time. In addition, software was made as trial, by which reservation of meeting, announcement, and calling can be carried out. 26 refs., 6 figs., 5 tabs.

  19. Digital Dome versus Desktop Display: Learning Outcome Assessments by Domain Experts

    Science.gov (United States)

    Jacobson, Jeffery

    2013-01-01

    In previous publications, the author reported that students learned about Egyptian architecture and society by playing an educational game based on a virtual representation of a temple. Students played the game in a digital dome or on a standard desktop computer, and (each) then recorded a video tour of the temple. Those who had used the dome…

  20. Aerospace Software Engineering for Advanced Systems Architectures (L’Ingenierie des Logiciels Pour les Architectures des Systemes Aerospatiaux)

    Science.gov (United States)

    1993-11-01

    DeskTop Publishing package Framemaker . BSW with the emphasis on the properly modelling Framemaker is an integrated text and graphics of the ASW/BSW...automated tool. and Framemaker , the text and graphics of the From a software developers’ point of view. it documents can be automatically combined into one... FRAMEMAKER is a registered trademark of Frame starting from the second project. The vertical tools Technology improve the performance of individuals in

  1. Software needs engineering - a position paper

    OpenAIRE

    GRIMSON, JANE BARCLAY

    2000-01-01

    PUBLISHED When the general press refers to `software' in its headlines, then this is often not to relate a success story, but to expand on yet another `software-risk-turned-problem-story'. For many people, the term `software' evokes the image of an application package running either on a PC or some similar stand-alone usage. Over 70% of all software, however, are not developed in the traditional software houses as part of the creation of such packages. Much of this software comes in the fo...

  2. Desktop Publishing: The Effects of Computerized Formats on Reading Speed and Comprehension.

    Science.gov (United States)

    Knupfer, Nancy Nelson; McIsaac, Marina Stock

    1989-01-01

    Describes study that was conducted to determine the effects of two electronic text variables used in desktop publishing on undergraduate students' reading speed and comprehension. Research on text variables, graphic design, instructional text design, and computer screen design is discussed, and further studies are suggested. (22 references) (LRW)

  3. From Server to Desktop: Capital and Institutional Planning for Client/Server Technology.

    Science.gov (United States)

    Mullig, Richard M.; Frey, Keith W.

    1994-01-01

    Beginning with a request for an enhanced system for decision/strategic planning support, the University of Chicago's biological sciences division has developed a range of administrative client/server tools, instituted a capital replacement plan for desktop technology, and created a planning and staffing approach enabling rapid introduction of new…

  4. Fast and predictable video compression in software design and implementation of an H.261 codec

    Science.gov (United States)

    Geske, Dagmar; Hess, Robert

    1998-09-01

    The use of software codecs for video compression becomes commonplace in several videoconferencing applications. In order to reduce conflicts with other applications used at the same time, mechanisms for resource reservation on endsystems need to determine an upper bound for computing time used by the codec. This leads to the demand for predictable execution times of compression/decompression. Since compression schemes as H.261 inherently depend on the motion contained in the video, an adaptive admission control is required. This paper presents a data driven approach based on dynamical reduction of the number of processed macroblocks in peak situations. Besides the absolute speed is a point of interest. The question, whether and how software compression of high quality video is feasible on today's desktop computers, is examined.

  5. Software for improved field surveys of nesting marine turtles.

    Science.gov (United States)

    Anastácio, R; Gonzalez, J M; Slater, K; Pereira, M J

    2017-09-07

    Field data are still recorded on paper in many worldwide beach surveys of nesting marine turtles. The data must be subsequently transferred into an electronic database, and this can introduce errors in the dataset. To minimize such errors, the "Turtles" software was developed and piloted to record field data by one software user accompanying one Tortuguero in Akumal beaches, Quintana Roo, Mexico, from June 1 st to July 31 st during the night patrols. Comparisons were made between exported data from the software with the paper forms entered into a database (henceforth traditional). Preliminary assessment indicated that the software user tended to record a greater amount of metrics (i.e., an average of 18.3 fields ± 5.4 sd vs. 8.6 fields ± 2.1 sd recorded by the traditional method). The traditional method introduce three types of "errors" into a dataset: missing values in relevant fields (40.1%), different answers for the same value (9.8%), and inconsistent data (0.9%). Only 5.8% of these (missing values) were found with the software methodology. Although only tested by a single user, the software may suggest increased efficacy and warrants further examination to accurately assess the merit of replacing traditional methods of data recording for beach monitoring programmes.

  6. Analytical Hierarchy Process for the selection of strategic alternatives for introduction of infrastructure virtual desktop infrastructure in the university

    Directory of Open Access Journals (Sweden)

    Katerina A. Makoviy

    2017-12-01

    Full Text Available The task of choosing a strategy for implementing the virtual desktop infrastructure into the IT infrastructure of the university is considered. The infrastructure of virtual desktops is a technology that provides centralization of management of client workplaces, increase the service life of computers in classrooms. The analysis of strengths and weaknesses, threats and opportunities for introducing virtualization in the university. Alternatives to implementation based on the results of the pilot project have been developed. To obtain quantitative estimates in the SWOT - analysis of the pilot project, the analytical hierarchy process is used. The analysis of implementation of the pilot project by experts is carried out and the integral value of quantitative estimates of various alternatives is generated. The combination of the analytical hierarchy process and SWOT - analysis allows you to choose the optimal strategy for implementing desktop virtualization.

  7. 基于OpenStack的虚拟桌面Spice的应用%Based on OpenStack virtual desktop applications of Spice

    Institute of Scientific and Technical Information of China (English)

    余康; 张鹏; 唐攀

    2014-01-01

    针对目前开源云平台OpenStack采用的VNC虚拟桌面在实际应用中存在着鼠标重影、屏幕无法自适应等不足之处,本文提出了一种新的OpenStack整合开源虚拟桌面传输协议Spice的虚拟桌面方案,该方案解决了目前OpenStack采用VNC虚拟桌面的不足之处,Spice协议较VNC有更强大的功能,能够提供更好的最终用户体验。%Open source cloud platform it has used VNC exist in actual application of the virtual desktop mouse ghosting, screen can't adaptive deficiency, such as this paper proposes a new open source virtual desktop transfer protocol Spice it integration of virtual desktop solution, the solution to solve the current deficiencies of the virtual desktop, it USES the VNC Spice protocol a VNC has more powerful functions, can provide a better end user experience.

  8. Accuracy and efficiency of full-arch digitalization and 3D printing: A comparison between desktop model scanners, an intraoral scanner, a CBCT model scan, and stereolithographic 3D printing.

    Science.gov (United States)

    Wesemann, Christian; Muallah, Jonas; Mah, James; Bumann, Axel

    2017-01-01

    The primary objective of this study was to compare the accuracy and time efficiency of an indirect and direct digitalization workflow with that of a three-dimensional (3D) printer in order to identify the most suitable method for orthodontic use. A master model was measured with a coordinate measuring instrument. The distances measured were the intercanine width, the intermolar width, and the dental arch length. Sixty-four scans were taken with each of the desktop scanners R900 and R700 (3Shape), the intraoral scanner TRIOS Color Pod (3Shape), and the Promax 3D Mid cone beam computed tomography (CBCT) unit (Planmeca). All scans were measured with measuring software. One scan was selected and printed 37 times on the D35 stereolithographic 3D printer (Innovation MediTech). The printed models were measured again using the coordinate measuring instrument. The most accurate results were obtained by the R900. The R700 and the TRIOS intraoral scanner showed comparable results. CBCT-3D-rendering with the Promax 3D Mid CBCT unit revealed significantly higher accuracy with regard to dental casts than dental impressions. 3D printing offered a significantly higher level of deviation than digitalization with desktop scanners or an intraoral scanner. The chairside time required for digital impressions was 27% longer than for conventional impressions. Conventional impressions, model casting, and optional digitization with desktop scanners remains the recommended workflow process. For orthodontic demands, intraoral scanners are a useful alternative for full-arch scans. For prosthodontic use, the scanning scope should be less than one quadrant and three additional teeth.

  9. Process-based software project management

    CERN Document Server

    Goodman, F Alan

    2006-01-01

    Not connecting software project management (SPM) to actual, real-world development processes can lead to a complete divorcing of SPM to software engineering that can undermine any successful software project. By explaining how a layered process architectural model improves operational efficiency, Process-Based Software Project Management outlines a new method that is more effective than the traditional method when dealing with SPM. With a clear and easy-to-read approach, the book discusses the benefits of an integrated project management-process management connection. The described tight coup

  10. Experimental Setup for Ultrasonic-Assisted Desktop Fused Deposition Modeling System

    OpenAIRE

    Maidin, S.; Muhamad, M. K.; Pei, Eujin

    2014-01-01

    Fused deposition modeling (FDM) is an additive manufacturing (AM) process that has been used in various manufacturing fields. However, the drawback of FDM is poor surface finish of part produced, leading to surface roughness and requires hand finishing. In this study, ultrasonic technology will be integrated into a desktop FDM system. Ultrasound has been applied in various conventional machining process and shows good machined surface finish. However, very little research regarding the applic...

  11. Virtual Reality on a Desktop Hailed as New Tool in Distance Education.

    Science.gov (United States)

    Young, Jeffrey R.

    2000-01-01

    Describes college and university educational applications of desktop virtual reality to provide a more human touch to interactive distance education programs and impress the brain with more vivid images. Critics suggest the technology is too costly and time consuming and may even distract students from the content of an online course. (DB)

  12. 3d visualization of atomistic simulations on every desktop

    Science.gov (United States)

    Peled, Dan; Silverman, Amihai; Adler, Joan

    2013-08-01

    Once upon a time, after making simulations, one had to go to a visualization center with fancy SGI machines to run a GL visualization and make a movie. More recently, OpenGL and its mesa clone have let us create 3D on simple desktops (or laptops), whether or not a Z-buffer card is present. Today, 3D a la Avatar is a commodity technique, presented in cinemas and sold for home TV. However, only a few special research centers have systems large enough for entire classes to view 3D, or special immersive facilities like visualization CAVEs or walls, and not everyone finds 3D immersion easy to view. For maximum physics with minimum effort a 3D system must come to each researcher and student. So how do we create 3D visualization cheaply on every desktop for atomistic simulations? After several months of attempts to select commodity equipment for a whole room system, we selected an approach that goes back a long time, even predating GL. The old concept of anaglyphic stereo relies on two images, slightly displaced, and viewed through colored glasses, or two squares of cellophane from a regular screen/projector or poster. We have added this capability to our AViz atomistic visualization code in its new, 6.1 version, which is RedHat, CentOS and Ubuntu compatible. Examples using data from our own research and that of other groups will be given.

  13. 3d visualization of atomistic simulations on every desktop

    International Nuclear Information System (INIS)

    Peled, Dan; Silverman, Amihai; Adler, Joan

    2013-01-01

    Once upon a time, after making simulations, one had to go to a visualization center with fancy SGI machines to run a GL visualization and make a movie. More recently, OpenGL and its mesa clone have let us create 3D on simple desktops (or laptops), whether or not a Z-buffer card is present. Today, 3D a la Avatar is a commodity technique, presented in cinemas and sold for home TV. However, only a few special research centers have systems large enough for entire classes to view 3D, or special immersive facilities like visualization CAVEs or walls, and not everyone finds 3D immersion easy to view. For maximum physics with minimum effort a 3D system must come to each researcher and student. So how do we create 3D visualization cheaply on every desktop for atomistic simulations? After several months of attempts to select commodity equipment for a whole room system, we selected an approach that goes back a long time, even predating GL. The old concept of anaglyphic stereo relies on two images, slightly displaced, and viewed through colored glasses, or two squares of cellophane from a regular screen/projector or poster. We have added this capability to our AViz atomistic visualization code in its new, 6.1 version, which is RedHat, CentOS and Ubuntu compatible. Examples using data from our own research and that of other groups will be given

  14. Development of an automated desktop procedure for defining macro-reaches for river longitudinal profiles

    CSIR Research Space (South Africa)

    Dollar, LH

    2006-07-01

    Full Text Available This paper presents an automated desktop procedure for delineating river longitudinal profiles into macro-reaches for use in Ecological Reserve assessments and to aid freshwater ecosystem conservation planning. The procedure was developed for use...

  15. Design and construction of a desktop AC susceptometer using an Arduino and a Bluetooth for serial interface

    Science.gov (United States)

    Pérez, Israel; Ángel Hernández Cuevas, José; Trinidad Elizalde Galindo, José

    2018-05-01

    We designed and developed a desktop AC susceptometer for the characterization of materials. The system consists of a lock-in amplifier, an AC function generator, a couple of coils, a sample holder, a computer system with a designed software in freeware C++ code, and an Arduino card coupled to a Bluetooth module. The Arduino/Bluetooth serial interface allows the user to have a connection to almost any computer and thus avoids the problem of connectivity between the computer and the peripherals, such as the lock-in amplifier and the function generator. The Bluetooth transmitter/receiver used is a commercial device which is robust and fast. These new features reduce the size and increase the versatility of the susceptometer, for it can be used with a simple laptop. To test our instrument, we performed measurements on magnetic materials and show that the system is reliable at both room temperature and cryogenic temperatures (77 K). The instrument is suitable for any physics or engineering laboratory either for research or academic purposes.

  16. Desktop computer graphics for RMS/payload handling flight design

    Science.gov (United States)

    Homan, D. J.

    1984-01-01

    A computer program, the Multi-Adaptive Drawings, Renderings and Similitudes (MADRAS) program, is discussed. The modeling program, written for a desktop computer system (the Hewlett-Packard 9845/C), is written in BASIC and uses modular construction of objects while generating both wire-frame and hidden-line drawings from any viewpoint. The dimensions and placement of objects are user definable. Once the hidden-line calculations are made for a particular viewpoint, the viewpoint may be rotated in pan, tilt, and roll without further hidden-line calculations. The use and results of this program are discussed.

  17. A Five-Year Hedonic Price Breakdown for Desktop Personal Computer Attributes in Brazil

    Directory of Open Access Journals (Sweden)

    Nuno Manoel Martins Dias Fouto

    2009-07-01

    Full Text Available The purpose of this article is to identify the attributes that discriminate the prices of personal desktop computers. We employ the hedonic price method in evaluating such characteristics. This approach allows market prices to be expressed as a function, a set of attributes present in the products and services offered. Prices and characteristics of up to 3,779 desktop personal computers offered in the IT pages of one of the main Brazilian newspapers were collected from January 2003 to December 2007. Several specifications for the hedonic (multivariate linear regression were tested. In this particular study, the main attributes were found to be hard drive capacity, screen technology, main board brand, random memory size, microprocessor brand, video board memory, digital video and compact disk recording devices, screen size and microprocessor speed. These results highlight the novel contribution of this study: the manner and means in which hedonic price indexes may be estimated in Brazil.

  18. Fabrication of low cost soft tissue prostheses with the desktop 3D printer.

    Science.gov (United States)

    He, Yong; Xue, Guang-huai; Fu, Jian-zhong

    2014-11-27

    Soft tissue prostheses such as artificial ear, eye and nose are widely used in the maxillofacial rehabilitation. In this report we demonstrate how to fabricate soft prostheses mold with a low cost desktop 3D printer. The fabrication method used is referred to as Scanning Printing Polishing Casting (SPPC). Firstly the anatomy is scanned with a 3D scanner, then a tissue casting mold is designed on computer and printed with a desktop 3D printer. Subsequently, a chemical polishing method is used to polish the casting mold by removing the staircase effect and acquiring a smooth surface. Finally, the last step is to cast medical grade silicone into the mold. After the silicone is cured, the fine soft prostheses can be removed from the mold. Utilizing the SPPC method, soft prostheses with smooth surface and complicated structure can be fabricated at a low cost. Accordingly, the total cost of fabricating ear prosthesis is about $30, which is much lower than the current soft prostheses fabrication methods.

  19. ASTEC and MODEL: Controls software development at Goddard Space Flight Center

    Science.gov (United States)

    Downing, John P.; Bauer, Frank H.; Surber, Jeffrey L.

    1993-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at the Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. In the last three years the ASTEC (Analysis and Simulation Tools for Engineering Controls) software has been under development. ASTEC is meant to be an integrated collection of controls analysis tools for use at the desktop level. MODEL (Multi-Optimal Differential Equation Language) is a translator that converts programs written in the MODEL language to FORTRAN. An upgraded version of the MODEL program will be merged into ASTEC. MODEL has not been modified since 1981 and has not kept with changes in computers or user interface techniques. This paper describes the changes made to MODEL in order to make it useful in the 90's and how it relates to ASTEC.

  20. Fatigue monitoring desktop guide

    International Nuclear Information System (INIS)

    Woods, K.; Thomas, K.

    2012-01-01

    The development of a program for managing material aging (MMG) in the nuclear industry requires a new and different perspective. The classical method for MMG is cycle counting, which has been shown to have limited success. The classical method has been successful in satisfying the ductile condition per the America Society of Mechanical Engineers' (ASME) design criteria. However, the defined material failure mechanism has transformed from through-wall cracking and leakage (ASME) to crack initiation (NUREG-6909). This transformation is based on current industry experience with material degradation early in plant life and can be attributed to fabrication issues and environment concerns where cycle counting has been unsuccessful. This new perspective provides a different approach to cycle counting that incorporates all of the information about the material conditions. This approach goes beyond the consideration of a static analysis and includes a dynamic assessment of component health, which is required for operating plants. This health definition should consider fabrication, inspections, transient conditions and industry operating experience. In addition, this collection of information can be transparent to a broader audience that may not have a full understanding of the system design or the potential causes of early material degradation. This paper will present the key points that are needed for a successful fatigue monitoring desktop guide. (authors)

  1. Noise data management using commercially available data-base software

    International Nuclear Information System (INIS)

    Damiano, B.; Thie, J.A.

    1988-01-01

    A data base has been created using commercially available software to manage the data collected by an automated noise data acquisition system operated by Oak Ridge National Laboratory at the Fast Flux Test Facility (FFTF). The data base was created to store, organize, and retrieve selected features of the nuclear and process signal noise data, because the large volume of data collected by the automated system makes manual data handling and interpretation based on visual examination of noise signatures impractical. Compared with manual data handling, use of the data base allows the automatically collected data to be utilized more fully and effectively. The FFTF noise data base uses the Oracle Relational Data Base Management System implemented on a desktop personal computer

  2. Repository Evaluation of Software Reuse

    OpenAIRE

    Banker, Rajiv D.; Kauffman, Robert J.; Zweig, Dani

    1993-01-01

    The article of record as published may be found at: 10.1109/32.223805 Center for Digital Economy Research Stem School of Business Working Paper IS-93-28, Replaces: Working Paper IS-93-1 Working Paper Series STERN IS-93-28 Working Paper series: STERN IS-93-28 The traditional unit of analysis and control for software managers is the software project, and subsequently the resulting application system. Today, with the emerging ca- pabilities of computer-aided software engineering ...

  3. Delivering an Alternative Medicine Resource to the User's Desktop via World Wide Web.

    Science.gov (United States)

    Li, Jie; Wu, Gang; Marks, Ellen; Fan, Weiyu

    1998-01-01

    Discusses the design and implementation of a World Wide Web-based alternative medicine virtual resource. This homepage integrates regional, national, and international resources and delivers library services to the user's desktop. Goals, structure, and organizational schemes of the system are detailed, and design issues for building such a…

  4. Use of Signaling to Integrate Desktop Virtual Reality and Online Learning Management Systems

    Science.gov (United States)

    Dodd, Bucky J.; Antonenko, Pavlo D.

    2012-01-01

    Desktop virtual reality is an emerging educational technology that offers many potential benefits for learners in online learning contexts; however, a limited body of research is available that connects current multimedia learning techniques with these new forms of media. Because most formal online learning is delivered using learning management…

  5. Utility of ck metrics in predicting size of board-based software games

    International Nuclear Information System (INIS)

    Sabhat, N.; Azam, F.; Malik, A.A.

    2017-01-01

    Software size is one of the most important inputs of many software cost and effort estimation models. Early estimation of software plays an important role at the time of project inception. An accurate estimate of software size is, therefore, crucial for planning, managing, and controlling software development projects dealing with the development of software games. However, software size is unavailable during early phase of software development. This research determines the utility of CK (Chidamber and Kemerer) metrics, a well-known suite of object-oriented metrics, in estimating the size of software applications using the information from its UML (Unified Modeling Language) class diagram. This work focuses on a small subset dealing with board-based software games. Almost sixty games written using an object-oriented programming language are downloaded from open source repositories, analyzed and used to calibrate a regression-based size estimation model. Forward stepwise MLR (Multiple Linear Regression) is used for model fitting. The model thus obtained is assessed using a variety of accuracy measures such as MMRE (Mean Magnitude of Relative Error), Prediction of x(PRED(x)), MdMRE (Median of Relative Error) and validated using K-fold cross validation. The accuracy of this model is also compared with an existing model tailored for size estimation of board games. Based on a small subset of desktop games developed in various object-oriented languages, we obtained a model using CK metrics and forward stepwise multiple linear regression with reasonable estimation accuracy as indicated by the value of the coefficient of determination (R2 = 0.756).Comparison results indicate that the existing size estimation model outperforms the model derived using CK metrics in terms of accuracy of prediction. (author)

  6. Align and conquer: moving toward plug-and-play color imaging

    Science.gov (United States)

    Lee, Ho J.

    1996-03-01

    The rapid evolution of the low-cost color printing and image capture markets has precipitated a huge increase in the use of color imagery by casual end users on desktop systems, as opposed to traditional professional color users working with specialized equipment. While the cost of color equipment and software has decreased dramatically, the underlying system-level problems associated with color reproduction have remained the same, and in many cases are more difficult to address in a casual environment than in a professional setting. The proliferation of color imaging technologies so far has resulted in a wide availability of component solutions which work together poorly. A similar situation in the desktop computing market has led to the various `Plug-and-Play' standards, which provide a degree of interoperability between a range of products on disparate computing platforms. This presentation will discuss some of the underlying issues and emerging trends in the desktop and consumer digital color imaging markets.

  7. Software engineering beyond the project

    DEFF Research Database (Denmark)

    Dittrich, Yvonne

    2014-01-01

    Context The main part of software engineering methods, tools and technologies has developed around projects as the central organisational form of software development. A project organisation depends on clear bounds regarding scope, participants, development effort and lead-time. What happens when...... of traditional software engineering, but makes perfect sense, considering that the frame of reference for product development is not a project but continuous innovation across the respective ecosystem. The article provides a number of concrete points for further research....

  8. Infusing Reliability Techniques into Software Safety Analysis

    Science.gov (United States)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  9. Using Desktop Publishing in an Editing Class--The Lessons Learned and Students' Assessments.

    Science.gov (United States)

    Tharp, Marty; Zimmerman, Don

    1992-01-01

    Reports students' perceptions of learning desktop publishing (DTP) systems. Finds that (1) students learned the foundations of DTP in under 60 hours of hands-on experience; (2) the incremental introduction of DTP functions and practice sessions before assignments were the most effective teaching strategy; and (3) use of DTP encouraged nonartistic…

  10. Modelo de Cuadro de Mando para una Software Factory del Sector Financiero

    OpenAIRE

    Álvarez Pérez, César

    2012-01-01

    Abstract This work aims to search for KPI’s, Key Performance Indicators which permit to measure productivity of the generation of Financial Software in a Software Factory, which means the workplace where software is developed by using principles and techniques associated to traditional industrial production. A set of new indicators is presented, which together with the already used and more traditional ones, allow us to assess productivity and performance for Software Factory. The ...

  11. CHEMOSTAT, UM SOFTWARE GRATUITO PARA ANÁLISE EXPLORATÓRIA DE DADOS MULTIVARIADOS

    Directory of Open Access Journals (Sweden)

    Gilson A. Helfer

    2015-05-01

    Full Text Available The objective of this work was to develop a free access exploratory data analysis software application for academic use that is easy to install and can be handled without user-level programming due to extensive use of chemometrics and its association with applications that require purchased licenses or routines. The developed software, called Chemostat, employs Hierarchical Cluster Analysis (HCA, Principal Component Analysis (PCA, intervals Principal Component Analysis (iPCA, as well as correction methods, data transformation and outlier detection. The data can be imported from the clipboard, text files, ASCII or FT-IR Perkin-Elmer “.sp” files. It generates a variety of charts and tables that allow the analysis of results that can be exported in several formats. The main features of the software were tested using midinfrared and near-infrared spectra in vegetable oils and digital images obtained from different types of commercial diesel. In order to validate the software results, the same sets of data were analyzed using Matlab© and the results in both applications matched in various combinations. In addition to the desktop version, the reuse of algorithms allowed an online version to be provided that offers a unique experience on the web. Both applications are available in English.

  12. Writing Essays on a Laptop or a Desktop Computer: Does It Matter?

    Science.gov (United States)

    Ling, Guangming; Bridgeman, Brent

    2013-01-01

    To explore the potential effect of computer type on the Test of English as a Foreign Language-Internet-Based Test (TOEFL iBT) Writing Test, a sample of 444 international students was used. The students were randomly assigned to either a laptop or a desktop computer to write two TOEFL iBT practice essays in a simulated testing environment, followed…

  13. Effective Web and Desktop Retrieval with Enhanced Semantic Spaces

    Science.gov (United States)

    Daoud, Amjad M.

    We describe the design and implementation of the NETBOOK prototype system for collecting, structuring and efficiently creating semantic vectors for concepts, noun phrases, and documents from a corpus of free full text ebooks available on the World Wide Web. Automatic generation of concept maps from correlated index terms and extracted noun phrases are used to build a powerful conceptual index of individual pages. To ensure scalabilty of our system, dimension reduction is performed using Random Projection [13]. Furthermore, we present a complete evaluation of the relative effectiveness of the NETBOOK system versus the Google Desktop [8].

  14. Principles of Antifragile Software

    OpenAIRE

    Monperrus, Martin

    2014-01-01

    The goal of this paper is to study and define the concept of "antifragile software". For this, I start from Taleb's statement that antifragile systems love errors, and discuss whether traditional software dependability fits into this class. The answer is somewhat negative, although adaptive fault tolerance is antifragile: the system learns something when an error happens, and always imrpoves. Automatic runtime bug fixing is changing the code in response to errors, fault injection in productio...

  15. The Ragnarok Architectural Software Configuration Management Model

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    1999-01-01

    The architecture is the fundamental framework for designing and implementing large scale software, and the ability to trace and control its evolution is essential. However, many traditional software configuration management tools view 'software' merely as a set of files, not as an architecture....... This introduces an unfortunate impedance mismatch between the design domain (architecture level) and configuration management domain (file level.) This paper presents a software configuration management model that allows tight version control and configuration management of the architecture of a software system...

  16. Fatigue monitoring desktop guide

    Energy Technology Data Exchange (ETDEWEB)

    Woods, K. [InnoTech Engineering Solutions, LLC (United States); Thomas, K. [Nebraska Public Power District (United States)

    2012-07-01

    The development of a program for managing material aging (MMG) in the nuclear industry requires a new and different perspective. The classical method for MMG is cycle counting, which has been shown to have limited success. The classical method has been successful in satisfying the ductile condition per the America Society of Mechanical Engineers' (ASME) design criteria. However, the defined material failure mechanism has transformed from through-wall cracking and leakage (ASME) to crack initiation (NUREG-6909). This transformation is based on current industry experience with material degradation early in plant life and can be attributed to fabrication issues and environment concerns where cycle counting has been unsuccessful. This new perspective provides a different approach to cycle counting that incorporates all of the information about the material conditions. This approach goes beyond the consideration of a static analysis and includes a dynamic assessment of component health, which is required for operating plants. This health definition should consider fabrication, inspections, transient conditions and industry operating experience. In addition, this collection of information can be transparent to a broader audience that may not have a full understanding of the system design or the potential causes of early material degradation. This paper will present the key points that are needed for a successful fatigue monitoring desktop guide. (authors)

  17. Agile software development

    CERN Document Server

    Stober, Thomas

    2009-01-01

    Software Development is moving towards a more agile and more flexible approach. It turns out that the traditional 'waterfall' model is not supportive in an environment where technical, financial and strategic constraints are changing almost every day. But what is agility? What are today's major approaches? And especially: What is the impact of agile development principles on the development teams, on project management and on software architects? How can large enterprises become more agile and improve their business processes, which have been existing since many, many years? What are the limit

  18. ROMANIAN TRADITIONAL MOTIF ELEMENT OF MODERNITY IN CLOTHING

    Directory of Open Access Journals (Sweden)

    ŞUTEU Marius Darius

    2017-05-01

    Full Text Available In this paper are presented the phases for improving from an aesthetic point of view a clothing item, the T-shirt for women using software design patterns, computerised graphics and textile different modern technologies including: industrial embroidery, digital printing, sublimation. In the first phase a documentation was prepared in the University of Oradea and traditional motif was selected from a collection comprising a number of Romanian traditional motifs from different parts of the country and were reintepreted and stylized whilst preserving the symbolism and color range specified to the area. For the styling phase was used CorelDraw vector graphics program that allows changing the shape, size and color of the drawings without affecting the identity of the pattern. The embroidery was done using BERNINA Embroidery Software Designer Plus Software. This software allows you to export the model to any domestic or industrial embroidery machine regardless of brand. Finally we observed the resistance of the printed and embroided model to various: elasticity, resistance to abrasion and a sensory analysis on the preservation of color. After testing we noticed the imprint resistance applied to the fabric, resulting in a quality that makes possible to keep the Romanian traditional motif from generation to generation.

  19. A Cross-Case Analysis of Gender Issues in Desktop Virtual Reality Learning Environments

    Science.gov (United States)

    Ausburn, Lynna J.; Martens, Jon; Washington, Andre; Steele, Debra; Washburn, Earlene

    2009-01-01

    This study examined gender-related issues in using new desktop virtual reality (VR) technology as a learning tool in career and technical education (CTE). Using relevant literature, theory, and cross-case analysis of data and findings, the study compared and analyzed the outcomes of two recent studies conducted by a research team at Oklahoma State…

  20. Accelerating phylogenetics computing on the desktop: experiments with executing UPGMA in programmable logic.

    Science.gov (United States)

    Davis, J P; Akella, S; Waddell, P H

    2004-01-01

    Having greater computational power on the desktop for processing taxa data sets has been a dream of biologists/statisticians involved in phylogenetics data analysis. Many existing algorithms have been highly optimized-one example being Felsenstein's PHYLIP code, written in C, for UPGMA and neighbor joining algorithms. However, the ability to process more than a few tens of taxa in a reasonable amount of time using conventional computers has not yielded a satisfactory speedup in data processing, making it difficult for phylogenetics practitioners to quickly explore data sets-such as might be done from a laptop computer. We discuss the application of custom computing techniques to phylogenetics. In particular, we apply this technology to speed up UPGMA algorithm execution by a factor of a hundred, against that of PHYLIP code running on the same PC. We report on these experiments and discuss how custom computing techniques can be used to not only accelerate phylogenetics algorithm performance on the desktop, but also on larger, high-performance computing engines, thus enabling the high-speed processing of data sets involving thousands of taxa.

  1. Configuring the Orion Guidance, Navigation, and Control Flight Software for Automated Sequencing

    Science.gov (United States)

    Odegard, Ryan G.; Siliwinski, Tomasz K.; King, Ellis T.; Hart, Jeremy J.

    2010-01-01

    The Orion Crew Exploration Vehicle is being designed with greater automation capabilities than any other crewed spacecraft in NASA s history. The Guidance, Navigation, and Control (GN&C) flight software architecture is designed to provide a flexible and evolvable framework that accommodates increasing levels of automation over time. Within the GN&C flight software, a data-driven approach is used to configure software. This approach allows data reconfiguration and updates to automated sequences without requiring recompilation of the software. Because of the great dependency of the automation and the flight software on the configuration data, the data management is a vital component of the processes for software certification, mission design, and flight operations. To enable the automated sequencing and data configuration of the GN&C subsystem on Orion, a desktop database configuration tool has been developed. The database tool allows the specification of the GN&C activity sequences, the automated transitions in the software, and the corresponding parameter reconfigurations. These aspects of the GN&C automation on Orion are all coordinated via data management, and the database tool provides the ability to test the automation capabilities during the development of the GN&C software. In addition to providing the infrastructure to manage the GN&C automation, the database tool has been designed with capabilities to import and export artifacts for simulation analysis and documentation purposes. Furthermore, the database configuration tool, currently used to manage simulation data, is envisioned to evolve into a mission planning tool for generating and testing GN&C software sequences and configurations. A key enabler of the GN&C automation design, the database tool allows both the creation and maintenance of the data artifacts, as well as serving the critical role of helping to manage, visualize, and understand the data-driven parameters both during software development

  2. Virtualisation Devices for Student Learning: Comparison between Desktop-Based (Oculus Rift) and Mobile-Based (Gear VR) Virtual Reality in Medical and Health Science Education

    Science.gov (United States)

    Moro, Christian; Stromberga, Zane; Stirling, Allan

    2017-01-01

    Consumer-grade virtual reality has recently become available for both desktop and mobile platforms and may redefine the way that students learn. However, the decision regarding which device to utilise within a curriculum is unclear. Desktop-based VR has considerably higher setup costs involved, whereas mobile-based VR cannot produce the quality of…

  3. Potential Pedagogical Benefits and Limitations of Multimedia Integrated Desktop Video Conferencing Technology for Synchronous Learning

    NARCIS (Netherlands)

    drs Maurice Schols

    2009-01-01

    As multimedia gradually becomes more and more an integrated part of video conferencing systems, the use of multimedia integrated desktop video conferencing technology (MIDVCT) will open up new educational possibilities for synchronous learning. However, the possibilities and limitations of this

  4. Object oriented development of engineering software using CLIPS

    Science.gov (United States)

    Yoon, C. John

    1991-01-01

    Engineering applications involve numeric complexity and manipulations of a large amount of data. Traditionally, numeric computation has been the concern in developing an engineering software. As engineering application software became larger and more complex, management of resources such as data, rather than the numeric complexity, has become the major software design problem. Object oriented design and implementation methodologies can improve the reliability, flexibility, and maintainability of the resulting software; however, some tasks are better solved with the traditional procedural paradigm. The C Language Integrated Production System (CLIPS), with deffunction and defgeneric constructs, supports the procedural paradigm. The natural blending of object oriented and procedural paradigms has been cited as the reason for the popularity of the C++ language. The CLIPS Object Oriented Language's (COOL) object oriented features are more versatile than C++'s. A software design methodology based on object oriented and procedural approaches appropriate for engineering software, and to be implemented in CLIPS was outlined. A method for sensor placement for Space Station Freedom is being implemented in COOL as a sample problem.

  5. Open source software development : some historical perspectives

    NARCIS (Netherlands)

    Nuvolari, A.

    2005-01-01

    In this paper we suggest that historical studies of technology can help us to account for some, perplexing (at least for traditional economic reasoning) features of open source software development. From a historical perspective, open source software seems to be a particular case of what Robert C.

  6. Open source software development : some historical perspectives

    NARCIS (Netherlands)

    Nuvolari, A.

    2003-01-01

    In this paper we suggest that historical studies of technology can help us to account for some, perplexing (at least for traditional economic reasoning) features of open source software development. When looked in historical perspective, open source software seems to be a particular case of what

  7. Dependability Analysis Methods For Configurable Software

    International Nuclear Information System (INIS)

    Dahll, Gustav; Pulkkinen, Urho

    1996-01-01

    Configurable software systems are systems which are built up by standard software components in the same way as a hardware system is built up by standard hardware components. Such systems are often used in the control of NPPs, also in safety related applications. A reliability analysis of such systems is therefore necessary. This report discusses what configurable software is, and what is particular with respect to reliability assessment of such software. Two very commonly used techniques in traditional reliability analysis, viz. failure mode, effect and criticality analysis (FMECA) and fault tree analysis are investigated. A real example is used to illustrate the discussed methods. Various aspects relevant to the assessment of the software reliability in such systems are discussed. Finally some models for quantitative software reliability assessment applicable on configurable software systems are described. (author)

  8. Life cycle assessment study of a Chinese desktop personal computer.

    Science.gov (United States)

    Duan, Huabo; Eugster, Martin; Hischier, Roland; Streicher-Porte, Martin; Li, Jinhui

    2009-02-15

    Associated with the tremendous prosperity in world electronic information and telecommunication industry, there continues to be an increasing awareness of the environmental impacts related to the accelerating mass production, electricity use, and waste management of electronic and electric products (e-products). China's importance as both a consumer and supplier of e-products has grown at an unprecedented pace in recent decade. Hence, this paper aims to describe the application of life cycle assessment (LCA) to investigate the environmental performance of Chinese e-products from a global level. A desktop personal computer system has been selected to carry out a detailed and modular LCA which follows the ISO 14040 series. The LCA is constructed by SimaPro software version 7.0 and expressed with the Eco-indicator'99 life cycle impact assessment method. For a sensitivity analysis of the overall LCA results, the so-called CML method is used in order to estimate the influence of the choice of the assessment method on the result. Life cycle inventory information is complied by ecoinvent 1.3 databases, combined with literature and field investigations on the present Chinese situation. The established LCA study shows that that the manufacturing and the use of such devices are of the highest environmental importance. In the manufacturing of such devices, the integrated circuits (ICs) and the Liquid Crystal Display (LCD) are those parts contributing most to the impact. As no other aspects are taken into account during the use phase, the impact is due to the way how the electricity is produced. The final process steps--i.e. the end of life phase--lead to a clear environmental benefit if a formal and modern, up-to-date technical system is assumed, like here in this study.

  9. Life cycle assessment study of a Chinese desktop personal computer

    International Nuclear Information System (INIS)

    Duan Huabo; Eugster, Martin; Hischier, Roland; Streicher-Porte, Martin; Li Jinhui

    2009-01-01

    Associated with the tremendous prosperity in world electronic information and telecommunication industry, there continues to be an increasing awareness of the environmental impacts related to the accelerating mass production, electricity use, and waste management of electronic and electric products (e-products). China's importance as both a consumer and supplier of e-products has grown at an unprecedented pace in recent decade. Hence, this paper aims to describe the application of life cycle assessment (LCA) to investigate the environmental performance of Chinese e-products from a global level. A desktop personal computer system has been selected to carry out a detailed and modular LCA which follows the ISO 14040 series. The LCA is constructed by SimaPro software version 7.0 and expressed with the Eco-indicator'99 life cycle impact assessment method. For a sensitivity analysis of the overall LCA results, the so-called CML method is used in order to estimate the influence of the choice of the assessment method on the result. Life cycle inventory information is complied by ecoinvent 1.3 databases, combined with literature and field investigations on the present Chinese situation. The established LCA study shows that that the manufacturing and the use of such devices are of the highest environmental importance. In the manufacturing of such devices, the integrated circuits (ICs) and the Liquid Crystal Display (LCD) are those parts contributing most to the impact. As no other aspects are taken into account during the use phase, the impact is due to the way how the electricity is produced. The final process steps - i.e. the end of life phase - lead to a clear environmental benefit if a formal and modern, up-to-date technical system is assumed, like here in this study

  10. Concierge: Personal database software for managing digital research resources

    Directory of Open Access Journals (Sweden)

    Hiroyuki Sakai

    2007-11-01

    Full Text Available This article introduces a desktop application, named Concierge, for managing personal digital research resources. Using simple operations, it enables storage of various types of files and indexes them based on content descriptions. A key feature of the software is a high level of extensibility. By installing optional plug-ins, users can customize and extend the usability of the software based on their needs. In this paper, we also introduce a few optional plug-ins: literaturemanagement, electronic laboratory notebook, and XooNlps client plug-ins. XooNIps is a content management system developed to share digital research resources among neuroscience communities. It has been adopted as the standard database system in Japanese neuroinformatics projects. Concierge, therefore, offers comprehensive support from management of personal digital research resources to their sharing in open-access neuroinformatics databases such as XooNIps. This interaction between personal and open-access neuroinformatics databases is expected to enhance the dissemination of digital research resources. Concierge is developed as an open source project; Mac OS X and Windows XP versions have been released at the official site (http://concierge.sourceforge.jp.

  11. Software agent Technology: A Framework for Minimizing Fraud in ...

    African Journals Online (AJOL)

    Software agent Technology: A Framework for Minimizing Fraud in Postpaid Billing Systems. ... Journal of Research in National Development ... to the traditional Object-oriented Software engineering methodology was used to come up with this ...

  12. Improving Agile Software Practice

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte

    2006-01-01

    Software process improvement in small and agile organizations is often problematic, but achieving good SPI-assessments can still be necessary to stay in the marked or to meet demands of multinational owners. The traditional norm driven, centralized and control centered improvement approaches has...

  13. Visual attention for a desktop virtual environment with ambient scent

    Directory of Open Access Journals (Sweden)

    Alexander eToet

    2013-11-01

    Full Text Available In the current study participants explored a desktop virtual environment (VE representing a suburban neighborhood with signs of public disorder (neglect, vandalism and crime, while being exposed to either room air (control group, or subliminal levels of tar (unpleasant; typically associated with burned or waste material or freshly cut grass (pleasant; typically associated with natural or fresh material ambient odor. They reported all signs of disorder they noticed during their walk together with their associated emotional response. Based on recent evidence that odors reflexively direct visual attention to (either semantically or affectively congruent visual objects, we hypothesized that participants would notice more signs of disorder in the presence of ambient tar odor (since this odor may bias attention to unpleasant and negative features, and less signs of disorder in the presence of ambient grass odor (since this odor may bias visual attention towards the vegetation in the environment and away from the signs of disorder. Contrary to our expectations the results show that the presence of an ambient odor did not affect the participants’ visual attention for signs of disorder or their emotional response. We conclude that a closer affective, semantic or spatiotemporal link between the contents of a desktop VE and ambient scents may be required to effectively establish diagnostic associations that guide a user’s attention. In the absence of these direct links, ambient scent may be more diagnostic for the physical environment of the observer as a whole than for the particular items in that environment (or, in this case, items represented in the VE.

  14. The Utility of Open Source Software in Military Systems

    National Research Council Canada - National Science Library

    Esperon, Agustin I; Munoz, Jose P; Tanneau, Jean M

    2005-01-01

    .... The companies involved were THALES and GMV. The MILOS project aimed to demonstrate benefits of Open Source Software in large software based military systems, by casting off constraints inherent to traditional proprietary COTS and by taking...

  15. Software as a service approach to sensor simulation software deployment

    Science.gov (United States)

    Webster, Steven; Miller, Gordon; Mayott, Gregory

    2012-05-01

    Traditionally, military simulation has been problem domain specific. Executing an exercise currently requires multiple simulation software providers to specialize, deploy, and configure their respective implementations, integrate the collection of software to achieve a specific system behavior, and then execute for the purpose at hand. This approach leads to rigid system integrations which require simulation expertise for each deployment due to changes in location, hardware, and software. Our alternative is Software as a Service (SaaS) predicated on the virtualization of Night Vision Electronic Sensors (NVESD) sensor simulations as an exemplary case. Management middleware elements layer self provisioning, configuration, and integration services onto the virtualized sensors to present a system of services at run time. Given an Infrastructure as a Service (IaaS) environment, enabled and managed system of simulations yields a durable SaaS delivery without requiring user simulation expertise. Persistent SaaS simulations would provide on demand availability to connected users, decrease integration costs and timelines, and benefit the domain community from immediate deployment of lessons learned.

  16. Software Startups - A Research Agenda

    Directory of Open Access Journals (Sweden)

    Michael Unterkalmsteiner

    2016-10-01

    Full Text Available Software startup companies develop innovative, software-intensive products within limited time frames and with few resources, searching for sustainable and scalable business models. Software startups are quite distinct from traditional mature software companies, but also from micro-, small-, and medium-sized enterprises, introducing new challenges relevant for software engineering research. This paper's research agenda focuses on software engineering in startups, identifying, in particular, 70+ research questions in the areas of supporting startup engineering activities, startup evolution models and patterns, ecosystems and innovation hubs, human aspects in software startups, applying startup concepts in non-startup environments, and methodologies and theories for startup research. We connect and motivate this research agenda with past studies in software startup research, while pointing out possible future directions. While all authors of this research agenda have their main background in Software Engineering or Computer Science, their interest in software startups broadens the perspective to the challenges, but also to the opportunities that emerge from multi-disciplinary research. Our audience is therefore primarily software engineering researchers, even though we aim at stimulating collaborations and research that crosses disciplinary boundaries. We believe that with this research agenda we cover a wide spectrum of the software startup industry current needs.

  17. Software support environment design knowledge capture

    Science.gov (United States)

    Dollman, Tom

    1990-01-01

    The objective of this task is to assess the potential for using the software support environment (SSE) workstations and associated software for design knowledge capture (DKC) tasks. This assessment will include the identification of required capabilities for DKC and hardware/software modifications needed to support DKC. Several approaches to achieving this objective are discussed and interim results are provided: (1) research into the problem of knowledge engineering in a traditional computer-aided software engineering (CASE) environment, like the SSE; (2) research into the problem of applying SSE CASE tools to develop knowledge based systems; and (3) direct utilization of SSE workstations to support a DKC activity.

  18. Migrating C/C++ Software to Mobile Platforms in the ADM Context

    Directory of Open Access Journals (Sweden)

    Liliana Martinez

    2017-03-01

    Full Text Available Software technology is constantly evolving and therefore the development of applications requires adapting software components and applications in order to be aligned to new paradigms such as Pervasive Computing, Cloud Computing and Internet of Things. In particular, many desktop software components need to be migrated to mobile technologies. This migration faces many challenges due to the proliferation of different mobile platforms. Developers usually make applications tailored for each type of device expending time and effort. As a result, new programming languages are emerging to integrate the native behaviors of the different platforms targeted in development projects. In this direction, the Haxe language allows writing mobile applications that target all major mobile platforms. Novel technical frameworks for information integration and tool interoperability such as Architecture-Driven Modernization (ADM proposed by the Object Management Group (OMG can help to manage a huge diversity of mobile technologies. The Architecture-Driven Modernization Task Force (ADMTF was formed to create specifications and promote industry consensus on the modernization of existing applications. In this work, we propose a migration process from C/C++ software to different mobile platforms that integrates ADM standards with Haxe. We exemplify the different steps of the process with a simple case study, the migration of “the Set of Mandelbrot” C++ application. The proposal was validated in Eclipse Modeling Framework considering that some of its tools and run-time environments are aligned with ADM standards.

  19. GNU polyxmass: a software framework for mass spectrometric simulations of linear (bio-polymeric analytes

    Directory of Open Access Journals (Sweden)

    Rusconi Filippo

    2006-04-01

    Full Text Available Abstract Background Nowadays, a variety of (bio-polymers can be analyzed by mass spectrometry. The detailed interpretation of the spectra requires a huge number of "hypothesis cycles", comprising the following three actions 1 put forth a structural hypothesis, 2 test it, 3 (invalidate it. This time-consuming and painstaking data scrutiny is alleviated by using specialized software tools. However, all the software tools available to date are polymer chemistry-specific. This imposes a heavy overhead to researchers who do mass spectrometry on a variety of (bio-polymers, as each polymer type will require a different software tool to perform data simulations and analyses. We developed a software to address the lack of an integrated software framework able to deal with different polymer chemistries. Results The GNU polyxmass software framework performs common (bio-chemical simulations–along with simultaneous mass spectrometric calculations–for any kind of linear bio-polymeric analyte (DNA, RNA, saccharides or proteins. The framework is organized into three modules, all accessible from one single binary program. The modules let the user to 1 define brand new polymer chemistries, 2 perform quick mass calculations using a desktop calculator paradigm, 3 graphically edit polymer sequences and perform (bio-chemical/mass spectrometric simulations. Any aspect of the mass calculations, polymer chemistry reactions or graphical polymer sequence editing is configurable. Conclusion The scientist who uses mass spectrometry to characterize (bio-polymeric analytes of different chemistries is provided with a single software framework for his data prediction/analysis needs, whatever the polymer chemistry being involved.

  20. Effects of boundary-layer separation controllers on a desktop fume hood.

    Science.gov (United States)

    Huang, Rong Fung; Chen, Jia-Kun; Hsu, Ching Min; Hung, Shuo-Fu

    2016-10-02

    A desktop fume hood installed with an innovative design of flow boundary-layer separation controllers on the leading edges of the side plates, work surface, and corners was developed and characterized for its flow and containment leakage characteristics. The geometric features of the developed desktop fume hood included a rearward offset suction slot, two side plates, two side-plate boundary-layer separation controllers on the leading edges of the side plates, a slanted surface on the leading edge of the work surface, and two small triangular plates on the upper left and right corners of the hood face. The flow characteristics were examined using the laser-assisted smoke flow visualization technique. The containment leakages were measured by the tracer gas (sulphur hexafluoride) detection method on the hood face plane with a mannequin installed in front of the hood. The results of flow visualization showed that the smoke dispersions induced by the boundary-layer separations on the leading edges of the side plates and work surface, as well as the three-dimensional complex flows on the upper-left and -right corners of the hood face, were effectively alleviated by the boundary-layer separation controllers. The results of the tracer gas detection method with a mannequin standing in front of the hood showed that the leakage levels were negligibly small (≤0.003 ppm) at low face velocities (≥0.19 m/s).

  1. A Functional Analysis of DOD Implementation of Seat Management

    National Research Council Canada - National Science Library

    Rasmussen, David

    1999-01-01

    .... Seat management, also known as desktop outsourcing, involves the acquisition and management of all hardware and software, desktop and network management, operations management, support services...

  2. Exciting Normal Distribution

    Science.gov (United States)

    Fuchs, Karl Josef; Simonovits, Reinhard; Thaller, Bernd

    2008-01-01

    This paper describes a high school project where the mathematics teaching and learning software M@th Desktop (MD) based on the Computer Algebra System Mathematica was used for symbolical and numerical calculations and for visualisation. The mathematics teaching and learning software M@th Desktop 2.0 (MD) contains the modules Basics including tools…

  3. THE DIFFERENCE BETWEEN DEVELOPING SINGLE PAGE APPLICATION AND TRADITIONAL WEB APPLICATION BASED ON MECHATRONICS ROBOT LABORATORY ONAFT APPLICATION

    Directory of Open Access Journals (Sweden)

    V. Solovei

    2018-04-01

    Full Text Available Today most of desktop and mobile applications have analogues in the form of web-based applications.  With evolution of development technologies and web technologies web application increased in functionality to desktop applications. The Web application consists of two parts of the client part and the server part. The client part is responsible for providing the user with visual information through the browser. The server part is responsible for processing and storing data.MPA appeared simultaneously with the Internet. Multiple-page applications work in a "traditional" way. Every change eg. display the data or submit data back to the server. With the advent of AJAX, MPA learned to load not the whole page, but only a part of it, which eventually led to the appearance of the SPA. SPA is the principle of development when only one page is transferred to the client part, and the content is downloaded only to a certain part of the page, without rebooting it, which allows to speed up the application and simplify the user experience of using the application to the level of desktop applications.Based on the SPA, the Mechatronics Robot Laboratory ONAFT application was designed to automate the management process. The application implements the client-server architecture. The server part consists of a RESTful API, which allows you to get unified access to the application functionality, and a database for storing information. Since the client part is a spa, this allows you to reduce the load on the connection to the server and improve the user experience

  4. Development of a small-scale computer cluster

    Science.gov (United States)

    Wilhelm, Jay; Smith, Justin T.; Smith, James E.

    2008-04-01

    An increase in demand for computing power in academia has necessitated the need for high performance machines. Computing power of a single processor has been steadily increasing, but lags behind the demand for fast simulations. Since a single processor has hard limits to its performance, a cluster of computers can have the ability to multiply the performance of a single computer with the proper software. Cluster computing has therefore become a much sought after technology. Typical desktop computers could be used for cluster computing, but are not intended for constant full speed operation and take up more space than rack mount servers. Specialty computers that are designed to be used in clusters meet high availability and space requirements, but can be costly. A market segment exists where custom built desktop computers can be arranged in a rack mount situation, gaining the space saving of traditional rack mount computers while remaining cost effective. To explore these possibilities, an experiment was performed to develop a computing cluster using desktop components for the purpose of decreasing computation time of advanced simulations. This study indicates that small-scale cluster can be built from off-the-shelf components which multiplies the performance of a single desktop machine, while minimizing occupied space and still remaining cost effective.

  5. A Uniform Ontology for Software Interfaces

    Science.gov (United States)

    Feyock, Stefan

    2002-01-01

    It is universally the case that computer users who are not also computer specialists prefer to deal with computers' in terms of a familiar ontology, namely that of their application domains. For example, the well-known Windows ontology assumes that the user is an office worker, and therefore should be presented with a "desktop environment" featuring entities such as (virtual) file folders, documents, appointment calendars, and the like, rather than a world of machine registers and machine language instructions, or even the DOS command level. The central theme of this research has been the proposition that the user interacting with a software system should have at his disposal both the ontology underlying the system, as well as a model of the system. This information is necessary for the understanding of the system in use, as well as for the automatic generation of assistance for the user, both in solving the problem for which the application is designed, and for providing guidance in the capabilities and use of the system.

  6. Visual attention for a desktop virtual environment with ambient scent.

    Science.gov (United States)

    Toet, Alexander; van Schaik, Martin G

    2013-01-01

    In the current study participants explored a desktop virtual environment (VE) representing a suburban neighborhood with signs of public disorder (neglect, vandalism, and crime), while being exposed to either room air (control group), or subliminal levels of tar (unpleasant; typically associated with burned or waste material) or freshly cut grass (pleasant; typically associated with natural or fresh material) ambient odor. They reported all signs of disorder they noticed during their walk together with their associated emotional response. Based on recent evidence that odors reflexively direct visual attention to (either semantically or affectively) congruent visual objects, we hypothesized that participants would notice more signs of disorder in the presence of ambient tar odor (since this odor may bias attention to unpleasant and negative features), and less signs of disorder in the presence of ambient grass odor (since this odor may bias visual attention toward the vegetation in the environment and away from the signs of disorder). Contrary to our expectations the results provide no indication that the presence of an ambient odor affected the participants' visual attention for signs of disorder or their emotional response. However, the paradigm used in present study does not allow us to draw any conclusions in this respect. We conclude that a closer affective, semantic, or spatiotemporal link between the contents of a desktop VE and ambient scents may be required to effectively establish diagnostic associations that guide a user's attention. In the absence of these direct links, ambient scent may be more diagnostic for the physical environment of the observer as a whole than for the particular items in that environment (or, in this case, items represented in the VE).

  7. Essence: Team-Based Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2012-01-01

    Essence is a methodology supporting innovative software teams. It is designed with agile development in mind to allow for the problem situation to talk back to the team as they go along building solutions. Traditional software development teams – and for that matter probably also development teams...... using technologies other than software – might also enjoy adapting Essence to suit their situation. Essence is not yet another method for generating ideas. There are plenty of good methods already, and for that reason I choose to focus less on idea generation and more on the thereafter. Most teams....... Essence is based on the idea that challenges are open to interpretation and choice. We may often choose how we understand a challenge and choose among several strategies for answering it. Software development and indeed software innovation are far from linear. Essence is built on structures rather than...

  8. Testing Object-Oriented Software

    DEFF Research Database (Denmark)

    Caspersen, Michael Edelgaard; Madsen, Ole Lehrmann; Skov, Stefan H.

    The report is a result of an activity within the project Centre for Object Technology (COT), case 2. In case 2 a number of pilot projects have been carried out to test the feasibility of using object technology within embedded software. Some of the pilot projects have resulted in proto-types that......The report is a result of an activity within the project Centre for Object Technology (COT), case 2. In case 2 a number of pilot projects have been carried out to test the feasibility of using object technology within embedded software. Some of the pilot projects have resulted in proto......-types that are currently being developed into production versions. To assure a high quality in the product it was decided to carry out an activ-ity regarding issues in testing OO software. The purpose of this report is to discuss the issues of testing object-oriented software. It is often claimed that testing of OO...... software is radically different form testing traditional software developed using imperative/procedural programming. Other authors claim that there is no difference. In this report we will attempt to give an answer to these questions (or at least initiate a discussion)....

  9. Towards a typification of software ecosystems

    DEFF Research Database (Denmark)

    Knodel, Jens; Manikas, Konstantinos

    2015-01-01

    Classical software engineering has been traditionally dominated by stand-alone development organizations and collaborations be- tween contractors, integrators and suppliers. The notion of software ecosystems has been established as a new kind of software engineer- ing paradigm in the last decade....... In its essence it proposes participative engineering across independent development organizations. This short paper reviews the current state-of-the-art and presents a typification of successful software ecosystems. We further discuss key characteristic of the ecosystem types and present a set of example...... cases. The characterization reviews and consolidates existing research and discusses variations within the key building block of a software ecosystem. It further enables sharpening the borders of what an ecosystem is (and what not) and how the individual types can be differentiated. Thus, this paper...

  10. A VBA Desktop Database for Proposal Processing at National Optical Astronomy Observatories

    Science.gov (United States)

    Brown, Christa L.

    National Optical Astronomy Observatories (NOAO) has developed a relational Microsoft Windows desktop database using Microsoft Access and the Microsoft Office programming language, Visual Basic for Applications (VBA). The database is used to track data relating to observing proposals from original receipt through the review process, scheduling, observing, and final statistical reporting. The database has automated proposal processing and distribution of information. It allows NOAO to collect and archive data so as to query and analyze information about our science programs in new ways.

  11. CloVR: a virtual machine for automated and portable sequence analysis from the desktop using cloud computing.

    Science.gov (United States)

    Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian

    2011-08-30

    Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.

  12. Cooperative and human aspects of software engineering: CHASE 2010

    DEFF Research Database (Denmark)

    Dittrich, Yvonne; Sharp, Helen C.; Winschiers Theophilus, Heike

    2010-01-01

    Software is created by people -- software engineers in cooperation with domain experts, users and other stakeholders--in varied environments, under various conditions. Thus understanding cooperative and human aspects of software development is crucial to comprehend how and which methods and tools...... are required, to improve the creation and maintenance of software. The 3rd workshop on Cooperative and Human Aspects of Software Engineering held at the International Conference on Software Engineering continued the tradition from earlier workshops and provided a lively forum to discuss current developments...... and high quality research in the field. Further dissemination of research results will lead to an improvement of software development and deployment across the globe....

  13. Large-scale visualization projects for teaching software engineering.

    Science.gov (United States)

    Müller, Christoph; Reina, Guido; Burch, Michael; Weiskopf, Daniel

    2012-01-01

    The University of Stuttgart's software engineering major complements the traditional computer science major with more practice-oriented education. Two-semester software projects in various application areas offered by the university's different computer science institutes are a successful building block in the curriculum. With this realistic, complex project setting, students experience the practice of software engineering, including software development processes, technologies, and soft skills. In particular, visualization-based projects are popular with students. Such projects offer them the opportunity to gain profound knowledge that would hardly be possible with only regular lectures and homework assignments.

  14. Active resources concept of computation for enterprise software

    Directory of Open Access Journals (Sweden)

    Koryl Maciej

    2017-06-01

    Full Text Available Traditional computational models for enterprise software are still to a great extent centralized. However, rapid growing of modern computation techniques and frameworks causes that contemporary software becomes more and more distributed. Towards development of new complete and coherent solution for distributed enterprise software construction, synthesis of three well-grounded concepts is proposed: Domain-Driven Design technique of software engineering, REST architectural style and actor model of computation. As a result new resources-based framework arises, which after first cases of use seems to be useful and worthy of further research.

  15. Happy software developers solve problems better: psychological measurements in empirical software engineering.

    Science.gov (United States)

    Graziotin, Daniel; Wang, Xiaofeng; Abrahamsson, Pekka

    2014-01-01

    For more than thirty years, it has been claimed that a way to improve software developers' productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states-emotions and moods-deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint.

  16. Six Sigma software development

    CERN Document Server

    Tayntor, Christine B

    2002-01-01

    Since Six Sigma has had marked success in improving quality in other settings, and since the quality of software remains poor, it seems a natural evolution to apply the concepts and tools of Six Sigma to system development and the IT department. Until now however, there were no books available that applied these concepts to the system development process. Six Sigma Software Development fills this void and illustrates how Six Sigma concepts can be applied to all aspects of the evolving system development process. It includes the traditional waterfall model and in the support of legacy systems,

  17. Economic analysis of cloud-based desktop virtualization implementation at a hospital.

    Science.gov (United States)

    Yoo, Sooyoung; Kim, Seok; Kim, Taeki; Baek, Rong-Min; Suh, Chang Suk; Chung, Chin Youb; Hwang, Hee

    2012-10-30

    Cloud-based desktop virtualization infrastructure (VDI) is known as providing simplified management of application and desktop, efficient management of physical resources, and rapid service deployment, as well as connection to the computer environment at anytime, anywhere with any device. However, the economic validity of investing in the adoption of the system at a hospital has not been established. This study computed the actual investment cost of the hospital-wide VDI implementation at the 910-bed Seoul National University Bundang Hospital in Korea and the resulting effects (i.e., reductions in PC errors and difficulties, application and operating system update time, and account management time). Return on investment (ROI), net present value (NPV), and internal rate of return (IRR) indexes used for corporate investment decision-making were used for the economic analysis of VDI implementation. The results of five-year cost-benefit analysis given for 400 Virtual Machines (VMs; i.e., 1,100 users in the case of SNUBH) showed that the break-even point was reached in the fourth year of the investment. At that point, the ROI was 122.6%, the NPV was approximately US$192,000, and the IRR showed an investment validity of 10.8%. From our sensitivity analysis to changing the number of VMs (in terms of number of users), the greater the number of adopted VMs was the more investable the system was. This study confirms that the emerging VDI can have an economic impact on hospital information system (HIS) operation and utilization in a tertiary hospital setting.

  18. Economic analysis of cloud-based desktop virtualization implementation at a hospital

    Directory of Open Access Journals (Sweden)

    Yoo Sooyoung

    2012-10-01

    Full Text Available Abstract Background Cloud-based desktop virtualization infrastructure (VDI is known as providing simplified management of application and desktop, efficient management of physical resources, and rapid service deployment, as well as connection to the computer environment at anytime, anywhere with anydevice. However, the economic validity of investing in the adoption of the system at a hospital has not been established. Methods This study computed the actual investment cost of the hospital-wide VDI implementation at the 910-bed Seoul National University Bundang Hospital in Korea and the resulting effects (i.e., reductions in PC errors and difficulties, application and operating system update time, and account management time. Return on investment (ROI, net present value (NPV, and internal rate of return (IRR indexes used for corporate investment decision-making were used for the economic analysis of VDI implementation. Results The results of five-year cost-benefit analysis given for 400 Virtual Machines (VMs; i.e., 1,100 users in the case of SNUBH showed that the break-even point was reached in the fourth year of the investment. At that point, the ROI was 122.6%, the NPV was approximately US$192,000, and the IRR showed an investment validity of 10.8%. From our sensitivity analysis to changing the number of VMs (in terms of number of users, the greater the number of adopted VMs was the more investable the system was. Conclusions This study confirms that the emerging VDI can have an economic impact on hospital information system (HIS operation and utilization in a tertiary hospital setting.

  19. Enhancing requirements engineering for patient registry software systems with evidence-based components.

    Science.gov (United States)

    Lindoerfer, Doris; Mansmann, Ulrich

    2017-07-01

    Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be

  20. Software in the oil industry

    International Nuclear Information System (INIS)

    Turnill, M.C.

    1982-01-01

    The paper reviews the challenges of today's oil industry which is dominated in Europe by offshore production. Some of the key computer applications are examined, discussing new software development methods which have been adopted in order to achieve significant reduction in development times. The range of modern software development tools is considered, with the decreasing impact of traditional programming languages such as COBOL and FORTRAN. The use and benefits of non procedural languages are also discussed together with some views on their relevance to high energy physics. The paper concludes with a look into the not-too-distant future, stressing the need for new approaches to software development and improving the facilities for information handling. (orig.)

  1. A Lifecycle Which Incorporates Software Metrics

    OpenAIRE

    Li, Wei; Henry, Sallie M.

    1990-01-01

    The traditional waterfall life cycle model of software development provides a systematic method to separate the development process into different stages with explicit communication boundaries between each subsequent stage. But the waterfall model does not provide quantitative measurements for the products of each phase in the software life cycle. The model provides a base to develop methodologies which emphasize the completeness of the documents, the use of certain disciplines, and the cons...

  2. Scrum in the Traditional Development Organization

    DEFF Research Database (Denmark)

    Ovesen, Nis; Friis Sommer, Anita

    2015-01-01

    During the last couple of years, the application of Scrum as a project management framework has been broadened from initially belonging to the software domain. Now companies within the field of traditional product development are starting to implement Scrum in an attempt to improve...

  3. Students' Beliefs about Mobile Devices vs. Desktop Computers in South Korea and the United States

    Science.gov (United States)

    Sung, Eunmo; Mayer, Richard E.

    2012-01-01

    College students in the United States and in South Korea completed a 28-item multidimensional scaling (MDS) questionnaire in which they rated the similarity of 28 pairs of multimedia learning materials on a 10-point scale (e.g., narrated animation on a mobile device Vs. movie clip on a desktop computer) and a 56-item semantic differential…

  4. Software engineers and nuclear engineers: teaming up to do testing

    International Nuclear Information System (INIS)

    Kelly, D.; Cote, N.; Shepard, T.

    2007-01-01

    The software engineering community has traditionally paid little attention to the specific needs of engineers and scientists who develop their own software. Recently there has been increased recognition that specific software engineering techniques need to be found for this group of developers. In this case study, a software engineering group teamed with a nuclear engineering group to develop a software testing strategy. This work examines the types of testing that proved to be useful and examines what each discipline brings to the table to improve the quality of the software product. (author)

  5. Use of collaboration software to improve nuclear power plant outage management

    Energy Technology Data Exchange (ETDEWEB)

    Germain, Shawn

    2015-02-01

    Nuclear Power Plant (NPP) refueling outages create some of the most challenging activities the utilities face in both tracking and coordinating thousands of activities in a short period of time. Other challenges, including nuclear safety concerns arising from atypical system configurations and resource allocation issues, can create delays and schedule overruns, driving up outage costs. Today the majority of the outage communication is done using processes that do not take advantage of advances in modern technologies that enable enhanced communication, collaboration and information sharing. Some of the common practices include: runners that deliver paper-based requests for approval, radios, telephones, desktop computers, daily schedule printouts, and static whiteboards that are used to display information. Many gains have been made to reduce the challenges facing outage coordinators; however; new opportunities can be realized by utilizing modern technological advancements in communication and information tools that can enhance the collective situational awareness of plant personnel leading to improved decision-making. Ongoing research as part of the Light Water Reactor Sustainability Program (LWRS) has been targeting NPP outage improvement. As part of this research, various applications of collaborative software have been demonstrated through pilot project utility partnerships. Collaboration software can be utilized as part of the larger concept of Computer-Supported Cooperative Work (CSCW). Collaborative software can be used for emergent issue resolution, Outage Control Center (OCC) displays, and schedule monitoring. Use of collaboration software enables outage staff and subject matter experts (SMEs) to view and update critical outage information from any location on site or off.

  6. Software Accelerates Computing Time for Complex Math

    Science.gov (United States)

    2014-01-01

    Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.

  7. Assessment of Two Desk-Top Computer Simulations Used to Train Tactical Decision Making (TDM) of Small Unit Infantry Leaders

    National Research Council Canada - National Science Library

    Beal, Scott A

    2007-01-01

    Fifty-two leaders in the Basic Non-Commissioned Officer Course (BNCOC) at Fort Benning, Georgia, participated in an assessment of two desk-top computer simulations used to train tactical decision making...

  8. Mars Propellant Liquefaction Modeling in Thermal Desktop

    Science.gov (United States)

    Desai, Pooja; Hauser, Dan; Sutherlin, Steven

    2017-01-01

    NASAs current Mars architectures are assuming the production and storage of 23 tons of liquid oxygen on the surface of Mars over a duration of 500+ days. In order to do this in a mass efficient manner, an energy efficient refrigeration system will be required. Based on previous analysis NASA has decided to do all liquefaction in the propulsion vehicle storage tanks. In order to allow for transient Martian environmental effects, a propellant liquefaction and storage system for a Mars Ascent Vehicle (MAV) was modeled using Thermal Desktop. The model consisted of a propellant tank containing a broad area cooling loop heat exchanger integrated with a reverse turbo Brayton cryocooler. Cryocooler sizing and performance modeling was conducted using MAV diurnal heat loads and radiator rejection temperatures predicted from a previous thermal model of the MAV. A system was also sized and modeled using an alternative heat rejection system that relies on a forced convection heat exchanger. Cryocooler mass, input power, and heat rejection for both systems were estimated and compared against sizing based on non-transient sizing estimates.

  9. GRID : unlimited computing power on your desktop Conference MT17

    CERN Multimedia

    2001-01-01

    The Computational GRID is an analogy to the electrical power grid for computing resources. It decouples the provision of computing, data, and networking from its use, it allows large-scale pooling and sharing of resources distributed world-wide. Every computer, from a desktop to a mainframe or supercomputer, can provide computing power or data for the GRID. The final objective is to plug your computer into the wall and have direct access to huge computing resources immediately, just like plugging-in a lamp to get instant light. The GRID will facilitate world-wide scientific collaborations on an unprecedented scale. It will provide transparent access to major distributed resources of computer power, data, information, and collaborations.

  10. Laevo: A Temporal Desktop Interface for Integrated Knowledge Work

    DEFF Research Database (Denmark)

    Jeuris, Steven; Houben, Steven; Bardram, Jakob

    2014-01-01

    Prior studies show that knowledge work is characterized by highly interlinked practices, including task, file and window management. However, existing personal information management tools primarily focus on a limited subset of knowledge work, forcing users to perform additional manual...... states and transitions of an activity. The life cycle is used to inform the design of Laevo, a temporal activity-centric desktop interface for personal knowledge work. Laevo allows users to structure work within dedicated workspaces, managed on a timeline. Through a centralized notification system which...... configuration work to integrate the different tools they use. In order to understand tool usage, we review literature on how users' activities are created and evolve over time as part of knowledge worker practices. From this we derive the activity life cycle, a conceptual framework describing the different...

  11. Evidence synthesis software.

    Science.gov (United States)

    Park, Sophie Elizabeth; Thomas, James

    2018-06-07

    It can be challenging to decide which evidence synthesis software to choose when doing a systematic review. This article discusses some of the important questions to consider in relation to the chosen method and synthesis approach. Software can support researchers in a range of ways. Here, a range of review conditions and software solutions. For example, facilitating contemporaneous collaboration across time and geographical space; in-built bias assessment tools; and line-by-line coding for qualitative textual analysis. EPPI-Reviewer is a review software for research synthesis managed by the EPPI-centre, UCL Institute of Education. EPPI-Reviewer has text mining automation technologies. Version 5 supports data sharing and re-use across the systematic review community. Open source software will soon be released. EPPI-Centre will continue to offer the software as a cloud-based service. The software is offered via a subscription with a one-month (extendible) trial available and volume discounts for 'site licences'. It is free to use for Cochrane and Campbell reviews. The next EPPI-Reviewer version is being built in collaboration with National Institute for Health and Care Excellence using 'surveillance' of newly published research to support 'living' iterative reviews. This is achieved using a combination of machine learning and traditional information retrieval technologies to identify the type of research each new publication describes and determine its relevance for a particular review, domain or guideline. While the amount of available knowledge and research is constantly increasing, the ways in which software can support the focus and relevance of data identification are also developing fast. Software advances are maximising the opportunities for the production of relevant and timely reviews. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise

  12. Fab the coming revolution on your desktop : from personal computers to personal fabrication

    CERN Document Server

    Gershenfeld, Neil

    2005-01-01

    What if you could someday put the manufacturing power of an automobile plant on your desktop? According to Neil Gershenfeld, the renowned MIT scientist and inventor, the next big thing is personal fabrication-the ability to design and produce your own products, in your own home, with a machine that combines consumer electronics and industrial tools. Personal fabricators are about to revolutionize the world just as personal computers did a generation ago, and Fab shows us how.

  13. Software Technology for E-Commerce Era

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The rapid growth of Internet usage and electronic commerce(e-commerce) applica t ions will push traditional industries to transform their business models and to re-engineer their information systems. This direction will give the software in d ustry either great opportunities for their business growth or crucial challenges to their existence. This article describes two essential challenges the softwar e industry will face and presents relevant new technologies that will be helpful for overcoming those challenges.

  14. Design of a Nanosatellite Ground Monitoring and Control Software – a Case Study

    Directory of Open Access Journals (Sweden)

    Freddy Alexander Díaz González

    2016-04-01

    Full Text Available The growing countries that have carried out the development of CubeSat missions for academic purposes do not offer aerospace engineering programs at their universities. This causes difficulties for traditional engineers upon the formal use of different standards and frameworks for aerospace development, such as the European Cooperation for Space Standardization and Space Mission Analysis and Design . One way in which traditional software engineers can easily understand the structure of an aerospace framework, in order to apply it on the development of CubeSat mission software parts, is comparing its most important elements in relation to the elements suggested by a more familiar method. In this paper, we present a hybrid framework between the ECSS-E-ST-40C standard and the Rational Unified Process, which can be used by traditional software engineers as a guide model for the development of software elements in academic nanosatellite missions. The model integrates the processes and documentation suggested by the ECSS-E-ST-40C with the disciplines, workflows and artifacts suggested in Rational Unified Process. This simplifies the structure of ECSS-E-ST-40C and allows traditional software engineers to easily understand its work elements. The paper describes as study case the implementation of the hybrid model in the analysis and design of ground monitoring and control software for the Libertad-2 satellite mission, which is currently being developed by the Universidad Sergio Arboleda in Colombia.

  15. Romanian traditional motif - element of modernity in clothing

    Science.gov (United States)

    Doble, L.; Stan, O.; Suteu, M. D.; Albu, A.; Bohm, G.; Tsatsarou-Michalaki, A.; Gialinou, E.

    2017-10-01

    In this paper are presented the phases for improving from an aesthetic point of view a clothing item, the jacket respectively, with a straight cut for women using software design patterns, computerised graphics and textile different modern technologies including: industrial embroidery, digital printing, sublimation. In the first phase a documentation was prepared in the Ethnographic Museum of Transylvania from Cluj Napoca where more traditional motifs were selected specific to Transylvania etnographic region and were reintepreted and stylized whilst preserving the symbolism and color range specified to the area. For the styling phase was used CorelDraw vector graphics program that allows changing the shape, size and color of the drawings without affecting the identity of the pattern. In the patterns design phase Gemini CAD software was used and for the modeling and model development Optitex software was used. The part for garnishing the model was performed using Embrodery machine software reproducing the stylized motif identically. In order to obtain a significantly improved aesthetic look and an added artistic value the pattern chosen for the jacket was done using a combination of modern textile technologies. This has allowed the realization of a particular texture on the surface of the designed product, demonstrating that traditional patterns can be reintepreted in modern clothing

  16. SMARTPHONES AND THEIR IMPACT ON NET INCOME PER EMPLOYEE FOR SELECTED U.S. FIRMS

    OpenAIRE

    Amod Choudhary

    2014-01-01

    For the last few years, the number of smartphone users has been on a remarkable rise. The number of users increased from 62.6 million in 2010 to 115.8 million in 2012, and expected to increase to 192.4 million by 2016. This increased usage of smartphones by employees poses a dilemma for organizations. Since smartphones can do almost all the tasks (email, internet, and run applications of popular Microsoft software) of a traditional desktop computer, laptop, and phone; smartphone users are exp...

  17. E-COCOMO: The Extended COst Constructive MOdel for Cleanroom Software Engineering

    Directory of Open Access Journals (Sweden)

    Hitesh KUMAR SHARMA

    2014-02-01

    Full Text Available Mistakes create rework. Rework takes time and increases costs. The traditional software engineering methodology defines the ratio of Design:Code:Test as 40:20:40. As we can easily see that 40% time and efforts are used in testing phase in traditional approach, that means we have to perform rework again if we found some bugs in testing phase. This rework is being performed after Design and code phase. This rework will increase the cost exponentially. The cleanroom software engineering methodology controls the exponential growth in cost by removing this rework. It says that "do the work correct in first attempt and move to next phase after getting the proof of correctness". This new approach minimized the rework and reduces the cost in the exponential ratio. Due to the removal of testing phase, the COCOMO (COst COnstructive MOdel used for the traditional engineering is not directly applicable in cleanroom software engineering. The traditional cost drivers used for traditional COCOMO needs to be revised. We have proposed the Extended version of COCOMO (i.e. E-COCOMO in which we have incorporated some new cost drivers. This paper explains the proposed E-COCOMO and the detailed description of proposed new cost driver.

  18. Desk-top microcomputer for lab-scale process control

    International Nuclear Information System (INIS)

    Overman, R.F.; Byrd, J.S.; Goosey, M.H.; Sand, R.J.

    1981-01-01

    A desk-top microcomputer was programmed to acquire the data from various process control sensors installed in a laboratory scale liquid-liquid extraction, pulse column facility. The parameters monitored included valve positions, gamma spectra, alpha radioactivity, temperature, pH, density, and flow rates. The program for the microcomputer is written in BASIC and requires about 31000 8-bit bytes of memory. All data is stored on floppy discs, and can be displayed or printed. Unexpected data values are brought to the process operator's attention via CRT display or print-out. The general organization of the program and a few subroutines unique to polling instruments are explained. Some of the data acquisition devices were designed and built at the Savannah River Laboratory. These include a pulse height analyzer, a data multiplexer, and a data acquisition instrument. A general description of the electronics design of these instruments is also given with emphasis placed on data formatting and bus addressing

  19. Direct Desktop Printed-Circuits-on-Paper Flexible Electronics

    Science.gov (United States)

    Zheng, Yi; He, Zhizhu; Gao, Yunxia; Liu, Jing

    2013-01-01

    There currently lacks of a way to directly write out electronics, just like printing pictures on paper by an office printer. Here we show a desktop printing of flexible circuits on paper via developing liquid metal ink and related working mechanisms. Through modifying adhesion of the ink, overcoming its high surface tension by dispensing machine and designing a brush like porous pinhead for printing alloy and identifying matched substrate materials among different papers, the slightly oxidized alloy ink was demonstrated to be flexibly printed on coated paper, which could compose various functional electronics and the concept of Printed-Circuits-on-Paper was thus presented. Further, RTV silicone rubber was adopted as isolating inks and packaging material to guarantee the functional stability of the circuit, which suggests an approach for printing 3D hybrid electro-mechanical device. The present work paved the way for a low cost and easygoing method in directly printing paper electronics.

  20. Finite-difference method Stokes solver (FDMSS) for 3D pore geometries: Software development, validation and case studies

    Science.gov (United States)

    Gerke, Kirill M.; Vasilyev, Roman V.; Khirevich, Siarhei; Collins, Daniel; Karsanina, Marina V.; Sizonenko, Timofey O.; Korost, Dmitry V.; Lamontagne, Sébastien; Mallants, Dirk

    2018-05-01

    Permeability is one of the fundamental properties of porous media and is required for large-scale Darcian fluid flow and mass transport models. Whilst permeability can be measured directly at a range of scales, there are increasing opportunities to evaluate permeability from pore-scale fluid flow simulations. We introduce the free software Finite-Difference Method Stokes Solver (FDMSS) that solves Stokes equation using a finite-difference method (FDM) directly on voxelized 3D pore geometries (i.e. without meshing). Based on explicit convergence studies, validation on sphere packings with analytically known permeabilities, and comparison against lattice-Boltzmann and other published FDM studies, we conclude that FDMSS provides a computationally efficient and accurate basis for single-phase pore-scale flow simulations. By implementing an efficient parallelization and code optimization scheme, permeability inferences can now be made from 3D images of up to 109 voxels using modern desktop computers. Case studies demonstrate the broad applicability of the FDMSS software for both natural and artificial porous media.

  1. Finite-difference method Stokes solver (FDMSS) for 3D pore geometries: Software development, validation and case studies

    KAUST Repository

    Gerke, Kirill M.

    2018-01-17

    Permeability is one of the fundamental properties of porous media and is required for large-scale Darcian fluid flow and mass transport models. Whilst permeability can be measured directly at a range of scales, there are increasing opportunities to evaluate permeability from pore-scale fluid flow simulations. We introduce the free software Finite-Difference Method Stokes Solver (FDMSS) that solves Stokes equation using a finite-difference method (FDM) directly on voxelized 3D pore geometries (i.e. without meshing). Based on explicit convergence studies, validation on sphere packings with analytically known permeabilities, and comparison against lattice-Boltzmann and other published FDM studies, we conclude that FDMSS provides a computationally efficient and accurate basis for single-phase pore-scale flow simulations. By implementing an efficient parallelization and code optimization scheme, permeability inferences can now be made from 3D images of up to 109 voxels using modern desktop computers. Case studies demonstrate the broad applicability of the FDMSS software for both natural and artificial porous media.

  2. The Global Climate Dashboard: a Software Interface to Stream Comprehensive Climate Data

    Science.gov (United States)

    Gardiner, N.; Phillips, M.; NOAA Climate Portal Dashboard

    2011-12-01

    The Global Climate Dashboard is an integral component of NOAA's web portal to climate data, services, and value-added content for decision-makers, teachers, and the science-attentive public (www.clmate.gov). The dashboard provides a rapid view of observational data that demonstrate climate change and variability, as well as outputs from the Climate Model Intercomparison Project version 3, which was built to support the Intergovernmental Panel on Climate Change fourth assessment. The data shown in the dashboard therefore span a range of climate science disciplines with applications that serve audiences with diverse needs. The dashboard is designed with reusable software components that allow it to be implemented incrementally on a wide range of platforms including desktops, tablet devices, and mobile phones. The underlying software components support live streaming of data and provide a way of encapsulating graph sytles and other presentation details into a device-independent standard format that results in a common visual look and feel across all platforms. Here we describe the pedagogical objectives, technical implementation, and the deployment of the dashboard through climate.gov and partner web sites and describe plans to develop a mobile application using the same framework.

  3. The Clinical Utilisation of Respiratory Elastance Software (CURE Soft): a bedside software for real-time respiratory mechanics monitoring and mechanical ventilation management.

    Science.gov (United States)

    Szlavecz, Akos; Chiew, Yeong Shiong; Redmond, Daniel; Beatson, Alex; Glassenbury, Daniel; Corbett, Simon; Major, Vincent; Pretty, Christopher; Shaw, Geoffrey M; Benyo, Balazs; Desaive, Thomas; Chase, J Geoffrey

    2014-09-30

    Real-time patient respiratory mechanics estimation can be used to guide mechanical ventilation settings, particularly, positive end-expiratory pressure (PEEP). This work presents a software, Clinical Utilisation of Respiratory Elastance (CURE Soft), using a time-varying respiratory elastance model to offer this ability to aid in mechanical ventilation treatment. CURE Soft is a desktop application developed in JAVA. It has two modes of operation, 1) Online real-time monitoring decision support and, 2) Offline for user education purposes, auditing, or reviewing patient care. The CURE Soft has been tested in mechanically ventilated patients with respiratory failure. The clinical protocol, software testing and use of the data were approved by the New Zealand Southern Regional Ethics Committee. Using CURE Soft, patient's respiratory mechanics response to treatment and clinical protocol were monitored. Results showed that the patient's respiratory elastance (Stiffness) changed with the use of muscle relaxants, and responded differently to ventilator settings. This information can be used to guide mechanical ventilation therapy and titrate optimal ventilator PEEP. CURE Soft enables real-time calculation of model-based respiratory mechanics for mechanically ventilated patients. Results showed that the system is able to provide detailed, previously unavailable information on patient-specific respiratory mechanics and response to therapy in real-time. The additional insight available to clinicians provides the potential for improved decision-making, and thus improved patient care and outcomes.

  4. The software analysis project for the Office of Human Resources

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1994-01-01

    There were two major sections of the project for the Office of Human Resources (OHR). The first section was to conduct a planning study to analyze software use with the goal of recommending software purchases and determining whether the need exists for a file server. The second section was analysis and distribution planning for retirement planning computer program entitled VISION provided by NASA Headquarters. The software planning study was developed to help OHR analyze the current administrative desktop computing environment and make decisions regarding software acquisition and implementation. There were three major areas addressed by the study: current environment new software requirements, and strategies regarding the implementation of a server in the Office. To gather data on current environment, employees were surveyed and an inventory of computers were produced. The surveys were compiled and analyzed by the ASEE fellow with interpretation help by OHR staff. New software requirements represented a compilation and analysis of the surveyed requests of OHR personnel. Finally, the information on the use of a server represents research done by the ASEE fellow and analysis of survey data to determine software requirements for a server. This included selection of a methodology to estimate the number of copies of each software program required given current use and estimated growth. The report presents the results of the computing survey, a description of the current computing environment, recommenations for changes in the computing environment, current software needs, management advantages of using a server, and management considerations in the implementation of a server. In addition, detailed specifications were presented for the hardware and software recommendations to offer a complete picture to OHR management. The retirement planning computer program available to NASA employees will aid in long-range retirement planning. The intended audience is the NASA civil

  5. An interactive physics-based unmanned ground vehicle simulator leveraging open source gaming technology: progress in the development and application of the virtual autonomous navigation environment (VANE) desktop

    Science.gov (United States)

    Rohde, Mitchell M.; Crawford, Justin; Toschlog, Matthew; Iagnemma, Karl D.; Kewlani, Guarav; Cummins, Christopher L.; Jones, Randolph A.; Horner, David A.

    2009-05-01

    It is widely recognized that simulation is pivotal to vehicle development, whether manned or unmanned. There are few dedicated choices, however, for those wishing to perform realistic, end-to-end simulations of unmanned ground vehicles (UGVs). The Virtual Autonomous Navigation Environment (VANE), under development by US Army Engineer Research and Development Center (ERDC), provides such capabilities but utilizes a High Performance Computing (HPC) Computational Testbed (CTB) and is not intended for on-line, real-time performance. A product of the VANE HPC research is a real-time desktop simulation application under development by the authors that provides a portal into the HPC environment as well as interaction with wider-scope semi-automated force simulations (e.g. OneSAF). This VANE desktop application, dubbed the Autonomous Navigation Virtual Environment Laboratory (ANVEL), enables analysis and testing of autonomous vehicle dynamics and terrain/obstacle interaction in real-time with the capability to interact within the HPC constructive geo-environmental CTB for high fidelity sensor evaluations. ANVEL leverages rigorous physics-based vehicle and vehicle-terrain interaction models in conjunction with high-quality, multimedia visualization techniques to form an intuitive, accurate engineering tool. The system provides an adaptable and customizable simulation platform that allows developers a controlled, repeatable testbed for advanced simulations. ANVEL leverages several key technologies not common to traditional engineering simulators, including techniques from the commercial video-game industry. These enable ANVEL to run on inexpensive commercial, off-the-shelf (COTS) hardware. In this paper, the authors describe key aspects of ANVEL and its development, as well as several initial applications of the system.

  6. Factors influencing awareness and attendance of traditional oral ...

    African Journals Online (AJOL)

    Data were recorded using SPSS version 16 software. ... Conclusion: The study showed moderate awareness of traditional oral care .... Descriptive and inferential statistics were used as ..... C. Pilot survey of oral health-related quality of life: a.

  7. Lowering the Barriers to Using Data: Enabling Desktop-based HPD Science through Virtual Environments and Web Data Services

    Science.gov (United States)

    Druken, K. A.; Trenham, C. E.; Steer, A.; Evans, B. J. K.; Richards, C. J.; Smillie, J.; Allen, C.; Pringle, S.; Wang, J.; Wyborn, L. A.

    2016-12-01

    The Australian National Computational Infrastructure (NCI) provides access to petascale data in climate, weather, Earth observations, and genomics, and terascale data in astronomy, geophysics, ecology and land use, as well as social sciences. The data is centralized in a closely integrated High Performance Computing (HPC), High Performance Data (HPD) and cloud facility. Despite this, there remain significant barriers for many users to find and access the data: simply hosting a large volume of data is not helpful if researchers are unable to find, access, and use the data for their particular need. Use cases demonstrate we need to support a diverse range of users who are increasingly crossing traditional research discipline boundaries. To support their varying experience, access needs and research workflows, NCI has implemented an integrated data platform providing a range of services that enable users to interact with our data holdings. These services include: - A GeoNetwork catalog built on standardized Data Management Plans to search collection metadata, and find relevant datasets; - Web data services to download or remotely access data via OPeNDAP, WMS, WCS and other protocols; - Virtual Desktop Infrastructure (VDI) built on a highly integrated on-site cloud with access to both the HPC peak machine and research data collections. The VDI is a fully featured environment allowing visualization, code development and analysis to take place in an interactive desktop environment; and - A Learning Management System (LMS) containing User Guides, Use Case examples and Jupyter Notebooks structured into courses, so that users can self-teach how to use these facilities with examples from our system across a range of disciplines. We will briefly present these components, and discuss how we engage with data custodians and consumers to develop standardized data structures and services that support the range of needs. We will also highlight some key developments that have

  8. CloVR: A virtual machine for automated and portable sequence analysis from the desktop using cloud computing

    Science.gov (United States)

    2011-01-01

    Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105

  9. Software ``Best'' Practices: Agile Deconstructed

    Science.gov (United States)

    Fraser, Steven

    Software “best” practices depend entirely on context - in terms of the problem domain, the system constructed, the software designers, and the “customers” ultimately deriving value from the system. Agile practices no longer have the luxury of “choosing” small non-mission critical projects with co-located teams. Project stakeholders are selecting and adapting practices based on a combina tion of interest, need and staffing. For example, growing product portfolios through a merger or the acquisition of a company exposes legacy systems to new staff, new software integration challenges, and new ideas. Innovation in communications (tools and processes) to span the growth and contraction of both information and organizations, while managing the adoption of changing software practices, is imperative for success. Traditional web-based tools such as web pages, document libraries, and forums are not suf ficient. A blend of tweeting, blogs, wikis, instant messaging, web-based confer encing, and telepresence creates a new dimension of communication “best” practices.

  10. Resilience Engineering in Critical Long Term Aerospace Software Systems: A New Approach to Spacecraft Software Safety

    Science.gov (United States)

    Dulo, D. A.

    Safety critical software systems permeate spacecraft, and in a long term venture like a starship would be pervasive in every system of the spacecraft. Yet software failure today continues to plague both the systems and the organizations that develop them resulting in the loss of life, time, money, and valuable system platforms. A starship cannot afford this type of software failure in long journeys away from home. A single software failure could have catastrophic results for the spaceship and the crew onboard. This paper will offer a new approach to developing safe reliable software systems through focusing not on the traditional safety/reliability engineering paradigms but rather by focusing on a new paradigm: Resilience and Failure Obviation Engineering. The foremost objective of this approach is the obviation of failure, coupled with the ability of a software system to prevent or adapt to complex changing conditions in real time as a safety valve should failure occur to ensure safe system continuity. Through this approach, safety is ensured through foresight to anticipate failure and to adapt to risk in real time before failure occurs. In a starship, this type of software engineering is vital. Through software developed in a resilient manner, a starship would have reduced or eliminated software failure, and would have the ability to rapidly adapt should a software system become unstable or unsafe. As a result, long term software safety, reliability, and resilience would be present for a successful long term starship mission.

  11. Automatic Generation of Just-in-Time Online Assessments from Software Design Models

    Science.gov (United States)

    Zualkernan, Imran A.; El-Naaj, Salim Abou; Papadopoulos, Maria; Al-Amoudi, Budoor K.; Matthews, Charles E.

    2009-01-01

    Computer software is pervasive in today's society. The rate at which new versions of computer software products are released is phenomenal when compared to the release rate of new products in traditional industries such as aircraft building. This rapid rate of change can partially explain why most certifications in the software industry are…

  12. Software for math and science education for the deaf.

    Science.gov (United States)

    Adamo-Villani, Nicoletta; Wilbur, Ronnie

    2010-01-01

    In this article, we describe the development of two novel approaches to teaching math and science concepts to deaf children using 3D animated interactive software. One approach, Mathsigner, is non-immersive and the other, SMILE, is a virtual reality immersive environment. The content is curriculum-based, and the animated signing characters are constructed with state-of-the art technology and design. We report preliminary development findings of usability and appeal based on programme features (e.g. 2D/3D, immersiveness, interaction type, avatar and interface design) and subject features (hearing status, gender and age). Programme features of 2D/3D, immersiveness and interaction type were very much affected by subject features. Among subject features, we find significant effects of hearing status (deaf children take longer time and make more mistakes than hearing children) and gender (girls take longer than boys; girls prefer immersive environments rather than desktop presentation; girls are more interested in content than technology compared to boys). For avatar type, we found a preference for seamless, deformable characters over segmented ones. For interface comparisons, there were no subject effects, but an animated interface resulted in reduced time to task completion compared to static interfaces with and without sound and highlighting. These findings identify numerous features that affect software design and appeal and suggest that designers must be careful in their assumptions during programme development.

  13. Analisis Kebutuhan Bandwidth Pada Pemanfaatan Web Streaming Justin.tv Sebagai Media E-Learning Dengan Menggunakan Wirecast Dan Desktop Presenter

    Directory of Open Access Journals (Sweden)

    Muhammad Ubaidilah

    2014-05-01

    Full Text Available Perkembangan teknologi informasi begitu cepat seperti sekarang telah banyak mengubah sudut pandang banyak orang, antara lain sudut pandang orang untuk mengubah dunia pendidikan menjadi lebih baik. Salah satu contohnya pembelajaran berbasis Information and Communication Technologies (ICT yaitu pembelajaran menggunakan video streaming. Dengan instalasi software open source Wirecast dan Desktop presenter digunakan untuk membuat video pembelajaran Streaming, disiarkan secara real time melalui media broadcast justin.tv (internet TV Channel, diharapkan dapat lebih mendukung konsep pembelajaran kapan dan dimana saja. Masalah terbesar dari teknologi ini adalah keterbatasan bandwidth. Bandwidth adalah parameter penting untuk melakukan streaming dalam jaringan. Sedangkan proses komunikasi menggunakan video digital ini menghabiskan resource yang cukup besar. Sehingga penggunaan wireshark di sini sangat diperlukan untuk menganalisis bandwidth pada paket yang diterima oleh client. Dari hasil pengukuran video dengan standar H.264 resolusi (720 x 540, dengan rata-rata 20 menit dalam pengambilan sampel, sebanyak 30 pengujian sampel streaming video menggunakan wireshark, diperoleh rata-rata throughput keseluruhan 0,343 Mbps, rata-rata throughput terendah 0,309 Mbps dan throughput tertinggi 0,372 Mbps. Dapat disimpulkan bahwa jika dihasilkan throughput yang lebih besar maka kualitas video streaming akan lebih baik, tetapi jika throughput dihasilkan semakin kecil maka kualitas video streaming akan menurun

  14. Evaluation of usefulness and availability for orthopedic surgery using clavicle fracture model manufactured by desktop 3D printer

    International Nuclear Information System (INIS)

    Oh, Wang Kyun

    2014-01-01

    Usefulness and clinical availability for surgery efficiency were evaluated by conducting pre-operative planning with a model manufactured by desktop 3D printer by using clavicle CT image. The patient-customized clavicle fracture model was manufactured by desktop 3D printer of FDM wire laminated processing method by converting the CT image into STL file in Open Source DICOM Viewer Osirix. Also, the model of the original shape before damaged was restored and manufactured by Mirror technique based on STL file of not fractured clavicle of the other side by using the symmetry feature of the human body. For the model, the position and size, degree of the fracture was equally printed out. Using the clavicle model directly manufactured with low cost and less time in Department of Radiology is considered to be useful because it can reduce secondary damage during surgery and increase surgery efficiency with Minimal invasive percutaneous plate osteosynthesis(MIPO)

  15. Evaluation of usefulness and availability for orthopedic surgery using clavicle fracture model manufactured by desktop 3D printer

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Wang Kyun [Dept. of Diagnostic Radiology, Cheongju Medical Center, Cheongju (Korea, Republic of)

    2014-09-15

    Usefulness and clinical availability for surgery efficiency were evaluated by conducting pre-operative planning with a model manufactured by desktop 3D printer by using clavicle CT image. The patient-customized clavicle fracture model was manufactured by desktop 3D printer of FDM wire laminated processing method by converting the CT image into STL file in Open Source DICOM Viewer Osirix. Also, the model of the original shape before damaged was restored and manufactured by Mirror technique based on STL file of not fractured clavicle of the other side by using the symmetry feature of the human body. For the model, the position and size, degree of the fracture was equally printed out. Using the clavicle model directly manufactured with low cost and less time in Department of Radiology is considered to be useful because it can reduce secondary damage during surgery and increase surgery efficiency with Minimal invasive percutaneous plate osteosynthesis(MIPO)

  16. Oak Ridge Institutional Cluster Autotune Test Drive Report

    Energy Technology Data Exchange (ETDEWEB)

    Jibonananda, Sanyal [ORNL; New, Joshua Ryan [ORNL

    2014-02-01

    The Oak Ridge Institutional Cluster (OIC) provides general purpose computational resources for the ORNL staff to run computation heavy jobs that are larger than desktop applications but do not quite require the scale and power of the Oak Ridge Leadership Computing Facility (OLCF). This report details the efforts made and conclusions derived in performing a short test drive of the cluster resources on Phase 5 of the OIC. EnergyPlus was used in the analysis as a candidate user program and the overall software environment was evaluated against anticipated challenges experienced with resources such as the shared memory-Nautilus (JICS) and Titan (OLCF). The OIC performed within reason and was found to be acceptable in the context of running EnergyPlus simulations. The number of cores per node and the availability of scratch space per node allow non-traditional desktop focused applications to leverage parallel ensemble execution. Although only individual runs of EnergyPlus were executed, the software environment on the OIC appeared suitable to run ensemble simulations with some modifications to the Autotune workflow. From a standpoint of general usability, the system supports common Linux libraries, compilers, standard job scheduling software (Torque/Moab), and the OpenMPI library (the only MPI library) for MPI communications. The file system is a Panasas file system which literature indicates to be an efficient file system.

  17. Interactive multicentre teleconferences using open source software in a team of thoracic surgeons.

    Science.gov (United States)

    Ito, Kazuhiro; Shimada, Junichi; Katoh, Daishiro; Nishimura, Motohiro; Yanada, Masashi; Okada, Satoru; Ishihara, Shunta; Ichise, Kaori

    2012-12-01

    Real-time consultation between a team of thoracic surgeons is important for the management of difficult cases. We established a system for interactive teleconsultation between multiple sites, based on open-source software. The graphical desktop-sharing system VNC (virtual network computing) was used for remotely controlling another computer. An image-processing package (OsiriX) was installed on the server to share the medical images. We set up a voice communication system using Voice Chatter, a free, cross-platform voice communication application. Four hospitals participated in the trials. One was connected by gigabit ethernet, one by WiMAX and one by ADSL. Surgeons at three of the sites found that it was comfortable to view images and consult with each other using the teleconferencing system. However, it was not comfortable using the client that connected via WiMAX, because of dropped frames. Apart from the WiMAX connection, the VNC-based screen-sharing system transferred the clinical images efficiently and in real time. We found the screen-sharing software VNC to be a good application for medical image interpretation, especially for a team of thoracic surgeons using multislice CT scans.

  18. FPGAs for software programmers

    CERN Document Server

    Hannig, Frank; Ziener, Daniel

    2016-01-01

    This book makes powerful Field Programmable Gate Array (FPGA) and reconfigurable technology accessible to software engineers by covering different state-of-the-art high-level synthesis approaches (e.g., OpenCL and several C-to-gates compilers). It introduces FPGA technology, its programming model, and how various applications can be implemented on FPGAs without going through low-level hardware design phases. Readers will get a realistic sense for problems that are suited for FPGAs and how to implement them from a software designer’s point of view. The authors demonstrate that FPGAs and their programming model reflect the needs of stream processing problems much better than traditional CPU or GPU architectures, making them well-suited for a wide variety of systems, from embedded systems performing sensor processing to large setups for Big Data number crunching. This book serves as an invaluable tool for software designers and FPGA design engineers who are interested in high design productivity through behavi...

  19. Fabrication of cerebral aneurysm simulator with a desktop 3D printer.

    Science.gov (United States)

    Liu, Yu; Gao, Qing; Du, Song; Chen, ZiChen; Fu, JianZhong; Chen, Bing; Liu, ZhenJie; He, Yong

    2017-05-17

    Now, more and more patients are suffering cerebral aneurysm. However, long training time limits the rapid growth of cerebrovascular neurosurgeons. Here we developed a novel cerebral aneurysm simulator which can be better represented the dynamic bulging process of cerebral aneurysm The proposed simulator features the integration of a hollow elastic vascular model, a skull model and a brain model, which can be affordably fabricated at the clinic (Fab@Clinic), under $25.00 each with the help of a low-cost desktop 3D printer. Moreover, the clinical blood flow and pulsation pressure similar to the human can be well simulated, which can be used to train the neurosurgical residents how to clip aneurysms more effectively.

  20. Protocol independent transmission method in software defined optical network

    Science.gov (United States)

    Liu, Yuze; Li, Hui; Hou, Yanfang; Qiu, Yajun; Ji, Yuefeng

    2016-10-01

    With the development of big data and cloud computing technology, the traditional software-defined network is facing new challenges (e.i., ubiquitous accessibility, higher bandwidth, more flexible management and greater security). Using a proprietary protocol or encoding format is a way to improve information security. However, the flow, which carried by proprietary protocol or code, cannot go through the traditional IP network. In addition, ultra- high-definition video transmission service once again become a hot spot. Traditionally, in the IP network, the Serial Digital Interface (SDI) signal must be compressed. This approach offers additional advantages but also bring some disadvantages such as signal degradation and high latency. To some extent, HD-SDI can also be regard as a proprietary protocol, which need transparent transmission such as optical channel. However, traditional optical networks cannot support flexible traffics . In response to aforementioned challenges for future network, one immediate solution would be to use NFV technology to abstract the network infrastructure and provide an all-optical switching topology graph for the SDN control plane. This paper proposes a new service-based software defined optical network architecture, including an infrastructure layer, a virtualization layer, a service abstract layer and an application layer. We then dwell on the corresponding service providing method in order to implement the protocol-independent transport. Finally, we experimentally evaluate that proposed service providing method can be applied to transmit the HD-SDI signal in the software-defined optical network.

  1. Comparing internal and external run-time coupling of CFD and building energy simulation software

    NARCIS (Netherlands)

    Djunaedy, E.; Hensen, J.L.M.; Loomans, M.G.L.C.

    2004-01-01

    This paper describes a comparison between internal and external run-time coupling of CFD and building energy simulation software. Internal coupling can be seen as the "traditional" way of developing software, i.e. the capabilities of existing software are expanded by merging codes. With external

  2. DYNALIGHT DESKTOP

    DEFF Research Database (Denmark)

    Mærsk-Møller, Hans Martin; Kjær, Katrine Heinsvig; Ottosen, Carl-Otto

    2018-01-01

    for energy and cost-efficient climate control strategies that do not compromise product quality. In this paper, we present a novel approach addressing dynamic control of supplemental light in greenhouses aiming to decrease electricity costs and energy consumption without loss in plant productivity. Our...... approach uses weather forecasts and electricity prices to compute energy and cost-efficient supplemental light plans, which fulfils the production goals of the grower. The approach is supported by a set of newly developed planning software, which interfaces with a greenhouse climate computer. The planning...... algorithm is based on a new plant physiological understanding that utilizes the natural plasticity in plants to irregular light periods. The results revealed that different light control strategies using three different set points of daily photosynthesis integral (DPI) compared to a control treatment...

  3. Towards Activity Context using Software Sensors

    Directory of Open Access Journals (Sweden)

    Kamran Taj Pathan

    2009-06-01

    Full Text Available Service-Oriented Computing delivers the promise of configuring and reconfiguring software systems to address user's needs in a dynamic way. Context-aware computing promises to capture the user's needs and hence the requirements they have on systems. The marriage of both can deliver ad-hoc software solutions relevant to the user in the most current fashion. However, here it is a key to gather information on the users' activity (that is what they are doing. Traditionally any context sensing was conducted with hardware sensors. However, software can also play the same role and in some situations will be more useful to sense the activity of the user. Furthermore they can make use of the fact that Service-oriented systems exchange information through standard protocols. In this paper we discuss our proposed approach to sense the activity of the user making use of software.

  4. Addressing software security and mitigations in the life cycle

    Science.gov (United States)

    Gilliam, David; Powell, John; Haugh, Eric; Bishop, Matt

    2004-01-01

    Traditionally, security is viewed as an organizational and Information Technology (IT) systems function comprising of firewalls, intrusion detection systems (IDS), system security settings and patches to the operating system (OS) and applications running on it. Until recently, little thought has been given to the importance of security as a formal approach in the software life cycle. The Jet Propulsion Laboratory has approached the problem through the development of an integrated formal Software Security Assessment Instrument (SSAI) with six foci for the software life cycle.

  5. Precise Documentation: The Key to Better Software

    Science.gov (United States)

    Parnas, David Lorge

    The prime cause of the sorry “state of the art” in software development is our failure to produce good design documentation. Poor documentation is the cause of many errors and reduces efficiency in every phase of a software product's development and use. Most software developers believe that “documentation” refers to a collection of wordy, unstructured, introductory descriptions, thousands of pages that nobody wanted to write and nobody trusts. In contrast, Engineers in more traditional disciplines think of precise blueprints, circuit diagrams, and mathematical specifications of component properties. Software developers do not know how to produce precise documents for software. Software developments also think that documentation is something written after the software has been developed. In other fields of Engineering much of the documentation is written before and during the development. It represents forethought not afterthought. Among the benefits of better documentation would be: easier reuse of old designs, better communication about requirements, more useful design reviews, easier integration of separately written modules, more effective code inspection, more effective testing, and more efficient corrections and improvements. This paper explains how to produce and use precise software documentation and illustrate the methods with several examples.

  6. A Platform-Independent Plugin for Navigating Online Radiology Cases.

    Science.gov (United States)

    Balkman, Jason D; Awan, Omer A

    2016-06-01

    Software methods that enable navigation of radiology cases on various digital platforms differ between handheld devices and desktop computers. This has resulted in poor compatibility of online radiology teaching files across mobile smartphones, tablets, and desktop computers. A standardized, platform-independent, or "agnostic" approach for presenting online radiology content was produced in this work by leveraging modern hypertext markup language (HTML) and JavaScript web software technology. We describe the design and evaluation of this software, demonstrate its use across multiple viewing platforms, and make it publicly available as a model for future development efforts.

  7. Quality assurance of EDP software in practical application

    International Nuclear Information System (INIS)

    Winkler, H.

    1982-01-01

    Alongside the specific properties of the soft software, it is mainly points outside the traditional testing field which apply for the quality assurance thereof. Measures for quality assurance must in particular, start in the development. This presupposes a partial-result orientated development process of software. Due to the high qualitative demands, implements for testing and inspection are of great importance. The problems in software quality assurance are typical for a young technical field where the necessity of which is indisputed, but which has to effect on an empirical-pragmatical level still, due to insufficient scientific foundation. (orig.) [de

  8. An x-ray detection system development for Tandem Mirror Experiment Upgrade (TMX-U): Hardware and software

    International Nuclear Information System (INIS)

    Jones, R.M.; Coutts, G.W.; Failor, B.H.

    1983-01-01

    This x-ray detection system measures the electron Bremstrahlung spectrum from the Tandem Mirror Experiment-Upgrade (TMX-U). From this spectrum, we can calculate the electron temperature. The low energy portion of the spectrum (0.5-40 keV) is measured by a liquid-nitrogen-cooled, lithium-drifted silicon detector. The higher energy spectrometer uses an intrinsic germanium detector to accommodate the 100 to 200 keV spectra. The system proceeds as follows. The preamplified detector signals are digitized by a high-speed A-to-D converter located in a Computer Automated Measurement and Control (CAMAC) crate. The data is then stored in a histogramming memory via a data router. The CAMAC crate interfaces with a local desktop computer or the main data acquisition computer that stores the data. The software sets up the modules, acquires the energy spectra (with sample times as short as 2 ms) and plots it. Up to 40 time-resolved spectra are available during one plasma cycle. The actual module configuration, CAMAC interfacing and software that runs the system are the subjects of this paper

  9. Does It Matter Whether One Takes a Test on an iPad or a Desktop Computer?

    Science.gov (United States)

    Ling, Guangming

    2016-01-01

    To investigate possible iPad related mode effect, we tested 403 8th graders in Indiana, Maryland, and New Jersey under three mode conditions through random assignment: a desktop computer, an iPad alone, and an iPad with an external keyboard. All students had used an iPad or computer for six months or longer. The 2-hour test included reading, math,…

  10. The Taverna workflow suite: designing and executing workflows of Web Services on the desktop, web or in the cloud

    NARCIS (Netherlands)

    Wolstencroft, K.; Haines, R.; Fellows, D.; Williams, A.; Withers, D.; Owen, S.; Soiland-Reyes, S.; Dunlop, I.; Nenadic, A.; Fisher, P.; Bhagat, J.; Belhajjame, K.; Bacall, F.; Hardisty, A.; Nieva de la Hidalga, A.; Balcazar Vargas, M.P.; Sufi, S.; Goble, C.

    2013-01-01

    The Taverna workflow tool suite (http://www.taverna.org.uk) is designed to combine distributed Web Services and/or local tools into complex analysis pipelines. These pipelines can be executed on local desktop machines or through larger infrastructure (such as supercomputers, Grids or cloud

  11. Scribl: an HTML5 Canvas-based graphics library for visualizing genomic data over the web.

    Science.gov (United States)

    Miller, Chase A; Anthony, Jon; Meyer, Michelle M; Marth, Gabor

    2013-02-01

    High-throughput biological research requires simultaneous visualization as well as analysis of genomic data, e.g. read alignments, variant calls and genomic annotations. Traditionally, such integrative analysis required desktop applications operating on locally stored data. Many current terabyte-size datasets generated by large public consortia projects, however, are already only feasibly stored at specialist genome analysis centers. As even small laboratories can afford very large datasets, local storage and analysis are becoming increasingly limiting, and it is likely that most such datasets will soon be stored remotely, e.g. in the cloud. These developments will require web-based tools that enable users to access, analyze and view vast remotely stored data with a level of sophistication and interactivity that approximates desktop applications. As rapidly dropping cost enables researchers to collect data intended to answer questions in very specialized contexts, developers must also provide software libraries that empower users to implement customized data analyses and data views for their particular application. Such specialized, yet lightweight, applications would empower scientists to better answer specific biological questions than possible with general-purpose genome browsers currently available. Using recent advances in core web technologies (HTML5), we developed Scribl, a flexible genomic visualization library specifically targeting coordinate-based data such as genomic features, DNA sequence and genetic variants. Scribl simplifies the development of sophisticated web-based graphical tools that approach the dynamism and interactivity of desktop applications. Software is freely available online at http://chmille4.github.com/Scribl/ and is implemented in JavaScript with all modern browsers supported.

  12. Towards reference architectures as an enabler for software ecosystems

    DEFF Research Database (Denmark)

    Knodel, Jens; Manikas, Konstantinos

    2016-01-01

    Software ecosystems - a topic with increasingly growing interest in academia and industry in the past decade - arguably revolutionized many aspects of industrial software engineering (business models, architectures, platforms, project executions, collaboration models, distribution of assets......, to name a few). Software ecosystems enable the contribution of external actors with distinct center a common technology and the potential distribution of the actor contributions to an existing user set. Reference architectures have been proven successful and beneficial for software product lines...... and traditional software development within distinct domains. They arguably come with a set of benefits that severely counterweights the additional effort of design and implementation. But what is the role of reference architectures in an ecosystem setting? In this position paper, we argue for the use...

  13. Introduction of the digitization software GDgraph

    International Nuclear Information System (INIS)

    Chen Guochang; Jin Yongli; Wang Jimin

    2015-01-01

    The evaluators and experimenters always desire to have full and latest experimental data sets. However, the data are often published as figures without any numerical values for some publications or journals. Furthermore, the quality of figures is not always good enough, especially for some figures scanned from the hard copy of old publications. On the other hand, the researchers would like to retrieve the data directly from EXFOR database. Digitization of figures is only one method to obtain the numerical data and correlative uncertainty, when there are only figures available from publications. Therefore we need a digitization tool to fit the requirements from evaluation, measurement and EXFOR compilation in CNDC. Before 2000, there have no common software to digitize experimental and evaluated data. And the quality of digitization results can not fit the requirements of evaluation and measurement using the traditional coordinate paper or rule. The end of twenty century, the PC was developed so quickly that to develop a software for digitization purpose become possible. Since 1997, CNDC devotes to design and develop a software for digitization. Four years later, the first version of digitization software GDGraph was developed using VC++ and released in CNDC. Although, the functions of the 1 st version of GDGraph is fit the basic requirements of digitization only, in which can digitize one group data excluding data error, BMP image format only, and it can not randomly delete digitizing data points. However, it obtains higher quality digitizing results and efficiency than the traditional method

  14. THE EVOLUTION OF CARTOGRAPHIC VISUALIZATION IN CONJUNCTION WITH GEOGRAPHIC INFORMATION SCIENCE

    Directory of Open Access Journals (Sweden)

    László Zentai

    2014-01-01

    Full Text Available The development of the digital production techniques were rapidly improved around 1980-1990 due to the requirements of information technology. Both hardware and software components were essential in the development of the technology, but the first milestone of this process was the release of personal computers. In the cartographic visualization, the GIS technologies were invented in the 1970s. However, for a very long time the development focused on the input part and on the analysis. The need for real map production features (to produce paper maps which conform to cartographic traditions in GIS software environment was raised after most of the paper maps were converted into digital ones. On the other hand, the non-GIS based map production could easily use the development of the desktop publishing technologies about ten years earlier. Nowadays the GIS-based map production offers visualization methods which do not have the antecedents in the traditional cartography. Such kinds of contemporary cartographic visualization techniques looked very trendy, but the efficiency of representation was not seriously tested. The interpretation of unusual visualization techniques can be misleading and less efficient than it was expected by the software developers. The traditional visualization techniques of thematic cartography can be successfully combined with the recent IT methods: mobile phones, tablets etc.

  15. Open Source software and social networks: Disruptive alternatives for medical imaging

    International Nuclear Information System (INIS)

    Ratib, Osman; Rosset, Antoine; Heuberger, Joris

    2011-01-01

    In recent decades several major changes in computer and communication technology have pushed the limits of imaging informatics and PACS beyond the traditional system architecture providing new perspectives and innovative approach to a traditionally conservative medical community. Disruptive technologies such as the world-wide-web, wireless networking, Open Source software and recent emergence of cyber communities and social networks have imposed an accelerated pace and major quantum leaps in the progress of computer and technology infrastructure applicable to medical imaging applications. Methods: This paper reviews the impact and potential benefits of two major trends in consumer market software development and how they will influence the future of medical imaging informatics. Open Source software is emerging as an attractive and cost effective alternative to traditional commercial software developments and collaborative social networks provide a new model of communication that is better suited to the needs of the medical community. Observations: Evidence shows that successful Open Source software tools have penetrated the medical market and have proven to be more robust and cost effective than their commercial counterparts. Developed by developers that are themselves part of the user community, these tools are usually better adapted to the user's need and are more robust than traditional software programs being developed and tested by a large number of contributing users. This context allows a much faster and more appropriate development and evolution of the software platforms. Similarly, communication technology has opened up to the general public in a way that has changed the social behavior and habits adding a new dimension to the way people communicate and interact with each other. The new paradigms have also slowly penetrated the professional market and ultimately the medical community. Secure social networks allowing groups of people to easily communicate

  16. Open Source software and social networks: Disruptive alternatives for medical imaging

    Energy Technology Data Exchange (ETDEWEB)

    Ratib, Osman, E-mail: osman.ratib@hcuge.ch [Department of Medical Imaging and Information Sciences, University Hospital of Geneva, 24, rue Micheli-du-Crest, 1205 Geneva (Switzerland); Rosset, Antoine; Heuberger, Joris [Department of Medical Imaging and Information Sciences, University Hospital of Geneva, 24, rue Micheli-du-Crest, 1205 Geneva (Switzerland)

    2011-05-15

    In recent decades several major changes in computer and communication technology have pushed the limits of imaging informatics and PACS beyond the traditional system architecture providing new perspectives and innovative approach to a traditionally conservative medical community. Disruptive technologies such as the world-wide-web, wireless networking, Open Source software and recent emergence of cyber communities and social networks have imposed an accelerated pace and major quantum leaps in the progress of computer and technology infrastructure applicable to medical imaging applications. Methods: This paper reviews the impact and potential benefits of two major trends in consumer market software development and how they will influence the future of medical imaging informatics. Open Source software is emerging as an attractive and cost effective alternative to traditional commercial software developments and collaborative social networks provide a new model of communication that is better suited to the needs of the medical community. Observations: Evidence shows that successful Open Source software tools have penetrated the medical market and have proven to be more robust and cost effective than their commercial counterparts. Developed by developers that are themselves part of the user community, these tools are usually better adapted to the user's need and are more robust than traditional software programs being developed and tested by a large number of contributing users. This context allows a much faster and more appropriate development and evolution of the software platforms. Similarly, communication technology has opened up to the general public in a way that has changed the social behavior and habits adding a new dimension to the way people communicate and interact with each other. The new paradigms have also slowly penetrated the professional market and ultimately the medical community. Secure social networks allowing groups of people to easily

  17. Open Source software and social networks: disruptive alternatives for medical imaging.

    Science.gov (United States)

    Ratib, Osman; Rosset, Antoine; Heuberger, Joris

    2011-05-01

    In recent decades several major changes in computer and communication technology have pushed the limits of imaging informatics and PACS beyond the traditional system architecture providing new perspectives and innovative approach to a traditionally conservative medical community. Disruptive technologies such as the world-wide-web, wireless networking, Open Source software and recent emergence of cyber communities and social networks have imposed an accelerated pace and major quantum leaps in the progress of computer and technology infrastructure applicable to medical imaging applications. This paper reviews the impact and potential benefits of two major trends in consumer market software development and how they will influence the future of medical imaging informatics. Open Source software is emerging as an attractive and cost effective alternative to traditional commercial software developments and collaborative social networks provide a new model of communication that is better suited to the needs of the medical community. Evidence shows that successful Open Source software tools have penetrated the medical market and have proven to be more robust and cost effective than their commercial counterparts. Developed by developers that are themselves part of the user community, these tools are usually better adapted to the user's need and are more robust than traditional software programs being developed and tested by a large number of contributing users. This context allows a much faster and more appropriate development and evolution of the software platforms. Similarly, communication technology has opened up to the general public in a way that has changed the social behavior and habits adding a new dimension to the way people communicate and interact with each other. The new paradigms have also slowly penetrated the professional market and ultimately the medical community. Secure social networks allowing groups of people to easily communicate and exchange information

  18. Desenvolupament d'un sistema de monitoratge per a Software Defined Networks (SDN)

    OpenAIRE

    Navarro Sánchez, Albert

    2015-01-01

    Les Software Defined-Networks (SDN) són una tecnologia emergent que permet als components software estendre funcionalitats sobre la xarxa. En aquest projecte s'estudia com pot encaixar Polygraph, un sistema de monitoratge per a xarxes tradicionals, dins les SDN. Software Defined-Networks (SDN) are a new tecnology that allows software components to add new functionalities on networks. This project studies how Polygraph, a monitoring system for traditional networks, can fit in a SDN scenario.

  19. Software approach to minimizing problems of student-lecturer ...

    African Journals Online (AJOL)

    Lecturer Interaction in Higher institutions of learning. The Software was developed using PHP and hosted in the University web server, and the interaction between students and their lecturers was compared using both the traditional approaches ...

  20. Software authority transition through multiple distributors.

    Science.gov (United States)

    Han, Kyusunk; Shon, Taeshik

    2014-01-01

    The rapid growth in the use of smartphones and tablets has changed the software distribution ecosystem. The trend today is to purchase software through application stores rather than from traditional offline markets. Smartphone and tablet users can install applications easily by purchasing from the online store deployed in their device. Several systems, such as Android or PC-based OS units, allow users to install software from multiple sources. Such openness, however, can promote serious threats, including malware and illegal usage. In order to prevent such threats, several stores use online authentication techniques. These methods can, however, also present a problem whereby even licensed users cannot use their purchased application. In this paper, we discuss these issues and provide an authentication method that will make purchased applications available to the registered user at all times.

  1. A research on the application of software defined networking in satellite network architecture

    Science.gov (United States)

    Song, Huan; Chen, Jinqiang; Cao, Suzhi; Cui, Dandan; Li, Tong; Su, Yuxing

    2017-10-01

    Software defined network is a new type of network architecture, which decouples control plane and data plane of traditional network, has the feature of flexible configurations and is a direction of the next generation terrestrial Internet development. Satellite network is an important part of the space-ground integrated information network, while the traditional satellite network has the disadvantages of difficult network topology maintenance and slow configuration. The application of SDN technology in satellite network can solve these problems that traditional satellite network faces. At present, the research on the application of SDN technology in satellite network is still in the stage of preliminary study. In this paper, we start with introducing the SDN technology and satellite network architecture. Then we mainly introduce software defined satellite network architecture, as well as the comparison of different software defined satellite network architecture and satellite network virtualization. Finally, the present research status and development trend of SDN technology in satellite network are analyzed.

  2. Effects of a Technology Supported Project Based Learning (TS-PBL Approach on the Success of a Mobile Application Development Course and the Students’ Opinions

    Directory of Open Access Journals (Sweden)

    Fezile Ozdamli

    2017-05-01

    Full Text Available Similar to traditional desktop software development processes, teamwork is a necessity in the mobile application development process. Thus, the aim of this study is to examine the effects of the technology supported project-based learning approach in mobile application development courses on the academic achievement of students and to clarify the engineering students’ opinions. A total of 130 engineering students from the Department taking mobile application development courses were the participants of this study. The lessons progressed in one group in the form of technology supported project-based learning steps, while in the other group, they were conducted using traditional methods. Based on the results, the practical implementation of a mobile application with a TS-PBL approach in engineering students’ education will be discussed.

  3. Improvements in fast-response flood modeling: desktop parallel computing and domain tracking

    Energy Technology Data Exchange (ETDEWEB)

    Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV. OF UTAH

    2009-01-01

    It is becoming increasingly important to have the ability to accurately forecast flooding, as flooding accounts for the most losses due to natural disasters in the world and the United States. Flood inundation modeling has been dominated by one-dimensional approaches. These models are computationally efficient and are considered by many engineers to produce reasonably accurate water surface profiles. However, because the profiles estimated in these models must be superimposed on digital elevation data to create a two-dimensional map, the result may be sensitive to the ability of the elevation data to capture relevant features (e.g. dikes/levees, roads, walls, etc...). Moreover, one-dimensional models do not explicitly represent the complex flow processes present in floodplains and urban environments and because two-dimensional models based on the shallow water equations have significantly greater ability to determine flow velocity and direction, the National Research Council (NRC) has recommended that two-dimensional models be used over one-dimensional models for flood inundation studies. This paper has shown that two-dimensional flood modeling computational time can be greatly reduced through the use of Java multithreading on multi-core computers which effectively provides a means for parallel computing on a desktop computer. In addition, this paper has shown that when desktop parallel computing is coupled with a domain tracking algorithm, significant computation time can be eliminated when computations are completed only on inundated cells. The drastic reduction in computational time shown here enhances the ability of two-dimensional flood inundation models to be used as a near-real time flood forecasting tool, engineering, design tool, or planning tool. Perhaps even of greater significance, the reduction in computation time makes the incorporation of risk and uncertainty/ensemble forecasting more feasible for flood inundation modeling (NRC 2000; Sayers et al

  4. Emission of particulate matter from a desktop three-dimensional (3D) printer

    Science.gov (United States)

    Yi, Jinghai; LeBouf, Ryan F.; Duling, Matthew G.; Nurkiewicz, Timothy; Chen, Bean T.; Schwegler-Berry, Diane; Virji, M. Abbas; Stefaniak, Aleksandr B.

    2016-01-01

    ABSTRACT Desktop three-dimensional (3D) printers are becoming commonplace in business offices, public libraries, university labs and classrooms, and even private homes; however, these settings are generally not designed for exposure control. Prior experience with a variety of office equipment devices such as laser printers that emit ultrafine particles (UFP) suggests the need to characterize 3D printer emissions to enable reliable risk assessment. The aim of this study was to examine factors that influence particulate emissions from 3D printers and characterize their physical properties to inform risk assessment. Emissions were evaluated in a 0.5-m3 chamber and in a small room (32.7 m3) using real-time instrumentation to measure particle number, size distribution, mass, and surface area. Factors evaluated included filament composition and color, as well as the manufacturer-provided printer emissions control technologies while printing an object. Filament type significantly influenced emissions, with acrylonitrile butadiene styrene (ABS) emitting larger particles than polylactic acid (PLA), which may have been the result of agglomeration. Geometric mean particle sizes and total particle (TP) number and mass emissions differed significantly among colors of a given filament type. Use of a cover on the printer reduced TP emissions by a factor of 2. Lung deposition calculations indicated a threefold higher PLA particle deposition in alveoli compared to ABS. Desktop 3D printers emit high levels of UFP, which are released into indoor environments where adequate ventilation may not be present to control emissions. Emissions in nonindustrial settings need to be reduced through the use of a hierarchy of controls, beginning with device design, followed by engineering controls (ventilation) and administrative controls such as choice of filament composition and color. PMID:27196745

  5. Emission of particulate matter from a desktop three-dimensional (3D) printer.

    Science.gov (United States)

    Yi, Jinghai; LeBouf, Ryan F; Duling, Matthew G; Nurkiewicz, Timothy; Chen, Bean T; Schwegler-Berry, Diane; Virji, M Abbas; Stefaniak, Aleksandr B

    2016-01-01

    Desktop three-dimensional (3D) printers are becoming commonplace in business offices, public libraries, university labs and classrooms, and even private homes; however, these settings are generally not designed for exposure control. Prior experience with a variety of office equipment devices such as laser printers that emit ultrafine particles (UFP) suggests the need to characterize 3D printer emissions to enable reliable risk assessment. The aim of this study was to examine factors that influence particulate emissions from 3D printers and characterize their physical properties to inform risk assessment. Emissions were evaluated in a 0.5-m(3) chamber and in a small room (32.7 m(3)) using real-time instrumentation to measure particle number, size distribution, mass, and surface area. Factors evaluated included filament composition and color, as well as the manufacturer-provided printer emissions control technologies while printing an object. Filament type significantly influenced emissions, with acrylonitrile butadiene styrene (ABS) emitting larger particles than polylactic acid (PLA), which may have been the result of agglomeration. Geometric mean particle sizes and total particle (TP) number and mass emissions differed significantly among colors of a given filament type. Use of a cover on the printer reduced TP emissions by a factor of 2. Lung deposition calculations indicated a threefold higher PLA particle deposition in alveoli compared to ABS. Desktop 3D printers emit high levels of UFP, which are released into indoor environments where adequate ventilation may not be present to control emissions. Emissions in nonindustrial settings need to be reduced through the use of a hierarchy of controls, beginning with device design, followed by engineering controls (ventilation) and administrative controls such as choice of filament composition and color.

  6. Orthographic Software Modelling: A Novel Approach to View-Based Software Engineering

    Science.gov (United States)

    Atkinson, Colin

    The need to support multiple views of complex software architectures, each capturing a different aspect of the system under development, has been recognized for a long time. Even the very first object-oriented analysis/design methods such as the Booch method and OMT supported a number of different diagram types (e.g. structural, behavioral, operational) and subsequent methods such as Fusion, Kruchten's 4+1 views and the Rational Unified Process (RUP) have added many more views over time. Today's leading modeling languages such as the UML and SysML, are also oriented towards supporting different views (i.e. diagram types) each able to portray a different facets of a system's architecture. More recently, so called enterprise architecture frameworks such as the Zachman Framework, TOGAF and RM-ODP have become popular. These add a whole set of new non-functional views to the views typically emphasized in traditional software engineering environments.

  7. iSDS: a self-configurable software-defined storage system for enterprise

    Science.gov (United States)

    Chen, Wen-Shyen Eric; Huang, Chun-Fang; Huang, Ming-Jen

    2018-01-01

    Storage is one of the most important aspects of IT infrastructure for various enterprises. But, enterprises are interested in more than just data storage; they are interested in such things as more reliable data protection, higher performance and reduced resource consumption. Traditional enterprise-grade storage satisfies these requirements at high cost. It is because traditional enterprise-grade storage is usually designed and constructed by customised field-programmable gate array to achieve high-end functionality. However, in this ever-changing environment, enterprises request storage with more flexible deployment and at lower cost. Moreover, the rise of new application fields, such as social media, big data, video streaming service etc., makes operational tasks for administrators more complex. In this article, a new storage system called intelligent software-defined storage (iSDS), based on software-defined storage, is described. More specifically, this approach advocates using software to replace features provided by traditional customised chips. To alleviate the management burden, it also advocates applying machine learning to automatically configure storage to meet dynamic requirements of workloads running on storage. This article focuses on the analysis feature of iSDS cluster by detailing its architecture and design.

  8. ANT: Software for Generating and Evaluating Degenerate Codons for Natural and Expanded Genetic Codes.

    Science.gov (United States)

    Engqvist, Martin K M; Nielsen, Jens

    2015-08-21

    The Ambiguous Nucleotide Tool (ANT) is a desktop application that generates and evaluates degenerate codons. Degenerate codons are used to represent DNA positions that have multiple possible nucleotide alternatives. This is useful for protein engineering and directed evolution, where primers specified with degenerate codons are used as a basis for generating libraries of protein sequences. ANT is intuitive and can be used in a graphical user interface or by interacting with the code through a defined application programming interface. ANT comes with full support for nonstandard, user-defined, or expanded genetic codes (translation tables), which is important because synthetic biology is being applied to an ever widening range of natural and engineered organisms. The Python source code for ANT is freely distributed so that it may be used without restriction, modified, and incorporated in other software or custom data pipelines.

  9. A quantitative calculation for software reliability evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young-Jun; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    To meet these regulatory requirements, the software used in the nuclear safety field has been ensured through the development, validation, safety analysis, and quality assurance activities throughout the entire process life cycle from the planning phase to the installation phase. A variety of activities, such as the quality assurance activities are also required to improve the quality of a software. However, there are limitations to ensure that the quality is improved enough. Therefore, the effort to calculate the reliability of the software continues for a quantitative evaluation instead of a qualitative evaluation. In this paper, we propose a quantitative calculation method for the software to be used for a specific operation of the digital controller in an NPP. After injecting random faults in the internal space of a developed controller and calculating the ability to detect the injected faults using diagnostic software, we can evaluate the software reliability of a digital controller in an NPP. We tried to calculate the software reliability of the controller in an NPP using a new method that differs from a traditional method. It calculates the fault detection coverage after injecting the faults into the software memory space rather than the activity through the life cycle process. We attempt differentiation by creating a new definition of the fault, imitating the software fault using the hardware, and giving a consideration and weights for injection faults.

  10. QuBiLS-MIDAS: a parallel free-software for molecular descriptors computation based on multilinear algebraic maps.

    Science.gov (United States)

    García-Jacas, César R; Marrero-Ponce, Yovani; Acevedo-Martínez, Liesner; Barigye, Stephen J; Valdés-Martiní, José R; Contreras-Torres, Ernesto

    2014-07-05

    The present report introduces the QuBiLS-MIDAS software belonging to the ToMoCoMD-CARDD suite for the calculation of three-dimensional molecular descriptors (MDs) based on the two-linear (bilinear), three-linear, and four-linear (multilinear or N-linear) algebraic forms. Thus, it is unique software that computes these tensor-based indices. These descriptors, establish relations for two, three, and four atoms by using several (dis-)similarity metrics or multimetrics, matrix transformations, cutoffs, local calculations and aggregation operators. The theoretical background of these N-linear indices is also presented. The QuBiLS-MIDAS software was developed in the Java programming language and employs the Chemical Development Kit library for the manipulation of the chemical structures and the calculation of the atomic properties. This software is composed by a desktop user-friendly interface and an Abstract Programming Interface library. The former was created to simplify the configuration of the different options of the MDs, whereas the library was designed to allow its easy integration to other software for chemoinformatics applications. This program provides functionalities for data cleaning tasks and for batch processing of the molecular indices. In addition, it offers parallel calculation of the MDs through the use of all available processors in current computers. The studies of complexity of the main algorithms demonstrate that these were efficiently implemented with respect to their trivial implementation. Lastly, the performance tests reveal that this software has a suitable behavior when the amount of processors is increased. Therefore, the QuBiLS-MIDAS software constitutes a useful application for the computation of the molecular indices based on N-linear algebraic maps and it can be used freely to perform chemoinformatics studies. Copyright © 2014 Wiley Periodicals, Inc.

  11. BDE-209 in the Australian Environment: Desktop review

    Energy Technology Data Exchange (ETDEWEB)

    English, Karin, E-mail: k.english@uq.edu.au [School of Medicine, The University of Queensland, Brisbane (Australia); Children’s Health and Environment Program, Child Health Research Centre, The University of Queensland, Brisbane (Australia); Queensland Children’s Medical Research Institute, Children’s Health Research Centre, Brisbane (Australia); Toms, Leisa-Maree L. [School of Public Health and Social Work, and Institute of Health and Biomedical Innovation, Queensland University of Technology, Brisbane (Australia); Gallen, Christie; Mueller, Jochen F. [The University of Queensland, National Research Centre for Environmental Toxicology (Entox), Brisbane (Australia)

    2016-12-15

    The commercial polybrominated diphenyl ether (PBDE) flame retardant mixture c-decaBDE is now being considered for listing on the Stockholm Convention on Persistent Organic Pollutants. The aim of our study was to review the literature regarding the use and detection of BDE-209, a major component of c-decaBDE, in consumer products and provide a best estimate of goods that are likely to contain BDE-209 in Australia. This review is part of a larger study, which will include quantitative testing of items to assess for BDE-209. The findings of this desktop review will be used to determine which items should be prioritized for quantitative testing. We identified that electronics, particularly televisions, computers, small household appliances and power boards, were the items that were most likely to contain BDE-209 in Australia. Further testing of these items should include items of various ages. Several other items were identified as high priority for future testing, including transport vehicles, building materials and textiles in non-domestic settings. The findings from this study will aid in the development of appropriate policies, should listing of c-decaBDE on the Stockholm Convention and Australia’s ratification of that listing proceed.

  12. BDE-209 in the Australian Environment: Desktop review

    International Nuclear Information System (INIS)

    English, Karin; Toms, Leisa-Maree L.; Gallen, Christie; Mueller, Jochen F.

    2016-01-01

    The commercial polybrominated diphenyl ether (PBDE) flame retardant mixture c-decaBDE is now being considered for listing on the Stockholm Convention on Persistent Organic Pollutants. The aim of our study was to review the literature regarding the use and detection of BDE-209, a major component of c-decaBDE, in consumer products and provide a best estimate of goods that are likely to contain BDE-209 in Australia. This review is part of a larger study, which will include quantitative testing of items to assess for BDE-209. The findings of this desktop review will be used to determine which items should be prioritized for quantitative testing. We identified that electronics, particularly televisions, computers, small household appliances and power boards, were the items that were most likely to contain BDE-209 in Australia. Further testing of these items should include items of various ages. Several other items were identified as high priority for future testing, including transport vehicles, building materials and textiles in non-domestic settings. The findings from this study will aid in the development of appropriate policies, should listing of c-decaBDE on the Stockholm Convention and Australia’s ratification of that listing proceed.

  13. Software Authority Transition through Multiple Distributors

    Directory of Open Access Journals (Sweden)

    Kyusunk Han

    2014-01-01

    Full Text Available The rapid growth in the use of smartphones and tablets has changed the software distribution ecosystem. The trend today is to purchase software through application stores rather than from traditional offline markets. Smartphone and tablet users can install applications easily by purchasing from the online store deployed in their device. Several systems, such as Android or PC-based OS units, allow users to install software from multiple sources. Such openness, however, can promote serious threats, including malware and illegal usage. In order to prevent such threats, several stores use online authentication techniques. These methods can, however, also present a problem whereby even licensed users cannot use their purchased application. In this paper, we discuss these issues and provide an authentication method that will make purchased applications available to the registered user at all times.

  14. Bringing Legacy Visualization Software to Modern Computing Devices via Application Streaming

    Science.gov (United States)

    Fisher, Ward

    2014-05-01

    Planning software compatibility across forthcoming generations of computing platforms is a problem commonly encountered in software engineering and development. While this problem can affect any class of software, data analysis and visualization programs are particularly vulnerable. This is due in part to their inherent dependency on specialized hardware and computing environments. A number of strategies and tools have been designed to aid software engineers with this task. While generally embraced by developers at 'traditional' software companies, these methodologies are often dismissed by the scientific software community as unwieldy, inefficient and unnecessary. As a result, many important and storied scientific software packages can struggle to adapt to a new computing environment; for example, one in which much work is carried out on sub-laptop devices (such as tablets and smartphones). Rewriting these packages for a new platform often requires significant investment in terms of development time and developer expertise. In many cases, porting older software to modern devices is neither practical nor possible. As a result, replacement software must be developed from scratch, wasting resources better spent on other projects. Enabled largely by the rapid rise and adoption of cloud computing platforms, 'Application Streaming' technologies allow legacy visualization and analysis software to be operated wholly from a client device (be it laptop, tablet or smartphone) while retaining full functionality and interactivity. It mitigates much of the developer effort required by other more traditional methods while simultaneously reducing the time it takes to bring the software to a new platform. This work will provide an overview of Application Streaming and how it compares against other technologies which allow scientific visualization software to be executed from a remote computer. We will discuss the functionality and limitations of existing application streaming

  15. Los Alamos radiation transport code system on desktop computing platforms

    International Nuclear Information System (INIS)

    Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.; West, J.T.

    1990-01-01

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. The current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines

  16. Assessment of two different software solutions for the evaluation of CT colonography

    International Nuclear Information System (INIS)

    Andersen, K.; Blondin, D.; Moedder, U.; Cohnen, M.; Beck, A.; Aurich, V.; Vogt, C.

    2005-01-01

    Purpose: To compare a commercial CT colonography software ('Colonography', Siemens, Forchheim) with a non-commercial post-processing system ('Colotux', Department of Informatics). Material and methods: Identical data sets of 10 patients, who underwent an ultra-low-dose multi-detector CT colonography (ULD-MDCTC) (4 x 1 mm collimation, 8 mm pitch, 120 kVp, 10 mAs) were analyzed retrospectively. Assessment was performed using both software solutions by two resident radiologists, who did not have any experience with any colonography software tool before and who did not know the clinical symptoms of the patients or the results of the conventional colonoscopy. Both systems were analyzed using several subjective quality criteria including workflow, handling, image quality, endoluminal navigation and analysis of lesions with grading on a 5-point-scale. Results concerning polyps were compared between the two systems as well as with conventional colonoscopy. Results: Both colonography systems detected the same number of polyps. Although both showed some advantages for single criteria, no relevant difference was noted in general for subjective assessment. The time for calculation of three dimensional interactive volumes was three times longer for 'Colotux' compared to 'Colonography'. Linux-based 'Colotux' showed a trend towards better subjective image quality and easier measurement of polyp size. An intuitive desktop and 'Syngo'-workflow integration were advantages of 'Colonography'. Conclusion: The analysis of CT colonographies (4-detector-row-CT-scanner, ultra low dose technique, supine position) can adequately be achieved by both software solutions. There was no significant subjective or objective difference of quality between a 'stand-alone' individual system and a commerical workflow-integrated solution. A relevant factor for decision between the two systems may be the difference in time needed for the 3D volume calculation, especially in institutes with a high frequency

  17. Proposal for the award of blanket contracts for the supply of Intel-based desktop PCs, display monitors and portable PCs

    CERN Document Server

    2000-01-01

    This document concerns the award of blanket contracts for the supply of the three following categories of equipment for the period 2001-2004: a) desktop PCs (complete PC systems but without display monitors), b) display monitors (conventional CRTs or flat screen LCDs) and c) portable PCs (also called notebooks or laptops). Following a market survey carried out among 41 firms in fourteen Member States, an invitation to tender (IT-2692/IT) was sent on 19 May 2000 to 12 firms and three consortia, each consisting of two firms, in five Member States. By the closing date, CERN had received seven tenders, all from the Swiss subsidiaries of the firms and consortia. The Finance Committee is invited to agree to the negotiation of - blanket contracts with VOBIS (CH), ELONEX (CH) and FUJITSU-SIEMENS (CH), the three lowest bidders complying with the specification, for the supply of Desktop PCs; - blanket contracts with VOBIS (CH), SYNOPTIC (CH) and ELONEX (CH), the three lowest bidders offering display monitors manufactur...

  18. Real-time SHVC software decoding with multi-threaded parallel processing

    Science.gov (United States)

    Gudumasu, Srinivas; He, Yuwen; Ye, Yan; He, Yong; Ryu, Eun-Seok; Dong, Jie; Xiu, Xiaoyu

    2014-09-01

    This paper proposes a parallel decoding framework for scalable HEVC (SHVC). Various optimization technologies are implemented on the basis of SHVC reference software SHM-2.0 to achieve real-time decoding speed for the two layer spatial scalability configuration. SHVC decoder complexity is analyzed with profiling information. The decoding process at each layer and the up-sampling process are designed in parallel and scheduled by a high level application task manager. Within each layer, multi-threaded decoding is applied to accelerate the layer decoding speed. Entropy decoding, reconstruction, and in-loop processing are pipeline designed with multiple threads based on groups of coding tree units (CTU). A group of CTUs is treated as a processing unit in each pipeline stage to achieve a better trade-off between parallelism and synchronization. Motion compensation, inverse quantization, and inverse transform modules are further optimized with SSE4 SIMD instructions. Simulations on a desktop with an Intel i7 processor 2600 running at 3.4 GHz show that the parallel SHVC software decoder is able to decode 1080p spatial 2x at up to 60 fps (frames per second) and 1080p spatial 1.5x at up to 50 fps for those bitstreams generated with SHVC common test conditions in the JCT-VC standardization group. The decoding performance at various bitrates with different optimization technologies and different numbers of threads are compared in terms of decoding speed and resource usage, including processor and memory.

  19. Use of Diabetes Data Management Software Reports by Health ...

    African Journals Online (AJOL)

    Use of Diabetes Data Management Software Reports by Health Care Providers, Patients With Diabetes, and Caregivers Improves Accuracy and Efficiency of Data Analysis and Interpretation Compared With Traditional Logbook Data: First Results of the Accu-Chek C.

  20. Exploring Coordination Structures in Open Source Software Development

    NARCIS (Netherlands)

    van Hillegersberg, Jos; Harmsen, Frank; Hegeman, J.H.; Amrit, Chintan Amrit; Geisberger, Eva; Keil, Patrick; Kuhrmann, Marco

    2007-01-01

    Coordination is difficult to achieve in a large globally distributed project setting. The problem is multiplied in open source software development projects, where most of the traditional means of coordination such as plans, system-level designs, schedules and defined process are not used. In order

  1. Towards Multi-Method Research Approach in Empirical Software Engineering

    Science.gov (United States)

    Mandić, Vladimir; Markkula, Jouni; Oivo, Markku

    This paper presents results of a literature analysis on Empirical Research Approaches in Software Engineering (SE). The analysis explores reasons why traditional methods, such as statistical hypothesis testing and experiment replication are weakly utilized in the field of SE. It appears that basic assumptions and preconditions of the traditional methods are contradicting the actual situation in the SE. Furthermore, we have identified main issues that should be considered by the researcher when selecting the research approach. In virtue of reasons for weak utilization of traditional methods we propose stronger use of Multi-Method approach with Pragmatism as the philosophical standpoint.

  2. Computing on the Desktop: From Batch to Online in Two Large Danish Service Bureaus

    DEFF Research Database (Denmark)

    Jørgensen, Anker Helms

    The advent of the personal computer is often hailed as the major step towards empowering the computer user. This step was indeed significant, but it was preceeded by a similar step some 10-15 years earlier: the advent of the video terminal or ”glass–TTY”. The video terminal invaded the desktop...... of many while collar workers and the workplace of many blue collar workers in the 1970s and 1980s. It replaced batch processing and facilitated direct, interactive access to computing services. This had a considerable impact on working conditions. This paper addresses this transition in two large Danish...

  3. Service-oriented Software Defined Optical Networks for Cloud Computing

    Science.gov (United States)

    Liu, Yuze; Li, Hui; Ji, Yuefeng

    2017-10-01

    With the development of big data and cloud computing technology, the traditional software-defined network is facing new challenges (e.g., ubiquitous accessibility, higher bandwidth, more flexible management and greater security). This paper proposes a new service-oriented software defined optical network architecture, including a resource layer, a service abstract layer, a control layer and an application layer. We then dwell on the corresponding service providing method. Different service ID is used to identify the service a device can offer. Finally, we experimentally evaluate that proposed service providing method can be applied to transmit different services based on the service ID in the service-oriented software defined optical network.

  4. Introduction to adoption of lean canvas in software test architecture design

    Directory of Open Access Journals (Sweden)

    Padmaraj Nidagundi

    2017-01-01

    Full Text Available The growth of the software dependent businesses, as well as the use of electronic devices in daily life, brings new challenges requiring the software to work error free all the time, to achieve this goal software needs to be sufficiently and effectively tested during various development phases. Most software development companies make great efforts in testing; it is even more difficult to reach the error-free software goal. Different software development methodologies (e.g. traditional waterfall, agile brought in a new dimension for both - development and testing - introducing new technologies and tools. In software test automation the test architecture design plays a key role in managing written test cases and effectively executing them. Having the more effective software test automation architecture design in test process saves resources, efforts and reduces the technical depth. This paper provides the new dimension and possibilities of using lean canvas in the design of the software test architecture.

  5. Effects of Desktop Virtual Reality Environment Training on State Anxiety and Vocational Identity Scores among Persons with Disabilities during Job Placement

    Science.gov (United States)

    Washington, Andre Lamont

    2013-01-01

    This study examined how desktop virtual reality environment training (DVRET) affected state anxiety and vocational identity of vocational rehabilitation services consumers during job placement/job readiness activities. It utilized a quantitative research model with a quasi-experimental pretest-posttest design plus some qualitative descriptive…

  6. Energy-Saving Traffic Scheduling in Hybrid Software Defined Wireless Rechargeable Sensor Networks.

    Science.gov (United States)

    Wei, Yunkai; Ma, Xiaohui; Yang, Ning; Chen, Yijin

    2017-09-15

    Software Defined Wireless Rechargeable Sensor Networks (SDWRSNs) are an inexorable trend for Wireless Sensor Networks (WSNs), including Wireless Rechargeable Sensor Network (WRSNs). However, the traditional network devices cannot be completely substituted in the short term. Hybrid SDWRSNs, where software defined devices and traditional devices coexist, will last for a long time. Hybrid SDWRSNs bring new challenges as well as opportunities for energy saving issues, which is still a key problem considering that the wireless chargers are also exhaustible, especially in some rigid environment out of the main supply. Numerous energy saving schemes for WSNs, or even some works for WRSNs, are no longer suitable for the new features of hybrid SDWRSNs. To solve this problem, this paper puts forward an Energy-saving Traffic Scheduling (ETS) algorithm. The ETS algorithm adequately considers the new characters in hybrid SDWRSNs, and takes advantage of the Software Defined Networking (SDN) controller's direct control ability on SDN nodes and indirect control ability on normal nodes. The simulation results show that, comparing with traditional Minimum Transmission Energy (MTE) protocol, ETS can substantially improve the energy efficiency in hybrid SDWRSNs for up to 20-40% while ensuring feasible data delay.

  7. Energy-Saving Traffic Scheduling in Hybrid Software Defined Wireless Rechargeable Sensor Networks

    Directory of Open Access Journals (Sweden)

    Yunkai Wei

    2017-09-01

    Full Text Available Software Defined Wireless Rechargeable Sensor Networks (SDWRSNs are an inexorable trend for Wireless Sensor Networks (WSNs, including Wireless Rechargeable Sensor Network (WRSNs. However, the traditional network devices cannot be completely substituted in the short term. Hybrid SDWRSNs, where software defined devices and traditional devices coexist, will last for a long time. Hybrid SDWRSNs bring new challenges as well as opportunities for energy saving issues, which is still a key problem considering that the wireless chargers are also exhaustible, especially in some rigid environment out of the main supply. Numerous energy saving schemes for WSNs, or even some works for WRSNs, are no longer suitable for the new features of hybrid SDWRSNs. To solve this problem, this paper puts forward an Energy-saving Traffic Scheduling (ETS algorithm. The ETS algorithm adequately considers the new characters in hybrid SDWRSNs, and takes advantage of the Software Defined Networking (SDN controller’s direct control ability on SDN nodes and indirect control ability on normal nodes. The simulation results show that, comparing with traditional Minimum Transmission Energy (MTE protocol, ETS can substantially improve the energy efficiency in hybrid SDWRSNs for up to 20–40% while ensuring feasible data delay.

  8. A Desktop Screen Sharing System based on Various Connection Methods

    Science.gov (United States)

    Negishi, Yuya; Kawaguchi, Nobuo

    Recently it became very common to use information devices such as PCs during presentations and discussions. In these situations, a need arises for techniques that allow a smooth switch of presenters without changing cables, or an easy screen sharing in case of remote videoconferences. In this paper, we propose a desktop screen sharing system that can be used for such purposes and situations. For that, we designed an automatic control of connections in the VNC system that can be operated remotely over the network. We also suggested an interface that assigns a role such as “Screen sender" or “Screen receiver" to each terminal. In the proposed system, while sharing a screen between multiple terminals, one can easily display and browse the screen without having to understand how the others are connected. We also implemented a “role card" using contactless IC card, where roles are assigned only by placing the card in the IC reader.

  9. Mining dynamic noteworthy functions in software execution sequences.

    Science.gov (United States)

    Zhang, Bing; Huang, Guoyan; Wang, Yuqian; He, Haitao; Ren, Jiadong

    2017-01-01

    As the quality of crucial entities can directly affect that of software, their identification and protection become an important premise for effective software development, management, maintenance and testing, which thus contribute to improving the software quality and its attack-defending ability. Most analysis and evaluation on important entities like codes-based static structure analysis are on the destruction of the actual software running. In this paper, from the perspective of software execution process, we proposed an approach to mine dynamic noteworthy functions (DNFM)in software execution sequences. First, according to software decompiling and tracking stack changes, the execution traces composed of a series of function addresses were acquired. Then these traces were modeled as execution sequences and then simplified so as to get simplified sequences (SFS), followed by the extraction of patterns through pattern extraction (PE) algorithm from SFS. After that, evaluating indicators inner-importance and inter-importance were designed to measure the noteworthiness of functions in DNFM algorithm. Finally, these functions were sorted by their noteworthiness. Comparison and contrast were conducted on the experiment results from two traditional complex network-based node mining methods, namely PageRank and DegreeRank. The results show that the DNFM method can mine noteworthy functions in software effectively and precisely.

  10. libLocation: acceso a dispositivos de localización para gvSIG Desktop y Mobile

    OpenAIRE

    Jordán Aldasorro, Juan G.; Planells Jiménez, Manuel

    2009-01-01

    Inicialmente integrada en el piloto de gvSIG Mobile, la librería libLocation tiene como objetivo dotar a los proyectos gvSIG Desktop y gvSIG Mobile un acceso transparente a fuentes de localización. La librería se fundamenta en las especificaciones JSR-179 -API de localización para J2ME- y JSR-293 -API de localización para J2ME v2.0-, proporcionando una interfaz uniforme a diferentes fuentes de localización, mediante funciones de alto nivel. Asimismo, se extiende la funcionalida...

  11. Introduction to Lean Canvas Transformation Models and Metrics in Software Testing

    Directory of Open Access Journals (Sweden)

    Nidagundi Padmaraj

    2016-05-01

    Full Text Available Software plays a key role nowadays in all fields, from simple up to cutting-edge technologies and most of technology devices now work on software. Software development verification and validation have become very important to produce the high quality software according to business stakeholder requirements. Different software development methodologies have given a new dimension for software testing. In traditional waterfall software development software testing has approached the end point and begins with resource planning, a test plan is designed and test criteria are defined for acceptance testing. In this process most of test plan is well documented and it leads towards the time-consuming processes. For the modern software development methodology such as agile where long test processes and documentations are not followed strictly due to small iteration of software development and testing, lean canvas transformation models can be a solution. This paper provides a new dimension to find out the possibilities of adopting the lean transformation models and metrics in the software test plan to simplify the test process for further use of these test metrics on canvas.

  12. FreeContact: fast and free software for protein contact prediction from residue co-evolution.

    Science.gov (United States)

    Kaján, László; Hopf, Thomas A; Kalaš, Matúš; Marks, Debora S; Rost, Burkhard

    2014-03-26

    20 years of improved technology and growing sequences now renders residue-residue contact constraints in large protein families through correlated mutations accurate enough to drive de novo predictions of protein three-dimensional structure. The method EVfold broke new ground using mean-field Direct Coupling Analysis (EVfold-mfDCA); the method PSICOV applied a related concept by estimating a sparse inverse covariance matrix. Both methods (EVfold-mfDCA and PSICOV) are publicly available, but both require too much CPU time for interactive applications. On top, EVfold-mfDCA depends on proprietary software. Here, we present FreeContact, a fast, open source implementation of EVfold-mfDCA and PSICOV. On a test set of 140 proteins, FreeContact was almost eight times faster than PSICOV without decreasing prediction performance. The EVfold-mfDCA implementation of FreeContact was over 220 times faster than PSICOV with negligible performance decrease. EVfold-mfDCA was unavailable for testing due to its dependency on proprietary software. FreeContact is implemented as the free C++ library "libfreecontact", complete with command line tool "freecontact", as well as Perl and Python modules. All components are available as Debian packages. FreeContact supports the BioXSD format for interoperability. FreeContact provides the opportunity to compute reliable contact predictions in any environment (desktop or cloud).

  13. Application of software quality assurance to a specific scientific code development task

    International Nuclear Information System (INIS)

    Dronkers, J.J.

    1986-03-01

    This paper describes an application of software quality assurance to a specific scientific code development program. The software quality assurance program consists of three major components: administrative control, configuration management, and user documentation. The program attempts to be consistent with existing local traditions of scientific code development while at the same time providing a controlled process of development

  14. A Collaborative Digital Pathology System for Multi-Touch Mobile and Desktop Computing Platforms

    KAUST Repository

    Jeong, W.

    2013-06-13

    Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server system that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch-enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. We introduce a domain-specific image-stack compression method that leverages real-time hardware decoding on mobile devices. It adaptively encodes image stacks in a decorrelated colour space to achieve extremely low bitrates (0.8 bpp) with very low loss of image quality. We evaluate the image quality of our compression method and the performance of our system for diagnosis with an in-depth user study. Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server systems that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. © 2013 The Eurographics Association and John Wiley & Sons Ltd.

  15. A Collaborative Digital Pathology System for Multi-Touch Mobile and Desktop Computing Platforms

    KAUST Repository

    Jeong, W.; Schneider, J.; Hansen, A.; Lee, M.; Turney, S. G.; Faulkner-Jones, B. E.; Hecht, J. L.; Najarian, R.; Yee, E.; Lichtman, J. W.; Pfister, H.

    2013-01-01

    Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server system that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch-enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. We introduce a domain-specific image-stack compression method that leverages real-time hardware decoding on mobile devices. It adaptively encodes image stacks in a decorrelated colour space to achieve extremely low bitrates (0.8 bpp) with very low loss of image quality. We evaluate the image quality of our compression method and the performance of our system for diagnosis with an in-depth user study. Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server systems that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. © 2013 The Eurographics Association and John Wiley & Sons Ltd.

  16. WebStruct and VisualStruct: web interfaces and visualization for Structure software implemented in a cluster environment

    Directory of Open Access Journals (Sweden)

    Jayashree B.

    2008-03-01

    Full Text Available Structure, is a widely used software tool to investigate population genetic structure with multi-locus genotyping data. The software uses an iterative algorithm to group individuals into “K” clusters, representing possibly K genetically distinct subpopulations. The serial implementation of this programme is processor-intensive even with small datasets. We describe an implementation of the program within a parallel framework. Speedup was achieved by running different replicates and values of K on each node of the cluster. A web-based user-oriented GUI has been implemented in PHP, through which the user can specify input parameters for the programme. The number of processors to be used can be specified in the background command. A web-based visualization tool “Visualstruct”, written in PHP (HTML and Java script embedded, allows for the graphical display of population clusters output from Structure, where each individual may be visualized as a line segment with K colors defining its possible genomic composition with respect to the K genetic sub-populations. The advantage over available programs is in the increased number of individuals that can be visualized. The analyses of real datasets indicate a speedup of up to four, when comparing the speed of execution on clusters of eight processors with the speed of execution on one desktop. The software package is freely available to interested users upon request.

  17. WebStruct and VisualStruct: Web interfaces and visualization for Structure software implemented in a cluster environment.

    Science.gov (United States)

    Jayashree, B; Rajgopal, S; Hoisington, D; Prasanth, V P; Chandra, S

    2008-09-24

    Structure, is a widely used software tool to investigate population genetic structure with multi-locus genotyping data. The software uses an iterative algorithm to group individuals into "K" clusters, representing possibly K genetically distinct subpopulations. The serial implementation of this programme is processor-intensive even with small datasets. We describe an implementation of the program within a parallel framework. Speedup was achieved by running different replicates and values of K on each node of the cluster. A web-based user-oriented GUI has been implemented in PHP, through which the user can specify input parameters for the programme. The number of processors to be used can be specified in the background command. A web-based visualization tool "Visualstruct", written in PHP (HTML and Java script embedded), allows for the graphical display of population clusters output from Structure, where each individual may be visualized as a line segment with K colors defining its possible genomic composition with respect to the K genetic sub-populations. The advantage over available programs is in the increased number of individuals that can be visualized. The analyses of real datasets indicate a speedup of up to four, when comparing the speed of execution on clusters of eight processors with the speed of execution on one desktop. The software package is freely available to interested users upon request.

  18. Professional Ethics of Software Engineers: An Ethical Framework.

    Science.gov (United States)

    Lurie, Yotam; Mark, Shlomo

    2016-04-01

    The purpose of this article is to propose an ethical framework for software engineers that connects software developers' ethical responsibilities directly to their professional standards. The implementation of such an ethical framework can overcome the traditional dichotomy between professional skills and ethical skills, which plagues the engineering professions, by proposing an approach to the fundamental tasks of the practitioner, i.e., software development, in which the professional standards are intrinsically connected to the ethical responsibilities. In so doing, the ethical framework improves the practitioner's professionalism and ethics. We call this approach Ethical-Driven Software Development (EDSD), as an approach to software development. EDSD manifests the advantages of an ethical framework as an alternative to the all too familiar approach in professional ethics that advocates "stand-alone codes of ethics". We believe that one outcome of this synergy between professional and ethical skills is simply better engineers. Moreover, since there are often different software solutions, which the engineer can provide to an issue at stake, the ethical framework provides a guiding principle, within the process of software development, that helps the engineer evaluate the advantages and disadvantages of different software solutions. It does not and cannot affect the end-product in and of-itself. However, it can and should, make the software engineer more conscious and aware of the ethical ramifications of certain engineering decisions within the process.

  19. The Learner Characteristics, Features of Desktop 3D Virtual Reality Environments, and College Chemistry Instruction: A Structural Equation Modeling Analysis

    Science.gov (United States)

    Merchant, Zahira; Goetz, Ernest T.; Keeney-Kennicutt, Wendy; Kwok, Oi-man; Cifuentes, Lauren; Davis, Trina J.

    2012-01-01

    We examined a model of the impact of a 3D desktop virtual reality environment on the learner characteristics (i.e. perceptual and psychological variables) that can enhance chemistry-related learning achievements in an introductory college chemistry class. The relationships between the 3D virtual reality features and the chemistry learning test as…

  20. Confirmatory Factor Analysis Alternative: Free, Accessible CBID Software.

    Science.gov (United States)

    Bott, Marjorie; Karanevich, Alex G; Garrard, Lili; Price, Larry R; Mudaranthakam, Dinesh Pal; Gajewski, Byron

    2018-02-01

    New software that performs Classical and Bayesian Instrument Development (CBID) is reported that seamlessly integrates expert (content validity) and participant data (construct validity) to produce entire reliability estimates with smaller sample requirements. The free CBID software can be accessed through a website and used by clinical investigators in new instrument development. Demonstrations are presented of the three approaches using the CBID software: (a) traditional confirmatory factor analysis (CFA), (b) Bayesian CFA using flat uninformative prior, and (c) Bayesian CFA using content expert data (informative prior). Outcomes of usability testing demonstrate the need to make the user-friendly, free CBID software available to interdisciplinary researchers. CBID has the potential to be a new and expeditious method for instrument development, adding to our current measurement toolbox. This allows for the development of new instruments for measuring determinants of health in smaller diverse populations or populations of rare diseases.

  1. [Analysis on traditional Chinese medicine prescriptions treating cancer based on traditional Chinese medicine inheritance assistance system and discovery of new prescriptions].

    Science.gov (United States)

    Yu, Ming; Cao, Qi-chen; Su, Yu-xi; Sui, Xin; Yang, Hong-jun; Huang, Lu-qi; Wang, Wen-ping

    2015-08-01

    Malignant tumor is one of the main causes for death in the world at present as well as a major disease seriously harming human health and life and restricting the social and economic development. There are many kinds of reports about traditional Chinese medicine patent prescriptions, empirical prescriptions and self-made prescriptions treating cancer, and prescription rules were often analyzed based on medication frequency. Such methods were applicable for discovering dominant experience but hard to have an innovative discovery and knowledge. In this paper, based on the traditional Chinese medicine inheritance assistance system, the software integration of mutual information improvement method, complex system entropy clustering and unsupervised entropy-level clustering data mining methods was adopted to analyze the rules of traditional Chinese medicine prescriptions for cancer. Totally 114 prescriptions were selected, the frequency of herbs in prescription was determined, and 85 core combinations and 13 new prescriptions were indentified. The traditional Chinese medicine inheritance assistance system, as a valuable traditional Chinese medicine research-supporting tool, can be used to record, manage, inquire and analyze prescription data.

  2. Integration of Web Technologies in Software Applications. Is Web 2.0 a Solution?

    Directory of Open Access Journals (Sweden)

    Cezar Liviu CERVINSCHI

    2010-12-01

    Full Text Available Starting from the idea that Web 2.0 represents “the era of dynamic web”, the paper proposes to provide arguments (demonstrated by physical results regarding the question that is at the foundation if this article. Due to the findings we can definitely affirm that Web 2.0 is a solution to building powerful and robust software, since the Internet has become more than just a simple presence on the users’ desktop that develops easy access to information, services, entertainment, online transactions, e-commerce, e-learning and so on, but basically every kind of human or institutional interaction can happen online. This paper seeks to study the impact of two of these braches upon the user – e-commerce and e-testing. The statistic reports will be made on different sets of people, while the conclusions are the results of a detailed research and study of the applications’ behaviour in the actual operating environment.

  3. Software Development and Test Methodology for a Distributed Ground System

    Science.gov (United States)

    Ritter, George; Guillebeau, Pat; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes in an effort to minimize unnecessary overhead while maximizing process benefits. The Software processes that have evolved still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Processes have evolved, highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project.

  4. MultiSpec: A Desktop and Online Geospatial Image Data Processing Tool

    Science.gov (United States)

    Biehl, L. L.; Hsu, W. K.; Maud, A. R. M.; Yeh, T. T.

    2017-12-01

    MultiSpec is an easy to learn and use, freeware image processing tool for interactively analyzing a broad spectrum of geospatial image data, with capabilities such as image display, unsupervised and supervised classification, feature extraction, feature enhancement, and several other functions. Originally developed for Macintosh and Windows desktop computers, it has a community of several thousand users worldwide, including researchers and educators, as a practical and robust solution for analyzing multispectral and hyperspectral remote sensing data in several different file formats. More recently MultiSpec was adapted to run in the HUBzero collaboration platform so that it can be used within a web browser, allowing new user communities to be engaged through science gateways. MultiSpec Online has also been extended to interoperate with other components (e.g., data management) in HUBzero through integration with the geospatial data building blocks (GABBs) project. This integration enables a user to directly launch MultiSpec Online from data that is stored and/or shared in a HUBzero gateway and to save output data from MultiSpec Online to hub storage, allowing data sharing and multi-step workflows without having to move data between different systems. MultiSpec has also been used in K-12 classes for which one example is the GLOBE program (www.globe.gov) and in outreach material such as that provided by the USGS (eros.usgs.gov/educational-activities). MultiSpec Online now provides teachers with another way to use MultiSpec without having to install the desktop tool. Recently MultiSpec Online was used in a geospatial data session with 30-35 middle school students at the Turned Onto Technology and Leadership (TOTAL) Camp in the summers of 2016 and 2017 at Purdue University. The students worked on a flood mapping exercise using Landsat 5 data to learn about land remote sensing using supervised classification techniques. Online documentation is available for Multi

  5. Green Software Engineering Adaption In Requirement Elicitation Process

    Directory of Open Access Journals (Sweden)

    Umma Khatuna Jannat

    2015-08-01

    Full Text Available A recent technology investigates the role of concern in the environment software that is green software system. Now it is widely accepted that the green software can fit all process of software development. It is also suitable for the requirement elicitation process. Now a days software companies have used requirements elicitation techniques in an enormous majority. Because this process plays more and more important roles in software development. At the present time most of the requirements elicitation process is improved by using some techniques and tools. So that the intention of this research suggests to adapt green software engineering for the intention of existing elicitation technique and recommend suitable actions for improvement. This research being involved qualitative data. I used few keywords in my searching procedure then searched IEEE ACM Springer Elsevier Google scholar Scopus and Wiley. Find out articles which published in 2010 until 2016. Finding from the literature review Identify 15 traditional requirement elicitations factors and 23 improvement techniques to convert green engineering. Lastly The paper includes a squat review of the literature a description of the grounded theory and some of the identity issues related finding of the necessity for requirements elicitation improvement techniques.

  6. Qualitative research ethics on the spot: Not only on the desktop.

    Science.gov (United States)

    Øye, Christine; Sørensen, Nelli Øvre; Glasdam, Stinne

    2016-06-01

    The increase in medical ethical regulations and bureaucracy handled by institutional review boards and healthcare institutions puts the researchers using qualitative methods in a challenging position. Based on three different cases from three different research studies, the article explores and discusses research ethical dilemmas. First, and especially, the article addresses the challenges for gatekeepers who influence the informant's decisions to participate in research. Second, the article addresses the challenges in following research ethical guidelines related to informed consent and doing no harm. Third, the article argues for the importance of having research ethical guidelines and review boards to question and discuss the possible ethical dilemmas that occur in qualitative research. Research ethics must be understood in qualitative research as relational, situational, and emerging. That is, that focus on ethical issues and dilemmas has to be paid attention on the spot and not only at the desktop. © The Author(s) 2015.

  7. A taxonomy for software-defined networking, man-in-the-middle attacks

    OpenAIRE

    Fischer, Briana D.; Lato, Anita M.

    2016-01-01

    Approved for public release; distribution is unlimited In contrast to traditional networks, Software Defined Networking (SDN) allows the programming of network functions via an Application Programming Interface (API). The ability to implement the APIs in software is advantageous for traffic manipulation in SDN. With automated logic being programmed into a centralized component of the SDN, network operators are presented with new and scalable methods for traffic manipulation. Enterprises an...

  8. Natural language processing-based COTS software and related technologies survey.

    Energy Technology Data Exchange (ETDEWEB)

    Stickland, Michael G.; Conrad, Gregory N.; Eaton, Shelley M.

    2003-09-01

    Natural language processing-based knowledge management software, traditionally developed for security organizations, is now becoming commercially available. An informal survey was conducted to discover and examine current NLP and related technologies and potential applications for information retrieval, information extraction, summarization, categorization, terminology management, link analysis, and visualization for possible implementation at Sandia National Laboratories. This report documents our current understanding of the technologies, lists software vendors and their products, and identifies potential applications of these technologies.

  9. Development, Implementation, and Analysis of Desktop-Scale Model Industrial Equipment and a Critical Thinking Rubric for Use in Chemical Engineering Education

    Science.gov (United States)

    Golter, Paul B.

    2011-01-01

    In order to address some of the challenges facing engineering education, namely the demand that students be better prepared to practice professional as well as technical skills, we have developed an intervention consisting of equipment, assessments and a novel pedagogy. The equipment consists of desktop-scale replicas of common industrial…

  10. Attack Potential Evaluation in Desktop and Smartphone Fingerprint Sensors: Can They Be Attacked by Anyone?

    Directory of Open Access Journals (Sweden)

    Ines Goicoechea-Telleria

    2018-01-01

    Full Text Available The use of biometrics keeps growing. Every day, we use biometric recognition to unlock our phones or to have access to places such as the gym or the office, so we rely on the security manufacturers offer when protecting our privileges and private life. It is well known that it is possible to hack into a fingerprint sensor using fake fingers made of Play-Doh and other easy-to-obtain materials but to what extent? Is this true for all users or only for specialists with a deep knowledge on biometrics? Are smartphone fingerprint sensors as reliable as desktop sensors? To answer these questions, we performed 3 separate evaluations. First, we evaluated 4 desktop fingerprint sensors of different technologies by attacking them with 7 different fake finger materials. All of them were successfully attacked by an experienced attacker. Secondly, we carried out a similar test on 5 smartphones with embedded sensors using the most successful materials, which also hacked the 5 sensors. Lastly, we gathered 15 simulated attackers with no background in biometrics to create fake fingers of several materials, and they had one week to attack the fingerprint sensors of the same 5 smartphones, with the starting point of a short video with the techniques to create them. All 5 smartphones were successfully attacked by an inexperienced attacker. This paper will provide the results achieved, as well as an analysis on the attack potential of every case. All results are given following the metrics of the standard ISO/IEC 30107-3.

  11. Assessing soil erosion risk using RUSLE through a GIS open source desktop and web application.

    Science.gov (United States)

    Duarte, L; Teodoro, A C; Gonçalves, J A; Soares, D; Cunha, M

    2016-06-01

    Soil erosion is a serious environmental problem. An estimation of the expected soil loss by water-caused erosion can be calculated considering the Revised Universal Soil Loss Equation (RUSLE). Geographical Information Systems (GIS) provide different tools to create categorical maps of soil erosion risk which help to study the risk assessment of soil loss. The objective of this study was to develop a GIS open source application (in QGIS), using the RUSLE methodology for estimating erosion rate at the watershed scale (desktop application) and provide the same application via web access (web application). The applications developed allow one to generate all the maps necessary to evaluate the soil erosion risk. Several libraries and algorithms from SEXTANTE were used to develop these applications. These applications were tested in Montalegre municipality (Portugal). The maps involved in RUSLE method-soil erosivity factor, soil erodibility factor, topographic factor, cover management factor, and support practices-were created. The estimated mean value of the soil loss obtained was 220 ton km(-2) year(-1) ranged from 0.27 to 1283 ton km(-2) year(-1). The results indicated that most of the study area (80 %) is characterized by very low soil erosion level (soil erosion was higher than 962 ton km(-2) year(-1). It was also concluded that areas with high slope values and bare soil are related with high level of erosion and the higher the P and C values, the higher the soil erosion percentage. The RUSLE web and the desktop application are freely available.

  12. Do small fish mean no voucher? Using a flatbed desktop scanner to document larval and small specimens before destructive analyses

    Czech Academy of Sciences Publication Activity Database

    Kalous, L.; Šlechtová, Věra; Petrtýl, M.; Kohout, Jan; Čech, Martin

    2010-01-01

    Roč. 26, č. 4 (2010), s. 614-617 ISSN 0175-8659 R&D Projects: GA ČR GA206/06/1371; GA ČR GP206/09/P266 Institutional research plan: CEZ:AV0Z50450515; CEZ:AV0Z60170517 Keywords : small fish * voucher * desktop scanner Subject RIV: GL - Fishing Impact factor: 0.945, year: 2010

  13. Path generation algorithm for UML graphic modeling of aerospace test software

    Science.gov (United States)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Chen, Chao

    2018-03-01

    Aerospace traditional software testing engineers are based on their own work experience and communication with software development personnel to complete the description of the test software, manual writing test cases, time-consuming, inefficient, loopholes and more. Using the high reliability MBT tools developed by our company, the one-time modeling can automatically generate test case documents, which is efficient and accurate. UML model to describe the process accurately express the need to rely on the path is reached, the existing path generation algorithm are too simple, cannot be combined into a path and branch path with loop, or too cumbersome, too complicated arrangement generates a path is meaningless, for aerospace software testing is superfluous, I rely on our experience of ten load space, tailor developed a description of aerospace software UML graphics path generation algorithm.

  14. [Analysis on medication regularity of modern traditional Chinese medicines in treating melancholia based on data mining technology].

    Science.gov (United States)

    Zhao, Yan-qing; Teng, Jing; Yang, Hong-jun

    2015-05-01

    To analyze the prescription and medication regularities of traditional Chinese medicines in the treatment of melancholia in the Chinese journal full text database (CNKI), Wanfang Data knowledge service platform, VIP, Chinese biomedical literature database (CBM) in based on the traditional Chinese medicine inheritance support platform software, in order to provide reference for further mining traditional Chinese medicines for the treatment of melancholia and new drug development. The traditional Chinese medicine inheritance support platform software V2.0 was used to establish the prescription database of traditional Chinese medicines for treating melancholia. The software integrated data mining method was adopted to analyze four Qis, five flavors, meridian distribution, frequency statistics, syndrome distribution, composition regularity and new prescriptions. Totally 358 prescriptions for treating melancholia were analyzed to determine the frequency of prescription drugs, commonly used drug pairs and combinations and develop 22 new prescriptions. According to this study, prescriptions for treating depression collected in modern literature databases mainly have the effects in soothing liver and resolving melancholia, strengthening spleen and eliminating phlegm, activating and replenishing blood, regulating liver qi, tonifying spleen qi, clearing heat and purging heat, soothing the mind, nourishing yin and tonifying kidney, with neutral drug property and sweet or bitter flavor, and follow the melancholia treatment principle of "regulating qi and opening the mind, regulating qi and empathy".

  15. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  16. Design of MPPT Controller Monitoring Software Based on QT Framework

    Science.gov (United States)

    Meng, X. Z.; Lu, P. G.

    2017-10-01

    The MPPT controller was a hardware device for tracking the maximum power point of solar photovoltaic array. Multiple controllers could be working as networking mode by specific communicating protocol. In this article, based on C++ GUI programming with Qt frame, we designed one sort of desktop application for monitoring and analyzing operational parameter of MPPT controller. The type of communicating protocol for building network was Modbus protocol which using Remote Terminal Unit mode and The desktop application of host computer was connected with all the controllers in the network through RS485 communication or ZigBee wireless communication. Using this application, user could monitor the parameter of controller wherever they were by internet.

  17. Using Ajax to Empower Dynamic Searchinb

    Directory of Open Access Journals (Sweden)

    Judith Wusteman

    2006-06-01

    Full Text Available The use of Ajax, or Asynchronous JavaScript + XML, can result in Web applications that demonstrate the flexibility, responsiveness, and usability traditionally found only in desktop software. To illustrate this, a repository metasearch user interface, OJAX, has been developed. OJAX is simple, unintimidating but powerful. It attempts to minimize upfront user investment and provide immediate dynamic feedback, thus encouraging experimentation and enabling enactive learning. This article introduces the Ajax approach to the development of interactive Web applications and discusses its implications. It then describes the OJAX user interface and illustrates how it can transform the user experience.

  18. A free software for pore-scale modelling: solving Stokes equation for velocity fields and permeability values in 3D pore geometries

    KAUST Repository

    Gerke, Kirill

    2015-04-01

    In this contribution we introduce a novel free software which solves the Stokes equation to obtain velocity fields for low Reynolds-number flows within externally generated 3D pore geometries. Provided with velocity fields, one can calculate permeability for known pressure gradient boundary conditions via Darcy\\'s equation. Finite-difference schemes of 2nd and 4th order of accuracy are used together with an artificial compressibility method to iteratively converge to a steady-state solution of Stokes\\' equation. This numerical approach is much faster and less computationally demanding than the majority of open-source or commercial softwares employing other algorithms (finite elements/volumes, lattice Boltzmann, etc.) The software consists of two parts: 1) a pre and post-processing graphical interface, and 2) a solver. The latter is efficiently parallelized to use any number of available cores (the speedup on 16 threads was up to 10-12 depending on hardware). Due to parallelization and memory optimization our software can be used to obtain solutions for 300x300x300 voxels geometries on modern desktop PCs. The software was successfully verified by testing it against lattice Boltzmann simulations and analytical solutions. To illustrate the software\\'s applicability for numerous problems in Earth Sciences, a number of case studies have been developed: 1) identifying the representative elementary volume for permeability determination within a sandstone sample, 2) derivation of permeability/hydraulic conductivity values for rock and soil samples and comparing those with experimentally obtained values, 3) revealing the influence of the amount of fine-textured material such as clay on filtration properties of sandy soil. This work was partially supported by RSF grant 14-17-00658 (pore-scale modelling) and RFBR grants 13-04-00409-a and 13-05-01176-a.

  19. Two-Year Community: Human Anatomy Software Use in Traditional and Online Anatomy Laboratory Classes: Student-Perceived Learning Benefits

    Science.gov (United States)

    Kuyatt, Brian L.; Baker, Jason D.

    2014-01-01

    This study evaluates the effectiveness of human anatomy software in face-to-face and online anatomy laboratory classes. Cognitive, affective, and psychomotor perceived learning was measured for students using Pearson Education's Practice Anatomy Laboratory 2.0 software. This study determined that student-perceived learning was significantly…

  20. “Live” Formulations of International Association for the properties of Water and Steam (IAPWS)

    Science.gov (United States)

    Ochkov, V. F.; Orlov, K. A.; Gurke, S.

    2017-11-01

    Online publication of IAPWS formulations for calculation of the properties of water and steam is reviewed. The advantages of electronic delivery via Internet over traditional publication on paper are examined. Online calculation can be used with or without formulas or equations printed in traditional publications. Online calculations should preferably free of charge and compatible across multiple platforms (Windows, Android, Linux). Other requirements include availability of multilingual interface, traditional math operators and functions, 2D and 3D graphic capabilities, animation, numerical and symbolic math, tools for solving equation systems, local functions, etc. Using of online visualization tools for verification of functions for calculating thermophysical properties of substances is reviewed. Specific examples are provided of tools for the modeling of the properties of chemical substances, including desktop and online calculation software, downloadable online calculations, and calculations that use server technologies such as Mathcad Calculation Server (see the site of National Research University “Moscow Power Engineering Institute”) and SMath (see the site of Knovel, an Elsevier company).

  1. Development of a Traditional/Computer-aided Graphics Course for Engineering Technology.

    Science.gov (United States)

    Anand, Vera B.

    1985-01-01

    Describes a two-semester-hour freshman course in engineering graphics which uses both traditional and computerized instruction. Includes course description, computer graphics topics, and recommendations. Indicates that combining interactive graphics software with development of simple programs gave students a better foundation for upper-division…

  2. Working Inside The Box: An Example Of Google Desktop Search in a Forensic Examination

    Directory of Open Access Journals (Sweden)

    Timothy James LaTulippe

    2011-12-01

    Full Text Available Information and the technological advancements for which mankind develops with regards to its storage has increased tremendously over the past few decades. As the total amount of data stored rapidly increases in conjunction with the amount of widely available computer-driven devices being used, solutions are being developed to better harness this data. These types of advancements are continually assisting investigators and computer forensic examiners. One such application which houses copious amounts of fruitful data is the Google Desktop Search program. Coupled with tested and verified techniques, examiners can exploit the power of this application to cater to their investigative needs. Please find within a real world case example of these techniques and its subsequent outcome.

  3. Nuclear forensics of a non-traditional sample: Neptunium

    International Nuclear Information System (INIS)

    Doyle, Jamie L.; Schwartz, Daniel; Tandon, Lav

    2016-01-01

    Recent nuclear forensics cases have focused primarily on plutonium (Pu) and uranium (U) materials. By definition however, nuclear forensics can apply to any diverted nuclear material. This includes neptunium (Np), an internationally safeguarded material like Pu and U, that could offer a nuclear security concern if significant quantities were found outside of regulatory control. This case study couples scanning electron microscopy (SEM) with quantitative analysis using newly developed specialized software, to evaluate a non-traditional nuclear forensic sample of Np. Here, the results of the morphological analyses were compared with another Np sample of known pedigree, as well as other traditional actinide materials in order to determine potential processing and point-of-origin

  4. Teaching with ArcGIS Pro

    OpenAIRE

    Theller, Larry

    2016-01-01

    For Fall semester 2016 the ABE department moved the course ASM 540 Basic GIS from ArcGIS Desktop 10.2 to ArcGIS Pro 1.3. This software from ESRI has a completely new look and feel, (ribbon-based rather than cascading menus) and is a true 64 bit application, capable of multi-threading, and built on Python 3. After ArcGIS Desktop 10.5 is released, desktop ends and the future release will be ArcGIS Pro; so it makes sense to switch sooner rather than later. This talk will discuss some issues and...

  5. Study of the nonlinear imperfect software debugging model

    International Nuclear Information System (INIS)

    Wang, Jinyong; Wu, Zhibo

    2016-01-01

    In recent years there has been a dramatic proliferation of research on imperfect software debugging phenomena. Software debugging is a complex process and is affected by a variety of factors, including the environment, resources, personnel skills, and personnel psychologies. Therefore, the simple assumption that debugging is perfect is inconsistent with the actual software debugging process, wherein a new fault can be introduced when removing a fault. Furthermore, the fault introduction process is nonlinear, and the cumulative number of nonlinearly introduced faults increases over time. Thus, this paper proposes a nonlinear, NHPP imperfect software debugging model in consideration of the fact that fault introduction is a nonlinear process. The fitting and predictive power of the NHPP-based proposed model are validated through related experiments. Experimental results show that this model displays better fitting and predicting performance than the traditional NHPP-based perfect and imperfect software debugging models. S-confidence bounds are set to analyze the performance of the proposed model. This study also examines and discusses optimal software release-time policy comprehensively. In addition, this research on the nonlinear process of fault introduction is significant given the recent surge of studies on software-intensive products, such as cloud computing and big data. - Highlights: • Fault introduction is a nonlinear changing process during the debugging phase. • The assumption that the process of fault introduction is nonlinear is credible. • Our proposed model can better fit and accurately predict software failure behavior. • Research on fault introduction case is significant to software-intensive products.

  6. Creating the next generation control system software

    International Nuclear Information System (INIS)

    Schultz, D.E.

    1989-01-01

    A new 1980's style support package for future accelerator control systems is proposed. It provides a way to create accelerator applications software without traditional programming. Visual Interactive Applications (VIA) is designed to meet the needs of expanded accelerator complexes in a more cost effective way than past experience with procedural languages by using technology from the personal computer and artificial intelligence communities. 4 refs

  7. Instructional Uses of Web-Based Survey Software

    Directory of Open Access Journals (Sweden)

    Concetta A. DePaolo, Ph.D.

    2006-07-01

    Full Text Available Recent technological advances have led to changes in how instruction is delivered. Such technology can create opportunities to enhance instruction and make instructors more efficient in performing instructional tasks, especially if the technology is easy to use and requires no training. One such technology, web-based survey software, is extremely accessible for anyone with basic computer skills. Web-based survey software can be used for a variety of instructional purposes to streamline instructor tasks, as well as enhance instruction and communication with students. Following a brief overview of the technology, we discuss how Web Forms from nTreePoint can be used to conduct instructional surveys, collect course feedback, conduct peer evaluations of group work, collect completed assignments, schedule meeting times among multiple people, and aid in pedagogical research. We also discuss our experiences with these tasks within traditional on-campus courses and how they were enhanced or expedited by the use of web-based survey software.

  8. Software engineering capability for Ada (GRASP/Ada Tool)

    Science.gov (United States)

    Cross, James H., II

    1995-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis has been on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada Source code. A new Motif compliant graphical user interface has been developed for the GRASP/Ada prototype.

  9. Comparing the efficiency of performing complex tasks with a tablet and a smartphone

    OpenAIRE

    Botella, Federico; Moreno, Juan P.; Peñalver, Antonio

    2015-01-01

    The number of users who today use a smartphone or a tablet as a tool on a daily basis is increasing. Professionals, in certain environments, are moving towards mobile devices instead of the traditional laptop or desktop computer. From manufacturing to e-health and e-commerce, more and more apps can be used to improve companies' efficiency and efficacy. Thus, there are apps developed for tablet and smartphone platforms as well as for the traditional desktop version. But how can we define user ...

  10. Formal methods in software development: A road less travelled

    Directory of Open Access Journals (Sweden)

    John A van der Poll

    2010-08-01

    Full Text Available An integration of traditional verification techniques and formal specifications in software engineering is presented. Advocates of such techniques claim that mathematical formalisms allow them to produce quality, verifiably correct, or at least highly dependable software and that the testing and maintenance phases are shortened. Critics on the other hand maintain that software formalisms are hard to master, tedious to use and not well suited for the fast turnaround times demanded by industry. In this paper some popular formalisms and the advantages of using these during the early phases of the software development life cycle are presented. Employing the Floyd-Hoare verification principles during the formal specification phase facilitates reasoning about the properties of a specification. Some observations that may help to alleviate the formal-methods controversy are established and a number of formal methods successes is presented. Possible conditions for an increased acceptance of formalisms in oftware development are discussed.

  11. The role of original equipment manufacturers in software distribution

    Directory of Open Access Journals (Sweden)

    Herţanu, A.

    2012-01-01

    Full Text Available The software distribution channels are having a significant impact on the mix of marketing not only for big companies in this domain, but also for small companies that activate in this domain. The Original Equipment Manufacturer’s distribution channel it’s having a significant impact on the marketing strategy of different companies. If the traditional distribution channels are still used to, the OEM’s channels are used more and more to distribute the software products or services not only to the segment of consumers formed by companies, but also to the segment of costumers formed by individual users.

  12. Using Modern Methodologies with Maintenance Software

    Science.gov (United States)

    Streiffert, Barbara A.; Francis, Laurie K.; Smith, Benjamin D.

    2014-01-01

    Jet Propulsion Laboratory uses multi-mission software produced by the Mission Planning and Sequencing (MPS) team to process, simulate, translate, and package the commands that are sent to a spacecraft. MPS works under the auspices of the Multi-Mission Ground Systems and Services (MGSS). This software consists of nineteen applications that are in maintenance. The MPS software is classified as either class B (mission critical) or class C (mission important). The scheduling of tasks is difficult because mission needs must be addressed prior to performing any other tasks and those needs often spring up unexpectedly. Keeping track of the tasks that everyone is working on is also difficult because each person is working on a different software component. Recently the group adopted the Scrum methodology for planning and scheduling tasks. Scrum is one of the newer methodologies typically used in agile development. In the Scrum development environment, teams pick their tasks that are to be completed within a sprint based on priority. The team specifies the sprint length usually a month or less. Scrum is typically used for new development of one application. In the Scrum methodology there is a scrum master who is a facilitator who tries to make sure that everything moves smoothly, a product owner who represents the user(s) of the software and the team. MPS is not the traditional environment for the Scrum methodology. MPS has many software applications in maintenance, team members who are working on disparate applications, many users, and is interruptible based on mission needs, issues and requirements. In order to use scrum, the methodology needed adaptation to MPS. Scrum was chosen because it is adaptable. This paper is about the development of the process for using scrum, a new development methodology, with a team that works on disparate interruptible tasks on multiple software applications.

  13. BioSunMS: a plug-in-based software for the management of patients information and the analysis of peptide profiles from mass spectrometry

    Directory of Open Access Journals (Sweden)

    Zhang Xuemin

    2009-02-01

    Full Text Available Abstract Background With wide applications of matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS and surface-enhanced laser desorption/ionization time-of-flight mass spectrometry (SELDI-TOF MS, statistical comparison of serum peptide profiles and management of patients information play an important role in clinical studies, such as early diagnosis, personalized medicine and biomarker discovery. However, current available software tools mainly focused on data analysis rather than providing a flexible platform for both the management of patients information and mass spectrometry (MS data analysis. Results Here we presented a plug-in-based software, BioSunMS, for both the management of patients information and serum peptide profiles-based statistical analysis. By integrating all functions into a user-friendly desktop application, BioSunMS provided a comprehensive solution for clinical researchers without any knowledge in programming, as well as a plug-in architecture platform with the possibility for developers to add or modify functions without need to recompile the entire application. Conclusion BioSunMS provides a plug-in-based solution for managing, analyzing, and sharing high volumes of MALDI-TOF or SELDI-TOF MS data. The software is freely distributed under GNU General Public License (GPL and can be downloaded from http://sourceforge.net/projects/biosunms/.

  14. Examining the Use of Usability Results in a Software Development Company

    DEFF Research Database (Denmark)

    Høegh, Rune Thaarup; Stage, Jan

    2004-01-01

    This paper presents the first results of a study on a usability evaluation for a Danish software development company. The use of the results from the usability evaluation is examined through interviews with two developers from the software company. It is through an interview with a project leader...... from the company found that the traditional usability report plays a very little role for the development team. Initial results suggest that textual feedback proves more valuable when accompanied with video and oral feedback.......This paper presents the first results of a study on a usability evaluation for a Danish software development company. The use of the results from the usability evaluation is examined through interviews with two developers from the software company. It is through an interview with a project leader...

  15. Web-based Quality Control Tool used to validate CERES products on a cluster of Linux servers

    Science.gov (United States)

    Chu, C.; Sun-Mack, S.; Heckert, E.; Chen, Y.; Mlynczak, P.; Mitrescu, C.; Doelling, D.

    2014-12-01

    There have been a few popular desktop tools used in the Earth Science community to validate science data. Because of the limitation on the capacity of desktop hardware such as disk space and CPUs, those softwares are not able to display large amount of data from files.This poster will talk about an in-house developed web-based software built on a cluster of Linux servers. That allows users to take advantage of a few Linux servers working in parallel to generate hundreds images in a short period of time. The poster will demonstrate:(1) The hardware and software architecture is used to provide high throughput of images. (2) The software structure that can incorporate new products and new requirement quickly. (3) The user interface about how users can manipulate the data and users can control how the images are displayed.

  16. Cross-Platform Technologies

    Directory of Open Access Journals (Sweden)

    Maria Cristina ENACHE

    2017-04-01

    Full Text Available Cross-platform - a concept becoming increasingly used in recent years especially in the development of mobile apps, but this consistently over time and in the development of conventional desktop applications. The notion of cross-platform software (multi-platform or platform-independent refers to a software application that can run on more than one operating system or computing architecture. Thus, a cross-platform application can operate independent of software or hardware platform on which it is execute. As a generic definition presents a wide range of meanings for purposes of this paper we individualize this definition as follows: we will reduce the horizon of meaning and we use functionally following definition: a cross-platform application is a software application that can run on more than one operating system (desktop or mobile identical or in a similar way.

  17. Enabling System Evolution through Configuration Management on the Hardware/Software Boundary

    NARCIS (Netherlands)

    Krikhaar, R.L.; Mosterman, W.; Veerman, N.P.; Verhoef, C.

    2009-01-01

    As the use of software and electronics in modern products is omnipresent and continuously increasing, companies in the embedded systems industry face increasing complexity in controlling and enabling the evolution of their IT-intensive products. Traditionally, product configurations and their

  18. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  19. CLOUD-BASED VS DESKTOP-BASED PROPERTY MANAGEMENT SYSTEMS IN HOTEL

    Directory of Open Access Journals (Sweden)

    Mustafa\tGULMEZ

    2015-06-01

    Full Text Available Even though keeping up with the modern developments in IT sector is crucial for the success and competitiveness of a hotel, it is usually very hard for new technologies to be accepted and implemented. This is the case with the cloud technology for which the opinions between hoteliers are divided on those who think that it is just another fashion trend, unnecessary to be taken into consideration and those that believe that it helps in performing daily operations more easily, leaving space for more interaction with guests both in virtual and real world. Usage of cloud technology in hotels is still in its beginning phase and hoteliers still have to learn more about its advantages and adequate usage for the benefit of overall hotel operating. On the example of hotel property management system (PMS and comparison between features of its older desktop-version and new web-based programs, this research aims at finding out at which stage and how effective is usage of cloud technology in hotels. For this, qualitative research with semi-structured interviews with hotel mangers that use one of these programs was conducted. Reasons for usage and advantages of each version are discussed.

  20. A free software for pore-scale modelling: solving Stokes equation for velocity fields and permeability values in 3D pore geometries

    KAUST Repository

    Gerke, Kirill; Vasilyev, Roman; Khirevich, Siarhei; Karsanina, Marina; Collins, Daniel; Korost, Dmitry; Mallants, Dirk

    2015-01-01

    In this contribution we introduce a novel free software which solves the Stokes equation to obtain velocity fields for low Reynolds-number flows within externally generated 3D pore geometries. Provided with velocity fields, one can calculate permeability for known pressure gradient boundary conditions via Darcy's equation. Finite-difference schemes of 2nd and 4th order of accuracy are used together with an artificial compressibility method to iteratively converge to a steady-state solution of Stokes' equation. This numerical approach is much faster and less computationally demanding than the majority of open-source or commercial softwares employing other algorithms (finite elements/volumes, lattice Boltzmann, etc.) The software consists of two parts: 1) a pre and post-processing graphical interface, and 2) a solver. The latter is efficiently parallelized to use any number of available cores (the speedup on 16 threads was up to 10-12 depending on hardware). Due to parallelization and memory optimization our software can be used to obtain solutions for 300x300x300 voxels geometries on modern desktop PCs. The software was successfully verified by testing it against lattice Boltzmann simulations and analytical solutions. To illustrate the software's applicability for numerous problems in Earth Sciences, a number of case studies have been developed: 1) identifying the representative elementary volume for permeability determination within a sandstone sample, 2) derivation of permeability/hydraulic conductivity values for rock and soil samples and comparing those with experimentally obtained values, 3) revealing the influence of the amount of fine-textured material such as clay on filtration properties of sandy soil. This work was partially supported by RSF grant 14-17-00658 (pore-scale modelling) and RFBR grants 13-04-00409-a and 13-05-01176-a.

  1. A Real-Time Capable Software-Defined Receiver Using GPU for Adaptive Anti-Jam GPS Sensors

    Science.gov (United States)

    Seo, Jiwon; Chen, Yu-Hsuan; De Lorenzo, David S.; Lo, Sherman; Enge, Per; Akos, Dennis; Lee, Jiyun

    2011-01-01

    Due to their weak received signal power, Global Positioning System (GPS) signals are vulnerable to radio frequency interference. Adaptive beam and null steering of the gain pattern of a GPS antenna array can significantly increase the resistance of GPS sensors to signal interference and jamming. Since adaptive array processing requires intensive computational power, beamsteering GPS receivers were usually implemented using hardware such as field-programmable gate arrays (FPGAs). However, a software implementation using general-purpose processors is much more desirable because of its flexibility and cost effectiveness. This paper presents a GPS software-defined radio (SDR) with adaptive beamsteering capability for anti-jam applications. The GPS SDR design is based on an optimized desktop parallel processing architecture using a quad-core Central Processing Unit (CPU) coupled with a new generation Graphics Processing Unit (GPU) having massively parallel processors. This GPS SDR demonstrates sufficient computational capability to support a four-element antenna array and future GPS L5 signal processing in real time. After providing the details of our design and optimization schemes for future GPU-based GPS SDR developments, the jamming resistance of our GPS SDR under synthetic wideband jamming is presented. Since the GPS SDR uses commercial-off-the-shelf hardware and processors, it can be easily adopted in civil GPS applications requiring anti-jam capabilities. PMID:22164116

  2. A Real-Time Capable Software-Defined Receiver Using GPU for Adaptive Anti-Jam GPS Sensors

    Directory of Open Access Journals (Sweden)

    Dennis Akos

    2011-09-01

    Full Text Available Due to their weak received signal power, Global Positioning System (GPS signals are vulnerable to radio frequency interference. Adaptive beam and null steering of the gain pattern of a GPS antenna array can significantly increase the resistance of GPS sensors to signal interference and jamming. Since adaptive array processing requires intensive computational power, beamsteering GPS receivers were usually implemented using hardware such as field-programmable gate arrays (FPGAs. However, a software implementation using general-purpose processors is much more desirable because of its flexibility and cost effectiveness. This paper presents a GPS software-defined radio (SDR with adaptive beamsteering capability for anti-jam applications. The GPS SDR design is based on an optimized desktop parallel processing architecture using a quad-core Central Processing Unit (CPU coupled with a new generation Graphics Processing Unit (GPU having massively parallel processors. This GPS SDR demonstrates sufficient computational capability to support a four-element antenna array and future GPS L5 signal processing in real time. After providing the details of our design and optimization schemes for future GPU-based GPS SDR developments, the jamming resistance of our GPS SDR under synthetic wideband jamming is presented. Since the GPS SDR uses commercial-off-the-shelf hardware and processors, it can be easily adopted in civil GPS applications requiring anti-jam capabilities.

  3. Development of evaluation method for software hazard identification techniques

    International Nuclear Information System (INIS)

    Huang, H. W.; Chen, M. H.; Shih, C.; Yih, S.; Kuo, C. T.; Wang, L. H.; Yu, Y. C.; Chen, C. W.

    2006-01-01

    This research evaluated the applicable software hazard identification techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flow-graph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/noise ratio, complexity, and implementation cost. By this proposed method, the analysts can evaluate various software hazard identification combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (with transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and simulation-based model-analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantages are the completeness complexity and implementation cost. This evaluation method can be a platform to reach common consensus for the stakeholders. Following the evolution of software hazard identification techniques, the evaluation results could be changed. However, the insight of software hazard identification techniques is much more important than the numbers obtained by the evaluation. (authors)

  4. Development, implementation, and analysis of desktop-scale model industrial equipment and a critical thinking rubric for use in chemical engineering education

    Science.gov (United States)

    Golter, Paul B.

    In order to address some of the challenges facing engineering education, namely the demand that students be better prepared to practice professional as well as technical skills, we have developed an intervention consisting of equipment, assessments and a novel pedagogy. The equipment consists of desktop-scale replicas of common industrial equipment. These are implemented in the form of modular cartridges that can be interchanged in a base unit containing water, power and instrumentation. These Desktop Learning Modules (DLMs) are effective at providing a hands on experience in most classroom environments without requiring either water or power hook-ups. Furthermore, the DLMs respond quickly enough that multiple experiments by multiple groups can be run in a single one hour class. We refined an existing critical thinking rubric to be more specific to the realm of engineering problem solving. By altering our pedagogy to a project based environment using the critical thinking rubric as a primary grading tool, we are able to observe and measure the critical thinking skills of student groups. This rubric is corroborated with an industrial perspective and measures constructs that are important to the students' future careers.

  5. Novel software system development for finance

    OpenAIRE

    Maad, Soha

    2002-01-01

    This paper addresses the need for novel software system development (SSD) practices in finance. It proposes Empirical Modelling as a novel approach for SSD in finance. This approach aims at finding a suitable framework for studying both the traditional and the emerging computing culture to SSD in finance. First, the paper studies the change in the financial industry and identifies key issues of the application of computer-based technology in finance. These key issues are framed in a wider age...

  6. Combining Traditional and New Literacies in a 21st-Century Writing Workshop

    Science.gov (United States)

    Bogard, Jennifer M.; McMackin, Mary C.

    2012-01-01

    This article describes how third graders combine traditional literacy practices, including writer's notebooks and graphic organizers, with new literacies, such as video editing software, to create digital personal narratives. The authors emphasize the role of planning in the recursive writing process and describe how technology-based audio…

  7. Predicting Software Assurance Using Quality and Reliability Measures

    Science.gov (United States)

    2014-12-01

    22 Figure 13: CISQ Assurance 28 Figure 14: Distribution of Faulty and Vulnerable Files in Firefox 2.0 31 Figure 15: Boxplot of V/D% 33 Figure 16...ap- peared in notebook and desktop machines using the Mac OS X operating system. The vulnerabil- ity is described in the National Vulnerability

  8. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  9. [Physical Activity in the Context of Workplace Health Promotion: A Systematic Review on the Effectiveness of Software-Based in Contrast to Personal-Based Interventions].

    Science.gov (United States)

    Rudolph, Sabrina; Göring, Arne; Padrok, Dennis

    2018-01-03

    Sports and physical activity interventions are attracting considerable attention in the context of workplace health promotion. Due to increasing digitalization, especially software-based interventions that promote physical activity are gaining acceptance in practice. Empirical evidence concerning the efficiency of software-based interventions in the context of workplace health promotion is rather low so far. This paper examines the question in what way software-based interventions are more efficient than personal-based interventions in terms of increasing the level of physical activity. A systematic review according to the specifications of the Cochrane Collaboration was conducted. Inclusion criteria and should-have criteria were defined and by means of the should-have criteria the quality score of the studies was calculated. The software-based and personal-based interventions are presented in 2 tables with the categories author, year, country, sample group, aim of the intervention, methods, outcome and study quality. A total of 25 studies are included in the evaluation (12 personal- and 13 software-based interventions). The quality scores of the studies are heterogeneous and range from 3 to 9 points. 5 personal- and 5 software-based studies achieved an increase of physical activity. Other positive effects on health could be presented in the studies, for example, a reduction in blood pressure or body-mass index. A few studies did not show any improvement in health-related parameters. This paper demonstrates that positive effects can be achieved with both intervention types. Software-based interventions show advantages due to the use of new technologies. Use of desktop or mobile applications facilitate organization, communication and data acquisition with fewer resources needed. A schooled trainer, on the other hand, is able to react to specific and varying needs of the employees. This aspect should be considered as very significant. © Georg Thieme Verlag KG

  10. The application of image processing software: Photoshop in environmental design

    Science.gov (United States)

    Dong, Baohua; Zhang, Chunmi; Zhuo, Chen

    2011-02-01

    In the process of environmental design and creation, the design sketch holds a very important position in that it not only illuminates the design's idea and concept but also shows the design's visual effects to the client. In the field of environmental design, computer aided design has made significant improvement. Many types of specialized design software for environmental performance of the drawings and post artistic processing have been implemented. Additionally, with the use of this software, working efficiency has greatly increased and drawings have become more specific and more specialized. By analyzing the application of photoshop image processing software in environmental design and comparing and contrasting traditional hand drawing and drawing with modern technology, this essay will further explore the way for computer technology to play a bigger role in environmental design.

  11. Real-Time Extended Interface Automata for Software Testing Cases Generation

    Science.gov (United States)

    Yang, Shunkun; Xu, Jiaqi; Man, Tianlong; Liu, Bin

    2014-01-01

    Testing and verification of the interface between software components are particularly important due to the large number of complex interactions, which requires the traditional modeling languages to overcome the existing shortcomings in the aspects of temporal information description and software testing input controlling. This paper presents the real-time extended interface automata (RTEIA) which adds clearer and more detailed temporal information description by the application of time words. We also establish the input interface automaton for every input in order to solve the problems of input controlling and interface covering nimbly when applied in the software testing field. Detailed definitions of the RTEIA and the testing cases generation algorithm are provided in this paper. The feasibility and efficiency of this method have been verified in the testing of one real aircraft braking system. PMID:24892080

  12. Epistemology, software engineering and formal methods

    Science.gov (United States)

    Holloway, C. Michael

    1994-01-01

    One of the most basic questions anyone can ask is, 'How do I know that what I think I know is true?' The study of this question is called epistemology. Traditionally, epistemology has been considered to be of legitimate interest only to philosophers, theologians, and three year old children who respond to every statement by asking, 'Why?' Software engineers need to be interested in the subject, however, because a lack of sufficient understanding of epistemology contributes to many of the current problems in the field.

  13. Nurturing reliable and robust open-source scientific software

    Science.gov (United States)

    Uieda, L.; Wessel, P.

    2017-12-01

    Scientific results are increasingly the product of software. The reproducibility and validity of published results cannot be ensured without access to the source code of the software used to produce them. Therefore, the code itself is a fundamental part of the methodology and must be published along with the results. With such a reliance on software, it is troubling that most scientists do not receive formal training in software development. Tools such as version control, continuous integration, and automated testing are routinely used in industry to ensure the correctness and robustness of software. However, many scientist do not even know of their existence (although efforts like Software Carpentry are having an impact on this issue; software-carpentry.org). Publishing the source code is only the first step in creating an open-source project. For a project to grow it must provide documentation, participation guidelines, and a welcoming environment for new contributors. Expanding the project community is often more challenging than the technical aspects of software development. Maintainers must invest time to enforce the rules of the project and to onboard new members, which can be difficult to justify in the context of the "publish or perish" mentality. This problem will continue as long as software contributions are not recognized as valid scholarship by hiring and tenure committees. Furthermore, there are still unsolved problems in providing attribution for software contributions. Many journals and metrics of academic productivity do not recognize citations to sources other than traditional publications. Thus, some authors choose to publish an article about the software and use it as a citation marker. One issue with this approach is that updating the reference to include new contributors involves writing and publishing a new article. A better approach would be to cite a permanent archive of individual versions of the source code in services such as Zenodo

  14. A systematic review of applying modern software engineering techniques to developing robotic systems

    Directory of Open Access Journals (Sweden)

    Claudia Pons

    2012-01-01

    Full Text Available Robots have become collaborators in our daily life. While robotic systems become more and more complex, the need to engineer their software development grows as well. The traditional approaches used in developing these software systems are reaching their limits; currently used methodologies and tools fall short of addressing the needs of such complex software development. Separating robotics’ knowledge from short-cycled implementation technologies is essential to foster reuse and maintenance. This paper presents a systematic review (SLR of the current use of modern software engineering techniques for developing robotic software systems and their actual automation level. The survey was aimed at summarizing existing evidence concerning applying such technologies to the field of robotic systems to identify any gaps in current research to suggest areas for further investigation and provide a background for positioning new research activities.

  15. Integrating R and Hadoop for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Bogdan Oancea

    2014-06-01

    Full Text Available Analyzing and working with big data could be very difficult using classical means like relational database management systems or desktop software packages for statistics and visualization. Instead, big data requires large clusters with hundreds or even thousands of computing nodes. Official statistics is increasingly considering big data for deriving new statistics because big data sources could produce more relevant and timely statistics than traditional sources. One of the software tools successfully and wide spread used for storage and processing of big data sets on clusters of commodity hardware is Hadoop. Hadoop framework contains libraries, a distributed file-system (HDFS, a resource-management platform and implements a version of the MapReduce programming model for large scale data processing. In this paper we investigate the possibilities of integrating Hadoop with R which is a popular software used for statistical computing and data visualization. We present three ways of integrating them: R with Streaming, Rhipe and RHadoop and we emphasize the advantages and disadvantages of each solution.

  16. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    Science.gov (United States)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  17. Techniques to maximize software reliability in radiation fields

    International Nuclear Information System (INIS)

    Eichhorn, G.; Piercey, R.B.

    1986-01-01

    Microprocessor system failures due to memory corruption by single event upsets (SEUs) and/or latch-up in RAM or ROM memory are common in environments where there is high radiation flux. Traditional methods to harden microcomputer systems against SEUs and memory latch-up have usually involved expensive large scale hardware redundancy. Such systems offer higher reliability, but they tend to be more complex and non-standard. At the Space Astronomy Laboratory the authors have developed general programming techniques for producing software which is resistant to such memory failures. These techniques, which may be applied to standard off-the-shelf hardware, as well as custom designs, include an implementation of Maximally Redundant Software (MRS) model, error detection algorithms and memory verification and management

  18. The development and evaluation of a medical imaging training immersive environment

    International Nuclear Information System (INIS)

    Bridge, Pete; Gunn, Therese; Kastanis, Lazaros; Pack, Darren; Rowntree, Pamela; Starkey, Debbie; Mahoney, Gaynor; Berry, Clare; Braithwaite, Vicki; Wilson-Stewart, Kelly

    2014-01-01

    A novel realistic 3D virtual reality (VR) application has been developed to allow medical imaging students at Queensland University of Technology to practice radiographic techniques independently outside the usual radiography laboratory. A flexible agile development methodology was used to create the software rapidly and effectively. A 3D gaming environment and realistic models were used to engender presence in the software while tutor-determined gold standards enabled students to compare their performance and learn in a problem-based learning pedagogy. Students reported high levels of satisfaction and perceived value and the software enabled up to 40 concurrent users to prepare for clinical practice. Student feedback also indicated that they found 3D to be of limited value in the desktop version compared to the usual 2D approach. A randomised comparison between groups receiving software-based and traditional practice measured performance in a formative role play with real equipment. The results of this work indicated superior performance with the equipment for the VR trained students (P = 0.0366) and confirmed the value of VR for enhancing 3D equipment-based problem-solving skills. Students practising projection techniques virtually performed better at role play assessments than students practising in a traditional radiography laboratory only. The application particularly helped with 3D equipment configuration, suggesting that teaching 3D problem solving is an ideal use of such medical equipment simulators. Ongoing development work aims to establish the role of VR software in preparing students for clinical practice with a range of medical imaging equipment

  19. The development and evaluation of a medical imaging training immersive environment

    Energy Technology Data Exchange (ETDEWEB)

    Bridge, Pete, E-mail: pete.bridge@qut.edu.au; Gunn, Therese [School of Clinical Sciences, Queensland University of Technology, Brisbane (Australia); Kastanis, Lazaros; Pack, Darren [End-to-End Visuals, Brisbane (Australia); Rowntree, Pamela; Starkey, Debbie; Mahoney, Gaynor; Berry, Clare; Braithwaite, Vicki; Wilson-Stewart, Kelly [School of Clinical Sciences, Queensland University of Technology, Brisbane (Australia)

    2014-09-15

    A novel realistic 3D virtual reality (VR) application has been developed to allow medical imaging students at Queensland University of Technology to practice radiographic techniques independently outside the usual radiography laboratory. A flexible agile development methodology was used to create the software rapidly and effectively. A 3D gaming environment and realistic models were used to engender presence in the software while tutor-determined gold standards enabled students to compare their performance and learn in a problem-based learning pedagogy. Students reported high levels of satisfaction and perceived value and the software enabled up to 40 concurrent users to prepare for clinical practice. Student feedback also indicated that they found 3D to be of limited value in the desktop version compared to the usual 2D approach. A randomised comparison between groups receiving software-based and traditional practice measured performance in a formative role play with real equipment. The results of this work indicated superior performance with the equipment for the VR trained students (P = 0.0366) and confirmed the value of VR for enhancing 3D equipment-based problem-solving skills. Students practising projection techniques virtually performed better at role play assessments than students practising in a traditional radiography laboratory only. The application particularly helped with 3D equipment configuration, suggesting that teaching 3D problem solving is an ideal use of such medical equipment simulators. Ongoing development work aims to establish the role of VR software in preparing students for clinical practice with a range of medical imaging equipment.

  20. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Kim, Dong Hoon; Lee, Dong Young

    2014-01-01

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  1. Implementation Issues of Virtual Desktop Infrastructure and Its Case Study for a Physician's Round at Seoul National University Bundang Hospital

    OpenAIRE

    Yoo, Sooyoung; Kim, Seok; Kim, Taegi; Kim, Jon Soo; Baek, Rong-Min; Suh, Chang Suk; Chung, Chin Youb; Hwang, Hee

    2012-01-01

    Objectives The cloud computing-based virtual desktop infrastructure (VDI) allows access to computing environments with no limitations in terms of time or place such that it can permit the rapid establishment of a mobile hospital environment. The objective of this study was to investigate the empirical issues to be considered when establishing a virtual mobile environment using VDI technology in a hospital setting and to examine the utility of the technology with an Apple iPad during a physici...

  2. Introduction to co-simulation of software and hardware in embedded processor systems

    Energy Technology Data Exchange (ETDEWEB)

    Dreike, P.L.; McCoy, J.A.

    1996-09-01

    From the dawn of the first use of microprocessors and microcontrollers in embedded systems, the software has been blamed for products being late to market, This is due to software being developed after hardware is fabricated. During the past few years, the use of Hardware Description (or Design) Languages (HDLs) and digital simulation have advanced to a point where the concurrent development of software and hardware can be contemplated using simulation environments. This offers the potential of 50% or greater reductions in time-to-market for embedded systems. This paper is a tutorial on the technical issues that underlie software-hardware (swhw) co-simulation, and the current state of the art. We review the traditional sequential hardware-software design paradigm, and suggest a paradigm for concurrent design, which is supported by co-simulation of software and hardware. This is followed by sections on HDLs modeling and simulation;hardware assisted approaches to simulation; microprocessor modeling methods; brief descriptions of four commercial products for sw-hw co-simulation and a description of our own experiments to develop a co-simulation environment.

  3. Advanced Visualization Software System for Nuclear Power Plant Inspection

    International Nuclear Information System (INIS)

    Kukic, I.; Jambresic, D.; Reskovic, S.

    2006-01-01

    Visualization techniques have been widely used in industrial environment for enhancing process control. Traditional techniques of visualization are based on control panels with switches and lights, and 2D graphic representations of processes. However, modern visualization systems enable significant new opportunities in creating 3D virtual environments. These opportunities arise from the availability of high end graphics capabilities in low cost personal computers. In this paper we describe implementation of process visualization software, developed by INETEC. This software is used to visualize testing equipment, components being tested and the overall power plant inspection process. It improves security of the process due to its real-time visualization and collision detection capabilities, and therefore greatly enhances the inspection process. (author)

  4. Real-Time Extended Interface Automata for Software Testing Cases Generation

    Directory of Open Access Journals (Sweden)

    Shunkun Yang

    2014-01-01

    Full Text Available Testing and verification of the interface between software components are particularly important due to the large number of complex interactions, which requires the traditional modeling languages to overcome the existing shortcomings in the aspects of temporal information description and software testing input controlling. This paper presents the real-time extended interface automata (RTEIA which adds clearer and more detailed temporal information description by the application of time words. We also establish the input interface automaton for every input in order to solve the problems of input controlling and interface covering nimbly when applied in the software testing field. Detailed definitions of the RTEIA and the testing cases generation algorithm are provided in this paper. The feasibility and efficiency of this method have been verified in the testing of one real aircraft braking system.

  5. Scaling Critical Zone analysis tasks from desktop to the cloud utilizing contemporary distributed computing and data management approaches: A case study for project based learning of Cyberinfrastructure concepts

    Science.gov (United States)

    Swetnam, T. L.; Pelletier, J. D.; Merchant, N.; Callahan, N.; Lyons, E.

    2015-12-01

    Earth science is making rapid advances through effective utilization of large-scale data repositories such as aerial LiDAR and access to NSF-funded cyberinfrastructures (e.g. the OpenTopography.org data portal, iPlant Collaborative, and XSEDE). Scaling analysis tasks that are traditionally developed using desktops, laptops or computing clusters to effectively leverage national and regional scale cyberinfrastructure pose unique challenges and barriers to adoption. To address some of these challenges in Fall 2014 an 'Applied Cyberinfrastructure Concepts' a project-based learning course (ISTA 420/520) at the University of Arizona focused on developing scalable models of 'Effective Energy and Mass Transfer' (EEMT, MJ m-2 yr-1) for use by the NSF Critical Zone Observatories (CZO) project. EEMT is a quantitative measure of the flux of available energy to the critical zone, and its computation involves inputs that have broad applicability (e.g. solar insolation). The course comprised of 25 students with varying level of computational skills and with no prior domain background in the geosciences, collaborated with domain experts to develop the scalable workflow. The original workflow relying on open-source QGIS platform on a laptop was scaled to effectively utilize cloud environments (Openstack), UA Campus HPC systems, iRODS, and other XSEDE and OSG resources. The project utilizes public data, e.g. DEMs produced by OpenTopography.org and climate data from Daymet, which are processed using GDAL, GRASS and SAGA and the Makeflow and Work-queue task management software packages. Students were placed into collaborative groups to develop the separate aspects of the project. They were allowed to change teams, alter workflows, and design and develop novel code. The students were able to identify all necessary dependencies, recompile source onto the target execution platforms, and demonstrate a functional workflow, which was further improved upon by one of the group leaders over

  6. C:\\Users\\AISA\\Desktop\\SORO S..xps

    African Journals Online (AJOL)

    AISA

    Propriétés insecticides et fertilisantes de l'engrais organique liquide «Ergofito Defense». ETUDE DE ... results were analyzed using the SAS software version 8.2. At 75 days after ... Par exemple, l'efficacité d'extraits de feuilles de neem.

  7. A Generic Software Safety Document Generator

    Science.gov (United States)

    Denney, Ewen; Venkatesan, Ram Prasad

    2004-01-01

    Formal certification is based on the idea that a mathematical proof of some property of a piece of software can be regarded as a certificate of correctness which, in principle, can be subjected to external scrutiny. In practice, however, proofs themselves are unlikely to be of much interest to engineers. Nevertheless, it is possible to use the information obtained from a mathematical analysis of software to produce a detailed textual justification of correctness. In this paper, we describe an approach to generating textual explanations from automatically generated proofs of program safety, where the proofs are of compliance with an explicit safety policy that can be varied. Key to this is tracing proof obligations back to the program, and we describe a tool which implements this to certify code auto-generated by AutoBayes and AutoFilter, program synthesis systems under development at the NASA Ames Research Center. Our approach is a step towards combining formal certification with traditional certification methods.

  8. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  9. Usability Comparisons of Head-Mounted vs. Stereoscopic Desktop Displays in a Virtual Reality Environment with Pain Patients.

    Science.gov (United States)

    Tong, Xin; Gromala, Diane; Gupta, Dimple; Squire, Pam

    2016-01-01

    Researchers have shown that immersive Virtual Reality (VR) can serve as an unusually powerful pain control technique. However, research assessing the reported symptoms and negative effects of VR systems indicate that it is important to ascertain if these symptoms arise from the use of particular VR display devices, particularly for users who are deemed "at risk," such as chronic pain patients Moreover, these patients have specific and often complex needs and requirements, and because basic issues such as 'comfort' may trigger anxiety or panic attacks, it is important to examine basic questions of the feasibility of using VR displays. Therefore, this repeated-measured experiment was conducted with two VR displays: the Oculus Rift's head-mounted display (HMD) and Firsthand Technologies' immersive desktop display, DeepStream3D. The characteristics of these immersive desktop displays differ: one is worn, enabling patients to move their heads, while the other is peered into, allowing less head movement. To assess the severity of physical discomforts, 20 chronic pain patients tried both displays while watching a VR pain management demo in clinical settings. Results indicated that participants experienced higher levels of Simulator Sickness using the Oculus Rift HMD. However, results also indicated other preferences of the two VR displays among patients, including physical comfort levels and a sense of immersion. Few studies have been conducted that compare usability of specific VR devices specifically with chronic pain patients using a therapeutic virtual environment in pain clinics. Thus, the results may help clinicians and researchers to choose the most appropriate VR displays for chronic pain patients and guide VR designers to enhance the usability of VR displays for long-term pain management interventions.

  10. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  11. Using Office Simulation Software in Teaching Computer Literacy Using Three Sets of Teaching/Learning Activities

    Directory of Open Access Journals (Sweden)

    Azad Ali

    2016-05-01

    Full Text Available The most common course delivery model is based on teacher (knowledge provider - student (knowledge receiver relationship. The most visible symptom of this situation is over-reliance on textbook’s tutorials. This traditional model of delivery reduces teacher flexibility, causes lack of interest among students, and often makes classes boring. Especially this is visible when teaching Computer Literacy courses. Instead, authors of this paper suggest a new active model which is based on MS Office simulation. The proposed model was discussed within the framework of three activities: guided software simulation, instructor-led activities, and self-directed learning activities. The model proposed in the paper of active teaching based on software simulation was proven as more effective than traditional.

  12. A Unified Algorithm for Virtual Desktops Placement in Distributed Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jiangtao Zhang

    2016-01-01

    Full Text Available Distributed cloud has been widely adopted to support service requests from dispersed regions, especially for large enterprise which requests virtual desktops for multiple geodistributed branch companies. The cloud service provider (CSP aims to deliver satisfactory services at the least cost. CSP selects proper data centers (DCs closer to the branch companies so as to shorten the response time to user request. At the same time, it also strives to cut cost considering both DC level and server level. At DC level, the expensive long distance inter-DC bandwidth consumption should be reduced and lower electricity price is sought. Inside each tree-like DC, servers are trying to be used as little as possible so as to save equipment cost and power. In nature, there is a noncooperative relation between the DC level and server level in the selection. To attain these objectives and capture the noncooperative relation, multiobjective bilevel programming is used to formulate the problem. Then a unified genetic algorithm is proposed to solve the problem which realizes the selection of DC and server simultaneously. The extensive simulation shows that the proposed algorithm outperforms baseline algorithm in both quality of service guaranteeing and cost saving.

  13. Detection of analyte binding to microarrays using gold nanoparticle labels and a desktop scanner

    DEFF Research Database (Denmark)

    Han, Anpan; Dufva, Martin; Belleville, Erik

    2003-01-01

    on gold nanoparticle labeled antibodies visualized by a commercial, office desktop flatbed scanner. Scanning electron microscopy studies showed that the signal from the flatbed scanner was proportional to the surface density of the bound antibody-gold conjugates, and that the flatbed scanner could detect...... six attomoles of antibody-gold conjugates. This detection system was used in a competitive immunoassay to measure the concentration of the pesticide metabolite 2,6-dichlorobenzamide (BAM) in water samples. The results showed that the gold labeled antibodies functioned comparably with a fluorescent...... based immunoassay for detecting BAM in water. A qualitative immunoassay based on gold-labeled antibodies could determineif a water sample contained BAM above and below 60-70 ng L(-1), which is below the maximum allowed BAM concentration for drinking water (100 ng L(-1)) according to European Union...

  14. Co-verification of hardware and software for ARM SoC design

    CERN Document Server

    Andrews, Jason

    2004-01-01

    Hardware/software co-verification is how to make sure that embedded system software works correctly with the hardware, and that the hardware has been properly designed to run the software successfully -before large sums are spent on prototypes or manufacturing. This is the first book to apply this verification technique to the rapidly growing field of embedded systems-on-a-chip(SoC). As traditional embedded system design evolves into single-chip design, embedded engineers must be armed with the necessary information to make educated decisions about which tools and methodology to deploy. SoC verification requires a mix of expertise from the disciplines of microprocessor and computer architecture, logic design and simulation, and C and Assembly language embedded software. Until now, the relevant information on how it all fits together has not been available. Andrews, a recognized expert, provides in-depth information about how co-verification really works, how to be successful using it, and pitfalls to avoid. H...

  15. [Clinical effects of micro-implant and traditional anchorage in orthodontic treatments].

    Science.gov (United States)

    Qian, Yi; Zhou, Hua-Jie; Wu, Jian-Hua

    2017-06-01

    To analyze the value of micro-implant and traditional anchorage in the treatment of malocclusion. From Jan 2015 to Jan 2016, 20 cases with malocclusion were randomly divided into control group(10) and experimental group (10). A comparison was conducted between the control group, in which traditional anchorage was used and the experimental group, in which micro-implant anchorage was adopted. The data were analyzed with SPSS 17.0 software package. There was significant difference of U1-NA, L1-NB, U1-APg, U6-PtPNS between the 2 groups(PMicro-implant anchorage can improve overjet relation of the anterior teeth and effect of orthodontic treatment.

  16. The hardware and software design for digital data acquisition system of γ-camera

    International Nuclear Information System (INIS)

    Zhang Chong; Jin Yongjie

    2006-01-01

    The digital data acquisition system is presented, which are used to update the traditional γ-cameras, including hardware and software. The system has many advantages such as small volume, various functions, high-quality image, low cost, extensible, and so on. (authors)

  17. Data-Proximate Analysis and Visualization in the Cloud using Cloudstream, an Open-Source Application Streaming Technology Stack

    Science.gov (United States)

    Fisher, W. I.

    2017-12-01

    The rise in cloud computing, coupled with the growth of "Big Data", has lead to a migration away from local scientific data storage. The increasing size of remote scientific data sets increase, however, makes it difficult for scientists to subject them to large-scale analysis and visualization. These large datasets can take an inordinate amount of time to download; subsetting is a potential solution, but subsetting services are not yet ubiquitous. Data providers may also pay steep prices, as many cloud providers meter data based on how much data leaves their cloud service. The solution to this problem is a deceptively simple one; move data analysis and visualization tools to the cloud, so that scientists may perform data-proximate analysis and visualization. This results in increased transfer speeds, while egress costs are lowered or completely eliminated. Moving standard desktop analysis and visualization tools to the cloud is enabled via a technique called "Application Streaming". This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations. When coupled with containerization technology such as Docker, we are able to easily deploy legacy analysis and visualization software to the cloud whilst retaining access via a desktop, netbook, a smartphone, or the next generation of hardware, whatever it may be. Unidata has created a Docker-based solution for easily adapting legacy software for Application Streaming. This technology stack, dubbed Cloudstream, allows desktop software to run in the cloud with little-to-no effort. The docker container is configured by editing text files, and the legacy software does not need to be modified in any way. This work will discuss the underlying technologies used by Cloudstream, and outline how to use Cloudstream to run and access an existing desktop application to the cloud.

  18. Hybrid Software and System Development in Practice: Waterfall, Scrum, and Beyond

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2017-01-01

    Software and system development faces numerous challenges of rapidly changing markets. To address such challenges, companies and projects design and adopt specific development approaches by combining well-structured comprehensive methods and flexible agile practices. Yet, the number of methods...... and practices is large, and available studies argue that the actual process composition is carried out in a fairly ad-hoc manner. The present paper reports on a survey on hybrid software development approaches. We study which approaches are used in practice, how different approaches are combined, and what...... contextual factors influence the use and combination of hybrid software development approaches. Our results from 69 study participants show a variety of development approaches used and combined in practice. We show that most combinations follow a pattern in which a traditional process model serves...

  19. The future of commodity computing and many-core versus the interests of HEP software

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    As the mainstream computing world has shifted from multi-core to many-core platforms, the situation for software developers has changed as well. With the numerous hardware and software options available, choices balancing programmability and performance are becoming a significant challenge. The expanding multiplicative dimensions of performance offer a growing number of possibilities that need to be assessed and addressed on several levels of abstraction. This paper reviews the major tradeoffs forced upon the software domain by the changing landscape of parallel technologies – hardware and software alike. Recent developments, paradigms and techniques are considered with respect to their impact on the rather traditional HEP programming models. Other considerations addressed include aspects of efficiency and reasonably achievable targets for the parallelization of large scale HEP workloads.

  20. Upside to downsizing : Acceleware's graphic processor technology propels seismic data processing revolution

    Energy Technology Data Exchange (ETDEWEB)

    Smith, M.

    2009-11-15

    Accelware has developed a graphic processor technology (GPU) that is transforming the petroleum industry. The benefits of the technology are its small-footprint, low-wattage, and high speed. The software brings supercomputing speed to the desktop by leveraging the massive parallel processing capacity to the very latest in GPU technology. This article discussed the GPU technology and its emergence as a powerful supercomputing tool. Accelware's partnering with California-based NVIDIA was also outlined. The advantages of the technology were also discussed including its smaller footprint. Accelware's hardware takes up a fraction of the space and uses up to 70 per cent less power than a traditional central processing unit. By combining Accelware's core knowledge in making complex algorithms run in parallel with an in-house team of seismic industry experts, the company provides software solutions for seismic data processors that access the massively parallel processing capabilities of GPUs. 1 fig.

  1. New generation of 3D desktop computer interfaces

    Science.gov (United States)

    Skerjanc, Robert; Pastoor, Siegmund

    1997-05-01

    Today's computer interfaces use 2-D displays showing windows, icons and menus and support mouse interactions for handling programs and data files. The interface metaphor is that of a writing desk with (partly) overlapping sheets of documents placed on its top. Recent advances in the development of 3-D display technology give the opportunity to take the interface concept a radical stage further by breaking the design limits of the desktop metaphor. The major advantage of the envisioned 'application space' is, that it offers an additional, immediately perceptible dimension to clearly and constantly visualize the structure and current state of interrelations between documents, videos, application programs and networked systems. In this context, we describe the development of a visual operating system (VOS). Under VOS, applications appear as objects in 3-D space. Users can (graphically connect selected objects to enable communication between the respective applications. VOS includes a general concept of visual and object oriented programming for tasks ranging from, e.g., low-level programming up to high-level application configuration. In order to enable practical operation in an office or at home for many hours, the system should be very comfortable to use. Since typical 3-D equipment used, e.g., in virtual-reality applications (head-mounted displays, data gloves) is rather cumbersome and straining, we suggest to use off-head displays and contact-free interaction techniques. In this article, we introduce an autostereoscopic 3-D display and connected video based interaction techniques which allow viewpoint-depending imaging (by head tracking) and visually controlled modification of data objects and links (by gaze tracking, e.g., to pick, 3-D objects just by looking at them).

  2. Dynamic modeling and simulation of sheave damper based on AMESim software

    Directory of Open Access Journals (Sweden)

    BI Ke

    2017-10-01

    Full Text Available [Objectives] Considering the shortcomings of the traditional sheave damper in buffer performance and the peak value of the greatest cable tension,[Methods] this paper presents a sheave damper with variable damping according to piston displacement as a replacement for the traditional sheave damper, and AMESim software is used for the modeling and simulation.[Results] The results show that the new sheave damper can significantly improve the arresting gear performance indicators, and has better adaptability for aircraft impact load. Compared with the traditional sheave damper, the new method can reduce cable tension by 25% and reduce the maximum deceleration of aircraft by 23%.[Conclusions] As such, the research in this paper can provide a theoretical reference for improving the performance of aircraft arresting gear.

  3. Experience Report: Introducing Kanban Into Automotive Software Project

    Directory of Open Access Journals (Sweden)

    Marek Majchrzak

    2017-03-01

    Full Text Available The boundaries between traditional and agile approach methods are disappearing. A significant number of software projects require a continuous implementation of tasks without dividing them into sprints or strict project phases. Customers expect more flexibility and responsiveness from software vendors in response to the ever-changing business environment. To achieve better results in this field, Capgemini has begun using the Lean philosophy and Kanban techniques. \\\\The following article illustrates examples of different uses of Kanban and the main stakeholder of the process. The article presents the main advantages of transparency and ways to improve the customer co-operation as well as stakeholder relationships. The Authors try to visualise all of the elements in the context of the project. \\\\There is also a discussion of different approaches in two software projects. The article fokuses on the main challenges and the evolutionary approach used. An attempt is made to answer the question how to convince both the team as well as the customer, and how to optimise ways to achieve great results.

  4. QuantWorm: a comprehensive software package for Caenorhabditis elegans phenotypic assays.

    Directory of Open Access Journals (Sweden)

    Sang-Kyu Jung

    Full Text Available Phenotypic assays are crucial in genetics; however, traditional methods that rely on human observation are unsuitable for quantitative, large-scale experiments. Furthermore, there is an increasing need for comprehensive analyses of multiple phenotypes to provide multidimensional information. Here we developed an automated, high-throughput computer imaging system for quantifying multiple Caenorhabditis elegans phenotypes. Our imaging system is composed of a microscope equipped with a digital camera and a motorized stage connected to a computer running the QuantWorm software package. Currently, the software package contains one data acquisition module and four image analysis programs: WormLifespan, WormLocomotion, WormLength, and WormEgg. The data acquisition module collects images and videos. The WormLifespan software counts the number of moving worms by using two time-lapse images; the WormLocomotion software computes the velocity of moving worms; the WormLength software measures worm body size; and the WormEgg software counts the number of eggs. To evaluate the performance of our software, we compared the results of our software with manual measurements. We then demonstrated the application of the QuantWorm software in a drug assay and a genetic assay. Overall, the QuantWorm software provided accurate measurements at a high speed. Software source code, executable programs, and sample images are available at www.quantworm.org. Our software package has several advantages over current imaging systems for C. elegans. It is an all-in-one package for quantifying multiple phenotypes. The QuantWorm software is written in Java and its source code is freely available, so it does not require use of commercial software or libraries. It can be run on multiple platforms and easily customized to cope with new methods and requirements.

  5. Examining software complexity and quality for scientific software

    International Nuclear Information System (INIS)

    Kelly, D.; Shepard, T.

    2005-01-01

    Research has not found a simple relationship between software complexity and software quality, and particularly no relationship between commonly used software complexity metrics and the occurrence of software faults. A study with an example of scientific software from the nuclear power industry illustrates the importance of addressing cognitive complexity, the complexity related to understanding the intellectual content of the software. Simple practices such as aptly-named variables contributes more to high quality software than limiting code sizes. This paper examines the research into complexity and quality and reports on a longitudinal study using the example of nuclear software. (author)

  6. OxMaR: open source free software for online minimization and randomization for clinical trials.

    Science.gov (United States)

    O'Callaghan, Christopher A

    2014-01-01

    Minimization is a valuable method for allocating participants between the control and experimental arms of clinical studies. The use of minimization reduces differences that might arise by chance between the study arms in the distribution of patient characteristics such as gender, ethnicity and age. However, unlike randomization, minimization requires real time assessment of each new participant with respect to the preceding distribution of relevant participant characteristics within the different arms of the study. For multi-site studies, this necessitates centralized computational analysis that is shared between all study locations. Unfortunately, there is no suitable freely available open source or free software that can be used for this purpose. OxMaR was developed to enable researchers in any location to use minimization for patient allocation and to access the minimization algorithm using any device that can connect to the internet such as a desktop computer, tablet or mobile phone. The software is complete in itself and requires no special packages or libraries to be installed. It is simple to set up and run over the internet using online facilities which are very low cost or even free to the user. Importantly, it provides real time information on allocation to the study lead or administrator and generates real time distributed backups with each allocation. OxMaR can readily be modified and customised and can also be used for standard randomization. It has been extensively tested and has been used successfully in a low budget multi-centre study. Hitherto, the logistical difficulties involved in minimization have precluded its use in many small studies and this software should allow more widespread use of minimization which should lead to studies with better matched control and experimental arms. OxMaR should be particularly valuable in low resource settings.

  7. OxMaR: open source free software for online minimization and randomization for clinical trials.

    Directory of Open Access Journals (Sweden)

    Christopher A O'Callaghan

    Full Text Available Minimization is a valuable method for allocating participants between the control and experimental arms of clinical studies. The use of minimization reduces differences that might arise by chance between the study arms in the distribution of patient characteristics such as gender, ethnicity and age. However, unlike randomization, minimization requires real time assessment of each new participant with respect to the preceding distribution of relevant participant characteristics within the different arms of the study. For multi-site studies, this necessitates centralized computational analysis that is shared between all study locations. Unfortunately, there is no suitable freely available open source or free software that can be used for this purpose. OxMaR was developed to enable researchers in any location to use minimization for patient allocation and to access the minimization algorithm using any device that can connect to the internet such as a desktop computer, tablet or mobile phone. The software is complete in itself and requires no special packages or libraries to be installed. It is simple to set up and run over the internet using online facilities which are very low cost or even free to the user. Importantly, it provides real time information on allocation to the study lead or administrator and generates real time distributed backups with each allocation. OxMaR can readily be modified and customised and can also be used for standard randomization. It has been extensively tested and has been used successfully in a low budget multi-centre study. Hitherto, the logistical difficulties involved in minimization have precluded its use in many small studies and this software should allow more widespread use of minimization which should lead to studies with better matched control and experimental arms. OxMaR should be particularly valuable in low resource settings.

  8. The Astringency of the GP Algorithm for Forecasting Software Failure Data Series

    Directory of Open Access Journals (Sweden)

    Yong-qiang Zhang

    2007-05-01

    Full Text Available The forecasting of software failure data series by Genetic Programming (GP can be realized without any assumptions before modeling. This discovery has transformed traditional statistical modeling methods as well as improved consistency for model applicability. The individuals' different characteristics during the evolution of generations, which are randomly changeable, are treated as Markov random processes. This paper also proposes that a GP algorithm with "optimal individuals reserved strategy" is the best solution to this problem, and therefore the adaptive individuals finally will be evolved. This will allow practical applications in software reliability modeling analysis and forecasting for failure behaviors. Moreover it can verify the feasibility and availability of the GP algorithm, which is applied to software failure data series forecasting on a theoretical basis. The results show that the GP algorithm is the best solution for software failure behaviors in a variety of disciplines.

  9. A concept of software testing for SMART MMIS software

    International Nuclear Information System (INIS)

    Seo, Yong Seok; Seong, Seung Hwan; Park, Keun Ok; Hur, Sub; Kim, Dong Hoon

    2001-01-01

    In order to achieve high quality of SMART MMIS software, the well-constructed software testing concept shall be required. This paper established software testing concept which is to be applied to SMART MMIS software, in terms of software testing organization, documentation. procedure, and methods. The software testing methods are classified into source code static analysis and dynamic testing. The software dynamic testing methods are discussed with two aspects: white-box and black-box testing. As software testing concept introduced in this paper is applied to the SMART MMIS software. the high quality of the software will be produced. In the future, software failure data will be collected through the construction of SMART MMIS prototyping facility which the software testing concept of this paper is applied to

  10. Expert system for skin problem consultation in Thai traditional medicine.

    Science.gov (United States)

    Nopparatkiat, Pornchai; na Nagara, Byaporn; Chansa-ngavej, Chuvej

    2014-01-01

    This paper aimed to demonstrate the research and development of a rule-based expert system for skin problem consulting in the areas of acne, melasma, freckle, wrinkle, and uneven skin tone, with recommended treatments from Thai traditional medicine knowledge. The tool selected for developing the expert system is a software program written in the PHP language. MySQL database is used to work together with PHP for building database of the expert system. The system is web-based and can be reached from anywhere with Internet access. The developed expert system gave recommendations on the skin problem treatment with Thai herbal recipes and Thai herbal cosmetics based on 416 rules derived from primary and secondary sources. The system had been tested by 50 users consisting of dermatologists, Thai traditional medicine doctors, and general users. The developed system was considered good for learning and consultation. The present work showed how such a scattered body of traditional knowledge as Thai traditional medicine and herbal recipes could be collected, organised and made accessible to users and interested parties. The expert system developed herein should contribute in a meaningful way towards preserving the knowledge and helping promote the use of Thai traditional medicine as a practical alternative medicine for the treatment of illnesses.

  11. Software Replica of Minimal Living Processes

    Science.gov (United States)

    Bersini, Hugues

    2010-04-01

    There is a long tradition of software simulations in theoretical biology to complement pure analytical mathematics which are often limited to reproduce and understand the self-organization phenomena resulting from the non-linear and spatially grounded interactions of the huge number of diverse biological objects. Since John Von Neumann and Alan Turing pioneering works on self-replication and morphogenesis, proponents of artificial life have chosen to resolutely neglecting a lot of materialistic and quantitative information deemed not indispensable and have focused on the rule-based mechanisms making life possible, supposedly neutral with respect to their underlying material embodiment. Minimal life begins at the intersection of a series of processes which need to be isolated, differentiated and duplicated as such in computers. Only software developments and running make possible to understand the way these processes are intimately interconnected in order for life to appear at the crossroad. In this paper, I will attempt to set out the history of life as the disciples of artificial life understand it, by placing these different lessons on a temporal and causal axis, showing which one is indispensable to the appearance of the next and how does it connect to the next. I will discuss the task of artificial life as setting up experimental software platforms where these different lessons, whether taken in isolation or together, are tested, simulated, and, more systematically, analyzed. I will sketch some of these existing software platforms: chemical reaction networks, Varela’s autopoietic cellular automata, Ganti’s chemoton model, whose running delivers interesting take home messages to open-minded biologists.

  12. Composing simulations using persistent software components

    Energy Technology Data Exchange (ETDEWEB)

    Holland, J.V.; Michelsen, R.E.; Powell, D.R.; Upton, S.C.; Thompson, D.R.

    1999-03-01

    The traditional process for developing large-scale simulations is cumbersome, time consuming, costly, and in some cases, inadequate. The topics of software components and component-based software engineering are being explored by software professionals in academic and industrial settings. A component is a well-delineated, relatively independent, and replaceable part of a software system that performs a specific function. Many researchers have addressed the potential to derive a component-based approach to simulations in general, and a few have focused on military simulations in particular. In a component-based approach, functional or logical blocks of the simulation entities are represented as coherent collections of components satisfying explicitly defined interface requirements. A simulation is a top-level aggregate comprised of a collection of components that interact with each other in the context of a simulated environment. A component may represent a simulation artifact, an agent, or any entity that can generated events affecting itself, other simulated entities, or the state of the system. The component-based approach promotes code reuse, contributes to reducing time spent validating or verifying models, and promises to reduce the cost of development while still delivering tailored simulations specific to analysis questions. The Integrated Virtual Environment for Simulation (IVES) is a composition-centered framework to achieve this potential. IVES is a Java implementation of simulation composition concepts developed at Los Alamos National Laboratory for use in several application domains. In this paper, its use in the military domain is demonstrated via the simulation of dismounted infantry in an urban environment.

  13. New Tools for New Literacies Research: An Exploration of Usability Testing Software

    Science.gov (United States)

    Asselin, Marlene; Moayeri, Maryam

    2010-01-01

    Competency in the new literacies of the Internet is essential for participating in contemporary society. Researchers studying these new literacies are recognizing the limitations of traditional methodological tools and adapting new technologies and new media for use in research. This paper reports our exploration of usability testing software to…

  14. Surgical pathology report in the era of desktop publishing.

    Science.gov (United States)

    Pillarisetti, S G

    1993-01-01

    Since it is believed that "a picture is worth a thousand words," incorporation of computer-generated line art was used as a adjunct to gross description in surgical pathology reporting in selected cases. The lack of an integrated software program was overcome by using commercially available graphic and word processing software. A library of drawings was developed over the last few years. Most time-consuming is the development of templates and the graphic library. With some effort it is possible to integrate graphics of high quality into surgical pathology reports.

  15. Software Maintenance and Evolution: The Implication for Software ...

    African Journals Online (AJOL)

    Software Maintenance and Evolution: The Implication for Software Development. ... Software maintenance is the process of modifying existing operational software by correcting errors, ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  16. Software Defined Radio (SDR) and Direct Digital Synthesizer (DDS) for NMR/MRI Instruments at Low-Field

    Science.gov (United States)

    Asfour, Aktham; Raoof, Kosai; Yonnet, Jean-Paul

    2013-01-01

    A proof-of-concept of the use of a fully digital radiofrequency (RF) electronics for the design of dedicated Nuclear Magnetic Resonance (NMR) systems at low-field (0.1 T) is presented. This digital electronics is based on the use of three key elements: a Direct Digital Synthesizer (DDS) for pulse generation, a Software Defined Radio (SDR) for a digital receiving of NMR signals and a Digital Signal Processor (DSP) for system control and for the generation of the gradient signals (pulse programmer). The SDR includes a direct analog-to-digital conversion and a Digital Down Conversion (digital quadrature demodulation, decimation filtering, processing gain…). The various aspects of the concept and of the realization are addressed with some details. These include both hardware design and software considerations. One of the underlying ideas is to enable such NMR systems to “enjoy” from existing advanced technology that have been realized in other research areas, especially in telecommunication domain. Another goal is to make these systems easy to build and replicate so as to help research groups in realizing dedicated NMR desktops for a large palette of new applications. We also would like to give readers an idea of the current trends in this field. The performances of the developed electronics are discussed throughout the paper. First FID (Free Induction Decay) signals are also presented. Some development perspectives of our work in the area of low-field NMR/MRI will be finally addressed. PMID:24287540

  17. Software Defined Radio (SDR and Direct Digital Synthesizer (DDS for NMR/MRI Instruments at Low-Field

    Directory of Open Access Journals (Sweden)

    Aktham Asfour

    2013-11-01

    Full Text Available A proof-of-concept of the use of a fully digital radiofrequency (RF electronics for the design of dedicated Nuclear Magnetic Resonance (NMR systems at low-field (0.1 T is presented. This digital electronics is based on the use of three key elements: a Direct Digital Synthesizer (DDS for pulse generation, a Software Defined Radio (SDR for a digital receiving of NMR signals and a Digital Signal Processor (DSP for system control and for the generation of the gradient signals (pulse programmer. The SDR includes a direct analog-to-digital conversion and a Digital Down Conversion (digital quadrature demodulation, decimation filtering, processing gain…. The various aspects of the concept and of the realization are addressed with some details. These include both hardware design and software considerations. One of the underlying ideas is to enable such NMR systems to “enjoy” from existing advanced technology that have been realized in other research areas, especially in telecommunication domain. Another goal is to make these systems easy to build and replicate so as to help research groups in realizing dedicated NMR desktops for a large palette of new applications. We also would like to give readers an idea of the current trends in this field. The performances of the developed electronics are discussed throughout the paper. First FID (Free Induction Decay signals are also presented. Some development perspectives of our work in the area of low-field NMR/MRI will be finally addressed.

  18. 76 FR 52581 - Automated Data Processing and Information Retrieval System Requirements

    Science.gov (United States)

    2011-08-23

    ... forth in the final rule in 7 CFR part 3015, Subpart V and related Notice published at [48 FR 29114 for... publish a separate action in the Federal Register announcing OMB's approval. Title: Supporting Statement... and desktop/office software, FNS recognizes the potential of COTS software in the Human Services...

  19. A Risk-based, Practice-centered Approach to Project Management for HPCMP CREATE

    Science.gov (United States)

    2015-10-05

    form of videoconferencing. These impediments have been mitigated to some extent by using browser-based Software as a Service ( SaaS ) access to CREATE...one-time password (OTP), and OpenID. Security is managed within the DREN, as opposed to every desktop. As a “Software as a Service” ( SaaS

  20. Vertical Interaction in Open Software Engineering Communities

    Science.gov (United States)

    2009-03-01

    FIRMS: BUSINESS COLLABORATION THROUGH OPEN SOURCE PROJECTS the Application Lifecycle Framework (ALF) and several sub-project streams in the SOA Tools...to 1This chapter is substantially based on a paper in progress with Jim Herbsleb and Robert Kraut. 86 CHAPTER 4. FIRMS AND INDIVIDUALS: THE IMPACT OF...involved in the community[44]. For exam - ple, the GNOME project, a successful desktop environment for Linux and Unix systems, has a website called