WorldWideScience

Sample records for multimodal user interfaces

  1. HCIDL: Human-computer interface description language for multi-target, multimodal, plastic user interfaces

    Directory of Open Access Journals (Sweden)

    Lamia Gaouar

    2018-06-01

    Full Text Available From the human-computer interface perspectives, the challenges to be faced are related to the consideration of new, multiple interactions, and the diversity of devices. The large panel of interactions (touching, shaking, voice dictation, positioning … and the diversification of interaction devices can be seen as a factor of flexibility albeit introducing incidental complexity. Our work is part of the field of user interface description languages. After an analysis of the scientific context of our work, this paper introduces HCIDL, a modelling language staged in a model-driven engineering approach. Among the properties related to human-computer interface, our proposition is intended for modelling multi-target, multimodal, plastic interaction interfaces using user interface description languages. By combining plasticity and multimodality, HCIDL improves usability of user interfaces through adaptive behaviour by providing end-users with an interaction-set adapted to input/output of terminals and, an optimum layout. Keywords: Model driven engineering, Human-computer interface, User interface description languages, Multimodal applications, Plastic user interfaces

  2. Multimodal user interfaces to improve social integration of elderly and mobility impaired.

    Science.gov (United States)

    Dias, Miguel Sales; Pires, Carlos Galinho; Pinto, Fernando Miguel; Teixeira, Vítor Duarte; Freitas, João

    2012-01-01

    Technologies for Human-Computer Interaction (HCI) and Communication have evolved tremendously over the past decades. However, citizens such as mobility impaired or elderly or others, still face many difficulties interacting with communication services, either due to HCI issues or intrinsic design problems with the services. In this paper we start by presenting the results of two user studies, the first one conducted with a group of mobility impaired users, comprising paraplegic and quadriplegic individuals; and the second one with elderly. The study participants carried out a set of tasks with a multimodal (speech, touch, gesture, keyboard and mouse) and multi-platform (mobile, desktop) system, offering an integrated access to communication and entertainment services, such as email, agenda, conferencing, instant messaging and social media, referred to as LHC - Living Home Center. The system was designed to take into account the requirements captured from these users, with the objective of evaluating if the adoption of multimodal interfaces for audio-visual communication and social media services, could improve the interaction with such services. Our study revealed that a multimodal prototype system, offering natural interaction modalities, especially supporting speech and touch, can in fact improve access to the presented services, contributing to the reduction of social isolation of mobility impaired, as well as elderly, and improving their digital inclusion.

  3. Holographic Raman tweezers controlled by multi-modal natural user interface

    International Nuclear Information System (INIS)

    Tomori, Zoltán; Keša, Peter; Nikorovič, Matej; Valušová, Eva; Antalík, Marián; Kaňka, Jan; Jákl, Petr; Šerý, Mojmír; Bernatová, Silvie; Zemánek, Pavel

    2016-01-01

    Holographic optical tweezers provide a contactless way to trap and manipulate several microobjects independently in space using focused laser beams. Although the methods of fast and efficient generation of optical traps are well developed, their user friendly control still lags behind. Even though several attempts have appeared recently to exploit touch tablets, 2D cameras, or Kinect game consoles, they have not yet reached the level of natural human interface. Here we demonstrate a multi-modal ‘natural user interface’ approach that combines finger and gaze tracking with gesture and speech recognition. This allows us to select objects with an operator’s gaze and voice, to trap the objects and control their positions via tracking of finger movement in space and to run semi-automatic procedures such as acquisition of Raman spectra from preselected objects. This approach takes advantage of the power of human processing of images together with smooth control of human fingertips and downscales these skills to control remotely the motion of microobjects at microscale in a natural way for the human operator. (paper)

  4. Metawidgets in the multimodal interface

    Energy Technology Data Exchange (ETDEWEB)

    Blattner, M.M. (Lawrence Livermore National Lab., CA (United States) Anderson (M.D.) Cancer Center, Houston, TX (United States)); Glinert, E.P.; Jorge, J.A.; Ormsby, G.R. (Rensselaer Polytechnic Inst., Troy, NY (United States). Dept. of Computer Science)

    1991-01-01

    We analyze two intertwined and fundamental issues concerning computer-to-human communication in the multimodal interfaces: the interplay between sound and graphics, and the role of object persistence. Our observations lead us to introduce metawidgets as abstract entities capable of manifesting themselves to users as image, as sound, or as various combinations and/or sequences of the two media. We show examples of metawidgets in action, and discuss mechanisms for choosing among alternative media for metawidget instantiation. Finally, we describe a couple of experimental microworlds we have implemented to test out some of our ideas. 17 refs., 7 figs.

  5. Multimodal interaction with W3C standards toward natural user interfaces to everything

    CERN Document Server

    2017-01-01

    This book presents new standards for multimodal interaction published by the W3C and other standards bodies in straightforward and accessible language, while also illustrating the standards in operation through case studies and chapters on innovative implementations. The book illustrates how, as smart technology becomes ubiquitous, and appears in more and more different shapes and sizes, vendor-specific approaches to multimodal interaction become impractical, motivating the need for standards. This book covers standards for voice, emotion, natural language understanding, dialog, and multimodal architectures. The book describes the standards in a practical manner, making them accessible to developers, students, and researchers. Comprehensive resource that explains the W3C standards for multimodal interaction clear and straightforward way; Includes case studies of the use of the standards on a wide variety of devices, including mobile devices, tablets, wearables and robots, in applications such as assisted livi...

  6. Model based estimation for multi-modal user interface component selection

    CSIR Research Space (South Africa)

    Coetzee, L

    2009-12-01

    Full Text Available and literacy level of the user should be taken into account. This paper presents one approach to develop a cost-based model which can be used to derive appropriate mappings for specific user profiles. The model is explained through a number of small examples...

  7. Holographic Raman tweezers controlled by multi-modal natural user interface

    Czech Academy of Sciences Publication Activity Database

    Tomori, Z.; Keša, P.; Nikorovič, M.; Kaňka, Jan; Jákl, Petr; Šerý, Mojmír; Bernatová, Silvie; Valušová, E.; Antalík, M.; Zemánek, Pavel

    2016-01-01

    Roč. 18, č. 1 (2016), 015602:1-9 ISSN 2040-8978 R&D Projects: GA MŠk(CZ) LO1212; GA MŠk(CZ) LD14069 Institutional support: RVO:68081731 Keywords : holographic optical tweezers * Raman microspectroscopy * human-computer interface Subject RIV: BH - Optics, Masers, Lasers Impact factor: 1.741, year: 2016

  8. User Interface History

    DEFF Research Database (Denmark)

    Jørgensen, Anker Helms; Myers, Brad A

    2008-01-01

    User Interfaces have been around as long as computers have existed, even well before the field of Human-Computer Interaction was established. Over the years, some papers on the history of Human-Computer Interaction and User Interfaces have appeared, primarily focusing on the graphical interface e...

  9. Designing Multimodal User-Interfaces for Effective E-Learning in the School Primary Stages Applied on Real Fractions

    Directory of Open Access Journals (Sweden)

    Salaheddin Odeh

    2009-06-01

    Full Text Available This contribution focuses on the development and design of e-learning tools for school students in primary stages through dealing and considering the math of real fractions, which presents an example of learning material difficult to understand by many school students and a real challenge for e-learning designers and multimedia authoring. Firstly, we will highlight several problems facing school students and teachers caused by the traditional learning approach. Then, we are going to discuss some aspects related to e-learning, the major theoretical issues of educational psychology and e-learning with various modalities related to our work, and the classification of the interactive multimedia methodologies adopted in this work. Furthermore, the software-ergonomic and –architectural features of the developed e-learning tool will be introduced. Finally, the paper will conclude with a brief summary of a usability testing carried out to compare the developed e-learning user-interface with the traditional learning approach.

  10. User interface support

    Science.gov (United States)

    Lewis, Clayton; Wilde, Nick

    1989-01-01

    Space construction will require heavy investment in the development of a wide variety of user interfaces for the computer-based tools that will be involved at every stage of construction operations. Using today's technology, user interface development is very expensive for two reasons: (1) specialized and scarce programming skills are required to implement the necessary graphical representations and complex control regimes for high-quality interfaces; (2) iteration on prototypes is required to meet user and task requirements, since these are difficult to anticipate with current (and foreseeable) design knowledge. We are attacking this problem by building a user interface development tool based on extensions to the spreadsheet model of computation. The tool provides high-level support for graphical user interfaces and permits dynamic modification of interfaces, without requiring conventional programming concepts and skills.

  11. User interface design considerations

    DEFF Research Database (Denmark)

    Andersen, Simon Engedal; Jakobsen, Arne; Rasmussen, Bjarne D.

    1999-01-01

    and output variables. This feature requires special attention when designing the user interface and a special approach for controlling the user selection of input and output variables are developed. To obtain a consistent system description the different input variables are grouped corresponding......When designing a user interface for a simulation model there are several important issues to consider: Who is the target user group, and which a priori information can be expected. What questions do the users want answers to and what questions are answered using a specific model?When developing...... the user interface of EESCoolTools these issues led to a series of simulation tools each with a specific purpose and a carefully selected set of input and output variables. To allow a more wide range of questions to be answered by the same model, the user can change between different sets of input...

  12. APEX_SCOPE: A graphical user interface for visualization of multi-modal data in inter-disciplinary studies.

    Science.gov (United States)

    Kanbar, Lara J; Shalish, Wissam; Precup, Doina; Brown, Karen; Sant'Anna, Guilherme M; Kearney, Robert E

    2017-07-01

    In multi-disciplinary studies, different forms of data are often collected for analysis. For example, APEX, a study on the automated prediction of extubation readiness in extremely preterm infants, collects clinical parameters and cardiorespiratory signals. A variety of cardiorespiratory metrics are computed from these signals and used to assign a cardiorespiratory pattern at each time. In such a situation, exploratory analysis requires a visualization tool capable of displaying these different types of acquired and computed signals in an integrated environment. Thus, we developed APEX_SCOPE, a graphical tool for the visualization of multi-modal data comprising cardiorespiratory signals, automated cardiorespiratory metrics, automated respiratory patterns, manually classified respiratory patterns, and manual annotations by clinicians during data acquisition. This MATLAB-based application provides a means for collaborators to view combinations of signals to promote discussion, generate hypotheses and develop features.

  13. Equivalent Representations of Multi-Modal User Interfaces Through Parallel Rendering (Equivalente representaties van multi-modale gebruikersomgevingen via parallele weergave)

    OpenAIRE

    Van Hees, Kris

    2012-01-01

    Even though the Graphical User Interface (GUI) has been in existence since 1974, and available for commercial and home use since 1984, blind users still face many obstacles when using computer systems with a GUI. Over the past few years, our daily life has become more and more infused with devices that feature this type of user interface (UI). This continuing trend increasingly impacts blind users primarily due to the implied visual interaction model. Furthermore, the general availability of...

  14. User interface development

    Science.gov (United States)

    Aggrawal, Bharat

    1994-01-01

    This viewgraph presentation describes the development of user interfaces for OS/2 versions of computer codes for the analysis of seals. Current status, new features, work in progress, and future plans are discussed.

  15. Adaptive user interfaces

    CERN Document Server

    1990-01-01

    This book describes techniques for designing and building adaptive user interfaces developed in the large AID project undertaken by the contributors.Key Features* Describes one of the few large-scale adaptive interface projects in the world* Outlines the principles of adaptivity in human-computer interaction

  16. Natural User Interfaces

    OpenAIRE

    Câmara , António

    2011-01-01

    Dissertação de Mestrado em Engenharia Informática apresentada à Faculdade de Ciências e Tecnologia da Universidade de Coimbra This project’s main subject are Natural User Interfaces. These interfaces’ main purpose is to allow the user to interact with computer systems in a more direct and natural way. The popularization of touch and gesture devices in the last few years has allowed for them to become increasingly common and today we are experiencing a transition of interface p...

  17. Workflow User Interfaces Patterns

    Directory of Open Access Journals (Sweden)

    Jean Vanderdonckt

    2012-03-01

    Full Text Available Este trabajo presenta una colección de patrones de diseño de interfaces de usuario para sistemas de información para el flujo de trabajo; la colección incluye cuarenta y tres patrones clasificados en siete categorías identificados a partir de la lógica del ciclo de vida de la tarea sobre la base de la oferta y la asignación de tareas a los responsables de realizarlas (i. e. recursos humanos durante el flujo de trabajo. Cada patrón de la interfaz de usuario de flujo de trabajo (WUIP, por sus siglas en inglés se caracteriza por las propiedades expresadas en el lenguaje PLML para expresar patrones y complementado por otros atributos y modelos que se adjuntan a dicho modelo: la interfaz de usuario abstracta y el modelo de tareas correspondiente. Estos modelos se especifican en un lenguaje de descripción de interfaces de usuario. Todos los WUIPs se almacenan en una biblioteca y se pueden recuperar a través de un editor de flujo de trabajo que vincula a cada patrón de asignación de trabajo a su WUIP correspondiente.A collection of user interface design patterns for workflow information systems is presented that contains forty three resource patterns classified in seven categories. These categories and their corresponding patterns have been logically identified from the task life cycle based on offering and allocation operations. Each Workflow User Interface Pattern (WUIP is characterized by properties expressed in the PLML markup language for expressing patterns and augmented by additional attributes and models attached to the pattern: the abstract user interface and the corresponding task model. These models are specified in a User Interface Description Language. All WUIPs are stored in a library and can be retrieved within a workflow editor that links each workflow pattern to its corresponding WUIP, thus giving rise to a user interface for each workflow pattern.

  18. Planning and User Interface Affordances

    National Research Council Canada - National Science Library

    St. Amant, Robert

    1999-01-01

    .... We identify a number of similarities between executing plans and interacting with a graphical user interface, and argue that affordances for planning environments apply equally well to user interface environments...

  19. Development of Multimodal Human Interface Technology

    Science.gov (United States)

    Hirose, Michitaka

    About 20 years have passed since the word “Virtual Reality” became popular. During these two decades, novel human interface technology so called “multimodal interface technology” has been formed. In this paper, firstly, recent progress in realtime CG, BCI and five senses IT is quickly reviewed. Since the life cycle of the information technology is said to be 20 years or so, novel directions and paradigms of VR technology can be found in conjunction with the technologies forementioned. At the end of the paper, these futuristic directions such as ultra-realistic media are briefly introduced.

  20. Power User Interface

    Science.gov (United States)

    Pfister, Robin; McMahon, Joe

    2006-01-01

    Power User Interface 5.0 (PUI) is a system of middleware, written for expert users in the Earth-science community, PUI enables expedited ordering of data granules on the basis of specific granule-identifying information that the users already know or can assemble. PUI also enables expert users to perform quick searches for orderablegranule information for use in preparing orders. PUI 5.0 is available in two versions (note: PUI 6.0 has command-line mode only): a Web-based application program and a UNIX command-line- mode client program. Both versions include modules that perform data-granule-ordering functions in conjunction with external systems. The Web-based version works with Earth Observing System Clearing House (ECHO) metadata catalog and order-entry services and with an open-source order-service broker server component, called the Mercury Shopping Cart, that is provided separately by Oak Ridge National Laboratory through the Department of Energy. The command-line version works with the ECHO metadata and order-entry process service. Both versions of PUI ultimately use ECHO to process an order to be sent to a data provider. Ordered data are provided through means outside the PUI software system.

  1. Portraying User Interface History

    DEFF Research Database (Denmark)

    Jørgensen, Anker Helms

    2008-01-01

    history. Next the paper analyses a selected sample of papers on UI history at large. The analysis shows that the current state-of-art is featured by three aspects: Firstly internalism, in that the papers adress the tech­nologies in their own right with little con­text­ualization, secondly whiggism...... in that they largely address prevailing UI techno­logies, and thirdly history from above in that they focus on the great deeds of the visionaries. The paper then compares this state-of-art in UI history to the much more mature fields history of computing and history of technology. Based hereon, some speculations......The user interface is coming of age. Papers adressing UI history have appeared in fair amounts in the last 25 years. Most of them address particular aspects such as an in­novative interface paradigm or the contribution of a visionary or a research lab. Contrasting this, papers addres­sing UI...

  2. Overview of Graphical User Interfaces.

    Science.gov (United States)

    Hulser, Richard P.

    1993-01-01

    Discussion of graphical user interfaces for online public access catalogs (OPACs) covers the history of OPACs; OPAC front-end design, including examples from Indiana University and the University of Illinois; and planning and implementation of a user interface. (10 references) (EA)

  3. Multimodal 2D Brain Computer Interface.

    Science.gov (United States)

    Almajidy, Rand K; Boudria, Yacine; Hofmann, Ulrich G; Besio, Walter; Mankodiya, Kunal

    2015-08-01

    In this work we used multimodal, non-invasive brain signal recording systems, namely Near Infrared Spectroscopy (NIRS), disc electrode electroencephalography (EEG) and tripolar concentric ring electrodes (TCRE) electroencephalography (tEEG). 7 healthy subjects participated in our experiments to control a 2-D Brain Computer Interface (BCI). Four motor imagery task were performed, imagery motion of the left hand, the right hand, both hands and both feet. The signal slope (SS) of the change in oxygenated hemoglobin concentration measured by NIRS was used for feature extraction while the power spectrum density (PSD) of both EEG and tEEG in the frequency band 8-30Hz was used for feature extraction. Linear Discriminant Analysis (LDA) was used to classify different combinations of the aforementioned features. The highest classification accuracy (85.2%) was achieved by using features from all the three brain signals recording modules. The improvement in classification accuracy was highly significant (p = 0.0033) when using the multimodal signals features as compared to pure EEG features.

  4. Multimodal Sensing Interface for Haptic Interaction

    Directory of Open Access Journals (Sweden)

    Carlos Diaz

    2017-01-01

    Full Text Available This paper investigates the integration of a multimodal sensing system for exploring limits of vibrato tactile haptic feedback when interacting with 3D representation of real objects. In this study, the spatial locations of the objects are mapped to the work volume of the user using a Kinect sensor. The position of the user’s hand is obtained using the marker-based visual processing. The depth information is used to build a vibrotactile map on a haptic glove enhanced with vibration motors. The users can perceive the location and dimension of remote objects by moving their hand inside a scanning region. A marker detection camera provides the location and orientation of the user’s hand (glove to map the corresponding tactile message. A preliminary study was conducted to explore how different users can perceive such haptic experiences. Factors such as total number of objects detected, object separation resolution, and dimension-based and shape-based discrimination were evaluated. The preliminary results showed that the localization and counting of objects can be attained with a high degree of success. The users were able to classify groups of objects of different dimensions based on the perceived haptic feedback.

  5. User habits and multimodal route planning

    Directory of Open Access Journals (Sweden)

    Géza Katona

    2017-10-01

    Full Text Available The results of route planning researches are monitored by logistic and automotive industries. The economic aspects of the cost saving are in the focus of the attention. An optimal route could cause time or fuel savings. An effective driving or an optimal route is a good basis to achieve an economical aim. Moreover the spread of new automotive solutions especially in case of electric cars the optimisation has particular significance regarding the limited battery storage. Additionally the autonomous car development could not be neglected. As a result the society could expect safer roads, better space usage and effective resource management. Nevertheless the requirements of users are extremely diverse, which is not negligible. Supporting these aims, in this paper the connection between the multimodal route planning and the user requirements are investigated. The examination is focused to a sensitivity analysis and a survey to evaluate the data and support the settings of a user habit effect to the final route.

  6. Flippable User Interfaces for Internationalization

    OpenAIRE

    Khaddam, Iyad; Vanderdonckt, Jean; 3rd ACM Symposium on Engineering Interactive Computing Systems EICS’2011

    2011-01-01

    The language reading direction is probably one of the most determinant factors influencing the successful internationalization of graphical user interfaces, beyond their mere translation. Western languages are read from left to right and top to bottom, while Arabic languages and Hebrew are read from right to left and top to bottom, and Oriental languages are read from top to bottom. In order to address this challenge, we introduce flippable user interfaces that enable the end user to change t...

  7. Practical speech user interface design

    CERN Document Server

    Lewis, James R

    2010-01-01

    Although speech is the most natural form of communication between humans, most people find using speech to communicate with machines anything but natural. Drawing from psychology, human-computer interaction, linguistics, and communication theory, Practical Speech User Interface Design provides a comprehensive yet concise survey of practical speech user interface (SUI) design. It offers practice-based and research-based guidance on how to design effective, efficient, and pleasant speech applications that people can really use. Focusing on the design of speech user interfaces for IVR application

  8. Search-User Interface Design

    CERN Document Server

    Wilson, Max

    2011-01-01

    Search User Interfaces (SUIs) represent the gateway between people who have a task to complete, and the repositories of information and data stored around the world. Not surprisingly, therefore, there are many communities who have a vested interest in the way SUIs are designed. There are people who study how humans search for information, and people who study how humans use computers. There are people who study good user interface design, and people who design aesthetically pleasing user interfaces. There are also people who curate and manage valuable information resources, and people who desi

  9. DIRAC: Secure web user interface

    International Nuclear Information System (INIS)

    Casajus Ramo, A; Sapunov, M

    2010-01-01

    Traditionally the interaction between users and the Grid is done with command line tools. However, these tools are difficult to use by non-expert users providing minimal help and generating outputs not always easy to understand especially in case of errors. Graphical User Interfaces are typically limited to providing access to the monitoring or accounting information and concentrate on some particular aspects failing to cover the full spectrum of grid control tasks. To make the Grid more user friendly more complete graphical interfaces are needed. Within the DIRAC project we have attempted to construct a Web based User Interface that provides means not only for monitoring the system behavior but also allows to steer the main user activities on the grid. Using DIRAC's web interface a user can easily track jobs and data. It provides access to job information and allows performing actions on jobs such as killing or deleting. Data managers can define and monitor file transfer activity as well as check requests set by jobs. Production managers can define and follow large data productions and react if necessary by stopping or starting them. The Web Portal is build following all the grid security standards and using modern Web 2.0 technologies which allow to achieve the user experience similar to the desktop applications. Details of the DIRAC Web Portal architecture and User Interface will be presented and discussed.

  10. Preface (to Playful User Interfaces)

    NARCIS (Netherlands)

    Unknown, [Unknown; Nijholt, A.; Nijholt, Antinus

    2014-01-01

    This book is about user interfaces to applications that can be considered as ‘playful’. The interfaces to such applications should be ‘playful’ as well. The application should be fun, and interacting with such an application should, of course, be fun as well. Maybe more. Why not expect that the

  11. Demonstrator 1: User Interface and User Functions

    DEFF Research Database (Denmark)

    Gram, Christian

    1999-01-01

    Describes the user interface and its functionality in a prototype system used for a virtual seminar session. The functionality is restricted to what is needed for a distributed seminar discussion among not too many people. The system is designed to work with the participants distributed at several...

  12. Designing end-user interfaces

    CERN Document Server

    Heaton, N

    1988-01-01

    Designing End-User Interfaces: State of the Art Report focuses on the field of human/computer interaction (HCI) that reviews the design of end-user interfaces.This compilation is divided into two parts. Part I examines specific aspects of the problem in HCI that range from basic definitions of the problem, evaluation of how to look at the problem domain, and fundamental work aimed at introducing human factors into all aspects of the design cycle. Part II consists of six main topics-definition of the problem, psychological and social factors, principles of interface design, computer intelligenc

  13. User acquaintance with mobile interfaces.

    Science.gov (United States)

    Ehrler, Frederic; Walesa, Magali; Sarrey, Evelyne; Wipfli, Rolf; Lovis, Christian

    2013-01-01

    Handheld technology finds slowly its place in the healthcare world. Some clinicians already use intensively dedicated mobile applications to consult clinical references. However, handheld technology hasn't still broadly embraced to the core of the healthcare business, the hospitals. The weak penetration of handheld technology in the hospitals can be partly explained by the caution of stakeholders that must be convinced about the efficiency of these tools before going forward. In a domain where temporal constraints are increasingly strong, caregivers cannot loose time on playing with gadgets. All users are not comfortable with tactile manipulations and the lack of dedicated peripheral complicates entering data for novices. Stakeholders must be convinced that caregivers will be able to master handheld devices. In this paper, we make the assumption that the proper design of an interface may influence users' performances to record information. We are also interested to find out whether users increase their efficiency when using handheld tools repeatedly. To answer these questions, we have set up a field study to compare users' performances on three different user interfaces while recording vital signs. Some user interfaces were familiar to users, and others were totally innovative. Results showed that users' familiarity with smartphone influences their performances and that users improve their performances by repeating a task.

  14. Towards personalized adaptive user interfaces

    International Nuclear Information System (INIS)

    Kostov, Vlaho; Fukuda, Shuchi; Yanagisawa, Hideyoshi

    2002-01-01

    An approach towards standardization of the general rules for synthesis and design of man machine interfaces that include dynamic adaptive behavior is presented. The link between the personality type (Myers-Briggs or Kersey Temperament sorter) and the personal preferences of the users (Kansei) for the purpose of building Graphical User Interface (GU]) was investigated. The rules for a personalized el-notional GUI based on the subjective preferences of the users were defined. The results were tested on a modified TETRIS game that displayed background characters capable of emotional response. When the system responded to a user in a manner that is customized to his or her preferences, the reaction time was smaller and the information transfer was faster. Usability testing methods were used and it was shown that development of pleasant cartoon face GUI based on the users inborn personality tendencies was feasible. (Author)

  15. User Interface Technology Survey.

    Science.gov (United States)

    1987-04-01

    Menu entries may be grouped and orgered Into menu hierarchies. AN levels of a hirarchy may be visible at once, or they may have to be navigated with...s t Xerox aw systemflla3. Ti ft"s lb m"k t system leble , ba it ls effective tis of1 iy some User 9101" SWsuc kaybosrdoeute usersome ssteMIS ( Apple Ma...r~stMiTemidndanodApple we tadaml no d Muaioh Isa tadsm macesd to Apple - ,pfr ftc ParSomPT In a beadue-nu of Adobe System lnooporstsd. Scdlbs Is a

  16. User interface user's guide for HYPGEN

    Science.gov (United States)

    Chiu, Ing-Tsau

    1992-01-01

    The user interface (UI) of HYPGEN is developed using Panel Library to shorten the learning curve for new users and provide easier ways to run HYPGEN for casual users as well as for advanced users. Menus, buttons, sliders, and type-in fields are used extensively in UI to allow users to point and click with a mouse to choose various available options or to change values of parameters. On-line help is provided to give users information on using UI without consulting the manual. Default values are set for most parameters and boundary conditions are determined by UI to further reduce the effort needed to run HYPGEN; however, users are free to make any changes and save it in a file for later use. A hook to PLOT3D is built in to allow graphics manipulation. The viewpoint and min/max box for PLOT3D windows are computed by UI and saved in a PLOT3D journal file. For large grids which take a long time to generate on workstations, the grid generator (HYPGEN) can be run on faster computers such as Crays, while UI stays at the workstation.

  17. User interface and patient involvement.

    Science.gov (United States)

    Andreassen, Hege Kristin; Lundvoll Nilsen, Line

    2013-01-01

    Increased patient involvement is a goal in contemporary health care, and of importance to the development of patient oriented ICT. In this paper we discuss how the design of patient-user interfaces can affect patient involvement. Our discussion is based on 12 semi-structured interviews with patient users of a web-based solution for patient--doctor communication piloted in Norway. We argue ICT solutions offering a choice of user interfaces on the patient side are preferable to ensure individual accommodation and a high degree of patient involvement. When introducing web-based tools for patient--health professional communication a free-text option should be provided to the patient users.

  18. The HEASARC graphical user interface

    Science.gov (United States)

    White, N.; Barrett, P.; Jacobs, P.; Oneel, B.

    1992-01-01

    An OSF/Motif-based graphical user interface has been developed to facilitate the use of the database and data analysis software packages available from the High Energy Astrophysics Science Archive Research Center (HEASARC). It can also be used as an interface to other, similar, routines. A small number of tables are constructed to specify the possible commands and command parameters for a given set of analysis routines. These tables can be modified by a designer to affect the appearance of the interface screens. They can also be dynamically changed in response to parameter adjustments made while the underlying program is running. Additionally, a communication protocol has been designed so that the interface can operate locally or across a network. It is intended that this software be able to run on a variety of workstations and X terminals.

  19. Coordinating user interfaces for consistency

    CERN Document Server

    Nielsen, Jakob

    2001-01-01

    In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analys

  20. Spelling Correction in User Interfaces.

    Science.gov (United States)

    1982-12-20

    conventional typescript -oriented command language, where most com- mands consist of a verb followed by a sequence of arguments. Most user terminals are...and explanations. not part of the typescripts . 2 SPFE.LING CORRLC1iON IN USR IN"RFAC’S 2. Design Issues We were prompted to look for a new correction...remaining 73% led us to wonder what other mechanisms might permit further corrections while retaining the typescript -style interface. Most of the other

  1. On user behaviour adaptation under interface change

    CSIR Research Space (South Africa)

    Rosman, Benjamin S

    2014-02-01

    Full Text Available International Conference on Intelligent User Interfaces, Haifa, Israel, 24-27 February 2014 On User Behaviour Adaptation Under Interface Change Benjamin Rosman_ Subramanian Ramamoorthy M. M. Hassan Mahmud School of Informatics University of Edinburgh...

  2. User interface inspection methods a user-centered design method

    CERN Document Server

    Wilson, Chauncey

    2014-01-01

    User Interface Inspection Methods succinctly covers five inspection methods: heuristic evaluation, perspective-based user interface inspection, cognitive walkthrough, pluralistic walkthrough, and formal usability inspections. Heuristic evaluation is perhaps the best-known inspection method, requiring a group of evaluators to review a product against a set of general principles. The perspective-based user interface inspection is based on the principle that different perspectives will find different problems in a user interface. In the related persona-based inspection, colleagues assume the

  3. Natural user interfaces for multitouch devices

    OpenAIRE

    Bukovinski, Matej

    2010-01-01

    This thesis presents a new class of user interfaces, which a commonly referred to as natural user interfaces. It discusses their main characteristics, evolution and advantages over currently dominant graphical user interfaces. Special attention is devoted to the subgroup of natural user interfaces for multitouch devices. Multitouch technology is firstly presented from a technical point of view and afterwards also in practice in form of a comparative study of six popular multitouch platfo...

  4. Design and Development of a Web Based User Interface

    OpenAIRE

    László, Magda

    2014-01-01

    The first objective of the thesis is to study the technological background of application design and more specifically the Unified Modeling Language (hereinafter UML). Due to this, the research provides deeper understanding of technical aspects of the practical part of the thesis work. The second and third objectives of this thesis are to design and develop a web application and more specifically a Web Based User Interface for Multimodal Observation and Analysis System for Social Interactions...

  5. End User Development Toolkit for Developing Physical User Interface Applications

    OpenAIRE

    Abrahamsen, Daniel T; Palfi, Anders; Svendsen, Haakon Sønsteby

    2014-01-01

    BACKGROUND: Tangible user interfaces and end user development are two increasingresearch areas in software technology. Physical representation promoteopportunities to ease the use of technology and reinforce personality traits ascreativeness, collaboration and intuitive actions. However, designing tangibleuser interfaces are both cumbersome and require several layers of architecture.End user development allows users with no programming experience to createor customize their own applications. ...

  6. User Interface Cultures of Mobile Knowledge Workers

    Directory of Open Access Journals (Sweden)

    Petri Mannonen

    2008-10-01

    Full Text Available Information and communication tools (ICTs have become a major influencer of how modern work is carried out. Methods of user-centered design do not however take into account the full complexity of technology and the user interface context the users live in. User interface culture analysis aims providing to designers new ways and strategies to better take into account the current user interface environment when designing new products. This paper describes the reasons behind user interface culture analysis and shows examples of its usage when studying mobile and distributed knowledge workers.

  7. User interfaces of information retrieval systems and user friendliness

    Directory of Open Access Journals (Sweden)

    Polona Vilar

    2008-01-01

    Full Text Available The paper deals with the characteristics of user interfaces of information retrieval systems with the emphasis on design and evaluation. It presents users’ information retrieval tasks and the functions which are offered through interfaces. Design rules, guidelines and standards are presented, as well as criteria and methods for evaluation. Special emphasis is placed on the concept of user friendliness as one of the most important characteristic of the user interfaces. Various definitions of user friendliness are presented and their elements are also discussed. In the end, the paper shows how user interfaces should be designed, taken into consideration all these criteria.

  8. Graphical User Interface in Art

    Science.gov (United States)

    Gwilt, Ian

    This essay discusses the use of the Graphical User Interface (GUI) as a site of creative practice. By creatively repositioning the GUI as a work of art it is possible to challenge our understanding and expectations of the conventional computer interface wherein the icons and navigational architecture of the GUI no longer function as a technological tool. These artistic recontextualizations are often used to question our engagement with technology and to highlight the pivotal place that the domestic computer has taken in our everyday social, cultural and (increasingly), creative domains. Through these works the media specificity of the screen-based GUI can broken by dramatic changes in scale, form and configuration. This can be seen through the work of new media artists who have re-imagined the GUI in a number of creative forms both, within the digital, as image, animation, net and interactive art, and in the analogue, as print, painting, sculpture, installation and performative event. Furthermore as a creative work, the GUI can also be utilized as a visual way-finder to explore the relationship between the dynamic potentials of the digital and the concretized qualities of the material artifact.

  9. User interfaces of information retrieval systems and user friendliness

    OpenAIRE

    Polona Vilar; Maja Žumer

    2008-01-01

    The paper deals with the characteristics of user interfaces of information retrieval systems with the emphasis on design and evaluation. It presents users’ information retrieval tasks and the functions which are offered through interfaces. Design rules, guidelines and standards are presented, as well as criteria and methods for evaluation. Special emphasis is placed on the concept of user friendliness as one of the most important characteristic of the user interfaces. Various definitions of u...

  10. Graphical user interfaces and visually disabled users

    NARCIS (Netherlands)

    Poll, L.H.D.; Waterham, R.P.

    1995-01-01

    From February 1992 until the end of 1993, the authors ((IPO) Institute for Perception Research) participated in a European ((TIDE) Technology Initiative for Disabled and Elderly) project which addressed the problem arising for visually disabled computer-users from the growing use of Graphical User

  11. User Interface Design for Dynamic Geometry Software

    Science.gov (United States)

    Kortenkamp, Ulrich; Dohrmann, Christian

    2010-01-01

    In this article we describe long-standing user interface issues with Dynamic Geometry Software and common approaches to address them. We describe first prototypes of multi-touch-capable DGS. We also give some hints on the educational benefits of proper user interface design.

  12. Learning Analytics for Natural User Interfaces

    Science.gov (United States)

    Martinez-Maldonado, Roberto; Shum, Simon Buckingham; Schneider, Bertrand; Charleer, Sven; Klerkx, Joris; Duval, Erik

    2017-01-01

    The continuous advancement of natural user interfaces (NUIs) allows for the development\tof novel and creative ways to support collocated collaborative work in a wide range of areas, including teaching and learning. The use of NUIs, such as those based on interactive multi-touch surfaces and tangible user interfaces (TUIs), can offer unique…

  13. Distributed user interfaces usability and collaboration

    CERN Document Server

    Lozano, María D; Tesoriero, Ricardo; Penichet, Victor MR

    2013-01-01

    Written by international researchers in the field of Distributed User Interfaces (DUIs), this book brings together important contributions regarding collaboration and usability in Distributed User Interface settings. Throughout the thirteen chapters authors address key questions concerning how collaboration can be improved by using DUIs, including: in which situations a DUI is suitable to ease the collaboration among users; how usability standards can be used to evaluate the usability of systems based on DUIs; and accurately describe case studies and prototypes implementing these concerns

  14. Reasoning about Users' Actions in a Graphical User Interface.

    Science.gov (United States)

    Virvou, Maria; Kabassi, Katerina

    2002-01-01

    Describes a graphical user interface called IFM (Intelligent File Manipulator) that provides intelligent help to users. Explains two underlying reasoning mechanisms, one an adaptation of human plausible reasoning and one that performs goal recognition based on the effects of users' commands; and presents results of an empirical study that…

  15. The Rise of the Graphical User Interface.

    Science.gov (United States)

    Edwards, Alastair D. N.

    1996-01-01

    Discusses the history of the graphical user interface (GUI) and the growing realization that adaptations must be made to it lest its visual nature discriminate against nonsighted or sight-impaired users. One of the most popular commercially developed adaptations is to develop sounds that signal the location of icons or menus to mouse users.…

  16. Gestures in an Intelligent User Interface

    Science.gov (United States)

    Fikkert, Wim; van der Vet, Paul; Nijholt, Anton

    In this chapter we investigated which hand gestures are intuitive to control a large display multimedia interface from a user's perspective. Over the course of two sequential user evaluations, we defined a simple gesture set that allows users to fully control a large display multimedia interface, intuitively. First, we evaluated numerous gesture possibilities for a set of commands that can be issued to the interface. These gestures were selected from literature, science fiction movies, and a previous exploratory study. Second, we implemented a working prototype with which the users could interact with both hands and the preferred hand gestures with 2D and 3D visualizations of biochemical structures. We found that the gestures are influenced to significant extent by the fast paced developments in multimedia interfaces such as the Apple iPhone and the Nintendo Wii and to no lesser degree by decades of experience with the more traditional WIMP-based interfaces.

  17. Applying Cognitive Psychology to User Interfaces

    Science.gov (United States)

    Durrani, Sabeen; Durrani, Qaiser S.

    This paper explores some key aspects of cognitive psychology that may be mapped onto user interfaces. Major focus in existing user interface guidelines is on consistency, simplicity, feedback, system messages, display issues, navigation, colors, graphics, visibility and error prevention [8-10]. These guidelines are effective indesigning user interfaces. However, these guidelines do not handle the issues that may arise due to the innate structure of human brain and human limitations. For example, where to place graphics on the screen so that user can easily process them and what kind of background should be given on the screen according to the limitation of human motor system. In this paper we have collected some available guidelines from the area of cognitive psychology [1, 5, 7]. In addition, we have extracted few guidelines from theories and studies of cognitive psychology [3, 11] which may be mapped to user interfaces.

  18. Through the Interface - a human activity approach to user interfaces

    DEFF Research Database (Denmark)

    Bødker, Susanne

    In providing a theoretical framework for understanding human- computer interaction as well as design of user interfaces, this book combines elements of anthropology, psychology, cognitive science, software engineering, and computer science. The framework examines the everyday work practices of us...

  19. User Interface Technology for Formal Specification Development

    Science.gov (United States)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    Formal specification development and modification are an essential component of the knowledge-based software life cycle. User interface technology is needed to empower end-users to create their own formal specifications. This paper describes the advanced user interface for AMPHION1 a knowledge-based software engineering system that targets scientific subroutine libraries. AMPHION is a generic, domain-independent architecture that is specialized to an application domain through a declarative domain theory. Formal specification development and reuse is made accessible to end-users through an intuitive graphical interface that provides semantic guidance in creating diagrams denoting formal specifications in an application domain. The diagrams also serve to document the specifications. Automatic deductive program synthesis ensures that end-user specifications are correctly implemented. The tables that drive AMPHION's user interface are automatically compiled from a domain theory; portions of the interface can be customized by the end-user. The user interface facilitates formal specification development by hiding syntactic details, such as logical notation. It also turns some of the barriers for end-user specification development associated with strongly typed formal languages into active sources of guidance, without restricting advanced users. The interface is especially suited for specification modification. AMPHION has been applied to the domain of solar system kinematics through the development of a declarative domain theory. Testing over six months with planetary scientists indicates that AMPHION's interactive specification acquisition paradigm enables users to develop, modify, and reuse specifications at least an order of magnitude more rapidly than manual program development.

  20. Playful User Interfaces. Interfaces that Invite Social and Physical Interaction.

    NARCIS (Netherlands)

    Nijholt, Antinus; Unknown, [Unknown

    2014-01-01

    This book is about user interfaces to applications that can be considered as ‘playful’. The interfaces to such applications should be ‘playful’ as well. The application should be fun, and interacting with such an application should, of course, be fun as well. Maybe more. Why not expect that the

  1. Projection Mapping User Interface for Disabled People.

    Science.gov (United States)

    Gelšvartas, Julius; Simutis, Rimvydas; Maskeliūnas, Rytis

    2018-01-01

    Difficulty in communicating is one of the key challenges for people suffering from severe motor and speech disabilities. Often such person can communicate and interact with the environment only using assistive technologies. This paper presents a multifunctional user interface designed to improve communication efficiency and person independence. The main component of this interface is a projection mapping technique used to highlight objects in the environment. Projection mapping makes it possible to create a natural augmented reality information presentation method. The user interface combines a depth sensor and a projector to create camera-projector system. We provide a detailed description of camera-projector system calibration procedure. The described system performs tabletop object detection and automatic projection mapping. Multiple user input modalities have been integrated into the multifunctional user interface. Such system can be adapted to the needs of people with various disabilities.

  2. Zoomable User Interfaces for the Semantic WEB

    National Research Council Canada - National Science Library

    Gorniak, Mark

    2004-01-01

    .... The University of Maryland, College Park (UMCP) developed an interface, to visualize the taxonomic hierarchy of data, and applied integrated searching and browsing so that users need not have complete knowledge either of appropriate keyword...

  3. Projection Mapping User Interface for Disabled People

    Science.gov (United States)

    Simutis, Rimvydas; Maskeliūnas, Rytis

    2018-01-01

    Difficulty in communicating is one of the key challenges for people suffering from severe motor and speech disabilities. Often such person can communicate and interact with the environment only using assistive technologies. This paper presents a multifunctional user interface designed to improve communication efficiency and person independence. The main component of this interface is a projection mapping technique used to highlight objects in the environment. Projection mapping makes it possible to create a natural augmented reality information presentation method. The user interface combines a depth sensor and a projector to create camera-projector system. We provide a detailed description of camera-projector system calibration procedure. The described system performs tabletop object detection and automatic projection mapping. Multiple user input modalities have been integrated into the multifunctional user interface. Such system can be adapted to the needs of people with various disabilities. PMID:29686827

  4. Projection Mapping User Interface for Disabled People

    Directory of Open Access Journals (Sweden)

    Julius Gelšvartas

    2018-01-01

    Full Text Available Difficulty in communicating is one of the key challenges for people suffering from severe motor and speech disabilities. Often such person can communicate and interact with the environment only using assistive technologies. This paper presents a multifunctional user interface designed to improve communication efficiency and person independence. The main component of this interface is a projection mapping technique used to highlight objects in the environment. Projection mapping makes it possible to create a natural augmented reality information presentation method. The user interface combines a depth sensor and a projector to create camera-projector system. We provide a detailed description of camera-projector system calibration procedure. The described system performs tabletop object detection and automatic projection mapping. Multiple user input modalities have been integrated into the multifunctional user interface. Such system can be adapted to the needs of people with various disabilities.

  5. New ROOT Graphical User Interfaces for fitting

    International Nuclear Information System (INIS)

    Maline, D Gonzalez; Moneta, L; Antcheva, I

    2010-01-01

    ROOT, as a scientific data analysis framework, provides extensive capabilities via Graphical User Interfaces (GUI) for performing interactive analysis and visualizing data objects like histograms and graphs. A new interface for fitting has been developed for performing, exploring and comparing fits on data point sets such as histograms, multi-dimensional graphs or trees. With this new interface, users can build interactively the fit model function, set parameter values and constraints and select fit and minimization methods with their options. Functionality for visualizing the fit results is as well provided, with the possibility of drawing residuals or confidence intervals. Furthermore, the new fit panel reacts as a standalone application and it does not prevent users from interacting with other windows. We will describe in great detail the functionality of this user interface, covering as well new capabilities provided by the new fitting and minimization tools introduced recently in the ROOT framework.

  6. Nurses perceptions of a user friendly interface

    OpenAIRE

    Alshafai, Fatimah

    2017-01-01

    Introduction: The successful implementation of clinical information systems depends to a large extent on its usability. Usability can be achieved by a strong focus on interface quality. With a focus on improving the quality of patient care, growing numbers of clinical information systems have been advertised as being "user-friendly". However, the term "user-friendly" may not be quite accurate and in some circumstances could be misleading. Within a clinical setting, an interface designed as ea...

  7. Mobile Phone User Interfaces in Multiplayer Games

    OpenAIRE

    NURMINEN, MINNA

    2007-01-01

    This study focuses on the user interface elements of mobile phones and their qualities in multiplayer games. Mobile phone is not intended as a gaming device. Therefore its technology has many shortcomings when it comes to playing mobile games on the device. One of those is the non-standardized user interface design. However, it has also some strengths, such as the portability and networked nature. In addition, many mobile phone models today have a camera, a feature only few gaming devices hav...

  8. Playful user interfaces interfaces that invite social and physical interaction

    CERN Document Server

    2014-01-01

    The book is about user interfaces to applications that have been designed for social and physical interaction. The interfaces are ‘playful’, that is, users feel challenged to engage in social and physical interaction because that will be fun. The topics that will be present in this book are interactive playgrounds, urban games using mobiles, sensor-equipped environments for playing, child-computer interaction, tangible game interfaces, interactive tabletop technology and applications, full-body interaction, exertion games, persuasion, engagement, evaluation, and user experience. Readers of the book will not only get a survey of state-of-the-art research in these areas, but the chapters in this book will also provide a vision of the future where playful interfaces will be ubiquitous, that is, present and integrated in home, office, recreational, sports and urban environments, emphasizing that in the future in these environments game elements will be integrated and welcomed.

  9. Spectrometer user interface to computer systems

    International Nuclear Information System (INIS)

    Salmon, L.; Davies, M.; Fry, F.A.; Venn, J.B.

    1979-01-01

    A computer system for use in radiation spectrometry should be designed around the needs and comprehension of the user and his operating environment. To this end, the functions of the system should be built in a modular and independent fashion such that they can be joined to the back end of an appropriate user interface. The point that this interface should be designed rather than just allowed to evolve is illustrated by reference to four related computer systems of differing complexity and function. The physical user interfaces in all cases are keyboard terminals, and the virtues and otherwise of these devices are discussed and compared with others. The language interface needs to satisfy a number of requirements, often conflicting. Among these, simplicity and speed of operation compete with flexibility and scope. Both experienced and novice users need to be considered, and any individual's needs may vary from naive to complex. To be efficient and resilient, the implementation must use an operating system, but the user needs to be protected from its complex and unfamiliar syntax. At the same time the interface must allow the user access to all services appropriate to his needs. The user must also receive an image of privacy in a multi-user system. The interface itself must be stable and exhibit continuity between implementations. Some of these conflicting needs have been overcome by the SABRE interface with languages operating at several levels. The foundation is a simple semimnemonic command language that activates indididual and independent functions. The commands can be used with positional parameters or in an interactive dialogue the precise nature of which depends upon the operating environment and the user's experience. A command procedure or macrolanguage allows combinations of commands with conditional branching and arithmetic features. Thus complex but repetitive operations are easily performed

  10. Graphical User Interfaces and Library Systems: End-User Reactions.

    Science.gov (United States)

    Zorn, Margaret; Marshall, Lucy

    1995-01-01

    Describes a study by Parke-Davis Pharmaceutical Research Library to determine user satisfaction with the graphical user interface-based (GUI) Dynix Marquis compared with the text-based Dynix Classic Online Public Access Catalog (OPAC). Results show that the GUI-based OPAC was preferred by endusers over the text-based OPAC. (eight references) (DGM)

  11. EPICS system: system structure and user interface

    International Nuclear Information System (INIS)

    West, R.E.; Bartlett, J.F.; Bobbitt, J.S.; Lahey, T.E.; Kramper, B.J.; MacKinnon, B.A.

    1984-02-01

    This paper present the user's view of and the general organization of the EPICS control system at Fermilab. Various subsystems of the EPICS control system are discussed. These include the user command language, software protection, the device database, remote computer interfaces, and several application utilities. This paper is related to two other papers on EPICS: an overview paper and a detailed implementation paper

  12. Nonspeech audio in user interfaces for TV

    NARCIS (Netherlands)

    Sluis, van de Richard; Eggen, J.H.; Rypkema, J.A.

    1997-01-01

    This study explores the end-user benefits of using nonspeech audio in television user interfaces. A prototype of an Electronic Programme Guide (EPG) served as a carrier for the research. One of the features of this EPG is the possibility to search for TV programmes in a category-based way. The EPG

  13. Interfaces for End-User Information Seeking.

    Science.gov (United States)

    Marchionini, Gary

    1992-01-01

    Discusses essential features of interfaces to support end-user information seeking. Highlights include cognitive engineering; task models and task analysis; the problem-solving nature of information seeking; examples of systems for end-users, including online public access catalogs (OPACs), hypertext, and help systems; and suggested research…

  14. Disjoint forms in graphical user interfaces

    NARCIS (Netherlands)

    Evers, S.; Achten, P.M.; Plasmeijer, M.J.; Loidl, H.W.

    Forms are parts of a graphical user interface (GUI) that show a set of values and allow the user to update them. The declarative form construction library FunctionalForms is extended with disjoint form combinators to capture some common patterns in which the form structure expresses a choice. We

  15. Development of INFRA graphic user interface

    International Nuclear Information System (INIS)

    Yang, Y. S.; Lee, C. B.; Kim, Y. M.; Kim, D. H.; Kim, S. K.

    2004-01-01

    GUI(Graphic User Interface) has been developed for high burnup fuel performance code INFRA. Based upon FORTRAN program language, INFRA was developed by COMPAQ Visual FORTRAN 6.5. Graphic user input and output interface have been developed by using Visual Basic and MDB which are the most widely used program language and database for windows application development. Various input parameters, which are required for INFRA calculation, can be input more conveniently by newly developed input interface. Without any additional data handling, INFRA calculation results can be investigated intuitively by 2D or 3D graphs on screen and animation function

  16. A multimodal interface to resolve the Midas-Touch problem in gaze controlled wheelchair.

    Science.gov (United States)

    Meena, Yogesh Kumar; Cecotti, Hubert; Wong-Lin, KongFatt; Prasad, Girijesh

    2017-07-01

    Human-computer interaction (HCI) research has been playing an essential role in the field of rehabilitation. The usability of the gaze controlled powered wheelchair is limited due to Midas-Touch problem. In this work, we propose a multimodal graphical user interface (GUI) to control a powered wheelchair that aims to help upper-limb mobility impaired people in daily living activities. The GUI was designed to include a portable and low-cost eye-tracker and a soft-switch wherein the wheelchair can be controlled in three different ways: 1) with a touchpad 2) with an eye-tracker only, and 3) eye-tracker with soft-switch. The interface includes nine different commands (eight directions and stop) and integrated within a powered wheelchair system. We evaluated the performance of the multimodal interface in terms of lap-completion time, the number of commands, and the information transfer rate (ITR) with eight healthy participants. The analysis of the results showed that the eye-tracker with soft-switch provides superior performance with an ITR of 37.77 bits/min among the three different conditions (pusers.

  17. Chandra Source Catalog: User Interfaces

    Science.gov (United States)

    Bonaventura, Nina; Evans, I. N.; Harbo, P. N.; Rots, A. H.; Tibbetts, M. S.; Van Stone, D. W.; Zografou, P.; Anderson, C. S.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Glotfelty, K. J.; Grier, J. D.; Hain, R.; Hall, D. M.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Primini, F. A.; Refsdal, B. L.; Siemiginowska, A. L.; Sundheim, B. A.; Winkelman, S. L.

    2010-03-01

    The CSCview data mining interface is available for browsing the Chandra Source Catalog (CSC) and downloading tables of quality-assured source properties and data products. Once the desired source properties and search criteria are entered into the CSCview query form, the resulting source matches are returned in a table along with the values of the requested source properties for each source. (The catalog can be searched on any source property, not just position.) At this point, the table of search results may be saved to a text file, and the available data products for each source may be downloaded. CSCview save files are output in RDB-like and VOTable format. The available CSC data products include event files, spectra, lightcurves, and images, all of which are processed with the CIAO software. CSC data may also be accessed non-interactively with Unix command-line tools such as cURL and Wget, using ADQL 2.0 query syntax. In fact, CSCview features a separate ADQL query form for those who wish to specify this type of query within the GUI. Several interfaces are available for learning if a source is included in the catalog (in addition to CSCview): 1) the CSC interface to Sky in Google Earth shows the footprint of each Chandra observation on the sky, along with the CSC footprint for comparison (CSC source properties are also accessible when a source within a Chandra field-of-view is clicked); 2) the CSC Limiting Sensitivity online tool indicates if a source at an input celestial location was too faint for detection; 3) an IVOA Simple Cone Search interface locates all CSC sources within a specified radius of an R.A. and Dec.; and 4) the CSC-SDSS cross-match service returns the list of sources common to the CSC and SDSS, either all such sources or a subset based on search criteria.

  18. Generating User Interfaces with the FUSE-System

    OpenAIRE

    Frank Lonczewski; Siegfried Schreiber

    2017-01-01

    With the FUSE(Formal User interface Specification Environment)-System we present a methodology and a set of integrated tools for the automatic generation of graphical user interfaces. FUSE provides tool-based support for all phases (task-, user-, problem domain analysis, design of the logical user interface, design of user interface in a particular layout style) of the user interface development process. Based on a formal specification of dialogue- and layout guidelines, FUSE allows the autom...

  19. Language workbench user interfaces for data analysis

    Science.gov (United States)

    Benson, Victoria M.

    2015-01-01

    Biological data analysis is frequently performed with command line software. While this practice provides considerable flexibility for computationally savy individuals, such as investigators trained in bioinformatics, this also creates a barrier to the widespread use of data analysis software by investigators trained as biologists and/or clinicians. Workflow systems such as Galaxy and Taverna have been developed to try and provide generic user interfaces that can wrap command line analysis software. These solutions are useful for problems that can be solved with workflows, and that do not require specialized user interfaces. However, some types of analyses can benefit from custom user interfaces. For instance, developing biomarker models from high-throughput data is a type of analysis that can be expressed more succinctly with specialized user interfaces. Here, we show how Language Workbench (LW) technology can be used to model the biomarker development and validation process. We developed a language that models the concepts of Dataset, Endpoint, Feature Selection Method and Classifier. These high-level language concepts map directly to abstractions that analysts who develop biomarker models are familiar with. We found that user interfaces developed in the Meta-Programming System (MPS) LW provide convenient means to configure a biomarker development project, to train models and view the validation statistics. We discuss several advantages of developing user interfaces for data analysis with a LW, including increased interface consistency, portability and extension by language composition. The language developed during this experiment is distributed as an MPS plugin (available at http://campagnelab.org/software/bdval-for-mps/). PMID:25755929

  20. Language workbench user interfaces for data analysis

    Directory of Open Access Journals (Sweden)

    Victoria M. Benson

    2015-02-01

    Full Text Available Biological data analysis is frequently performed with command line software. While this practice provides considerable flexibility for computationally savy individuals, such as investigators trained in bioinformatics, this also creates a barrier to the widespread use of data analysis software by investigators trained as biologists and/or clinicians. Workflow systems such as Galaxy and Taverna have been developed to try and provide generic user interfaces that can wrap command line analysis software. These solutions are useful for problems that can be solved with workflows, and that do not require specialized user interfaces. However, some types of analyses can benefit from custom user interfaces. For instance, developing biomarker models from high-throughput data is a type of analysis that can be expressed more succinctly with specialized user interfaces. Here, we show how Language Workbench (LW technology can be used to model the biomarker development and validation process. We developed a language that models the concepts of Dataset, Endpoint, Feature Selection Method and Classifier. These high-level language concepts map directly to abstractions that analysts who develop biomarker models are familiar with. We found that user interfaces developed in the Meta-Programming System (MPS LW provide convenient means to configure a biomarker development project, to train models and view the validation statistics. We discuss several advantages of developing user interfaces for data analysis with a LW, including increased interface consistency, portability and extension by language composition. The language developed during this experiment is distributed as an MPS plugin (available at http://campagnelab.org/software/bdval-for-mps/.

  1. Pen-and-Paper User Interfaces

    CERN Document Server

    Steimle, Jurgen

    2012-01-01

    Even at the beginning of the 21st century, we are far from becoming paperless. Pen and paper is still the only truly ubiquitous information processing technology. Pen-and-paper user interfaces bridge the gap between paper and the digital world. Rather than replacing paper with electronic media, they seamlessly integrate both worlds in a hybrid user interface. Classical paper documents become interactive. This opens up a huge field of novel computer applications at our workplaces and in our homes. This book provides readers with a broad and extensive overview of the field, so as to provide a fu

  2. Programming Graphical User Interfaces in R

    CERN Document Server

    Verzani, John

    2012-01-01

    Programming Graphical User Interfaces with R introduces each of the major R packages for GUI programming: RGtk2, qtbase, Tcl/Tk, and gWidgets. With examples woven through the text as well as stand-alone demonstrations of simple yet reasonably complete applications, the book features topics especially relevant to statisticians who aim to provide a practical interface to functionality implemented in R. The book offers: A how-to guide for developing GUIs within R The fundamentals for users with limited knowledge of programming within R and other languages GUI design for specific functions or as l

  3. Liferay 6.2 user interface development

    CERN Document Server

    Chen, Xinsheng

    2013-01-01

    A step-by-step tutorial, targeting the Liferay 6.2 version. This book takes a step-by-step approach to customizing the look and feel of your website, and shows you how to build a great looking user interface as well.""Liferay 6.2 User Interface Development"" is for anyone who is interested in the Liferay Portal. It contains text that explicitly introduces you to the Liferay Portal. You will benefit most from this book if you have Java programming experience and have coded servlets or JavaServer Pages before. Experienced Liferay portal developers will also find this book useful because it expla

  4. The web based user interface of RODOS

    International Nuclear Information System (INIS)

    Raskob, W.; Mueller, A.; Munz, E.; Rafat, M.

    2003-01-01

    Full text: The interaction between the RODOS system and its users has three main objectives: (1) operation of the system in its automatic and interactive modes including the processing of meteorological and radiological on-line data, and the choice of module chains for performing the necessary calculations; (2) input of data defining the accident situation, such as source term information, intervention criteria and timing of emergency actions; (3) selection and presentation of results in the form of spatial and temporal distributions of activity concentrations, areas affected by emergency actions and countermeasures, and their radiological and economic consequences. Users of category A have direct access to the RODOS system via local or wide area networks through the client/server protocol Internet/X. Any internet connected X desktop machine, such as Unix workstations from different vendors, X- terminals, Linux PCs, and PCs with X-emulation can be used. A number of X-Windows based graphical user interfaces (GUIs) provide direct access to all functionalities of the RODOS system and allow for handling the various user interactions with the RODOS system described above. Among others, the user can trigger or interrupt the automatic processing mode, execute application programs simultaneously, modify and delete data, import data sets from databases, and change configuration files. As the user interacts directly with in-memory active processes, the system responses immediately after having performed the necessary calculations. For obtaining the requested results, the users must know, which chain of application software has to be selected, how to interact with their interfaces, which sort of initialization data have to be assigned, etc. This flexible interaction with RODOS implies that only experienced and well-trained users are able to operate the system and to obtain correct and sensible information. A new interface has been developed which is based an the commonly used

  5. Development of graphical user interface for EGS

    International Nuclear Information System (INIS)

    Jin Gang; Liu Liye; Li Junli; Cheng Jianping

    2002-01-01

    In order to make it more convenient for the engineers to use EGS, explored a new type of procedure under the utility of the VC ++ , this procedure which is named of EGS Win can run under the Windows system. This procedure consists of graphical user interface. Through this procedure, the user have to input the simple and intuitionistic geometric entity for getting the definition of the region. This procedure greatly improves the efficiency of EGS

  6. Multimodal Semantics Extraction from User-Generated Videos

    Directory of Open Access Journals (Sweden)

    Francesco Cricri

    2012-01-01

    Full Text Available User-generated video content has grown tremendously fast to the point of outpacing professional content creation. In this work we develop methods that analyze contextual information of multiple user-generated videos in order to obtain semantic information about public happenings (e.g., sport and live music events being recorded in these videos. One of the key contributions of this work is a joint utilization of different data modalities, including such captured by auxiliary sensors during the video recording performed by each user. In particular, we analyze GPS data, magnetometer data, accelerometer data, video- and audio-content data. We use these data modalities to infer information about the event being recorded, in terms of layout (e.g., stadium, genre, indoor versus outdoor scene, and the main area of interest of the event. Furthermore we propose a method that automatically identifies the optimal set of cameras to be used in a multicamera video production. Finally, we detect the camera users which fall within the field of view of other cameras recording at the same public happening. We show that the proposed multimodal analysis methods perform well on various recordings obtained in real sport events and live music performances.

  7. EDTGRAF, DISSPLA User Interface Program

    International Nuclear Information System (INIS)

    Bloom, I.

    1989-01-01

    1 - Description of program or function: EDTGRAF is a graphics package that allows the user to access the high-quality graphics available in Computer Associates' (CA) DISSPLA without any knowledge of programming. EDTGRAF reduces the complex syntax of DISSPLA to simple menus. The use of menus decreases computer time spent preprocessing command input. High-quality graphics can be produced quickly in two and three dimensions and in color. EDTGRAF has capabilities for storing and later retrieving information needed to produce graphics. EDTGRAF will screen most input for entries that do not make sense in the current graph and will produce a meaningful error message. Two orientations are available for plots - COMIC, which produces a plot with the y-axis perpendicular to the long axis of the paper, and MOVIE, which produces a plot with the y-axis parallel to the long axis of the paper. 2 - Restrictions on the complexity of the problem - Maxima of: 1000 ordered pairs (or triplets) per dataset, 18 datasets per graph. Errors generated by DISSPLA are not trapped by EDTGRAF. EDTGRAF assumes the paper is 8.5 x 11 inches

  8. Chandra Source Catalog: User Interface

    Science.gov (United States)

    Bonaventura, Nina; Evans, Ian N.; Rots, Arnold H.; Tibbetts, Michael S.; van Stone, David W.; Zografou, Panagoula; Primini, Francis A.; Glotfelty, Kenny J.; Anderson, Craig S.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; He, Helen; Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Winkelman, Sherry L.

    2009-09-01

    The Chandra Source Catalog (CSC) is intended to be the definitive catalog of all X-ray sources detected by Chandra. For each source, the CSC provides positions and multi-band fluxes, as well as derived spatial, spectral, and temporal source properties. Full-field and source region data products are also available, including images, photon event lists, light curves, and spectra. The Chandra X-ray Center CSC website (http://cxc.harvard.edu/csc/) is the place to visit for high-level descriptions of each source property and data product included in the catalog, along with other useful information, such as step-by-step catalog tutorials, answers to FAQs, and a thorough summary of the catalog statistical characterization. Eight categories of detailed catalog documents may be accessed from the navigation bar on most of the 50+ CSC pages; these categories are: About the Catalog, Creating the Catalog, Using the Catalog, Catalog Columns, Column Descriptions, Documents, Conferences, and Useful Links. There are also prominent links to CSCview, the CSC data access GUI, and related help documentation, as well as a tutorial for using the new CSC/Google Earth interface. Catalog source properties are presented in seven scientific categories, within two table views: the Master Source and Source Observations tables. Each X-ray source has one ``master source'' entry and one or more ``source observation'' entries, the details of which are documented on the CSC ``Catalog Columns'' pages. The master source properties represent the best estimates of the properties of a source; these are extensively described on the following pages of the website: Position and Position Errors, Source Flags, Source Extent and Errors, Source Fluxes, Source Significance, Spectral Properties, and Source Variability. The eight tutorials (``threads'') available on the website serve as a collective guide for accessing, understanding, and manipulating the source properties and data products provided by the catalog.

  9. Detecting users handedness for ergonomic adaptation of mobile user interfaces

    DEFF Research Database (Denmark)

    Löchtefeld, Markus; Schardt, Phillip; Krüger, Antonio

    2015-01-01

    ) for users with average hand sizes. One solution is to offer adaptive user interfaces for such one-handed interactions. These modes have to be triggered manually and thus induce a critical overhead. They are further designed to bring all content closer, regardless of whether the phone is operated...... with the left or right hand. In this paper, we present an algorithm that allows determining the users' interacting hand from their unlocking behavior. Our algorithm correctly distinguishes one- and twohanded usage as well as left- and right handed unlocking in 98.51% of all cases. This is achieved through a k...

  10. The Promise of Zoomable User Interfaces

    Science.gov (United States)

    Bederson, Benjamin B.

    2011-01-01

    Zoomable user interfaces (ZUIs) have received a significant amount of attention in the 18 years since they were introduced. They have enjoyed some success, and elements of ZUIs are widely used in computers today, although the grand vision of a zoomable desktop has not materialised. This paper describes the premise and promise of ZUIs along with…

  11. Flash Builder customizing the user interface

    CERN Document Server

    Rocchi, Cesare

    2010-01-01

    Personalize user interface components of your projects. Example projects are grouped together in an AIR application and the appearance is totally customized. Learn how to change visual properties by means of style directives or create brand new skins by knowing and exploiting their internal architecture.

  12. User Interface of MUDR Electronic Health Record

    Czech Academy of Sciences Publication Activity Database

    Hanzlíček, Petr; Špidlen, Josef; Heroutová, Helena; Nagy, Miroslav

    2005-01-01

    Roč. 74, - (2005), s. 221-227 ISSN 1386-5056 R&D Projects: GA MŠk LN00B107 Institutional research plan: CEZ:AV0Z10300504 Keywords : electronic health record * user interface * data entry * knowledge base Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.374, year: 2005

  13. The Morphing Waldo: An Adaptive User Interface.

    Science.gov (United States)

    Howell, Colby Chambers

    2001-01-01

    Performance Centered Design (PCD) offers an alternative to the old methodology of software development. The author suggests one design possibility that could be used to create a more universally satisfying interface for existing applications, an adaptive mini-program that "sits" between the larger application and the user. Potential…

  14. A human activity approach to User Interfaces

    DEFF Research Database (Denmark)

    Bødker, Susanne

    1989-01-01

    the work situations in which computer-based artifacts are used: The framework deals with the role of the user interface in purposeful human work. Human activity theory is used in this analysis. The purpose of this article is to make the reader curious and hopefully open his or her eyes to a somewhat...

  15. More playful user interfaces: an introduction

    NARCIS (Netherlands)

    Unknown, [Unknown; Nijholt, A.; Nijholt, Antinus

    2015-01-01

    In this chapter we embed recent research advances in creating playful user interfaces in a historical context. We have observations on spending leisure time, in particular predictions from previous decades and views expressed in Science Fiction novels. We confront these views and predictions with

  16. ADELA - user interface for fuel charge design

    International Nuclear Information System (INIS)

    Havluj, Frantisek

    2010-01-01

    ADELA is a supporting computer code - ANDREA code add-on - for fuel batch designing and optimization. It facilitates fuel batch planning, evaluation and archival by using graphical user interface. ADELA simplifies and automates the design process and is closely linked to the QUADRIGA system for data library creation. (author)

  17. AXAF user interfaces for heterogeneous analysis environments

    Science.gov (United States)

    Mandel, Eric; Roll, John; Ackerman, Mark S.

    1992-01-01

    The AXAF Science Center (ASC) will develop software to support all facets of data center activities and user research for the AXAF X-ray Observatory, scheduled for launch in 1999. The goal is to provide astronomers with the ability to utilize heterogeneous data analysis packages, that is, to allow astronomers to pick the best packages for doing their scientific analysis. For example, ASC software will be based on IRAF, but non-IRAF programs will be incorporated into the data system where appropriate. Additionally, it is desired to allow AXAF users to mix ASC software with their own local software. The need to support heterogeneous analysis environments is not special to the AXAF project, and therefore finding mechanisms for coordinating heterogeneous programs is an important problem for astronomical software today. The approach to solving this problem has been to develop two interfaces that allow the scientific user to run heterogeneous programs together. The first is an IRAF-compatible parameter interface that provides non-IRAF programs with IRAF's parameter handling capabilities. Included in the interface is an application programming interface to manipulate parameters from within programs, and also a set of host programs to manipulate parameters at the command line or from within scripts. The parameter interface has been implemented to support parameter storage formats other than IRAF parameter files, allowing one, for example, to access parameters that are stored in data bases. An X Windows graphical user interface called 'agcl' has been developed, layered on top of the IRAF-compatible parameter interface, that provides a standard graphical mechanism for interacting with IRAF and non-IRAF programs. Users can edit parameters and run programs for both non-IRAF programs and IRAF tasks. The agcl interface allows one to communicate with any command line environment in a transparent manner and without any changes to the original environment. For example, the authors

  18. Toward Multimodal Human-Robot Interaction to Enhance Active Participation of Users in Gait Rehabilitation.

    Science.gov (United States)

    Gui, Kai; Liu, Honghai; Zhang, Dingguo

    2017-11-01

    Robotic exoskeletons for physical rehabilitation have been utilized for retraining patients suffering from paraplegia and enhancing motor recovery in recent years. However, users are not voluntarily involved in most systems. This paper aims to develop a locomotion trainer with multiple gait patterns, which can be controlled by the active motion intention of users. A multimodal human-robot interaction (HRI) system is established to enhance subject's active participation during gait rehabilitation, which includes cognitive HRI (cHRI) and physical HRI (pHRI). The cHRI adopts brain-computer interface based on steady-state visual evoked potential. The pHRI is realized via admittance control based on electromyography. A central pattern generator is utilized to produce rhythmic and continuous lower joint trajectories, and its state variables are regulated by cHRI and pHRI. A custom-made leg exoskeleton prototype with the proposed multimodal HRI is tested on healthy subjects and stroke patients. The results show that voluntary and active participation can be effectively involved to achieve various assistive gait patterns.

  19. User interface design of electronic appliances

    CERN Document Server

    Baumann, Konrad

    2002-01-01

    Foreword by Brenda Laurel. Part One: Introduction 1. Background, Bruce Thomas 2. Introduction, Konrad Baumann 3. The Interaction Design Process, Georg Rakers Part Two: User Interface Design 4. Creativity Techniques, Irene Mavrommati 5. Design Principals, Irene Mavrommati and Adrian Martel 6. Design of On-Screen Interfaces, Irene Mavrommati Part Three: Input Devices 7. Controls, Konrad Baumann 8. Keyboards, Konrad Baumann 9. Advanced Interaction Techniques, Christopher Baber and Konrad Baumann 10. Speech Control, Christopher Baber and Jan Noyes 11. Wearable Computers, Christopher Baber Part Fou

  20. Graphic user interface for COSMOS code

    International Nuclear Information System (INIS)

    Oh, Je Yong; Koo, Yang Hyun; Lee, Byung Ho; Cheon, Jin Sik; Sohn, Dong Seong

    2003-06-01

    The Graphic User Interface (GUI) - which consisted of graphical elements such as windows, menu, button, icon, and so on - made it possible that the computer could be easily used for common users. Hence, the GUI was introduced to improve the efficiency to input parameters in COSMOS code. The functions to output graphs on the screen and postscript files were also added. And the graph library can be applied to the other codes. The details of principles of GUI and graphic library were described in the report

  1. Prototyping of user interfaces for mobile applications

    CERN Document Server

    Bähr, Benjamin

    2017-01-01

    This book investigates processes for the prototyping of user interfaces for mobile apps, and describes the development of new concepts and tools that can improve the prototype driven app development in the early stages. It presents the development and evaluation of a new requirements catalogue for prototyping mobile app tools that identifies the most important criteria such tools should meet at different prototype-development stages. This catalogue is not just a good point of orientation for designing new prototyping approaches, but also provides a set of metrics for a comparing the performance of alternative prototyping tools. In addition, the book discusses the development of Blended Prototyping, a new approach for prototyping user interfaces for mobile applications in the early and middle development stages, and presents the results of an evaluation of its performance, showing that it provides a tool for teamwork-oriented, creative prototyping of mobile apps in the early design stages.

  2. Innovative User Interfaces in the Industrial Domain

    OpenAIRE

    Jutterström, Jenny

    2010-01-01

    The goal of this thesis is to explore how the HMI of a process control system can be improved by applying modern interaction technologies. Many new interaction possibilities are arising on the market, while the interaction in the industrial domain still is quite conservative, with computer mouse and keyboard as the central method of interaction. It is believed that by making use of technology available today, the user interface can provide further assistance to the process control operators a...

  3. Workshop AccessibleTV "Accessible User Interfaces for Future TV Applications"

    Science.gov (United States)

    Hahn, Volker; Hamisu, Pascal; Jung, Christopher; Heinrich, Gregor; Duarte, Carlos; Langdon, Pat

    Approximately half of the elderly people over 55 suffer from some type of typically mild visual, auditory, motor or cognitive impairment. For them interaction, especially with PCs and other complex devices is sometimes challenging, although accessible ICT applications could make much of a difference for their living quality. Basically they have the potential to enable or simplify participation and inclusion in their surrounding private and professional communities. However, the availability of accessible user interfaces being capable to adapt to the specific needs and requirements of users with individual impairments is very limited. Although there are a number of APIs [1, 2, 3, 4] available for various platforms that allow developers to provide accessibility features within their applications, today none of them provides features for the automatic adaptation of multimodal interfaces being capable to automatically fit the individual requirements of users with different kinds of impairments. Moreover, the provision of accessible user interfaces is still expensive and risky for application developers, as they need special experience and effort for user tests. Today many implementations simply neglect the needs of elderly people, thus locking out a large portion of their potential users. The workshop is organized as part of the dissemination activity for the European-funded project GUIDE "Gentle user interfaces for elderly people", which aims to address this situation with a comprehensive approach for the realization of multimodal user interfaces being capable to adapt to the needs of users with different kinds of mild impairments. As application platform, GUIDE will mainly target TVs and Set-Top Boxes, such as the emerging Connected-TV or WebTV platforms, as they have the potential to address the needs of the elderly users with applications such as for home automation, communication or continuing education.

  4. User interface for a tele-operated robotic hand system

    Science.gov (United States)

    Crawford, Anthony L

    2015-03-24

    Disclosed here is a user interface for a robotic hand. The user interface anchors a user's palm in a relatively stationary position and determines various angles of interest necessary for a user's finger to achieve a specific fingertip location. The user interface additionally conducts a calibration procedure to determine the user's applicable physiological dimensions. The user interface uses the applicable physiological dimensions and the specific fingertip location, and treats the user's finger as a two link three degree-of-freedom serial linkage in order to determine the angles of interest. The user interface communicates the angles of interest to a gripping-type end effector which closely mimics the range of motion and proportions of a human hand. The user interface requires minimal contact with the operator and provides distinct advantages in terms of available dexterity, work space flexibility, and adaptability to different users.

  5. User interface for a tele-operated robotic hand system

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, Anthony L

    2015-03-24

    Disclosed here is a user interface for a robotic hand. The user interface anchors a user's palm in a relatively stationary position and determines various angles of interest necessary for a user's finger to achieve a specific fingertip location. The user interface additionally conducts a calibration procedure to determine the user's applicable physiological dimensions. The user interface uses the applicable physiological dimensions and the specific fingertip location, and treats the user's finger as a two link three degree-of-freedom serial linkage in order to determine the angles of interest. The user interface communicates the angles of interest to a gripping-type end effector which closely mimics the range of motion and proportions of a human hand. The user interface requires minimal contact with the operator and provides distinct advantages in terms of available dexterity, work space flexibility, and adaptability to different users.

  6. A Design Approach for Tangible User Interfaces

    Directory of Open Access Journals (Sweden)

    Bernard Champoux

    2004-05-01

    Full Text Available This paper proposes a mechanism to design Tangible User Interface (TUI based on Alexander’s (1964 design approach i.e. achieving fitness between the form and its context. Adapted to the design of TUIs, the fitness-of-use mechanism now takes into consideration the potential conflicts between the hardware of the artifact (electro-mechanical components and the form of the user’s control (Physical-ergonomics. The design problem is a search for an effortless co-existence (fitness-of-use between these two aspects. Tangible interface design differs from traditional graphical interface design as unsolved conflicts between hardware and ergonomics can deeply affect the desired interaction. Here we propose a mechanism (in the form of eight questions that support the design by defining the boundaries of the task, orienting the hardware (electro-mechanics and ergonomics of the design space for various sub-tasks and finally fitting the different components of the hardware and physical-ergonomics of the artefact to provide a component level fitness which will delineate the final tangible interfaces. We further evaluate the effectiveness and efficiency of our approach by quantitative user evaluation

  7. The crustal dynamics intelligent user interface anthology

    Science.gov (United States)

    Short, Nicholas M., Jr.; Campbell, William J.; Roelofs, Larry H.; Wattawa, Scott L.

    1987-01-01

    The National Space Science Data Center (NSSDC) has initiated an Intelligent Data Management (IDM) research effort which has, as one of its components, the development of an Intelligent User Interface (IUI). The intent of the IUI is to develop a friendly and intelligent user interface service based on expert systems and natural language processing technologies. The purpose of such a service is to support the large number of potential scientific and engineering users that have need of space and land-related research and technical data, but have little or no experience in query languages or understanding of the information content or architecture of the databases of interest. This document presents the design concepts, development approach and evaluation of the performance of a prototype IUI system for the Crustal Dynamics Project Database, which was developed using a microcomputer-based expert system tool (M. 1), the natural language query processor THEMIS, and the graphics software system GSS. The IUI design is based on a multiple view representation of a database from both the user and database perspective, with intelligent processes to translate between the views.

  8. User Interface Program for secure electronic tags

    International Nuclear Information System (INIS)

    Cai, Y.; Koehl, E.R.; Carlson, R.D.; Raptis, A.C.

    1995-05-01

    This report summarizes and documents the efforts of Argonne National Laboratory (ANL) in developing a secure tag communication user interface program comprising a tag monitor and a communication tool. This program can perform the same functions as the software that was developed at the Lawrence Livermore National Laboratory (LLNL), but it is enhanced with a user-friendly screen. It represents the first step in updating the TRANSCOM Tracking System (TRANSCOM) by incorporating a tag communication screen menu into the main menu of the TRANSCOM user program. A working version of TRANSCOM, enhanced with ANL secure-tag graphics, will strongly support the Department of Energy Warhead Dismantlement/Special Nuclear Materials Control initiatives. It will allow commercial satellite tracking of the movements and operational activities of treaty-limited items and transportation vehicles throughout Europe and the former USSR, as well as the continental US

  9. Designing Tangible User Interfaces for NFC Phones

    Directory of Open Access Journals (Sweden)

    Mikko Pyykkönen

    2012-01-01

    Full Text Available The increasing amount of NFC phones is attracting application developers to utilize NFC functionality. We can hence soon expect a large amount of mobile applications that users command by touching NFC tags in their environment with their NFC phones. The communication technology and the data formats have been standardized by the NFC Forum, but there are no conventions for advertising to the users NFC tags and the functionality touching the tags triggers. Only individual graphical symbols have been suggested when guidelines for advertising a rich variety of functionality are called for. In this paper, we identify the main challenges and present our proposal, a set of design guidelines based on more than twenty application prototypes we have built. We hope to initiate discussion and research resulting in uniform user interfaces for NFC-based services.

  10. A User Interface Toolkit for a Small Screen Device.

    OpenAIRE

    UOTILA, ALEKSI

    2000-01-01

    The appearance of different kinds of networked mobile devices and network appliances creates special requirements for user interfaces that are not met by existing widget based user interface creation toolkits. This thesis studies the problem domain of user interface creation toolkits for portable network connected devices. The portable nature of these devices places great restrictions on the user interface capabilities. One main characteristic of the devices is that they have small screens co...

  11. Representing Graphical User Interfaces with Sound: A Review of Approaches

    Science.gov (United States)

    Ratanasit, Dan; Moore, Melody M.

    2005-01-01

    The inability of computer users who are visually impaired to access graphical user interfaces (GUIs) has led researchers to propose approaches for adapting GUIs to auditory interfaces, with the goal of providing access for visually impaired people. This article outlines the issues involved in nonvisual access to graphical user interfaces, reviews…

  12. Graphical User Interface Programming in Introductory Computer Science.

    Science.gov (United States)

    Skolnick, Michael M.; Spooner, David L.

    Modern computing systems exploit graphical user interfaces for interaction with users; as a result, introductory computer science courses must begin to teach the principles underlying such interfaces. This paper presents an approach to graphical user interface (GUI) implementation that is simple enough for beginning students to understand, yet…

  13. PAMLX: a graphical user interface for PAML.

    Science.gov (United States)

    Xu, Bo; Yang, Ziheng

    2013-12-01

    This note announces pamlX, a graphical user interface/front end for the paml (for Phylogenetic Analysis by Maximum Likelihood) program package (Yang Z. 1997. PAML: a program package for phylogenetic analysis by maximum likelihood. Comput Appl Biosci. 13:555-556; Yang Z. 2007. PAML 4: Phylogenetic analysis by maximum likelihood. Mol Biol Evol. 24:1586-1591). pamlX is written in C++ using the Qt library and communicates with paml programs through files. It can be used to create, edit, and print control files for paml programs and to launch paml runs. The interface is available for free download at http://abacus.gene.ucl.ac.uk/software/paml.html.

  14. Integrated multimodal human-computer interface and augmented reality for interactive display applications

    Science.gov (United States)

    Vassiliou, Marius S.; Sundareswaran, Venkataraman; Chen, S.; Behringer, Reinhold; Tam, Clement K.; Chan, M.; Bangayan, Phil T.; McGee, Joshua H.

    2000-08-01

    We describe new systems for improved integrated multimodal human-computer interaction and augmented reality for a diverse array of applications, including future advanced cockpits, tactical operations centers, and others. We have developed an integrated display system featuring: speech recognition of multiple concurrent users equipped with both standard air- coupled microphones and novel throat-coupled sensors (developed at Army Research Labs for increased noise immunity); lip reading for improving speech recognition accuracy in noisy environments, three-dimensional spatialized audio for improved display of warnings, alerts, and other information; wireless, coordinated handheld-PC control of a large display; real-time display of data and inferences from wireless integrated networked sensors with on-board signal processing and discrimination; gesture control with disambiguated point-and-speak capability; head- and eye- tracking coupled with speech recognition for 'look-and-speak' interaction; and integrated tetherless augmented reality on a wearable computer. The various interaction modalities (speech recognition, 3D audio, eyetracking, etc.) are implemented a 'modality servers' in an Internet-based client-server architecture. Each modality server encapsulates and exposes commercial and research software packages, presenting a socket network interface that is abstracted to a high-level interface, minimizing both vendor dependencies and required changes on the client side as the server's technology improves.

  15. Presentation of dynamically overlapping auditory messages in user interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Papp, III, Albert Louis [Univ. of California, Davis, CA (United States)

    1997-09-01

    This dissertation describes a methodology and example implementation for the dynamic regulation of temporally overlapping auditory messages in computer-user interfaces. The regulation mechanism exists to schedule numerous overlapping auditory messages in such a way that each individual message remains perceptually distinct from all others. The method is based on the research conducted in the area of auditory scene analysis. While numerous applications have been engineered to present the user with temporally overlapped auditory output, they have generally been designed without any structured method of controlling the perceptual aspects of the sound. The method of scheduling temporally overlapping sounds has been extended to function in an environment where numerous applications can present sound independently of each other. The Centralized Audio Presentation System is a global regulation mechanism that controls all audio output requests made from all currently running applications. The notion of multimodal objects is explored in this system as well. Each audio request that represents a particular message can include numerous auditory representations, such as musical motives and voice. The Presentation System scheduling algorithm selects the best representation according to the current global auditory system state, and presents it to the user within the request constraints of priority and maximum acceptable latency. The perceptual conflicts between temporally overlapping audio messages are examined in depth through the Computational Auditory Scene Synthesizer. At the heart of this system is a heuristic-based auditory scene synthesis scheduling method. Different schedules of overlapped sounds are evaluated and assigned penalty scores. High scores represent presentations that include perceptual conflicts between over-lapping sounds. Low scores indicate fewer and less serious conflicts. A user study was conducted to validate that the perceptual difficulties predicted by

  16. The missing graphical user interface for genomics.

    Science.gov (United States)

    Schatz, Michael C

    2010-01-01

    The Galaxy package empowers regular users to perform rich DNA sequence analysis through a much-needed and user-friendly graphical web interface. See research article http://genomebiology.com/2010/11/8/R86 RESEARCH HIGHLIGHT: With the advent of affordable and high-throughput DNA sequencing, sequencing is becoming an essential component in nearly every genetics lab. These data are being generated to probe sequence variations, to understand transcribed, regulated or methylated DNA elements, and to explore a host of other biological features across the tree of life and across a range of environments and conditions. Given this deluge of data, novices and experts alike are facing the daunting challenge of trying to analyze the raw sequence data computationally. With so many tools available and so many assays to analyze, how can one be expected to stay current with the state of the art? How can one be expected to learn to use each tool and construct robust end-to-end analysis pipelines, all while ensuring that input formats, command-line options, sequence databases and program libraries are set correctly? Finally, once the analysis is complete, how does one ensure the results are reproducible and transparent for others to scrutinize and study?In an article published in Genome Biology, Jeremy Goecks, Anton Nekrutenko, James Taylor and the rest of the Galaxy Team (Goecks et al. 1) make a great advance towards resolving these critical questions with the latest update to their Galaxy Project. The ambitious goal of Galaxy is to empower regular users to carry out their own computational analysis without having to be an expert in computational biology or computer science. Galaxy adds a desperately needed graphical user interface to genomics research, making data analysis universally accessible in a web browser, and freeing users from the minutiae of archaic command-line parameters, data formats and scripting languages. Data inputs and computational steps are selected from

  17. Visual Design of User Interfaces by (De)composition

    OpenAIRE

    Lepreux, Sophie; Michotte, Benjamin; Vanderdonckt, Jean; 13th Int. Workshop on Design, Specification, and Verification of Interactive Systems DSV-IS

    2006-01-01

    Most existing graphical user interfaces are usually designed for a fixed context of use, thus making them rather difficult to modify for other contexts of use, such as for other users, other platforms, and other environments. This paper addresses this problem by introducing a new visual design method for graphical users interfaces referred to as “visual design by (de)composition". In this method, any individual or composite component of a graphical user interface is submitted to a series of o...

  18. Spatial sound in the use of multimodal interfaces for the acquisition of motor skills

    DEFF Research Database (Denmark)

    Hoffmann, Pablo F.

    2008-01-01

    This paper discusses the potential effectiveness of spatial sound in the use of multimodal interfaces and virtual environment technologies for the acquisition of motor skills. Because skills are generally of multimodal nature, spatial sound is discussed in terms of the role that it may play...... as to convey information considered critical for the transfer of motor skills....... in facilitating skill acquisition by complementing, or substituting, other sensory modalities. An overview of related research areas on audiovisual and audiotactile interaction is given in connection to the potential benefits of spatial sound as a means to improve the perceptual quality of the interfaces as well...

  19. User interface issues in supporting human-computer integrated scheduling

    Science.gov (United States)

    Cooper, Lynne P.; Biefeld, Eric W.

    1991-01-01

    The topics are presented in view graph form and include the following: characteristics of Operations Mission Planner (OMP) schedule domain; OMP architecture; definition of a schedule; user interface dimensions; functional distribution; types of users; interpreting user interaction; dynamic overlays; reactive scheduling; and transitioning the interface.

  20. Contributions to the integrated graphical user interface

    International Nuclear Information System (INIS)

    Badescu, E.; Caprini, M.

    2003-01-01

    The Online Software is part of the distributed Data Acquisition System (DAQ) for the ATLAS experiment that will start taking data in 2007 at the Large Hadron Collider at CERN. The Online Software system is responsible for overall experiment control, including run control, configuration and monitoring of Trigger and Data Acquisition System (TDAQ) and management of data-taking partitions. The system encompasses all the software dealing with configuring, controlling and monitoring the data acquisition system but excludes anything dealing with the management, processing or transportation of physics data. In other words, the Online Software is supposed to act as the 'glue' to a quantity of heterogeneous sub-system, providing not only a uniform control interface, but also the possibility of easily abstracting the specificities of those subsystems in order to provide them with control services. The component model architecture has been adopted for the system, each component being developed as an individual package. All the hardware and software configurations of the data taking partitions are stored in configuration databases. The Process Manager component performs the basic job control of the software components. The Integrated Graphical User Interface (IGUI) is one of the integration components of the Online Software, allowing the operator to control and monitor the status of the current data taking run in terms of its main parameters, detector configuration, trigger rate, buffer occupancy and state of the subsystems. The component has been designed as a Java application, having defined some specialized panels for allowing the user to send the main DAQ commands and displaying messages, state or run specific parameters of the whole system or related to all the other components (Run Control, Run Parameters, DAQ Supervisor, Process Manager, Message Reporting, Monitoring or Data Flow). The design of this component allows the users to develop their own panel to be displayed

  1. Experimental evaluation of user performance in a pursuit tracking task with multimodal feedback

    Directory of Open Access Journals (Sweden)

    Obrenović Željko

    2004-01-01

    Full Text Available In this paper we describe the results of experimental evaluation of user performance in a pursuit-tracking task with multimodal feedback. Our experimental results indicate that audio can significantly improve the accuracy of pursuit tracking. Experiments with 19 participants have shown that addition of acoustic modalities reduces the error during pursuit tracking for up to 19%. Moreover, experiments indicated the existence of perceptual boundaries of multimodal HCI for different scene complexity and target speeds. We have also shown that the most appealing paradigms are not the most effective ones, which necessitates a careful quantitative analysis of proposed multimodal HCI paradigms.

  2. User interface graphically improves generator AL diagnostics

    International Nuclear Information System (INIS)

    Gray, R.F.; King, I.J.

    1991-01-01

    In April of 1990, the recently developed Diagnostic Graphical User Interface (DGUI) was installed at a large nuclear power plant in the midwestern United States. Since 1988, the power plant has been using the Generator Artificial Intelligence Diagnostics (GenAID) System, which provides online diagnostic capability for the generator and generator auxiliaries through a plant data center (PDC) and communication link to the diagnostic operations center (DOC) in Orlando, Florida. The enhanced system provides the power plant control room operator with a comprehensive tool to understand and better utilize the information provided by the existing knowledge bases. This paper represents a significant improvement over existing technology by providing the power plant control room operator with the capability of interacting directly with the diagnostic system

  3. Simulation Control Graphical User Interface Logging Report

    Science.gov (United States)

    Hewling, Karl B., Jr.

    2012-01-01

    One of the many tasks of my project was to revise the code of the Simulation Control Graphical User Interface (SIM GUI) to enable logging functionality to a file. I was also tasked with developing a script that directed the startup and initialization flow of the various LCS software components. This makes sure that a software component will not spin up until all the appropriate dependencies have been configured properly. Also I was able to assist hardware modelers in verifying the configuration of models after they have been upgraded to a new software version. I developed some code that analyzes the MDL files to determine if any error were generated due to the upgrade process. Another one of the projects assigned to me was supporting the End-to-End Hardware/Software Daily Tag-up meeting.

  4. Standards for the user interface - Developing a user consensus. [for Space Station Information System

    Science.gov (United States)

    Moe, Karen L.; Perkins, Dorothy C.; Szczur, Martha R.

    1987-01-01

    The user support environment (USE) which is a set of software tools for a flexible standard interactive user interface to the Space Station systems, platforms, and payloads is described in detail. Included in the USE concept are a user interface language, a run time environment and user interface management system, support tools, and standards for human interaction methods. The goals and challenges of the USE are discussed as well as a methodology based on prototype demonstrations for involving users in the process of validating the USE concepts. By prototyping the key concepts and salient features of the proposed user interface standards, the user's ability to respond is greatly enhanced.

  5. FGB: A Graphical and Haptic User Interface for Creating Graphical, Haptic User Interfaces

    International Nuclear Information System (INIS)

    ANDERSON, THOMAS G.; BRECKENRIDGE, ARTHURINE; DAVIDSON, GEORGE S.

    1999-01-01

    The emerging field of haptics represents a fundamental change in human-computer interaction (HCI), and presents solutions to problems that are difficult or impossible to solve with a two-dimensional, mouse-based interface. To take advantage of the potential of haptics, however, innovative interaction techniques and programming environments are needed. This paper describes FGB (FLIGHT GHUI Builder), a programming tool that can be used to create an application specific graphical and haptic user interface (GHUI). FGB is itself a graphical and haptic user interface with which a programmer can intuitively create and manipulate components of a GHUI in real time in a graphical environment through the use of a haptic device. The programmer can create a GHUI without writing any programming code. After a user interface is created, FGB writes the appropriate programming code to a file, using the FLIGHT API, to recreate what the programmer created in the FGB interface. FGB saves programming time and increases productivity, because a programmer can see the end result as it is created, and FGB does much of the programming itself. Interestingly, as FGB was created, it was used to help build itself. The further FGB was in its development, the more easily and quickly it could be used to create additional functionality and improve its own design. As a finished product, FGB can be used to recreate itself in much less time than it originally required, and with much less programming. This paper describes FGB's GHUI components, the techniques used in the interface, how the output code is created, where programming additions and modifications should be placed, and how it can be compared to and integrated with existing API's such as MFC and Visual C++, OpenGL, and GHOST

  6. Support for User Interfaces for Distributed Systems

    Science.gov (United States)

    Eychaner, Glenn; Niessner, Albert

    2005-01-01

    An extensible Java(TradeMark) software framework supports the construction and operation of graphical user interfaces (GUIs) for distributed computing systems typified by ground control systems that send commands to, and receive telemetric data from, spacecraft. Heretofore, such GUIs have been custom built for each new system at considerable expense. In contrast, the present framework affords generic capabilities that can be shared by different distributed systems. Dynamic class loading, reflection, and other run-time capabilities of the Java language and JavaBeans component architecture enable the creation of a GUI for each new distributed computing system with a minimum of custom effort. By use of this framework, GUI components in control panels and menus can send commands to a particular distributed system with a minimum of system-specific code. The framework receives, decodes, processes, and displays telemetry data; custom telemetry data handling can be added for a particular system. The framework supports saving and later restoration of users configurations of control panels and telemetry displays with a minimum of effort in writing system-specific code. GUIs constructed within this framework can be deployed in any operating system with a Java run-time environment, without recompilation or code changes.

  7. Squidy : a Zoomable Design Environment for Natural User Interfaces

    OpenAIRE

    König, Werner A.; Rädle, Roman; Reiterer, Harald

    2009-01-01

    We introduce the interaction library Squidy, which eases the design of natural user interfaces by unifying relevant frameworks and toolkits in a common library. Squidy provides a central design environment based on high-level visual data flow programming combined with zoomable user interface concepts. The user interface offers a Simple visual language and a collection of ready-to-use devices, filters and interaction techniques. The concept of semantic zooming enables nevertheless access to mo...

  8. User's Manual for the Object User Interface (OUI): An Environmental Resource Modeling Framework

    Science.gov (United States)

    Markstrom, Steven L.; Koczot, Kathryn M.

    2008-01-01

    The Object User Interface is a computer application that provides a framework for coupling environmental-resource models and for managing associated temporal and spatial data. The Object User Interface is designed to be easily extensible to incorporate models and data interfaces defined by the user. Additionally, the Object User Interface is highly configurable through the use of a user-modifiable, text-based control file that is written in the eXtensible Markup Language. The Object User Interface user's manual provides (1) installation instructions, (2) an overview of the graphical user interface, (3) a description of the software tools, (4) a project example, and (5) specifications for user configuration and extension.

  9. Multimodal and ubiquitous computing systems: supporting independent-living older users.

    Science.gov (United States)

    Perry, Mark; Dowdall, Alan; Lines, Lorna; Hone, Kate

    2004-09-01

    We document the rationale and design of a multimodal interface to a pervasive/ubiquitous computing system that supports independent living by older people in their own homes. The Millennium Home system involves fitting a resident's home with sensors--these sensors can be used to trigger sequences of interaction with the resident to warn them about dangerous events, or to check if they need external help. We draw lessons from the design process and conclude the paper with implications for the design of multimodal interfaces to ubiquitous systems developed for the elderly and in healthcare, as well as for more general ubiquitous computing applications.

  10. How to Develop a User Interface That Your Real Users Will Love

    Science.gov (United States)

    Phillips, Donald

    2012-01-01

    A "user interface" is the part of an interactive system that bridges the user and the underlying functionality of the system. But people sometimes forget that the best interfaces will provide a platform to optimize the users' interactions so that they support and extend the users' activities in effective, useful, and usable ways. To look at it…

  11. Use of force feedback to enhance graphical user interfaces

    Science.gov (United States)

    Rosenberg, Louis B.; Brave, Scott

    1996-04-01

    This project focuses on the use of force feedback sensations to enhance user interaction with standard graphical user interface paradigms. While typical joystick and mouse devices are input-only, force feedback controllers allow physical sensations to be reflected to a user. Tasks that require users to position a cursor on a given target can be enhanced by applying physical forces to the user that aid in targeting. For example, an attractive force field implemented at the location of a graphical icon can greatly facilitate target acquisition and selection of the icon. It has been shown that force feedback can enhance a users ability to perform basic functions within graphical user interfaces.

  12. VisTool: A user interface and visualization development system

    DEFF Research Database (Denmark)

    Xu, Shangjin

    system – to simplify user interface development. VisTool allows user interface development without real programming. With VisTool a designer assembles visual objects (e.g. textboxes, ellipse, etc.) to visualize database contents. In VisTool, visual properties (e.g. color, position, etc.) can be formulas...... programming. However, in Software Engineering, software engineers who develop user interfaces do not follow it. In many cases, it is desirable to use graphical presentations, because a graphical presentation gives a better overview than text forms, and can improve task efficiency and user satisfaction....... However, it is more difficult to follow the classical usability approach for graphical presentation development. These difficulties result from the fact that designers cannot implement user interface with interactions and real data. We developed VisTool – a user interface and visualization development...

  13. Exploring distributed user interfaces in ambient intelligent environments

    NARCIS (Netherlands)

    Dadlani Mahtani, P.M.; Peregrin Emparanza, J.; Markopoulos, P.; Gallud, J.A.; Tesoriero, R.; Penichet, V.M.R.

    2011-01-01

    In this paper we explore the use of Distributed User Interfaces (DUIs) in the field of Ambient Intelligence (AmI). We first introduce the emerging area of AmI, followed by describing three case studies where user interfaces or ambient displays are distributed and blending in the user’s environments.

  14. Reflections on Andes' Goal-Free User Interface

    Science.gov (United States)

    VanLehn, Kurt

    2016-01-01

    Although the Andes project produced many results over its 18 years of activity, this commentary focuses on its contributions to understanding how a goal-free user interface impacts the overall design and performance of a step-based tutoring system. Whereas a goal-aligned user interface displays relevant goals as blank boxes or empty locations that…

  15. Users expect interfaces to behave like the physical world

    DEFF Research Database (Denmark)

    Nørager, Rune

    2006-01-01

    Navigation in folder structures is an essential part of most window based user interfaces. Basic human navigation strategies rely on stable properties of the physical world, which are not by default present in windows style user interfaces. According to the theoretical framework Ecological Cognit...

  16. The Graphical User Interface: Crisis, Danger, and Opportunity.

    Science.gov (United States)

    Boyd, L. H.; And Others

    1990-01-01

    This article describes differences between the graphical user interface and traditional character-based interface systems, identifies potential problems posed by graphic computing environments for blind computer users, and describes some programs and strategies that are being developed to provide access to those environments. (Author/JDD)

  17. Advances in the development of a cognitive user interface

    Directory of Open Access Journals (Sweden)

    Jokisch Oliver

    2018-01-01

    Full Text Available In this contribution, we want to summarize recent development steps of the embedded cognitive user interface UCUI, which enables a user-adaptive scenario in human-machine or even human-robot interactions by considering sophisticated cognitive and semantic modelling. The interface prototype is developed by different German institutes and companies with their steering teams at Fraunhofer IKTS and Brandenburg University of Technology. The interface prototype is able to communicate with users via speech and gesture recognition, speech synthesis and a touch display. The device includes an autarkic semantic processing and beyond a cognitive behavior control, which supports an intuitive interaction to control different kinds of electronic devices, e. g. in a smart home environment or in interactive respectively collaborative robotics. Contrary to available speech assistance systems such as Amazon Echo or Google Home, the introduced cognitive user interface UCUI ensures the user privacy by processing all necessary information without any network access of the interface device.

  18. A multimodal interface for real-time soldier-robot teaming

    Science.gov (United States)

    Barber, Daniel J.; Howard, Thomas M.; Walter, Matthew R.

    2016-05-01

    Recent research and advances in robotics have led to the development of novel platforms leveraging new sensing capabilities for semantic navigation. As these systems becoming increasingly more robust, they support highly complex commands beyond direct teleoperation and waypoint finding facilitating a transition away from robots as tools to robots as teammates. Supporting future Soldier-Robot teaming requires communication capabilities on par with human-human teams for successful integration of robots. Therefore, as robots increase in functionality, it is equally important that the interface between the Soldier and robot advances as well. Multimodal communication (MMC) enables human-robot teaming through redundancy and levels of communications more robust than single mode interaction. Commercial-off-the-shelf (COTS) technologies released in recent years for smart-phones and gaming provide tools for the creation of portable interfaces incorporating MMC through the use of speech, gestures, and visual displays. However, for multimodal interfaces to be successfully used in the military domain, they must be able to classify speech, gestures, and process natural language in real-time with high accuracy. For the present study, a prototype multimodal interface supporting real-time interactions with an autonomous robot was developed. This device integrated COTS Automated Speech Recognition (ASR), a custom gesture recognition glove, and natural language understanding on a tablet. This paper presents performance results (e.g. response times, accuracy) of the integrated device when commanding an autonomous robot to perform reconnaissance and surveillance activities in an unknown outdoor environment.

  19. A VR-User Interface for Design by Features

    NARCIS (Netherlands)

    Coomans, M.K.D.; Timmermans, H.J.P.

    1998-01-01

    We present the design of a Virtual Reality based user interface (VR-UI). It is the interface for the VR-DIS system, a design application for the Building and Construction industry (VRDIS stands for Virtual Reality - Design Information System). The interface is characterised by a mixed representation

  20. The Visual Web User Interface Design in Augmented Reality Technology

    OpenAIRE

    Chouyin Hsu; Haui-Chih Shiau

    2013-01-01

    Upon the popularity of 3C devices, the visual creatures are all around us, such the online game, touch pad, video and animation. Therefore, the text-based web page will no longer satisfy users. With the popularity of webcam, digital camera, stereoscopic glasses, or head-mounted display, the user interface becomes more visual and multi-dimensional. For the consideration of 3D and visual display in the research of web user interface design, Augmented Reality technology providing the convenient ...

  1. User Interface Aspects of a Human-Hand Simulation System

    Directory of Open Access Journals (Sweden)

    Beifang Yi

    2005-10-01

    Full Text Available This paper describes the user interface design for a human-hand simulation system, a virtual environment that produces ground truth data (life-like human hand gestures and animations and provides visualization support for experiments on computer vision-based hand pose estimation and tracking. The system allows users to save time in data generation and easily create any hand gestures. We have designed and implemented this user interface with the consideration of usability goals and software engineering issues.

  2. Facility Interface Capability Assessment (FICA) user manual

    International Nuclear Information System (INIS)

    Pope, R.B.; MacDonald, R.R.; Massaglia, J.L.; Williamson, D.A.; Viebrock, J.M.; Mote, N.

    1995-09-01

    The US Department of Energy's (DOE) Office of Civilian Radioactive Waste Management (OCRWM) is responsible for developing the Civilian Radioactive Waste Management System (CRWMS) to accept spent nuclear fuel from commercial facilities. The objective of the Facility Interface Capability Assessment (FICA) project was to assess the capability of each commercial spent nuclear fuel (SNF) storage facility, at which SNF is stored, to handle various SNF shipping casks. The purpose of this report is describe the FICA computer software and to provide the FICA user with a guide on how to use the FICA system. The FICA computer software consists of two executable programs: the FICA Reactor Report program and the FICA Summary Report program (written in the Ca-Clipper version 5.2 development system). The complete FICA software system is contained on either a 3.5 in. (double density) or a 5.25 in. (high density) diskette and consists of the two FICA programs and all the database files (generated using dBASE III). The FICA programs are provided as ''stand alone'' systems and neither the Ca-Clipper compiler nor dBASE III is required to run the FICA programs. The steps for installing the FICA software system and executing the FICA programs are described in this report. Instructions are given on how to install the FICA software system onto the hard drive of the PC and how to execute the FICA programs from the FICA subdirectory on the hard drive. Both FICA programs are menu driven with the up-arrow and down-arrow keys used to move the cursor to the desired selection

  3. User interface for a partially incompatible system software environment with non-ADP users

    Energy Technology Data Exchange (ETDEWEB)

    Loffman, R.S.

    1987-08-01

    Good user interfaces to computer systems and software applications are the result of combining an analysis of user needs with knowledge of interface design principles and techniques. This thesis reports on an interface for an environment: (a) that consists of users who are not computer science or data processing professionals; and (b) which is bound by predetermined software and hardware. The interface was designed which combined these considerations with user interface design principles. Current literature was investigated to establish a baseline of knowledge about user interface design. There are many techniques which can be used to implement a user interface, but all should have the same basic goal, which is to assist the user in the performance of a task. This can be accomplished by providing the user with consistent, well-structured interfaces which also provide flexibility to adapt to differences among users. The interface produced used menu selection and command language techniques to make two different operating system environments appear similar. Additional included features helped to address the needs of different users. The original goal was also to make the transition between the two systems transparent. This was not fully accomplished due to software and hardware limitations.

  4. Computer-Based Tools for Evaluating Graphical User Interfaces

    Science.gov (United States)

    Moore, Loretta A.

    1997-01-01

    The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.

  5. User-interface aspects in recognizing connected-cursive handwriting

    NARCIS (Netherlands)

    Schomaker, L

    1994-01-01

    There are at least two major stumbling blocks for user acceptance of pen-based computers: the recognition performance is not good enough, especially on cursive handwriting; and the user interface technology has not reached a mature stage. The initial reaction of product reviewers and potential user

  6. Framework for Developing a Multimodal Programming Interface Used on Industrial Robots

    Directory of Open Access Journals (Sweden)

    Bogdan Mocan

    2014-12-01

    Full Text Available The proposed approach within this paper shifts the focus from the coordinate based programming of an industrial robot, which currently dominates the field, to an object based programming scheme. The general framework proposed in this paper is designed to perform natural language understanding, gesture integration and semantic analysis which facilitate the development of a multimodal robot programming interface that facilitate an intuitive programming.

  7. Distributed user interfaces for clinical ubiquitous computing applications.

    Science.gov (United States)

    Bång, Magnus; Larsson, Anders; Berglund, Erik; Eriksson, Henrik

    2005-08-01

    Ubiquitous computing with multiple interaction devices requires new interface models that support user-specific modifications to applications and facilitate the fast development of active workspaces. We have developed NOSTOS, a computer-augmented work environment for clinical personnel to explore new user interface paradigms for ubiquitous computing. NOSTOS uses several devices such as digital pens, an active desk, and walk-up displays that allow the system to track documents and activities in the workplace. We present the distributed user interface (DUI) model that allows standalone applications to distribute their user interface components to several devices dynamically at run-time. This mechanism permit clinicians to develop their own user interfaces and forms to clinical information systems to match their specific needs. We discuss the underlying technical concepts of DUIs and show how service discovery, component distribution, events and layout management are dealt with in the NOSTOS system. Our results suggest that DUIs--and similar network-based user interfaces--will be a prerequisite of future mobile user interfaces and essential to develop clinical multi-device environments.

  8. a mobile user interface for low-literacy users in rural south africa

    African Journals Online (AJOL)

    Information and Communication Technology services for socio-economic ... was conducted to design a mobile user interface to enable low-literacy users in Dwesa ..... common social and religious groups ... layout, buttons and menu.

  9. Multimodal Challenge: Analytics Beyond User-computer Interaction Data

    NARCIS (Netherlands)

    Di Mitri, Daniele; Schneider, Jan; Specht, Marcus; Drachsler, Hendrik

    2018-01-01

    This contribution describes one the challenges explored in the Fourth LAK Hackathon. This challenge aims at shifting the focus from learning situations which can be easily traced through user-computer interactions data and concentrate more on user-world interactions events, typical of co-located and

  10. A Single Rod Multi-modality Multi-interface Level Sensor Using an AC Current Source

    Directory of Open Access Journals (Sweden)

    Abdulgader Hwili

    2008-05-01

    Full Text Available Crude oil separation is an important process in the oil industry. To make efficient use of the separators, it is important to know their internal behaviour, and to measure the levels of multi-interfaces between different materials, such as gas-foam, foam-oil, oil-emulsion, emulsion-water and water-solids. A single-rod multi-modality multi-interface level sensor is presented, which has a current source, and electromagnetic modalities. Some key issues have been addressed, including the effect of salt content and temperature i.e. conductivity on the measurement.

  11. Multimodality

    DEFF Research Database (Denmark)

    Buhl, Mie

    2010-01-01

    In this paper, I address an ongoing discussion in Danish E-learning research about how to take advantage of the fact that digital media facilitate other communication forms than text, so-called ‘multimodal' communication, which should not be confused with the term ‘multimedia'. While multimedia...... on their teaching and learning situations. The choices they make involve e-learning resources like videos, social platforms and mobile devices, not just as digital artefacts we interact with, but the entire practice of using digital media. In a life-long learning perspective, multimodality is potentially very...

  12. Safeguarding the User - Developing a Multimodal Design for Surveying and Raising Internet Safety and Security Awareness

    DEFF Research Database (Denmark)

    Gjedde, Lisa; Sharp, Robin; Andersen, Preben

    2009-01-01

    Internet safety and security for the user is an issue of great importance for the successful implementation of ICT, but since it is a complex field, with a specialist vocabulary that cannot immediately be understood by the common user, it is difficult to survey the field. The user may not underst......Internet safety and security for the user is an issue of great importance for the successful implementation of ICT, but since it is a complex field, with a specialist vocabulary that cannot immediately be understood by the common user, it is difficult to survey the field. The user may...... describes an ICT-based research method that combines a verbal mode of inquiry with a visual mode employing illustrations, animations and simulations to provide the user with a multimodal media experience. The rationale for this is that we are working in a complex technical field with a specialist vocabulary...

  13. More playful user interfaces: interfaces that invite social and physical interaction

    NARCIS (Netherlands)

    Nijholt, Antinus; Unknown, [Unknown

    2015-01-01

    This book covers the latest advances in playful user interfacesinterfaces that invite social and physical interaction. These new developments include the use of audio, visual, tactile and physiological sensors to monitor, provide feedback and anticipate the behavior of human users. The decreasing

  14. iPhone User Interface Cookbook

    CERN Document Server

    Banga, Cameron

    2011-01-01

    Written in a cookbook style, this book offers solutions using a recipe based approach. Each recipe contains step-by-step instructions followed by an analysis of what was done in each task and other useful information. The cookbook approach means you can dive into whatever recipes you want in no particular order. The iPhone Interface Cookbook is written from the ground up for people who are new to iOS or application interface design in general. Each chapter discusses the reasoning and design strategy behind critical interface components, as well as how to best integrate each into any iPhone or

  15. User Interfaces for Cooperative Remote Design

    National Research Council Canada - National Science Library

    Hodges, Larry

    1998-01-01

    .... We seek to develop new methods to facilitate cooperative remote design utilizing both high-bandwidth networking capability and virtual reality with appropriate graphical interfaces to support the collaborative effort...

  16. Open|SpeedShop Graphical User Interface Technology, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to create a new graphical user interface (GUI) for an existing parallel application performance and profiling tool, Open|SpeedShop. The current GUI has...

  17. Using Vim as User Interface for Your Applications

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The Vim editor offers one of the cleverest user interfaces. It's why many developers write programs with vi keyboard bindings. Now, imagine how powerful it gets to build applications literally on top of Vim itself.

  18. The intelligent user interface for NASA's advanced information management systems

    Science.gov (United States)

    Campbell, William J.; Short, Nicholas, Jr.; Rolofs, Larry H.; Wattawa, Scott L.

    1987-01-01

    NASA has initiated the Intelligent Data Management Project to design and develop advanced information management systems. The project's primary goal is to formulate, design and develop advanced information systems that are capable of supporting the agency's future space research and operational information management needs. The first effort of the project was the development of a prototype Intelligent User Interface to an operational scientific database, using expert systems and natural language processing technologies. An overview of Intelligent User Interface formulation and development is given.

  19. Generating Graphical User Interfaces from Precise Domain Specifications

    OpenAIRE

    Kamil Rybiński; Norbert Jarzębowski; Michał Śmiałek; Wiktor Nowakowski; Lucyna Skrzypek; Piotr Łabęcki

    2014-01-01

    Turning requirements into working systems is the essence of software engineering. This paper proposes automation of one of the aspects of this vast problem: generating user interfaces directly from requirements models. It presents syntax and semantics of a comprehensible yet precise domain specification language. For this language, the paper presents the process of generating code for the user interface elements. This includes model transformation procedures to generate window initiation code...

  20. Designing for User Engagment Aesthetic and Attractive User Interfaces

    CERN Document Server

    Sutcliffe, Alistair

    2009-01-01

    This book explores the design process for user experience and engagement, which expands the traditional concept of usability and utility in design to include aesthetics, fun and excitement. User experience has evolved as a new area of Human Computer Interaction research, motivated by non-work oriented applications such as games, education and emerging interactive Web 2.0. The chapter starts by examining the phenomena of user engagement and experience and setting them in the perspective of cognitive psychology, in particular motivation, emotion and mood. The perspective of aesthetics is expande

  1. Multimodality and Ambient Intelligence

    NARCIS (Netherlands)

    Nijholt, Antinus; Verhaegh, W.; Aarts, E.; Korst, J.

    2004-01-01

    In this chapter we discuss multimodal interface technology. We present eexamples of multimodal interfaces and show problems and opportunities. Fusion of modalities is discussed and some roadmap discussions on research in multimodality are summarized. This chapter also discusses future developments

  2. Human Computer Interface Design Criteria. Volume 1. User Interface Requirements

    Science.gov (United States)

    2010-03-19

    127 13.3.4 Typography ................................................................................................... 127 13.3.5...meaning assigned to the shape. 13.3.4 Typography In general, variations in typography are not used for coding, since they may conflict with font...attributes selected by users in a system-level or browser-level setting and be illegible when rendered. However, if variations in typography are

  3. INTERNET CONNECTIVITY FOR MASS PRODUCED UNITS WITHOUT USER INTERFACE

    DEFF Research Database (Denmark)

    2000-01-01

    To the manufacturer of mass produced units without a user interface, typically field level units, connection of these units to a communications network for enabling servicing, control and trackability is of interest. To provide this connection, a solution is described in which an interface...

  4. Towards a taxonomy of virtual reality user interfaces

    NARCIS (Netherlands)

    Coomans, M.K.D.; Timmermans, H.J.P.

    1997-01-01

    Virtual Reality-based user interfaces (VRUIs) are expected to bring about a revolution in computing. VR can potentially communicate large amounts of data in an easily understandable format. VR looks very promising, but it is still a very new interface technology for which very little

  5. INTERFACING GOOGLE SEARCH ENGINE TO CAPTURE USER WEB SEARCH BEHAVIOR

    OpenAIRE

    Fadhilah Mat Yamin; T. Ramayah

    2013-01-01

    The behaviour of the searcher when using the search engine especially during the query formulation is crucial. Search engines capture users’ activities in the search log, which is stored at the search engine server. Due to the difficulty of obtaining this search log, this paper proposed and develops an interface framework to interface a Google search engine. This interface will capture users’ queries before redirect them to Google. The analysis of the search log will show that users are utili...

  6. Enabling Accessibility Through Model-Based User Interface Development.

    Science.gov (United States)

    Ziegler, Daniel; Peissner, Matthias

    2017-01-01

    Adaptive user interfaces (AUIs) can increase the accessibility of interactive systems. They provide personalized display and interaction modes to fit individual user needs. Most AUI approaches rely on model-based development, which is considered relatively demanding. This paper explores strategies to make model-based development more attractive for mainstream developers.

  7. A Functional Programming Technique for Forms in Graphical User Interfaces

    NARCIS (Netherlands)

    Evers, S.; Kuper, Jan; Achten, P.M.; Grelck, G.; Huch, F.; Michaelson, G.; Trinder, Ph.W.

    2005-01-01

    This paper presents FunctionalForms, a new combinator library for constructing fully functioning forms in a concise and flexible way. A form is a part of a graphical user interface (GUI) restricted to displaying a value and allowing the user to modify it. The library is built on top of the

  8. Advanced Displays and Natural User Interfaces to Support Learning

    Science.gov (United States)

    Martin-SanJose, Juan-Fernando; Juan, M. -Carmen; Mollá, Ramón; Vivó, Roberto

    2017-01-01

    Advanced displays and natural user interfaces (NUI) are a very suitable combination for developing systems to provide an enhanced and richer user experience. This combination can be appropriate in several fields and has not been extensively exploited. One of the fields that this combination is especially suitable for is education. Nowadays,…

  9. Activity Walkthrough - A Quick User Interface Evaluation without Users

    DEFF Research Database (Denmark)

    Bertelsen, Olav Wedege

    2004-01-01

    Based on activity theory an expert review method, the activity walkthrough, is introduced. The method is a modified version of the cognitive walkthrough, addressing some of the practical issues arising when non-experts apply the cognitive walkthrough to non-trivial interfaces. The presented version...

  10. The graphical user interface for CRISTAL V1

    International Nuclear Information System (INIS)

    Heulers, L.; Courtois, G.; Fernex, F.; Gomit, J.M.; Letang, E.

    2003-01-01

    This paper deals with the new Graphical User Interface (GUI) of the CRISTAL V1 package devoted to criticality studies including burn up calculations. The aim of this GUI is to offer users a high level of user-friendliness and flexibility in the data description and the results analysis of codes of the package. The three main components of the GUI (CIGAIES, EJM and OPOSSUM) are presented. The different functionalities of the tools are explained through some applications. (author)

  11. User Interface Framework for the National Ignition Facility (NIF)

    International Nuclear Information System (INIS)

    Fisher, J M; Bowers, G A; Carey, R W; Daveler, S A; Herndon Ford, K B; Ho, J C; Lagin, L J; Lambert, C J; Mauvais, J; Stout, E A; West, S L

    2007-01-01

    A user interface (UI) framework supports the development of user interfaces to operate the National Ignition Facility (NIF) using the Integrated Computer Control System (ICCS). [1] This framework simplifies UI development and ensures consistency for NIF operators. A comprehensive, layered collection of UIs in ICCS provides interaction with system-level processes, shot automation, and subsystem-specific devices. All user interfaces are written in Java, employing CORBA to interact with other ICCS components. ICCS developers use these frameworks to compose two major types of user interfaces: broadviews and control panels. Broadviews provide a visual representation of the NIF beamlines through interactive schematic drawings. Control panels provide status and control at a device level. The UI framework includes a suite of display components to standardize user interaction through data entry behaviors, common connection and threading mechanisms, and a common appearance. With these components, ICCS developers can more efficiently address usability issues in the facility when needed. The ICCS UI framework helps developers create consistent and easy-to-understand user interfaces for NIF operators

  12. Gromita: a fully integrated graphical user interface to gromacs 4.

    Science.gov (United States)

    Sellis, Diamantis; Vlachakis, Dimitrios; Vlassi, Metaxia

    2009-09-07

    Gromita is a fully integrated and efficient graphical user interface (GUI) to the recently updated molecular dynamics suite Gromacs, version 4. Gromita is a cross-platform, perl/tcl-tk based, interactive front end designed to break the command line barrier and introduce a new user-friendly environment to run molecular dynamics simulations through Gromacs. Our GUI features a novel workflow interface that guides the user through each logical step of the molecular dynamics setup process, making it accessible to both advanced and novice users. This tool provides a seamless interface to the Gromacs package, while providing enhanced functionality by speeding up and simplifying the task of setting up molecular dynamics simulations of biological systems. Gromita can be freely downloaded from http://bio.demokritos.gr/gromita/.

  13. Model driven development of user interface prototypes

    DEFF Research Database (Denmark)

    Störrle, Harald

    2010-01-01

    the whole UI development life cycle, connect all stakeholders involved, and support a wide range of levels of granularity and abstraction. This is achieved by using Window/Event-Diagrams (WEDs), a UI specification notation based on UML 2 state machines. It affords closer collaboration between different user...

  14. A Toolkit for Designing User Interfaces

    Science.gov (United States)

    1990-03-01

    as the NPS IB can provide prototyping capability. Interface generators are available commercially for nearly every computing machine on the market ...structure which holds attributes of the message buffer window is shown in Figure 4.2. The variables nlines and nchars hold the number of lines in the...window its appearance of scrolling 46 /* define a type and structure for the message buffer */ struct messbuf( long nlines ; /* number of lines in the

  15. Multimodality

    DEFF Research Database (Denmark)

    Buhl, Mie

    In this paper, I address an ongoing discussion in Danish E-learning research about how to take advantage of the fact that digital media facilitate other communication forms than text, so-called ‘multimodal’ communication, which should not be confused with the term ‘multimedia’. While multimedia...... and learning situations. The choices they make involve E-learning resources like videos, social platforms and mobile devices, not just as digital artefacts we interact with, but the entire practice of using digital media. In a life-long learning perspective, multimodality is potentially very useful...

  16. Customization of user interfaces to reduce errors and enhance user acceptance.

    Science.gov (United States)

    Burkolter, Dina; Weyers, Benjamin; Kluge, Annette; Luther, Wolfram

    2014-03-01

    Customization is assumed to reduce error and increase user acceptance in the human-machine relation. Reconfiguration gives the operator the option to customize a user interface according to his or her own preferences. An experimental study with 72 computer science students using a simulated process control task was conducted. The reconfiguration group (RG) interactively reconfigured their user interfaces and used the reconfigured user interface in the subsequent test whereas the control group (CG) used a default user interface. Results showed significantly lower error rates and higher acceptance of the RG compared to the CG while there were no significant differences between the groups regarding situation awareness and mental workload. Reconfiguration seems to be promising and therefore warrants further exploration. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  17. Designing a Facebook interface for senior users.

    Science.gov (United States)

    Gomes, Gonçalo; Duarte, Carlos; Coelho, José; Matos, Eduardo

    2014-01-01

    The adoption of social networks by older adults has increased in recent years. However, many still cannot make use of social networks as these are simply not adapted to them. Through a series of direct observations, interviews, and focus groups, we identified recommendations for the design of social networks targeting seniors. Based on these, we developed a prototype for tablet devices, supporting sharing and viewing Facebook content. We then conducted a user study comparing our prototype with Facebook's native mobile application. We have found that Facebook's native application does not meet senior users concerns, like privacy and family focus, while our prototype, designed in accordance with the collected recommendations, supported relevant use cases in a usable and accessible manner.

  18. Designing a Facebook Interface for Senior Users

    Directory of Open Access Journals (Sweden)

    Gonçalo Gomes

    2014-01-01

    Full Text Available The adoption of social networks by older adults has increased in recent years. However, many still cannot make use of social networks as these are simply not adapted to them. Through a series of direct observations, interviews, and focus groups, we identified recommendations for the design of social networks targeting seniors. Based on these, we developed a prototype for tablet devices, supporting sharing and viewing Facebook content. We then conducted a user study comparing our prototype with Facebook's native mobile application. We have found that Facebook's native application does not meet senior users concerns, like privacy and family focus, while our prototype, designed in accordance with the collected recommendations, supported relevant use cases in a usable and accessible manner.

  19. Concepts of analytical user interface evaluation method for continuous work in NPP main control room

    International Nuclear Information System (INIS)

    Lee, S. J.; Heo, G. Y.; Jang, S. H.

    2003-01-01

    This paper describes a conceptual study of analytical evaluation method for computer-based user interface in the main control room of advanced nuclear power plant. User interfaces can classify them into two groups as static interface and dynamic interface. Existing evaluation and design methods of user interface have been mainly performed for the static user interface. But, it is useful for the dynamic user interface to control the complex system, and proper evaluation method for this is seldom. Therefore an evaluation method for dynamic user interface is proper for continuous works by standards of the load of cognition and the similarity of an interface

  20. Waste assay measurement integration system user interface

    International Nuclear Information System (INIS)

    Mousseau, K.C.; Hempstead, A.R.; Becker, G.K.

    1995-01-01

    The Waste Assay Measurement Integration System (WAMIS) is being developed to improve confidence in and lower the uncertainty of waste characterization data. There are two major components to the WAMIS: a data access and visualization component and a data interpretation component. The intent of the access and visualization software is to provide simultaneous access to all data sources that describe the contents of any particular container of waste. The visualization software also allows the user to display data at any level from raw to reduced output. Depending on user type, the software displays a menuing hierarchy, related to level of access, that allows the user to observe only those data sources s/he has been authorized to view. Access levels include system administrator, physicist, QA representative, shift operations supervisor, and data entry. Data sources are displayed in separate windows and presently include (1) real-time radiography video, (2) gamma spectra, (3) passive and active neutron, (4) radionuclide mass estimates, (5) total alpha activity (Ci), (6) container attributes, (7) thermal power (w), and (8) mass ratio estimates for americium, plutonium, and uranium isotopes. The data interpretation component is in the early phases of design, but will include artificial intelligence, expert system, and neural network techniques. The system is being developed on a Pentium PC using Microsoft Visual C++. Future generations of WAMIS will be UNIX based and will incorporate more generically radiographic/tomographic, gamma spectroscopic/tomographics, neutron, and prompt gamma measurements

  1. Multimodal game bot detection using user behavioral characteristics.

    Science.gov (United States)

    Kang, Ah Reum; Jeong, Seong Hoon; Mohaisen, Aziz; Kim, Huy Kang

    2016-01-01

    As the online service industry has continued to grow, illegal activities in the online world have drastically increased and become more diverse. Most illegal activities occur continuously because cyber assets, such as game items and cyber money in online games, can be monetized into real currency. The aim of this study is to detect game bots in a massively multiplayer online role playing game (MMORPG). We observed the behavioral characteristics of game bots and found that they execute repetitive tasks associated with gold farming and real money trading. We propose a game bot detection method based on user behavioral characteristics. The method of this paper was applied to real data provided by a major MMORPG company. Detection accuracy rate increased to 96.06 % on the banned account list.

  2. Business Performer-Centered Design of User Interfaces

    Science.gov (United States)

    Sousa, Kênia; Vanderdonckt, Jean

    Business Performer-Centered Design of User Interfaces is a new design methodology that adopts business process (BP) definition and a business performer perspective for managing the life cycle of user interfaces of enterprise systems. In this methodology, when the organization has a business process culture, the business processes of an organization are firstly defined according to a traditional methodology for this kind of artifact. These business processes are then transformed into a series of task models that represent the interactive parts of the business processes that will ultimately lead to interactive systems. When the organization has its enterprise systems, but not yet its business processes modeled, the user interfaces of the systems help derive tasks models, which are then used to derive the business processes. The double linking between a business process and a task model, and between a task model and a user interface model makes it possible to ensure traceability of the artifacts in multiple paths and enables a more active participation of business performers in analyzing the resulting user interfaces. In this paper, we outline how a human-perspective is used tied to a model-driven perspective.

  3. AutoCAD platform customization user interface and beyond

    CERN Document Server

    Ambrosius, Lee

    2014-01-01

    Make AutoCAD your own with powerful personalization options Options for AutoCAD customization are typically the domain of administrators, but savvy users can perform their own customizations to personalize AutoCAD. Until recently, most users never thought to customize the AutoCAD platform to meet their specific needs, instead leaving it to administrators. If you are an AutoCAD user who wants to ramp up personalization options in your favorite software, AutoCAD Platform Customization: User Interface and Beyond is the perfect resource for you. Author Lee Ambrosius is recognized as a leader in Au

  4. Reservation system with graphical user interface

    KAUST Repository

    Mohamed, Mahmoud A. Abdelhamid

    2012-01-05

    Techniques for providing a reservation system are provided. The techniques include displaying a scalable visualization object, wherein the scalable visualization object comprises an expanded view element of the reservation system depicting information in connection with a selected interval of time and a compressed view element of the reservation system depicting information in connection with one or more additional intervals of time, maintaining a visual context between the expanded view and the compressed view within the visualization object, and enabling a user to switch between the expanded view and the compressed view to facilitate use of the reservation system.

  5. Earthdata User Interface Patterns: Building Usable Web Interfaces Through a Shared UI Pattern Library

    Science.gov (United States)

    Siarto, J.

    2014-12-01

    As more Earth science software tools and services move to the web--the design and usability of those tools become ever more important. A good user interface is becoming expected and users are becoming increasingly intolerant of websites and web applications that work against them. The Earthdata UI Pattern Library attempts to give these scientists and developers the design tools they need to make usable, compelling user interfaces without the associated overhead of using a full design team. Patterns are tested and functional user interface elements targeted specifically at the Earth science community and will include web layouts, buttons, tables, typography, iconography, mapping and visualization/graphing widgets. These UI elements have emerged as the result of extensive user testing, research and software development within the NASA Earthdata team over the past year.

  6. Customizing graphical user interface technology for spacecraft control centers

    Science.gov (United States)

    Beach, Edward; Giancola, Peter; Gibson, Steven; Mahmot, Ronald

    1993-01-01

    The Transportable Payload Operations Control Center (TPOCC) project is applying the latest in graphical user interface technology to the spacecraft control center environment. This project of the Mission Operations Division's (MOD) Control Center Systems Branch (CCSB) at NASA Goddard Space Flight Center (GSFC) has developed an architecture for control centers which makes use of a distributed processing approach and the latest in Unix workstation technology. The TPOCC project is committed to following industry standards and using commercial off-the-shelf (COTS) hardware and software components wherever possible to reduce development costs and to improve operational support. TPOCC's most successful use of commercial software products and standards has been in the development of its graphical user interface. This paper describes TPOCC's successful use and customization of four separate layers of commercial software products to create a flexible and powerful user interface that is uniquely suited to spacecraft monitoring and control.

  7. A visual Fortran user interface for CITATION code

    International Nuclear Information System (INIS)

    Albarhoum, M.; Zaidan, N.

    2006-11-01

    A user interface is designed to enable running the CITATION code under Windows. Four sections of CITATION input file are arranged in the form of 4 interfaces, in which all the parameters of the section can be modified dynamically. The help for each parameter (item) can be read from a general help for the section which, in turn, can be visualized upon selecting the section from the program general menu. (author)

  8. New User Interface Architecture for NetWiser Product with AngularJS

    OpenAIRE

    Johansson, Janne

    2015-01-01

    The topic of the thesis was to develop a new user interface for a product called NetWiser. The need for a new user interface originated from the problems experienced with the technology of the old user interface. A decision was made to replace the old Java applets based user interface with a new JavaScript based user interface. In practice, the technology switch meant a complete redevelopment of the user interface application component. A complete redesign of the user interface layout was exc...

  9. Interfacing ANSYS to user's programs using UNIX shell program

    Energy Technology Data Exchange (ETDEWEB)

    Kim, In Yong; Kim, Beom Shig [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-01-01

    It has been considered to be impossible to interface the ANSYS, which is the commercial finite element code and whose program is not open to public, to the other user's program. When the analysis need to be iterated, the user should wait until the analysis is finished and read the ANSYS result to make the input data for every iteration. In this report the direct interfacing techniques between the ANSYS and the other program using UNIX shell programming are proposed. The detail program lists and the application example are also provided. (Author) 19 refs., 6 figs., 7 tabs.

  10. User Interface Technology Transfer to NASA's Virtual Wind Tunnel System

    Science.gov (United States)

    vanDam, Andries

    1998-01-01

    Funded by NASA grants for four years, the Brown Computer Graphics Group has developed novel 3D user interfaces for desktop and immersive scientific visualization applications. This past grant period supported the design and development of a software library, the 3D Widget Library, which supports the construction and run-time management of 3D widgets. The 3D Widget Library is a mechanism for transferring user interface technology from the Brown Graphics Group to the Virtual Wind Tunnel system at NASA Ames as well as the public domain.

  11. WASAT. A graphical user interface for visualization of wave spectrograms

    Energy Technology Data Exchange (ETDEWEB)

    Joergensen, R

    1996-12-01

    The report describes a technique for the decoding and visualization of sounding rocket data sets. A specific application for the visualization of three dimensional wave HF FFT spectra obtained from the SCIFER sounding rocket launched January 25, 1995, is made. The data set was decoded from its original data format which was the NASA DITES I/II format. A graphical user interface, WASAT (WAve Spectrogram Analysis Tool), using the Interactive Data Language was created. The data set was visualized using IDL image tools overlayed with contour routines. The user interface was based on the IDL widget concept. 9 refs., 7 figs.

  12. WASAT. A graphical user interface for visualization of wave spectrograms

    International Nuclear Information System (INIS)

    Joergensen, R.

    1996-12-01

    The report describes a technique for the decoding and visualization of sounding rocket data sets. A specific application for the visualization of three dimensional wave HF FFT spectra obtained from the SCIFER sounding rocket launched January 25, 1995, is made. The data set was decoded from its original data format which was the NASA DITES I/II format. A graphical user interface, WASAT (WAve Spectrogram Analysis Tool), using the Interactive Data Language was created. The data set was visualized using IDL image tools overlayed with contour routines. The user interface was based on the IDL widget concept. 9 refs., 7 figs

  13. Implementation of graphical user interfaces in nuclear applications

    International Nuclear Information System (INIS)

    Barmsnes, K.A.; Johnsen, T.; Sundling, C.-V.

    1997-01-01

    During recent years a demand has formed for systems that support design and implementation of graphical user interfaces (GUIs) in the control rooms of nuclear power plants. Picasso-3 is a user interface management system supporting object oriented definition of GUIs in a distributed computing environment. The system is currently being used in a number of different application areas within the nuclear industry, such as retrofitting of display systems in simulators and control rooms, education and training applications, etc. Some examples are given of nuclear applications where the Picasso-3 system has been used

  14. On user-friendly interface construction for CACSD packages

    DEFF Research Database (Denmark)

    Ravn, Ole

    1989-01-01

    Some ideas that are used in the development of user-friendly interface for a computer-aided control system design (CACSD) package are presented. The concepts presented are integration and extensibility through the use of object-oriented programming, man-machine interface and user support using...... direct manipulation, and multiple views and multiple actions on objects in different domains. The use of multiple views and actions in combination with graphics enhances the user's ability to get an overview of the system to be designed. Good support for iteration is provided, and the short time between...... action and presentation allows the user to evaluate actions quickly. Object-oriented programming has been used to provide modularity and encapsulation...

  15. An user-interface for retrieval of nuclear data

    International Nuclear Information System (INIS)

    Utsumi, Misako; Fujita, Mitsutane; Noda, Tetsuji

    1996-01-01

    A database storing the data on nuclear reaction was built to calculate for simulating transmutation behaviors of materials. In order to retrieve and maintain the database, the user interface for the data retrieval was developed where special knowledge on handling of the database or the machine structure is not required for end-user. It is indicated that using the database, the possibility of He formation and radioactivity in a material can be easily retrieved though the evaluation is qualitatively. (author)

  16. Ecological user interface for emergency management decision support systems

    DEFF Research Database (Denmark)

    Andersen, V.

    2003-01-01

    The user interface for decision support systems is normally structured for presenting relevant data for the skilled user in order to allow fast assessment and action of the hazardous situation, or for more complex situations to present the relevant rules and procedures to be followed in order to ...... of this paper is to discuss the possibility of using the same principles for emergency management with the aim of improved performance in complex and unanticipated situations....

  17. Graphical User Interfaces for Volume Rendering Applications in Medical Imaging

    OpenAIRE

    Lindfors, Lisa; Lindmark, Hanna

    2002-01-01

    Volume rendering applications are used in medical imaging in order to facilitate the analysis of three-dimensional image data. This study focuses on how to improve the usability of graphical user interfaces of these systems, by gathering user requirements. This is achieved by evaluations of existing systems, together with interviews and observations at clinics in Sweden that use volume rendering to some extent. The usability of the applications of today is not sufficient, according to the use...

  18. Model-driven Instrumentation of graphical user interfaces.

    OpenAIRE

    Funk, M.; Hoyer, P.; Link, S.

    2009-01-01

    In today's continuously changing markets newly developed products often do not meet the demands and expectations of customers. Research on this problem identified a large gap between developer and user expectations. Approaches to bridge this gap are to provide the developers with better information on product usage and to create a fast feedback cycle that helps tackling usage problems. Therefore, the user interface of the product, the central point of human-computer interaction, has to be ins...

  19. User Interface for the SMAC Traffic Accident Reconstruction Program

    Directory of Open Access Journals (Sweden)

    Rok Krulec

    2003-11-01

    Full Text Available This paper describes the development of the user interfacefor the traffic accident reconstruction program SMAC. Threebasic modules of software will be presented. Initial parametersinput and visualization, using graphics library for simulation of3D space, which form a graphical user interface, will be explainedin more detail. The modules have been developed usingdifferent technologies and programming approaches to increaseflexibility in further development and to take maximumadvantage of the currently accessible computer hardware, sothat module to module communication is also mentioned.

  20. DEBUGGER: Developing a graphical user interface to debug FPGAs

    CERN Document Server

    AUTHOR|(SzGeCERN)773309

    2015-01-01

    As part of the summer student projects, an FPGA debugger was designed using Qt- framework. The aim of this project is to help Data Acquisition System (DAQ) experts of COMPASS experiment to easily monitor the state of each FPGA being used. It is needful to continually monitor their state. A Graphical User Interface (GUI) has then been designed to aid experts to do so. Via IP-Bus, the content of the FPGA under investigation is displayed to the user.

  1. Visual design for the user interface, Part 1: Design fundamentals.

    Science.gov (United States)

    Lynch, P J

    1994-01-01

    Digital audiovisual media and computer-based documents will be the dominant forms of professional communication in both clinical medicine and the biomedical sciences. The design of highly interactive multimedia systems will shortly become a major activity for biocommunications professionals. The problems of human-computer interface design are intimately linked with graphic design for multimedia presentations and on-line document systems. This article outlines the history of graphic interface design and the theories that have influenced the development of today's major graphic user interfaces.

  2. A graphical user-interface for propulsion system analysis

    Science.gov (United States)

    Curlett, Brian P.; Ryall, Kathleen

    1993-01-01

    NASA LeRC uses a series of computer codes to calculate installed propulsion system performance and weight. The need to evaluate more advanced engine concepts with a greater degree of accuracy has resulted in an increase in complexity of this analysis system. Therefore, a graphical user interface was developed to allow the analyst to more quickly and easily apply these codes. The development of this interface and the rationale for the approach taken are described. The interface consists of a method of pictorially representing and editing the propulsion system configuration, forms for entering numerical data, on-line help and documentation, post processing of data, and a menu system to control execution.

  3. Automatic User Interface Generation for Visualizing Big Geoscience Data

    Science.gov (United States)

    Yu, H.; Wu, J.; Zhou, Y.; Tang, Z.; Kuo, K. S.

    2016-12-01

    Along with advanced computing and observation technologies, geoscience and its related fields have been generating a large amount of data at an unprecedented growth rate. Visualization becomes an increasingly attractive and feasible means for researchers to effectively and efficiently access and explore data to gain new understandings and discoveries. However, visualization has been challenging due to a lack of effective data models and visual representations to tackle the heterogeneity of geoscience data. We propose a new geoscience data visualization framework by leveraging the interface automata theory to automatically generate user interface (UI). Our study has the following three main contributions. First, geoscience data has its unique hierarchy data structure and complex formats, and therefore it is relatively easy for users to get lost or confused during their exploration of the data. By applying interface automata model to the UI design, users can be clearly guided to find the exact visualization and analysis that they want. In addition, from a development perspective, interface automaton is also easier to understand than conditional statements, which can simplify the development process. Second, it is common that geoscience data has discontinuity in its hierarchy structure. The application of interface automata can prevent users from suffering automation surprises, and enhance user experience. Third, for supporting a variety of different data visualization and analysis, our design with interface automata could also make applications become extendable in that a new visualization function or a new data group could be easily added to an existing application, which reduces the overhead of maintenance significantly. We demonstrate the effectiveness of our framework using real-world applications.

  4. Impact of English Regional Accents on User Acceptance of Voice User Interfaces

    NARCIS (Netherlands)

    Niculescu, A.I.; White, G.M.; Lan, S.S.; Waloejo, R.U.; Kawaguchi, Y.

    2008-01-01

    In this paper we present an experiment addressing a critical issue in Voice User Interface (VUI) design, namely whether the user acceptance can be improved by having recorded voice prompts imitate his/her regional dialect. The claim was tested within a project aiming to develop voice animated

  5. User-centered design with illiterate persons : The case of the ATM user interface

    NARCIS (Netherlands)

    Cremers, A.H.M.; Jong, J.G.M. de; Balken, J.S. van

    2008-01-01

    One of the major challenges in current user interface research and development is the accommodation of diversity in users and contexts of use in order to improve the self-efficacy of citizens. A common banking service, which should be designed for diversity, is the Automated Teller Machine (ATM).

  6. An Object-Oriented Architecture for User Interface Management in Distributed Applications

    OpenAIRE

    Denzer, Ralf

    2017-01-01

    User interfaces for large distributed applications have to handle specific problems: the complexity of the application itself and the integration of online-data into the user interface. A main task of the user interface architecture is to provide powerful tools to design and augment the end-user system easily, hence giving the designer more time to focus on user requirements. Our experiences developing a user interface system for a process control room showed that a lot of time during the dev...

  7. Feedback from Usability Evaluation to User Interface Design

    DEFF Research Database (Denmark)

    Nielsen, C. M.; Overgaard, M.; Pedersen, M. B.

    2005-01-01

    This paper reports from an exploratory study of means for providing feedback from a usability evaluation to the user interface designers. In this study, we conducted a usability evaluation of a mobile system that is used by craftsmen to register use of time and materials. The results...... and weaknesses of the system. The findings indicate that detailed descriptions of problems and log descriptions of the user's interaction with the system and of system interaction are useful for the designers when trying to understand the usability problems that the users have encountered....

  8. The Impact of User Interface on Young Children's Computational Thinking

    Science.gov (United States)

    Pugnali, Alex; Sullivan, Amanda; Bers, Marina Umaschi

    2017-01-01

    Aim/Purpose: Over the past few years, new approaches to introducing young children to computational thinking have grown in popularity. This paper examines the role that user interfaces have on children's mastery of computational thinking concepts and positive interpersonal behaviors. Background: There is a growing pressure to begin teaching…

  9. Improving the Interplay between Usability Evaluation and User Interface Design

    DEFF Research Database (Denmark)

    Hornbæk, K.; Stage, Jan

    2004-01-01

    of the workshop are motivated and an outline of the contents of the papers that were presented in the workshop is given. In addition we summarize some challenges to the interplay between usability evaluation and user interface design agreed upon at the workshop, as well as some solutions that were debated....

  10. CATO--A General User Interface for CAS

    Science.gov (United States)

    Janetzko, Hans-Dieter

    2015-01-01

    CATO is a new user interface, developed by the author as a response to the significant difficulties faced by scientists, engineers, and students in their usage of computer algebra (CA) systems. Their tendency to use CA systems only occasionally means that they are unfamiliar with requisite grammar and syntax these systems require. The author…

  11. CATO--A Guided User Interface for Different CAS

    Science.gov (United States)

    Janetzko, Hans-Dieter

    2017-01-01

    CATO is a new user interface, written in Java and developed by the author as a response to the significant difficulties faced by students who only sporadically use computer algebra systems (CAS). The usage of CAS in mathematical lectures should be an integral part of mathematical instruction. However, difficulties arise for those students who have…

  12. Adapting the unified software development process for user interface development

    NARCIS (Netherlands)

    Obrenovic, Z.; Starcevic, D.

    2006-01-01

    In this paper we describe how existing software developing processes, such as Rational Unified Process, can be adapted in order to allow disciplined and more efficient development of user interfaces. The main objective of this paper is to demonstrate that standard modeling environments, based on the

  13. Evaluating The Role Of Empathy In Crowdsourcing User Interfaces

    NARCIS (Netherlands)

    Khan, J.V.; Dey, D.; Buchina, N.

    2016-01-01

    Empathy induced altruism is believed to motivate people in a crowdsourcing environment to produce better quality work. However, there hasn’t been any considerable investigation regarding how empathy can be effectively conveyed through user interfaces (UI). We conducted a study to find the effects of

  14. Designing the OPAC User Interface to Improve Access and Retrieval.

    Science.gov (United States)

    Basista, Thomas; And Others

    1991-01-01

    Discussion of problems with retrieval of records in library online public access catalogs (OPACs) focuses on an ongoing research project at the Indiana University of Pennsylvania (IUP) that has been trying to improve subject retrieval vocabulary control using natural and thesaural language and on the design of a good graphical user interface.…

  15. The Graphical User Interface Crisis: Danger and Opportunity.

    Science.gov (United States)

    Boyd, Lawrence H.; And Others

    This paper examines graphic computing environments, identifies potential problems in providing access to blind people, and describes programs and strategies being developed to provide this access. The paper begins with an explanation of how graphic user interfaces differ from character-based systems in their use of pixels, visual metaphors such as…

  16. Towards Linking User Interface Translation Needs to Lexicographic ...

    African Journals Online (AJOL)

    In a time of proliferating electronic devices such as smartphones, translators of user interfaces are faced with new challenges, such as the use of existing words in new contexts or in their obtaining new meanings. In this article, three lexicographic reference works available to translators in this field are compared: the ...

  17. Applying Minimal Manual Principles for Documentation of Graphical User Interfaces.

    Science.gov (United States)

    Nowaczyk, Ronald H.; James, E. Christopher

    1993-01-01

    Investigates the need to include computer screens in documentation for software using a graphical user interface. Describes the uses and purposes of "minimal manuals" and their principles. Studies student reaction to their use of one of three on-screen manuals: screens, icon, and button. Finds some benefit for including icon and button…

  18. Circumventing Graphical User Interfaces in Chemical Engineering Plant Design

    Science.gov (United States)

    Romey, Noel; Schwartz, Rachel M.; Behrend, Douglas; Miao, Peter; Cheung, H. Michael; Beitle, Robert

    2007-01-01

    Graphical User Interfaces (GUIs) are pervasive elements of most modern technical software and represent a convenient tool for student instruction. For example, GUIs are used for [chemical] process design software (e.g., CHEMCAD, PRO/II and ASPEN) typically encountered in the senior capstone course. Drag and drop aspects of GUIs are challenging for…

  19. Software Graphical User Interface For Analysis Of Images

    Science.gov (United States)

    Leonard, Desiree M.; Nolf, Scott R.; Avis, Elizabeth L.; Stacy, Kathryn

    1992-01-01

    CAMTOOL software provides graphical interface between Sun Microsystems workstation and Eikonix Model 1412 digitizing camera system. Camera scans and digitizes images, halftones, reflectives, transmissives, rigid or flexible flat material, or three-dimensional objects. Users digitize images and select from three destinations: work-station display screen, magnetic-tape drive, or hard disk. Written in C.

  20. An approach to developing user interfaces for space systems

    Science.gov (United States)

    Shackelford, Keith; McKinney, Karen

    1993-08-01

    Inherent weakness in the traditional waterfall model of software development has led to the definition of the spiral model. The spiral model software development lifecycle model, however, has not been applied to NASA projects. This paper describes its use in developing real time user interface software for an Environmental Control and Life Support System (ECLSS) Process Control Prototype at NASA's Marshall Space Flight Center.

  1. Helping Students Test Programs That Have Graphical User Interfaces

    Directory of Open Access Journals (Sweden)

    Matthew Thornton

    2008-08-01

    Full Text Available Within computer science education, many educators are incorporating software testing activities into regular programming assignments. Tools like JUnit and its relatives make software testing tasks much easier, bringing them into the realm of even introductory students. At the same time, many introductory programming courses are now including graphical interfaces as part of student assignments to improve student interest and engagement. Unfortunately, writing software tests for programs that have significant graphical user interfaces is beyond the skills of typical students (and many educators. This paper presents initial work at combining educationally oriented and open-source tools to create an infrastructure for writing tests for Java programs that have graphical user interfaces. Critically, these tools are intended to be appropriate for introductory (CS1/CS2 student use, and to dovetail with current teaching approaches that incorporate software testing in programming assignments. We also include in our findings our proposed approach to evaluating our techniques.

  2. The phenomenological experience of dementia and user interface development

    DEFF Research Database (Denmark)

    Peterson, Carrie Beth; Mitseva, Anelia; Mihovska, Albena D.

    2009-01-01

    This study follows the project ISISEMD through a phenomenological approach of investigating the experience of the Human Computer Interaction (HCI) for someone with dementia. The aim is to accentuate the Assistive Technology (AT) from the end user perspective. It proposes that older adults and those...... with dementia should no longer be an overlooked population, and how the HCI community can learn from their experiences to develop methods and design interfaces which truly benefit these individuals. Guidelines from previous research are incorporated along with eclectic, user-centered strategies as the interface...... designers for ISISEMD develop an appropriate and effective modality. The paper outlines the interconnected difficulties associated with the characteristics of older adults with mild dementia, which are important to be considered when introducing AT to that group of end users. It further presents clear...

  3. Use of natural user interfaces in water simulations

    Science.gov (United States)

    Donchyts, G.; Baart, F.; van Dam, A.; Jagers, B.

    2013-12-01

    Conventional graphical user interfaces, used to edit input and present results of earth science models, have seen little innovation for the past two decades. In most cases model data is presented and edited using 2D projections even when working with 3D data. The emergence of 3D motion sensing technologies, such as Microsoft Kinect and LEAP Motion, opens new possibilities for user interaction by adding more degrees of freedom compared to a classical way using mouse and keyboard. Here we investigate how interaction with hydrodynamic numerical models can be improved using these new technologies. Our research hypothesis (H1) states that properly designed 3D graphical user interface paired with the 3D motion sensor can significantly reduce the time required to setup and use numerical models. In this work we have used a LEAP motion controller combined with a shallow water flow model engine D-Flow Flexible Mesh. Interacting with numerical model using hands

  4. EEG Classification for Hybrid Brain-Computer Interface Using a Tensor Based Multiclass Multimodal Analysis Scheme.

    Science.gov (United States)

    Ji, Hongfei; Li, Jie; Lu, Rongrong; Gu, Rong; Cao, Lei; Gong, Xiaoliang

    2016-01-01

    Electroencephalogram- (EEG-) based brain-computer interface (BCI) systems usually utilize one type of changes in the dynamics of brain oscillations for control, such as event-related desynchronization/synchronization (ERD/ERS), steady state visual evoked potential (SSVEP), and P300 evoked potentials. There is a recent trend to detect more than one of these signals in one system to create a hybrid BCI. However, in this case, EEG data were always divided into groups and analyzed by the separate processing procedures. As a result, the interactive effects were ignored when different types of BCI tasks were executed simultaneously. In this work, we propose an improved tensor based multiclass multimodal scheme especially for hybrid BCI, in which EEG signals are denoted as multiway tensors, a nonredundant rank-one tensor decomposition model is proposed to obtain nonredundant tensor components, a weighted fisher criterion is designed to select multimodal discriminative patterns without ignoring the interactive effects, and support vector machine (SVM) is extended to multiclass classification. Experiment results suggest that the proposed scheme can not only identify the different changes in the dynamics of brain oscillations induced by different types of tasks but also capture the interactive effects of simultaneous tasks properly. Therefore, it has great potential use for hybrid BCI.

  5. Increasing trend of wearables and multimodal interface for human activity monitoring: A review.

    Science.gov (United States)

    Kumari, Preeti; Mathew, Lini; Syal, Poonam

    2017-04-15

    Activity recognition technology is one of the most important technologies for life-logging and for the care of elderly persons. Elderly people prefer to live in their own houses, within their own locality. If, they are capable to do so, several benefits can follow in terms of society and economy. However, living alone may have high risks. Wearable sensors have been developed to overcome these risks and these sensors are supposed to be ready for medical uses. It can help in monitoring the wellness of elderly persons living alone by unobtrusively monitoring their daily activities. The study aims to review the increasing trends of wearable devices and need of multimodal recognition for continuous or discontinuous monitoring of human activity, biological signals such as Electroencephalogram (EEG), Electrooculogram (EOG), Electromyogram (EMG), Electrocardiogram (ECG) and parameters along with other symptoms. This can provide necessary assistance in times of ominous need, which is crucial for the advancement of disease-diagnosis and treatment. Shared control architecture with multimodal interface can be used for application in more complex environment where more number of commands is to be used to control with better results in terms of controlling. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. A Tool for Balance Control Training Using Muscle Synergies and Multimodal Interfaces

    Directory of Open Access Journals (Sweden)

    D. Galeano

    2014-01-01

    Full Text Available Balance control plays a key role in neuromotor rehabilitation after stroke or spinal cord injuries. Computerized dynamic posturography (CDP is a classic technological tool to assess the status of balance control and to identify potential disorders. Despite the more accurate diagnosis generated by these tools, the current strategies to promote rehabilitation are still limited and do not take full advantage of the technologies available. This paper presents a novel balance training platform which combines a CDP device made from low-cost interfaces, such as the Nintendo Wii Balance Board and the Microsoft Kinect. In addition, it integrates a custom electrical stimulator that uses the concept of muscle synergies to promote natural interaction. The aim of the platform is to support the exploration of innovative multimodal therapies. Results include the technical validation of the platform using mediolateral and anteroposterior sways as basic balance training therapies.

  7. NLEdit: A generic graphical user interface for Fortran programs

    Science.gov (United States)

    Curlett, Brian P.

    1994-01-01

    NLEdit is a generic graphical user interface for the preprocessing of Fortran namelist input files. The interface consists of a menu system, a message window, a help system, and data entry forms. A form is generated for each namelist. The form has an input field for each namelist variable along with a one-line description of that variable. Detailed help information, default values, and minimum and maximum allowable values can all be displayed via menu picks. Inputs are processed through a scientific calculator program that allows complex equations to be used instead of simple numeric inputs. A custom user interface is generated simply by entering information about the namelist input variables into an ASCII file. There is no need to learn a new graphics system or programming language. NLEdit can be used as a stand-alone program or as part of a larger graphical user interface. Although NLEdit is intended for files using namelist format, it can be easily modified to handle other file formats.

  8. Save medical personnel's time by improved user interfaces.

    Science.gov (United States)

    Kindler, H

    1997-01-01

    Common objectives in the industrial countries are the improvement of quality of care, clinical effectiveness, and cost control. Cost control, in particular, has been addressed through the introduction of case mix systems for reimbursement by social-security institutions. More data is required to enable quality improvement, increases in clinical effectiveness and for juridical reasons. At first glance, this documentation effort is contradictory to cost reduction. However, integrated services for resource management based on better documentation should help to reduce costs. The clerical effort for documentation should be decreased by providing a co-operative working environment for healthcare professionals applying sophisticated human-computer interface technology. Additional services, e.g., automatic report generation, increase the efficiency of healthcare personnel. Modelling the medical work flow forms an essential prerequisite for integrated resource management services and for co-operative user interfaces. A user interface aware of the work flow provides intelligent assistance by offering the appropriate tools at the right moment. Nowadays there is a trend to client/server systems with relational databases or object-oriented databases as repository. The work flows used for controlling purposes and to steer the user interfaces must be represented in the repository.

  9. Multimodal Desktop Interaction: The Face –Object-Gesture–Voice Example

    DEFF Research Database (Denmark)

    Vidakis, Nikolas; Vlasopoulos, Anastasios; Kounalakis, Tsampikos

    2013-01-01

    This paper presents a natural user interface system based on multimodal human computer interaction, which operates as an intermediate module between the user and the operating system. The aim of this work is to demonstrate a multimodal system which gives users the ability to interact with desktop...

  10. EPICS-QT based graphical user interface for accelerator control

    International Nuclear Information System (INIS)

    Basu, A.; Singh, S.K.; Rosily, Sherry; Bhagwat, P.V.

    2016-01-01

    Particle accelerators and many industrial complex systems, require a robust and efficient control for its proper operation to achieve required beam quality, safety of its sub component and all working personnel. This control is executed via a graphical user interface through which an operator interacts with the accelerator to achieve the desired state of the machine and its output. Experimental Physics and Industrial Control System (EPICS) is a widely used control system framework in the field of accelerator control. It acts as a middle layer between field devices and graphic user interface used by the operator. Field devices can also be made EPICS compliant by using EPICS based software in that. On the other hand Qt is a C++ framework which is widely used for creating very professional looking and user friendly graphical component. In Low Energy High Intensity Proton Accelerator (LEHIPA), which is the first stage of the three stage Accelerator Driven System (ADS) program taken by Bhabha Atomic Research Centre (BARC), it is decided that EPICS will be used for controlling the accelerator and Qt will be used for developing the various Graphic User Interface (GUI) for operation and diagnostics. This paper discuss the work carried out to achieve this goal in LEHIPA

  11. Eliciting user-sourced interaction mappings for body-based interfaces

    OpenAIRE

    May, Aaron

    2015-01-01

    Thanks to technological advancements, whole-body natural user interfaces are becoming increasingly common in modern homes and public spaces. However, because whole-body natural user interfaces lack obvious affordances, users can be unsure how to control the interface. In this thesis, I report the findings of a study of novice and expert users mock controlling a balance-based whole-body natural user interface during a Think Aloud task. I compare the strategies demonstrated by participants whi...

  12. Pemrograman Graphical User Interface (GUI) Dengan Matlab Untuk Mendesain Alat Bantu Opersai Matematika

    OpenAIRE

    Butar Butar, Ronisah Putra

    2011-01-01

    Graphical User Interface ( GUI) is a application program orient visual which woke up with graphical obyek in the place of comand of text for the user interaction. Graphical User Interface ( GUI) in MATLAB embraced in a application of GUIDE ( Graphical User Interface Builder). In this paper will be discuss about how disagning a appliance assist mathematics operation with program of Graphical User Interface ( GUI) with MATLAB with aim to as one of the appliance alternative assist...

  13. Bed occupancy monitoring: data processing and clinician user interface design.

    Science.gov (United States)

    Pouliot, Melanie; Joshi, Vilas; Goubran, Rafik; Knoefel, Frank

    2012-01-01

    Unobtrusive and continuous monitoring of patients, especially at their place of residence, is becoming a significant part of the healthcare model. A variety of sensors are being used to monitor different patient conditions. Bed occupancy monitoring provides clinicians a quantitative measure of bed entry/exit patterns and may provide information relating to sleep quality. This paper presents a bed occupancy monitoring system using a bed pressure mat sensor. A clinical trial was performed involving 8 patients to collect bed occupancy data. The trial period for each patient ranged from 5-10 weeks. This data was analyzed using a participatory design methodology incorporating clinician feedback to obtain bed occupancy parameters. The parameters extracted include the number of bed exits per night, the bed exit weekly average (including minimum and maximum), the time of day of a particular exit, and the amount of uninterrupted bed occupancy per night. The design of a clinical user interface plays a significant role in the acceptance of such patient monitoring systems by clinicians. The clinician user interface proposed in this paper was designed to be intuitive, easy to navigate and not cause information overload. An iterative design methodology was used for the interface design. The interface design is extendible to incorporate data from multiple sensors. This allows the interface to be part of a comprehensive remote patient monitoring system.

  14. Dynamic User Interfaces for Service Oriented Architectures in Healthcare.

    Science.gov (United States)

    Schweitzer, Marco; Hoerbst, Alexander

    2016-01-01

    Electronic Health Records (EHRs) play a crucial role in healthcare today. Considering a data-centric view, EHRs are very advanced as they provide and share healthcare data in a cross-institutional and patient-centered way adhering to high syntactic and semantic interoperability. However, the EHR functionalities available for the end users are rare and hence often limited to basic document query functions. Future EHR use necessitates the ability to let the users define their needed data according to a certain situation and how this data should be processed. Workflow and semantic modelling approaches as well as Web services provide means to fulfil such a goal. This thesis develops concepts for dynamic interfaces between EHR end users and a service oriented eHealth infrastructure, which allow the users to design their flexible EHR needs, modeled in a dynamic and formal way. These are used to discover, compose and execute the right Semantic Web services.

  15. A visual user interface program, EGSWIN, for EGS4

    International Nuclear Information System (INIS)

    Qiu Rui; Li Junli; Wu Zhen

    2005-01-01

    To overcome the inconvenience and difficulty in using the EGS4 code by novice users, a visual user interface program, called the EGSWIN system, has been developed by the Monte Carlo Research Center of Tsinghua University in China. EGSWIN allows users to run EGS4 for many applications without any user coding. A mixed-language programming technique with Visual C++ and Visual Fortran is used in order to embed both EGS4 and PEGS4 into EGSWIN. The system has the features of visual geometry input, geometry processing, visual definitions of source, scoring and computing parameters, and particle trajectories display. Comparison between the calculated results with EGS4 and EGSWIN, as well as with FLUKA and GEANT, has been made to validate EGSWIN. (author)

  16. TmoleX--a graphical user interface for TURBOMOLE.

    Science.gov (United States)

    Steffen, Claudia; Thomas, Klaus; Huniar, Uwe; Hellweg, Arnim; Rubner, Oliver; Schroer, Alexander

    2010-12-01

    We herein present the graphical user interface (GUI) TmoleX for the quantum chemical program package TURBOMOLE. TmoleX allows users to execute the complete workflow of a quantum chemical investigation from the initial building of a structure to the visualization of the results in a user friendly graphical front end. The purpose of TmoleX is to make TURBOMOLE easy to use and to provide a high degree of flexibility. Hence, it should be a valuable tool for most users from beginners to experts. The program is developed in Java and runs on Linux, Windows, and Mac platforms. It can be used to run calculations on local desktops as well as on remote computers. © 2010 Wiley Periodicals, Inc.

  17. Graphical user interfaces for McClellan Nuclear Radiation Center

    International Nuclear Information System (INIS)

    Brown-VanHoozer, S.A.; Power, M.; Forsmann, H.

    1998-01-01

    The control console of the TRIGA reactor at McClellan's Nuclear Radiation Center (MNRC) is in the process of being replaced because of spurious scrams, outdated software, and obsolete parts. The intent of the new control console is to eliminate the existing problems by installing a UNIX-based computer system with industry-standard interface software and by incorporating human factors during all stages of the graphical user interface (GUI) development and control console design. This paper gives a brief description of some of the guidelines used in developing the MNRC's GUIs as continuous, real-time displays

  18. Low Cost Desktop Image Analysis Workstation With Enhanced Interactive User Interface

    Science.gov (United States)

    Ratib, Osman M.; Huang, H. K.

    1989-05-01

    A multimodality picture archiving and communication system (PACS) is in routine clinical use in the UCLA Radiology Department. Several types workstations are currently implemented for this PACS. Among them, the Apple Macintosh II personal computer was recently chosen to serve as a desktop workstation for display and analysis of radiological images. This personal computer was selected mainly because of its extremely friendly user-interface, its popularity among the academic and medical community and its low cost. In comparison to other microcomputer-based systems the Macintosh II offers the following advantages: the extreme standardization of its user interface, file system and networking, and the availability of a very large variety of commercial software packages. In the current configuration the Macintosh II operates as a stand-alone workstation where images are imported from a centralized PACS server through an Ethernet network using a standard TCP-IP protocol, and stored locally on magnetic disk. The use of high resolution screens (1024x768 pixels x 8bits) offer sufficient performance for image display and analysis. We focused our project on the design and implementation of a variety of image analysis algorithms ranging from automated structure and edge detection to sophisticated dynamic analysis of sequential images. Specific analysis programs were developed for ultrasound images, digitized angiograms, MRI and CT tomographic images and scintigraphic images.

  19. User Interface Design in Medical Distributed Web Applications.

    Science.gov (United States)

    Serban, Alexandru; Crisan-Vida, Mihaela; Mada, Leonard; Stoicu-Tivadar, Lacramioara

    2016-01-01

    User interfaces are important to facilitate easy learning and operating with an IT application especially in the medical world. An easy to use interface has to be simple and to customize the user needs and mode of operation. The technology in the background is an important tool to accomplish this. The present work aims to creating a web interface using specific technology (HTML table design combined with CSS3) to provide an optimized responsive interface for a complex web application. In the first phase, the current icMED web medical application layout is analyzed, and its structure is designed using specific tools, on source files. In the second phase, a new graphic adaptable interface to different mobile terminals is proposed, (using HTML table design (TD) and CSS3 method) that uses no source files, just lines of code for layout design, improving the interaction in terms of speed and simplicity. For a complex medical software application a new prototype layout was designed and developed using HTML tables. The method uses a CSS code with only CSS classes applied to one or multiple HTML table elements, instead of CSS styles that can be applied to just one DIV tag at once. The technique has the advantage of a simplified CSS code, and a better adaptability to different media resolutions compared to DIV-CSS style method. The presented work is a proof that adaptive web interfaces can be developed just using and combining different types of design methods and technologies, using HTML table design, resulting in a simpler to learn and use interface, suitable for healthcare services.

  20. Social Circles: A 3D User Interface for Facebook

    Science.gov (United States)

    Rodrigues, Diego; Oakley, Ian

    Online social network services are increasingly popular web applications which display large amounts of rich multimedia content: contacts, status updates, photos and event information. Arguing that this quantity of information overwhelms conventional user interfaces, this paper presents Social Circles, a rich interactive visualization designed to support real world users of social network services in everyday tasks such as keeping up with friends and organizing their network. It achieves this by using 3D UIs, fluid animations and a spatial metaphor to enable direct manipulation of a social network.

  1. A Graphical User Interface to Generalized Linear Models in MATLAB

    Directory of Open Access Journals (Sweden)

    Peter Dunn

    1999-07-01

    Full Text Available Generalized linear models unite a wide variety of statistical models in a common theoretical framework. This paper discusses GLMLAB-software that enables such models to be fitted in the popular mathematical package MATLAB. It provides a graphical user interface to the powerful MATLAB computational engine to produce a program that is easy to use but with many features, including offsets, prior weights and user-defined distributions and link functions. MATLAB's graphical capacities are also utilized in providing a number of simple residual diagnostic plots.

  2. Robotic and user interface solutions for hazardous and remote applications

    International Nuclear Information System (INIS)

    Schempf, H.

    1997-01-01

    Carnegie Mellon University (CMU) is developing novel robotic and user interface systems to assist in the cleanup activities undertaken by the U.S. Department of Energy (DOE). Under DOE's EM-50 funding and administered by the Federal Energy Technology Center (FETC), CMU has developed a novel asbestos pipe-insulation abatement robot system, called BOA, and a novel generic user interface control and training console, dubbed RoboCon. The use of BOA will allow the speedier abatement of the vast DOE piping networks clad with hazardous and contaminated asbestos insulation by which overall job costs can be reduced by as much as 50%. RoboCon will allow the DOE to evaluate different remote and robotic system technologies from the overall man-machine performance standpoint, as well as provide a standardized training platform for training site operators in the operation of remote and robotic equipment

  3. Knowledge-based critiquing of graphical user interfaces with CHIMES

    Science.gov (United States)

    Jiang, Jianping; Murphy, Elizabeth D.; Carter, Leslie E.; Truszkowski, Walter F.

    1994-01-01

    CHIMES is a critiquing tool that automates the process of checking graphical user interface (GUI) designs for compliance with human factors design guidelines and toolkit style guides. The current prototype identifies instances of non-compliance and presents problem statements, advice, and tips to the GUI designer. Changes requested by the designer are made automatically, and the revised GUI is re-evaluated. A case study conducted at NASA-Goddard showed that CHIMES has the potential for dramatically reducing the time formerly spent in hands-on consistency checking. Capabilities recently added to CHIMES include exception handling and rule building. CHIMES is intended for use prior to usability testing as a means, for example, of catching and correcting syntactic inconsistencies in a larger user interface.

  4. ModelMate - A graphical user interface for model analysis

    Science.gov (United States)

    Banta, Edward R.

    2011-01-01

    ModelMate is a graphical user interface designed to facilitate use of model-analysis programs with models. This initial version of ModelMate supports one model-analysis program, UCODE_2005, and one model software program, MODFLOW-2005. ModelMate can be used to prepare input files for UCODE_2005, run UCODE_2005, and display analysis results. A link to the GW_Chart graphing program facilitates visual interpretation of results. ModelMate includes capabilities for organizing directories used with the parallel-processing capabilities of UCODE_2005 and for maintaining files in those directories to be identical to a set of files in a master directory. ModelMate can be used on its own or in conjunction with ModelMuse, a graphical user interface for MODFLOW-2005 and PHAST.

  5. Automatic Generation of User Interface Layouts for Alternative Screen Orientations

    OpenAIRE

    Zeidler , Clemens; Weber , Gerald; Stuerzlinger , Wolfgang; Lutteroth , Christof

    2017-01-01

    Part 1: Adaptive Design and Mobile Applications; International audience; Creating multiple layout alternatives for graphical user interfaces to accommodate different screen orientations for mobile devices is labor intensive. Here, we investigate how such layout alternatives can be generated automatically from an initial layout. Providing good layout alternatives can inspire developers in their design work and support them to create adaptive layouts. We performed an analysis of layout alternat...

  6. A New Layout Method for Graphical User Interfaces

    OpenAIRE

    Scoditti , Adriano; Stuerzlinger , Wolfgang

    2010-01-01

    International audience; The layout mechanisms for many GUI toolkits are hard to understand, the associated tools and API's often difficult to use. This work investigates new, easy-to-understand layout mechanisms and evaluates its implementation. We will analyze the requirements for the definition of layouts of a graphical user interface. Part of the issue is that several aspects need to be considered simultaneously while laying-out a component: the alignment with other components as well as i...

  7. User interface to an ICAI system that teaches discrete math

    OpenAIRE

    Calcote, Roy Keith.; Howard, Richard Anthony

    1990-01-01

    Approved for public release; distribution is unlimited. The main thrust of this thesis is the design of a usable Intelligent Computer Aided Instruction (ICAI) user interface that does not use a natural language processor and runs on a personal computer. Discrete Mathematics is the knowledge domain for this project and the Discrete Math Tutor (DMT) is the name of the tutoring system. The DMT will allow the average student to benefit from a tutoring system now and not have to wait until the ...

  8. Comparison of Automated Graphical User Interface Testing Tools

    OpenAIRE

    Gaber, Domen

    2018-01-01

    The thesis presents the analysis of modern tools for automated testing of various web based user interfaces. The purpose of the work is to compare specific test automation solutions and point out the most suitable test automation tool amongst them. One of the main goals of test automation is to gain faster execution when compared to manual testing and overall cost deduction. There are multiple test automation solutions available on the market, which differ in complexity of use, type of o...

  9. A general graphical user interface for automatic reliability modeling

    Science.gov (United States)

    Liceaga, Carlos A.; Siewiorek, Daniel P.

    1991-01-01

    Reported here is a general Graphical User Interface (GUI) for automatic reliability modeling of Processor Memory Switch (PMS) structures using a Markov model. This GUI is based on a hierarchy of windows. One window has graphical editing capabilities for specifying the system's communication structure, hierarchy, reconfiguration capabilities, and requirements. Other windows have field texts, popup menus, and buttons for specifying parameters and selecting actions. An example application of the GUI is given.

  10. Simulated breeding with QU-GENE graphical user interface.

    Science.gov (United States)

    Hathorn, Adrian; Chapman, Scott; Dieters, Mark

    2014-01-01

    Comparing the efficiencies of breeding methods with field experiments is a costly, long-term process. QU-GENE is a highly flexible genetic and breeding simulation platform capable of simulating the performance of a range of different breeding strategies and for a continuum of genetic models ranging from simple to complex. In this chapter we describe some of the basic mechanics behind the QU-GENE user interface and give a simplified example of how it works.

  11. Development of a Graphical User Interface to Visualize Surface Observations

    Energy Technology Data Exchange (ETDEWEB)

    Buckley, R.L.

    1998-07-13

    Thousands of worldwide observing stations provide meteorological information near the earth's surface as often as once each hour. This surface data may be plotted on geographical maps to provide the meteorologist useful information regarding weather patterns for a region of interest. This report describes the components and applications of a graphical user interface which have been developed to visualize surface observations at any global location and time of interest.

  12. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  13. User interface for MAWST limit of error program

    International Nuclear Information System (INIS)

    Crain, B. Jr.

    1991-01-01

    This paper reports on a user-friendly interface which is being developed to aid in preparation of input data for the Los Alamos National Laboratory software module MAWST (Materials Accounting With Sequential Testing) used at Savannah River Site to propagate limits of error for facility material balances. The forms-based interface is being designed using traditional software project management tools and using the Ingres family of database management and application development products (products of Relational Technology, Inc.). The software will run on VAX computers (products of Digital Equipment Corporation) on which the VMS operating system and Ingres database management software are installed. Use of the interface software will reduce time required to prepare input data for calculations and also reduce errors associated with data preparation

  14. RGtk2: A Graphical User Interface Toolkit for R

    Directory of Open Access Journals (Sweden)

    Duncan Temple Lang

    2011-01-01

    Full Text Available Graphical user interfaces (GUIs are growing in popularity as a complement or alternative to the traditional command line interfaces to R. RGtk2 is an R package for creating GUIs in R. The package provides programmatic access to GTK+ 2.0, an open-source GUI toolkit written in C. To construct a GUI, the R programmer calls RGtk2 functions that map to functions in the underlying GTK+ library. This paper introduces the basic concepts underlying GTK+ and explains how to use RGtk2 to construct GUIs from R. The tutorial is based on simple and pratical programming examples. We also provide more complex examples illustrating the advanced features of the package. The design of the RGtk2 API and the low-level interface from R to GTK+ are discussed at length. We compare RGtk2 to alternative GUI toolkits for R.

  15. Flair: A powerful but user friendly graphical interface for FLUKA

    International Nuclear Information System (INIS)

    Vlachoudis, V.

    2009-01-01

    FLAIR is an advanced user graphical interface for FLUKA, to enable the user to start and control FLUKA jobs completely from a GUI environment without the need for command-line interactions. It is written entirely with python and Tkinter allowing easier portability across various operating systems and great programming flexibility with focus to be used as an Application Programming Interface (API) for FLUKA. FLAIR is an integrated development environment (IDE) for FLUKA, it does not only provide means for the post processing of the output but a big emphasis has been set on the creation and checking of error free input files. It contains a fully featured editor for editing the input files in a human readable way with syntax highlighting, without hiding the inner functionality of FLUKA from the users. It provides also means for building the executable, debugging the geometry, running the code, monitoring the status of one or many runs, inspection of the output files, post processing of the binary files (data merging) and interface to plotting utilities like gnuplot and PovRay for high quality plots or photo-realistic images. The program includes also a database of selected properties of all known nuclides and their known isotopic composition as well a reference database of ∼ 300 predefined materials together with their Sterheimer parameters. (authors)

  16. A graphical user-interface control system at SRRC

    International Nuclear Information System (INIS)

    Chen, J.S.; Wang, C.J.; Chen, S.J.; Jan, G.J.

    1993-01-01

    A graphical user interface control system of 1.3 GeV synchrotron radiation light source was designed and implemented for the beam transport line (BTL) and storage ring (SR). A modern control technique has been used to implement and control the third generation synchrotron light source. Two level computer hardware configuration, that includes process and console computers as a top level and VME based intelligent local controller as a bottom level, was setup and tested. Both level computers are linked by high speed Ethernet data communication network. A database includes static and dynamic databases as well as access routines were developed. In order to commission and operate the machine friendly, the graphical man machine interface was designed and coded. The graphical user interface (GUI) software was installed on VAX workstations for the BTL and SR at the Synchrotron Radiation Research Center (SRRC). The over all performance has been evaluated at 10Hz update rate. The results showed that the graphical operator interface control system is versatile system and can be implemented into the control system of the accelerator. It will provide the tool to control and monitor the equipments of the radiation light source especially for machine commissioning and operation

  17. More playful user interfaces interfaces that invite social and physical interaction

    CERN Document Server

    2015-01-01

    This book covers the latest advances in playful user interfacesinterfaces that invite social and physical interaction. These new developments include the use of audio, visual, tactile and physiological sensors to monitor, provide feedback and anticipate the behavior of human users. The decreasing cost of sensor and actuator technology makes it possible to integrate physical behavior information in human-computer interactions. This leads to many new entertainment and game applications that allow or require social and physical interaction in sensor- and actuator-equipped smart environments. The topics discussed include: human-nature interaction, human-animal interaction and the interaction with tangibles that are naturally integrated in our smart environments. Digitally supported remote audience participation in artistic or sport events is also discussed. One important theme that emerges throughout the book is the involvement of users in the digital-entertainment design process or even design and implement...

  18. Beam modelling with a window-oriented user interface

    International Nuclear Information System (INIS)

    Raich, U.

    1990-01-01

    In the near future, graphic workstations will be used as replacements for the present operator consoles in the CERN PS accelerator complex. This implies a major change in the style of work for the operators and in the way programs have to be conceived by the programmers. ULTRIX-based (ULTRIX is Digital's version of UNIX) VAX workstations have been selected and DEC-Windows will be used to construct the user interfaces. As a first prototype application, TRACE3D, a beam-transport simulation program, has been adapted to the new environment. This program was an ideal candidate for tests because it needs a great deal of user interaction, while process access is not necessary, at least in a first implementation. This paper describes the different ways in which users interact in order to calculate the beam envelopes, to plot the results, to print out the beam or transport-line parameters, to modify the parameters and to get help. (orig.)

  19. Automatic figure ranking and user interfacing for intelligent figure search.

    Directory of Open Access Journals (Sweden)

    Hong Yu

    2010-10-01

    Full Text Available Figures are important experimental results that are typically reported in full-text bioscience articles. Bioscience researchers need to access figures to validate research facts and to formulate or to test novel research hypotheses. On the other hand, the sheer volume of bioscience literature has made it difficult to access figures. Therefore, we are developing an intelligent figure search engine (http://figuresearch.askhermes.org. Existing research in figure search treats each figure equally, but we introduce a novel concept of "figure ranking": figures appearing in a full-text biomedical article can be ranked by their contribution to the knowledge discovery.We empirically validated the hypothesis of figure ranking with over 100 bioscience researchers, and then developed unsupervised natural language processing (NLP approaches to automatically rank figures. Evaluating on a collection of 202 full-text articles in which authors have ranked the figures based on importance, our best system achieved a weighted error rate of 0.2, which is significantly better than several other baseline systems we explored. We further explored a user interfacing application in which we built novel user interfaces (UIs incorporating figure ranking, allowing bioscience researchers to efficiently access important figures. Our evaluation results show that 92% of the bioscience researchers prefer as the top two choices the user interfaces in which the most important figures are enlarged. With our automatic figure ranking NLP system, bioscience researchers preferred the UIs in which the most important figures were predicted by our NLP system than the UIs in which the most important figures were randomly assigned. In addition, our results show that there was no statistical difference in bioscience researchers' preference in the UIs generated by automatic figure ranking and UIs by human ranking annotation.The evaluation results conclude that automatic figure ranking and user

  20. An Accessible User Interface for Geoscience and Programming

    Science.gov (United States)

    Sevre, E. O.; Lee, S.

    2012-12-01

    The goal of this research is to develop an interface that will simplify user interaction with software for scientists. The motivating factor of the research is to develop tools that assist scientists with limited motor skills with the efficient generation and use of software tools. Reliance on computers and programming is increasing in the world of geology, and it is increasingly important for geologists and geophysicists to have the computational resources to use advanced software and edit programs for their research. I have developed a prototype of a program to help geophysicists write programs using a simple interface that requires only simple single-mouse-clicks to input code. It is my goal to minimize the amount of typing necessary to create simple programs and scripts to increase accessibility for people with disabilities limiting fine motor skills. This interface can be adapted for various programming and scripting languages. Using this interface will simplify development of code for C/C++, Java, and GMT, and can be expanded to support any other text based programming language. The interface is designed around the concept of maximizing the amount of code that can be written using a minimum number of clicks and typing. The screen is split into two sections: a list of click-commands is on the left hand side, and a text area is on the right hand side. When the user clicks on a command on the left hand side the applicable code is automatically inserted at the insertion point in the text area. Currently in the C/C++ interface, there are commands for common code segments that are often used, such as for loops, comments, print statements, and structured code creation. The primary goal is to provide an interface that will work across many devices for developing code. A simple prototype has been developed for the iPad. Due to the limited number of devices that an iOS application can be used with, the code has been re-written in Java to run on a wider range of devices

  1. User Centred Design of a Multimodal Reading Training System for Dyslexics

    DEFF Research Database (Denmark)

    Pedersen, Jakob Schou

    This thesis presents work in the area of computer-based reading training for dyslexics using speech technology and multimodal information. The study includes an analysis of typical dyslexic reading behaviour and traditional training techniques as well as the detailed development of a prototype...... schemes, evaluation forms and graphical layout solutions are investigated. This is done through the design and evaluation of several prototypes that seek to compensate for the loss of empathy when going from traditional training to an automated system. Through evaluations of the final prototype system......, involving dyslexics, it is shown that it is possible for dyslexic users to carry out reading exercises with the sole assistance and guidance of an automated training tool, given a sufficient speech recognition accuracy. It is furthermore shown that in order to cope with the different preferences...

  2. Designing Multimodal Mobile Interaction for a Text Messaging Application for Visually Impaired Users

    Directory of Open Access Journals (Sweden)

    Carlos Duarte

    2017-12-01

    Full Text Available While mobile devices have experienced important accessibility advances in the past years, people with visual impairments still face important barriers, especially in specific contexts when both their hands are not free to hold the mobile device, like when walking outside. By resorting to a multimodal combination of body based gestures and voice, we aim to achieve full hands and vision free interaction with mobile devices. In this article, we describe this vision and present the design of a prototype, inspired by that vision, of a text messaging application. The article also presents a user study where the suitability of the proposed approach was assessed, and a performance comparison between our prototype and existing SMS applications was conducted. Study participants received positively the prototype, which also supported better performance in tasks that involved text editing.

  3. Layout design of user interface components with multiple objectives

    Directory of Open Access Journals (Sweden)

    Peer S.K.

    2004-01-01

    Full Text Available A multi-goal layout problem may be formulated as a Quadratic Assignment model, considering multiple goals (or factors, both qualitative and quantitative in the objective function. The facilities layout problem, in general, varies from the location and layout of facilities in manufacturing plant to the location and layout of textual and graphical user interface components in the human–computer interface. In this paper, we propose two alternate mathematical approaches to the single-objective layout model. The first one presents a multi-goal user interface component layout problem, considering the distance-weighted sum of congruent objectives of closeness relationships and the interactions. The second one considers the distance-weighted sum of congruent objectives of normalized weighted closeness relationships and normalized weighted interactions. The results of first approach are compared with that of an existing single objective model for example task under consideration. Then, the results of first approach and second approach of the proposed model are compared for the example task under consideration.

  4. A graphical user interface for infant ERP analysis.

    Science.gov (United States)

    Kaatiala, Jussi; Yrttiaho, Santeri; Forssman, Linda; Perdue, Katherine; Leppänen, Jukka

    2014-09-01

    Recording of event-related potentials (ERPs) is one of the best-suited technologies for examining brain function in human infants. Yet the existing software packages are not optimized for the unique requirements of analyzing artifact-prone ERP data from infants. We developed a new graphical user interface that enables an efficient implementation of a two-stage approach to the analysis of infant ERPs. In the first stage, video records of infant behavior are synchronized with ERPs at the level of individual trials to reject epochs with noncompliant behavior and other artifacts. In the second stage, the interface calls MATLAB and EEGLAB (Delorme & Makeig, Journal of Neuroscience Methods 134(1):9-21, 2004) functions for further preprocessing of the ERP signal itself (i.e., filtering, artifact removal, interpolation, and rereferencing). Finally, methods are included for data visualization and analysis by using bootstrapped group averages. Analyses of simulated and real EEG data demonstrated that the proposed approach can be effectively used to establish task compliance, remove various types of artifacts, and perform representative visualizations and statistical comparisons of ERPs. The interface is available for download from http://www.uta.fi/med/icl/methods/eeg.html in a format that is widely applicable to ERP studies with special populations and open for further editing by users.

  5. An SML Driven Graphical User Interface and Application Management Toolkit

    International Nuclear Information System (INIS)

    White, Greg R

    2002-01-01

    In the past, the features of a user interface were limited by those available in the existing graphical widgets it used. Now, improvements in processor speed have fostered the emergence of interpreted languages, in which the appropriate method to render a given data object can be loaded at runtime. XML can be used to precisely describe the association of data types with their graphical handling (beans), and Java provides an especially rich environment for programming the graphics. We present a graphical user interface builder based on Java Beans and XML, in which the graphical screens are described textually (in files or a database) in terms of their screen components. Each component may be a simple text read back, or a complex plot. The programming model provides for dynamic data pertaining to a component to be forwarded synchronously or asynchronously, to the appropriate handler, which may be a built-in method, or a complex applet. This work was initially motivated by the need to move the legacy VMS display interface of the SLAC Control Program to another platform while preserving all of its existing functionality. However the model allows us a powerful and generic system for adding new kinds of graphics, such as Matlab, data sources, such as EPICS, middleware, such as AIDA[1], and transport, such as XML and SOAP. The system will also include a management console, which will be able to report on the present usage of the system, for instance who is running it where and connected to which channels

  6. Adaptive multimodal interaction in mobile augmented reality: A conceptual framework

    Science.gov (United States)

    Abidin, Rimaniza Zainal; Arshad, Haslina; Shukri, Saidatul A'isyah Ahmad

    2017-10-01

    Recently, Augmented Reality (AR) is an emerging technology in many mobile applications. Mobile AR was defined as a medium for displaying information merged with the real world environment mapped with augmented reality surrounding in a single view. There are four main types of mobile augmented reality interfaces and one of them are multimodal interfaces. Multimodal interface processes two or more combined user input modes (such as speech, pen, touch, manual gesture, gaze, and head and body movements) in a coordinated manner with multimedia system output. In multimodal interface, many frameworks have been proposed to guide the designer to develop a multimodal applications including in augmented reality environment but there has been little work reviewing the framework of adaptive multimodal interface in mobile augmented reality. The main goal of this study is to propose a conceptual framework to illustrate the adaptive multimodal interface in mobile augmented reality. We reviewed several frameworks that have been proposed in the field of multimodal interfaces, adaptive interface and augmented reality. We analyzed the components in the previous frameworks and measure which can be applied in mobile devices. Our framework can be used as a guide for designers and developer to develop a mobile AR application with an adaptive multimodal interfaces.

  7. Exploring Interaction Space as Abstraction Mechanism for Task-Based User Interface Design

    DEFF Research Database (Denmark)

    Nielsen, C. M.; Overgaard, M.; Pedersen, M. B.

    2007-01-01

    Designing a user interface is often a complex undertaking. Model-based user interface design is an approach where models and mappings between them form the basis for creating and specifying the design of a user interface. Such models usually include descriptions of the tasks of the prospective user......, but there is considerable variation in the other models that are employed. This paper explores the extent to which the notion of interaction space is useful as an abstraction mechanism to reduce the complexity of creating and specifying a user interface design. We present how we designed a specific user interface through...... mechanism that can help user interface designers exploit object-oriented analysis results and reduce the complexity of designing a user interface....

  8. End-User Control over Physical User-Interfaces: From Digital Fabrication to Real-Time Adaptability

    OpenAIRE

    Ramakers, Raf

    2016-01-01

    Graphical user interfaces are at the core of the majority of computing devices, including WIMP (windows, icons, menus, pointer) interaction styles and touch interactions. The popularity of graphical user interfaces stems from their ability to adapt to a multitude of tasks, such as document editing, messaging, browsing, etc. New tools and technologies also enabled users without a technical background to author these kind of digital interfaces, for example, filters for quick photo editing, inte...

  9. Tactile and bone-conduction auditory brain computer interface for vision and hearing impaired users.

    Science.gov (United States)

    Rutkowski, Tomasz M; Mori, Hiromu

    2015-04-15

    The paper presents a report on the recently developed BCI alternative for users suffering from impaired vision (lack of focus or eye-movements) or from the so-called "ear-blocking-syndrome" (limited hearing). We report on our recent studies of the extents to which vibrotactile stimuli delivered to the head of a user can serve as a platform for a brain computer interface (BCI) paradigm. In the proposed tactile and bone-conduction auditory BCI novel multiple head positions are used to evoke combined somatosensory and auditory (via the bone conduction effect) P300 brain responses, in order to define a multimodal tactile and bone-conduction auditory brain computer interface (tbcaBCI). In order to further remove EEG interferences and to improve P300 response classification synchrosqueezing transform (SST) is applied. SST outperforms the classical time-frequency analysis methods of the non-linear and non-stationary signals such as EEG. The proposed method is also computationally more effective comparing to the empirical mode decomposition. The SST filtering allows for online EEG preprocessing application which is essential in the case of BCI. Experimental results with healthy BCI-naive users performing online tbcaBCI, validate the paradigm, while the feasibility of the concept is illuminated through information transfer rate case studies. We present a comparison of the proposed SST-based preprocessing method, combined with a logistic regression (LR) classifier, together with classical preprocessing and LDA-based classification BCI techniques. The proposed tbcaBCI paradigm together with data-driven preprocessing methods are a step forward in robust BCI applications research. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. User Interface Design to Bring Simulation Data into the Classroom

    International Nuclear Information System (INIS)

    Tebbe, P.A.

    1999-01-01

    The training and simulation staff at the AmerenUE Callaway nuclear plant has been given the task of implementing the plant full-scope simulator in a classroom setting. As part of this project, members of the Nuclear Engineering Program at the University of Missouri, Columbia are working with plant personnel to create desktop software for use in training on fundamental plant principles. Data are created with the same modeling software used to power the simulator and are made available with the existing dynamic database structure. Visualization is provided through a specially designed user interface, created with the G programming language of LabVIEW. It is hoped that by focusing on a specific topic and designing the interface with educational objectives in mind, this software will help provide operators with an improved understanding of fundamental principles

  11. Graphical user interface for image acquisition and processing

    Science.gov (United States)

    Goldberg, Kenneth A.

    2002-01-01

    An event-driven GUI-based image acquisition interface for the IDL programming environment designed for CCD camera control and image acquisition directly into the IDL environment where image manipulation and data analysis can be performed, and a toolbox of real-time analysis applications. Running the image acquisition hardware directly from IDL removes the necessity of first saving images in one program and then importing the data into IDL for analysis in a second step. Bringing the data directly into IDL creates an opportunity for the implementation of IDL image processing and display functions in real-time. program allows control over the available charge coupled device (CCD) detector parameters, data acquisition, file saving and loading, and image manipulation and processing, all from within IDL. The program is built using IDL's widget libraries to control the on-screen display and user interface.

  12. Graphical user interface prototyping for distributed requirements engineering

    CERN Document Server

    Scheibmayr, Sven

    2014-01-01

    Finding and understanding the right requirements is essential for every software project. This book deals with the challenge to improve requirements engineering in distributed software projects. The use of graphical user interface (GUI) prototypes can help stakeholders in such projects to elicit and specify high quality requirements. The research objective of this study is to develop a method and a software artifact to support the activities in the early requirements engineering phase in order to overcome some of the difficulties and improve the quality of the requirements, which should eventu

  13. Graphical user interface for wireless sensor networks simulator

    Science.gov (United States)

    Paczesny, Tomasz; Paczesny, Daniel; Weremczuk, Jerzy

    2008-01-01

    Wireless Sensor Networks (WSN) are currently very popular area of development. It can be suited in many applications form military through environment monitoring, healthcare, home automation and others. Those networks, when working in dynamic, ad-hoc model, need effective protocols which must differ from common computer networks algorithms. Research on those protocols would be difficult without simulation tool, because real applications often use many nodes and tests on such a big networks take much effort and costs. The paper presents Graphical User Interface (GUI) for simulator which is dedicated for WSN studies, especially in routing and data link protocols evaluation.

  14. New kind of user interface for controlling MFTF diagnostics

    International Nuclear Information System (INIS)

    Preckshot, G.G.; Saroyan, R.A.; Mead, J.E.

    1983-01-01

    The Mirror Fusion Test Facility (MFTF) at Lawrence Livermore National Laboratory is faced with the problem of controlling a multitude of plasma diagnostics instruments from a central, multiprocessor computer facility. A 16-bit microprocessor-based workstation allows each physicist entree into the central multiprocessor, which consists of nine Perkin-Elmer 32-bit minicomputers. The workstation provides the user interface to the larger system, with display graphics, windowing, and a physics notebook. Controlling a diagnostic is now equivalent to making entries into a traditional physics notebook

  15. The homes of tomorrow: service composition and advanced user interfaces

    Directory of Open Access Journals (Sweden)

    Claudio Di Ciccio

    2011-12-01

    Full Text Available Home automation represents a growing market in the industrialized world. Today’s systems are mainly based on ad hoc and proprietary solutions, with little to no interoperability and smart integration. However, in a not so distant future, our homes will be equipped with many sensors, actuators and devices, which will collectively expose services, able to smartly interact and integrate, in order to offer complex services providing even richer functionalities. In this paper we present the approach and results of SM4ALL- Smart hoMes for All, a project investigating automatic service composition and advanced user interfaces applied to domotics.

  16. A new kind of user interface for controlling MFTF diagnostics

    International Nuclear Information System (INIS)

    Preckshot, G.; Mead, J.; Saroyan, R.

    1983-01-01

    The Mirror Fusion Test Facility (MFTF) at Lawrence Livermore National Laboratory is faced with the problem of controlling a multitude of plasma diagnostics instruments from a central, multiprocessor computer facility. A 16-bit microprocessor-based workstation allows each physicist entree into the central multiprocessor, which consists of nine Perkin-Elmer 32-bit minicomputers. The workstation provides the user interface to the larger system, with display graphics, windowing, and a physics notebook. Controlling a diagnostic is now equivalent to making entries into a traditional physics notebook

  17. NATURAL USER INTERFACE SENSORS FOR HUMAN BODY MEASUREMENT

    Directory of Open Access Journals (Sweden)

    J. Boehm

    2012-08-01

    Full Text Available The recent push for natural user interfaces (NUI in the entertainment and gaming industry has ushered in a new era of low cost three-dimensional sensors. While the basic idea of using a three-dimensional sensor for human gesture recognition dates some years back it is not until recently that such sensors became available on the mass market. The current market leader is PrimeSense who provide their technology for the Microsoft Xbox Kinect. Since these sensors are developed to detect and observe human users they should be ideally suited to measure the human body. We describe the technology of a line of NUI sensors and assess their performance in terms of repeatability and accuracy. We demonstrate the implementation of a prototype scanner integrating several NUI sensors to achieve full body coverage. We present the results of the obtained surface model of a human body.

  18. Object-oriented user interfaces for personalized mobile learning

    CERN Document Server

    Alepis, Efthimios

    2014-01-01

    This book presents recent research in mobile learning and advanced user interfaces. It is shown how the combination of this fields can result in personalized educational software that meets the requirements of state-of-the-art mobile learning software. This book provides a framework that is capable of incorporating the software technologies, exploiting a wide range of their current advances and additionally investigating ways to go even further by providing potential solutions to future challenges. The presented approach uses the well-known Object-Oriented method in order to address these challenges. Throughout this book, a general model is constructed using Object-Oriented Architecture. Each chapter focuses on the construction of a specific part of this model, while in the conclusion these parts are unified. This book will help software engineers build more sophisticated personalized software that targets in mobile education, while at the same time retaining a high level of adaptivity and user-friendliness w...

  19. Natural User Interface Sensors for Human Body Measurement

    Science.gov (United States)

    Boehm, J.

    2012-08-01

    The recent push for natural user interfaces (NUI) in the entertainment and gaming industry has ushered in a new era of low cost three-dimensional sensors. While the basic idea of using a three-dimensional sensor for human gesture recognition dates some years back it is not until recently that such sensors became available on the mass market. The current market leader is PrimeSense who provide their technology for the Microsoft Xbox Kinect. Since these sensors are developed to detect and observe human users they should be ideally suited to measure the human body. We describe the technology of a line of NUI sensors and assess their performance in terms of repeatability and accuracy. We demonstrate the implementation of a prototype scanner integrating several NUI sensors to achieve full body coverage. We present the results of the obtained surface model of a human body.

  20. NASA Access Mechanism - Graphical user interface information retrieval system

    Science.gov (United States)

    Hunter, Judy F.; Generous, Curtis; Duncan, Denise

    1993-01-01

    Access to online information sources of aerospace, scientific, and engineering data, a mission focus for NASA's Scientific and Technical Information Program, has always been limited by factors such as telecommunications, query language syntax, lack of standardization in the information, and the lack of adequate tools to assist in searching. Today, the NASA STI Program's NASA Access Mechanism (NAM) prototype offers a solution to these problems by providing the user with a set of tools that provide a graphical interface to remote, heterogeneous, and distributed information in a manner adaptable to both casual and expert users. Additionally, the NAM provides access to many Internet-based services such as Electronic Mail, the Wide Area Information Servers system, Peer Locating tools, and electronic bulletin boards.

  1. NASA access mechanism: Graphical user interface information retrieval system

    Science.gov (United States)

    Hunter, Judy; Generous, Curtis; Duncan, Denise

    1993-01-01

    Access to online information sources of aerospace, scientific, and engineering data, a mission focus for NASA's Scientific and Technical Information Program, has always been limited to factors such as telecommunications, query language syntax, lack of standardization in the information, and the lack of adequate tools to assist in searching. Today, the NASA STI Program's NASA Access Mechanism (NAM) prototype offers a solution to these problems by providing the user with a set of tools that provide a graphical interface to remote, heterogeneous, and distributed information in a manner adaptable to both casual and expert users. Additionally, the NAM provides access to many Internet-based services such as Electronic Mail, the Wide Area Information Servers system, Peer Locating tools, and electronic bulletin boards.

  2. Usability evaluation of user interface of thesis title review system

    Science.gov (United States)

    Tri, Y.; Erna, A.; Gellysa, U.

    2018-03-01

    Presentation of programs with user interface that can be accessed online through the website of course greatly provide user benefits. User can easily access the program they need. There are usability values that serve as a benchmark for the success of a user accessible program, ie efficiency, effectiveness, and convenience. These usability values also determine the development of the program for the better use. Therefore, on the review title thesis program that will be implemented in STT Dumai was measured usability evaluation. It aims to see which sides are not yet perfect and need to be improved to improve the performance and utilization of the program. Usability evaluation was measured by using smartPLS software. Database used was the result of respondent questionnaires that include questions about the experience when they used program. The result of a review of thesis title program implemented in STT Dumai has an efficiency value of 22.615, the effectiveness of 20.612, and satisfaction of 33.177.

  3. Graphical user interface development for the MARS code

    International Nuclear Information System (INIS)

    Jeong, J.-J.; Hwang, M.; Lee, Y.J.; Kim, K.D.; Chung, B.D.

    2003-01-01

    KAERI has developed the best-estimate thermal-hydraulic system code MARS using the RELAP5/MOD3 and COBRA-TF codes. To exploit the excellent features of the two codes, we consolidated the two codes. Then, to improve the readability, maintainability, and portability of the consolidated code, all the subroutines were completely restructured by employing a modular data structure. At present, a major part of the MARS code development program is underway to improve the existing capabilities. The code couplings with three-dimensional neutron kinetics, containment analysis, and transient critical heat flux calculations have also been carried out. At the same time, graphical user interface (GUI) tools have been developed for user friendliness. This paper presents the main features of the MARS GUI. The primary objective of the GUI development was to provide a valuable aid for all levels of MARS users in their output interpretation and interactive controls. Especially, an interactive control function was designed to allow operator actions during simulation so that users can utilize the MARS code like conventional nuclear plant analyzers (NPAs). (author)

  4. User's manual for the HYPGEN hyperbolic grid generator and the HGUI graphical user interface

    Science.gov (United States)

    Chan, William M.; Chiu, Ing-Tsau; Buning, Pieter G.

    1993-01-01

    The HYPGEN program is used to generate a 3-D volume grid over a user-supplied single-block surface grid. This is accomplished by solving the 3-D hyperbolic grid generation equations consisting of two orthogonality relations and one cell volume constraint. In this user manual, the required input files and parameters and output files are described. Guidelines on how to select the input parameters are given. Illustrated examples are provided showing a variety of topologies and geometries that can be treated. HYPGEN can be used in stand-alone mode as a batch program or it can be called from within a graphical user interface HGUI that runs on Silicon Graphics workstations. This user manual provides a description of the menus, buttons, sliders, and typein fields in HGUI for users to enter the parameters needed to run HYPGEN. Instructions are given on how to configure the interface to allow HYPGEN to run either locally or on a faster remote machine through the use of shell scripts on UNIX operating systems. The volume grid generated is copied back to the local machine for visualization using a built-in hook to PLOT3D.

  5. TaskMaster: a prototype graphical user interface to a schedule optimization model

    OpenAIRE

    Banham, Stephen R.

    1990-01-01

    Approved for public release, distribution is unlimited This thesis investigates the use of current graphical interface techniques to build more effective computer-user interfaces to Operations Research (OR) schedule optimization models. The design is directed at the scheduling decision maker who possesses limited OR experience. The feasibility and validity of building an interface for this kind of user is demonstrated in the development of a prototype graphical user interface called TaskMa...

  6. User Control Interface for W7-X Plasma Operation

    International Nuclear Information System (INIS)

    Spring, A.; Laqua, H.; Schacht, J.

    2006-01-01

    The WENDELSTEIN 7-X fusion experiment will be a highly complex device operated by a likewise complex control system. The fundamental configuration of the W7-X control system follows two major design principles: It reflects the strict hierarchy of the machine set-up with a set of subordinated components, which in turn can be run autonomously during commissioning and testing. Secondly, it links the basic machine operation (mainly given by the infrastructure status and the components readiness) and the physics program execution (i.e. plasma operation) on each hierarchy level and on different time scales. The complexity of the control system implies great demands on appropriate user interfaces: specialized tools for specific control tasks allowing a dedicated view on the subject to be controlled, hiding complexity wherever possible and reasonable, providing similar operation methods on each hierarchy level and both manual interaction possibilities and a high degree of intelligent automation. The contribution will describe the operation interface for experiment control including the necessary links to the machine operation. The users of ' Xcontrol ' will be both the W7-X session leaders during plasma discharge experiments and the components' or diagnostics' operators during autonomous mode or even laboratory experiments. The main ' Xcontrol ' features, such as program composition and validation, manual and automatic control instruments, resource survey, and process monitoring, will be presented. The implementation principles and the underlying communication will be discussed. (author)

  7. User interface using a 3D model for video surveillance

    Science.gov (United States)

    Hata, Toshihiko; Boh, Satoru; Tsukada, Akihiro; Ozaki, Minoru

    1998-02-01

    These days fewer people, who must carry out their tasks quickly and precisely, are required in industrial surveillance and monitoring applications such as plant control or building security. Utilizing multimedia technology is a good approach to meet this need, and we previously developed Media Controller, which is designed for the applications and provides realtime recording and retrieval of digital video data in a distributed environment. In this paper, we propose a user interface for such a distributed video surveillance system in which 3D models of buildings and facilities are connected to the surveillance video. A novel method of synchronizing camera field data with each frame of a video stream is considered. This method records and reads the camera field data similarity to the video data and transmits it synchronously with the video stream. This enables the user interface to have such useful functions as comprehending the camera field immediately and providing clues when visibility is poor, for not only live video but also playback video. We have also implemented and evaluated the display function which makes surveillance video and 3D model work together using Media Controller with Java and Virtual Reality Modeling Language employed for multi-purpose and intranet use of 3D model.

  8. The ganga user interface for physics analysis and distributed resources

    CERN Document Server

    Soroko, A; Adams, D; Harrison, K; Charpentier, P; Maier, A; Mato, P; Moscicki, J T; Egede, U; Martyniak, J; Jones, R; Patrick, G N

    2004-01-01

    A physicist analysing data from the LHC experiments will have to deal with data and computing resources that are distributed across multiple locations and have different access methods. Ganga helps by providing a uniform high-level interface to the different low-level solutions for the required tasks, ranging from the specification of input data to the retrieval and post-processing of the output. For LHCb and ATLAS the goal is to assist in running jobs based on the Gaudi/Athena C++ framework. Ganga is written in python and presents the user with a single GUI rather than a set of different applications. It uses pluggable modules to interact with external tools for operations such as querying metadata catalogues, job configuration and job submission. At start-up, the user is presented with a list of templates for common analysis tasks, and information about ongoing tasks is stored from one invocation to the next. Ganga can also be used through a command line interface. This closely mirrors the functionality of ...

  9. Multimodal human-machine interaction for service robots in home-care environments

    OpenAIRE

    Goetze, Stefan; Fischer, S.; Moritz, Niko; Appell, Jens-E.; Wallhoff, Frank

    2012-01-01

    This contribution focuses on multimodal interaction techniques for a mobile communication and assistance system on a robot platform. The system comprises of acoustic, visual and haptic input modalities. Feedback is given to the user by a graphical user interface and a speech synthesis system. By this, multimodal and natural communication with the robot system is possible.

  10. A Model-Driven Approach to Graphical User Interface Runtime Adaptation

    OpenAIRE

    Criado, Javier; Vicente Chicote, Cristina; Iribarne, Luis; Padilla, Nicolás

    2010-01-01

    Graphical user interfaces play a key role in human-computer interaction, as they link the system with its end-users, allowing information exchange and improving communication. Nowadays, users increasingly demand applications with adaptive interfaces that dynamically evolve in response to their specific needs. Thus, providing graphical user interfaces with runtime adaptation capabilities is becoming more and more an important issue. To address this problem, this paper proposes a componen...

  11. Java graphical user interface for the supervision of Tore Supra

    International Nuclear Information System (INIS)

    Utzel, Nadine; Guillerminet, Bernard; Leluyer, Mireille; Moulin, Daniele

    2002-01-01

    The graphical user interface (GUI) for the supervision of Tore Supra is intended to supervise the start-up and the shut-down of the installation, to control general state (state of all diagnostics, state of the system and network) and to follow the pulse sequence. Implementation of a new multi-platform, modular GUI for Tore Supra is in progress. This provides not only a simpler, more structured view for the non-specialist user, but also is open-ended and adaptable to a wide variety of uses. The actual implementation of a GUI is a question of user-ergonomics. Hence, a user-directed study in 2000 produced a specification for the interface. The information is treated with a hierarchical order. At the top level, only the global state of the supervised elements appears, i.e. the general state of every diagnostics, the pulse sequence, the safety systems. If a problem occurs, the operator has access to the lower level detailed state of the concerned element, simply with a double-click. An event log also helps the operator to analyse the chronology of the alarms arising during the pulse. Although the GUI is mainly used in the control room on X terminals under Unix, it should also be accessible via a portable PC for the purpose of maintenance, or directly from any office to see how the physics program is progressing. The choice of Java, multi-platform object programming language was thus adopted with access via any web browser. The modularity of the GUI is made possible by a distributed architecture (remote method invocation) between the graphic client and different servers: one for the diagnostics and the sequence, one for the system and the network and one for the configuration database. All the components interact with each other in a very simple and standard way. This distributed architecture allows the progressive set up of the new interface. The first step, being produced for mid-2001 is the GUI for the supervision of diagnostics. This prototype will help us to

  12. Java graphical user interface for the supervision of Tore Supra

    Energy Technology Data Exchange (ETDEWEB)

    Utzel, Nadine E-mail: nutzel@cea.fr; Guillerminet, Bernard; Leluyer, Mireille; Moulin, Daniele

    2002-06-01

    The graphical user interface (GUI) for the supervision of Tore Supra is intended to supervise the start-up and the shut-down of the installation, to control general state (state of all diagnostics, state of the system and network) and to follow the pulse sequence. Implementation of a new multi-platform, modular GUI for Tore Supra is in progress. This provides not only a simpler, more structured view for the non-specialist user, but also is open-ended and adaptable to a wide variety of uses. The actual implementation of a GUI is a question of user-ergonomics. Hence, a user-directed study in 2000 produced a specification for the interface. The information is treated with a hierarchical order. At the top level, only the global state of the supervised elements appears, i.e. the general state of every diagnostics, the pulse sequence, the safety systems. If a problem occurs, the operator has access to the lower level detailed state of the concerned element, simply with a double-click. An event log also helps the operator to analyse the chronology of the alarms arising during the pulse. Although the GUI is mainly used in the control room on X terminals under Unix, it should also be accessible via a portable PC for the purpose of maintenance, or directly from any office to see how the physics program is progressing. The choice of Java, multi-platform object programming language was thus adopted with access via any web browser. The modularity of the GUI is made possible by a distributed architecture (remote method invocation) between the graphic client and different servers: one for the diagnostics and the sequence, one for the system and the network and one for the configuration database. All the components interact with each other in a very simple and standard way. This distributed architecture allows the progressive set up of the new interface. The first step, being produced for mid-2001 is the GUI for the supervision of diagnostics. This prototype will help us to

  13. Wearable wireless User Interface Cursor-Controller (UIC-C).

    Science.gov (United States)

    Marjanovic, Nicholas; Kerr, Kevin; Aranda, Ricardo; Hickey, Richard; Esmailbeigi, Hananeh

    2017-07-01

    Controlling a computer or a smartphone's cursor allows the user to access a world full of information. For millions of people with limited upper extremities motor function, controlling the cursor becomes profoundly difficult. Our team has developed the User Interface Cursor-Controller (UIC-C) to assist the impaired individuals in regaining control over the cursor. The UIC-C is a hands-free device that utilizes the tongue muscle to control the cursor movements. The entire device is housed inside a subject specific retainer. The user maneuvers the cursor by manipulating a joystick imbedded inside the retainer via their tongue. The joystick movement commands are sent to an electronic device via a Bluetooth connection. The device is readily recognizable as a cursor controller by any Bluetooth enabled electronic device. The device testing results have shown that the time it takes the user to control the cursor accurately via the UIC-C is about three times longer than a standard computer mouse controlled via the hand. The device does not require any permanent modifications to the body; therefore, it could be used during the period of acute rehabilitation of the hands. With the development of modern smart homes, and enhancement electronics controlled by the computer, UIC-C could be integrated into a system that enables individuals with permanent impairment, the ability to control the cursor. In conclusion, the UIC-C device is designed with the goal of allowing the user to accurately control a cursor during the periods of either acute or permanent upper extremities impairment.

  14. Graphical user interface concepts for tactical augmented reality

    Science.gov (United States)

    Argenta, Chris; Murphy, Anne; Hinton, Jeremy; Cook, James; Sherrill, Todd; Snarski, Steve

    2010-04-01

    Applied Research Associates and BAE Systems are working together to develop a wearable augmented reality system under the DARPA ULTRA-Vis program†. Our approach to achieve the objectives of ULTRAVis, called iLeader, incorporates a full color 40° field of view (FOV) see-thru holographic waveguide integrated with sensors for full position and head tracking to provide an unobtrusive information system for operational maneuvers. iLeader will enable warfighters to mark-up the 3D battle-space with symbologic identification of graphical control measures, friendly force positions and enemy/target locations. Our augmented reality display provides dynamic real-time painting of symbols on real objects, a pose-sensitive 360° representation of relevant object positions, and visual feedback for a variety of system activities. The iLeader user interface and situational awareness graphical representations are highly intuitive, nondisruptive, and always tactically relevant. We used best human-factors practices, system engineering expertise, and cognitive task analysis to design effective strategies for presenting real-time situational awareness to the military user without distorting their natural senses and perception. We present requirements identified for presenting information within a see-through display in combat environments, challenges in designing suitable visualization capabilities, and solutions that enable us to bring real-time iconic command and control to the tactical user community.

  15. StarTrax --- The Next Generation User Interface

    Science.gov (United States)

    Richmond, Alan; White, Nick

    StarTrax is a software package to be distributed to end users for installation on their local computing infrastructure. It will provide access to many services of the HEASARC, i.e. bulletins, catalogs, proposal and analysis tools, initially for the ROSAT MIPS (Mission Information and Planning System), later for the Next Generation Browse. A user activating the GUI will reach all HEASARC capabilities through a uniform view of the system, independent of the local computing environment and of the networking method of accessing StarTrax. Use it if you prefer the point-and-click metaphor of modern GUI technology, to the classical command-line interfaces (CLI). Notable strengths include: easy to use; excellent portability; very robust server support; feedback button on every dialog; painstakingly crafted User Guide. It is designed to support a large number of input devices including terminals, workstations and personal computers. XVT's Portability Toolkit is used to build the GUI in C/C++ to run on: OSF/Motif (UNIX or VMS), OPEN LOOK (UNIX), or Macintosh, or MS-Windows (DOS), or character systems.

  16. Development of output user interface software to support analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wahanani, Nursinta Adi, E-mail: sintaadi@batan.go.id; Natsir, Khairina, E-mail: sintaadi@batan.go.id; Hartini, Entin, E-mail: sintaadi@batan.go.id [Center for Development of Nuclear Informatics - National Nuclear Energy Agency, PUSPIPTEK, Serpong, Tangerang, Banten (Indonesia)

    2014-09-30

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu{sup 239} and Pu{sup 241}. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.

  17. Development of output user interface software to support analysis

    International Nuclear Information System (INIS)

    Wahanani, Nursinta Adi; Natsir, Khairina; Hartini, Entin

    2014-01-01

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu 239 and Pu 241 . Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis

  18. CTG Analyzer: A graphical user interface for cardiotocography.

    Science.gov (United States)

    Sbrollini, Agnese; Agostinelli, Angela; Burattini, Luca; Morettini, Micaela; Di Nardo, Francesco; Fioretti, Sandro; Burattini, Laura

    2017-07-01

    Cardiotocography (CTG) is the most commonly used test for establishing the good health of the fetus during pregnancy and labor. CTG consists in the recording of fetal heart rate (FHR; bpm) and maternal uterine contractions (UC; mmHg). FHR is characterized by baseline, baseline variability, tachycardia, bradycardia, acceleration and decelerations. Instead, UC signal is characterized by presence of contractions and contractions period. Such parameters are usually evaluated by visual inspection. However, visual analysis of CTG recordings has a well-demonstrated poor reproducibility, due to the complexity of physiological phenomena affecting fetal heart rhythm and being related to clinician's experience. Computerized tools in support of clinicians represents a possible solution for improving correctness in CTG interpretation. This paper proposes CTG Analyzer as a graphical tool for automatic and objective analysis of CTG tracings. CTG Analyzer was developed under MATLAB®; it is a very intuitive and user friendly graphical user interface. FHR time series and UC signal are represented one under the other, on a grid with reference lines, as usually done for CTG reports printed on paper. Colors help identification of FHR and UC features. Automatic analysis is based on some unchangeable features definitions provided by the FIGO guidelines, and other arbitrary settings whose default values can be changed by the user. Eventually, CTG Analyzer provides a report file listing all the quantitative results of the analysis. Thus, CTG Analyzer represents a potentially useful graphical tool for automatic and objective analysis of CTG tracings.

  19. LTCP 2D Graphical User Interface. Application Description and User's Guide

    Science.gov (United States)

    Ball, Robert; Navaz, Homayun K.

    1996-01-01

    A graphical user interface (GUI) written for NASA's LTCP (Liquid Thrust Chamber Performance) 2 dimensional computational fluid dynamic code is described. The GUI is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. Through the use of common and familiar dialog boxes, features, and tools, the user can easily and quickly create and modify input files for the LTCP code. In addition, old input files used with the LTCP code can be opened and modified using the GUI. The application is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. The program and its capabilities are presented, followed by a detailed description of each menu selection and the method of creating an input file for LTCP. A cross reference is included to help experienced users quickly find the variables which commonly need changes. Finally, the system requirements and installation instructions are provided.

  20. A Prototype Graphical User Interface for Co-op: A Group Decision Support System.

    Science.gov (United States)

    1992-03-01

    achieve their potential to communicate. Information-oriented, systematic graphic design is the use of typography , symbols, color, and other static and...apphcuittin by reducig Uber ellurt anid enhuncizig Iliteracti. ’Iliis thesis designs and de% elupht Itrututylle Graphical User Interface iGUl i fui Cu f...ORGANIZATION.... .. .. ............ II. INTERFACE DESIGN PRINCIPLES. .............. 7 A. GRAPHICAL USER INTERFACES.............7 1. Design Principles

  1. MOO in Your Face: Researching, Designing, and Programming a User-Friendly Interface.

    Science.gov (United States)

    Haas, Mark; Gardner, Clinton

    1999-01-01

    Suggests the learning curve of a multi-user, object-oriented domain (MOO) blockades effective use. Discusses use of an IBM/PC-compatible interface that allows developers to modify the interface to provide a sense of presence for the user. Concludes that work in programming a variety of interfaces has led to a more intuitive environment for…

  2. Incorporating Speech Recognition into a Natural User Interface

    Science.gov (United States)

    Chapa, Nicholas

    2017-01-01

    The Augmented/ Virtual Reality (AVR) Lab has been working to study the applicability of recent virtual and augmented reality hardware and software to KSC operations. This includes the Oculus Rift, HTC Vive, Microsoft HoloLens, and Unity game engine. My project in this lab is to integrate voice recognition and voice commands into an easy to modify system that can be added to an existing portion of a Natural User Interface (NUI). A NUI is an intuitive and simple to use interface incorporating visual, touch, and speech recognition. The inclusion of speech recognition capability will allow users to perform actions or make inquiries using only their voice. The simplicity of needing only to speak to control an on-screen object or enact some digital action means that any user can quickly become accustomed to using this system. Multiple programs were tested for use in a speech command and recognition system. Sphinx4 translates speech to text using a Hidden Markov Model (HMM) based Language Model, an Acoustic Model, and a word Dictionary running on Java. PocketSphinx had similar functionality to Sphinx4 but instead ran on C. However, neither of these programs were ideal as building a Java or C wrapper slowed performance. The most ideal speech recognition system tested was the Unity Engine Grammar Recognizer. A Context Free Grammar (CFG) structure is written in an XML file to specify the structure of phrases and words that will be recognized by Unity Grammar Recognizer. Using Speech Recognition Grammar Specification (SRGS) 1.0 makes modifying the recognized combinations of words and phrases very simple and quick to do. With SRGS 1.0, semantic information can also be added to the XML file, which allows for even more control over how spoken words and phrases are interpreted by Unity. Additionally, using a CFG with SRGS 1.0 produces a Finite State Machine (FSM) functionality limiting the potential for incorrectly heard words or phrases. The purpose of my project was to

  3. A user interface for mobile robotized tele-echography

    Energy Technology Data Exchange (ETDEWEB)

    Triantafyllidis, G.A. [Informatics and Telematics Institute ITI-CERTH, Thessaloniki (Greece)]. E-mail: gatrian@iti.gr; Thomos, N. [Informatics and Telematics Institute ITI-CERTH, Thessaloniki (Greece); Canero, C. [Computer Vision Center, UAB, Barcelona (Spain); Vieyres, P. [Laboratoire Vision and Robotique Universite d' Orleans, Bourges (France); Strintzis, M.G. [Informatics and Telematics Institute ITI-CERTH, Thessaloniki (Greece)

    2006-12-20

    Ultrasound imaging allows the evaluation of the degree of emergency of a patient. However, in many situations no experienced sonographer is available to perform such echography. To cope with this issue, the OTELO project 'mObile Tele-Echography using an ultra-Light rObot' (OTELO) aims to develop a fully integrated end-to-end mobile tele-echography system using an ultralight, remotely controlled six degree-of-freedom (DOF) robot. In this context, this paper deals with the user interface environment of the OTELO system, composed by the following parts: an ultrasound video transmission system providing real-time images of the scanned area at each moment, an audio/video conference to communicate with the paramedical assistant and the patient, and finally a virtual reality environment, providing visual and haptic feedback to the expert, while capturing the expert's hand movements with a one-DOF hand free input device.

  4. Architecture of collaborating frameworks simulation, visualisation, user interface and analysis

    CERN Document Server

    Pfeier, A; Ferrero-Merlino, B; Giannitrapani, R; Longo, F; Nieminen, P; Pia, M G; Santin, G

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, we have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create "shadow classes" for the Python language, which map the definitio...

  5. A user interface for mobile robotized tele-echography

    International Nuclear Information System (INIS)

    Triantafyllidis, G.A.; Thomos, N.; Canero, C.; Vieyres, P.; Strintzis, M.G.

    2006-01-01

    Ultrasound imaging allows the evaluation of the degree of emergency of a patient. However, in many situations no experienced sonographer is available to perform such echography. To cope with this issue, the OTELO project 'mObile Tele-Echography using an ultra-Light rObot' (OTELO) aims to develop a fully integrated end-to-end mobile tele-echography system using an ultralight, remotely controlled six degree-of-freedom (DOF) robot. In this context, this paper deals with the user interface environment of the OTELO system, composed by the following parts: an ultrasound video transmission system providing real-time images of the scanned area at each moment, an audio/video conference to communicate with the paramedical assistant and the patient, and finally a virtual reality environment, providing visual and haptic feedback to the expert, while capturing the expert's hand movements with a one-DOF hand free input device

  6. siGnum: graphical user interface for EMG signal analysis.

    Science.gov (United States)

    Kaur, Manvinder; Mathur, Shilpi; Bhatia, Dinesh; Verma, Suresh

    2015-01-01

    Electromyography (EMG) signals that represent the electrical activity of muscles can be used for various clinical and biomedical applications. These are complicated and highly varying signals that are dependent on anatomical location and physiological properties of the muscles. EMG signals acquired from the muscles require advanced methods for detection, decomposition and processing. This paper proposes a novel Graphical User Interface (GUI) siGnum developed in MATLAB that will apply efficient and effective techniques on processing of the raw EMG signals and decompose it in a simpler manner. It could be used independent of MATLAB software by employing a deploy tool. This would enable researcher's to gain good understanding of EMG signal and its analysis procedures that can be utilized for more powerful, flexible and efficient applications in near future.

  7. ShelXle: a Qt graphical user interface for SHELXL.

    Science.gov (United States)

    Hübschle, Christian B; Sheldrick, George M; Dittrich, Birger

    2011-12-01

    ShelXle is a graphical user interface for SHELXL [Sheldrick, G. M. (2008). Acta Cryst. A64, 112-122], currently the most widely used program for small-molecule structure refinement. It combines an editor with syntax highlighting for the SHELXL-associated .ins (input) and .res (output) files with an interactive graphical display for visualization of a three-dimensional structure including the electron density (F(o)) and difference density (F(o)-F(c)) maps. Special features of ShelXle include intuitive atom (re-)naming, a strongly coupled editor, structure visualization in various mono and stereo modes, and a novel way of displaying disorder extending over special positions. ShelXle is completely compatible with all features of SHELXL and is written entirely in C++ using the Qt4 and FFTW libraries. It is available at no cost for Windows, Linux and Mac-OS X and as source code.

  8. Quantifying Quality Aspects of Multimodal Interactive Systems

    CERN Document Server

    Kühnel, Christine

    2012-01-01

    This book systematically addresses the quantification of quality aspects of multimodal interactive systems. The conceptual structure is based on a schematic view on human-computer interaction where the user interacts with the system and perceives it via input and output interfaces. Thus, aspects of multimodal interaction are analyzed first, followed by a discussion of the evaluation of output and input and concluding with a view on the evaluation of a complete system.

  9. User interface issues in radiotherapy CAD software. 116

    International Nuclear Information System (INIS)

    Sherouse, G.W.; Mosher, C.E. Jr.

    1987-01-01

    A major growth area in computerized planning of radiotherapy over the last five years has been the development of programs for treatment design based on the interactive graphic display of three dimensional patient models. This word has been recognized as being a close relative of Computer-Aided Design (CAD) software which is used in industry for a broad spectrum of engineering and design applications. A CAD system for radiotherapy has been constructed which combines some of the more attractive aspects of other institutions' treatment design systems with CAD technology and a user interface which is designed for use by clinicians. The importance of an appropriate user interface cannot be overemphasized and is often sadly neglected in radiotherapy software. In order to realize the potential of this new treatment design technology it is essential that clinicians, particularly physicians, perceive RT CAD systems as tools rather than as obstacles. It is felt that thhe best way to achieve that perception is for the software to implement a superset of the functions used in conventional practice while retaining the ambiance of the traditional methods. It is the aim to construct a system which an experienced physician or dosimetrist can use with essentially no retraining. The model for this CAD tool is that of a virtual simulator. It is meant to faithfully reproduce both the function and the feel of a physical simulator. A number of techniques has been identified for realizing this goal. These include simple intuitive input devices, natural coordinate systems graphic interaction, good interactive responsiveness, and high-quality 3D display modalities. Here a description of these techniques and some details of their implementation is presented. 10 refs.; 2 figs

  10. SPIKY: a graphical user interface for monitoring spike train synchrony.

    Science.gov (United States)

    Kreuz, Thomas; Mulansky, Mario; Bozanic, Nebojsa

    2015-05-01

    Techniques for recording large-scale neuronal spiking activity are developing very fast. This leads to an increasing demand for algorithms capable of analyzing large amounts of experimental spike train data. One of the most crucial and demanding tasks is the identification of similarity patterns with a very high temporal resolution and across different spatial scales. To address this task, in recent years three time-resolved measures of spike train synchrony have been proposed, the ISI-distance, the SPIKE-distance, and event synchronization. The Matlab source codes for calculating and visualizing these measures have been made publicly available. However, due to the many different possible representations of the results the use of these codes is rather complicated and their application requires some basic knowledge of Matlab. Thus it became desirable to provide a more user-friendly and interactive interface. Here we address this need and present SPIKY, a graphical user interface that facilitates the application of time-resolved measures of spike train synchrony to both simulated and real data. SPIKY includes implementations of the ISI-distance, the SPIKE-distance, and the SPIKE-synchronization (an improved and simplified extension of event synchronization) that have been optimized with respect to computation speed and memory demand. It also comprises a spike train generator and an event detector that makes it capable of analyzing continuous data. Finally, the SPIKY package includes additional complementary programs aimed at the analysis of large numbers of datasets and the estimation of significance levels. Copyright © 2015 the American Physiological Society.

  11. Multi-Modal Traveler Information System - GCM Corridor Architecture Interface Control Requirements

    Science.gov (United States)

    1997-10-31

    The Multi-Modal Traveler Information System (MMTIS) project involves a large number of Intelligent Transportation System (ITS) related tasks. It involves research of all ITS initiatives in the Gary-Chicago-Milwaukee (GCM) Corridor which are currently...

  12. User-based representation of time-resolved multimodal public transportation networks.

    Science.gov (United States)

    Alessandretti, Laura; Karsai, Márton; Gauvin, Laetitia

    2016-07-01

    Multimodal transportation systems, with several coexisting services like bus, tram and metro, can be represented as time-resolved multilayer networks where the different transportation modes connecting the same set of nodes are associated with distinct network layers. Their quantitative description became possible recently due to openly accessible datasets describing the geo-localized transportation dynamics of large urban areas. Advancements call for novel analytics, which combines earlier established methods and exploits the inherent complexity of the data. Here, we provide a novel user-based representation of public transportation systems, which combines representations, accounting for the presence of multiple lines and reducing the effect of spatial embeddedness, while considering the total travel time, its variability across the schedule, and taking into account the number of transfers necessary. After the adjustment of earlier techniques to the novel representation framework, we analyse the public transportation systems of several French municipal areas and identify hidden patterns of privileged connections. Furthermore, we study their efficiency as compared to the commuting flow. The proposed representation could help to enhance resilience of local transportation systems to provide better design policies for future developments.

  13. Apatia multimodal iatrogênica Multimodal apathy: a unique effect of antidepressant therapy at the neurological-psychiatric interface

    Directory of Open Access Journals (Sweden)

    Ricardo de Oliveira-Souza

    1996-06-01

    Full Text Available O presente trabalho documenta um efeito peculiar dos antidepressivos em 5 pacientes - a apatia -, definida pela incapacidade de experimentar emoções. O reconhecimento da apatia no curso de tratamento antidepressivo deve levantar a possibilidade de iatrogenia e suspensão do antidepressivo em uso. Frizamos que a apatia deve ser diferenciada da abulia e da avolição, com as quais é comumente contundida. Documentamos que a indiferença emocional pode se confinar a um domínio sensorial ("apatia unimodal" ou, como em nossos casos, a mais de uma modalidade ("apatia multimodal". Circuitos anterobasais, centrados na amígdala e no pólo temporal, são fortes candidatos para integrar a experiência emocional às imagens mentais e percepções multimodais do ambiente, uma vez que para eles convergem os principais sistemas de projeção do prosencéfalo, ao mesmo tempo em que se situam em pontos estratégicos para modular o córtex pré-frontal e parieto-têmporo-occipital. O fato de que a apatia foi produzida por classes quimicamente distintas, como ISRSs (inibidores seletivos de recaptação da serotonina, IMAOs (inibidores reversíveis da monoamino oxidase e tricíclicos, indica que a fisiopatologia em jogo se deve a alguma ação compartilhada por essas drogas no plano subneuronal. A intervenção em circuitos serotoninérgicos cerebrais parece o mecanismo mais adequado para explicar tal efeito.The present paper reports on five patients who developed apathy as a peculiar side effect of antidepressants. Their behavioral and psychopathological changes were primarily due to the near-absence of emotional experience, a key characteristic that distinguishes apathy from avolition and abulia. The emergence of apathy in the course of an antidepressant treatment should raise the suspicion of an adverse effect of the drug and lead to its prompt withdrawal. A sample of the relevant clinical evidence favoring the distinction of apathy confined to a single

  14. Natural user interface as a supplement of the holographic Raman tweezers

    Science.gov (United States)

    Tomori, Zoltan; Kanka, Jan; Kesa, Peter; Jakl, Petr; Sery, Mojmir; Bernatova, Silvie; Antalik, Marian; Zemánek, Pavel

    2014-09-01

    Holographic Raman tweezers (HRT) manipulates with microobjects by controlling the positions of multiple optical traps via the mouse or joystick. Several attempts have appeared recently to exploit touch tablets, 2D cameras or Kinect game console instead. We proposed a multimodal "Natural User Interface" (NUI) approach integrating hands tracking, gestures recognition, eye tracking and speech recognition. For this purpose we exploited "Leap Motion" and "MyGaze" low-cost sensors and a simple speech recognition program "Tazti". We developed own NUI software which processes signals from the sensors and sends the control commands to HRT which subsequently controls the positions of trapping beams, micropositioning stage and the acquisition system of Raman spectra. System allows various modes of operation proper for specific tasks. Virtual tools (called "pin" and "tweezers") serving for the manipulation with particles are displayed on the transparent "overlay" window above the live camera image. Eye tracker identifies the position of the observed particle and uses it for the autofocus. Laser trap manipulation navigated by the dominant hand can be combined with the gestures recognition of the secondary hand. Speech commands recognition is useful if both hands are busy. Proposed methods make manual control of HRT more efficient and they are also a good platform for its future semi-automated and fully automated work.

  15. Model for Educational Game Using Natural User Interface

    Directory of Open Access Journals (Sweden)

    Azrulhizam Shapi’i

    2016-01-01

    Full Text Available Natural User Interface (NUI is a new approach that has become increasingly popular in Human-Computer Interaction (HCI. The use of this technology is widely used in almost all sectors, including the field of education. In recent years, there are a lot of educational games using NUI technology in the market such as Kinect game. Kinect is a sensor that can recognize body movements, postures, and voices in three dimensions. It enables users to control and interact with game without the need of using game controller. However, the contents of most existing Kinect games do not follow the standard curriculum in classroom, thus making it do not fully achieve the learning objectives. Hence, this research proposes a design model as a guideline in designing educational game using NUI. A prototype has been developed as one of the objectives in this study. The prototype is based on proposed model to ensure and assess the effectiveness of the model. The outcomes of this study conclude that the proposed model contributed to the design method for the development of the educational game using NUI. Furthermore, evaluation results of the prototype show a good response from participant and in line with the standard curriculum.

  16. GoPhast: a graphical user interface for PHAST

    Science.gov (United States)

    Winston, Richard B.

    2006-01-01

    GoPhast is a graphical user interface (GUI) for the USGS model PHAST. PHAST simulates multicomponent, reactive solute transport in three-dimensional, saturated, ground-water flow systems. PHAST can model both equilibrium and kinetic geochemical reactions. PHAST is derived from HST3D (flow and transport) and PHREEQC (geochemical calculations). The flow and transport calculations are restricted to constant fluid density and constant temperature. The complexity of the input required by PHAST makes manual construction of its input files tedious and error-prone. GoPhast streamlines the creation of the input file and helps reduce errors. GoPhast allows the user to define the spatial input for the PHAST flow and transport data file by drawing points, lines, or polygons on top, front, and side views of the model domain. These objects can have up to two associated formulas that define their extent perpendicular to the view plane, allowing the objects to be three-dimensional. Formulas are also used to specify the values of spatial data (data sets) both globally and for individual objects. Objects can be used to specify the values of data sets independent of the spatial and temporal discretization of the model. Thus, the grid and simulation periods for the model can be changed without respecifying spatial data pertaining to the hydrogeologic framework and boundary conditions. This report describes the operation of GoPhast and demonstrates its use with examples. GoPhast runs on Windows 2000, Windows XP, and Linux operating systems.

  17. UsiGesture: Test and Evaluation of an Environment for Integrating Gestures in User Interfaces

    OpenAIRE

    Beuvens, François; Vanderdonckt, Jean

    2014-01-01

    User interfaces allowing gesture recognition and manipulation are becoming more and more popular these last years. It however remains a hard task for programmers to developer such interfaces : some knowledge of recognition systems is required, along with user experience and user interface management knowledge. It is often difficult for only one developer to handle all this knowledge by itself and it is why a team gathering different skills is most of the time needed. We previously presented a...

  18. Graphical User Interface Development for Representing Air Flow Patterns

    Science.gov (United States)

    Chaudhary, Nilika

    2004-01-01

    In the Turbine Branch, scientists carry out experimental and computational work to advance the efficiency and diminish the noise production of jet engine turbines. One way to do this is by decreasing the heat that the turbine blades receive. Most of the experimental work is carried out by taking a single turbine blade and analyzing the air flow patterns around it, because this data indicates the sections of the turbine blade that are getting too hot. Since the cost of doing turbine blade air flow experiments is very high, researchers try to do computational work that fits the experimental data. The goal of computational fluid dynamics is for scientists to find a numerical way to predict the complex flow patterns around different turbine blades without physically having to perform tests or costly experiments. When visualizing flow patterns, scientists need a way to represent the flow conditions around a turbine blade. A researcher will assign specific zones that surround the turbine blade. In a two-dimensional view, the zones are usually quadrilaterals. The next step is to assign boundary conditions which define how the flow enters or exits one side of a zone. way of setting up computational zones and grids, visualizing flow patterns, and storing all the flow conditions in a file on the computer for future computation. Such a program is necessary because the only method for creating flow pattern graphs is by hand, which is tedious and time-consuming. By using a computer program to create the zones and grids, the graph would be faster to make and easier to edit. Basically, the user would run a program that is an editable graph. The user could click and drag with the mouse to form various zones and grids, then edit the locations of these grids, add flow and boundary conditions, and finally save the graph for future use and analysis. My goal this summer is to create a graphical user interface (GUI) that incorporates all of these elements. I am writing the program in

  19. A user interface development tool for space science systems Transportable Applications Environment (TAE) Plus

    Science.gov (United States)

    Szczur, Martha R.

    1990-01-01

    The Transportable Applications Environment Plus (TAE PLUS), developed at NASA's Goddard Space Flight Center, is a portable What You See Is What You Get (WYSIWYG) user interface development and management system. Its primary objective is to provide an integrated software environment that allows interactive prototyping and development that of user interfaces, as well as management of the user interface within the operational domain. Although TAE Plus is applicable to many types of applications, its focus is supporting user interfaces for space applications. This paper discusses what TAE Plus provides and how the implementation has utilized state-of-the-art technologies within graphic workstations, windowing systems and object-oriented programming languages.

  20. Risk Issues in Developing Novel User Interfaces for Human-Computer Interaction

    KAUST Repository

    Klinker, Gudrun; Huber, Manuel; Tö nnis, Marcus

    2014-01-01

    © 2014 Springer International Publishing Switzerland. All rights are reserved. When new user interfaces or information visualization schemes are developed for complex information processing systems, it is not readily clear how much they do, in fact, support and improve users' understanding and use of such systems. Is a new interface better than an older one? In what respect, and in which situations? To provide answers to such questions, user testing schemes are employed. This chapter reports on a range of risks pertaining to the design and implementation of user interfaces in general, and to newly emerging interfaces (3-dimensionally, immersive, mobile) in particular.

  1. Glenn Reconfigurable User-interface and Virtual reality Exploration (GURVE) Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The GRUVE (Glenn Reconfigurable User-interface and Virtual reality Exploration) Lab is a reconfigurable, large screen display facility at Nasa Glenn Research Center....

  2. Risk Issues in Developing Novel User Interfaces for Human-Computer Interaction

    KAUST Repository

    Klinker, Gudrun

    2014-01-01

    © 2014 Springer International Publishing Switzerland. All rights are reserved. When new user interfaces or information visualization schemes are developed for complex information processing systems, it is not readily clear how much they do, in fact, support and improve users\\' understanding and use of such systems. Is a new interface better than an older one? In what respect, and in which situations? To provide answers to such questions, user testing schemes are employed. This chapter reports on a range of risks pertaining to the design and implementation of user interfaces in general, and to newly emerging interfaces (3-dimensionally, immersive, mobile) in particular.

  3. Some computer graphical user interfaces in radiation therapy.

    Science.gov (United States)

    Chow, James C L

    2016-03-28

    In this review, five graphical user interfaces (GUIs) used in radiation therapy practices and researches are introduced. They are: (1) the treatment time calculator, superficial X-ray treatment time calculator (SUPCALC) used in the superficial X-ray radiation therapy; (2) the monitor unit calculator, electron monitor unit calculator (EMUC) used in the electron radiation therapy; (3) the multileaf collimator machine file creator, sliding window intensity modulated radiotherapy (SWIMRT) used in generating fluence map for research and quality assurance in intensity modulated radiation therapy; (4) the treatment planning system, DOSCTP used in the calculation of 3D dose distribution using Monte Carlo simulation; and (5) the monitor unit calculator, photon beam monitor unit calculator (PMUC) used in photon beam radiation therapy. One common issue of these GUIs is that all user-friendly interfaces are linked to complex formulas and algorithms based on various theories, which do not have to be understood and noted by the user. In that case, user only needs to input the required information with help from graphical elements in order to produce desired results. SUPCALC is a superficial radiation treatment time calculator using the GUI technique to provide a convenient way for radiation therapist to calculate the treatment time, and keep a record for the skin cancer patient. EMUC is an electron monitor unit calculator for electron radiation therapy. Instead of doing hand calculation according to pre-determined dosimetric tables, clinical user needs only to input the required drawing of electron field in computer graphical file format, prescription dose, and beam parameters to EMUC to calculate the required monitor unit for the electron beam treatment. EMUC is based on a semi-experimental theory of sector-integration algorithm. SWIMRT is a multileaf collimator machine file creator to generate a fluence map produced by a medical linear accelerator. This machine file controls

  4. BlindSense: An Accessibility-inclusive Universal User Interface for Blind People

    Directory of Open Access Journals (Sweden)

    A. Khan

    2018-04-01

    Full Text Available A large number of blind people use smartphone-based assistive technology to perform their common activities. In order to provide a better user experience the existing user interface paradigm needs to be revisited. A new user interface model has been proposed in this paper. A simplified, semantically consistent, and blind-friendly adaptive user interface is provided. The proposed solution is evaluated through an empirical study on 63 blind people leveraging an improved user experience in performing common activities on a smartphone.

  5. Short report on the evaluation of a graphical user interface for radiation therapy planning systems

    International Nuclear Information System (INIS)

    Martin, M.B.

    1993-01-01

    Since their introduction graphical user interfaces for computing applications have generally appealed more to users than command-line or menu interfaces. Benefits from using a graphical interface include ease-of-use, ease-of-under-standing and increased productivity. For a radiation therapy planning application, an additional potential benefit is that the user regards the planning activity as a closer simulation of the real world situation. A prototype radiation therapy planning system incorporating a graphical user interface was developed on an Apple Macintosh microcomputer. Its graphic interface was then evaluated by twenty-six participants. The results showed markedly that the features associated with a graphic user interface were preferred. 6 refs., 3 figs., 1 tab

  6. Development of a multimodal transportation educational virtual appliance (MTEVA) to study congestion during extreme tropical events.

    Science.gov (United States)

    2011-11-28

    In this study, a prototype Multimodal Transportation Educational Virtual Appliance (MTEVA) is developed to assist in transportation and cyberinfrastructure undergraduate education. This initial version of the MTEVA provides a graphical user interface...

  7. CATE 2016 Indonesia: Camera, Software, and User Interface

    Science.gov (United States)

    Kovac, S. A.; Jensen, L.; Hare, H. S.; Mitchell, A. M.; McKay, M. A.; Bosh, R.; Watson, Z.; Penn, M.

    2016-12-01

    The Citizen Continental-America Telescopic Eclipse (Citizen CATE) Experiment will use a fleet of 60 identical telescopes across the United States to image the inner solar corona during the 2017 total solar eclipse. For a proof of concept, five sites were hosted along the path of totality during the 2016 total solar eclipse in Indonesia. Tanjung Pandan, Belitung, Indonesia was the first site to experience totality. This site had the best seeing conditions and focus, resulting in the highest quality images. This site proved that the equipment that is going to be used is capable of recording high quality images of the solar corona. Because 60 sites will be funded, each set up needs to be cost effective. This requires us to use an inexpensive camera, which consequently has a small dynamic range. To compensate for the corona's intensity drop off factor of 1,000, images are taken at seven frames per second, at exposures 0.4ms, 1.3ms, 4.0ms, 13ms, 40ms, 130ms, and 400ms. Using MatLab software, we are able to capture a high dynamic range with an Arduino that controls the 2448 x 2048 CMOS camera. A major component of this project is to train average citizens to use the software, meaning it needs to be as user friendly as possible. The CATE team is currently working with MathWorks to create a graphic user interface (GUI) that will make data collection run smoothly. This interface will include tabs for alignment, focus, calibration data, drift data, GPS, totality, and a quick look function. This work was made possible through the National Solar Observatory Research Experiences for Undergraduates (REU) Program, which is funded by the National Science Foundation (NSF). The NSO Training for 2017 Citizen CATE Experiment, funded by NASA (NASA NNX16AB92A), also provided support for this project. The National Solar Observatory is operated by the Association of Universities for Research in Astronomy, Inc. (AURA) under cooperative agreement with the NSF.

  8. Building a Relationship between Robot Characteristics and Teleoperation User Interfaces.

    Science.gov (United States)

    Mortimer, Michael; Horan, Ben; Seyedmahmoudian, Mehdi

    2017-03-14

    The Robot Operating System (ROS) provides roboticists with a standardized and distributed framework for real-time communication between robotic systems using a microkernel environment. This paper looks at how ROS metadata, Unified Robot Description Format (URDF), Semantic Robot Description Format (SRDF), and its message description language, can be used to identify key robot characteristics to inform User Interface (UI) design for the teleoperation of heterogeneous robot teams. Logical relationships between UI components and robot characteristics are defined by a set of relationship rules created using relevant and available information including developer expertise and ROS metadata. This provides a significant opportunity to move towards a rule-driven approach for generating the designs of teleoperation UIs; in particular the reduction of the number of different UI configurations required to teleoperate each individual robot within a heterogeneous robot team. This approach is based on using an underlying rule set identifying robots that can be teleoperated using the same UI configuration due to having the same or similar robot characteristics. Aside from reducing the number of different UI configurations an operator needs to be familiar with, this approach also supports consistency in UI configurations when a teleoperator is periodically switching between different robots. To achieve this aim, a Matlab toolbox is developed providing users with the ability to define rules specifying the relationship between robot characteristics and UI components. Once rules are defined, selections that best describe the characteristics of the robot type within a particular heterogeneous robot team can be made. A main advantage of this approach is that rather than specifying discrete robots comprising the team, the user can specify characteristics of the team more generally allowing the system to deal with slight variations that may occur in the future. In fact, by using the

  9. Building a Relationship between Robot Characteristics and Teleoperation User Interfaces

    Directory of Open Access Journals (Sweden)

    Michael Mortimer

    2017-03-01

    Full Text Available The Robot Operating System (ROS provides roboticists with a standardized and distributed framework for real-time communication between robotic systems using a microkernel environment. This paper looks at how ROS metadata, Unified Robot Description Format (URDF, Semantic Robot Description Format (SRDF, and its message description language, can be used to identify key robot characteristics to inform User Interface (UI design for the teleoperation of heterogeneous robot teams. Logical relationships between UI components and robot characteristics are defined by a set of relationship rules created using relevant and available information including developer expertise and ROS metadata. This provides a significant opportunity to move towards a rule-driven approach for generating the designs of teleoperation UIs; in particular the reduction of the number of different UI configurations required to teleoperate each individual robot within a heterogeneous robot team. This approach is based on using an underlying rule set identifying robots that can be teleoperated using the same UI configuration due to having the same or similar robot characteristics. Aside from reducing the number of different UI configurations an operator needs to be familiar with, this approach also supports consistency in UI configurations when a teleoperator is periodically switching between different robots. To achieve this aim, a Matlab toolbox is developed providing users with the ability to define rules specifying the relationship between robot characteristics and UI components. Once rules are defined, selections that best describe the characteristics of the robot type within a particular heterogeneous robot team can be made. A main advantage of this approach is that rather than specifying discrete robots comprising the team, the user can specify characteristics of the team more generally allowing the system to deal with slight variations that may occur in the future. In fact, by

  10. Optoelectronic polarimeter controlled by a graphical user interface of Matlab

    International Nuclear Information System (INIS)

    Vilardy, J M; Torres, R; Jimenez, C J

    2017-01-01

    We show the design and implementation of an optical polarimeter using electronic control. The polarimeter has a software with a graphical user interface (GUI) that controls the optoelectronic setup and captures the optical intensity measurement, and finally, this software evaluates the Stokes vector of a state of polarization (SOP) by means of the synchronous detection of optical waves. The proposed optoelectronic polarimeter can determine the Stokes vector of a SOP in a rapid and efficient way. Using the polarimeter proposed in this paper, the students will be able to observe (in an optical bench) and understand the different interactions of the SOP when the optical waves pass through to the linear polarizers and retarder waves plates. The polarimeter prototype could be used as a main tool for the students in order to learn the theory and experimental aspects of the SOP for optical waves via the Stokes vector measurement. The proposed polarimeter controlled by a GUI of Matlab is more attractive and suitable to teach and to learn the polarization of optical waves. (paper)

  11. Motor dysfunction and touch-slang in user interface data.

    Science.gov (United States)

    Klein, Yoni; Djaldetti, Ruth; Keller, Yosi; Bachelet, Ido

    2017-07-05

    The recent proliferation in mobile touch-based devices paves the way for increasingly efficient, easy to use natural user interfaces (NUI). Unfortunately, touch-based NUIs might prove difficult, or even impossible to operate, in certain conditions e.g. when suffering from motor dysfunction such as Parkinson's Disease (PD). Yet, the prevalence of such devices makes them particularly suitable for acquiring motor function data, and enabling the early detection of PD symptoms and other conditions. In this work we acquired a unique database of more than 12,500 annotated NUI multi-touch gestures, collected from PD patients and healthy volunteers, that were analyzed by applying advanced shape analysis and statistical inference schemes. The proposed analysis leads to a novel detection scheme for early stages of PD. Moreover, our computational analysis revealed that young subjects may be using a 'slang' form of gesture-making to reduce effort and attention cost while maintaining meaning, whereas older subjects put an emphasis on content and precise performance.

  12. GCL – An Easy Way for Creating Graphical User Interfaces

    Directory of Open Access Journals (Sweden)

    Mariusz Trzaska

    2011-02-01

    Full Text Available Graphical User Interfaces (GUI can be created using several approaches. Beside using visual editors or a manually written source code, it is possible to employ a declarative method. Such a solution usually allows working on a higher abstraction level which saves the developers' time and reduces errors. The approach can follow many ideas. One of them is based on utilizing a Domain Specific Language (DSL. In this paper we present the results of our research concerning a DSL language called GCL (GUI Creating Language. The prototype is implemented as a library for Java with an API emulating the syntax and semantics of a DSL language. A programmer, using a few keywords, is able to create different types of GUIs, including forms, panels, dialogs, etc. The widgets of the GUI are built automatically during the run-time phase based on a given data instance (an ordinary Java object and optionally are to be customized by the programmer. The main contribution of our work is delivering a working library for a popular platform. The library could be easily ported for other programming languages such the MS C#.

  13. Lessons learned from the design and implementation of distributed post-WIMP user interfaces

    OpenAIRE

    Seifried, Thomas; Jetter, Hans-Christian; Haller, Michael; Reiterer, Harald

    2011-01-01

    Creating novel user interfaces that are “natural” and distributed is challenging for designers and developers. “Natural” interaction techniques are barely standardized and in combination with distributed UIs additional technical difficulties arise. In this paper we present the lessons we have learned in developing several natural and distributed user interfaces and propose design patterns to support development of such applications.

  14. Integrating User Interface and Personal Innovativeness into the TAM for Mobile Learning in Cyber University

    Science.gov (United States)

    Joo, Young Ju; Lee, Hyeon Woo; Ham, Yookyoung

    2014-01-01

    This study aims to add new variables, namely user interface, personal innovativeness, and satisfaction in learning, to Davis's technology acceptance model and also examine whether learners are willing to adopt mobile learning. Thus, this study attempted to explain the structural causal relationships among user interface, personal…

  15. User interface design principles for the SSM/PMAD automated power system

    International Nuclear Information System (INIS)

    Jakstas, L.M.; Myers, C.J.

    1991-01-01

    Computer-human interfaces are an integral part of developing software for spacecraft power systems. A well designed and efficient user interface enables an engineer to effectively operate the system, while it concurrently prevents the user from entering data which is beyond boundary conditions or performing operations which are out of context. A user interface should also be designed to ensure that the engineer easily obtains all useful and critical data for operating the system and is aware of all faults and states in the system. Martin Marietta, under contract to NASA George C. Marshall Space Flight Center, has developed a user interface for the Space Station Module Power Management and Distribution (SSM/PMAD) automated power system testbed which provides human access to the functionality of the power system, as well as exemplifying current techniques in user interface design. The testbed user interface was designed to enable an engineer to operate the system easily without having significant knowledge of computer systems, as well as provide an environment in which the engineer can monitor and interact with the SSM/PMAD system hardware. The design of the interface supports a global view of the most important data form the various hardware and software components, as well as enabling the user to obtain additional or more detailed data when needed. The components and representations of the SSM/PMAD testbed user interface are examined in this paper. An engineer's interactions with the system are also described

  16. Realism is not all! User engagement with task-related interface characters

    NARCIS (Netherlands)

    van Vugt, H.C.; Konijn, E.A.; Hoorn, J.F.; Eliëns, A.P.W.; Keur, I.

    2007-01-01

    Human-like characters in the interface may evoke social responses in users, and literature suggests that realism is the most important factor herein. However, the effects of interface characters on the user are not well understood. We developed an integrative framework, called I-PEFiC, to explain

  17. Developing adaptive user interfaces using a game-based simulation environment

    NARCIS (Netherlands)

    Brake, G.M. te; Greef, T.E. de; Lindenberg, J.; Rypkema, J.A.; Smets-Noor, N.J.J.M.

    2006-01-01

    In dynamic settings, user interfaces can provide more optimal support if they adapt to the context of use. Providing adaptive user interfaces to first responders may therefore be fruitful. A cognitive engineering method that incorporates development iterations in both a simulated and a real-world

  18. Glotaran: A Java-Based Graphical User Interface for the R Package TIMP

    NARCIS (Netherlands)

    Snellenburg, J.J.; Laptenok, S.; Seger, R.; Mullen, K.M.; van Stokkum, I.H.M.

    2012-01-01

    In this work the software application called Glotaran is introduced as a Java-based graphical user interface to the R package TIMP, a problem solving environment for fitting superposition models to multi-dimensional data. TIMP uses a command-line user interface for the interaction with data, the

  19. Influence of Learning Styles on Graphical User Interface Preferences for e-Learners

    Science.gov (United States)

    Dedic, Velimir; Markovic, Suzana

    2012-01-01

    Implementing Web-based educational environment requires not only developing appropriate architectures, but also incorporating human factors considerations. User interface becomes the major channel to convey information in e-learning context: a well-designed and friendly enough interface is thus the key element in helping users to get the best…

  20. SWATMOD-PREP: Graphical user interface for preparing coupled SWAT-modflow simulations

    Science.gov (United States)

    This paper presents SWATMOD-Prep, a graphical user interface that couples a SWAT watershed model with a MODFLOW groundwater flow model. The interface is based on a recently published SWAT-MODFLOW code that couples the models via mapping schemes. The spatial layout of SWATMOD-Prep guides the user t...

  1. VOILA 2015 Visualizations and User Interfaces for Ontologies and Linked Data : Proceedings of the International Workshop on Visualizations and User Interfaces for Ontologies and Linked Data

    OpenAIRE

    2015-01-01

    A picture is worth a thousand words, we often say, yet many areas are in demand of sophisticated visualization techniques, and the Semantic Web is not an exception. The size and complexity of ontologies and Linked Data in the Semantic Web constantly grow and the diverse backgrounds of the users and application areas multiply at the same time. Providing users with visual representations and intuitive user interfaces can significantly aid the understanding of the domains and knowledge represent...

  2. User interface design principles for the SSM/PMAD automated power system

    Science.gov (United States)

    Jakstas, Laura M.; Myers, Chris J.

    1991-01-01

    Martin Marietta has developed a user interface for the space station module power management and distribution (SSM/PMAD) automated power system testbed which provides human access to the functionality of the power system, as well as exemplifying current techniques in user interface design. The testbed user interface was designed to enable an engineer to operate the system easily without having significant knowledge of computer systems, as well as provide an environment in which the engineer can monitor and interact with the SSM/PMAD system hardware. The design of the interface supports a global view of the most important data from the various hardware and software components, as well as enabling the user to obtain additional or more detailed data when needed. The components and representations of the SSM/PMAD testbed user interface are examined. An engineer's interactions with the system are also described.

  3. A Graphical User Interface for the Computational Fluid Dynamics Software OpenFOAM

    OpenAIRE

    Melbø, Henrik Kaald

    2014-01-01

    A graphical user interface for the computational fluid dynamics software OpenFOAM has been constructed. OpenFOAM is a open source and powerful numerical software, but has much to be wanted in the field of user friendliness. In this thesis the basic operation of OpenFOAM will be introduced and the thesis will emerge in a graphical user interface written in PyQt. The graphical user interface will make the use of OpenFOAM simpler, and hopefully make this powerful tool more available for the gene...

  4. Guidelines for the integration of audio cues into computer user interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Sumikawa, D.A.

    1985-06-01

    Throughout the history of computers, vision has been the main channel through which information is conveyed to the computer user. As the complexities of man-machine interactions increase, more and more information must be transferred from the computer to the user and then successfully interpreted by the user. A logical next step in the evolution of the computer-user interface is the incorporation of sound and thereby using the sense of ''hearing'' in the computer experience. This allows our visual and auditory capabilities to work naturally together in unison leading to more effective and efficient interpretation of all information received by the user from the computer. This thesis presents an initial set of guidelines to assist interface developers in designing an effective sight and sound user interface. This study is a synthesis of various aspects of sound, human communication, computer-user interfaces, and psychoacoustics. We introduce the notion of an earcon. Earcons are audio cues used in the computer-user interface to provide information and feedback to the user about some computer object, operation, or interaction. A possible construction technique for earcons, the use of earcons in the interface, how earcons are learned and remembered, and the affects of earcons on their users are investigated. This study takes the point of view that earcons are a language and human/computer communication issue and are therefore analyzed according to the three dimensions of linguistics; syntactics, semantics, and pragmatics.

  5. fgui: A Method for Automatically Creating Graphical User Interfaces for Command-Line R Packages

    Science.gov (United States)

    Hoffmann, Thomas J.; Laird, Nan M.

    2009-01-01

    The fgui R package is designed for developers of R packages, to help rapidly, and sometimes fully automatically, create a graphical user interface for a command line R package. The interface is built upon the Tcl/Tk graphical interface included in R. The package further facilitates the developer by loading in the help files from the command line functions to provide context sensitive help to the user with no additional effort from the developer. Passing a function as the argument to the routines in the fgui package creates a graphical interface for the function, and further options are available to tweak this interface for those who want more flexibility. PMID:21625291

  6. Development and evaluation of nursing user interface screens using multiple methods.

    Science.gov (United States)

    Hyun, Sookyung; Johnson, Stephen B; Stetson, Peter D; Bakken, Suzanne

    2009-12-01

    Building upon the foundation of the Structured Narrative Electronic Health Record (EHR) model, we applied theory-based (combined Technology Acceptance Model and Task-Technology Fit Model) and user-centered methods to explore nurses' perceptions of functional requirements for an electronic nursing documentation system, design user interface screens reflective of the nurses' perspectives, and assess nurses' perceptions of the usability of the prototype user interface screens. The methods resulted in user interface screens that were perceived to be easy to use, potentially useful, and well-matched to nursing documentation tasks associated with Nursing Admission Assessment, Blood Administration, and Nursing Discharge Summary. The methods applied in this research may serve as a guide for others wishing to implement user-centered processes to develop or extend EHR systems. In addition, some of the insights obtained in this study may be informative to the development of safe and efficient user interface screens for nursing document templates in EHRs.

  7. Designing distributed user interfaces for ambient intelligent environments using models and simulations

    OpenAIRE

    LUYTEN, Kris; VAN DEN BERGH, Jan; VANDERVELPEN, Chris; CONINX, Karin

    2006-01-01

    There is a growing demand for design support to create interactive systems that are deployed in ambient intelligent environments. Unlike traditional interactive systems, the wide diversity of situations these type of user interfaces need to work in require tool support that is close to the environment of the end-user on the one hand and provide a smooth integration with the application logic on the other hand. This paper shows how the model-based user interface development methodology can be ...

  8. Visibility Aspects Importance of User Interface Reception in Cloud Computing Applications with Increased Automation

    OpenAIRE

    Haxhixhemajli, Denis

    2012-01-01

    Visibility aspects of User Interfaces are important; they deal with the crucial phase of human-computer interaction. They allow users to perform and at the same time hide the complexity of the system. Acceptance of new systems depends on how visibility aspects of the User Interfaces are presented. Human eyes make the first contact with the appearance of any system by so it generates the very beginning of the human – application interaction. In this study it is enforced that visibility aspects...

  9. Graphical User Interface for Simulink Integrated Performance Analysis Model

    Science.gov (United States)

    Durham, R. Caitlyn

    2009-01-01

    The J-2X Engine (built by Pratt & Whitney Rocketdyne,) in the Upper Stage of the Ares I Crew Launch Vehicle, will only start within a certain range of temperature and pressure for Liquid Hydrogen and Liquid Oxygen propellants. The purpose of the Simulink Integrated Performance Analysis Model is to verify that in all reasonable conditions the temperature and pressure of the propellants are within the required J-2X engine start boxes. In order to run the simulation, test variables must be entered at all reasonable values of parameters such as heat leak and mass flow rate. To make this testing process as efficient as possible in order to save the maximum amount of time and money, and to show that the J-2X engine will start when it is required to do so, a graphical user interface (GUI) was created to allow the input of values to be used as parameters in the Simulink Model, without opening or altering the contents of the model. The GUI must allow for test data to come from Microsoft Excel files, allow those values to be edited before testing, place those values into the Simulink Model, and get the output from the Simulink Model. The GUI was built using MATLAB, and will run the Simulink simulation when the Simulate option is activated. After running the simulation, the GUI will construct a new Microsoft Excel file, as well as a MATLAB matrix file, using the output values for each test of the simulation so that they may graphed and compared to other values.

  10. Effect of EHR user interface changes on internal prescription discrepancies.

    Science.gov (United States)

    Turchin, A; Sawarkar, A; Dementieva, Y A; Breydo, E; Ramelson, H

    2014-01-01

    To determine whether specific design interventions (changes in the user interface (UI)) of an electronic health record (EHR) medication module are associated with an increase or decrease in the incidence of contradictions between the structured and narrative components of electronic prescriptions (internal prescription discrepancies). We performed a retrospective analysis of 960,000 randomly selected electronic prescriptions generated in a single EHR between 01/2004 and 12/2011. Internal prescription discrepancies were identified using a validated natural language processing tool with recall of 76% and precision of 84%. A multivariable autoregressive integrated moving average (ARIMA) model was used to evaluate the effect of five UI changes in the EHR medication module on incidence of internal prescription discrepancies. Over the study period 175,725 (18.4%) prescriptions were found to have internal discrepancies. The highest rate of prescription discrepancies was observed in March 2006 (22.5%) and the lowest in March 2009 (15.0%). Addition of "as directed" option to the dropdown decreased prescription discrepancies by 195 / month (p = 0.0004). An non-interruptive alert that reminded providers to ensure that structured and narrative components did not contradict each other decreased prescription discrepancies by 145 / month (p = 0.03). Addition of a "Renew / Sign" button to the Medication module (a negative control) did not have an effect in prescription discrepancies. Several UI changes in the electronic medication module were effective in reducing the incidence of internal prescription discrepancies. Further research is needed to identify interventions that can completely eliminate this type of prescription error and their effects on patient outcomes.

  11. Effect of EHR User Interface Changes on Internal Prescription Discrepancies

    Science.gov (United States)

    Sawarkar, A.; Dementieva, Y.A.; Breydo, E.; Ramelson, H.

    2014-01-01

    Summary Objective To determine whether specific design interventions (changes in the user interface (UI)) of an electronic health record (EHR) medication module are associated with an increase or decrease in the incidence of contradictions between the structured and narrative components of electronic prescriptions (internal prescription discrepancies). Materials and Methods We performed a retrospective analysis of 960,000 randomly selected electronic prescriptions generated in a single EHR between 01/2004 and 12/2011. Internal prescription discrepancies were identified using a validated natural language processing tool with recall of 76% and precision of 84%. A multivariable autoregressive integrated moving average (ARIMA) model was used to evaluate the effect of five UI changes in the EHR medication module on incidence of internal prescription discrepancies. Results Over the study period 175,725 (18.4%) prescriptions were found to have internal discrepancies. The highest rate of prescription discrepancies was observed in March 2006 (22.5%) and the lowest in March 2009 (15.0%). Addition of „as directed“ option to the dropdown decreased prescription discrepancies by 195 / month (p = 0.0004). An non-interruptive alert that reminded providers to ensure that structured and narrative components did not contradict each other decreased prescription discrepancies by 145 / month (p = 0.03). Addition of a „Renew / Sign“ button to the Medication module (a negative control) did not have an effect in prescription discrepancies. Conclusions Several UI changes in the electronic medication module were effective in reducing the incidence of internal prescription discrepancies. Further research is needed to identify interventions that can completely eliminate this type of prescription error and their effects on patient outcomes. PMID:25298811

  12. Dynamic Distribution and Layouting of Model-Based User Interfaces in Smart Environments

    Science.gov (United States)

    Roscher, Dirk; Lehmann, Grzegorz; Schwartze, Veit; Blumendorf, Marco; Albayrak, Sahin

    The developments in computer technology in the last decade change the ways of computer utilization. The emerging smart environments make it possible to build ubiquitous applications that assist users during their everyday life, at any time, in any context. But the variety of contexts-of-use (user, platform and environment) makes the development of such ubiquitous applications for smart environments and especially its user interfaces a challenging and time-consuming task. We propose a model-based approach, which allows adapting the user interface at runtime to numerous (also unknown) contexts-of-use. Based on a user interface modelling language, defining the fundamentals and constraints of the user interface, a runtime architecture exploits the description to adapt the user interface to the current context-of-use. The architecture provides automatic distribution and layout algorithms for adapting the applications also to contexts unforeseen at design time. Designers do not specify predefined adaptations for each specific situation, but adaptation constraints and guidelines. Furthermore, users are provided with a meta user interface to influence the adaptations according to their needs. A smart home energy management system serves as running example to illustrate the approach.

  13. Development of a Mobile User Interface for Image-based Dietary Assessment.

    Science.gov (United States)

    Kim, Sungye; Schap, Tusarebecca; Bosch, Marc; Maciejewski, Ross; Delp, Edward J; Ebert, David S; Boushey, Carol J

    2010-12-31

    In this paper, we present a mobile user interface for image-based dietary assessment. The mobile user interface provides a front end to a client-server image recognition and portion estimation software. In the client-server configuration, the user interactively records a series of food images using a built-in camera on the mobile device. Images are sent from the mobile device to the server, and the calorie content of the meal is estimated. In this paper, we describe and discuss the design and development of our mobile user interface features. We discuss the design concepts, through initial ideas and implementations. For each concept, we discuss qualitative user feedback from participants using the mobile client application. We then discuss future designs, including work on design considerations for the mobile application to allow the user to interactively correct errors in the automatic processing while reducing the user burden associated with classical pen-and-paper dietary records.

  14. Comparing two anesthesia information management system user interfaces: a usability evaluation.

    Science.gov (United States)

    Wanderer, Jonathan P; Rao, Anoop V; Rothwell, Sarah H; Ehrenfeld, Jesse M

    2012-11-01

    Anesthesia information management systems (AIMS) have been developed by multiple vendors and are deployed in thousands of operating rooms around the world, yet not much is known about measuring and improving AIMS usability. We developed a methodology for evaluating AIMS usability in a low-fidelity simulated clinical environment and used it to compare an existing user interface with a revised version. We hypothesized that the revised user interface would be more useable. In a low-fidelity simulated clinical environment, twenty anesthesia providers documented essential anesthetic information for the start of the case using both an existing and a revised user interface. Participants had not used the revised user interface previously and completed a brief training exercise prior to the study task. All participants completed a workload assessment and a satisfaction survey. All sessions were recorded. Multiple usability metrics were measured. The primary outcome was documentation accuracy. Secondary outcomes were perceived workload, number of documentation steps, number of user interactions, and documentation time. The interfaces were compared and design problems were identified by analyzing recorded sessions and survey results. Use of the revised user interface was shown to improve documentation accuracy from 85.1% to 92.4%, a difference of 7.3% (95% confidence interval [CI] for the difference 1.8 to 12.7). The revised user interface decreased the number of user interactions by 6.5 for intravenous documentation (95% CI 2.9 to 10.1) and by 16.1 for airway documentation (95% CI 11.1 to 21.1). The revised user interface required 3.8 fewer documentation steps (95% CI 2.3 to 5.4). Airway documentation time was reduced by 30.5 seconds with the revised workflow (95% CI 8.5 to 52.4). There were no significant time differences noted in intravenous documentation or in total task time. No difference in perceived workload was found between the user interfaces. Two user interface

  15. Use of Design Patterns According to Hand Dominance in a Mobile User Interface

    Science.gov (United States)

    Al-Samarraie, Hosam; Ahmad, Yusof

    2016-01-01

    User interface (UI) design patterns for mobile applications provide a solution to design problems and can improve the usage experience for users. However, there is a lack of research categorizing the uses of design patterns according to users' hand dominance in a learning-based mobile UI. We classified the main design patterns for mobile…

  16. Mechanisms for collaboration: a design and evaluation framework for multi-user interfaces

    OpenAIRE

    Yuill, Nicola; Rogers, Yvonne

    2012-01-01

    Multi-user interfaces are said to provide “natural” interaction in supporting collaboration, compared to individual and noncolocated technologies. We identify three mechanisms accounting for the success of such interfaces: high awareness of others' actions and intentions, high control over the interface, and high availability of background information. We challenge the idea that interaction over such interfaces is necessarily “natural” and argue that everyday interaction involves constraints ...

  17. Monitoring and controlling ATLAS data management: The Rucio web user interface

    OpenAIRE

    Lassnig, Mario; Beermann, Thomas Alfons; Vigne, Ralph; Barisits, Martin-Stefan; Garonne, Vincent; Serfon, Cedric

    2015-01-01

    The monitoring and controlling interfaces of the previous data management system DQ2 followed the evolutionary requirements and needs of the ATLAS collaboration. The new data management system, Rucio, has put in place a redesigned web-based interface based upon the lessons learnt from DQ2, and the increased volume of managed information. This interface encompasses both a monitoring and controlling component, and allows easy integration for user-generated views. The interface follows three des...

  18. The web-based user interface for EAST plasma control system

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, R.R., E-mail: rrzhang@ipp.ac.cn [Institute of Plasma Physics, Chinese Academy of Sciences, Anhui (China); Xiao, B.J. [Institute of Plasma Physics, Chinese Academy of Sciences, Anhui (China); School of Nuclear Science and Technology, University of Science and Technology of China, Anhui (China); Yuan, Q.P. [Institute of Plasma Physics, Chinese Academy of Sciences, Anhui (China); Yang, F. [Institute of Plasma Physics, Chinese Academy of Sciences, Anhui (China); Department of Computer Science, Anhui Medical University, Anhui (China); Zhang, Y. [Institute of Plasma Physics, Chinese Academy of Sciences, Anhui (China); Johnson, R.D.; Penaflor, B.G. [General Atomics, DIII-D National Fusion Facility, San Diego, CA (United States)

    2014-05-15

    The plasma control system (PCS) plays a vital role at EAST for fusion science experiments. Its software application consists of two main parts: an IDL graphical user interface for setting a large number of plasma parameters to specify each discharge, several programs for performing the real-time feedback control and managing the whole control system. The PCS user interface can be used from any X11 Windows client with privileged access to the PCS computer system. However, remote access to the PCS system via the IDL user interface becomes an extreme inconvenience due to the high network latency to draw or operate the interfaces. In order to realize lower latency for remote access to the PCS system, a web-based system has been developed for EAST recently. The setup data are retrieved from the PCS system and client-side JavaScript draws the interfaces into the user's browser. The user settings are also sent back to the PCS system for controlling discharges. These technologies allow the web-based user interface to be viewed by authorized users with a web browser and have it communicate with PCS server processes directly. It works together with the IDL interface and provides a new way to aid remote participation.

  19. The web-based user interface for EAST plasma control system

    International Nuclear Information System (INIS)

    Zhang, R.R.; Xiao, B.J.; Yuan, Q.P.; Yang, F.; Zhang, Y.; Johnson, R.D.; Penaflor, B.G.

    2014-01-01

    The plasma control system (PCS) plays a vital role at EAST for fusion science experiments. Its software application consists of two main parts: an IDL graphical user interface for setting a large number of plasma parameters to specify each discharge, several programs for performing the real-time feedback control and managing the whole control system. The PCS user interface can be used from any X11 Windows client with privileged access to the PCS computer system. However, remote access to the PCS system via the IDL user interface becomes an extreme inconvenience due to the high network latency to draw or operate the interfaces. In order to realize lower latency for remote access to the PCS system, a web-based system has been developed for EAST recently. The setup data are retrieved from the PCS system and client-side JavaScript draws the interfaces into the user's browser. The user settings are also sent back to the PCS system for controlling discharges. These technologies allow the web-based user interface to be viewed by authorized users with a web browser and have it communicate with PCS server processes directly. It works together with the IDL interface and provides a new way to aid remote participation

  20. Information visualization to user-friendly interface construction for information retrieval systems

    Directory of Open Access Journals (Sweden)

    Jessica Monique de Lira Vieira

    2011-10-01

    Full Text Available The information presented through visualization help the Information Retrieval System (IRS to reach its main goal: to retrieve relevant information that meets the informational needs of its users. The objective of this article is to describe and analyze techniques proposed by the Information Visualization area and interface models discussed in Information Science Literature, which applied to graphical interface construction would facilitate the appropriation of information by the users of IRS and would help them to search, browse and retrieve information. The methodology consists of a literature review focusing on the potential contribution of the visual representation of information in the development of user-friendly interfaces to IRS, as well as identification and analyses of visualizations used as interfaces by IRS. The use of visualizations is of great importance in the communication between SRI and users, because the information presented through visual representation are better understood by user and allow the discovery of new knowledge.

  1. User interface design in safety parameter display systems

    International Nuclear Information System (INIS)

    Schultz, E.E. Jr.; Johnson, G.L.

    1988-01-01

    The extensive installation of computerized safety Parameter Display Systems (SPDSs) in nuclear power plants since the Three-Mile Island accident has enhanced plant safety. It has also raised new issues of how best to ensure an effective interface between human operators and the plant via computer systems. New developments in interface technologies since the current generation of SPDSs was installed can contribute to improving display interfaces. These technologies include new input devices, three-dimensional displays, delay indicators, and auditory displays. Examples of how they might be applied to improve current SPDSs are given. These examples illustrate how the new use interface technology could be applied to future nuclear plant displays

  2. Personalization of XML Content Browsing Based on User Preferences

    Science.gov (United States)

    Encelle, Benoit; Baptiste-Jessel, Nadine; Sedes, Florence

    2009-01-01

    Personalization of user interfaces for browsing content is a key concept to ensure content accessibility. In this direction, we introduce concepts that result in the generation of personalized multimodal user interfaces for browsing XML content. User requirements concerning the browsing of a specific content type can be specified by means of…

  3. Bilinear modeling of EMG signals to extract user-independent features for multiuser myoelectric interface.

    Science.gov (United States)

    Matsubara, Takamitsu; Morimoto, Jun

    2013-08-01

    In this study, we propose a multiuser myoelectric interface that can easily adapt to novel users. When a user performs different motions (e.g., grasping and pinching), different electromyography (EMG) signals are measured. When different users perform the same motion (e.g., grasping), different EMG signals are also measured. Therefore, designing a myoelectric interface that can be used by multiple users to perform multiple motions is difficult. To cope with this problem, we propose for EMG signals a bilinear model that is composed of two linear factors: 1) user dependent and 2) motion dependent. By decomposing the EMG signals into these two factors, the extracted motion-dependent factors can be used as user-independent features. We can construct a motion classifier on the extracted feature space to develop the multiuser interface. For novel users, the proposed adaptation method estimates the user-dependent factor through only a few interactions. The bilinear EMG model with the estimated user-dependent factor can extract the user-independent features from the novel user data. We applied our proposed method to a recognition task of five hand gestures for robotic hand control using four-channel EMG signals measured from subject forearms. Our method resulted in 73% accuracy, which was statistically significantly different from the accuracy of standard nonmultiuser interfaces, as the result of a two-sample t -test at a significance level of 1%.

  4. User interface development and metadata considerations for the Atmospheric Radiation Measurement (ARM) archive

    Science.gov (United States)

    Singley, P. T.; Bell, J. D.; Daugherty, P. F.; Hubbs, C. A.; Tuggle, J. G.

    1993-01-01

    This paper will discuss user interface development and the structure and use of metadata for the Atmospheric Radiation Measurement (ARM) Archive. The ARM Archive, located at Oak Ridge National Laboratory (ORNL) in Oak Ridge, Tennessee, is the data repository for the U.S. Department of Energy's (DOE's) ARM Project. After a short description of the ARM Project and the ARM Archive's role, we will consider the philosophy and goals, constraints, and prototype implementation of the user interface for the archive. We will also describe the metadata that are stored at the archive and support the user interface.

  5. Shortening User Interface Design Iterations through Realtime Visualisation of Design Actions on the Target Device

    OpenAIRE

    MESKENS, Jan; LUYTEN, Kris; CONINX, Karin

    2009-01-01

    In current mobile user interface design tools, it is time consuming to export a design to the target device. This makes it hard for designers to iterate over the user interfaces they are creating. We propose Gummy-live, a GUI builder for mobile devices allowing designers to test and observe immediately on the target device each step they take in the GUI builder. This way, designers are stimulated to iteratively test and refine user interface prototypes in order to take the target device charac...

  6. Age Based User Interface in Mobile Operating System

    OpenAIRE

    Sharma, Sumit; Sharma, Rohitt; Singh, Paramjit; Mahajan, Aditya

    2012-01-01

    This paper proposes the creation of different interfaces in the mobile operating system for different age groups. The different age groups identified are kids, elderly people and all others. The motive behind creating different interfaces is to make the smartphones of today's world usable to all age groups.

  7. Cognitive Awareness Prototype Development on User Interface Design

    Science.gov (United States)

    Rosli, D'oria Islamiah

    2015-01-01

    Human error is a crucial problem in manufacturing industries. Due to the misinterpretation of information on interface system design, accidents or death may occur at workplace. Lack of human cognition criteria in interface system design is also one of the contributions to the failure in using the system effectively. Therefore, this paper describes…

  8. Finding and Exploring Health Information with a Slider-Based User Interface.

    Science.gov (United States)

    Pang, Patrick Cheong-Iao; Verspoor, Karin; Pearce, Jon; Chang, Shanton

    2016-01-01

    Despite the fact that search engines are the primary channel to access online health information, there are better ways to find and explore health information on the web. Search engines are prone to problems when they are used to find health information. For instance, users have difficulties in expressing health scenarios with appropriate search keywords, search results are not optimised for medical queries, and the search process does not account for users' literacy levels and reading preferences. In this paper, we describe our approach to addressing these problems by introducing a novel design using a slider-based user interface for discovering health information without the need for precise search keywords. The user evaluation suggests that the interface is easy to use and able to assist users in the process of discovering new information. This study demonstrates the potential value of adopting slider controls in the user interface of health websites for navigation and information discovery.

  9. NFC-Based User Interface for Smart Environments

    OpenAIRE

    Susanna Spinsante; Ennio Gambi

    2015-01-01

    The physical support of a home automation system, joined with a simplified user-system interaction modality, may allow people affected by motor impairments or limitations, such as elderly and disabled people, to live safely and comfortably at home, by improving their autonomy and facilitating the execution of daily life tasks. The proposed solution takes advantage of the Near Field Communications technology, which is simple and intuitive to use, to enable advanced user interaction. The user c...

  10. Eye gaze in intelligent user interfaces gaze-based analyses, models and applications

    CERN Document Server

    Nakano, Yukiko I; Bader, Thomas

    2013-01-01

    Remarkable progress in eye-tracking technologies opened the way to design novel attention-based intelligent user interfaces, and highlighted the importance of better understanding of eye-gaze in human-computer interaction and human-human communication. For instance, a user's focus of attention is useful in interpreting the user's intentions, their understanding of the conversation, and their attitude towards the conversation. In human face-to-face communication, eye gaze plays an important role in floor management, grounding, and engagement in conversation.Eye Gaze in Intelligent User Interfac

  11. In-situ Multimodal Imaging and Spectroscopy of Mg Electrodeposition at Electrode-Electrolyte Interfaces

    Science.gov (United States)

    Wu, Yimin A.; Yin, Zuwei; Farmand, Maryam; Yu, Young-Sang; Shapiro, David A.; Liao, Hong-Gang; Liang, Wen-I.; Chu, Ying-Hao; Zheng, Haimei

    2017-02-01

    We report the study of Mg cathodic electrochemical deposition on Ti and Au electrode using a multimodal approach by examining the sample area in-situ using liquid cell transmission electron microscopy (TEM), scanning transmission X-ray microscopy (STXM) and X-ray absorption spectroscopy (XAS). Magnesium Aluminum Chloride Complex was synthesized and utilized as electrolyte, where non-reversible features during in situ charging-discharging cycles were observed. During charging, a uniform Mg film was deposited on the electrode, which is consistent with the intrinsic non-dendritic nature of Mg deposition in Mg ion batteries. The Mg thin film was not dissolvable during the following discharge process. We found that such Mg thin film is hexacoordinated Mg compounds by in-situ STXM and XAS. This study provides insights on the non-reversibility issue and failure mechanism of Mg ion batteries. Also, our method provides a novel generic method to understand the in situ battery chemistry without any further sample processing, which can preserve the original nature of battery materials or electrodeposited materials. This multimodal in situ imaging and spectroscopy provides many opportunities to attack complex problems that span orders of magnitude in length and time scale, which can be applied to a broad range of the energy storage systems.

  12. Interface Prostheses With Classifier-Feedback-Based User Training.

    Science.gov (United States)

    Fang, Yinfeng; Zhou, Dalin; Li, Kairu; Liu, Honghai

    2017-11-01

    It is evident that user training significantly affects performance of pattern-recognition-based myoelectric prosthetic device control. Despite plausible classification accuracy on offline datasets, online accuracy usually suffers from the changes in physiological conditions and electrode displacement. The user ability in generating consistent electromyographic (EMG) patterns can be enhanced via proper user training strategies in order to improve online performance. This study proposes a clustering-feedback strategy that provides real-time feedback to users by means of a visualized online EMG signal input as well as the centroids of the training samples, whose dimensionality is reduced to minimal number by dimension reduction. Clustering feedback provides a criterion that guides users to adjust motion gestures and muscle contraction forces intentionally. The experiment results have demonstrated that hand motion recognition accuracy increases steadily along the progress of the clustering-feedback-based user training, while conventional classifier-feedback methods, i.e., label feedback, hardly achieve any improvement. The result concludes that the use of proper classifier feedback can accelerate the process of user training, and implies prosperous future for the amputees with limited or no experience in pattern-recognition-based prosthetic device manipulation.It is evident that user training significantly affects performance of pattern-recognition-based myoelectric prosthetic device control. Despite plausible classification accuracy on offline datasets, online accuracy usually suffers from the changes in physiological conditions and electrode displacement. The user ability in generating consistent electromyographic (EMG) patterns can be enhanced via proper user training strategies in order to improve online performance. This study proposes a clustering-feedback strategy that provides real-time feedback to users by means of a visualized online EMG signal input as well

  13. The design and evaluation of an activity monitoring user interface for people with stroke.

    Science.gov (United States)

    Hart, Phil; Bierwirth, Rebekah; Fulk, George; Sazonov, Edward

    2014-01-01

    Usability is an important topic in the field of telerehabilitation research. Older users with disabilities in particular, present age-related and disability-related challenges that should be accommodated for in the design of a user interface for a telerehabilitation system. This paper describes the design, implementation, and assessment of a telerehabilitation system user interface that tries to maximize usability for an elderly user who has experienced a stroke. An Internet-connected Nintendo(®) Wii™ gaming system is selected as a hardware platform, and a server and website are implemented to process and display the feedback information. The usability of the interface is assessed with a trial consisting of 18 subjects: 10 healthy Doctor of Physical Therapy students and 8 people with a stroke. Results show similar levels of usability and high satisfaction with the gaming system interface from both groups of subjects.

  14. An Efficient User Interface Design for Nursing Information System Based on Integrated Patient Order Information.

    Science.gov (United States)

    Chu, Chia-Hui; Kuo, Ming-Chuan; Weng, Shu-Hui; Lee, Ting-Ting

    2016-01-01

    A user friendly interface can enhance the efficiency of data entry, which is crucial for building a complete database. In this study, two user interfaces (traditional pull-down menu vs. check boxes) are proposed and evaluated based on medical records with fever medication orders by measuring the time for data entry, steps for each data entry record, and the complete rate of each medical record. The result revealed that the time for data entry is reduced from 22.8 sec/record to 3.2 sec/record. The data entry procedures also have reduced from 9 steps in the traditional one to 3 steps in the new one. In addition, the completeness of medical records is increased from 20.2% to 98%. All these results indicate that the new user interface provides a more user friendly and efficient approach for data entry than the traditional interface.

  15. Developing A Web-based User Interface for Semantic Information Retrieval

    Science.gov (United States)

    Berrios, Daniel C.; Keller, Richard M.

    2003-01-01

    While there are now a number of languages and frameworks that enable computer-based systems to search stored data semantically, the optimal design for effective user interfaces for such systems is still uncle ar. Such interfaces should mask unnecessary query detail from users, yet still allow them to build queries of arbitrary complexity without significant restrictions. We developed a user interface supporting s emantic query generation for Semanticorganizer, a tool used by scient ists and engineers at NASA to construct networks of knowledge and dat a. Through this interface users can select node types, node attribute s and node links to build ad-hoc semantic queries for searching the S emanticOrganizer network.

  16. A Formal Approach to User Interface Design using Hybrid System Theory, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Optimal Synthesis Inc.(OSI) proposes to develop an aiding tool for user interface design that is based on mathematical formalism of hybrid system theory. The...

  17. An Object-Oriented Graphical User Interface for a Reusable Rocket Engine Intelligent Control System

    Science.gov (United States)

    Litt, Jonathan S.; Musgrave, Jeffrey L.; Guo, Ten-Huei; Paxson, Daniel E.; Wong, Edmond; Saus, Joseph R.; Merrill, Walter C.

    1994-01-01

    An intelligent control system for reusable rocket engines under development at NASA Lewis Research Center requires a graphical user interface to allow observation of the closed-loop system in operation. The simulation testbed consists of a real-time engine simulation computer, a controls computer, and several auxiliary computers for diagnostics and coordination. The system is set up so that the simulation computer could be replaced by the real engine and the change would be transparent to the control system. Because of the hard real-time requirement of the control computer, putting a graphical user interface on it was not an option. Thus, a separate computer used strictly for the graphical user interface was warranted. An object-oriented LISP-based graphical user interface has been developed on a Texas Instruments Explorer 2+ to indicate the condition of the engine to the observer through plots, animation, interactive graphics, and text.

  18. Responsive Graphical User Interface (ReGUI) and its Implementation in MATLAB

    OpenAIRE

    Mikulszky, Matej; Pocsova, Jana; Mojzisova, Andrea; Podlubny, Igor

    2017-01-01

    In this paper we introduce the responsive graphical user interface (ReGUI) approach to creating applications, and demonstrate how this approach can be implemented in MATLAB. The same general technique can be used in other programming languages.

  19. Graphical User Interface Tool Kit for Path-Based Network Policy Language

    National Research Council Canada - National Science Library

    Ekin, Tufan

    2002-01-01

    .... Two of the changes are related to the semantics of the language. A graphical user interface tool kit for creating, validating, archiving and compiling policies represented in PPL has been developed...

  20. Graphics server and action language interpreter greatly simplify the composition of a graphical user interface

    International Nuclear Information System (INIS)

    Mueller, R.

    1992-01-01

    A new control system based on a distributed computing environment is gradually installed at BESSY, a 800 MeV storage ring dedicated to the generation of synchrotron light in the VUV and soft X-ray region. The new operator consoles are large high resolution, bitmap oriented color graphic screens with mouse and keyboard. A new graphical user interface has been developed with a user interface management system. A graphics server encapsulates completely representational aspects, mediates between user interactions and application variables and takes care of a consistent state of graphical and applicational objects. Graphical representations, semantics of user interactions and interpreter instructions are defined in a database written in a simple and comprehensible user interface definition language. (R.P.) 7 refs.; 5 figs

  1. User interface to administrative DRMS within a distributed environment

    Science.gov (United States)

    Martin, L. D.; Kirk, R. D.

    1983-01-01

    The implementation of a data base management system (DBMS) into a communications office to control and report on communication leased service contracts is discussed. The system user executes online programs to update five files residing on a UNIVAC 1100/82, through the forms mode features of the Tektronix 4025 terminal and IMSAI 8080 microcomputer. This user can select the appropriate form to the Tektronix 4025 screen, and enter new data, update existing data, or discontinue service. Selective online printing of 40 reports is accomplished by the system user to satisfy management, budget, and bill payment reporting requirements.

  2. Experimental evaluation of multimodal human computer interface for tactical audio applications

    NARCIS (Netherlands)

    Obrenovic, Z.; Starcevic, D.; Jovanov, E.; Oy, S.

    2002-01-01

    Mission critical and information overwhelming applications require careful design of the human computer interface. Typical applications include night vision or low visibility mission navigation, guidance through a hostile territory, and flight navigation and orientation. Additional channels of

  3. A Framework for Effective User Interface Design for Web-Based Electronic Commerce Applications

    Directory of Open Access Journals (Sweden)

    Justyna Burns

    2001-01-01

    Full Text Available Efficient delivery of relevant product information is increasingly becoming the central basis of competition between firms. The interface design represents the central component for successful information delivery to consumers. However, interface design for web-based information systems is probably more an art than a science at this point in time. Much research is needed to understand properties of an effective interface for electronic commerce. This paper develops a framework identifying the relationship between user factors, the role of the user interface and overall system success for web-based electronic commerce. The paper argues that web-based systems for electronic commerce have some similar properties to decision support systems (DSS and adapts an established DSS framework to the electronic commerce domain. Based on a limited amount of research studying web browser interface design, the framework identifies areas of research needed and outlines possible relationships between consumer characteristics, interface design attributes and measures of overall system success.

  4. Software programmable multi-mode interface for nuclear-medical imaging

    International Nuclear Information System (INIS)

    Zubal, I.G.; Rowe, R.W.; Bizais, Y.J.C.; Bennett, G.W.; Brill, A.B.

    1982-01-01

    An innovative multi-port interface allows gamma camera events (spatial coordinates and energy) to be acquired concurrently with a sampling of physiological patient data. The versatility of the interface permits all conventional static, dynamic, and tomographic imaging modes, in addition to multi-hole coded aperture acquisition. The acquired list mode data may be analyzed or gated on the basis of various camera, isotopic, or physiological parameters

  5. The high level programmer and user interface of the NSLS control system

    International Nuclear Information System (INIS)

    Tang, Y.N.; Smith, J.D.; Sathe, S.

    1993-01-01

    This paper presents the major components of the high level software in the NSLS upgraded control system. Both programmer and user interfaces are discussed. The use of the high-speed work stations, fast network communications, UNIX system, X-window and Motif have greatly changed and improved these interfaces

  6. Introducing a new open source GIS user interface for the SWAT model

    Science.gov (United States)

    The Soil and Water Assessment Tool (SWAT) model is a robust watershed modelling tool. It typically uses the ArcSWAT interface to create its inputs. ArcSWAT is public domain software which works in the licensed ArcGIS environment. The aim of this paper was to develop an open source user interface ...

  7. Developing a Graphical User Interface for the ALSS Crop Planning Tool

    Science.gov (United States)

    Koehlert, Erik

    1997-01-01

    The goal of my project was to create a graphical user interface for a prototype crop scheduler. The crop scheduler was developed by Dr. Jorge Leon and Laura Whitaker for the ALSS (Advanced Life Support System) program. The addition of a system-independent graphical user interface to the crop planning tool will make the application more accessible to a wider range of users and enhance its value as an analysis, design, and planning tool. My presentation will demonstrate the form and functionality of this interface. This graphical user interface allows users to edit system parameters stored in the file system. Data on the interaction of the crew, crops, and waste processing system with the available system resources is organized and labeled. Program output, which is stored in the file system, is also presented to the user in performance-time plots and organized charts. The menu system is designed to guide the user through analysis and decision making tasks, providing some help if necessary. The Java programming language was used to develop this interface in hopes of providing portability and remote operation.

  8. Comparison and Evaluation of End-User Interfaces for Online Public Access Catalogs.

    Science.gov (United States)

    Zumer, Maja

    End-user interfaces for the online public access catalogs (OPACs) of OhioLINK, a system linking major university and research libraries in Ohio, and its 16 member libraries, accessible through the Internet, are compared and evaluated from the user-oriented perspective. A common, systematic framework was used for the scientific observation of the…

  9. The Design and Evaluation of a Front-End User Interface for Energy Researchers.

    Science.gov (United States)

    Borgman, Christine L.; And Others

    1989-01-01

    Reports on the Online Access to Knowledge (OAK) Project, which developed software to support end user access to a Department of Energy database based on the skill levels and needs of energy researchers. The discussion covers issues in development, evaluation, and the study of user behavior in designing an interface tailored to a special…

  10. Preparing for Future Learning with a Tangible User Interface: The Case of Neuroscience

    Science.gov (United States)

    Schneider, B.; Wallace, J.; Blikstein, P.; Pea, R.

    2013-01-01

    In this paper, we describe the development and evaluation of a microworld-based learning environment for neuroscience. Our system, BrainExplorer, allows students to discover the way neural pathways work by interacting with a tangible user interface. By severing and reconfiguring connections, users can observe how the visual field is impaired and,…

  11. A graphical user interface (gui) matlab program Synthetic_Ves For ...

    African Journals Online (AJOL)

    An interactive and robust computer program for 1D forward modeling of Schlumberger Vertical Electrical Sounding (VES) curves for multilayered earth models is presented. The Graphical User Interface (GUI) enabled software, written in MATLAB v.7.12.0.635 (R2011a), accepts user-defined geologic model parameters (i.e. ...

  12. Benefits of the use of natural user interfaces in water simulations

    NARCIS (Netherlands)

    Donchyts, G.; Baart, F.; Van Dam, A.; Jagers, B.

    2014-01-01

    The use of natural user interfaces instead of conventional ones has become a reality with the emergence of 3D motion sensing technologies. However, some problems are still unsolved (for example, no haptic or tactile feedback); so this technology requires careful evaluation before the users can

  13. Designing personal attentive user interfaces in the mobile public safety domain

    NARCIS (Netherlands)

    Streefkerk, J.W.; Esch van-Bussemakers, M.P.; Neerincx, M.A.

    2006-01-01

    In the mobile computing environment, there is a need to adapt the information and service provision to the momentary attentive state of the user, operational requirements and usage context. This paper proposes to design personal attentive user interfaces (PAUI) for which the content and style of

  14. Research and Development for an Operational Information Ecology: The User-System Interface Agent Project

    Science.gov (United States)

    Srivastava, Sadanand; deLamadrid, James

    1998-01-01

    The User System Interface Agent (USIA) is a special type of software agent which acts as the "middle man" between a human user and an information processing environment. USIA consists of a group of cooperating agents which are responsible for assisting users in obtaining information processing services intuitively and efficiently. Some of the main features of USIA include: (1) multiple interaction modes and (2) user-specific and stereotype modeling and adaptation. This prototype system provides us with a development platform towards the realization of an operational information ecology. In the first phase of this project we focus on the design and implementation of prototype system of the User-System Interface Agent (USIA). The second face of USIA allows user interaction via a restricted query language as well as through a taxonomy of windows. In third phase the USIA system architecture was revised.

  15. Image as Interface : Consequences for Users of Museum Knowledge

    NARCIS (Netherlands)

    de Rijcke, Sarah; Beaulieu, Anne

    2011-01-01

    Photographs of objects are ubiquitous in the work and presentation of museums, whether in collection-management infrastructure or in Web-based communication. This article examines the use of images in these settings and traces how they function as interfaces and tools in the production of museum

  16. Towards linking user interface translation needs to lexicographic ...

    African Journals Online (AJOL)

    user

    Lexikos 25 (AFRILEX-reeks/series 25: 2015): 136-150. Towards Linking ... interfaces are faced with new challenges, such as the use of existing words in new contexts or in ... ros Afrikaans–Engels/English–Afrikaans Dictionary (Du Plessis et al. ..... utterance (or text) near or adjacent to a unit which is the focus of attention [...].

  17. A case study on better iconographic design in electronic medical records' user interface.

    Science.gov (United States)

    Tasa, Umut Burcu; Ozcan, Oguzhan; Yantac, Asim Evren; Unluer, Ayca

    2008-06-01

    It is a known fact that there is a conflict between what users expect and what user interface designers create in the field of medical informatics along with other fields of interface design. The objective of the study is to suggest, from the 'design art' perspective, a method for improving the usability of an electronic medical record (EMR) interface. The suggestion is based on the hypothesis that the user interface of an EMR should be iconographic. The proposed three-step method consists of a questionnaire survey on how hospital users perceive concepts/terms that are going to be used in the EMR user interface. Then icons associated with the terms are designed by a designer, following a guideline which is prepared according to the results of the first questionnaire. Finally the icons are asked back to the target group for proof. A case study was conducted with 64 medical staff and 30 professional designers for the first questionnaire, and with 30 medical staff for the second. In the second questionnaire 7.53 icons out of 10 were matched correctly with a standard deviation of 0.98. Also, all icons except three were matched correctly in at least 83.3% of the forms. The proposed new method differs from the majority of previous studies which are based on user requirements by leaning on user experiments instead. The study demonstrated that the user interface of EMRs should be designed according to a guideline that results from a survey on users' experiences on metaphoric perception of the terms.

  18. VAGUE: a graphical user interface for the Velvet assembler.

    Science.gov (United States)

    Powell, David R; Seemann, Torsten

    2013-01-15

    Velvet is a popular open-source de novo genome assembly software tool, which is run from the Unix command line. Most of the problems experienced by new users of Velvet revolve around constructing syntactically and semantically correct command lines, getting input files into acceptable formats and assessing the output. Here, we present Velvet Assembler Graphical User Environment (VAGUE), a multi-platform graphical front-end for Velvet. VAGUE aims to make sequence assembly accessible to a wider audience and to facilitate better usage amongst existing users of Velvet. VAGUE is implemented in JRuby and targets the Java Virtual Machine. It is available under an open-source GPLv2 licence from http://www.vicbioinformatics.com/. torsten.seemann@monash.edu.

  19. Draft User Functionalities and Interfaces of PN Services (Low-Fi Prototyping)

    DEFF Research Database (Denmark)

    Karamolegkos, P.; Larsen, J. E.; Larsen, Lars Bo

    2006-01-01

    Internal report of WP1 Task 4 activities from January 2006 to August 2006. This report describes the draft user functionalities and coming user interfaces for PN services. It is a working document to be handed over to WP1 Task1 and Task3 for guidelines on specification. State of the art usability...... and user experience, conceptual design work on the two pilot services, MAGNET.CARE and Nomadic@Work, is described....

  20. Four Principles for User Interface Design of Computerised Clinical Decision Support Systems

    DEFF Research Database (Denmark)

    Kanstrup, Anne Marie; Christiansen, Marion Berg; Nøhr, Christian

    2011-01-01

    emphasises a focus on how users interact with the system, a focus on how information is provided by the system, and four principles of interaction. The four principles for design of user interfaces for CDSS are summarised as four A’s: All in one, At a glance, At hand and Attention. It is recommended that all...... four interaction principles are integrated in the design of user interfaces for CDSS, i.e. the model is an integrated model which we suggest as a guide for interaction design when working with preventing medication errors....

  1. User productivity as a function of AutoCAD interface design.

    Science.gov (United States)

    Mitta, D A; Flores, P L

    1995-12-01

    Increased operator productivity is a desired outcome of user-CAD interaction scenarios. Two objectives of this research were to (1) define a measure of operator productivity and (2) empirically investigate the potential effects of CAD interface design on operator productivity, where productivity is defined as the percentage of a drawing session correctly completed per unit time. Here, AutoCAD provides the CAD environment of interest. Productivity with respect to two AutoCAD interface designs (menu, template) and three task types (draw, dimension, display) was investigated. Analysis of user productivity data revealed significantly higher productivity under the menu interface condition than under the template interface condition. A significant effect of task type was also discovered, where user productivity under display tasks was higher than productivity under the draw and dimension tasks. Implications of these results are presented.

  2. A novel graphical user interface for ultrasound-guided shoulder arthroscopic surgery

    Science.gov (United States)

    Tyryshkin, K.; Mousavi, P.; Beek, M.; Pichora, D.; Abolmaesumi, P.

    2007-03-01

    This paper presents a novel graphical user interface developed for a navigation system for ultrasound-guided computer-assisted shoulder arthroscopic surgery. The envisioned purpose of the interface is to assist the surgeon in determining the position and orientation of the arthroscopic camera and other surgical tools within the anatomy of the patient. The user interface features real time position tracking of the arthroscopic instruments with an optical tracking system, and visualization of their graphical representations relative to a three-dimensional shoulder surface model of the patient, created from computed tomography images. In addition, the developed graphical interface facilitates fast and user-friendly intra-operative calibration of the arthroscope and the arthroscopic burr, capture and segmentation of ultrasound images, and intra-operative registration. A pilot study simulating the computer-aided shoulder arthroscopic procedure on a shoulder phantom demonstrated the speed, efficiency and ease-of-use of the system.

  3. Design of Electronic Medical Record User Interfaces: A Matrix-Based Method for Improving Usability

    Directory of Open Access Journals (Sweden)

    Kushtrim Kuqi

    2013-01-01

    Full Text Available This study examines a new approach of using the Design Structure Matrix (DSM modeling technique to improve the design of Electronic Medical Record (EMR user interfaces. The usability of an EMR medication dosage calculator used for placing orders in an academic hospital setting was investigated. The proposed method captures and analyzes the interactions between user interface elements of the EMR system and groups elements based on information exchange, spatial adjacency, and similarity to improve screen density and time-on-task. Medication dose adjustment task time was recorded for the existing and new designs using a cognitive simulation model that predicts user performance. We estimate that the design improvement could reduce time-on-task by saving an average of 21 hours of hospital physicians’ time over the course of a month. The study suggests that the application of DSM can improve the usability of an EMR user interface.

  4. WIFIP: a web-based user interface for automated synchrotron beamlines.

    Science.gov (United States)

    Sallaz-Damaz, Yoann; Ferrer, Jean Luc

    2017-09-01

    The beamline control software, through the associated graphical user interface (GUI), is the user access point to the experiment, interacting with synchrotron beamline components and providing automated routines. FIP, the French beamline for the Investigation of Proteins, is a highly automatized macromolecular crystallography (MX) beamline at the European Synchrotron Radiation Facility. On such a beamline, a significant number of users choose to control their experiment remotely. This is often performed with a limited bandwidth and from a large choice of computers and operating systems. Furthermore, this has to be possible in a rapidly evolving experimental environment, where new developments have to be easily integrated. To face these challenges, a light, platform-independent, control software and associated GUI are required. Here, WIFIP, a web-based user interface developed at FIP, is described. Further than being the present FIP control interface, WIFIP is also a proof of concept for future MX control software.

  5. Elckerlyc goes mobile - Enabling natural interaction in mobile user interfaces

    NARCIS (Netherlands)

    Klaassen, Randy; Hendrix, Jordi; Reidsma, Dennis; op den Akker, Hendrikus J.A.; van Dijk, Elisabeth M.A.G.; op den Akker, Harm

    The fast growth of computational resources and speech technology available on mobile devices makes it possible to entertain users of these devices in having a natural dialogue with service systems. These systems are sometimes perceived as social agents and this can be supported by presenting them on

  6. User interface considerations to prevent self-driving carsickness

    NARCIS (Netherlands)

    Diels, C.; Bos, J.E.

    2015-01-01

    Self-driving cars have the potential to bring significant benefits to drivers and society at large. However, all envisaged scenarios are predicted to increase the risk of motion sickness. This will negatively affect user acceptance and uptake and hence negate the benefits of this technology. Here we

  7. A virtual reality user interface for a design information system

    NARCIS (Netherlands)

    Coomans, M.K.D.

    1998-01-01

    The computer is a tool, a complex artefact that is used to extend our reach. A computer system can provide several kinds of services, but against these services stands a supplementary task that the user must deal with: the communication with the computer system. We argued that Virtual Reality (VR)

  8. NFC-Based User Interface for Smart Environments

    Directory of Open Access Journals (Sweden)

    Susanna Spinsante

    2015-01-01

    Full Text Available The physical support of a home automation system, joined with a simplified user-system interaction modality, may allow people affected by motor impairments or limitations, such as elderly and disabled people, to live safely and comfortably at home, by improving their autonomy and facilitating the execution of daily life tasks. The proposed solution takes advantage of the Near Field Communications technology, which is simple and intuitive to use, to enable advanced user interaction. The user can perform normal daily activities, such as lifting a gate or closing a window, through a device enabled to read NFC tags containing the commands for the home automation system. A passive Smart Panel is implemented, composed of multiple Near Field Communications tags properly programmed, to enable the execution of both individual commands and so-called scenarios. The work compares several versions of the proposed Smart Panel, differing for interrogation and composition of the single command, number of tags, and dynamic user interaction model, at a parity of the number of commands to issue. Main conclusions are drawn from the experimental results, about the effective adoption of Near Field Communications in smart assistive environments.

  9. Model-driven Instrumentation of graphical user interfaces.

    NARCIS (Netherlands)

    Funk, M.; Hoyer, P.; Link, S.

    2009-01-01

    In today's continuously changing markets newly developed products often do not meet the demands and expectations of customers. Research on this problem identified a large gap between developer and user expectations. Approaches to bridge this gap are to provide the developers with better information

  10. Inventions on presenting textual items in Graphical User Interface

    OpenAIRE

    Mishra, Umakant

    2014-01-01

    Although a GUI largely replaces textual descriptions by graphical icons, the textual items are not completely removed. The textual items are inevitably used in window titles, message boxes, help items, menu items and popup items. Textual items are necessary for communicating messages that are beyond the limitation of graphical messages. However, it is necessary to harness the textual items on the graphical interface in such a way that they complement each other to produce the best effect. One...

  11. Spatial issues in user interface design from a graphic design perspective

    Science.gov (United States)

    Marcus, Aaron

    1989-01-01

    The user interface of a computer system is a visual display that provides information about the status of operations on data within the computer and control options to the user that enable adjustments to these operations. From the very beginning of computer technology the user interface was a spatial display, although its spatial features were not necessarily complex or explicitly recognized by the users. All text and nonverbal signs appeared in a virtual space generally thought of as a single flat plane of symbols. Current technology of high performance workstations permits any element of the display to appear as dynamic, multicolor, 3-D signs in a virtual 3-D space. The complexity of appearance and the user's interaction with the display provide significant challenges to the graphic designer of current and future user interfaces. In particular, spatial depiction provides many opportunities for effective communication of objects, structures, processes, navigation, selection, and manipulation. Issues are presented that are relevant to the graphic designer seeking to optimize the user interface's spatial attributes for effective visual communication.

  12. The role of brand loyalty and social media in e-commerce interfaces: survey results and implications for user interfaces

    OpenAIRE

    Rigas, Dimitrios; Hussain, Hammad

    2015-01-01

    This paper explores the role of brand loyalty and social media in e-commerce interfaces. A survey consisting of 118 respondents was contacted to address the questions relating to online shopping and brand loyalty. Link between the frequency of access and time spent on an e-commerce user interface, and brand loyalty, gender and age profile differences, and the role of social media to branding and on-line shopping was analyzed. It was found that online loyalty differs from offline loyalty and l...

  13. Inventions on expressing emotions In Graphical User Interface

    OpenAIRE

    Mishra, Umakant

    2014-01-01

    The conventional GUI is more mechanical and does not recognize or communicate emotions. The modern GUIs are trying to infer the likely emotional state and personality of the user and communicate through a corresponding emotional state. Emotions are expressed in graphical icons, sounds, pictures and other means. The emotions are found to be useful in especially in communication software, interactive learning systems, robotics and other adaptive environments. Various mechanisms have been develo...

  14. A multimodal interface device for online board games designed for sight-impaired people.

    Science.gov (United States)

    Caporusso, Nicholas; Mkrtchyan, Lusine; Badia, Leonardo

    2010-03-01

    Online games between remote opponents playing over computer networks are becoming a common activity of everyday life. However, computer interfaces for board games are usually based on the visual channel. For example, they require players to check their moves on a video display and interact by using pointing devices such as a mouse. Hence, they are not suitable for visually impaired people. The present paper discusses a multipurpose system that allows especially blind and deafblind people playing chess or other board games over a network, therefore reducing their disability barrier. We describe and benchmark a prototype of a special interactive haptic device for online gaming providing a dual tactile feedback. The novel interface of this proposed device is able to guarantee not only a better game experience for everyone but also an improved quality of life for sight-impaired people.

  15. A Hybrid 2D/3D User Interface for Radiological Diagnosis.

    Science.gov (United States)

    Mandalika, Veera Bhadra Harish; Chernoglazov, Alexander I; Billinghurst, Mark; Bartneck, Christoph; Hurrell, Michael A; Ruiter, Niels de; Butler, Anthony P H; Butler, Philip H

    2018-02-01

    This paper presents a novel 2D/3D desktop virtual reality hybrid user interface for radiology that focuses on improving 3D manipulation required in some diagnostic tasks. An evaluation of our system revealed that our hybrid interface is more efficient for novice users and more accurate for both novice and experienced users when compared to traditional 2D only interfaces. This is a significant finding because it indicates, as the techniques mature, that hybrid interfaces can provide significant benefit to image evaluation. Our hybrid system combines a zSpace stereoscopic display with 2D displays, and mouse and keyboard input. It allows the use of 2D and 3D components interchangeably, or simultaneously. The system was evaluated against a 2D only interface with a user study that involved performing a scoliosis diagnosis task. There were two user groups: medical students and radiology residents. We found improvements in completion time for medical students, and in accuracy for both groups. In particular, the accuracy of medical students improved to match that of the residents.

  16. Development of a user interface style guide for the reactor protection system cabinet operator module

    International Nuclear Information System (INIS)

    Lee, Hyun-Chul; Lee, Dong-Young; Lee, Jung-Woon

    2004-01-01

    The reactor protection system (RPS) plays the roles of generating the reactor trip signal and the engineered safety features (ESF) actuation signal when the monitored plant processes reach the predefined limits. A Korean project group is developing a new digitalized RPS and the Cabinet Operator Module (COM) of the RPS is used for the RPS integrity testing and monitoring by an equipment operator. A flat panel display (FPD) with a touch screen capability is provided as a main user interface for the RPS operation. To support the RPS COM user interface design, actually the FPD screen design, we developed a user interface style guide because the system designer could not properly deal with the many general human factors design guidelines. To develop the user interface style guide, various design guideline gatherings, a walk-though with a video recorder, guideline selection with respect to user interface design elements, determination of the properties of the design elements, discussion with system designers, and a conversion of the properties into the screen design were carried out. This paper describes the process details and the findings in the course of the style guide development. (Author)

  17. Glotaran: A Java-Based Graphical User Interface for the R Package TIMP

    Directory of Open Access Journals (Sweden)

    Katharine M. Mullen

    2012-06-01

    Full Text Available In this work the software application called Glotaran is introduced as a Java-based graphical user interface to the R package TIMP, a problem solving environment for fitting superposition models to multi-dimensional data. TIMP uses a command-line user interface for the interaction with data, the specification of models and viewing of analysis results. Instead, Glotaran provides a graphical user interface which features interactive and dynamic data inspection, easier -- assisted by the user interface -- model specification and interactive viewing of results. The interactivity component is especially helpful when working with large, multi-dimensional datasets as often result from time-resolved spectroscopy measurements, allowing the user to easily pre-select and manipulate data before analysis and to quickly zoom in to regions of interest in the analysis results. Glotaran has been developed on top of the NetBeans rich client platform and communicates with R through the Java-to-R interface Rserve. The background and the functionality of the application are described here. In addition, the design, development and implementation process of Glotaran is documented in a generic way.

  18. Evaluation of the user interface simplicity in the modern generation of mechanical ventilators.

    Science.gov (United States)

    Uzawa, Yoshihiro; Yamada, Yoshitsugu; Suzukawa, Masayuki

    2008-03-01

    We designed this study to evaluate the simplicity of the user interface in modern-generation mechanical ventilators. We hypothesized that different designs in the user interface could result in different rates of operational failures. A laboratory in a tertiary teaching hospital. Crossover design. Twenty-one medical resident physicians who did not possess operating experience with any of the selected ventilators. Four modern mechanical ventilators were selected: Dräger Evita XL, Maquet Servo-i, Newport e500, and Puritan Bennett 840. Each subject was requested to perform 8 tasks on each ventilator. Two objective variables (the number of successfully completed tasks without operational failures and the operational time) and the overall subjective rating of the ease of use, measured with a 100-mm visual analog scale were recorded. The total percentage of operational failures made for all subjects, for all tasks, was 23%. There were significant differences in the rates of operational failures and operational time among the 4 ventilators. Subjects made more operational failures in setting up the ventilators and in making ventilator-setting changes than in reacting to alarms. The subjective feeling of the ease of use was also significantly different among the ventilators. The design of the user interface is relevant to the occurrence of operational failures. Our data indicate that ventilator designers could optimize the user-interface design to reduce the operational failures; therefore, basic user interface should be standardized among the clinically used mechanical ventilators.

  19. MuSim, a Graphical User Interface for Multiple Simulation Programs

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, Thomas [MUONS Inc., Batavia; Cummings, Mary Anne [MUONS Inc., Batavia; Johnson, Rolland [MUONS Inc., Batavia; Neuffer, David [Fermilab

    2016-06-01

    MuSim is a new user-friendly program designed to interface to many different particle simulation codes, regardless of their data formats or geometry descriptions. It presents the user with a compelling graphical user interface that includes a flexible 3-D view of the simulated world plus powerful editing and drag-and-drop capabilities. All aspects of the design can be parametrized so that parameter scans and optimizations are easy. It is simple to create plots and display events in the 3-D viewer (with a slider to vary the transparency of solids), allowing for an effortless comparison of different simulation codes. Simulation codes: G4beamline, MAD-X, and MCNP; more coming. Many accelerator design tools and beam optics codes were written long ago, with primitive user interfaces by today's standards. MuSim is specifically designed to make it easy to interface to such codes, providing a common user experience for all, and permitting the construction and exploration of models with very little overhead. For today's technology-driven students, graphical interfaces meet their expectations far better than text-based tools, and education in accelerator physics is one of our primary goals.

  20. The use of Graphic User Interface for development of a user-friendly CRS-Stack software

    Science.gov (United States)

    Sule, Rachmat; Prayudhatama, Dythia; Perkasa, Muhammad D.; Hendriyana, Andri; Fatkhan; Sardjito; Adriansyah

    2017-04-01

    The development of a user-friendly Common Reflection Surface (CRS) Stack software that has been built by implementing Graphical User Interface (GUI) is described in this paper. The original CRS-Stack software developed by WIT Consortium is compiled in the unix/linux environment, which is not a user-friendly software, so that a user must write the commands and parameters manually in a script file. Due to this limitation, the CRS-Stack become a non popular method, although applying this method is actually a promising way in order to obtain better seismic sections, which have better reflector continuity and S/N ratio. After obtaining successful results that have been tested by using several seismic data belong to oil companies in Indonesia, it comes to an idea to develop a user-friendly software in our own laboratory. Graphical User Interface (GUI) is a type of user interface that allows people to interact with computer programs in a better way. Rather than typing commands and module parameters, GUI allows the users to use computer programs in much simple and easy. Thus, GUI can transform the text-based interface into graphical icons and visual indicators. The use of complicated seismic unix shell script can be avoided. The Java Swing GUI library is used to develop this CRS-Stack GUI. Every shell script that represents each seismic process is invoked from Java environment. Besides developing interactive GUI to perform CRS-Stack processing, this CRS-Stack GUI is design to help geophysicists to manage a project with complex seismic processing procedures. The CRS-Stack GUI software is composed by input directory, operators, and output directory, which are defined as a seismic data processing workflow. The CRS-Stack processing workflow involves four steps; i.e. automatic CMP stack, initial CRS-Stack, optimized CRS-Stack, and CRS-Stack Supergather. Those operations are visualized in an informative flowchart with self explanatory system to guide the user inputting the

  1. User interface on networked workstations for MFTF plasma diagnostic instruments

    International Nuclear Information System (INIS)

    Renbarger, V.L.; Balch, T.R.

    1985-01-01

    A network of Sun-2/170 workstations is used to provide an interface to the MFTF-B Plasma Diagnostics System at Lawrence Livermore National Laboratory. The Plasma Diagnostics System (PDS) is responsible for control of MFTF-B plasma diagnostic instrumentation. An EtherNet Local Area Network links the workstations to a central multiprocessing system which furnishes data processing, data storage and control services for PDS. These workstations permit a physicist to command data acquisition, data processing, instrument control, and display of results. The interface is implemented as a metaphorical desktop, which helps the operator form a mental model of how the system works. As on a real desktop, functions are provided by sheets of paper (windows on a CRT screen) called worksheets. The worksheets may be invoked by pop-up menus and may be manipulated with a mouse. These worksheets are actually tasks that communicate with other tasks running in the central computer system. By making entries in the appropriate worksheet, a physicist may specify data acquisition or processing, control a diagnostic, or view a result

  2. Study on user interface of pathology picture archiving and communication system.

    Science.gov (United States)

    Kim, Dasueran; Kang, Peter; Yun, Jungmin; Park, Sung-Hye; Seo, Jeong-Wook; Park, Peom

    2014-01-01

    It is necessary to improve the pathology workflow. A workflow task analysis was performed using a pathology picture archiving and communication system (pathology PACS) in order to propose a user interface for the Pathology PACS considering user experience. An interface analysis of the Pathology PACS in Seoul National University Hospital and a task analysis of the pathology workflow were performed by observing recorded video. Based on obtained results, a user interface for the Pathology PACS was proposed. Hierarchical task analysis of Pathology PACS was classified into 17 tasks including 1) pre-operation, 2) text, 3) images, 4) medical record viewer, 5) screen transition, 6) pathology identification number input, 7) admission date input, 8) diagnosis doctor, 9) diagnosis code, 10) diagnosis, 11) pathology identification number check box, 12) presence or absence of images, 13) search, 14) clear, 15) Excel save, 16) search results, and 17) re-search. And frequently used menu items were identified and schematized. A user interface for the Pathology PACS considering user experience could be proposed as a preliminary step, and this study may contribute to the development of medical information systems based on user experience and usability.

  3. Iterative User Interface Design for Automated Sequential Organ Failure Assessment Score Calculator in Sepsis Detection.

    Science.gov (United States)

    Aakre, Christopher Ansel; Kitson, Jaben E; Li, Man; Herasevich, Vitaly

    2017-05-18

    The new sepsis definition has increased the need for frequent sequential organ failure assessment (SOFA) score recalculation and the clerical burden of information retrieval makes this score ideal for automated calculation. The aim of this study was to (1) estimate the clerical workload of manual SOFA score calculation through a time-motion analysis and (2) describe a user-centered design process for an electronic medical record (EMR) integrated, automated SOFA score calculator with subsequent usability evaluation study. First, we performed a time-motion analysis by recording time-to-task-completion for the manual calculation of 35 baseline and 35 current SOFA scores by 14 internal medicine residents over a 2-month period. Next, we used an agile development process to create a user interface for a previously developed automated SOFA score calculator. The final user interface usability was evaluated by clinician end users with the Computer Systems Usability Questionnaire. The overall mean (standard deviation, SD) time-to-complete manual SOFA score calculation time was 61.6 s (33). Among the 24% (12/50) usability survey respondents, our user-centered user interface design process resulted in >75% favorability of survey items in the domains of system usability, information quality, and interface quality. Early stakeholder engagement in our agile design process resulted in a user interface for an automated SOFA score calculator that reduced clinician workload and met clinicians' needs at the point of care. Emerging interoperable platforms may facilitate dissemination of similarly useful clinical score calculators and decision support algorithms as "apps." A user-centered design process and usability evaluation should be considered during creation of these tools. ©Christopher Ansel Aakre, Jaben E Kitson, Man Li, Vitaly Herasevich. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 18.05.2017.

  4. Non-visual Interfaces and Network Games for Blind Users

    OpenAIRE

    Ina, Satoshi

    2002-01-01

    Visually impaired people have difficulty with communication of graphical information. It is to be more difficult for them to work/play in cooperation with sighted people at a distance. We developed a non-visual access method to a graphical screen through tactile and auditory sense, and applied it into network board/card games as a joint workspace for blind and sighted users via communication of image, sound, and voice. We took an "IGO" type boardgame and a Card game "SEVENS" as sample subject...

  5. Java-based Graphical User Interface for MAVERIC-II

    Science.gov (United States)

    Seo, Suk Jai

    2005-01-01

    A computer program entitled "Marshall Aerospace Vehicle Representation in C II, (MAVERIC-II)" is a vehicle flight simulation program written primarily in the C programming language. It is written by James W. McCarter at NASA/Marshall Space Flight Center. The goal of the MAVERIC-II development effort is to provide a simulation tool that facilitates the rapid development of high-fidelity flight simulations for launch, orbital, and reentry vehicles of any user-defined configuration for all phases of flight. MAVERIC-II has been found invaluable in performing flight simulations for various Space Transportation Systems. The flexibility provided by MAVERIC-II has allowed several different launch vehicles, including the Saturn V, a Space Launch Initiative Two-Stage-to-Orbit concept and a Shuttle-derived launch vehicle, to be simulated during ascent and portions of on-orbit flight in an extremely efficient manner. It was found that MAVERIC-II provided the high fidelity vehicle and flight environment models as well as the program modularity to allow efficient integration, modification and testing of advanced guidance and control algorithms. In addition to serving as an analysis tool for techno logy development, many researchers have found MAVERIC-II to be an efficient, powerful analysis tool that evaluates guidance, navigation, and control designs, vehicle robustness, and requirements. MAVERIC-II is currently designed to execute in a UNIX environment. The input to the program is composed of three segments: 1) the vehicle models such as propulsion, aerodynamics, and guidance, navigation, and control 2) the environment models such as atmosphere and gravity, and 3) a simulation framework which is responsible for executing the vehicle and environment models and propagating the vehicle s states forward in time and handling user input/output. MAVERIC users prepare data files for the above models and run the simulation program. They can see the output on screen and/or store in

  6. Towards a Playful User Interface for Home Entertainment Systems

    OpenAIRE

    Block, Florian; Schmidt, Albrecht; Villar, Nicolas; Gellersen, Hans

    2004-01-01

    In this paper we propose a tangible cube as an input device for playfully changing between different TV-channels. First we consider several design approaches and compare them. Based on a cube that has embedded gravity sensing and wireless communication capabilities a prototype is implemented. A 3D graphical representation of the cube is shown on the television screen. On each face of the cube a TV stream is rendered. The motion of the cube on the screen is connected to the rotation the user p...

  7. When soft controls get slippery: User interfaces and human error

    International Nuclear Information System (INIS)

    Stubler, W.F.; O'Hara, J.M.

    1998-01-01

    Many types of products and systems that have traditionally featured physical control devices are now being designed with soft controls--input formats appearing on computer-based display devices and operated by a variety of input devices. A review of complex human-machine systems found that soft controls are particularly prone to some types of errors and may affect overall system performance and safety. This paper discusses the application of design approaches for reducing the likelihood of these errors and for enhancing usability, user satisfaction, and system performance and safety

  8. Tangible User Interface and Mu Rhythm Suppression: The Effect of User Interface on the Brain Activity in Its Operator and Observer

    Directory of Open Access Journals (Sweden)

    Kazuo Isoda

    2017-03-01

    Full Text Available The intuitiveness of tangible user interface (TUI is not only for its operator. It is quite possible that this type of user interface (UI can also have an effect on the experience and learning of observers who are just watching the operator using it. To understand the possible effect of TUI, the present study focused on the mu rhythm suppression in the sensorimotor area reflecting execution and observation of action, and investigated the brain activity both in its operator and observer. In the observer experiment, the effect of TUI on its observers was demonstrated through the brain activity. Although the effect of the grasping action itself was uncertain, the unpredictability of the result of the action seemed to have some effect on the mirror neuron system (MNS-related brain activity. In the operator experiment, in spite of the same grasping action, the brain activity was activated in the sensorimotor area when UI functions were included (TUI. Such activation of the brain activity was not found with a graphical user interface (GUI that has UI functions without grasping action. These results suggest that the MNS-related brain activity is involved in the effect of TUI, indicating the possibility of UI evaluation based on brain activity.

  9. Control system user interface for accelerator commissioning and operation

    International Nuclear Information System (INIS)

    Dobrott, D.; Keeley, D.; Kolte, G.; Mikic, Z.; Lee, M.; Corbett, J.; Howry, S.; King, A.

    1991-01-01

    An Interactive Accelerator Interface Module (AIM) has been developed in a workstation environment for the purposes of assisting in the commissioning and operation of any storage ring/collider system. The function of AIM is to integrate modeling and simulation codes into accelerator and beamline control systems for the purpose of rapid on-line data analysis and error-correction, resulting in significant time-saving. A system dependent module provides for the translation of specific control system data files to appropriate input format for application programs within AIM. Interactive screen graphics, including system function diagrams, menus, beamline element status and update information are standard in AIM. AIM is currently connected to the Stanford Linear Collider (SLC) control system, but is easily transportable to other facilities. This paper describes the development of AIM and its applications on SLC

  10. New and Old User Interface Metaphors in Music Production

    DEFF Research Database (Denmark)

    Walther-Hansen, Mads

    2017-01-01

    This paper outlines a theoretical framework for interaction with sound in music mixing. Using cognitive linguistic theory and studies exploring the spatiality of recorded music, it is argued that the logic of music mixing builds on three master metaphors—the signal flow metaphor, the sound stage...... metaphor and the container metaphor. I show how the metaphorical basis for interacting with sound in music mixing has changed with the development of recording technology, new aesthetic ideals and changing terminology. These changes are studied as expressions of underlying thought patterns that govern how...... music producers and engineers make sense of their actions. In conclusion, this leads to suggestions for a theoretical framework through which more intuitive music mixing interfaces may be developed in the future....

  11. User Interface Developed for Controls/CFD Interdisciplinary Research

    Science.gov (United States)

    1996-01-01

    The NASA Lewis Research Center, in conjunction with the University of Akron, is developing analytical methods and software tools to create a cross-discipline "bridge" between controls and computational fluid dynamics (CFD) technologies. Traditionally, the controls analyst has used simulations based on large lumping techniques to generate low-order linear models convenient for designing propulsion system controls. For complex, high-speed vehicles such as the High Speed Civil Transport (HSCT), simulations based on CFD methods are required to capture the relevant flow physics. The use of CFD should also help reduce the development time and costs associated with experimentally tuning the control system. The initial application for this research is the High Speed Civil Transport inlet control problem. A major aspect of this research is the development of a controls/CFD interface for non-CFD experts, to facilitate the interactive operation of CFD simulations and the extraction of reduced-order, time-accurate models from CFD results. A distributed computing approach for implementing the interface is being explored. Software being developed as part of the Integrated CFD and Experiments (ICE) project provides the basis for the operating environment, including run-time displays and information (data base) management. Message-passing software is used to communicate between the ICE system and the CFD simulation, which can reside on distributed, parallel computing systems. Initially, the one-dimensional Large-Perturbation Inlet (LAPIN) code is being used to simulate a High Speed Civil Transport type inlet. LAPIN can model real supersonic inlet features, including bleeds, bypasses, and variable geometry, such as translating or variable-ramp-angle centerbodies. Work is in progress to use parallel versions of the multidimensional NPARC code.

  12. A Mobile Phone User Interface for Image-Based Dietary Assessment.

    Science.gov (United States)

    Ahmad, Ziad; Khanna, Nitin; Kerr, Deborah A; Boushey, Carol J; Delp, Edward J

    2014-02-02

    Many chronic diseases, including obesity and cancer, are related to diet. Such diseases may be prevented and/or successfully treated by accurately monitoring and assessing food and beverage intakes. Existing dietary assessment methods such as the 24-hour dietary recall and the food frequency questionnaire, are burdensome and not generally accurate. In this paper, we present a user interface for a mobile telephone food record that relies on taking images, using the built-in camera, as the primary method of recording. We describe the design and implementation of this user interface while stressing the solutions we devised to meet the requirements imposed by the image analysis process, yet keeping the user interface easy to use.

  13. A user-friendly, graphical interface for the Monte Carlo neutron optics code MCLIB

    International Nuclear Information System (INIS)

    Thelliez, T.; Daemen, L.; Hjelm, R.P.; Seeger, P.A.

    1995-01-01

    The authors describe a prototype of a new user interface for the Monte Carlo neutron optics simulation program MCLIB. At this point in its development the interface allows the user to define an instrument as a set of predefined instrument elements. The user can specify the intrinsic parameters of each element, its position and orientation. The interface then writes output to the MCLIB package and starts the simulation. The present prototype is an early development stage of a comprehensive Monte Carlo simulations package that will serve as a tool for the design, optimization and assessment of performance of new neutron scattering instruments. It will be an important tool for understanding the efficacy of new source designs in meeting the needs of these instruments

  14. A mobile phone user interface for image-based dietary assessment

    Science.gov (United States)

    Ahmad, Ziad; Khanna, Nitin; Kerr, Deborah A.; Boushey, Carol J.; Delp, Edward J.

    2014-02-01

    Many chronic diseases, including obesity and cancer, are related to diet. Such diseases may be prevented and/or successfully treated by accurately monitoring and assessing food and beverage intakes. Existing dietary assessment methods such as the 24-hour dietary recall and the food frequency questionnaire, are burdensome and not generally accurate. In this paper, we present a user interface for a mobile telephone food record that relies on taking images, using the built-in camera, as the primary method of recording. We describe the design and implementation of this user interface while stressing the solutions we devised to meet the requirements imposed by the image analysis process, yet keeping the user interface easy to use.

  15. Transportable Applications Environment (TAE) Plus - A NASA productivity tool used to develop graphical user interfaces

    Science.gov (United States)

    Szczur, Martha R.

    1991-01-01

    The Transportable Applications Environment (TAE) Plus, developed at NASA's Goddard Space Flight Center, is an advanced portable user interface development environment which simplifies the process of creating and managing complex application graphical user interfaces (GUIs), supports prototyping, allows applications to be oported easily between different platforms, and encourages appropriate levels of user interface consistency between applications. This paper discusses the capabilities of the TAE Plus tool, and how it makes the job of designing and developing GUIs easier for the application developers. The paper also explains how tools like TAE Plus provide for reusability and ensure reliability of UI software components, as well as how they aid in the reduction of development and maintenance costs.

  16. Evaluating user experience with respect to user expectations in brain-computer interface games

    NARCIS (Netherlands)

    Gürkök, Hayrettin; Hakvoort, G.; Poel, Mannes; Müller-Putz, G.R.; Scherer, R.; Billinger, M.; Kreilinger, A.; Kaiser, V.; Neuper, C.

    Evaluating user experience (UX) with respect to previous experiences can provide insight into whether a product can positively aect a user's opinion about a technology. If it can, then we can say that the product provides a positive UX. In this paper we propose a method to assess the UX in BCI

  17. The Philosophy of User Interfaces in HELIO and the Importance of CASSIS

    Science.gov (United States)

    Bonnin, X.; Aboudarham, J.; Renié, C.; Csillaghy, A.; Messerotti, M.; Bentley, R. D.

    2012-09-01

    HELIO is a European project funded under FP7 (Project No. 238969). One of its goals as a Heliospheric Virtual Observatory is to provide an easy access to many datasets scattered all over the world, in the fields of Solar physics, Heliophysics, and Planetary magnetospheres. The efficiency of such a tool is very much related to the quality of the user interface. HELIO infrastructure is based on a Service Oriented Architecture (SOA), regrouping a network of standalone components, which allows four main types of interfaces: - HELIO Front End (HFE) is a browser-based user interface, which offers a centralized access to the HELIO main functionalities. Especially, it provides the possibility to reach data directly, or to refine selection by determination of observing characteristics, such as which instrument was observing at that time, which instrument was at this location, etc. - Many services/components provide their own standalone graphical user interface. While one can directly access individually each of these interfaces, they can also be connected together. - Most services also provide direct access for any tools through a public interface. A small java library, called Java API, simplifies this access by providing client stubs for services and shields the user from security, discovery and failover issues. - Workflows capabilities are available in HELIO, allowing complex combination of queries over several services. We want the user to be able to navigate easily, at his needs, through the various interfaces, and possibly use a specific one in order to make much-dedicated queries. We will also emphasize the importance of the CASSIS project (Coordination Action for the integration of Solar System Infrastructure and Science) in encouraging the interoperability necessary to undertake scientific studies that span disciplinary boundaries. If related projects follow the guidelines being developed by CASSIS then using external resources with HELIO will be greatly simplified.

  18. Graphical user interfaces for McCellan Nuclear Radiation Center (MNRC)

    International Nuclear Information System (INIS)

    Brown-VanHoozer, S. A.

    1998-01-01

    McClellan's Nuclear Radiation Center (MNRC) control console is in the process of being replaced due to spurious scrams, outdated software, and obsolete parts. The intent of the new control console is to eliminate the existing problems by installing a UNIX-based computer system with industry-standard interface software and incorporating human factors during all stages of the graphical user interface (GUI) development and control console design

  19. Profex: a graphical user interface for the Rietveld refinement program BGMN

    OpenAIRE

    Doebelin, Nicola; Kleeberg, Reinhard

    2015-01-01

    Profex is a graphical user interface for the Rietveld refinement program BGMN. Its interface focuses on preserving BGMN’s powerful and flexible scripting features by giving direct access to BGMN input files. Very efficient workflows for single or batch refinements are achieved by managing refinement control files and structure files, by providing dialogues and shortcuts for many operations, by performing operations in the background, and by providing import filters for CIF and XML crystal str...

  20. Design of Flight Control Panel Layout using Graphical User Interface in MATLAB

    Science.gov (United States)

    Wirawan, A.; Indriyanto, T.

    2018-04-01

    This paper introduces the design of Flight Control Panel (FCP) Layout using Graphical User Interface in MATLAB. The FCP is the interface to give the command to the simulation and to monitor model variables while the simulation is running. The command accommodates by the FCP are altitude command, the angle of sideslip command, heading command, and setting command for turbulence model. The FCP was also designed to monitor the flight parameter while the simulation is running.

  1. The effect of egocentric body movements on users' navigation performance and spatial memory in zoomable user interfaces

    OpenAIRE

    Rädle, Roman; Jetter, Hans-Christian; Butscher, Simon; Reiterer, Harald

    2013-01-01

    We present two experiments examining the impact of navigation techniques on users’ navigation performance and spatial memory in a zoomable user interface (ZUI). The first experiment with 24 participants compared the effect of egocentric body movements with traditional multi-touch navigation. The results indicate a 47% decrease in path lengths and a 34% decrease in task time in favor of egocentric navigation, but no significant effect on users’ spatial memory immediately after a navigation tas...

  2. Mobile health IT: The effect of user interface and form factor on doctor-patient communication

    DEFF Research Database (Denmark)

    Alsos, Ole Andreas; Das, Anita; Svanæs, Dag

    2012-01-01

    -establishment of eye contact, better verbal and non-verbal contact, more gesturing, good visibility of actions, and quick information retrieval. The digital information devices lacked many of these affordances; physicians’ actions were not visible for the patients, the user interfaces required much attention......, gesturing was harder, and re-establishment of eye contact took more time. Physicians used the devices to display their actions to the patients. The analysis revealed that the findings were related to the user interface and form factor of the information devices, as well as the personal characteristics...

  3. US NDC Modernization Iteration E2 Prototyping Report: User Interface Framework

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Jennifer E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Palmer, Melanie A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vickers, James Wallace [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Voegtli, Ellen M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-12-01

    During the second iteration of the US NDC Modernization Elaboration phase (E2), the SNL US NDC Modernization project team completed follow-on Rich Client Platform (RCP) exploratory prototyping related to the User Interface Framework (UIF). The team also developed a survey of browser-based User Interface solutions and completed exploratory prototyping for selected solutions. This report presents the results of the browser-based UI survey, summarizes the E2 browser-based UI and RCP prototyping work, and outlines a path forward for the third iteration of the Elaboration phase (E3).

  4. Transportable Applications Environment (TAE) Plus: A NASA tool for building and managing graphical user interfaces

    Science.gov (United States)

    Szczur, Martha R.

    1993-01-01

    The Transportable Applications Environment (TAE) Plus, developed at NASA's Goddard Space Flight Center, is an advanced portable user interface development which simplifies the process of creating and managing complex application graphical user interfaces (GUI's). TAE Plus supports the rapid prototyping of GUI's and allows applications to be ported easily between different platforms. This paper will discuss the capabilities of the TAE Plus tool, and how it makes the job of designing and developing GUI's easier for application developers. TAE Plus is being applied to many types of applications, and this paper discusses how it has been used both within and outside NASA.

  5. Examination of Color-Lighting Control System Using Colored Paper User Interface

    Directory of Open Access Journals (Sweden)

    Aida Hiroto

    2016-01-01

    Full Text Available In recent year, Full-Color LED Lighting that can be changed to various color such as red, green, blue has been appeared with development of LED Lighting. By Color-Lighting control, users affected such as concentrating and relaxing. Therefore, Color-lighting control will spread to various place such as home, offices, stations. However color-lighting control affected some disturbance such as daylight, display when Full-Color LED controlled indoors. Also, information devices control get difficult with information technology develop. I propose Color-Lighting Control System using Colored Paper User Interface(CLC/CPUI. The purpose of CLC/CPUI is that anyone can intuitively control Full-Color LED Lighting. CLC/CPUI uses colored paper as user interface by sensing the paper. CLC/CPUI realizes lighting color that user demanded to do feedback control. I conduct accuracy verification experiment of CLC/CPUI.

  6. Graphical user interface simplifies infusion pump programming and enhances the ability to detect pump-related faults.

    Science.gov (United States)

    Syroid, Noah; Liu, David; Albert, Robert; Agutter, James; Egan, Talmage D; Pace, Nathan L; Johnson, Ken B; Dowdle, Michael R; Pulsipher, Daniel; Westenskow, Dwayne R

    2012-11-01

    Drug administration errors are frequent and are often associated with the misuse of IV infusion pumps. One source of these errors may be the infusion pump's user interface. We used failure modes-and-effects analyses to identify programming errors and to guide the design of a new syringe pump user interface. We designed the new user interface to clearly show the pump's operating state simultaneously in more than 1 monitoring location. We evaluated anesthesia residents in laboratory and simulated environments on programming accuracy and error detection between the new user interface and the user interface of a commercially available infusion pump. With the new user interface, we observed the number of programming errors reduced by 81%, the number of keystrokes per task reduced from 9.2 ± 5.0 to 7.5 ± 5.5 (mean ± SD), the time required per task reduced from 18.1 ± 14.1 seconds to 10.9 ± 9.5 seconds and significantly less perceived workload. Residents detected 38 of 70 (54%) of the events with the new user interface and 37 of 70 (53%) with the existing user interface, despite no experience with the new user interface and extensive experience with the existing interface. The number of programming errors and workload were reduced partly because it took less time and fewer keystrokes to program the pump when using the new user interface. Despite minimal training, residents quickly identified preexisting infusion pump problems with the new user interface. Intuitive and easy-to-program infusion pump interfaces may reduce drug administration errors and infusion pump-related adverse events.

  7. Multi-Touch Collaborative Gesture Recognition Based User Interfaces as Behavioral Interventions for Children with Autistic Spectrum Disorder: A Review

    Directory of Open Access Journals (Sweden)

    AHMED HASSAN

    2016-10-01

    Full Text Available This paper addresses UI (User Interface designing based on multi-touch collaborative gesture recognition meant for ASD (Autism Spectrum Disorder - affected children. The present user interfaces (in the context of behavioral interventions for Autism Spectrum disorder are investigated in detail. Thorough comparison has been made among various groups of these UIs. Advantages and limitations of these interfaces are discussed and future directions for the design of such interfaces are suggested.

  8. Multi-Touch Collaborative Gesture Recognition Based User Interfaces as Behavioral Interventions for Children with Autistic Spectrum Disorder: A Review

    International Nuclear Information System (INIS)

    Hassan, A.; Shafi, M.; Khattak, M.

    2016-01-01

    This paper addresses UI (User Interface) designing based on multi-touch collaborative gesture recognition meant for ASD (Autism Spectrum Disorder) - affected children. The present user interfaces (in the context of behavioral interventions for Autism Spectrum disorder) are investigated in detail. Thorough comparison has been made among various groups of these UIs. Advantages and limitations of these interfaces are discussed and future directions for the design of such interfaces are suggested. (author)

  9. A Multimodal Deep Log-Based User Experience (UX) Platform for UX Evaluation.

    Science.gov (United States)

    Hussain, Jamil; Khan, Wajahat Ali; Hur, Taeho; Bilal, Hafiz Syed Muhammad; Bang, Jaehun; Hassan, Anees Ul; Afzal, Muhammad; Lee, Sungyoung

    2018-05-18

    The user experience (UX) is an emerging field in user research and design, and the development of UX evaluation methods presents a challenge for both researchers and practitioners. Different UX evaluation methods have been developed to extract accurate UX data. Among UX evaluation methods, the mixed-method approach of triangulation has gained importance. It provides more accurate and precise information about the user while interacting with the product. However, this approach requires skilled UX researchers and developers to integrate multiple devices, synchronize them, analyze the data, and ultimately produce an informed decision. In this paper, a method and system for measuring the overall UX over time using a triangulation method are proposed. The proposed platform incorporates observational and physiological measurements in addition to traditional ones. The platform reduces the subjective bias and validates the user's perceptions, which are measured by different sensors through objectification of the subjective nature of the user in the UX assessment. The platform additionally offers plug-and-play support for different devices and powerful analytics for obtaining insight on the UX in terms of multiple participants.

  10. AutoNUI:2nd Workshop on Automotive Natural User Interfaces

    OpenAIRE

    Pflegling, Bastian; Döring, Tanja; Alvarez, Ignacio; Kranz, Matthias; Weinberg, Garrett; Healey, Jennifer

    2012-01-01

    Natural user interfaces—generally based on gesture and speech interaction—are an increasingly hot topic in research and are already being applied in a multitude of commercial products. Most use cases currently involve consumer electronics devices like smart phones, tablets, TV sets, game consoles, or large-screen tabletop computers.Motivated by the latest results in those areas, our vision is to apply natural user interfaces, for example gesture and conversational speech interaction, to the a...

  11. Development of Web-Based Remote Desktop to Provide Adaptive User Interfaces in Cloud Platform

    OpenAIRE

    Shuen-Tai Wang; Hsi-Ya Chang

    2014-01-01

    Cloud virtualization technologies are becoming more and more prevalent, cloud users usually encounter the problem of how to access to the virtualized remote desktops easily over the web without requiring the installation of special clients. To resolve this issue, we took advantage of the HTML5 technology and developed web-based remote desktop. It permits users to access the terminal which running in our cloud platform from anywhere. We implemented a sketch of web interfac...

  12. Design of a Graphical User Interface for Virtual Reality with Oculus Rift

    OpenAIRE

    Silverhav, Robin

    2015-01-01

    Virtual reality is a concept that has existed for some time but the recent advances in the performance of commercial computers has led the development of different commercial head mounted displays, for example the Oculus Rift. With this growing interest in virtual reality, it is important to evaluate existing techniques used when designing user interfaces. In addition, it is also important to develop new techniques to be able to give the user the best experience when using virtual reality app...

  13. A Prototype User Interface for a Mobile Electronic Clinical Note Data Entry System

    OpenAIRE

    Zafar, Atif; Lehto, Mark; Kim, Jongseo

    2005-01-01

    Recent advances in mobile computing technologies have made electronic medical records (EMRs) on handheld devices an attractive possibility. However, data entry paradigms popular on desktop machines do not translate well to mobile devices1,2. Based on a review of the literature on mobile device usability1–4, we built a prototype user interface for mobile EMRs and held focus groups with clinician users whose feedback provided useful insight about design choices, functionality and...

  14. pmx Webserver: A User Friendly Interface for Alchemistry.

    Science.gov (United States)

    Gapsys, Vytautas; de Groot, Bert L

    2017-02-27

    With the increase of available computational power and improvements in simulation algorithms, alchemical molecular dynamics based free energy calculations have developed into routine usage. To further facilitate the usability of alchemical methods for amino acid mutations, we have developed a web based infrastructure for obtaining hybrid protein structures and topologies. The presented webserver allows amino acid mutation selection in five contemporary molecular mechanics force fields. In addition, a complete mutation scan with a user defined amino acid is supported. The output generated by the webserver is directly compatible with the Gromacs molecular dynamics engine and can be used with any of the alchemical free energy calculation setup. Furthermore, we present a database of input files and precalculated free energy differences for tripeptides approximating a disordered state of a protein, of particular use for protein stability studies. Finally, the usage of the webserver and its output is exemplified by performing an alanine scan and investigating thermodynamic stability of the Trp cage mini protein. The webserver is accessible at http://pmx.mpibpc.mpg.de.

  15. OpenDolphin: presentation models for compelling user interfaces

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    Shared applications run on the server. They still need a display, though, be it on the web or on the desktop. OpenDolphin introduces a shared presentation model to clearly differentiate between "what" to display and "how" to display. The "what" is managed on the server and is independent of the UI technology whereas the "how" can fully exploit the UI capabilities like the ubiquity of the web or the power of the desktop in terms of interactivity, animations, effects, 3D worlds, and local devices. If you run a server-centric architecture and still seek to provide the best possible user experience, then this talk is for you. About the speaker Dierk König (JavaOne Rock Star) works as a fellow for Canoo Engineering AG, Basel, Switzerland. He is a committer to many open-source projects including OpenDolphin, Groovy, Grails, GPars and GroovyFX. He is lead author of the "Groovy in Action" book, which is among ...

  16. PHREEQCI; a graphical user interface for the geochemical computer program PHREEQC

    Science.gov (United States)

    Charlton, Scott R.; Macklin, Clifford L.; Parkhurst, David L.

    1997-01-01

    PhreeqcI is a Windows-based graphical user interface for the geochemical computer program PHREEQC. PhreeqcI provides the capability to generate and edit input data files, run simulations, and view text files containing simulation results, all within the framework of a single interface. PHREEQC is a multipurpose geochemical program that can perform speciation, inverse, reaction-path, and 1D advective reaction-transport modeling. Interactive access to all of the capabilities of PHREEQC is available with PhreeqcI. The interface is written in Visual Basic and will run on personal computers under the Windows(3.1), Windows95, and WindowsNT operating systems.

  17. User-customized brain computer interfaces using Bayesian optimization

    Science.gov (United States)

    Bashashati, Hossein; Ward, Rabab K.; Bashashati, Ali

    2016-04-01

    Objective. The brain characteristics of different people are not the same. Brain computer interfaces (BCIs) should thus be customized for each individual person. In motor-imagery based synchronous BCIs, a number of parameters (referred to as hyper-parameters) including the EEG frequency bands, the channels and the time intervals from which the features are extracted should be pre-determined based on each subject’s brain characteristics. Approach. To determine the hyper-parameter values, previous work has relied on manual or semi-automatic methods that are not applicable to high-dimensional search spaces. In this paper, we propose a fully automatic, scalable and computationally inexpensive algorithm that uses Bayesian optimization to tune these hyper-parameters. We then build different classifiers trained on the sets of hyper-parameter values proposed by the Bayesian optimization. A final classifier aggregates the results of the different classifiers. Main Results. We have applied our method to 21 subjects from three BCI competition datasets. We have conducted rigorous statistical tests, and have shown the positive impact of hyper-parameter optimization in improving the accuracy of BCIs. Furthermore, We have compared our results to those reported in the literature. Significance. Unlike the best reported results in the literature, which are based on more sophisticated feature extraction and classification methods, and rely on prestudies to determine the hyper-parameter values, our method has the advantage of being fully automated, uses less sophisticated feature extraction and classification methods, and yields similar or superior results compared to the best performing designs in the literature.

  18. The computerised procedure system COPMA and its user interface

    International Nuclear Information System (INIS)

    Krogsaeter, M.; Larsen, J.; Nilsen, S.; Oewre, F.

    1990-01-01

    At the OECD Halden Reactor Project, the COPMA system has been developed in order to investigate whether procedures can be executed more safety and efficiently if they are computerised, i.e. if the operator uses a CRT-based system instead of written manuals. Procedures are entered in a procedure data base using PED, a procedure editor. Each procedure is given a textual as well as a graphical representation. For the textual representation, the language PROLA is used, a language which has been designed for simple procedure specification. The COPMA online system lets the operator execute procedures that are stored in the procedure data base. The operator interface is a screen divided into non-overlapping windows each serving a different purpose. All commands to the system are given by moving a mouse device around and clicking buttons on top of the mouse. A procedure consists of steps, each step containing a number of instructions. The operator works on one activity at a time, an activity to be seen as a procedure instance. A graph shows the overall procedure (or activity) structure in a window and activity execution is traced in the graph. Another windows shows the instructions of the step currently being executed. The operator steps through the activity by selecting whether and how to execute the listed instructions. COPMA can maintain the status of several activities in parallel, so that the operator can easily switch between different activities. COPMA is linked to a PWR nuclear simulator over Ethernet using the TCP/IP protocol. This gives a number of advantages as compared to conventional written procedures, especially the fact that COPMA can help collect data from the procedure data base automatically

  19. User-customized brain computer interfaces using Bayesian optimization.

    Science.gov (United States)

    Bashashati, Hossein; Ward, Rabab K; Bashashati, Ali

    2016-04-01

    The brain characteristics of different people are not the same. Brain computer interfaces (BCIs) should thus be customized for each individual person. In motor-imagery based synchronous BCIs, a number of parameters (referred to as hyper-parameters) including the EEG frequency bands, the channels and the time intervals from which the features are extracted should be pre-determined based on each subject's brain characteristics. To determine the hyper-parameter values, previous work has relied on manual or semi-automatic methods that are not applicable to high-dimensional search spaces. In this paper, we propose a fully automatic, scalable and computationally inexpensive algorithm that uses Bayesian optimization to tune these hyper-parameters. We then build different classifiers trained on the sets of hyper-parameter values proposed by the Bayesian optimization. A final classifier aggregates the results of the different classifiers. We have applied our method to 21 subjects from three BCI competition datasets. We have conducted rigorous statistical tests, and have shown the positive impact of hyper-parameter optimization in improving the accuracy of BCIs. Furthermore, We have compared our results to those reported in the literature. Unlike the best reported results in the literature, which are based on more sophisticated feature extraction and classification methods, and rely on prestudies to determine the hyper-parameter values, our method has the advantage of being fully automated, uses less sophisticated feature extraction and classification methods, and yields similar or superior results compared to the best performing designs in the literature.

  20. A flexible user-interface for audiovisual presentation and interactive control in neurobehavioral experiments [v1; ref status: indexed, http://f1000r.es/wt

    Directory of Open Access Journals (Sweden)

    Christopher T Noto

    2013-01-01

    Full Text Available A major problem facing behavioral neuroscientists is a lack of unified, vendor-distributed data acquisition systems that allow stimulus presentation and behavioral monitoring while recording neural activity. Numerous systems perform one of these tasks well independently, but to our knowledge, a useful package with a straightforward user interface does not exist. Here we describe the development of a flexible, script-based user interface that enables customization for real-time stimulus presentation, behavioral monitoring and data acquisition. The experimental design can also incorporate neural microstimulation paradigms. We used this interface to deliver multimodal, auditory and visual (images or video stimuli to a nonhuman primate and acquire single-unit data. Our design is cost-effective and works well with commercially available hardware and software. Our design incorporates a script, providing high-level control of data acquisition via a sequencer running on a digital signal processor to enable behaviorally triggered control of the presentation of visual and auditory stimuli. Our experiments were conducted in combination with eye-tracking hardware. The script, however, is designed to be broadly useful to neuroscientists who may want to deliver stimuli of different modalities using any animal model.

  1. A graphical user interface for RAId, a knowledge integrated proteomics analysis suite with accurate statistics.

    Science.gov (United States)

    Joyce, Brendan; Lee, Danny; Rubio, Alex; Ogurtsov, Aleksey; Alves, Gelio; Yu, Yi-Kuo

    2018-03-15

    RAId is a software package that has been actively developed for the past 10 years for computationally and visually analyzing MS/MS data. Founded on rigorous statistical methods, RAId's core program computes accurate E-values for peptides and proteins identified during database searches. Making this robust tool readily accessible for the proteomics community by developing a graphical user interface (GUI) is our main goal here. We have constructed a graphical user interface to facilitate the use of RAId on users' local machines. Written in Java, RAId_GUI not only makes easy executions of RAId but also provides tools for data/spectra visualization, MS-product analysis, molecular isotopic distribution analysis, and graphing the retrieval versus the proportion of false discoveries. The results viewer displays and allows the users to download the analyses results. Both the knowledge-integrated organismal databases and the code package (containing source code, the graphical user interface, and a user manual) are available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads/raid.html .

  2. Multimodal Information Presentation for High-Load Human Computer Interaction

    NARCIS (Netherlands)

    Cao, Y.

    2011-01-01

    This dissertation addresses multimodal information presentation in human computer interaction. Information presentation refers to the manner in which computer systems/interfaces present information to human users. More specifically, the focus of our work is not on which information to present, but

  3. Enhancing the Gaming Experience Using 3D Spatial User Interface Technologies.

    Science.gov (United States)

    Kulshreshth, Arun; Pfeil, Kevin; LaViola, Joseph J

    2017-01-01

    Three-dimensional (3D) spatial user interface technologies have the potential to make games more immersive and engaging and thus provide a better user experience. Although technologies such as stereoscopic 3D display, head tracking, and gesture-based control are available for games, it is still unclear how their use affects gameplay and if there are any user performance benefits. The authors have conducted several experiments on these technologies in game environments to understand how they affect gameplay and how we can use them to optimize the gameplay experience.

  4. A Tabletop Board Game Interface for Multi-User Interaction with a Storytelling System

    NARCIS (Netherlands)

    Alofs, T.; Theune, Mariet; Swartjes, I.M.T.; Camurri, A.; Costa, C.

    2011-01-01

    The Interactive Storyteller is an interactive storytelling system with a multi-user tabletop interface. Our goal was to design a generic framework combining emergent narrative, where stories emerge from the actions of autonomous intelligent agents, with the social aspects of traditional board games.

  5. Experimental setup for evaluating an adaptive user interface for teleoperation control

    Science.gov (United States)

    Wijayasinghe, Indika B.; Peetha, Srikanth; Abubakar, Shamsudeen; Saadatzi, Mohammad Nasser; Cremer, Sven; Popa, Dan O.

    2017-05-01

    A vital part of human interactions with a machine is the control interface, which single-handedly could define the user satisfaction and the efficiency of performing a task. This paper elaborates the implementation of an experimental setup to study an adaptive algorithm that can help the user better tele-operate the robot. The formulation of the adaptive interface and associate learning algorithms are general enough to apply when the mapping between the user controls and the robot actuators is complex and/or ambiguous. The method uses a genetic algorithm to find the optimal parameters that produce the input-output mapping for teleoperation control. In this paper, we describe the experimental setup and associated results that was used to validate the adaptive interface to a differential drive robot from two different input devices; a joystick, and a Myo gesture control armband. Results show that after the learning phase, the interface converges to an intuitive mapping that can help even inexperienced users drive the system to a goal location.

  6. Social Benefits of a Tangible User Interface for Children with Autistic Spectrum Conditions

    Science.gov (United States)

    Farr, William; Yuill, Nicola; Raffle, Hayes

    2010-01-01

    Tangible user interfaces (TUIs) embed computer technology in graspable objects. This study assessed the potential of Topobo, a construction toy with programmable movement, to support social interaction in children with Autistic Spectrum Conditions (ASC). Groups of either typically developing (TD) children or those with ASC had group play sessions…

  7. imDEV: a graphical user interface to R multivariate analysis tools in Microsoft Excel

    Science.gov (United States)

    Interactive modules for data exploration and visualization (imDEV) is a Microsoft Excel spreadsheet embedded application providing an integrated environment for the analysis of omics data sets with a user-friendly interface. Individual modules were designed to provide toolsets to enable interactive ...

  8. Toward User Interfaces and Data Visualization Criteria for Learning Design of Digital Textbooks

    Science.gov (United States)

    Railean, Elena

    2014-01-01

    User interface and data visualisation criteria are central issues in digital textbooks design. However, when applying mathematical modelling of learning process to the analysis of the possible solutions, it could be observed that results differ. Mathematical learning views cognition in on the base on statistics and probability theory, graph…

  9. Information Practices and User Interfaces: Student Use of an iOS Application in Special Education

    Science.gov (United States)

    Demmans Epp, Carrie; McEwen, Rhonda; Campigotto, Rachelle; Moffatt, Karyn

    2016-01-01

    A framework connecting concepts from user interface design with those from information studies is applied in a study that integrated a location-aware mobile application into two special education classes at different schools; this application had two support modes (one general and one location specific). The five-month study revealed several…

  10. Moving towards the Assessment of Collaborative Problem Solving Skills with a Tangible User Interface

    Science.gov (United States)

    Ras, Eric; Krkovic, Katarina; Greiff, Samuel; Tobias, Eric; Maquil, Valérie

    2014-01-01

    The research on the assessment of collaborative problem solving (ColPS), as one crucial 21st Century Skill, is still in its beginnings. Using Tangible User Interfaces (TUI) for this purpose has only been marginally investigated in technology-based assessment. Our first empirical studies focused on light-weight performance measurements, usability,…

  11. AOP-DB Frontend: A user interface for the Adverse Outcome Pathways Database

    Science.gov (United States)

    The EPA Adverse Outcome Pathway Database (AOP-DB) is a database resource that aggregates association relationships between AOPs, genes, chemicals, diseases, pathways, species orthology information, ontologies. The AOP-DB frontend is a simple yet powerful user interface in the for...

  12. User Interfaces for Patient-Centered Communication of Health Status and Care Progress

    Science.gov (United States)

    Wilcox-Patterson, Lauren

    2013-01-01

    The recent trend toward patients participating in their own healthcare has opened up numerous opportunities for computing research. This dissertation focuses on how technology can foster this participation, through user interfaces to effectively communicate personal health status and care progress to hospital patients. I first characterize the…

  13. Assessment of Application Technology of Natural User Interfaces in the Creation of a Virtual Chemical Laboratory

    Science.gov (United States)

    Jagodzinski, Piotr; Wolski, Robert

    2015-01-01

    Natural User Interfaces (NUI) are now widely used in electronic devices such as smartphones, tablets and gaming consoles. We have tried to apply this technology in the teaching of chemistry in middle school and high school. A virtual chemical laboratory was developed in which students can simulate the performance of laboratory activities similar…

  14. User Interface Preferences in the Design of a Camera-Based Navigation and Wayfinding Aid

    Science.gov (United States)

    Arditi, Aries; Tian, YingLi

    2013-01-01

    Introduction: Development of a sensing device that can provide a sufficient perceptual substrate for persons with visual impairments to orient themselves and travel confidently has been a persistent rehabilitation technology goal, with the user interface posing a significant challenge. In the study presented here, we enlist the advice and ideas of…

  15. 78 FR 36478 - Accessibility of User Interfaces, and Video Programming Guides and Menus

    Science.gov (United States)

    2013-06-18

    ... the functional requirements needed to carry out those sections. Among other things, the VPAAC Second... of the associated functional requirement.'' The VPAAC Second Report: User Interfaces also lists... operable in accordance with each of the following, assessed independently: Operable without vision. Provide...

  16. Flexible software architecture for user-interface and machine control in laboratory automation.

    Science.gov (United States)

    Arutunian, E B; Meldrum, D R; Friedman, N A; Moody, S E

    1998-10-01

    We describe a modular, layered software architecture for automated laboratory instruments. The design consists of a sophisticated user interface, a machine controller and multiple individual hardware subsystems, each interacting through a client-server architecture built entirely on top of open Internet standards. In our implementation, the user-interface components are built as Java applets that are downloaded from a server integrated into the machine controller. The user-interface client can thereby provide laboratory personnel with a familiar environment for experiment design through a standard World Wide Web browser. Data management and security are seamlessly integrated at the machine-controller layer using QNX, a real-time operating system. This layer also controls hardware subsystems through a second client-server interface. This architecture has proven flexible and relatively easy to implement and allows users to operate laboratory automation instruments remotely through an Internet connection. The software architecture was implemented and demonstrated on the Acapella, an automated fluid-sample-processing system that is under development at the University of Washington.

  17. Guidance from the Graphical User Interface (GUI) Experience: What GUI Teaches about Technology Access.

    Science.gov (United States)

    National Council on Disability, Washington, DC.

    This report investigates the use of the graphical user interface (GUI) in computer programs, the problems it creates for individuals with visual impairments or blindness, and advocacy efforts concerning this issue, which have been targeted primarily at Microsoft, producer of Windows. The report highlights the concerns of individuals with visual…

  18. Usability evaluation of augmented reality user interfaces in games and personal navigation assistants

    OpenAIRE

    Ivanec, Zorian

    2017-01-01

    This paper studies heuristic usability evaluation of user interfaces. The literature review distinguishes characteristics of augmented reality and smartphones, presents heuristic evaluation, PACMAD usability model, personal navigation assistants, video games, augmented reality and J. Nielsen usability heuristics. In research part, usability heuristics of related subject fields and design guidelines are combined. Created heuristics and control questionnaire are used for evaluation of augmented...

  19. Integration of data validation and user interface concerns in a DSL for web applications

    NARCIS (Netherlands)

    Groenewegen, D.M.; Visser, E.

    2009-01-01

    This paper is a pre-print of: Danny M. Groenewegen, Eelco Visser. Integration of Data Validation and User Interface Concerns in a DSL for Web Applications. In Mark G. J. van den Brand, Jeff Gray, editors, Software Language Engineering, Second International Conference, SLE 2009, Denver, USA, October,

  20. US NDC Modernization Iteration E1 Prototyping Report: User Interface Framework

    Energy Technology Data Exchange (ETDEWEB)

    Lober, Randall R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-12-01

    During the first iteration of the US NDC Modernization Elaboration phase (E1), the SNL US NDC modernization project team completed an initial survey of applicable COTS solutions, and established exploratory prototyping related to the User Interface Framework (UIF) in support of system architecture definition. This report summarizes these activities and discusses planned follow-on work.

  1. Using R in Introductory Statistics Courses with the pmg Graphical User Interface

    Science.gov (United States)

    Verzani, John

    2008-01-01

    The pmg add-on package for the open source statistics software R is described. This package provides a simple to use graphical user interface (GUI) that allows introductory statistics students, without advanced computing skills, to quickly create the graphical and numeric summaries expected of them. (Contains 9 figures.)

  2. TESTAR : Tool Support for Test Automation at the User Interface Level

    NARCIS (Netherlands)

    Vos, Tanja E.J.; Kruse, Peter M.; Condori Fernandez, Nelly; Bauersfeld, Sebastian; Wegener, Joachim

    2015-01-01

    Testing applications with a graphical user interface (GUI) is an important, though challenging and time consuming task. The state of the art in the industry are still capture and replay tools, which may simplify the recording and execution of input sequences, but do not support the tester in finding

  3. Interface support of fault diagnosis strategies : a user-driven approach

    NARCIS (Netherlands)

    Schaaf, van der T.W.; Brinkman, J.A.

    1993-01-01

    In this paper an overview is presented of a research program to provide interface designers with guidelines to support process control- and fault diagnosis tasks by control-room operators. Different methods of eliciting information needs of end-users had to be developed, as explained by an

  4. What you see is what you feel : on the simulation of touch in graphical user interfaces

    NARCIS (Netherlands)

    Mensvoort, van K.M.

    2009-01-01

    This study introduces a novel method of simulating touch with merely visual means. Interactive animations are used to create an optical illusion that evokes haptic percepts like stickiness, stiffness and mass, within a standard graphical user interface. The technique, called optically simulated

  5. AOP-DB Frontend: A user interface for the Adverse Outcome Pathways Database.

    Science.gov (United States)

    The EPA Adverse Outcome Pathway Database (AOP-DB) is a database resource that aggregates association relationships between AOPs, genes, chemicals, diseases, pathways, species orthology information, ontologies. The AOP-DB frontend is a simple yet powerful AOP-DB user interface in...

  6. The Design of a Graphical User Interface for an Electronic Classroom.

    Science.gov (United States)

    Cahalan, Kathleen J.; Levin, Jacques

    2000-01-01

    Describes the design of a prototype for the graphical user interface component of an electronic classroom (ECR) application that supports real-time lectures and question-and-answer sessions between an instructor and students. Based on requirements analysis and an analysis of competing products, a Web-based ECR prototype was produced. Findings show…

  7. Graphical User Interface Development and Design to Support Airport Runway Configuration Management

    Science.gov (United States)

    Jones, Debra G.; Lenox, Michelle; Onal, Emrah; Latorella, Kara A.; Lohr, Gary W.; Le Vie, Lisa

    2015-01-01

    The objective of this effort was to develop a graphical user interface (GUI) for the National Aeronautics and Space Administration's (NASA) System Oriented Runway Management (SORM) decision support tool to support runway management. This tool is expected to be used by traffic flow managers and supervisors in the Airport Traffic Control Tower (ATCT) and Terminal Radar Approach Control (TRACON) facilities.

  8. A Monthly Water-Balance Model Driven By a Graphical User Interface

    Science.gov (United States)

    McCabe, Gregory J.; Markstrom, Steven L.

    2007-01-01

    This report describes a monthly water-balance model driven by a graphical user interface, referred to as the Thornthwaite monthly water-balance program. Computations of monthly water-balance components of the hydrologic cycle are made for a specified location. The program can be used as a research tool, an assessment tool, and a tool for classroom instruction.

  9. Development of web-based user interface for evaluated covariance data files

    International Nuclear Information System (INIS)

    Togashi, Tomoaki; Kato, Kiyoshi; Suzuki, Ryusuke; Otuka, Naohiko

    2010-01-01

    We develop a web-based interface which visualizes cross sections with their covariance compiled in the ENDF format in order to support evaluated covariance data users who do not have experience of NJOY calculation. A package of programs has been constructed without aid of any existing program libraries. (author)

  10. Touch in Computer-Mediated Environments: An Analysis of Online Shoppers' Touch-Interface User Experiences

    Science.gov (United States)

    Chung, Sorim

    2016-01-01

    Over the past few years, one of the most fundamental changes in current computer-mediated environments has been input devices, moving from mouse devices to touch interfaces. However, most studies of online retailing have not considered device environments as retail cues that could influence users' shopping behavior. In this research, I examine the…

  11. A Graphical User Interface (GUI) for Automated Classification of Bradley Fighting Vehicle Shock Absorbers

    National Research Council Canada - National Science Library

    Sincebaugh, Patrick

    1998-01-01

    .... We then explain the design and capabilities of the SSATS graphical user interface (GUI), which includes the integration of a neural network classification scheme. We finish by discussing recent results of utilizing the system to test and evaluate Bradley armored vehicle shock absorbers.

  12. Version I of the users manual for the Tuff Data Base Interface

    International Nuclear Information System (INIS)

    Langkopf, B.S.; Satter, B.J.; Welch, E.P.

    1985-04-01

    The Nevada Nuclear Waste Storage Investigations (NNWSI) project, managed by the Nevada Operations Office of the US Department of Energy, is investigating the feasibility of locating a repository at Yucca Mountain on and adjacent to the Nevada Test Site (NTS) in southern Nevada. A part of this investigation includes obtaining physical properties from laboratory tests on samples from Yucca Mountain and field tests of the in situ tuffs at Yucca Mountain. A computerized data base has been developed to store this data in a centralized location. The data base is stored on the Cyber 170/855 computer at Sandia using the System 2000 Data Base Management software. A user-friendly interface, the Tuff Data Base Interface, is being developed to allow NNWSI participants to retrieve information from the Tuff Data Base directly. The Interface gives NNWSI users a great deal of flexibility in retrieving portions of the Data Base. This report is an interim users manual for the Tuff Data Base Interface, as of August 1984. It gives basic instructions on accessing the Sandia computing system and explains the Interface on a question-by-question basis

  13. Exploring the requirements for multimodal interaction for mobile devices in an end-to-end journey context.

    Science.gov (United States)

    Krehl, Claudia; Sharples, Sarah

    2012-01-01

    The paper investigates the requirements for multimodal interaction on mobile devices in an end-to-end journey context. Traditional interfaces are deemed cumbersome and inefficient for exchanging information with the user. Multimodal interaction provides a different user-centred approach allowing for more natural and intuitive interaction between humans and computers. It is especially suitable for mobile interaction as it can overcome additional constraints including small screens, awkward keypads, and continuously changing settings - an inherent property of mobility. This paper is based on end-to-end journeys where users encounter several contexts during their journeys. Interviews and focus groups explore the requirements for multimodal interaction design for mobile devices by examining journey stages and identifying the users' information needs and sources. Findings suggest that multimodal communication is crucial when users multitask. Choosing suitable modalities depend on user context, characteristics and tasks.

  14. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems

    Science.gov (United States)

    Varela, Gervasio; Paz-Lopez, Alejandro; Becerra, Jose A.; Duro, Richard

    2016-01-01

    This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC) and Ambient Intelligence (AmI) systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location. PMID:27399711

  15. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems

    Directory of Open Access Journals (Sweden)

    Gervasio Varela

    2016-07-01

    Full Text Available This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC and Ambient Intelligence (AmI systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location.

  16. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems.

    Science.gov (United States)

    Varela, Gervasio; Paz-Lopez, Alejandro; Becerra, Jose A; Duro, Richard

    2016-07-07

    This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC) and Ambient Intelligence (AmI) systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location.

  17. U-Net/SLE: A Java-Based User-Customizable Virtual Network Interface

    Directory of Open Access Journals (Sweden)

    Matt Welsh

    1999-01-01

    Full Text Available We describe U‐Net/SLE (Safe Language Extensions, a user‐level network interface architecture which enables per‐application customization of communication semantics through downloading of user extension applets, implemented as Java classfiles, to the network interface. This architecture permits applications to safely specify code to be executed within the NI on message transmission and reception. By leveraging the existing U‐Net model, applications may implement protocol code at the user level, within the NI, or using some combination of the two. Our current implementation, using the Myricom Myrinet interface and a small Java Virtual Machine subset, allows host communication overhead to be reduced and improves the overlap of communication and computation during protocol processing.

  18. Design and validation of an improved graphical user interface with the 'Tool ball'.

    Science.gov (United States)

    Lee, Kuo-Wei; Lee, Ying-Chu

    2012-01-01

    The purpose of this research is introduce the design of an improved graphical user interface (GUI) and verifies the operational efficiency of the proposed interface. Until now, clicking the toolbar with the mouse is the usual way to operate software functions. In our research, we designed an improved graphical user interface - a tool ball that is operated by a mouse wheel to perform software functions. Several experiments are conducted to measure the time needed to operate certain software functions with the traditional combination of "mouse click + tool button" and the proposed integration of "mouse wheel + tool ball". The results indicate that the tool ball design can accelerate the speed of operating software functions, decrease the number of icons on the screen, and enlarge the applications of the mouse wheel. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  19. Theorema 2.0: A Graphical User Interface for a Mathematical Assistant System

    Directory of Open Access Journals (Sweden)

    Wolfgang Windsteiger

    2013-07-01

    Full Text Available Theorema 2.0 stands for a re-design including a complete re-implementation of the Theorema system, which was originally designed, developed, and implemented by Bruno Buchberger and his Theorema group at RISC. In this paper, we present the first prototype of a graphical user interface (GUI for the new system. It heavily relies on powerful interactive capabilities introduced in recent releases of the underlying Mathematica system, most importantly the possibility of having dynamic objects connected to interface elements like sliders, menus, check-boxes, radio-buttons and the like. All these features are fully integrated into the Mathematica programming environment and allow the implementation of a modern user interface.

  20. Comparative performance analysis of M-IMU/EMG and voice user interfaces for assistive robots.

    Science.gov (United States)

    Laureiti, Clemente; Cordella, Francesca; di Luzio, Francesco Scotto; Saccucci, Stefano; Davalli, Angelo; Sacchetti, Rinaldo; Zollo, Loredana

    2017-07-01

    People with a high level of disability experience great difficulties to perform activities of daily living and resort to their residual motor functions in order to operate assistive devices. The commercially available interfaces used to control assistive manipulators are typically based on joysticks and can be used only by subjects with upper-limb residual mobilities. Many other solutions can be found in the literature, based on the use of multiple sensory systems for detecting the human motion intention and state. Some of them require a high cognitive workload for the user. Some others are more intuitive and easy to use but have not been widely investigated in terms of usability and user acceptance. The objective of this work is to propose an intuitive and robust user interface for assistive robots, not obtrusive for the user and easily adaptable for subjects with different levels of disability. The proposed user interface is based on the combination of M-IMU and EMG for the continuous control of an arm-hand robotic system by means of M-IMUs. The system has been experimentally validated and compared to a standard voice interface. Sixteen healthy subjects volunteered to participate in the study: 8 subjects used the combined M-IMU/EMG robot control, and 8 subjects used the voice control. The arm-hand robotic system made of the KUKA LWR 4+ and the IH2 Azzurra hand was controlled to accomplish the daily living task of drinking. Performance indices and evaluation scales were adopted to assess performance of the two interfaces.