WorldWideScience

Sample records for haptic user interfaces

  1. User Acceptance of a Haptic Interface for Learning Anatomy

    Science.gov (United States)

    Yeom, Soonja; Choi-Lundberg, Derek; Fluck, Andrew; Sale, Arthur

    2013-01-01

    Visualizing the structure and relationships in three dimensions (3D) of organs is a challenge for students of anatomy. To provide an alternative way of learning anatomy engaging multiple senses, we are developing a force-feedback (haptic) interface for manipulation of 3D virtual organs, using design research methodology, with iterations of system…

  2. Force Control and Nonlinear Master-Slave Force Profile to Manage an Admittance Type Multi-Fingered Haptic User Interface

    Energy Technology Data Exchange (ETDEWEB)

    Anthony L. Crawford

    2012-08-01

    Natural movements and force feedback are important elements in using teleoperated equipment if complex and speedy manipulation tasks are to be accomplished in remote and/or hazardous environments, such as hot cells, glove boxes, decommissioning, explosives disarmament, and space to name a few. In order to achieve this end the research presented in this paper has developed an admittance type exoskeleton like multi-fingered haptic hand user interface that secures the user’s palm and provides 3-dimensional force feedback to the user’s fingertips. Atypical to conventional haptic hand user interfaces that limit themselves to integrating the human hand’s characteristics just into the system’s mechanical design this system also perpetuates that inspiration into the designed user interface’s controller. This is achieved by manifesting the property differences of manipulation and grasping activities as they pertain to the human hand into a nonlinear master-slave force relationship. The results presented in this paper show that the admittance-type system has sufficient bandwidth that it appears nearly transparent to the user when the user is in free motion and when the system is subjected to a manipulation task, increased performance is achieved using the nonlinear force relationship compared to the traditional linear scaling techniques implemented in the vast majority of systems.

  3. Video Game Device Haptic Interface for Robotic Arc Welding

    Energy Technology Data Exchange (ETDEWEB)

    Corrie I. Nichol; Milos Manic

    2009-05-01

    Recent advances in technology for video games have made a broad array of haptic feedback devices available at low cost. This paper presents a bi-manual haptic system to enable an operator to weld remotely using the a commercially available haptic feedback video game device for the user interface. The system showed good performance in initial tests, demonstrating the utility of low cost input devices for remote haptic operations.

  4. Haptic interfaces: Hardware, software and human performance

    Science.gov (United States)

    Srinivasan, Mandayam A.

    1995-01-01

    Virtual environments are computer-generated synthetic environments with which a human user can interact to perform a wide variety of perceptual and motor tasks. At present, most of the virtual environment systems engage only the visual and auditory senses, and not the haptic sensorimotor system that conveys the sense of touch and feel of objects in the environment. Computer keyboards, mice, and trackballs constitute relatively simple haptic interfaces. Gloves and exoskeletons that track hand postures have more interaction capabilities and are available in the market. Although desktop and wearable force-reflecting devices have been built and implemented in research laboratories, the current capabilities of such devices are quite limited. To realize the full promise of virtual environments and teleoperation of remote systems, further developments of haptic interfaces are critical. In this paper, the status and research needs in human haptics, technology development and interactions between the two are described. In particular, the excellent performance characteristics of Phantom, a haptic interface recently developed at MIT, are highlighted. Realistic sensations of single point of contact interactions with objects of variable geometry (e.g., smooth, textured, polyhedral) and material properties (e.g., friction, impedance) in the context of a variety of tasks (e.g., needle biopsy, switch panels) achieved through this device are described and the associated issues in haptic rendering are discussed.

  5. Improved haptic interface for colonoscopy simulation.

    Science.gov (United States)

    Woo, Hyun Soo; Kim, Woo Seok; Ahn, Woojin; Lee, Doo Yong; Yi, Sun Young

    2007-01-01

    This paper presents an improved haptic interface of the KAIST-Ewha colonoscopy simulator II. The haptic interface enables the distal portion of the colonoscope to be freely bent while guaranteeing enough workspace and reflective forces for colonoscopy simulation. Its force-torque sensor measures profiles of the user. Manipulation of the colonoscope tip is monitored by four deflection sensors, and triggers computation to render accurate graphic images corresponding to the angle knob rotation. Tack switches are attached on the valve-actuation buttons of the colonoscope to simulate air-injection or suction, and the corresponding deformation of the colon.

  6. NONLINEAR FORCE PROFILE USED TO INCREASE THE PERFORMANCE OF A HAPTIC USER INTERFACE FOR TELEOPERATING A ROBOTIC HAND

    Energy Technology Data Exchange (ETDEWEB)

    Anthony L. Crawford

    2012-07-01

    MODIFIED PAPER TITLE AND ABSTRACT DUE TO SLIGHTLY MODIFIED SCOPE: TITLE: Nonlinear Force Profile Used to Increase the Performance of a Haptic User Interface for Teleoperating a Robotic Hand Natural movements and force feedback are important elements in using teleoperated equipment if complex and speedy manipulation tasks are to be accomplished in hazardous environments, such as hot cells, glove boxes, decommissioning, explosives disarmament, and space. The research associated with this paper hypothesizes that a user interface and complementary radiation compatible robotic hand that integrates the human hand’s anthropometric properties, speed capability, nonlinear strength profile, reduction of active degrees of freedom during the transition from manipulation to grasping, and just noticeable difference force sensation characteristics will enhance a user’s teleoperation performance. The main contribution of this research is in that a system that concisely integrates all these factors has yet to be developed and furthermore has yet to be applied to a hazardous environment as those referenced above. In fact, the most prominent slave manipulator teleoperation technology in use today is based on a design patented in 1945 (Patent 2632574) [1]. The robotic hand/user interface systems of similar function as the one being developed in this research limit their design input requirements in the best case to only complementing the hand’s anthropometric properties, speed capability, and linearly scaled force application relationship (e.g. robotic force is a constant, 4 times that of the user). In this paper a nonlinear relationship between the force experienced between the user interface and the robotic hand was devised based on property differences of manipulation and grasping activities as they pertain to the human hand. The results show that such a relationship when subjected to a manipulation task and grasping task produces increased performance compared to the

  7. Development of a wearable haptic game interface

    Directory of Open Access Journals (Sweden)

    J. Foottit

    2016-04-01

    Full Text Available This paper outlines the ongoing development of a wearable haptic game interface, in this case for controlling a flight simulator. The device differs from many traditional haptic feedback implementations in that it combines vibrotactile feedback with gesture based input, thus becoming a two-way conduit between the user and the virtual environment. The device is intended to challenge what is considered an “interface” and sets out to purposefully blur the boundary between man and machine. This allows for a more immersive experience, and a user evaluation shows that the intuitive interface allows the user to become the aircraft that is controlled by the movements of the user's hand.

  8. Human-computer interface including haptically controlled interactions

    Science.gov (United States)

    Anderson, Thomas G.

    2005-10-11

    The present invention provides a method of human-computer interfacing that provides haptic feedback to control interface interactions such as scrolling or zooming within an application. Haptic feedback in the present method allows the user more intuitive control of the interface interactions, and allows the user's visual focus to remain on the application. The method comprises providing a control domain within which the user can control interactions. For example, a haptic boundary can be provided corresponding to scrollable or scalable portions of the application domain. The user can position a cursor near such a boundary, feeling its presence haptically (reducing the requirement for visual attention for control of scrolling of the display). The user can then apply force relative to the boundary, causing the interface to scroll the domain. The rate of scrolling can be related to the magnitude of applied force, providing the user with additional intuitive, non-visual control of scrolling.

  9. Haptic interface of the KAIST-Ewha colonoscopy simulator II.

    Science.gov (United States)

    Woo, Hyun Soo; Kim, Woo Seok; Ahn, Woojin; Lee, Doo Yong; Yi, Sun Young

    2008-11-01

    This paper presents an improved haptic interface for the Korea Advanced Institute of Science and Technology Ewha Colonoscopy Simulator II. The haptic interface enables the distal portion of the colonoscope to be freely bent while guaranteeing sufficient workspace and reflective forces for colonoscopy simulation. Its force-torque sensor measures the profiles of the user. Manipulation of the colonoscope tip is monitored by four deflection sensors and triggers computations to render accurate graphic images corresponding to the rotation of the angle knob. Tack sensors are attached to the valve-actuation buttons of the colonoscope to simulate air injection or suction as well as the corresponding deformation of the colon. A survey study for face validation was conducted, and the result shows that the developed haptic interface provides realistic haptic feedback for colonoscopy simulations.

  10. Haptic and Visual feedback in 3D Audio Mixing Interfaces

    DEFF Research Database (Denmark)

    Gelineck, Steven; Overholt, Daniel

    2015-01-01

    This paper describes the implementation and informal evaluation of a user interface that explores haptic feedback for 3D audio mixing. The implementation compares different approaches using either the LEAP Motion for mid-air hand gesture control, or the Novint Falcon for active haptic feed- back...... in order to augment the perception of the 3D space. We compare different interaction paradigms implemented using these interfaces, aiming to increase speed and accuracy and reduce the need for constant visual feedback. While the LEAP Motion relies upon visual perception and proprioception, users can forego...

  11. Personalized Learning in Medical Education: Designing a User Interface for a Dynamic Haptic Robotic Trainer for Central Venous Catheterization.

    Science.gov (United States)

    Yovanoff, Mary; Pepley, David; Mirkin, Katelin; Moore, Jason; Han, David; Miller, Scarlett

    2017-09-01

    While Virtual Reality (VR) has emerged as a viable method for training new medical residents, it has not yet reached all areas of training. One area lacking such development is surgical residency programs where there are large learning curves associated with skill development. In order to address this gap, a Dynamic Haptic Robotic Trainer (DHRT) was developed to help train surgical residents in the placement of ultrasound guided Internal Jugular Central Venous Catheters and to incorporate personalized learning. In order to accomplish this, a 2-part study was conducted to: (1) systematically analyze the feedback given to 18 third year medical students by trained professionals to identify the items necessary for a personalized learning system and (2) develop and experimentally test the usability of the personalized learning interface within the DHRT system. The results can be used to inform the design of VR and personalized learning systems within the medical community.

  12. UPPER LIMB FUNCTIONAL ASSESSMENT USING HAPTIC INTERFACE

    Directory of Open Access Journals (Sweden)

    Aleš Bardorfer

    2004-12-01

    Full Text Available A new method for the assessment of the upper limb (UL functional state, using a haptic interface is presented. A haptic interface is used as a measuring device, capable of providing objective, repeatable and quantitative data of the UL motion. A patient is presented with a virtual environment, both graphically via a computer screen and haptically via the Phantom Premium 1.5 haptic interface. The setup allows the patient to explore and feel the virtual environment with three of his/her senses; sight, hearing, and most important, touch. Specially designed virtual environments are used to assess the patient’s UL movement capabilities. The tests range from tracking tasks – to assess the accuracy of movement – tracking tasks with added disturbances in a form of random forces – to assess the patient’s control abilities, a labyrinth test – to assess both speed and accuracy, to the last test for measuring the maximal force capacity of the UL.A new method for the assessment of the upper limb (UL functional state, using a haptic interface is presented. A haptic interface is used as a measuring device, capable of providing objective, repeatable and quantitative data of the UL motion. A patient is presented with a virtual environment, both graphically via a computer screen and haptically via the Phantom Premium 1.5 haptic interface. The setup allows the patient to explore and feel the virtual environment with three of his/her senses; sight, hearing, and most important, touch. Specially designed virtual environments are used to assess the patient’s UL movement capabilities. The tests range from tracking tasks–to assess the accuracy of movement-tracking tasks with added disturbances in a form of random forces-to assess the patient’s control abilities, a labyrinth test-to assess both speed and accuracy, to the last test for measuring the maximal force capacity of the UL.A comprehensive study, using the developed measurement setup within the

  13. fMRI-Compatible Electromagnetic Haptic Interface.

    Science.gov (United States)

    Riener, R; Villgrattner, T; Kleiser, R; Nef, T; Kollias, S

    2005-01-01

    A new haptic interface device is suggested, which can be used for functional magnetic resonance imaging (fMRI) studies. The basic component of this 1 DOF haptic device are two coils that produce a Lorentz force induced by the large static magnetic field of the MR scanner. A MR-compatible optical angular encoder and a optical force sensor enable the implementation of different control architectures for haptic interactions. The challenge was to provide a large torque, and not to affect image quality by the currents applied in the device. The haptic device was tested in a 3T MR scanner. With a current of up to 1A and a distance of 1m to the focal point of the MR-scanner it was possible to generate torques of up to 4 Nm. Within these boundaries image quality was not affected.

  14. Multimodal Sensing Interface for Haptic Interaction

    Directory of Open Access Journals (Sweden)

    Carlos Diaz

    2017-01-01

    Full Text Available This paper investigates the integration of a multimodal sensing system for exploring limits of vibrato tactile haptic feedback when interacting with 3D representation of real objects. In this study, the spatial locations of the objects are mapped to the work volume of the user using a Kinect sensor. The position of the user’s hand is obtained using the marker-based visual processing. The depth information is used to build a vibrotactile map on a haptic glove enhanced with vibration motors. The users can perceive the location and dimension of remote objects by moving their hand inside a scanning region. A marker detection camera provides the location and orientation of the user’s hand (glove to map the corresponding tactile message. A preliminary study was conducted to explore how different users can perceive such haptic experiences. Factors such as total number of objects detected, object separation resolution, and dimension-based and shape-based discrimination were evaluated. The preliminary results showed that the localization and counting of objects can be attained with a high degree of success. The users were able to classify groups of objects of different dimensions based on the perceived haptic feedback.

  15. An investigation of a passively controlled haptic interface

    Energy Technology Data Exchange (ETDEWEB)

    Davis, J.T. [Oak Ridge National Lab., TN (United States); Book, W.J. [Georgia Inst. of Tech., Atlanta, GA (United States). School of Mechanical Engineering

    1997-03-01

    Haptic interfaces enhance cooperation between humans and robotic manipulators by providing force and tactile feedback to the human user during the execution of arbitrary tasks. The use of active actuators in haptic displays presents a certain amount of risk since they are capable of providing unacceptable levels of energy to the systems upon which they operate. An alternative to providing numerous safeguards is to remove the sources of risk altogether. This research investigates the feasibility of trajectory control using passive devices, that is, devices that cannot add energy to the system. Passive actuators are capable only of removing energy from the system or transferring energy within the system. It is proposed that the utility of passive devices is greatly enhanced by the use of redundant actuators. In a passive system, once motion is provided to the system, presumably by a human user, passive devices may be able to modify this motion to achieve a desired resultant trajectory. A mechanically passive, 2-Degree-of-Freedom (D.O.F.) manipulator has been designed and built. It is equipped with four passive actuators: two electromagnetic brakes and two electromagnetic clutches. This paper gives a review of the literature on passive and robotics and describes the experimental test bed used in this research. Several control algorithms are investigated, resulting in the formulation of a passive control law.

  16. Haptic Addition to a Visual Menu Selection Interface Controlled by an In-Vehicle Rotary Device

    Directory of Open Access Journals (Sweden)

    Camilla Grane

    2012-01-01

    Full Text Available Today, several vehicles are equipped with a visual display combined with a haptic rotary device for handling in-vehicle information system tasks while driving. This experimental study investigates whether a haptic addition to a visual interface interferes with or supports secondary task performance and whether haptic information could be used without taking eyes off road. Four interfaces were compared during simulated driving: visual only, partly corresponding visual-haptic, fully corresponding visual-haptic, and haptic only. Secondary task performance and subjective mental workload were measured. Additionally, the participants were interviewed. It was found that some haptic support improved performance. However, when more haptic information was used, the results diverged in terms of task completion time and interface comprehension. Some participants did not sense all haptics provided, some did not comprehend the correspondence between the haptic and visual interfaces, and some did. Interestingly, the participants managed to complete the tasks when using haptic-only information.

  17. Haptic interfaces using dielectric electroactive polymers

    Science.gov (United States)

    Ozsecen, Muzaffer Y.; Sivak, Mark; Mavroidis, Constantinos

    2010-04-01

    Quality, amplitude and frequency of the interaction forces between a human and an actuator are essential traits for haptic applications. A variety of Electro-Active Polymer (EAP) based actuators can provide these characteristics simultaneously with quiet operation, low weight, high power density and fast response. This paper demonstrates a rolled Dielectric Elastomer Actuator (DEA) being used as a telepresence device in a heart beat measurement application. In the this testing, heart signals were acquired from a remote location using a wireless heart rate sensor, sent through a network and DEA was used to haptically reproduce the heart beats at the medical expert's location. A series of preliminary human subject tests were conducted that demonstrated that a) DE based haptic feeling can be used in heart beat measurement tests and b) through subjective testing the stiffness and actuator properties of the EAP can be tuned for a variety of applications.

  18. Haptic Addition to a Visual Menu Selection Interface Controlled by an In-Vehicle Rotary Device

    OpenAIRE

    Camilla Grane; Peter Bengtsson

    2012-01-01

    Today, several vehicles are equipped with a visual display combined with a haptic rotary device for handling in-vehicle information system tasks while driving. This experimental study investigates whether a haptic addition to a visual interface interferes with or supports secondary task performance and whether haptic information could be used without taking eyes off road. Four interfaces were compared during simulated driving: visual only, partly corresponding visual-haptic, fully correspondi...

  19. Vibrotactile perception assessment for a haptic interface on an antigravity suit.

    Science.gov (United States)

    Ko, Sang Min; Lee, Kwangil; Kim, Daeho; Ji, Yong Gu

    2017-01-01

    Haptic technology is used in various fields to transmit information to the user with or without visual and auditory cues. This study aimed to provide preliminary data for use in developing a haptic interface for an antigravity (anti-G) suit. With the structural characteristics of the anti-G suit in mind, we determined five areas on the body (lower back, outer thighs, inner thighs, outer calves, and inner calves) on which to install ten bar-type eccentric rotating mass (ERM) motors as vibration actuators. To determine the design factors of the haptic anti-G suit, we conducted three experiments to find the absolute threshold, moderate intensity, and subjective assessments of vibrotactile stimuli. Twenty-six fighter pilots participated in the experiments, which were conducted in a fixed-based flight simulator. From the results of our study, we recommend 1) absolute thresholds of ∼11.98-15.84 Hz and 102.01-104.06 dB, 2) moderate intensities of 74.36 Hz and 126.98 dB for the lower back and 58.65 Hz and 122.37 dB for either side of the thighs and calves, and 3) subjective assessments of vibrotactile stimuli (displeasure, easy to perceive, and level of comfort). The results of this study will be useful for the design of a haptic anti-G suit. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Haptic interface for vehicular touch screens.

    Science.gov (United States)

    2013-02-01

    Once the domain of purely physical controls such as knobs, : levers, buttons, and sliders, the vehicle dash is rapidly : transforming into a computer interface. This presents a : challenge for drivers, because the physics-based cues which : make trad...

  1. Design and Evaluation of Shape-Changing Haptic Interfaces for Pedestrian Navigation Assistance.

    Science.gov (United States)

    Spiers, Adam J; Dollar, Aaron M

    2017-01-01

    Shape-changing interfaces are a category of device capable of altering their form in order to facilitate communication of information. In this work, we present a shape-changing device that has been designed for navigation assistance. 'The Animotus' (previously, 'The Haptic Sandwich' ), resembles a cube with an articulated upper half that is able to rotate and extend (translate) relative to the bottom half, which is fixed in the user's grasp. This rotation and extension, generally felt via the user's fingers, is used to represent heading and proximity to navigational targets. The device is intended to provide an alternative to screen or audio based interfaces for visually impaired, hearing impaired, deafblind, and sighted pedestrians. The motivation and design of the haptic device is presented, followed by the results of a navigation experiment that aimed to determine the role of each device DOF, in terms of facilitating guidance. An additional device, 'The Haptic Taco', which modulated its volume in response to target proximity (negating directional feedback), was also compared. Results indicate that while the heading (rotational) DOF benefited motion efficiency, the proximity (translational) DOF benefited velocity. Combination of the two DOF improved overall performance. The volumetric Taco performed comparably to the Animotus' extension DOF.

  2. Contribution to the modeling and the identification of haptic interfaces; Contribution a la modelisation et a l'identification des interfaces haptiques

    Energy Technology Data Exchange (ETDEWEB)

    Janot, A

    2007-12-15

    This thesis focuses on the modeling and the identification of haptic interfaces using cable drive. An haptic interface is a force feedback device, which enables its user to interact with a virtual world or a remote environment explored by a slave system. It aims at the matching between the forces and displacements given by the user and those applied to virtual world. Usually, haptic interfaces make use of a mechanical actuated structure whose distal link is equipped with a handle. When manipulating this handle to interact with explored world, the user feels the apparent mass, compliance and friction of the interface. This distortion introduced between the operator and the virtual world must be modeled and identified to enhance the design of the interface and develop appropriate control laws. The first approach has been to adapt the modeling and identification methods of rigid and localized flexibilities robots to haptic interfaces. The identification technique makes use of the inverse dynamic model and the linear least squares with the measurements of joint torques and positions. This approach is validated on a single degree of freedom and a three degree of freedom haptic devices. A new identification method needing only torque data is proposed. It is based on a closed loop simulation using the direct dynamic model. The optimal parameters minimize the 2 norms of the error between the actual torque and the simulated torque assuming the same control law and the same tracking trajectory. This non linear least squares problem dramatically is simplified using the inverse model to calculate the simulated torque. This method is validated on the single degree of freedom haptic device and the SCARA robot. (author)

  3. User Interface History

    DEFF Research Database (Denmark)

    Jørgensen, Anker Helms; Myers, Brad A

    2008-01-01

    User Interfaces have been around as long as computers have existed, even well before the field of Human-Computer Interaction was established. Over the years, some papers on the history of Human-Computer Interaction and User Interfaces have appeared, primarily focusing on the graphical interface era...... and early visionaries such as Bush, Engelbart and Kay. With the User Interface being a decisive factor in the proliferation of computers in society and since it has become a cultural phenomenon, it is time to paint a more comprehensive picture of its history. This SIG will investigate the possibilities...... of  launching a concerted effort towards creating a History of User Interfaces. ...

  4. User interface design considerations

    DEFF Research Database (Denmark)

    Andersen, Simon Engedal; Jakobsen, Arne; Rasmussen, Bjarne D.

    1999-01-01

    When designing a user interface for a simulation model there are several important issues to consider: Who is the target user group, and which a priori information can be expected. What questions do the users want answers to and what questions are answered using a specific model?When developing...... and output variables. This feature requires special attention when designing the user interface and a special approach for controlling the user selection of input and output variables are developed. To obtain a consistent system description the different input variables are grouped corresponding...... the consequence that the user does not have to specify any start guesses, etc.The design approach developed have resulted in a number of simulation tools which allow users with limited theoretical knowledge about refrigeration systems, mathematical models and simulation to use them while the expert users still...

  5. Design of a New MR Compatible Haptic Interface with Six Actuated Degrees of Freedom

    DEFF Research Database (Denmark)

    Ergin, Mehmet Alper; Kühne, Markus; Thielscher, Axel

    2014-01-01

    . Existing MR-compatible haptic interfaces are restricted to maximum three actuated degrees of freedom. We propose an MR-compatible haptic interface with six actuated degrees of freedom to be able to study human brain mechanisms of natural pick-and-place movements including arm transport. In this work, we...

  6. User interface development

    Science.gov (United States)

    Aggrawal, Bharat

    1994-01-01

    This viewgraph presentation describes the development of user interfaces for OS/2 versions of computer codes for the analysis of seals. Current status, new features, work in progress, and future plans are discussed.

  7. Opportunistic tangible user interfaces for augmented reality.

    Science.gov (United States)

    Henderson, Steven; Feiner, Steven

    2010-01-01

    Opportunistic Controls are a class of user interaction techniques that we have developed for augmented reality (AR) applications to support gesturing on, and receiving feedback from, otherwise unused affordances already present in the domain environment. By leveraging characteristics of these affordances to provide passive haptics that ease gesture input, Opportunistic Controls simplify gesture recognition, and provide tangible feedback to the user. In this approach, 3D widgets are tightly coupled with affordances to provide visual feedback and hints about the functionality of the control. For example, a set of buttons can be mapped to existing tactile features on domain objects. We describe examples of Opportunistic Controls that we have designed and implemented using optical marker tracking, combined with appearance-based gesture recognition. We present the results of two user studies. In the first, participants performed a simulated maintenance inspection of an aircraft engine using a set of virtual buttons implemented both as Opportunistic Controls and using simpler passive haptics. Opportunistic Controls allowed participants to complete their tasks significantly faster and were preferred over the baseline technique. In the second, participants proposed and demonstrated user interfaces incorporating Opportunistic Controls for two domains, allowing us to gain additional insights into how user interfaces featuring Opportunistic Controls might be designed.

  8. User Interface Technology Survey.

    Science.gov (United States)

    1987-04-01

    Interface can be manufactured. The user Interface bulder may be provided with tools to enhance the building block set, e.g.. icon and font editor to add...ity and easy extensiblity of the command set. t supports command history , execu- tion of previous commands, and editing of commands. Through the

  9. A real-time haptic interface for interventional radiology procedures.

    Science.gov (United States)

    Moix, Thomas; Ilic, Dejan; Fracheboud, Blaise; Zoethout, Jurjen; Bleuler, Hannes

    2005-01-01

    Interventional Radiology (IR) is a minimally-invasive surgery technique (MIS) where guidewires and catheters are steered in the vascular system under X-ray imaging. In order to perform these procedures, a radiologist has to be correctly trained to master hand-eye coordination, instrument manipulation and procedure protocols. This paper proposes a computer-assisted training environment dedicated to IR. The system is composed of a virtual reality (VR) simulation of the anatomy of the patient linked to a robotic interface providing haptic force feedback.The paper focuses on the requirements, design and prototyping of a specific part of the haptic interface dedicated to catheters. Translational tracking and force feedback on the catheter is provided by two cylinders forming a friction drive arrangement. The whole friction can be set in rotation with an additional motor providing torque feedback. A force and a torque sensor are integrated in the cylinders for direct measurement on the catheter enabling disturbance cancellation with a close-loop force control strategy.

  10. UIL -User Interface Language

    CERN Document Server

    Lewis, J; CERN. Geneva

    1990-01-01

    Some widget examples, widget categories, the push button widget, menus, the FORM widget, using UIL for an application program, the MOTIF Resource Manager (MRM), execution thread of an application using UIL and MRM, opening hierarchies, binding UIL names to application addresses, fetching widget hierarchies and managing them, changing widget resources using UIL and MRM, fetching literal values from the UID file. Introduction to the User Interface Language, defining a user interface, advantages of using UIL, accessing UID files from the application, UIL Syntax, the UIL module structure, defining a widget instance hierarchy, declaration of literals colors, icons, fonts

  11. Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface

    Directory of Open Access Journals (Sweden)

    Jacopo Aleotti

    2017-09-01

    Full Text Available A visuo-haptic augmented reality (VHAR interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source.

  12. Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface

    Science.gov (United States)

    Micconi, Giorgio; Caselli, Stefano; Benassi, Giacomo; Zambelli, Nicola; Bettelli, Manuele

    2017-01-01

    A visuo-haptic augmented reality (VHAR) interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV) equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source. PMID:28961198

  13. Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface.

    Science.gov (United States)

    Aleotti, Jacopo; Micconi, Giorgio; Caselli, Stefano; Benassi, Giacomo; Zambelli, Nicola; Bettelli, Manuele; Zappettini, Andrea

    2017-09-29

    A visuo-haptic augmented reality (VHAR) interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV) equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source.

  14. Soft Robotic Haptic Interface with Variable Stiffness for Rehabilitation of Neurologically Impaired Hand Function

    Directory of Open Access Journals (Sweden)

    Frederick Sebastian

    2017-12-01

    Full Text Available The human hand comprises complex sensorimotor functions that can be impaired by neurological diseases and traumatic injuries. Effective rehabilitation can bring the impaired hand back to a functional state because of the plasticity of the central nervous system to relearn and remodel the lost synapses in the brain. Current rehabilitation therapies focus on strengthening motor skills, such as grasping, employ multiple objects of varying stiffness so that affected persons can experience a wide range of strength training. These devices have limited range of stiffness due to the rigid mechanisms employed in their variable stiffness actuators. This paper presents a novel soft robotic haptic device for neuromuscular rehabilitation of the hand, which is designed to offer adjustable stiffness and can be utilized in both clinical and home settings. The device eliminates the need for multiple objects by employing a pneumatic soft structure made with highly compliant materials that act as the actuator of the haptic interface. It is made with interchangeable sleeves that can be customized to include materials of varying stiffness to increase the upper limit of the stiffness range. The device is fabricated using existing 3D printing technologies, and polymer molding and casting techniques, thus keeping the cost low and throughput high. The haptic interface is linked to either an open-loop system that allows for an increased pressure during usage or closed-loop system that provides pressure regulation in accordance to the stiffness the user specifies. Preliminary evaluation is performed to characterize the effective controllable region of variance in stiffness. It was found that the region of controllable stiffness was between points 3 and 7, where the stiffness appeared to plateau with each increase in pressure. The two control systems are tested to derive relationships between internal pressure, grasping force exertion on the surface, and displacement using

  15. Portraying User Interface History

    DEFF Research Database (Denmark)

    Jørgensen, Anker Helms

    2008-01-01

    The user interface is coming of age. Papers adressing UI history have appeared in fair amounts in the last 25 years. Most of them address particular aspects such as an in­novative interface paradigm or the contribution of a visionary or a research lab. Contrasting this, papers addres­sing UI...... history at large have been sparse. However, a small spate of publications appeared recently, so a reasonable number of papers are available. Hence this work-in-progress paints a portrait of the current history of user interfaces at large. The paper first describes a theoretical framework recruited from...... in that they largely address prevailing UI techno­logies, and thirdly history from above in that they focus on the great deeds of the visionaries. The paper then compares this state-of-art in UI history to the much more mature fields history of computing and history of technology. Based hereon, some speculations...

  16. Haptic fMRI: Reliability and performance of electromagnetic haptic interfaces for motion and force neuroimaging experiments.

    Science.gov (United States)

    Menon, Samir; Zhu, Jack; Goyal, Deeksha; Khatib, Oussama

    2017-07-01

    Haptic interfaces compatible with functional magnetic resonance imaging (Haptic fMRI) promise to enable rich motor neuroscience experiments that study how humans perform complex manipulation tasks. Here, we present a large-scale study (176 scans runs, 33 scan sessions) that characterizes the reliability and performance of one such electromagnetically actuated device, Haptic fMRI Interface 3 (HFI-3). We outline engineering advances that ensured HFI-3 did not interfere with fMRI measurements. Observed fMRI temporal noise levels with HFI-3 operating were at the fMRI baseline (0.8% noise to signal). We also present results from HFI-3 experiments demonstrating that high resolution fMRI can be used to study spatio-temporal patterns of fMRI blood oxygenation dependent (BOLD) activation. These experiments include motor planning, goal-directed reaching, and visually-guided force control. Observed fMRI responses are consistent with existing literature, which supports Haptic fMRI's effectiveness at studying the brain's motor regions.

  17. Workflow User Interfaces Patterns

    Directory of Open Access Journals (Sweden)

    Jean Vanderdonckt

    2012-03-01

    Full Text Available Este trabajo presenta una colección de patrones de diseño de interfaces de usuario para sistemas de información para el flujo de trabajo; la colección incluye cuarenta y tres patrones clasificados en siete categorías identificados a partir de la lógica del ciclo de vida de la tarea sobre la base de la oferta y la asignación de tareas a los responsables de realizarlas (i. e. recursos humanos durante el flujo de trabajo. Cada patrón de la interfaz de usuario de flujo de trabajo (WUIP, por sus siglas en inglés se caracteriza por las propiedades expresadas en el lenguaje PLML para expresar patrones y complementado por otros atributos y modelos que se adjuntan a dicho modelo: la interfaz de usuario abstracta y el modelo de tareas correspondiente. Estos modelos se especifican en un lenguaje de descripción de interfaces de usuario. Todos los WUIPs se almacenan en una biblioteca y se pueden recuperar a través de un editor de flujo de trabajo que vincula a cada patrón de asignación de trabajo a su WUIP correspondiente.A collection of user interface design patterns for workflow information systems is presented that contains forty three resource patterns classified in seven categories. These categories and their corresponding patterns have been logically identified from the task life cycle based on offering and allocation operations. Each Workflow User Interface Pattern (WUIP is characterized by properties expressed in the PLML markup language for expressing patterns and augmented by additional attributes and models attached to the pattern: the abstract user interface and the corresponding task model. These models are specified in a User Interface Description Language. All WUIPs are stored in a library and can be retrieved within a workflow editor that links each workflow pattern to its corresponding WUIP, thus giving rise to a user interface for each workflow pattern.

  18. Power User Interface

    Science.gov (United States)

    Pfister, Robin; McMahon, Joe

    2006-01-01

    Power User Interface 5.0 (PUI) is a system of middleware, written for expert users in the Earth-science community, PUI enables expedited ordering of data granules on the basis of specific granule-identifying information that the users already know or can assemble. PUI also enables expert users to perform quick searches for orderablegranule information for use in preparing orders. PUI 5.0 is available in two versions (note: PUI 6.0 has command-line mode only): a Web-based application program and a UNIX command-line- mode client program. Both versions include modules that perform data-granule-ordering functions in conjunction with external systems. The Web-based version works with Earth Observing System Clearing House (ECHO) metadata catalog and order-entry services and with an open-source order-service broker server component, called the Mercury Shopping Cart, that is provided separately by Oak Ridge National Laboratory through the Department of Energy. The command-line version works with the ECHO metadata and order-entry process service. Both versions of PUI ultimately use ECHO to process an order to be sent to a data provider. Ordered data are provided through means outside the PUI software system.

  19. A Preliminary Study on the Use of Haptic Feedback to Assist Users with Impaired Arm Coordination During Mouse Interactions

    Directory of Open Access Journals (Sweden)

    N. G. Tsagarakis

    2011-01-01

    Full Text Available Physical movement impairments caused by central nervous system dysfunction or by muscle spasms generated from other neurological damage or dysfunction can often make it difficult or impossible for affected individuals to interact with computer generated environments using the conventional mouse interfaces. This work investigates the use of a 2 dimensional haptic device as an assistive robotic aid to minimize the effects of the pathological absence of motor control on the upper limb in impaired users while using a mouse interface. The haptic system used in this research is a two degree of freedom (DOF Pantograph planar device. To detect the intended user motion, the device is equipped with force sensing allowing the monitoring of the user applied loads. Impedance based techniques are used to develop a “clumsy” motion suppression control system. The erratic motion suppression techniques and the experimental system setup are evaluated in two dimensional tracking tasks using a human subject with failure of the gross coordination of the upper limb muscle movements resulting from a disorder called ‘Muscle Ataxia’. The results presented demonstrate the ability of the system to improve the tracking performance of the impaired user while interacting with a simple computer generated 2D space.

  20. AcuTable: A Touch-enabled, Actuated Tangible User Interface

    DEFF Research Database (Denmark)

    Dibbern, Simon; Rasmussen, Kasper Vestergaard; Ortiz-Arroyo, Daniel

    2017-01-01

    In this paper we describe AcuTable, a new tangible user interface. AcuTable is a shapeable surface that employs capacitive touch sensors. The goal of AcuTable was to enable the exploration of the capabilities of such haptic interface and its applications. We describe its design and implementation...

  1. Benefits of the use of natural user interfaces in water simulations

    NARCIS (Netherlands)

    Donchyts, G.; Baart, F.; Van Dam, A.; Jagers, B.

    2014-01-01

    The use of natural user interfaces instead of conventional ones has become a reality with the emergence of 3D motion sensing technologies. However, some problems are still unsolved (for example, no haptic or tactile feedback); so this technology requires careful evaluation before the users can

  2. Haptic feedback improves surgeons' user experience and fracture reduction in facial trauma simulation.

    Science.gov (United States)

    Girod, Sabine; Schvartzman, Sara C; Gaudilliere, Dyani; Salisbury, Kenneth; Silva, Rebeka

    2016-01-01

    Computer-assisted surgical (CAS) planning tools are available for craniofacial surgery, but are usually based on computer-aided design (CAD) tools that lack the ability to detect the collision of virtual objects (i.e., fractured bone segments). We developed a CAS system featuring a sense of touch (haptic) that enables surgeons to physically interact with individual, patient-specific anatomy and immerse in a three-dimensional virtual environment. In this study, we evaluated initial user experience with our novel system compared to an existing CAD system. Ten surgery resident trainees received a brief verbal introduction to both the haptic and CAD systems. Users simulated mandibular fracture reduction in three clinical cases within a 15 min time limit for each system and completed a questionnaire to assess their subjective experience. We compared standard landmarks and linear and angular measurements between the simulated results and the actual surgical outcome and found that haptic simulation results were not significantly different from actual postoperative outcomes. In contrast, CAD results significantly differed from both the haptic simulation and actual postoperative results. In addition to enabling a more accurate fracture repair, the haptic system provided a better user experience than the CAD system in terms of intuitiveness and self-reported quality of repair.

  3. Spatial user interfaces for large-scale projector-based augmented reality.

    Science.gov (United States)

    Marner, Michael R; Smith, Ross T; Walsh, James A; Thomas, Bruce H

    2014-01-01

    Spatial augmented reality applies the concepts of spatial user interfaces to large-scale, projector-based augmented reality. Such virtual environments have interesting characteristics. They deal with large physical objects, the projection surfaces are nonplanar, the physical objects provide natural passive haptic feedback, and the systems naturally support collaboration between users. The article describes how these features affect the design of spatial user interfaces for these environments and explores promising research directions and application domains.

  4. User interface for personal accounting

    OpenAIRE

    Femec, Vasilij

    2008-01-01

    This diploma work describes a method for user interface development for an application Bilanca that is intended for a review of personal financial flows. It is a simple application that subtracts outcome from income and shows the current financial state. The work begins with a detailed analysis of the best possible user interface options that give the most comfortable user experience. This is followed by the implementation in a Delphi environment. The results show that even a simple applicati...

  5. A brain-computer interface with vibrotactile biofeedback for haptic information

    Directory of Open Access Journals (Sweden)

    Acharya Soumyadipta

    2007-10-01

    Full Text Available Abstract Background It has been suggested that Brain-Computer Interfaces (BCI may one day be suitable for controlling a neuroprosthesis. For closed-loop operation of BCI, a tactile feedback channel that is compatible with neuroprosthetic applications is desired. Operation of an EEG-based BCI using only vibrotactile feedback, a commonly used method to convey haptic senses of contact and pressure, is demonstrated with a high level of accuracy. Methods A Mu-rhythm based BCI using a motor imagery paradigm was used to control the position of a virtual cursor. The cursor position was shown visually as well as transmitted haptically by modulating the intensity of a vibrotactile stimulus to the upper limb. A total of six subjects operated the BCI in a two-stage targeting task, receiving only vibrotactile biofeedback of performance. The location of the vibration was also systematically varied between the left and right arms to investigate location-dependent effects on performance. Results and Conclusion Subjects are able to control the BCI using only vibrotactile feedback with an average accuracy of 56% and as high as 72%. These accuracies are significantly higher than the 15% predicted by random chance if the subject had no voluntary control of their Mu-rhythm. The results of this study demonstrate that vibrotactile feedback is an effective biofeedback modality to operate a BCI using motor imagery. In addition, the study shows that placement of the vibrotactile stimulation on the biceps ipsilateral or contralateral to the motor imagery introduces a significant bias in the BCI accuracy. This bias is consistent with a drop in performance generated by stimulation of the contralateral limb. Users demonstrated the capability to overcome this bias with training.

  6. Usability of Nomadic User Interfaces

    NARCIS (Netherlands)

    Dees, W.

    2011-01-01

    During the last decade, a number of research activities have been performed to enable user interfaces and the underlying user activities to be migrated from one device to another. We call this “Nomadic User Interfaces”. The primary goal of these research activities has been to develop the

  7. Search-User Interface Design

    CERN Document Server

    Wilson, Max

    2011-01-01

    Search User Interfaces (SUIs) represent the gateway between people who have a task to complete, and the repositories of information and data stored around the world. Not surprisingly, therefore, there are many communities who have a vested interest in the way SUIs are designed. There are people who study how humans search for information, and people who study how humans use computers. There are people who study good user interface design, and people who design aesthetically pleasing user interfaces. There are also people who curate and manage valuable information resources, and people who desi

  8. Practical speech user interface design

    CERN Document Server

    Lewis, James R

    2010-01-01

    Although speech is the most natural form of communication between humans, most people find using speech to communicate with machines anything but natural. Drawing from psychology, human-computer interaction, linguistics, and communication theory, Practical Speech User Interface Design provides a comprehensive yet concise survey of practical speech user interface (SUI) design. It offers practice-based and research-based guidance on how to design effective, efficient, and pleasant speech applications that people can really use. Focusing on the design of speech user interfaces for IVR application

  9. A Three-Axis Force Sensor for Dual Finger Haptic Interfaces

    Directory of Open Access Journals (Sweden)

    Fabio Salsedo

    2012-10-01

    Full Text Available In this work we present the design process, the characterization and testing of a novel three-axis mechanical force sensor. This sensor is optimized for use in closed-loop force control of haptic devices with three degrees of freedom. In particular the sensor has been conceived for integration with a dual finger haptic interface that aims at simulating forces that occur during grasping and surface exploration. The sensing spring structure has been purposely designed in order to match force and layout specifications for the application. In this paper the design of the sensor is presented, starting from an analytic model that describes the characteristic matrix of the sensor. A procedure for designing an optimal overload protection mechanism is proposed. In the last part of the paper the authors describe the experimental characterization and the integrated test on a haptic hand exoskeleton showing the improvements in the controller performances provided by the inclusion of the force sensor.

  10. Demonstrator 1: User Interface and User Functions

    DEFF Research Database (Denmark)

    Gram, Christian

    1999-01-01

    Describes the user interface and its functionality in a prototype system used for a virtual seminar session. The functionality is restricted to what is needed for a distributed seminar discussion among not too many people. The system is designed to work with the participants distributed at several...

  11. Preface (to Playful User Interfaces)

    NARCIS (Netherlands)

    Unknown, [Unknown; Nijholt, A.; Nijholt, Antinus

    2014-01-01

    This book is about user interfaces to applications that can be considered as ‘playful’. The interfaces to such applications should be ‘playful’ as well. The application should be fun, and interacting with such an application should, of course, be fun as well. Maybe more. Why not expect that the

  12. Designing end-user interfaces

    CERN Document Server

    Heaton, N

    1988-01-01

    Designing End-User Interfaces: State of the Art Report focuses on the field of human/computer interaction (HCI) that reviews the design of end-user interfaces.This compilation is divided into two parts. Part I examines specific aspects of the problem in HCI that range from basic definitions of the problem, evaluation of how to look at the problem domain, and fundamental work aimed at introducing human factors into all aspects of the design cycle. Part II consists of six main topics-definition of the problem, psychological and social factors, principles of interface design, computer intelligenc

  13. User acquaintance with mobile interfaces.

    Science.gov (United States)

    Ehrler, Frederic; Walesa, Magali; Sarrey, Evelyne; Wipfli, Rolf; Lovis, Christian

    2013-01-01

    Handheld technology finds slowly its place in the healthcare world. Some clinicians already use intensively dedicated mobile applications to consult clinical references. However, handheld technology hasn't still broadly embraced to the core of the healthcare business, the hospitals. The weak penetration of handheld technology in the hospitals can be partly explained by the caution of stakeholders that must be convinced about the efficiency of these tools before going forward. In a domain where temporal constraints are increasingly strong, caregivers cannot loose time on playing with gadgets. All users are not comfortable with tactile manipulations and the lack of dedicated peripheral complicates entering data for novices. Stakeholders must be convinced that caregivers will be able to master handheld devices. In this paper, we make the assumption that the proper design of an interface may influence users' performances to record information. We are also interested to find out whether users increase their efficiency when using handheld tools repeatedly. To answer these questions, we have set up a field study to compare users' performances on three different user interfaces while recording vital signs. Some user interfaces were familiar to users, and others were totally innovative. Results showed that users' familiarity with smartphone influences their performances and that users improve their performances by repeating a task.

  14. The use of haptic interfaces and web services in crystallography: an application for a 'screen to beam' interface.

    Science.gov (United States)

    Bruno, Andrew E; Soares, Alexei S; Owen, Robin L; Snell, Edward H

    2016-12-01

    Haptic interfaces have become common in consumer electronics. They enable easy interaction and information entry without the use of a mouse or keyboard. The work presented here illustrates the application of a haptic interface to crystallization screening in order to provide a natural means for visualizing and selecting results. By linking this to a cloud-based database and web-based application program interface, the same application shifts the approach from 'point and click' to 'touch and share', where results can be selected, annotated and discussed collaboratively. In the crystallographic application, given a suitable crystallization plate, beamline and robotic end effector, the resulting information can be used to close the loop between screening and X-ray analysis, allowing a direct and efficient 'screen to beam' approach. The application is not limited to the area of crystallization screening; 'touch and share' can be used by any information-rich scientific analysis and geographically distributed collaboration.

  15. User interface user's guide for HYPGEN

    Science.gov (United States)

    Chiu, Ing-Tsau

    1992-01-01

    The user interface (UI) of HYPGEN is developed using Panel Library to shorten the learning curve for new users and provide easier ways to run HYPGEN for casual users as well as for advanced users. Menus, buttons, sliders, and type-in fields are used extensively in UI to allow users to point and click with a mouse to choose various available options or to change values of parameters. On-line help is provided to give users information on using UI without consulting the manual. Default values are set for most parameters and boundary conditions are determined by UI to further reduce the effort needed to run HYPGEN; however, users are free to make any changes and save it in a file for later use. A hook to PLOT3D is built in to allow graphics manipulation. The viewpoint and min/max box for PLOT3D windows are computed by UI and saved in a PLOT3D journal file. For large grids which take a long time to generate on workstations, the grid generator (HYPGEN) can be run on faster computers such as Crays, while UI stays at the workstation.

  16. Co-located haptic and 3D graphic interface for medical simulations.

    Science.gov (United States)

    Berkelman, Peter; Miyasaka, Muneaki; Bozlee, Sebastian

    2013-01-01

    We describe a system which provides high-fidelity haptic feedback in the same physical location as a 3D graphical display, in order to enable realistic physical interaction with virtual anatomical tissue during modelled procedures such as needle driving, palpation, and other interventions performed using handheld instruments. The haptic feedback is produced by the interaction between an array of coils located behind a thin flat LCD screen, and permanent magnets embedded in the instrument held by the user. The coil and magnet configuration permits arbitrary forces and torques to be generated on the instrument in real time according to the dynamics of the simulated tissue by activating the coils in combination. A rigid-body motion tracker provides position and orientation feedback of the handheld instrument to the computer simulation, and the 3D display is produced using LCD shutter glasses and a head-tracking system for the user.

  17. Spelling Correction in User Interfaces.

    Science.gov (United States)

    1982-12-20

    we have concluded that there are considerable benefits and few obstacles to providing a spelling corrector in almost any interactie user interface. Key...the ACAI 23, 12 (December 1980), 676-687. 8. John F. Reiser (ed.). SAIL Manual. Stanford University Computer Science Department, 1976. 9. Warren

  18. The HEASARC graphical user interface

    Science.gov (United States)

    White, N.; Barrett, P.; Jacobs, P.; Oneel, B.

    1992-01-01

    An OSF/Motif-based graphical user interface has been developed to facilitate the use of the database and data analysis software packages available from the High Energy Astrophysics Science Archive Research Center (HEASARC). It can also be used as an interface to other, similar, routines. A small number of tables are constructed to specify the possible commands and command parameters for a given set of analysis routines. These tables can be modified by a designer to affect the appearance of the interface screens. They can also be dynamically changed in response to parameter adjustments made while the underlying program is running. Additionally, a communication protocol has been designed so that the interface can operate locally or across a network. It is intended that this software be able to run on a variety of workstations and X terminals.

  19. Coordinating user interfaces for consistency

    CERN Document Server

    Nielsen, Jakob

    2001-01-01

    In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analys

  20. User interface inspection methods a user-centered design method

    CERN Document Server

    Wilson, Chauncey

    2014-01-01

    User Interface Inspection Methods succinctly covers five inspection methods: heuristic evaluation, perspective-based user interface inspection, cognitive walkthrough, pluralistic walkthrough, and formal usability inspections. Heuristic evaluation is perhaps the best-known inspection method, requiring a group of evaluators to review a product against a set of general principles. The perspective-based user interface inspection is based on the principle that different perspectives will find different problems in a user interface. In the related persona-based inspection, colleagues assume the

  1. Support for Conference Entitled The Fifth PHANTOM Users Group Workshop

    National Research Council Canada - National Science Library

    Reinig, Karl

    2001-01-01

    The Fifth PHANToM Users Group (PUG2000) brought together, in an intimate setting, participants who are actively engaged in making computer haptics practical and useful through the use of the PHANTOM Haptic Interface...

  2. Some Economics of User Interfaces

    OpenAIRE

    Hal R. Varian

    1994-01-01

    I examine the incentives for software providers to design appropriate user interfaces. There are two sorts of costs involved when one uses software: the fixed cost of learning to use a piece of software and the the variable cost of operating the software. For example menu driven software is easy to learn, but tedious to operate. I show that a monopoly provider of software generally invests the ``right'' amount of resources in making the software easy to learn, but too little in making it easy...

  3. User Interface Cultures of Mobile Knowledge Workers

    Directory of Open Access Journals (Sweden)

    Petri Mannonen

    2008-10-01

    Full Text Available Information and communication tools (ICTs have become a major influencer of how modern work is carried out. Methods of user-centered design do not however take into account the full complexity of technology and the user interface context the users live in. User interface culture analysis aims providing to designers new ways and strategies to better take into account the current user interface environment when designing new products. This paper describes the reasons behind user interface culture analysis and shows examples of its usage when studying mobile and distributed knowledge workers.

  4. Designing a flexible user interface for both users and programmers.

    Science.gov (United States)

    Kishore, S; Feingold, E

    1989-01-01

    The design of a user interface for computers is examined from both the end user's and the programmer's point of view. Different methods of menu selection and user feedback are discussed. A graphics interface using pull down menus and dialog boxes is ideal for simplifying user interaction and program organization. This style of interface also provides for a modular program development environment, reduced program development time, program portability, and reduced maintenance. Software tools for programming the user interface are explored and pseudo-code examples are given.

  5. An arm wearable haptic interface for impact sensing on unmanned aerial vehicles

    Science.gov (United States)

    Choi, Yunshil; Hong, Seung-Chan; Lee, Jung-Ryul

    2017-04-01

    In this paper, an impact monitoring system using fiber Bragg grating (FBG) sensors and vibro-haptic actuators has been introduced. The system is suggested for structural health monitoring (SHM) for unmanned aerial vehicles (UAVs), by making a decision with human-robot interaction. The system is composed with two major subsystems; an on-board system equipped on UAV and an arm-wearable interface for ground pilot. The on-board system acquires impact-induced wavelength changes and performs localization process, which was developed based on arrival time calculation. The arm-wearable interface helps ground pilots to make decision about impact location themselves by stimulating their tactile-sense with motor vibration.

  6. User interface adaptability for all users | Akazue | International ...

    African Journals Online (AJOL)

    Also, Bean Development kit (BDK) 1.0 tool, an abandoned project of Sun-Java, was modified by adding more applets to make it robust and making application to run on their own instead of a web browser which was its original design. Keywords: User interface, Bean Development kit, graphical user interface, applets

  7. Vision as a user interface

    Science.gov (United States)

    Koenderink, Jan

    2011-03-01

    The egg-rolling behavior of the graylag goose is an often quoted example of a fixed-action pattern. The bird will even attempt to roll a brick back to its nest! Despite excellent visual acuity it apparently takes a brick for an egg." Evolution optimizes utility, not veridicality. Yet textbooks take it for a fact that human vision evolved so as to approach veridical perception. How do humans manage to dodge the laws of evolution? I will show that they don't, but that human vision is an idiosyncratic user interface. By way of an example I consider the case of pictorial perception. Gleaning information from still images is an important human ability and is likely to remain so for the foreseeable future. I will discuss a number of instances of extreme non-veridicality and huge inter-observer variability. Despite their importance in applications (information dissemination, personnel selection,...) such huge effects have remained undocumented in the literature, although they can be traced to artistic conventions. The reason appears to be that conventional psychophysics-by design-fails to address the qualitative, that is the meaningful, aspects of visual awareness whereas this is the very target of the visual arts.

  8. User Interface Design for Dynamic Geometry Software

    Science.gov (United States)

    Kortenkamp, Ulrich; Dohrmann, Christian

    2010-01-01

    In this article we describe long-standing user interface issues with Dynamic Geometry Software and common approaches to address them. We describe first prototypes of multi-touch-capable DGS. We also give some hints on the educational benefits of proper user interface design.

  9. Learning Analytics for Natural User Interfaces

    Science.gov (United States)

    Martinez-Maldonado, Roberto; Shum, Simon Buckingham; Schneider, Bertrand; Charleer, Sven; Klerkx, Joris; Duval, Erik

    2017-01-01

    The continuous advancement of natural user interfaces (NUIs) allows for the development\tof novel and creative ways to support collocated collaborative work in a wide range of areas, including teaching and learning. The use of NUIs, such as those based on interactive multi-touch surfaces and tangible user interfaces (TUIs), can offer unique…

  10. Gestures in an Intelligent User Interface

    NARCIS (Netherlands)

    Fikkert, F.W.; van der Vet, P.E.; Nijholt, Antinus; Shao, Ling; Shan, Caifeng; Luo, Jiebo; Etoh, Minoru

    2010-01-01

    In this chapter we investigated which hand gestures are intuitive to control a large display multimedia interface from a user’s perspective. Over the course of two sequential user evaluations we defined a simple gesture set that allows users to fully control a large display multimedia interface,

  11. Distributed user interfaces usability and collaboration

    CERN Document Server

    Lozano, María D; Tesoriero, Ricardo; Penichet, Victor MR

    2013-01-01

    Written by international researchers in the field of Distributed User Interfaces (DUIs), this book brings together important contributions regarding collaboration and usability in Distributed User Interface settings. Throughout the thirteen chapters authors address key questions concerning how collaboration can be improved by using DUIs, including: in which situations a DUI is suitable to ease the collaboration among users; how usability standards can be used to evaluate the usability of systems based on DUIs; and accurately describe case studies and prototypes implementing these concerns

  12. Evaluating Author and User Experience for an Audio-Haptic System for Annotation of Physical Models.

    Science.gov (United States)

    Coughlan, James M; Miele, Joshua

    2017-01-01

    We describe three usability studies involving a prototype system for creation and haptic exploration of labeled locations on 3D objects. The system uses a computer, webcam, and fiducial markers to associate a physical 3D object in the camera's view with a predefined digital map of labeled locations ("hotspots"), and to do real-time finger tracking, allowing a blind or visually impaired user to explore the object and hear individual labels spoken as each hotspot is touched. This paper describes: (a) a formative study with blind users exploring pre-annotated objects to assess system usability and accuracy; (b) a focus group of blind participants who used the system and, through structured and unstructured discussion, provided feedback on its practicality, possible applications, and real-world potential; and (c) a formative study in which a sighted adult used the system to add labels to on-screen images of objects, demonstrating the practicality of remote annotation of 3D models. These studies and related literature suggest potential for future iterations of the system to benefit blind and visually impaired users in educational, professional, and recreational contexts.

  13. Gestures in an Intelligent User Interface

    Science.gov (United States)

    Fikkert, Wim; van der Vet, Paul; Nijholt, Anton

    In this chapter we investigated which hand gestures are intuitive to control a large display multimedia interface from a user's perspective. Over the course of two sequential user evaluations, we defined a simple gesture set that allows users to fully control a large display multimedia interface, intuitively. First, we evaluated numerous gesture possibilities for a set of commands that can be issued to the interface. These gestures were selected from literature, science fiction movies, and a previous exploratory study. Second, we implemented a working prototype with which the users could interact with both hands and the preferred hand gestures with 2D and 3D visualizations of biochemical structures. We found that the gestures are influenced to significant extent by the fast paced developments in multimedia interfaces such as the Apple iPhone and the Nintendo Wii and to no lesser degree by decades of experience with the more traditional WIMP-based interfaces.

  14. Applying Cognitive Psychology to User Interfaces

    Science.gov (United States)

    Durrani, Sabeen; Durrani, Qaiser S.

    This paper explores some key aspects of cognitive psychology that may be mapped onto user interfaces. Major focus in existing user interface guidelines is on consistency, simplicity, feedback, system messages, display issues, navigation, colors, graphics, visibility and error prevention [8-10]. These guidelines are effective indesigning user interfaces. However, these guidelines do not handle the issues that may arise due to the innate structure of human brain and human limitations. For example, where to place graphics on the screen so that user can easily process them and what kind of background should be given on the screen according to the limitation of human motor system. In this paper we have collected some available guidelines from the area of cognitive psychology [1, 5, 7]. In addition, we have extracted few guidelines from theories and studies of cognitive psychology [3, 11] which may be mapped to user interfaces.

  15. Improving Challenge/Skill Ratio in a Multimodal Interface by Simultaneously Adapting Game Difficulty and Haptic Assistance through Psychophysiological and Performance Feedback

    Directory of Open Access Journals (Sweden)

    Carlos Rodriguez-Guerrero

    2017-05-01

    Full Text Available In order to harmonize robotic devices with human beings, the robots should be able to perceive important psychosomatic impact triggered by emotional states such as frustration or boredom. This paper presents a new type of biocooperative control architecture, which acts toward improving the challenge/skill relation perceived by the user when interacting with a robotic multimodal interface in a cooperative scenario. In the first part of the paper, open-loop experiments revealed which physiological signals were optimal for inclusion in the feedback loop. These were heart rate, skin conductance level, and skin conductance response frequency. In the second part of the paper, the proposed controller, consisting of a biocooperative architecture with two degrees of freedom, simultaneously modulating game difficulty and haptic assistance through performance and psychophysiological feedback, is presented. With this setup, the perceived challenge can be modulated by means of the game difficulty and the perceived skill by means of the haptic assistance. A new metric (FlowIndex is proposed to numerically quantify and visualize the challenge/skill relation. The results are contrasted with comparable previously published work and show that the new method afforded a higher FlowIndex (i.e., a superior challenge/skill relation and an improved balance between augmented performance and user satisfaction (higher level of valence, i.e., a more enjoyable and satisfactory experience.

  16. Human Perception Test of Discontinuous Force and a Trial of Skill Transfer Using a Five-Fingered Haptic Interface

    Directory of Open Access Journals (Sweden)

    Takahiro Endo

    2010-01-01

    Full Text Available In the transferring of expert skills, it takes a great deal of time and effort for beginners to obtain new skills, and it is difficult to teach the skills by using only words. For those reasons, a skill transfer system that uses virtual reality (VR and a haptic interface technique is very attractive. In this study, we investigated the human perception of fingertip force with respect to the following changes: (1 the spatial change of the presented force, and (2 the change of the time to present the force. Based on the results of the perception experiments, we considered the skill transfer to a person's five fingers by using a five-fingered haptic interface robot.

  17. User Interface Technology for Formal Specification Development

    Science.gov (United States)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    Formal specification development and modification are an essential component of the knowledge-based software life cycle. User interface technology is needed to empower end-users to create their own formal specifications. This paper describes the advanced user interface for AMPHION1 a knowledge-based software engineering system that targets scientific subroutine libraries. AMPHION is a generic, domain-independent architecture that is specialized to an application domain through a declarative domain theory. Formal specification development and reuse is made accessible to end-users through an intuitive graphical interface that provides semantic guidance in creating diagrams denoting formal specifications in an application domain. The diagrams also serve to document the specifications. Automatic deductive program synthesis ensures that end-user specifications are correctly implemented. The tables that drive AMPHION's user interface are automatically compiled from a domain theory; portions of the interface can be customized by the end-user. The user interface facilitates formal specification development by hiding syntactic details, such as logical notation. It also turns some of the barriers for end-user specification development associated with strongly typed formal languages into active sources of guidance, without restricting advanced users. The interface is especially suited for specification modification. AMPHION has been applied to the domain of solar system kinematics through the development of a declarative domain theory. Testing over six months with planetary scientists indicates that AMPHION's interactive specification acquisition paradigm enables users to develop, modify, and reuse specifications at least an order of magnitude more rapidly than manual program development.

  18. On user behaviour adaptation under interface change

    CSIR Research Space (South Africa)

    Rosman, Benjamin S

    2014-02-01

    Full Text Available Edinburgh, UK Pushmeet Kohli Machine Learning and Perception Microsoft Research Cambridge, UK Abstract Different interfaces allow a user to achieve the same end goal through different action sequences, e.g., command lines vs. drop down menus...

  19. Playful User Interfaces. Interfaces that Invite Social and Physical Interaction.

    NARCIS (Netherlands)

    Nijholt, Antinus; Unknown, [Unknown

    2014-01-01

    This book is about user interfaces to applications that can be considered as ‘playful’. The interfaces to such applications should be ‘playful’ as well. The application should be fun, and interacting with such an application should, of course, be fun as well. Maybe more. Why not expect that the

  20. Model driven development of user interface prototypes

    DEFF Research Database (Denmark)

    Störrle, Harald

    2010-01-01

    Many approaches to interface development apply only to isolated aspects of the development of user interfaces (UIs), e.g., exploration during the early phases, design of visual appearance, or implementation in some technology. In this paper we explore an _integrated_ approach to incorporate the w...

  1. Playful user interfaces interfaces that invite social and physical interaction

    CERN Document Server

    2014-01-01

    The book is about user interfaces to applications that have been designed for social and physical interaction. The interfaces are ‘playful’, that is, users feel challenged to engage in social and physical interaction because that will be fun. The topics that will be present in this book are interactive playgrounds, urban games using mobiles, sensor-equipped environments for playing, child-computer interaction, tangible game interfaces, interactive tabletop technology and applications, full-body interaction, exertion games, persuasion, engagement, evaluation, and user experience. Readers of the book will not only get a survey of state-of-the-art research in these areas, but the chapters in this book will also provide a vision of the future where playful interfaces will be ubiquitous, that is, present and integrated in home, office, recreational, sports and urban environments, emphasizing that in the future in these environments game elements will be integrated and welcomed.

  2. A human activity approach to User Interfaces

    DEFF Research Database (Denmark)

    Bødker, Susanne

    1989-01-01

    the work situations in which computer-based artifacts are used: The framework deals with the role of the user interface in purposeful human work. Human activity theory is used in this analysis. The purpose of this article is to make the reader curious and hopefully open his or her eyes to a somewhat...... different way of thinking about the user interface. The article applies examples of real-life interfaces to support this process, but it does not include a systematic presentation of empirical results. I focus on the role of the computer application in use. Thus, it is necessary to consider human-computer...... interaction and other related work conditions. I deal with human experience and competence as being rooted in the practice of the group that conducts the specific work activity. The main conclusions are: The user interface cannot be seen independently of the use activity (i.e., the professional, socially...

  3. Language workbench user interfaces for data analysis

    Science.gov (United States)

    Benson, Victoria M.

    2015-01-01

    Biological data analysis is frequently performed with command line software. While this practice provides considerable flexibility for computationally savy individuals, such as investigators trained in bioinformatics, this also creates a barrier to the widespread use of data analysis software by investigators trained as biologists and/or clinicians. Workflow systems such as Galaxy and Taverna have been developed to try and provide generic user interfaces that can wrap command line analysis software. These solutions are useful for problems that can be solved with workflows, and that do not require specialized user interfaces. However, some types of analyses can benefit from custom user interfaces. For instance, developing biomarker models from high-throughput data is a type of analysis that can be expressed more succinctly with specialized user interfaces. Here, we show how Language Workbench (LW) technology can be used to model the biomarker development and validation process. We developed a language that models the concepts of Dataset, Endpoint, Feature Selection Method and Classifier. These high-level language concepts map directly to abstractions that analysts who develop biomarker models are familiar with. We found that user interfaces developed in the Meta-Programming System (MPS) LW provide convenient means to configure a biomarker development project, to train models and view the validation statistics. We discuss several advantages of developing user interfaces for data analysis with a LW, including increased interface consistency, portability and extension by language composition. The language developed during this experiment is distributed as an MPS plugin (available at http://campagnelab.org/software/bdval-for-mps/). PMID:25755929

  4. Through the Interface - a human activity approach to user interfaces

    DEFF Research Database (Denmark)

    Bødker, Susanne

    In providing a theoretical framework for understanding human- computer interaction as well as design of user interfaces, this book combines elements of anthropology, psychology, cognitive science, software engineering, and computer science. The framework examines the everyday work practices...... of users when analyzing and designing computer applications. The text advocates the unique theory that computer application design is fundamentally a collective activity in which the various practices of the participants meet in a process of mutual learning....

  5. Liferay 6.2 user interface development

    CERN Document Server

    Chen, Xinsheng

    2013-01-01

    A step-by-step tutorial, targeting the Liferay 6.2 version. This book takes a step-by-step approach to customizing the look and feel of your website, and shows you how to build a great looking user interface as well.""Liferay 6.2 User Interface Development"" is for anyone who is interested in the Liferay Portal. It contains text that explicitly introduces you to the Liferay Portal. You will benefit most from this book if you have Java programming experience and have coded servlets or JavaServer Pages before. Experienced Liferay portal developers will also find this book useful because it expla

  6. Programming Graphical User Interfaces in R

    CERN Document Server

    Verzani, John

    2012-01-01

    Programming Graphical User Interfaces with R introduces each of the major R packages for GUI programming: RGtk2, qtbase, Tcl/Tk, and gWidgets. With examples woven through the text as well as stand-alone demonstrations of simple yet reasonably complete applications, the book features topics especially relevant to statisticians who aim to provide a practical interface to functionality implemented in R. The book offers: A how-to guide for developing GUIs within R The fundamentals for users with limited knowledge of programming within R and other languages GUI design for specific functions or as l

  7. Sensators : Active multisensory tangible user interfaces

    NARCIS (Netherlands)

    Erp, J.B.F. van; Willemse, C.J.A.M.; Janssen, J.B.; Toet, A.

    2014-01-01

    Although Tangible User Interfaces are considered an intuitive means of human-computer interaction, they oftentimes lack the option to provide active feedback. We developed ‘Sensators’: generic shaped active tangibles to be used on a multi-touch table. Sensators can represent digital information by

  8. Flash Builder customizing the user interface

    CERN Document Server

    Rocchi, Cesare

    2010-01-01

    Personalize user interface components of your projects. Example projects are grouped together in an AIR application and the appearance is totally customized. Learn how to change visual properties by means of style directives or create brand new skins by knowing and exploiting their internal architecture.

  9. The Promise of Zoomable User Interfaces

    Science.gov (United States)

    Bederson, Benjamin B.

    2011-01-01

    Zoomable user interfaces (ZUIs) have received a significant amount of attention in the 18 years since they were introduced. They have enjoyed some success, and elements of ZUIs are widely used in computers today, although the grand vision of a zoomable desktop has not materialised. This paper describes the premise and promise of ZUIs along with…

  10. More playful user interfaces: an introduction

    NARCIS (Netherlands)

    Unknown, [Unknown; Nijholt, A.; Nijholt, Antinus

    2015-01-01

    In this chapter we embed recent research advances in creating playful user interfaces in a historical context. We have observations on spending leisure time, in particular predictions from previous decades and views expressed in Science Fiction novels. We confront these views and predictions with

  11. Embodied Conversational Interfaces for the Elderly User

    NARCIS (Netherlands)

    Mehrotra, S.; Motti, V. G.; Frijns, H.; Akkoc, T.; Yengeç, S. B.; Calik, O.; Peeters, M.M.M.; Neerincx, M. A.

    2016-01-01

    This paper describes the design and development of an embodied conversational agent (ECA) that provides a social interface for older adults. Following a user-centred design approach, we implemented a multimodal agent consisting of a virtual character and a robot. This so-called "bi-bodied

  12. Rehabilitation of activities of daily living in virtual environments with intuitive user interface and force feedback.

    Science.gov (United States)

    Chiang, Vico Chung-Lim; Lo, King-Hung; Choi, Kup-Sze

    2017-10-01

    To investigate the feasibility of using a virtual rehabilitation system with intuitive user interface and force feedback to improve the skills in activities of daily living (ADL). A virtual training system equipped with haptic devices was developed for the rehabilitation of three ADL tasks - door unlocking, water pouring and meat cutting. Twenty subjects with upper limb disabilities, supervised by two occupational therapists, received a four-session training using the system. The task completion time and the amount of water poured into a virtual glass were recorded. The performance of the three tasks in reality was assessed before and after the virtual training. Feedback of the participants was collected with questionnaires after the study. The completion time of the virtual tasks decreased during the training (p user interface and force feedback was designed to improve the learning of the manual skills. The study shows that system could be used as a training tool to complement conventional rehabilitation approaches.

  13. Towards Essential Visual Variables in User Interface Design

    OpenAIRE

    Silvennoinen, Johanna

    2014-01-01

    This paper focuses on visual variables in user interface design from the user perspective. Visual design of user interfaces is essential to users interacting with different software. The study is conducted with 3E-templates for users to express their impressions by writing and drawing regarding visual website design. The data is analyzed with qualitative content analysis through interpretation framework. The results of this study provide new insights into user-centered visual user interface d...

  14. Factors Influencing Undergraduate Students' Acceptance of a Haptic Interface for Learning Gross Anatomy

    Science.gov (United States)

    Yeom, Soonja; Choi-Lundberg, Derek L.; Fluck, Andrew Edward; Sale, Arthur

    2017-01-01

    Purpose: This study aims to evaluate factors influencing undergraduate students' acceptance of a computer-aided learning resource using the Phantom Omni haptic stylus to enable rotation, touch and kinaesthetic feedback and display of names of three-dimensional (3D) human anatomical structures on a visual display. Design/methodology/approach: The…

  15. Pervasive haptics science, design, and application

    CERN Document Server

    Saga, Satoshi; Konyo, Masashi

    2016-01-01

    This book examines the state of the art in diverse areas of haptics (touch)-related research, including the psychophysics and neurophysiology of haptics, development of haptics displays and sensors, and applications to a wide variety of fields such as industry, education, therapy, medicine, and welfare for the visually impaired. It also discusses the potential of future haptics interaction, such as haptics for emotional control and remote haptics communication. The book offers a valuable resource not only for haptics and human interface researchers, but also for developers and designers at manufacturing corporations and in the entertainment industries.

  16. Review of surgical robotics user interface: what is the best way to control robotic surgery?

    Science.gov (United States)

    Simorov, Anton; Otte, R Stephen; Kopietz, Courtni M; Oleynikov, Dmitry

    2012-08-01

    As surgical robots begin to occupy a larger place in operating rooms around the world, continued innovation is necessary to improve our outcomes. A comprehensive review of current surgical robotic user interfaces was performed to describe the modern surgical platforms, identify the benefits, and address the issues of feedback and limitations of visualization. Most robots currently used in surgery employ a master/slave relationship, with the surgeon seated at a work-console, manipulating the master system and visualizing the operation on a video screen. Although enormous strides have been made to advance current technology to the point of clinical use, limitations still exist. A lack of haptic feedback to the surgeon and the inability of the surgeon to be stationed at the operating table are the most notable examples. The future of robotic surgery sees a marked increase in the visualization technologies used in the operating room, as well as in the robots' abilities to convey haptic feedback to the surgeon. This will allow unparalleled sensation for the surgeon and almost eliminate inadvertent tissue contact and injury. A novel design for a user interface will allow the surgeon to have access to the patient bedside, remaining sterile throughout the procedure, employ a head-mounted three-dimensional visualization system, and allow the most intuitive master manipulation of the slave robot to date.

  17. User interface design of electronic appliances

    CERN Document Server

    Baumann, Konrad

    2002-01-01

    Foreword by Brenda Laurel. Part One: Introduction 1. Background, Bruce Thomas 2. Introduction, Konrad Baumann 3. The Interaction Design Process, Georg Rakers Part Two: User Interface Design 4. Creativity Techniques, Irene Mavrommati 5. Design Principals, Irene Mavrommati and Adrian Martel 6. Design of On-Screen Interfaces, Irene Mavrommati Part Three: Input Devices 7. Controls, Konrad Baumann 8. Keyboards, Konrad Baumann 9. Advanced Interaction Techniques, Christopher Baber and Konrad Baumann 10. Speech Control, Christopher Baber and Jan Noyes 11. Wearable Computers, Christopher Baber Part Fou

  18. Occupational therapists' evaluation of haptic motor rehabilitation.

    Science.gov (United States)

    Kayyali, Ruba; Alamri, Atif; Eid, Mohamad; Iglesias, Rosa; Shirmohammadi, Shervin; El Saddik, Abdulmotaleb; Lemaire, Edward

    2007-01-01

    Haptic-based virtual rehabilitation systems have recently become a subject of interest. In addition to the benefits provided by virtual rehabilitation, the haptic-based systems offer force and tactile feedback which can be for upper and lower extremity rehabilitation. In this paper, we present a system that uses haptics, in conjunction with virtual environments, to provide a rich media environment for motor rehabilitation of stroke patients. The system also provides Occupational Therapists (OTs) with a Graphical User Interface (GUI) that enables them to configure the hardware and virtual exercises and to monitor patients' performance. We also present an analysis of the system by a group of OTs from the Ottawa General Hospital, Rehabilitation Center. The OT's feedback, both the positives and negatives, and the results of the assessment test are also presented.

  19. Prototyping of user interfaces for mobile applications

    CERN Document Server

    Bähr, Benjamin

    2017-01-01

    This book investigates processes for the prototyping of user interfaces for mobile apps, and describes the development of new concepts and tools that can improve the prototype driven app development in the early stages. It presents the development and evaluation of a new requirements catalogue for prototyping mobile app tools that identifies the most important criteria such tools should meet at different prototype-development stages. This catalogue is not just a good point of orientation for designing new prototyping approaches, but also provides a set of metrics for a comparing the performance of alternative prototyping tools. In addition, the book discusses the development of Blended Prototyping, a new approach for prototyping user interfaces for mobile applications in the early and middle development stages, and presents the results of an evaluation of its performance, showing that it provides a tool for teamwork-oriented, creative prototyping of mobile apps in the early design stages.

  20. Detecting users handedness for ergonomic adaptation of mobile user interfaces

    DEFF Research Database (Denmark)

    Löchtefeld, Markus; Schardt, Phillip; Krüger, Antonio

    2015-01-01

    Often, we operate mobile devices using only one hand. The hand thereby serves two purposes: holding the device and operating the touch screen with the thumb. The current trend of increasing screen sizes however, makes it close to impossible to reach all parts of the screen (especially the top area......) for users with average hand sizes. One solution is to offer adaptive user interfaces for such one-handed interactions. These modes have to be triggered manually and thus induce a critical overhead. They are further designed to bring all content closer, regardless of whether the phone is operated...... with the left or right hand. In this paper, we present an algorithm that allows determining the users' interacting hand from their unlocking behavior. Our algorithm correctly distinguishes one- and twohanded usage as well as left- and right handed unlocking in 98.51% of all cases. This is achieved through a k...

  1. Designing the user interface for Wizard Wars

    OpenAIRE

    Yli-Kiikka, Daniela

    2015-01-01

    The goal of this thesis was to design a high-quality, functioning and implementation-ready user interface (UI) for the tablet strategy game Wizard Wars. UI design is traditionally a discipline where visuals take a backseat in favor of usability and functionality. In games, however, visual impressiveness is valued highly, and it is the only medium where it is deemed acceptable to sacrifice some amount of usability to create a more elaborate design to support the theme and atmosphere. Therefore...

  2. Language workbench user interfaces for data analysis

    OpenAIRE

    Victoria M. Benson; Fabien Campagne

    2015-01-01

    Biological data analysis is frequently performed with command line software. While this practice provides considerable flexibility for computationally savy individuals, such as investigators trained in bioinformatics, this also creates a barrier to the widespread use of data analysis software by investigators trained as biologists and/or clinicians. Workflow systems such as Galaxy and Taverna have been developed to try and provide generic user interfaces that can wrap command line analysis so...

  3. Design Patterns for User Interfaces on Mobile Equipment

    Science.gov (United States)

    Nilsson, Erik G.

    The objective of this tutorial is to enhance the participants’ skills in designing user interfaces for mobile equipment, including adaptive and context sensitive user interfaces and multimodal interaction. Through a combination of lectures and practical exercises, a collection of patterns addressing issues regarding designing user interfaces on mobile devices is presented. The patterns address typical challenges and opportunities when designing user interfaces that are to run on PDAs and SmartPhones - both challenges connected to characteristics of the equipment and connected to tasks to which designing suitable user interfaces is challenging. The tutorial is intended for user interface designer, systems developers, and project leaders that work with or plan to work on development of applications on mobile devices. The tutorial requires basic knowledge of user interface design in general, and basic understanding of challenges connected to designing user interfaces on mobile devices.

  4. User Interface Improvements in Computer-Assisted Instruction, the Challenge.

    Science.gov (United States)

    Chalmers, P. A.

    2000-01-01

    Identifies user interface problems as they relate to computer-assisted instruction (CAI); reviews the learning theories and instructional theories related to CAI user interface; and presents potential CAI user interface improvements for research and development based on learning and instructional theory. Focuses on screen design improvements.…

  5. Consistency in use through model based user interface development

    OpenAIRE

    Trapp, M.; Schmettow, M.

    2006-01-01

    In dynamic environments envisioned under the concept of Ambient Intelligence the consistency of user interfaces is of particular importance. To encounter this, the variability of the environment has to be transformed to a coherent user experience. In this paper we explain several dimension of consistency and present our ideas and recent results on achieving adaptive and consistent user interfaces by exploiting the technology of model driven user interface development.

  6. A Model-Based Approach for Distributed User Interfaces

    OpenAIRE

    Melchior, Jérémie; Vanderdonckt, Jean; Van Roy, Peter; 3rd ACM Symposium on Engineering Interactive Computing Systems EICS’2011

    2011-01-01

    This paper describes a model-based approach for designing Distributed User Interfaces (DUIs), i.e., graphical user interfaces that are distributed along the following dimensions: end user, display device, computing platform, and physical environment. The three pillars of this model-based approach are: (i) a Concrete User Interface model for DUIs incorporating the distribution dimensions and expressing any DUI element in a XML-compliant format until the granularity of an individual DUI element...

  7. How to design software user interfaces to prevent musculoskeletal symptoms

    NARCIS (Netherlands)

    Lingen P. van

    2006-01-01

    Static postures, repetitive movements and precision demands are causes for musculoskeletal disorders. Features in the design of user interfaces of software contribute to these risks. The user interface can increase risks of musculoskeletal disorders by forcing the user to repeat movements, to

  8. A user experience model for tangible interfaces for children

    NARCIS (Netherlands)

    Reidsma, Dennis; van Dijk, Elisabeth M.A.G.; van der Sluis, Frans; Volpe, G; Camurri, A.; Perloy, L.M.; Nijholt, Antinus

    2015-01-01

    Tangible user interfaces allow children to take advantage of their experience in the real world when interacting with digital information. In this paper we describe a model for tangible user interfaces specifically for children that focuses mainly on the user experience during interaction and on how

  9. User interface issues in supporting human-computer integrated scheduling

    Science.gov (United States)

    Cooper, Lynne P.; Biefeld, Eric W.

    1991-01-01

    The topics are presented in view graph form and include the following: characteristics of Operations Mission Planner (OMP) schedule domain; OMP architecture; definition of a schedule; user interface dimensions; functional distribution; types of users; interpreting user interaction; dynamic overlays; reactive scheduling; and transitioning the interface.

  10. Development of a haptic interface for motor rehabilitation therapy using augmented reality.

    Science.gov (United States)

    Vidrios-Serrano, Carlos; Bonilla, Isela; Vigueras-Gomez, Flavio; Mendoza, Marco

    2015-08-01

    In this paper, a robot-assisted therapy system is presented, mainly focused on the improvement of fine movements of patients with motor deficits of upper limbs. This system combines the use of a haptic device with an augmented reality environment, where a kind of occupational therapy exercises are implemented. The main goal of the system is to provide an extra motivation to patients, who are stimulated visually and tactilely into a scene that mixes elements of real and virtual worlds. Additionally, using the norm of tracking error, it is possible to quantitatively measure the performance of the patient during a therapy session, likewise, it is possible to obtain information such as runtime and the followed path.

  11. Simulation Control Graphical User Interface Logging Report

    Science.gov (United States)

    Hewling, Karl B., Jr.

    2012-01-01

    One of the many tasks of my project was to revise the code of the Simulation Control Graphical User Interface (SIM GUI) to enable logging functionality to a file. I was also tasked with developing a script that directed the startup and initialization flow of the various LCS software components. This makes sure that a software component will not spin up until all the appropriate dependencies have been configured properly. Also I was able to assist hardware modelers in verifying the configuration of models after they have been upgraded to a new software version. I developed some code that analyzes the MDL files to determine if any error were generated due to the upgrade process. Another one of the projects assigned to me was supporting the End-to-End Hardware/Software Daily Tag-up meeting.

  12. Frictional Compliant Haptic Contact and Deformation of Soft Objects

    Directory of Open Access Journals (Sweden)

    Naci Zafer

    2016-05-01

    Full Text Available This paper is concerned with compliant haptic contact and deformation of soft objects. A human soft fingertip model is considered to act as the haptic interface and is brought into contact with and deforms a discrete surface. A nonlinear constitutive law is developed in predicting normal forces and, for the haptic display of surface texture, motions along the surface are also resisted at various rates by accounting for dynamic Lund-Grenoble (LuGre frictional forces. For the soft fingertip to apply forces over an area larger than a point, normal and frictional forces are distributed around the soft fingertip contact location on the deforming surface. The distribution is realized based on a kernel smoothing function and by a nonlinear spring-damper net around the contact point. Experiments conducted demonstrate the accuracy and effectiveness of our approach in real-time haptic rendering of a kidney surface. The resistive (interaction forces are applied at the user fingertip bone edge. A 3-DoF parallel robotic manipulator equipped with a constraint based controller is used for the implementation. By rendering forces both in lateral and normal directions, the designed haptic interface system allows the user to realistically feel both the geometrical and mechanical (nonlinear properties of the deforming kidney.

  13. How to Develop a User Interface That Your Real Users Will Love

    Science.gov (United States)

    Phillips, Donald

    2012-01-01

    A "user interface" is the part of an interactive system that bridges the user and the underlying functionality of the system. But people sometimes forget that the best interfaces will provide a platform to optimize the users' interactions so that they support and extend the users' activities in effective, useful, and usable ways. To look at it…

  14. Graphical user interface for intraoperative neuroimage updating

    Science.gov (United States)

    Rick, Kyle R.; Hartov, Alex; Roberts, David W.; Lunn, Karen E.; Sun, Hai; Paulsen, Keith D.

    2003-05-01

    Image-guided neurosurgery typically relies on preoperative imaging information that is subject to errors resulting from brain shift and deformation in the OR. A graphical user interface (GUI) has been developed to facilitate the flow of data from OR to image volume in order to provide the neurosurgeon with updated views concurrent with surgery. Upon acquisition of registration data for patient position in the OR (using fiducial markers), the Matlab GUI displays ultrasound image overlays on patient specific, preoperative MR images. Registration matrices are also applied to patient-specific anatomical models used for image updating. After displaying the re-oriented brain model in OR coordinates and digitizing the edge of the craniotomy, gravitational sagging of the brain is simulated using the finite element method. Based on this model, interpolation to the resolution of the preoperative images is performed and re-displayed to the surgeon during the procedure. These steps were completed within reasonable time limits and the interface was relatively easy to use after a brief training period. The techniques described have been developed and used retrospectively prior to this study. Based on the work described here, these steps can now be accomplished in the operating room and provide near real-time feedback to the surgeon.

  15. User Interface for Volume Rendering in Virtual Reality Environments

    OpenAIRE

    Klein, Jonathan; Reuling, Dennis; Grimm, Jan; Pfau, Andreas; Lefloch, Damien; Lambers, Martin; Kolb, Andreas

    2013-01-01

    Volume Rendering applications require sophisticated user interaction for the definition and refinement of transfer functions. Traditional 2D desktop user interface elements have been developed to solve this task, but such concepts do not map well to the interaction devices available in Virtual Reality environments. In this paper, we propose an intuitive user interface for Volume Rendering specifically designed for Virtual Reality environments. The proposed interface allows transfer function d...

  16. Multiresolution representation of data in a haptic environment

    Science.gov (United States)

    Asghar, Mohammad W.; Barner, Kenneth E.

    1998-12-01

    A number of methods are available for the visualization of scientific data. Most of these methods use computer graphics for the visual representation of data. Such visual methods cannot be used by a blind person. Haptic interface technology makes it possible for a user to explore haptically rendered data. A haptic interface, therefore, can be used to effectively present data to a blind person. However large and complex datasets, if rendered without processing, are often confusing to the user. Additionally haptic devices are often point interaction. Thus the amount of information conveyed through the device is far less than that obtained through a visual device, making exploration difficult. Multiresolution methods provide a solution to problems that rise due to the low information capacity of these devices. Utilizing these methods the user can feel the data at low resolution and then add in details by increasing the resolution. These techniques are particular useful for the visually impaired because complex local detail of the data often prevent the user from obtaining an overall view of the haptic plot. Wavelet is a common technique used for the generation of multiresolution data. However, the wavelet decomposition uses linear filters result in edges that are smoothed. Since nonlinear filters are known to preserve edges, we have used affine median filter in a structure similar to that used for the evaluation of wavelet coefficient. The affine median filter is a hybrid filter because its characteristics can be varied from the nonlinear median filter to a linear filter. Thus a flexible multiresolution technique with controllable characteristic is proposed. The technique is used to haptically render a 2D evenly sampled data t different resolutions. The standard Wavelet multiresolution technique is also applied to the same data sets and compared to the hybrid multiresolution technique. The advantage with the hybrid method is that with the same multiresolution

  17. Development of a New Backdrivable Actuator for Haptic Interfaces and Collaborative Robots

    Directory of Open Access Journals (Sweden)

    Florian Gosselin

    2016-06-01

    Full Text Available Industrial robots are most often position controlled and insensitive to external forces. In many robotic applications, however, such as teleoperation, haptics for virtual reality, and collaborative robotics, a close cooperation between humans and robots is required. For such applications, force sensing and control capabilities are required for stable interactions with the operator and environment. The robots must also be backdrivable, i.e., the robot must be able to follow user’s induced movements with the least possible resistance. High force efficiency is also desirable. These requirements are different from the design drivers of traditional industrial robots and call for specific actuators and reducers. Many such devices were proposed in the literature. However, they suffer from several drawbacks, offering either a limited reduction ratio or being complex and bulky. This paper introduces a novel solution to this problem. A new differential cable drive reducer is presented. It is backdrivable, has a high efficiency, and a potentially infinite reduction ratio. A prototype actuator using such a reducer has been developed and integrated on a test bench. The experimental characterization of its performance confirms its theoretical advantages.

  18. Reflections on Andes' Goal-Free User Interface

    Science.gov (United States)

    VanLehn, Kurt

    2016-01-01

    Although the Andes project produced many results over its 18 years of activity, this commentary focuses on its contributions to understanding how a goal-free user interface impacts the overall design and performance of a step-based tutoring system. Whereas a goal-aligned user interface displays relevant goals as blank boxes or empty locations that…

  19. Direct manipulation and the design of user interfaces

    NARCIS (Netherlands)

    Desain, P.

    1988-01-01

    An approach to user interfaces is made from a cognitive engineering viewpoint. A model of task representations within the user is given, together with complexity measures of the translations between the representations. Two approaches to interface design are compared: the conversational method and

  20. A framework of interface improvements for designing new user interfaces for the MANUS robot arm

    NARCIS (Netherlands)

    Tijsma, H.A.; Liefhebber, F.; Herder, J.L.

    2005-01-01

    Users of the MANUS robot arm experience a high cognitive and physical load when performing activities of daily living with the arm. These high loads originate from user interface problems and limitations. To reduce these high loads the user interface of the MANUS needs to be improved. Because large

  1. Evaluation of Wearable Haptic Systems for the Fingers in Augmented Reality Applications.

    Science.gov (United States)

    Maisto, Maurizio; Pacchierotti, Claudio; Chinello, Francesco; Salvietti, Gionata; De Luca, Alessandro; Prattichizzo, Domenico

    2017-01-01

    Although Augmented Reality (AR) has been around for almost five decades, only recently we have witnessed AR systems and applications entering in our everyday life. Representative examples of this technological revolution are the smartphone games "Pokémon GO" and "Ingress" or the Google Translate real-time sign interpretation app. Even if AR applications are already quite compelling and widespread, users are still not able to physically interact with the computer-generated reality. In this respect, wearable haptics can provide the compelling illusion of touching the superimposed virtual objects without constraining the motion or the workspace of the user. In this paper, we present the experimental evaluation of two wearable haptic interfaces for the fingers in three AR scenarios, enrolling 38 participants. In the first experiment, subjects were requested to write on a virtual board using a real chalk. The haptic devices provided the interaction forces between the chalk and the board. In the second experiment, subjects were asked to pick and place virtual and real objects. The haptic devices provided the interaction forces due to the weight of the virtual objects. In the third experiment, subjects were asked to balance a virtual sphere on a real cardboard. The haptic devices provided the interaction forces due to the weight of the virtual sphere rolling on the cardboard. Providing haptic feedback through the considered wearable device significantly improved the performance of all the considered tasks. Moreover, subjects significantly preferred conditions providing wearable haptic feedback.

  2. User interface for a partially incompatible system software environment with non-ADP users

    Energy Technology Data Exchange (ETDEWEB)

    Loffman, R.S.

    1987-08-01

    Good user interfaces to computer systems and software applications are the result of combining an analysis of user needs with knowledge of interface design principles and techniques. This thesis reports on an interface for an environment: (a) that consists of users who are not computer science or data processing professionals; and (b) which is bound by predetermined software and hardware. The interface was designed which combined these considerations with user interface design principles. Current literature was investigated to establish a baseline of knowledge about user interface design. There are many techniques which can be used to implement a user interface, but all should have the same basic goal, which is to assist the user in the performance of a task. This can be accomplished by providing the user with consistent, well-structured interfaces which also provide flexibility to adapt to differences among users. The interface produced used menu selection and command language techniques to make two different operating system environments appear similar. Additional included features helped to address the needs of different users. The original goal was also to make the transition between the two systems transparent. This was not fully accomplished due to software and hardware limitations.

  3. User Interface Design for E-Learning System

    OpenAIRE

    Suteja, Bernard Renaldy; Harjoko, Agus

    2008-01-01

    With the demand for e-Learning steadily growing and the ongoing struggle to convince the skeptics of thepotential of e-Learning and online virtual classrooms, quality design is the foundation for a successful DEprogram. The design of the instruction and the design of the user interface are critical elements in providingquality education with a virtual, e-Learning model. This White Paper will focus on the design of the e-Learninguser interface (UI). This paper provide examples of user interfac...

  4. Psychological Dimensions of User-Computer Interfaces. ERIC Digest.

    Science.gov (United States)

    Marchionini, Gary

    This digest highlights several psychological dimensions of user-computer interfaces. First, the psychological theory behind interface design and the field of human-computer interaction (HCI) are discussed. Two psychological models, the information processing model of cognition and the mental model--both of which contribute to interface design--are…

  5. User Interface Aspects of a Human-Hand Simulation System

    Directory of Open Access Journals (Sweden)

    Beifang Yi

    2005-10-01

    Full Text Available This paper describes the user interface design for a human-hand simulation system, a virtual environment that produces ground truth data (life-like human hand gestures and animations and provides visualization support for experiments on computer vision-based hand pose estimation and tracking. The system allows users to save time in data generation and easily create any hand gestures. We have designed and implemented this user interface with the consideration of usability goals and software engineering issues.

  6. 1st International AsiaHaptics conference

    CERN Document Server

    Ando, Hideyuki; Kyung, Ki-Uk

    2015-01-01

    This book is aimed not only at haptics and human interface researchers, but also at developers and designers from manufacturing corporations and the entertainment industry who are working to change our lives. This publication comprises the proceedings of the first International AsiaHaptics conference, held in Tsukuba, Japan, in 2014. The book describes the state of the art of the diverse haptics- (touch-) related research, including scientific research into haptics perception and illusion, development of haptics devices, and applications for a wide variety of fields such as education, medicine, telecommunication, navigation, and entertainment.

  7. Ecological user interface for emergency management decision support systems

    DEFF Research Database (Denmark)

    Andersen, V.

    2003-01-01

    The user interface for decision support systems is normally structured for presenting relevant data for the skilled user in order to allow fast assessment and action of the hazardous situation, or for more complex situations to present the relevant rules and procedures to be followed in order...... to deal most efficiently with the situation. For situations not foreseen, however, no rules exist, and no support may be given to the user by suggested actions to be fulfilled. The idea of ecological user interface is to present to the user the complete situation at various interrelated levels...... of abstraction supporting the situation assessment and remedial actions based on the domain knowledge of the user. The concept of ecological user interface has been tested and appreciated in a variety of other domains using prototypes designed to be representative of industrial processes. The purpose...

  8. User-interface aspects in recognizing connected-cursive handwriting

    NARCIS (Netherlands)

    Schomaker, L

    1994-01-01

    There are at least two major stumbling blocks for user acceptance of pen-based computers: the recognition performance is not good enough, especially on cursive handwriting; and the user interface technology has not reached a mature stage. The initial reaction of product reviewers and potential user

  9. Haptic object individuation

    NARCIS (Netherlands)

    Plaisier, M.A.; Bergmann Tiest, W.M.; Kappers, A.M.L.

    2010-01-01

    Item individuation, i.e., how we decide which parts belong to one object and which to another, is an important aspect of haptic perception and may be important for design of interfaces in which different buttons have to be distinguished. We daily hold several objects in our hand. Somehow, we decide

  10. Experiment on a novel user input for computer interface utilizing tongue input for the severely disabled.

    Science.gov (United States)

    Kencana, Andy Prima; Heng, John

    2008-11-01

    This paper introduces a novel passive tongue control and tracking device. The device is intended to be used by the severely disabled or quadriplegic person. The main focus of this device when compared to the other existing tongue tracking devices is that the sensor employed is passive which means it requires no powered electrical sensor to be inserted into the user's mouth and hence no trailing wires. This haptic interface device employs the use of inductive sensors to track the position of the user's tongue. The device is able perform two main PC functions that of the keyboard and mouse function. The results show that this device allows the severely disabled person to have some control in his environment, such as to turn on and off or control daily electrical devices or appliances; or to be used as a viable PC Human Computer Interface (HCI) by tongue control. The operating principle and set-up of such a novel passive tongue HCI has been established with successful laboratory trials and experiments. Further clinical trials will be required to test out the device on disabled persons before it is ready for future commercial development.

  11. VisTool: A user interface and visualization development system

    DEFF Research Database (Denmark)

    Xu, Shangjin

    system – to simplify user interface development. VisTool allows user interface development without real programming. With VisTool a designer assembles visual objects (e.g. textboxes, ellipse, etc.) to visualize database contents. In VisTool, visual properties (e.g. color, position, etc.) can be formulas....... However, it is more difficult to follow the classical usability approach for graphical presentation development. These difficulties result from the fact that designers cannot implement user interface with interactions and real data. We developed VisTool – a user interface and visualization development...... interface objects and properties. We built visualizations such as Lifelines, Parallel Coordinates, Heatmap, etc. to show that the formula-based approach is powerful enough for building customized visualizations. The evaluation with Cognitive Dimensions shows that the formula-based approach is cognitively...

  12. Users expect interfaces to behave like the physical world

    DEFF Research Database (Denmark)

    Nørager, Rune

    2006-01-01

    Navigation in folder structures is an essential part of most window based user interfaces. Basic human navigation strategies rely on stable properties of the physical world, which are not by default present in windows style user interfaces. According to the theoretical framework Ecological...... Cognitive Ergonomics, user interfaces that mimics the dynamics of the physical world, should be more intuitive and easy to use. To test this hypothesises 69 subjects solved a number of tasks involving navigation in folders in two different windows environments that varied in their degree physical world...... resemblance. Results showed that users had very strong physical world biases in their use of the windows interfaces. The more ecological version was thus significantly faster to use, and was preferred by the majority of users. These results seem to confirm the hypothesis and are discussed in light...

  13. High Fidelity Haptic Rendering

    CERN Document Server

    Otaduy, Miguel A

    2006-01-01

    The human haptic system, among all senses, provides unique and bidirectional communication between humans and their physical environment. Yet, to date, most human-computer interactive systems have focused primarily on the graphical rendering of visual information and, to a lesser extent, on the display of auditory information. Extending the frontier of visual computing, haptic interfaces, or force feedback devices, have the potential to increase the quality of human-computer interaction by accommodating the sense of touch. They provide an attractive augmentation to visual display and enhance t

  14. Perancangan User Interface E-learning Berbasis Web

    OpenAIRE

    Suteja, Bernard Renaldy; Harjoko, Agus

    2008-01-01

    E-Learning steadily growing and the ongoing struggle to convince the skeptics of the potential of e-Learning and online virtual classrooms, quality design is the foundation for a successful distance learning program. The design of the instruction and the design of the user interface are critical elements in providing quality education with a virtual, e-Learning model. This White Paper will focus on the design of the e-Learning user interface (UI). This paper provides examples of user interfac...

  15. Mobile interface for neuroprosthesis control aiming tetraplegic users.

    Science.gov (United States)

    Barelli, Renato G; Aquino Junior, Plinio T; Ferrari de Castro, Maria Claudia

    2016-08-01

    This article proposes the development of a mobile interface for controlling a Neuroprosthesis, designed to restore grasp patterns, aiming tetraplegics users at C5 and C6 levels. Human Computer Interface paradigms and usability concepts guide its planning and development to garantee the quality of user's interaction with the system and thus, the sucess and controlability of the neuroprostheses. The number of screens and menus were optimized, thus the user may feel the interface as more intuitive, leading to fast learning and increasing the trust on it.

  16. The phenomenological experience of dementia and user interface development

    DEFF Research Database (Denmark)

    Peterson, Carrie Beth; Mitseva, Anelia; Mihovska, Albena D.

    2009-01-01

    This study follows the project ISISEMD through a phenomenological approach of investigating the experience of the Human Computer Interaction (HCI) for someone with dementia. The aim is to accentuate the Assistive Technology (AT) from the end user perspective. It proposes that older adults and those...... with dementia should no longer be an overlooked population, and how the HCI community can learn from their experiences to develop methods and design interfaces which truly benefit these individuals. Guidelines from previous research are incorporated along with eclectic, user-centered strategies as the interface...... design considerations for the adaptation of user interfaces....

  17. Designing for User Engagment Aesthetic and Attractive User Interfaces

    CERN Document Server

    Sutcliffe, Alistair

    2009-01-01

    This book explores the design process for user experience and engagement, which expands the traditional concept of usability and utility in design to include aesthetics, fun and excitement. User experience has evolved as a new area of Human Computer Interaction research, motivated by non-work oriented applications such as games, education and emerging interactive Web 2.0. The chapter starts by examining the phenomena of user engagement and experience and setting them in the perspective of cognitive psychology, in particular motivation, emotion and mood. The perspective of aesthetics is expande

  18. iPhone User Interface Cookbook

    CERN Document Server

    Banga, Cameron

    2011-01-01

    Written in a cookbook style, this book offers solutions using a recipe based approach. Each recipe contains step-by-step instructions followed by an analysis of what was done in each task and other useful information. The cookbook approach means you can dive into whatever recipes you want in no particular order. The iPhone Interface Cookbook is written from the ground up for people who are new to iOS or application interface design in general. Each chapter discusses the reasoning and design strategy behind critical interface components, as well as how to best integrate each into any iPhone or

  19. More playful user interfaces: interfaces that invite social and physical interaction

    NARCIS (Netherlands)

    Nijholt, Antinus; Unknown, [Unknown

    2015-01-01

    This book covers the latest advances in playful user interfacesinterfaces that invite social and physical interaction. These new developments include the use of audio, visual, tactile and physiological sensors to monitor, provide feedback and anticipate the behavior of human users. The decreasing

  20. Using Vim as User Interface for Your Applications

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The Vim editor offers one of the cleverest user interfaces. It's why many developers write programs with vi keyboard bindings. Now, imagine how powerful it gets to build applications literally on top of Vim itself.

  1. Task Models in the Context of User Interface Development

    Science.gov (United States)

    Szwillus, Gerd

    Task models are widely used in the field of user interface development. They represent a human actor's performance or the co-operation of a group of people on or together with a system. For considerable time, it was an open problem in the field how to switch from the analyzing step of task analysis and modeling to the synthesizing step of user interface design. In the meantime, interesting approaches have shown up dealing with this problem and helping to bridge the gap between task modeling and user interface development. In this chapter, some of these approaches are discussed, together with recent concepts used to improve the usability of user interfaces based upon underlying task models.

  2. The Value of Constraints for 3D User Interfaces

    Science.gov (United States)

    Stuerzlinger, Wolfgang; Wingrave, Chadwick A.

    User interfaces to three-dimensional environments are becoming more and more popular. Today this trend is fuelled through the introduction of social communication via virtual worlds, console and computer games, as well as 3D televisions.

  3. Open|SpeedShop Graphical User Interface Technology Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to create a new graphical user interface (GUI) for an existing parallel application performance and profiling tool, Open|SpeedShop. The current GUI has...

  4. The intelligent user interface for NASA's advanced information management systems

    Science.gov (United States)

    Campbell, William J.; Short, Nicholas, Jr.; Rolofs, Larry H.; Wattawa, Scott L.

    1987-01-01

    NASA has initiated the Intelligent Data Management Project to design and develop advanced information management systems. The project's primary goal is to formulate, design and develop advanced information systems that are capable of supporting the agency's future space research and operational information management needs. The first effort of the project was the development of a prototype Intelligent User Interface to an operational scientific database, using expert systems and natural language processing technologies. An overview of Intelligent User Interface formulation and development is given.

  5. Activity Walkthrough - A Quick User Interface Evaluation without Users

    DEFF Research Database (Denmark)

    Bertelsen, Olav Wedege

    2004-01-01

    Based on activity theory an expert review method, the activity walkthrough, is introduced. The method is a modified version of the cognitive walkthrough, addressing some of the practical issues arising when non-experts apply the cognitive walkthrough to non-trivial interfaces. The presented version...

  6. Prop-Based Haptic Interaction with Co-location and Immersion: an Automotive Application

    OpenAIRE

    Ortega, Michael; Coquillart, Sabine

    2006-01-01

    Most research on 3D user interfaces aims at providing only a single sensory modality. One challenge is to integrate several sensory modalities into a seamless system while preserving each modality's immersion and performance factors. This paper concerns manipulation tasks and proposes a visuo-haptic system integrating immersive visualization, tactile force and tactile feedback with co-location. An industrial application is presented.

  7. A Functional Programming Technique for Forms in Graphical User Interfaces

    NARCIS (Netherlands)

    Evers, S.; Kuper, Jan; Achten, P.M.; Grelck, G.; Huch, F.; Michaelson, G.; Trinder, Ph.W.

    2005-01-01

    This paper presents FunctionalForms, a new combinator library for constructing fully functioning forms in a concise and flexible way. A form is a part of a graphical user interface (GUI) restricted to displaying a value and allowing the user to modify it. The library is built on top of the

  8. Advanced Displays and Natural User Interfaces to Support Learning

    Science.gov (United States)

    Martin-SanJose, Juan-Fernando; Juan, M. -Carmen; Mollá, Ramón; Vivó, Roberto

    2017-01-01

    Advanced displays and natural user interfaces (NUI) are a very suitable combination for developing systems to provide an enhanced and richer user experience. This combination can be appropriate in several fields and has not been extensively exploited. One of the fields that this combination is especially suitable for is education. Nowadays,…

  9. Enabling Accessibility Through Model-Based User Interface Development.

    Science.gov (United States)

    Ziegler, Daniel; Peissner, Matthias

    2017-01-01

    Adaptive user interfaces (AUIs) can increase the accessibility of interactive systems. They provide personalized display and interaction modes to fit individual user needs. Most AUI approaches rely on model-based development, which is considered relatively demanding. This paper explores strategies to make model-based development more attractive for mainstream developers.

  10. Human Computer Interface Design Criteria. Volume 1. User Interface Requirements

    Science.gov (United States)

    2010-03-19

    function may require modification when converted to certain European languages. For example, French and Italian replace the terminal vowel in an article...87 8.3.3 Types of Message Windows ........................................................................... 89 9.  User Support ...112 11.4.5 Support for Printing ...................................................................................... 113

  11. A Mobile User Interface For Low-Literacy Users In Rural South Africa ...

    African Journals Online (AJOL)

    This study was conducted to design a mobile user interface to enable low-literacy users in Dwesa community in South Africa to have access to mobile commerce services. We applied different ethnographic research methods through a usercentred design approach to actively involve the target users in the design process.

  12. An augmented reality haptic training simulator for spinal needle procedures.

    Science.gov (United States)

    Sutherland, Colin; Hashtrudi-Zaad, Keyvan; Sellens, Rick; Abolmaesumi, Purang; Mousavi, Parvin

    2013-11-01

    This paper presents the prototype for an augmented reality haptic simulation system with potential for spinal needle insertion training. The proposed system is composed of a torso mannequin, a MicronTracker2 optical tracking system, a PHANToM haptic device, and a graphical user interface to provide visual feedback. The system allows users to perform simulated needle insertions on a physical mannequin overlaid with an augmented reality cutaway of patient anatomy. A tissue model based on a finite-element model provides force during the insertion. The system allows for training without the need for the presence of a trained clinician or access to live patients or cadavers. A pilot user study demonstrates the potential and functionality of the system.

  13. A non-expert-user interface for posing signing avatars.

    Science.gov (United States)

    Adamo-Villani, Nicoletta; Popescu, Voicu; Lestina, Jason

    2013-05-01

    We describe a graphical user interface designed to allow non-expert users to pose 3D characters to create American Sign Language (ASL) computer animation. The interface is an important component of a software system that allows educators of the Deaf to add sign language translation, in the form of 3D character animations, to digital learning materials, thus making them accessible to deaf learners. A study indicates that users with no computer animation expertize can create animated ASL signs quickly and accurately.

  14. On user-friendly interface construction for CACSD packages

    DEFF Research Database (Denmark)

    Ravn, Ole

    1989-01-01

    Some ideas that are used in the development of user-friendly interface for a computer-aided control system design (CACSD) package are presented. The concepts presented are integration and extensibility through the use of object-oriented programming, man-machine interface and user support using...... direct manipulation, and multiple views and multiple actions on objects in different domains. The use of multiple views and actions in combination with graphics enhances the user's ability to get an overview of the system to be designed. Good support for iteration is provided, and the short time between...

  15. Designing a Facebook Interface for Senior Users

    Directory of Open Access Journals (Sweden)

    Gonçalo Gomes

    2014-01-01

    Full Text Available The adoption of social networks by older adults has increased in recent years. However, many still cannot make use of social networks as these are simply not adapted to them. Through a series of direct observations, interviews, and focus groups, we identified recommendations for the design of social networks targeting seniors. Based on these, we developed a prototype for tablet devices, supporting sharing and viewing Facebook content. We then conducted a user study comparing our prototype with Facebook's native mobile application. We have found that Facebook's native application does not meet senior users concerns, like privacy and family focus, while our prototype, designed in accordance with the collected recommendations, supported relevant use cases in a usable and accessible manner.

  16. Virtual Reality and Haptics for Product Assembly

    Directory of Open Access Journals (Sweden)

    Maria Teresa Restivo

    2012-01-01

    Full Text Available Haptics can significantly enhance the user's sense of immersion and interactivity. An industrial application of virtual reality and haptics for product assembly is described in this paper, which provides a new and low-cost approach for product assembly design, assembly task planning and assembly operation training. A demonstration of the system with haptics device interaction was available at the session of exp.at'11.

  17. Reservation system with graphical user interface

    KAUST Repository

    Mohamed, Mahmoud A. Abdelhamid

    2012-01-05

    Techniques for providing a reservation system are provided. The techniques include displaying a scalable visualization object, wherein the scalable visualization object comprises an expanded view element of the reservation system depicting information in connection with a selected interval of time and a compressed view element of the reservation system depicting information in connection with one or more additional intervals of time, maintaining a visual context between the expanded view and the compressed view within the visualization object, and enabling a user to switch between the expanded view and the compressed view to facilitate use of the reservation system.

  18. AutoCAD platform customization user interface and beyond

    CERN Document Server

    Ambrosius, Lee

    2014-01-01

    Make AutoCAD your own with powerful personalization options Options for AutoCAD customization are typically the domain of administrators, but savvy users can perform their own customizations to personalize AutoCAD. Until recently, most users never thought to customize the AutoCAD platform to meet their specific needs, instead leaving it to administrators. If you are an AutoCAD user who wants to ramp up personalization options in your favorite software, AutoCAD Platform Customization: User Interface and Beyond is the perfect resource for you. Author Lee Ambrosius is recognized as a leader in Au

  19. Teleoperation in cluttered environments using wearable haptic feedback

    OpenAIRE

    Bimbo, Joao; Pacchierotti, Claudio; Aggravi, Marco; Tsagarakis, Nikos; Prattichizzo, Domenico

    2017-01-01

    International audience; Robotic teleoperation in cluttered environments is attracting increasing attention for its potential in hazardous scenarios, disaster response, and telemaintenance. Although haptic feedback has been proven effective in such applications, commercially-available grounded haptic interfaces still show significant limitations in terms of workspace, safety, transparency , and encumbrance. For this reason, we present a novel robotic teleoperation system with wearable haptic f...

  20. X window system based user interface in radiology.

    Science.gov (United States)

    Heinilä, J; Yliaho, J; Ahonen, J; Viitanen, J; Kormano, M

    1994-05-01

    A teleradiology system was designed for image transfer between two hospitals. One of the main challenges of the work was the user interface, which was to be easy to operate and to learn, and was equipped with useful functions for image manipulation and diagnosing. The software tools used were the Unix operating system (HP-UX v.7.0), C programming language and the X Window System (or simply X). The graphical user interface (GUI) was based on OSF-Motif standard, and it was developed by using the HP-Interface Architect. Both OSF-Motif and HP-Interface Architect are based on X. The results of the development project were installed for clinical use in the Turku University Central Hospital. The work demonstrates, that the X Window System has useful and advantageous features for radiology department's computer network environment.

  1. Earthdata User Interface Patterns: Building Usable Web Interfaces Through a Shared UI Pattern Library

    Science.gov (United States)

    Siarto, J.

    2014-12-01

    As more Earth science software tools and services move to the web--the design and usability of those tools become ever more important. A good user interface is becoming expected and users are becoming increasingly intolerant of websites and web applications that work against them. The Earthdata UI Pattern Library attempts to give these scientists and developers the design tools they need to make usable, compelling user interfaces without the associated overhead of using a full design team. Patterns are tested and functional user interface elements targeted specifically at the Earth science community and will include web layouts, buttons, tables, typography, iconography, mapping and visualization/graphing widgets. These UI elements have emerged as the result of extensive user testing, research and software development within the NASA Earthdata team over the past year.

  2. INTERNET CONNECTIVITY FOR MASS PRODUCED UNITS WITHOUT USER INTERFACE

    DEFF Research Database (Denmark)

    2000-01-01

    To the manufacturer of mass produced units without a user interface, typically field level units, connection of these units to a communications network for enabling servicing, control and trackability is of interest. To provide this connection, a solution is described in which an interface compri...... comprising an ASIC is built into a mass produced unit, whereby the ASIC is incorporating selected portions of selected layers of the Internet Protocol. The mass produced unit is then allocated a unit address....

  3. User Interface Technology Transfer to NASA's Virtual Wind Tunnel System

    Science.gov (United States)

    vanDam, Andries

    1998-01-01

    Funded by NASA grants for four years, the Brown Computer Graphics Group has developed novel 3D user interfaces for desktop and immersive scientific visualization applications. This past grant period supported the design and development of a software library, the 3D Widget Library, which supports the construction and run-time management of 3D widgets. The 3D Widget Library is a mechanism for transferring user interface technology from the Brown Graphics Group to the Virtual Wind Tunnel system at NASA Ames as well as the public domain.

  4. Feedback from Usability Evaluation to User Interface Design

    DEFF Research Database (Denmark)

    Nielsen, C. M.; Overgaard, M.; Pedersen, M. B.

    2005-01-01

    This paper reports from an exploratory study of means for providing feedback from a usability evaluation to the user interface designers. In this study, we conducted a usability evaluation of a mobile system that is used by craftsmen to register use of time and materials. The results of this eval......This paper reports from an exploratory study of means for providing feedback from a usability evaluation to the user interface designers. In this study, we conducted a usability evaluation of a mobile system that is used by craftsmen to register use of time and materials. The results...

  5. User Interface for the SMAC Traffic Accident Reconstruction Program

    OpenAIRE

    Rok Krulec; Milan Batista

    2003-01-01

    This paper describes the development of the user interfacefor the traffic accident reconstruction program SMAC. Threebasic modules of software will be presented. Initial parametersinput and visualization, using graphics library for simulation of3D space, which form a graphical user interface, will be explainedin more detail. The modules have been developed usingdifferent technologies and programming approaches to increaseflexibility in further development and to take maximumadvantage of the c...

  6. User Interface for the SMAC Traffic Accident Reconstruction Program

    Directory of Open Access Journals (Sweden)

    Rok Krulec

    2003-11-01

    Full Text Available This paper describes the development of the user interfacefor the traffic accident reconstruction program SMAC. Threebasic modules of software will be presented. Initial parametersinput and visualization, using graphics library for simulation of3D space, which form a graphical user interface, will be explainedin more detail. The modules have been developed usingdifferent technologies and programming approaches to increaseflexibility in further development and to take maximumadvantage of the currently accessible computer hardware, sothat module to module communication is also mentioned.

  7. DEBUGGER: Developing a graphical user interface to debug FPGAs

    CERN Document Server

    AUTHOR|(SzGeCERN)773309

    2015-01-01

    As part of the summer student projects, an FPGA debugger was designed using Qt- framework. The aim of this project is to help Data Acquisition System (DAQ) experts of COMPASS experiment to easily monitor the state of each FPGA being used. It is needful to continually monitor their state. A Graphical User Interface (GUI) has then been designed to aid experts to do so. Via IP-Bus, the content of the FPGA under investigation is displayed to the user.

  8. Natural user interface based on gestures recognition using Leap Motion sensor

    National Research Council Canada - National Science Library

    L. Sousa; J. Monteiro; P.J.S. Cardoso; J.M.F. Rodrigues

    2015-01-01

    Natural User Interface (NUI) is a term used for human-computer interfaces where the interface is invisible or becomes invisible after successive user-immersion levels, it is typically based on the human nature or human natural elements...

  9. User-centered design with illiterate persons : The case of the ATM user interface

    NARCIS (Netherlands)

    Cremers, A.H.M.; Jong, J.G.M. de; Balken, J.S. van

    2008-01-01

    One of the major challenges in current user interface research and development is the accommodation of diversity in users and contexts of use in order to improve the self-efficacy of citizens. A common banking service, which should be designed for diversity, is the Automated Teller Machine (ATM).

  10. Impact of English Regional Accents on User Acceptance of Voice User Interfaces

    NARCIS (Netherlands)

    Niculescu, A.I.; White, G.M.; Lan, S.S.; Waloejo, R.U.; Kawaguchi, Y.

    2008-01-01

    In this paper we present an experiment addressing a critical issue in Voice User Interface (VUI) design, namely whether the user acceptance can be improved by having recorded voice prompts imitate his/her regional dialect. The claim was tested within a project aiming to develop voice animated

  11. Haptic stylus and empirical studies on braille, button, and texture display.

    Science.gov (United States)

    Kyung, Ki-Uk; Lee, Jun-Young; Park, Junseok

    2008-01-01

    This paper presents a haptic stylus interface with a built-in compact tactile display module and an impact module as well as empirical studies on Braille, button, and texture display. We describe preliminary evaluations verifying the tactile display's performance indicating that it can satisfactorily represent Braille numbers for both the normal and the blind. In order to prove haptic feedback capability of the stylus, an experiment providing impact feedback mimicking the click of a button has been conducted. Since the developed device is small enough to be attached to a force feedback device, its applicability to combined force and tactile feedback display in a pen-held haptic device is also investigated. The handle of pen-held haptic interface was replaced by the pen-like interface to add tactile feedback capability to the device. Since the system provides combination of force, tactile and impact feedback, three haptic representation methods for texture display have been compared on surface with 3 texture groups which differ in direction, groove width, and shape. In addition, we evaluate its capacity to support touch screen operations by providing tactile sensations when a user rubs against an image displayed on a monitor.

  12. Haptic Stylus and Empirical Studies on Braille, Button, and Texture Display

    Directory of Open Access Journals (Sweden)

    Ki-Uk Kyung

    2008-01-01

    Full Text Available This paper presents a haptic stylus interface with a built-in compact tactile display module and an impact module as well as empirical studies on Braille, button, and texture display. We describe preliminary evaluations verifying the tactile display's performance indicating that it can satisfactorily represent Braille numbers for both the normal and the blind. In order to prove haptic feedback capability of the stylus, an experiment providing impact feedback mimicking the click of a button has been conducted. Since the developed device is small enough to be attached to a force feedback device, its applicability to combined force and tactile feedback display in a pen-held haptic device is also investigated. The handle of pen-held haptic interface was replaced by the pen-like interface to add tactile feedback capability to the device. Since the system provides combination of force, tactile and impact feedback, three haptic representation methods for texture display have been compared on surface with 3 texture groups which differ in direction, groove width, and shape. In addition, we evaluate its capacity to support touch screen operations by providing tactile sensations when a user rubs against an image displayed on a monitor.

  13. Emotion Telepresence: Emotion Augmentation through Affective Haptics and Visual Stimuli

    Science.gov (United States)

    Tsetserukou, D.; Neviarouskaya, A.

    2012-03-01

    The paper focuses on a novel concept of emotional telepresence. The iFeel_IM! system which is in the vanguard of this technology integrates 3D virtual world Second Life, intelligent component for automatic emotion recognition from text messages, and innovative affective haptic interfaces providing additional nonverbal communication channels through simulation of emotional feedback and social touch (physical co-presence). Users can not only exchange messages but also emotionally and physically feel the presence of the communication partner (e.g., family member, friend, or beloved person). The next prototype of the system will include the tablet computer. The user can realize haptic interaction with avatar, and thus influence its mood and emotion of the partner. The finger gesture language will be designed for communication with avatar. This will bring new level of immersion of on-line communication.

  14. States and Sound: Modelling User Interactions with Musical Interfaces

    DEFF Research Database (Denmark)

    Larsen, Jeppe Veirum; Knoche, Hendrik

    2017-01-01

    Musical instruments and musical user interfaces provide rich input and feedback through mostly tangible interactions, resulting in complex behavior. However, publications of novel interfaces often lack the required detail due to the complex- ity or the focus on a specific part of the interfaces...... and absence of a specific template or structure to describe these interactions. Drawing on and synthesizing models from interaction design and music making we propose a way for modeling musical interfaces by providing a scheme and visual language to describe, design, analyze, and compare interfaces for music...... making. To illustrate its capabilities we apply the proposed model to a range of assistive musical instruments, which often draw on multi-modal in- and output, resulting in complex designs and descriptions thereof....

  15. A virtual universe utilizing haptic display

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, T. [Sandia National Labs., Albuquerque, NM (United States)][New Mexico Univ., Albuquerque, NM (United States). Dept. of Electrical and Computer Engineering

    1996-12-31

    This paper summarizes a virtual reality universe application in which a user can travel between four virtual worlds through the use of haptic buttons. Each of the worlds demonstrates different aspects of haptic rendering which together create a wide base for force feedback effects. Specifics of the rendering algorithms will be discussed along with possible uses and modifications for other real-life applications.

  16. Toward visual user interfaces supporting collaborative multimedia content management

    Science.gov (United States)

    Husein, Fathi; Leissler, Martin; Hemmje, Matthias

    2000-12-01

    Supporting collaborative multimedia content management activities, as e.g., image and video acquisition, exploration, and access dialogues between naive users and multi media information systems is a non-trivial task. Although a wide variety of experimental and prototypical multimedia storage technologies as well as corresponding indexing and retrieval engines are available, most of them lack appropriate support for collaborative end-user oriented user interface front ends. The development of advanced user adaptable interfaces is necessary for building collaborative multimedia information- space presentations based upon advanced tools for information browsing, searching, filtering, and brokering to be applied on potentially very large and highly dynamic multimedia collections with a large number of users and user groups. Therefore, the development of advanced and at the same time adaptable and collaborative computer graphical information presentation schemes that allow to easily apply adequate visual metaphors for defined target user stereotypes has to become a key focus within ongoing research activities trying to support collaborative information work with multimedia collections.

  17. Semiotic user interface analysis of building information model systems

    NARCIS (Netherlands)

    Hartmann, Timo

    2013-01-01

    To promote understanding of how to use building information (BI) systems to support communication, this paper uses computer semiotic theory to study how user interfaces of BI systems operate as a carrier of meaning. More specifically, the paper introduces a semiotic framework for the analysis of BI

  18. Easier Said than Done: Practical Considerations in User Interface Design.

    Science.gov (United States)

    Crow, Raymond W., Jr.; Starbird, Robert F.

    1992-01-01

    Describes the redesign of a CD-ROM database interface by the Congressional Information Service (CIS) that addressed the needs of novice, casual, and expert searchers in academic libraries. Topics discussed include the user profile, the task profile, redesign goals, interaction style, menu design and implementation, system structure and the search…

  19. Towards Linking User Interface Translation Needs to Lexicographic ...

    African Journals Online (AJOL)

    In a time of proliferating electronic devices such as smartphones, translators of user interfaces are faced with new challenges, such as the use of existing words in new contexts or in their obtaining new meanings. In this article, three lexicographic reference works available to translators in this field are compared: the ...

  20. Towards linking user interface translation needs to lexicographic ...

    African Journals Online (AJOL)

    user

    Abstract: In a time of proliferating electronic devices such as smartphones, translators of user interfaces are faced with new challenges, such as the use of existing words in new contexts or in their obtaining new meanings. In this article, three lexicographic reference works available to translators in this field are compared: ...

  1. Design of a user interface for intuitive colonoscope control

    NARCIS (Netherlands)

    Kuperij, Nicole; Reilink, Rob; Schwartz, Matthijs P.; Stramigioli, Stefano; Misra, Sarthak; Broeders, Ivo Adriaan Maria Johannes

    2011-01-01

    The goal of this study is to improve the efficiency and efficacy of the standard colonoscopy procedure. This is done by addressing the intuitiveness of colonoscope control. For this purpose an interface in the form of a grip was designed that allows the user to intuitively steer and drive the

  2. Circumventing Graphical User Interfaces in Chemical Engineering Plant Design

    Science.gov (United States)

    Romey, Noel; Schwartz, Rachel M.; Behrend, Douglas; Miao, Peter; Cheung, H. Michael; Beitle, Robert

    2007-01-01

    Graphical User Interfaces (GUIs) are pervasive elements of most modern technical software and represent a convenient tool for student instruction. For example, GUIs are used for [chemical] process design software (e.g., CHEMCAD, PRO/II and ASPEN) typically encountered in the senior capstone course. Drag and drop aspects of GUIs are challenging for…

  3. CATO--A General User Interface for CAS

    Science.gov (United States)

    Janetzko, Hans-Dieter

    2015-01-01

    CATO is a new user interface, developed by the author as a response to the significant difficulties faced by scientists, engineers, and students in their usage of computer algebra (CA) systems. Their tendency to use CA systems only occasionally means that they are unfamiliar with requisite grammar and syntax these systems require. The author…

  4. The Impact of User Interface on Young Children's Computational Thinking

    Science.gov (United States)

    Pugnali, Alex; Sullivan, Amanda; Bers, Marina Umaschi

    2017-01-01

    Aim/Purpose: Over the past few years, new approaches to introducing young children to computational thinking have grown in popularity. This paper examines the role that user interfaces have on children's mastery of computational thinking concepts and positive interpersonal behaviors. Background: There is a growing pressure to begin teaching…

  5. Helping Students Test Programs That Have Graphical User Interfaces

    Directory of Open Access Journals (Sweden)

    Matthew Thornton

    2008-08-01

    Full Text Available Within computer science education, many educators are incorporating software testing activities into regular programming assignments. Tools like JUnit and its relatives make software testing tasks much easier, bringing them into the realm of even introductory students. At the same time, many introductory programming courses are now including graphical interfaces as part of student assignments to improve student interest and engagement. Unfortunately, writing software tests for programs that have significant graphical user interfaces is beyond the skills of typical students (and many educators. This paper presents initial work at combining educationally oriented and open-source tools to create an infrastructure for writing tests for Java programs that have graphical user interfaces. Critically, these tools are intended to be appropriate for introductory (CS1/CS2 student use, and to dovetail with current teaching approaches that incorporate software testing in programming assignments. We also include in our findings our proposed approach to evaluating our techniques.

  6. USER INTERFACE DALAM DESAIN MODEL PENAKSIR RESPON EMOSI

    Directory of Open Access Journals (Sweden)

    Umi Rosyidah

    2016-10-01

    Full Text Available Interface merupakan sarana interaksi antara pengguna dengan sistem. Interface yang baik akan sangat mempengaruhi kinerja pengguna system. Model penaksir Emosi merupakan sebuah model yang digunakan untuk menilai efek warna dari sebuah desain interface berupa respon emosi tertentu. Penulis melakukan analisis terhadap aplikasi ini menggunakan Model GOMS dan KLM. Disini penulis  melihat dan memperkirakan pikir dan reaksi dari pengguna sistem. Sehingga dapat diketahui bagaimana user akan berinteraksi dengan interface, bagaimana interface ini mempengaruhi kinerja pengguna, serta dapat  mendeskripsikan bagaimana seorang pengguna sistem menggunakan aplikasi Model  Penaksir Respon Emosi ini. KLM yang sudah dianalisis menunjukkan asumsi waktu yang diperlukan untuk mencapai tujuan, sebagai contoh untuk sub-tujuan/ sub-Goal untuk melakukan proses pengujian, ada 2 pilihan untuk mencapai tujuan, dengan masing-masing prosedur memerlukan waktu yang berbeda. Pilihan prosedur/ methods ini bisa disesuaikan oleh penggun. Pengguna ahli akan berbeda dengan pengguna pemula. Aplikasi Model Penaksir Emosi ini cukup sederhana dengan Goals (tujuan yang jelas sehingga dapat digunakan dengan mudah oleh pengguna. Kata Kunci: aplikasi,  desain, emosi, model, GOMS,KLM, respon, user interface,warna

  7. Effects of 3D virtual haptics force feedback on brand personality perception: the mediating role of physical presence in advergames.

    Science.gov (United States)

    Jin, Seung-A Annie

    2010-06-01

    This study gauged the effects of force feedback in the Novint Falcon haptics system on the sensory and cognitive dimensions of a virtual test-driving experience. First, in order to explore the effects of tactile stimuli with force feedback on users' sensory experience, feelings of physical presence (the extent to which virtual physical objects are experienced as actual physical objects) were measured after participants used the haptics interface. Second, to evaluate the effects of force feedback on the cognitive dimension of consumers' virtual experience, this study investigated brand personality perception. The experiment utilized the Novint Falcon haptics controller to induce immersive virtual test-driving through tactile stimuli. The author designed a two-group (haptics stimuli with force feedback versus no force feedback) comparison experiment (N = 238) by manipulating the level of force feedback. Users in the force feedback condition were exposed to tactile stimuli involving various force feedback effects (e.g., terrain effects, acceleration, and lateral forces) while test-driving a rally car. In contrast, users in the control condition test-drove the rally car using the Novint Falcon but were not given any force feedback. Results of ANOVAs indicated that (a) users exposed to force feedback felt stronger physical presence than those in the no force feedback condition, and (b) users exposed to haptics stimuli with force feedback perceived the brand personality of the car to be more rugged than those in the control condition. Managerial implications of the study for product trial in the business world are discussed.

  8. The design of Jemboss: a graphical user interface to EMBOSS.

    Science.gov (United States)

    Carver, Tim; Bleasby, Alan

    2003-09-22

    Jemboss is a graphical user interface (GUI) for the European Molecular Biology Open Software Suite (EMBOSS). It is being developed at the MRC UK HGMP-RC as part of the EMBOSS project. This paper explains the technical aspects of the Jemboss client-server design. The client-server model optionally allows that a Jemboss user have an account on the remote server. The Jemboss client is written in Java and is downloaded automatically to a user's workstation via Java Web Start using the HTML protocol. The client then communicates with the remote server using SOAP (Simple Object Access Protocol). A Tomcat server listens on the remote machine and communicates the SOAP requests to a Jemboss server, again written in Java. This Java server interprets the client requests and executes them through Java Native Interface (JNI) code written in the C language. Another C program having setuid privilege, jembossctl, is called by the JNI code to perform the client requests under the user's account on the server. The commands include execution of EMBOSS applications, file management and project management tasks. Jemboss allows the use of JSSE for encryption of communication between the client and server. The GUI parses the EMBOSS Ajax Command Definition language for form generation and maximum input flexibility. Jemboss interacts directly with the EMBOSS libraries to allow dynamic generation of application default settings. This interface is part of the EMBOSS distribution and has attracted much interest. It has been set up at many other sites globally as well as being used at the HGMP-RC for registered users. The software, EMBOSS and Jemboss, is freely available to academics and commercial users under the GPL licence. It can be downloaded from the EMBOSS ftp server: http://www.uk.embnet.org/Software/EMBOSS/, ftp://ftp.uk.embnet.org/pub/EMBOSS/. Registered HGMP-RC users can access an installed server from: http://www.uk.embnet.org/Software/EMBOSS/Jemboss/

  9. Evaluation of new user interface features for the MANUS robot arm

    NARCIS (Netherlands)

    Tijsma, H.A.; Liefhebber, F.; Herder, J.L.

    2005-01-01

    New user interface features and a new user interface for the MANUS robot arm, were designed in order to reduce the high cognitive and physical load that users experience when controlling the MANUS. These interface features, and the new interface, were evaluated for their performance. The following

  10. Design of natural user interface of indoor surveillance system

    Science.gov (United States)

    Jia, Lili; Liu, Dan; Jiang, Mu-Jin; Cao, Ning

    2015-03-01

    Conventional optical video surveillance systems usually just record what they view, but they can't make sense of what they are viewing. With lots of useless video information stored and transmitted, waste of memory space and increasing the bandwidth are produced every day. In order to reduce the overall cost of the system, and improve the application value of the monitoring system, we use the Kinect sensor with CMOS infrared sensor, as a supplement to the traditional video surveillance system, to establish the natural user interface system for indoor surveillance. In this paper, the architecture of the natural user interface system, complex background monitoring object separation, user behavior analysis algorithms are discussed. By the analysis of the monitoring object, instead of the command language grammar, when the monitored object need instant help, the system with the natural user interface sends help information. We introduce the method of combining the new system and traditional monitoring system. In conclusion, theoretical analysis and experimental results in this paper show that the proposed system is reasonable and efficient. It can satisfy the system requirements of non-contact, online, real time, higher precision and rapid speed to control the state of affairs at the scene.

  11. Dynamic User Interfaces for Service Oriented Architectures in Healthcare.

    Science.gov (United States)

    Schweitzer, Marco; Hoerbst, Alexander

    2016-01-01

    Electronic Health Records (EHRs) play a crucial role in healthcare today. Considering a data-centric view, EHRs are very advanced as they provide and share healthcare data in a cross-institutional and patient-centered way adhering to high syntactic and semantic interoperability. However, the EHR functionalities available for the end users are rare and hence often limited to basic document query functions. Future EHR use necessitates the ability to let the users define their needed data according to a certain situation and how this data should be processed. Workflow and semantic modelling approaches as well as Web services provide means to fulfil such a goal. This thesis develops concepts for dynamic interfaces between EHR end users and a service oriented eHealth infrastructure, which allow the users to design their flexible EHR needs, modeled in a dynamic and formal way. These are used to discover, compose and execute the right Semantic Web services.

  12. Bed occupancy monitoring: data processing and clinician user interface design.

    Science.gov (United States)

    Pouliot, Melanie; Joshi, Vilas; Goubran, Rafik; Knoefel, Frank

    2012-01-01

    Unobtrusive and continuous monitoring of patients, especially at their place of residence, is becoming a significant part of the healthcare model. A variety of sensors are being used to monitor different patient conditions. Bed occupancy monitoring provides clinicians a quantitative measure of bed entry/exit patterns and may provide information relating to sleep quality. This paper presents a bed occupancy monitoring system using a bed pressure mat sensor. A clinical trial was performed involving 8 patients to collect bed occupancy data. The trial period for each patient ranged from 5-10 weeks. This data was analyzed using a participatory design methodology incorporating clinician feedback to obtain bed occupancy parameters. The parameters extracted include the number of bed exits per night, the bed exit weekly average (including minimum and maximum), the time of day of a particular exit, and the amount of uninterrupted bed occupancy per night. The design of a clinical user interface plays a significant role in the acceptance of such patient monitoring systems by clinicians. The clinician user interface proposed in this paper was designed to be intuitive, easy to navigate and not cause information overload. An iterative design methodology was used for the interface design. The interface design is extendible to incorporate data from multiple sensors. This allows the interface to be part of a comprehensive remote patient monitoring system.

  13. Pemrograman Graphical User Interface (GUI) Dengan Matlab Untuk Mendesain Alat Bantu Opersai Matematika

    OpenAIRE

    Butar Butar, Ronisah Putra

    2011-01-01

    Graphical User Interface ( GUI) is a application program orient visual which woke up with graphical obyek in the place of comand of text for the user interaction. Graphical User Interface ( GUI) in MATLAB embraced in a application of GUIDE ( Graphical User Interface Builder). In this paper will be discuss about how disagning a appliance assist mathematics operation with program of Graphical User Interface ( GUI) with MATLAB with aim to as one of the appliance alternative assist...

  14. Structural health monitoring for bolt loosening via a non-invasive vibro-haptics human-machine cooperative interface

    Science.gov (United States)

    Pekedis, Mahmut; Mascerañas, David; Turan, Gursoy; Ercan, Emre; Farrar, Charles R.; Yildiz, Hasan

    2015-08-01

    For the last two decades, developments in damage detection algorithms have greatly increased the potential for autonomous decisions about structural health. However, we are still struggling to build autonomous tools that can match the ability of a human to detect and localize the quantity of damage in structures. Therefore, there is a growing interest in merging the computational and cognitive concepts to improve the solution of structural health monitoring (SHM). The main object of this research is to apply the human-machine cooperative approach on a tower structure to detect damage. The cooperation approach includes haptic tools to create an appropriate collaboration between SHM sensor networks, statistical compression techniques and humans. Damage simulation in the structure is conducted by releasing some of the bolt loads. Accelerometers are bonded to various locations of the tower members to acquire the dynamic response of the structure. The obtained accelerometer results are encoded in three different ways to represent them as a haptic stimulus for the human subjects. Then, the participants are subjected to each of these stimuli to detect the bolt loosened damage in the tower. Results obtained from the human-machine cooperation demonstrate that the human subjects were able to recognize the damage with an accuracy of 88 ± 20.21% and response time of 5.87 ± 2.33 s. As a result, it is concluded that the currently developed human-machine cooperation SHM may provide a useful framework to interact with abstract entities such as data from a sensor network.

  15. Haptic fMRI: combining functional neuroimaging with haptics for studying the brain's motor control representation.

    Science.gov (United States)

    Menon, Samir; Brantner, Gerald; Aholt, Chris; Kay, Kendrick; Khatib, Oussama

    2013-01-01

    A challenging problem in motor control neuroimaging studies is the inability to perform complex human motor tasks given the Magnetic Resonance Imaging (MRI) scanner's disruptive magnetic fields and confined workspace. In this paper, we propose a novel experimental platform that combines Functional MRI (fMRI) neuroimaging, haptic virtual simulation environments, and an fMRI-compatible haptic device for real-time haptic interaction across the scanner workspace (above torso ∼ .65×.40×.20m(3)). We implement this Haptic fMRI platform with a novel haptic device, the Haptic fMRI Interface (HFI), and demonstrate its suitability for motor neuroimaging studies. HFI has three degrees-of-freedom (DOF), uses electromagnetic motors to enable high-fidelity haptic rendering (>350Hz), integrates radio frequency (RF) shields to prevent electromagnetic interference with fMRI (temporal SNR >100), and is kinematically designed to minimize currents induced by the MRI scanner's magnetic field during motor displacement (Tesla fMRI scanner's baseline noise variation (∼.85±.1%). Finally, HFI is haptically transparent and does not interfere with human motor tasks (tested for .4m reaches). By allowing fMRI experiments involving complex three-dimensional manipulation with haptic interaction, Haptic fMRI enables-for the first time-non-invasive neuroscience experiments involving interactive motor tasks, object manipulation, tactile perception, and visuo-motor integration.

  16. User Interface Design in Medical Distributed Web Applications.

    Science.gov (United States)

    Serban, Alexandru; Crisan-Vida, Mihaela; Mada, Leonard; Stoicu-Tivadar, Lacramioara

    2016-01-01

    User interfaces are important to facilitate easy learning and operating with an IT application especially in the medical world. An easy to use interface has to be simple and to customize the user needs and mode of operation. The technology in the background is an important tool to accomplish this. The present work aims to creating a web interface using specific technology (HTML table design combined with CSS3) to provide an optimized responsive interface for a complex web application. In the first phase, the current icMED web medical application layout is analyzed, and its structure is designed using specific tools, on source files. In the second phase, a new graphic adaptable interface to different mobile terminals is proposed, (using HTML table design (TD) and CSS3 method) that uses no source files, just lines of code for layout design, improving the interaction in terms of speed and simplicity. For a complex medical software application a new prototype layout was designed and developed using HTML tables. The method uses a CSS code with only CSS classes applied to one or multiple HTML table elements, instead of CSS styles that can be applied to just one DIV tag at once. The technique has the advantage of a simplified CSS code, and a better adaptability to different media resolutions compared to DIV-CSS style method. The presented work is a proof that adaptive web interfaces can be developed just using and combining different types of design methods and technologies, using HTML table design, resulting in a simpler to learn and use interface, suitable for healthcare services.

  17. Social Circles: A 3D User Interface for Facebook

    Science.gov (United States)

    Rodrigues, Diego; Oakley, Ian

    Online social network services are increasingly popular web applications which display large amounts of rich multimedia content: contacts, status updates, photos and event information. Arguing that this quantity of information overwhelms conventional user interfaces, this paper presents Social Circles, a rich interactive visualization designed to support real world users of social network services in everyday tasks such as keeping up with friends and organizing their network. It achieves this by using 3D UIs, fluid animations and a spatial metaphor to enable direct manipulation of a social network.

  18. A Graphical User Interface to Generalized Linear Models in MATLAB

    Directory of Open Access Journals (Sweden)

    Peter Dunn

    1999-07-01

    Full Text Available Generalized linear models unite a wide variety of statistical models in a common theoretical framework. This paper discusses GLMLAB-software that enables such models to be fitted in the popular mathematical package MATLAB. It provides a graphical user interface to the powerful MATLAB computational engine to produce a program that is easy to use but with many features, including offsets, prior weights and user-defined distributions and link functions. MATLAB's graphical capacities are also utilized in providing a number of simple residual diagnostic plots.

  19. ModelMate - A graphical user interface for model analysis

    Science.gov (United States)

    Banta, Edward R.

    2011-01-01

    ModelMate is a graphical user interface designed to facilitate use of model-analysis programs with models. This initial version of ModelMate supports one model-analysis program, UCODE_2005, and one model software program, MODFLOW-2005. ModelMate can be used to prepare input files for UCODE_2005, run UCODE_2005, and display analysis results. A link to the GW_Chart graphing program facilitates visual interpretation of results. ModelMate includes capabilities for organizing directories used with the parallel-processing capabilities of UCODE_2005 and for maintaining files in those directories to be identical to a set of files in a master directory. ModelMate can be used on its own or in conjunction with ModelMuse, a graphical user interface for MODFLOW-2005 and PHAST.

  20. Mashup - Based End User Interface for Fleet Monitoring

    Science.gov (United States)

    Popa, M.; Popa, A. S.; Slavici, T.; Darvasi, D.

    Fleet monitoring of commercial vehicles has received a major attention in the last period. A good monitoring solution increases the fleet efficiency by reducing the transportation durations, by optimizing the planned routes and by providing determinism at the intermediate and final destinations. This paper presents a fleet monitoring system for commercial vehicles using the Internet as data infrastructure. The mashup concept was implemented for creating a user interface.

  1. USING NATURAL USER INTERFACES TO SUPPORT LEARNING ENVIRONMENTS

    OpenAIRE

    Martín San José, Juan Fernando

    2015-01-01

    [EN] Considering the importance of games and new technologies for learning, in this thesis, two different systems that use Natural User Interfaces (NUI) for learning about a period of history were designed and developed. One of these systems uses autostereoscopic visualization, which lets the children see themselves as a background in the game, and that renders the elements with 3D sensation without the need for wearing special glasses or other devices. The other system uses frontal projectio...

  2. The Distributed Common Ground System-Army User Interface

    Science.gov (United States)

    2015-06-12

    analysts ability to gather, analyze and share significant amounts of information pulled into a common environment, and enhances Soldier situational ...and lessons learned. The AARs and lessons learned present clear issues to Army leadership about the problems facing DCGS-A. Recommendations to have...chapter can be applied to the development of new systems and user interfaces. The case studies of software system failures at the IRS, Hershey

  3. Haptic medicine.

    Science.gov (United States)

    Mason, Cindy; Mason, Earl

    2009-01-01

    The paper introduces haptic medicine--healthcare based on loving touch for healing and preventing disease. We describe the effects of loving touch (a square inch of our skin has over 1000 nerves) on the body, brain and mind. We describe two web-based health education and media projects. The first, HYPERLINK "http://www.21stcenturymed.org" www.21stcenturymed.org is a place for health practitioners to start learning about touch and resources. The second project, Humans Without Borders, is a multi-lingual self help education website for everyday people. Teaching materials for these projects are based on our previous work with a form of haptic medicine known as psychophysiophilosophy with patients at Stanford Hospital, Kaiser Permanente and Lucille Packard Children's Hospital. We describe psychophysiophilosophy, relate motherly love to recent discoveries in neurosciences and give hints on ways to increase motherly love in each of us. We present a plan for moving into the future by re-introducing haptic medicine into our daily lives through self-help and as an adjunct for current physician practice. There is an exercise in self-help for the reader and an appendix of recent clinical research with profound benefits on the use of human touch for over 40 conditions.

  4. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  5. Top ten list of user-hostile interface design

    Energy Technology Data Exchange (ETDEWEB)

    Miller, D.P.

    1994-10-01

    This report describes ten of the most frequent ergonomic problems found in human-computer interfaces (HCIs) associated with complex industrial machines. In contrast with being thought of as ``user friendly,`` many of these machines are seen as exhibiting ``user-hostile`` attributes by the author. The historical lack of consistent application of ergonomic principles in the HCIs has led to a breed of very sophisticated, complex manufacturing equipment that few people can operate without extensive orientation, training, or experience. This design oversight has produced the need for extensive training programs and help documentation, unnecessary machine downtime, and reduced productivity resulting from operator stress and confusion. Ergonomic considerations affect industrial machines in at least three important areas: (1) the physical package including CRT and keyboard, maintenance access areas, and dedicated hardware selection, layout, and labeling; (2) the software by which the user interacts with the computer that controls the equipment; and (3) the supporting documentation.

  6. Description of the PRISM system architecture and user interface

    Science.gov (United States)

    Constanza, P.; Larsson, C.; Thiemann, H.; Wedi, N.

    2003-04-01

    The PRISM system architecture enables the user to perform numerical experiments, allowing to couple interchangeable model components, e.g. atmosphere, ocean, biosphere, chemistry, via standardised interfaces. The coupler is based on standard interfaces implemented in the different model components. The exchange of data between the components will occur either in a direct way between components or through the coupler. The general architecture provides the infrastructure to configure, submit, monitor and subsequently post process, archive and diagnose the results of these coupled model experiments. There is an emphasis on choosing an architectural design that allows these activities to be done remotely, e.g. without the user physically being in the same place where the numerical computations take place. The PRISM general architecture gives the choice to the user either to work locally or to work through a central PRISM site where the user will be registered. Locally or via the Internet, the user will be able to use the same graphical user interface. The choice will depend of the local availability of the required resources. In addition a supervisor monitor program (SMS) gives full control over model simulations during run-time. The technology that realises the proposed architecture is known as "Web services". This includes the use of web servers, application servers, resource directories, discovery mechanisms and messages services. Currently there is no client software that can be used with browsers that does not build on Java technology. Java supports all the mechanisms needed for implementing web services using available standards. From a system maintenance point of view using one technology, Java, is the preferred way as this simplifies the task of adhering to multiple standards. The issue of standardisation of interfaces is important for complex and configurable systems such as PRISM. For example the extensible Markup Language (XML) allows for standardisation of

  7. Development of the User Interface for AIR-Spec

    Science.gov (United States)

    Cervantes Alcala, E.; Guth, G.; Fedeler, S.; Samra, J.; Cheimets, P.; DeLuca, E.; Golub, L.

    2016-12-01

    The airborne infrared spectrometer (AIR-Spec) is an imaging spectrometer that will observe the solar corona during the 2017 total solar eclipse. This eclipse will provide a unique opportunity to observe infrared emission lines in the corona. Five spectral lines are of particular interest because they may eventually be used to measure the coronal magnetic field. To avoid infrared absorption from atmospheric water vapor, AIR-Spec will be placed on an NSF Gulfstream aircraft flying above 14.9 km. AIR-Spec must be capable of taking stable images while the plane moves. The instrument includes an image stabilization system, which uses fiber-optic gyroscopes to determine platform rotation, GPS to calculate the ephemeris of the sun, and a voltage-driven mirror to correct the line of sight. An operator monitors a white light image of the eclipse and manually corrects for residual drift. The image stabilization calculation is performed by a programmable automatic controller (PAC), which interfaces with the gyroscopes and mirror controller. The operator interfaces with a separate computer, which acquires images and computes the solar ephemeris. To ensure image stabilization is successful, a human machine interface (HMI) was developed to allow connection between the client and PAC. In order to make control of the instruments user friendly during the short eclipse observation, a graphical user interface (GUI) was also created. The GUI's functionality includes turning image stabilization on and off, allowing the user to input information about the geometric setup, calculating the solar ephemeris, refining estimates of the initial aircraft attitude, and storing data from the PAC on the operator's computer. It also displays time, location, attitude, ephemeris, gyro rates and mirror angles.

  8. More playful user interfaces interfaces that invite social and physical interaction

    CERN Document Server

    2015-01-01

    This book covers the latest advances in playful user interfacesinterfaces that invite social and physical interaction. These new developments include the use of audio, visual, tactile and physiological sensors to monitor, provide feedback and anticipate the behavior of human users. The decreasing cost of sensor and actuator technology makes it possible to integrate physical behavior information in human-computer interactions. This leads to many new entertainment and game applications that allow or require social and physical interaction in sensor- and actuator-equipped smart environments. The topics discussed include: human-nature interaction, human-animal interaction and the interaction with tangibles that are naturally integrated in our smart environments. Digitally supported remote audience participation in artistic or sport events is also discussed. One important theme that emerges throughout the book is the involvement of users in the digital-entertainment design process or even design and implement...

  9. An Accessible User Interface for Geoscience and Programming

    Science.gov (United States)

    Sevre, E. O.; Lee, S.

    2012-12-01

    The goal of this research is to develop an interface that will simplify user interaction with software for scientists. The motivating factor of the research is to develop tools that assist scientists with limited motor skills with the efficient generation and use of software tools. Reliance on computers and programming is increasing in the world of geology, and it is increasingly important for geologists and geophysicists to have the computational resources to use advanced software and edit programs for their research. I have developed a prototype of a program to help geophysicists write programs using a simple interface that requires only simple single-mouse-clicks to input code. It is my goal to minimize the amount of typing necessary to create simple programs and scripts to increase accessibility for people with disabilities limiting fine motor skills. This interface can be adapted for various programming and scripting languages. Using this interface will simplify development of code for C/C++, Java, and GMT, and can be expanded to support any other text based programming language. The interface is designed around the concept of maximizing the amount of code that can be written using a minimum number of clicks and typing. The screen is split into two sections: a list of click-commands is on the left hand side, and a text area is on the right hand side. When the user clicks on a command on the left hand side the applicable code is automatically inserted at the insertion point in the text area. Currently in the C/C++ interface, there are commands for common code segments that are often used, such as for loops, comments, print statements, and structured code creation. The primary goal is to provide an interface that will work across many devices for developing code. A simple prototype has been developed for the iPad. Due to the limited number of devices that an iOS application can be used with, the code has been re-written in Java to run on a wider range of devices

  10. Astro-WISE interfaces : Scientific information system brought to the user

    NARCIS (Netherlands)

    Belikov, Andrey N.; Vriend, Willem-Jan; Sikkema, Gert

    From a simple text interface to a graphical user interfaces-Astro-WISE provides the user with a wide range of possibilities to interact with the information system according to the user's tasks and use cases. We describe a general approach to the interfacing of a scientific information system. We

  11. A graphical user interface for infant ERP analysis.

    Science.gov (United States)

    Kaatiala, Jussi; Yrttiaho, Santeri; Forssman, Linda; Perdue, Katherine; Leppänen, Jukka

    2014-09-01

    Recording of event-related potentials (ERPs) is one of the best-suited technologies for examining brain function in human infants. Yet the existing software packages are not optimized for the unique requirements of analyzing artifact-prone ERP data from infants. We developed a new graphical user interface that enables an efficient implementation of a two-stage approach to the analysis of infant ERPs. In the first stage, video records of infant behavior are synchronized with ERPs at the level of individual trials to reject epochs with noncompliant behavior and other artifacts. In the second stage, the interface calls MATLAB and EEGLAB (Delorme & Makeig, Journal of Neuroscience Methods 134(1):9-21, 2004) functions for further preprocessing of the ERP signal itself (i.e., filtering, artifact removal, interpolation, and rereferencing). Finally, methods are included for data visualization and analysis by using bootstrapped group averages. Analyses of simulated and real EEG data demonstrated that the proposed approach can be effectively used to establish task compliance, remove various types of artifacts, and perform representative visualizations and statistical comparisons of ERPs. The interface is available for download from http://www.uta.fi/med/icl/methods/eeg.html in a format that is widely applicable to ERP studies with special populations and open for further editing by users.

  12. Development of Graphical User Interface Student Electoral System

    Directory of Open Access Journals (Sweden)

    Challiz Delima- Omorog

    2016-08-01

    Full Text Available The study was conducted to design and obtain evidence concerning the software quality and acceptance of a graphical user interface (GUI student electoral voting system. The intention of this research is three-fold; firstly, a system based on ISO 9126 software quality characteristics, secondly, a system that conforms to the current hardware and software standard and lastly, improve student participation to decision-making. Designing a usable system in the context of the user’s perception (needs and let these perceptions dictate the design is therefore a great challenge. This study used descriptivedevelopment research method. Data were collected thru guided interviews and survey questionnaires from the respondents. The researcher adopted the Princeton Development Methodology through the entire life cycle of the software development process. A very substantial majority of the respondents stated that for them, the new voting system is highly acceptable as compared to the old system both in terms of development (maintainability and portability and implementation (efficiency, functionality, reliability and usability requirements of the ISO 9126. The researcher came to conclude that usability is tied to the four software characteristics. Users’ perception about software quality-implementation requirement is correlated specifically with usability. Based on data and the problems encountered, respondents’ placed low importance on metrics if it is not well represented in the interface. When the interface fails, users are more likely to take longer to vote, failing efficiency targets and be less reliable, weakening functionality

  13. Exploring Interaction Space as Abstraction Mechanism for Task-Based User Interface Design

    DEFF Research Database (Denmark)

    Nielsen, C. M.; Overgaard, M.; Pedersen, M. B.

    2007-01-01

    Designing a user interface is often a complex undertaking. Model-based user interface design is an approach where models and mappings between them form the basis for creating and specifying the design of a user interface. Such models usually include descriptions of the tasks of the prospective user......, but there is considerable variation in the other models that are employed. This paper explores the extent to which the notion of interaction space is useful as an abstraction mechanism to reduce the complexity of creating and specifying a user interface design. We present how we designed a specific user interface through...... mechanism that can help user interface designers exploit object-oriented analysis results and reduce the complexity of designing a user interface....

  14. Database Graphic User Interface correspondence with Ellis Information Seeking behavior Model

    Directory of Open Access Journals (Sweden)

    Muhammad Azami

    2010-03-01

    Full Text Available   Graphic User interface serves as a bridge between man and databases. Its primary purpose is to assist uses by establishing interaction with computer systems. Database user interface designers have seldom focused on the impact of user information seeking behaviors on the database user interface structures. Therefore, it is crucial to incorporate the user information seeking behavior within database software design as well as analyzing their impact on upgrade and optimization of user interface environment. The present study intends to determine the degree of correspondence of database interface with information seeking behavioral components of Ellis’ model. The component studied starting, chaining, browsing, differentiating, monitoring and extracting. Investigators employed direct observation method, using a checklist, in order to see how much the database interfaces support these components. Results indicated that the information seeking behavior components outlined by Ellis Model are not fully considered in database user interface design. Some of the components such as starting, chaining and differentiation were to some extent supported by some of database user interfaces studied. However elements such as browsing, monitoring and extracting have not been incorporated within the user interface structures of these databases. On the whole, the degree of correspondence and correlation of database user interfaces with Ellis information seeking components is about average. Therefore incorporating these elements in design and evaluation of user interface environment could have high impact on better optimization of database interface environment and consequently the very process of search and retrieval.

  15. A REVIEW ON USER INTERFACE DESIGN PRINCIPLES TO INCREASE SOFTWARE USABILITY FOR USERS WITH LESS COMPUTER LITERACY

    OpenAIRE

    Ali Darejeh; Dalbir Singh

    2013-01-01

    This article presents a review on how software usability could be increased for users with less computer literacy. The literature was reviewed to extract user interface design principles by identifying the similar problems of this group of users. There are different groups of users with less computer literacy. However, based on the literature three groups of them need special attention from software designers. The first group is elderly users, as users with lack of computer background. The se...

  16. Force Sensitive Handles and Capacitive Touch Sensor for Driving a Flexible Haptic-Based Immersive System

    Directory of Open Access Journals (Sweden)

    Umberto Cugini

    2013-10-01

    Full Text Available In this article, we present an approach that uses both two force sensitive handles (FSH and a flexible capacitive touch sensor (FCTS to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user’s fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape.

  17. Graphical user interface for wireless sensor networks simulator

    Science.gov (United States)

    Paczesny, Tomasz; Paczesny, Daniel; Weremczuk, Jerzy

    2008-01-01

    Wireless Sensor Networks (WSN) are currently very popular area of development. It can be suited in many applications form military through environment monitoring, healthcare, home automation and others. Those networks, when working in dynamic, ad-hoc model, need effective protocols which must differ from common computer networks algorithms. Research on those protocols would be difficult without simulation tool, because real applications often use many nodes and tests on such a big networks take much effort and costs. The paper presents Graphical User Interface (GUI) for simulator which is dedicated for WSN studies, especially in routing and data link protocols evaluation.

  18. The homes of tomorrow: service composition and advanced user interfaces

    Directory of Open Access Journals (Sweden)

    Claudio Di Ciccio

    2011-12-01

    Full Text Available Home automation represents a growing market in the industrialized world. Today’s systems are mainly based on ad hoc and proprietary solutions, with little to no interoperability and smart integration. However, in a not so distant future, our homes will be equipped with many sensors, actuators and devices, which will collectively expose services, able to smartly interact and integrate, in order to offer complex services providing even richer functionalities. In this paper we present the approach and results of SM4ALL- Smart hoMes for All, a project investigating automatic service composition and advanced user interfaces applied to domotics.

  19. Graphical user interface prototyping for distributed requirements engineering

    CERN Document Server

    Scheibmayr, Sven

    2014-01-01

    Finding and understanding the right requirements is essential for every software project. This book deals with the challenge to improve requirements engineering in distributed software projects. The use of graphical user interface (GUI) prototypes can help stakeholders in such projects to elicit and specify high quality requirements. The research objective of this study is to develop a method and a software artifact to support the activities in the early requirements engineering phase in order to overcome some of the difficulties and improve the quality of the requirements, which should eventu

  20. Improving the Interplay between Usability Evaluation and User Interface Design

    DEFF Research Database (Denmark)

    Hornbæk, K.; Stage, Jan

    2004-01-01

    This paper provides an overview of a full-day workshop that was held on October 23 2004 in connection with the Third Nordic Conference on Human Computer Interaction (Nordichi 2004). The proceedings from the workshop are available from http://www.cs.aau.dk/~jans/events.html. The ideas and theme...... of the workshop are motivated and an outline of the contents of the papers that were presented in the workshop is given. In addition we summarize some challenges to the interplay between usability evaluation and user interface design agreed upon at the workshop, as well as some solutions that were debated....

  1. Hydraulophones: Acoustic musical instruments and expressive user interfaces

    Science.gov (United States)

    Janzen, Ryan E.

    Fluid flow creates an expansive range of acoustic possibilities, particularly in the case of water, which has unique turbulence and vortex shedding properties as compared with the air of ordinary wind instruments. Sound from water flow is explained with reference to a new class of musical instruments, hydraulophones, in which oscillation originates directly from matter in its liquid state. Several hydraulophones which were realized in practical form are described. A unique user-interface consisting of a row of water jets is presented, in terms of its expressiveness, tactility, responsiveness to derivatives and integrals of displacement, and in terms of the direct physical interaction between a user and the physical process of sound production. Signal processing algorithms are introduced, which extract further information from turbulent water flow, for industrial applications as well as musical applications.

  2. Object-oriented user interfaces for personalized mobile learning

    CERN Document Server

    Alepis, Efthimios

    2014-01-01

    This book presents recent research in mobile learning and advanced user interfaces. It is shown how the combination of this fields can result in personalized educational software that meets the requirements of state-of-the-art mobile learning software. This book provides a framework that is capable of incorporating the software technologies, exploiting a wide range of their current advances and additionally investigating ways to go even further by providing potential solutions to future challenges. The presented approach uses the well-known Object-Oriented method in order to address these challenges. Throughout this book, a general model is constructed using Object-Oriented Architecture. Each chapter focuses on the construction of a specific part of this model, while in the conclusion these parts are unified. This book will help software engineers build more sophisticated personalized software that targets in mobile education, while at the same time retaining a high level of adaptivity and user-friendliness w...

  3. User's manual for the HYPGEN hyperbolic grid generator and the HGUI graphical user interface

    Science.gov (United States)

    Chan, William M.; Chiu, Ing-Tsau; Buning, Pieter G.

    1993-01-01

    The HYPGEN program is used to generate a 3-D volume grid over a user-supplied single-block surface grid. This is accomplished by solving the 3-D hyperbolic grid generation equations consisting of two orthogonality relations and one cell volume constraint. In this user manual, the required input files and parameters and output files are described. Guidelines on how to select the input parameters are given. Illustrated examples are provided showing a variety of topologies and geometries that can be treated. HYPGEN can be used in stand-alone mode as a batch program or it can be called from within a graphical user interface HGUI that runs on Silicon Graphics workstations. This user manual provides a description of the menus, buttons, sliders, and typein fields in HGUI for users to enter the parameters needed to run HYPGEN. Instructions are given on how to configure the interface to allow HYPGEN to run either locally or on a faster remote machine through the use of shell scripts on UNIX operating systems. The volume grid generated is copied back to the local machine for visualization using a built-in hook to PLOT3D.

  4. A polygonal method for haptic force generation

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, T. [Univ. of New Mexico, Albuquerque, NM (United States). Dept. of Electrical and Computer Engineering]|[Sandia National Labs., Albuquerque, NM (United States)

    1996-12-31

    Algorithms for computing forces and associated surface deformations (graphical and physical) are given, which, together with a force feedback device can be used to haptically display virtual objects. The Bendable Polygon algorithm, created at Sandia National Labs and the University of New Mexico, for visual rendering of computer generated surfaces is also presented. An implementation using the EIGEN virtual reality environment, and the PHANToM (Trademark) haptic interface, is reported together with suggestions for future research.

  5. Bringing Control System User Interfaces to the Web

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xihui [ORNL; Kasemir, Kay [ORNL

    2013-01-01

    With the evolution of web based technologies, especially HTML5 [1], it becomes possible to create web-based control system user interfaces (UI) that are cross-browser and cross-device compatible. This article describes two technologies that facilitate this goal. The first one is the WebOPI [2], which can seamlessly display CSS BOY [3] Operator Interfaces (OPI) in web browsers without modification to the original OPI file. The WebOPI leverages the powerful graphical editing capabilities of BOY and provides the convenience of re-using existing OPI files. On the other hand, it uses generic JavaScript and a generic communication mechanism between the web browser and web server. It is not optimized for a control system, which results in unnecessary network traffic and resource usage. Our second technology is the WebSocket-based Process Data Access (WebPDA) [4]. It is a protocol that provides efficient control system data communication using WebSocket [5], so that users can create web-based control system UIs using standard web page technologies such as HTML, CSS and JavaScript. WebPDA is control system independent, potentially supporting any type of control system.

  6. Graphical user interfaces for teaching and research in optical communications

    Science.gov (United States)

    Almeida, Telmo; Nogueira, Rogerio; André, Paulo

    2014-07-01

    This paper highlights the use of graphical user interfaces (GUIs) developed with the guide tool from Matlab® for university level optical communications courses and research activities. Graphical user interfaces programmed with Matlab® would not only improve the learning experience, making models easier to understand, but also could be tweaked and improved by students themselves. As Matlab® is already taught in many universities, this would ease the process. An example of a model for a stationary EDFA is given to demonstrate the ease of use and understanding of the role of all the different parameters of the model, so students can get a real interactive experience. Another considered potential application is in research. With GUIs, researchers can make real-time parameter optimization, quick assessments and calculations, or simply showcase their work to broader audiences who may not be so familiar with the topic. A practical example of a research application is given for a parameter optimization of a model for non-linear phenomena in uncompensated long-haul transmission links is given. Besides all the emphasis given to practical applications and potential situations for its use, the paper also covers the basic notions of the critical steps in making a successful Matlab® GUI. Ease of use, visual appearance and computation time are the key features of a successfully implemented GUI.

  7. Wearable wireless User Interface Cursor-Controller (UIC-C).

    Science.gov (United States)

    Marjanovic, Nicholas; Kerr, Kevin; Aranda, Ricardo; Hickey, Richard; Esmailbeigi, Hananeh

    2017-07-01

    Controlling a computer or a smartphone's cursor allows the user to access a world full of information. For millions of people with limited upper extremities motor function, controlling the cursor becomes profoundly difficult. Our team has developed the User Interface Cursor-Controller (UIC-C) to assist the impaired individuals in regaining control over the cursor. The UIC-C is a hands-free device that utilizes the tongue muscle to control the cursor movements. The entire device is housed inside a subject specific retainer. The user maneuvers the cursor by manipulating a joystick imbedded inside the retainer via their tongue. The joystick movement commands are sent to an electronic device via a Bluetooth connection. The device is readily recognizable as a cursor controller by any Bluetooth enabled electronic device. The device testing results have shown that the time it takes the user to control the cursor accurately via the UIC-C is about three times longer than a standard computer mouse controlled via the hand. The device does not require any permanent modifications to the body; therefore, it could be used during the period of acute rehabilitation of the hands. With the development of modern smart homes, and enhancement electronics controlled by the computer, UIC-C could be integrated into a system that enables individuals with permanent impairment, the ability to control the cursor. In conclusion, the UIC-C device is designed with the goal of allowing the user to accurately control a cursor during the periods of either acute or permanent upper extremities impairment.

  8. Towards an Improved Pilot-Vehicle Interface for Highly Automated Aircraft: Evaluation of the Haptic Flight Control System

    Science.gov (United States)

    Schutte, Paul; Goodrich, Kenneth; Williams, Ralph

    2012-01-01

    The control automation and interaction paradigm (e.g., manual, autopilot, flight management system) used on virtually all large highly automated aircraft has long been an exemplar of breakdowns in human factors and human-centered design. An alternative paradigm is the Haptic Flight Control System (HFCS) that is part of NASA Langley Research Center s Naturalistic Flight Deck Concept. The HFCS uses only stick and throttle for easily and intuitively controlling the actual flight of the aircraft without losing any of the efficiency and operational benefits of the current paradigm. Initial prototypes of the HFCS are being evaluated and this paper describes one such evaluation. In this evaluation we examined claims regarding improved situation awareness, appropriate workload, graceful degradation, and improved pilot acceptance. Twenty-four instrument-rated pilots were instructed to plan and fly four different flights in a fictitious airspace using a moderate fidelity desktop simulation. Three different flight control paradigms were tested: Manual control, Full Automation control, and a simplified version of the HFCS. Dependent variables included both subjective (questionnaire) and objective (SAGAT) measures of situation awareness, workload (NASA-TLX), secondary task performance, time to recognize automation failures, and pilot preference (questionnaire). The results showed a statistically significant advantage for the HFCS in a number of measures. Results that were not statistically significant still favored the HFCS. The results suggest that the HFCS does offer an attractive and viable alternative to the tactical components of today s FMS/autopilot control system. The paper describes further studies that are planned to continue to evaluate the HFCS.

  9. CTG Analyzer: A graphical user interface for cardiotocography.

    Science.gov (United States)

    Sbrollini, Agnese; Agostinelli, Angela; Burattini, Luca; Morettini, Micaela; Di Nardo, Francesco; Fioretti, Sandro; Burattini, Laura

    2017-07-01

    Cardiotocography (CTG) is the most commonly used test for establishing the good health of the fetus during pregnancy and labor. CTG consists in the recording of fetal heart rate (FHR; bpm) and maternal uterine contractions (UC; mmHg). FHR is characterized by baseline, baseline variability, tachycardia, bradycardia, acceleration and decelerations. Instead, UC signal is characterized by presence of contractions and contractions period. Such parameters are usually evaluated by visual inspection. However, visual analysis of CTG recordings has a well-demonstrated poor reproducibility, due to the complexity of physiological phenomena affecting fetal heart rhythm and being related to clinician's experience. Computerized tools in support of clinicians represents a possible solution for improving correctness in CTG interpretation. This paper proposes CTG Analyzer as a graphical tool for automatic and objective analysis of CTG tracings. CTG Analyzer was developed under MATLAB®; it is a very intuitive and user friendly graphical user interface. FHR time series and UC signal are represented one under the other, on a grid with reference lines, as usually done for CTG reports printed on paper. Colors help identification of FHR and UC features. Automatic analysis is based on some unchangeable features definitions provided by the FIGO guidelines, and other arbitrary settings whose default values can be changed by the user. Eventually, CTG Analyzer provides a report file listing all the quantitative results of the analysis. Thus, CTG Analyzer represents a potentially useful graphical tool for automatic and objective analysis of CTG tracings.

  10. Graphical user interface concepts for tactical augmented reality

    Science.gov (United States)

    Argenta, Chris; Murphy, Anne; Hinton, Jeremy; Cook, James; Sherrill, Todd; Snarski, Steve

    2010-04-01

    Applied Research Associates and BAE Systems are working together to develop a wearable augmented reality system under the DARPA ULTRA-Vis program†. Our approach to achieve the objectives of ULTRAVis, called iLeader, incorporates a full color 40° field of view (FOV) see-thru holographic waveguide integrated with sensors for full position and head tracking to provide an unobtrusive information system for operational maneuvers. iLeader will enable warfighters to mark-up the 3D battle-space with symbologic identification of graphical control measures, friendly force positions and enemy/target locations. Our augmented reality display provides dynamic real-time painting of symbols on real objects, a pose-sensitive 360° representation of relevant object positions, and visual feedback for a variety of system activities. The iLeader user interface and situational awareness graphical representations are highly intuitive, nondisruptive, and always tactically relevant. We used best human-factors practices, system engineering expertise, and cognitive task analysis to design effective strategies for presenting real-time situational awareness to the military user without distorting their natural senses and perception. We present requirements identified for presenting information within a see-through display in combat environments, challenges in designing suitable visualization capabilities, and solutions that enable us to bring real-time iconic command and control to the tactical user community.

  11. Creating interactive User Feedback in DGS using Scripting Interfaces

    Directory of Open Access Journals (Sweden)

    Andreas Fest

    2010-06-01

    Full Text Available Feedback is an important component of interactive learning software. A conclusion from cognitive learning theory is that good software must give the learner more information about what he did. Following the ideas of constructivist learning theory the user should be in control of both the time and the level of feedback he receives. At the same time the feedback system must identify and review different possible solution strategies in an open learning environment. The interactive geometry software Cinderella offers an easy-to-use programming interface. It can be used to implement application specific feedback by the author of learning units. In this paper we present two exemplary learning units implementing two kinds of interactive feedback: feedback on demand and immediate feedback. The presented units come from discrete mathematics and from the theory of line reflections and congruencies in geometry. The units are implemented in a process-oriented design. Various directly given or hidden hints help the students to understand the mathematical principles behind the given problems. Our tools analyses the student’s solution processes automatically and generates additional feedback on demand. The second learning environment can also be used in conjunction with recording of user actions. This allows additional feedback given later by the teacher whenever the automatic feedback system fails in analyzing the users' learning processes. First experiences using the units in teaching are presented.

  12. Presentation of dynamically overlapping auditory messages in user interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Papp, III, Albert Louis [Univ. of California, Davis, CA (United States)

    1997-09-01

    This dissertation describes a methodology and example implementation for the dynamic regulation of temporally overlapping auditory messages in computer-user interfaces. The regulation mechanism exists to schedule numerous overlapping auditory messages in such a way that each individual message remains perceptually distinct from all others. The method is based on the research conducted in the area of auditory scene analysis. While numerous applications have been engineered to present the user with temporally overlapped auditory output, they have generally been designed without any structured method of controlling the perceptual aspects of the sound. The method of scheduling temporally overlapping sounds has been extended to function in an environment where numerous applications can present sound independently of each other. The Centralized Audio Presentation System is a global regulation mechanism that controls all audio output requests made from all currently running applications. The notion of multimodal objects is explored in this system as well. Each audio request that represents a particular message can include numerous auditory representations, such as musical motives and voice. The Presentation System scheduling algorithm selects the best representation according to the current global auditory system state, and presents it to the user within the request constraints of priority and maximum acceptable latency. The perceptual conflicts between temporally overlapping audio messages are examined in depth through the Computational Auditory Scene Synthesizer. At the heart of this system is a heuristic-based auditory scene synthesis scheduling method. Different schedules of overlapped sounds are evaluated and assigned penalty scores. High scores represent presentations that include perceptual conflicts between over-lapping sounds. Low scores indicate fewer and less serious conflicts. A user study was conducted to validate that the perceptual difficulties predicted by

  13. A survey on haptic technologies for mobile augmented reality

    OpenAIRE

    Bermejo, Carlos; Hui, Pan

    2017-01-01

    Augmented Reality (AR) and Mobile Augmented Reality (MAR) applications have gained much research and industry attention these days. The mobile nature of MAR applications limits users' interaction capabilities such as inputs, and haptic feedbacks. This survey reviews current research issues in the area of human computer interaction for MAR and haptic devices. The survey first presents human sensing capabilities and their applicability in AR applications. We classify haptic devices into two gro...

  14. Precise Haptic Device Co-Location for Visuo-Haptic Augmented Reality.

    Science.gov (United States)

    Eck, Ulrich; Pankratz, Frieder; Sandor, Christian; Klinker, Gudrun; Laga, Hamid

    2015-12-01

    Visuo-haptic augmented reality systems enable users to see and touch digital information that is embedded in the real world. PHANToM haptic devices are often employed to provide haptic feedback. Precise co-location of computer-generated graphics and the haptic stylus is necessary to provide a realistic user experience. Previous work has focused on calibration procedures that compensate the non-linear position error caused by inaccuracies in the joint angle sensors. In this article we present a more complete procedure that additionally compensates for errors in the gimbal sensors and improves position calibration. The proposed procedure further includes software-based temporal alignment of sensor data and a method for the estimation of a reference for position calibration, resulting in increased robustness against haptic device initialization and external tracker noise. We designed our procedure to require minimal user input to maximize usability. We conducted an extensive evaluation with two different PHANToMs, two different optical trackers, and a mechanical tracker. Compared to state-of-the-art calibration procedures, our approach significantly improves the co-location of the haptic stylus. This results in higher fidelity visual and haptic augmentations, which are crucial for fine-motor tasks in areas such as medical training simulators, assembly planning tools, or rapid prototyping applications.

  15. Incorporating Speech Recognition into a Natural User Interface

    Science.gov (United States)

    Chapa, Nicholas

    2017-01-01

    The Augmented/ Virtual Reality (AVR) Lab has been working to study the applicability of recent virtual and augmented reality hardware and software to KSC operations. This includes the Oculus Rift, HTC Vive, Microsoft HoloLens, and Unity game engine. My project in this lab is to integrate voice recognition and voice commands into an easy to modify system that can be added to an existing portion of a Natural User Interface (NUI). A NUI is an intuitive and simple to use interface incorporating visual, touch, and speech recognition. The inclusion of speech recognition capability will allow users to perform actions or make inquiries using only their voice. The simplicity of needing only to speak to control an on-screen object or enact some digital action means that any user can quickly become accustomed to using this system. Multiple programs were tested for use in a speech command and recognition system. Sphinx4 translates speech to text using a Hidden Markov Model (HMM) based Language Model, an Acoustic Model, and a word Dictionary running on Java. PocketSphinx had similar functionality to Sphinx4 but instead ran on C. However, neither of these programs were ideal as building a Java or C wrapper slowed performance. The most ideal speech recognition system tested was the Unity Engine Grammar Recognizer. A Context Free Grammar (CFG) structure is written in an XML file to specify the structure of phrases and words that will be recognized by Unity Grammar Recognizer. Using Speech Recognition Grammar Specification (SRGS) 1.0 makes modifying the recognized combinations of words and phrases very simple and quick to do. With SRGS 1.0, semantic information can also be added to the XML file, which allows for even more control over how spoken words and phrases are interpreted by Unity. Additionally, using a CFG with SRGS 1.0 produces a Finite State Machine (FSM) functionality limiting the potential for incorrectly heard words or phrases. The purpose of my project was to

  16. siGnum: graphical user interface for EMG signal analysis.

    Science.gov (United States)

    Kaur, Manvinder; Mathur, Shilpi; Bhatia, Dinesh; Verma, Suresh

    2015-01-01

    Electromyography (EMG) signals that represent the electrical activity of muscles can be used for various clinical and biomedical applications. These are complicated and highly varying signals that are dependent on anatomical location and physiological properties of the muscles. EMG signals acquired from the muscles require advanced methods for detection, decomposition and processing. This paper proposes a novel Graphical User Interface (GUI) siGnum developed in MATLAB that will apply efficient and effective techniques on processing of the raw EMG signals and decompose it in a simpler manner. It could be used independent of MATLAB software by employing a deploy tool. This would enable researcher's to gain good understanding of EMG signal and its analysis procedures that can be utilized for more powerful, flexible and efficient applications in near future.

  17. A user friendly interface for microwave tomography enhanced GPR surveys

    Science.gov (United States)

    Catapano, Ilaria; Affinito, Antonio; Soldovieri, Francesco

    2013-04-01

    Ground Penetrating Radar (GPR) systems are nowadays widely used in civil applications among which structural monitoring is one of the most critical issues due to its importance in terms of risks prevents and cost effective management of the structure itself. Despite GPR systems are assessed devices, there is a continuous interest towards their optimization, which involves both hardware and software aspects, with the common final goal to achieve accurate and highly informative images while keeping as low as possible difficulties and times involved in on field surveys. As far as data processing is concerned, one of the key aims is the development of imaging approaches capable of providing images easily interpretable by not expert users while keeping feasible the requirements in terms of computational resources. To satisfy this request or at least improve the reconstruction capabilities of data processing tools actually available in commercial GPR systems, microwave tomographic approaches based on the Born approximation have been developed and tested in several practical conditions, such as civil and archeological investigations, sub-service monitoring, security surveys and so on [1-3]. However, the adoption of these approaches is subjected to the involvement of expert workers, which have to be capable of properly managing the gathered data and their processing, which involves the solution of a linear inverse scattering problem. In order to overcome this drawback, aim of this contribution is to present an end-user friendly software interface that makes possible a simple management of the microwave tomographic approaches. In particular, the proposed interface allows us to upload both synthetic and experimental data sets saved in .txt, .dt and .dt1 formats, to perform all the steps needed to obtain tomographic images and to display raw-radargrams, intermediate and final results. By means of the interface, the users can apply time gating, back-ground removal or both to

  18. Importance of Matching Physical Friction, Hardness, and Texture in Creating Realistic Haptic Virtual Surfaces.

    Science.gov (United States)

    Culbertson, Heather; Kuchenbecker, Katherine J

    2017-01-01

    Interacting with physical objects through a tool elicits tactile and kinesthetic sensations that comprise your haptic impression of the object. These cues, however, are largely missing from interactions with virtual objects, yielding an unrealistic user experience. This article evaluates the realism of virtual surfaces rendered using haptic models constructed from data recorded during interactions with real surfaces. The models include three components: surface friction, tapping transients, and texture vibrations. We render the virtual surfaces on a SensAble Phantom Omni haptic interface augmented with a Tactile Labs Haptuator for vibration output. We conducted a human-subject study to assess the realism of these virtual surfaces and the importance of the three model components. Following a perceptual discrepancy paradigm, subjects compared each of 15 real surfaces to a full rendering of the same surface plus versions missing each model component. The realism improvement achieved by including friction, tapping, or texture in the rendering was found to directly relate to the intensity of the surface's property in that domain (slipperiness, hardness, or roughness). A subsequent analysis of forces and vibrations measured during interactions with virtual surfaces indicated that the Omni's inherent mechanical properties corrupted the user's haptic experience, decreasing realism of the virtual surface.

  19. Provide a Model to Determine of Importance the Characteristics of Iranian Digital Libraries User Interface

    Directory of Open Access Journals (Sweden)

    Yaghoub Norouzi

    2012-03-01

    Full Text Available User interfaces in digital libraries are responsible for interaction among users and information environment and creation the feedback. Therefore, the aim of this study was offering a model to find of importance the characteristics of Iranian digital libraries user interface based of Delphi method. In relation, after study of the related literature and providing the researcher constructed checklist regarding the Delphi panel, ten key criteria (search, consistency, guidance, navigation, design, error correction, presentation, user control, interface language, and simplicity included 114 features and sub-components were selected to find of importance of them in the design of Iranian digital libraries user interfaces. Then, based on obtained scores from the Delphi panel, importance of each sub-feature determined in the design of Iranian digital libraries user interfaces. Finally, the criteria of the interface language and error correction in sequence gained the highest and lowest score. Although the difference among of criteria's and sub-components were low and according to the Delphi panel using them in the design of Iranian digital libraries user interface offered. It was hope that the results of this study could present needed features and their importance in designing in digital libraries user interface. On the other hand, it could be used as a tool for the determination of capabilities of Iran digital libraries user interfaces from experts and their user's opinion.

  20. Model for Educational Game Using Natural User Interface

    Directory of Open Access Journals (Sweden)

    Azrulhizam Shapi’i

    2016-01-01

    Full Text Available Natural User Interface (NUI is a new approach that has become increasingly popular in Human-Computer Interaction (HCI. The use of this technology is widely used in almost all sectors, including the field of education. In recent years, there are a lot of educational games using NUI technology in the market such as Kinect game. Kinect is a sensor that can recognize body movements, postures, and voices in three dimensions. It enables users to control and interact with game without the need of using game controller. However, the contents of most existing Kinect games do not follow the standard curriculum in classroom, thus making it do not fully achieve the learning objectives. Hence, this research proposes a design model as a guideline in designing educational game using NUI. A prototype has been developed as one of the objectives in this study. The prototype is based on proposed model to ensure and assess the effectiveness of the model. The outcomes of this study conclude that the proposed model contributed to the design method for the development of the educational game using NUI. Furthermore, evaluation results of the prototype show a good response from participant and in line with the standard curriculum.

  1. Risk Issues in Developing Novel User Interfaces for Human-Computer Interaction

    KAUST Repository

    Klinker, Gudrun

    2014-01-01

    © 2014 Springer International Publishing Switzerland. All rights are reserved. When new user interfaces or information visualization schemes are developed for complex information processing systems, it is not readily clear how much they do, in fact, support and improve users\\' understanding and use of such systems. Is a new interface better than an older one? In what respect, and in which situations? To provide answers to such questions, user testing schemes are employed. This chapter reports on a range of risks pertaining to the design and implementation of user interfaces in general, and to newly emerging interfaces (3-dimensionally, immersive, mobile) in particular.

  2. A user interface development tool for space science systems Transportable Applications Environment (TAE) Plus

    Science.gov (United States)

    Szczur, Martha R.

    1990-01-01

    The Transportable Applications Environment Plus (TAE PLUS), developed at NASA's Goddard Space Flight Center, is a portable What You See Is What You Get (WYSIWYG) user interface development and management system. Its primary objective is to provide an integrated software environment that allows interactive prototyping and development that of user interfaces, as well as management of the user interface within the operational domain. Although TAE Plus is applicable to many types of applications, its focus is supporting user interfaces for space applications. This paper discusses what TAE Plus provides and how the implementation has utilized state-of-the-art technologies within graphic workstations, windowing systems and object-oriented programming languages.

  3. Glenn Reconfigurable User-interface and Virtual reality Exploration (GURVE) Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The GRUVE (Glenn Reconfigurable User-interface and Virtual reality Exploration) Lab is a reconfigurable, large screen display facility at Nasa Glenn Research Center....

  4. Knowledge-Based User-Computer Interface Design, Prototyping and Evaluation - the Design Pro Advisory System

    National Research Council Canada - National Science Library

    Andriole, Stephen

    1998-01-01

    ...) design, prototyping and evaluation. DesignPro permits designers of user computer interfaces to represent requirements, to build prototypes, and to evaluate their impact -- all via a "workbench" of user accessible functions...

  5. Haptic-assistive technologies for audition and vision sensory disabilities.

    Science.gov (United States)

    Sorgini, Francesca; Caliò, Renato; Carrozza, Maria Chiara; Oddo, Calogero Maria

    2017-10-10

    The aim of this review is to analyze haptic sensory substitution technologies for deaf, blind and deaf-blind individuals. The literature search has been performed in Scopus, PubMed and Google Scholar databases using selected keywords, analyzing studies from 1960s to present. Search on databases for scientific publications has been accompanied by web search for commercial devices. Results have been classified by sensory disability and functionality, and analyzed by assistive technology. Complementary analyses have also been carried out on websites of public international agencies, such as the World Health Organization (WHO), and of associations representing sensory disabled persons. The reviewed literature provides evidences that sensory substitution aids are able to mitigate in part the deficits in language learning, communication and navigation for deaf, blind and deaf-blind individuals, and that the tactile sense can be a means of communication to provide some kind of information to sensory disabled individuals. A lack of acceptance emerged from the discussion of capabilities and limitations of haptic assistive technologies. Future researches shall go towards miniaturized, custom-designed and low-cost haptic interfaces and integration with personal devices such as smartphones for a major diffusion of sensory aids among disabled. Implications for rehabilitation Systematic review of state of the art of haptic assistive technologies for vision and audition sensory disabilities. Sensory substitution systems for visual and hearing disabilities have a central role in the transmission of information for patients with sensory impairments, enabling users to interact with the not disabled community in daily activities. Visual and auditory inputs are converted in haptic feedback via different actuation technologies. The information is presented in the form of static or dynamic stimulation of the skin. Their effectiveness and ease of use make haptic sensory substitution

  6. Probabilistic rainfall warning system with an interactive user interface

    Science.gov (United States)

    Koistinen, Jarmo; Hohti, Harri; Kauhanen, Janne; Kilpinen, Juha; Kurki, Vesa; Lauri, Tuomo; Nurmi, Pertti; Rossi, Pekka; Jokelainen, Miikka; Heinonen, Mari; Fred, Tommi; Moisseev, Dmitri; Mäkelä, Antti

    2013-04-01

    citizens and professional end users applies SMS messages and, in near future, smartphone maps. The present interactive user interface facilitates free selection of alert sites and two warning thresholds (any rain, heavy rain) at any location in Finland. The pilot service was tested by 1000-3000 users during summers 2010 and 2012. As an example of dedicated end-user services gridded exceedance scenarios (of probabilities 5 %, 50 % and 90 %) of hourly rainfall accumulations for the next 3 hours have been utilized as an online input data for the influent model at the Greater Helsinki Wastewater Treatment Plant.

  7. User-centered design of a patient’s work station for haptic robot-based telerehabilitation after stroke

    Directory of Open Access Journals (Sweden)

    Ivanova Ekaterina

    2017-03-01

    Full Text Available Robotic therapy devices have been an important part of clinical neurological rehabilitation for several years. Until now such devices are only available for patients receiving therapy inside rehabilitation hospitals. Since patients should continue rehabilitation training after hospital discharge at home, intelligent robotic rehab devices could help to achieve this goal. This paper presents therapeutic requirements and early phases of the user-centered design process of the patient’s work station as part of a novel robot-based system for motor telerehabilitation.

  8. Haptic spatial configuration learning in deaf and hearing individuals

    NARCIS (Netherlands)

    van Dijk, R; Kappers, A.M.L.; Postma, A.

    2013-01-01

    The present study investigated haptic spatial configuration learning in deaf individuals, hearing sign language interpreters and hearing controls. In three trials, participants had to match ten shapes haptically to the cut-outs in a board as fast as possible. Deaf and hearing sign language users

  9. CATE 2016 Indonesia: Camera, Software, and User Interface

    Science.gov (United States)

    Kovac, S. A.; Jensen, L.; Hare, H. S.; Mitchell, A. M.; McKay, M. A.; Bosh, R.; Watson, Z.; Penn, M.

    2016-12-01

    The Citizen Continental-America Telescopic Eclipse (Citizen CATE) Experiment will use a fleet of 60 identical telescopes across the United States to image the inner solar corona during the 2017 total solar eclipse. For a proof of concept, five sites were hosted along the path of totality during the 2016 total solar eclipse in Indonesia. Tanjung Pandan, Belitung, Indonesia was the first site to experience totality. This site had the best seeing conditions and focus, resulting in the highest quality images. This site proved that the equipment that is going to be used is capable of recording high quality images of the solar corona. Because 60 sites will be funded, each set up needs to be cost effective. This requires us to use an inexpensive camera, which consequently has a small dynamic range. To compensate for the corona's intensity drop off factor of 1,000, images are taken at seven frames per second, at exposures 0.4ms, 1.3ms, 4.0ms, 13ms, 40ms, 130ms, and 400ms. Using MatLab software, we are able to capture a high dynamic range with an Arduino that controls the 2448 x 2048 CMOS camera. A major component of this project is to train average citizens to use the software, meaning it needs to be as user friendly as possible. The CATE team is currently working with MathWorks to create a graphic user interface (GUI) that will make data collection run smoothly. This interface will include tabs for alignment, focus, calibration data, drift data, GPS, totality, and a quick look function. This work was made possible through the National Solar Observatory Research Experiences for Undergraduates (REU) Program, which is funded by the National Science Foundation (NSF). The NSO Training for 2017 Citizen CATE Experiment, funded by NASA (NASA NNX16AB92A), also provided support for this project. The National Solar Observatory is operated by the Association of Universities for Research in Astronomy, Inc. (AURA) under cooperative agreement with the NSF.

  10. Building a Relationship between Robot Characteristics and Teleoperation User Interfaces.

    Science.gov (United States)

    Mortimer, Michael; Horan, Ben; Seyedmahmoudian, Mehdi

    2017-03-14

    The Robot Operating System (ROS) provides roboticists with a standardized and distributed framework for real-time communication between robotic systems using a microkernel environment. This paper looks at how ROS metadata, Unified Robot Description Format (URDF), Semantic Robot Description Format (SRDF), and its message description language, can be used to identify key robot characteristics to inform User Interface (UI) design for the teleoperation of heterogeneous robot teams. Logical relationships between UI components and robot characteristics are defined by a set of relationship rules created using relevant and available information including developer expertise and ROS metadata. This provides a significant opportunity to move towards a rule-driven approach for generating the designs of teleoperation UIs; in particular the reduction of the number of different UI configurations required to teleoperate each individual robot within a heterogeneous robot team. This approach is based on using an underlying rule set identifying robots that can be teleoperated using the same UI configuration due to having the same or similar robot characteristics. Aside from reducing the number of different UI configurations an operator needs to be familiar with, this approach also supports consistency in UI configurations when a teleoperator is periodically switching between different robots. To achieve this aim, a Matlab toolbox is developed providing users with the ability to define rules specifying the relationship between robot characteristics and UI components. Once rules are defined, selections that best describe the characteristics of the robot type within a particular heterogeneous robot team can be made. A main advantage of this approach is that rather than specifying discrete robots comprising the team, the user can specify characteristics of the team more generally allowing the system to deal with slight variations that may occur in the future. In fact, by using the

  11. Building a Relationship between Robot Characteristics and Teleoperation User Interfaces

    Directory of Open Access Journals (Sweden)

    Michael Mortimer

    2017-03-01

    Full Text Available The Robot Operating System (ROS provides roboticists with a standardized and distributed framework for real-time communication between robotic systems using a microkernel environment. This paper looks at how ROS metadata, Unified Robot Description Format (URDF, Semantic Robot Description Format (SRDF, and its message description language, can be used to identify key robot characteristics to inform User Interface (UI design for the teleoperation of heterogeneous robot teams. Logical relationships between UI components and robot characteristics are defined by a set of relationship rules created using relevant and available information including developer expertise and ROS metadata. This provides a significant opportunity to move towards a rule-driven approach for generating the designs of teleoperation UIs; in particular the reduction of the number of different UI configurations required to teleoperate each individual robot within a heterogeneous robot team. This approach is based on using an underlying rule set identifying robots that can be teleoperated using the same UI configuration due to having the same or similar robot characteristics. Aside from reducing the number of different UI configurations an operator needs to be familiar with, this approach also supports consistency in UI configurations when a teleoperator is periodically switching between different robots. To achieve this aim, a Matlab toolbox is developed providing users with the ability to define rules specifying the relationship between robot characteristics and UI components. Once rules are defined, selections that best describe the characteristics of the robot type within a particular heterogeneous robot team can be made. A main advantage of this approach is that rather than specifying discrete robots comprising the team, the user can specify characteristics of the team more generally allowing the system to deal with slight variations that may occur in the future. In fact, by

  12. Optoelectronic polarimeter controlled by a graphical user interface of Matlab

    Science.gov (United States)

    Vilardy, J. M.; Jimenez, C. J.; Torres, R.

    2017-01-01

    We show the design and implementation of an optical polarimeter using electronic control. The polarimeter has a software with a graphical user interface (GUI) that controls the optoelectronic setup and captures the optical intensity measurement, and finally, this software evaluates the Stokes vector of a state of polarization (SOP) by means of the synchronous detection of optical waves. The proposed optoelectronic polarimeter can determine the Stokes vector of a SOP in a rapid and efficient way. Using the polarimeter proposed in this paper, the students will be able to observe (in an optical bench) and understand the different interactions of the SOP when the optical waves pass through to the linear polarizers and retarder waves plates. The polarimeter prototype could be used as a main tool for the students in order to learn the theory and experimental aspects of the SOP for optical waves via the Stokes vector measurement. The proposed polarimeter controlled by a GUI of Matlab is more attractive and suitable to teach and to learn the polarization of optical waves.

  13. GCL – An Easy Way for Creating Graphical User Interfaces

    Directory of Open Access Journals (Sweden)

    Mariusz Trzaska

    2011-02-01

    Full Text Available Graphical User Interfaces (GUI can be created using several approaches. Beside using visual editors or a manually written source code, it is possible to employ a declarative method. Such a solution usually allows working on a higher abstraction level which saves the developers' time and reduces errors. The approach can follow many ideas. One of them is based on utilizing a Domain Specific Language (DSL. In this paper we present the results of our research concerning a DSL language called GCL (GUI Creating Language. The prototype is implemented as a library for Java with an API emulating the syntax and semantics of a DSL language. A programmer, using a few keywords, is able to create different types of GUIs, including forms, panels, dialogs, etc. The widgets of the GUI are built automatically during the run-time phase based on a given data instance (an ordinary Java object and optionally are to be customized by the programmer. The main contribution of our work is delivering a working library for a popular platform. The library could be easily ported for other programming languages such the MS C#.

  14. Motor dysfunction and touch-slang in user interface data.

    Science.gov (United States)

    Klein, Yoni; Djaldetti, Ruth; Keller, Yosi; Bachelet, Ido

    2017-07-05

    The recent proliferation in mobile touch-based devices paves the way for increasingly efficient, easy to use natural user interfaces (NUI). Unfortunately, touch-based NUIs might prove difficult, or even impossible to operate, in certain conditions e.g. when suffering from motor dysfunction such as Parkinson's Disease (PD). Yet, the prevalence of such devices makes them particularly suitable for acquiring motor function data, and enabling the early detection of PD symptoms and other conditions. In this work we acquired a unique database of more than 12,500 annotated NUI multi-touch gestures, collected from PD patients and healthy volunteers, that were analyzed by applying advanced shape analysis and statistical inference schemes. The proposed analysis leads to a novel detection scheme for early stages of PD. Moreover, our computational analysis revealed that young subjects may be using a 'slang' form of gesture-making to reduce effort and attention cost while maintaining meaning, whereas older subjects put an emphasis on content and precise performance.

  15. Developing adaptive user interfaces using a game-based simulation environment

    NARCIS (Netherlands)

    Brake, G.M. te; Greef, T.E. de; Lindenberg, J.; Rypkema, J.A.; Smets-Noor, N.J.J.M.

    2006-01-01

    In dynamic settings, user interfaces can provide more optimal support if they adapt to the context of use. Providing adaptive user interfaces to first responders may therefore be fruitful. A cognitive engineering method that incorporates development iterations in both a simulated and a real-world

  16. Integrating User Interface and Personal Innovativeness into the TAM for Mobile Learning in Cyber University

    Science.gov (United States)

    Joo, Young Ju; Lee, Hyeon Woo; Ham, Yookyoung

    2014-01-01

    This study aims to add new variables, namely user interface, personal innovativeness, and satisfaction in learning, to Davis's technology acceptance model and also examine whether learners are willing to adopt mobile learning. Thus, this study attempted to explain the structural causal relationships among user interface, personal…

  17. SWATMOD-PREP: Graphical user interface for preparing coupled SWAT-modflow simulations

    Science.gov (United States)

    This paper presents SWATMOD-Prep, a graphical user interface that couples a SWAT watershed model with a MODFLOW groundwater flow model. The interface is based on a recently published SWAT-MODFLOW code that couples the models via mapping schemes. The spatial layout of SWATMOD-Prep guides the user t...

  18. Influence of Learning Styles on Graphical User Interface Preferences for e-Learners

    Science.gov (United States)

    Dedic, Velimir; Markovic, Suzana

    2012-01-01

    Implementing Web-based educational environment requires not only developing appropriate architectures, but also incorporating human factors considerations. User interface becomes the major channel to convey information in e-learning context: a well-designed and friendly enough interface is thus the key element in helping users to get the best…

  19. User interface design principles for the SSM/PMAD automated power system

    Science.gov (United States)

    Jakstas, Laura M.; Myers, Chris J.

    1991-01-01

    Martin Marietta has developed a user interface for the space station module power management and distribution (SSM/PMAD) automated power system testbed which provides human access to the functionality of the power system, as well as exemplifying current techniques in user interface design. The testbed user interface was designed to enable an engineer to operate the system easily without having significant knowledge of computer systems, as well as provide an environment in which the engineer can monitor and interact with the SSM/PMAD system hardware. The design of the interface supports a global view of the most important data from the various hardware and software components, as well as enabling the user to obtain additional or more detailed data when needed. The components and representations of the SSM/PMAD testbed user interface are examined. An engineer's interactions with the system are also described.

  20. Guidelines for the integration of audio cues into computer user interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Sumikawa, D.A.

    1985-06-01

    Throughout the history of computers, vision has been the main channel through which information is conveyed to the computer user. As the complexities of man-machine interactions increase, more and more information must be transferred from the computer to the user and then successfully interpreted by the user. A logical next step in the evolution of the computer-user interface is the incorporation of sound and thereby using the sense of ''hearing'' in the computer experience. This allows our visual and auditory capabilities to work naturally together in unison leading to more effective and efficient interpretation of all information received by the user from the computer. This thesis presents an initial set of guidelines to assist interface developers in designing an effective sight and sound user interface. This study is a synthesis of various aspects of sound, human communication, computer-user interfaces, and psychoacoustics. We introduce the notion of an earcon. Earcons are audio cues used in the computer-user interface to provide information and feedback to the user about some computer object, operation, or interaction. A possible construction technique for earcons, the use of earcons in the interface, how earcons are learned and remembered, and the affects of earcons on their users are investigated. This study takes the point of view that earcons are a language and human/computer communication issue and are therefore analyzed according to the three dimensions of linguistics; syntactics, semantics, and pragmatics.

  1. Towards a modified consumer haptic device for robotic-assisted fine-motor repetitive motion training.

    Science.gov (United States)

    Palsbo, Susan E; Marr, Deborah; Streng, Taylor; Bay, Brian K; Norblad, A Walter

    2011-01-01

    To develop, test and evaluate affordable haptic technology to provide robotic-assisted repetitive motion fine-motor training. A haptic computer/user interface was modified by adding a pantograph to hold a pen and to increase the haptic workspace. Custom software moves a pen attached to the device through prescribed three-dimensional (3D) stroke sequences to create two-dimensional glyphs. The pen's position is recorded in 3D coordinates at 1 kHz. Twenty-one healthy child volunteers were taught a standard handwriting curriculum in a group setting, two times per week for 45-60 min each session over 8 wks. The curriculum was supplemented by the device under the supervision of occupational therapy students. Outcomes were measured using the Evaluation Tool of Children's Handwriting (ETCH), and the Beery-Buktenica Developmental Test of visual-motor integration. Word legibility made significant gains on near point copying task (p=0.04; effect size=0.95). Letter legibility made no significant improvement. One healthy volunteer with illegible handwriting improved significantly on 8 of 14 ETCH measures. The children found the device engaging, but made several recommendations to redesign the pantograph and scribing movements. A consumer haptic device can be modified for robotic-assisted repetitive motion training for children. The device is affordable, portable, and engaging. It is safe for healthy volunteers. Objective time-stamped data offer the potential for telerehabilitation between a remote therapist and the school or home. © 2011 Informa UK, Ltd.

  2. Development and evaluation of nursing user interface screens using multiple methods.

    Science.gov (United States)

    Hyun, Sookyung; Johnson, Stephen B; Stetson, Peter D; Bakken, Suzanne

    2009-12-01

    Building upon the foundation of the Structured Narrative Electronic Health Record (EHR) model, we applied theory-based (combined Technology Acceptance Model and Task-Technology Fit Model) and user-centered methods to explore nurses' perceptions of functional requirements for an electronic nursing documentation system, design user interface screens reflective of the nurses' perspectives, and assess nurses' perceptions of the usability of the prototype user interface screens. The methods resulted in user interface screens that were perceived to be easy to use, potentially useful, and well-matched to nursing documentation tasks associated with Nursing Admission Assessment, Blood Administration, and Nursing Discharge Summary. The methods applied in this research may serve as a guide for others wishing to implement user-centered processes to develop or extend EHR systems. In addition, some of the insights obtained in this study may be informative to the development of safe and efficient user interface screens for nursing document templates in EHRs.

  3. Tactile Feedback for Above-Device Gesture Interfaces

    OpenAIRE

    Freeman, Euan; Brewster, Stephen; Lantz, Vuokko

    2014-01-01

    Above-device gesture interfaces let people interact in the space above mobile devices using hand and finger movements. For example, users could gesture over a mobile phone or wearable without having to use the touchscreen. We look at how above-device interfaces can also give feedback in the space over the device. Recent haptic and wearable technologies give new ways to provide tactile feedback while gesturing, letting touchless gesture interfaces give touch feedback. In this paper we take a f...

  4. Effect of EHR user interface changes on internal prescription discrepancies.

    Science.gov (United States)

    Turchin, A; Sawarkar, A; Dementieva, Y A; Breydo, E; Ramelson, H

    2014-01-01

    To determine whether specific design interventions (changes in the user interface (UI)) of an electronic health record (EHR) medication module are associated with an increase or decrease in the incidence of contradictions between the structured and narrative components of electronic prescriptions (internal prescription discrepancies). We performed a retrospective analysis of 960,000 randomly selected electronic prescriptions generated in a single EHR between 01/2004 and 12/2011. Internal prescription discrepancies were identified using a validated natural language processing tool with recall of 76% and precision of 84%. A multivariable autoregressive integrated moving average (ARIMA) model was used to evaluate the effect of five UI changes in the EHR medication module on incidence of internal prescription discrepancies. Over the study period 175,725 (18.4%) prescriptions were found to have internal discrepancies. The highest rate of prescription discrepancies was observed in March 2006 (22.5%) and the lowest in March 2009 (15.0%). Addition of "as directed" option to the dropdown decreased prescription discrepancies by 195 / month (p = 0.0004). An non-interruptive alert that reminded providers to ensure that structured and narrative components did not contradict each other decreased prescription discrepancies by 145 / month (p = 0.03). Addition of a "Renew / Sign" button to the Medication module (a negative control) did not have an effect in prescription discrepancies. Several UI changes in the electronic medication module were effective in reducing the incidence of internal prescription discrepancies. Further research is needed to identify interventions that can completely eliminate this type of prescription error and their effects on patient outcomes.

  5. Graphical User Interface for Simulink Integrated Performance Analysis Model

    Science.gov (United States)

    Durham, R. Caitlyn

    2009-01-01

    The J-2X Engine (built by Pratt & Whitney Rocketdyne,) in the Upper Stage of the Ares I Crew Launch Vehicle, will only start within a certain range of temperature and pressure for Liquid Hydrogen and Liquid Oxygen propellants. The purpose of the Simulink Integrated Performance Analysis Model is to verify that in all reasonable conditions the temperature and pressure of the propellants are within the required J-2X engine start boxes. In order to run the simulation, test variables must be entered at all reasonable values of parameters such as heat leak and mass flow rate. To make this testing process as efficient as possible in order to save the maximum amount of time and money, and to show that the J-2X engine will start when it is required to do so, a graphical user interface (GUI) was created to allow the input of values to be used as parameters in the Simulink Model, without opening or altering the contents of the model. The GUI must allow for test data to come from Microsoft Excel files, allow those values to be edited before testing, place those values into the Simulink Model, and get the output from the Simulink Model. The GUI was built using MATLAB, and will run the Simulink simulation when the Simulate option is activated. After running the simulation, the GUI will construct a new Microsoft Excel file, as well as a MATLAB matrix file, using the output values for each test of the simulation so that they may graphed and compared to other values.

  6. Development of a Mobile User Interface for Image-based Dietary Assessment.

    Science.gov (United States)

    Kim, Sungye; Schap, Tusarebecca; Bosch, Marc; Maciejewski, Ross; Delp, Edward J; Ebert, David S; Boushey, Carol J

    2010-12-31

    In this paper, we present a mobile user interface for image-based dietary assessment. The mobile user interface provides a front end to a client-server image recognition and portion estimation software. In the client-server configuration, the user interactively records a series of food images using a built-in camera on the mobile device. Images are sent from the mobile device to the server, and the calorie content of the meal is estimated. In this paper, we describe and discuss the design and development of our mobile user interface features. We discuss the design concepts, through initial ideas and implementations. For each concept, we discuss qualitative user feedback from participants using the mobile client application. We then discuss future designs, including work on design considerations for the mobile application to allow the user to interactively correct errors in the automatic processing while reducing the user burden associated with classical pen-and-paper dietary records.

  7. Developing a User-process Model for Designing Menu-based Interfaces: An Exploratory Study.

    Science.gov (United States)

    Ju, Boryung; Gluck, Myke

    2003-01-01

    The purpose of this study was to organize menu items based on a user-process model and implement a new version of current software for enhancing usability of interfaces. A user-process model was developed, drawn from actual users' understanding of their goals and strategies to solve their information needs by using Dervin's Sense-Making Theory…

  8. Use of Design Patterns According to Hand Dominance in a Mobile User Interface

    Science.gov (United States)

    Al-Samarraie, Hosam; Ahmad, Yusof

    2016-01-01

    User interface (UI) design patterns for mobile applications provide a solution to design problems and can improve the usage experience for users. However, there is a lack of research categorizing the uses of design patterns according to users' hand dominance in a learning-based mobile UI. We classified the main design patterns for mobile…

  9. Comparing two anesthesia information management system user interfaces: a usability evaluation.

    Science.gov (United States)

    Wanderer, Jonathan P; Rao, Anoop V; Rothwell, Sarah H; Ehrenfeld, Jesse M

    2012-11-01

    Anesthesia information management systems (AIMS) have been developed by multiple vendors and are deployed in thousands of operating rooms around the world, yet not much is known about measuring and improving AIMS usability. We developed a methodology for evaluating AIMS usability in a low-fidelity simulated clinical environment and used it to compare an existing user interface with a revised version. We hypothesized that the revised user interface would be more useable. In a low-fidelity simulated clinical environment, twenty anesthesia providers documented essential anesthetic information for the start of the case using both an existing and a revised user interface. Participants had not used the revised user interface previously and completed a brief training exercise prior to the study task. All participants completed a workload assessment and a satisfaction survey. All sessions were recorded. Multiple usability metrics were measured. The primary outcome was documentation accuracy. Secondary outcomes were perceived workload, number of documentation steps, number of user interactions, and documentation time. The interfaces were compared and design problems were identified by analyzing recorded sessions and survey results. Use of the revised user interface was shown to improve documentation accuracy from 85.1% to 92.4%, a difference of 7.3% (95% confidence interval [CI] for the difference 1.8 to 12.7). The revised user interface decreased the number of user interactions by 6.5 for intravenous documentation (95% CI 2.9 to 10.1) and by 16.1 for airway documentation (95% CI 11.1 to 21.1). The revised user interface required 3.8 fewer documentation steps (95% CI 2.3 to 5.4). Airway documentation time was reduced by 30.5 seconds with the revised workflow (95% CI 8.5 to 52.4). There were no significant time differences noted in intravenous documentation or in total task time. No difference in perceived workload was found between the user interfaces. Two user interface

  10. The web-based user interface for EAST plasma control system

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, R.R., E-mail: rrzhang@ipp.ac.cn [Institute of Plasma Physics, Chinese Academy of Sciences, Anhui (China); Xiao, B.J. [Institute of Plasma Physics, Chinese Academy of Sciences, Anhui (China); School of Nuclear Science and Technology, University of Science and Technology of China, Anhui (China); Yuan, Q.P. [Institute of Plasma Physics, Chinese Academy of Sciences, Anhui (China); Yang, F. [Institute of Plasma Physics, Chinese Academy of Sciences, Anhui (China); Department of Computer Science, Anhui Medical University, Anhui (China); Zhang, Y. [Institute of Plasma Physics, Chinese Academy of Sciences, Anhui (China); Johnson, R.D.; Penaflor, B.G. [General Atomics, DIII-D National Fusion Facility, San Diego, CA (United States)

    2014-05-15

    The plasma control system (PCS) plays a vital role at EAST for fusion science experiments. Its software application consists of two main parts: an IDL graphical user interface for setting a large number of plasma parameters to specify each discharge, several programs for performing the real-time feedback control and managing the whole control system. The PCS user interface can be used from any X11 Windows client with privileged access to the PCS computer system. However, remote access to the PCS system via the IDL user interface becomes an extreme inconvenience due to the high network latency to draw or operate the interfaces. In order to realize lower latency for remote access to the PCS system, a web-based system has been developed for EAST recently. The setup data are retrieved from the PCS system and client-side JavaScript draws the interfaces into the user's browser. The user settings are also sent back to the PCS system for controlling discharges. These technologies allow the web-based user interface to be viewed by authorized users with a web browser and have it communicate with PCS server processes directly. It works together with the IDL interface and provides a new way to aid remote participation.

  11. VISTA (Vertical Integration of Science, Technology, and Applications) user interface software study

    Energy Technology Data Exchange (ETDEWEB)

    Chin, G.

    1990-04-01

    The Vertical Integration of Science, Technology, and Applications (VISTA) project is an initiative to employ modern information and communications technology for rapid and effective application of basic research results by end users. Developed by the Pacific Northwest Laboratory, VISTA's purpose is to develop and deploy information systems (software or software/hardware products) to broad segments of various markets. Inherent in these products would be mechanisms for accessing PNL-resident information about the problem. A goal of VISTA is to incorporate existing, commercially available user interface technology into the VISTA UIMS. Commercial systems are generally more complete, reliable, and cost-effective than software developed in-house. The objective of this report is to examine the current state of commercial user interface software and discuss the implications of selections thereof. This report begins by describing the functionality of the user interface as it applies to users and application developers. Next, a reference model is presented defining the various operational software layers of a graphical user interface. The main body follows which examines current user interface technology by sampling a number of commercial systems. Both the window system and user interface toolkit markets are surveyed. A summary of the current technology concludes this report. 15 refs., 3 figs., 1 tab.

  12. Bilinear modeling of EMG signals to extract user-independent features for multiuser myoelectric interface.

    Science.gov (United States)

    Matsubara, Takamitsu; Morimoto, Jun

    2013-08-01

    In this study, we propose a multiuser myoelectric interface that can easily adapt to novel users. When a user performs different motions (e.g., grasping and pinching), different electromyography (EMG) signals are measured. When different users perform the same motion (e.g., grasping), different EMG signals are also measured. Therefore, designing a myoelectric interface that can be used by multiple users to perform multiple motions is difficult. To cope with this problem, we propose for EMG signals a bilinear model that is composed of two linear factors: 1) user dependent and 2) motion dependent. By decomposing the EMG signals into these two factors, the extracted motion-dependent factors can be used as user-independent features. We can construct a motion classifier on the extracted feature space to develop the multiuser interface. For novel users, the proposed adaptation method estimates the user-dependent factor through only a few interactions. The bilinear EMG model with the estimated user-dependent factor can extract the user-independent features from the novel user data. We applied our proposed method to a recognition task of five hand gestures for robotic hand control using four-channel EMG signals measured from subject forearms. Our method resulted in 73% accuracy, which was statistically significantly different from the accuracy of standard nonmultiuser interfaces, as the result of a two-sample t -test at a significance level of 1%.

  13. User interface development and metadata considerations for the Atmospheric Radiation Measurement (ARM) archive

    Science.gov (United States)

    Singley, P. T.; Bell, J. D.; Daugherty, P. F.; Hubbs, C. A.; Tuggle, J. G.

    1993-01-01

    This paper will discuss user interface development and the structure and use of metadata for the Atmospheric Radiation Measurement (ARM) Archive. The ARM Archive, located at Oak Ridge National Laboratory (ORNL) in Oak Ridge, Tennessee, is the data repository for the U.S. Department of Energy's (DOE's) ARM Project. After a short description of the ARM Project and the ARM Archive's role, we will consider the philosophy and goals, constraints, and prototype implementation of the user interface for the archive. We will also describe the metadata that are stored at the archive and support the user interface.

  14. Cognitive Awareness Prototype Development on User Interface Design

    Science.gov (United States)

    Rosli, D'oria Islamiah

    2015-01-01

    Human error is a crucial problem in manufacturing industries. Due to the misinterpretation of information on interface system design, accidents or death may occur at workplace. Lack of human cognition criteria in interface system design is also one of the contributions to the failure in using the system effectively. Therefore, this paper describes…

  15. Age Based User Interface in Mobile Operating System

    OpenAIRE

    Sharma, Sumit; Sharma, Rohitt; Singh, Paramjit; Mahajan, Aditya

    2012-01-01

    This paper proposes the creation of different interfaces in the mobile operating system for different age groups. The different age groups identified are kids, elderly people and all others. The motive behind creating different interfaces is to make the smartphones of today's world usable to all age groups.

  16. Finding and Exploring Health Information with a Slider-Based User Interface.

    Science.gov (United States)

    Pang, Patrick Cheong-Iao; Verspoor, Karin; Pearce, Jon; Chang, Shanton

    2016-01-01

    Despite the fact that search engines are the primary channel to access online health information, there are better ways to find and explore health information on the web. Search engines are prone to problems when they are used to find health information. For instance, users have difficulties in expressing health scenarios with appropriate search keywords, search results are not optimised for medical queries, and the search process does not account for users' literacy levels and reading preferences. In this paper, we describe our approach to addressing these problems by introducing a novel design using a slider-based user interface for discovering health information without the need for precise search keywords. The user evaluation suggests that the interface is easy to use and able to assist users in the process of discovering new information. This study demonstrates the potential value of adopting slider controls in the user interface of health websites for navigation and information discovery.

  17. Natural user interface based on gestures recognition using Leap Motion sensor

    Directory of Open Access Journals (Sweden)

    L. Sousa

    2015-11-01

    Full Text Available Natural User Interface (NUI is a term used for human-computer interfaces where the interface is invisible or becomes invisible after successive user-immersion levels, it is typically based on the human nature or human natural elements. Currently several three-dimensional (3D sensors and system can be used to interpret specific human gestures, enabling a completely hands-free control of electronic devices, manipulating objects in a virtual world or interacting with augmented reality applications. This paper presents a set of methods to recognize 3D gestures, and some human-computer interfaces applications using a Leap Motion sensor

  18. Continuum of haptic space

    NARCIS (Netherlands)

    Kappers, A.M.L.; Koenderink, J.J.

    2002-01-01

    The structure of haptic space has first received serious attention in 1937 by Blumenfeld. Haptic space, as used in this chapter and indeed also by Blumenfeld, involves the space around us which we can reach by touch from a fixed position. How this space is related to the space through which we

  19. Single-Switch User Interface for Robot Arm to Help Disabled People Using RT-Middleware

    Directory of Open Access Journals (Sweden)

    Yujin Wakita

    2011-01-01

    Full Text Available We are developing a manipulator system in order to support disabled people with less muscle strength such as muscular dystrophy patients. Such a manipulator should have an easy user interface for the users to control it. But the supporting manipulator for disabled people cannot make large industry, so we should offer inexpensive manufacturing way. These type products are called “orphan products.” We report on the construction of the user interface system using RT-Middleware which is an open software platform for robot systems. Therefore other user interface components or robot components which are adapted to other symptoms can be replaced with the user interface without any change of the contents. A single switch and scanning menu panel are introduced as the input device for the manual control of the robot arm. The scanning menu panel is designed to perform various actions of the robot arm with the single switch. A manipulator simulation system was constructed to evaluate the input performance. Two muscular dystrophy patients tried our user interface to control the robot simulator and made comments. According to the comments by them, we made several improvements on the user interface. This improvements examples prepare inexpensive manufacturing way for orphan products.

  20. Animations Effect on Reading Comprehension in Web-based User Interfaces

    OpenAIRE

    Nordahl, Sanna

    2016-01-01

    When it comes to web-based user interfaces and web design, one of today’s trends is to use informative and storytelling animations. They can be used as tools for communication, simplifying the interaction, or guiding the user’s attention. However, those animations used in a web- based user interface can slow down the interaction and the user flow and become a distraction for the user. Three popular informative and storytelling animations that are used in web design are: background video, anim...

  1. Eye gaze in intelligent user interfaces gaze-based analyses, models and applications

    CERN Document Server

    Nakano, Yukiko I; Bader, Thomas

    2013-01-01

    Remarkable progress in eye-tracking technologies opened the way to design novel attention-based intelligent user interfaces, and highlighted the importance of better understanding of eye-gaze in human-computer interaction and human-human communication. For instance, a user's focus of attention is useful in interpreting the user's intentions, their understanding of the conversation, and their attitude towards the conversation. In human face-to-face communication, eye gaze plays an important role in floor management, grounding, and engagement in conversation.Eye Gaze in Intelligent User Interfac

  2. Haptic Interface for UAV Teleoperation

    NARCIS (Netherlands)

    Lam, T.M.

    2009-01-01

    In the teleoperation of an uninhabited aerial vehicle (UAV), the human operator is physically separated from the vehicle and lacks various multiple-sensory information such as sound, motions, and vibrations of the airframe. The operator is usually only provided with visual information, e.g., from

  3. effects of user behaviour on gsm air interface performance

    African Journals Online (AJOL)

    User

    rated software objects, called agents. (Wooldridge, 2002). The agent representing the. 'individual' is .... to using voice when the network is congested. Figure 5 compares the blocking probabilities from the basic model with those obtained from the situation where users use SMS during con- gestion. Modified. User Behaviour.

  4. Interface Prostheses With Classifier-Feedback-Based User Training.

    Science.gov (United States)

    Fang, Yinfeng; Zhou, Dalin; Li, Kairu; Liu, Honghai

    It is evident that user training significantly affects performance of pattern-recognition-based myoelectric prosthetic device control. Despite plausible classification accuracy on offline datasets, online accuracy usually suffers from the changes in physiological conditions and electrode displacement. The user ability in generating consistent electromyographic (EMG) patterns can be enhanced via proper user training strategies in order to improve online performance. This study proposes a clustering-feedback strategy that provides real-time feedback to users by means of a visualized online EMG signal input as well as the centroids of the training samples, whose dimensionality is reduced to minimal number by dimension reduction. Clustering feedback provides a criterion that guides users to adjust motion gestures and muscle contraction forces intentionally. The experiment results have demonstrated that hand motion recognition accuracy increases steadily along the progress of the clustering-feedback-based user training, while conventional classifier-feedback methods, i.e., label feedback, hardly achieve any improvement. The result concludes that the use of proper classifier feedback can accelerate the process of user training, and implies prosperous future for the amputees with limited or no experience in pattern-recognition-based prosthetic device manipulation.It is evident that user training significantly affects performance of pattern-recognition-based myoelectric prosthetic device control. Despite plausible classification accuracy on offline datasets, online accuracy usually suffers from the changes in physiological conditions and electrode displacement. The user ability in generating consistent electromyographic (EMG) patterns can be enhanced via proper user training strategies in order to improve online performance. This study proposes a clustering-feedback strategy that provides real-time feedback to users by means of a visualized online EMG signal input as well

  5. Developing A Web-based User Interface for Semantic Information Retrieval

    Science.gov (United States)

    Berrios, Daniel C.; Keller, Richard M.

    2003-01-01

    While there are now a number of languages and frameworks that enable computer-based systems to search stored data semantically, the optimal design for effective user interfaces for such systems is still uncle ar. Such interfaces should mask unnecessary query detail from users, yet still allow them to build queries of arbitrary complexity without significant restrictions. We developed a user interface supporting s emantic query generation for Semanticorganizer, a tool used by scient ists and engineers at NASA to construct networks of knowledge and dat a. Through this interface users can select node types, node attribute s and node links to build ad-hoc semantic queries for searching the S emanticOrganizer network.

  6. A Formal Approach to User Interface Design using Hybrid System Theory Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Optimal Synthesis Inc.(OSI) proposes to develop an aiding tool for user interface design that is based on mathematical formalism of hybrid system theory. The...

  7. An Efficient User Interface Design for Nursing Information System Based on Integrated Patient Order Information.

    Science.gov (United States)

    Chu, Chia-Hui; Kuo, Ming-Chuan; Weng, Shu-Hui; Lee, Ting-Ting

    2016-01-01

    A user friendly interface can enhance the efficiency of data entry, which is crucial for building a complete database. In this study, two user interfaces (traditional pull-down menu vs. check boxes) are proposed and evaluated based on medical records with fever medication orders by measuring the time for data entry, steps for each data entry record, and the complete rate of each medical record. The result revealed that the time for data entry is reduced from 22.8 sec/record to 3.2 sec/record. The data entry procedures also have reduced from 9 steps in the traditional one to 3 steps in the new one. In addition, the completeness of medical records is increased from 20.2% to 98%. All these results indicate that the new user interface provides a more user friendly and efficient approach for data entry than the traditional interface.

  8. Advanced User Interface Design and Advanced Internetting for Tactical Security Systems

    National Research Council Canada - National Science Library

    Murray, S

    1998-01-01

    ...), at the request of the U.S. Army Product Manager - Physical Security Equipment, initiated two exploratory development projects at SPAWAR Systems Center, San Diego, to develop an Advanced User Interface for Tactical Security (AITS...

  9. The design and evaluation of an activity monitoring user interface for people with stroke.

    Science.gov (United States)

    Hart, Phil; Bierwirth, Rebekah; Fulk, George; Sazonov, Edward

    2014-01-01

    Usability is an important topic in the field of telerehabilitation research. Older users with disabilities in particular, present age-related and disability-related challenges that should be accommodated for in the design of a user interface for a telerehabilitation system. This paper describes the design, implementation, and assessment of a telerehabilitation system user interface that tries to maximize usability for an elderly user who has experienced a stroke. An Internet-connected Nintendo(®) Wii™ gaming system is selected as a hardware platform, and a server and website are implemented to process and display the feedback information. The usability of the interface is assessed with a trial consisting of 18 subjects: 10 healthy Doctor of Physical Therapy students and 8 people with a stroke. Results show similar levels of usability and high satisfaction with the gaming system interface from both groups of subjects.

  10. Responsive Graphical User Interface (ReGUI) and its Implementation in MATLAB

    OpenAIRE

    Mikulszky, Matej; Pocsova, Jana; Mojzisova, Andrea; Podlubny, Igor

    2017-01-01

    In this paper we introduce the responsive graphical user interface (ReGUI) approach to creating applications, and demonstrate how this approach can be implemented in MATLAB. The same general technique can be used in other programming languages.

  11. Teleoperación de un vehículo remoto en un medio de acceso inalámbrico mediante el uso de una interfaz háptica Remote vehicle teleoperation through a haptic interface

    Directory of Open Access Journals (Sweden)

    Arys Carrasquilla Batista

    2012-11-01

    Full Text Available La teleoperación permite que el ser humano pueda llevar a cabo ciertas tareas en lugares muy lejanos, de difícil acceso o de condiciones hostiles para la presencia de un operador. Con este proyecto se logró teleoperar un vehículo remoto por medio del protocolo de comunicación Bluetooth, para lo cual se adaptó una interfaz háptica (Novint Falcon. Desde la interfaz, el operador puede enviar la consigna de movimiento y, además, obtener sensaciones conforme a la información de los sensores incluidos al sistema. Como vehículo remoto se utilizó un Lego Mindstorms con capacidad de comunicación Bluetooth, al cual se incorporó un sensor de contacto y otro de ultrasonido con el fin de percibir en la interfaz háptica la retroalimentación de fuerzas. A una computadora estándar se le dio capacidad de comunicación Bluetooth por medio de un adaptador USB, desde la cual se ejecuta un programa creado en C++ para controlar las acciones de la interfaz háptica, enviar los comandos de movimiento al vehículo y recibir la información de los sensores, la cual reproduce sensaciones al operador.Teleoperation allows human beings to carry out certain tasks in places far away, inaccessible or with hostile conditions for the presence of an operator. In this project a remote vehicle is teleoperated using a bluetooth wireless connection, to accomplish this an haptic interface (Novint Falcon was used. The operator can give movement instructions to the vehicle and obtain sensations, according to the information received from the sensors connected to the system. The remote vehicle was a Lego Mindstorms with bluetooth communication capabilities, a touch sensor and ultrasonic sensor were included in order to perceive reflection of forces through the haptic interface. A USB adapter for bluetooth communication was added to a standard computer; a program in C++ is executed over this computer to control the haptic interface, send movement commands to the vehicle and

  12. A Framework for Effective User Interface Design for Web-Based Electronic Commerce Applications

    Directory of Open Access Journals (Sweden)

    Justyna Burns

    2001-01-01

    Full Text Available Efficient delivery of relevant product information is increasingly becoming the central basis of competition between firms. The interface design represents the central component for successful information delivery to consumers. However, interface design for web-based information systems is probably more an art than a science at this point in time. Much research is needed to understand properties of an effective interface for electronic commerce. This paper develops a framework identifying the relationship between user factors, the role of the user interface and overall system success for web-based electronic commerce. The paper argues that web-based systems for electronic commerce have some similar properties to decision support systems (DSS and adapts an established DSS framework to the electronic commerce domain. Based on a limited amount of research studying web browser interface design, the framework identifies areas of research needed and outlines possible relationships between consumer characteristics, interface design attributes and measures of overall system success.

  13. Haptics using a smart material for eyes-free interaction in personal devices

    Science.gov (United States)

    Wang, Huihui; Lane, William Brian; Pappas, Devin; Duque, Bryam; Leong, John

    2014-03-01

    In this paper we present a prototype using a dry ionic polymer metal composite (IPMC) in interactive personal devices such as bracelet, necklace, pocket key chain or mobile devices for haptic interaction when audio or visual feedback is not possible or practical. This prototype interface is an electro-mechanical system that realizes a shape-changing haptic display for information communication. A dry IPMC will change its dimensions due to the electrostatic effect when an electrical potential is provided to them. The IPMC can operate at a lower voltage (less than 2.5V) which is compatible with requirements for personal electrical devices or mobile devices. The prototype consists of the addressable arrays of the IPMCs with different dimensions which are deformable to different shapes with proper handling or customization. 3D printing technology will be used to form supporting parts. Microcontrollers (about 3cm square) from DigiKey will be imbedded into this personal device. An Android based mobile APP will be developed to talk with microcontrollers to control IPMCs. When personal devices receive information signals, the original shape of the prototype will change to another shape related to the specific sender or types of information sources. This interactive prototype can simultaneously realize multiple methods for conveying haptic information such as dimension, force, and texture due to the flexible array design. We conduct several studies of user experience to explore how users' respond to shape change information.

  14. Introducing a new open source GIS user interface for the SWAT model

    Science.gov (United States)

    The Soil and Water Assessment Tool (SWAT) model is a robust watershed modelling tool. It typically uses the ArcSWAT interface to create its inputs. ArcSWAT is public domain software which works in the licensed ArcGIS environment. The aim of this paper was to develop an open source user interface ...

  15. A Framework and Implementation of User Interface and Human-Computer Interaction Instruction

    Science.gov (United States)

    Peslak, Alan

    2005-01-01

    Researchers have suggested that up to 50 % of the effort in development of information systems is devoted to user interface development (Douglas, Tremaine, Leventhal, Wills, & Manaris, 2002; Myers & Rosson, 1992). Yet little study has been performed on the inclusion of important interface and human-computer interaction topics into a current…

  16. A graphical user interface (gui) matlab program Synthetic_Ves For ...

    African Journals Online (AJOL)

    An interactive and robust computer program for 1D forward modeling of Schlumberger Vertical Electrical Sounding (VES) curves for multilayered earth models is presented. The Graphical User Interface (GUI) enabled software, written in MATLAB v.7.12.0.635 (R2011a), accepts user-defined geologic model parameters (i.e. ...

  17. Designing personal attentive user interfaces in the mobile public safety domain

    NARCIS (Netherlands)

    Streefkerk, J.W.; Esch van-Bussemakers, M.P.; Neerincx, M.A.

    2006-01-01

    In the mobile computing environment, there is a need to adapt the information and service provision to the momentary attentive state of the user, operational requirements and usage context. This paper proposes to design personal attentive user interfaces (PAUI) for which the content and style of

  18. Interfacing Media: User-Centered Design for Media-Rich Web Sites.

    Science.gov (United States)

    Horton, Sarah

    2000-01-01

    Discusses multimedia Web site design that may include images, animations, audio, and video. Highlights include interfaces that stress user-centered design; using only relevant media; placing high-demand content on secondary pages and keeping the home page simpler; providing information about the media; considering users with disabilities; and user…

  19. Exploratory Usability Testing of User Interface Options in LibGuides 2

    Science.gov (United States)

    Thorngate, Sarah; Hoden, Allison

    2017-01-01

    Online research guides offer librarians a way to provide digital researchers with point-of-need support. If these guides are to support student learning well, it is critical that they provide an effective user experience. This article details the results of an exploratory comparison study that tested three key user interface options in LibGuides…

  20. Preparing for Future Learning with a Tangible User Interface: The Case of Neuroscience

    Science.gov (United States)

    Schneider, B.; Wallace, J.; Blikstein, P.; Pea, R.

    2013-01-01

    In this paper, we describe the development and evaluation of a microworld-based learning environment for neuroscience. Our system, BrainExplorer, allows students to discover the way neural pathways work by interacting with a tangible user interface. By severing and reconfiguring connections, users can observe how the visual field is impaired and,…

  1. Continuous affect state annotation using a joystick-based user interface

    NARCIS (Netherlands)

    Antony, J.; Sharma, K.; van den Broek, Egon L.; Castellini, C.; Borst, C.

    2014-01-01

    Ongoing research at the DLR (German Aerospace Center) aims to employ affective computing techniques to ascertain the emotional states of users in motion simulators. In this work, a novel user feedback interface employing a joystick to acquire subjective evaluation of the affective experience is

  2. Research and Development for an Operational Information Ecology: The User-System Interface Agent Project

    Science.gov (United States)

    Srivastava, Sadanand; deLamadrid, James

    1998-01-01

    The User System Interface Agent (USIA) is a special type of software agent which acts as the "middle man" between a human user and an information processing environment. USIA consists of a group of cooperating agents which are responsible for assisting users in obtaining information processing services intuitively and efficiently. Some of the main features of USIA include: (1) multiple interaction modes and (2) user-specific and stereotype modeling and adaptation. This prototype system provides us with a development platform towards the realization of an operational information ecology. In the first phase of this project we focus on the design and implementation of prototype system of the User-System Interface Agent (USIA). The second face of USIA allows user interaction via a restricted query language as well as through a taxonomy of windows. In third phase the USIA system architecture was revised.

  3. Brain-Computer Interfaces and User Experience Evaluation

    NARCIS (Netherlands)

    van de Laar, B.L.A.; Gürkök, Hayrettin; Plass - Oude Bos, D.; Nijboer, Femke; Allison, Brendan Z.; Dunne, Stephen; Leeb, Robert; del R. Millán, José; Nijholt, Antinus

    2012-01-01

    The research on brain–computer interfaces (BCIs) is pushing hard to bring technologies out of the lab, into society and onto the market. The newly developing merge of the field of BCI with human–computer interaction (HCI) is paving the way for new applications such as BCI-controlled games. The

  4. Perspectives on User Experience Evaluation of Brain-Computer Interfaces

    NARCIS (Netherlands)

    van de Laar, B.L.A.; Gürkök, Hayrettin; Plass - Oude Bos, D.; Nijboer, Femke; Nijholt, Antinus; Stephanidis, Constantine

    2011-01-01

    The research on brain-computer interfaces (BCIs) is pushing hard to bring technologies out of the lab and into society and onto the market. The nascent merge between the field of BCI and human-computer interaction (HCI) is paving the way for new applications such as BCI-controlled gaming. The

  5. Image as Interface : Consequences for Users of Museum Knowledge

    NARCIS (Netherlands)

    de Rijcke, Sarah; Beaulieu, Anne

    2011-01-01

    Photographs of objects are ubiquitous in the work and presentation of museums, whether in collection-management infrastructure or in Web-based communication. This article examines the use of images in these settings and traces how they function as interfaces and tools in the production of museum

  6. A case study on better iconographic design in electronic medical records' user interface.

    Science.gov (United States)

    Tasa, Umut Burcu; Ozcan, Oguzhan; Yantac, Asim Evren; Unluer, Ayca

    2008-06-01

    It is a known fact that there is a conflict between what users expect and what user interface designers create in the field of medical informatics along with other fields of interface design. The objective of the study is to suggest, from the 'design art' perspective, a method for improving the usability of an electronic medical record (EMR) interface. The suggestion is based on the hypothesis that the user interface of an EMR should be iconographic. The proposed three-step method consists of a questionnaire survey on how hospital users perceive concepts/terms that are going to be used in the EMR user interface. Then icons associated with the terms are designed by a designer, following a guideline which is prepared according to the results of the first questionnaire. Finally the icons are asked back to the target group for proof. A case study was conducted with 64 medical staff and 30 professional designers for the first questionnaire, and with 30 medical staff for the second. In the second questionnaire 7.53 icons out of 10 were matched correctly with a standard deviation of 0.98. Also, all icons except three were matched correctly in at least 83.3% of the forms. The proposed new method differs from the majority of previous studies which are based on user requirements by leaning on user experiments instead. The study demonstrated that the user interface of EMRs should be designed according to a guideline that results from a survey on users' experiences on metaphoric perception of the terms.

  7. User Interfaces and HCI for Ambient Intelligence and Smart Environments

    Science.gov (United States)

    Butz, Andreas

    As this book clearly demonstrates, there are many ways to create smart environments and to realize the vision of ambient intelligence. But whatever constitutes this smartness or intelligence, has to manifest itself to the human user through the human senses. Interaction with the environment can only take place through phenomena which can be perceived through these senses and through physical actions xecuted by the human. Therefore, the devices which create these phenomena (e.g., light, sound, force, …) or sense these actions are the user's contact point with the underlying smartness or intelligence.

  8. An Approach to User Interface Design with Two Indigenous Groups in Namibia

    DEFF Research Database (Denmark)

    Rodil, Kasper; Winschiers-Theophilus, Heike; Stanley, Colin

    2014-01-01

    It has been widely reported that interactions with and expectations of technology differ across cultural contexts. Concepts such as ‘usability’ have shown to be context-dependent, thus user interfaces intuitive to one group of users appears counter-intuitive to the others. In an attempt to localise...... a user interface of a tablet based system aimed at preserving Indigenous Knowledge for rural Herero communities, we present findings from two sites in Namibia, complementing prior research. Participants who had little or no previous experience with technologies informed our endeavour of aligning local...

  9. Workshop AccessibleTV "Accessible User Interfaces for Future TV Applications"

    Science.gov (United States)

    Hahn, Volker; Hamisu, Pascal; Jung, Christopher; Heinrich, Gregor; Duarte, Carlos; Langdon, Pat

    Approximately half of the elderly people over 55 suffer from some type of typically mild visual, auditory, motor or cognitive impairment. For them interaction, especially with PCs and other complex devices is sometimes challenging, although accessible ICT applications could make much of a difference for their living quality. Basically they have the potential to enable or simplify participation and inclusion in their surrounding private and professional communities. However, the availability of accessible user interfaces being capable to adapt to the specific needs and requirements of users with individual impairments is very limited. Although there are a number of APIs [1, 2, 3, 4] available for various platforms that allow developers to provide accessibility features within their applications, today none of them provides features for the automatic adaptation of multimodal interfaces being capable to automatically fit the individual requirements of users with different kinds of impairments. Moreover, the provision of accessible user interfaces is still expensive and risky for application developers, as they need special experience and effort for user tests. Today many implementations simply neglect the needs of elderly people, thus locking out a large portion of their potential users. The workshop is organized as part of the dissemination activity for the European-funded project GUIDE "Gentle user interfaces for elderly people", which aims to address this situation with a comprehensive approach for the realization of multimodal user interfaces being capable to adapt to the needs of users with different kinds of mild impairments. As application platform, GUIDE will mainly target TVs and Set-Top Boxes, such as the emerging Connected-TV or WebTV platforms, as they have the potential to address the needs of the elderly users with applications such as for home automation, communication or continuing education.

  10. User productivity as a function of AutoCAD interface design.

    Science.gov (United States)

    Mitta, D A; Flores, P L

    1995-12-01

    Increased operator productivity is a desired outcome of user-CAD interaction scenarios. Two objectives of this research were to (1) define a measure of operator productivity and (2) empirically investigate the potential effects of CAD interface design on operator productivity, where productivity is defined as the percentage of a drawing session correctly completed per unit time. Here, AutoCAD provides the CAD environment of interest. Productivity with respect to two AutoCAD interface designs (menu, template) and three task types (draw, dimension, display) was investigated. Analysis of user productivity data revealed significantly higher productivity under the menu interface condition than under the template interface condition. A significant effect of task type was also discovered, where user productivity under display tasks was higher than productivity under the draw and dimension tasks. Implications of these results are presented.

  11. User interface considerations to prevent self-driving carsickness

    NARCIS (Netherlands)

    Diels, Cyriel; Bos, Jelte E.

    2015-01-01

    Self-driving cars have the potential to bring significant benefits to drivers and society at large. However, all envisaged scenarios are predicted to increase the risk of motion sickness. This will negatively affect user acceptance and uptake and hence negate the benefits of this technology. Here we

  12. NFC-Based User Interface for Smart Environments

    Directory of Open Access Journals (Sweden)

    Susanna Spinsante

    2015-01-01

    Full Text Available The physical support of a home automation system, joined with a simplified user-system interaction modality, may allow people affected by motor impairments or limitations, such as elderly and disabled people, to live safely and comfortably at home, by improving their autonomy and facilitating the execution of daily life tasks. The proposed solution takes advantage of the Near Field Communications technology, which is simple and intuitive to use, to enable advanced user interaction. The user can perform normal daily activities, such as lifting a gate or closing a window, through a device enabled to read NFC tags containing the commands for the home automation system. A passive Smart Panel is implemented, composed of multiple Near Field Communications tags properly programmed, to enable the execution of both individual commands and so-called scenarios. The work compares several versions of the proposed Smart Panel, differing for interrogation and composition of the single command, number of tags, and dynamic user interaction model, at a parity of the number of commands to issue. Main conclusions are drawn from the experimental results, about the effective adoption of Near Field Communications in smart assistive environments.

  13. Design of electronic medical record user interfaces: a matrix-based method for improving usability.

    Science.gov (United States)

    Kuqi, Kushtrim; Eveleigh, Tim; Holzer, Thomas; Sarkani, Shahryar; Levin, James E; Crowley, Rebecca S

    2013-01-01

    This study examines a new approach of using the Design Structure Matrix (DSM) modeling technique to improve the design of Electronic Medical Record (EMR) user interfaces. The usability of an EMR medication dosage calculator used for placing orders in an academic hospital setting was investigated. The proposed method captures and analyzes the interactions between user interface elements of the EMR system and groups elements based on information exchange, spatial adjacency, and similarity to improve screen density and time-on-task. Medication dose adjustment task time was recorded for the existing and new designs using a cognitive simulation model that predicts user performance. We estimate that the design improvement could reduce time-on-task by saving an average of 21 hours of hospital physicians' time over the course of a month. The study suggests that the application of DSM can improve the usability of an EMR user interface.

  14. Design of Electronic Medical Record User Interfaces: A Matrix-Based Method for Improving Usability

    Directory of Open Access Journals (Sweden)

    Kushtrim Kuqi

    2013-01-01

    Full Text Available This study examines a new approach of using the Design Structure Matrix (DSM modeling technique to improve the design of Electronic Medical Record (EMR user interfaces. The usability of an EMR medication dosage calculator used for placing orders in an academic hospital setting was investigated. The proposed method captures and analyzes the interactions between user interface elements of the EMR system and groups elements based on information exchange, spatial adjacency, and similarity to improve screen density and time-on-task. Medication dose adjustment task time was recorded for the existing and new designs using a cognitive simulation model that predicts user performance. We estimate that the design improvement could reduce time-on-task by saving an average of 21 hours of hospital physicians’ time over the course of a month. The study suggests that the application of DSM can improve the usability of an EMR user interface.

  15. WIFIP: a web-based user interface for automated synchrotron beamlines.

    Science.gov (United States)

    Sallaz-Damaz, Yoann; Ferrer, Jean Luc

    2017-09-01

    The beamline control software, through the associated graphical user interface (GUI), is the user access point to the experiment, interacting with synchrotron beamline components and providing automated routines. FIP, the French beamline for the Investigation of Proteins, is a highly automatized macromolecular crystallography (MX) beamline at the European Synchrotron Radiation Facility. On such a beamline, a significant number of users choose to control their experiment remotely. This is often performed with a limited bandwidth and from a large choice of computers and operating systems. Furthermore, this has to be possible in a rapidly evolving experimental environment, where new developments have to be easily integrated. To face these challenges, a light, platform-independent, control software and associated GUI are required. Here, WIFIP, a web-based user interface developed at FIP, is described. Further than being the present FIP control interface, WIFIP is also a proof of concept for future MX control software.

  16. Four Principles for User Interface Design of Computerised Clinical Decision Support Systems

    DEFF Research Database (Denmark)

    Kanstrup, Anne Marie; Christiansen, Marion Berg; Nøhr, Christian

    2011-01-01

    Abstract.  The paper presents results from design of a user interface for a Computerised Clinical Decision Support System (CSSS). The ambition has been to design Human-Computer Interaction that can minimise medication errors. Through an iterative design process a digital prototype for prescription...... emphasises a focus on how users interact with the system, a focus on how information is provided by the system, and four principles of interaction. The four principles for design of user interfaces for CDSS are summarised as four A’s: All in one, At a glance, At hand and Attention. It is recommended that all...... four interaction principles are integrated in the design of user interfaces for CDSS, i.e. the model is an integrated model which we suggest as a guide for interaction design when working with preventing medication errors....

  17. Observing cassette culture: user interface implications for digital music libraries

    OpenAIRE

    Toal, Jason

    2007-01-01

    Many people keep their collections of music on cassette tape even if they rarely listen to them. Images of these collections can be found online on photo sharing websites. What can we learn from such collections and what might they tell us about designing interfaces for new digital music libraries? The author conducts an online ethnographic study of over two hundred cassette tape collections, and over sixty participants with the aim of guiding future design of music collections. The author pr...

  18. Hypertext/Prolog user interface for a flexible inspection cell

    Science.gov (United States)

    Griffiths, Eric C.; Batchelor, Bruce G.; Daley, Michael W.; Jones, Andrew C.

    1995-10-01

    An inexpensive but versatile human-computer interface (HCI) for a machine vision system is described. Widely available hardware and computing components are controlled by software based on HyperCard and Prolog. While considerable benefit is obtained using just one of these programming tools, it has been found that the combination provides many advantages, including ease of use and great flexibility. Details of what is possible using HyperCard and Prolog individually and both working in harmony are discussed.

  19. Haptic communication for manipulator tooling operations in hazardous environments

    Science.gov (United States)

    Counsell, Michael S.; Barnes, David P.

    1999-11-01

    This paper presents a summary of the design and integration of a haptic interface with a nuclear industry accepted control system and manipulator. The control system is a UK Robotics Advanced Teleoperation Controller and the manipulator is a Schilling Titan II hydraulic arm. Operator performance has been studied for peg in the hole, grinding and drilling tasks, both with and without haptic communication. The results of these experiments are presented.

  20. Spatial issues in user interface design from a graphic design perspective

    Science.gov (United States)

    Marcus, Aaron

    1989-01-01

    The user interface of a computer system is a visual display that provides information about the status of operations on data within the computer and control options to the user that enable adjustments to these operations. From the very beginning of computer technology the user interface was a spatial display, although its spatial features were not necessarily complex or explicitly recognized by the users. All text and nonverbal signs appeared in a virtual space generally thought of as a single flat plane of symbols. Current technology of high performance workstations permits any element of the display to appear as dynamic, multicolor, 3-D signs in a virtual 3-D space. The complexity of appearance and the user's interaction with the display provide significant challenges to the graphic designer of current and future user interfaces. In particular, spatial depiction provides many opportunities for effective communication of objects, structures, processes, navigation, selection, and manipulation. Issues are presented that are relevant to the graphic designer seeking to optimize the user interface's spatial attributes for effective visual communication.

  1. Towards a Teleoperated Needle Driver Robot with Haptic Feedback for RFA of Breast Tumors under Continuous MRI1

    Science.gov (United States)

    Kokes, Rebecca; Lister, Kevin; Gullapalli, Rao; Zhang, Bao; MacMillan, Alan; Richard, Howard; Desai, Jaydev P.

    2009-01-01

    Objective The purpose of this paper is to explore the feasibility of developing a MRI-compatible needle driver system for radiofrequency ablation (RFA) of breast tumors under continuous MRI imaging while being teleoperated by a haptic feedback device from outside the scanning room. The developed needle driver prototype was designed and tested for both tumor targeting capability as well as RFA. Methods The single degree-of-freedom (DOF) prototype was interfaced with a PHANToM haptic device controlled from outside the scanning room. Experiments were performed to demonstrate MRI-compatibility and position control accuracy with hydraulic actuation, along with an experiment to determine the PHANToM’s ability to guide the RFA tool to a tumor nodule within a phantom breast tissue model while continuously imaging within the MRI and receiving force feedback from the RFA tool. Results Hydraulic actuation is shown to be a feasible actuation technique for operation in an MRI environment. The design is MRI-compatible in all aspects except for force sensing in the directions perpendicular to the direction of motion. Experiments confirm that the user is able to detect healthy vs. cancerous tissue in a phantom model when provided with both visual (imaging) feedback and haptic feedback. Conclusion The teleoperated 1-DOF needle driver system presented in this paper demonstrates the feasibility of implementing a MRI-compatible robot for RFA of breast tumors with haptic feedback capability. PMID:19303805

  2. Towards a teleoperated needle driver robot with haptic feedback for RFA of breast tumors under continuous MRI.

    Science.gov (United States)

    Kokes, Rebecca; Lister, Kevin; Gullapalli, Rao; Zhang, Bao; MacMillan, Alan; Richard, Howard; Desai, Jaydev P

    2009-06-01

    The purpose of this paper is to explore the feasibility of developing a MRI-compatible needle driver system for radiofrequency ablation (RFA) of breast tumors under continuous MRI imaging while being teleoperated by a haptic feedback device from outside the scanning room. The developed needle driver prototype was designed and tested for both tumor targeting capability as well as RFA. The single degree-of-freedom (DOF) prototype was interfaced with a PHANToM haptic device controlled from outside the scanning room. Experiments were performed to demonstrate MRI-compatibility and position control accuracy with hydraulic actuation, along with an experiment to determine the PHANToM's ability to guide the RFA tool to a tumor nodule within a phantom breast tissue model while continuously imaging within the MRI and receiving force feedback from the RFA tool. Hydraulic actuation is shown to be a feasible actuation technique for operation in an MRI environment. The design is MRI-compatible in all aspects except for force sensing in the directions perpendicular to the direction of motion. Experiments confirm that the user is able to detect healthy vs. cancerous tissue in a phantom model when provided with both visual (imaging) feedback and haptic feedback. The teleoperated 1-DOF needle driver system presented in this paper demonstrates the feasibility of implementing a MRI-compatible robot for RFA of breast tumors with haptic feedback capability.

  3. The use of Graphic User Interface for development of a user-friendly CRS-Stack software

    Science.gov (United States)

    Sule, Rachmat; Prayudhatama, Dythia; Perkasa, Muhammad D.; Hendriyana, Andri; Fatkhan; Sardjito; Adriansyah

    2017-04-01

    The development of a user-friendly Common Reflection Surface (CRS) Stack software that has been built by implementing Graphical User Interface (GUI) is described in this paper. The original CRS-Stack software developed by WIT Consortium is compiled in the unix/linux environment, which is not a user-friendly software, so that a user must write the commands and parameters manually in a script file. Due to this limitation, the CRS-Stack become a non popular method, although applying this method is actually a promising way in order to obtain better seismic sections, which have better reflector continuity and S/N ratio. After obtaining successful results that have been tested by using several seismic data belong to oil companies in Indonesia, it comes to an idea to develop a user-friendly software in our own laboratory. Graphical User Interface (GUI) is a type of user interface that allows people to interact with computer programs in a better way. Rather than typing commands and module parameters, GUI allows the users to use computer programs in much simple and easy. Thus, GUI can transform the text-based interface into graphical icons and visual indicators. The use of complicated seismic unix shell script can be avoided. The Java Swing GUI library is used to develop this CRS-Stack GUI. Every shell script that represents each seismic process is invoked from Java environment. Besides developing interactive GUI to perform CRS-Stack processing, this CRS-Stack GUI is design to help geophysicists to manage a project with complex seismic processing procedures. The CRS-Stack GUI software is composed by input directory, operators, and output directory, which are defined as a seismic data processing workflow. The CRS-Stack processing workflow involves four steps; i.e. automatic CMP stack, initial CRS-Stack, optimized CRS-Stack, and CRS-Stack Supergather. Those operations are visualized in an informative flowchart with self explanatory system to guide the user inputting the

  4. Draft User Functionalities and Interfaces of PN Services (Low-Fi Prototyping)

    DEFF Research Database (Denmark)

    Karamolegkos, P.; Larsen, J. E.; Larsen, Lars Bo

    2006-01-01

    Internal report of WP1 Task 4 activities from January 2006 to August 2006. This report describes the draft user functionalities and coming user interfaces for PN services. It is a working document to be handed over to WP1 Task1 and Task3 for guidelines on specification. State of the art usability...... and user experience, conceptual design work on the two pilot services, MAGNET.CARE and Nomadic@Work, is described.......Internal report of WP1 Task 4 activities from January 2006 to August 2006. This report describes the draft user functionalities and coming user interfaces for PN services. It is a working document to be handed over to WP1 Task1 and Task3 for guidelines on specification. State of the art usability...

  5. Wearable Haptic Systems for the Fingertip and the Hand: Taxonomy, Review, and Perspectives

    OpenAIRE

    Pacchierotti, Claudio; Sinclair, Stephen; Solazzi, Massimiliano; Frisoli, Antonio; Hayward, Vincent; Prattichizzo, Domenico

    2017-01-01

    International audience; In the last decade, we have witnessed a drastic change in the form factor of audio and vision technologies, from heavy and grounded machines to lightweight devices that naturally fit our bodies. However, only recently, haptic systems have started to be designed with wearability in mind. The wearability of haptic systems enables novel forms of communication, cooperation, and integration between humans and machines. Wearable haptic interfaces are capable of communicating...

  6. An Exploration of User Interface Designs for Real-Time Panoramic

    Directory of Open Access Journals (Sweden)

    Patrick Baudisch

    2006-05-01

    Full Text Available Image stitching allows users to combine multiple regular-sized photographs into a single wide-angle picture, often referred to as a panoramic picture. To create such a panoramic picture, users traditionally first take all the photographs, then upload them to a PC and stitch. During stitching, however, users often discover that the produced panorama contains artifacts or is incomplete. Fixing these flaws requires retaking individual images, which is often difficult by this time. In this paper, we present Panoramic Viewfinder, an interactive system for panorama construction that offers a real-time preview of the panorama while shooting. As the user swipes the camera across the scene, each photo is immediately added to the preview. By making ghosting and stitching failures apparent, the system allows users to immediately retake necessary images. The system also provides a preview of the cropped panorama. When this preview includes all desired scene elements, users know that the panorama will be complete. Unlike earlier work in the field of real-time stitching, this paper focuses on the user interface aspects of real-time stitching. We describe our prototype, individual shooting modes, and provide an overview of our implementation. Building on our experiences with Panoramic Viewfinder, we discuss a separate design that relaxes the level of synchrony between user and camera required by the current system and provide usage flexibility that we believe might further improve the user experience. Keywords: Panorama, Panoramic Viewfinder, user interface, interactive, stitching, real-time, preview.

  7. MuSim, a Graphical User Interface for Multiple Simulation Programs

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, Thomas [MUONS Inc., Batavia; Cummings, Mary Anne [MUONS Inc., Batavia; Johnson, Rolland [MUONS Inc., Batavia; Neuffer, David [Fermilab

    2016-06-01

    MuSim is a new user-friendly program designed to interface to many different particle simulation codes, regardless of their data formats or geometry descriptions. It presents the user with a compelling graphical user interface that includes a flexible 3-D view of the simulated world plus powerful editing and drag-and-drop capabilities. All aspects of the design can be parametrized so that parameter scans and optimizations are easy. It is simple to create plots and display events in the 3-D viewer (with a slider to vary the transparency of solids), allowing for an effortless comparison of different simulation codes. Simulation codes: G4beamline, MAD-X, and MCNP; more coming. Many accelerator design tools and beam optics codes were written long ago, with primitive user interfaces by today's standards. MuSim is specifically designed to make it easy to interface to such codes, providing a common user experience for all, and permitting the construction and exploration of models with very little overhead. For today's technology-driven students, graphical interfaces meet their expectations far better than text-based tools, and education in accelerator physics is one of our primary goals.

  8. A Hybrid 2D/3D User Interface for Radiological Diagnosis.

    Science.gov (United States)

    Mandalika, Veera Bhadra Harish; Chernoglazov, Alexander I; Billinghurst, Mark; Bartneck, Christoph; Hurrell, Michael A; Ruiter, Niels de; Butler, Anthony P H; Butler, Philip H

    2017-08-01

    This paper presents a novel 2D/3D desktop virtual reality hybrid user interface for radiology that focuses on improving 3D manipulation required in some diagnostic tasks. An evaluation of our system revealed that our hybrid interface is more efficient for novice users and more accurate for both novice and experienced users when compared to traditional 2D only interfaces. This is a significant finding because it indicates, as the techniques mature, that hybrid interfaces can provide significant benefit to image evaluation. Our hybrid system combines a zSpace stereoscopic display with 2D displays, and mouse and keyboard input. It allows the use of 2D and 3D components interchangeably, or simultaneously. The system was evaluated against a 2D only interface with a user study that involved performing a scoliosis diagnosis task. There were two user groups: medical students and radiology residents. We found improvements in completion time for medical students, and in accuracy for both groups. In particular, the accuracy of medical students improved to match that of the residents.

  9. Integrating mid-air haptics into movie experiences

    OpenAIRE

    Ablart, Damien; Velasco, Carlos; Obrist, Marianna

    2017-01-01

    `Seeing is believing, but feeling is the truth''. This idiom from the seventeenth century English clergyman Thomas Fuller gains new momentum in light of an increased proliferation of haptic technologies that allow people to have various kinds of `touch' and `touchless' interactions. Here, we report on the process of creating and integrating touchless feedback (i.e. mid-air haptic stimuli) into short movie experiences (i.e. one-minute movie format). Based on a systematic evaluation of user's e...

  10. Evaluating user experience with respect to user expectations in brain-computer interface games

    NARCIS (Netherlands)

    Gürkök, Hayrettin; Hakvoort, G.; Poel, Mannes; Müller-Putz, G.R.; Scherer, R.; Billinger, M.; Kreilinger, A.; Kaiser, V.; Neuper, C.

    Evaluating user experience (UX) with respect to previous experiences can provide insight into whether a product can positively aect a user's opinion about a technology. If it can, then we can say that the product provides a positive UX. In this paper we propose a method to assess the UX in BCI

  11. Non-visual Interfaces and Network Games for Blind Users

    OpenAIRE

    Ina, Satoshi

    2002-01-01

    Visually impaired people have difficulty with communication of graphical information. It is to be more difficult for them to work/play in cooperation with sighted people at a distance. We developed a non-visual access method to a graphical screen through tactile and auditory sense, and applied it into network board/card games as a joint workspace for blind and sighted users via communication of image, sound, and voice. We took an "IGO" type boardgame and a Card game "SEVENS" as sample subject...

  12. When soft controls get slippery: User interfaces and human error

    Energy Technology Data Exchange (ETDEWEB)

    Stubler, W.F.; O`Hara, J.M.

    1998-12-01

    Many types of products and systems that have traditionally featured physical control devices are now being designed with soft controls--input formats appearing on computer-based display devices and operated by a variety of input devices. A review of complex human-machine systems found that soft controls are particularly prone to some types of errors and may affect overall system performance and safety. This paper discusses the application of design approaches for reducing the likelihood of these errors and for enhancing usability, user satisfaction, and system performance and safety.

  13. Java-based Graphical User Interface for MAVERIC-II

    Science.gov (United States)

    Seo, Suk Jai

    2005-01-01

    A computer program entitled "Marshall Aerospace Vehicle Representation in C II, (MAVERIC-II)" is a vehicle flight simulation program written primarily in the C programming language. It is written by James W. McCarter at NASA/Marshall Space Flight Center. The goal of the MAVERIC-II development effort is to provide a simulation tool that facilitates the rapid development of high-fidelity flight simulations for launch, orbital, and reentry vehicles of any user-defined configuration for all phases of flight. MAVERIC-II has been found invaluable in performing flight simulations for various Space Transportation Systems. The flexibility provided by MAVERIC-II has allowed several different launch vehicles, including the Saturn V, a Space Launch Initiative Two-Stage-to-Orbit concept and a Shuttle-derived launch vehicle, to be simulated during ascent and portions of on-orbit flight in an extremely efficient manner. It was found that MAVERIC-II provided the high fidelity vehicle and flight environment models as well as the program modularity to allow efficient integration, modification and testing of advanced guidance and control algorithms. In addition to serving as an analysis tool for techno logy development, many researchers have found MAVERIC-II to be an efficient, powerful analysis tool that evaluates guidance, navigation, and control designs, vehicle robustness, and requirements. MAVERIC-II is currently designed to execute in a UNIX environment. The input to the program is composed of three segments: 1) the vehicle models such as propulsion, aerodynamics, and guidance, navigation, and control 2) the environment models such as atmosphere and gravity, and 3) a simulation framework which is responsible for executing the vehicle and environment models and propagating the vehicle s states forward in time and handling user input/output. MAVERIC users prepare data files for the above models and run the simulation program. They can see the output on screen and/or store in

  14. Study on user interface of pathology picture archiving and communication system.

    Science.gov (United States)

    Kim, Dasueran; Kang, Peter; Yun, Jungmin; Park, Sung-Hye; Seo, Jeong-Wook; Park, Peom

    2014-01-01

    It is necessary to improve the pathology workflow. A workflow task analysis was performed using a pathology picture archiving and communication system (pathology PACS) in order to propose a user interface for the Pathology PACS considering user experience. An interface analysis of the Pathology PACS in Seoul National University Hospital and a task analysis of the pathology workflow were performed by observing recorded video. Based on obtained results, a user interface for the Pathology PACS was proposed. Hierarchical task analysis of Pathology PACS was classified into 17 tasks including 1) pre-operation, 2) text, 3) images, 4) medical record viewer, 5) screen transition, 6) pathology identification number input, 7) admission date input, 8) diagnosis doctor, 9) diagnosis code, 10) diagnosis, 11) pathology identification number check box, 12) presence or absence of images, 13) search, 14) clear, 15) Excel save, 16) search results, and 17) re-search. And frequently used menu items were identified and schematized. A user interface for the Pathology PACS considering user experience could be proposed as a preliminary step, and this study may contribute to the development of medical information systems based on user experience and usability.

  15. Study on User Interface of Pathology Picture Archiving and Communication System

    Science.gov (United States)

    Kim, Dasueran; Kang, Peter; Yun, Jungmin; Park, Sung-Hye; Seo, Jeong-Wook

    2014-01-01

    Objectives It is necessary to improve the pathology workflow. A workflow task analysis was performed using a pathology picture archiving and communication system (pathology PACS) in order to propose a user interface for the Pathology PACS considering user experience. Methods An interface analysis of the Pathology PACS in Seoul National University Hospital and a task analysis of the pathology workflow were performed by observing recorded video. Based on obtained results, a user interface for the Pathology PACS was proposed. Results Hierarchical task analysis of Pathology PACS was classified into 17 tasks including 1) pre-operation, 2) text, 3) images, 4) medical record viewer, 5) screen transition, 6) pathology identification number input, 7) admission date input, 8) diagnosis doctor, 9) diagnosis code, 10) diagnosis, 11) pathology identification number check box, 12) presence or absence of images, 13) search, 14) clear, 15) Excel save, 16) search results, and 17) re-search. And frequently used menu items were identified and schematized. Conclusions A user interface for the Pathology PACS considering user experience could be proposed as a preliminary step, and this study may contribute to the development of medical information systems based on user experience and usability. PMID:24627818

  16. Virtual haptic system for intuitive planning of bone fixation plate placement

    Directory of Open Access Journals (Sweden)

    Kup-Sze Choi

    2017-01-01

    Full Text Available Placement of pre-contoured fixation plate is a common treatment for bone fracture. Fitting of fixation plates on fractured bone can be preoperatively planned and evaluated in 3D virtual environment using virtual reality technology. However, conventional systems usually employ 2D mouse and virtual trackball as the user interface, which makes the process inconvenient and inefficient. In the paper, a preoperative planning system equipped with 3D haptic user interface is proposed to allow users to manipulate the virtual fixation plate intuitively to determine the optimal position for placement on distal medial tibia. The system provides interactive feedback forces and visual guidance based on the geometric requirements. Creation of 3D models from medical imaging data, collision detection, dynamics simulation and haptic rendering are discussed. The system was evaluated by 22 subjects. Results show that the time to achieve optimal placement using the proposed system was shorter than that by using 2D mouse and virtual trackball, and the satisfaction rating was also higher. The system shows potential to facilitate the process of fitting fixation plates on fractured bones as well as interactive fixation plate design.

  17. Iterative User Interface Design for Automated Sequential Organ Failure Assessment Score Calculator in Sepsis Detection.

    Science.gov (United States)

    Aakre, Christopher Ansel; Kitson, Jaben E; Li, Man; Herasevich, Vitaly

    2017-05-18

    The new sepsis definition has increased the need for frequent sequential organ failure assessment (SOFA) score recalculation and the clerical burden of information retrieval makes this score ideal for automated calculation. The aim of this study was to (1) estimate the clerical workload of manual SOFA score calculation through a time-motion analysis and (2) describe a user-centered design process for an electronic medical record (EMR) integrated, automated SOFA score calculator with subsequent usability evaluation study. First, we performed a time-motion analysis by recording time-to-task-completion for the manual calculation of 35 baseline and 35 current SOFA scores by 14 internal medicine residents over a 2-month period. Next, we used an agile development process to create a user interface for a previously developed automated SOFA score calculator. The final user interface usability was evaluated by clinician end users with the Computer Systems Usability Questionnaire. The overall mean (standard deviation, SD) time-to-complete manual SOFA score calculation time was 61.6 s (33). Among the 24% (12/50) usability survey respondents, our user-centered user interface design process resulted in >75% favorability of survey items in the domains of system usability, information quality, and interface quality. Early stakeholder engagement in our agile design process resulted in a user interface for an automated SOFA score calculator that reduced clinician workload and met clinicians' needs at the point of care. Emerging interoperable platforms may facilitate dissemination of similarly useful clinical score calculators and decision support algorithms as "apps." A user-centered design process and usability evaluation should be considered during creation of these tools.

  18. New and Old User Interface Metaphors in Music Production

    DEFF Research Database (Denmark)

    Walther-Hansen, Mads

    2017-01-01

    This paper outlines a theoretical framework for interaction with sound in music mixing. Using cognitive linguistic theory and studies exploring the spatiality of recorded music, it is argued that the logic of music mixing builds on three master metaphors—the signal flow metaphor, the sound stage...... metaphor and the container metaphor. I show how the metaphorical basis for interacting with sound in music mixing has changed with the development of recording technology, new aesthetic ideals and changing terminology. These changes are studied as expressions of underlying thought patterns that govern how...... music producers and engineers make sense of their actions. In conclusion, this leads to suggestions for a theoretical framework through which more intuitive music mixing interfaces may be developed in the future....

  19. Tangible User Interface and Mu Rhythm Suppression: The Effect of User Interface on the Brain Activity in Its Operator and Observer

    Directory of Open Access Journals (Sweden)

    Kazuo Isoda

    2017-03-01

    Full Text Available The intuitiveness of tangible user interface (TUI is not only for its operator. It is quite possible that this type of user interface (UI can also have an effect on the experience and learning of observers who are just watching the operator using it. To understand the possible effect of TUI, the present study focused on the mu rhythm suppression in the sensorimotor area reflecting execution and observation of action, and investigated the brain activity both in its operator and observer. In the observer experiment, the effect of TUI on its observers was demonstrated through the brain activity. Although the effect of the grasping action itself was uncertain, the unpredictability of the result of the action seemed to have some effect on the mirror neuron system (MNS-related brain activity. In the operator experiment, in spite of the same grasping action, the brain activity was activated in the sensorimotor area when UI functions were included (TUI. Such activation of the brain activity was not found with a graphical user interface (GUI that has UI functions without grasping action. These results suggest that the MNS-related brain activity is involved in the effect of TUI, indicating the possibility of UI evaluation based on brain activity.

  20. User Interface Developed for Controls/CFD Interdisciplinary Research

    Science.gov (United States)

    1996-01-01

    The NASA Lewis Research Center, in conjunction with the University of Akron, is developing analytical methods and software tools to create a cross-discipline "bridge" between controls and computational fluid dynamics (CFD) technologies. Traditionally, the controls analyst has used simulations based on large lumping techniques to generate low-order linear models convenient for designing propulsion system controls. For complex, high-speed vehicles such as the High Speed Civil Transport (HSCT), simulations based on CFD methods are required to capture the relevant flow physics. The use of CFD should also help reduce the development time and costs associated with experimentally tuning the control system. The initial application for this research is the High Speed Civil Transport inlet control problem. A major aspect of this research is the development of a controls/CFD interface for non-CFD experts, to facilitate the interactive operation of CFD simulations and the extraction of reduced-order, time-accurate models from CFD results. A distributed computing approach for implementing the interface is being explored. Software being developed as part of the Integrated CFD and Experiments (ICE) project provides the basis for the operating environment, including run-time displays and information (data base) management. Message-passing software is used to communicate between the ICE system and the CFD simulation, which can reside on distributed, parallel computing systems. Initially, the one-dimensional Large-Perturbation Inlet (LAPIN) code is being used to simulate a High Speed Civil Transport type inlet. LAPIN can model real supersonic inlet features, including bleeds, bypasses, and variable geometry, such as translating or variable-ramp-angle centerbodies. Work is in progress to use parallel versions of the multidimensional NPARC code.

  1. A Mobile Phone User Interface for Image-Based Dietary Assessment.

    Science.gov (United States)

    Ahmad, Ziad; Khanna, Nitin; Kerr, Deborah A; Boushey, Carol J; Delp, Edward J

    2014-02-02

    Many chronic diseases, including obesity and cancer, are related to diet. Such diseases may be prevented and/or successfully treated by accurately monitoring and assessing food and beverage intakes. Existing dietary assessment methods such as the 24-hour dietary recall and the food frequency questionnaire, are burdensome and not generally accurate. In this paper, we present a user interface for a mobile telephone food record that relies on taking images, using the built-in camera, as the primary method of recording. We describe the design and implementation of this user interface while stressing the solutions we devised to meet the requirements imposed by the image analysis process, yet keeping the user interface easy to use.

  2. Extending a User Interface Prototyping Tool with Automatic MISRA C Code Generation

    Directory of Open Access Journals (Sweden)

    Gioacchino Mauro

    2017-01-01

    Full Text Available We are concerned with systems, particularly safety-critical systems, that involve interaction between users and devices, such as the user interface of medical devices. We therefore developed a MISRA C code generator for formal models expressed in the PVSio-web prototyping toolkit. PVSio-web allows developers to rapidly generate realistic interactive prototypes for verifying usability and safety requirements in human-machine interfaces. The visual appearance of the prototypes is based on a picture of a physical device, and the behaviour of the prototype is defined by an executable formal model. Our approach transforms the PVSio-web prototyping tool into a model-based engineering toolkit that, starting from a formally verified user interface design model, will produce MISRA C code that can be compiled and linked into a final product. An initial validation of our tool is presented for the data entry system of an actual medical device.

  3. Integrating Cadaver Needle Forces Into a Haptic Robotic Simulator.

    Science.gov (United States)

    Pepley, David F; Yovanoff, Mary A; Mirkin, Katelin A; Miller, Scarlett R; Han, David C; Moore, Jason Z

    2018-03-01

    Accurate force simulation is essential to haptic simulators for surgical training. Factors such as tissue inhomogeneity pose unique challenges for simulating needle forces. To aid in the development of haptic needle insertion simulators, a handheld force sensing syringe was created to measure the motion and forces of needle insertions. Five needle insertions were performed into the neck of a cadaver using the force sensing syringe. Based on these measurements a piecewise exponential needle force characterization, was implemented into a haptic central venous catheterization (CVC) simulator. The haptic simulator was evaluated through a survey of expert surgeons, fellows, and residents. The maximum needle insertion forces measured ranged from 2.02 N to 1.20 N. With this information, four characterizations were created representing average, muscular, obese, and thin patients. The median survey results showed that users statistically agreed that "the robotic system made me sensitive to how patient anatomy impacts the force required to advance needles in the human body." The force sensing syringe captured force and position information. The information gained from this syringe was able to be implemented into a haptic simulator for CVC insertions, showing its utility. Survey results showed that experts, fellows, and residents had an overall positive outlook on the haptic simulator's ability to teach haptic skills.

  4. Designing Better Radiology Workstations: Impact of Two User Interfaces on Interpretation Errors and User Satisfaction

    National Research Council Canada - National Science Library

    Moise, Adrian; Atkins, M Stella

    2005-01-01

    .... We demonstrated the benefits of staging in a user experiment with 20 lay subjects involved in a comparative visual search for targets, similar to a radiology task of identifying anatomical abnormalities...

  5. US NDC Modernization Iteration E2 Prototyping Report: User Interface Framework

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Jennifer E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Palmer, Melanie A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vickers, James Wallace [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Voegtli, Ellen M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-12-01

    During the second iteration of the US NDC Modernization Elaboration phase (E2), the SNL US NDC Modernization project team completed follow-on Rich Client Platform (RCP) exploratory prototyping related to the User Interface Framework (UIF). The team also developed a survey of browser-based User Interface solutions and completed exploratory prototyping for selected solutions. This report presents the results of the browser-based UI survey, summarizes the E2 browser-based UI and RCP prototyping work, and outlines a path forward for the third iteration of the Elaboration phase (E3).

  6. Simulation and training of lumbar punctures using haptic volume rendering and a 6DOF haptic device

    Science.gov (United States)

    Färber, Matthias; Heller, Julika; Handels, Heinz

    2007-03-01

    The lumbar puncture is performed by inserting a needle into the spinal chord of the patient to inject medicaments or to extract liquor. The training of this procedure is usually done on the patient guided by experienced supervisors. A virtual reality lumbar puncture simulator has been developed in order to minimize the training costs and the patient's risk. We use a haptic device with six degrees of freedom (6DOF) to feedback forces that resist needle insertion and rotation. An improved haptic volume rendering approach is used to calculate the forces. This approach makes use of label data of relevant structures like skin, bone, muscles or fat and original CT data that contributes information about image structures that can not be segmented. A real-time 3D visualization with optional stereo view shows the punctured region. 2D visualizations of orthogonal slices enable a detailed impression of the anatomical context. The input data consisting of CT and label data and surface models of relevant structures is defined in an XML file together with haptic rendering and visualization parameters. In a first evaluation the visible human male data has been used to generate a virtual training body. Several users with different medical experience tested the lumbar puncture trainer. The simulator gives a good haptic and visual impression of the needle insertion and the haptic volume rendering technique enables the feeling of unsegmented structures. Especially, the restriction of transversal needle movement together with rotation constraints enabled by the 6DOF device facilitate a realistic puncture simulation.

  7. Graphical user interface simplifies infusion pump programming and enhances the ability to detect pump-related faults.

    Science.gov (United States)

    Syroid, Noah; Liu, David; Albert, Robert; Agutter, James; Egan, Talmage D; Pace, Nathan L; Johnson, Ken B; Dowdle, Michael R; Pulsipher, Daniel; Westenskow, Dwayne R

    2012-11-01

    Drug administration errors are frequent and are often associated with the misuse of IV infusion pumps. One source of these errors may be the infusion pump's user interface. We used failure modes-and-effects analyses to identify programming errors and to guide the design of a new syringe pump user interface. We designed the new user interface to clearly show the pump's operating state simultaneously in more than 1 monitoring location. We evaluated anesthesia residents in laboratory and simulated environments on programming accuracy and error detection between the new user interface and the user interface of a commercially available infusion pump. With the new user interface, we observed the number of programming errors reduced by 81%, the number of keystrokes per task reduced from 9.2 ± 5.0 to 7.5 ± 5.5 (mean ± SD), the time required per task reduced from 18.1 ± 14.1 seconds to 10.9 ± 9.5 seconds and significantly less perceived workload. Residents detected 38 of 70 (54%) of the events with the new user interface and 37 of 70 (53%) with the existing user interface, despite no experience with the new user interface and extensive experience with the existing interface. The number of programming errors and workload were reduced partly because it took less time and fewer keystrokes to program the pump when using the new user interface. Despite minimal training, residents quickly identified preexisting infusion pump problems with the new user interface. Intuitive and easy-to-program infusion pump interfaces may reduce drug administration errors and infusion pump-related adverse events.

  8. OpenDolphin: presentation models for compelling user interfaces

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    Shared applications run on the server. They still need a display, though, be it on the web or on the desktop. OpenDolphin introduces a shared presentation model to clearly differentiate between "what" to display and "how" to display. The "what" is managed on the server and is independent of the UI technology whereas the "how" can fully exploit the UI capabilities like the ubiquity of the web or the power of the desktop in terms of interactivity, animations, effects, 3D worlds, and local devices. If you run a server-centric architecture and still seek to provide the best possible user experience, then this talk is for you. About the speaker Dierk König (JavaOne Rock Star) works as a fellow for Canoo Engineering AG, Basel, Switzerland. He is a committer to many open-source projects including OpenDolphin, Groovy, Grails, GPars and GroovyFX. He is lead author of the "Groovy in Action" book, which is among ...

  9. pmx Webserver: A User Friendly Interface for Alchemistry.

    Science.gov (United States)

    Gapsys, Vytautas; de Groot, Bert L

    2017-02-27

    With the increase of available computational power and improvements in simulation algorithms, alchemical molecular dynamics based free energy calculations have developed into routine usage. To further facilitate the usability of alchemical methods for amino acid mutations, we have developed a web based infrastructure for obtaining hybrid protein structures and topologies. The presented webserver allows amino acid mutation selection in five contemporary molecular mechanics force fields. In addition, a complete mutation scan with a user defined amino acid is supported. The output generated by the webserver is directly compatible with the Gromacs molecular dynamics engine and can be used with any of the alchemical free energy calculation setup. Furthermore, we present a database of input files and precalculated free energy differences for tripeptides approximating a disordered state of a protein, of particular use for protein stability studies. Finally, the usage of the webserver and its output is exemplified by performing an alanine scan and investigating thermodynamic stability of the Trp cage mini protein. The webserver is accessible at http://pmx.mpibpc.mpg.de.

  10. Impact of Spatial Reference Frames on Human Performance in Virtual Reality User Interfaces

    OpenAIRE

    Marc Bernatchez; Jean-Marc Robert

    2008-01-01

    The design of virtual reality user interfaces (VRUI) is still an open field of research and development. One category of VRUI is the 3D floating menus that can be manipulated by users in free space. These menus can contain various controls such as buttons, sliders, and text. This article presents an experimental study that aims at testing the impact of five spatial reference frames on human performance with VRUI. Fifteen subjects participated in the study. Wearing a head-mounted display (HMD)...

  11. Saving and Restoring Mechanisms for Tangible User Interfaces through Tangible Active Objects

    OpenAIRE

    Riedenklau, Eckard; Hermann, Thomas; Ritter, Helge; Jacko, Julie A.

    2011-01-01

    In this paper we present a proof of concept for saving and restoring mechanisms for Tangible User Interfaces (TUIs). We describe our actuated Tangible Active Objects (TAOs) and explain the design which allows equal user access to a dial-based fully tangible actuated menu metaphor. We present a new application extending an existing TUI for interactive sonification of process data with saving and restoring mechanisms and we outline another application proposal for family therapists.

  12. Mapa-an object oriented code with a graphical user interface for accelerator design and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Shasharina, S.G.; Cary, J.R. [Tech-X Corporation 4588 Pussy Willow Court, Boulder, Colorado 80301 (United States)

    1997-02-01

    We developed a code for accelerator modeling which will allow users to create and analyze accelerators through a graphical user interface (GUI). The GUI can read an accelerator from files or create it by adding, removing and changing elements. It also creates 4D orbits and lifetime plots. The code includes a set of accelerator elements classes, C++ utility and GUI libraries. Due to the GUI, the code is easy to use and expand. {copyright} {ital 1997 American Institute of Physics.}

  13. How individual should digital AT user interfaces be for people with dementia

    OpenAIRE

    Cudd, P.; Greasley, P; Gallant, Z.; Bolton, E; Mountain, G.

    2013-01-01

    A literature review of papers that have explored digital technology user interface design for people with dementia is reported. Only papers that have employed target user input directly or from other works have been included. Twenty four were analysed. Improvements in reporting of studies are recommended. A case is made for considering the population of people with dementia as so heterogeneous that one design does not suit all, this is illustrated through some case study reports from people w...

  14. User-customized brain computer interfaces using Bayesian optimization.

    Science.gov (United States)

    Bashashati, Hossein; Ward, Rabab K; Bashashati, Ali

    2016-04-01

    The brain characteristics of different people are not the same. Brain computer interfaces (BCIs) should thus be customized for each individual person. In motor-imagery based synchronous BCIs, a number of parameters (referred to as hyper-parameters) including the EEG frequency bands, the channels and the time intervals from which the features are extracted should be pre-determined based on each subject's brain characteristics. To determine the hyper-parameter values, previous work has relied on manual or semi-automatic methods that are not applicable to high-dimensional search spaces. In this paper, we propose a fully automatic, scalable and computationally inexpensive algorithm that uses Bayesian optimization to tune these hyper-parameters. We then build different classifiers trained on the sets of hyper-parameter values proposed by the Bayesian optimization. A final classifier aggregates the results of the different classifiers. We have applied our method to 21 subjects from three BCI competition datasets. We have conducted rigorous statistical tests, and have shown the positive impact of hyper-parameter optimization in improving the accuracy of BCIs. Furthermore, We have compared our results to those reported in the literature. Unlike the best reported results in the literature, which are based on more sophisticated feature extraction and classification methods, and rely on prestudies to determine the hyper-parameter values, our method has the advantage of being fully automated, uses less sophisticated feature extraction and classification methods, and yields similar or superior results compared to the best performing designs in the literature.

  15. Multi-Touch Collaborative Gesture Recognition Based User Interfaces as Behavioral Interventions for Children with Autistic Spectrum Disorder: A Review

    Directory of Open Access Journals (Sweden)

    AHMED HASSAN

    2016-10-01

    Full Text Available This paper addresses UI (User Interface designing based on multi-touch collaborative gesture recognition meant for ASD (Autism Spectrum Disorder - affected children. The present user interfaces (in the context of behavioral interventions for Autism Spectrum disorder are investigated in detail. Thorough comparison has been made among various groups of these UIs. Advantages and limitations of these interfaces are discussed and future directions for the design of such interfaces are suggested.

  16. Usability of JACO Arm Interfaces Designed with a User-Centred Design Method.

    Science.gov (United States)

    Sauzin, Damien; Vigouroux, Nadine; Vella, Frédéric

    2017-01-01

    Utility, usability and acceptability of robotic arm for helping motor impairment people (quadriplegic, muscular dystrophy, Amyotrophic Lateral Sclerosis) must be improved. The robotic arm JACO of company ©Kinova is controlled by a joystick, sometimes unusable by patients. The IRIT laboratory has designed three types of virtual interfaces: one based on virtual keyboards and two others on Pie Menu concepts. These interfaces were designed by mean of a user centred design approach (UCDA) including brain storming, focus group, iterative prototyping and trials. Then an experiment is described with two patients (Spinal Muscular Atrophy and cerebral palsy). This experiment shows that the three interfaces designed by a UCDA are usable by them.

  17. Expanding the user base beyond HEP for the Ganga distributed analysis user interface

    Science.gov (United States)

    Currie, R.; Egede, U.; Richards, A.; Slater, M.; Williams, M.

    2017-10-01

    This document presents the result of recent developments within Ganga[1] project to support users from new communities outside of HEP. In particular I will examine the case of users from the Large Scale Survey Telescope (LSST) group looking to use resources provided by the UK based GridPP[2][3] DIRAC[4][5] instance. An example use case is work performed with users from the LSST Virtual Organisation (VO) to distribute the workflow used for galaxy shape identification analyses. This work highlighted some LSST specific challenges which could be well solved by common tools within the HEP community. As a result of this work the LSST community was able to take advantage of GridPP[2][3] resources to perform large computing tasks within the UK.

  18. Visual interfaces as an approach for providing mobile services and mobile content to low literate users in South Africa

    CSIR Research Space (South Africa)

    Matyila, M

    2014-05-01

    Full Text Available in the related mobile applications. Exploring typical challenges experienced by low literate users and adapting these mobile applications using visual interfaces can provide low literate users with usable access to mobile services and mobile content....

  19. Quantitative Analysis Of User Interfaces For Large Electronic Home Appliances And Mobile Devices Based On Lifestyle Categorization Of Older Users.

    Science.gov (United States)

    Shin, Wonkyoung; Park, Minyong

    2017-01-01

    Background/Study Context: The increasing longevity and health of older users as well as aging populations has created the need to develop senior-oriented product interfaces. This study aims to find user interface (UI) priorities according to older user groups based on their lifestyle and develop quality of UI (QUI) models for large electronic home appliances and mobile products. A segmentation table designed to show how older users can be categorized was created through a review of the literature to survey 252 subjects with a questionnaire. Factor analysis was performed to extract six preliminary lifestyle factors, which were then used for subsequent cluster analysis. The analysis resulted in four groups. Cross-analysis was carried out to investigate which characteristics were included in the groups. Analysis of variance was then applied to investigate the differences in the UI priorities among the user groups for various electronic devices. Finally, QUI models were developed and applied to those electronic devices. Differences in UI priorities were found according to the four lifestyles ("money-oriented," "innovation-oriented," "stability- and simplicity-oriented," and "innovation- and intellectual-oriented"). Twelve QUI models were developed for four different lifestyle groups associated with different products. Three washers and three smartphones were used as an example for testing the QUI models. The UI differences of the older user groups by the segmentation in this study using several key (i.e., demographic, socioeconomic, and physical-cognitive) variables are distinct from earlier studies made by a single variable. The differences in responses clearly indicate the benefits of integrating various factors of older users, rather than single variable, in order to design and develop more innovative and better consumer products in the future. The results of this study showed that older users with a potentially high buying power in the future are likely to have

  20. Enhancing the Gaming Experience Using 3D Spatial User Interface Technologies.

    Science.gov (United States)

    Kulshreshth, Arun; Pfeil, Kevin; LaViola, Joseph J

    2017-01-01

    Three-dimensional (3D) spatial user interface technologies have the potential to make games more immersive and engaging and thus provide a better user experience. Although technologies such as stereoscopic 3D display, head tracking, and gesture-based control are available for games, it is still unclear how their use affects gameplay and if there are any user performance benefits. The authors have conducted several experiments on these technologies in game environments to understand how they affect gameplay and how we can use them to optimize the gameplay experience.

  1. AdaM: Adapting Multi-User Interfaces for Collaborative Environments in Real-Time

    DEFF Research Database (Denmark)

    Park, Seonwook; Gebhardt, Christoph; Rädle, Roman

    2018-01-01

    Developing cross-device multi-user interfaces (UIs) is a challenging problem. There are numerous ways in which content and interactivity can be distributed. However, good solutions must consider multiple users, their roles, their preferences and access rights, as well as device capabilities. Manual...... and rule-based solutions are tedious to create and do not scale to larger problems nor do they adapt to dynamic changes, such as users leaving or joining an activity. In this paper, we cast the problem of UI distribution as an assignment problem and propose to solve it using combinatorial optimization. We...

  2. Information Practices and User Interfaces: Student Use of an iOS Application in Special Education

    Science.gov (United States)

    Demmans Epp, Carrie; McEwen, Rhonda; Campigotto, Rachelle; Moffatt, Karyn

    2016-01-01

    A framework connecting concepts from user interface design with those from information studies is applied in a study that integrated a location-aware mobile application into two special education classes at different schools; this application had two support modes (one general and one location specific). The five-month study revealed several…

  3. TESTAR : Tool Support for Test Automation at the User Interface Level

    NARCIS (Netherlands)

    Vos, Tanja E.J.; Kruse, Peter M.; Condori Fernandez, Nelly; Bauersfeld, Sebastian; Wegener, Joachim

    2015-01-01

    Testing applications with a graphical user interface (GUI) is an important, though challenging and time consuming task. The state of the art in the industry are still capture and replay tools, which may simplify the recording and execution of input sequences, but do not support the tester in finding

  4. Touch in Computer-Mediated Environments: An Analysis of Online Shoppers' Touch-Interface User Experiences

    Science.gov (United States)

    Chung, Sorim

    2016-01-01

    Over the past few years, one of the most fundamental changes in current computer-mediated environments has been input devices, moving from mouse devices to touch interfaces. However, most studies of online retailing have not considered device environments as retail cues that could influence users' shopping behavior. In this research, I examine the…

  5. 78 FR 36478 - Accessibility of User Interfaces, and Video Programming Guides and Menus

    Science.gov (United States)

    2013-06-18

    ... MVPD's IP network or through a different Internet Service Provider? If we interpret the term... V-Chip and other parental controls, that may provide additional guidance to manufacturers. If any... included in the 11 listed in the VPAAC Second Report: User Interfaces, such as V-Chip and other parental...

  6. A Tabletop Board Game Interface for Multi-User Interaction with a Storytelling System

    NARCIS (Netherlands)

    Alofs, T.; Theune, Mariet; Swartjes, I.M.T.; Camurri, A.; Costa, C.

    2011-01-01

    The Interactive Storyteller is an interactive storytelling system with a multi-user tabletop interface. Our goal was to design a generic framework combining emergent narrative, where stories emerge from the actions of autonomous intelligent agents, with the social aspects of traditional board games.

  7. Experimental setup for evaluating an adaptive user interface for teleoperation control

    Science.gov (United States)

    Wijayasinghe, Indika B.; Peetha, Srikanth; Abubakar, Shamsudeen; Saadatzi, Mohammad Nasser; Cremer, Sven; Popa, Dan O.

    2017-05-01

    A vital part of human interactions with a machine is the control interface, which single-handedly could define the user satisfaction and the efficiency of performing a task. This paper elaborates the implementation of an experimental setup to study an adaptive algorithm that can help the user better tele-operate the robot. The formulation of the adaptive interface and associate learning algorithms are general enough to apply when the mapping between the user controls and the robot actuators is complex and/or ambiguous. The method uses a genetic algorithm to find the optimal parameters that produce the input-output mapping for teleoperation control. In this paper, we describe the experimental setup and associated results that was used to validate the adaptive interface to a differential drive robot from two different input devices; a joystick, and a Myo gesture control armband. Results show that after the learning phase, the interface converges to an intuitive mapping that can help even inexperienced users drive the system to a goal location.

  8. Talk and Tools : The best of both worlds in mobile user interfaces for E-coaching

    NARCIS (Netherlands)

    Beun, RJ; Fitrianie, S.; Griffioen-Both, Fiemke; Spruit, Sandor; Horsch, C.H.G.; Lancee, J; Brinkman, W.P.

    2017-01-01

    In this paper, a user interface paradigm, called Talk-and-Tools, is presented for automated e-coaching. The paradigm is based on the idea that people interact in two ways with their environment: symbolically and physically. The main goal is to show how the paradigm can be applied in the design of

  9. Flexible software architecture for user-interface and machine control in laboratory automation.

    Science.gov (United States)

    Arutunian, E B; Meldrum, D R; Friedman, N A; Moody, S E

    1998-10-01

    We describe a modular, layered software architecture for automated laboratory instruments. The design consists of a sophisticated user interface, a machine controller and multiple individual hardware subsystems, each interacting through a client-server architecture built entirely on top of open Internet standards. In our implementation, the user-interface components are built as Java applets that are downloaded from a server integrated into the machine controller. The user-interface client can thereby provide laboratory personnel with a familiar environment for experiment design through a standard World Wide Web browser. Data management and security are seamlessly integrated at the machine-controller layer using QNX, a real-time operating system. This layer also controls hardware subsystems through a second client-server interface. This architecture has proven flexible and relatively easy to implement and allows users to operate laboratory automation instruments remotely through an Internet connection. The software architecture was implemented and demonstrated on the Acapella, an automated fluid-sample-processing system that is under development at the University of Washington.

  10. Novice Use of a Dimensional Scale for the Evaluation of the Hypermedia User Interface: Caveat Emptor.

    Science.gov (United States)

    Harmon, Stephen W.

    1995-01-01

    Discusses a dimensional scale for the evaluation of the multimedia user interface. Reports on a study of the use of the scale by novice graduate students at the University of Houston Clear Lake. Discusses hypermedia as a subset of multimedia, and investigates dependent measures including navigation. (LRW)

  11. Toward User Interfaces and Data Visualization Criteria for Learning Design of Digital Textbooks

    Science.gov (United States)

    Railean, Elena

    2014-01-01

    User interface and data visualisation criteria are central issues in digital textbooks design. However, when applying mathematical modelling of learning process to the analysis of the possible solutions, it could be observed that results differ. Mathematical learning views cognition in on the base on statistics and probability theory, graph…

  12. Utilising cognitive work analysis for the design and evaluation of command and control user interfaces

    CSIR Research Space (South Africa)

    Gous, E

    2013-11-01

    Full Text Available This paper reports on the design and evaluation of distributed net-centric command and control user interfaces for future air defence operations. The design was based on the Cognitive Work Analysis framework to identify the required capabilities...

  13. 78 FR 77074 - Accessibility of User Interfaces, and Video Programming Guides and Menus; Accessible Emergency...

    Science.gov (United States)

    2013-12-20

    ... COMMISSION 47 CFR Part 79 Accessibility of User Interfaces, and Video Programming Guides and Menus... authority for requiring MVPDs to ensure that video programming guides and menus that ] provide channel and... the instructions for submitting comments. Federal Communications Commission's Web site: http...

  14. User Interface Preferences in the Design of a Camera-Based Navigation and Wayfinding Aid

    Science.gov (United States)

    Arditi, Aries; Tian, YingLi

    2013-01-01

    Introduction: Development of a sensing device that can provide a sufficient perceptual substrate for persons with visual impairments to orient themselves and travel confidently has been a persistent rehabilitation technology goal, with the user interface posing a significant challenge. In the study presented here, we enlist the advice and ideas of…

  15. Using R in Introductory Statistics Courses with the pmg Graphical User Interface

    Science.gov (United States)

    Verzani, John

    2008-01-01

    The pmg add-on package for the open source statistics software R is described. This package provides a simple to use graphical user interface (GUI) that allows introductory statistics students, without advanced computing skills, to quickly create the graphical and numeric summaries expected of them. (Contains 9 figures.)

  16. User Interfaces for Patient-Centered Communication of Health Status and Care Progress

    Science.gov (United States)

    Wilcox-Patterson, Lauren

    2013-01-01

    The recent trend toward patients participating in their own healthcare has opened up numerous opportunities for computing research. This dissertation focuses on how technology can foster this participation, through user interfaces to effectively communicate personal health status and care progress to hospital patients. I first characterize the…

  17. imDEV: a graphical user interface to R multivariate analysis tools in Microsoft Excel

    Science.gov (United States)

    Interactive modules for data exploration and visualization (imDEV) is a Microsoft Excel spreadsheet embedded application providing an integrated environment for the analysis of omics data sets with a user-friendly interface. Individual modules were designed to provide toolsets to enable interactive ...

  18. Social Benefits of a Tangible User Interface for Children with Autistic Spectrum Conditions

    Science.gov (United States)

    Farr, William; Yuill, Nicola; Raffle, Hayes

    2010-01-01

    Tangible user interfaces (TUIs) embed computer technology in graspable objects. This study assessed the potential of Topobo, a construction toy with programmable movement, to support social interaction in children with Autistic Spectrum Conditions (ASC). Groups of either typically developing (TD) children or those with ASC had group play sessions…

  19. Assessment of Application Technology of Natural User Interfaces in the Creation of a Virtual Chemical Laboratory

    Science.gov (United States)

    Jagodzinski, Piotr; Wolski, Robert

    2015-01-01

    Natural User Interfaces (NUI) are now widely used in electronic devices such as smartphones, tablets and gaming consoles. We have tried to apply this technology in the teaching of chemistry in middle school and high school. A virtual chemical laboratory was developed in which students can simulate the performance of laboratory activities similar…

  20. AOP-DB Frontend: A user interface for the Adverse Outcome Pathways Database

    Science.gov (United States)

    The EPA Adverse Outcome Pathway Database (AOP-DB) is a database resource that aggregates association relationships between AOPs, genes, chemicals, diseases, pathways, species orthology information, ontologies. The AOP-DB frontend is a simple yet powerful user interface in the for...

  1. Graphical User Interface Development and Design to Support Airport Runway Configuration Management

    Science.gov (United States)

    Jones, Debra G.; Lenox, Michelle; Onal, Emrah; Latorella, Kara A.; Lohr, Gary W.; Le Vie, Lisa

    2015-01-01

    The objective of this effort was to develop a graphical user interface (GUI) for the National Aeronautics and Space Administration's (NASA) System Oriented Runway Management (SORM) decision support tool to support runway management. This tool is expected to be used by traffic flow managers and supervisors in the Airport Traffic Control Tower (ATCT) and Terminal Radar Approach Control (TRACON) facilities.

  2. Integration of data validation and user interface concerns in a DSL for web applications

    NARCIS (Netherlands)

    Groenewegen, D.M.; Visser, E.

    2009-01-01

    This paper is a pre-print of: Danny M. Groenewegen, Eelco Visser. Integration of Data Validation and User Interface Concerns in a DSL for Web Applications. In Mark G. J. van den Brand, Jeff Gray, editors, Software Language Engineering, Second International Conference, SLE 2009, Denver, USA, October,

  3. US NDC Modernization Iteration E1 Prototyping Report: User Interface Framework

    Energy Technology Data Exchange (ETDEWEB)

    Lober, Randall R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-12-01

    During the first iteration of the US NDC Modernization Elaboration phase (E1), the SNL US NDC modernization project team completed an initial survey of applicable COTS solutions, and established exploratory prototyping related to the User Interface Framework (UIF) in support of system architecture definition. This report summarizes these activities and discusses planned follow-on work.

  4. Moving towards the Assessment of Collaborative Problem Solving Skills with a Tangible User Interface

    Science.gov (United States)

    Ras, Eric; Krkovic, Katarina; Greiff, Samuel; Tobias, Eric; Maquil, Valérie

    2014-01-01

    The research on the assessment of collaborative problem solving (ColPS), as one crucial 21st Century Skill, is still in its beginnings. Using Tangible User Interfaces (TUI) for this purpose has only been marginally investigated in technology-based assessment. Our first empirical studies focused on light-weight performance measurements, usability,…

  5. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems

    Directory of Open Access Journals (Sweden)

    Gervasio Varela

    2016-07-01

    Full Text Available This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC and Ambient Intelligence (AmI systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location.

  6. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems.

    Science.gov (United States)

    Varela, Gervasio; Paz-Lopez, Alejandro; Becerra, Jose A; Duro, Richard

    2016-07-07

    This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC) and Ambient Intelligence (AmI) systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location.

  7. Topological Galleries: A High Level User Interface for Topology Controlled Volume Rendering

    Energy Technology Data Exchange (ETDEWEB)

    MacCarthy, Brian; Carr, Hamish; Weber, Gunther H.

    2011-06-30

    Existing topological interfaces to volume rendering are limited by their reliance on sophisticated knowledge of topology by the user. We extend previous work by describing topological galleries, an interface for novice users that is based on the design galleries approach. We report three contributions: an interface based on hierarchical thumbnail galleries to display the containment relationships between topologically identifiable features, the use of the pruning hierarchy instead of branch decomposition for contour tree simplification, and drag-and-drop transfer function assignment for individual components. Initial results suggest that this approach suffers from limitations due to rapid drop-off of feature size in the pruning hierarchy. We explore these limitations by providing statistics of feature size as function of depth in the pruning hierarchy of the contour tree.

  8. Eye-gaze determination of user intent at the computer interface

    Energy Technology Data Exchange (ETDEWEB)

    Goldberg, J.H. [Pennsylvania State Univ., University Park, PA (United States). Dept. of Industrial Engineering; Schryver, J.C. [Oak Ridge National Lab., TN (United States)

    1993-12-31

    Determination of user intent at the computer interface through eye-gaze monitoring can significantly aid applications for the disabled, as well as telerobotics and process control interfaces. Whereas current eye-gaze control applications are limited to object selection and x/y gazepoint tracking, a methodology was developed here to discriminate a more abstract interface operation: zooming-in or out. This methodology first collects samples of eve-gaze location looking at controlled stimuli, at 30 Hz, just prior to a user`s decision to zoom. The sample is broken into data frames, or temporal snapshots. Within a data frame, all spatial samples are connected into a minimum spanning tree, then clustered, according to user defined parameters. Each cluster is mapped to one in the prior data frame, and statistics are computed from each cluster. These characteristics include cluster size, position, and pupil size. A multiple discriminant analysis uses these statistics both within and between data frames to formulate optimal rules for assigning the observations into zooming, zoom-out, or no zoom conditions. The statistical procedure effectively generates heuristics for future assignments, based upon these variables. Future work will enhance the accuracy and precision of the modeling technique, and will empirically test users in controlled experiments.

  9. Renewable Electric Plant Information System user interface manual: Paradox 7 Runtime for Windows

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-11-01

    The Renewable Electric Plant Information System (REPiS) is a comprehensive database with detailed information on grid-connected renewable electric plants in the US. The current version, REPiS3 beta, was developed in Paradox for Windows. The user interface (UI) was developed to facilitate easy access to information in the database, without the need to have, or know how to use, Paradox for Windows. The UI is designed to provide quick responses to commonly requested sorts of the database. A quick perusal of this manual will familiarize one with the functions of the UI and will make use of the system easier. There are six parts to this manual: (1) Quick Start: Instructions for Users Familiar with Database Applications; (2) Getting Started: The Installation Process; (3) Choosing the Appropriate Report; (4) Using the User Interface; (5) Troubleshooting; (6) Appendices A and B.

  10. Photo-Based User Interfaces: Picture It, Tag It, Use It

    Science.gov (United States)

    Vanderhulst, Geert; Luyten, Kris; Coninx, Karin

    Pervasive environments can be hard to configure and interact with using handheld computing devices, due to the mismatch between physical and digital worlds. Usually, smart resources in the user's vicinity are discovered and presented in a menu on the user's device from where they can be accessed. However, in environments with many embedded resources it becomes hard to identify resources by means of a textual description and to get aware of the tasks they support. As an alternative to menu-driven interfaces, we demonstrate annotated photos as a means for controlling a pervasive environment. We present as part of our approach a tool that enables people to picture their own environment and use photos as building blocks to create an interactive digital view on their surroundings. To demonstrate and evaluate our approach, we engineered a pervasive prototype application that is operated through a photo-based user interface and assembled using ontologies.

  11. Comparative performance analysis of M-IMU/EMG and voice user interfaces for assistive robots.

    Science.gov (United States)

    Laureiti, Clemente; Cordella, Francesca; di Luzio, Francesco Scotto; Saccucci, Stefano; Davalli, Angelo; Sacchetti, Rinaldo; Zollo, Loredana

    2017-07-01

    People with a high level of disability experience great difficulties to perform activities of daily living and resort to their residual motor functions in order to operate assistive devices. The commercially available interfaces used to control assistive manipulators are typically based on joysticks and can be used only by subjects with upper-limb residual mobilities. Many other solutions can be found in the literature, based on the use of multiple sensory systems for detecting the human motion intention and state. Some of them require a high cognitive workload for the user. Some others are more intuitive and easy to use but have not been widely investigated in terms of usability and user acceptance. The objective of this work is to propose an intuitive and robust user interface for assistive robots, not obtrusive for the user and easily adaptable for subjects with different levels of disability. The proposed user interface is based on the combination of M-IMU and EMG for the continuous control of an arm-hand robotic system by means of M-IMUs. The system has been experimentally validated and compared to a standard voice interface. Sixteen healthy subjects volunteered to participate in the study: 8 subjects used the combined M-IMU/EMG robot control, and 8 subjects used the voice control. The arm-hand robotic system made of the KUKA LWR 4+ and the IH2 Azzurra hand was controlled to accomplish the daily living task of drinking. Performance indices and evaluation scales were adopted to assess performance of the two interfaces.

  12. Human-system interface design review guideline -- Review software and user`s guide: Final report. Revision 1, Volume 3

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-06-01

    NUREG-0700, Revision 1, provides human factors engineering (HFE) guidance to the US Nuclear Regulatory Commission staff for its: (1) review of the human system interface (HSI) design submittals prepared by licensees or applications for a license or design certification of commercial nuclear power plants, and (2) performance of HSI reviews that could be undertaken as part of an inspection or other type of regulatory review involving HSI design or incidents involving human performance. The guidance consists of a review process and HFE guidelines. The document describes those aspects of the HSI design review process that are important to the identification and resolution of human engineering discrepancies that could adversely affect plant safety. Guidance is provided that could be used by the staff to review an applicant`s HSI design review process or to guide the development of an HSI design review plan, e.g., as part of an inspection activity. The document also provides detailed HFE guidelines for the assessment of HSI design implementations. NUREG-0700, Revision 1, consists of three stand-alone volumes. Volume 3 contains an interactive software application of the NUREG-0700, Revision 1 guidance and a user`s guide for this software. The software supports reviewers during review preparation, evaluation design using the human factors engineering guidelines, and in report preparation. The user`s guide provides system requirements and installation instructions, detailed explanations of the software`s functions and features, and a tutorial on using the software.

  13. What Do IT-People Know About the (Nordic) History of Computers and User Interfaces?

    DEFF Research Database (Denmark)

    Jørgensen, Anker Helms

    2009-01-01

    :  This paper reports a preliminary, empirical exploration of what IT-people know about the history of computers and user interfaces.  The principal motivation for the study is that the younger generations such as students in IT seem to know very little about these topics.  The study employed...... and researchers seems heavily based on personal experience so that the researchers know much more about the earlier days of computing and interfaces.  Thirdly, there is a tendency amongst the students to conceptualize the history of computers in interface features and concepts.  Hence, the interface seems...... to become the designation or even the icon for the computer.  In other words, one of the key focal points in the area of human-computer interaction: to make the computer as such invisible seems to have been successful...

  14. Monitoring and controlling ATLAS data management: The Rucio web user interface

    CERN Document Server

    Lassnig, Mario; The ATLAS collaboration; Vigne, Ralph; Barisits, Martin-Stefan; Garonne, Vincent; Serfon, Cedric

    2015-01-01

    The monitoring and controlling interfaces of the previous data management system DQ2 followed the evolutionary requirements and needs of the ATLAS collaboration. The new data management system, Rucio, has put in place a redesigned web-based interface based upon the lessons learnt from DQ2, and the increased volume of managed information. This interface encompasses both a monitoring and controlling component, and allows easy integration for user-generated views. The interface follows three design principles. First, the collection and storage of data from internal and external systems is asynchronous to reduce latency. This includes the use of technologies like ActiveMQ or Nagios. Second, analysis of the data into information is done massively parallel due to its volume, using a combined approach with an Oracle database and Hadoop MapReduce. Third, sharing of the information does not distinguish between human or programmatic access, making it easy to access selective parts of the information both in constrained...

  15. Monitoring and controlling ATLAS data management: The Rucio web user interface

    CERN Document Server

    Lassnig, Mario; The ATLAS collaboration; Barisits, Martin-Stefan; Serfon, Cedric; Vigne, Ralph; Garonne, Vincent

    2015-01-01

    The monitoring and controlling interfaces of the previous data management system DQ2 followed the evolutionary requirements and needs of the ATLAS collaboration. The new system, Rucio, has put in place a redesigned web-based interface based upon the lessons learnt from DQ2, and the increased volume of managed information. This interface encompasses both a monitoring and controlling component, and allows easy integration for user-generated views. The interface follows three design principles. First, the collection and storage of data from internal and external systems is asynchronous to reduce latency. This includes the use of technologies like ActiveMQ or Nagios. Second, analysis of the data into information is done massively parallel due to its volume, using a combined approach with an Oracle database and Hadoop MapReduce. Third, sharing of the information does not distinguish between human or programmatic access, making it easy to access selective parts of the information both in constrained frontends like ...

  16. UIMX: A User Interface Management System For Scientific Computing With X Windows

    Science.gov (United States)

    Foody, Michael

    1989-09-01

    Applications with iconic user interfaces, (for example, interfaces with pulldown menus, radio buttons, and scroll bars), such as those found on Apple's Macintosh computer and the IBM PC under Microsoft's Presentation Manager, have become very popular, and for good reason. They are much easier to use than applications with traditional keyboard-oriented interfaces, so training costs are much lower and just about anyone can use them. They are standardized between applications, so once you learn one application you are well along the way to learning another. The use of one reinforces the common elements between applications of the interface, and, as a result, you remember how to use them longer. Finally, for the developer, their support costs can be much lower because of their ease of use.

  17. The Role of Perceived User-Interface Design in Continued Usage Intention of Self-Paced E-Learning Tools

    Science.gov (United States)

    Cho, Vincent; Cheng, T. C. Edwin; Lai, W. M. Jennifer

    2009-01-01

    While past studies on user-interface design focused on a particular system or application using the experimental approach, we propose a theoretical model to assess the impact of perceived user-interface design (PUID) on continued usage intention (CUI) of self-paced e-learning tools in general. We argue that the impact of PUID is mediated by two…

  18. A Systematic Review of User Interface Issues Related to PDA-based Decision Support Systems in Health Care

    OpenAIRE

    Lee, Nam-Ju; Starren, Justin; Bakken, Suzanne

    2005-01-01

    This paper explores user interface issues in the design and implementation of a personal digital assistant-based decision support system (PDA-DSS) in health care. An automated literature search found 15 studies addressing the main PDA user interface issues, which can be categorized as display, security, memory, Web browser, and communication.

  19. Prototype of haptic device for sole of foot using magnetic field sensitive elastomer

    Science.gov (United States)

    Kikuchi, T.; Masuda, Y.; Sugiyama, M.; Mitsumata, T.; Ohori, S.

    2013-02-01

    Walking is one of the most popular activities and a healthy aerobic exercise for the elderly. However, if they have physical and / or cognitive disabilities, sometimes it is challenging to go somewhere they don't know well. The final goal of this study is to develop a virtual reality walking system that allows users to walk in virtual worlds fabricated with computer graphics. We focus on a haptic device that can perform various plantar pressures on users' soles of feet as an additional sense in the virtual reality walking. In this study, we discuss a use of a magnetic field sensitive elastomer (MSE) as a working material for the haptic interface on the sole. The first prototype with MSE was developed and evaluated in this work. According to the measurement of planter pressures, it was found that this device can perform different pressures on the sole of a light-weight user by applying magnetic field on the MSE. The result also implied necessities of the improvement of the magnetic circuit and the basic structure of the mechanism of the device.

  20. GUIdock: Using Docker Containers with a Common Graphics User Interface to Address the Reproducibility of Research.

    Directory of Open Access Journals (Sweden)

    Ling-Hong Hung

    Full Text Available Reproducibility is vital in science. For complex computational methods, it is often necessary, not just to recreate the code, but also the software and hardware environment to reproduce results. Virtual machines, and container software such as Docker, make it possible to reproduce the exact environment regardless of the underlying hardware and operating system. However, workflows that use Graphical User Interfaces (GUIs remain difficult to replicate on different host systems as there is no high level graphical software layer common to all platforms. GUIdock allows for the facile distribution of a systems biology application along with its graphics environment. Complex graphics based workflows, ubiquitous in systems biology, can now be easily exported and reproduced on many different platforms. GUIdock uses Docker, an open source project that provides a container with only the absolutely necessary software dependencies and configures a common X Windows (X11 graphic interface on Linux, Macintosh and Windows platforms. As proof of concept, we present a Docker package that contains a Bioconductor application written in R and C++ called networkBMA for gene network inference. Our package also includes Cytoscape, a java-based platform with a graphical user interface for visualizing and analyzing gene networks, and the CyNetworkBMA app, a Cytoscape app that allows the use of networkBMA via the user-friendly Cytoscape interface.

  1. Four principles for user interface design of computerised clinical decision support systems.

    Science.gov (United States)

    Kanstrup, Anne Marie; Christiansen, Marion Berg; Nøhr, Christian

    2011-01-01

    The paper presents results from a design research project of a user interface (UI) for a Computerised Clinical Decision Support System (CDSS). The ambition has been to design Human-Computer Interaction (HCI) that can minimise medication errors. Through an iterative design process a digital prototype for prescription of medicine has been developed. This paper presents results from the formative evaluation of the prototype conducted in a simulation laboratory with ten participating physicians. Data from the simulation is analysed by use of theory on how users perceive information. The conclusion is a model, which sum up four principles of interaction for design of CDSS. The four principles for design of user interfaces for CDSS are summarised as four A's: All in one, At a glance, At hand and Attention. The model emphasises integration of all four interaction principles in the design of user interfaces for CDSS, i.e. the model is an integrated model which we suggest as a guide for interaction design when working with preventing medication errors.

  2. SimHap GUI: an intuitive graphical user interface for genetic association analysis.

    Science.gov (United States)

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-12-25

    Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.

  3. SimHap GUI: An intuitive graphical user interface for genetic association analysis

    Directory of Open Access Journals (Sweden)

    Carter Kim W

    2008-12-01

    Full Text Available Abstract Background Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. Results We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. Conclusion SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.

  4. Classification of user interfaces for graph-based online analytical processing

    Science.gov (United States)

    Michaelis, James R.

    2016-05-01

    In the domain of business intelligence, user-oriented software for conducting multidimensional analysis via Online- Analytical Processing (OLAP) is now commonplace. In this setting, datasets commonly have well-defined sets of dimensions and measures around which analysis tasks can be conducted. However, many forms of data used in intelligence operations - deriving from social networks, online communications, and text corpora - will consist of graphs with varying forms of potential dimensional structure. Hence, enabling OLAP over such data collections requires explicit definition and extraction of supporting dimensions and measures. Further, as Graph OLAP remains an emerging technique, limited research has been done on its user interface requirements. Namely, on effective pairing of interface designs to different types of graph-derived dimensions and measures. This paper presents a novel technique for pairing of user interface designs to Graph OLAP datasets, rooted in Analytic Hierarchy Process (AHP) driven comparisons. Attributes of the classification strategy are encoded through an AHP ontology, developed in our alternate work and extended to support pairwise comparison of interfaces. Specifically, according to their ability, as perceived by Subject Matter Experts, to support dimensions and measures corresponding to Graph OLAP dataset attributes. To frame this discussion, a survey is provided both on existing variations of Graph OLAP, as well as existing interface designs previously applied in multidimensional analysis settings. Following this, a review of our AHP ontology is provided, along with a listing of corresponding dataset and interface attributes applicable toward SME recommendation structuring. A walkthrough of AHP-based recommendation encoding via the ontology-based approach is then provided. The paper concludes with a short summary of proposed future directions seen as essential for this research area.

  5. The Impact of User Interface on Young Children’s Computational Thinking

    Directory of Open Access Journals (Sweden)

    Amanda Sullivan

    2017-07-01

    Full Text Available Aim/Purpose: Over the past few years, new approaches to introducing young children to computational thinking have grown in popularity. This paper examines the role that user interfaces have on children’s mastery of computational thinking concepts and positive interpersonal behaviors. Background: There is a growing pressure to begin teaching computational thinking at a young age. This study explores the affordances of two very different programming interfaces for teaching computational thinking: a graphical coding application on the iPad (ScratchJr and tangible programmable robotics kit (KIBO. Methodology\t: This study used a mixed-method approach to explore the learning experiences that young children have with tangible and graphical coding interfaces. A sample of children ages four to seven (N = 28 participated. Findings: Results suggest that type of user interface does have an impact on children’s learning, but is only one of many factors that affect positive academic and socio-emotional experiences. Tangible and graphical interfaces each have qualities that foster different types of learning

  6. Providing QoS for Networked Peers in Distributed Haptic Virtual Environments

    Directory of Open Access Journals (Sweden)

    Alan Marshall

    2008-01-01

    Full Text Available Haptic information originates from a different human sense (touch, therefore the quality of service (QoS required to support haptic traffic is significantly different from that used to support conventional real-time traffic such as voice or video. Each type of network impairment has different (and severe impacts on the user's haptic experience. There has been no specific provision of QoS parameters for haptic interaction. Previous research into distributed haptic virtual environments (DHVEs have concentrated on synchronization of positions (haptic device or virtual objects, and are based on client-server architectures. We present a new peer-to-peer DHVE architecture that further extends this to enable force interactions between two users whereby force data are sent to the remote peer in addition to positional information. The work presented involves both simulation and practical experimentation where multimodal data is transmitted over a QoS-enabled IP network. Both forms of experiment produce consistent results which show that the use of specific QoS classes for haptic traffic will reduce network delay and jitter, leading to improvements in users' haptic experiences with these types of applications.

  7. DESIGN AND IMPLEMENTATION OF A USER-ORIENTED SPEECH RECOGNITION INTERFACE - THE SYNERGY OF TECHNOLOGY AND HUMAN-FACTORS

    NARCIS (Netherlands)

    KLOOSTERMAN, SH

    The design and implementation of a user-oriented speech recognition interface are described. The interface enables the use of speech recognition in so-called interactive voice response systems which can be accessed via a telephone connection. In the design of the interface a synergy of technology

  8. Design and implementation of a user-oriented speech recognition interface: the synergy of technology and human factors

    NARCIS (Netherlands)

    Kloosterman, Sietse H.

    1994-01-01

    The design and implementation of a user-oriented speech recognition interface are described. The interface enables the use of speech recognition in so-called interactive voice response systems which can be accessed via a telephone connection. In the design of the interface a synergy of technology

  9. Brave NUI World Designing Natural User Interfaces for Touch and Gesture

    CERN Document Server

    Wigdor, Daniel

    2011-01-01

    Touch and gestural devices have been hailed as next evolutionary step in human-computer interaction. As software companies struggle to catch up with one another in terms of developing the next great touch-based interface, designers are charged with the daunting task of keeping up with the advances in new technology and this new aspect to user experience design. Product and interaction designers, developers and managers are already well versed in UI design, but touch-based interfaces have added a new level of complexity.

  10. User participation in the development of the human/computer interface for control centers

    Science.gov (United States)

    Broome, Richard; Quick-Campbell, Marlene; Creegan, James; Dutilly, Robert

    1996-01-01

    Technological advances coupled with the requirements to reduce operations staffing costs led to the demand for efficient, technologically-sophisticated mission operations control centers. The control center under development for the earth observing system (EOS) is considered. The users are involved in the development of a control center in order to ensure that it is cost-efficient and flexible. A number of measures were implemented in the EOS program in order to encourage user involvement in the area of human-computer interface development. The following user participation exercises carried out in relation to the system analysis and design are described: the shadow participation of the programmers during a day of operations; the flight operations personnel interviews; and the analysis of the flight operations team tasks. The user participation in the interface prototype development, the prototype evaluation, and the system implementation are reported on. The involvement of the users early in the development process enables the requirements to be better understood and the cost to be reduced.

  11. Novel user interface design for medication reconciliation: an evaluation of Twinlist.

    Science.gov (United States)

    Plaisant, Catherine; Wu, Johnny; Hettinger, A Zach; Powsner, Seth; Shneiderman, Ben

    2015-03-01

    The primary objective was to evaluate time, number of interface actions, and accuracy on medication reconciliation tasks using a novel user interface (Twinlist, which lays out the medications in five columns based on similarity and uses animation to introduce the grouping - www.cs.umd.edu/hcil/sharp/twinlist) compared to a Control interface (where medications are presented side by side in two columns). A secondary objective was to assess participant agreement with statements regarding clarity and utility and to elicit comparisons. A 1 × 2 within-subjects experimental design was used with interface (Twinlist or Control) as an independent variable; time, number of clicks, scrolls, and errors were used as dependent variables. Participants were practicing medical providers with experience performing medication reconciliation but no experience with Twinlist. They reconciled two cases in each interface (in a counterbalanced order), then provided feedback on the design of the interface. Twenty medical providers participated in the study for a total of 80 trials. The trials using Twinlist were statistically significantly faster (18%), with fewer clicks (40%) and scrolls (60%). Serious errors were noted 12 and 31 times in Twinlist and Control trials, respectively. Trials using Twinlist were faster and more accurate. Subjectively, participants rated Twinlist more favorably than Control. They valued the novel layout of the drugs, but indicated that the included animation would be valuable for novices, but not necessarily for advanced users. Additional feedback from participants provides guidance for further development and clinical implementations. Cognitive support of medication reconciliation through interface design can significantly improve performance and safety. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. An Evaluation and Redesign of the Conflict Prediction and Trial Planning Planview Graphical User Interface

    Science.gov (United States)

    Laudeman, Irene V.; Brasil, Connie L.; Stassart, Philippe

    1998-01-01

    The Planview Graphical User Interface (PGUI) is the primary display of air traffic for the Conflict Prediction and Trial Planning, function of the Center TRACON Automation System. The PGUI displays air traffic information that assists the user in making decisions related to conflict detection, conflict resolution, and traffic flow management. The intent of this document is to outline the human factors issues related to the design of the conflict prediction and trial planning portions of the PGUI, document all human factors related design changes made to the PGUI from December 1996 to September 1997, and outline future plans for the ongoing PGUI design.

  13. Marine Web Portal as an Interface between Users and Marine Data and Information Sources

    Science.gov (United States)

    Palazov, A.; Stefanov, A.; Marinova, V.; Slabakova, V.

    2012-04-01

    Fundamental elements of the success of marine data and information management system and an effective support of marine and maritime economic activities are the speed and the ease with which users can identify, locate, get access, exchange and use oceanographic and marine data and information. There are a lot of activities and bodies have been identified as marine data and information users, such as: science, government and local authorities, port authorities, shipping, marine industry, fishery and aquaculture, tourist industry, environmental protection, coast protection, oil spills combat, Search and Rescue, national security, civil protection, and general public. On other hand diverse sources of real-time and historical marine data and information exist and generally they are fragmented, distributed in different places and sometimes unknown for the users. The marine web portal concept is to build common web based interface which will provide users fast and easy access to all available marine data and information sources, both historical and real-time such as: marine data bases, observing systems, forecasting systems, atlases etc. The service is regionally oriented to meet user needs. The main advantage of the portal is that it provides general look "at glance" on all available marine data and information as well as direct user to easy discover data and information in interest. It is planned to provide personalization ability, which will give the user instrument to tailor visualization according its personal needs.

  14. Visuo-Haptic Mixed Reality with Unobstructed Tool-Hand Integration.

    Science.gov (United States)

    Cosco, Francesco; Garre, Carlos; Bruno, Fabio; Muzzupappa, Maurizio; Otaduy, Miguel A

    2013-01-01

    Visuo-haptic mixed reality consists of adding to a real scene the ability to see and touch virtual objects. It requires the use of see-through display technology for visually mixing real and virtual objects, and haptic devices for adding haptic interaction with the virtual objects. Unfortunately, the use of commodity haptic devices poses obstruction and misalignment issues that complicate the correct integration of a virtual tool and the user's real hand in the mixed reality scene. In this work, we propose a novel mixed reality paradigm where it is possible to touch and see virtual objects in combination with a real scene, using commodity haptic devices, and with a visually consistent integration of the user's hand and the virtual tool. We discuss the visual obstruction and misalignment issues introduced by commodity haptic devices, and then propose a solution that relies on four simple technical steps: color-based segmentation of the hand, tracking-based segmentation of the haptic device, background repainting using image-based models, and misalignment-free compositing of the user's hand. We have developed a successful proof-of-concept implementation, where a user can touch virtual objects and interact with them in the context of a real scene, and we have evaluated the impact on user performance of obstruction and misalignment correction.

  15. Rating User Interface and Universal Instructional Design in MOOC Course Design

    Directory of Open Access Journals (Sweden)

    Richard Meyer

    2015-01-01

    Full Text Available This study examines how college students rate Massive Open Online Courses (MOOCs in terms of User Interface Design and Universal Instructional Design. The research participants were 115 undergraduate students from a public midwestern university in the United States. Each participant evaluated three randomly chosen MOOCs, all of which were developed on the Coursera platform, using rubrics for User Interface Design and Universal Instructional Design. The results indicated that students had an overall positive impression of each MOOC’s course design. This study concludes that overall course design strategies are not associated with the massive dropout rates currently documented in MOOC learning environments. The authors suggest the use of appropriate instructional design principles be further explored

  16. Microcomputer spacecraft thermal analysis routines (MSTAR) Phase I: The user interface

    Science.gov (United States)

    Teti, Nicholas M.

    1993-01-01

    The Microcomputer Spacecraft Thermal Analysis Routines (MSTAR) software package is being developed for NASA/Goddard Space Flight Center by Swales and Associates, Inc. (S&AI). In December 1992, S&AI was awarded a phase I Small Business Inovative Research contract fronm NASA to develop a microcomputer based thermal analysis program to replace the current SSPTA and TRASYS programs. Phase I consists of a six month effort which will focus on developing geometric model generation and visualization capabilities using a graphical user interface (GUI). The information contained in this paper encompasses the work performed during the Phase I development cycle; with emphasis on the development of the graphical user interface (GUI). This includes both the theory behind and specific examples of how the MSTAR GUI was implemented. Furthermore, this report discusses new applications and enhancements which will improve the capabilities and commercialization of the MSTAR program.

  17. From User Interface Usability to the Overall Usability of Interactive Systems: Adding Usability in System Architecture

    Science.gov (United States)

    Taleb, Mohamed; Seffah, Ahmed; Engleberg, Daniel

    Traditional interactive system architectures such as MVC and PAC decompose the system into subsystems that are relatively independent, thereby allowing the design work to be partitioned between the user interfaces and underlying functionalities. Such architectures extend the independence assumption to usability, approaching the design of the user interface as a subsystem that can be designed and tested independently from the underlying functionality. This Cartesian dichotomy can be fallacious, as functionalities buried in the application’s logic can sometimes affect the usability of the system. Our investigations model the relationships between internal software attributes and externally visible usability factors. We propose a pattern-based approach for dealing with these relationships. We conclude by discussing how these patterns can lead to a methodological framework for improving interactive system architec-tures, and how these patterns can support the integration of usability in the software design process.

  18. QE::GUI – A Graphical User Interface for Quality Estimation

    Directory of Open Access Journals (Sweden)

    Avramidis Eleftherios

    2017-10-01

    Full Text Available Despite its wide applicability, Quality Estimation (QE of Machine Translation (MT poses a difficult entry barrier since there are no open source tools with a graphical user interface (GUI. Here we present a tool in this direction by connecting the back-end of the QE decision-making mechanism with a web-based GUI. The interface allows the user to post requests to the QE engine and get a visual response with the results. Additionally we provide pre-trained QE models for easier launching of the app. The tool is written in Python so that it can leverage the rich natural language processing capabilities of the popular dynamic programming language, which is at the same time supported by top web-server environments.

  19. Development of educational software for beam loading analysis using pen-based user interfaces

    Directory of Open Access Journals (Sweden)

    Yong S. Suh

    2014-01-01

    Full Text Available Most engineering software tools use typical menu-based user interfaces, and they may not be suitable for learning tools because the solution processes are hidden and students can only see the results. An educational tool for simple beam analyses is developed using a pen-based user interface with a computer so students can write and sketch by hand. The geometry of beam sections is sketched, and a shape matching technique is used to recognize the sketch. Various beam loads are added by sketching gestures or writing singularity functions. Students sketch the distributions of the loadings by sketching the graphs, and they are automatically checked and the system provides aids in grading the graphs. Students receive interactive graphical feedback for better learning experiences while they are working on solving the problems.

  20. Reducing wrong patient selection errors: exploring the design space of user interface techniques.

    Science.gov (United States)

    Sopan, Awalin; Plaisant, Catherine; Powsner, Seth; Shneiderman, Ben

    2014-01-01

    Wrong patient selection errors are a major issue for patient safety; from ordering medication to performing surgery, the stakes are high. Widespread adoption of Electronic Health Record (EHR) and Computerized Provider Order Entry (CPOE) systems makes patient selection using a computer screen a frequent task for clinicians. Careful design of the user interface can help mitigate the problem by helping providers recall their patients' identities, accurately select their names, and spot errors before orders are submitted. We propose a catalog of twenty seven distinct user interface techniques, organized according to a task analysis. An associated video demonstrates eighteen of those techniques. EHR designers who consider a wider range of human-computer interaction techniques could reduce selection errors, but verification of efficacy is still needed.

  1. XUIMS the X-Window User Interface Management System at CERN

    CERN Document Server

    Van den Eynden, M

    1995-01-01

    The CERN X-Window User Interface Management System (XUIMS) is a modular and highly configurable software development environment allowing the interactive design, prototyping, and production of OSF/Motif Human Computer Interfaces (HCI). Fully compliant with the X11R5 and OSF/Motif industry standards, XUIMS covers complex software areas like the development of schematics, the visualization and on-line interactions with 2D and 3D scientific data, the display of relational database data, and the direct access to CERN SPS and LEP accelerators equipment. The guarantee of consistency across the applications and the encapsulation of complex functionality in re-usable and user-friendly components has also been implemented through the development of home made graphical objects (widgets) and templates. The XUIMS environment is built with commercial software products integrated in the CERN SPS and LEP controls infrastructure with a very limited home-made effort. Productivity and quality have been improved through less co...

  2. TOOKUIL: A case study in user interface development for safety code application

    Energy Technology Data Exchange (ETDEWEB)

    Gray, D.L.; Harkins, C.K.; Hoole, J.G. [and others

    1997-07-01

    Traditionally, there has been a very high learning curve associated with using nuclear power plant (NPP) analysis codes. Even for seasoned plant analysts and engineers, the process of building or modifying an input model for present day NPP analysis codes is tedious, error prone, and time consuming. Current cost constraints and performance demands place an additional burden on today`s safety analysis community. Advances in graphical user interface (GUI) technology have been applied to obtain significant productivity and quality assurance improvements for the Transient Reactor Analysis Code (TRAC) input model development. KAPL Inc. has developed an X Windows-based graphical user interface named TOOKUIL which supports the design and analysis process, acting as a preprocessor, runtime editor, help system, and post processor for TRAC. This paper summarizes the objectives of the project, the GUI development process and experiences, and the resulting end product, TOOKUIL.

  3. From Documents to User Interfaces Universal Design and the Emergence of Abstraction

    Directory of Open Access Journals (Sweden)

    Jason White

    2004-05-01

    Full Text Available Abstract representations of content which allow it to be automatically adapted to suit the delivery context, have emerged historically with the development of markup languages intended to facilitate the storage and processing of electronic documents. This technological tradition is reviewed in the first part of the paper, focusing predominantly on the nature and advantages of a ‘single authoring’ approach to the creation of content. Some of the lessons to be derived from the evolution and deployment of markup systems are also discussed, then applied, in the second part of the paper, to the question of how such abstractions can be extended to the design of user interfaces. Innovative work related to the generic specification of user interfaces is reviewed. It is argued that the advantages of an abstract approach depend for their realization on the development of more expressive style languages and more sophisticated adaptation mechanisms, as well as continued refinement of the semantics of markup languages themselves.

  4. Graphical user interface for input output characterization of single variable and multivariable highly nonlinear systems

    Directory of Open Access Journals (Sweden)

    Shahrukh Adnan Khan M. D.

    2017-01-01

    Full Text Available This paper presents a Graphical User Interface (GUI software utility for the input/output characterization of single variable and multivariable nonlinear systems by obtaining the sinusoidal input describing function (SIDF of the plant. The software utility is developed on MATLAB R2011a environment. The developed GUI holds no restriction on the nonlinearity type, arrangement and system order; provided that output(s of the system is obtainable either though simulation or experiments. An insight to the GUI and its features are presented in this paper and example problems from both single variable and multivariable cases are demonstrated. The formulation of input/output behavior of the system is discussed and the nucleus of the MATLAB command underlying the user interface has been outlined. Some of the industries that would benefit from this software utility includes but not limited to aerospace, defense technology, robotics and automotive.

  5. Development of a user-friendly interface version of the Salmonella source-attribution model

    DEFF Research Database (Denmark)

    Hald, Tine; Lund, Jan

    allow for the identification of the most important animal reservoirs of the zoonotic agent, assisting risk managers to prioritize interventions and focus control strategies at the animal production level. The model can provide estimates for the effect on the number of human cases originating from...... of questions, where the use of a classical quantitative risk assessment model (i.e. transmission models) would be impaired due to a lack of data and time limitations. As these models require specialist knowledge, it was requested by EFSA to develop a flexible user-friendly source attribution model for use...... in this report is called the EFSA Source Attribution Model (EFSA_SAM). The programming language (development environment) used for developing the user-friendly interface is Embarcaderos Delphi XE2 Enterprise. The interface generates a WinBUGS code based on the user’s imported data and model selections...

  6. Haptic feedback for virtual assembly

    Science.gov (United States)

    Luecke, Greg R.; Zafer, Naci

    1998-12-01

    Assembly operations require high speed and precision with low cost. The manufacturing industry has recently turned attenuation to the possibility of investigating assembly procedures using graphical display of CAD parts. For these tasks, some sort of feedback to the person is invaluable in providing a real sense of interaction with virtual parts. This research develops the use of a commercial assembly robot as the haptic display in such tasks. For demonstration, a peg-hole insertion task is studied. Kane's Method is employed to derive the dynamics of the peg and the contact motions between the peg and the hole. A handle modeled as a cylindrical peg is attached to the end effector of a PUMA 560 robotic arm. The arm is handle modeled as a cylindrical peg is attached to the end effector of a PUMA 560 robotic arm. The arm is equipped with a six axis force/torque transducer. The use grabs the handle and the user-applied forces are recorded. A 300 MHz Pentium computer is used to simulate the dynamics of the virtual peg and its interactions as it is inserted in the virtual hole. The computed torque control is then employed to exert the full dynamics of the task to the user hand. Visual feedback is also incorporated to help the user in the process of inserting the peg into the hole. Experimental results are presented to show several contact configurations for this virtually simulated task.

  7. Make E-Learning Effortless! Impact of a Redesigned User Interface on Usability through the Application of an Affordance Design Approach

    Science.gov (United States)

    Park, Hyungjoo; Song, Hae-Deok

    2015-01-01

    Given that a user interface interacts with users, a critical factor to be considered in improving the usability of an e-learning user interface is user-friendliness. Affordances enable users to more easily approach and engage in learning tasks because they strengthen positive, activating emotions. However, most studies on affordances limit…

  8. User and system interface issues in the purchase of imaging and information systems.

    Science.gov (United States)

    Langer, S; Wang, J

    1996-08-01

    We introduce a set of worksheets to facilitate standardized comparisons among scheduling systems, hospital information systems, radiology information systems, and picture archiving and communication systems vendors. For each system category, we provide worksheets to evaluate the features, performance, installation requirements and costs of the included components. These worksheets will help to assure that critical user and systems interface issues are not overlooked and aid potential purchasers to make informed and objective purchasing decisions.

  9. User Interface Concepts for Mechanism Modelling in the RaMMS KBE System

    OpenAIRE

    Coward, Thor Christian

    2017-01-01

    By combining the principles of knowledge-based engineering (KBE) and concurrent engineering in the design process, repetitive tasks are reduced and design tasks are conducted simultaneously, enabling the engineer to explore a large design space early in the deign process, when the committed production costs are low. This thesis investigates the principles of KBE, concurrent engineering, mechanisms, the mechanism design process and user interface development. Rapid Mechanism Modelling System (...

  10. On the control of brain-computer interfaces by users with cerebral palsy

    OpenAIRE

    Daly I.; Billinger M.; Laparra-Hernandez J.; Aloise F.; Garcia M.L.; Faller J.; Scherer R.; Muller-Putz G.

    2013-01-01

    Brain-computer interfaces (BCIs) have been proposed as a potential assistive device for individuals with cerebral palsy (CP) to assist with their communication needs. However, it is unclear how well-suited BCIs are to individuals with CP. Therefore, this study aims to investigate to what extent these users are able to gain control of BCIs FP7 Framework EU Research Project ABC 287774 Daly, I.; Billinger, M.; Laparra Hernandez, J.; Aloise, F.; Lloria Garcia, M.; Faller, J.; Scherer, R.....

  11. Teaching Task Analysis for User Interface Design: Lessons Learned from Three Pilot Studies

    OpenAIRE

    Marçal de Oliveira, Káthia; Girard, Patrick; Guidini Gonçalves, Taisa; Lepreux, Sophie; Kolski, Christophe

    2015-01-01

    International audience; Task analysis is recognized by the Human-Computer Interaction community as good practice to improve the understanding of how a user may interact with software interfaces to reach a given goal. During more than one decade, we have taught task analysis in undergraduate and graduate HCI programs for the design of better interactive systems. In this paper, we describe three ways of teaching task analysis and the lessons learned from those practices. We consider this the fi...

  12. A Graphical User Interface for Scattering Analysis of Electromagnetic Waves Incident on Planar Layered Media

    Directory of Open Access Journals (Sweden)

    A. Mirala

    2016-09-01

    Full Text Available This paper introduces a MATLAB-based Graphical User Interface (GUI which could help electromagnetics engineers and researchers who are interested in designing layered media for various applications. The paper begins with presenting the analysis method the program employs, continues by encountering specific considerations and techniques of implementation, and ends with providing different numerical examples. These examples show good efficiency of the program for analysis of diverse problems.

  13. A Prototype Graphical User Interface for Co-op: A Group Decision Support System.

    Science.gov (United States)

    1992-03-01

    INTRODUCTION 1 A. PURPOSE OF THESIS. ................. 1 B. BACKGROUND ...................... 1. System Overview ................. 1 2. Model Components...74 ix i I. INTRODUCTION A. PURPOSE OF THESIS The purpose of this research is to design a prototype Graphical User Interface (GUI) for Co-oP...which each participant of the group has his own DSS whose model base is based on multiple criteria decision 1 methods ( MCDM ) along with additional

  14. Teaching Photovoltaic Array Modelling and Characterization Using a Graphical User Interface and a Flash Solar Simulator

    DEFF Research Database (Denmark)

    Spataru, Sergiu; Sera, Dezso; Kerekes, Tamas

    2012-01-01

    This paper presents a set of laboratory tools aimed to support students with various backgrounds (no programming) to understand photovoltaic array modelling and characterization techniques. A graphical user interface (GUI) has been developed in Matlab, for modelling PV arrays and characterizing...... the effect of different types of parameters and operating conditions, on the current-voltage and power-voltage curves. The GUI is supported by experimental investigation and validation on PV module level, with the help of an indoor flash solar simulator....

  15. User Interface on the World Wide Web: How to Implement a Multi-Level Program Online

    Science.gov (United States)

    Cranford, Jonathan W.

    1995-01-01

    The objective of this Langley Aerospace Research Summer Scholars (LARSS) research project was to write a user interface that utilizes current World Wide Web (WWW) technologies for an existing computer program written in C, entitled LaRCRisk. The project entailed researching data presentation and script execution on the WWW and than writing input/output procedures for the database management portion of LaRCRisk.

  16. Toward User Interfaces and Data Visualization Criteria for Learning Design of Digital Textbooks

    OpenAIRE

    Elena RAILEAN

    2014-01-01

    User interface and data visualisation criteria are central issues in digital textbooks design. However, when applying mathematical modelling of learning process to the analysis of the possible solutions, it could be observed that results differ. Mathematical learning views cognition in on the base on statistics and probability theory, graph theory, game theory, cellular automata, neural networks etc. Instead of this, research methodologies in learning design are diversified in behaviourism, c...

  17. The data array, a tool to interface the user to a large data base

    Science.gov (United States)

    Foster, G. H.

    1974-01-01

    Aspects of the processing of spacecraft data is considered. Use of the data array in a large address space as an intermediate form in data processing for a large scientific data base is advocated. Techniques for efficient indexing in data arrays are reviewed and the data array method for mapping an arbitrary structure onto linear address space is shown. A compromise between the two forms is given. The impact of the data array on the user interface are considered along with implementation.

  18. Graphical User Interface Aided Online Fault Diagnosis of Electric Motor - DC motor case study

    OpenAIRE

    POSTALCIOGLU OZGEN, S.

    2009-01-01

    This paper contains graphical user interface (GUI) aided online fault diagnosis for DC motor. The aim of the research is to prevent system faults. Online fault diagnosis has been studied. Design of fault diagnosis has two main levels: Level 1 comprises a traditional control loop; Level 2 contains knowledge based fault diagnosis. Fault diagnosis technique contains feature extraction module, feature cluster module and fault decision module. Wavelet analysis has been used for the feature extract...

  19. User-Centered Design, Experience, and Usability of an Electronic Consent User Interface to Facilitate Informed Decision-Making in an HIV Clinic.

    Science.gov (United States)

    Ramos, S Raquel

    2017-11-01

    Health information exchange is the electronic accessibility and transferability of patient medical records across various healthcare settings and providers. In some states, patients have to formally give consent to allow their medical records to be electronically shared. The purpose of this study was to apply a novel user-centered, multistep, multiframework approach to design and test an electronic consent user interface, so patients with HIV can make more informed decisions about electronically sharing their health information. This study consisted of two steps. Step 1 was a cross-sectional, descriptive, qualitative study that used user-centric design interviews to create the user interface. This informed Step 2. Step 2 consisted of a one group posttest to examine perceptions of usefulness, ease of use, preference, and comprehension of a health information exchange electronic consent user interface. More than half of the study population had college experience, but challenges remained with overall comprehension regarding consent. The user interface was not independently successful, suggesting that in addition to an electronic consent user interface, human interaction may also be necessary to address the complexities associated with consenting to electronically share health information. Comprehension is key factor in the ability to make informed decisions.

  20. A mobile user-interface for elderly care from the perspective of relatives.

    Science.gov (United States)

    Warpenius, Erika; Alasaarela, Esko; Sorvoja, Hannu; Kinnunen, Matti

    2015-03-01

    As the number of elderly people rises, relatives' care-taking responsibilities increase accordingly. This creates a need for developing new systems that enable relatives to keep track of aged family members. To develop new mobile services for elderly healthcare we tried to identify the most wanted features of a mobile user-interface from the perspective of relatives. Feature mapping was based on two online surveys: one administered to the relatives (N = 32) and nurses (N = 3) of senior citizens and the other to nursing students (N = 18). Results of the surveys, confirmed by face-to-face interviews of the relatives (N = 8), indicated that the most valued features of the mobile user-interface are Accident Reporting (e.g. falling), Alarms (e.g. fire-alarm), Doctor Visits and evaluation of the General Condition of the Senior. The averaged importance ratings of these features were 9.2, 9.0, 8.6 and 8.5, respectively (on a scale from 0 to 10). Other important considerations for the user-interface development are aspiration to simplicity and ease-of-use. We recommend that the results are taken into account, when designing and implementing mobile services for elderly healthcare.

  1. User interface considerations for telerobotics: the case of an agricultural robot sprayer

    Science.gov (United States)

    Adamides, George; Katsanos, Christos; Christou, Georgios; Xenos, Michalis; Papadavid, Giorgos; Hadzilacos, Thanasis

    2014-08-01

    Agricultural robots can tackle harsh working conditions and hardness of work, as well as the shortage of laborers that is a bottleneck to agricultural production. Such robots exist, but they are not yet widespread. We believe that the limited usage of robotics in agriculture could be related to the fact that the mainstream direction for robotics in agriculture is full automation. The teleoperation of an agricultural robotic system can enable improved performance overcoming the complexity that current autonomous robots face due to the dynamic and unstructured agricultural environment. A field study was conducted to evaluate eight different user interfaces aiming to determine the factors that should be taken into consideration by designers while developing user interfaces for robot teleoperation in agriculture. Thirty participants, including farmers and agricultural engineers, were asked to use different teleoperation interaction modes in order to navigate the robot along vineyard rows and spray grape clusters. Based on our findings, additional views for target identification and peripheral vision improved both robot navigation (fewer collisions) and target identification (sprayed grape clusters). In this paper, we discuss aspects of user interface design related to remote operation of an agricultural robot.

  2. Streamflow forecasting using the modular modeling system and an object-user interface

    Science.gov (United States)

    Jeton, A.E.

    2001-01-01

    The U.S. Geological Survey (USGS), in cooperation with the Bureau of Reclamation (BOR), developed a computer program to provide a general framework needed to couple disparate environmental resource models and to manage the necessary data. The Object-User Interface (OUI) is a map-based interface for models and modeling data. It provides a common interface to run hydrologic models and acquire, browse, organize, and select spatial and temporal data. One application is to assist river managers in utilizing streamflow forecasts generated with the Precipitation-Runoff Modeling System running in the Modular Modeling System (MMS), a distributed-parameter watershed model, and the National Weather Service Extended Streamflow Prediction (ESP) methodology.

  3. Integrating Virtual Worlds with Tangible User Interfaces for Teaching Mathematics: A Pilot Study

    Science.gov (United States)

    Guerrero, Graciela; Ayala, Andrés; Mateu, Juan; Casades, Laura; Alamán, Xavier

    2016-01-01

    This article presents a pilot study of the use of two new tangible interfaces and virtual worlds for teaching geometry in a secondary school. The first tangible device allows the user to control a virtual object in six degrees of freedom. The second tangible device is used to modify virtual objects, changing attributes such as position, size, rotation and color. A pilot study on using these devices was carried out at the “Florida Secundaria” high school. A virtual world was built where students used the tangible interfaces to manipulate geometrical figures in order to learn different geometrical concepts. The pilot experiment results suggest that the use of tangible interfaces and virtual worlds allowed a more meaningful learning (concepts learnt were more durable). PMID:27792132

  4. imDEV: a graphical user interface to R multivariate analysis tools in Microsoft Excel.

    Science.gov (United States)

    Grapov, Dmitry; Newman, John W

    2012-09-01

    Interactive modules for Data Exploration and Visualization (imDEV) is a Microsoft Excel spreadsheet embedded application providing an integrated environment for the analysis of omics data through a user-friendly interface. Individual modules enables interactive and dynamic analyses of large data by interfacing R's multivariate statistics and highly customizable visualizations with the spreadsheet environment, aiding robust inferences and generating information-rich data visualizations. This tool provides access to multiple comparisons with false discovery correction, hierarchical clustering, principal and independent component analyses, partial least squares regression and discriminant analysis, through an intuitive interface for creating high-quality two- and a three-dimensional visualizations including scatter plot matrices, distribution plots, dendrograms, heat maps, biplots, trellis biplots and correlation networks. Freely available for download at http://sourceforge.net/projects/imdev/. Implemented in R and VBA and supported by Microsoft Excel (2003, 2007 and 2010).

  5. Roughness based perceptual analysis towards digital skin imaging system with haptic feedback.

    Science.gov (United States)

    Kim, K

    2016-08-01

    To examine psoriasis or atopic eczema, analyzing skin roughness by palpation is essential to precisely diagnose skin diseases. However, optical sensor based skin imaging systems do not allow dermatologists to touch skin images. To solve the problem, a new haptic rendering technology that can accurately display skin roughness must be developed. In addition, the rendering algorithm must be able to filter spatial noises created during 2D to 3D image conversion without losing the original roughness on the skin image. In this study, a perceptual way to design a noise filter that will remove spatial noises and in the meantime recover maximized roughness is introduced by understanding human sensitivity on surface roughness. A visuohaptic rendering system that can provide a user with seeing and touching digital skin surface roughness has been developed including a geometric roughness estimation method from a meshed surface. In following, a psychophysical experiment was designed and conducted with 12 human subjects to measure human perception with the developed visual and haptic interfaces to examine surface roughness. From the psychophysical experiment, it was found that touch is more sensitive at lower surface roughness, and vice versa. Human perception with both senses, vision and touch, becomes less sensitive to surface distortions as roughness increases. When interact with both channels, visual and haptic interfaces, the performance to detect abnormalities on roughness is greatly improved by sensory integration with the developed visuohaptic rendering system. The result can be used as a guideline to design a noise filter that can perceptually remove spatial noises while recover maximized roughness values from a digital skin image obtained by optical sensors. In addition, the result also confirms that the developed visuohaptic rendering system can help dermatologists or skin care professionals examine skin conditions by using vision and touch at the same time. © 2015

  6. Integrating macromolecular X-ray diffraction data with the graphical user interface iMosflm.

    Science.gov (United States)

    Powell, Harold R; Battye, T Geoff G; Kontogiannis, Luke; Johnson, Owen; Leslie, Andrew G W

    2017-07-01

    X-ray crystallography is the predominant source of structural information for biological macromolecules, providing fundamental insights into biological function. The availability of robust and user-friendly software to process the collected X-ray diffraction images makes the technique accessible to a wider range of scientists. iMosflm/MOSFLM (http://www.mrc-lmb.cam.ac.uk/harry/imosflm) is a software package designed to achieve this goal. The graphical user interface (GUI) version of MOSFLM (called iMosflm) is designed to guide inexperienced users through the steps of data integration, while retaining powerful features for more experienced users. Images from almost all commercially available X-ray detectors can be handled using this software. Although the program uses only 2D profile fitting, it can readily integrate data collected in the 'fine phi-slicing' mode (in which the rotation angle per image is less than the crystal mosaic spread by a factor of at least 2), which is commonly used with modern very fast readout detectors. The GUI provides real-time feedback on the success of the indexing step and the progress of data processing. This feedback includes the ability to monitor detector and crystal parameter refinement and to display the average spot shape in different regions of the detector. Data scaling and merging tasks can be initiated directly from the interface. Using this protocol, a data set of 360 images with ∼2,000 reflections per image can be processed in ∼4 min.

  7. Envisioning Advanced User Interfaces for E-Government Applications: A Case Study

    Science.gov (United States)

    Calvary, Gaëlle; Serna, Audrey; Coutaz, Joëlle; Scapin, Dominique; Pontico, Florence; Winckler, Marco

    The increasing use of the Web as a software platform together with the advance of technology has promoted Web applications as a starting point for improving communication between citizens and administration. Currently, several e-government Web portals propose applications for accessing information regarding healthcare, taxation, registration, housing, agriculture, education, and social services, which otherwise may be difficult to obtain. However, the adoption of services provided to citizens depends upon how such applications comply with the users' needs. Unfortunately, building an e-government website doesn't guarantee that all citizens who come to use it can access its contents. These services need to be accessible to all citizens/customers equally to ensure wider reach and subsequent adoption of the e-government services. User disabilities, computer or language illiteracy (e.g., foreign language), flexibility on information access (e.g., user remotely located in rural areas, homeless, mobile users), and ensuring user privacy on sensitive data are some of the barriers that must be taken into account when designing the User Interface (UI) of e-government applications.

  8. Practical experience with graphical user interfaces and object-oriented design in the clinical laboratory.

    Science.gov (United States)

    Wells, I G; Cartwright, R Y; Farnan, L P

    1993-12-15

    The computing strategy in our laboratories evolved from research in Artificial Intelligence, and is based on powerful software tools running on high performance desktop computers with a graphical user interface. This allows most tasks to be regarded as design problems rather than implementation projects, and both rapid prototyping and an object-oriented approach to be employed during the in-house development and enhancement of the laboratory information systems. The practical application of this strategy is discussed, with particular reference to the system designer, the laboratory user and the laboratory customer. Routine operation covers five departments, and the systems are stable, flexible and well accepted by the users. Client-server computing, currently undergoing final trials, is seen as the key to further development, and this approach to Pathology computing has considerable potential for the future.

  9. Virtual Exertions: a user interface combining visual information, kinesthetics and biofeedback for virtual object manipulation.

    Science.gov (United States)

    Ponto, Kevin; Kimmel, Ryan; Kohlmann, Joe; Bartholomew, Aaron; Radwin, Robert G

    2012-01-01

    Virtual Reality environments have the ability to present users with rich visual representations of simulated environments. However, means to interact with these types of illusions are generally unnatural in the sense that they do not match the methods humans use to grasp and move objects in the physical world. We demonstrate a system that enables users to interact with virtual objects with natural body movements by combining visual information, kinesthetics and biofeedback from electromyograms (EMG). Our method allows virtual objects to be grasped, moved and dropped through muscle exertion classification based on physical world masses. We show that users can consistently reproduce these calibrated exertions, allowing them to interface with objects in a novel way.

  10. affylmGUI: a graphical user interface for linear modeling of single channel microarray data.

    Science.gov (United States)

    Wettenhall, James M; Simpson, Ken M; Satterley, Keith; Smyth, Gordon K

    2006-04-01

    affylmGUI is a graphical user interface (GUI) to an integrated workflow for Affymetrix microarray data. The user is able to proceed from raw data (CEL files) to QC and pre-processing, and eventually to analysis of differential expression using linear models with empirical Bayes smoothing. Output of the analysis (tables and figures) can be exported to an HTML report. The GUI provides user-friendly access to state-of-the-art methods embodied in the Bioconductor software repository. affylmGUI is an R package freely available from http://www.bioconductor.org. It requires R version 1.9.0 or later and tcl/tk 8.3 or later and has been successfully tested on Windows 2000, Windows XP, Linux (RedHat and Fedora distributions) and Mac OS/X with X11. Further documentation is available at http://bioinf.wehi.edu.au/affylmGUI CONTACT: keith@wehi.edu.au.

  11. Exploring the role of haptic feedback in enabling implicit HCI-based bookmarking.

    Science.gov (United States)

    Pan, Matthew K X J; McGrenere, Joanna; Croft, Elizabeth A; MacLean, Karon E

    2014-03-01

    We examine how haptic feedback could enable an implicit human-computer interaction, in the context of an audio stream listening use case where a device monitors a user's electrodermal activity for orienting responses to external interruptions. When such a response is detected, our previously developed system automatically places a bookmark in the audio stream for later resumption of listening. Here, we investigate two uses of haptic feedback to support this implicit interaction and mitigate effects of noisy (false-positive) bookmarking: (a) low-attention notification when a bookmark is placed, and (b) focused-attention display of bookmarks during resumptive navigation. Results show that haptic notification of bookmark placement, when paired with visual display of bookmark location, significant improves navigation time. Solely visual or haptic display of bookmarks elicited equivalent navigation time; however, only the inclusion of haptic display significantly increased accuracy. Participants preferred haptic notification over no notification at interruption time, and combined haptic and visual display of bookmarks to support navigation to their interrupted location at resumption time. Our contributions include an approach to handling noisy data in implicit HCI, an implementation of haptic notifications that signal implicit system behavior, and discussion of user mental models that may be active in this context.

  12. AGUIA: autonomous graphical user interface assembly for clinical trials semantic data services.

    Science.gov (United States)

    Correa, Miria C; Deus, Helena F; Vasconcelos, Ana T; Hayashi, Yuki; Ajani, Jaffer A; Patnana, Srikrishna V; Almeida, Jonas S

    2010-10-26

    AGUIA is a front-end web application originally developed to manage clinical, demographic and biomolecular patient data collected during clinical trials at MD Anderson Cancer Center. The diversity of methods involved in patient screening and sample processing generates a variety of data types that require a resource-oriented architecture to capture the associations between the heterogeneous data elements. AGUIA uses a semantic web formalism, resource description framework (RDF), and a bottom-up design of knowledge bases that employ the S3DB tool as the starting point for the client's interface assembly. The data web service, S3DB, meets the necessary requirements of generating the RDF and of explicitly distinguishing the description of the domain from its instantiation, while allowing for continuous editing of both. Furthermore, it uses an HTTP-REST protocol, has a SPARQL endpoint, and has open source availability in the public domain, which facilitates the development and dissemination of this application. However, S3DB alone does not address the issue of representing content in a form that makes sense for domain experts. We identified an autonomous set of descriptors, the GBox, that provides user and domain specifications for the graphical user interface. This was achieved by identifying a formalism that makes use of an RDF schema to enable the automatic assembly of graphical user interfaces in a meaningful manner while using only resources native to the client web browser (JavaScript interpreter, document object model). We defined a generalized RDF model such that changes in the graphic descriptors are automatically and immediately (locally) reflected into the configuration of the client's interface application. The design patterns identified for the GBox benefit from and reflect the specific requirements of interacting with data generated by clinical trials, and they contain clues for a general purpose solution to the challenge of having interfaces

  13. AGUIA: autonomous graphical user interface assembly for clinical trials semantic data services

    Directory of Open Access Journals (Sweden)

    Hayashi Yuki

    2010-10-01

    Full Text Available Abstract Background AGUIA is a front-end web application originally developed to manage clinical, demographic and biomolecular patient data collected during clinical trials at MD Anderson Cancer Center. The diversity of methods involved in patient screening and sample processing generates a variety of data types that require a resource-oriented architecture to capture the associations between the heterogeneous data elements. AGUIA uses a semantic web formalism, resource description framework (RDF, and a bottom-up design of knowledge bases that employ the S3DB tool as the starting point for the client's interface assembly. Methods The data web service, S3DB, meets the necessary requirements of generating the RDF and of explicitly distinguishing the description of the domain from its instantiation, while allowing for continuous editing of both. Furthermore, it uses an HTTP-REST protocol, has a SPARQL endpoint, and has open source availability in the public domain, which facilitates the development and dissemination of this application. However, S3DB alone does not address the issue of representing content in a form that makes sense for domain experts. Results We identified an autonomous set of descriptors, the GBox, that provides user and domain specifications for the graphical user interface. This was achieved by identifying a formalism that makes use of an RDF schema to enable the automatic assembly of graphical user interfaces in a meaningful manner while using only resources native to the client web browser (JavaScript interpreter, document object model. We defined a generalized RDF model such that changes in the graphic descriptors are automatically and immediately (locally reflected into the configuration of the client's interface application. Conclusions The design patterns identified for the GBox benefit from and reflect the specific requirements of interacting with data generated by clinical trials, and they contain clues for a general

  14. The effects of voluntary movements on auditory-haptic and haptic-haptic temporal order judgments.

    Science.gov (United States)

    Frissen, Ilja; Ziat, Mounia; Campion, Gianni; Hayward, Vincent; Guastavino, Catherine

    2012-10-01

    In two experiments we investigated the effects of voluntary movements on temporal haptic perception. Measures of sensitivity (JND) and temporal alignment (PSS) were obtained from temporal order judgments made on intermodal auditory-haptic (Experiment 1) or intramodal haptic (Experiment 2) stimulus pairs under three movement conditions. In the baseline, static condition, the arm of the participants remained stationary. In the passive condition, the arm was displaced by a servo-controlled motorized device. In the active condition, the participants moved voluntarily. The auditory stimulus was a short, 500Hz tone presented over headphones and the haptic stimulus was a brief suprathreshold force pulse applied to the tip of the index finger orthogonally to the finger movement. Active movement did not significantly affect discrimination sensitivity on the auditory-haptic stimulus pairs, whereas it significantly improved sensitivity in the case of the haptic stimulus pair, demonstrating a key role for motor command information in temporal sensitivity in the haptic system. Points of subjective simultaneity were by-and-large coincident with physical simultaneity, with one striking exception in the passive condition with the auditory-haptic stimulus pair. In the latter case, the haptic stimulus had to be presented 45ms before the auditory stimulus in order to obtain subjective simultaneity. A model is proposed to explain the discrimination performance. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Haptic Distal Spatial Perception Mediated by Strings: Haptic "Looming"

    Science.gov (United States)

    Cabe, Patrick A.

    2011-01-01

    Five experiments tested a haptic analog of optical looming, demonstrating string-mediated haptic distal spatial perception. Horizontally collinear hooks supported a weighted string held taut by a blindfolded participant's finger midway between the hooks. At the finger, the angle between string segments increased as the finger approached…

  16. Using GOMS to predict the usability of user interfaces of small off-the-shelf software products

    OpenAIRE

    O'Neill, Aine P

    1990-01-01

    The design of user interfaces and how usable they are, are both important research topics in computer science. This thesis is a research effort aimed at exploring the whole concept of usability and measuring the quality of a user interface in terms of how usable it is. Usability means how easy a system can be learned and used. In order to have usable products, they must be initially designed with usability in mind. A survey of methods for designing user interfaces which incorporate usability ...

  17. Haptic force-feedback devices for the office computer: performance and musculoskeletal loading issues.

    Science.gov (United States)

    Dennerlein, J T; Yang, M C

    2001-01-01

    Pointing devices, essential input tools for the graphical user interface (GUI) of desktop computers, require precise motor control and dexterity to use. Haptic force-feedback devices provide the human operator with tactile cues, adding the sense of touch to existing visual and auditory interfaces. However, the performance enhancements, comfort, and possible musculoskeletal loading of using a force-feedback device in an office environment are unknown. Hypothesizing that the time to perform a task and the self-reported pain and discomfort of the task improve with the addition of force feedback, 26 people ranging in age from 22 to 44 years performed a point-and-click task 540 times with and without an attractive force field surrounding the desired target. The point-and-click movements were approximately 25% faster with the addition of force feedback (paired t-tests, p < 0.001). Perceived user discomfort and pain, as measured through a questionnaire, were also smaller with the addition of force feedback (p < 0.001). However, this difference decreased as additional distracting force fields were added to the task environment, simulating a more realistic work situation. These results suggest that for a given task, use of a force-feedback device improves performance, and potentially reduces musculoskeletal loading during mouse use. Actual or potential applications of this research include human-computer interface design, specifically that of the pointing device extensively used for the graphical user interface.

  18. EMG-based visual-haptic biofeedback: a tool to improve motor control in children with primary dystonia.

    Science.gov (United States)

    Casellato, Claudia; Pedrocchi, Alessandra; Zorzi, Giovanna; Vernisse, Lea; Ferrigno, Giancarlo; Nardocci, Nardo

    2013-05-01

    New insights suggest that dystonic motor impairments could also involve a deficit of sensory processing. In this framework, biofeedback, making covert physiological processes more overt, could be useful. The present work proposes an innovative integrated setup which provides the user with an electromyogram (EMG)-based visual-haptic biofeedback during upper limb movements (spiral tracking tasks), to test if augmented sensory feedbacks can induce motor control improvement in patients with primary dystonia. The ad hoc developed real-time control algorithm synchronizes the haptic loop with the EMG reading; the brachioradialis EMG values were used to modify visual and haptic features of the interface: the higher was the EMG level, the higher was the virtual table friction and the background color proportionally moved from green to red. From recordings on dystonic and healthy subjects, statistical results showed that biofeedback has a significant impact, correlated with the local impairment, on the dystonic muscular control. These tests pointed out the effectiveness of biofeedback paradigms in gaining a better specific-muscle voluntary motor control. The flexible tool developed here shows promising prospects of clinical applications and sensorimotor rehabilitation.

  19. Users Engage More with Interface than Materials at Welsh Newspapers Online Website

    Directory of Open Access Journals (Sweden)

    Kathleen Reed

    2016-09-01

    frequently accessed newspapers from the 1840s and 1850s. They viewed the title page much more frequently than any other page in the newspapers, likely reflecting that the title page is default when users access a paper via browsing. A correlation between time spent on the site and searching versus engaging with content was found: the longer a visitor was on WNO, the less time they spent searching, and the more time spent engaging with content. Still, as Gooding reports, “over half of all pageviews are dedicated to interacting with the web interface rather than the historical sources” (p. 240. Conclusion – WNO visitors spend more of their time interacting with the site’s interface than with digitized content, making it important that interface design be a high priority when designing online archives. Gooding concludes that despite a focus on interface, visitors are still engaged in a research process similar to that found in an offline archive and that “a differently remediated experience is not necessarily any less rich” (p. 242.

  20. Robot services for elderly with cognitive impairment: testing usability of graphical user interfaces.

    Science.gov (United States)

    Granata, C; Pino, M; Legouverneur, G; Vidal, J-S; Bidaud, P; Rigaud, A-S

    2013-01-01

    Socially assistive robotics for elderly care is a growing field. However, although robotics has the potential to support elderly in daily tasks by offering specific services, the development of usable interfaces is still a challenge. Since several factors such as age or disease-related changes in perceptual or cognitive abilities and familiarity with computer technologies influence technology use they must be considered when designing interfaces for these users. This paper presents findings from usability testing of two different services provided by a social assistive robot intended for elderly with cognitive impairment: a grocery shopping list and an agenda application. The main goal of this study is to identify the usability problems of the robot interface for target end-users as well as to isolate the human factors that affect the use of the technology by elderly. Socio-demographic characteristics and computer experience were examined as factors that could have an influence on task performance. A group of 11 elderly persons with Mild Cognitive Impairment and a group of 11 cognitively healthy elderly individuals took part in this study. Performance measures (task completion time and number of errors) were collected. Cognitive profile, age and computer experience were found to impact task performance. Participants with cognitive impairment achieved the tasks committing more errors than cognitively healthy elderly. Instead younger participants and those with previous computer experience were faster at completing the tasks confirming previous findings in the literature. The overall results suggested that interfaces and contents of the services assessed were usable by older adults with cognitive impairment. However, some usability problems were identified and should be addressed to better meet the needs and capacities of target end-users.

  1. Interactive multi-objective path planning through a palette-based user interface

    Science.gov (United States)

    Shaikh, Meher T.; Goodrich, Michael A.; Yi, Daqing; Hoehne, Joseph

    2016-05-01

    n a problem where a human uses supervisory control to manage robot path-planning, there are times when human does the path planning, and if satisfied commits those paths to be executed by the robot, and the robot executes that plan. In planning a path, the robot often uses an optimization algorithm that maximizes or minimizes an objective. When a human is assigned the task of path planning for robot, the human may care about multiple objectives. This work proposes a graphical user interface (GUI) designed for interactive robot path-planning when an operator may prefer one objective over others or care about how multiple objectives are traded off. The GUI represents multiple objectives using the metaphor of an artist's palette. A distinct color is used to represent each objective, and tradeoffs among objectives are balanced in a manner that an artist mixes colors to get the desired shade of color. Thus, human intent is analogous to the artist's shade of color. We call the GUI an "Adverb Palette" where the word "Adverb" represents a specific type of objective for the path, such as the adverbs "quickly" and "safely" in the commands: "travel the path quickly", "make the journey safely". The novel interactive interface provides the user an opportunity to evaluate various alternatives (that tradeoff between different objectives) by allowing her to visualize the instantaneous outcomes that result from her actions on the interface. In addition to assisting analysis of various solutions given by an optimization algorithm, the palette has additional feature of allowing the user to define and visualize her own paths, by means of waypoints (guiding locations) thereby spanning variety for planning. The goal of the Adverb Palette is thus to provide a way for the user and robot to find an acceptable solution even though they use very different representations of the problem. Subjective evaluations suggest that even non-experts in robotics can carry out the planning tasks with a

  2. Interactive education based on haptic technologies and educational testing of an innovative system

    Directory of Open Access Journals (Sweden)

    Theodore S. Papatheodorou

    2008-05-01

    Full Text Available This work presents on the one hand, the specifications and design of an educational haptic device and an educational platform and on the other hand, the educational trial of the applications that specially constructed in order to use this advanced virtual reality system. Α new haptic device designed especially for educational purposes and a prototype were implemented, under the framework of an IST European program called MUVII. This device is called Haptic-3D-Interface (H3DI. The novelty on this device is the tactile feedback that provides minute detailed information about the nature of virtual objects handled, in addition to force and torque feedback. The device was integrated into an innovative platform called Interactive Kiosk Demonstrator (IKD. IKD’s aim was to demonstrate new interactive paradigms forming a novel integration of the following modalities: 3D-vision, 3D-audio and haptic (force, torque, and tactile feedback. Besides, interactive educational software especially designed for IKD platform was developed. Then the educational trial of the IKD system, as well as the educational software, took place. All schools that participated in the trial were randomly selected. A total of 163 students participated in the educational trial, 64 of which were primary school students, 74 were lower-secondary school students, and 25 were upper secondary school students. For the educational trail all the international accepted practices concerning research in education were followed. The “exercises” for each group of students were chosen in accordance to their age. The educational results of this teaching approach, as well as the feedback derived from the users are presented in this work. Furthermore, some interesting results concerning important requirements for the specifications for haptic devices are also presented. Overall, we can state that the opportunity of having a natural “look and feel

  3. Haptic Feedback Compared with Visual Feedback for BCI

    OpenAIRE

    Kauhanen, L.; Palomäki, T; Jylänki, P.; Aloise, F; Nuttin, Marnix; Millán, José del R.

    2006-01-01

    Feedback plays an important role when learning to use a Brain-Computer Interface (BCI). Here we compare visual and haptic feedback in a short experiment. By imagining left and right hand movements, six subjects tried to control a BCI with the help of either visual or haptic feedback every 1s. Alpha band EEG signals from C3 and C4 were classified. The classifier was updated after each prediction using correct class information. Thus feedback could be given throughout the experiment. Subjects g...

  4. Human factors approach to evaluate the user interface of physiologic monitoring.

    Science.gov (United States)

    Fidler, Richard; Bond, Raymond; Finlay, Dewar; Guldenring, Daniel; Gallagher, Anthony; Pelter, Michele; Drew, Barbara; Hu, Xiao

    2015-01-01

    As technology infiltrates more of our personal and professional lives, user expectations for intuitive design have driven many consumer products, while medical equipment continues to have high training requirements. Not much is known about the usability and user experience associated with hospital monitoring equipment. This pilot project aimed to better understand and describe the user interface interaction and user experience with physiologic monitoring technology. This was a prospective, descriptive, mixed-methods quality improvement project to analyze perceptions and task analyses of physiologic monitors. Following a survey of practice patterns and perceived abilities to accomplish key tasks, 10 voluntary experienced physician and nurse subjects were asked to perform a series of tasks in 7 domains of monitor operations on GE Monitoring equipment in a single institution. For each task analysis, data were collected on time to complete the task, the number of button pushes or clicks required to accomplish the task, economy of motion, and observed errors. Although 60% of the participants reported incorporating monitoring data into patient care, 80% of participants preferred to receive monitoring data at the point of care (bedside). Average perceived central station usability is 5.3 out of 10 (ten is easiest). High variability exists in monitoring station interaction performance among those participating in this project. Alarms were almost universally silenced without cognitive recognition of the alarm state. Education related to monitoring operations appeared largely absent in this sample. Most users perceived the interface to not be intuitive, complaining of multiple layers and steps for data retrieval. These clinicians report real-time monitoring helpful for abrupt changes in condition like arrhythmias; however, reviewing alarms is not prioritized as valuable due to frequent false alarms. Participants requested exporting monitoring data to electronic medical

  5. imDEV: a graphical user interface to R multivariate analysis tools in Microsoft Excel

    Science.gov (United States)

    Grapov, Dmitry; Newman, John W.

    2012-01-01

    Summary: Interactive modules for Data Exploration and Visualization (imDEV) is a Microsoft Excel spreadsheet embedded application providing an integrated environment for the analysis of omics data through a user-friendly interface. Individual modules enables interactive and dynamic analyses of large data by interfacing R's multivariate statistics and highly customizable visualizations with the spreadsheet environment, aiding robust inferences and generating information-rich data visualizations. This tool provides access to multiple comparisons with false discovery correction, hierarchical clustering, principal and independent component analyses, partial least squares regression and discriminant analysis, through an intuitive interface for creating high-quality two- and a three-dimensional visualizations including scatter plot matrices, distribution plots, dendrograms, heat maps, biplots, trellis biplots and correlation networks. Availability and implementation: Freely available for download at http://sourceforge.net/projects/imdev/. Implemented in R and VBA and supported by Microsoft Excel (2003, 2007 and 2010). Contact: John.Newman@ars.usda.gov Supplementary Information: Installation instructions, tutorials and users manual are available at http://sourceforge.net/projects/imdev/. PMID:22815358

  6. Secure Web-based Ground System User Interfaces over the Open Internet

    Science.gov (United States)

    Langston, James H.; Murray, Henry L.; Hunt, Gary R.

    1998-01-01

    A prototype has been developed which makes use of commercially available products in conjunction with the Java programming language to provide a secure user interface for command and control over the open Internet. This paper reports successful demonstration of: (1) Security over the Internet, including encryption and certification; (2) Integration of Java applets with a COTS command and control product; (3) Remote spacecraft commanding using the Internet. The Java-based Spacecraft Web Interface to Telemetry and Command Handling (Jswitch) ground system prototype provides these capabilities. This activity demonstrates the use and integration of current technologies to enable a spacecraft engineer or flight operator to monitor and control a spacecraft from a user interface communicating over the open Internet using standard World Wide Web (WWW) protocols and commercial off-the-shelf (COTS) products. The core command and control functions are provided by the COTS Epoch 2000 product. The standard WWW tools and browsers are used in conjunction with the Java programming technology. Security is provided with the current encryption and certification technology. This system prototype is a step in the direction of giving scientist and flight operators Web-based access to instrument, payload, and spacecraft data.

  7. User-tailored Inter-Widget Communication - Extending the Shared Data Interface for the Apache Wookie Engine

    NARCIS (Netherlands)

    Hoisl, Bernhard; Drachsler, Hendrik; Waglecher, Christoph

    2010-01-01

    Hoisl, B., Drachsler, H., & Waglecher, C. (2010). User-tailored Inter-Widget Communication. Extending the Shared Data Interface for the Apache Wookie Engine, International Conference on Interactive Computer Aided Learning 2010, Hasselt, Belgium.

  8. SeaView Version 4: A Multiplatform Graphical User Interface for Sequence Alignment and Phylogenetic Tree Building

    National Research Council Canada - National Science Library

    Gouy, Manolo; Guindon, Stéphane; Gascuel, Olivier

    We present SeaView version 4, a multiplatform program designed to facilitate multiple alignment and phylogenetic tree building from molecular sequence data through the use of a graphical user interface...

  9. A MATLAB Graphical User Interface Dedicated to the Optimal Design of the High Power Induction Motor with Heavy Starting Conditions

    Directory of Open Access Journals (Sweden)

    Maria Brojboiu

    2014-09-01

    Full Text Available In this paper, a Matlab graphical user interface dedicated to the optimal design of the high power induction motor with heavy starting conditions is presented. This graphical user interface allows to input the rated parameters, the selection of the induction motor type and the optimization criterion of the induction motor design also. For the squirrel cage induction motor the graphical user interface allows the selection of the rotor bar geometry, the material of the rotor bar as well as the fastening technology of the shorting ring on the rotor bar. The Matlab graphical user interface is developed and applied to the general optimal design program of the induction motor described in [1], [2].

  10. Designing the user interfaces of a behavior modification intervention for obesity & eating disorders prevention.

    Science.gov (United States)

    Moulos, Ioannis; Maramis, Christos; Mourouzis, Alexandros; Maglaveras, Nicos

    2015-01-01

    The recent immense diffusion of smartphones has significantly upgraded the role of mobile user interfaces in interventions that build and/or maintain healthier lifestyles. Indeed, high-quality, user-centered smartphone applications are able to serve as advanced front-ends to such interventions. These smartphone applications, coupled with portable or wearable sensors, are being employed for monitoring day to day health-related behaviors, including eating and physical activity. Some of them take one step forward by identifying unhealthy behaviors and contributing towards their modification. This work presents the design as well as the preliminary implementation of the mobile user interface of SPLENDID, a novel, sensor-oriented intervention for preventing obesity and eating disorders in young populations. This is implemented by means of an Android application, which is able to monitor the eating and physical activity behaviors of young individuals at risk for obesity and/or eating disorders, subsequently guiding them towards the modification of those behaviors that put them at risk. Behavior monitoring is based on multiple data provided by a set of communicating sensors and self-reported information, while guidance is facilitated through a feedback/encouragement provision and goal setting mechanism.

  11. ON THE EFFECT OF ADAPTIVE USER INTERFACES ON RELIABILITY AND EFFICIENCY OF THE AUTOMATED SYSTEMS

    Directory of Open Access Journals (Sweden)

    Yu. O. Furtat

    2014-01-01

    Full Text Available In modern automated systems users often have to face the information overload problem because of ever increasing volumes of information with short time processing requirements. Working in such conditions affects the system operator’s work quality and the systems reliability. One possible approach to solving the information overload problem is to create personalized interfaces that take into account the user’s information management particularities. System operator’s features, which determine the shape and pace of information representation preferred by him, form the user’s cognitive portrait. To determine the values of portrait characteristics professional testing with the assistance of psychologists or operational testing at the user’s workplace is performed. The second option is more preferable for use in automated systems, since it has no issue with lack of psychologists. Cognitive portrait is then built as a result of user interaction with the software diagnostic tools that are based on the cognitive psychology methods. The effect of personalized user interface application in an automated system can be estimated by quantifying how the reduction in user’s response time to critical events affects the system reliability and efficiency. For this purpose, the formulae of reliability theory for complex automated systems are used, showing the dependence between the system reliability and user’s response time to critical event.

  12. PIA: An Intuitive Protein Inference Engine with a Web-Based User Interface.

    Science.gov (United States)

    Uszkoreit, Julian; Maerkens, Alexandra; Perez-Riverol, Yasset; Meyer, Helmut E; Marcus, Katrin; Stephan, Christian; Kohlbacher, Oliver; Eisenacher, Martin

    2015-07-02

    Protein inference connects the peptide spectrum matches (PSMs) obtained from database search engines back to proteins, which are typically at the heart of most proteomics studies. Different search engines yield different PSMs and thus different protein lists. Analysis of results from one or multiple search engines is often hampered by different data exchange formats and lack of convenient and intuitive user interfaces. We present PIA, a flexible software suite for combining PSMs from different search engine runs and turning these into consistent results. PIA can be integrated into proteomics data analysis workflows in several ways. A user-friendly graphical user interface can be run either locally or (e.g., for larger core facilities) from a central server. For automated data processing, stand-alone tools are available. PIA implements several established protein inference algorithms and can combine results from different search engines seamlessly. On several benchmark data sets, we show that PIA can identify a larger number of proteins at the same protein FDR when compared to that using inference based on a single search engine. PIA supports the majority of established search engines and data in the mzIdentML standard format. It is implemented in Java and freely available at https://github.com/mpc-bioinformatics/pia.

  13. Usability Studies on Mobile User Interface Design Patterns: A Systematic Literature Review

    Directory of Open Access Journals (Sweden)

    Lumpapun Punchoojit

    2017-01-01

    Full Text Available Mobile platforms have called for attention from HCI practitioners, and, ever since 2007, touchscreens have completely changed mobile user interface and interaction design. Some notable differences between mobile devices and desktops include the lack of tactile feedback, ubiquity, limited screen size, small virtual keys, and high demand of visual attention. These differences have caused unprecedented challenges to users. Most of the mobile user interface designs are based on desktop paradigm, but the desktop designs do not fully fit the mobile context. Although mobile devices are becoming an indispensable part of daily lives, true standards for mobile UI design patterns do not exist. This article provides a systematic literature review of the existing studies on mobile UI design patterns. The first objective is to give an overview of recent studies on the mobile designs. The second objective is to provide an analysis on what topics or areas have insufficient information and what factors are concentrated upon. This article will benefit the HCI community in seeing an overview of present works, to shape the future research directions.

  14. Computer-aided fit testing: an approach for examining the user/equipment interface

    Science.gov (United States)

    Corner, Brian D.; Beecher, Robert M.; Paquette, Steven

    1997-03-01

    Developments in laser digitizing technology now make it possible to capture very accurate 3D images of the surface of the human body in less than 20 seconds. Applications for the images range from animation of movie characters to the design and visualization of clothing and individual equipment (CIE). In this paper we focus on modeling the user/equipment interface. Defining the relative geometry between user and equipment provides a better understanding of equipment performance, and can make the design cycle more efficient. Computer-aided fit testing (CAFT) is the application of graphical and statistical techniques to visualize and quantify the human/equipment interface in virtual space. In short, CAFT looks to measure the relative geometry between a user and his or her equipment. The design cycle changes with the introducing CAFT; now some evaluation may be done in the CAD environment prior to prototyping. CAFT may be applied in two general ways: (1) to aid in the creation of new equipment designs and (2) to evaluate current designs for compliance to performance specifications. We demonstrate the application of CAFT with two examples. First, we show how a prototype helmet may be evaluated for fit, and second we demonstrate how CAFT may be used to measure body armor coverage.

  15. Recognizing the Operating Hand and the Hand-Changing Process for User Interface Adjustment on Smartphones

    Directory of Open Access Journals (Sweden)

    Hansong Guo

    2016-08-01

    Full Text Available As the size of smartphone touchscreens has become larger and larger in recent years, operability with a single hand is getting worse, especially for female users. We envision that user experience can be significantly improved if smartphones are able to recognize the current operating hand, detect the hand-changing process and then adjust the user interfaces subsequently. In this paper, we proposed, implemented and evaluated two novel systems. The first one leverages the user-generated touchscreen traces to recognize the current operating hand, and the second one utilizes the accelerometer and gyroscope data of all kinds of activities in the user’s daily life to detect the hand-changing process. These two systems are based on two supervised classifiers constructed from a series of refined touchscreen trace, accelerometer and gyroscope features. As opposed to existing solutions that all require users to select the current operating hand or confirm the hand-changing process manually, our systems follow much more convenient and practical methods and allow users to change the operating hand frequently without any harm to the user experience. We conduct extensive experiments on Samsung Galaxy S4 smartphones, and the evaluation results demonstrate that our proposed systems can recognize the current operating hand and detect the hand-changing process with 94.1% and 93.9% precision and 94.1% and 93.7% True Positive Rates (TPR respectively, when deciding with a single touchscreen trace or accelerometer-gyroscope data segment, and the False Positive Rates (FPR are as low as 2.6% and 0.7% accordingly. These two systems can either work completely independently and achieve pretty high accuracies or work jointly to further improve the recognition accuracy.

  16. Haptic rendering foundations, algorithms, and applications

    CERN Document Server

    Lin, Ming C

    2008-01-01

    For a long time, human beings have dreamed of a virtual world where it is possible to interact with synthetic entities as if they were real. It has been shown that the ability to touch virtual objects increases the sense of presence in virtual environments. This book provides an authoritative overview of state-of-theart haptic rendering algorithms and their applications. The authors examine various approaches and techniques for designing touch-enabled interfaces for a number of applications, including medical training, model design, and maintainability analysis for virtual prototyping, scienti

  17. Using a Log Analyser to Assist Research into Haptic Technology

    Science.gov (United States)

    Jónsson, Fannar Freyr; Hvannberg, Ebba Þóra

    Usability evaluations collect subjective and objective measures. Examples of the latter are time to complete a task. The paper describes use cases of a log analyser for haptic feedback. The log analyser reads a log file and extracts information such as time of each practice and assessment session, analyses whether the user goes off curve and measures the force applied. A study case using the analyser is performed using a PHANToM haptic learning environment application that is used to teach young visually impaired students the subject of polynomials. The paper answers six questions to illustrate further use cases of the log analyser.

  18. Determinants of user acceptance of a specific social platform for older adults: An empirical examination of user interface characteristics and behavioral intention

    Science.gov (United States)

    Chang, Hsien-Tsung; Chen, Yan-Jiun; Chang, Yung-Sheng

    2017-01-01

    The use of the Internet and social applications has many benefits for the elderly, but numerous investigations have shown that the elderly do not perceive online social networks as a friendly social environment. Therefore, TreeIt, a social application specifically designed for the elderly, was developed for this study. In the TreeIt application, seven mechanisms promoting social interaction were designed to allow older adults to use social networking sites (SNSs) to increase social connection, maintain the intensity of social connections and strengthen social experience. This study’s main objective was to investigate how user interface design affects older people’s intention and attitude related to using SNSs. Fourteen user interface evaluation heuristics proposed by Zhang et al. were adopted as the criteria to assess user interface usability and further grouped into three categories: system support, user interface design and navigation. The technology acceptance model was adopted to assess older people’s intention and attitude related to using SNSs. One hundred and one elderly persons were enrolled in this study as subjects, and the results showed that all of the hypotheses proposed in this study were valid: system support and perceived usefulness had a significant effect on behavioral intention; user interface design and perceived ease of use were positively correlated with perceived usefulness; and navigation exerted an influence on perceived ease of use. The results of this study are valuable for the future development of social applications for the elderly. PMID:28837566

  19. Determinants of user acceptance of a specific social platform for older adults: An empirical examination of user interface characteristics and behavioral intention.

    Science.gov (United States)

    Tsai, Tsai-Hsuan; Chang, Hsien-Tsung; Chen, Yan-Jiun; Chang, Yung-Sheng

    2017-01-01

    The use of the Internet and social applications has many benefits for the elderly, but numerous investigations have shown that the elderly do not perceive online social networks as a friendly social environment. Therefore, TreeIt, a social application specifically designed for the elderly, was developed for this study. In the TreeIt application, seven mechanisms promoting social interaction were designed to allow older adults to use social networking sites (SNSs) to increase social connection, maintain the intensity of social connections and strengthen social experience. This study's main objective was to investigate how user interface design affects older people's intention and attitude related to using SNSs. Fourteen user interface evaluation heuristics proposed by Zhang et al. were adopted as the criteria to assess user interface usability and further grouped into three categories: system support, user interface design and navigation. The technology acceptance model was adopted to assess older people's intention and attitude related to using SNSs. One hundred and one elderly persons were enrolled in this study as subjects, and the results showed that all of the hypotheses proposed in this study were valid: system support and perceived usefulness had a significant effect on behavioral intention; user interface design and perceived ease of use were positively correlated with perceived usefulness; and navigation exerted an influence on perceived ease of use. The results of this study are valuable for the future development of social applications for the elderly.

  20. Determinants of user acceptance of a specific social platform for older adults: An empirical examination of user interface characteristics and behavioral intention.

    Directory of Open Access Journals (Sweden)

    Tsai-Hsuan Tsai

    Full Text Available The use of the Internet and social applications has many benefits for the elderly, but numerous investigations have shown that the elderly do not perceive online social networks as a friendly social environment. Therefore, TreeIt, a social application specifically designed for the elderly, was developed for this study. In the TreeIt application, seven mechanisms promoting social interaction were designed to allow older adults to use social networking sites (SNSs to increase social connection, maintain the intensity of social connections and strengthen social experience. This study's main objective was to investigate how user interface design affects older people's intention and attitude related to using SNSs. Fourteen user interface evaluation heuristics proposed by Zhang et al. were adopted as the criteria to assess user interface usability and further grouped into three categories: system support, user interface design and navigation. The technology acceptance model was adopted to assess older people's intention and attitude related to using SNSs. One hundred and one elderly persons were enrolled in this study as subjects, and the results showed that all of the hypotheses proposed in this study were valid: system support and perceived usefulness had a significant effect on behavioral intention; user interface design and perceived ease of use were positively correlated with perceived usefulness; and navigation exerted an influence on perceived ease of use. The results of this study are valuable for the future development of social applications for the elderly.

  1. Design and usability study of an iconic user interface to ease information retrieval of medical guidelines.

    Science.gov (United States)

    Griffon, Nicolas; Kerdelhué, Gaétan; Hamek, Saliha; Hassler, Sylvain; Boog, César; Lamy, Jean-Baptiste; Duclos, Catherine; Venot, Alain; Darmoni, Stéfan J

    2014-10-01

    Doc'CISMeF (DC) is a semantic search engine used to find resources in CISMeF-BP, a quality controlled health gateway, which gathers guidelines available on the internet in French. Visualization of Concepts in Medicine (VCM) is an iconic language that may ease information retrieval tasks. This study aimed to describe the creation and evaluation of an interface integrating VCM in DC in order to make this search engine much easier to use. Focus groups were organized to suggest ways to enhance information retrieval tasks using VCM in DC. A VCM interface was created and improved using the ergonomic evaluation approach. 20 physicians were recruited to compare the VCM interface with the non-VCM one. Each evaluator answered two different clinical scenarios in each interface. The ability and time taken to select a relevant resource were recorded and compared. A usability analysis was performed using the System Usability Scale (SUS). The VCM interface contains a filter based on icons, and icons describing each resource according to focus group recommendations. Some ergonomic issues were resolved before evaluation. Use of VCM significantly increased the success of information retrieval tasks (OR=11; 95% CI 1.4 to 507). Nonetheless, it took significantly more time to find a relevant resource with VCM interface (101 vs 65 s; p=0.02). SUS revealed 'good' usability with an average score of 74/100. VCM was successfully implemented in DC as an option. It increased the success rate of information retrieval tasks, despite requiring slightly more time, and was well accepted by end-users. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  2. A data mining technique for discovering distinct patterns of hand signs: implications in user training and computer interface design.

    Science.gov (United States)

    Ye, Nong; Li, Xiangyang; Farley, Toni

    2003-01-15

    Hand signs are considered as one of the important ways to enter information into computers for certain tasks. Computers receive sensor data of hand signs for recognition. When using hand signs as computer inputs, we need to (1) train computer users in the sign language so that their hand signs can be easily recognized by computers, and (2) design the computer interface to avoid the use of confusing signs for improving user input performance and user satisfaction. For user training and computer interface design, it is important to have a knowledge of which signs can be easily recognized by computers and which signs are not distinguishable by computers. This paper presents a data mining technique to discover distinct patterns of hand signs from sensor data. Based on these patterns, we derive a group of indistinguishable signs by computers. Such information can in turn assist in user training and computer interface design.

  3. User-centered design in brain-computer interfaces-a case study.

    Science.gov (United States)

    Schreuder, Martijn; Riccio, Angela; Risetti, Monica; Dähne, Sven; Ramsay, Andrew; Williamson, John; Mattia, Donatella; Tangermann, Michael

    2013-10-01

    The array of available brain-computer interface (BCI) paradigms has continued to grow, and so has the corresponding set of machine learning methods which are at the core of BCI systems. The latter have evolved to provide more robust data analysis solutions, and as a consequence the proportion of healthy BCI users who can use a BCI successfully is growing. With this development the chances have increased that the needs and abilities of specific patients, the end-users, can be covered by an existing BCI approach. However, most end-users who have experienced the use of a BCI system at all have encountered a single paradigm only. This paradigm is typically the one that is being tested in the study that the end-user happens to be enrolled in, along with other end-users. Though this corresponds to the preferred study arrangement for basic research, it does not ensure that the end-user experiences a working BCI. In this study, a different approach was taken; that of a user-centered design. It is the prevailing process in traditional assistive technology. Given an individual user with a particular clinical profile, several available BCI approaches are tested and - if necessary - adapted to him/her until a suitable BCI system is found. Described is the case of a 48-year-old woman who suffered from an ischemic brain stem stroke, leading to a severe motor- and communication deficit. She was enrolled in studies with two different BCI systems before a suitable system was found. The first was an auditory event-related potential (ERP) paradigm and the second a visual ERP paradigm, both of which are established in literature. The auditory paradigm did not work successfully, despite favorable preconditions. The visual paradigm worked flawlessly, as found over several sessions. This discrepancy in performance can possibly be explained by the user's clinical deficit in several key neuropsychological indicators, such as attention and working memory. While the auditory paradigm relies

  4. Monitoring and controlling ATLAS data management: The Rucio web user interface

    Science.gov (United States)

    Lassnig, M.; Beermann, T.; Vigne, R.; Barisits, M.; Garonne, V.; Serfon, C.

    2015-12-01

    The monitoring and controlling interfaces of the previous data management system DQ2 followed the evolutionary requirements and needs of the ATLAS collaboration. The new data management system, Rucio, has put in place a redesigned web-based interface based upon the lessons learnt from DQ2, and the increased volume of managed information. This interface encompasses both a monitoring and controlling component, and allows easy integration for usergenerated views. The interface follows three design principles. First, the collection and storage of data from internal and external systems is asynchronous to reduce latency. This includes the use of technologies like ActiveMQ or Nagios. Second, analysis of the data into information is done massively parallel due to its volume, using a combined approach with an Oracle database and Hadoop MapReduce. Third, sharing of the information does not distinguish between human or programmatic access, making it easy to access selective parts of the information both in constrained frontends like web-browsers as well as remote services. This contribution will detail the reasons for these principles and the design choices taken. Additionally, the implementation, the interactions with external systems, and an evaluation of the system in production, both from a technological and user perspective, conclude this contribution.

  5. Phast4Windows: A 3D graphical user interface for the reactive-transport simulator PHAST

    Science.gov (United States)

    Charlton, Scott R.; Parkhurst, David L.

    2013-01-01

    Phast4Windows is a Windows® program for developing and running groundwater-flow and reactive-transport models with the PHAST simulator. This graphical user interface allows definition of grid-independent spatial distributions of model properties—the porous media properties, the initial head and chemistry conditions, boundary conditions, and locations of wells, rivers, drains, and accounting zones—and other parameters necessary for a simulation. Spatial data can be defined without reference to a grid by drawing, by point-by-point definitions, or by importing files, including ArcInfo® shape and raster files. All definitions can be inspected, edited, deleted, moved, copied, and switched from hidden to visible through the data tree of the interface. Model features are visualized in the main panel of the interface, so that it is possible to zoom, pan, and rotate features in three dimensions (3D). PHAST simulates single phase, constant density, saturated groundwater flow under confined or unconfined conditions. Reactions among multiple solutes include mineral equilibria, cation exchange, surface complexation, solid solutions, and general kinetic reactions. The interface can be used to develop and run simple or complex models, and is ideal for use in the classroom, for analysis of laboratory column experiments, and for development of field-scale simulations of geochemical processes and contaminant transport.

  6. Graphic User Interface for Monte Carlo Simulation of Ferromagnetic/Antiferromagnetic Manganite Bilayers

    Directory of Open Access Journals (Sweden)

    Hector Barco-Ríos

    2011-06-01

    Full Text Available The manganites have been widely studied because of their important properties as colossal magnetoresistance and exchange bias that are important phenomena used in many technological applications. For this reason, in this work, a study of the exchange bias effect present in La2/3Ca1/3MnO3/La1/3Ca2/3MnO3. This study was carried out by using the Monte Carlo method and the Metropolis Algorithm. In order to make easy this study, a graphic user interface was built alloying a friendly interaction. The interface permits to control the thickness of Ferromagnetic and Antiferromagnetic layer, temperatures the magnetic field, the number of Monte Carlo steps and the exchange parameters. Results obtained reflected the influence of all of these parameters on the exchange bias and coercive fields.

  7. Profex: a graphical user interface for the Rietveld refinement program BGMN.

    Science.gov (United States)

    Doebelin, Nicola; Kleeberg, Reinhard

    2015-10-01

    Profex is a graphical user interface for the Rietveld refinement program BGMN. Its interface focuses on preserving BGMN's powerful and flexible scripting features by giving direct access to BGMN input files. Very efficient workflows for single or batch refinements are achieved by managing refinement control files and structure files, by providing dialogues and shortcuts for many operations, by performing operations in the background, and by providing import filters for CIF and XML crystal structure files. Refinement results can be easily exported for further processing. State-of-the-art graphical export of diffraction patterns to pixel and vector graphics formats allows the creation of publication-quality graphs with minimum effort. Profex reads and converts a variety of proprietary raw data formats and is thus largely instrument independent. Profex and BGMN are available under an open-source license for Windows, Linux and OS X operating systems.

  8. GRAPHICAL USER INTERFACE WITH APPLICATIONS IN SUSCEPTIBLE-INFECTIOUS-SUSCEPTIBLE MODELS.

    Science.gov (United States)

    Ilea, M; Turnea, M; Arotăriţei, D; Rotariu, Mariana; Popescu, Marilena

    2015-01-01

    Practical significance of understanding the dynamics and evolution of infectious diseases increases continuously in contemporary world. The mathematical study of the dynamics of infectious diseases has a long history. By incorporating statistical methods and computer-based simulations in dynamic epidemiological models, it could be possible for modeling methods and theoretical analyses to be more realistic and reliable, allowing a more detailed understanding of the rules governing epidemic spreading. To provide the basis for a disease transmission, the population of a region is often divided into various compartments, and the model governing their relation is called the compartmental model. To present all of the information available, a graphical user interface provides icons and visual indicators. The graphical interface shown in this paper is performed using the MATLAB software ver. 7.6.0. MATLAB software offers a wide range of techniques by which data can be displayed graphically. The process of data viewing involves a series of operations. To achieve it, I had to make three separate files, one for defining the mathematical model and two for the interface itself. Considering a fixed population, it is observed that the number of susceptible individuals diminishes along with an increase in the number of infectious individuals so that in about ten days the number of individuals infected and susceptible, respectively, has the same value. If the epidemic is not controlled, it will continue for an indefinite period of time. By changing the global parameters specific of the SIS model, a more rapid increase of infectious individuals is noted. Using the graphical user interface shown in this paper helps achieving a much easier interaction with the computer, simplifying the structure of complex instructions by using icons and menus, and, in particular, programs and files are much easier to organize. Some numerical simulations have been presented to illustrate theoretical

  9. Improving the Usability of the User Interface for a Digital Textbook Platform for Elementary-School Students

    Science.gov (United States)

    Lim, Cheolil; Song, Hae-Deok; Lee, Yekyung

    2012-01-01

    Usability is critical to the development of a user-friendly digital textbook platform interface, yet thorough research on interface development based on usability principles is in short supply. This study addresses that need by looking at usability attributes and corresponding design elements from a learning perspective. The researchers used a…

  10. "I Want My Robot to Look for Food": Comparing Kindergartner's Programming Comprehension Using Tangible, Graphic, and Hybrid User Interfaces

    Science.gov (United States)

    Strawhacker, Amanda; Bers, Marina U.

    2015-01-01

    In recent years, educational robotics has become an increasingly popular research area. However, limited studies have focused on differentiated learning outcomes based on type of programming interface. This study aims to explore how successfully young children master foundational programming concepts based on the robotics user interface (tangible,…

  11. Designing with the mind in mind simple guide to understanding user interface design guidelines

    CERN Document Server

    Johnson, Jeff

    2014-01-01

    In this completely updated and revised edition of Designing with the Mind in Mind, Jeff Johnson provides you with just enough background in perceptual and cognitive psychology that user interface (UI) design guidelines make intuitive sense rather than being just a list or rules to follow. Early UI practitioners were trained in cognitive psychology, and developed UI design rules based on it. But as the field has evolved since the first edition of this book, designers enter the field from many disciplines. Practitioners today have enough experience in UI design that they have been exposed to

  12. The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.

    Directory of Open Access Journals (Sweden)

    Jared Adolf-Bryfogle

    Full Text Available The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.

  13. The Wiimote and beyond: spatially convenient devices for 3D user interfaces.

    Science.gov (United States)

    Wingrave, Chadwick A; Williamson, Brian; Varcholik, Paul D; Rose, Jeremy; Miller, Andrew; Charbonneau, Emiko; Bott, Jared; LaViola, Joseph J

    2010-01-01

    The Nintendo Wii Remote (Wiimote) has served as an input device in 3D user interfaces (3DUIs) but differs from the general-purpose input hardware typically found in research labs and commercial applications. Despite this, no one has systematically evaluated the device in terms of what it offers 3DUI designers. Experience with the Wiimote indicates that it's an imperfect harbinger of a new class of spatially convenient devices, classified in terms of spatial data, functionality, and commodity design. This tutorial presents techniques for using the Wiimote in 3DUIs. It discusses the device's strengths and how to compensate for its limitations, with implications for future spatially convenient devices.

  14. Advanced graphical user interface for multi-physics simulations using AMST

    Science.gov (United States)

    Hoffmann, Florian; Vogel, Frank

    2017-07-01

    Numerical modelling of particulate matter has gained much popularity in recent decades. Advanced Multi-physics Simulation Technology (AMST) is a state-of-the-art three dimensional numerical modelling technique combining the eX-tended Discrete Element Method (XDEM) with Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) [1]. One major limitation of this code is the lack of a graphical user interface (GUI) meaning that all pre-processing has to be made directly in a HDF5-file. This contribution presents the first graphical pre-processor developed for AMST.

  15. MixPlore: A Cocktail-Based Media Performance Using Tangible User Interfaces

    Science.gov (United States)

    Lee, Zune; Chang, Sungkyun; Lim, Chang Young

    This paper presents MixPlore, a framework for a cocktail-based live media performance. It aims to maximize the pleasure of mixology, presenting the cocktail as a plentiful art medium where people can fully enjoy new synesthetic contents by the integration of bartending and musical creation. For this, we fabricated Tangible User Interfaces (TUIs): tin, glass, muddler, and costume display, etc. The basic idea of performance and music composition is to follow the process of making cocktails. At the end of every repertoire, the performer provides the resultant 'sonic cocktails' to audience.

  16. Prototyping the graphical user interface for the operator of the Cherenkov Telescope Array

    Science.gov (United States)

    Sadeh, I.; Oya, I.; Schwarz, J.; Pietriga, E.

    2016-07-01

    The Cherenkov Telescope Array (CTA) is a planned gamma-ray observatory. CTA will incorporate about 100 imaging atmospheric Cherenkov telescopes (IACTs) at a Southern site, and about 20 in the North. Previous IACT experiments have used up to five telescopes. Subsequently, the design of a graphical user interface (GUI) for the operator of CTA involves new challenges. We present a GUI prototype, the concept for which is being developed in collaboration with experts from the field of Human-Computer Interaction (HCI). The prototype is based on Web technology; it incorporates a Python web server, Web Sockets and graphics generated with the d3.js Javascript library.

  17. Mobile health IT: The effect of user interface and form factor on doctor-patient communication

    DEFF Research Database (Denmark)

    Alsos, Ole Andreas; Das, Anita; Svanæs, Dag

    2012-01-01

    a paper chart, a PDA, and a laptop mounted on a trolley. Video recordings from the simulations were analyzed qualitatively. Interviews with clinicians and patients were used to triangulate the findings and to verify the realism and results of the simulations. Result The paper chart afforded smooth re......-establishment of eye contact, better verbal and non-verbal contact, more gesturing, good visibility of actions, and quick information retrieval. The digital information devices lacked many of these affordances; physicians’ actions were not visible for the patients, the user interfaces required much attention...

  18. jQuery UI 1.7 the user interface library for jQuery

    CERN Document Server

    Wellman, Dan

    2009-01-01

    An example-based approach leads you step-by-step through the implementation and customization of each library component and its associated resources in turn. To emphasize the way that jQuery UI takes the difficulty out of user interface design and implementation, each chapter ends with a 'fun with' section that puts together what you've learned throughout the chapter to make a usable and fun page. In these sections you'll often get to experiment with the latest associated technologies like AJAX and JSON. This book is for front-end designers and developers who need to quickly learn how to use t

  19. Designing Social Interfaces Principles, Patterns, and Practices for Improving the User Experience

    CERN Document Server

    Crumlish, Christian

    2009-01-01

    From the creators of Yahoo!'s Design Pattern Library, Designing Social Interfaces provides you with more than 100 patterns, principles, and best practices, along with salient advice for many of the common challenges you'll face when starting a social website. Designing sites that foster user interaction and community-building is a valuable skill for web developers and designers today, but it's not that easy to understand the nuances of the social web. Now you have help. Christian Crumlish and Erin Malone share hard-won insights into what works, what doesn't, and why. You'll learn how to bala

  20. Law of the cloud: on the supremacy of the user interface over copyright law

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2013-07-01

    Full Text Available Cloud computing technologies are commonly used for delivering content or information to users who no longer need to store this data onto their own devices. This is likely to have an important impact on the effectivity of copyright law in the context of online applications, insofar as the underlying infrastructure of the cloud is such that is allows cloud operators to control the manner in which and the extent to which users can exploit such content - regardless of whether it is protected by copyright law or it has already fallen in the public domain. This article analyses the extent to which the provisions of copyright law can potentially be bypassed by cloud computing applications whose interface is designed to regulate the access, use and reuse of online content, and how these online applications can be used to establish private regimes of regulation that often go beyond the scope of the traditional copyright regime.