WorldWideScience

Sample records for haptic user interfaces

  1. FGB: A Graphical and Haptic User Interface for Creating Graphical, Haptic User Interfaces

    International Nuclear Information System (INIS)

    ANDERSON, THOMAS G.; BRECKENRIDGE, ARTHURINE; DAVIDSON, GEORGE S.

    1999-01-01

    The emerging field of haptics represents a fundamental change in human-computer interaction (HCI), and presents solutions to problems that are difficult or impossible to solve with a two-dimensional, mouse-based interface. To take advantage of the potential of haptics, however, innovative interaction techniques and programming environments are needed. This paper describes FGB (FLIGHT GHUI Builder), a programming tool that can be used to create an application specific graphical and haptic user interface (GHUI). FGB is itself a graphical and haptic user interface with which a programmer can intuitively create and manipulate components of a GHUI in real time in a graphical environment through the use of a haptic device. The programmer can create a GHUI without writing any programming code. After a user interface is created, FGB writes the appropriate programming code to a file, using the FLIGHT API, to recreate what the programmer created in the FGB interface. FGB saves programming time and increases productivity, because a programmer can see the end result as it is created, and FGB does much of the programming itself. Interestingly, as FGB was created, it was used to help build itself. The further FGB was in its development, the more easily and quickly it could be used to create additional functionality and improve its own design. As a finished product, FGB can be used to recreate itself in much less time than it originally required, and with much less programming. This paper describes FGB's GHUI components, the techniques used in the interface, how the output code is created, where programming additions and modifications should be placed, and how it can be compared to and integrated with existing API's such as MFC and Visual C++, OpenGL, and GHOST

  2. Development of a wearable haptic game interface

    Directory of Open Access Journals (Sweden)

    J. Foottit

    2016-04-01

    Full Text Available This paper outlines the ongoing development of a wearable haptic game interface, in this case for controlling a flight simulator. The device differs from many traditional haptic feedback implementations in that it combines vibrotactile feedback with gesture based input, thus becoming a two-way conduit between the user and the virtual environment. The device is intended to challenge what is considered an “interface” and sets out to purposefully blur the boundary between man and machine. This allows for a more immersive experience, and a user evaluation shows that the intuitive interface allows the user to become the aircraft that is controlled by the movements of the user's hand.

  3. Contribution to the modeling and the identification of haptic interfaces

    International Nuclear Information System (INIS)

    Janot, A.

    2007-12-01

    This thesis focuses on the modeling and the identification of haptic interfaces using cable drive. An haptic interface is a force feedback device, which enables its user to interact with a virtual world or a remote environment explored by a slave system. It aims at the matching between the forces and displacements given by the user and those applied to virtual world. Usually, haptic interfaces make use of a mechanical actuated structure whose distal link is equipped with a handle. When manipulating this handle to interact with explored world, the user feels the apparent mass, compliance and friction of the interface. This distortion introduced between the operator and the virtual world must be modeled and identified to enhance the design of the interface and develop appropriate control laws. The first approach has been to adapt the modeling and identification methods of rigid and localized flexibilities robots to haptic interfaces. The identification technique makes use of the inverse dynamic model and the linear least squares with the measurements of joint torques and positions. This approach is validated on a single degree of freedom and a three degree of freedom haptic devices. A new identification method needing only torque data is proposed. It is based on a closed loop simulation using the direct dynamic model. The optimal parameters minimize the 2 norms of the error between the actual torque and the simulated torque assuming the same control law and the same tracking trajectory. This non linear least squares problem dramatically is simplified using the inverse model to calculate the simulated torque. This method is validated on the single degree of freedom haptic device and the SCARA robot. (author)

  4. Improved haptic interface for colonoscopy simulation.

    Science.gov (United States)

    Woo, Hyun Soo; Kim, Woo Seok; Ahn, Woojin; Lee, Doo Yong; Yi, Sun Young

    2007-01-01

    This paper presents an improved haptic interface of the KAIST-Ewha colonoscopy simulator II. The haptic interface enables the distal portion of the colonoscope to be freely bent while guaranteeing enough workspace and reflective forces for colonoscopy simulation. Its force-torque sensor measures profiles of the user. Manipulation of the colonoscope tip is monitored by four deflection sensors, and triggers computation to render accurate graphic images corresponding to the angle knob rotation. Tack switches are attached on the valve-actuation buttons of the colonoscope to simulate air-injection or suction, and the corresponding deformation of the colon.

  5. Video Game Device Haptic Interface for Robotic Arc Welding

    Energy Technology Data Exchange (ETDEWEB)

    Corrie I. Nichol; Milos Manic

    2009-05-01

    Recent advances in technology for video games have made a broad array of haptic feedback devices available at low cost. This paper presents a bi-manual haptic system to enable an operator to weld remotely using the a commercially available haptic feedback video game device for the user interface. The system showed good performance in initial tests, demonstrating the utility of low cost input devices for remote haptic operations.

  6. Haptic interface of the KAIST-Ewha colonoscopy simulator II.

    Science.gov (United States)

    Woo, Hyun Soo; Kim, Woo Seok; Ahn, Woojin; Lee, Doo Yong; Yi, Sun Young

    2008-11-01

    This paper presents an improved haptic interface for the Korea Advanced Institute of Science and Technology Ewha Colonoscopy Simulator II. The haptic interface enables the distal portion of the colonoscope to be freely bent while guaranteeing sufficient workspace and reflective forces for colonoscopy simulation. Its force-torque sensor measures the profiles of the user. Manipulation of the colonoscope tip is monitored by four deflection sensors and triggers computations to render accurate graphic images corresponding to the rotation of the angle knob. Tack sensors are attached to the valve-actuation buttons of the colonoscope to simulate air injection or suction as well as the corresponding deformation of the colon. A survey study for face validation was conducted, and the result shows that the developed haptic interface provides realistic haptic feedback for colonoscopy simulations.

  7. Haptic and Visual feedback in 3D Audio Mixing Interfaces

    DEFF Research Database (Denmark)

    Gelineck, Steven; Overholt, Daniel

    2015-01-01

    This paper describes the implementation and informal evaluation of a user interface that explores haptic feedback for 3D audio mixing. The implementation compares different approaches using either the LEAP Motion for mid-air hand gesture control, or the Novint Falcon for active haptic feed- back...

  8. Haptic-STM: a human-in-the-loop interface to a scanning tunneling microscope.

    Science.gov (United States)

    Perdigão, Luís M A; Saywell, Alex

    2011-07-01

    The operation of a haptic device interfaced with a scanning tunneling microscope (STM) is presented here. The user moves the STM tip in three dimensions by means of a stylus attached to the haptic instrument. The tunneling current measured by the STM is converted to a vertical force, applied to the stylus and felt by the user, with the user being incorporated into the feedback loop that controls the tip-surface distance. A haptic-STM interface of this nature allows the user to feel atomic features on the surface and facilitates the tactile manipulation of the adsorbate/substrate system. The operation of this device is demonstrated via the room temperature STM imaging of C(60) molecules adsorbed on an Au(111) surface in ultra-high vacuum.

  9. Dynamics modeling for parallel haptic interfaces with force sensing and control.

    Science.gov (United States)

    Bernstein, Nicholas; Lawrence, Dale; Pao, Lucy

    2013-01-01

    Closed-loop force control can be used on haptic interfaces (HIs) to mitigate the effects of mechanism dynamics. A single multidimensional force-torque sensor is often employed to measure the interaction force between the haptic device and the user's hand. The parallel haptic interface at the University of Colorado (CU) instead employs smaller 1D force sensors oriented along each of the five actuating rods to build up a 5D force vector. This paper shows that a particular manipulandum/hand partition in the system dynamics is induced by the placement and type of force sensing, and discusses the implications on force and impedance control for parallel haptic interfaces. The details of a "squaring down" process are also discussed, showing how to obtain reduced degree-of-freedom models from the general six degree-of-freedom dynamics formulation.

  10. Force Control and Nonlinear Master-Slave Force Profile to Manage an Admittance Type Multi-Fingered Haptic User Interface

    Energy Technology Data Exchange (ETDEWEB)

    Anthony L. Crawford

    2012-08-01

    Natural movements and force feedback are important elements in using teleoperated equipment if complex and speedy manipulation tasks are to be accomplished in remote and/or hazardous environments, such as hot cells, glove boxes, decommissioning, explosives disarmament, and space to name a few. In order to achieve this end the research presented in this paper has developed an admittance type exoskeleton like multi-fingered haptic hand user interface that secures the user’s palm and provides 3-dimensional force feedback to the user’s fingertips. Atypical to conventional haptic hand user interfaces that limit themselves to integrating the human hand’s characteristics just into the system’s mechanical design this system also perpetuates that inspiration into the designed user interface’s controller. This is achieved by manifesting the property differences of manipulation and grasping activities as they pertain to the human hand into a nonlinear master-slave force relationship. The results presented in this paper show that the admittance-type system has sufficient bandwidth that it appears nearly transparent to the user when the user is in free motion and when the system is subjected to a manipulation task, increased performance is achieved using the nonlinear force relationship compared to the traditional linear scaling techniques implemented in the vast majority of systems.

  11. Contribution to the modeling and the identification of haptic interfaces; Contribution a la modelisation et a l'identification des interfaces haptiques

    Energy Technology Data Exchange (ETDEWEB)

    Janot, A

    2007-12-15

    This thesis focuses on the modeling and the identification of haptic interfaces using cable drive. An haptic interface is a force feedback device, which enables its user to interact with a virtual world or a remote environment explored by a slave system. It aims at the matching between the forces and displacements given by the user and those applied to virtual world. Usually, haptic interfaces make use of a mechanical actuated structure whose distal link is equipped with a handle. When manipulating this handle to interact with explored world, the user feels the apparent mass, compliance and friction of the interface. This distortion introduced between the operator and the virtual world must be modeled and identified to enhance the design of the interface and develop appropriate control laws. The first approach has been to adapt the modeling and identification methods of rigid and localized flexibilities robots to haptic interfaces. The identification technique makes use of the inverse dynamic model and the linear least squares with the measurements of joint torques and positions. This approach is validated on a single degree of freedom and a three degree of freedom haptic devices. A new identification method needing only torque data is proposed. It is based on a closed loop simulation using the direct dynamic model. The optimal parameters minimize the 2 norms of the error between the actual torque and the simulated torque assuming the same control law and the same tracking trajectory. This non linear least squares problem dramatically is simplified using the inverse model to calculate the simulated torque. This method is validated on the single degree of freedom haptic device and the SCARA robot. (author)

  12. Structural impact detection with vibro-haptic interfaces

    Science.gov (United States)

    Jung, Hwee-Kwon; Park, Gyuhae; Todd, Michael D.

    2016-07-01

    This paper presents a new sensing paradigm for structural impact detection using vibro-haptic interfaces. The goal of this study is to allow humans to ‘feel’ structural responses (impact, shape changes, and damage) and eventually determine health conditions of a structure. The target applications for this study are aerospace structures, in particular, airplane wings. Both hardware and software components are developed to realize the vibro-haptic-based impact detection system. First, L-shape piezoelectric sensor arrays are deployed to measure the acoustic emission data generated by impacts on a wing. Unique haptic signals are then generated by processing the measured acoustic emission data. These haptic signals are wirelessly transmitted to human arms, and with vibro-haptic interface, human pilots could identify impact location, intensity and possibility of subsequent damage initiation. With the haptic interface, the experimental results demonstrate that human could correctly identify such events, while reducing false indications on structural conditions by capitalizing on human’s classification capability. Several important aspects of this study, including development of haptic interfaces, design of optimal human training strategies, and extension of the haptic capability into structural impact detection are summarized in this paper.

  13. Contribution to the modeling and the identification of haptic interfaces; Contribution a la modelisation et a l'identification des interfaces haptiques

    Energy Technology Data Exchange (ETDEWEB)

    Janot, A

    2007-12-15

    This thesis focuses on the modeling and the identification of haptic interfaces using cable drive. An haptic interface is a force feedback device, which enables its user to interact with a virtual world or a remote environment explored by a slave system. It aims at the matching between the forces and displacements given by the user and those applied to virtual world. Usually, haptic interfaces make use of a mechanical actuated structure whose distal link is equipped with a handle. When manipulating this handle to interact with explored world, the user feels the apparent mass, compliance and friction of the interface. This distortion introduced between the operator and the virtual world must be modeled and identified to enhance the design of the interface and develop appropriate control laws. The first approach has been to adapt the modeling and identification methods of rigid and localized flexibilities robots to haptic interfaces. The identification technique makes use of the inverse dynamic model and the linear least squares with the measurements of joint torques and positions. This approach is validated on a single degree of freedom and a three degree of freedom haptic devices. A new identification method needing only torque data is proposed. It is based on a closed loop simulation using the direct dynamic model. The optimal parameters minimize the 2 norms of the error between the actual torque and the simulated torque assuming the same control law and the same tracking trajectory. This non linear least squares problem dramatically is simplified using the inverse model to calculate the simulated torque. This method is validated on the single degree of freedom haptic device and the SCARA robot. (author)

  14. Active skin as new haptic interface

    Science.gov (United States)

    Vuong, Nguyen Huu Lam; Kwon, Hyeok Yong; Chuc, Nguyen Huu; Kim, Duksang; An, Kuangjun; Phuc, Vuong Hong; Moon, Hyungpil; Koo, Jachoon; Lee, Youngkwan; Nam, Jae-Do; Choi, Hyouk Ryeol

    2010-04-01

    In this paper, we present a new haptic interface, called "active skin", which is configured with a tactile sensor and a tactile stimulator in single haptic cell, and multiple haptic cells are embedded in a dielectric elastomer. The active skin generates a wide variety of haptic feel in response to the touch by synchronizing the sensor and the stimulator. In this paper, the design of the haptic cell is derived via iterative analysis and design procedures. A fabrication method dedicated to the proposed device is investigated and a controller to drive multiple haptic cells is developed. In addition, several experiments are performed to evaluate the performance of the active skin.

  15. UPPER LIMB FUNCTIONAL ASSESSMENT USING HAPTIC INTERFACE

    Directory of Open Access Journals (Sweden)

    Aleš Bardorfer

    2004-12-01

    Full Text Available A new method for the assessment of the upper limb (UL functional state, using a haptic interface is presented. A haptic interface is used as a measuring device, capable of providing objective, repeatable and quantitative data of the UL motion. A patient is presented with a virtual environment, both graphically via a computer screen and haptically via the Phantom Premium 1.5 haptic interface. The setup allows the patient to explore and feel the virtual environment with three of his/her senses; sight, hearing, and most important, touch. Specially designed virtual environments are used to assess the patient’s UL movement capabilities. The tests range from tracking tasks – to assess the accuracy of movement – tracking tasks with added disturbances in a form of random forces – to assess the patient’s control abilities, a labyrinth test – to assess both speed and accuracy, to the last test for measuring the maximal force capacity of the UL.A new method for the assessment of the upper limb (UL functional state, using a haptic interface is presented. A haptic interface is used as a measuring device, capable of providing objective, repeatable and quantitative data of the UL motion. A patient is presented with a virtual environment, both graphically via a computer screen and haptically via the Phantom Premium 1.5 haptic interface. The setup allows the patient to explore and feel the virtual environment with three of his/her senses; sight, hearing, and most important, touch. Specially designed virtual environments are used to assess the patient’s UL movement capabilities. The tests range from tracking tasks–to assess the accuracy of movement-tracking tasks with added disturbances in a form of random forces-to assess the patient’s control abilities, a labyrinth test-to assess both speed and accuracy, to the last test for measuring the maximal force capacity of the UL.A comprehensive study, using the developed measurement setup within the

  16. Multimodal Sensing Interface for Haptic Interaction

    Directory of Open Access Journals (Sweden)

    Carlos Diaz

    2017-01-01

    Full Text Available This paper investigates the integration of a multimodal sensing system for exploring limits of vibrato tactile haptic feedback when interacting with 3D representation of real objects. In this study, the spatial locations of the objects are mapped to the work volume of the user using a Kinect sensor. The position of the user’s hand is obtained using the marker-based visual processing. The depth information is used to build a vibrotactile map on a haptic glove enhanced with vibration motors. The users can perceive the location and dimension of remote objects by moving their hand inside a scanning region. A marker detection camera provides the location and orientation of the user’s hand (glove to map the corresponding tactile message. A preliminary study was conducted to explore how different users can perceive such haptic experiences. Factors such as total number of objects detected, object separation resolution, and dimension-based and shape-based discrimination were evaluated. The preliminary results showed that the localization and counting of objects can be attained with a high degree of success. The users were able to classify groups of objects of different dimensions based on the perceived haptic feedback.

  17. Design and Evaluation of Shape-Changing Haptic Interfaces for Pedestrian Navigation Assistance.

    Science.gov (United States)

    Spiers, Adam J; Dollar, Aaron M

    2017-01-01

    Shape-changing interfaces are a category of device capable of altering their form in order to facilitate communication of information. In this work, we present a shape-changing device that has been designed for navigation assistance. 'The Animotus' (previously, 'The Haptic Sandwich' ), resembles a cube with an articulated upper half that is able to rotate and extend (translate) relative to the bottom half, which is fixed in the user's grasp. This rotation and extension, generally felt via the user's fingers, is used to represent heading and proximity to navigational targets. The device is intended to provide an alternative to screen or audio based interfaces for visually impaired, hearing impaired, deafblind, and sighted pedestrians. The motivation and design of the haptic device is presented, followed by the results of a navigation experiment that aimed to determine the role of each device DOF, in terms of facilitating guidance. An additional device, 'The Haptic Taco', which modulated its volume in response to target proximity (negating directional feedback), was also compared. Results indicate that while the heading (rotational) DOF benefited motion efficiency, the proximity (translational) DOF benefited velocity. Combination of the two DOF improved overall performance. The volumetric Taco performed comparably to the Animotus' extension DOF.

  18. A Study of an Assistance SystemUsing a Haptic Interface

    OpenAIRE

    浅川, 貴史

    2011-01-01

    We make a proposal for a music baton system for visual handicapped persons. This system is constituted by an acceleration sensor. a radio module. and a haptic interface device. The acceleration sensor is built in the music baton grip and the data are transmitted by the radio module. A performer has a receiver with the haptic interface device. The receiver's CPU picks up rhythm from the data and vibrates the haptic interface device. This paper is described about an experiment of comparing the ...

  19. Wide-Area Haptic Guidance: Taking the User by the Hand

    OpenAIRE

    Pérez Arias, Antonia; Hanebeck, Uwe D.

    2010-01-01

    In this paper, we present a novel use of haptic information in extended range telepresence, the wide-area haptic guidance. It consists of force and position signals applied to the user's hand in order to improve safety, accuracy, and speed in some telepresent tasks. Wide-area haptic guidance assists the user in reaching a desired position in a remote environment of arbitrary size without degrading the feeling of presence. Several methods for haptic guidance are analyzed. With active haptic gu...

  20. fMRI-Compatible Electromagnetic Haptic Interface.

    Science.gov (United States)

    Riener, R; Villgrattner, T; Kleiser, R; Nef, T; Kollias, S

    2005-01-01

    A new haptic interface device is suggested, which can be used for functional magnetic resonance imaging (fMRI) studies. The basic component of this 1 DOF haptic device are two coils that produce a Lorentz force induced by the large static magnetic field of the MR scanner. A MR-compatible optical angular encoder and a optical force sensor enable the implementation of different control architectures for haptic interactions. The challenge was to provide a large torque, and not to affect image quality by the currents applied in the device. The haptic device was tested in a 3T MR scanner. With a current of up to 1A and a distance of 1m to the focal point of the MR-scanner it was possible to generate torques of up to 4 Nm. Within these boundaries image quality was not affected.

  1. Assignment about providing of substitute haptic interface for visually disabled persons

    OpenAIRE

    浅川, 貴史

    2013-01-01

    [Abstract] This paper is described about an assignment of haptic interface. We have made a proposal for a music baton system for visually disabled persons. The system is constituted by an acceleration sensor, a radio module, and a haptic interface device. We have carried out an experiment of comparing the visual and the haptic interface. The assignments are declared by the results that are rise-time of a motor and pre-motion. In the paper, we make a proposal for new method of the voltage cont...

  2. Evaluating User Response to In-Car Haptic Feedback Touchscreens Using the Lane Change Test

    Directory of Open Access Journals (Sweden)

    Matthew J. Pitts

    2012-01-01

    Full Text Available Touchscreen interfaces are widely used in modern technology, from mobile devices to in-car infotainment systems. However, touchscreens impose significant visual workload demands on the user which have safety implications for use in cars. Previous studies indicate that the application of haptic feedback can improve both performance of and affective response to user interfaces. This paper reports on and extends the findings of a 2009 study conducted to evaluate the effects of different combinations of touchscreen visual, audible, and haptic feedback on driving and task performance, affective response, and subjective workload; the initial findings of which were originally published in (M. J. Pitts et al., 2009. A total of 48 non-expert users completed the study. A dual-task approach was applied, using the Lane Change Test as the driving task and realistic automotive use case touchscreen tasks. Results indicated that, while feedback type had no effect on driving or task performance, preference was expressed for multimodal feedback over visual alone. Issues relating to workload and cross-modal interaction were also identified.

  3. NONLINEAR FORCE PROFILE USED TO INCREASE THE PERFORMANCE OF A HAPTIC USER INTERFACE FOR TELEOPERATING A ROBOTIC HAND

    Energy Technology Data Exchange (ETDEWEB)

    Anthony L. Crawford

    2012-07-01

    MODIFIED PAPER TITLE AND ABSTRACT DUE TO SLIGHTLY MODIFIED SCOPE: TITLE: Nonlinear Force Profile Used to Increase the Performance of a Haptic User Interface for Teleoperating a Robotic Hand Natural movements and force feedback are important elements in using teleoperated equipment if complex and speedy manipulation tasks are to be accomplished in hazardous environments, such as hot cells, glove boxes, decommissioning, explosives disarmament, and space. The research associated with this paper hypothesizes that a user interface and complementary radiation compatible robotic hand that integrates the human hand’s anthropometric properties, speed capability, nonlinear strength profile, reduction of active degrees of freedom during the transition from manipulation to grasping, and just noticeable difference force sensation characteristics will enhance a user’s teleoperation performance. The main contribution of this research is in that a system that concisely integrates all these factors has yet to be developed and furthermore has yet to be applied to a hazardous environment as those referenced above. In fact, the most prominent slave manipulator teleoperation technology in use today is based on a design patented in 1945 (Patent 2632574) [1]. The robotic hand/user interface systems of similar function as the one being developed in this research limit their design input requirements in the best case to only complementing the hand’s anthropometric properties, speed capability, and linearly scaled force application relationship (e.g. robotic force is a constant, 4 times that of the user). In this paper a nonlinear relationship between the force experienced between the user interface and the robotic hand was devised based on property differences of manipulation and grasping activities as they pertain to the human hand. The results show that such a relationship when subjected to a manipulation task and grasping task produces increased performance compared to the

  4. Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface.

    Science.gov (United States)

    Aleotti, Jacopo; Micconi, Giorgio; Caselli, Stefano; Benassi, Giacomo; Zambelli, Nicola; Bettelli, Manuele; Zappettini, Andrea

    2017-09-29

    A visuo-haptic augmented reality (VHAR) interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV) equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source.

  5. An investigation of a passively controlled haptic interface

    Energy Technology Data Exchange (ETDEWEB)

    Davis, J.T. [Oak Ridge National Lab., TN (United States); Book, W.J. [Georgia Inst. of Tech., Atlanta, GA (United States). School of Mechanical Engineering

    1997-03-01

    Haptic interfaces enhance cooperation between humans and robotic manipulators by providing force and tactile feedback to the human user during the execution of arbitrary tasks. The use of active actuators in haptic displays presents a certain amount of risk since they are capable of providing unacceptable levels of energy to the systems upon which they operate. An alternative to providing numerous safeguards is to remove the sources of risk altogether. This research investigates the feasibility of trajectory control using passive devices, that is, devices that cannot add energy to the system. Passive actuators are capable only of removing energy from the system or transferring energy within the system. It is proposed that the utility of passive devices is greatly enhanced by the use of redundant actuators. In a passive system, once motion is provided to the system, presumably by a human user, passive devices may be able to modify this motion to achieve a desired resultant trajectory. A mechanically passive, 2-Degree-of-Freedom (D.O.F.) manipulator has been designed and built. It is equipped with four passive actuators: two electromagnetic brakes and two electromagnetic clutches. This paper gives a review of the literature on passive and robotics and describes the experimental test bed used in this research. Several control algorithms are investigated, resulting in the formulation of a passive control law.

  6. An investigation of a passively controlled haptic interface

    International Nuclear Information System (INIS)

    Davis, J.T.; Book, W.J.

    1997-01-01

    Haptic interfaces enhance cooperation between humans and robotic manipulators by providing force and tactile feedback to the human user during the execution of arbitrary tasks. The use of active actuators in haptic displays presents a certain amount of risk since they are capable of providing unacceptable levels of energy to the systems upon which they operate. An alternative to providing numerous safeguards is to remove the sources of risk altogether. This research investigates the feasibility of trajectory control using passive devices, that is, devices that cannot add energy to the system. Passive actuators are capable only of removing energy from the system or transferring energy within the system. It is proposed that the utility of passive devices is greatly enhanced by the use of redundant actuators. In a passive system, once motion is provided to the system, presumably by a human user, passive devices may be able to modify this motion to achieve a desired resultant trajectory. A mechanically passive, 2-Degree-of-Freedom (D.O.F.) manipulator has been designed and built. It is equipped with four passive actuators: two electromagnetic brakes and two electromagnetic clutches. This paper gives a review of the literature on passive and robotics and describes the experimental test bed used in this research. Several control algorithms are investigated, resulting in the formulation of a passive control law

  7. Advanced Maintenance Simulation by Means of Hand-Based Haptic Interfaces

    Science.gov (United States)

    Nappi, Michele; Paolino, Luca; Ricciardi, Stefano; Sebillo, Monica; Vitiello, Giuliana

    Aerospace industry has been involved in virtual simulation for design and testing since the birth of virtual reality. Today this industry is showing a growing interest in the development of haptic-based maintenance training applications, which represent the most advanced way to simulate maintenance and repair tasks within a virtual environment by means of a visual-haptic approach. The goal is to allow the trainee to experiment the service procedures not only as a workflow reproduced at a visual level but also in terms of the kinaesthetic feedback involved with the manipulation of tools and components. This study, conducted in collaboration with aerospace industry specialists, is aimed to the development of an immersive virtual capable of immerging the trainees into a virtual environment where mechanics and technicians can perform maintenance simulation or training tasks by directly manipulating 3D virtual models of aircraft parts while perceiving force feedback through the haptic interface. The proposed system is based on ViRstperson, a virtual reality engine under development at the Italian Center for Aerospace Research (CIRA) to support engineering and technical activities such as design-time maintenance procedure validation, and maintenance training. This engine has been extended to support haptic-based interaction, enabling a more complete level of interaction, also in terms of impedance control, and thus fostering the development of haptic knowledge in the user. The user’s “sense of touch” within the immersive virtual environment is simulated through an Immersion CyberForce® hand-based force-feedback device. Preliminary testing of the proposed system seems encouraging.

  8. Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface

    OpenAIRE

    Jacopo Aleotti; Giorgio Micconi; Stefano Caselli; Giacomo Benassi; Nicola Zambelli; Manuele Bettelli; Andrea Zappettini

    2017-01-01

    A visuo-haptic augmented reality (VHAR) interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV) equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the sit...

  9. Benefits of the use of natural user interfaces in water simulations

    NARCIS (Netherlands)

    Donchyts, G.; Baart, F.; Van Dam, A.; Jagers, B.

    2014-01-01

    The use of natural user interfaces instead of conventional ones has become a reality with the emergence of 3D motion sensing technologies. However, some problems are still unsolved (for example, no haptic or tactile feedback); so this technology requires careful evaluation before the users can

  10. Study of Electric Music Baton using Haptic Interface for Assistance of Visually Disabled Persons

    OpenAIRE

    浅川, 貴史

    2012-01-01

    [Abstract] We have made a proposal for a music baton system for visual disabled persons. The system is constituted by an acceleration sensor, a radio module, and a haptic interface device. When a conductor moves the baton, Players are able to acknowledge the action using the haptic interface device. We have carried out an experiment of comparing the visual and the haptic interface. The result declared that a pre-motion is important for the visual interface. In the paper, we make a proposal fo...

  11. Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface

    Directory of Open Access Journals (Sweden)

    Jacopo Aleotti

    2017-09-01

    Full Text Available A visuo-haptic augmented reality (VHAR interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source.

  12. Vibrotactile perception assessment for a haptic interface on an antigravity suit.

    Science.gov (United States)

    Ko, Sang Min; Lee, Kwangil; Kim, Daeho; Ji, Yong Gu

    2017-01-01

    Haptic technology is used in various fields to transmit information to the user with or without visual and auditory cues. This study aimed to provide preliminary data for use in developing a haptic interface for an antigravity (anti-G) suit. With the structural characteristics of the anti-G suit in mind, we determined five areas on the body (lower back, outer thighs, inner thighs, outer calves, and inner calves) on which to install ten bar-type eccentric rotating mass (ERM) motors as vibration actuators. To determine the design factors of the haptic anti-G suit, we conducted three experiments to find the absolute threshold, moderate intensity, and subjective assessments of vibrotactile stimuli. Twenty-six fighter pilots participated in the experiments, which were conducted in a fixed-based flight simulator. From the results of our study, we recommend 1) absolute thresholds of ∼11.98-15.84 Hz and 102.01-104.06 dB, 2) moderate intensities of 74.36 Hz and 126.98 dB for the lower back and 58.65 Hz and 122.37 dB for either side of the thighs and calves, and 3) subjective assessments of vibrotactile stimuli (displeasure, easy to perceive, and level of comfort). The results of this study will be useful for the design of a haptic anti-G suit. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Modeling Auditory-Haptic Interface Cues from an Analog Multi-line Telephone

    Science.gov (United States)

    Begault, Durand R.; Anderson, Mark R.; Bittner, Rachael M.

    2012-01-01

    The Western Electric Company produced a multi-line telephone during the 1940s-1970s using a six-button interface design that provided robust tactile, haptic and auditory cues regarding the "state" of the communication system. This multi-line telephone was used as a model for a trade study comparison of two interfaces: a touchscreen interface (iPad)) versus a pressure-sensitive strain gauge button interface (Phidget USB interface controllers). The experiment and its results are detailed in the authors' AES 133rd convention paper " Multimodal Information Management: Evaluation of Auditory and Haptic Cues for NextGen Communication Dispays". This Engineering Brief describes how the interface logic, visual indications, and auditory cues of the original telephone were synthesized using MAX/MSP, including the logic for line selection, line hold, and priority line activation.

  14. Touch Is Everywhere: Floor Surfaces as Ambient Haptic Interfaces.

    Science.gov (United States)

    Visell, Y; Law, A; Cooperstock, J R

    2009-01-01

    Floor surfaces are notable for the diverse roles that they play in our negotiation of everyday environments. Haptic communication via floor surfaces could enhance or enable many computer-supported activities that involve movement on foot. In this paper, we discuss potential applications of such interfaces in everyday environments and present a haptically augmented floor component through which several interaction methods are being evaluated. We describe two approaches to the design of structured vibrotactile signals for this device. The first is centered on a musical phrase metaphor, as employed in prior work on tactile display. The second is based upon the synthesis of rhythmic patterns of virtual physical impact transients. We report on an experiment in which participants were able to identify communication units that were constructed from these signals and displayed via a floor interface at well above chance levels. The results support the feasibility of tactile information display via such interfaces and provide further indications as to how to effectively design vibrotactile signals for them.

  15. A Three-Axis Force Sensor for Dual Finger Haptic Interfaces

    OpenAIRE

    Fontana, Marco; Marcheschi, Simone; Salsedo, Fabio; Bergamasco, Massimo

    2012-01-01

    In this work we present the design process, the characterization and testing of a novel three-axis mechanical force sensor. This sensor is optimized for use in closed-loop force control of haptic devices with three degrees of freedom. In particular the sensor has been conceived for integration with a dual finger haptic interface that aims at simulating forces that occur during grasping and surface exploration. The sensing spring structure has been purposely designed in order to match force an...

  16. Design of a New MR Compatible Haptic Interface with Six Actuated Degrees of Freedom

    DEFF Research Database (Denmark)

    Ergin, Mehmet Alper; Kühne, Markus; Thielscher, Axel

    2014-01-01

    Functional magnetic resonance imaging is an often adopted tool to study human motor control mechanisms. Highly controlled experiments as required by this form of analysis can be realized with haptic interfaces. Their design is challenging because of strong safety and MR compatibility requirements....... Existing MR-compatible haptic interfaces are restricted to maximum three actuated degrees of freedom. We propose an MR-compatible haptic interface with six actuated degrees of freedom to be able to study human brain mechanisms of natural pick-and-place movements including arm transport. In this work, we...... present its mechanical design, kinematic and dynamic model, as well as report on its model-based characterization. A novel hybrid control scheme for the employed ultrasonic motors is introduced. Preliminary MR compatibility tests based on one complete actuator-sensor module are performed. No measurable...

  17. Frictional Compliant Haptic Contact and Deformation of Soft Objects

    Directory of Open Access Journals (Sweden)

    Naci Zafer

    2016-05-01

    Full Text Available This paper is concerned with compliant haptic contact and deformation of soft objects. A human soft fingertip model is considered to act as the haptic interface and is brought into contact with and deforms a discrete surface. A nonlinear constitutive law is developed in predicting normal forces and, for the haptic display of surface texture, motions along the surface are also resisted at various rates by accounting for dynamic Lund-Grenoble (LuGre frictional forces. For the soft fingertip to apply forces over an area larger than a point, normal and frictional forces are distributed around the soft fingertip contact location on the deforming surface. The distribution is realized based on a kernel smoothing function and by a nonlinear spring-damper net around the contact point. Experiments conducted demonstrate the accuracy and effectiveness of our approach in real-time haptic rendering of a kidney surface. The resistive (interaction forces are applied at the user fingertip bone edge. A 3-DoF parallel robotic manipulator equipped with a constraint based controller is used for the implementation. By rendering forces both in lateral and normal directions, the designed haptic interface system allows the user to realistically feel both the geometrical and mechanical (nonlinear properties of the deforming kidney.

  18. Haptic feedback improves surgeons' user experience and fracture reduction in facial trauma simulation.

    Science.gov (United States)

    Girod, Sabine; Schvartzman, Sara C; Gaudilliere, Dyani; Salisbury, Kenneth; Silva, Rebeka

    2016-01-01

    Computer-assisted surgical (CAS) planning tools are available for craniofacial surgery, but are usually based on computer-aided design (CAD) tools that lack the ability to detect the collision of virtual objects (i.e., fractured bone segments). We developed a CAS system featuring a sense of touch (haptic) that enables surgeons to physically interact with individual, patient-specific anatomy and immerse in a three-dimensional virtual environment. In this study, we evaluated initial user experience with our novel system compared to an existing CAD system. Ten surgery resident trainees received a brief verbal introduction to both the haptic and CAD systems. Users simulated mandibular fracture reduction in three clinical cases within a 15 min time limit for each system and completed a questionnaire to assess their subjective experience. We compared standard landmarks and linear and angular measurements between the simulated results and the actual surgical outcome and found that haptic simulation results were not significantly different from actual postoperative outcomes. In contrast, CAD results significantly differed from both the haptic simulation and actual postoperative results. In addition to enabling a more accurate fracture repair, the haptic system provided a better user experience than the CAD system in terms of intuitiveness and self-reported quality of repair.

  19. Haptics for Virtual Reality and Teleoperation

    CERN Document Server

    Mihelj, Matjaž

    2012-01-01

    This book covers all topics relevant for the design of haptic interfaces and teleoperation systems. The book provides the basic knowledge required for understanding more complex approaches and more importantly it introduces all issues that must be considered for designing efficient and safe haptic interfaces. Topics covered in this book provide insight into all relevant components of a haptic system. The reader is guided from understanding the virtual reality concept to the final goal of being able to design haptic interfaces for specific tasks such as nanomanipulation.  The introduction chapter positions the haptic interfaces within the virtual reality context. In order to design haptic interfaces that will comply with human capabilities at least basic understanding of human sensors-motor system is required. An overview of this topic is provided in the chapter related to human haptics. The book does not try to introduce the state-of-the-art haptic interface solutions because these tend to change quickly. On...

  20. A real-time haptic interface for interventional radiology procedures.

    Science.gov (United States)

    Moix, Thomas; Ilic, Dejan; Fracheboud, Blaise; Zoethout, Jurjen; Bleuler, Hannes

    2005-01-01

    Interventional Radiology (IR) is a minimally-invasive surgery technique (MIS) where guidewires and catheters are steered in the vascular system under X-ray imaging. In order to perform these procedures, a radiologist has to be correctly trained to master hand-eye coordination, instrument manipulation and procedure protocols. This paper proposes a computer-assisted training environment dedicated to IR. The system is composed of a virtual reality (VR) simulation of the anatomy of the patient linked to a robotic interface providing haptic force feedback.The paper focuses on the requirements, design and prototyping of a specific part of the haptic interface dedicated to catheters. Translational tracking and force feedback on the catheter is provided by two cylinders forming a friction drive arrangement. The whole friction can be set in rotation with an additional motor providing torque feedback. A force and a torque sensor are integrated in the cylinders for direct measurement on the catheter enabling disturbance cancellation with a close-loop force control strategy.

  1. Soft Robotic Haptic Interface with Variable Stiffness for Rehabilitation of Neurologically Impaired Hand Function

    Directory of Open Access Journals (Sweden)

    Frederick Sebastian

    2017-12-01

    Full Text Available The human hand comprises complex sensorimotor functions that can be impaired by neurological diseases and traumatic injuries. Effective rehabilitation can bring the impaired hand back to a functional state because of the plasticity of the central nervous system to relearn and remodel the lost synapses in the brain. Current rehabilitation therapies focus on strengthening motor skills, such as grasping, employ multiple objects of varying stiffness so that affected persons can experience a wide range of strength training. These devices have limited range of stiffness due to the rigid mechanisms employed in their variable stiffness actuators. This paper presents a novel soft robotic haptic device for neuromuscular rehabilitation of the hand, which is designed to offer adjustable stiffness and can be utilized in both clinical and home settings. The device eliminates the need for multiple objects by employing a pneumatic soft structure made with highly compliant materials that act as the actuator of the haptic interface. It is made with interchangeable sleeves that can be customized to include materials of varying stiffness to increase the upper limit of the stiffness range. The device is fabricated using existing 3D printing technologies, and polymer molding and casting techniques, thus keeping the cost low and throughput high. The haptic interface is linked to either an open-loop system that allows for an increased pressure during usage or closed-loop system that provides pressure regulation in accordance to the stiffness the user specifies. Preliminary evaluation is performed to characterize the effective controllable region of variance in stiffness. It was found that the region of controllable stiffness was between points 3 and 7, where the stiffness appeared to plateau with each increase in pressure. The two control systems are tested to derive relationships between internal pressure, grasping force exertion on the surface, and displacement using

  2. An MR-Compatible Haptic Interface With Seven Degrees of Freedom

    DEFF Research Database (Denmark)

    Kuhne, Markus; Eschelbach, Martin; Aghaeifar, Ali

    2018-01-01

    Functional magnetic resonance imaging (fMRI) is a powerful tool for neuroscience. It allows the visualization of active areas in the human brain. Combining this method with haptic interfaces allows one to conduct human motor control studies with an opportunity for standardized experimental...

  3. What you see is what you feel : on the simulation of touch in graphical user interfaces

    NARCIS (Netherlands)

    Mensvoort, van K.M.

    2009-01-01

    This study introduces a novel method of simulating touch with merely visual means. Interactive animations are used to create an optical illusion that evokes haptic percepts like stickiness, stiffness and mass, within a standard graphical user interface. The technique, called optically simulated

  4. A Taxonomy and Comparison of Haptic Actions for Disassembly Tasks

    National Research Council Canada - National Science Library

    Bloomfield, Aaron; Deng, Yu; Wampler, Jeff; Rondot, Pascale; Harth, Dina; McManus, Mary; Badler, Norman

    2003-01-01

    .... We conducted a series of human subject experiments to compare user performance and preference on a disassembly task with and without haptic feedback using CyberGlove, Phantom, and SpaceMouse interfaces...

  5. Portable haptic interface with omni-directional movement and force capability.

    Science.gov (United States)

    Avizzano, Carlo Alberto; Satler, Massimo; Ruffaldi, Emanuele

    2014-01-01

    We describe the design of a new mobile haptic interface that employs wheels for force rendering. The interface, consisting of an omni-directional Killough type platform, provides 2DOF force feedback with different control modalities. The system autonomously performs sensor fusion for localization and force rendering. This paper explains the relevant choices concerning the functional aspects, the control design, the mechanical and electronic solution. Experimental results for force feedback characterization are reported.

  6. A three-axis force sensor for dual finger haptic interfaces.

    Science.gov (United States)

    Fontana, Marco; Marcheschi, Simone; Salsedo, Fabio; Bergamasco, Massimo

    2012-10-10

    In this work we present the design process, the characterization and testing of a novel three-axis mechanical force sensor. This sensor is optimized for use in closed-loop force control of haptic devices with three degrees of freedom. In particular the sensor has been conceived for integration with a dual finger haptic interface that aims at simulating forces that occur during grasping and surface exploration. The sensing spring structure has been purposely designed in order to match force and layout specifications for the application. In this paper the design of the sensor is presented, starting from an analytic model that describes the characteristic matrix of the sensor. A procedure for designing an optimal overload protection mechanism is proposed. In the last part of the paper the authors describe the experimental characterization and the integrated test on a haptic hand exoskeleton showing the improvements in the controller performances provided by the inclusion of the force sensor.

  7. A Three-Axis Force Sensor for Dual Finger Haptic Interfaces

    Directory of Open Access Journals (Sweden)

    Fabio Salsedo

    2012-10-01

    Full Text Available In this work we present the design process, the characterization and testing of a novel three-axis mechanical force sensor. This sensor is optimized for use in closed-loop force control of haptic devices with three degrees of freedom. In particular the sensor has been conceived for integration with a dual finger haptic interface that aims at simulating forces that occur during grasping and surface exploration. The sensing spring structure has been purposely designed in order to match force and layout specifications for the application. In this paper the design of the sensor is presented, starting from an analytic model that describes the characteristic matrix of the sensor. A procedure for designing an optimal overload protection mechanism is proposed. In the last part of the paper the authors describe the experimental characterization and the integrated test on a haptic hand exoskeleton showing the improvements in the controller performances provided by the inclusion of the force sensor.

  8. Precise Haptic Device Co-Location for Visuo-Haptic Augmented Reality.

    Science.gov (United States)

    Eck, Ulrich; Pankratz, Frieder; Sandor, Christian; Klinker, Gudrun; Laga, Hamid

    2015-12-01

    Visuo-haptic augmented reality systems enable users to see and touch digital information that is embedded in the real world. PHANToM haptic devices are often employed to provide haptic feedback. Precise co-location of computer-generated graphics and the haptic stylus is necessary to provide a realistic user experience. Previous work has focused on calibration procedures that compensate the non-linear position error caused by inaccuracies in the joint angle sensors. In this article we present a more complete procedure that additionally compensates for errors in the gimbal sensors and improves position calibration. The proposed procedure further includes software-based temporal alignment of sensor data and a method for the estimation of a reference for position calibration, resulting in increased robustness against haptic device initialization and external tracker noise. We designed our procedure to require minimal user input to maximize usability. We conducted an extensive evaluation with two different PHANToMs, two different optical trackers, and a mechanical tracker. Compared to state-of-the-art calibration procedures, our approach significantly improves the co-location of the haptic stylus. This results in higher fidelity visual and haptic augmentations, which are crucial for fine-motor tasks in areas such as medical training simulators, assembly planning tools, or rapid prototyping applications.

  9. Sound Descriptions of Haptic Experiences of Art Work by Deafblind Cochlear Implant Users

    Directory of Open Access Journals (Sweden)

    Riitta Lahtinen

    2018-05-01

    Full Text Available Deafblind persons’ perception and experiences are based on their residual auditive and visual senses, and touch. Their haptic exploration, through movements and orientation towards objects give blind persons direct, independent experience. Few studies explore the aesthetic experiences and appreciation of artefacts of deafblind people using cochlear implant (CI technology, and how they interpret and express their perceived aesthetic experience through another sensory modality. While speech recognition is studied extensively in this area, the aspect of auditive descriptions made by CI users are a less-studied domain. This present research intervention describes and analyses five different deafblind people sharing their interpretation of five statues vocally, using sounds and written descriptions based on their haptic explorations. The participants found new and multimodal ways of expressing their experiences, as well as re-experiencing them through technological aids. We also found that the CI users modify technology to better suit their personal needs. We conclude that CI technology in combination with self-made sound descriptions enhance memorization of haptic art experiences that can be re-called by the recording of the sound descriptions. This research expands the idea of auditive descriptions, and encourages user-produced descriptions as artistic supports to traditional linguistic, audio descriptions. These can be used to create personal auditive–haptic memory collections similar to how sighted create photo albums.

  10. Force sensitive handles and capacitive touch sensor for driving a flexible haptic-based immersive system.

    Science.gov (United States)

    Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto

    2013-10-09

    In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape.

  11. Safe Local Navigation for Visually Impaired Users With a Time-of-Flight and Haptic Feedback Device.

    Science.gov (United States)

    Katzschmann, Robert K; Araki, Brandon; Rus, Daniela

    2018-03-01

    This paper presents ALVU (Array of Lidars and Vibrotactile Units), a contactless, intuitive, hands-free, and discreet wearable device that allows visually impaired users to detect low- and high-hanging obstacles, as well as physical boundaries in their immediate environment. The solution allows for safe local navigation in both confined and open spaces by enabling the user to distinguish free space from obstacles. The device presented is composed of two parts: a sensor belt and a haptic strap. The sensor belt is an array of time-of-flight distance sensors worn around the front of a user's waist, and the pulses of infrared light provide reliable and accurate measurements of the distances between the user and surrounding obstacles or surfaces. The haptic strap communicates the measured distances through an array of vibratory motors worn around the user's upper abdomen, providing haptic feedback. The linear vibration motors are combined with a point-loaded pretensioned applicator to transmit isolated vibrations to the user. We validated the device's capability in an extensive user study entailing 162 trials with 12 blind users. Users wearing the device successfully walked through hallways, avoided obstacles, and detected staircases.

  12. Discriminating Tissue Stiffness with a Haptic Catheter: Feeling the Inside of the Beating Heart.

    Science.gov (United States)

    Kesner, Samuel B; Howe, Robert D

    2011-01-01

    Catheter devices allow physicians to access the inside of the human body easily and painlessly through natural orifices and vessels. Although catheters allow for the delivery of fluids and drugs, the deployment of devices, and the acquisition of the measurements, they do not allow clinicians to assess the physical properties of tissue inside the body due to the tissue motion and transmission limitations of the catheter devices, including compliance, friction, and backlash. The goal of this research is to increase the tactile information available to physicians during catheter procedures by providing haptic feedback during palpation procedures. To accomplish this goal, we have developed the first motion compensated actuated catheter system that enables haptic perception of fast moving tissue structures. The actuated catheter is instrumented with a distal tip force sensor and a force feedback interface that allows users to adjust the position of the catheter while experiencing the forces on the catheter tip. The efficacy of this device and interface is evaluated through a psychophyisical study comparing how accurately users can differentiate various materials attached to a cardiac motion simulator using the haptic device and a conventional manual catheter. The results demonstrate that haptics improves a user's ability to differentiate material properties and decreases the total number of errors by 50% over the manual catheter system.

  13. A Fabric-Based Approach for Wearable Haptics

    Directory of Open Access Journals (Sweden)

    Matteo Bianchi

    2016-07-01

    Full Text Available In recent years, wearable haptic systems (WHS have gained increasing attention as a novel and exciting paradigm for human–robot interaction (HRI. These systems can be worn by users, carried around, and integrated in their everyday lives, thus enabling a more natural manner to deliver tactile cues. At the same time, the design of these types of devices presents new issues: the challenge is the correct identification of design guidelines, with the two-fold goal of minimizing system encumbrance and increasing the effectiveness and naturalness of stimulus delivery. Fabrics can represent a viable solution to tackle these issues. They are specifically thought “to be worn”, and could be the key ingredient to develop wearable haptic interfaces conceived for a more natural HRI. In this paper, the author will review some examples of fabric-based WHS that can be applied to different body locations, and elicit different haptic perceptions for different application fields. Perspective and future developments of this approach will be discussed.

  14. Pervasive haptics science, design, and application

    CERN Document Server

    Saga, Satoshi; Konyo, Masashi

    2016-01-01

    This book examines the state of the art in diverse areas of haptics (touch)-related research, including the psychophysics and neurophysiology of haptics, development of haptics displays and sensors, and applications to a wide variety of fields such as industry, education, therapy, medicine, and welfare for the visually impaired. It also discusses the potential of future haptics interaction, such as haptics for emotional control and remote haptics communication. The book offers a valuable resource not only for haptics and human interface researchers, but also for developers and designers at manufacturing corporations and in the entertainment industries.

  15. Design and Control of a Haptic Enabled Robotic Manipulator

    Directory of Open Access Journals (Sweden)

    Muhammad Yaqoob

    2015-07-01

    Full Text Available Robotic surgery offers various advantages over conventional surgery that includes less bleeding, less trauma, and more precise tissue cutting. However, even surgeons who use the best commercially available surgical robotic systems complain about the absence of haptic feedback in such systems. In this paper, we present the findings of our project to overcome this shortcoming of surgical robotic systems, in which a haptic-enabled robotic system based on master and slave topology is designed and developed. To detect real-time intrusion at the slave end, haptic feedback is implemented along with a programmable system on chip, functioning as an embedded system for processing information. In order to obtain real-time haptic feedback, force and motion sensors are mounted on each joint of the master and slave units. At the master end, results are displayed through a graphical user interface, along with the physical feeling of intrusion at the slave part. Apart from the obvious applications of the current system in robotic surgery, it could also be used in designing more intuitive video games with further precise haptic feedback mechanisms. Moreover, the results presented in our work should pave the way for further scientific investigation, to provide even better haptic mechanisms.

  16. Haptic interfaces using dielectric electroactive polymers

    Science.gov (United States)

    Ozsecen, Muzaffer Y.; Sivak, Mark; Mavroidis, Constantinos

    2010-04-01

    Quality, amplitude and frequency of the interaction forces between a human and an actuator are essential traits for haptic applications. A variety of Electro-Active Polymer (EAP) based actuators can provide these characteristics simultaneously with quiet operation, low weight, high power density and fast response. This paper demonstrates a rolled Dielectric Elastomer Actuator (DEA) being used as a telepresence device in a heart beat measurement application. In the this testing, heart signals were acquired from a remote location using a wireless heart rate sensor, sent through a network and DEA was used to haptically reproduce the heart beats at the medical expert's location. A series of preliminary human subject tests were conducted that demonstrated that a) DE based haptic feeling can be used in heart beat measurement tests and b) through subjective testing the stiffness and actuator properties of the EAP can be tuned for a variety of applications.

  17. Force Sensitive Handles and Capacitive Touch Sensor for Driving a Flexible Haptic-Based Immersive System

    Directory of Open Access Journals (Sweden)

    Umberto Cugini

    2013-10-01

    Full Text Available In this article, we present an approach that uses both two force sensitive handles (FSH and a flexible capacitive touch sensor (FCTS to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user’s fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape.

  18. Virtual haptic system for intuitive planning of bone fixation plate placement

    Directory of Open Access Journals (Sweden)

    Kup-Sze Choi

    2017-01-01

    Full Text Available Placement of pre-contoured fixation plate is a common treatment for bone fracture. Fitting of fixation plates on fractured bone can be preoperatively planned and evaluated in 3D virtual environment using virtual reality technology. However, conventional systems usually employ 2D mouse and virtual trackball as the user interface, which makes the process inconvenient and inefficient. In the paper, a preoperative planning system equipped with 3D haptic user interface is proposed to allow users to manipulate the virtual fixation plate intuitively to determine the optimal position for placement on distal medial tibia. The system provides interactive feedback forces and visual guidance based on the geometric requirements. Creation of 3D models from medical imaging data, collision detection, dynamics simulation and haptic rendering are discussed. The system was evaluated by 22 subjects. Results show that the time to achieve optimal placement using the proposed system was shorter than that by using 2D mouse and virtual trackball, and the satisfaction rating was also higher. The system shows potential to facilitate the process of fitting fixation plates on fractured bones as well as interactive fixation plate design.

  19. Archaeologies of touch interfacing with haptics from electricity to computing

    CERN Document Server

    Parisi, David

    2018-01-01

    David Parisi offers the first full history of new computing technologies known as haptic interfaces--which use electricity, vibration, and force feedback to stimulate the sense of touch--showing how the efforts of scientists and engineers over the past 300 years have gradually remade and redefined our sense of touch. Archaeologies of Touch offers a timely and provocative engagement with the long history of touch technology that helps us confront and question the power relations underpinning the project of giving touch its own set of technical media.

  20. 1st International AsiaHaptics conference

    CERN Document Server

    Ando, Hideyuki; Kyung, Ki-Uk

    2015-01-01

    This book is aimed not only at haptics and human interface researchers, but also at developers and designers from manufacturing corporations and the entertainment industry who are working to change our lives. This publication comprises the proceedings of the first International AsiaHaptics conference, held in Tsukuba, Japan, in 2014. The book describes the state of the art of the diverse haptics- (touch-) related research, including scientific research into haptics perception and illusion, development of haptics devices, and applications for a wide variety of fields such as education, medicine, telecommunication, navigation, and entertainment.

  1. A novel shape-changing haptic table-top display

    Science.gov (United States)

    Wang, Jiabin; Zhao, Lu; Liu, Yue; Wang, Yongtian; Cai, Yi

    2018-01-01

    A shape-changing table-top display with haptic feedback allows its users to perceive 3D visual and texture displays interactively. Since few existing devices are developed as accurate displays with regulatory haptic feedback, a novel attentive and immersive shape changing mechanical interface (SCMI) consisting of image processing unit and transformation unit was proposed in this paper. In order to support a precise 3D table-top display with an offset of less than 2 mm, a custommade mechanism was developed to form precise surface and regulate the feedback force. The proposed image processing unit was capable of extracting texture data from 2D picture for rendering shape-changing surface and realizing 3D modeling. The preliminary evaluation result proved the feasibility of the proposed system.

  2. Review of surgical robotics user interface: what is the best way to control robotic surgery?

    Science.gov (United States)

    Simorov, Anton; Otte, R Stephen; Kopietz, Courtni M; Oleynikov, Dmitry

    2012-08-01

    As surgical robots begin to occupy a larger place in operating rooms around the world, continued innovation is necessary to improve our outcomes. A comprehensive review of current surgical robotic user interfaces was performed to describe the modern surgical platforms, identify the benefits, and address the issues of feedback and limitations of visualization. Most robots currently used in surgery employ a master/slave relationship, with the surgeon seated at a work-console, manipulating the master system and visualizing the operation on a video screen. Although enormous strides have been made to advance current technology to the point of clinical use, limitations still exist. A lack of haptic feedback to the surgeon and the inability of the surgeon to be stationed at the operating table are the most notable examples. The future of robotic surgery sees a marked increase in the visualization technologies used in the operating room, as well as in the robots' abilities to convey haptic feedback to the surgeon. This will allow unparalleled sensation for the surgeon and almost eliminate inadvertent tissue contact and injury. A novel design for a user interface will allow the surgeon to have access to the patient bedside, remaining sterile throughout the procedure, employ a head-mounted three-dimensional visualization system, and allow the most intuitive master manipulation of the slave robot to date.

  3. Visuo-Haptic Mixed Reality with Unobstructed Tool-Hand Integration.

    Science.gov (United States)

    Cosco, Francesco; Garre, Carlos; Bruno, Fabio; Muzzupappa, Maurizio; Otaduy, Miguel A

    2013-01-01

    Visuo-haptic mixed reality consists of adding to a real scene the ability to see and touch virtual objects. It requires the use of see-through display technology for visually mixing real and virtual objects, and haptic devices for adding haptic interaction with the virtual objects. Unfortunately, the use of commodity haptic devices poses obstruction and misalignment issues that complicate the correct integration of a virtual tool and the user's real hand in the mixed reality scene. In this work, we propose a novel mixed reality paradigm where it is possible to touch and see virtual objects in combination with a real scene, using commodity haptic devices, and with a visually consistent integration of the user's hand and the virtual tool. We discuss the visual obstruction and misalignment issues introduced by commodity haptic devices, and then propose a solution that relies on four simple technical steps: color-based segmentation of the hand, tracking-based segmentation of the haptic device, background repainting using image-based models, and misalignment-free compositing of the user's hand. We have developed a successful proof-of-concept implementation, where a user can touch virtual objects and interact with them in the context of a real scene, and we have evaluated the impact on user performance of obstruction and misalignment correction.

  4. User interface support

    Science.gov (United States)

    Lewis, Clayton; Wilde, Nick

    1989-01-01

    Space construction will require heavy investment in the development of a wide variety of user interfaces for the computer-based tools that will be involved at every stage of construction operations. Using today's technology, user interface development is very expensive for two reasons: (1) specialized and scarce programming skills are required to implement the necessary graphical representations and complex control regimes for high-quality interfaces; (2) iteration on prototypes is required to meet user and task requirements, since these are difficult to anticipate with current (and foreseeable) design knowledge. We are attacking this problem by building a user interface development tool based on extensions to the spreadsheet model of computation. The tool provides high-level support for graphical user interfaces and permits dynamic modification of interfaces, without requiring conventional programming concepts and skills.

  5. Telerobotic Haptic Exploration in Art Galleries and Museums for Individuals with Visual Impairments.

    Science.gov (United States)

    Park, Chung Hyuk; Ryu, Eun-Seok; Howard, Ayanna M

    2015-01-01

    This paper presents a haptic telepresence system that enables visually impaired users to explore locations with rich visual observation such as art galleries and museums by using a telepresence robot, a RGB-D sensor (color and depth camera), and a haptic interface. The recent improvement on RGB-D sensors has enabled real-time access to 3D spatial information in the form of point clouds. However, the real-time representation of this data in the form of tangible haptic experience has not been challenged enough, especially in the case of telepresence for individuals with visual impairments. Thus, the proposed system addresses the real-time haptic exploration of remote 3D information through video encoding and real-time 3D haptic rendering of the remote real-world environment. This paper investigates two scenarios in haptic telepresence, i.e., mobile navigation and object exploration in a remote environment. Participants with and without visual impairments participated in our experiments based on the two scenarios, and the system performance was validated. In conclusion, the proposed framework provides a new methodology of haptic telepresence for individuals with visual impairments by providing an enhanced interactive experience where they can remotely access public places (art galleries and museums) with the aid of haptic modality and robotic telepresence.

  6. An arm wearable haptic interface for impact sensing on unmanned aerial vehicles

    Science.gov (United States)

    Choi, Yunshil; Hong, Seung-Chan; Lee, Jung-Ryul

    2017-04-01

    In this paper, an impact monitoring system using fiber Bragg grating (FBG) sensors and vibro-haptic actuators has been introduced. The system is suggested for structural health monitoring (SHM) for unmanned aerial vehicles (UAVs), by making a decision with human-robot interaction. The system is composed with two major subsystems; an on-board system equipped on UAV and an arm-wearable interface for ground pilot. The on-board system acquires impact-induced wavelength changes and performs localization process, which was developed based on arrival time calculation. The arm-wearable interface helps ground pilots to make decision about impact location themselves by stimulating their tactile-sense with motor vibration.

  7. Input and output for surgical simulation: devices to measure tissue properties in vivo and a haptic interface for laparoscopy simulators.

    Science.gov (United States)

    Ottensmeyer, M P; Ben-Ur, E; Salisbury, J K

    2000-01-01

    Current efforts in surgical simulation very often focus on creating realistic graphical feedback, but neglect some or all tactile and force (haptic) feedback that a surgeon would normally receive. Simulations that do include haptic feedback do not typically use real tissue compliance properties, favoring estimates and user feedback to determine realism. When tissue compliance data are used, there are virtually no in vivo property measurements to draw upon. Together with the Center for Innovative Minimally Invasive Therapy at the Massachusetts General Hospital, the Haptics Group is developing tools to introduce more comprehensive haptic feedback in laparoscopy simulators and to provide biological tissue material property data for our software simulation. The platform for providing haptic feedback is a PHANToM Haptic Interface, produced by SensAble Technologies, Inc. Our devices supplement the PHANToM to provide for grasping and optionally, for the roll axis of the tool. Together with feedback from the PHANToM, which provides the pitch, yaw and thrust axes of a typical laparoscopy tool, we can recreate all of the haptic sensations experienced during laparoscopy. The devices integrate real laparoscopy toolhandles and a compliant torso model to complete the set of visual and tactile sensations. Biological tissues are known to exhibit non-linear mechanical properties, and change their properties dramatically when removed from a living organism. To measure the properties in vivo, two devices are being developed. The first is a small displacement, 1-D indenter. It will measure the linear tissue compliance (stiffness and damping) over a wide range of frequencies. These data will be used as inputs to a finite element or other model. The second device will be able to deflect tissues in 3-D over a larger range, so that the non-linearities due to changes in the tissue geometry will be measured. This will allow us to validate the performance of the model on large tissue

  8. Object discrimination using optimized multi-frequency auditory cross-modal haptic feedback.

    Science.gov (United States)

    Gibson, Alison; Artemiadis, Panagiotis

    2014-01-01

    As the field of brain-machine interfaces and neuro-prosthetics continues to grow, there is a high need for sensor and actuation mechanisms that can provide haptic feedback to the user. Current technologies employ expensive, invasive and often inefficient force feedback methods, resulting in an unrealistic solution for individuals who rely on these devices. This paper responds through the development, integration and analysis of a novel feedback architecture where haptic information during the neural control of a prosthetic hand is perceived through multi-frequency auditory signals. Through representing force magnitude with volume and force location with frequency, the feedback architecture can translate the haptic experiences of a robotic end effector into the alternative sensory modality of sound. Previous research with the proposed cross-modal feedback method confirmed its learnability, so the current work aimed to investigate which frequency map (i.e. frequency-specific locations on the hand) is optimal in helping users distinguish between hand-held objects and tasks associated with them. After short use with the cross-modal feedback during the electromyographic (EMG) control of a prosthetic hand, testing results show that users are able to use audial feedback alone to discriminate between everyday objects. While users showed adaptation to three different frequency maps, the simplest map containing only two frequencies was found to be the most useful in discriminating between objects. This outcome provides support for the feasibility and practicality of the cross-modal feedback method during the neural control of prosthetics.

  9. Review of Designs for Haptic Data Visualization.

    Science.gov (United States)

    Paneels, Sabrina; Roberts, Jonathan C

    2010-01-01

    There are many different uses for haptics, such as training medical practitioners, teleoperation, or navigation of virtual environments. This review focuses on haptic methods that display data. The hypothesis is that haptic devices can be used to present information, and consequently, the user gains quantitative, qualitative, or holistic knowledge about the presented data. Not only is this useful for users who are blind or partially sighted (who can feel line graphs, for instance), but also the haptic modality can be used alongside other modalities, to increase the amount of variables being presented, or to duplicate some variables to reinforce the presentation. Over the last 20 years, a significant amount of research has been done in haptic data presentation; e.g., researchers have developed force feedback line graphs, bar charts, and other forms of haptic representations. However, previous research is published in different conferences and journals, with different application emphases. This paper gathers and collates these various designs to provide a comprehensive review of designs for haptic data visualization. The designs are classified by their representation: Charts, Maps, Signs, Networks, Diagrams, Images, and Tables. This review provides a comprehensive reference for researchers and learners, and highlights areas for further research.

  10. User Interface History

    DEFF Research Database (Denmark)

    Jørgensen, Anker Helms; Myers, Brad A

    2008-01-01

    User Interfaces have been around as long as computers have existed, even well before the field of Human-Computer Interaction was established. Over the years, some papers on the history of Human-Computer Interaction and User Interfaces have appeared, primarily focusing on the graphical interface e...

  11. A user interface for mobile robotized tele-echography

    Energy Technology Data Exchange (ETDEWEB)

    Triantafyllidis, G.A. [Informatics and Telematics Institute ITI-CERTH, Thessaloniki (Greece)]. E-mail: gatrian@iti.gr; Thomos, N. [Informatics and Telematics Institute ITI-CERTH, Thessaloniki (Greece); Canero, C. [Computer Vision Center, UAB, Barcelona (Spain); Vieyres, P. [Laboratoire Vision and Robotique Universite d' Orleans, Bourges (France); Strintzis, M.G. [Informatics and Telematics Institute ITI-CERTH, Thessaloniki (Greece)

    2006-12-20

    Ultrasound imaging allows the evaluation of the degree of emergency of a patient. However, in many situations no experienced sonographer is available to perform such echography. To cope with this issue, the OTELO project 'mObile Tele-Echography using an ultra-Light rObot' (OTELO) aims to develop a fully integrated end-to-end mobile tele-echography system using an ultralight, remotely controlled six degree-of-freedom (DOF) robot. In this context, this paper deals with the user interface environment of the OTELO system, composed by the following parts: an ultrasound video transmission system providing real-time images of the scanned area at each moment, an audio/video conference to communicate with the paramedical assistant and the patient, and finally a virtual reality environment, providing visual and haptic feedback to the expert, while capturing the expert's hand movements with a one-DOF hand free input device.

  12. A user interface for mobile robotized tele-echography

    International Nuclear Information System (INIS)

    Triantafyllidis, G.A.; Thomos, N.; Canero, C.; Vieyres, P.; Strintzis, M.G.

    2006-01-01

    Ultrasound imaging allows the evaluation of the degree of emergency of a patient. However, in many situations no experienced sonographer is available to perform such echography. To cope with this issue, the OTELO project 'mObile Tele-Echography using an ultra-Light rObot' (OTELO) aims to develop a fully integrated end-to-end mobile tele-echography system using an ultralight, remotely controlled six degree-of-freedom (DOF) robot. In this context, this paper deals with the user interface environment of the OTELO system, composed by the following parts: an ultrasound video transmission system providing real-time images of the scanned area at each moment, an audio/video conference to communicate with the paramedical assistant and the patient, and finally a virtual reality environment, providing visual and haptic feedback to the expert, while capturing the expert's hand movements with a one-DOF hand free input device

  13. Effects of 3D virtual haptics force feedback on brand personality perception: the mediating role of physical presence in advergames.

    Science.gov (United States)

    Jin, Seung-A Annie

    2010-06-01

    This study gauged the effects of force feedback in the Novint Falcon haptics system on the sensory and cognitive dimensions of a virtual test-driving experience. First, in order to explore the effects of tactile stimuli with force feedback on users' sensory experience, feelings of physical presence (the extent to which virtual physical objects are experienced as actual physical objects) were measured after participants used the haptics interface. Second, to evaluate the effects of force feedback on the cognitive dimension of consumers' virtual experience, this study investigated brand personality perception. The experiment utilized the Novint Falcon haptics controller to induce immersive virtual test-driving through tactile stimuli. The author designed a two-group (haptics stimuli with force feedback versus no force feedback) comparison experiment (N = 238) by manipulating the level of force feedback. Users in the force feedback condition were exposed to tactile stimuli involving various force feedback effects (e.g., terrain effects, acceleration, and lateral forces) while test-driving a rally car. In contrast, users in the control condition test-drove the rally car using the Novint Falcon but were not given any force feedback. Results of ANOVAs indicated that (a) users exposed to force feedback felt stronger physical presence than those in the no force feedback condition, and (b) users exposed to haptics stimuli with force feedback perceived the brand personality of the car to be more rugged than those in the control condition. Managerial implications of the study for product trial in the business world are discussed.

  14. Virtual Reality and Haptics for Product Assembly

    Directory of Open Access Journals (Sweden)

    Maria Teresa Restivo

    2012-01-01

    Full Text Available Haptics can significantly enhance the user's sense of immersion and interactivity. An industrial application of virtual reality and haptics for product assembly is described in this paper, which provides a new and low-cost approach for product assembly design, assembly task planning and assembly operation training. A demonstration of the system with haptics device interaction was available at the session of exp.at'11.

  15. Mechatronic design of haptic forceps for robotic surgery.

    Science.gov (United States)

    Rizun, P; Gunn, D; Cox, B; Sutherland, G

    2006-12-01

    Haptic feedback increases operator performance and comfort during telerobotic manipulation. Feedback of grasping pressure is critical in many microsurgical tasks, yet no haptic interface for surgical tools is commercially available. Literature on the psychophysics of touch was reviewed to define the spectrum of human touch perception and the fidelity requirements of an ideal haptic interface. Mechanical design and control literature was reviewed to translate the psychophysical requirements to engineering specification. High-fidelity haptic forceps were then developed through an iterative process between engineering and surgery. The forceps are a modular device that integrate with a haptic hand controller to add force feedback for tool actuation in telerobotic or virtual surgery. Their overall length is 153 mm and their mass is 125 g. A contact-free voice coil actuator generates force feedback at frequencies up to 800 Hz. Maximum force output is 6 N (2N continuous) and the force resolution is 4 mN. The forceps employ a contact-free magnetic position sensor as well as micro-machined accelerometers to measure opening/closing acceleration. Position resolution is 0.6 microm with 1.3 microm RMS noise. The forceps can simulate stiffness greater than 20N/mm or impedances smaller than 15 g with no noticeable haptic artifacts or friction. As telerobotic surgery evolves, haptics will play an increasingly important role. Copyright 2006 John Wiley & Sons, Ltd.

  16. High Fidelity Haptic Rendering

    CERN Document Server

    Otaduy, Miguel A

    2006-01-01

    The human haptic system, among all senses, provides unique and bidirectional communication between humans and their physical environment. Yet, to date, most human-computer interactive systems have focused primarily on the graphical rendering of visual information and, to a lesser extent, on the display of auditory information. Extending the frontier of visual computing, haptic interfaces, or force feedback devices, have the potential to increase the quality of human-computer interaction by accommodating the sense of touch. They provide an attractive augmentation to visual display and enhance t

  17. Evaluation of Pseudo-Haptic Interactions with Soft Objects in Virtual Environments.

    Directory of Open Access Journals (Sweden)

    Min Li

    Full Text Available This paper proposes a pseudo-haptic feedback method conveying simulated soft surface stiffness information through a visual interface. The method exploits a combination of two feedback techniques, namely visual feedback of soft surface deformation and control of the indenter avatar speed, to convey stiffness information of a simulated surface of a soft object in virtual environments. The proposed method was effective in distinguishing different sizes of virtual hard nodules integrated into the simulated soft bodies. To further improve the interactive experience, the approach was extended creating a multi-point pseudo-haptic feedback system. A comparison with regards to (a nodule detection sensitivity and (b elapsed time as performance indicators in hard nodule detection experiments to a tablet computer incorporating vibration feedback was conducted. The multi-point pseudo-haptic interaction is shown to be more time-efficient than the single-point pseudo-haptic interaction. It is noted that multi-point pseudo-haptic feedback performs similarly well when compared to a vibration-based feedback method based on both performance measures elapsed time and nodule detection sensitivity. This proves that the proposed method can be used to convey detailed haptic information for virtual environmental tasks, even subtle ones, using either a computer mouse or a pressure sensitive device as an input device. This pseudo-haptic feedback method provides an opportunity for low-cost simulation of objects with soft surfaces and hard inclusions, as, for example, occurring in ever more realistic video games with increasing emphasis on interaction with the physical environment and minimally invasive surgery in the form of soft tissue organs with embedded cancer nodules. Hence, the method can be used in many low-budget applications where haptic sensation is required, such as surgeon training or video games, either using desktop computers or portable devices, showing

  18. Prototype of haptic device for sole of foot using magnetic field sensitive elastomer

    Science.gov (United States)

    Kikuchi, T.; Masuda, Y.; Sugiyama, M.; Mitsumata, T.; Ohori, S.

    2013-02-01

    Walking is one of the most popular activities and a healthy aerobic exercise for the elderly. However, if they have physical and / or cognitive disabilities, sometimes it is challenging to go somewhere they don't know well. The final goal of this study is to develop a virtual reality walking system that allows users to walk in virtual worlds fabricated with computer graphics. We focus on a haptic device that can perform various plantar pressures on users' soles of feet as an additional sense in the virtual reality walking. In this study, we discuss a use of a magnetic field sensitive elastomer (MSE) as a working material for the haptic interface on the sole. The first prototype with MSE was developed and evaluated in this work. According to the measurement of planter pressures, it was found that this device can perform different pressures on the sole of a light-weight user by applying magnetic field on the MSE. The result also implied necessities of the improvement of the magnetic circuit and the basic structure of the mechanism of the device.

  19. KinoHaptics: An Automated, Wearable, Haptic Assisted, Physio-therapeutic System for Post-surgery Rehabilitation and Self-care.

    Science.gov (United States)

    Rajanna, Vijay; Vo, Patrick; Barth, Jerry; Mjelde, Matthew; Grey, Trevor; Oduola, Cassandra; Hammond, Tracy

    2016-03-01

    A carefully planned, structured, and supervised physiotherapy program, following a surgery, is crucial for the successful diagnosis of physical injuries. Nearly 50 % of the surgeries fail due to unsupervised, and erroneous physiotherapy. The demand for a physiotherapist for an extended period is expensive to afford, and sometimes inaccessible. Researchers have tried to leverage the advancements in wearable sensors and motion tracking by building affordable, automated, physio-therapeutic systems that direct a physiotherapy session by providing audio-visual feedback on patient's performance. There are many aspects of automated physiotherapy program which are yet to be addressed by the existing systems: a wide classification of patients' physiological conditions to be diagnosed, multiple demographics of the patients (blind, deaf, etc.), and the need to pursue patients to adopt the system for an extended period for self-care. In our research, we have tried to address these aspects by building a health behavior change support system called KinoHaptics, for post-surgery rehabilitation. KinoHaptics is an automated, wearable, haptic assisted, physio-therapeutic system that can be used by a wide variety of demographics and for various physiological conditions of the patients. The system provides rich and accurate vibro-haptic feedback that can be felt by the user, irrespective of the physiological limitations. KinoHaptics is built to ensure that no injuries are induced during the rehabilitation period. The persuasive nature of the system allows for personal goal-setting, progress tracking, and most importantly life-style compatibility. The system was evaluated under laboratory conditions, involving 14 users. Results show that KinoHaptics is highly convenient to use, and the vibro-haptic feedback is intuitive, accurate, and has shown to prevent accidental injuries. Also, results show that KinoHaptics is persuasive in nature as it supports behavior change and habit building

  20. Ascending and Descending in Virtual Reality: Simple and Safe System Using Passive Haptics.

    Science.gov (United States)

    Nagao, Ryohei; Matsumoto, Keigo; Narumi, Takuji; Tanikawa, Tomohiro; Hirose, Michitaka

    2018-04-01

    This paper presents a novel interactive system that provides users with virtual reality (VR) experiences, wherein users feel as if they are ascending/descending stairs through passive haptic feedback. The passive haptic stimuli are provided by small bumps under the feet of users; these stimuli are provided to represent the edges of the stairs in the virtual environment. The visual stimuli of the stairs and shoes, provided by head-mounted displays, evoke a visuo-haptic interaction that modifies a user's perception of the floor shape. Our system enables users to experience all types of stairs, such as half-turn and spiral stairs, in a VR setting. We conducted a preliminary user study and two experiments to evaluate the proposed technique. The preliminary user study investigated the effectiveness of the basic idea associated with the proposed technique for the case of a user ascending stairs. The results demonstrated that the passive haptic feedback produced by the small bumps enhanced the user's feeling of presence and sense of ascending. We subsequently performed an experiment to investigate an improved viewpoint manipulation method and the interaction of the manipulation and haptics for both the ascending and descending cases. The experimental results demonstrated that the participants had a feeling of presence and felt a steep stair gradient under the condition of haptic feedback and viewpoint manipulation based on the characteristics of actual stair walking data. However, these results also indicated that the proposed system may not be as effective in providing a sense of descending stairs without an optimization of the haptic stimuli. We then redesigned the shape of the small bumps, and evaluated the design in a second experiment. The results indicated that the best shape to present haptic stimuli is a right triangle cross section in both the ascending and descending cases. Although it is necessary to install small protrusions in the determined direction, by

  1. User interface inspection methods a user-centered design method

    CERN Document Server

    Wilson, Chauncey

    2014-01-01

    User Interface Inspection Methods succinctly covers five inspection methods: heuristic evaluation, perspective-based user interface inspection, cognitive walkthrough, pluralistic walkthrough, and formal usability inspections. Heuristic evaluation is perhaps the best-known inspection method, requiring a group of evaluators to review a product against a set of general principles. The perspective-based user interface inspection is based on the principle that different perspectives will find different problems in a user interface. In the related persona-based inspection, colleagues assume the

  2. User interface design considerations

    DEFF Research Database (Denmark)

    Andersen, Simon Engedal; Jakobsen, Arne; Rasmussen, Bjarne D.

    1999-01-01

    and output variables. This feature requires special attention when designing the user interface and a special approach for controlling the user selection of input and output variables are developed. To obtain a consistent system description the different input variables are grouped corresponding......When designing a user interface for a simulation model there are several important issues to consider: Who is the target user group, and which a priori information can be expected. What questions do the users want answers to and what questions are answered using a specific model?When developing...... the user interface of EESCoolTools these issues led to a series of simulation tools each with a specific purpose and a carefully selected set of input and output variables. To allow a more wide range of questions to be answered by the same model, the user can change between different sets of input...

  3. Overview Electrotactile Feedback for Enhancing Human Computer Interface

    Science.gov (United States)

    Pamungkas, Daniel S.; Caesarendra, Wahyu

    2018-04-01

    To achieve effective interaction between a human and a computing device or machine, adequate feedback from the computing device or machine is required. Recently, haptic feedback is increasingly being utilised to improve the interactivity of the Human Computer Interface (HCI). Most existing haptic feedback enhancements aim at producing forces or vibrations to enrich the user’s interactive experience. However, these force and/or vibration actuated haptic feedback systems can be bulky and uncomfortable to wear and only capable of delivering a limited amount of information to the user which can limit both their effectiveness and the applications they can be applied to. To address this deficiency, electrotactile feedback is used. This involves delivering haptic sensations to the user by electrically stimulating nerves in the skin via electrodes placed on the surface of the skin. This paper presents a review and explores the capability of electrotactile feedback for HCI applications. In addition, a description of the sensory receptors within the skin for sensing tactile stimulus and electric currents alsoseveral factors which influenced electric signal to transmit to the brain via human skinare explained.

  4. Search-User Interface Design

    CERN Document Server

    Wilson, Max

    2011-01-01

    Search User Interfaces (SUIs) represent the gateway between people who have a task to complete, and the repositories of information and data stored around the world. Not surprisingly, therefore, there are many communities who have a vested interest in the way SUIs are designed. There are people who study how humans search for information, and people who study how humans use computers. There are people who study good user interface design, and people who design aesthetically pleasing user interfaces. There are also people who curate and manage valuable information resources, and people who desi

  5. User interfaces of information retrieval systems and user friendliness

    Directory of Open Access Journals (Sweden)

    Polona Vilar

    2008-01-01

    Full Text Available The paper deals with the characteristics of user interfaces of information retrieval systems with the emphasis on design and evaluation. It presents users’ information retrieval tasks and the functions which are offered through interfaces. Design rules, guidelines and standards are presented, as well as criteria and methods for evaluation. Special emphasis is placed on the concept of user friendliness as one of the most important characteristic of the user interfaces. Various definitions of user friendliness are presented and their elements are also discussed. In the end, the paper shows how user interfaces should be designed, taken into consideration all these criteria.

  6. The virtual haptic back: A simulation for training in palpatory diagnosis

    Directory of Open Access Journals (Sweden)

    Eland David C

    2008-04-01

    Full Text Available Abstract Background Models and simulations are finding increased roles in medical education. The Virtual Haptic Back (VHB is a virtual reality simulation of the mechanical properties of the human back designed as an aid to teaching clinical palpatory diagnosis. Methods Eighty-nine first year medical students of the Ohio University College of Osteopathic Medicine carried out six, 15-minute practice sessions with the VHB, plus tests before and after the sessions in order to monitor progress in identifying regions of simulated abnormal tissue compliance. Students palpated with two digits, fingers or thumbs, by placing them in gimbaled thimbles at the ends of PHANToM 3.0® haptic interface arms. The interface simulated the contours and compliance of the back surface by the action of electric motors. The motors limited the compression of the virtual tissues induced by the palpating fingers, by generating counterforces. Users could see the position of their fingers with respect to the back on a video monitor just behind the plane of the haptic back. The abnormal region varied randomly among 12 locations between trials. During the practice sessions student users received immediate feedback following each trial, indicating either a correct choice or the actual location of the abnormality if an incorrect choice had been made. This allowed the user to feel the actual abnormality before going on to the next trial. Changes in accuracy, speed and Weber fraction across practice sessions were analyzed using a repeated measures analysis of variance. Results Students improved in accuracy and speed of diagnosis with practice. The smallest difference in simulated tissue compliance users were able to detect improved from 28% (SD = 9.5% to 14% (SD = 4.4% during the practice sessions while average detection time decreased from 39 (SD = 19.8 to 17 (SD = 11.7 seconds. When asked in anonymous evaluation questionnaires if they judged the VHB practice to be helpful to

  7. Audio-haptic interaction in simulated walking experiences

    DEFF Research Database (Denmark)

    Serafin, Stefania

    2011-01-01

    and interchangeable use of the haptic and auditory modality in floor interfaces, and for the synergy of perception and action in capturing and guiding human walking. We describe the technology developed in the context of this project, together with some experiments performed to evaluate the role of auditory......In this paper an overview of the work conducted on audio-haptic physically based simulation and evaluation of walking is provided. This work has been performed in the context of the Natural Interactive Walking (NIW) project, whose goal is to investigate possibilities for the integrated...... and haptic feedback in walking tasks....

  8. Modeling and Design of an Electro-Rheological Fluid Based Haptic System for Tele-Operation of Space Robots

    Science.gov (United States)

    Mavroidis, Constantinos; Pfeiffer, Charles; Paljic, Alex; Celestino, James; Lennon, Jamie; Bar-Cohen, Yoseph

    2000-01-01

    For many years, the robotic community sought to develop robots that can eventually operate autonomously and eliminate the need for human operators. However, there is an increasing realization that there are some tasks that human can perform significantly better but, due to associated hazards, distance, physical limitations and other causes, only robot can be employed to perform these tasks. Remotely performing these types of tasks requires operating robots as human surrogates. While current "hand master" haptic systems are able to reproduce the feeling of rigid objects, they present great difficulties in emulating the feeling of remote/virtual stiffness. In addition, they tend to be heavy, cumbersome and usually they only allow limited operator workspace. In this paper a novel haptic interface is presented to enable human-operators to "feel" and intuitively mirror the stiffness/forces at remote/virtual sites enabling control of robots as human-surrogates. This haptic interface is intended to provide human operators intuitive feeling of the stiffness and forces at remote or virtual sites in support of space robots performing dexterous manipulation tasks (such as operating a wrench or a drill). Remote applications are referred to the control of actual robots whereas virtual applications are referred to simulated operations. The developed haptic interface will be applicable to IVA operated robotic EVA tasks to enhance human performance, extend crew capability and assure crew safety. The electrically controlled stiffness is obtained using constrained ElectroRheological Fluids (ERF), which changes its viscosity under electrical stimulation. Forces applied at the robot end-effector due to a compliant environment will be reflected to the user using this ERF device where a change in the system viscosity will occur proportionally to the force to be transmitted. In this paper, we will present the results of our modeling, simulation, and initial testing of such an

  9. Generating User Interfaces with the FUSE-System

    OpenAIRE

    Frank Lonczewski; Siegfried Schreiber

    2017-01-01

    With the FUSE(Formal User interface Specification Environment)-System we present a methodology and a set of integrated tools for the automatic generation of graphical user interfaces. FUSE provides tool-based support for all phases (task-, user-, problem domain analysis, design of the logical user interface, design of user interface in a particular layout style) of the user interface development process. Based on a formal specification of dialogue- and layout guidelines, FUSE allows the autom...

  10. Natural user interfaces for multitouch devices

    OpenAIRE

    Bukovinski, Matej

    2010-01-01

    This thesis presents a new class of user interfaces, which a commonly referred to as natural user interfaces. It discusses their main characteristics, evolution and advantages over currently dominant graphical user interfaces. Special attention is devoted to the subgroup of natural user interfaces for multitouch devices. Multitouch technology is firstly presented from a technical point of view and afterwards also in practice in form of a comparative study of six popular multitouch platfo...

  11. DIRAC: Secure web user interface

    International Nuclear Information System (INIS)

    Casajus Ramo, A; Sapunov, M

    2010-01-01

    Traditionally the interaction between users and the Grid is done with command line tools. However, these tools are difficult to use by non-expert users providing minimal help and generating outputs not always easy to understand especially in case of errors. Graphical User Interfaces are typically limited to providing access to the monitoring or accounting information and concentrate on some particular aspects failing to cover the full spectrum of grid control tasks. To make the Grid more user friendly more complete graphical interfaces are needed. Within the DIRAC project we have attempted to construct a Web based User Interface that provides means not only for monitoring the system behavior but also allows to steer the main user activities on the grid. Using DIRAC's web interface a user can easily track jobs and data. It provides access to job information and allows performing actions on jobs such as killing or deleting. Data managers can define and monitor file transfer activity as well as check requests set by jobs. Production managers can define and follow large data productions and react if necessary by stopping or starting them. The Web Portal is build following all the grid security standards and using modern Web 2.0 technologies which allow to achieve the user experience similar to the desktop applications. Details of the DIRAC Web Portal architecture and User Interface will be presented and discussed.

  12. Tactile Feedback for Above-Device Gesture Interfaces

    OpenAIRE

    Freeman, Euan; Brewster, Stephen; Lantz, Vuokko

    2014-01-01

    Above-device gesture interfaces let people interact in the space above mobile devices using hand and finger movements. For example, users could gesture over a mobile phone or wearable without having to use the touchscreen. We look at how above-device interfaces can also give feedback in the space over the device. Recent haptic and wearable technologies give new ways to provide tactile feedback while gesturing, letting touchless gesture interfaces give touch feedback. In this paper we take a f...

  13. A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality.

    Science.gov (United States)

    Kim, Mingyu; Jeon, Changyu; Kim, Jinmo

    2017-05-17

    This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality.

  14. A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality

    Science.gov (United States)

    Kim, Mingyu; Jeon, Changyu; Kim, Jinmo

    2017-01-01

    This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality. PMID:28513545

  15. Haptic force-feedback devices for the office computer: performance and musculoskeletal loading issues.

    Science.gov (United States)

    Dennerlein, J T; Yang, M C

    2001-01-01

    Pointing devices, essential input tools for the graphical user interface (GUI) of desktop computers, require precise motor control and dexterity to use. Haptic force-feedback devices provide the human operator with tactile cues, adding the sense of touch to existing visual and auditory interfaces. However, the performance enhancements, comfort, and possible musculoskeletal loading of using a force-feedback device in an office environment are unknown. Hypothesizing that the time to perform a task and the self-reported pain and discomfort of the task improve with the addition of force feedback, 26 people ranging in age from 22 to 44 years performed a point-and-click task 540 times with and without an attractive force field surrounding the desired target. The point-and-click movements were approximately 25% faster with the addition of force feedback (paired t-tests, p user discomfort and pain, as measured through a questionnaire, were also smaller with the addition of force feedback (p device improves performance, and potentially reduces musculoskeletal loading during mouse use. Actual or potential applications of this research include human-computer interface design, specifically that of the pointing device extensively used for the graphical user interface.

  16. Planning and User Interface Affordances

    National Research Council Canada - National Science Library

    St. Amant, Robert

    1999-01-01

    .... We identify a number of similarities between executing plans and interacting with a graphical user interface, and argue that affordances for planning environments apply equally well to user interface environments...

  17. Natural User Interfaces

    OpenAIRE

    Câmara , António

    2011-01-01

    Dissertação de Mestrado em Engenharia Informática apresentada à Faculdade de Ciências e Tecnologia da Universidade de Coimbra This project’s main subject are Natural User Interfaces. These interfaces’ main purpose is to allow the user to interact with computer systems in a more direct and natural way. The popularization of touch and gesture devices in the last few years has allowed for them to become increasingly common and today we are experiencing a transition of interface p...

  18. A haptic interface for virtual simulation of endoscopic surgery.

    Science.gov (United States)

    Rosenberg, L B; Stredney, D

    1996-01-01

    Virtual reality can be described as a convincingly realistic and naturally interactive simulation in which the user is given a first person illusion of being immersed within a computer generated environment While virtual reality systems offer great potential to reduce the cost and increase the quality of medical training, many technical challenges must be overcome before such simulation platforms offer effective alternatives to more traditional training means. A primary challenge in developing effective virtual reality systems is designing the human interface hardware which allows rich sensory information to be presented to users in natural ways. When simulating a given manual procedure, task specific human interface requirements dictate task specific human interface hardware. The following paper explores the design of human interface hardware that satisfies the task specific requirements of virtual reality simulation of Endoscopic surgical procedures. Design parameters were derived through direct cadaver studies and interviews with surgeons. Final hardware design is presented.

  19. User interfaces of information retrieval systems and user friendliness

    OpenAIRE

    Polona Vilar; Maja Žumer

    2008-01-01

    The paper deals with the characteristics of user interfaces of information retrieval systems with the emphasis on design and evaluation. It presents users’ information retrieval tasks and the functions which are offered through interfaces. Design rules, guidelines and standards are presented, as well as criteria and methods for evaluation. Special emphasis is placed on the concept of user friendliness as one of the most important characteristic of the user interfaces. Various definitions of u...

  20. Haptics using a smart material for eyes-free interaction in personal devices

    Science.gov (United States)

    Wang, Huihui; Lane, William Brian; Pappas, Devin; Duque, Bryam; Leong, John

    2014-03-01

    In this paper we present a prototype using a dry ionic polymer metal composite (IPMC) in interactive personal devices such as bracelet, necklace, pocket key chain or mobile devices for haptic interaction when audio or visual feedback is not possible or practical. This prototype interface is an electro-mechanical system that realizes a shape-changing haptic display for information communication. A dry IPMC will change its dimensions due to the electrostatic effect when an electrical potential is provided to them. The IPMC can operate at a lower voltage (less than 2.5V) which is compatible with requirements for personal electrical devices or mobile devices. The prototype consists of the addressable arrays of the IPMCs with different dimensions which are deformable to different shapes with proper handling or customization. 3D printing technology will be used to form supporting parts. Microcontrollers (about 3cm square) from DigiKey will be imbedded into this personal device. An Android based mobile APP will be developed to talk with microcontrollers to control IPMCs. When personal devices receive information signals, the original shape of the prototype will change to another shape related to the specific sender or types of information sources. This interactive prototype can simultaneously realize multiple methods for conveying haptic information such as dimension, force, and texture due to the flexible array design. We conduct several studies of user experience to explore how users' respond to shape change information.

  1. Practical speech user interface design

    CERN Document Server

    Lewis, James R

    2010-01-01

    Although speech is the most natural form of communication between humans, most people find using speech to communicate with machines anything but natural. Drawing from psychology, human-computer interaction, linguistics, and communication theory, Practical Speech User Interface Design provides a comprehensive yet concise survey of practical speech user interface (SUI) design. It offers practice-based and research-based guidance on how to design effective, efficient, and pleasant speech applications that people can really use. Focusing on the design of speech user interfaces for IVR application

  2. User's Manual for the Object User Interface (OUI): An Environmental Resource Modeling Framework

    Science.gov (United States)

    Markstrom, Steven L.; Koczot, Kathryn M.

    2008-01-01

    The Object User Interface is a computer application that provides a framework for coupling environmental-resource models and for managing associated temporal and spatial data. The Object User Interface is designed to be easily extensible to incorporate models and data interfaces defined by the user. Additionally, the Object User Interface is highly configurable through the use of a user-modifiable, text-based control file that is written in the eXtensible Markup Language. The Object User Interface user's manual provides (1) installation instructions, (2) an overview of the graphical user interface, (3) a description of the software tools, (4) a project example, and (5) specifications for user configuration and extension.

  3. Adaptive user interfaces

    CERN Document Server

    1990-01-01

    This book describes techniques for designing and building adaptive user interfaces developed in the large AID project undertaken by the contributors.Key Features* Describes one of the few large-scale adaptive interface projects in the world* Outlines the principles of adaptivity in human-computer interaction

  4. Experiment on a novel user input for computer interface utilizing tongue input for the severely disabled.

    Science.gov (United States)

    Kencana, Andy Prima; Heng, John

    2008-11-01

    This paper introduces a novel passive tongue control and tracking device. The device is intended to be used by the severely disabled or quadriplegic person. The main focus of this device when compared to the other existing tongue tracking devices is that the sensor employed is passive which means it requires no powered electrical sensor to be inserted into the user's mouth and hence no trailing wires. This haptic interface device employs the use of inductive sensors to track the position of the user's tongue. The device is able perform two main PC functions that of the keyboard and mouse function. The results show that this device allows the severely disabled person to have some control in his environment, such as to turn on and off or control daily electrical devices or appliances; or to be used as a viable PC Human Computer Interface (HCI) by tongue control. The operating principle and set-up of such a novel passive tongue HCI has been established with successful laboratory trials and experiments. Further clinical trials will be required to test out the device on disabled persons before it is ready for future commercial development.

  5. User Interface Cultures of Mobile Knowledge Workers

    Directory of Open Access Journals (Sweden)

    Petri Mannonen

    2008-10-01

    Full Text Available Information and communication tools (ICTs have become a major influencer of how modern work is carried out. Methods of user-centered design do not however take into account the full complexity of technology and the user interface context the users live in. User interface culture analysis aims providing to designers new ways and strategies to better take into account the current user interface environment when designing new products. This paper describes the reasons behind user interface culture analysis and shows examples of its usage when studying mobile and distributed knowledge workers.

  6. Overview of Graphical User Interfaces.

    Science.gov (United States)

    Hulser, Richard P.

    1993-01-01

    Discussion of graphical user interfaces for online public access catalogs (OPACs) covers the history of OPACs; OPAC front-end design, including examples from Indiana University and the University of Illinois; and planning and implementation of a user interface. (10 references) (EA)

  7. Workflow User Interfaces Patterns

    Directory of Open Access Journals (Sweden)

    Jean Vanderdonckt

    2012-03-01

    Full Text Available Este trabajo presenta una colección de patrones de diseño de interfaces de usuario para sistemas de información para el flujo de trabajo; la colección incluye cuarenta y tres patrones clasificados en siete categorías identificados a partir de la lógica del ciclo de vida de la tarea sobre la base de la oferta y la asignación de tareas a los responsables de realizarlas (i. e. recursos humanos durante el flujo de trabajo. Cada patrón de la interfaz de usuario de flujo de trabajo (WUIP, por sus siglas en inglés se caracteriza por las propiedades expresadas en el lenguaje PLML para expresar patrones y complementado por otros atributos y modelos que se adjuntan a dicho modelo: la interfaz de usuario abstracta y el modelo de tareas correspondiente. Estos modelos se especifican en un lenguaje de descripción de interfaces de usuario. Todos los WUIPs se almacenan en una biblioteca y se pueden recuperar a través de un editor de flujo de trabajo que vincula a cada patrón de asignación de trabajo a su WUIP correspondiente.A collection of user interface design patterns for workflow information systems is presented that contains forty three resource patterns classified in seven categories. These categories and their corresponding patterns have been logically identified from the task life cycle based on offering and allocation operations. Each Workflow User Interface Pattern (WUIP is characterized by properties expressed in the PLML markup language for expressing patterns and augmented by additional attributes and models attached to the pattern: the abstract user interface and the corresponding task model. These models are specified in a User Interface Description Language. All WUIPs are stored in a library and can be retrieved within a workflow editor that links each workflow pattern to its corresponding WUIP, thus giving rise to a user interface for each workflow pattern.

  8. Performance evaluation of a robot-assisted catheter operating system with haptic feedback.

    Science.gov (United States)

    Song, Yu; Guo, Shuxiang; Yin, Xuanchun; Zhang, Linshuai; Hirata, Hideyuki; Ishihara, Hidenori; Tamiya, Takashi

    2018-06-20

    In this paper, a novel robot-assisted catheter operating system (RCOS) has been proposed as a method to reduce physical stress and X-ray exposure time to physicians during endovascular procedures. The unique design of this system allows the physician to apply conventional bedside catheterization skills (advance, retreat and rotate) to an input catheter, which is placed at the master side to control another patient catheter placed at the slave side. For this purpose, a magnetorheological (MR) fluids-based master haptic interface has been developed to measure the axial and radial motions of an input catheter, as well as to provide the haptic feedback to the physician during the operation. In order to achieve a quick response of the haptic force in the master haptic interface, a hall sensor-based closed-loop control strategy is employed. In slave side, a catheter manipulator is presented to deliver the patient catheter, according to position commands received from the master haptic interface. The contact forces between the patient catheter and blood vessel system can be measured by designed force sensor unit of catheter manipulator. Four levels of haptic force are provided to make the operator aware of the resistance encountered by the patient catheter during the insertion procedure. The catheter manipulator was evaluated for precision positioning. The time lag from the sensed motion to replicated motion is tested. To verify the efficacy of the proposed haptic feedback method, the evaluation experiments in vitro are carried out. The results demonstrate that the proposed system has the ability to enable decreasing the contact forces between the catheter and vasculature.

  9. Designing end-user interfaces

    CERN Document Server

    Heaton, N

    1988-01-01

    Designing End-User Interfaces: State of the Art Report focuses on the field of human/computer interaction (HCI) that reviews the design of end-user interfaces.This compilation is divided into two parts. Part I examines specific aspects of the problem in HCI that range from basic definitions of the problem, evaluation of how to look at the problem domain, and fundamental work aimed at introducing human factors into all aspects of the design cycle. Part II consists of six main topics-definition of the problem, psychological and social factors, principles of interface design, computer intelligenc

  10. User interface for a partially incompatible system software environment with non-ADP users

    Energy Technology Data Exchange (ETDEWEB)

    Loffman, R.S.

    1987-08-01

    Good user interfaces to computer systems and software applications are the result of combining an analysis of user needs with knowledge of interface design principles and techniques. This thesis reports on an interface for an environment: (a) that consists of users who are not computer science or data processing professionals; and (b) which is bound by predetermined software and hardware. The interface was designed which combined these considerations with user interface design principles. Current literature was investigated to establish a baseline of knowledge about user interface design. There are many techniques which can be used to implement a user interface, but all should have the same basic goal, which is to assist the user in the performance of a task. This can be accomplished by providing the user with consistent, well-structured interfaces which also provide flexibility to adapt to differences among users. The interface produced used menu selection and command language techniques to make two different operating system environments appear similar. Additional included features helped to address the needs of different users. The original goal was also to make the transition between the two systems transparent. This was not fully accomplished due to software and hardware limitations.

  11. End User Development Toolkit for Developing Physical User Interface Applications

    OpenAIRE

    Abrahamsen, Daniel T; Palfi, Anders; Svendsen, Haakon Sønsteby

    2014-01-01

    BACKGROUND: Tangible user interfaces and end user development are two increasingresearch areas in software technology. Physical representation promoteopportunities to ease the use of technology and reinforce personality traits ascreativeness, collaboration and intuitive actions. However, designing tangibleuser interfaces are both cumbersome and require several layers of architecture.End user development allows users with no programming experience to createor customize their own applications. ...

  12. Spectrometer user interface to computer systems

    International Nuclear Information System (INIS)

    Salmon, L.; Davies, M.; Fry, F.A.; Venn, J.B.

    1979-01-01

    A computer system for use in radiation spectrometry should be designed around the needs and comprehension of the user and his operating environment. To this end, the functions of the system should be built in a modular and independent fashion such that they can be joined to the back end of an appropriate user interface. The point that this interface should be designed rather than just allowed to evolve is illustrated by reference to four related computer systems of differing complexity and function. The physical user interfaces in all cases are keyboard terminals, and the virtues and otherwise of these devices are discussed and compared with others. The language interface needs to satisfy a number of requirements, often conflicting. Among these, simplicity and speed of operation compete with flexibility and scope. Both experienced and novice users need to be considered, and any individual's needs may vary from naive to complex. To be efficient and resilient, the implementation must use an operating system, but the user needs to be protected from its complex and unfamiliar syntax. At the same time the interface must allow the user access to all services appropriate to his needs. The user must also receive an image of privacy in a multi-user system. The interface itself must be stable and exhibit continuity between implementations. Some of these conflicting needs have been overcome by the SABRE interface with languages operating at several levels. The foundation is a simple semimnemonic command language that activates indididual and independent functions. The commands can be used with positional parameters or in an interactive dialogue the precise nature of which depends upon the operating environment and the user's experience. A command procedure or macrolanguage allows combinations of commands with conditional branching and arithmetic features. Thus complex but repetitive operations are easily performed

  13. HCIDL: Human-computer interface description language for multi-target, multimodal, plastic user interfaces

    Directory of Open Access Journals (Sweden)

    Lamia Gaouar

    2018-06-01

    Full Text Available From the human-computer interface perspectives, the challenges to be faced are related to the consideration of new, multiple interactions, and the diversity of devices. The large panel of interactions (touching, shaking, voice dictation, positioning … and the diversification of interaction devices can be seen as a factor of flexibility albeit introducing incidental complexity. Our work is part of the field of user interface description languages. After an analysis of the scientific context of our work, this paper introduces HCIDL, a modelling language staged in a model-driven engineering approach. Among the properties related to human-computer interface, our proposition is intended for modelling multi-target, multimodal, plastic interaction interfaces using user interface description languages. By combining plasticity and multimodality, HCIDL improves usability of user interfaces through adaptive behaviour by providing end-users with an interaction-set adapted to input/output of terminals and, an optimum layout. Keywords: Model driven engineering, Human-computer interface, User interface description languages, Multimodal applications, Plastic user interfaces

  14. User Interface Technology for Formal Specification Development

    Science.gov (United States)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    Formal specification development and modification are an essential component of the knowledge-based software life cycle. User interface technology is needed to empower end-users to create their own formal specifications. This paper describes the advanced user interface for AMPHION1 a knowledge-based software engineering system that targets scientific subroutine libraries. AMPHION is a generic, domain-independent architecture that is specialized to an application domain through a declarative domain theory. Formal specification development and reuse is made accessible to end-users through an intuitive graphical interface that provides semantic guidance in creating diagrams denoting formal specifications in an application domain. The diagrams also serve to document the specifications. Automatic deductive program synthesis ensures that end-user specifications are correctly implemented. The tables that drive AMPHION's user interface are automatically compiled from a domain theory; portions of the interface can be customized by the end-user. The user interface facilitates formal specification development by hiding syntactic details, such as logical notation. It also turns some of the barriers for end-user specification development associated with strongly typed formal languages into active sources of guidance, without restricting advanced users. The interface is especially suited for specification modification. AMPHION has been applied to the domain of solar system kinematics through the development of a declarative domain theory. Testing over six months with planetary scientists indicates that AMPHION's interactive specification acquisition paradigm enables users to develop, modify, and reuse specifications at least an order of magnitude more rapidly than manual program development.

  15. User acquaintance with mobile interfaces.

    Science.gov (United States)

    Ehrler, Frederic; Walesa, Magali; Sarrey, Evelyne; Wipfli, Rolf; Lovis, Christian

    2013-01-01

    Handheld technology finds slowly its place in the healthcare world. Some clinicians already use intensively dedicated mobile applications to consult clinical references. However, handheld technology hasn't still broadly embraced to the core of the healthcare business, the hospitals. The weak penetration of handheld technology in the hospitals can be partly explained by the caution of stakeholders that must be convinced about the efficiency of these tools before going forward. In a domain where temporal constraints are increasingly strong, caregivers cannot loose time on playing with gadgets. All users are not comfortable with tactile manipulations and the lack of dedicated peripheral complicates entering data for novices. Stakeholders must be convinced that caregivers will be able to master handheld devices. In this paper, we make the assumption that the proper design of an interface may influence users' performances to record information. We are also interested to find out whether users increase their efficiency when using handheld tools repeatedly. To answer these questions, we have set up a field study to compare users' performances on three different user interfaces while recording vital signs. Some user interfaces were familiar to users, and others were totally innovative. Results showed that users' familiarity with smartphone influences their performances and that users improve their performances by repeating a task.

  16. Flippable User Interfaces for Internationalization

    OpenAIRE

    Khaddam, Iyad; Vanderdonckt, Jean; 3rd ACM Symposium on Engineering Interactive Computing Systems EICS’2011

    2011-01-01

    The language reading direction is probably one of the most determinant factors influencing the successful internationalization of graphical user interfaces, beyond their mere translation. Western languages are read from left to right and top to bottom, while Arabic languages and Hebrew are read from right to left and top to bottom, and Oriental languages are read from top to bottom. In order to address this challenge, we introduce flippable user interfaces that enable the end user to change t...

  17. Encountered-Type Haptic Interface for Representation of Shape and Rigidity of 3D Virtual Objects.

    Science.gov (United States)

    Takizawa, Naoki; Yano, Hiroaki; Iwata, Hiroo; Oshiro, Yukio; Ohkohchi, Nobuhiro

    2017-01-01

    This paper describes the development of an encountered-type haptic interface that can generate the physical characteristics, such as shape and rigidity, of three-dimensional (3D) virtual objects using an array of newly developed non-expandable balloons. To alter the rigidity of each non-expandable balloon, the volume of air in it is controlled through a linear actuator and a pressure sensor based on Hooke's law. Furthermore, to change the volume of each balloon, its exposed surface area is controlled by using another linear actuator with a trumpet-shaped tube. A position control mechanism is constructed to display virtual objects using the balloons. The 3D position of each balloon is controlled using a flexible tube and a string. The performance of the system is tested and the results confirm the effectiveness of the proposed principle and interface.

  18. Applying Cognitive Psychology to User Interfaces

    Science.gov (United States)

    Durrani, Sabeen; Durrani, Qaiser S.

    This paper explores some key aspects of cognitive psychology that may be mapped onto user interfaces. Major focus in existing user interface guidelines is on consistency, simplicity, feedback, system messages, display issues, navigation, colors, graphics, visibility and error prevention [8-10]. These guidelines are effective indesigning user interfaces. However, these guidelines do not handle the issues that may arise due to the innate structure of human brain and human limitations. For example, where to place graphics on the screen so that user can easily process them and what kind of background should be given on the screen according to the limitation of human motor system. In this paper we have collected some available guidelines from the area of cognitive psychology [1, 5, 7]. In addition, we have extracted few guidelines from theories and studies of cognitive psychology [3, 11] which may be mapped to user interfaces.

  19. Expanding the Scope of Instant Messaging with Bidirectional Haptic Communication

    OpenAIRE

    Kim, Youngjae; Hahn, Minsoo

    2010-01-01

    This work was conducted on the combination of two fields, i.e., haptic and social messaging. Haptic is one of the most attention-drawing fields and the biggest buzzwords among nextgeneration users. Haptic is being applied to conventional devices such as the cellular phone and even the door lock. Diverse forms of media such as blogs, social network services, and instant messengers are used to send and receive messages. That is mainly why we focus on the messaging experience, the most frequent ...

  20. Experimental evaluation of magnified haptic feedback for robot-assisted needle insertion and palpation.

    Science.gov (United States)

    Meli, Leonardo; Pacchierotti, Claudio; Prattichizzo, Domenico

    2017-12-01

    Haptic feedback has been proven to play a key role in enhancing the performance of teleoperated medical procedures. However, due to safety issues, commercially-available medical robots do not currently provide the clinician with haptic feedback. This work presents the experimental evaluation of a teleoperation system for robot-assisted medical procedures able to provide magnified haptic feedback to the clinician. Forces registered at the operating table are magnified and provided to the clinician through a 7-DoF haptic interface. The same interface is also used to control the motion of a 6-DoF slave robotic manipulator. The safety of the system is guaranteed by a time-domain passivity-based control algorithm. Two experiments were carried out on stiffness discrimination (during palpation and needle insertion) and one experiment on needle guidance. Our haptic-enabled teleoperation system improved the performance with respect to direct hand interaction of 80%, 306%, and 27% in stiffness discrimination through palpation, stiffness discrimination during needle insertion, and guidance, respectively. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Gestures in an Intelligent User Interface

    Science.gov (United States)

    Fikkert, Wim; van der Vet, Paul; Nijholt, Anton

    In this chapter we investigated which hand gestures are intuitive to control a large display multimedia interface from a user's perspective. Over the course of two sequential user evaluations, we defined a simple gesture set that allows users to fully control a large display multimedia interface, intuitively. First, we evaluated numerous gesture possibilities for a set of commands that can be issued to the interface. These gestures were selected from literature, science fiction movies, and a previous exploratory study. Second, we implemented a working prototype with which the users could interact with both hands and the preferred hand gestures with 2D and 3D visualizations of biochemical structures. We found that the gestures are influenced to significant extent by the fast paced developments in multimedia interfaces such as the Apple iPhone and the Nintendo Wii and to no lesser degree by decades of experience with the more traditional WIMP-based interfaces.

  2. Customization of user interfaces to reduce errors and enhance user acceptance.

    Science.gov (United States)

    Burkolter, Dina; Weyers, Benjamin; Kluge, Annette; Luther, Wolfram

    2014-03-01

    Customization is assumed to reduce error and increase user acceptance in the human-machine relation. Reconfiguration gives the operator the option to customize a user interface according to his or her own preferences. An experimental study with 72 computer science students using a simulated process control task was conducted. The reconfiguration group (RG) interactively reconfigured their user interfaces and used the reconfigured user interface in the subsequent test whereas the control group (CG) used a default user interface. Results showed significantly lower error rates and higher acceptance of the RG compared to the CG while there were no significant differences between the groups regarding situation awareness and mental workload. Reconfiguration seems to be promising and therefore warrants further exploration. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  3. Heterogeneous Deformable Modeling of Bio-Tissues and Haptic Force Rendering for Bio-Object Modeling

    Science.gov (United States)

    Lin, Shiyong; Lee, Yuan-Shin; Narayan, Roger J.

    This paper presents a novel technique for modeling soft biological tissues as well as the development of an innovative interface for bio-manufacturing and medical applications. Heterogeneous deformable models may be used to represent the actual internal structures of deformable biological objects, which possess multiple components and nonuniform material properties. Both heterogeneous deformable object modeling and accurate haptic rendering can greatly enhance the realism and fidelity of virtual reality environments. In this paper, a tri-ray node snapping algorithm is proposed to generate a volumetric heterogeneous deformable model from a set of object interface surfaces between different materials. A constrained local static integration method is presented for simulating deformation and accurate force feedback based on the material properties of a heterogeneous structure. Biological soft tissue modeling is used as an example to demonstrate the proposed techniques. By integrating the heterogeneous deformable model into a virtual environment, users can both observe different materials inside a deformable object as well as interact with it by touching the deformable object using a haptic device. The presented techniques can be used for surgical simulation, bio-product design, bio-manufacturing, and medical applications.

  4. Adapting haptic guidance authority based on user grip

    NARCIS (Netherlands)

    Smisek, J.; Mugge, W.; Smeets, J.B.J.; Van Paassen, M.M.; Schiele, A

    2014-01-01

    Haptic guidance systems support the operator in task execution using additional forces on the input device. Scaling of the guidance forces determines the control authority of the support system. As task complexity may vary, one level of the guidance scaling may be insufficient, and adaptation of the

  5. User interface and patient involvement.

    Science.gov (United States)

    Andreassen, Hege Kristin; Lundvoll Nilsen, Line

    2013-01-01

    Increased patient involvement is a goal in contemporary health care, and of importance to the development of patient oriented ICT. In this paper we discuss how the design of patient-user interfaces can affect patient involvement. Our discussion is based on 12 semi-structured interviews with patient users of a web-based solution for patient--doctor communication piloted in Norway. We argue ICT solutions offering a choice of user interfaces on the patient side are preferable to ensure individual accommodation and a high degree of patient involvement. When introducing web-based tools for patient--health professional communication a free-text option should be provided to the patient users.

  6. User interface for a tele-operated robotic hand system

    Science.gov (United States)

    Crawford, Anthony L

    2015-03-24

    Disclosed here is a user interface for a robotic hand. The user interface anchors a user's palm in a relatively stationary position and determines various angles of interest necessary for a user's finger to achieve a specific fingertip location. The user interface additionally conducts a calibration procedure to determine the user's applicable physiological dimensions. The user interface uses the applicable physiological dimensions and the specific fingertip location, and treats the user's finger as a two link three degree-of-freedom serial linkage in order to determine the angles of interest. The user interface communicates the angles of interest to a gripping-type end effector which closely mimics the range of motion and proportions of a human hand. The user interface requires minimal contact with the operator and provides distinct advantages in terms of available dexterity, work space flexibility, and adaptability to different users.

  7. Distributed user interfaces for clinical ubiquitous computing applications.

    Science.gov (United States)

    Bång, Magnus; Larsson, Anders; Berglund, Erik; Eriksson, Henrik

    2005-08-01

    Ubiquitous computing with multiple interaction devices requires new interface models that support user-specific modifications to applications and facilitate the fast development of active workspaces. We have developed NOSTOS, a computer-augmented work environment for clinical personnel to explore new user interface paradigms for ubiquitous computing. NOSTOS uses several devices such as digital pens, an active desk, and walk-up displays that allow the system to track documents and activities in the workplace. We present the distributed user interface (DUI) model that allows standalone applications to distribute their user interface components to several devices dynamically at run-time. This mechanism permit clinicians to develop their own user interfaces and forms to clinical information systems to match their specific needs. We discuss the underlying technical concepts of DUIs and show how service discovery, component distribution, events and layout management are dealt with in the NOSTOS system. Our results suggest that DUIs--and similar network-based user interfaces--will be a prerequisite of future mobile user interfaces and essential to develop clinical multi-device environments.

  8. User interface for a tele-operated robotic hand system

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, Anthony L

    2015-03-24

    Disclosed here is a user interface for a robotic hand. The user interface anchors a user's palm in a relatively stationary position and determines various angles of interest necessary for a user's finger to achieve a specific fingertip location. The user interface additionally conducts a calibration procedure to determine the user's applicable physiological dimensions. The user interface uses the applicable physiological dimensions and the specific fingertip location, and treats the user's finger as a two link three degree-of-freedom serial linkage in order to determine the angles of interest. The user interface communicates the angles of interest to a gripping-type end effector which closely mimics the range of motion and proportions of a human hand. The user interface requires minimal contact with the operator and provides distinct advantages in terms of available dexterity, work space flexibility, and adaptability to different users.

  9. Design and Calibration of a New 6 DOF Haptic Device

    Directory of Open Access Journals (Sweden)

    Huanhuan Qin

    2015-12-01

    Full Text Available For many applications such as tele-operational robots and interactions with virtual environments, it is better to have performance with force feedback than without. Haptic devices are force reflecting interfaces. They can also track human hand positions simultaneously. A new 6 DOF (degree-of-freedom haptic device was designed and calibrated in this study. It mainly contains a double parallel linkage, a rhombus linkage, a rotating mechanical structure and a grasping interface. Benefited from the unique design, it is a hybrid structure device with a large workspace and high output capability. Therefore, it is capable of multi-finger interactions. Moreover, with an adjustable base, operators can change different postures without interrupting haptic tasks. To investigate the performance regarding position tracking accuracy and static output forces, we conducted experiments on a three-dimensional electric sliding platform and a digital force gauge, respectively. Displacement errors and force errors are calculated and analyzed. To identify the capability and potential of the device, four application examples were programmed.

  10. Design and Calibration of a New 6 DOF Haptic Device

    Science.gov (United States)

    Qin, Huanhuan; Song, Aiguo; Liu, Yuqing; Jiang, Guohua; Zhou, Bohe

    2015-01-01

    For many applications such as tele-operational robots and interactions with virtual environments, it is better to have performance with force feedback than without. Haptic devices are force reflecting interfaces. They can also track human hand positions simultaneously. A new 6 DOF (degree-of-freedom) haptic device was designed and calibrated in this study. It mainly contains a double parallel linkage, a rhombus linkage, a rotating mechanical structure and a grasping interface. Benefited from the unique design, it is a hybrid structure device with a large workspace and high output capability. Therefore, it is capable of multi-finger interactions. Moreover, with an adjustable base, operators can change different postures without interrupting haptic tasks. To investigate the performance regarding position tracking accuracy and static output forces, we conducted experiments on a three-dimensional electric sliding platform and a digital force gauge, respectively. Displacement errors and force errors are calculated and analyzed. To identify the capability and potential of the device, four application examples were programmed. PMID:26690449

  11. User interface development

    Science.gov (United States)

    Aggrawal, Bharat

    1994-01-01

    This viewgraph presentation describes the development of user interfaces for OS/2 versions of computer codes for the analysis of seals. Current status, new features, work in progress, and future plans are discussed.

  12. Towards open-source, low-cost haptics for surgery simulation.

    Science.gov (United States)

    Suwelack, Stefan; Sander, Christian; Schill, Julian; Serf, Manuel; Danz, Marcel; Asfour, Tamim; Burger, Wolfgang; Dillmann, Rüdiger; Speidel, Stefanie

    2014-01-01

    In minimally invasive surgery (MIS), virtual reality (VR) training systems have become a promising education tool. However, the adoption of these systems in research and clinical settings is still limited by the high costs of dedicated haptics hardware for MIS. In this paper, we present ongoing research towards an open-source, low-cost haptic interface for MIS simulation. We demonstrate the basic mechanical design of the device, the sensor setup as well as its software integration.

  13. Graphical User Interface Programming in Introductory Computer Science.

    Science.gov (United States)

    Skolnick, Michael M.; Spooner, David L.

    Modern computing systems exploit graphical user interfaces for interaction with users; as a result, introductory computer science courses must begin to teach the principles underlying such interfaces. This paper presents an approach to graphical user interface (GUI) implementation that is simple enough for beginning students to understand, yet…

  14. Computer-Based Tools for Evaluating Graphical User Interfaces

    Science.gov (United States)

    Moore, Loretta A.

    1997-01-01

    The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.

  15. How to Develop a User Interface That Your Real Users Will Love

    Science.gov (United States)

    Phillips, Donald

    2012-01-01

    A "user interface" is the part of an interactive system that bridges the user and the underlying functionality of the system. But people sometimes forget that the best interfaces will provide a platform to optimize the users' interactions so that they support and extend the users' activities in effective, useful, and usable ways. To look at it…

  16. Standards for the user interface - Developing a user consensus. [for Space Station Information System

    Science.gov (United States)

    Moe, Karen L.; Perkins, Dorothy C.; Szczur, Martha R.

    1987-01-01

    The user support environment (USE) which is a set of software tools for a flexible standard interactive user interface to the Space Station systems, platforms, and payloads is described in detail. Included in the USE concept are a user interface language, a run time environment and user interface management system, support tools, and standards for human interaction methods. The goals and challenges of the USE are discussed as well as a methodology based on prototype demonstrations for involving users in the process of validating the USE concepts. By prototyping the key concepts and salient features of the proposed user interface standards, the user's ability to respond is greatly enhanced.

  17. Command Recognition of Robot with Low Dimension Whole-Body Haptic Sensor

    Science.gov (United States)

    Ito, Tatsuya; Tsuji, Toshiaki

    The authors have developed “haptic armor”, a whole-body haptic sensor that has an ability to estimate contact position. Although it is developed for safety assurance of robots in human environment, it can also be used as an interface. This paper proposes a command recognition method based on finger trace information. This paper also discusses some technical issues for improving recognition accuracy of this system.

  18. Learning Analytics for Natural User Interfaces

    Science.gov (United States)

    Martinez-Maldonado, Roberto; Shum, Simon Buckingham; Schneider, Bertrand; Charleer, Sven; Klerkx, Joris; Duval, Erik

    2017-01-01

    The continuous advancement of natural user interfaces (NUIs) allows for the development\tof novel and creative ways to support collocated collaborative work in a wide range of areas, including teaching and learning. The use of NUIs, such as those based on interactive multi-touch surfaces and tangible user interfaces (TUIs), can offer unique…

  19. Playful user interfaces interfaces that invite social and physical interaction

    CERN Document Server

    2014-01-01

    The book is about user interfaces to applications that have been designed for social and physical interaction. The interfaces are ‘playful’, that is, users feel challenged to engage in social and physical interaction because that will be fun. The topics that will be present in this book are interactive playgrounds, urban games using mobiles, sensor-equipped environments for playing, child-computer interaction, tangible game interfaces, interactive tabletop technology and applications, full-body interaction, exertion games, persuasion, engagement, evaluation, and user experience. Readers of the book will not only get a survey of state-of-the-art research in these areas, but the chapters in this book will also provide a vision of the future where playful interfaces will be ubiquitous, that is, present and integrated in home, office, recreational, sports and urban environments, emphasizing that in the future in these environments game elements will be integrated and welcomed.

  20. More playful user interfaces: interfaces that invite social and physical interaction

    NARCIS (Netherlands)

    Nijholt, Antinus; Unknown, [Unknown

    2015-01-01

    This book covers the latest advances in playful user interfacesinterfaces that invite social and physical interaction. These new developments include the use of audio, visual, tactile and physiological sensors to monitor, provide feedback and anticipate the behavior of human users. The decreasing

  1. Eliciting user-sourced interaction mappings for body-based interfaces

    OpenAIRE

    May, Aaron

    2015-01-01

    Thanks to technological advancements, whole-body natural user interfaces are becoming increasingly common in modern homes and public spaces. However, because whole-body natural user interfaces lack obvious affordances, users can be unsure how to control the interface. In this thesis, I report the findings of a study of novice and expert users mock controlling a balance-based whole-body natural user interface during a Think Aloud task. I compare the strategies demonstrated by participants whi...

  2. A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality

    OpenAIRE

    Kim, Mingyu; Jeon, Changyu; Kim, Jinmo

    2017-01-01

    This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and hea...

  3. Sensorimotor Interactions in the Haptic Perception of Virtual Objects

    Science.gov (United States)

    1997-01-01

    the human user. 2 Compared to our understanding of vision and audition , our knowledge of the human haptic perception is very limited. Many basic...modalities such as vision and audition on haptic perception of viscosity or mass, for example. 116 Some preliminary work has already been done in this...string[3]; *posx="x" *forf="f’ *velv="v" * acca ="a" trial[64]; resp[64]; /* random number */ /* trial number */ /* index */ /* array holding stim

  4. Towards personalized adaptive user interfaces

    International Nuclear Information System (INIS)

    Kostov, Vlaho; Fukuda, Shuchi; Yanagisawa, Hideyoshi

    2002-01-01

    An approach towards standardization of the general rules for synthesis and design of man machine interfaces that include dynamic adaptive behavior is presented. The link between the personality type (Myers-Briggs or Kersey Temperament sorter) and the personal preferences of the users (Kansei) for the purpose of building Graphical User Interface (GU]) was investigated. The rules for a personalized el-notional GUI based on the subjective preferences of the users were defined. The results were tested on a modified TETRIS game that displayed background characters capable of emotional response. When the system responded to a user in a manner that is customized to his or her preferences, the reaction time was smaller and the information transfer was faster. Usability testing methods were used and it was shown that development of pleasant cartoon face GUI based on the users inborn personality tendencies was feasible. (Author)

  5. Language workbench user interfaces for data analysis

    Directory of Open Access Journals (Sweden)

    Victoria M. Benson

    2015-02-01

    Full Text Available Biological data analysis is frequently performed with command line software. While this practice provides considerable flexibility for computationally savy individuals, such as investigators trained in bioinformatics, this also creates a barrier to the widespread use of data analysis software by investigators trained as biologists and/or clinicians. Workflow systems such as Galaxy and Taverna have been developed to try and provide generic user interfaces that can wrap command line analysis software. These solutions are useful for problems that can be solved with workflows, and that do not require specialized user interfaces. However, some types of analyses can benefit from custom user interfaces. For instance, developing biomarker models from high-throughput data is a type of analysis that can be expressed more succinctly with specialized user interfaces. Here, we show how Language Workbench (LW technology can be used to model the biomarker development and validation process. We developed a language that models the concepts of Dataset, Endpoint, Feature Selection Method and Classifier. These high-level language concepts map directly to abstractions that analysts who develop biomarker models are familiar with. We found that user interfaces developed in the Meta-Programming System (MPS LW provide convenient means to configure a biomarker development project, to train models and view the validation statistics. We discuss several advantages of developing user interfaces for data analysis with a LW, including increased interface consistency, portability and extension by language composition. The language developed during this experiment is distributed as an MPS plugin (available at http://campagnelab.org/software/bdval-for-mps/.

  6. Language workbench user interfaces for data analysis

    Science.gov (United States)

    Benson, Victoria M.

    2015-01-01

    Biological data analysis is frequently performed with command line software. While this practice provides considerable flexibility for computationally savy individuals, such as investigators trained in bioinformatics, this also creates a barrier to the widespread use of data analysis software by investigators trained as biologists and/or clinicians. Workflow systems such as Galaxy and Taverna have been developed to try and provide generic user interfaces that can wrap command line analysis software. These solutions are useful for problems that can be solved with workflows, and that do not require specialized user interfaces. However, some types of analyses can benefit from custom user interfaces. For instance, developing biomarker models from high-throughput data is a type of analysis that can be expressed more succinctly with specialized user interfaces. Here, we show how Language Workbench (LW) technology can be used to model the biomarker development and validation process. We developed a language that models the concepts of Dataset, Endpoint, Feature Selection Method and Classifier. These high-level language concepts map directly to abstractions that analysts who develop biomarker models are familiar with. We found that user interfaces developed in the Meta-Programming System (MPS) LW provide convenient means to configure a biomarker development project, to train models and view the validation statistics. We discuss several advantages of developing user interfaces for data analysis with a LW, including increased interface consistency, portability and extension by language composition. The language developed during this experiment is distributed as an MPS plugin (available at http://campagnelab.org/software/bdval-for-mps/). PMID:25755929

  7. Robot-assisted microsurgical forceps with haptic feedback for transoral laser microsurgery.

    Science.gov (United States)

    Deshpande, Nikhil; Chauhan, Manish; Pacchierotti, Claudio; Prattichizzo, Domenico; Caldwell, Darwin G; Mattos, Leonardo S

    2016-08-01

    In this paper, a novel, motorized, multi-degrees-of-freedom (DoF), microsurgical forceps tool is presented, which is based on a master-slave teleoperation architecture. The slave device is a 7-DoF manipulator with: (i) 6-DoF positioning and orientation, (ii) 1 open/close gripper DoF; and (iii) an integrated force/torque sensor for tissue grip-force measurement. The master device is a 7-DoF haptic interface which teleoperates the slave device, and provides haptic feedback in its gripper interface. The combination of the device and the surgeon interface replaces the manual, hand-held device providing easy-to-use and ergonomic tissue control, simplifying the surgical tasks. This makes the system suitable to real surgical scenarios in the operating room (OR). The performance of the system was analysed through the evaluation of teleoperation control and characterization of gripping force. The new system offers an overall positioning error of less than 400 μm demonstrating its safety and accuracy. Improved system precision, usability, and ergonomics point to the potential suitability of the device for the OR and its ability to advance haptic-feedback-enhanced transoral laser microsurgeries.

  8. On user behaviour adaptation under interface change

    CSIR Research Space (South Africa)

    Rosman, Benjamin S

    2014-02-01

    Full Text Available International Conference on Intelligent User Interfaces, Haifa, Israel, 24-27 February 2014 On User Behaviour Adaptation Under Interface Change Benjamin Rosman_ Subramanian Ramamoorthy M. M. Hassan Mahmud School of Informatics University of Edinburgh...

  9. Distributed user interfaces usability and collaboration

    CERN Document Server

    Lozano, María D; Tesoriero, Ricardo; Penichet, Victor MR

    2013-01-01

    Written by international researchers in the field of Distributed User Interfaces (DUIs), this book brings together important contributions regarding collaboration and usability in Distributed User Interface settings. Throughout the thirteen chapters authors address key questions concerning how collaboration can be improved by using DUIs, including: in which situations a DUI is suitable to ease the collaboration among users; how usability standards can be used to evaluate the usability of systems based on DUIs; and accurately describe case studies and prototypes implementing these concerns

  10. OzBot and haptics: remote surveillance to physical presence

    Science.gov (United States)

    Mullins, James; Fielding, Mick; Nahavandi, Saeid

    2009-05-01

    This paper reports on robotic and haptic technologies and capabilities developed for the law enforcement and defence community within Australia by the Centre for Intelligent Systems Research (CISR). The OzBot series of small and medium surveillance robots have been designed in Australia and evaluated by law enforcement and defence personnel to determine suitability and ruggedness in a variety of environments. Using custom developed digital electronics and featuring expandable data busses including RS485, I2C, RS232, video and Ethernet, the robots can be directly connected to many off the shelf payloads such as gas sensors, x-ray sources and camera systems including thermal and night vision. Differentiating the OzBot platform from its peers is its ability to be integrated directly with haptic technology or the 'haptic bubble' developed by CISR. Haptic interfaces allow an operator to physically 'feel' remote environments through position-force control and experience realistic force feedback. By adding the capability to remotely grasp an object, feel its weight, texture and other physical properties in real-time from the remote ground control unit, an operator's situational awareness is greatly improved through Haptic augmentation in an environment where remote-system feedback is often limited.

  11. Reasoning about Users' Actions in a Graphical User Interface.

    Science.gov (United States)

    Virvou, Maria; Kabassi, Katerina

    2002-01-01

    Describes a graphical user interface called IFM (Intelligent File Manipulator) that provides intelligent help to users. Explains two underlying reasoning mechanisms, one an adaptation of human plausible reasoning and one that performs goal recognition based on the effects of users' commands; and presents results of an empirical study that…

  12. Shifty: A Weight-Shifting Dynamic Passive Haptic Proxy to Enhance Object Perception in Virtual Reality.

    Science.gov (United States)

    Zenner, Andre; Kruger, Antonio

    2017-04-01

    We define the concept of Dynamic Passive Haptic Feedback (DPHF) for virtual reality by introducing the weight-shifting physical DPHF proxy object Shifty. This concept combines actuators known from active haptics and physical proxies known from passive haptics to construct proxies that automatically adapt their passive haptic feedback. We describe the concept behind our ungrounded weight-shifting DPHF proxy Shifty and the implementation of our prototype. We then investigate how Shifty can, by automatically changing its internal weight distribution, enhance the user's perception of virtual objects interacted with in two experiments. In a first experiment, we show that Shifty can enhance the perception of virtual objects changing in shape, especially in length and thickness. Here, Shifty was shown to increase the user's fun and perceived realism significantly, compared to an equivalent passive haptic proxy. In a second experiment, Shifty is used to pick up virtual objects of different virtual weights. The results show that Shifty enhances the perception of weight and thus the perceived realism by adapting its kinesthetic feedback to the picked-up virtual object. In the same experiment, we additionally show that specific combinations of haptic, visual and auditory feedback during the pick-up interaction help to compensate for visual-haptic mismatch perceived during the shifting process.

  13. Business Performer-Centered Design of User Interfaces

    Science.gov (United States)

    Sousa, Kênia; Vanderdonckt, Jean

    Business Performer-Centered Design of User Interfaces is a new design methodology that adopts business process (BP) definition and a business performer perspective for managing the life cycle of user interfaces of enterprise systems. In this methodology, when the organization has a business process culture, the business processes of an organization are firstly defined according to a traditional methodology for this kind of artifact. These business processes are then transformed into a series of task models that represent the interactive parts of the business processes that will ultimately lead to interactive systems. When the organization has its enterprise systems, but not yet its business processes modeled, the user interfaces of the systems help derive tasks models, which are then used to derive the business processes. The double linking between a business process and a task model, and between a task model and a user interface model makes it possible to ensure traceability of the artifacts in multiple paths and enables a more active participation of business performers in analyzing the resulting user interfaces. In this paper, we outline how a human-perspective is used tied to a model-driven perspective.

  14. Projection Mapping User Interface for Disabled People.

    Science.gov (United States)

    Gelšvartas, Julius; Simutis, Rimvydas; Maskeliūnas, Rytis

    2018-01-01

    Difficulty in communicating is one of the key challenges for people suffering from severe motor and speech disabilities. Often such person can communicate and interact with the environment only using assistive technologies. This paper presents a multifunctional user interface designed to improve communication efficiency and person independence. The main component of this interface is a projection mapping technique used to highlight objects in the environment. Projection mapping makes it possible to create a natural augmented reality information presentation method. The user interface combines a depth sensor and a projector to create camera-projector system. We provide a detailed description of camera-projector system calibration procedure. The described system performs tabletop object detection and automatic projection mapping. Multiple user input modalities have been integrated into the multifunctional user interface. Such system can be adapted to the needs of people with various disabilities.

  15. Projection Mapping User Interface for Disabled People

    Science.gov (United States)

    Simutis, Rimvydas; Maskeliūnas, Rytis

    2018-01-01

    Difficulty in communicating is one of the key challenges for people suffering from severe motor and speech disabilities. Often such person can communicate and interact with the environment only using assistive technologies. This paper presents a multifunctional user interface designed to improve communication efficiency and person independence. The main component of this interface is a projection mapping technique used to highlight objects in the environment. Projection mapping makes it possible to create a natural augmented reality information presentation method. The user interface combines a depth sensor and a projector to create camera-projector system. We provide a detailed description of camera-projector system calibration procedure. The described system performs tabletop object detection and automatic projection mapping. Multiple user input modalities have been integrated into the multifunctional user interface. Such system can be adapted to the needs of people with various disabilities. PMID:29686827

  16. A User Interface Toolkit for a Small Screen Device.

    OpenAIRE

    UOTILA, ALEKSI

    2000-01-01

    The appearance of different kinds of networked mobile devices and network appliances creates special requirements for user interfaces that are not met by existing widget based user interface creation toolkits. This thesis studies the problem domain of user interface creation toolkits for portable network connected devices. The portable nature of these devices places great restrictions on the user interface capabilities. One main characteristic of the devices is that they have small screens co...

  17. New User Interface Architecture for NetWiser Product with AngularJS

    OpenAIRE

    Johansson, Janne

    2015-01-01

    The topic of the thesis was to develop a new user interface for a product called NetWiser. The need for a new user interface originated from the problems experienced with the technology of the old user interface. A decision was made to replace the old Java applets based user interface with a new JavaScript based user interface. In practice, the technology switch meant a complete redevelopment of the user interface application component. A complete redesign of the user interface layout was exc...

  18. Visual Design of User Interfaces by (De)composition

    OpenAIRE

    Lepreux, Sophie; Michotte, Benjamin; Vanderdonckt, Jean; 13th Int. Workshop on Design, Specification, and Verification of Interactive Systems DSV-IS

    2006-01-01

    Most existing graphical user interfaces are usually designed for a fixed context of use, thus making them rather difficult to modify for other contexts of use, such as for other users, other platforms, and other environments. This paper addresses this problem by introducing a new visual design method for graphical users interfaces referred to as “visual design by (de)composition". In this method, any individual or composite component of a graphical user interface is submitted to a series of o...

  19. Vision-Based Haptic Feedback for Remote Micromanipulation in-SEM Environment

    Science.gov (United States)

    Bolopion, Aude; Dahmen, Christian; Stolle, Christian; Haliyo, Sinan; Régnier, Stéphane; Fatikow, Sergej

    2012-07-01

    This article presents an intuitive environment for remote micromanipulation composed of both haptic feedback and virtual reconstruction of the scene. To enable nonexpert users to perform complex teleoperated micromanipulation tasks, it is of utmost importance to provide them with information about the 3-D relative positions of the objects and the tools. Haptic feedback is an intuitive way to transmit such information. Since position sensors are not available at this scale, visual feedback is used to derive information about the scene. In this work, three different techniques are implemented, evaluated, and compared to derive the object positions from scanning electron microscope images. The modified correlation matching with generated template algorithm is accurate and provides reliable detection of objects. To track the tool, a marker-based approach is chosen since fast detection is required for stable haptic feedback. Information derived from these algorithms is used to propose an intuitive remote manipulation system that enables users situated in geographically distant sites to benefit from specific equipments, such as SEMs. Stability of the haptic feedback is ensured by the minimization of the delays, the computational efficiency of vision algorithms, and the proper tuning of the haptic coupling. Virtual guides are proposed to avoid any involuntary collisions between the tool and the objects. This approach is validated by a teleoperation involving melamine microspheres with a diameter of less than 2 μ m between Paris, France and Oldenburg, Germany.

  20. Simulation and training of lumbar punctures using haptic volume rendering and a 6DOF haptic device

    Science.gov (United States)

    Färber, Matthias; Heller, Julika; Handels, Heinz

    2007-03-01

    The lumbar puncture is performed by inserting a needle into the spinal chord of the patient to inject medicaments or to extract liquor. The training of this procedure is usually done on the patient guided by experienced supervisors. A virtual reality lumbar puncture simulator has been developed in order to minimize the training costs and the patient's risk. We use a haptic device with six degrees of freedom (6DOF) to feedback forces that resist needle insertion and rotation. An improved haptic volume rendering approach is used to calculate the forces. This approach makes use of label data of relevant structures like skin, bone, muscles or fat and original CT data that contributes information about image structures that can not be segmented. A real-time 3D visualization with optional stereo view shows the punctured region. 2D visualizations of orthogonal slices enable a detailed impression of the anatomical context. The input data consisting of CT and label data and surface models of relevant structures is defined in an XML file together with haptic rendering and visualization parameters. In a first evaluation the visible human male data has been used to generate a virtual training body. Several users with different medical experience tested the lumbar puncture trainer. The simulator gives a good haptic and visual impression of the needle insertion and the haptic volume rendering technique enables the feeling of unsegmented structures. Especially, the restriction of transversal needle movement together with rotation constraints enabled by the 6DOF device facilitate a realistic puncture simulation.

  1. User Interface Design for Dynamic Geometry Software

    Science.gov (United States)

    Kortenkamp, Ulrich; Dohrmann, Christian

    2010-01-01

    In this article we describe long-standing user interface issues with Dynamic Geometry Software and common approaches to address them. We describe first prototypes of multi-touch-capable DGS. We also give some hints on the educational benefits of proper user interface design.

  2. VisTool: A user interface and visualization development system

    DEFF Research Database (Denmark)

    Xu, Shangjin

    system – to simplify user interface development. VisTool allows user interface development without real programming. With VisTool a designer assembles visual objects (e.g. textboxes, ellipse, etc.) to visualize database contents. In VisTool, visual properties (e.g. color, position, etc.) can be formulas...... programming. However, in Software Engineering, software engineers who develop user interfaces do not follow it. In many cases, it is desirable to use graphical presentations, because a graphical presentation gives a better overview than text forms, and can improve task efficiency and user satisfaction....... However, it is more difficult to follow the classical usability approach for graphical presentation development. These difficulties result from the fact that designers cannot implement user interface with interactions and real data. We developed VisTool – a user interface and visualization development...

  3. Projection Mapping User Interface for Disabled People

    Directory of Open Access Journals (Sweden)

    Julius Gelšvartas

    2018-01-01

    Full Text Available Difficulty in communicating is one of the key challenges for people suffering from severe motor and speech disabilities. Often such person can communicate and interact with the environment only using assistive technologies. This paper presents a multifunctional user interface designed to improve communication efficiency and person independence. The main component of this interface is a projection mapping technique used to highlight objects in the environment. Projection mapping makes it possible to create a natural augmented reality information presentation method. The user interface combines a depth sensor and a projector to create camera-projector system. We provide a detailed description of camera-projector system calibration procedure. The described system performs tabletop object detection and automatic projection mapping. Multiple user input modalities have been integrated into the multifunctional user interface. Such system can be adapted to the needs of people with various disabilities.

  4. New ROOT Graphical User Interfaces for fitting

    International Nuclear Information System (INIS)

    Maline, D Gonzalez; Moneta, L; Antcheva, I

    2010-01-01

    ROOT, as a scientific data analysis framework, provides extensive capabilities via Graphical User Interfaces (GUI) for performing interactive analysis and visualizing data objects like histograms and graphs. A new interface for fitting has been developed for performing, exploring and comparing fits on data point sets such as histograms, multi-dimensional graphs or trees. With this new interface, users can build interactively the fit model function, set parameter values and constraints and select fit and minimization methods with their options. Functionality for visualizing the fit results is as well provided, with the possibility of drawing residuals or confidence intervals. Furthermore, the new fit panel reacts as a standalone application and it does not prevent users from interacting with other windows. We will describe in great detail the functionality of this user interface, covering as well new capabilities provided by the new fitting and minimization tools introduced recently in the ROOT framework.

  5. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  6. Development of INFRA graphic user interface

    International Nuclear Information System (INIS)

    Yang, Y. S.; Lee, C. B.; Kim, Y. M.; Kim, D. H.; Kim, S. K.

    2004-01-01

    GUI(Graphic User Interface) has been developed for high burnup fuel performance code INFRA. Based upon FORTRAN program language, INFRA was developed by COMPAQ Visual FORTRAN 6.5. Graphic user input and output interface have been developed by using Visual Basic and MDB which are the most widely used program language and database for windows application development. Various input parameters, which are required for INFRA calculation, can be input more conveniently by newly developed input interface. Without any additional data handling, INFRA calculation results can be investigated intuitively by 2D or 3D graphs on screen and animation function

  7. Pen-and-Paper User Interfaces

    CERN Document Server

    Steimle, Jurgen

    2012-01-01

    Even at the beginning of the 21st century, we are far from becoming paperless. Pen and paper is still the only truly ubiquitous information processing technology. Pen-and-paper user interfaces bridge the gap between paper and the digital world. Rather than replacing paper with electronic media, they seamlessly integrate both worlds in a hybrid user interface. Classical paper documents become interactive. This opens up a huge field of novel computer applications at our workplaces and in our homes. This book provides readers with a broad and extensive overview of the field, so as to provide a fu

  8. The web based user interface of RODOS

    International Nuclear Information System (INIS)

    Raskob, W.; Mueller, A.; Munz, E.; Rafat, M.

    2003-01-01

    Full text: The interaction between the RODOS system and its users has three main objectives: (1) operation of the system in its automatic and interactive modes including the processing of meteorological and radiological on-line data, and the choice of module chains for performing the necessary calculations; (2) input of data defining the accident situation, such as source term information, intervention criteria and timing of emergency actions; (3) selection and presentation of results in the form of spatial and temporal distributions of activity concentrations, areas affected by emergency actions and countermeasures, and their radiological and economic consequences. Users of category A have direct access to the RODOS system via local or wide area networks through the client/server protocol Internet/X. Any internet connected X desktop machine, such as Unix workstations from different vendors, X- terminals, Linux PCs, and PCs with X-emulation can be used. A number of X-Windows based graphical user interfaces (GUIs) provide direct access to all functionalities of the RODOS system and allow for handling the various user interactions with the RODOS system described above. Among others, the user can trigger or interrupt the automatic processing mode, execute application programs simultaneously, modify and delete data, import data sets from databases, and change configuration files. As the user interacts directly with in-memory active processes, the system responses immediately after having performed the necessary calculations. For obtaining the requested results, the users must know, which chain of application software has to be selected, how to interact with their interfaces, which sort of initialization data have to be assigned, etc. This flexible interaction with RODOS implies that only experienced and well-trained users are able to operate the system and to obtain correct and sensible information. A new interface has been developed which is based an the commonly used

  9. Force Sensitive Handles and Capacitive Touch Sensor for Driving a Flexible Haptic-Based Immersive System

    OpenAIRE

    Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto

    2013-01-01

    In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object...

  10. Hit me baby one more time : a haptic rating interface

    NARCIS (Netherlands)

    Bartneck, C.; Athanasiadou, P.; Kanda, T.; Jacko, J.A.

    2007-01-01

    As the importance of recommender systems increases, in combination with the explosion in data available over the internet and in our own digital libraries, we suggest an alternative method of providing explicit user feedback. We create a tangible interface, which will not only facilitate

  11. Preface (to Playful User Interfaces)

    NARCIS (Netherlands)

    Unknown, [Unknown; Nijholt, A.; Nijholt, Antinus

    2014-01-01

    This book is about user interfaces to applications that can be considered as ‘playful’. The interfaces to such applications should be ‘playful’ as well. The application should be fun, and interacting with such an application should, of course, be fun as well. Maybe more. Why not expect that the

  12. User interface user's guide for HYPGEN

    Science.gov (United States)

    Chiu, Ing-Tsau

    1992-01-01

    The user interface (UI) of HYPGEN is developed using Panel Library to shorten the learning curve for new users and provide easier ways to run HYPGEN for casual users as well as for advanced users. Menus, buttons, sliders, and type-in fields are used extensively in UI to allow users to point and click with a mouse to choose various available options or to change values of parameters. On-line help is provided to give users information on using UI without consulting the manual. Default values are set for most parameters and boundary conditions are determined by UI to further reduce the effort needed to run HYPGEN; however, users are free to make any changes and save it in a file for later use. A hook to PLOT3D is built in to allow graphics manipulation. The viewpoint and min/max box for PLOT3D windows are computed by UI and saved in a PLOT3D journal file. For large grids which take a long time to generate on workstations, the grid generator (HYPGEN) can be run on faster computers such as Crays, while UI stays at the workstation.

  13. Design and implementation of visual-haptic assistive control system for virtual rehabilitation exercise and teleoperation manipulation.

    Science.gov (United States)

    Veras, Eduardo J; De Laurentis, Kathryn J; Dubey, Rajiv

    2008-01-01

    This paper describes the design and implementation of a control system that integrates visual and haptic information to give assistive force feedback through a haptic controller (Omni Phantom) to the user. A sensor-based assistive function and velocity scaling program provides force feedback that helps the user complete trajectory following exercises for rehabilitation purposes. This system also incorporates a PUMA robot for teleoperation, which implements a camera and a laser range finder, controlled in real time by a PC, were implemented into the system to help the user to define the intended path to the selected target. The real-time force feedback from the remote robot to the haptic controller is made possible by using effective multithreading programming strategies in the control system design and by novel sensor integration. The sensor-based assistant function concept applied to teleoperation as well as shared control enhances the motion range and manipulation capabilities of the users executing rehabilitation exercises such as trajectory following along a sensor-based defined path. The system is modularly designed to allow for integration of different master devices and sensors. Furthermore, because this real-time system is versatile the haptic component can be used separately from the telerobotic component; in other words, one can use the haptic device for rehabilitation purposes for cases in which assistance is needed to perform tasks (e.g., stroke rehab) and also for teleoperation with force feedback and sensor assistance in either supervisory or automatic modes.

  14. AXAF user interfaces for heterogeneous analysis environments

    Science.gov (United States)

    Mandel, Eric; Roll, John; Ackerman, Mark S.

    1992-01-01

    The AXAF Science Center (ASC) will develop software to support all facets of data center activities and user research for the AXAF X-ray Observatory, scheduled for launch in 1999. The goal is to provide astronomers with the ability to utilize heterogeneous data analysis packages, that is, to allow astronomers to pick the best packages for doing their scientific analysis. For example, ASC software will be based on IRAF, but non-IRAF programs will be incorporated into the data system where appropriate. Additionally, it is desired to allow AXAF users to mix ASC software with their own local software. The need to support heterogeneous analysis environments is not special to the AXAF project, and therefore finding mechanisms for coordinating heterogeneous programs is an important problem for astronomical software today. The approach to solving this problem has been to develop two interfaces that allow the scientific user to run heterogeneous programs together. The first is an IRAF-compatible parameter interface that provides non-IRAF programs with IRAF's parameter handling capabilities. Included in the interface is an application programming interface to manipulate parameters from within programs, and also a set of host programs to manipulate parameters at the command line or from within scripts. The parameter interface has been implemented to support parameter storage formats other than IRAF parameter files, allowing one, for example, to access parameters that are stored in data bases. An X Windows graphical user interface called 'agcl' has been developed, layered on top of the IRAF-compatible parameter interface, that provides a standard graphical mechanism for interacting with IRAF and non-IRAF programs. Users can edit parameters and run programs for both non-IRAF programs and IRAF tasks. The agcl interface allows one to communicate with any command line environment in a transparent manner and without any changes to the original environment. For example, the authors

  15. Reflections on Andes' Goal-Free User Interface

    Science.gov (United States)

    VanLehn, Kurt

    2016-01-01

    Although the Andes project produced many results over its 18 years of activity, this commentary focuses on its contributions to understanding how a goal-free user interface impacts the overall design and performance of a step-based tutoring system. Whereas a goal-aligned user interface displays relevant goals as blank boxes or empty locations that…

  16. Real-time dual-band haptic music player for mobile devices.

    Science.gov (United States)

    Hwang, Inwook; Lee, Hyeseon; Choi, Seungmoon

    2013-01-01

    We introduce a novel dual-band haptic music player for real-time simultaneous vibrotactile playback with music in mobile devices. Our haptic music player features a new miniature dual-mode actuator that can produce vibrations consisting of two principal frequencies and a real-time vibration generation algorithm that can extract vibration commands from a music file for dual-band playback (bass and treble). The algorithm uses a "haptic equalizer" and provides plausible sound-to-touch modality conversion based on human perceptual data. In addition, we present a user study carried out to evaluate the subjective performance (precision, harmony, fun, and preference) of the haptic music player, in comparison with the current practice of bass-band-only vibrotactile playback via a single-frequency voice-coil actuator. The evaluation results indicated that the new dual-band playback outperforms the bass-only rendering, also providing several insights for further improvements. The developed system and experimental findings have implications for improving the multimedia experience with mobile devices.

  17. End-User Control over Physical User-Interfaces: From Digital Fabrication to Real-Time Adaptability

    OpenAIRE

    Ramakers, Raf

    2016-01-01

    Graphical user interfaces are at the core of the majority of computing devices, including WIMP (windows, icons, menus, pointer) interaction styles and touch interactions. The popularity of graphical user interfaces stems from their ability to adapt to a multitude of tasks, such as document editing, messaging, browsing, etc. New tools and technologies also enabled users without a technical background to author these kind of digital interfaces, for example, filters for quick photo editing, inte...

  18. User Interface Framework for the National Ignition Facility (NIF)

    International Nuclear Information System (INIS)

    Fisher, J M; Bowers, G A; Carey, R W; Daveler, S A; Herndon Ford, K B; Ho, J C; Lagin, L J; Lambert, C J; Mauvais, J; Stout, E A; West, S L

    2007-01-01

    A user interface (UI) framework supports the development of user interfaces to operate the National Ignition Facility (NIF) using the Integrated Computer Control System (ICCS). [1] This framework simplifies UI development and ensures consistency for NIF operators. A comprehensive, layered collection of UIs in ICCS provides interaction with system-level processes, shot automation, and subsystem-specific devices. All user interfaces are written in Java, employing CORBA to interact with other ICCS components. ICCS developers use these frameworks to compose two major types of user interfaces: broadviews and control panels. Broadviews provide a visual representation of the NIF beamlines through interactive schematic drawings. Control panels provide status and control at a device level. The UI framework includes a suite of display components to standardize user interaction through data entry behaviors, common connection and threading mechanisms, and a common appearance. With these components, ICCS developers can more efficiently address usability issues in the facility when needed. The ICCS UI framework helps developers create consistent and easy-to-understand user interfaces for NIF operators

  19. Advances in the development of a cognitive user interface

    Directory of Open Access Journals (Sweden)

    Jokisch Oliver

    2018-01-01

    Full Text Available In this contribution, we want to summarize recent development steps of the embedded cognitive user interface UCUI, which enables a user-adaptive scenario in human-machine or even human-robot interactions by considering sophisticated cognitive and semantic modelling. The interface prototype is developed by different German institutes and companies with their steering teams at Fraunhofer IKTS and Brandenburg University of Technology. The interface prototype is able to communicate with users via speech and gesture recognition, speech synthesis and a touch display. The device includes an autarkic semantic processing and beyond a cognitive behavior control, which supports an intuitive interaction to control different kinds of electronic devices, e. g. in a smart home environment or in interactive respectively collaborative robotics. Contrary to available speech assistance systems such as Amazon Echo or Google Home, the introduced cognitive user interface UCUI ensures the user privacy by processing all necessary information without any network access of the interface device.

  20. Graphical User Interfaces and Library Systems: End-User Reactions.

    Science.gov (United States)

    Zorn, Margaret; Marshall, Lucy

    1995-01-01

    Describes a study by Parke-Davis Pharmaceutical Research Library to determine user satisfaction with the graphical user interface-based (GUI) Dynix Marquis compared with the text-based Dynix Classic Online Public Access Catalog (OPAC). Results show that the GUI-based OPAC was preferred by endusers over the text-based OPAC. (eight references) (DGM)

  1. An Object-Oriented Architecture for User Interface Management in Distributed Applications

    OpenAIRE

    Denzer, Ralf

    2017-01-01

    User interfaces for large distributed applications have to handle specific problems: the complexity of the application itself and the integration of online-data into the user interface. A main task of the user interface architecture is to provide powerful tools to design and augment the end-user system easily, hence giving the designer more time to focus on user requirements. Our experiences developing a user interface system for a process control room showed that a lot of time during the dev...

  2. User interface issues in supporting human-computer integrated scheduling

    Science.gov (United States)

    Cooper, Lynne P.; Biefeld, Eric W.

    1991-01-01

    The topics are presented in view graph form and include the following: characteristics of Operations Mission Planner (OMP) schedule domain; OMP architecture; definition of a schedule; user interface dimensions; functional distribution; types of users; interpreting user interaction; dynamic overlays; reactive scheduling; and transitioning the interface.

  3. Exploring Interaction Space as Abstraction Mechanism for Task-Based User Interface Design

    DEFF Research Database (Denmark)

    Nielsen, C. M.; Overgaard, M.; Pedersen, M. B.

    2007-01-01

    Designing a user interface is often a complex undertaking. Model-based user interface design is an approach where models and mappings between them form the basis for creating and specifying the design of a user interface. Such models usually include descriptions of the tasks of the prospective user......, but there is considerable variation in the other models that are employed. This paper explores the extent to which the notion of interaction space is useful as an abstraction mechanism to reduce the complexity of creating and specifying a user interface design. We present how we designed a specific user interface through...... mechanism that can help user interface designers exploit object-oriented analysis results and reduce the complexity of designing a user interface....

  4. The Graphical User Interface: Crisis, Danger, and Opportunity.

    Science.gov (United States)

    Boyd, L. H.; And Others

    1990-01-01

    This article describes differences between the graphical user interface and traditional character-based interface systems, identifies potential problems posed by graphic computing environments for blind computer users, and describes some programs and strategies that are being developed to provide access to those environments. (Author/JDD)

  5. Demonstrator 1: User Interface and User Functions

    DEFF Research Database (Denmark)

    Gram, Christian

    1999-01-01

    Describes the user interface and its functionality in a prototype system used for a virtual seminar session. The functionality is restricted to what is needed for a distributed seminar discussion among not too many people. The system is designed to work with the participants distributed at several...

  6. Representing Graphical User Interfaces with Sound: A Review of Approaches

    Science.gov (United States)

    Ratanasit, Dan; Moore, Melody M.

    2005-01-01

    The inability of computer users who are visually impaired to access graphical user interfaces (GUIs) has led researchers to propose approaches for adapting GUIs to auditory interfaces, with the goal of providing access for visually impaired people. This article outlines the issues involved in nonvisual access to graphical user interfaces, reviews…

  7. Squidy : a Zoomable Design Environment for Natural User Interfaces

    OpenAIRE

    König, Werner A.; Rädle, Roman; Reiterer, Harald

    2009-01-01

    We introduce the interaction library Squidy, which eases the design of natural user interfaces by unifying relevant frameworks and toolkits in a common library. Squidy provides a central design environment based on high-level visual data flow programming combined with zoomable user interface concepts. The user interface offers a Simple visual language and a collection of ready-to-use devices, filters and interaction techniques. The concept of semantic zooming enables nevertheless access to mo...

  8. Roughness based perceptual analysis towards digital skin imaging system with haptic feedback.

    Science.gov (United States)

    Kim, K

    2016-08-01

    To examine psoriasis or atopic eczema, analyzing skin roughness by palpation is essential to precisely diagnose skin diseases. However, optical sensor based skin imaging systems do not allow dermatologists to touch skin images. To solve the problem, a new haptic rendering technology that can accurately display skin roughness must be developed. In addition, the rendering algorithm must be able to filter spatial noises created during 2D to 3D image conversion without losing the original roughness on the skin image. In this study, a perceptual way to design a noise filter that will remove spatial noises and in the meantime recover maximized roughness is introduced by understanding human sensitivity on surface roughness. A visuohaptic rendering system that can provide a user with seeing and touching digital skin surface roughness has been developed including a geometric roughness estimation method from a meshed surface. In following, a psychophysical experiment was designed and conducted with 12 human subjects to measure human perception with the developed visual and haptic interfaces to examine surface roughness. From the psychophysical experiment, it was found that touch is more sensitive at lower surface roughness, and vice versa. Human perception with both senses, vision and touch, becomes less sensitive to surface distortions as roughness increases. When interact with both channels, visual and haptic interfaces, the performance to detect abnormalities on roughness is greatly improved by sensory integration with the developed visuohaptic rendering system. The result can be used as a guideline to design a noise filter that can perceptually remove spatial noises while recover maximized roughness values from a digital skin image obtained by optical sensors. In addition, the result also confirms that the developed visuohaptic rendering system can help dermatologists or skin care professionals examine skin conditions by using vision and touch at the same time. © 2015

  9. User interaction in smart ambient environment targeted for senior citizen.

    Science.gov (United States)

    Pulli, Petri; Hyry, Jaakko; Pouke, Matti; Yamamoto, Goshiro

    2012-11-01

    Many countries are facing a problem when the age-structure of the society is changing. The numbers of senior citizen are rising rapidly, and caretaking personnel numbers cannot match the problems and needs of these citizens. Using smart, ubiquitous technologies can offer ways in coping with the need of more nursing staff and the rising costs of taking care of senior citizens for the society. Helping senior citizens with a novel, easy to use interface that guides and helps, could improve their quality of living and make them participate more in daily activities. This paper presents a projection-based display system for elderly people with memory impairments and the proposed user interface for the system. The user's process recognition based on a sensor network is also described. Elderly people wearing the system can interact the projected user interface by tapping physical surfaces (such as walls, tables, or doors) using them as a natural, haptic feedback input surface.

  10. The web-based user interface for EAST plasma control system

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, R.R., E-mail: rrzhang@ipp.ac.cn [Institute of Plasma Physics, Chinese Academy of Sciences, Anhui (China); Xiao, B.J. [Institute of Plasma Physics, Chinese Academy of Sciences, Anhui (China); School of Nuclear Science and Technology, University of Science and Technology of China, Anhui (China); Yuan, Q.P. [Institute of Plasma Physics, Chinese Academy of Sciences, Anhui (China); Yang, F. [Institute of Plasma Physics, Chinese Academy of Sciences, Anhui (China); Department of Computer Science, Anhui Medical University, Anhui (China); Zhang, Y. [Institute of Plasma Physics, Chinese Academy of Sciences, Anhui (China); Johnson, R.D.; Penaflor, B.G. [General Atomics, DIII-D National Fusion Facility, San Diego, CA (United States)

    2014-05-15

    The plasma control system (PCS) plays a vital role at EAST for fusion science experiments. Its software application consists of two main parts: an IDL graphical user interface for setting a large number of plasma parameters to specify each discharge, several programs for performing the real-time feedback control and managing the whole control system. The PCS user interface can be used from any X11 Windows client with privileged access to the PCS computer system. However, remote access to the PCS system via the IDL user interface becomes an extreme inconvenience due to the high network latency to draw or operate the interfaces. In order to realize lower latency for remote access to the PCS system, a web-based system has been developed for EAST recently. The setup data are retrieved from the PCS system and client-side JavaScript draws the interfaces into the user's browser. The user settings are also sent back to the PCS system for controlling discharges. These technologies allow the web-based user interface to be viewed by authorized users with a web browser and have it communicate with PCS server processes directly. It works together with the IDL interface and provides a new way to aid remote participation.

  11. The web-based user interface for EAST plasma control system

    International Nuclear Information System (INIS)

    Zhang, R.R.; Xiao, B.J.; Yuan, Q.P.; Yang, F.; Zhang, Y.; Johnson, R.D.; Penaflor, B.G.

    2014-01-01

    The plasma control system (PCS) plays a vital role at EAST for fusion science experiments. Its software application consists of two main parts: an IDL graphical user interface for setting a large number of plasma parameters to specify each discharge, several programs for performing the real-time feedback control and managing the whole control system. The PCS user interface can be used from any X11 Windows client with privileged access to the PCS computer system. However, remote access to the PCS system via the IDL user interface becomes an extreme inconvenience due to the high network latency to draw or operate the interfaces. In order to realize lower latency for remote access to the PCS system, a web-based system has been developed for EAST recently. The setup data are retrieved from the PCS system and client-side JavaScript draws the interfaces into the user's browser. The user settings are also sent back to the PCS system for controlling discharges. These technologies allow the web-based user interface to be viewed by authorized users with a web browser and have it communicate with PCS server processes directly. It works together with the IDL interface and provides a new way to aid remote participation

  12. A Model-Driven Approach to Graphical User Interface Runtime Adaptation

    OpenAIRE

    Criado, Javier; Vicente Chicote, Cristina; Iribarne, Luis; Padilla, Nicolás

    2010-01-01

    Graphical user interfaces play a key role in human-computer interaction, as they link the system with its end-users, allowing information exchange and improving communication. Nowadays, users increasingly demand applications with adaptive interfaces that dynamically evolve in response to their specific needs. Thus, providing graphical user interfaces with runtime adaptation capabilities is becoming more and more an important issue. To address this problem, this paper proposes a componen...

  13. Earthdata User Interface Patterns: Building Usable Web Interfaces Through a Shared UI Pattern Library

    Science.gov (United States)

    Siarto, J.

    2014-12-01

    As more Earth science software tools and services move to the web--the design and usability of those tools become ever more important. A good user interface is becoming expected and users are becoming increasingly intolerant of websites and web applications that work against them. The Earthdata UI Pattern Library attempts to give these scientists and developers the design tools they need to make usable, compelling user interfaces without the associated overhead of using a full design team. Patterns are tested and functional user interface elements targeted specifically at the Earth science community and will include web layouts, buttons, tables, typography, iconography, mapping and visualization/graphing widgets. These UI elements have emerged as the result of extensive user testing, research and software development within the NASA Earthdata team over the past year.

  14. Playful User Interfaces. Interfaces that Invite Social and Physical Interaction.

    NARCIS (Netherlands)

    Nijholt, Antinus; Unknown, [Unknown

    2014-01-01

    This book is about user interfaces to applications that can be considered as ‘playful’. The interfaces to such applications should be ‘playful’ as well. The application should be fun, and interacting with such an application should, of course, be fun as well. Maybe more. Why not expect that the

  15. Exploring distributed user interfaces in ambient intelligent environments

    NARCIS (Netherlands)

    Dadlani Mahtani, P.M.; Peregrin Emparanza, J.; Markopoulos, P.; Gallud, J.A.; Tesoriero, R.; Penichet, V.M.R.

    2011-01-01

    In this paper we explore the use of Distributed User Interfaces (DUIs) in the field of Ambient Intelligence (AmI). We first introduce the emerging area of AmI, followed by describing three case studies where user interfaces or ambient displays are distributed and blending in the user’s environments.

  16. Communicating Emotion through Haptic Design: A Study Using Physical Keys

    DEFF Research Database (Denmark)

    Kjellerup, Marie Kjær; Larsen, Anne Cathrine; Maier, Anja

    2014-01-01

    This paper explores how designers may communicate with the users of their products through haptic design. More specifically, how tactile properties of materials evoke emotions such as satisfaction, joy, or disgust. A research through design approach has been followed; mood- and material boards...... and prototypes of four ‘haptically enhanced’ (physical) keys were created. Types of keys selected include home, bicycle, hobby, and basement. An experiment with ten participants was conducted, using word association and a software to elicit product emotions (PrEmo). Results show a mapping between the designer...

  17. A new visual feedback-based magnetorheological haptic master for robot-assisted minimally invasive surgery

    Science.gov (United States)

    Choi, Seung-Hyun; Kim, Soomin; Kim, Pyunghwa; Park, Jinhyuk; Choi, Seung-Bok

    2015-06-01

    In this study, we developed a novel four-degrees-of-freedom haptic master using controllable magnetorheological (MR) fluid. We also integrated the haptic master with a vision device with image processing for robot-assisted minimally invasive surgery (RMIS). The proposed master can be used in RMIS as a haptic interface to provide the surgeon with a sense of touch by using both kinetic and kinesthetic information. The slave robot, which is manipulated with a proportional-integrative-derivative controller, uses a force sensor to obtain the desired forces from tissue contact, and these desired repulsive forces are then embodied through the MR haptic master. To verify the effectiveness of the haptic master, the desired force and actual force are compared in the time domain. In addition, a visual feedback system is implemented in the RMIS experiment to distinguish between the tumor and organ more clearly and provide better visibility to the operator. The hue-saturation-value color space is adopted for the image processing since it is often more intuitive than other color spaces. The image processing and haptic feedback are realized on surgery performance. In this work, tumor-cutting experiments are conducted under four different operating conditions: haptic feedback on, haptic feedback off, image processing on, and image processing off. The experimental realization shows that the performance index, which is a function of pixels, is different in the four operating conditions.

  18. Haptic feedback for enhancing realism of walking simulations.

    Science.gov (United States)

    Turchet, Luca; Burelli, Paolo; Serafin, Stefania

    2013-01-01

    In this paper, we describe several experiments whose goal is to evaluate the role of plantar vibrotactile feedback in enhancing the realism of walking experiences in multimodal virtual environments. To achieve this goal we built an interactive and a noninteractive multimodal feedback system. While during the use of the interactive system subjects physically walked, during the use of the noninteractive system the locomotion was simulated while subjects were sitting on a chair. In both the configurations subjects were exposed to auditory and audio-visual stimuli presented with and without the haptic feedback. Results of the experiments provide a clear preference toward the simulations enhanced with haptic feedback showing that the haptic channel can lead to more realistic experiences in both interactive and noninteractive configurations. The majority of subjects clearly appreciated the added feedback. However, some subjects found the added feedback unpleasant. This might be due, on one hand, to the limits of the haptic simulation and, on the other hand, to the different individual desire to be involved in the simulations. Our findings can be applied to the context of physical navigation in multimodal virtual environments as well as to enhance the user experience of watching a movie or playing a video game.

  19. Interfaces for End-User Information Seeking.

    Science.gov (United States)

    Marchionini, Gary

    1992-01-01

    Discusses essential features of interfaces to support end-user information seeking. Highlights include cognitive engineering; task models and task analysis; the problem-solving nature of information seeking; examples of systems for end-users, including online public access catalogs (OPACs), hypertext, and help systems; and suggested research…

  20. Liferay 6.2 user interface development

    CERN Document Server

    Chen, Xinsheng

    2013-01-01

    A step-by-step tutorial, targeting the Liferay 6.2 version. This book takes a step-by-step approach to customizing the look and feel of your website, and shows you how to build a great looking user interface as well.""Liferay 6.2 User Interface Development"" is for anyone who is interested in the Liferay Portal. It contains text that explicitly introduces you to the Liferay Portal. You will benefit most from this book if you have Java programming experience and have coded servlets or JavaServer Pages before. Experienced Liferay portal developers will also find this book useful because it expla

  1. Programming Graphical User Interfaces in R

    CERN Document Server

    Verzani, John

    2012-01-01

    Programming Graphical User Interfaces with R introduces each of the major R packages for GUI programming: RGtk2, qtbase, Tcl/Tk, and gWidgets. With examples woven through the text as well as stand-alone demonstrations of simple yet reasonably complete applications, the book features topics especially relevant to statisticians who aim to provide a practical interface to functionality implemented in R. The book offers: A how-to guide for developing GUIs within R The fundamentals for users with limited knowledge of programming within R and other languages GUI design for specific functions or as l

  2. Comparing two anesthesia information management system user interfaces: a usability evaluation.

    Science.gov (United States)

    Wanderer, Jonathan P; Rao, Anoop V; Rothwell, Sarah H; Ehrenfeld, Jesse M

    2012-11-01

    Anesthesia information management systems (AIMS) have been developed by multiple vendors and are deployed in thousands of operating rooms around the world, yet not much is known about measuring and improving AIMS usability. We developed a methodology for evaluating AIMS usability in a low-fidelity simulated clinical environment and used it to compare an existing user interface with a revised version. We hypothesized that the revised user interface would be more useable. In a low-fidelity simulated clinical environment, twenty anesthesia providers documented essential anesthetic information for the start of the case using both an existing and a revised user interface. Participants had not used the revised user interface previously and completed a brief training exercise prior to the study task. All participants completed a workload assessment and a satisfaction survey. All sessions were recorded. Multiple usability metrics were measured. The primary outcome was documentation accuracy. Secondary outcomes were perceived workload, number of documentation steps, number of user interactions, and documentation time. The interfaces were compared and design problems were identified by analyzing recorded sessions and survey results. Use of the revised user interface was shown to improve documentation accuracy from 85.1% to 92.4%, a difference of 7.3% (95% confidence interval [CI] for the difference 1.8 to 12.7). The revised user interface decreased the number of user interactions by 6.5 for intravenous documentation (95% CI 2.9 to 10.1) and by 16.1 for airway documentation (95% CI 11.1 to 21.1). The revised user interface required 3.8 fewer documentation steps (95% CI 2.3 to 5.4). Airway documentation time was reduced by 30.5 seconds with the revised workflow (95% CI 8.5 to 52.4). There were no significant time differences noted in intravenous documentation or in total task time. No difference in perceived workload was found between the user interfaces. Two user interface

  3. Haptic teleoperation systems signal processing perspective

    CERN Document Server

    Lee, Jae-young

    2015-01-01

    This book examines the signal processing perspective in haptic teleoperation systems. This text covers the topics of prediction, estimation, architecture, data compression, and error correction that can be applied to haptic teleoperation systems. The authors begin with an overview of haptic teleoperation systems, then look at a Bayesian approach to haptic teleoperation systems. They move onto a discussion of haptic data compression, haptic data digitization and forward error correction.   ·         Presents haptic data prediction/estimation methods that compensate for unreliable networks   ·         Discusses haptic data compression that reduces haptic data size over limited network bandwidth and haptic data error correction that compensate for packet loss problem   ·         Provides signal processing techniques used with existing control architectures.

  4. Disjoint forms in graphical user interfaces

    NARCIS (Netherlands)

    Evers, S.; Achten, P.M.; Plasmeijer, M.J.; Loidl, H.W.

    Forms are parts of a graphical user interface (GUI) that show a set of values and allow the user to update them. The declarative form construction library FunctionalForms is extended with disjoint form combinators to capture some common patterns in which the form structure expresses a choice. We

  5. Nurses perceptions of a user friendly interface

    OpenAIRE

    Alshafai, Fatimah

    2017-01-01

    Introduction: The successful implementation of clinical information systems depends to a large extent on its usability. Usability can be achieved by a strong focus on interface quality. With a focus on improving the quality of patient care, growing numbers of clinical information systems have been advertised as being "user-friendly". However, the term "user-friendly" may not be quite accurate and in some circumstances could be misleading. Within a clinical setting, an interface designed as ea...

  6. Haptic perception

    NARCIS (Netherlands)

    Kappers, A.M.L.; Bergmann Tiest, W.M.

    2013-01-01

    Fueled by novel applications, interest in haptic perception is growing. This paper provides an overview of the state of the art of a number of important aspects of haptic perception. By means of touch we can not only perceive quite different material properties, such as roughness, compliance,

  7. Power User Interface

    Science.gov (United States)

    Pfister, Robin; McMahon, Joe

    2006-01-01

    Power User Interface 5.0 (PUI) is a system of middleware, written for expert users in the Earth-science community, PUI enables expedited ordering of data granules on the basis of specific granule-identifying information that the users already know or can assemble. PUI also enables expert users to perform quick searches for orderablegranule information for use in preparing orders. PUI 5.0 is available in two versions (note: PUI 6.0 has command-line mode only): a Web-based application program and a UNIX command-line- mode client program. Both versions include modules that perform data-granule-ordering functions in conjunction with external systems. The Web-based version works with Earth Observing System Clearing House (ECHO) metadata catalog and order-entry services and with an open-source order-service broker server component, called the Mercury Shopping Cart, that is provided separately by Oak Ridge National Laboratory through the Department of Energy. The command-line version works with the ECHO metadata and order-entry process service. Both versions of PUI ultimately use ECHO to process an order to be sent to a data provider. Ordered data are provided through means outside the PUI software system.

  8. Use of force feedback to enhance graphical user interfaces

    Science.gov (United States)

    Rosenberg, Louis B.; Brave, Scott

    1996-04-01

    This project focuses on the use of force feedback sensations to enhance user interaction with standard graphical user interface paradigms. While typical joystick and mouse devices are input-only, force feedback controllers allow physical sensations to be reflected to a user. Tasks that require users to position a cursor on a given target can be enhanced by applying physical forces to the user that aid in targeting. For example, an attractive force field implemented at the location of a graphical icon can greatly facilitate target acquisition and selection of the icon. It has been shown that force feedback can enhance a users ability to perform basic functions within graphical user interfaces.

  9. Developing A Web-based User Interface for Semantic Information Retrieval

    Science.gov (United States)

    Berrios, Daniel C.; Keller, Richard M.

    2003-01-01

    While there are now a number of languages and frameworks that enable computer-based systems to search stored data semantically, the optimal design for effective user interfaces for such systems is still uncle ar. Such interfaces should mask unnecessary query detail from users, yet still allow them to build queries of arbitrary complexity without significant restrictions. We developed a user interface supporting s emantic query generation for Semanticorganizer, a tool used by scient ists and engineers at NASA to construct networks of knowledge and dat a. Through this interface users can select node types, node attribute s and node links to build ad-hoc semantic queries for searching the S emanticOrganizer network.

  10. Mobile Phone User Interfaces in Multiplayer Games

    OpenAIRE

    NURMINEN, MINNA

    2007-01-01

    This study focuses on the user interface elements of mobile phones and their qualities in multiplayer games. Mobile phone is not intended as a gaming device. Therefore its technology has many shortcomings when it comes to playing mobile games on the device. One of those is the non-standardized user interface design. However, it has also some strengths, such as the portability and networked nature. In addition, many mobile phone models today have a camera, a feature only few gaming devices hav...

  11. User interface design principles for the SSM/PMAD automated power system

    International Nuclear Information System (INIS)

    Jakstas, L.M.; Myers, C.J.

    1991-01-01

    Computer-human interfaces are an integral part of developing software for spacecraft power systems. A well designed and efficient user interface enables an engineer to effectively operate the system, while it concurrently prevents the user from entering data which is beyond boundary conditions or performing operations which are out of context. A user interface should also be designed to ensure that the engineer easily obtains all useful and critical data for operating the system and is aware of all faults and states in the system. Martin Marietta, under contract to NASA George C. Marshall Space Flight Center, has developed a user interface for the Space Station Module Power Management and Distribution (SSM/PMAD) automated power system testbed which provides human access to the functionality of the power system, as well as exemplifying current techniques in user interface design. The testbed user interface was designed to enable an engineer to operate the system easily without having significant knowledge of computer systems, as well as provide an environment in which the engineer can monitor and interact with the SSM/PMAD system hardware. The design of the interface supports a global view of the most important data form the various hardware and software components, as well as enabling the user to obtain additional or more detailed data when needed. The components and representations of the SSM/PMAD testbed user interface are examined in this paper. An engineer's interactions with the system are also described

  12. a mobile user interface for low-literacy users in rural south africa

    African Journals Online (AJOL)

    Information and Communication Technology services for socio-economic ... was conducted to design a mobile user interface to enable low-literacy users in Dwesa ..... common social and religious groups ... layout, buttons and menu.

  13. Virtual reality haptic dissection.

    Science.gov (United States)

    Erolin, Caroline; Wilkinson, Caroline; Soames, Roger

    2011-12-01

    This project aims to create a three-dimensional digital model of the human hand and wrist which can be virtually 'dissected' through a haptic interface. Tissue properties will be added to the various anatomical structures to replicate a realistic look and feel. The project will explore the role of the medical artist, and investigate cross-discipline collaborations in the field of virtual anatomy. The software will be used to train anatomy students in dissection skills, before experience on a real cadaver. The effectiveness of the software will be evaluated and assessed both quantitatively as well as qualitatively.

  14. A new visual feedback-based magnetorheological haptic master for robot-assisted minimally invasive surgery

    International Nuclear Information System (INIS)

    Choi, Seung-Hyun; Kim, Soomin; Kim, Pyunghwa; Park, Jinhyuk; Choi, Seung-Bok

    2015-01-01

    In this study, we developed a novel four-degrees-of-freedom haptic master using controllable magnetorheological (MR) fluid. We also integrated the haptic master with a vision device with image processing for robot-assisted minimally invasive surgery (RMIS). The proposed master can be used in RMIS as a haptic interface to provide the surgeon with a sense of touch by using both kinetic and kinesthetic information. The slave robot, which is manipulated with a proportional-integrative-derivative controller, uses a force sensor to obtain the desired forces from tissue contact, and these desired repulsive forces are then embodied through the MR haptic master. To verify the effectiveness of the haptic master, the desired force and actual force are compared in the time domain. In addition, a visual feedback system is implemented in the RMIS experiment to distinguish between the tumor and organ more clearly and provide better visibility to the operator. The hue-saturation-value color space is adopted for the image processing since it is often more intuitive than other color spaces. The image processing and haptic feedback are realized on surgery performance. In this work, tumor-cutting experiments are conducted under four different operating conditions: haptic feedback on, haptic feedback off, image processing on, and image processing off. The experimental realization shows that the performance index, which is a function of pixels, is different in the four operating conditions. (paper)

  15. Users expect interfaces to behave like the physical world

    DEFF Research Database (Denmark)

    Nørager, Rune

    2006-01-01

    Navigation in folder structures is an essential part of most window based user interfaces. Basic human navigation strategies rely on stable properties of the physical world, which are not by default present in windows style user interfaces. According to the theoretical framework Ecological Cognit...

  16. The Rise of the Graphical User Interface.

    Science.gov (United States)

    Edwards, Alastair D. N.

    1996-01-01

    Discusses the history of the graphical user interface (GUI) and the growing realization that adaptations must be made to it lest its visual nature discriminate against nonsighted or sight-impaired users. One of the most popular commercially developed adaptations is to develop sounds that signal the location of icons or menus to mouse users.…

  17. Modeling and test of a kinaesthetic actuator based on MR fluid for haptic applications.

    Science.gov (United States)

    Yang, Tae-Heon; Koo, Jeong-Hoi; Kim, Sang-Youn; Kwon, Dong-Soo

    2017-03-01

    Haptic display units have been widely used for conveying button sensations to users, primarily employing vibrotactile actuators. However, the human feeling for pressing buttons mainly relies on kinaesthetic sensations (rather than vibrotactile sensations), and little studies exist on small-scale kinaesthetic haptic units. Thus, the primary goals of this paper are to design a miniature kinaesthetic actuator based on Magneto-Rheological (MR) fluid that can convey various button-clicking sensations and to experimentally evaluate its haptic performance. The design focuses of the proposed actuator were to produce sufficiently large actuation forces (resistive forces) for human users in a given size constraint and to offer a wide range of actuation forces for conveying vivid haptic sensations to users. To this end, this study first performed a series of parametric studies using mathematical force models for multiple operating modes of MR fluid in conjunction with finite element electromagnetism analysis. After selecting design parameters based on parametric studies, a prototype actuator was constructed, and its performance was evaluated using a dynamic mechanical analyzer. It measured the actuator's resistive force with a varying stroke (pressed depth) up to 1 mm and a varying input current from 0 A to 200 mA. The results show that the proposed actuator creates a wide range of resistive forces from around 2 N (off-state) to over 9.5 N at 200 mA. In order to assess the prototype's performance in the terms of the haptic application prospective, a maximum force rate was calculated to determine just noticeable difference in force changes for the 1 mm stoke of the actuator. The results show that the force rate is sufficient to mimic various levels of button sensations, indicating that the proposed kinaesthetic actuator can offer a wide range of resistive force changes that can be conveyed to human operators.

  18. Switching-based Mapping and Control for Haptic Teleoperation of Aerial Robots

    NARCIS (Netherlands)

    Mersha, A.Y.; Stramigioli, Stefano; Carloni, Raffaella

    2012-01-01

    This paper deals with the bilateral teleoperation of underactuated aerial robots by means of a haptic interface. In particular, we propose a switching-based state mapping and control algorithm between a rate-based passive controller, which addresses the workspace incompatibility between the master

  19. BlindSense: An Accessibility-inclusive Universal User Interface for Blind People

    Directory of Open Access Journals (Sweden)

    A. Khan

    2018-04-01

    Full Text Available A large number of blind people use smartphone-based assistive technology to perform their common activities. In order to provide a better user experience the existing user interface paradigm needs to be revisited. A new user interface model has been proposed in this paper. A simplified, semantically consistent, and blind-friendly adaptive user interface is provided. The proposed solution is evaluated through an empirical study on 63 blind people leveraging an improved user experience in performing common activities on a smartphone.

  20. Guidelines for the integration of audio cues into computer user interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Sumikawa, D.A.

    1985-06-01

    Throughout the history of computers, vision has been the main channel through which information is conveyed to the computer user. As the complexities of man-machine interactions increase, more and more information must be transferred from the computer to the user and then successfully interpreted by the user. A logical next step in the evolution of the computer-user interface is the incorporation of sound and thereby using the sense of ''hearing'' in the computer experience. This allows our visual and auditory capabilities to work naturally together in unison leading to more effective and efficient interpretation of all information received by the user from the computer. This thesis presents an initial set of guidelines to assist interface developers in designing an effective sight and sound user interface. This study is a synthesis of various aspects of sound, human communication, computer-user interfaces, and psychoacoustics. We introduce the notion of an earcon. Earcons are audio cues used in the computer-user interface to provide information and feedback to the user about some computer object, operation, or interaction. A possible construction technique for earcons, the use of earcons in the interface, how earcons are learned and remembered, and the affects of earcons on their users are investigated. This study takes the point of view that earcons are a language and human/computer communication issue and are therefore analyzed according to the three dimensions of linguistics; syntactics, semantics, and pragmatics.

  1. Risk Issues in Developing Novel User Interfaces for Human-Computer Interaction

    KAUST Repository

    Klinker, Gudrun; Huber, Manuel; Tö nnis, Marcus

    2014-01-01

    © 2014 Springer International Publishing Switzerland. All rights are reserved. When new user interfaces or information visualization schemes are developed for complex information processing systems, it is not readily clear how much they do, in fact, support and improve users' understanding and use of such systems. Is a new interface better than an older one? In what respect, and in which situations? To provide answers to such questions, user testing schemes are employed. This chapter reports on a range of risks pertaining to the design and implementation of user interfaces in general, and to newly emerging interfaces (3-dimensionally, immersive, mobile) in particular.

  2. Risk Issues in Developing Novel User Interfaces for Human-Computer Interaction

    KAUST Repository

    Klinker, Gudrun

    2014-01-01

    © 2014 Springer International Publishing Switzerland. All rights are reserved. When new user interfaces or information visualization schemes are developed for complex information processing systems, it is not readily clear how much they do, in fact, support and improve users\\' understanding and use of such systems. Is a new interface better than an older one? In what respect, and in which situations? To provide answers to such questions, user testing schemes are employed. This chapter reports on a range of risks pertaining to the design and implementation of user interfaces in general, and to newly emerging interfaces (3-dimensionally, immersive, mobile) in particular.

  3. Developing a Graphical User Interface for the ALSS Crop Planning Tool

    Science.gov (United States)

    Koehlert, Erik

    1997-01-01

    The goal of my project was to create a graphical user interface for a prototype crop scheduler. The crop scheduler was developed by Dr. Jorge Leon and Laura Whitaker for the ALSS (Advanced Life Support System) program. The addition of a system-independent graphical user interface to the crop planning tool will make the application more accessible to a wider range of users and enhance its value as an analysis, design, and planning tool. My presentation will demonstrate the form and functionality of this interface. This graphical user interface allows users to edit system parameters stored in the file system. Data on the interaction of the crew, crops, and waste processing system with the available system resources is organized and labeled. Program output, which is stored in the file system, is also presented to the user in performance-time plots and organized charts. The menu system is designed to guide the user through analysis and decision making tasks, providing some help if necessary. The Java programming language was used to develop this interface in hopes of providing portability and remote operation.

  4. User interface design principles for the SSM/PMAD automated power system

    Science.gov (United States)

    Jakstas, Laura M.; Myers, Chris J.

    1991-01-01

    Martin Marietta has developed a user interface for the space station module power management and distribution (SSM/PMAD) automated power system testbed which provides human access to the functionality of the power system, as well as exemplifying current techniques in user interface design. The testbed user interface was designed to enable an engineer to operate the system easily without having significant knowledge of computer systems, as well as provide an environment in which the engineer can monitor and interact with the SSM/PMAD system hardware. The design of the interface supports a global view of the most important data from the various hardware and software components, as well as enabling the user to obtain additional or more detailed data when needed. The components and representations of the SSM/PMAD testbed user interface are examined. An engineer's interactions with the system are also described.

  5. On user-friendly interface construction for CACSD packages

    DEFF Research Database (Denmark)

    Ravn, Ole

    1989-01-01

    Some ideas that are used in the development of user-friendly interface for a computer-aided control system design (CACSD) package are presented. The concepts presented are integration and extensibility through the use of object-oriented programming, man-machine interface and user support using...... direct manipulation, and multiple views and multiple actions on objects in different domains. The use of multiple views and actions in combination with graphics enhances the user's ability to get an overview of the system to be designed. Good support for iteration is provided, and the short time between...... action and presentation allows the user to evaluate actions quickly. Object-oriented programming has been used to provide modularity and encapsulation...

  6. Rehabilitation of activities of daily living in virtual environments with intuitive user interface and force feedback.

    Science.gov (United States)

    Chiang, Vico Chung-Lim; Lo, King-Hung; Choi, Kup-Sze

    2017-10-01

    To investigate the feasibility of using a virtual rehabilitation system with intuitive user interface and force feedback to improve the skills in activities of daily living (ADL). A virtual training system equipped with haptic devices was developed for the rehabilitation of three ADL tasks - door unlocking, water pouring and meat cutting. Twenty subjects with upper limb disabilities, supervised by two occupational therapists, received a four-session training using the system. The task completion time and the amount of water poured into a virtual glass were recorded. The performance of the three tasks in reality was assessed before and after the virtual training. Feedback of the participants was collected with questionnaires after the study. The completion time of the virtual tasks decreased during the training (p water successfully poured increased (p = 0.051). The score of the Borg scale of perceived exertion was 1.05 (SD = 1.85; 95% CI =  0.18-1.92) and that of the task specific feedback questionnaire was 31 (SD =  4.85; 95% CI =  28.66-33.34). The feedback of the therapists suggested a positive rehabilitation effect. The participants had positive perception towards the system. The system can potentially be used as a tool to complement conventional rehabilitation approaches of ADL. Implications for rehabilitation Rehabilitation of activities of daily living can be facilitated using computer-assisted approaches. The existing approaches focus on cognitive training rather than the manual skills. A virtual training system with intuitive user interface and force feedback was designed to improve the learning of the manual skills. The study shows that system could be used as a training tool to complement conventional rehabilitation approaches.

  7. Generating Graphical User Interfaces from Precise Domain Specifications

    OpenAIRE

    Kamil Rybiński; Norbert Jarzębowski; Michał Śmiałek; Wiktor Nowakowski; Lucyna Skrzypek; Piotr Łabęcki

    2014-01-01

    Turning requirements into working systems is the essence of software engineering. This paper proposes automation of one of the aspects of this vast problem: generating user interfaces directly from requirements models. It presents syntax and semantics of a comprehensible yet precise domain specification language. For this language, the paper presents the process of generating code for the user interface elements. This includes model transformation procedures to generate window initiation code...

  8. Dynamic Distribution and Layouting of Model-Based User Interfaces in Smart Environments

    Science.gov (United States)

    Roscher, Dirk; Lehmann, Grzegorz; Schwartze, Veit; Blumendorf, Marco; Albayrak, Sahin

    The developments in computer technology in the last decade change the ways of computer utilization. The emerging smart environments make it possible to build ubiquitous applications that assist users during their everyday life, at any time, in any context. But the variety of contexts-of-use (user, platform and environment) makes the development of such ubiquitous applications for smart environments and especially its user interfaces a challenging and time-consuming task. We propose a model-based approach, which allows adapting the user interface at runtime to numerous (also unknown) contexts-of-use. Based on a user interface modelling language, defining the fundamentals and constraints of the user interface, a runtime architecture exploits the description to adapt the user interface to the current context-of-use. The architecture provides automatic distribution and layout algorithms for adapting the applications also to contexts unforeseen at design time. Designers do not specify predefined adaptations for each specific situation, but adaptation constraints and guidelines. Furthermore, users are provided with a meta user interface to influence the adaptations according to their needs. A smart home energy management system serves as running example to illustrate the approach.

  9. A VR-User Interface for Design by Features

    NARCIS (Netherlands)

    Coomans, M.K.D.; Timmermans, H.J.P.

    1998-01-01

    We present the design of a Virtual Reality based user interface (VR-UI). It is the interface for the VR-DIS system, a design application for the Building and Construction industry (VRDIS stands for Virtual Reality - Design Information System). The interface is characterised by a mixed representation

  10. The HEASARC graphical user interface

    Science.gov (United States)

    White, N.; Barrett, P.; Jacobs, P.; Oneel, B.

    1992-01-01

    An OSF/Motif-based graphical user interface has been developed to facilitate the use of the database and data analysis software packages available from the High Energy Astrophysics Science Archive Research Center (HEASARC). It can also be used as an interface to other, similar, routines. A small number of tables are constructed to specify the possible commands and command parameters for a given set of analysis routines. These tables can be modified by a designer to affect the appearance of the interface screens. They can also be dynamically changed in response to parameter adjustments made while the underlying program is running. Additionally, a communication protocol has been designed so that the interface can operate locally or across a network. It is intended that this software be able to run on a variety of workstations and X terminals.

  11. Training haptic stiffness discrimination: time course of learning with or without visual information and knowledge of results.

    Science.gov (United States)

    Teodorescu, Kinneret; Bouchigny, Sylvain; Korman, Maria

    2013-08-01

    In this study, we explored the time course of haptic stiffness discrimination learning and how it was affected by two experimental factors, the addition of visual information and/or knowledge of results (KR) during training. Stiffness perception may integrate both haptic and visual modalities. However, in many tasks, the visual field is typically occluded, forcing stiffness perception to be dependent exclusively on haptic information. No studies to date addressed the time course of haptic stiffness perceptual learning. Using a virtual environment (VE) haptic interface and a two-alternative forced-choice discrimination task, the haptic stiffness discrimination ability of 48 participants was tested across 2 days. Each day included two haptic test blocks separated by a training block Additional visual information and/or KR were manipulated between participants during training blocks. Practice repetitions alone induced significant improvement in haptic stiffness discrimination. Between days, accuracy was slightly improved, but decision time performance was deteriorated. The addition of visual information and/or KR had only temporary effects on decision time, without affecting the time course of haptic discrimination learning. Learning in haptic stiffness discrimination appears to evolve through at least two distinctive phases: A single training session resulted in both immediate and latent learning. This learning was not affected by the training manipulations inspected. Training skills in VE in spaced sessions can be beneficial for tasks in which haptic perception is critical, such as surgery procedures, when the visual field is occluded. However, training protocols for such tasks should account for low impact of multisensory information and KR.

  12. MuSim, a Graphical User Interface for Multiple Simulation Programs

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, Thomas [MUONS Inc., Batavia; Cummings, Mary Anne [MUONS Inc., Batavia; Johnson, Rolland [MUONS Inc., Batavia; Neuffer, David [Fermilab

    2016-06-01

    MuSim is a new user-friendly program designed to interface to many different particle simulation codes, regardless of their data formats or geometry descriptions. It presents the user with a compelling graphical user interface that includes a flexible 3-D view of the simulated world plus powerful editing and drag-and-drop capabilities. All aspects of the design can be parametrized so that parameter scans and optimizations are easy. It is simple to create plots and display events in the 3-D viewer (with a slider to vary the transparency of solids), allowing for an effortless comparison of different simulation codes. Simulation codes: G4beamline, MAD-X, and MCNP; more coming. Many accelerator design tools and beam optics codes were written long ago, with primitive user interfaces by today's standards. MuSim is specifically designed to make it easy to interface to such codes, providing a common user experience for all, and permitting the construction and exploration of models with very little overhead. For today's technology-driven students, graphical interfaces meet their expectations far better than text-based tools, and education in accelerator physics is one of our primary goals.

  13. Detection of Membrane Puncture with Haptic Feedback using a Tip-Force Sensing Needle.

    Science.gov (United States)

    Elayaperumal, Santhi; Bae, Jung Hwa; Daniel, Bruce L; Cutkosky, Mark R

    2014-09-01

    This paper presents calibration and user test results of a 3-D tip-force sensing needle with haptic feedback. The needle is a modified MRI-compatible biopsy needle with embedded fiber Bragg grating (FBG) sensors for strain detection. After calibration, the needle is interrogated at 2 kHz, and dynamic forces are displayed remotely with a voice coil actuator. The needle is tested in a single-axis master/slave system, with the voice coil haptic display at the master, and the needle at the slave end. Tissue phantoms with embedded membranes were used to determine the ability of the tip-force sensors to provide real-time haptic feedback as compared to external sensors at the needle base during needle insertion via the master/slave system. Subjects were able to determine the position of the embedded membranes with significantly better accuracy using FBG tip feedback than with base feedback using a commercial force/torque sensor (p = 0.045) or with no added haptic feedback (p = 0.0024).

  14. Coordinating user interfaces for consistency

    CERN Document Server

    Nielsen, Jakob

    2001-01-01

    In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analys

  15. More playful user interfaces interfaces that invite social and physical interaction

    CERN Document Server

    2015-01-01

    This book covers the latest advances in playful user interfacesinterfaces that invite social and physical interaction. These new developments include the use of audio, visual, tactile and physiological sensors to monitor, provide feedback and anticipate the behavior of human users. The decreasing cost of sensor and actuator technology makes it possible to integrate physical behavior information in human-computer interactions. This leads to many new entertainment and game applications that allow or require social and physical interaction in sensor- and actuator-equipped smart environments. The topics discussed include: human-nature interaction, human-animal interaction and the interaction with tangibles that are naturally integrated in our smart environments. Digitally supported remote audience participation in artistic or sport events is also discussed. One important theme that emerges throughout the book is the involvement of users in the digital-entertainment design process or even design and implement...

  16. User-interface aspects in recognizing connected-cursive handwriting

    NARCIS (Netherlands)

    Schomaker, L

    1994-01-01

    There are at least two major stumbling blocks for user acceptance of pen-based computers: the recognition performance is not good enough, especially on cursive handwriting; and the user interface technology has not reached a mature stage. The initial reaction of product reviewers and potential user

  17. Pemrograman Graphical User Interface (GUI) Dengan Matlab Untuk Mendesain Alat Bantu Opersai Matematika

    OpenAIRE

    Butar Butar, Ronisah Putra

    2011-01-01

    Graphical User Interface ( GUI) is a application program orient visual which woke up with graphical obyek in the place of comand of text for the user interaction. Graphical User Interface ( GUI) in MATLAB embraced in a application of GUIDE ( Graphical User Interface Builder). In this paper will be discuss about how disagning a appliance assist mathematics operation with program of Graphical User Interface ( GUI) with MATLAB with aim to as one of the appliance alternative assist...

  18. Virtual reality in neurosurgical education: part-task ventriculostomy simulation with dynamic visual and haptic feedback.

    Science.gov (United States)

    Lemole, G Michael; Banerjee, P Pat; Luciano, Cristian; Neckrysh, Sergey; Charbel, Fady T

    2007-07-01

    Mastery of the neurosurgical skill set involves many hours of supervised intraoperative training. Convergence of political, economic, and social forces has limited neurosurgical resident operative exposure. There is need to develop realistic neurosurgical simulations that reproduce the operative experience, unrestricted by time and patient safety constraints. Computer-based, virtual reality platforms offer just such a possibility. The combination of virtual reality with dynamic, three-dimensional stereoscopic visualization, and haptic feedback technologies makes realistic procedural simulation possible. Most neurosurgical procedures can be conceptualized and segmented into critical task components, which can be simulated independently or in conjunction with other modules to recreate the experience of a complex neurosurgical procedure. We use the ImmersiveTouch (ImmersiveTouch, Inc., Chicago, IL) virtual reality platform, developed at the University of Illinois at Chicago, to simulate the task of ventriculostomy catheter placement as a proof-of-concept. Computed tomographic data are used to create a virtual anatomic volume. Haptic feedback offers simulated resistance and relaxation with passage of a virtual three-dimensional ventriculostomy catheter through the brain parenchyma into the ventricle. A dynamic three-dimensional graphical interface renders changing visual perspective as the user's head moves. The simulation platform was found to have realistic visual, tactile, and handling characteristics, as assessed by neurosurgical faculty, residents, and medical students. We have developed a realistic, haptics-based virtual reality simulator for neurosurgical education. Our first module recreates a critical component of the ventriculostomy placement task. This approach to task simulation can be assembled in a modular manner to reproduce entire neurosurgical procedures.

  19. User Interface Aspects of a Human-Hand Simulation System

    Directory of Open Access Journals (Sweden)

    Beifang Yi

    2005-10-01

    Full Text Available This paper describes the user interface design for a human-hand simulation system, a virtual environment that produces ground truth data (life-like human hand gestures and animations and provides visualization support for experiments on computer vision-based hand pose estimation and tracking. The system allows users to save time in data generation and easily create any hand gestures. We have designed and implemented this user interface with the consideration of usability goals and software engineering issues.

  20. A function-behavior-structure framework for quantification and reproduction of emotional haptic experience in using an electronic device

    International Nuclear Information System (INIS)

    Bae, Il Ju; Lee, Soo Hong; Ok, Hyung Seok; Lee, Jae In

    2013-01-01

    A user's haptic experience in using an electronic device is related to the continuous and dynamic variances of the structural state of the device. Since the changes of the structural component cause complex changes of the dynamics, it is difficult to predict the user's experience. We propose a function-behavior-structure framework to predict and improve the user's experience. The framework consists of the function layer model, the behavior layer model, and the structure layer model. Especially, the independent behavior model to the device is based on a physical phenomenon. Finally, an optimized structure which produces an ideal haptic experience for a cell phone is suggested.

  1. Workshop AccessibleTV "Accessible User Interfaces for Future TV Applications"

    Science.gov (United States)

    Hahn, Volker; Hamisu, Pascal; Jung, Christopher; Heinrich, Gregor; Duarte, Carlos; Langdon, Pat

    Approximately half of the elderly people over 55 suffer from some type of typically mild visual, auditory, motor or cognitive impairment. For them interaction, especially with PCs and other complex devices is sometimes challenging, although accessible ICT applications could make much of a difference for their living quality. Basically they have the potential to enable or simplify participation and inclusion in their surrounding private and professional communities. However, the availability of accessible user interfaces being capable to adapt to the specific needs and requirements of users with individual impairments is very limited. Although there are a number of APIs [1, 2, 3, 4] available for various platforms that allow developers to provide accessibility features within their applications, today none of them provides features for the automatic adaptation of multimodal interfaces being capable to automatically fit the individual requirements of users with different kinds of impairments. Moreover, the provision of accessible user interfaces is still expensive and risky for application developers, as they need special experience and effort for user tests. Today many implementations simply neglect the needs of elderly people, thus locking out a large portion of their potential users. The workshop is organized as part of the dissemination activity for the European-funded project GUIDE "Gentle user interfaces for elderly people", which aims to address this situation with a comprehensive approach for the realization of multimodal user interfaces being capable to adapt to the needs of users with different kinds of mild impairments. As application platform, GUIDE will mainly target TVs and Set-Top Boxes, such as the emerging Connected-TV or WebTV platforms, as they have the potential to address the needs of the elderly users with applications such as for home automation, communication or continuing education.

  2. Pseudo-Haptic Feedback in Teleoperation.

    Science.gov (United States)

    Neupert, Carsten; Matich, Sebastian; Scherping, Nick; Kupnik, Mario; Werthschutzky, Roland; Hatzfeld, Christian

    2016-01-01

    In this paper, we develop possible realizations of pseudo-haptic feedback in teleoperation systems based on existing works for pseudo-haptic feedback in virtual reality and the intended applications. We derive four potential factors affecting the performance of haptic feedback (calculation operator, maximum displacement, offset force, and scaling factor), which are analyzed in three compliance identification experiments. First, we analyze the principle usability of pseudo-haptic feedback by comparing information transfer measures for teleoperation and direct interaction. Pseudo-haptic interaction yields well above-chance performance, while direct interaction performs almost perfectly. In order to optimize pseudo-haptic feedback, in the second study we perform a full-factorial experimental design with 36 subjects performing 6,480 trials with 36 different treatments. Information transfer ranges from 0.68 bit to 1.72 bit in a task with a theoretical maximum of 2.6 bit, with a predominant effect of the calculation operator and a minor effect of the maximum displacement. In a third study, short- and long-term learning effects are analyzed. Learning effects regarding the performance of pseudo-haptic feedback cannot be observed for single-day experiments. Tests over 10 days show a maximum increase in information transfer of 0.8 bit. The results show the feasibility of pseudo-haptic feedback for teleoperation and can be used as design basis for task-specific systems.

  3. Automatic User Interface Generation for Visualizing Big Geoscience Data

    Science.gov (United States)

    Yu, H.; Wu, J.; Zhou, Y.; Tang, Z.; Kuo, K. S.

    2016-12-01

    Along with advanced computing and observation technologies, geoscience and its related fields have been generating a large amount of data at an unprecedented growth rate. Visualization becomes an increasingly attractive and feasible means for researchers to effectively and efficiently access and explore data to gain new understandings and discoveries. However, visualization has been challenging due to a lack of effective data models and visual representations to tackle the heterogeneity of geoscience data. We propose a new geoscience data visualization framework by leveraging the interface automata theory to automatically generate user interface (UI). Our study has the following three main contributions. First, geoscience data has its unique hierarchy data structure and complex formats, and therefore it is relatively easy for users to get lost or confused during their exploration of the data. By applying interface automata model to the UI design, users can be clearly guided to find the exact visualization and analysis that they want. In addition, from a development perspective, interface automaton is also easier to understand than conditional statements, which can simplify the development process. Second, it is common that geoscience data has discontinuity in its hierarchy structure. The application of interface automata can prevent users from suffering automation surprises, and enhance user experience. Third, for supporting a variety of different data visualization and analysis, our design with interface automata could also make applications become extendable in that a new visualization function or a new data group could be easily added to an existing application, which reduces the overhead of maintenance significantly. We demonstrate the effectiveness of our framework using real-world applications.

  4. Development and evaluation of nursing user interface screens using multiple methods.

    Science.gov (United States)

    Hyun, Sookyung; Johnson, Stephen B; Stetson, Peter D; Bakken, Suzanne

    2009-12-01

    Building upon the foundation of the Structured Narrative Electronic Health Record (EHR) model, we applied theory-based (combined Technology Acceptance Model and Task-Technology Fit Model) and user-centered methods to explore nurses' perceptions of functional requirements for an electronic nursing documentation system, design user interface screens reflective of the nurses' perspectives, and assess nurses' perceptions of the usability of the prototype user interface screens. The methods resulted in user interface screens that were perceived to be easy to use, potentially useful, and well-matched to nursing documentation tasks associated with Nursing Admission Assessment, Blood Administration, and Nursing Discharge Summary. The methods applied in this research may serve as a guide for others wishing to implement user-centered processes to develop or extend EHR systems. In addition, some of the insights obtained in this study may be informative to the development of safe and efficient user interface screens for nursing document templates in EHRs.

  5. Pantomime-grasping: Advance knowledge of haptic feedback availability supports an absolute visuo-haptic calibration

    Directory of Open Access Journals (Sweden)

    Shirin eDavarpanah Jazi

    2016-05-01

    Full Text Available An emerging issue in movement neurosciences is whether haptic feedback influences the nature of the information supporting a simulated grasping response (i.e., pantomime-grasping. In particular, recent work by our group contrasted pantomime-grasping responses performed with (i.e., PH+ trials and without (i.e., PH- trials terminal haptic feedback in separate blocks of trials. Results showed that PH- trials were mediated via relative visual information. In contrast, PH+ trials showed evidence of an absolute visuo-haptic calibration – a finding attributed to an error signal derived from a comparison between expected and actual haptic feedback (i.e., an internal forward model. The present study examined whether advanced knowledge of haptic feedback availability influences the aforementioned calibration process. To that end, PH- and PH+ trials were completed in separate blocks (i.e., the feedback schedule used in our group’s previous study and a block wherein PH- and PH+ trials were randomly interleaved on a trial-by-trial basis (i.e., random feedback schedule. In other words, the random feedback schedule precluded participants from predicting whether haptic feedback would be available at the movement goal location. We computed just-noticeable-difference (JND values to determine whether responses adhered to, or violated, the relative psychophysical principles of Weber’s law. Results for the blocked feedback schedule replicated our group’s previous work, whereas in the random feedback schedule PH- and PH+ trials were supported via relative visual information. Accordingly, we propose that a priori knowledge of haptic feedback is necessary to support an absolute visuo-haptic calibration. Moreover, our results demonstrate that the presence and expectancy of haptic feedback is an important consideration in contrasting the behavioral and neural properties of natural and stimulated (i.e., pantomime-grasping grasping.

  6. A Distributed Tactile Sensor for Intuitive Human-Robot Interfacing

    Directory of Open Access Journals (Sweden)

    Andrea Cirillo

    2017-01-01

    Full Text Available Safety of human-robot physical interaction is enabled not only by suitable robot control strategies but also by suitable sensing technologies. For example, if distributed tactile sensors were available on the robot, they could be used not only to detect unintentional collisions, but also as human-machine interface by enabling a new mode of social interaction with the machine. Starting from their previous works, the authors developed a conformable distributed tactile sensor that can be easily conformed to the different parts of the robot body. Its ability to estimate contact force components and to provide a tactile map with an accurate spatial resolution enables the robot to handle both unintentional collisions in safe human-robot collaboration tasks and intentional touches where the sensor is used as human-machine interface. In this paper, the authors present the characterization of the proposed tactile sensor and they show how it can be also exploited to recognize haptic tactile gestures, by tailoring recognition algorithms, well known in the image processing field, to the case of tactile images. In particular, a set of haptic gestures has been defined to test three recognition algorithms on a group of 20 users. The paper demonstrates how the same sensor originally designed to manage unintentional collisions can be successfully used also as human-machine interface.

  7. Audio effects on haptics perception during drilling simulation

    Directory of Open Access Journals (Sweden)

    Yair Valbuena

    2017-06-01

    Full Text Available Virtual reality has provided immersion and interactions through computer generated environments attempting to reproduce real life experiences through sensorial stimuli. Realism can be achieved through multimodal interactions which can enhance the user’s presence within the computer generated world. The most notorious advances in virtual reality can be seen in computer graphics visuals, where photorealism is the norm thriving to overcome the uncanny valley. Other advances have followed related to sound, haptics, and in a lesser manner smell and taste feedback. Currently, virtual reality systems (multimodal immersion and interactions through visual-haptic-sound are being massively used in entertainment (e.g., cinema, video games, art, and in non-entertainment scenarios (e.g., social inclusion, educational, training, therapy, and tourism. Moreover, the cost reduction of virtual reality technologies has resulted in the availability at a consumer-level of various haptic, headsets, and motion tracking devices. Current consumer-level devices offer low-fidelity experiences due to the properties of the sensors, displays, and other electro-mechanical devices, that may not be suitable for high-precision or realistic experiences requiring dexterity. However, research has been conducted on how toovercome or compensate the lack of high fidelity to provide an engaging user experience using storytelling, multimodal interactions and gaming elements. Our work focuses on analyzing the possible effects of auditory perception on haptic feedback within a drilling scenario. Drilling involves multimodal interactions and it is a task with multiple applications in medicine, crafting, and construction. We compare two drilling scenarios were two groups of participants had to drill through wood while listening to contextual and non-contextual audios. We gathered their perception using a survey after the task completion. From the results, we believe that sound does

  8. Developing adaptive user interfaces using a game-based simulation environment

    NARCIS (Netherlands)

    Brake, G.M. te; Greef, T.E. de; Lindenberg, J.; Rypkema, J.A.; Smets-Noor, N.J.J.M.

    2006-01-01

    In dynamic settings, user interfaces can provide more optimal support if they adapt to the context of use. Providing adaptive user interfaces to first responders may therefore be fruitful. A cognitive engineering method that incorporates development iterations in both a simulated and a real-world

  9. TaskMaster: a prototype graphical user interface to a schedule optimization model

    OpenAIRE

    Banham, Stephen R.

    1990-01-01

    Approved for public release, distribution is unlimited This thesis investigates the use of current graphical interface techniques to build more effective computer-user interfaces to Operations Research (OR) schedule optimization models. The design is directed at the scheduling decision maker who possesses limited OR experience. The feasibility and validity of building an interface for this kind of user is demonstrated in the development of a prototype graphical user interface called TaskMa...

  10. Concepts of analytical user interface evaluation method for continuous work in NPP main control room

    International Nuclear Information System (INIS)

    Lee, S. J.; Heo, G. Y.; Jang, S. H.

    2003-01-01

    This paper describes a conceptual study of analytical evaluation method for computer-based user interface in the main control room of advanced nuclear power plant. User interfaces can classify them into two groups as static interface and dynamic interface. Existing evaluation and design methods of user interface have been mainly performed for the static user interface. But, it is useful for the dynamic user interface to control the complex system, and proper evaluation method for this is seldom. Therefore an evaluation method for dynamic user interface is proper for continuous works by standards of the load of cognition and the similarity of an interface

  11. Spatial issues in user interface design from a graphic design perspective

    Science.gov (United States)

    Marcus, Aaron

    1989-01-01

    The user interface of a computer system is a visual display that provides information about the status of operations on data within the computer and control options to the user that enable adjustments to these operations. From the very beginning of computer technology the user interface was a spatial display, although its spatial features were not necessarily complex or explicitly recognized by the users. All text and nonverbal signs appeared in a virtual space generally thought of as a single flat plane of symbols. Current technology of high performance workstations permits any element of the display to appear as dynamic, multicolor, 3-D signs in a virtual 3-D space. The complexity of appearance and the user's interaction with the display provide significant challenges to the graphic designer of current and future user interfaces. In particular, spatial depiction provides many opportunities for effective communication of objects, structures, processes, navigation, selection, and manipulation. Issues are presented that are relevant to the graphic designer seeking to optimize the user interface's spatial attributes for effective visual communication.

  12. Impact of English Regional Accents on User Acceptance of Voice User Interfaces

    NARCIS (Netherlands)

    Niculescu, A.I.; White, G.M.; Lan, S.S.; Waloejo, R.U.; Kawaguchi, Y.

    2008-01-01

    In this paper we present an experiment addressing a critical issue in Voice User Interface (VUI) design, namely whether the user acceptance can be improved by having recorded voice prompts imitate his/her regional dialect. The claim was tested within a project aiming to develop voice animated

  13. Virtual reality haptic human dissection.

    Science.gov (United States)

    Needham, Caroline; Wilkinson, Caroline; Soames, Roger

    2011-01-01

    This project aims to create a three-dimensional digital model of the human hand and wrist which can be virtually 'dissected' through a haptic interface. Tissue properties will be added to the various anatomical structures to replicate a realistic look and feel. The project will explore the role of the medical artist and investigate the cross-discipline collaborations required in the field of virtual anatomy. The software will be used to train anatomy students in dissection skills before experience on a real cadaver. The effectiveness of the software will be evaluated and assessed both quantitatively as well as qualitatively.

  14. Gromita: a fully integrated graphical user interface to gromacs 4.

    Science.gov (United States)

    Sellis, Diamantis; Vlachakis, Dimitrios; Vlassi, Metaxia

    2009-09-07

    Gromita is a fully integrated and efficient graphical user interface (GUI) to the recently updated molecular dynamics suite Gromacs, version 4. Gromita is a cross-platform, perl/tcl-tk based, interactive front end designed to break the command line barrier and introduce a new user-friendly environment to run molecular dynamics simulations through Gromacs. Our GUI features a novel workflow interface that guides the user through each logical step of the molecular dynamics setup process, making it accessible to both advanced and novice users. This tool provides a seamless interface to the Gromacs package, while providing enhanced functionality by speeding up and simplifying the task of setting up molecular dynamics simulations of biological systems. Gromita can be freely downloaded from http://bio.demokritos.gr/gromita/.

  15. Save medical personnel's time by improved user interfaces.

    Science.gov (United States)

    Kindler, H

    1997-01-01

    Common objectives in the industrial countries are the improvement of quality of care, clinical effectiveness, and cost control. Cost control, in particular, has been addressed through the introduction of case mix systems for reimbursement by social-security institutions. More data is required to enable quality improvement, increases in clinical effectiveness and for juridical reasons. At first glance, this documentation effort is contradictory to cost reduction. However, integrated services for resource management based on better documentation should help to reduce costs. The clerical effort for documentation should be decreased by providing a co-operative working environment for healthcare professionals applying sophisticated human-computer interface technology. Additional services, e.g., automatic report generation, increase the efficiency of healthcare personnel. Modelling the medical work flow forms an essential prerequisite for integrated resource management services and for co-operative user interfaces. A user interface aware of the work flow provides intelligent assistance by offering the appropriate tools at the right moment. Nowadays there is a trend to client/server systems with relational databases or object-oriented databases as repository. The work flows used for controlling purposes and to steer the user interfaces must be represented in the repository.

  16. NLEdit: A generic graphical user interface for Fortran programs

    Science.gov (United States)

    Curlett, Brian P.

    1994-01-01

    NLEdit is a generic graphical user interface for the preprocessing of Fortran namelist input files. The interface consists of a menu system, a message window, a help system, and data entry forms. A form is generated for each namelist. The form has an input field for each namelist variable along with a one-line description of that variable. Detailed help information, default values, and minimum and maximum allowable values can all be displayed via menu picks. Inputs are processed through a scientific calculator program that allows complex equations to be used instead of simple numeric inputs. A custom user interface is generated simply by entering information about the namelist input variables into an ASCII file. There is no need to learn a new graphics system or programming language. NLEdit can be used as a stand-alone program or as part of a larger graphical user interface. Although NLEdit is intended for files using namelist format, it can be easily modified to handle other file formats.

  17. A Hybrid 2D/3D User Interface for Radiological Diagnosis.

    Science.gov (United States)

    Mandalika, Veera Bhadra Harish; Chernoglazov, Alexander I; Billinghurst, Mark; Bartneck, Christoph; Hurrell, Michael A; Ruiter, Niels de; Butler, Anthony P H; Butler, Philip H

    2018-02-01

    This paper presents a novel 2D/3D desktop virtual reality hybrid user interface for radiology that focuses on improving 3D manipulation required in some diagnostic tasks. An evaluation of our system revealed that our hybrid interface is more efficient for novice users and more accurate for both novice and experienced users when compared to traditional 2D only interfaces. This is a significant finding because it indicates, as the techniques mature, that hybrid interfaces can provide significant benefit to image evaluation. Our hybrid system combines a zSpace stereoscopic display with 2D displays, and mouse and keyboard input. It allows the use of 2D and 3D components interchangeably, or simultaneously. The system was evaluated against a 2D only interface with a user study that involved performing a scoliosis diagnosis task. There were two user groups: medical students and radiology residents. We found improvements in completion time for medical students, and in accuracy for both groups. In particular, the accuracy of medical students improved to match that of the residents.

  18. Nonspeech audio in user interfaces for TV

    NARCIS (Netherlands)

    Sluis, van de Richard; Eggen, J.H.; Rypkema, J.A.

    1997-01-01

    This study explores the end-user benefits of using nonspeech audio in television user interfaces. A prototype of an Electronic Programme Guide (EPG) served as a carrier for the research. One of the features of this EPG is the possibility to search for TV programmes in a category-based way. The EPG

  19. Detecting users handedness for ergonomic adaptation of mobile user interfaces

    DEFF Research Database (Denmark)

    Löchtefeld, Markus; Schardt, Phillip; Krüger, Antonio

    2015-01-01

    ) for users with average hand sizes. One solution is to offer adaptive user interfaces for such one-handed interactions. These modes have to be triggered manually and thus induce a critical overhead. They are further designed to bring all content closer, regardless of whether the phone is operated...... with the left or right hand. In this paper, we present an algorithm that allows determining the users' interacting hand from their unlocking behavior. Our algorithm correctly distinguishes one- and twohanded usage as well as left- and right handed unlocking in 98.51% of all cases. This is achieved through a k...

  20. Engineering haptic devices a beginner's guide

    CERN Document Server

    Hatzfeld, Christian

    2014-01-01

    In this greatly reworked second edition of Engineering Haptic Devices the psychophysic content has been thoroughly revised and updated. Chapters on haptic interaction, system structures and design methodology were rewritten from scratch to include further basic principles and recent findings. New chapters on the evaluation of haptic systems and the design of three exemplary haptic systems from science and industry have been added. This book was written for students and engineers that are faced with the development of a task-specific haptic system. It is a reference book for the basics of hap

  1. A Haptic Feedback Scheme to Accurately Position a Virtual Wrist Prosthesis Using a Three-Node Tactor Array.

    Directory of Open Access Journals (Sweden)

    Andrew Erwin

    Full Text Available In this paper, a novel haptic feedback scheme, used for accurately positioning a 1DOF virtual wrist prosthesis through sensory substitution, is presented. The scheme employs a three-node tactor array and discretely and selectively modulates the stimulation frequency of each tactor to relay 11 discrete haptic stimuli to the user. Able-bodied participants were able to move the virtual wrist prosthesis via a surface electromyography based controller. The participants evaluated the feedback scheme without visual or audio feedback and relied solely on the haptic feedback alone to correctly position the hand. The scheme was evaluated through both normal (perpendicular and shear (lateral stimulations applied on the forearm. Normal stimulations were applied through a prototype device previously developed by the authors while shear stimulations were generated using an ubiquitous coin motor vibrotactor. Trials with no feedback served as a baseline to compare results within the study and to the literature. The results indicated that using normal and shear stimulations resulted in accurately positioning the virtual wrist, but were not significantly different. Using haptic feedback was substantially better than no feedback. The results found in this study are significant since the feedback scheme allows for using relatively few tactors to relay rich haptic information to the user and can be learned easily despite a relatively short amount of training. Additionally, the results are important for the haptic community since they contradict the common conception in the literature that normal stimulation is inferior to shear. From an ergonomic perspective normal stimulation has the potential to benefit upper limb amputees since it can operate at lower frequencies than shear-based vibrotactors while also generating less noise. Through further tuning of the novel haptic feedback scheme and normal stimulation device, a compact and comfortable sensory substitution

  2. Through the Interface - a human activity approach to user interfaces

    DEFF Research Database (Denmark)

    Bødker, Susanne

    In providing a theoretical framework for understanding human- computer interaction as well as design of user interfaces, this book combines elements of anthropology, psychology, cognitive science, software engineering, and computer science. The framework examines the everyday work practices of us...

  3. Clinical and optical intraocular performance of rotationally asymmetric multifocal IOL plate-haptic design versus C-loop haptic design.

    Science.gov (United States)

    Alió, Jorge L; Plaza-Puche, Ana B; Javaloy, Jaime; Ayala, María José; Vega-Estrada, Alfredo

    2013-04-01

    To compare the visual and intraocular optical quality outcomes with different designs of the refractive rotationally asymmetric multifocal intraocular lens (MFIOL) (Lentis Mplus; Oculentis GmbH, Berlin, Germany) with or without capsular tension ring (CTR) implantation. One hundred thirty-five consecutive eyes of 78 patients with cataract (ages 36 to 82 years) were divided into three groups: 43 eyes implanted with the C-Loop haptic design without CTR (C-Loop haptic only group); 47 eyes implanted with the C-Loop haptic design with CTR (C-Loop haptic with CTR group); and 45 eyes implanted with the plate-haptic design (plate-haptic group). Visual acuity, contrast sensitivity, defocus curve, and ocular and intraocular optical quality were evaluated at 3 months postoperatively. Significant differences in the postoperative sphere were found (P = .01), with a more myopic postoperative refraction for the C-Loop haptic only group. No significant differences were detected in photopic and scotopic contrast sensitivity among groups (P ⩾ .05). Significantly better visual acuities were present in the C-Loop haptic with CTR group for the defocus levels of -2.0, -1.5, -1.0, and -0.50 D (P ⩽.03). Statistically significant differences among groups were found in total intraocular root mean square (RMS), high-order intraocular RMS, and intraocular coma-like RMS aberrations (P ⩽.04), with lower values from the plate-haptic group. The plate-haptic design and the C-Loop haptic design with CTR implantation both allow good visual rehabilitation. However, better refractive predictability and intraocular optical quality was obtained with the plate-haptic design without CTR implantation. The plate-haptic design seems to be a better design to support rotational asymmetric MFIOL optics. Copyright 2013, SLACK Incorporated.

  4. Perceiving haptic feedback in virtual reality simulators.

    Science.gov (United States)

    Våpenstad, Cecilie; Hofstad, Erlend Fagertun; Langø, Thomas; Mårvik, Ronald; Chmarra, Magdalena Karolina

    2013-07-01

    To improve patient safety, training of psychomotor laparoscopic skills is often done on virtual reality (VR) simulators outside the operating room. Haptic sensations have been found to influence psychomotor performance in laparoscopy. The emulation of haptic feedback is thus an important aspect of VR simulation. Some VR simulators try to simulate these sensations with handles equipped with haptic feedback. We conducted a survey on how laparoscopic surgeons perceive handles with and without haptic feedback. Surgeons with different levels of experience in laparoscopy were asked to test two handles: Xitact IHP with haptic feedback and Xitact ITP without haptic feedback (Mentice AB, Gothenburg, Sweden), connected to the LapSim (Surgical Science AB, Sweden) VR simulator. They performed two tasks on the simulator before answering 12 questions regarding the two handles. The surgeons were not informed about the differences in the handles. A total of 85 % of the 20 surgeons who participated in the survey claimed that it is important that handles with haptic feedback feel realistic. Ninety percent of the surgeons preferred the handles without haptic feedback. The friction in the handles with haptic feedback was perceived to be as in reality (5 %) or too high (95 %). Regarding the handles without haptic feedback, the friction was perceived as in reality (45 %), too low (50 %), or too high (5 %). A total of 85 % of the surgeons thought that the handle with haptic feedback attempts to simulate the resistance offered by tissue to deformation. Ten percent thought that the handle succeeds in doing so. The surveyed surgeons believe that haptic feedback is an important feature on VR simulators; however, they preferred the handles without haptic feedback because they perceived the handles with haptic feedback to add additional friction, making them unrealistic and not mechanically transparent.

  5. Haptic rendering foundations, algorithms, and applications

    CERN Document Server

    Lin, Ming C

    2008-01-01

    For a long time, human beings have dreamed of a virtual world where it is possible to interact with synthetic entities as if they were real. It has been shown that the ability to touch virtual objects increases the sense of presence in virtual environments. This book provides an authoritative overview of state-of-theart haptic rendering algorithms and their applications. The authors examine various approaches and techniques for designing touch-enabled interfaces for a number of applications, including medical training, model design, and maintainability analysis for virtual prototyping, scienti

  6. The crustal dynamics intelligent user interface anthology

    Science.gov (United States)

    Short, Nicholas M., Jr.; Campbell, William J.; Roelofs, Larry H.; Wattawa, Scott L.

    1987-01-01

    The National Space Science Data Center (NSSDC) has initiated an Intelligent Data Management (IDM) research effort which has, as one of its components, the development of an Intelligent User Interface (IUI). The intent of the IUI is to develop a friendly and intelligent user interface service based on expert systems and natural language processing technologies. The purpose of such a service is to support the large number of potential scientific and engineering users that have need of space and land-related research and technical data, but have little or no experience in query languages or understanding of the information content or architecture of the databases of interest. This document presents the design concepts, development approach and evaluation of the performance of a prototype IUI system for the Crustal Dynamics Project Database, which was developed using a microcomputer-based expert system tool (M. 1), the natural language query processor THEMIS, and the graphics software system GSS. The IUI design is based on a multiple view representation of a database from both the user and database perspective, with intelligent processes to translate between the views.

  7. A function-behavior-structure framework for quantification and reproduction of emotional haptic experience in using an electronic device

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Il Ju; Lee, Soo Hong [Yonsei University, Seoul (Korea, Republic of); Ok, Hyung Seok; Lee, Jae In [LG Electronics Inc, Seoul (Korea, Republic of)

    2013-08-15

    A user's haptic experience in using an electronic device is related to the continuous and dynamic variances of the structural state of the device. Since the changes of the structural component cause complex changes of the dynamics, it is difficult to predict the user's experience. We propose a function-behavior-structure framework to predict and improve the user's experience. The framework consists of the function layer model, the behavior layer model, and the structure layer model. Especially, the independent behavior model to the device is based on a physical phenomenon. Finally, an optimized structure which produces an ideal haptic experience for a cell phone is suggested.

  8. Information visualization to user-friendly interface construction for information retrieval systems

    Directory of Open Access Journals (Sweden)

    Jessica Monique de Lira Vieira

    2011-10-01

    Full Text Available The information presented through visualization help the Information Retrieval System (IRS to reach its main goal: to retrieve relevant information that meets the informational needs of its users. The objective of this article is to describe and analyze techniques proposed by the Information Visualization area and interface models discussed in Information Science Literature, which applied to graphical interface construction would facilitate the appropriation of information by the users of IRS and would help them to search, browse and retrieve information. The methodology consists of a literature review focusing on the potential contribution of the visual representation of information in the development of user-friendly interfaces to IRS, as well as identification and analyses of visualizations used as interfaces by IRS. The use of visualizations is of great importance in the communication between SRI and users, because the information presented through visual representation are better understood by user and allow the discovery of new knowledge.

  9. Implementation of graphical user interfaces in nuclear applications

    International Nuclear Information System (INIS)

    Barmsnes, K.A.; Johnsen, T.; Sundling, C.-V.

    1997-01-01

    During recent years a demand has formed for systems that support design and implementation of graphical user interfaces (GUIs) in the control rooms of nuclear power plants. Picasso-3 is a user interface management system supporting object oriented definition of GUIs in a distributed computing environment. The system is currently being used in a number of different application areas within the nuclear industry, such as retrofitting of display systems in simulators and control rooms, education and training applications, etc. Some examples are given of nuclear applications where the Picasso-3 system has been used

  10. Massage Therapy of the Back Using a Real-Time Haptic-Enhanced Telerehabilitation System

    Directory of Open Access Journals (Sweden)

    Cristina Ramírez-Fernández

    2017-01-01

    Full Text Available We present the usability evaluation of a haptic-enhanced telerehabilitation system for massage therapy of the back using the Vybe haptic gaming pad and the gesture sensor LEAP motion controller. The evaluated system includes features that allow for (i administering online therapy programs, (ii providing self-adjustable and safety treatment of back massages using a virtual environment, and (iii saving and replaying massage sessions according to a patient’s therapy program. The usability evaluation with 25 older adults and 10 specialists suggests that the haptic telerehabilitation system is perceived with high usability and pleasurable user experience, while providing personalized intensity of haptic therapy in a supervised, real-time, and secure way to treat the patient. Moreover, the specialists totally agree that the system design features, such as save and play, and delimiting therapy zones are the most important for back massage therapy, while the features of regulating feedback intensity and providing/receiving a massage remotely are also important. Finally, based on their comments, five design insights aiming at improving the current version of the system were generated.

  11. Towards a taxonomy of virtual reality user interfaces

    NARCIS (Netherlands)

    Coomans, M.K.D.; Timmermans, H.J.P.

    1997-01-01

    Virtual Reality-based user interfaces (VRUIs) are expected to bring about a revolution in computing. VR can potentially communicate large amounts of data in an easily understandable format. VR looks very promising, but it is still a very new interface technology for which very little

  12. A Graphical User Interface for the Computational Fluid Dynamics Software OpenFOAM

    OpenAIRE

    Melbø, Henrik Kaald

    2014-01-01

    A graphical user interface for the computational fluid dynamics software OpenFOAM has been constructed. OpenFOAM is a open source and powerful numerical software, but has much to be wanted in the field of user friendliness. In this thesis the basic operation of OpenFOAM will be introduced and the thesis will emerge in a graphical user interface written in PyQt. The graphical user interface will make the use of OpenFOAM simpler, and hopefully make this powerful tool more available for the gene...

  13. The Visual Web User Interface Design in Augmented Reality Technology

    OpenAIRE

    Chouyin Hsu; Haui-Chih Shiau

    2013-01-01

    Upon the popularity of 3C devices, the visual creatures are all around us, such the online game, touch pad, video and animation. Therefore, the text-based web page will no longer satisfy users. With the popularity of webcam, digital camera, stereoscopic glasses, or head-mounted display, the user interface becomes more visual and multi-dimensional. For the consideration of 3D and visual display in the research of web user interface design, Augmented Reality technology providing the convenient ...

  14. User productivity as a function of AutoCAD interface design.

    Science.gov (United States)

    Mitta, D A; Flores, P L

    1995-12-01

    Increased operator productivity is a desired outcome of user-CAD interaction scenarios. Two objectives of this research were to (1) define a measure of operator productivity and (2) empirically investigate the potential effects of CAD interface design on operator productivity, where productivity is defined as the percentage of a drawing session correctly completed per unit time. Here, AutoCAD provides the CAD environment of interest. Productivity with respect to two AutoCAD interface designs (menu, template) and three task types (draw, dimension, display) was investigated. Analysis of user productivity data revealed significantly higher productivity under the menu interface condition than under the template interface condition. A significant effect of task type was also discovered, where user productivity under display tasks was higher than productivity under the draw and dimension tasks. Implications of these results are presented.

  15. Computer-aided trauma simulation system with haptic feedback is easy and fast for oral-maxillofacial surgeons to learn and use.

    Science.gov (United States)

    Schvartzman, Sara C; Silva, Rebeka; Salisbury, Ken; Gaudilliere, Dyani; Girod, Sabine

    2014-10-01

    Computer-assisted surgical (CAS) planning tools have become widely available in craniomaxillofacial surgery, but are time consuming and often require professional technical assistance to simulate a case. An initial oral and maxillofacial (OM) surgical user experience was evaluated with a newly developed CAS system featuring a bimanual sense of touch (haptic). Three volunteer OM surgeons received a 5-minute verbal introduction to the use of a newly developed haptic-enabled planning system. The surgeons were instructed to simulate mandibular fracture reductions of 3 clinical cases, within a 15-minute time limit and without a time limit, and complete a questionnaire to assess their subjective experience with the system. Standard landmarks and linear and angular measurements between the simulated results and the actual surgical outcome were compared. After the 5-minute instruction, all 3 surgeons were able to use the system independently. The analysis of standardized anatomic measurements showed that the simulation results within a 15-minute time limit were not significantly different from those without a time limit. Mean differences between measurements of surgical and simulated fracture reductions were within current resolution limitations in collision detection, segmentation of computed tomographic scans, and haptic devices. All 3 surgeons reported that the system was easy to learn and use and that they would be comfortable integrating it into their daily clinical practice for trauma cases. A CAS system with a haptic interface that capitalizes on touch and force feedback experience similar to operative procedures is fast and easy for OM surgeons to learn and use. Copyright © 2014 American Association of Oral and Maxillofacial Surgeons. All rights reserved.

  16. Development of a Mobile User Interface for Image-based Dietary Assessment.

    Science.gov (United States)

    Kim, Sungye; Schap, Tusarebecca; Bosch, Marc; Maciejewski, Ross; Delp, Edward J; Ebert, David S; Boushey, Carol J

    2010-12-31

    In this paper, we present a mobile user interface for image-based dietary assessment. The mobile user interface provides a front end to a client-server image recognition and portion estimation software. In the client-server configuration, the user interactively records a series of food images using a built-in camera on the mobile device. Images are sent from the mobile device to the server, and the calorie content of the meal is estimated. In this paper, we describe and discuss the design and development of our mobile user interface features. We discuss the design concepts, through initial ideas and implementations. For each concept, we discuss qualitative user feedback from participants using the mobile client application. We then discuss future designs, including work on design considerations for the mobile application to allow the user to interactively correct errors in the automatic processing while reducing the user burden associated with classical pen-and-paper dietary records.

  17. A Design Approach for Tangible User Interfaces

    Directory of Open Access Journals (Sweden)

    Bernard Champoux

    2004-05-01

    Full Text Available This paper proposes a mechanism to design Tangible User Interface (TUI based on Alexander’s (1964 design approach i.e. achieving fitness between the form and its context. Adapted to the design of TUIs, the fitness-of-use mechanism now takes into consideration the potential conflicts between the hardware of the artifact (electro-mechanical components and the form of the user’s control (Physical-ergonomics. The design problem is a search for an effortless co-existence (fitness-of-use between these two aspects. Tangible interface design differs from traditional graphical interface design as unsolved conflicts between hardware and ergonomics can deeply affect the desired interaction. Here we propose a mechanism (in the form of eight questions that support the design by defining the boundaries of the task, orienting the hardware (electro-mechanics and ergonomics of the design space for various sub-tasks and finally fitting the different components of the hardware and physical-ergonomics of the artefact to provide a component level fitness which will delineate the final tangible interfaces. We further evaluate the effectiveness and efficiency of our approach by quantitative user evaluation

  18. A Mobile Phone User Interface for Image-Based Dietary Assessment.

    Science.gov (United States)

    Ahmad, Ziad; Khanna, Nitin; Kerr, Deborah A; Boushey, Carol J; Delp, Edward J

    2014-02-02

    Many chronic diseases, including obesity and cancer, are related to diet. Such diseases may be prevented and/or successfully treated by accurately monitoring and assessing food and beverage intakes. Existing dietary assessment methods such as the 24-hour dietary recall and the food frequency questionnaire, are burdensome and not generally accurate. In this paper, we present a user interface for a mobile telephone food record that relies on taking images, using the built-in camera, as the primary method of recording. We describe the design and implementation of this user interface while stressing the solutions we devised to meet the requirements imposed by the image analysis process, yet keeping the user interface easy to use.

  19. A mobile phone user interface for image-based dietary assessment

    Science.gov (United States)

    Ahmad, Ziad; Khanna, Nitin; Kerr, Deborah A.; Boushey, Carol J.; Delp, Edward J.

    2014-02-01

    Many chronic diseases, including obesity and cancer, are related to diet. Such diseases may be prevented and/or successfully treated by accurately monitoring and assessing food and beverage intakes. Existing dietary assessment methods such as the 24-hour dietary recall and the food frequency questionnaire, are burdensome and not generally accurate. In this paper, we present a user interface for a mobile telephone food record that relies on taking images, using the built-in camera, as the primary method of recording. We describe the design and implementation of this user interface while stressing the solutions we devised to meet the requirements imposed by the image analysis process, yet keeping the user interface easy to use.

  20. EPICS system: system structure and user interface

    International Nuclear Information System (INIS)

    West, R.E.; Bartlett, J.F.; Bobbitt, J.S.; Lahey, T.E.; Kramper, B.J.; MacKinnon, B.A.

    1984-02-01

    This paper present the user's view of and the general organization of the EPICS control system at Fermilab. Various subsystems of the EPICS control system are discussed. These include the user command language, software protection, the device database, remote computer interfaces, and several application utilities. This paper is related to two other papers on EPICS: an overview paper and a detailed implementation paper

  1. Study on user interface of pathology picture archiving and communication system.

    Science.gov (United States)

    Kim, Dasueran; Kang, Peter; Yun, Jungmin; Park, Sung-Hye; Seo, Jeong-Wook; Park, Peom

    2014-01-01

    It is necessary to improve the pathology workflow. A workflow task analysis was performed using a pathology picture archiving and communication system (pathology PACS) in order to propose a user interface for the Pathology PACS considering user experience. An interface analysis of the Pathology PACS in Seoul National University Hospital and a task analysis of the pathology workflow were performed by observing recorded video. Based on obtained results, a user interface for the Pathology PACS was proposed. Hierarchical task analysis of Pathology PACS was classified into 17 tasks including 1) pre-operation, 2) text, 3) images, 4) medical record viewer, 5) screen transition, 6) pathology identification number input, 7) admission date input, 8) diagnosis doctor, 9) diagnosis code, 10) diagnosis, 11) pathology identification number check box, 12) presence or absence of images, 13) search, 14) clear, 15) Excel save, 16) search results, and 17) re-search. And frequently used menu items were identified and schematized. A user interface for the Pathology PACS considering user experience could be proposed as a preliminary step, and this study may contribute to the development of medical information systems based on user experience and usability.

  2. User interface design of electronic appliances

    CERN Document Server

    Baumann, Konrad

    2002-01-01

    Foreword by Brenda Laurel. Part One: Introduction 1. Background, Bruce Thomas 2. Introduction, Konrad Baumann 3. The Interaction Design Process, Georg Rakers Part Two: User Interface Design 4. Creativity Techniques, Irene Mavrommati 5. Design Principals, Irene Mavrommati and Adrian Martel 6. Design of On-Screen Interfaces, Irene Mavrommati Part Three: Input Devices 7. Controls, Konrad Baumann 8. Keyboards, Konrad Baumann 9. Advanced Interaction Techniques, Christopher Baber and Konrad Baumann 10. Speech Control, Christopher Baber and Jan Noyes 11. Wearable Computers, Christopher Baber Part Fou

  3. Customizing graphical user interface technology for spacecraft control centers

    Science.gov (United States)

    Beach, Edward; Giancola, Peter; Gibson, Steven; Mahmot, Ronald

    1993-01-01

    The Transportable Payload Operations Control Center (TPOCC) project is applying the latest in graphical user interface technology to the spacecraft control center environment. This project of the Mission Operations Division's (MOD) Control Center Systems Branch (CCSB) at NASA Goddard Space Flight Center (GSFC) has developed an architecture for control centers which makes use of a distributed processing approach and the latest in Unix workstation technology. The TPOCC project is committed to following industry standards and using commercial off-the-shelf (COTS) hardware and software components wherever possible to reduce development costs and to improve operational support. TPOCC's most successful use of commercial software products and standards has been in the development of its graphical user interface. This paper describes TPOCC's successful use and customization of four separate layers of commercial software products to create a flexible and powerful user interface that is uniquely suited to spacecraft monitoring and control.

  4. Visual and Haptic Mental Rotation

    Directory of Open Access Journals (Sweden)

    Satoshi Shioiri

    2011-10-01

    Full Text Available It is well known that visual information can be retained in several types of memory systems. Haptic information can also be retained in a memory because we can repeat a hand movement. There may be a common memory system for vision and action. On the one hand, it may be convenient to have a common system for acting with visual information. On the other hand, different modalities may have their own memory and use retained information without transforming specific to the modality. We compared memory properties of visual and haptic information. There is a phenomenon known as mental rotation, which is possibly unique to visual representation. The mental rotation is a phenomenon where reaction time increases with the angle of visual target (eg,, a letter to identify. The phenomenon is explained by the difference in time to rotate the representation of the target in the visual sytem. In this study, we compared the effect of stimulus angle on visual and haptic shape identification (two-line shapes were used. We found that a typical effect of mental rotation for the visual stimulus. However, no such effect was found for the haptic stimulus. This difference cannot be explained by the modality differences in response because similar difference was found even when haptical response was used for visual representation and visual response was used for haptic representation. These results indicate that there are independent systems for visual and haptic representations.

  5. Flash Builder customizing the user interface

    CERN Document Server

    Rocchi, Cesare

    2010-01-01

    Personalize user interface components of your projects. Example projects are grouped together in an AIR application and the appearance is totally customized. Learn how to change visual properties by means of style directives or create brand new skins by knowing and exploiting their internal architecture.

  6. More playful user interfaces: an introduction

    NARCIS (Netherlands)

    Unknown, [Unknown; Nijholt, A.; Nijholt, Antinus

    2015-01-01

    In this chapter we embed recent research advances in creating playful user interfaces in a historical context. We have observations on spending leisure time, in particular predictions from previous decades and views expressed in Science Fiction novels. We confront these views and predictions with

  7. WIFIP: a web-based user interface for automated synchrotron beamlines.

    Science.gov (United States)

    Sallaz-Damaz, Yoann; Ferrer, Jean Luc

    2017-09-01

    The beamline control software, through the associated graphical user interface (GUI), is the user access point to the experiment, interacting with synchrotron beamline components and providing automated routines. FIP, the French beamline for the Investigation of Proteins, is a highly automatized macromolecular crystallography (MX) beamline at the European Synchrotron Radiation Facility. On such a beamline, a significant number of users choose to control their experiment remotely. This is often performed with a limited bandwidth and from a large choice of computers and operating systems. Furthermore, this has to be possible in a rapidly evolving experimental environment, where new developments have to be easily integrated. To face these challenges, a light, platform-independent, control software and associated GUI are required. Here, WIFIP, a web-based user interface developed at FIP, is described. Further than being the present FIP control interface, WIFIP is also a proof of concept for future MX control software.

  8. User-centered design with illiterate persons : The case of the ATM user interface

    NARCIS (Netherlands)

    Cremers, A.H.M.; Jong, J.G.M. de; Balken, J.S. van

    2008-01-01

    One of the major challenges in current user interface research and development is the accommodation of diversity in users and contexts of use in order to improve the self-efficacy of citizens. A common banking service, which should be designed for diversity, is the Automated Teller Machine (ATM).

  9. Absence of modulatory action on haptic height perception with musical pitch

    Directory of Open Access Journals (Sweden)

    Michele eGeronazzo

    2015-09-01

    Full Text Available Although acoustic frequency is not a spatial property of physical objects, in common language, pitch, i.e., the psychological correlated of frequency, is often labeled spatially (i.e., high in pitch or low in pitch. Pitch-height is known to modulate (and interact with the response of participants when they are asked to judge spatial properties of non-auditory stimuli (e.g., visual in a variety of behavioral tasks. In the current study we investigated whether the modulatory action of pitch-height extended to the haptic estimation of height of a virtual step.We implemented a HW/SW setup which is able to render virtual 3D objects (stair-steps haptically through a PHANTOM device, and to provide real-time continuous auditory feedback depending on the user interaction with the object. The haptic exploration was associated with a sinusoidal tone whose pitch varied as a function of the interaction point’s height within (i a narrower and (ii a wider pitch range, or (iii a random pitch variation acting as a control audio condition. Explorations were also performed with no sound (haptic only. Participants were instructed to explore the virtual step freely, and to communicate height estimation by opening their thumb and index finger to mimic the step riser height, or verbally by reporting the height in centimeters of the step riser. We analyzed the role of musical expertise by dividing participants into non musicians and musicians. Results showed no effects of musical pitch on high-realistic haptic feedback. Overall there is no difference between the two groups in the proposed multimodal conditions. Additionally, we observed a different haptic response distribution between musicians and non musicians when estimations of the auditory conditions are matched with estimations in the no sound condition.

  10. Finding and Exploring Health Information with a Slider-Based User Interface.

    Science.gov (United States)

    Pang, Patrick Cheong-Iao; Verspoor, Karin; Pearce, Jon; Chang, Shanton

    2016-01-01

    Despite the fact that search engines are the primary channel to access online health information, there are better ways to find and explore health information on the web. Search engines are prone to problems when they are used to find health information. For instance, users have difficulties in expressing health scenarios with appropriate search keywords, search results are not optimised for medical queries, and the search process does not account for users' literacy levels and reading preferences. In this paper, we describe our approach to addressing these problems by introducing a novel design using a slider-based user interface for discovering health information without the need for precise search keywords. The user evaluation suggests that the interface is easy to use and able to assist users in the process of discovering new information. This study demonstrates the potential value of adopting slider controls in the user interface of health websites for navigation and information discovery.

  11. Graphic user interface for COSMOS code

    International Nuclear Information System (INIS)

    Oh, Je Yong; Koo, Yang Hyun; Lee, Byung Ho; Cheon, Jin Sik; Sohn, Dong Seong

    2003-06-01

    The Graphic User Interface (GUI) - which consisted of graphical elements such as windows, menu, button, icon, and so on - made it possible that the computer could be easily used for common users. Hence, the GUI was introduced to improve the efficiency to input parameters in COSMOS code. The functions to output graphs on the screen and postscript files were also added. And the graph library can be applied to the other codes. The details of principles of GUI and graphic library were described in the report

  12. Glotaran: A Java-Based Graphical User Interface for the R Package TIMP

    Directory of Open Access Journals (Sweden)

    Katharine M. Mullen

    2012-06-01

    Full Text Available In this work the software application called Glotaran is introduced as a Java-based graphical user interface to the R package TIMP, a problem solving environment for fitting superposition models to multi-dimensional data. TIMP uses a command-line user interface for the interaction with data, the specification of models and viewing of analysis results. Instead, Glotaran provides a graphical user interface which features interactive and dynamic data inspection, easier -- assisted by the user interface -- model specification and interactive viewing of results. The interactivity component is especially helpful when working with large, multi-dimensional datasets as often result from time-resolved spectroscopy measurements, allowing the user to easily pre-select and manipulate data before analysis and to quickly zoom in to regions of interest in the analysis results. Glotaran has been developed on top of the NetBeans rich client platform and communicates with R through the Java-to-R interface Rserve. The background and the functionality of the application are described here. In addition, the design, development and implementation process of Glotaran is documented in a generic way.

  13. Realism is not all! User engagement with task-related interface characters

    NARCIS (Netherlands)

    van Vugt, H.C.; Konijn, E.A.; Hoorn, J.F.; Eliëns, A.P.W.; Keur, I.

    2007-01-01

    Human-like characters in the interface may evoke social responses in users, and literature suggests that realism is the most important factor herein. However, the effects of interface characters on the user are not well understood. We developed an integrative framework, called I-PEFiC, to explain

  14. Experimental setup for evaluating an adaptive user interface for teleoperation control

    Science.gov (United States)

    Wijayasinghe, Indika B.; Peetha, Srikanth; Abubakar, Shamsudeen; Saadatzi, Mohammad Nasser; Cremer, Sven; Popa, Dan O.

    2017-05-01

    A vital part of human interactions with a machine is the control interface, which single-handedly could define the user satisfaction and the efficiency of performing a task. This paper elaborates the implementation of an experimental setup to study an adaptive algorithm that can help the user better tele-operate the robot. The formulation of the adaptive interface and associate learning algorithms are general enough to apply when the mapping between the user controls and the robot actuators is complex and/or ambiguous. The method uses a genetic algorithm to find the optimal parameters that produce the input-output mapping for teleoperation control. In this paper, we describe the experimental setup and associated results that was used to validate the adaptive interface to a differential drive robot from two different input devices; a joystick, and a Myo gesture control armband. Results show that after the learning phase, the interface converges to an intuitive mapping that can help even inexperienced users drive the system to a goal location.

  15. TOUCH, TOUCH, TOUCH, SENSORIAL COGNITIVE SKILLS SENSITIZED THROUGH TACTILITY AND TANGIBILITY

    NARCIS (Netherlands)

    Wendrich, Robert E.

    2018-01-01

    This paper presents the development and testing of an original user interface ()UI) based on the sense of touch. A tangible user interface (TUI) project that includes the exploration of haptics, design processes, hybrid design tools and unconventional user interfaces (NUI) that focus essentially on

  16. Enabling Accessibility Through Model-Based User Interface Development.

    Science.gov (United States)

    Ziegler, Daniel; Peissner, Matthias

    2017-01-01

    Adaptive user interfaces (AUIs) can increase the accessibility of interactive systems. They provide personalized display and interaction modes to fit individual user needs. Most AUI approaches rely on model-based development, which is considered relatively demanding. This paper explores strategies to make model-based development more attractive for mainstream developers.

  17. SWATMOD-PREP: Graphical user interface for preparing coupled SWAT-modflow simulations

    Science.gov (United States)

    This paper presents SWATMOD-Prep, a graphical user interface that couples a SWAT watershed model with a MODFLOW groundwater flow model. The interface is based on a recently published SWAT-MODFLOW code that couples the models via mapping schemes. The spatial layout of SWATMOD-Prep guides the user t...

  18. MOO in Your Face: Researching, Designing, and Programming a User-Friendly Interface.

    Science.gov (United States)

    Haas, Mark; Gardner, Clinton

    1999-01-01

    Suggests the learning curve of a multi-user, object-oriented domain (MOO) blockades effective use. Discusses use of an IBM/PC-compatible interface that allows developers to modify the interface to provide a sense of presence for the user. Concludes that work in programming a variety of interfaces has led to a more intuitive environment for…

  19. The Haptic Bracelets: Learning Multi-Limb Rhythm Skills from Haptic Stimuli While Reading

    NARCIS (Netherlands)

    Bouwer, A.; Holland, S.; Dalgleish, M.; Holland, S.; Wilkie, K.; Mulholland, P.; Seago, A.

    2013-01-01

    The Haptic Bracelets are a system designed to help people learn multi-limbed rhythms (which involve multiple simultaneous rhythmic patterns) while they carry out other tasks. The Haptic Bracelets consist of vibrotactiles attached to each wrist and ankle, together with a computer system to control

  20. Zoomable User Interfaces for the Semantic WEB

    National Research Council Canada - National Science Library

    Gorniak, Mark

    2004-01-01

    .... The University of Maryland, College Park (UMCP) developed an interface, to visualize the taxonomic hierarchy of data, and applied integrated searching and browsing so that users need not have complete knowledge either of appropriate keyword...

  1. Short report on the evaluation of a graphical user interface for radiation therapy planning systems

    International Nuclear Information System (INIS)

    Martin, M.B.

    1993-01-01

    Since their introduction graphical user interfaces for computing applications have generally appealed more to users than command-line or menu interfaces. Benefits from using a graphical interface include ease-of-use, ease-of-under-standing and increased productivity. For a radiation therapy planning application, an additional potential benefit is that the user regards the planning activity as a closer simulation of the real world situation. A prototype radiation therapy planning system incorporating a graphical user interface was developed on an Apple Macintosh microcomputer. Its graphic interface was then evaluated by twenty-six participants. The results showed markedly that the features associated with a graphic user interface were preferred. 6 refs., 3 figs., 1 tab

  2. Visual-Haptic Integration: Cue Weights are Varied Appropriately, to Account for Changes in Haptic Reliability Introduced by Using a Tool

    Directory of Open Access Journals (Sweden)

    Chie Takahashi

    2011-10-01

    Full Text Available Tools such as pliers systematically change the relationship between an object's size and the hand opening required to grasp it. Previous work suggests the brain takes this into account, integrating visual and haptic size information that refers to the same object, independent of the similarity of the ‘raw’ visual and haptic signals (Takahashi et al., VSS 2009. Variations in tool geometry also affect the reliability (precision of haptic size estimates, however, because they alter the change in hand opening caused by a given change in object size. Here, we examine whether the brain appropriately adjusts the weights given to visual and haptic size signals when tool geometry changes. We first estimated each cue's reliability by measuring size-discrimination thresholds in vision-alone and haptics-alone conditions. We varied haptic reliability using tools with different object-size:hand-opening ratios (1:1, 0.7:1, and 1.4:1. We then measured the weights given to vision and haptics with each tool, using a cue-conflict paradigm. The weight given to haptics varied with tool type in a manner that was well predicted by the single-cue reliabilities (MLE model; Ernst and Banks, 2002. This suggests that the process of visual-haptic integration appropriately accounts for variations in haptic reliability introduced by different tool geometries.

  3. Graphical user interface simplifies infusion pump programming and enhances the ability to detect pump-related faults.

    Science.gov (United States)

    Syroid, Noah; Liu, David; Albert, Robert; Agutter, James; Egan, Talmage D; Pace, Nathan L; Johnson, Ken B; Dowdle, Michael R; Pulsipher, Daniel; Westenskow, Dwayne R

    2012-11-01

    Drug administration errors are frequent and are often associated with the misuse of IV infusion pumps. One source of these errors may be the infusion pump's user interface. We used failure modes-and-effects analyses to identify programming errors and to guide the design of a new syringe pump user interface. We designed the new user interface to clearly show the pump's operating state simultaneously in more than 1 monitoring location. We evaluated anesthesia residents in laboratory and simulated environments on programming accuracy and error detection between the new user interface and the user interface of a commercially available infusion pump. With the new user interface, we observed the number of programming errors reduced by 81%, the number of keystrokes per task reduced from 9.2 ± 5.0 to 7.5 ± 5.5 (mean ± SD), the time required per task reduced from 18.1 ± 14.1 seconds to 10.9 ± 9.5 seconds and significantly less perceived workload. Residents detected 38 of 70 (54%) of the events with the new user interface and 37 of 70 (53%) with the existing user interface, despite no experience with the new user interface and extensive experience with the existing interface. The number of programming errors and workload were reduced partly because it took less time and fewer keystrokes to program the pump when using the new user interface. Despite minimal training, residents quickly identified preexisting infusion pump problems with the new user interface. Intuitive and easy-to-program infusion pump interfaces may reduce drug administration errors and infusion pump-related adverse events.

  4. The Hedonic Haptic Player

    DEFF Research Database (Denmark)

    Vallgårda, Anna; Boer, Laurens; Cahill, Ben

    2017-01-01

    In this design case we present the Hedonic Haptic Player—a wearable device that plays different patterns of vibrations on the body as a form of music for the skin. With this we begin to explore the enjoyability of vibrations in a wearable set-up. Instead of implementing vibrations as a haptic...... output for some form of communication we want to explore their hedonistic value. The process leading up to the Hedonic Haptic player served as a first step in getting a grasp of the design space of vibrotactile stimuli in a broader sense. This is reported as seven episodes of explorations. The Hedonic...

  5. A case study on better iconographic design in electronic medical records' user interface.

    Science.gov (United States)

    Tasa, Umut Burcu; Ozcan, Oguzhan; Yantac, Asim Evren; Unluer, Ayca

    2008-06-01

    It is a known fact that there is a conflict between what users expect and what user interface designers create in the field of medical informatics along with other fields of interface design. The objective of the study is to suggest, from the 'design art' perspective, a method for improving the usability of an electronic medical record (EMR) interface. The suggestion is based on the hypothesis that the user interface of an EMR should be iconographic. The proposed three-step method consists of a questionnaire survey on how hospital users perceive concepts/terms that are going to be used in the EMR user interface. Then icons associated with the terms are designed by a designer, following a guideline which is prepared according to the results of the first questionnaire. Finally the icons are asked back to the target group for proof. A case study was conducted with 64 medical staff and 30 professional designers for the first questionnaire, and with 30 medical staff for the second. In the second questionnaire 7.53 icons out of 10 were matched correctly with a standard deviation of 0.98. Also, all icons except three were matched correctly in at least 83.3% of the forms. The proposed new method differs from the majority of previous studies which are based on user requirements by leaning on user experiments instead. The study demonstrated that the user interface of EMRs should be designed according to a guideline that results from a survey on users' experiences on metaphoric perception of the terms.

  6. Development of graphical user interface for EGS

    International Nuclear Information System (INIS)

    Jin Gang; Liu Liye; Li Junli; Cheng Jianping

    2002-01-01

    In order to make it more convenient for the engineers to use EGS, explored a new type of procedure under the utility of the VC ++ , this procedure which is named of EGS Win can run under the Windows system. This procedure consists of graphical user interface. Through this procedure, the user have to input the simple and intuitionistic geometric entity for getting the definition of the region. This procedure greatly improves the efficiency of EGS

  7. Portraying User Interface History

    DEFF Research Database (Denmark)

    Jørgensen, Anker Helms

    2008-01-01

    history. Next the paper analyses a selected sample of papers on UI history at large. The analysis shows that the current state-of-art is featured by three aspects: Firstly internalism, in that the papers adress the tech­nologies in their own right with little con­text­ualization, secondly whiggism...... in that they largely address prevailing UI techno­logies, and thirdly history from above in that they focus on the great deeds of the visionaries. The paper then compares this state-of-art in UI history to the much more mature fields history of computing and history of technology. Based hereon, some speculations......The user interface is coming of age. Papers adressing UI history have appeared in fair amounts in the last 25 years. Most of them address particular aspects such as an in­novative interface paradigm or the contribution of a visionary or a research lab. Contrasting this, papers addres­sing UI...

  8. The graphical user interface for CRISTAL V1

    International Nuclear Information System (INIS)

    Heulers, L.; Courtois, G.; Fernex, F.; Gomit, J.M.; Letang, E.

    2003-01-01

    This paper deals with the new Graphical User Interface (GUI) of the CRISTAL V1 package devoted to criticality studies including burn up calculations. The aim of this GUI is to offer users a high level of user-friendliness and flexibility in the data description and the results analysis of codes of the package. The three main components of the GUI (CIGAIES, EJM and OPOSSUM) are presented. The different functionalities of the tools are explained through some applications. (author)

  9. INTERNET CONNECTIVITY FOR MASS PRODUCED UNITS WITHOUT USER INTERFACE

    DEFF Research Database (Denmark)

    2000-01-01

    To the manufacturer of mass produced units without a user interface, typically field level units, connection of these units to a communications network for enabling servicing, control and trackability is of interest. To provide this connection, a solution is described in which an interface...

  10. The use of Graphic User Interface for development of a user-friendly CRS-Stack software

    Science.gov (United States)

    Sule, Rachmat; Prayudhatama, Dythia; Perkasa, Muhammad D.; Hendriyana, Andri; Fatkhan; Sardjito; Adriansyah

    2017-04-01

    The development of a user-friendly Common Reflection Surface (CRS) Stack software that has been built by implementing Graphical User Interface (GUI) is described in this paper. The original CRS-Stack software developed by WIT Consortium is compiled in the unix/linux environment, which is not a user-friendly software, so that a user must write the commands and parameters manually in a script file. Due to this limitation, the CRS-Stack become a non popular method, although applying this method is actually a promising way in order to obtain better seismic sections, which have better reflector continuity and S/N ratio. After obtaining successful results that have been tested by using several seismic data belong to oil companies in Indonesia, it comes to an idea to develop a user-friendly software in our own laboratory. Graphical User Interface (GUI) is a type of user interface that allows people to interact with computer programs in a better way. Rather than typing commands and module parameters, GUI allows the users to use computer programs in much simple and easy. Thus, GUI can transform the text-based interface into graphical icons and visual indicators. The use of complicated seismic unix shell script can be avoided. The Java Swing GUI library is used to develop this CRS-Stack GUI. Every shell script that represents each seismic process is invoked from Java environment. Besides developing interactive GUI to perform CRS-Stack processing, this CRS-Stack GUI is design to help geophysicists to manage a project with complex seismic processing procedures. The CRS-Stack GUI software is composed by input directory, operators, and output directory, which are defined as a seismic data processing workflow. The CRS-Stack processing workflow involves four steps; i.e. automatic CMP stack, initial CRS-Stack, optimized CRS-Stack, and CRS-Stack Supergather. Those operations are visualized in an informative flowchart with self explanatory system to guide the user inputting the

  11. On the design of a miniature haptic ring for cutaneous force feedback using shape memory alloy actuators

    Science.gov (United States)

    Hwang, Donghyun; Lee, Jaemin; Kim, Keehoon

    2017-10-01

    This paper proposes a miniature haptic ring that can display touch/pressure and shearing force to the user’s fingerpad. For practical use and wider application of the device, it is developed with the aim of achieving high wearability and mobility/portability as well as cutaneous force feedback functionality. A main body of the device is designed as a ring-shaped lightweight structure with a simple driving mechanism, and thin shape memory alloy (SMA) wires having high energy density are applied as actuating elements. Also, based on a band-type wireless control unit including a wireless data communication module, the whole device could be realized as a wearable mobile haptic device system. These features enable the device to take diverse advantages on functional performances and to provide users with significant usability. In this work, the proposed miniature haptic ring is systematically designed, and its working performances are experimentally evaluated with a fabricated functional prototype. The experimental results obviously demonstrate that the proposed device exhibits higher force-to-weight ratio than conventional finger-wearable haptic devices for cutaneous force feedback. Also, it is investigated that operational performances of the device are strongly influenced by electro-thermomechanical behaviors of the SMA actuator. In addition to the experiments for performance evaluation, we conduct a preliminary user test to assess practical feasibility and usability based on user’s qualitative feedback.

  12. EPICS-QT based graphical user interface for accelerator control

    International Nuclear Information System (INIS)

    Basu, A.; Singh, S.K.; Rosily, Sherry; Bhagwat, P.V.

    2016-01-01

    Particle accelerators and many industrial complex systems, require a robust and efficient control for its proper operation to achieve required beam quality, safety of its sub component and all working personnel. This control is executed via a graphical user interface through which an operator interacts with the accelerator to achieve the desired state of the machine and its output. Experimental Physics and Industrial Control System (EPICS) is a widely used control system framework in the field of accelerator control. It acts as a middle layer between field devices and graphic user interface used by the operator. Field devices can also be made EPICS compliant by using EPICS based software in that. On the other hand Qt is a C++ framework which is widely used for creating very professional looking and user friendly graphical component. In Low Energy High Intensity Proton Accelerator (LEHIPA), which is the first stage of the three stage Accelerator Driven System (ADS) program taken by Bhabha Atomic Research Centre (BARC), it is decided that EPICS will be used for controlling the accelerator and Qt will be used for developing the various Graphic User Interface (GUI) for operation and diagnostics. This paper discuss the work carried out to achieve this goal in LEHIPA

  13. User-Centered Design, Experience, and Usability of an Electronic Consent User Interface to Facilitate Informed Decision-Making in an HIV Clinic.

    Science.gov (United States)

    Ramos, S Raquel

    2017-11-01

    Health information exchange is the electronic accessibility and transferability of patient medical records across various healthcare settings and providers. In some states, patients have to formally give consent to allow their medical records to be electronically shared. The purpose of this study was to apply a novel user-centered, multistep, multiframework approach to design and test an electronic consent user interface, so patients with HIV can make more informed decisions about electronically sharing their health information. This study consisted of two steps. Step 1 was a cross-sectional, descriptive, qualitative study that used user-centric design interviews to create the user interface. This informed Step 2. Step 2 consisted of a one group posttest to examine perceptions of usefulness, ease of use, preference, and comprehension of a health information exchange electronic consent user interface. More than half of the study population had college experience, but challenges remained with overall comprehension regarding consent. The user interface was not independently successful, suggesting that in addition to an electronic consent user interface, human interaction may also be necessary to address the complexities associated with consenting to electronically share health information. Comprehension is key factor in the ability to make informed decisions.

  14. Mobile Haptic Technology Development through Artistic Exploration

    DEFF Research Database (Denmark)

    Cuartielles, David; Göransson, Andreas; Olsson, Tony

    2012-01-01

    This paper investigates how artistic explorations can be useful for the development of mobile haptic technology. It presents an alternative framework of design for wearable haptics that contributes to the building of haptic communities outside specialized research contexts. The paper also present...

  15. Model-driven Instrumentation of graphical user interfaces.

    OpenAIRE

    Funk, M.; Hoyer, P.; Link, S.

    2009-01-01

    In today's continuously changing markets newly developed products often do not meet the demands and expectations of customers. Research on this problem identified a large gap between developer and user expectations. Approaches to bridge this gap are to provide the developers with better information on product usage and to create a fast feedback cycle that helps tackling usage problems. Therefore, the user interface of the product, the central point of human-computer interaction, has to be ins...

  16. A novel graphical user interface for ultrasound-guided shoulder arthroscopic surgery

    Science.gov (United States)

    Tyryshkin, K.; Mousavi, P.; Beek, M.; Pichora, D.; Abolmaesumi, P.

    2007-03-01

    This paper presents a novel graphical user interface developed for a navigation system for ultrasound-guided computer-assisted shoulder arthroscopic surgery. The envisioned purpose of the interface is to assist the surgeon in determining the position and orientation of the arthroscopic camera and other surgical tools within the anatomy of the patient. The user interface features real time position tracking of the arthroscopic instruments with an optical tracking system, and visualization of their graphical representations relative to a three-dimensional shoulder surface model of the patient, created from computed tomography images. In addition, the developed graphical interface facilitates fast and user-friendly intra-operative calibration of the arthroscope and the arthroscopic burr, capture and segmentation of ultrasound images, and intra-operative registration. A pilot study simulating the computer-aided shoulder arthroscopic procedure on a shoulder phantom demonstrated the speed, efficiency and ease-of-use of the system.

  17. AutoCAD platform customization user interface and beyond

    CERN Document Server

    Ambrosius, Lee

    2014-01-01

    Make AutoCAD your own with powerful personalization options Options for AutoCAD customization are typically the domain of administrators, but savvy users can perform their own customizations to personalize AutoCAD. Until recently, most users never thought to customize the AutoCAD platform to meet their specific needs, instead leaving it to administrators. If you are an AutoCAD user who wants to ramp up personalization options in your favorite software, AutoCAD Platform Customization: User Interface and Beyond is the perfect resource for you. Author Lee Ambrosius is recognized as a leader in Au

  18. Use of natural user interfaces in water simulations

    Science.gov (United States)

    Donchyts, G.; Baart, F.; van Dam, A.; Jagers, B.

    2013-12-01

    Conventional graphical user interfaces, used to edit input and present results of earth science models, have seen little innovation for the past two decades. In most cases model data is presented and edited using 2D projections even when working with 3D data. The emergence of 3D motion sensing technologies, such as Microsoft Kinect and LEAP Motion, opens new possibilities for user interaction by adding more degrees of freedom compared to a classical way using mouse and keyboard. Here we investigate how interaction with hydrodynamic numerical models can be improved using these new technologies. Our research hypothesis (H1) states that properly designed 3D graphical user interface paired with the 3D motion sensor can significantly reduce the time required to setup and use numerical models. In this work we have used a LEAP motion controller combined with a shallow water flow model engine D-Flow Flexible Mesh. Interacting with numerical model using hands

  19. Tangible User Interface and Mu Rhythm Suppression: The Effect of User Interface on the Brain Activity in Its Operator and Observer

    Directory of Open Access Journals (Sweden)

    Kazuo Isoda

    2017-03-01

    Full Text Available The intuitiveness of tangible user interface (TUI is not only for its operator. It is quite possible that this type of user interface (UI can also have an effect on the experience and learning of observers who are just watching the operator using it. To understand the possible effect of TUI, the present study focused on the mu rhythm suppression in the sensorimotor area reflecting execution and observation of action, and investigated the brain activity both in its operator and observer. In the observer experiment, the effect of TUI on its observers was demonstrated through the brain activity. Although the effect of the grasping action itself was uncertain, the unpredictability of the result of the action seemed to have some effect on the mirror neuron system (MNS-related brain activity. In the operator experiment, in spite of the same grasping action, the brain activity was activated in the sensorimotor area when UI functions were included (TUI. Such activation of the brain activity was not found with a graphical user interface (GUI that has UI functions without grasping action. These results suggest that the MNS-related brain activity is involved in the effect of TUI, indicating the possibility of UI evaluation based on brain activity.

  20. The Promise of Zoomable User Interfaces

    Science.gov (United States)

    Bederson, Benjamin B.

    2011-01-01

    Zoomable user interfaces (ZUIs) have received a significant amount of attention in the 18 years since they were introduced. They have enjoyed some success, and elements of ZUIs are widely used in computers today, although the grand vision of a zoomable desktop has not materialised. This paper describes the premise and promise of ZUIs along with…

  1. Glotaran: A Java-Based Graphical User Interface for the R Package TIMP

    NARCIS (Netherlands)

    Snellenburg, J.J.; Laptenok, S.; Seger, R.; Mullen, K.M.; van Stokkum, I.H.M.

    2012-01-01

    In this work the software application called Glotaran is introduced as a Java-based graphical user interface to the R package TIMP, a problem solving environment for fitting superposition models to multi-dimensional data. TIMP uses a command-line user interface for the interaction with data, the

  2. Interfacing ANSYS to user's programs using UNIX shell program

    Energy Technology Data Exchange (ETDEWEB)

    Kim, In Yong; Kim, Beom Shig [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-01-01

    It has been considered to be impossible to interface the ANSYS, which is the commercial finite element code and whose program is not open to public, to the other user's program. When the analysis need to be iterated, the user should wait until the analysis is finished and read the ANSYS result to make the input data for every iteration. In this report the direct interfacing techniques between the ANSYS and the other program using UNIX shell programming are proposed. The detail program lists and the application example are also provided. (Author) 19 refs., 6 figs., 7 tabs.

  3. The phenomenological experience of dementia and user interface development

    DEFF Research Database (Denmark)

    Peterson, Carrie Beth; Mitseva, Anelia; Mihovska, Albena D.

    2009-01-01

    This study follows the project ISISEMD through a phenomenological approach of investigating the experience of the Human Computer Interaction (HCI) for someone with dementia. The aim is to accentuate the Assistive Technology (AT) from the end user perspective. It proposes that older adults and those...... with dementia should no longer be an overlooked population, and how the HCI community can learn from their experiences to develop methods and design interfaces which truly benefit these individuals. Guidelines from previous research are incorporated along with eclectic, user-centered strategies as the interface...... designers for ISISEMD develop an appropriate and effective modality. The paper outlines the interconnected difficulties associated with the characteristics of older adults with mild dementia, which are important to be considered when introducing AT to that group of end users. It further presents clear...

  4. Flexible software architecture for user-interface and machine control in laboratory automation.

    Science.gov (United States)

    Arutunian, E B; Meldrum, D R; Friedman, N A; Moody, S E

    1998-10-01

    We describe a modular, layered software architecture for automated laboratory instruments. The design consists of a sophisticated user interface, a machine controller and multiple individual hardware subsystems, each interacting through a client-server architecture built entirely on top of open Internet standards. In our implementation, the user-interface components are built as Java applets that are downloaded from a server integrated into the machine controller. The user-interface client can thereby provide laboratory personnel with a familiar environment for experiment design through a standard World Wide Web browser. Data management and security are seamlessly integrated at the machine-controller layer using QNX, a real-time operating system. This layer also controls hardware subsystems through a second client-server interface. This architecture has proven flexible and relatively easy to implement and allows users to operate laboratory automation instruments remotely through an Internet connection. The software architecture was implemented and demonstrated on the Acapella, an automated fluid-sample-processing system that is under development at the University of Washington.

  5. A Vehicle Haptic Steering by Wire System Based on High Gain GPI Observers

    Directory of Open Access Journals (Sweden)

    A. Rodriguez-Angeles

    2014-01-01

    Full Text Available A vehicle steering by wire (SBW haptic system based on high gain generalized proportional integral (GPI observers is introduced. The observers are considered for the estimation of dynamic perturbations that are present at the tire and steering wheel. To ensure efficient tracking between the commanded steering wheel angle and the tire orientation angle, the estimated perturbations are on line canceled. As to provide a haptic interface with the driver, the estimated dynamic effects at the steering rack are fed back to the steering wheel, yielding a master-slave haptic system with bilateral communication. For implementation purposes few sensors and minimum knowledge of the dynamic model are required, which is a major advantage compared to other approaches. Only position tracking errors are fed back, while all other signals are estimated by the high gain GPI observers. The scheme is robust to uncertainty on the input gain and cancels dynamic perturbation effects such as friction and aligning forces on the tire. Experimental results are presented on a prototype platform.

  6. Eye gaze in intelligent user interfaces gaze-based analyses, models and applications

    CERN Document Server

    Nakano, Yukiko I; Bader, Thomas

    2013-01-01

    Remarkable progress in eye-tracking technologies opened the way to design novel attention-based intelligent user interfaces, and highlighted the importance of better understanding of eye-gaze in human-computer interaction and human-human communication. For instance, a user's focus of attention is useful in interpreting the user's intentions, their understanding of the conversation, and their attitude towards the conversation. In human face-to-face communication, eye gaze plays an important role in floor management, grounding, and engagement in conversation.Eye Gaze in Intelligent User Interfac

  7. A user interface development tool for space science systems Transportable Applications Environment (TAE) Plus

    Science.gov (United States)

    Szczur, Martha R.

    1990-01-01

    The Transportable Applications Environment Plus (TAE PLUS), developed at NASA's Goddard Space Flight Center, is a portable What You See Is What You Get (WYSIWYG) user interface development and management system. Its primary objective is to provide an integrated software environment that allows interactive prototyping and development that of user interfaces, as well as management of the user interface within the operational domain. Although TAE Plus is applicable to many types of applications, its focus is supporting user interfaces for space applications. This paper discusses what TAE Plus provides and how the implementation has utilized state-of-the-art technologies within graphic workstations, windowing systems and object-oriented programming languages.

  8. User's manual for the HYPGEN hyperbolic grid generator and the HGUI graphical user interface

    Science.gov (United States)

    Chan, William M.; Chiu, Ing-Tsau; Buning, Pieter G.

    1993-01-01

    The HYPGEN program is used to generate a 3-D volume grid over a user-supplied single-block surface grid. This is accomplished by solving the 3-D hyperbolic grid generation equations consisting of two orthogonality relations and one cell volume constraint. In this user manual, the required input files and parameters and output files are described. Guidelines on how to select the input parameters are given. Illustrated examples are provided showing a variety of topologies and geometries that can be treated. HYPGEN can be used in stand-alone mode as a batch program or it can be called from within a graphical user interface HGUI that runs on Silicon Graphics workstations. This user manual provides a description of the menus, buttons, sliders, and typein fields in HGUI for users to enter the parameters needed to run HYPGEN. Instructions are given on how to configure the interface to allow HYPGEN to run either locally or on a faster remote machine through the use of shell scripts on UNIX operating systems. The volume grid generated is copied back to the local machine for visualization using a built-in hook to PLOT3D.

  9. Influence of Learning Styles on Graphical User Interface Preferences for e-Learners

    Science.gov (United States)

    Dedic, Velimir; Markovic, Suzana

    2012-01-01

    Implementing Web-based educational environment requires not only developing appropriate architectures, but also incorporating human factors considerations. User interface becomes the major channel to convey information in e-learning context: a well-designed and friendly enough interface is thus the key element in helping users to get the best…

  10. Spelling Correction in User Interfaces.

    Science.gov (United States)

    1982-12-20

    conventional typescript -oriented command language, where most com- mands consist of a verb followed by a sequence of arguments. Most user terminals are...and explanations. not part of the typescripts . 2 SPFE.LING CORRLC1iON IN USR IN"RFAC’S 2. Design Issues We were prompted to look for a new correction...remaining 73% led us to wonder what other mechanisms might permit further corrections while retaining the typescript -style interface. Most of the other

  11. The intelligent user interface for NASA's advanced information management systems

    Science.gov (United States)

    Campbell, William J.; Short, Nicholas, Jr.; Rolofs, Larry H.; Wattawa, Scott L.

    1987-01-01

    NASA has initiated the Intelligent Data Management Project to design and develop advanced information management systems. The project's primary goal is to formulate, design and develop advanced information systems that are capable of supporting the agency's future space research and operational information management needs. The first effort of the project was the development of a prototype Intelligent User Interface to an operational scientific database, using expert systems and natural language processing technologies. An overview of Intelligent User Interface formulation and development is given.

  12. Multi-arm multilateral haptics-based immersive tele-robotic system (HITS) for improvised explosive device disposal

    Science.gov (United States)

    Erickson, David; Lacheray, Hervé; Lai, Gilbert; Haddadi, Amir

    2014-06-01

    This paper presents the latest advancements of the Haptics-based Immersive Tele-robotic System (HITS) project, a next generation Improvised Explosive Device (IED) disposal (IEDD) robotic interface containing an immersive telepresence environment for a remotely-controlled three-articulated-robotic-arm system. While the haptic feedback enhances the operator's perception of the remote environment, a third teleoperated dexterous arm, equipped with multiple vision sensors and cameras, provides stereo vision with proper visual cues, and a 3D photo-realistic model of the potential IED. This decentralized system combines various capabilities including stable and scaled motion, singularity avoidance, cross-coupled hybrid control, active collision detection and avoidance, compliance control and constrained motion to provide a safe and intuitive control environment for the operators. Experimental results and validation of the current system are presented through various essential IEDD tasks. This project demonstrates that a two-armed anthropomorphic Explosive Ordnance Disposal (EOD) robot interface can achieve complex neutralization techniques against realistic IEDs without the operator approaching at any time.

  13. X-windows-based user interface for data acquisition and display

    International Nuclear Information System (INIS)

    Fredian, T.W.; Stillerman, J.A.

    1990-01-01

    A Macintosh-like user interface for the MDS-Plus data acquisition system is being implemented using the DECwindows MIT/X interface. MDS-Plus is a model driven general purpose data acquisition system being developed collaboratively by the CMOD group at MIT Plasma Fusion Center, the RFX group at IGI-Padua, and the ZTH group at Los Alamos National Laboratory. The model is a hierarchical description of an experiment, including all of the tasks to be performed and the results of having performed them. The inherent complexity of this experimental model requires the users to specify fairly complicated descriptions of what they want the system to do. A ''Point and Click'' interface simplifies this by presenting to the user a coherent set of choices which are valid in the current context. We are implementing a set of tools for data acquisition and data analysis which use DECwindows to this end. They include a data displayer (Scope Replacement), an experiment model editor (Tree Editor), a timing system, and a waveform editor. These tools provide an easy to use interface to the MDS-Plus data acquisition system

  14. A visual Fortran user interface for CITATION code

    International Nuclear Information System (INIS)

    Albarhoum, M.; Zaidan, N.

    2006-11-01

    A user interface is designed to enable running the CITATION code under Windows. Four sections of CITATION input file are arranged in the form of 4 interfaces, in which all the parameters of the section can be modified dynamically. The help for each parameter (item) can be read from a general help for the section which, in turn, can be visualized upon selecting the section from the program general menu. (author)

  15. User Interface Design in Medical Distributed Web Applications.

    Science.gov (United States)

    Serban, Alexandru; Crisan-Vida, Mihaela; Mada, Leonard; Stoicu-Tivadar, Lacramioara

    2016-01-01

    User interfaces are important to facilitate easy learning and operating with an IT application especially in the medical world. An easy to use interface has to be simple and to customize the user needs and mode of operation. The technology in the background is an important tool to accomplish this. The present work aims to creating a web interface using specific technology (HTML table design combined with CSS3) to provide an optimized responsive interface for a complex web application. In the first phase, the current icMED web medical application layout is analyzed, and its structure is designed using specific tools, on source files. In the second phase, a new graphic adaptable interface to different mobile terminals is proposed, (using HTML table design (TD) and CSS3 method) that uses no source files, just lines of code for layout design, improving the interaction in terms of speed and simplicity. For a complex medical software application a new prototype layout was designed and developed using HTML tables. The method uses a CSS code with only CSS classes applied to one or multiple HTML table elements, instead of CSS styles that can be applied to just one DIV tag at once. The technique has the advantage of a simplified CSS code, and a better adaptability to different media resolutions compared to DIV-CSS style method. The presented work is a proof that adaptive web interfaces can be developed just using and combining different types of design methods and technologies, using HTML table design, resulting in a simpler to learn and use interface, suitable for healthcare services.

  16. Bilateral intraocular lens subluxation secondary to haptic angulation.

    Science.gov (United States)

    Moreno-Montañés, Javier; Fernández-Hortelano, Ana; Caire, Josemaría

    2008-04-01

    An 82-year-old man had uneventful phacoemulsification with bilateral implantation of a hydrophilic acrylic, single-piece intraocular lens (IOL) (ACR6D SE, Laboratoires Cornéal). Five years later, simultaneous and bilateral IOL subluxations occurred. In both eyes, the subluxation was situated on the side of one haptic that had moved forward (temporal area in the right eye and superior area in the left eye). In the right eye, the haptic-capsular bag was entrapped by the pupil and produced endothelial damage. A transscleral suture was placed over and under the subluxated haptic through the anterior and posterior capsules to capture the haptic. The haptic was then sutured to the sclera. No postoperative complications developed. We hypothesize that 10-degree angulated and broad haptic junctions can lead to zonular damage and IOL subluxation.

  17. Feedback from Usability Evaluation to User Interface Design

    DEFF Research Database (Denmark)

    Nielsen, C. M.; Overgaard, M.; Pedersen, M. B.

    2005-01-01

    This paper reports from an exploratory study of means for providing feedback from a usability evaluation to the user interface designers. In this study, we conducted a usability evaluation of a mobile system that is used by craftsmen to register use of time and materials. The results...... and weaknesses of the system. The findings indicate that detailed descriptions of problems and log descriptions of the user's interaction with the system and of system interaction are useful for the designers when trying to understand the usability problems that the users have encountered....

  18. A Prototype Graphical User Interface for Co-op: A Group Decision Support System.

    Science.gov (United States)

    1992-03-01

    achieve their potential to communicate. Information-oriented, systematic graphic design is the use of typography , symbols, color, and other static and...apphcuittin by reducig Uber ellurt anid enhuncizig Iliteracti. ’Iliis thesis designs and de% elupht Itrututylle Graphical User Interface iGUl i fui Cu f...ORGANIZATION.... .. .. ............ II. INTERFACE DESIGN PRINCIPLES. .............. 7 A. GRAPHICAL USER INTERFACES.............7 1. Design Principles

  19. A graphical user-interface control system at SRRC

    International Nuclear Information System (INIS)

    Chen, J.S.; Wang, C.J.; Chen, S.J.; Jan, G.J.

    1993-01-01

    A graphical user interface control system of 1.3 GeV synchrotron radiation light source was designed and implemented for the beam transport line (BTL) and storage ring (SR). A modern control technique has been used to implement and control the third generation synchrotron light source. Two level computer hardware configuration, that includes process and console computers as a top level and VME based intelligent local controller as a bottom level, was setup and tested. Both level computers are linked by high speed Ethernet data communication network. A database includes static and dynamic databases as well as access routines were developed. In order to commission and operate the machine friendly, the graphical man machine interface was designed and coded. The graphical user interface (GUI) software was installed on VAX workstations for the BTL and SR at the Synchrotron Radiation Research Center (SRRC). The over all performance has been evaluated at 10Hz update rate. The results showed that the graphical operator interface control system is versatile system and can be implemented into the control system of the accelerator. It will provide the tool to control and monitor the equipments of the radiation light source especially for machine commissioning and operation

  20. Prototyping of user interfaces for mobile applications

    CERN Document Server

    Bähr, Benjamin

    2017-01-01

    This book investigates processes for the prototyping of user interfaces for mobile apps, and describes the development of new concepts and tools that can improve the prototype driven app development in the early stages. It presents the development and evaluation of a new requirements catalogue for prototyping mobile app tools that identifies the most important criteria such tools should meet at different prototype-development stages. This catalogue is not just a good point of orientation for designing new prototyping approaches, but also provides a set of metrics for a comparing the performance of alternative prototyping tools. In addition, the book discusses the development of Blended Prototyping, a new approach for prototyping user interfaces for mobile applications in the early and middle development stages, and presents the results of an evaluation of its performance, showing that it provides a tool for teamwork-oriented, creative prototyping of mobile apps in the early design stages.

  1. Advanced Displays and Natural User Interfaces to Support Learning

    Science.gov (United States)

    Martin-SanJose, Juan-Fernando; Juan, M. -Carmen; Mollá, Ramón; Vivó, Roberto

    2017-01-01

    Advanced displays and natural user interfaces (NUI) are a very suitable combination for developing systems to provide an enhanced and richer user experience. This combination can be appropriate in several fields and has not been extensively exploited. One of the fields that this combination is especially suitable for is education. Nowadays,…

  2. The Morphing Waldo: An Adaptive User Interface.

    Science.gov (United States)

    Howell, Colby Chambers

    2001-01-01

    Performance Centered Design (PCD) offers an alternative to the old methodology of software development. The author suggests one design possibility that could be used to create a more universally satisfying interface for existing applications, an adaptive mini-program that "sits" between the larger application and the user. Potential…

  3. ADELA - user interface for fuel charge design

    International Nuclear Information System (INIS)

    Havluj, Frantisek

    2010-01-01

    ADELA is a supporting computer code - ANDREA code add-on - for fuel batch designing and optimization. It facilitates fuel batch planning, evaluation and archival by using graphical user interface. ADELA simplifies and automates the design process and is closely linked to the QUADRIGA system for data library creation. (author)

  4. Using Vim as User Interface for Your Applications

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The Vim editor offers one of the cleverest user interfaces. It's why many developers write programs with vi keyboard bindings. Now, imagine how powerful it gets to build applications literally on top of Vim itself.

  5. The Philosophy of User Interfaces in HELIO and the Importance of CASSIS

    Science.gov (United States)

    Bonnin, X.; Aboudarham, J.; Renié, C.; Csillaghy, A.; Messerotti, M.; Bentley, R. D.

    2012-09-01

    HELIO is a European project funded under FP7 (Project No. 238969). One of its goals as a Heliospheric Virtual Observatory is to provide an easy access to many datasets scattered all over the world, in the fields of Solar physics, Heliophysics, and Planetary magnetospheres. The efficiency of such a tool is very much related to the quality of the user interface. HELIO infrastructure is based on a Service Oriented Architecture (SOA), regrouping a network of standalone components, which allows four main types of interfaces: - HELIO Front End (HFE) is a browser-based user interface, which offers a centralized access to the HELIO main functionalities. Especially, it provides the possibility to reach data directly, or to refine selection by determination of observing characteristics, such as which instrument was observing at that time, which instrument was at this location, etc. - Many services/components provide their own standalone graphical user interface. While one can directly access individually each of these interfaces, they can also be connected together. - Most services also provide direct access for any tools through a public interface. A small java library, called Java API, simplifies this access by providing client stubs for services and shields the user from security, discovery and failover issues. - Workflows capabilities are available in HELIO, allowing complex combination of queries over several services. We want the user to be able to navigate easily, at his needs, through the various interfaces, and possibly use a specific one in order to make much-dedicated queries. We will also emphasize the importance of the CASSIS project (Coordination Action for the integration of Solar System Infrastructure and Science) in encouraging the interoperability necessary to undertake scientific studies that span disciplinary boundaries. If related projects follow the guidelines being developed by CASSIS then using external resources with HELIO will be greatly simplified.

  6. User Interface Technology Transfer to NASA's Virtual Wind Tunnel System

    Science.gov (United States)

    vanDam, Andries

    1998-01-01

    Funded by NASA grants for four years, the Brown Computer Graphics Group has developed novel 3D user interfaces for desktop and immersive scientific visualization applications. This past grant period supported the design and development of a software library, the 3D Widget Library, which supports the construction and run-time management of 3D widgets. The 3D Widget Library is a mechanism for transferring user interface technology from the Brown Graphics Group to the Virtual Wind Tunnel system at NASA Ames as well as the public domain.

  7. UsiGesture: Test and Evaluation of an Environment for Integrating Gestures in User Interfaces

    OpenAIRE

    Beuvens, François; Vanderdonckt, Jean

    2014-01-01

    User interfaces allowing gesture recognition and manipulation are becoming more and more popular these last years. It however remains a hard task for programmers to developer such interfaces : some knowledge of recognition systems is required, along with user experience and user interface management knowledge. It is often difficult for only one developer to handle all this knowledge by itself and it is why a team gathering different skills is most of the time needed. We previously presented a...

  8. Designing distributed user interfaces for ambient intelligent environments using models and simulations

    OpenAIRE

    LUYTEN, Kris; VAN DEN BERGH, Jan; VANDERVELPEN, Chris; CONINX, Karin

    2006-01-01

    There is a growing demand for design support to create interactive systems that are deployed in ambient intelligent environments. Unlike traditional interactive systems, the wide diversity of situations these type of user interfaces need to work in require tool support that is close to the environment of the end-user on the one hand and provide a smooth integration with the application logic on the other hand. This paper shows how the model-based user interface development methodology can be ...

  9. Faster simulated laparoscopic cholecystectomy with haptic feedback technology

    Directory of Open Access Journals (Sweden)

    Yiasemidou M

    2011-10-01

    Full Text Available Marina Yiasemidou, Daniel Glassman, Peter Vasas, Sarit Badiani, Bijendra Patel Barts and the London School of Medicine and Dentistry, Department of Upper GI Surgery, Barts and The Royal London Hospital, London, UK Background: Virtual reality simulators have been gradually introduced into surgical training. One of the enhanced features of the latest virtual simulators is haptic feedback. The usefulness of haptic feedback technology has been a matter of controversy in recent years. Previous studies have assessed the importance of haptic feedback in executing parts of a procedure or basic tasks, such as tissue grasping. The aim of this study was to assess the role of haptic feedback within a structured educational environment, based on the performance of junior surgical trainees after undergoing substantial simulation training. Methods: Novices, whose performance was assessed after several repetitions of a task, were recruited for this study. The performance of senior house officers at the last stage of a validated laparoscopic cholecystectomy curriculum was assessed. Nine senior house officers completed a validated laparoscopic cholecystectomy curriculum on a haptic simulator and nine on a nonhaptic simulator. Performance in terms of mean total time, mean total number of movements, and mean total path length at the last level of the validated curriculum (full procedure of laparoscopic cholecystectomy was compared between the two groups. Results: Haptic feedback significantly reduced the time required to complete the full procedure of laparoscopic cholecystectomy (mean total time for nonhaptic machine 608.83 seconds, mean total time for haptic machine 553.27 seconds; P = 0.019 while maintaining safety standards similar to those of the nonhaptic machine (mean total number of movements: nonhaptic machine 583.74, haptic machine 603.93, P = 0.145, mean total path length: for nonhaptic machine 1207.37 cm, for haptic machine 1262.36 cm, P = 0

  10. Developing Visual Editors for High-Resolution Haptic Patterns

    DEFF Research Database (Denmark)

    Cuartielles, David; Göransson, Andreas; Olsson, Tony

    2012-01-01

    In this article we give an overview of our iterative work in developing visual editors for creating high resolution haptic patterns to be used in wearable, haptic feedback devices. During the past four years we have found the need to address the question of how to represent, construct and edit high...... resolution haptic patterns so that they translate naturally to the user’s haptic experience. To solve this question we have developed and tested several visual editors...

  11. User Interface of MUDR Electronic Health Record

    Czech Academy of Sciences Publication Activity Database

    Hanzlíček, Petr; Špidlen, Josef; Heroutová, Helena; Nagy, Miroslav

    2005-01-01

    Roč. 74, - (2005), s. 221-227 ISSN 1386-5056 R&D Projects: GA MŠk LN00B107 Institutional research plan: CEZ:AV0Z10300504 Keywords : electronic health record * user interface * data entry * knowledge base Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.374, year: 2005

  12. Different haptic tools reduce trunk velocity in the frontal plane during walking, but haptic anchors have advantages over lightly touching a railing.

    Science.gov (United States)

    Hedayat, Isabel; Moraes, Renato; Lanovaz, Joel L; Oates, Alison R

    2017-06-01

    There are different ways to add haptic input during walking which may affect walking balance. This study compared the use of two different haptic tools (rigid railing and haptic anchors) and investigated whether any effects on walking were the result of the added sensory input and/or the posture generated when using those tools. Data from 28 young healthy adults were collected using the Mobility Lab inertial sensor system (APDM, Oregon, USA). Participants walked with and without both haptic tools and while pretending to use both haptic tools (placebo trials), with eyes opened and eyes closed. Using the tools or pretending to use both tools decreased normalized stride velocity (p  .999). These findings highlight a difference in the type of tool used to add haptic input and suggest that changes in balance control strategy resulting from using the railing are based on arm placement, where it is the posture combined with added sensory input that affects balance control strategies with the haptic anchors. These findings provide a strong framework for additional research to be conducted on the effects of haptic input on walking in populations known to have decreased walking balance.

  13. User interface development and metadata considerations for the Atmospheric Radiation Measurement (ARM) archive

    Science.gov (United States)

    Singley, P. T.; Bell, J. D.; Daugherty, P. F.; Hubbs, C. A.; Tuggle, J. G.

    1993-01-01

    This paper will discuss user interface development and the structure and use of metadata for the Atmospheric Radiation Measurement (ARM) Archive. The ARM Archive, located at Oak Ridge National Laboratory (ORNL) in Oak Ridge, Tennessee, is the data repository for the U.S. Department of Energy's (DOE's) ARM Project. After a short description of the ARM Project and the ARM Archive's role, we will consider the philosophy and goals, constraints, and prototype implementation of the user interface for the archive. We will also describe the metadata that are stored at the archive and support the user interface.

  14. Bilinear modeling of EMG signals to extract user-independent features for multiuser myoelectric interface.

    Science.gov (United States)

    Matsubara, Takamitsu; Morimoto, Jun

    2013-08-01

    In this study, we propose a multiuser myoelectric interface that can easily adapt to novel users. When a user performs different motions (e.g., grasping and pinching), different electromyography (EMG) signals are measured. When different users perform the same motion (e.g., grasping), different EMG signals are also measured. Therefore, designing a myoelectric interface that can be used by multiple users to perform multiple motions is difficult. To cope with this problem, we propose for EMG signals a bilinear model that is composed of two linear factors: 1) user dependent and 2) motion dependent. By decomposing the EMG signals into these two factors, the extracted motion-dependent factors can be used as user-independent features. We can construct a motion classifier on the extracted feature space to develop the multiuser interface. For novel users, the proposed adaptation method estimates the user-dependent factor through only a few interactions. The bilinear EMG model with the estimated user-dependent factor can extract the user-independent features from the novel user data. We applied our proposed method to a recognition task of five hand gestures for robotic hand control using four-channel EMG signals measured from subject forearms. Our method resulted in 73% accuracy, which was statistically significantly different from the accuracy of standard nonmultiuser interfaces, as the result of a two-sample t -test at a significance level of 1%.

  15. Comparison of Walking and Traveling-Wave Piezoelectric Motors as Actuators in Kinesthetic Haptic Devices.

    Science.gov (United States)

    Olsson, Pontus; Nysjo, Fredrik; Carlbom, Ingrid B; Johansson, Stefan

    2016-01-01

    Piezoelectric motors offer an attractive alternative to electromagnetic actuators in portable haptic interfaces: they are compact, have a high force-to-volume ratio, and can operate with limited or no gearing. However, the choice of a piezoelectric motor type is not obvious due to differences in performance characteristics. We present our evaluation of two commercial, operationally different, piezoelectric motors acting as actuators in two kinesthetic haptic grippers, a walking quasi-static motor and a traveling wave ultrasonic motor. We evaluate each gripper's ability to display common virtual objects including springs, dampers, and rigid walls, and conclude that the walking quasi-static motor is superior at low velocities. However, for applications where high velocity is required, traveling wave ultrasonic motors are a better option.

  16. A graphical user-interface for propulsion system analysis

    Science.gov (United States)

    Curlett, Brian P.; Ryall, Kathleen

    1993-01-01

    NASA LeRC uses a series of computer codes to calculate installed propulsion system performance and weight. The need to evaluate more advanced engine concepts with a greater degree of accuracy has resulted in an increase in complexity of this analysis system. Therefore, a graphical user interface was developed to allow the analyst to more quickly and easily apply these codes. The development of this interface and the rationale for the approach taken are described. The interface consists of a method of pictorially representing and editing the propulsion system configuration, forms for entering numerical data, on-line help and documentation, post processing of data, and a menu system to control execution.

  17. A human activity approach to User Interfaces

    DEFF Research Database (Denmark)

    Bødker, Susanne

    1989-01-01

    the work situations in which computer-based artifacts are used: The framework deals with the role of the user interface in purposeful human work. Human activity theory is used in this analysis. The purpose of this article is to make the reader curious and hopefully open his or her eyes to a somewhat...

  18. Layout design of user interface components with multiple objectives

    Directory of Open Access Journals (Sweden)

    Peer S.K.

    2004-01-01

    Full Text Available A multi-goal layout problem may be formulated as a Quadratic Assignment model, considering multiple goals (or factors, both qualitative and quantitative in the objective function. The facilities layout problem, in general, varies from the location and layout of facilities in manufacturing plant to the location and layout of textual and graphical user interface components in the human–computer interface. In this paper, we propose two alternate mathematical approaches to the single-objective layout model. The first one presents a multi-goal user interface component layout problem, considering the distance-weighted sum of congruent objectives of closeness relationships and the interactions. The second one considers the distance-weighted sum of congruent objectives of normalized weighted closeness relationships and normalized weighted interactions. The results of first approach are compared with that of an existing single objective model for example task under consideration. Then, the results of first approach and second approach of the proposed model are compared for the example task under consideration.

  19. Helping Students Test Programs That Have Graphical User Interfaces

    Directory of Open Access Journals (Sweden)

    Matthew Thornton

    2008-08-01

    Full Text Available Within computer science education, many educators are incorporating software testing activities into regular programming assignments. Tools like JUnit and its relatives make software testing tasks much easier, bringing them into the realm of even introductory students. At the same time, many introductory programming courses are now including graphical interfaces as part of student assignments to improve student interest and engagement. Unfortunately, writing software tests for programs that have significant graphical user interfaces is beyond the skills of typical students (and many educators. This paper presents initial work at combining educationally oriented and open-source tools to create an infrastructure for writing tests for Java programs that have graphical user interfaces. Critically, these tools are intended to be appropriate for introductory (CS1/CS2 student use, and to dovetail with current teaching approaches that incorporate software testing in programming assignments. We also include in our findings our proposed approach to evaluating our techniques.

  20. One-Dimensional Haptic Rendering Using Audio Speaker with Displacement Determined by Inductance

    Directory of Open Access Journals (Sweden)

    Avin Khera

    2016-03-01

    Full Text Available We report overall design considerations and preliminary results for a new haptic rendering device based on an audio loudspeaker. Our application models tissue properties during microsurgery. For example, the device could respond to the tip of a tool by simulating a particular tissue, displaying a desired compressibility and viscosity, giving way as the tissue is disrupted, or exhibiting independent motion, such as that caused by pulsations in blood pressure. Although limited to one degree of freedom and with a relatively small range of displacement compared to other available haptic rendering devices, our design exhibits high bandwidth, low friction, low hysteresis, and low mass. These features are consistent with modeling interactions with delicate tissues during microsurgery. In addition, our haptic rendering device is designed to be simple and inexpensive to manufacture, in part through an innovative method of measuring displacement by existing variations in the speaker’s inductance as the voice coil moves over the permanent magnet. Low latency and jitter are achieved by running the real-time simulation models on a dedicated microprocessor, while maintaining bidirectional communication with a standard laptop computer for user controls and data logging.

  1. Classification of user interfaces for graph-based online analytical processing

    Science.gov (United States)

    Michaelis, James R.

    2016-05-01

    In the domain of business intelligence, user-oriented software for conducting multidimensional analysis via Online- Analytical Processing (OLAP) is now commonplace. In this setting, datasets commonly have well-defined sets of dimensions and measures around which analysis tasks can be conducted. However, many forms of data used in intelligence operations - deriving from social networks, online communications, and text corpora - will consist of graphs with varying forms of potential dimensional structure. Hence, enabling OLAP over such data collections requires explicit definition and extraction of supporting dimensions and measures. Further, as Graph OLAP remains an emerging technique, limited research has been done on its user interface requirements. Namely, on effective pairing of interface designs to different types of graph-derived dimensions and measures. This paper presents a novel technique for pairing of user interface designs to Graph OLAP datasets, rooted in Analytic Hierarchy Process (AHP) driven comparisons. Attributes of the classification strategy are encoded through an AHP ontology, developed in our alternate work and extended to support pairwise comparison of interfaces. Specifically, according to their ability, as perceived by Subject Matter Experts, to support dimensions and measures corresponding to Graph OLAP dataset attributes. To frame this discussion, a survey is provided both on existing variations of Graph OLAP, as well as existing interface designs previously applied in multidimensional analysis settings. Following this, a review of our AHP ontology is provided, along with a listing of corresponding dataset and interface attributes applicable toward SME recommendation structuring. A walkthrough of AHP-based recommendation encoding via the ontology-based approach is then provided. The paper concludes with a short summary of proposed future directions seen as essential for this research area.

  2. Incorporating Haptic Feedback in Simulation for Learning Physics

    Science.gov (United States)

    Han, Insook; Black, John B.

    2011-01-01

    The purpose of this study was to investigate the effectiveness of a haptic augmented simulation in learning physics. The results indicate that haptic augmented simulations, both the force and kinesthetic and the purely kinesthetic simulations, were more effective than the equivalent non-haptic simulation in providing perceptual experiences and…

  3. A magnetorheological haptic cue accelerator for manual transmission vehicles

    International Nuclear Information System (INIS)

    Han, Young-Min; Noh, Kyung-Wook; Choi, Seung-Bok; Lee, Yang-Sub

    2010-01-01

    This paper proposes a new haptic cue function for manual transmission vehicles to achieve optimal gear shifting. This function is implemented on the accelerator pedal by utilizing a magnetorheological (MR) brake mechanism. By combining the haptic cue function with the accelerator pedal, the proposed haptic cue device can transmit the optimal moment of gear shifting for manual transmission to a driver without requiring the driver's visual attention. As a first step to achieve this goal, a MR fluid-based haptic device is devised to enable rotary motion of the accelerator pedal. Taking into account spatial limitations, the design parameters are optimally determined using finite element analysis to maximize the relative control torque. The proposed haptic cue device is then manufactured and its field-dependent torque and time response are experimentally evaluated. Then the manufactured MR haptic cue device is integrated with the accelerator pedal. A simple virtual vehicle emulating the operation of the engine of a passenger vehicle is constructed and put into communication with the haptic cue device. A feed-forward torque control algorithm for the haptic cue is formulated and control performances are experimentally evaluated and presented in the time domain

  4. WASAT. A graphical user interface for visualization of wave spectrograms

    Energy Technology Data Exchange (ETDEWEB)

    Joergensen, R

    1996-12-01

    The report describes a technique for the decoding and visualization of sounding rocket data sets. A specific application for the visualization of three dimensional wave HF FFT spectra obtained from the SCIFER sounding rocket launched January 25, 1995, is made. The data set was decoded from its original data format which was the NASA DITES I/II format. A graphical user interface, WASAT (WAve Spectrogram Analysis Tool), using the Interactive Data Language was created. The data set was visualized using IDL image tools overlayed with contour routines. The user interface was based on the IDL widget concept. 9 refs., 7 figs.

  5. WASAT. A graphical user interface for visualization of wave spectrograms

    International Nuclear Information System (INIS)

    Joergensen, R.

    1996-12-01

    The report describes a technique for the decoding and visualization of sounding rocket data sets. A specific application for the visualization of three dimensional wave HF FFT spectra obtained from the SCIFER sounding rocket launched January 25, 1995, is made. The data set was decoded from its original data format which was the NASA DITES I/II format. A graphical user interface, WASAT (WAve Spectrogram Analysis Tool), using the Interactive Data Language was created. The data set was visualized using IDL image tools overlayed with contour routines. The user interface was based on the IDL widget concept. 9 refs., 7 figs

  6. Four Principles for User Interface Design of Computerised Clinical Decision Support Systems

    DEFF Research Database (Denmark)

    Kanstrup, Anne Marie; Christiansen, Marion Berg; Nøhr, Christian

    2011-01-01

    emphasises a focus on how users interact with the system, a focus on how information is provided by the system, and four principles of interaction. The four principles for design of user interfaces for CDSS are summarised as four A’s: All in one, At a glance, At hand and Attention. It is recommended that all...... four interaction principles are integrated in the design of user interfaces for CDSS, i.e. the model is an integrated model which we suggest as a guide for interaction design when working with preventing medication errors....

  7. CATO--A General User Interface for CAS

    Science.gov (United States)

    Janetzko, Hans-Dieter

    2015-01-01

    CATO is a new user interface, developed by the author as a response to the significant difficulties faced by scientists, engineers, and students in their usage of computer algebra (CA) systems. Their tendency to use CA systems only occasionally means that they are unfamiliar with requisite grammar and syntax these systems require. The author…

  8. TmoleX--a graphical user interface for TURBOMOLE.

    Science.gov (United States)

    Steffen, Claudia; Thomas, Klaus; Huniar, Uwe; Hellweg, Arnim; Rubner, Oliver; Schroer, Alexander

    2010-12-01

    We herein present the graphical user interface (GUI) TmoleX for the quantum chemical program package TURBOMOLE. TmoleX allows users to execute the complete workflow of a quantum chemical investigation from the initial building of a structure to the visualization of the results in a user friendly graphical front end. The purpose of TmoleX is to make TURBOMOLE easy to use and to provide a high degree of flexibility. Hence, it should be a valuable tool for most users from beginners to experts. The program is developed in Java and runs on Linux, Windows, and Mac platforms. It can be used to run calculations on local desktops as well as on remote computers. © 2010 Wiley Periodicals, Inc.

  9. The design and evaluation of an activity monitoring user interface for people with stroke.

    Science.gov (United States)

    Hart, Phil; Bierwirth, Rebekah; Fulk, George; Sazonov, Edward

    2014-01-01

    Usability is an important topic in the field of telerehabilitation research. Older users with disabilities in particular, present age-related and disability-related challenges that should be accommodated for in the design of a user interface for a telerehabilitation system. This paper describes the design, implementation, and assessment of a telerehabilitation system user interface that tries to maximize usability for an elderly user who has experienced a stroke. An Internet-connected Nintendo(®) Wii™ gaming system is selected as a hardware platform, and a server and website are implemented to process and display the feedback information. The usability of the interface is assessed with a trial consisting of 18 subjects: 10 healthy Doctor of Physical Therapy students and 8 people with a stroke. Results show similar levels of usability and high satisfaction with the gaming system interface from both groups of subjects.

  10. Development of a Virtual Guitar using Haptic Device

    OpenAIRE

    田村,真晴; 山下,英生

    2009-01-01

    In recent years, a haptic device that output power as one of the computer output devices has been developed. We can get the feeling that we really touch the material through a sensor of haptic device when we touch a material simulated in a computer. In this research, a virtual guitar in which the feeling playing guitar and the sound volume are changed by adjusting power to input with a haptic device was developed. With the haptic device we feel as if we play a genuine guitar. Moreover, it see...

  11. Graphic and haptic simulation system for virtual laparoscopic rectum surgery.

    Science.gov (United States)

    Pan, Jun J; Chang, Jian; Yang, Xiaosong; Zhang, Jian J; Qureshi, Tahseen; Howell, Robert; Hickish, Tamas

    2011-09-01

    Medical simulators with vision and haptic feedback techniques offer a cost-effective and efficient alternative to the traditional medical trainings. They have been used to train doctors in many specialties of medicine, allowing tasks to be practised in a safe and repetitive manner. This paper describes a virtual-reality (VR) system which will help to influence surgeons' learning curves in the technically challenging field of laparoscopic surgery of the rectum. Data from MRI of the rectum and real operation videos are used to construct the virtual models. A haptic force filter based on radial basis functions is designed to offer realistic and smooth force feedback. To handle collision detection efficiently, a hybrid model is presented to compute the deformation of intestines. Finally, a real-time cutting technique based on mesh is employed to represent the incision operation. Despite numerous research efforts, fast and realistic solutions of soft tissues with large deformation, such as intestines, prove extremely challenging. This paper introduces our latest contribution to this endeavour. With this system, the user can haptically operate with the virtual rectum and simultaneously watch the soft tissue deformation. Our system has been tested by colorectal surgeons who believe that the simulated tactile and visual feedbacks are realistic. It could replace the traditional training process and effectively transfer surgical skills to novices. Copyright © 2011 John Wiley & Sons, Ltd.

  12. U-Net/SLE: A Java-Based User-Customizable Virtual Network Interface

    Directory of Open Access Journals (Sweden)

    Matt Welsh

    1999-01-01

    Full Text Available We describe U‐Net/SLE (Safe Language Extensions, a user‐level network interface architecture which enables per‐application customization of communication semantics through downloading of user extension applets, implemented as Java classfiles, to the network interface. This architecture permits applications to safely specify code to be executed within the NI on message transmission and reception. By leveraging the existing U‐Net model, applications may implement protocol code at the user level, within the NI, or using some combination of the two. Our current implementation, using the Myricom Myrinet interface and a small Java Virtual Machine subset, allows host communication overhead to be reduced and improves the overlap of communication and computation during protocol processing.

  13. Knowledge-based critiquing of graphical user interfaces with CHIMES

    Science.gov (United States)

    Jiang, Jianping; Murphy, Elizabeth D.; Carter, Leslie E.; Truszkowski, Walter F.

    1994-01-01

    CHIMES is a critiquing tool that automates the process of checking graphical user interface (GUI) designs for compliance with human factors design guidelines and toolkit style guides. The current prototype identifies instances of non-compliance and presents problem statements, advice, and tips to the GUI designer. Changes requested by the designer are made automatically, and the revised GUI is re-evaluated. A case study conducted at NASA-Goddard showed that CHIMES has the potential for dramatically reducing the time formerly spent in hands-on consistency checking. Capabilities recently added to CHIMES include exception handling and rule building. CHIMES is intended for use prior to usability testing as a means, for example, of catching and correcting syntactic inconsistencies in a larger user interface.

  14. ModelMate - A graphical user interface for model analysis

    Science.gov (United States)

    Banta, Edward R.

    2011-01-01

    ModelMate is a graphical user interface designed to facilitate use of model-analysis programs with models. This initial version of ModelMate supports one model-analysis program, UCODE_2005, and one model software program, MODFLOW-2005. ModelMate can be used to prepare input files for UCODE_2005, run UCODE_2005, and display analysis results. A link to the GW_Chart graphing program facilitates visual interpretation of results. ModelMate includes capabilities for organizing directories used with the parallel-processing capabilities of UCODE_2005 and for maintaining files in those directories to be identical to a set of files in a master directory. ModelMate can be used on its own or in conjunction with ModelMuse, a graphical user interface for MODFLOW-2005 and PHAST.

  15. An user-interface for retrieval of nuclear data

    International Nuclear Information System (INIS)

    Utsumi, Misako; Fujita, Mitsutane; Noda, Tetsuji

    1996-01-01

    A database storing the data on nuclear reaction was built to calculate for simulating transmutation behaviors of materials. In order to retrieve and maintain the database, the user interface for the data retrieval was developed where special knowledge on handling of the database or the machine structure is not required for end-user. It is indicated that using the database, the possibility of He formation and radioactivity in a material can be easily retrieved though the evaluation is qualitatively. (author)

  16. PAMLX: a graphical user interface for PAML.

    Science.gov (United States)

    Xu, Bo; Yang, Ziheng

    2013-12-01

    This note announces pamlX, a graphical user interface/front end for the paml (for Phylogenetic Analysis by Maximum Likelihood) program package (Yang Z. 1997. PAML: a program package for phylogenetic analysis by maximum likelihood. Comput Appl Biosci. 13:555-556; Yang Z. 2007. PAML 4: Phylogenetic analysis by maximum likelihood. Mol Biol Evol. 24:1586-1591). pamlX is written in C++ using the Qt library and communicates with paml programs through files. It can be used to create, edit, and print control files for paml programs and to launch paml runs. The interface is available for free download at http://abacus.gene.ucl.ac.uk/software/paml.html.

  17. Version I of the users manual for the Tuff Data Base Interface

    International Nuclear Information System (INIS)

    Langkopf, B.S.; Satter, B.J.; Welch, E.P.

    1985-04-01

    The Nevada Nuclear Waste Storage Investigations (NNWSI) project, managed by the Nevada Operations Office of the US Department of Energy, is investigating the feasibility of locating a repository at Yucca Mountain on and adjacent to the Nevada Test Site (NTS) in southern Nevada. A part of this investigation includes obtaining physical properties from laboratory tests on samples from Yucca Mountain and field tests of the in situ tuffs at Yucca Mountain. A computerized data base has been developed to store this data in a centralized location. The data base is stored on the Cyber 170/855 computer at Sandia using the System 2000 Data Base Management software. A user-friendly interface, the Tuff Data Base Interface, is being developed to allow NNWSI participants to retrieve information from the Tuff Data Base directly. The Interface gives NNWSI users a great deal of flexibility in retrieving portions of the Data Base. This report is an interim users manual for the Tuff Data Base Interface, as of August 1984. It gives basic instructions on accessing the Sandia computing system and explains the Interface on a question-by-question basis

  18. A graphical user interface (gui) matlab program Synthetic_Ves For ...

    African Journals Online (AJOL)

    An interactive and robust computer program for 1D forward modeling of Schlumberger Vertical Electrical Sounding (VES) curves for multilayered earth models is presented. The Graphical User Interface (GUI) enabled software, written in MATLAB v.7.12.0.635 (R2011a), accepts user-defined geologic model parameters (i.e. ...

  19. A Functional Programming Technique for Forms in Graphical User Interfaces

    NARCIS (Netherlands)

    Evers, S.; Kuper, Jan; Achten, P.M.; Grelck, G.; Huch, F.; Michaelson, G.; Trinder, Ph.W.

    2005-01-01

    This paper presents FunctionalForms, a new combinator library for constructing fully functioning forms in a concise and flexible way. A form is a part of a graphical user interface (GUI) restricted to displaying a value and allowing the user to modify it. The library is built on top of the

  20. Automatic figure ranking and user interfacing for intelligent figure search.

    Directory of Open Access Journals (Sweden)

    Hong Yu

    2010-10-01

    Full Text Available Figures are important experimental results that are typically reported in full-text bioscience articles. Bioscience researchers need to access figures to validate research facts and to formulate or to test novel research hypotheses. On the other hand, the sheer volume of bioscience literature has made it difficult to access figures. Therefore, we are developing an intelligent figure search engine (http://figuresearch.askhermes.org. Existing research in figure search treats each figure equally, but we introduce a novel concept of "figure ranking": figures appearing in a full-text biomedical article can be ranked by their contribution to the knowledge discovery.We empirically validated the hypothesis of figure ranking with over 100 bioscience researchers, and then developed unsupervised natural language processing (NLP approaches to automatically rank figures. Evaluating on a collection of 202 full-text articles in which authors have ranked the figures based on importance, our best system achieved a weighted error rate of 0.2, which is significantly better than several other baseline systems we explored. We further explored a user interfacing application in which we built novel user interfaces (UIs incorporating figure ranking, allowing bioscience researchers to efficiently access important figures. Our evaluation results show that 92% of the bioscience researchers prefer as the top two choices the user interfaces in which the most important figures are enlarged. With our automatic figure ranking NLP system, bioscience researchers preferred the UIs in which the most important figures were predicted by our NLP system than the UIs in which the most important figures were randomly assigned. In addition, our results show that there was no statistical difference in bioscience researchers' preference in the UIs generated by automatic figure ranking and UIs by human ranking annotation.The evaluation results conclude that automatic figure ranking and user

  1. Monitoring and controlling ATLAS data management: The Rucio web user interface

    OpenAIRE

    Lassnig, Mario; Beermann, Thomas Alfons; Vigne, Ralph; Barisits, Martin-Stefan; Garonne, Vincent; Serfon, Cedric

    2015-01-01

    The monitoring and controlling interfaces of the previous data management system DQ2 followed the evolutionary requirements and needs of the ATLAS collaboration. The new data management system, Rucio, has put in place a redesigned web-based interface based upon the lessons learnt from DQ2, and the increased volume of managed information. This interface encompasses both a monitoring and controlling component, and allows easy integration for user-generated views. The interface follows three des...

  2. Examination of Color-Lighting Control System Using Colored Paper User Interface

    Directory of Open Access Journals (Sweden)

    Aida Hiroto

    2016-01-01

    Full Text Available In recent year, Full-Color LED Lighting that can be changed to various color such as red, green, blue has been appeared with development of LED Lighting. By Color-Lighting control, users affected such as concentrating and relaxing. Therefore, Color-lighting control will spread to various place such as home, offices, stations. However color-lighting control affected some disturbance such as daylight, display when Full-Color LED controlled indoors. Also, information devices control get difficult with information technology develop. I propose Color-Lighting Control System using Colored Paper User Interface(CLC/CPUI. The purpose of CLC/CPUI is that anyone can intuitively control Full-Color LED Lighting. CLC/CPUI uses colored paper as user interface by sensing the paper. CLC/CPUI realizes lighting color that user demanded to do feedback control. I conduct accuracy verification experiment of CLC/CPUI.

  3. Open|SpeedShop Graphical User Interface Technology, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to create a new graphical user interface (GUI) for an existing parallel application performance and profiling tool, Open|SpeedShop. The current GUI has...

  4. An Object-Oriented Graphical User Interface for a Reusable Rocket Engine Intelligent Control System

    Science.gov (United States)

    Litt, Jonathan S.; Musgrave, Jeffrey L.; Guo, Ten-Huei; Paxson, Daniel E.; Wong, Edmond; Saus, Joseph R.; Merrill, Walter C.

    1994-01-01

    An intelligent control system for reusable rocket engines under development at NASA Lewis Research Center requires a graphical user interface to allow observation of the closed-loop system in operation. The simulation testbed consists of a real-time engine simulation computer, a controls computer, and several auxiliary computers for diagnostics and coordination. The system is set up so that the simulation computer could be replaced by the real engine and the change would be transparent to the control system. Because of the hard real-time requirement of the control computer, putting a graphical user interface on it was not an option. Thus, a separate computer used strictly for the graphical user interface was warranted. An object-oriented LISP-based graphical user interface has been developed on a Texas Instruments Explorer 2+ to indicate the condition of the engine to the observer through plots, animation, interactive graphics, and text.

  5. Empowering Persons with Deafblindness: Designing an Intelligent Assistive Wearable in the SUITCEYES Project

    NARCIS (Netherlands)

    Korn, Oliver; Holt, Raymond; Kontopoulos, Efstratios; Kappers, Astrid M.L.; Persson, Nils-Krister; Olson, Nasrine

    2018-01-01

    Deafblindness is a condition that limits communication capabilities primarily to the haptic channel. In the EU-funded project SUITCEYES we design a system which allows haptic and thermal communication via soft interfaces and textiles. Based on user needs and informed by disability studies, we

  6. Evaluation of Subjective and Objective Performance Metrics for Haptically Controlled Robotic Systems

    Directory of Open Access Journals (Sweden)

    Cong Dung Pham

    2014-07-01

    Full Text Available This paper studies in detail how different evaluation methods perform when it comes to describing the performance of haptically controlled mobile manipulators. Particularly, we investigate how well subjective metrics perform compared to objective metrics. To find the best metrics to describe the performance of a control scheme is challenging when human operators are involved; how the user perceives the performance of the controller does not necessarily correspond to the directly measurable metrics normally used in controller evaluation. It is therefore important to study whether there is any correspondence between how the user perceives the performance of a controller, and how it performs in terms of directly measurable metrics such as the time used to perform a task, number of errors, accuracy, and so on. To perform these tests we choose a system that consists of a mobile manipulator that is controlled by an operator through a haptic device. This is a good system for studying different performance metrics as the performance can be determined by subjective metrics based on feedback from the users, and also as objective and directly measurable metrics. The system consists of a robotic arm which provides for interaction and manipulation, which is mounted on a mobile base which extends the workspace of the arm. The operator thus needs to perform both interaction and locomotion using a single haptic device. While the position of the on-board camera is determined by the base motion, the principal control objective is the motion of the manipulator arm. This calls for intelligent control allocation between the base and the manipulator arm in order to obtain intuitive control of both the camera and the arm. We implement three different approaches to the control allocation problem, i.e., whether the vehicle or manipulator arm actuation is applied to generate the desired motion. The performance of the different control schemes is evaluated, and our

  7. Evaluation of the user interface simplicity in the modern generation of mechanical ventilators.

    Science.gov (United States)

    Uzawa, Yoshihiro; Yamada, Yoshitsugu; Suzukawa, Masayuki

    2008-03-01

    We designed this study to evaluate the simplicity of the user interface in modern-generation mechanical ventilators. We hypothesized that different designs in the user interface could result in different rates of operational failures. A laboratory in a tertiary teaching hospital. Crossover design. Twenty-one medical resident physicians who did not possess operating experience with any of the selected ventilators. Four modern mechanical ventilators were selected: Dräger Evita XL, Maquet Servo-i, Newport e500, and Puritan Bennett 840. Each subject was requested to perform 8 tasks on each ventilator. Two objective variables (the number of successfully completed tasks without operational failures and the operational time) and the overall subjective rating of the ease of use, measured with a 100-mm visual analog scale were recorded. The total percentage of operational failures made for all subjects, for all tasks, was 23%. There were significant differences in the rates of operational failures and operational time among the 4 ventilators. Subjects made more operational failures in setting up the ventilators and in making ventilator-setting changes than in reacting to alarms. The subjective feeling of the ease of use was also significantly different among the ventilators. The design of the user interface is relevant to the occurrence of operational failures. Our data indicate that ventilator designers could optimize the user-interface design to reduce the operational failures; therefore, basic user interface should be standardized among the clinically used mechanical ventilators.

  8. What you can't feel won't hurt you: Evaluating haptic hardware using a haptic contrast sensitivity function.

    Science.gov (United States)

    Salisbury, C M; Gillespie, R B; Tan, H Z; Barbagli, F; Salisbury, J K

    2011-01-01

    In this paper, we extend the concept of the contrast sensitivity function - used to evaluate video projectors - to the evaluation of haptic devices. We propose using human observers to determine if vibrations rendered using a given haptic device are accompanied by artifacts detectable to humans. This determination produces a performance measure that carries particular relevance to applications involving texture rendering. For cases in which a device produces detectable artifacts, we have developed a protocol that localizes deficiencies in device design and/or hardware implementation. In this paper, we present results from human vibration detection experiments carried out using three commercial haptic devices and one high performance voice coil motor. We found that all three commercial devices produced perceptible artifacts when rendering vibrations near human detection thresholds. Our protocol allowed us to pinpoint the deficiencies, however, and we were able to show that minor modifications to the haptic hardware were sufficient to make these devices well suited for rendering vibrations, and by extension, the vibratory components of textures. We generalize our findings to provide quantitative design guidelines that ensure the ability of haptic devices to proficiently render the vibratory components of textures.

  9. User Interface Program for secure electronic tags

    International Nuclear Information System (INIS)

    Cai, Y.; Koehl, E.R.; Carlson, R.D.; Raptis, A.C.

    1995-05-01

    This report summarizes and documents the efforts of Argonne National Laboratory (ANL) in developing a secure tag communication user interface program comprising a tag monitor and a communication tool. This program can perform the same functions as the software that was developed at the Lawrence Livermore National Laboratory (LLNL), but it is enhanced with a user-friendly screen. It represents the first step in updating the TRANSCOM Tracking System (TRANSCOM) by incorporating a tag communication screen menu into the main menu of the TRANSCOM user program. A working version of TRANSCOM, enhanced with ANL secure-tag graphics, will strongly support the Department of Energy Warhead Dismantlement/Special Nuclear Materials Control initiatives. It will allow commercial satellite tracking of the movements and operational activities of treaty-limited items and transportation vehicles throughout Europe and the former USSR, as well as the continental US

  10. Designing Tangible User Interfaces for NFC Phones

    Directory of Open Access Journals (Sweden)

    Mikko Pyykkönen

    2012-01-01

    Full Text Available The increasing amount of NFC phones is attracting application developers to utilize NFC functionality. We can hence soon expect a large amount of mobile applications that users command by touching NFC tags in their environment with their NFC phones. The communication technology and the data formats have been standardized by the NFC Forum, but there are no conventions for advertising to the users NFC tags and the functionality touching the tags triggers. Only individual graphical symbols have been suggested when guidelines for advertising a rich variety of functionality are called for. In this paper, we identify the main challenges and present our proposal, a set of design guidelines based on more than twenty application prototypes we have built. We hope to initiate discussion and research resulting in uniform user interfaces for NFC-based services.

  11. US NDC Modernization Iteration E2 Prototyping Report: User Interface Framework

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Jennifer E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Palmer, Melanie A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vickers, James Wallace [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Voegtli, Ellen M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-12-01

    During the second iteration of the US NDC Modernization Elaboration phase (E2), the SNL US NDC Modernization project team completed follow-on Rich Client Platform (RCP) exploratory prototyping related to the User Interface Framework (UIF). The team also developed a survey of browser-based User Interface solutions and completed exploratory prototyping for selected solutions. This report presents the results of the browser-based UI survey, summarizes the E2 browser-based UI and RCP prototyping work, and outlines a path forward for the third iteration of the Elaboration phase (E3).

  12. A graphical user interface for infant ERP analysis.

    Science.gov (United States)

    Kaatiala, Jussi; Yrttiaho, Santeri; Forssman, Linda; Perdue, Katherine; Leppänen, Jukka

    2014-09-01

    Recording of event-related potentials (ERPs) is one of the best-suited technologies for examining brain function in human infants. Yet the existing software packages are not optimized for the unique requirements of analyzing artifact-prone ERP data from infants. We developed a new graphical user interface that enables an efficient implementation of a two-stage approach to the analysis of infant ERPs. In the first stage, video records of infant behavior are synchronized with ERPs at the level of individual trials to reject epochs with noncompliant behavior and other artifacts. In the second stage, the interface calls MATLAB and EEGLAB (Delorme & Makeig, Journal of Neuroscience Methods 134(1):9-21, 2004) functions for further preprocessing of the ERP signal itself (i.e., filtering, artifact removal, interpolation, and rereferencing). Finally, methods are included for data visualization and analysis by using bootstrapped group averages. Analyses of simulated and real EEG data demonstrated that the proposed approach can be effectively used to establish task compliance, remove various types of artifacts, and perform representative visualizations and statistical comparisons of ERPs. The interface is available for download from http://www.uta.fi/med/icl/methods/eeg.html in a format that is widely applicable to ERP studies with special populations and open for further editing by users.

  13. Teleoperación de un vehículo remoto en un medio de acceso inalámbrico mediante el uso de una interfaz háptica Remote vehicle teleoperation through a haptic interface

    Directory of Open Access Journals (Sweden)

    Arys Carrasquilla Batista

    2012-11-01

    Full Text Available La teleoperación permite que el ser humano pueda llevar a cabo ciertas tareas en lugares muy lejanos, de difícil acceso o de condiciones hostiles para la presencia de un operador. Con este proyecto se logró teleoperar un vehículo remoto por medio del protocolo de comunicación Bluetooth, para lo cual se adaptó una interfaz háptica (Novint Falcon. Desde la interfaz, el operador puede enviar la consigna de movimiento y, además, obtener sensaciones conforme a la información de los sensores incluidos al sistema. Como vehículo remoto se utilizó un Lego Mindstorms con capacidad de comunicación Bluetooth, al cual se incorporó un sensor de contacto y otro de ultrasonido con el fin de percibir en la interfaz háptica la retroalimentación de fuerzas. A una computadora estándar se le dio capacidad de comunicación Bluetooth por medio de un adaptador USB, desde la cual se ejecuta un programa creado en C++ para controlar las acciones de la interfaz háptica, enviar los comandos de movimiento al vehículo y recibir la información de los sensores, la cual reproduce sensaciones al operador.Teleoperation allows human beings to carry out certain tasks in places far away, inaccessible or with hostile conditions for the presence of an operator. In this project a remote vehicle is teleoperated using a bluetooth wireless connection, to accomplish this an haptic interface (Novint Falcon was used. The operator can give movement instructions to the vehicle and obtain sensations, according to the information received from the sensors connected to the system. The remote vehicle was a Lego Mindstorms with bluetooth communication capabilities, a touch sensor and ultrasonic sensor were included in order to perceive reflection of forces through the haptic interface. A USB adapter for bluetooth communication was added to a standard computer; a program in C++ is executed over this computer to control the haptic interface, send movement commands to the vehicle and

  14. The Impact of User Interface on Young Children’s Computational Thinking

    Directory of Open Access Journals (Sweden)

    Amanda Sullivan

    2017-07-01

    Full Text Available Aim/Purpose: Over the past few years, new approaches to introducing young children to computational thinking have grown in popularity. This paper examines the role that user interfaces have on children’s mastery of computational thinking concepts and positive interpersonal behaviors. Background: There is a growing pressure to begin teaching computational thinking at a young age. This study explores the affordances of two very different programming interfaces for teaching computational thinking: a graphical coding application on the iPad (ScratchJr and tangible programmable robotics kit (KIBO. Methodology\t: This study used a mixed-method approach to explore the learning experiences that young children have with tangible and graphical coding interfaces. A sample of children ages four to seven (N = 28 participated. Findings: Results suggest that type of user interface does have an impact on children’s learning, but is only one of many factors that affect positive academic and socio-emotional experiences. Tangible and graphical interfaces each have qualities that foster different types of learning

  15. User interface for MAWST limit of error program

    International Nuclear Information System (INIS)

    Crain, B. Jr.

    1991-01-01

    This paper reports on a user-friendly interface which is being developed to aid in preparation of input data for the Los Alamos National Laboratory software module MAWST (Materials Accounting With Sequential Testing) used at Savannah River Site to propagate limits of error for facility material balances. The forms-based interface is being designed using traditional software project management tools and using the Ingres family of database management and application development products (products of Relational Technology, Inc.). The software will run on VAX computers (products of Digital Equipment Corporation) on which the VMS operating system and Ingres database management software are installed. Use of the interface software will reduce time required to prepare input data for calculations and also reduce errors associated with data preparation

  16. Visual design for the user interface, Part 1: Design fundamentals.

    Science.gov (United States)

    Lynch, P J

    1994-01-01

    Digital audiovisual media and computer-based documents will be the dominant forms of professional communication in both clinical medicine and the biomedical sciences. The design of highly interactive multimedia systems will shortly become a major activity for biocommunications professionals. The problems of human-computer interface design are intimately linked with graphic design for multimedia presentations and on-line document systems. This article outlines the history of graphic interface design and the theories that have influenced the development of today's major graphic user interfaces.

  17. INTERFACING GOOGLE SEARCH ENGINE TO CAPTURE USER WEB SEARCH BEHAVIOR

    OpenAIRE

    Fadhilah Mat Yamin; T. Ramayah

    2013-01-01

    The behaviour of the searcher when using the search engine especially during the query formulation is crucial. Search engines capture users’ activities in the search log, which is stored at the search engine server. Due to the difficulty of obtaining this search log, this paper proposed and develops an interface framework to interface a Google search engine. This interface will capture users’ queries before redirect them to Google. The analysis of the search log will show that users are utili...

  18. Neodymium:YAG laser cutting of intraocular lens haptics in vitro and in vivo.

    Science.gov (United States)

    Feder, J M; Rosenberg, M A; Farber, M D

    1989-09-01

    Various complications following intraocular lens (IOL) surgery result in explantation of the lenses. Haptic fibrosis may necessitate cutting the IOL haptics prior to removal. In this study we used the neodymium: YAG (Nd:YAG) laser to cut polypropylene and poly(methyl methacrylate) (PMMA) haptics in vitro and in rabbit eyes. In vitro we were able to cut 100% of both haptic types successfully (28 PMMA and 30 polypropylene haptics). In rabbit eyes we were able to cut 50% of the PMMA haptics and 43% of the polypropylene haptics. Poly(methyl methacrylate) haptics were easier to cut in vitro and in vivo than polypropylene haptics, requiring fewer shots for transection. Complications of Nd:YAG laser use frequently interfered with haptic transections in rabbit eyes. Haptic transection may be more easily accomplished in human eyes.

  19. Lessons learned from the design and implementation of distributed post-WIMP user interfaces

    OpenAIRE

    Seifried, Thomas; Jetter, Hans-Christian; Haller, Michael; Reiterer, Harald

    2011-01-01

    Creating novel user interfaces that are “natural” and distributed is challenging for designers and developers. “Natural” interaction techniques are barely standardized and in combination with distributed UIs additional technical difficulties arise. In this paper we present the lessons we have learned in developing several natural and distributed user interfaces and propose design patterns to support development of such applications.

  20. A user-friendly, graphical interface for the Monte Carlo neutron optics code MCLIB

    International Nuclear Information System (INIS)

    Thelliez, T.; Daemen, L.; Hjelm, R.P.; Seeger, P.A.

    1995-01-01

    The authors describe a prototype of a new user interface for the Monte Carlo neutron optics simulation program MCLIB. At this point in its development the interface allows the user to define an instrument as a set of predefined instrument elements. The user can specify the intrinsic parameters of each element, its position and orientation. The interface then writes output to the MCLIB package and starts the simulation. The present prototype is an early development stage of a comprehensive Monte Carlo simulations package that will serve as a tool for the design, optimization and assessment of performance of new neutron scattering instruments. It will be an important tool for understanding the efficacy of new source designs in meeting the needs of these instruments

  1. LTCP 2D Graphical User Interface. Application Description and User's Guide

    Science.gov (United States)

    Ball, Robert; Navaz, Homayun K.

    1996-01-01

    A graphical user interface (GUI) written for NASA's LTCP (Liquid Thrust Chamber Performance) 2 dimensional computational fluid dynamic code is described. The GUI is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. Through the use of common and familiar dialog boxes, features, and tools, the user can easily and quickly create and modify input files for the LTCP code. In addition, old input files used with the LTCP code can be opened and modified using the GUI. The application is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. The program and its capabilities are presented, followed by a detailed description of each menu selection and the method of creating an input file for LTCP. A cross reference is included to help experienced users quickly find the variables which commonly need changes. Finally, the system requirements and installation instructions are provided.

  2. Graphics server and action language interpreter greatly simplify the composition of a graphical user interface

    International Nuclear Information System (INIS)

    Mueller, R.

    1992-01-01

    A new control system based on a distributed computing environment is gradually installed at BESSY, a 800 MeV storage ring dedicated to the generation of synchrotron light in the VUV and soft X-ray region. The new operator consoles are large high resolution, bitmap oriented color graphic screens with mouse and keyboard. A new graphical user interface has been developed with a user interface management system. A graphics server encapsulates completely representational aspects, mediates between user interactions and application variables and takes care of a consistent state of graphical and applicational objects. Graphical representations, semantics of user interactions and interpreter instructions are defined in a database written in a simple and comprehensible user interface definition language. (R.P.) 7 refs.; 5 figs

  3. Shortening User Interface Design Iterations through Realtime Visualisation of Design Actions on the Target Device

    OpenAIRE

    MESKENS, Jan; LUYTEN, Kris; CONINX, Karin

    2009-01-01

    In current mobile user interface design tools, it is time consuming to export a design to the target device. This makes it hard for designers to iterate over the user interfaces they are creating. We propose Gummy-live, a GUI builder for mobile devices allowing designers to test and observe immediately on the target device each step they take in the GUI builder. This way, designers are stimulated to iteratively test and refine user interface prototypes in order to take the target device charac...

  4. Haptic Feedback for the GPU-based Surgical Simulator

    DEFF Research Database (Denmark)

    Sørensen, Thomas Sangild; Mosegaard, Jesper

    2006-01-01

    The GPU has proven to be a powerful processor to compute spring-mass based surgical simulations. It has not previously been shown however, how to effectively implement haptic interaction with a simulation running entirely on the GPU. This paper describes a method to calculate haptic feedback...... with limited performance cost. It allows easy balancing of the GPU workload between calculations of simulation, visualisation, and the haptic feedback....

  5. Conceptual Design of Haptic-Feedback Navigation Device for Individuals with Alzheimer's Disease.

    Science.gov (United States)

    Che Me, Rosalam; Biamonti, Alessandro; Mohd Saad, Mohd Rashid

    2015-01-01

    Wayfinding ability in older adults with Alzheimer's disease (AD) is progressively impaired due to ageing and deterioration of cognitive domains. Usually, the sense of direction is deteriorated as visuospatial and spatial cognition are associated with the sensory acuity. Therefore, navigation systems that support only visual interactions may not be appropriate in case of AD. This paper presents a concept of wearable navigation device that integrates the haptic-feedback technology to facilitate the wayfinding of individuals with AD. The system provides the simplest instructions; left/right using haptic signals, as to avoid users' distraction during navigation. The advantages of haptic/tactile modality for wayfinding purpose based on several significant studies are presented. As preliminary assessment, a survey is conducted to understand the potential of this design concept in terms of (1) acceptability, (2) practicality, (3) wearability, and (4) environmental settings. Results indicate that the concept is highly acceptable and commercially implementable. A working prototype will be developed based on the results of the preliminary assessment. Introducing a new method of navigation should be followed by continuous practices for familiarization purpose. Improved navigability allows the good performance of activities of daily living (ADLs) hence maintain the good quality of life in older adults with AD.

  6. fgui: A Method for Automatically Creating Graphical User Interfaces for Command-Line R Packages

    Science.gov (United States)

    Hoffmann, Thomas J.; Laird, Nan M.

    2009-01-01

    The fgui R package is designed for developers of R packages, to help rapidly, and sometimes fully automatically, create a graphical user interface for a command line R package. The interface is built upon the Tcl/Tk graphical interface included in R. The package further facilitates the developer by loading in the help files from the command line functions to provide context sensitive help to the user with no additional effort from the developer. Passing a function as the argument to the routines in the fgui package creates a graphical interface for the function, and further options are available to tweak this interface for those who want more flexibility. PMID:21625291

  7. Ecological user interface for emergency management decision support systems

    DEFF Research Database (Denmark)

    Andersen, V.

    2003-01-01

    The user interface for decision support systems is normally structured for presenting relevant data for the skilled user in order to allow fast assessment and action of the hazardous situation, or for more complex situations to present the relevant rules and procedures to be followed in order to ...... of this paper is to discuss the possibility of using the same principles for emergency management with the aim of improved performance in complex and unanticipated situations....

  8. New geometric data structures for collision detection and haptics

    CERN Document Server

    Weller, René

    2013-01-01

    Starting with novel algorithms for optimally updating bounding volume hierarchies of objects undergoing arbitrary deformations, the author presents a new data structure that allows, for the first time, the computation of the penetration volume. The penetration volume is related to the water displacement of the overlapping region, and thus corresponds to a physically motivated and continuous force. The practicability of the approaches used is shown by realizing new applications in the field of robotics and haptics, including a user study that evaluates the influence of the degrees of freedom in

  9. Haptic perception accuracy depending on self-produced movement.

    Science.gov (United States)

    Park, Chulwook; Kim, Seonjin

    2014-01-01

    This study measured whether self-produced movement influences haptic perception ability (experiment 1) as well as the factors associated with levels of influence (experiment 2) in racket sports. For experiment 1, the haptic perception accuracy levels of five male table tennis experts and five male novices were examined under two different conditions (no movement vs. movement). For experiment 2, the haptic afferent subsystems of five male table tennis experts and five male novices were investigated in only the self-produced movement-coupled condition. Inferential statistics (ANOVA, t-test) and custom-made devices (shock & vibration sensor, Qualisys Track Manager) of the data were used to determine the haptic perception accuracy (experiment 1, experiment 2) and its association with expertise. The results of this research show that expert-level players acquire higher accuracy with less variability (racket vibration and angle) than novice-level players, especially in their self-produced movement coupled performances. The important finding from this result is that, in terms of accuracy, the skill-associated differences were enlarged during self-produced movement. To explain the origin of this difference between experts and novices, the functional variability of haptic afferent subsystems can serve as a reference. These two factors (self-produced accuracy and the variability of haptic features) as investigated in this study would be useful criteria for educators in racket sports and suggest a broader hypothesis for further research into the effects of the haptic accuracy related to variability.

  10. The missing graphical user interface for genomics.

    Science.gov (United States)

    Schatz, Michael C

    2010-01-01

    The Galaxy package empowers regular users to perform rich DNA sequence analysis through a much-needed and user-friendly graphical web interface. See research article http://genomebiology.com/2010/11/8/R86 RESEARCH HIGHLIGHT: With the advent of affordable and high-throughput DNA sequencing, sequencing is becoming an essential component in nearly every genetics lab. These data are being generated to probe sequence variations, to understand transcribed, regulated or methylated DNA elements, and to explore a host of other biological features across the tree of life and across a range of environments and conditions. Given this deluge of data, novices and experts alike are facing the daunting challenge of trying to analyze the raw sequence data computationally. With so many tools available and so many assays to analyze, how can one be expected to stay current with the state of the art? How can one be expected to learn to use each tool and construct robust end-to-end analysis pipelines, all while ensuring that input formats, command-line options, sequence databases and program libraries are set correctly? Finally, once the analysis is complete, how does one ensure the results are reproducible and transparent for others to scrutinize and study?In an article published in Genome Biology, Jeremy Goecks, Anton Nekrutenko, James Taylor and the rest of the Galaxy Team (Goecks et al. 1) make a great advance towards resolving these critical questions with the latest update to their Galaxy Project. The ambitious goal of Galaxy is to empower regular users to carry out their own computational analysis without having to be an expert in computational biology or computer science. Galaxy adds a desperately needed graphical user interface to genomics research, making data analysis universally accessible in a web browser, and freeing users from the minutiae of archaic command-line parameters, data formats and scripting languages. Data inputs and computational steps are selected from

  11. Open Touch/Sound Maps: A system to convey street data through haptic and auditory feedback

    Science.gov (United States)

    Kaklanis, Nikolaos; Votis, Konstantinos; Tzovaras, Dimitrios

    2013-08-01

    The use of spatial (geographic) information is becoming ever more central and pervasive in today's internet society but the most of it is currently inaccessible to visually impaired users. However, access in visual maps is severely restricted to visually impaired and people with blindness, due to their inability to interpret graphical information. Thus, alternative ways of a map's presentation have to be explored, in order to enforce the accessibility of maps. Multiple types of sensory perception like touch and hearing may work as a substitute of vision for the exploration of maps. The use of multimodal virtual environments seems to be a promising alternative for people with visual impairments. The present paper introduces a tool for automatic multimodal map generation having haptic and audio feedback using OpenStreetMap data. For a desired map area, an elevation map is being automatically generated and can be explored by touch, using a haptic device. A sonification and a text-to-speech (TTS) mechanism provide also audio navigation information during the haptic exploration of the map.

  12. Development of a user interface style guide for the reactor protection system cabinet operator module

    International Nuclear Information System (INIS)

    Lee, Hyun-Chul; Lee, Dong-Young; Lee, Jung-Woon

    2004-01-01

    The reactor protection system (RPS) plays the roles of generating the reactor trip signal and the engineered safety features (ESF) actuation signal when the monitored plant processes reach the predefined limits. A Korean project group is developing a new digitalized RPS and the Cabinet Operator Module (COM) of the RPS is used for the RPS integrity testing and monitoring by an equipment operator. A flat panel display (FPD) with a touch screen capability is provided as a main user interface for the RPS operation. To support the RPS COM user interface design, actually the FPD screen design, we developed a user interface style guide because the system designer could not properly deal with the many general human factors design guidelines. To develop the user interface style guide, various design guideline gatherings, a walk-though with a video recorder, guideline selection with respect to user interface design elements, determination of the properties of the design elements, discussion with system designers, and a conversion of the properties into the screen design were carried out. This paper describes the process details and the findings in the course of the style guide development. (Author)

  13. Haptic interface for vehicular touch screens.

    Science.gov (United States)

    2013-02-01

    Once the domain of purely physical controls such as knobs, : levers, buttons, and sliders, the vehicle dash is rapidly : transforming into a computer interface. This presents a : challenge for drivers, because the physics-based cues which : make trad...

  14. The Graphical User Interface Crisis: Danger and Opportunity.

    Science.gov (United States)

    Boyd, Lawrence H.; And Others

    This paper examines graphic computing environments, identifies potential problems in providing access to blind people, and describes programs and strategies being developed to provide this access. The paper begins with an explanation of how graphic user interfaces differ from character-based systems in their use of pixels, visual metaphors such as…

  15. Visual-haptic integration with pliers and tongs: signal ‘weights’ take account of changes in haptic sensitivity caused by different tools

    Directory of Open Access Journals (Sweden)

    Chie eTakahashi

    2014-02-01

    Full Text Available When we hold an object while looking at it, estimates from visual and haptic cues to size are combined in a statistically optimal fashion, whereby the ‘weight’ given to each signal reflects their relative reliabilities. This allows object properties to be estimated more precisely than would otherwise be possible. Tools such as pliers and tongs systematically perturb the mapping between object size and the hand opening. This could complicate visual-haptic integration because it may alter the reliability of the haptic signal, thereby disrupting the determination of appropriate signal weights. To investigate this we first measured the reliability of haptic size estimates made with virtual pliers-like tools (created using a stereoscopic display and force-feedback robots with different ‘gains’ between hand opening and object size. Haptic reliability in tool use was straightforwardly determined by a combination of sensitivity to changes in hand opening and the effects of tool geometry. The precise pattern of sensitivity to hand opening, which violated Weber’s law, meant that haptic reliability changed with tool gain. We then examined whether the visuo-motor system accounts for these reliability changes. We measured the weight given to visual and haptic stimuli when both were available, again with different tool gains, by measuring the perceived size of stimuli in which visual and haptic sizes were varied independently. The weight given to each sensory cue changed with tool gain in a manner that closely resembled the predictions of optimal sensory integration. The results are consistent with the idea that different tool geometries are modelled by the brain, allowing it to calculate not only the distal properties of objects felt with tools, but also the certainty with which those properties are known. These findings highlight the flexibility of human sensory integration and tool-use, and potentially provide an approach for optimising the

  16. Glenn Reconfigurable User-interface and Virtual reality Exploration (GURVE) Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The GRUVE (Glenn Reconfigurable User-interface and Virtual reality Exploration) Lab is a reconfigurable, large screen display facility at Nasa Glenn Research Center....

  17. Development of haptic system for surgical robot

    Science.gov (United States)

    Gang, Han Gyeol; Park, Jiong Min; Choi, Seung-Bok; Sohn, Jung Woo

    2017-04-01

    In this paper, a new type of haptic system for surgical robot application is proposed and its performances are evaluated experimentally. The proposed haptic system consists of an effective master device and a precision slave robot. The master device has 3-DOF rotational motion as same as human wrist motion. It has lightweight structure with a gyro sensor and three small-sized MR brakes for position measurement and repulsive torque generation, respectively. The slave robot has 3-DOF rotational motion using servomotors, five bar linkage and a torque sensor is used to measure resistive torque. It has been experimentally demonstrated that the proposed haptic system has good performances on tracking control of desired position and repulsive torque. It can be concluded that the proposed haptic system can be effectively applied to the surgical robot system in real field.

  18. Haptic and Audio-visual Stimuli: Enhancing Experiences and Interaction

    NARCIS (Netherlands)

    Nijholt, Antinus; Dijk, Esko O.; Lemmens, Paul M.C.; Luitjens, S.B.

    2010-01-01

    The intention of the symposium on Haptic and Audio-visual stimuli at the EuroHaptics 2010 conference is to deepen the understanding of the effect of combined Haptic and Audio-visual stimuli. The knowledge gained will be used to enhance experiences and interactions in daily life. To this end, a

  19. Designing personal attentive user interfaces in the mobile public safety domain

    NARCIS (Netherlands)

    Streefkerk, J.W.; Esch van-Bussemakers, M.P.; Neerincx, M.A.

    2006-01-01

    In the mobile computing environment, there is a need to adapt the information and service provision to the momentary attentive state of the user, operational requirements and usage context. This paper proposes to design personal attentive user interfaces (PAUI) for which the content and style of

  20. Version II of the users manual for the Tuff Data Base Interface

    International Nuclear Information System (INIS)

    Welch, E.P.; Satter, B.J.; Langkopf, B.S.; Zeuch, D.H.

    1987-05-01

    The Nevada Nuclear Waste Storage Investigations (NNWSI) Project, managed by the Nevada Operations Office of the US Department of Energy, is investigating the feasibility of locating a repository at Yucca Mountain on and adjacent to the Nevada Test Site (NTS) in southern Nevada. A part of this investigation includes obtaining physical properties from laboratory tests on samples from Yucca Mountain and from field tests at Yucca Mountain. A computerized data base has been developed to store this data in a centralized location. The data base is stored on the Cyber 170/855 computer at Sandia using the System 2000 Data Base Management software. A user-friendly interface, the Tuff Data Base Interface (the Interface), allows NNWSI participants to retrieve data from the Tuff Data Base. The Interface gives users flexibility to retrieve portions of the Data Base related to their interests. This report gives basic instructions on accessing the Sandia computing system and explains how to use the Interface. 18 figs., 5 tabs

  1. Visual-Haptic Integration: Cue Weights are Varied Appropriately, to Account for Changes in Haptic Reliability Introduced by Using a Tool

    OpenAIRE

    Chie Takahashi; Simon J Watt

    2011-01-01

    Tools such as pliers systematically change the relationship between an object's size and the hand opening required to grasp it. Previous work suggests the brain takes this into account, integrating visual and haptic size information that refers to the same object, independent of the similarity of the ‘raw’ visual and haptic signals (Takahashi et al., VSS 2009). Variations in tool geometry also affect the reliability (precision) of haptic size estimates, however, because they alter the change ...

  2. Memory for curvature of objects: haptic touch vs. vision.

    Science.gov (United States)

    Ittyerah, Miriam; Marks, Lawrence E

    2007-11-01

    The present study examined the role of vision and haptics in memory for stimulus objects that vary along the dimension of curvature. Experiment 1 measured haptic-haptic (T-T) and haptic-visual (T-V) discrimination of curvature in a short-term memory paradigm, using 30-second retention intervals containing five different interpolated tasks. Results showed poorest performance when the interpolated tasks required spatial processing or movement, thereby suggesting that haptic information about shape is encoded in a spatial-motor representation. Experiment 2 compared visual-visual (V-V) and visual-haptic (V-T) short-term memory, again using 30-second delay intervals. The results of the ANOVA failed to show a significant effect of intervening activity. Intra-modal visual performance and cross-modal performance were similar. Comparing the four modality conditions (inter-modal V-T, T-V; intra-modal V-V, T-T, by combining the data of Experiments 1 and 2), in a global analysis, showed a reliable interaction between intervening activity and experiment (modality). Although there appears to be a general tendency for spatial and movement activities to exert the most deleterious effects overall, the patterns are not identical when the initial stimulus is encoded haptically (Experiment 1) and visually (Experiment 2).

  3. Mechanisms for collaboration: a design and evaluation framework for multi-user interfaces

    OpenAIRE

    Yuill, Nicola; Rogers, Yvonne

    2012-01-01

    Multi-user interfaces are said to provide “natural” interaction in supporting collaboration, compared to individual and noncolocated technologies. We identify three mechanisms accounting for the success of such interfaces: high awareness of others' actions and intentions, high control over the interface, and high availability of background information. We challenge the idea that interaction over such interfaces is necessarily “natural” and argue that everyday interaction involves constraints ...

  4. Integration of Haptics in Agricultural Robotics

    Science.gov (United States)

    Kannan Megalingam, Rajesh; Sreekanth, M. M.; Sivanantham, Vinu; Sai Kumar, K.; Ghanta, Sriharsha; Surya Teja, P.; Reddy, Rajesh G.

    2017-08-01

    Robots can differentiate with open loop system and closed loop system robots. We face many problems when we do not have a feedback from robots. In this research paper, we are discussing all possibilities to achieve complete closed loop system for Multiple-DOF Robotic Arm, which is used in a coconut tree climbing and cutting robot by introducing a Haptic device. We are working on various sensors like tactile, vibration, force and proximity sensors for getting feedback. For monitoring the robotic arm achieved by graphical user interference software which simulates the working of the robotic arm, send the feedback of all the real time analog values which are produced by various sensors and provide real-time graphs for estimate the efficiency of the Robot.

  5. Towards Linking User Interface Translation Needs to Lexicographic ...

    African Journals Online (AJOL)

    In a time of proliferating electronic devices such as smartphones, translators of user interfaces are faced with new challenges, such as the use of existing words in new contexts or in their obtaining new meanings. In this article, three lexicographic reference works available to translators in this field are compared: the ...

  6. Theorema 2.0: A Graphical User Interface for a Mathematical Assistant System

    Directory of Open Access Journals (Sweden)

    Wolfgang Windsteiger

    2013-07-01

    Full Text Available Theorema 2.0 stands for a re-design including a complete re-implementation of the Theorema system, which was originally designed, developed, and implemented by Bruno Buchberger and his Theorema group at RISC. In this paper, we present the first prototype of a graphical user interface (GUI for the new system. It heavily relies on powerful interactive capabilities introduced in recent releases of the underlying Mathematica system, most importantly the possibility of having dynamic objects connected to interface elements like sliders, menus, check-boxes, radio-buttons and the like. All these features are fully integrated into the Mathematica programming environment and allow the implementation of a modern user interface.

  7. DEBUGGER: Developing a graphical user interface to debug FPGAs

    CERN Document Server

    AUTHOR|(SzGeCERN)773309

    2015-01-01

    As part of the summer student projects, an FPGA debugger was designed using Qt- framework. The aim of this project is to help Data Acquisition System (DAQ) experts of COMPASS experiment to easily monitor the state of each FPGA being used. It is needful to continually monitor their state. A Graphical User Interface (GUI) has then been designed to aid experts to do so. Via IP-Bus, the content of the FPGA under investigation is displayed to the user.

  8. Graphical user interfaces for McCellan Nuclear Radiation Center (MNRC)

    International Nuclear Information System (INIS)

    Brown-VanHoozer, S. A.

    1998-01-01

    McClellan's Nuclear Radiation Center (MNRC) control console is in the process of being replaced due to spurious scrams, outdated software, and obsolete parts. The intent of the new control console is to eliminate the existing problems by installing a UNIX-based computer system with industry-standard interface software and incorporating human factors during all stages of the graphical user interface (GUI) development and control console design

  9. Haptic Discrimination of Distance

    Science.gov (United States)

    van Beek, Femke E.; Bergmann Tiest, Wouter M.; Kappers, Astrid M. L.

    2014-01-01

    While quite some research has focussed on the accuracy of haptic perception of distance, information on the precision of haptic perception of distance is still scarce, particularly regarding distances perceived by making arm movements. In this study, eight conditions were measured to answer four main questions, which are: what is the influence of reference distance, movement axis, perceptual mode (active or passive) and stimulus type on the precision of this kind of distance perception? A discrimination experiment was performed with twelve participants. The participants were presented with two distances, using either a haptic device or a real stimulus. Participants compared the distances by moving their hand from a start to an end position. They were then asked to judge which of the distances was the longer, from which the discrimination threshold was determined for each participant and condition. The precision was influenced by reference distance. No effect of movement axis was found. The precision was higher for active than for passive movements and it was a bit lower for real stimuli than for rendered stimuli, but it was not affected by adding cutaneous information. Overall, the Weber fraction for the active perception of a distance of 25 or 35 cm was about 11% for all cardinal axes. The recorded position data suggest that participants, in order to be able to judge which distance was the longer, tried to produce similar speed profiles in both movements. This knowledge could be useful in the design of haptic devices. PMID:25116638

  10. Haptic discrimination of distance.

    Directory of Open Access Journals (Sweden)

    Femke E van Beek

    Full Text Available While quite some research has focussed on the accuracy of haptic perception of distance, information on the precision of haptic perception of distance is still scarce, particularly regarding distances perceived by making arm movements. In this study, eight conditions were measured to answer four main questions, which are: what is the influence of reference distance, movement axis, perceptual mode (active or passive and stimulus type on the precision of this kind of distance perception? A discrimination experiment was performed with twelve participants. The participants were presented with two distances, using either a haptic device or a real stimulus. Participants compared the distances by moving their hand from a start to an end position. They were then asked to judge which of the distances was the longer, from which the discrimination threshold was determined for each participant and condition. The precision was influenced by reference distance. No effect of movement axis was found. The precision was higher for active than for passive movements and it was a bit lower for real stimuli than for rendered stimuli, but it was not affected by adding cutaneous information. Overall, the Weber fraction for the active perception of a distance of 25 or 35 cm was about 11% for all cardinal axes. The recorded position data suggest that participants, in order to be able to judge which distance was the longer, tried to produce similar speed profiles in both movements. This knowledge could be useful in the design of haptic devices.

  11. Prevailing Trends in Haptic Feedback Simulation for Minimally Invasive Surgery.

    Science.gov (United States)

    Pinzon, David; Byrns, Simon; Zheng, Bin

    2016-08-01

    Background The amount of direct hand-tool-tissue interaction and feedback in minimally invasive surgery varies from being attenuated in laparoscopy to being completely absent in robotic minimally invasive surgery. The role of haptic feedback during surgical skill acquisition and its emphasis in training have been a constant source of controversy. This review discusses the major developments in haptic simulation as they relate to surgical performance and the current research questions that remain unanswered. Search Strategy An in-depth review of the literature was performed using PubMed. Results A total of 198 abstracts were returned based on our search criteria. Three major areas of research were identified, including advancements in 1 of the 4 components of haptic systems, evaluating the effectiveness of haptic integration in simulators, and improvements to haptic feedback in robotic surgery. Conclusions Force feedback is the best method for tissue identification in minimally invasive surgery and haptic feedback provides the greatest benefit to surgical novices in the early stages of their training. New technology has improved our ability to capture, playback and enhance to utility of haptic cues in simulated surgery. Future research should focus on deciphering how haptic training in surgical education can increase performance, safety, and improve training efficiency. © The Author(s) 2016.

  12. An Efficient User Interface Design for Nursing Information System Based on Integrated Patient Order Information.

    Science.gov (United States)

    Chu, Chia-Hui; Kuo, Ming-Chuan; Weng, Shu-Hui; Lee, Ting-Ting

    2016-01-01

    A user friendly interface can enhance the efficiency of data entry, which is crucial for building a complete database. In this study, two user interfaces (traditional pull-down menu vs. check boxes) are proposed and evaluated based on medical records with fever medication orders by measuring the time for data entry, steps for each data entry record, and the complete rate of each medical record. The result revealed that the time for data entry is reduced from 22.8 sec/record to 3.2 sec/record. The data entry procedures also have reduced from 9 steps in the traditional one to 3 steps in the new one. In addition, the completeness of medical records is increased from 20.2% to 98%. All these results indicate that the new user interface provides a more user friendly and efficient approach for data entry than the traditional interface.

  13. A visual user interface program, EGSWIN, for EGS4

    International Nuclear Information System (INIS)

    Qiu Rui; Li Junli; Wu Zhen

    2005-01-01

    To overcome the inconvenience and difficulty in using the EGS4 code by novice users, a visual user interface program, called the EGSWIN system, has been developed by the Monte Carlo Research Center of Tsinghua University in China. EGSWIN allows users to run EGS4 for many applications without any user coding. A mixed-language programming technique with Visual C++ and Visual Fortran is used in order to embed both EGS4 and PEGS4 into EGSWIN. The system has the features of visual geometry input, geometry processing, visual definitions of source, scoring and computing parameters, and particle trajectories display. Comparison between the calculated results with EGS4 and EGSWIN, as well as with FLUKA and GEANT, has been made to validate EGSWIN. (author)

  14. Bed occupancy monitoring: data processing and clinician user interface design.

    Science.gov (United States)

    Pouliot, Melanie; Joshi, Vilas; Goubran, Rafik; Knoefel, Frank

    2012-01-01

    Unobtrusive and continuous monitoring of patients, especially at their place of residence, is becoming a significant part of the healthcare model. A variety of sensors are being used to monitor different patient conditions. Bed occupancy monitoring provides clinicians a quantitative measure of bed entry/exit patterns and may provide information relating to sleep quality. This paper presents a bed occupancy monitoring system using a bed pressure mat sensor. A clinical trial was performed involving 8 patients to collect bed occupancy data. The trial period for each patient ranged from 5-10 weeks. This data was analyzed using a participatory design methodology incorporating clinician feedback to obtain bed occupancy parameters. The parameters extracted include the number of bed exits per night, the bed exit weekly average (including minimum and maximum), the time of day of a particular exit, and the amount of uninterrupted bed occupancy per night. The design of a clinical user interface plays a significant role in the acceptance of such patient monitoring systems by clinicians. The clinician user interface proposed in this paper was designed to be intuitive, easy to navigate and not cause information overload. An iterative design methodology was used for the interface design. The interface design is extendible to incorporate data from multiple sensors. This allows the interface to be part of a comprehensive remote patient monitoring system.

  15. CATO--A Guided User Interface for Different CAS

    Science.gov (United States)

    Janetzko, Hans-Dieter

    2017-01-01

    CATO is a new user interface, written in Java and developed by the author as a response to the significant difficulties faced by students who only sporadically use computer algebra systems (CAS). The usage of CAS in mathematical lectures should be an integral part of mathematical instruction. However, difficulties arise for those students who have…

  16. Software Graphical User Interface For Analysis Of Images

    Science.gov (United States)

    Leonard, Desiree M.; Nolf, Scott R.; Avis, Elizabeth L.; Stacy, Kathryn

    1992-01-01

    CAMTOOL software provides graphical interface between Sun Microsystems workstation and Eikonix Model 1412 digitizing camera system. Camera scans and digitizes images, halftones, reflectives, transmissives, rigid or flexible flat material, or three-dimensional objects. Users digitize images and select from three destinations: work-station display screen, magnetic-tape drive, or hard disk. Written in C.

  17. QE::GUI – A Graphical User Interface for Quality Estimation

    Directory of Open Access Journals (Sweden)

    Avramidis Eleftherios

    2017-10-01

    Full Text Available Despite its wide applicability, Quality Estimation (QE of Machine Translation (MT poses a difficult entry barrier since there are no open source tools with a graphical user interface (GUI. Here we present a tool in this direction by connecting the back-end of the QE decision-making mechanism with a web-based GUI. The interface allows the user to post requests to the QE engine and get a visual response with the results. Additionally we provide pre-trained QE models for easier launching of the app. The tool is written in Python so that it can leverage the rich natural language processing capabilities of the popular dynamic programming language, which is at the same time supported by top web-server environments.

  18. An Accessible User Interface for Geoscience and Programming

    Science.gov (United States)

    Sevre, E. O.; Lee, S.

    2012-12-01

    The goal of this research is to develop an interface that will simplify user interaction with software for scientists. The motivating factor of the research is to develop tools that assist scientists with limited motor skills with the efficient generation and use of software tools. Reliance on computers and programming is increasing in the world of geology, and it is increasingly important for geologists and geophysicists to have the computational resources to use advanced software and edit programs for their research. I have developed a prototype of a program to help geophysicists write programs using a simple interface that requires only simple single-mouse-clicks to input code. It is my goal to minimize the amount of typing necessary to create simple programs and scripts to increase accessibility for people with disabilities limiting fine motor skills. This interface can be adapted for various programming and scripting languages. Using this interface will simplify development of code for C/C++, Java, and GMT, and can be expanded to support any other text based programming language. The interface is designed around the concept of maximizing the amount of code that can be written using a minimum number of clicks and typing. The screen is split into two sections: a list of click-commands is on the left hand side, and a text area is on the right hand side. When the user clicks on a command on the left hand side the applicable code is automatically inserted at the insertion point in the text area. Currently in the C/C++ interface, there are commands for common code segments that are often used, such as for loops, comments, print statements, and structured code creation. The primary goal is to provide an interface that will work across many devices for developing code. A simple prototype has been developed for the iPad. Due to the limited number of devices that an iOS application can be used with, the code has been re-written in Java to run on a wider range of devices

  19. Visibility Aspects Importance of User Interface Reception in Cloud Computing Applications with Increased Automation

    OpenAIRE

    Haxhixhemajli, Denis

    2012-01-01

    Visibility aspects of User Interfaces are important; they deal with the crucial phase of human-computer interaction. They allow users to perform and at the same time hide the complexity of the system. Acceptance of new systems depends on how visibility aspects of the User Interfaces are presented. Human eyes make the first contact with the appearance of any system by so it generates the very beginning of the human – application interaction. In this study it is enforced that visibility aspects...

  20. A Framework for Effective User Interface Design for Web-Based Electronic Commerce Applications

    Directory of Open Access Journals (Sweden)

    Justyna Burns

    2001-01-01

    Full Text Available Efficient delivery of relevant product information is increasingly becoming the central basis of competition between firms. The interface design represents the central component for successful information delivery to consumers. However, interface design for web-based information systems is probably more an art than a science at this point in time. Much research is needed to understand properties of an effective interface for electronic commerce. This paper develops a framework identifying the relationship between user factors, the role of the user interface and overall system success for web-based electronic commerce. The paper argues that web-based systems for electronic commerce have some similar properties to decision support systems (DSS and adapts an established DSS framework to the electronic commerce domain. Based on a limited amount of research studying web browser interface design, the framework identifies areas of research needed and outlines possible relationships between consumer characteristics, interface design attributes and measures of overall system success.

  1. Haptic sense and the politicization of contemporary image

    Directory of Open Access Journals (Sweden)

    Tarcisio Torres Silva

    2017-08-01

    Full Text Available In this paper, it is intended to propose a theoretical approach to the political effects of the sense of touch/haptic in order to understand to what extent the intensification of contemporary haptic experience contributes to create proximity and engagement among individuals overloaded by too much visual information offered by multiple media. At the end, it is mentioned the work of Brazilian artist Rodrigo Braga to exemplify the contemporary political use of haptic sense.

  2. User Interface for the SMAC Traffic Accident Reconstruction Program

    Directory of Open Access Journals (Sweden)

    Rok Krulec

    2003-11-01

    Full Text Available This paper describes the development of the user interfacefor the traffic accident reconstruction program SMAC. Threebasic modules of software will be presented. Initial parametersinput and visualization, using graphics library for simulation of3D space, which form a graphical user interface, will be explainedin more detail. The modules have been developed usingdifferent technologies and programming approaches to increaseflexibility in further development and to take maximumadvantage of the currently accessible computer hardware, sothat module to module communication is also mentioned.

  3. VOILA 2015 Visualizations and User Interfaces for Ontologies and Linked Data : Proceedings of the International Workshop on Visualizations and User Interfaces for Ontologies and Linked Data

    OpenAIRE

    2015-01-01

    A picture is worth a thousand words, we often say, yet many areas are in demand of sophisticated visualization techniques, and the Semantic Web is not an exception. The size and complexity of ontologies and Linked Data in the Semantic Web constantly grow and the diverse backgrounds of the users and application areas multiply at the same time. Providing users with visual representations and intuitive user interfaces can significantly aid the understanding of the domains and knowledge represent...

  4. Conflicting audio-haptic feedback in physically based simulation of walking sounds

    DEFF Research Database (Denmark)

    Turchet, Luca; Serafin, Stefania; Dimitrov, Smilen

    2010-01-01

    We describe an audio-haptic experiment conducted using a system which simulates in real-time the auditory and haptic sensation of walking on different surfaces. The system is based on physical models, that drive both the haptic and audio synthesizers, and a pair of shoes enhanced with sensors...... and actuators. Such experiment was run to examine the ability of subjects to recognize the different surfaces with both coherent and incoherent audio-haptic stimuli. Results show that in this kind of tasks the auditory modality is dominant on the haptic one....

  5. Graphical User Interface in Art

    Science.gov (United States)

    Gwilt, Ian

    This essay discusses the use of the Graphical User Interface (GUI) as a site of creative practice. By creatively repositioning the GUI as a work of art it is possible to challenge our understanding and expectations of the conventional computer interface wherein the icons and navigational architecture of the GUI no longer function as a technological tool. These artistic recontextualizations are often used to question our engagement with technology and to highlight the pivotal place that the domestic computer has taken in our everyday social, cultural and (increasingly), creative domains. Through these works the media specificity of the screen-based GUI can broken by dramatic changes in scale, form and configuration. This can be seen through the work of new media artists who have re-imagined the GUI in a number of creative forms both, within the digital, as image, animation, net and interactive art, and in the analogue, as print, painting, sculpture, installation and performative event. Furthermore as a creative work, the GUI can also be utilized as a visual way-finder to explore the relationship between the dynamic potentials of the digital and the concretized qualities of the material artifact.

  6. Flair: A powerful but user friendly graphical interface for FLUKA

    International Nuclear Information System (INIS)

    Vlachoudis, V.

    2009-01-01

    FLAIR is an advanced user graphical interface for FLUKA, to enable the user to start and control FLUKA jobs completely from a GUI environment without the need for command-line interactions. It is written entirely with python and Tkinter allowing easier portability across various operating systems and great programming flexibility with focus to be used as an Application Programming Interface (API) for FLUKA. FLAIR is an integrated development environment (IDE) for FLUKA, it does not only provide means for the post processing of the output but a big emphasis has been set on the creation and checking of error free input files. It contains a fully featured editor for editing the input files in a human readable way with syntax highlighting, without hiding the inner functionality of FLUKA from the users. It provides also means for building the executable, debugging the geometry, running the code, monitoring the status of one or many runs, inspection of the output files, post processing of the binary files (data merging) and interface to plotting utilities like gnuplot and PovRay for high quality plots or photo-realistic images. The program includes also a database of selected properties of all known nuclides and their known isotopic composition as well a reference database of ∼ 300 predefined materials together with their Sterheimer parameters. (authors)

  7. Design of Electronic Medical Record User Interfaces: A Matrix-Based Method for Improving Usability

    Directory of Open Access Journals (Sweden)

    Kushtrim Kuqi

    2013-01-01

    Full Text Available This study examines a new approach of using the Design Structure Matrix (DSM modeling technique to improve the design of Electronic Medical Record (EMR user interfaces. The usability of an EMR medication dosage calculator used for placing orders in an academic hospital setting was investigated. The proposed method captures and analyzes the interactions between user interface elements of the EMR system and groups elements based on information exchange, spatial adjacency, and similarity to improve screen density and time-on-task. Medication dose adjustment task time was recorded for the existing and new designs using a cognitive simulation model that predicts user performance. We estimate that the design improvement could reduce time-on-task by saving an average of 21 hours of hospital physicians’ time over the course of a month. The study suggests that the application of DSM can improve the usability of an EMR user interface.

  8. Draft User Functionalities and Interfaces of PN Services (Low-Fi Prototyping)

    DEFF Research Database (Denmark)

    Karamolegkos, P.; Larsen, J. E.; Larsen, Lars Bo

    2006-01-01

    Internal report of WP1 Task 4 activities from January 2006 to August 2006. This report describes the draft user functionalities and coming user interfaces for PN services. It is a working document to be handed over to WP1 Task1 and Task3 for guidelines on specification. State of the art usability...... and user experience, conceptual design work on the two pilot services, MAGNET.CARE and Nomadic@Work, is described....

  9. Evaluating The Role Of Empathy In Crowdsourcing User Interfaces

    NARCIS (Netherlands)

    Khan, J.V.; Dey, D.; Buchina, N.

    2016-01-01

    Empathy induced altruism is believed to motivate people in a crowdsourcing environment to produce better quality work. However, there hasn’t been any considerable investigation regarding how empathy can be effectively conveyed through user interfaces (UI). We conducted a study to find the effects of

  10. Determinants of user acceptance of a specific social platform for older adults: An empirical examination of user interface characteristics and behavioral intention.

    Directory of Open Access Journals (Sweden)

    Tsai-Hsuan Tsai

    Full Text Available The use of the Internet and social applications has many benefits for the elderly, but numerous investigations have shown that the elderly do not perceive online social networks as a friendly social environment. Therefore, TreeIt, a social application specifically designed for the elderly, was developed for this study. In the TreeIt application, seven mechanisms promoting social interaction were designed to allow older adults to use social networking sites (SNSs to increase social connection, maintain the intensity of social connections and strengthen social experience. This study's main objective was to investigate how user interface design affects older people's intention and attitude related to using SNSs. Fourteen user interface evaluation heuristics proposed by Zhang et al. were adopted as the criteria to assess user interface usability and further grouped into three categories: system support, user interface design and navigation. The technology acceptance model was adopted to assess older people's intention and attitude related to using SNSs. One hundred and one elderly persons were enrolled in this study as subjects, and the results showed that all of the hypotheses proposed in this study were valid: system support and perceived usefulness had a significant effect on behavioral intention; user interface design and perceived ease of use were positively correlated with perceived usefulness; and navigation exerted an influence on perceived ease of use. The results of this study are valuable for the future development of social applications for the elderly.

  11. Determinants of user acceptance of a specific social platform for older adults: An empirical examination of user interface characteristics and behavioral intention.

    Science.gov (United States)

    Tsai, Tsai-Hsuan; Chang, Hsien-Tsung; Chen, Yan-Jiun; Chang, Yung-Sheng

    2017-01-01

    The use of the Internet and social applications has many benefits for the elderly, but numerous investigations have shown that the elderly do not perceive online social networks as a friendly social environment. Therefore, TreeIt, a social application specifically designed for the elderly, was developed for this study. In the TreeIt application, seven mechanisms promoting social interaction were designed to allow older adults to use social networking sites (SNSs) to increase social connection, maintain the intensity of social connections and strengthen social experience. This study's main objective was to investigate how user interface design affects older people's intention and attitude related to using SNSs. Fourteen user interface evaluation heuristics proposed by Zhang et al. were adopted as the criteria to assess user interface usability and further grouped into three categories: system support, user interface design and navigation. The technology acceptance model was adopted to assess older people's intention and attitude related to using SNSs. One hundred and one elderly persons were enrolled in this study as subjects, and the results showed that all of the hypotheses proposed in this study were valid: system support and perceived usefulness had a significant effect on behavioral intention; user interface design and perceived ease of use were positively correlated with perceived usefulness; and navigation exerted an influence on perceived ease of use. The results of this study are valuable for the future development of social applications for the elderly.

  12. Determinants of user acceptance of a specific social platform for older adults: An empirical examination of user interface characteristics and behavioral intention

    Science.gov (United States)

    Chang, Hsien-Tsung; Chen, Yan-Jiun; Chang, Yung-Sheng

    2017-01-01

    The use of the Internet and social applications has many benefits for the elderly, but numerous investigations have shown that the elderly do not perceive online social networks as a friendly social environment. Therefore, TreeIt, a social application specifically designed for the elderly, was developed for this study. In the TreeIt application, seven mechanisms promoting social interaction were designed to allow older adults to use social networking sites (SNSs) to increase social connection, maintain the intensity of social connections and strengthen social experience. This study’s main objective was to investigate how user interface design affects older people’s intention and attitude related to using SNSs. Fourteen user interface evaluation heuristics proposed by Zhang et al. were adopted as the criteria to assess user interface usability and further grouped into three categories: system support, user interface design and navigation. The technology acceptance model was adopted to assess older people’s intention and attitude related to using SNSs. One hundred and one elderly persons were enrolled in this study as subjects, and the results showed that all of the hypotheses proposed in this study were valid: system support and perceived usefulness had a significant effect on behavioral intention; user interface design and perceived ease of use were positively correlated with perceived usefulness; and navigation exerted an influence on perceived ease of use. The results of this study are valuable for the future development of social applications for the elderly. PMID:28837566

  13. Training leads to increased auditory brain-computer interface performance of end-users with motor impairments.

    Science.gov (United States)

    Halder, S; Käthner, I; Kübler, A

    2016-02-01

    Auditory brain-computer interfaces are an assistive technology that can restore communication for motor impaired end-users. Such non-visual brain-computer interface paradigms are of particular importance for end-users that may lose or have lost gaze control. We attempted to show that motor impaired end-users can learn to control an auditory speller on the basis of event-related potentials. Five end-users with motor impairments, two of whom with additional visual impairments, participated in five sessions. We applied a newly developed auditory brain-computer interface paradigm with natural sounds and directional cues. Three of five end-users learned to select symbols using this method. Averaged over all five end-users the information transfer rate increased by more than 1800% from the first session (0.17 bits/min) to the last session (3.08 bits/min). The two best end-users achieved information transfer rates of 5.78 bits/min and accuracies of 92%. Our results show that an auditory BCI with a combination of natural sounds and directional cues, can be controlled by end-users with motor impairment. Training improves the performance of end-users to the level of healthy controls. To our knowledge, this is the first time end-users with motor impairments controlled an auditory brain-computer interface speller with such high accuracy and information transfer rates. Further, our results demonstrate that operating a BCI with event-related potentials benefits from training and specifically end-users may require more than one session to develop their full potential. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  14. Integrating User Interface and Personal Innovativeness into the TAM for Mobile Learning in Cyber University

    Science.gov (United States)

    Joo, Young Ju; Lee, Hyeon Woo; Ham, Yookyoung

    2014-01-01

    This study aims to add new variables, namely user interface, personal innovativeness, and satisfaction in learning, to Davis's technology acceptance model and also examine whether learners are willing to adopt mobile learning. Thus, this study attempted to explain the structural causal relationships among user interface, personal…

  15. SimHap GUI: an intuitive graphical user interface for genetic association analysis.

    Science.gov (United States)

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-12-25

    Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.

  16. Iterative User Interface Design for Automated Sequential Organ Failure Assessment Score Calculator in Sepsis Detection.

    Science.gov (United States)

    Aakre, Christopher Ansel; Kitson, Jaben E; Li, Man; Herasevich, Vitaly

    2017-05-18

    The new sepsis definition has increased the need for frequent sequential organ failure assessment (SOFA) score recalculation and the clerical burden of information retrieval makes this score ideal for automated calculation. The aim of this study was to (1) estimate the clerical workload of manual SOFA score calculation through a time-motion analysis and (2) describe a user-centered design process for an electronic medical record (EMR) integrated, automated SOFA score calculator with subsequent usability evaluation study. First, we performed a time-motion analysis by recording time-to-task-completion for the manual calculation of 35 baseline and 35 current SOFA scores by 14 internal medicine residents over a 2-month period. Next, we used an agile development process to create a user interface for a previously developed automated SOFA score calculator. The final user interface usability was evaluated by clinician end users with the Computer Systems Usability Questionnaire. The overall mean (standard deviation, SD) time-to-complete manual SOFA score calculation time was 61.6 s (33). Among the 24% (12/50) usability survey respondents, our user-centered user interface design process resulted in >75% favorability of survey items in the domains of system usability, information quality, and interface quality. Early stakeholder engagement in our agile design process resulted in a user interface for an automated SOFA score calculator that reduced clinician workload and met clinicians' needs at the point of care. Emerging interoperable platforms may facilitate dissemination of similarly useful clinical score calculators and decision support algorithms as "apps." A user-centered design process and usability evaluation should be considered during creation of these tools. ©Christopher Ansel Aakre, Jaben E Kitson, Man Li, Vitaly Herasevich. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 18.05.2017.

  17. A novel remote center of motion mechanism for the force-reflective master robot of haptic tele-surgery systems.

    Science.gov (United States)

    Hadavand, Mostafa; Mirbagheri, Alireza; Behzadipour, Saeed; Farahmand, Farzam

    2014-06-01

    An effective master robot for haptic tele-surgery applications needs to provide a solution for the inversed movements of the surgical tool, in addition to sufficient workspace and manipulability, with minimal moving inertia. A novel 4 + 1-DOF mechanism was proposed, based on a triple parallelogram linkage, which provided a Remote Center of Motion (RCM) at the back of the user's hand. The kinematics of the robot was analyzed and a prototype was fabricated and evaluated by experimental tests. With a RCM at the back of the user's hand the actuators far from the end effector, the robot could produce the sensation of hand-inside surgery with minimal moving inertia. The target workspace was achieved with an acceptable manipulability. The trajectory tracking experiments revealed small errors, due to backlash at the joints. The proposed mechanism meets the basic requirements of an effective master robot for haptic tele-surgery applications. Copyright © 2013 John Wiley & Sons, Ltd.

  18. Finite Element Methods for real-time Haptic Feedback of Soft-Tissue Models in Virtual Reality Simulators

    Science.gov (United States)

    Frank, Andreas O.; Twombly, I. Alexander; Barth, Timothy J.; Smith, Jeffrey D.; Dalton, Bonnie P. (Technical Monitor)

    2001-01-01

    We have applied the linear elastic finite element method to compute haptic force feedback and domain deformations of soft tissue models for use in virtual reality simulators. Our results show that, for virtual object models of high-resolution 3D data (>10,000 nodes), haptic real time computations (>500 Hz) are not currently possible using traditional methods. Current research efforts are focused in the following areas: 1) efficient implementation of fully adaptive multi-resolution methods and 2) multi-resolution methods with specialized basis functions to capture the singularity at the haptic interface (point loading). To achieve real time computations, we propose parallel processing of a Jacobi preconditioned conjugate gradient method applied to a reduced system of equations resulting from surface domain decomposition. This can effectively be achieved using reconfigurable computing systems such as field programmable gate arrays (FPGA), thereby providing a flexible solution that allows for new FPGA implementations as improved algorithms become available. The resulting soft tissue simulation system would meet NASA Virtual Glovebox requirements and, at the same time, provide a generalized simulation engine for any immersive environment application, such as biomedical/surgical procedures or interactive scientific applications.

  19. Comparative performance analysis of M-IMU/EMG and voice user interfaces for assistive robots.

    Science.gov (United States)

    Laureiti, Clemente; Cordella, Francesca; di Luzio, Francesco Scotto; Saccucci, Stefano; Davalli, Angelo; Sacchetti, Rinaldo; Zollo, Loredana

    2017-07-01

    People with a high level of disability experience great difficulties to perform activities of daily living and resort to their residual motor functions in order to operate assistive devices. The commercially available interfaces used to control assistive manipulators are typically based on joysticks and can be used only by subjects with upper-limb residual mobilities. Many other solutions can be found in the literature, based on the use of multiple sensory systems for detecting the human motion intention and state. Some of them require a high cognitive workload for the user. Some others are more intuitive and easy to use but have not been widely investigated in terms of usability and user acceptance. The objective of this work is to propose an intuitive and robust user interface for assistive robots, not obtrusive for the user and easily adaptable for subjects with different levels of disability. The proposed user interface is based on the combination of M-IMU and EMG for the continuous control of an arm-hand robotic system by means of M-IMUs. The system has been experimentally validated and compared to a standard voice interface. Sixteen healthy subjects volunteered to participate in the study: 8 subjects used the combined M-IMU/EMG robot control, and 8 subjects used the voice control. The arm-hand robotic system made of the KUKA LWR 4+ and the IH2 Azzurra hand was controlled to accomplish the daily living task of drinking. Performance indices and evaluation scales were adopted to assess performance of the two interfaces.

  20. Innovative User Interfaces in the Industrial Domain

    OpenAIRE

    Jutterström, Jenny

    2010-01-01

    The goal of this thesis is to explore how the HMI of a process control system can be improved by applying modern interaction technologies. Many new interaction possibilities are arising on the market, while the interaction in the industrial domain still is quite conservative, with computer mouse and keyboard as the central method of interaction. It is believed that by making use of technology available today, the user interface can provide further assistance to the process control operators a...

  1. Research and Development for an Operational Information Ecology: The User-System Interface Agent Project

    Science.gov (United States)

    Srivastava, Sadanand; deLamadrid, James

    1998-01-01

    The User System Interface Agent (USIA) is a special type of software agent which acts as the "middle man" between a human user and an information processing environment. USIA consists of a group of cooperating agents which are responsible for assisting users in obtaining information processing services intuitively and efficiently. Some of the main features of USIA include: (1) multiple interaction modes and (2) user-specific and stereotype modeling and adaptation. This prototype system provides us with a development platform towards the realization of an operational information ecology. In the first phase of this project we focus on the design and implementation of prototype system of the User-System Interface Agent (USIA). The second face of USIA allows user interaction via a restricted query language as well as through a taxonomy of windows. In third phase the USIA system architecture was revised.

  2. A development of user-friendly graphical interface for a blanket simulator

    International Nuclear Information System (INIS)

    Lee, Young-Seok; Yoon, Seok-Heun; Han, Jung-Hoon

    2010-01-01

    A web-based user-friendly graphical interface (GUI) system, named GUMBIS (Graphical User-friendly Monte-Carlo-Application Blanket-Design Interface System), was developed to cut down the efforts of the researchers and practitioners who study tokamak blanket designs with the Monte Carlo MCNP/MCNPX codes. GUMBIS was also aimed at supporting them to use the codes for their study without having through understanding on the complex menus and commands of the codes. Developed on the web-based environment, GUMBIS provides task sharing capability on a network. GUMBIS, applicable for both blanket design and neutronics analysis, could facilitate not only advanced blanket R and D but also the education and training of the researchers in the R and D.

  3. Introduction to haptics for neurosurgeons.

    Science.gov (United States)

    L'Orsa, Rachael; Macnab, Chris J B; Tavakoli, Mahdi

    2013-01-01

    Robots are becoming increasingly relevant to neurosurgeons, extending a neurosurgeon's physical capabilities, improving navigation within the surgical landscape when combined with advanced imaging, and propelling the movement toward minimally invasive surgery. Most surgical robots, however, isolate surgeons from the full range of human senses during a procedure. This forces surgeons to rely on vision alone for guidance through the surgical corridor, which limits the capabilities of the system, requires significant operator training, and increases the surgeon's workload. Incorporating haptics into these systems, ie, enabling the surgeon to "feel" forces experienced by the tool tip of the robot, could render these limitations obsolete by making the robot feel more like an extension of the surgeon's own body. Although the use of haptics in neurosurgical robots is still mostly the domain of research, neurosurgeons who keep abreast of this emerging field will be more prepared to take advantage of it as it becomes more prevalent in operating theaters. Thus, this article serves as an introduction to the field of haptics for neurosurgeons. We not only outline the current and future benefits of haptics but also introduce concepts in the fields of robotic technology and computer control. This knowledge will allow readers to be better aware of limitations in the technology that can affect performance and surgical outcomes, and "knowing the right questions to ask" will be invaluable for surgeons who have purchasing power within their departments.

  4. The high level programmer and user interface of the NSLS control system

    International Nuclear Information System (INIS)

    Tang, Y.N.; Smith, J.D.; Sathe, S.

    1993-01-01

    This paper presents the major components of the high level software in the NSLS upgraded control system. Both programmer and user interfaces are discussed. The use of the high-speed work stations, fast network communications, UNIX system, X-window and Motif have greatly changed and improved these interfaces

  5. A graphical user interface for RAId, a knowledge integrated proteomics analysis suite with accurate statistics.

    Science.gov (United States)

    Joyce, Brendan; Lee, Danny; Rubio, Alex; Ogurtsov, Aleksey; Alves, Gelio; Yu, Yi-Kuo

    2018-03-15

    RAId is a software package that has been actively developed for the past 10 years for computationally and visually analyzing MS/MS data. Founded on rigorous statistical methods, RAId's core program computes accurate E-values for peptides and proteins identified during database searches. Making this robust tool readily accessible for the proteomics community by developing a graphical user interface (GUI) is our main goal here. We have constructed a graphical user interface to facilitate the use of RAId on users' local machines. Written in Java, RAId_GUI not only makes easy executions of RAId but also provides tools for data/spectra visualization, MS-product analysis, molecular isotopic distribution analysis, and graphing the retrieval versus the proportion of false discoveries. The results viewer displays and allows the users to download the analyses results. Both the knowledge-integrated organismal databases and the code package (containing source code, the graphical user interface, and a user manual) are available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads/raid.html .

  6. Haptic Feedback for Enhancing Realism of Walking Simulations

    DEFF Research Database (Denmark)

    Turchet, Luca; Burelli, Paolo; Serafin, Stefania

    2013-01-01

    system. While during the use of the interactive system subjects physically walked, during the use of the non-interactive system the locomotion was simulated while subjects were sitting on a chair. In both the configurations subjects were exposed to auditory and audio-visual stimuli presented...... with and without the haptic feedback. Results of the experiments provide a clear preference towards the simulations enhanced with haptic feedback showing that the haptic channel can lead to more realistic experiences in both interactive and non-interactive configurations. The majority of subjects clearly...... appreciated the added feedback. However, some subjects found the added feedback disturbing and annoying. This might be due on one hand to the limits of the haptic simulation and on the other hand to the different individual desire to be involved in the simulations. Our findings can be applied to the context...

  7. Social Circles: A 3D User Interface for Facebook

    Science.gov (United States)

    Rodrigues, Diego; Oakley, Ian

    Online social network services are increasingly popular web applications which display large amounts of rich multimedia content: contacts, status updates, photos and event information. Arguing that this quantity of information overwhelms conventional user interfaces, this paper presents Social Circles, a rich interactive visualization designed to support real world users of social network services in everyday tasks such as keeping up with friends and organizing their network. It achieves this by using 3D UIs, fluid animations and a spatial metaphor to enable direct manipulation of a social network.

  8. AutoNUI:2nd Workshop on Automotive Natural User Interfaces

    OpenAIRE

    Pflegling, Bastian; Döring, Tanja; Alvarez, Ignacio; Kranz, Matthias; Weinberg, Garrett; Healey, Jennifer

    2012-01-01

    Natural user interfaces—generally based on gesture and speech interaction—are an increasingly hot topic in research and are already being applied in a multitude of commercial products. Most use cases currently involve consumer electronics devices like smart phones, tablets, TV sets, game consoles, or large-screen tabletop computers.Motivated by the latest results in those areas, our vision is to apply natural user interfaces, for example gesture and conversational speech interaction, to the a...

  9. Vertical illusory self-motion through haptic stimulation of the feet

    DEFF Research Database (Denmark)

    Nordahl, Rolf; Nilsson, Niels Christian; Turchet, Luca

    2012-01-01

    Circular and linear self-motion illusions induced through visual and auditory stimuli have been studied rather extensively. While the ability of haptic stimuli to augment such illusions has been investigated, the self-motion illusions which primarily are induced by stimulation of the haptic...... to generate the haptic feedback while the final condition included no haptic feedback. Analysis of self-reports were used to assess the participants' experience of illusory self-motion. The results indicate that such illusions are indeed possible. Significant differences were found between the condition...... modality remain relatively unexplored. In this paper, we present an experiment performed with the intention of investigating whether it is possible to use haptic stimulation of the main supporting areas of the feet to induce vertical illusory self-motion on behalf of unrestrained participants during...

  10. Graphical User Interface for the NASA FLOPS Aircraft Performance and Sizing Code

    Science.gov (United States)

    Lavelle, Thomas M.; Curlett, Brian P.

    1994-01-01

    XFLOPS is an X-Windows/Motif graphical user interface for the aircraft performance and sizing code FLOPS. This new interface simplifies entering data and analyzing results, thereby reducing analysis time and errors. Data entry is simpler because input windows are used for each of the FLOPS namelists. These windows contain fields to input the variable's values along with help information describing the variable's function. Analyzing results is simpler because output data are displayed rapidly. This is accomplished in two ways. First, because the output file has been indexed, users can view particular sections with the click of a mouse button. Second, because menu picks have been created, users can plot engine and aircraft performance data. In addition, XFLOPS has a built-in help system and complete on-line documentation for FLOPS.

  11. Haptic spatial matching in near peripersonal space.

    Science.gov (United States)

    Kaas, Amanda L; Mier, Hanneke I van

    2006-04-01

    Research has shown that haptic spatial matching at intermanual distances over 60 cm is prone to large systematic errors. The error pattern has been explained by the use of reference frames intermediate between egocentric and allocentric coding. This study investigated haptic performance in near peripersonal space, i.e. at intermanual distances of 60 cm and less. Twelve blindfolded participants (six males and six females) were presented with two turn bars at equal distances from the midsagittal plane, 30 or 60 cm apart. Different orientations (vertical/horizontal or oblique) of the left bar had to be matched by adjusting the right bar to either a mirror symmetric (/ \\) or parallel (/ /) position. The mirror symmetry task can in principle be performed accurately in both an egocentric and an allocentric reference frame, whereas the parallel task requires an allocentric representation. Results showed that parallel matching induced large systematic errors which increased with distance. Overall error was significantly smaller in the mirror task. The task difference also held for the vertical orientation at 60 cm distance, even though this orientation required the same response in both tasks, showing a marked effect of task instruction. In addition, men outperformed women on the parallel task. Finally, contrary to our expectations, systematic errors were found in the mirror task, predominantly at 30 cm distance. Based on these findings, we suggest that haptic performance in near peripersonal space might be dominated by different mechanisms than those which come into play at distances over 60 cm. Moreover, our results indicate that both inter-individual differences and task demands affect task performance in haptic spatial matching. Therefore, we conclude that the study of haptic spatial matching in near peripersonal space might reveal important additional constraints for the specification of adequate models of haptic spatial performance.

  12. a New ER Fluid Based Haptic Actuator System for Virtual Reality

    Science.gov (United States)

    Böse, H.; Baumann, M.; Monkman, G. J.; Egersdörfer, S.; Tunayar, A.; Freimuth, H.; Ermert, H.; Khaled, W.

    The concept and some steps in the development of a new actuator system which enables the haptic perception of mechanically inhomogeneous virtual objects are introduced. The system consists of a two-dimensional planar array of actuator elements containing an electrorheological (ER) fluid. When a user presses his fingers onto the surface of the actuator array, he perceives locally variable resistance forces generated by vertical pistons which slide in the ER fluid through the gaps between electrode pairs. The voltage in each actuator element can be individually controlled by a novel sophisticated switching technology based on optoelectric gallium arsenide elements. The haptic information which is represented at the actuator array can be transferred from a corresponding sensor system based on ultrasonic elastography. The combined sensor-actuator system may serve as a technology platform for various applications in virtual reality, like telemedicine where the information on the consistency of tissue of a real patient is detected by the sensor part and recorded by the actuator part at a remote location.

  13. StarTrax --- The Next Generation User Interface

    Science.gov (United States)

    Richmond, Alan; White, Nick

    StarTrax is a software package to be distributed to end users for installation on their local computing infrastructure. It will provide access to many services of the HEASARC, i.e. bulletins, catalogs, proposal and analysis tools, initially for the ROSAT MIPS (Mission Information and Planning System), later for the Next Generation Browse. A user activating the GUI will reach all HEASARC capabilities through a uniform view of the system, independent of the local computing environment and of the networking method of accessing StarTrax. Use it if you prefer the point-and-click metaphor of modern GUI technology, to the classical command-line interfaces (CLI). Notable strengths include: easy to use; excellent portability; very robust server support; feedback button on every dialog; painstakingly crafted User Guide. It is designed to support a large number of input devices including terminals, workstations and personal computers. XVT's Portability Toolkit is used to build the GUI in C/C++ to run on: OSF/Motif (UNIX or VMS), OPEN LOOK (UNIX), or Macintosh, or MS-Windows (DOS), or character systems.

  14. Transportable Applications Environment (TAE) Plus - A NASA productivity tool used to develop graphical user interfaces

    Science.gov (United States)

    Szczur, Martha R.

    1991-01-01

    The Transportable Applications Environment (TAE) Plus, developed at NASA's Goddard Space Flight Center, is an advanced portable user interface development environment which simplifies the process of creating and managing complex application graphical user interfaces (GUIs), supports prototyping, allows applications to be oported easily between different platforms, and encourages appropriate levels of user interface consistency between applications. This paper discusses the capabilities of the TAE Plus tool, and how it makes the job of designing and developing GUIs easier for the application developers. The paper also explains how tools like TAE Plus provide for reusability and ensure reliability of UI software components, as well as how they aid in the reduction of development and maintenance costs.

  15. Haptic sensitivity in needle insertion: the effects of training and visual aid

    Directory of Open Access Journals (Sweden)

    Dumas Cedric

    2011-12-01

    Full Text Available This paper describes an experiment conducted to measure haptic sensitivity and the effects of haptic training with and without visual aid. The protocol for haptic training consisted of a needle insertion task using dual-layer silicon samples. A visual aid was provided as a multimodal cue for the haptic perception task. Results showed that for a group of novices (subjects with no previous experience in needle insertion, training with a visual aid resulted in a longer time to task completion, and a greater applied force, during post-training tests. This suggests that haptic perception is easily overshadowed, and may be completely replaced, by visual feedback. Therefore, haptic skills must be trained differently from visuomotor skills.

  16. Vision holds a greater share in visuo-haptic object recognition than touch

    DEFF Research Database (Denmark)

    Kassuba, Tanja; Klinge, Corinna; Hölig, Cordula

    2013-01-01

    approach of multisensory integration would predict that haptics as the less efficient sense for object recognition gains more from integrating additional visual information than vice versa. To test for asymmetries between vision and touch in visuo-haptic interactions, we measured regional changes in brain...... processed the target object, being more pronounced for haptic than visual targets. This preferential response of visuo-haptic regions indicates a modality-specific asymmetry in crossmodal matching of visual and haptic object features, suggesting a functional primacy of vision over touch in visuo...

  17. Robotic and user interface solutions for hazardous and remote applications

    International Nuclear Information System (INIS)

    Schempf, H.

    1997-01-01

    Carnegie Mellon University (CMU) is developing novel robotic and user interface systems to assist in the cleanup activities undertaken by the U.S. Department of Energy (DOE). Under DOE's EM-50 funding and administered by the Federal Energy Technology Center (FETC), CMU has developed a novel asbestos pipe-insulation abatement robot system, called BOA, and a novel generic user interface control and training console, dubbed RoboCon. The use of BOA will allow the speedier abatement of the vast DOE piping networks clad with hazardous and contaminated asbestos insulation by which overall job costs can be reduced by as much as 50%. RoboCon will allow the DOE to evaluate different remote and robotic system technologies from the overall man-machine performance standpoint, as well as provide a standardized training platform for training site operators in the operation of remote and robotic equipment

  18. Graphical user interface development for the MARS code

    International Nuclear Information System (INIS)

    Jeong, J.-J.; Hwang, M.; Lee, Y.J.; Kim, K.D.; Chung, B.D.

    2003-01-01

    KAERI has developed the best-estimate thermal-hydraulic system code MARS using the RELAP5/MOD3 and COBRA-TF codes. To exploit the excellent features of the two codes, we consolidated the two codes. Then, to improve the readability, maintainability, and portability of the consolidated code, all the subroutines were completely restructured by employing a modular data structure. At present, a major part of the MARS code development program is underway to improve the existing capabilities. The code couplings with three-dimensional neutron kinetics, containment analysis, and transient critical heat flux calculations have also been carried out. At the same time, graphical user interface (GUI) tools have been developed for user friendliness. This paper presents the main features of the MARS GUI. The primary objective of the GUI development was to provide a valuable aid for all levels of MARS users in their output interpretation and interactive controls. Especially, an interactive control function was designed to allow operator actions during simulation so that users can utilize the MARS code like conventional nuclear plant analyzers (NPAs). (author)

  19. Use of Design Patterns According to Hand Dominance in a Mobile User Interface

    Science.gov (United States)

    Al-Samarraie, Hosam; Ahmad, Yusof

    2016-01-01

    User interface (UI) design patterns for mobile applications provide a solution to design problems and can improve the usage experience for users. However, there is a lack of research categorizing the uses of design patterns according to users' hand dominance in a learning-based mobile UI. We classified the main design patterns for mobile…

  20. Graphical user interfaces for McClellan Nuclear Radiation Center

    International Nuclear Information System (INIS)

    Brown-VanHoozer, S.A.; Power, M.; Forsmann, H.

    1998-01-01

    The control console of the TRIGA reactor at McClellan's Nuclear Radiation Center (MNRC) is in the process of being replaced because of spurious scrams, outdated software, and obsolete parts. The intent of the new control console is to eliminate the existing problems by installing a UNIX-based computer system with industry-standard interface software and by incorporating human factors during all stages of the graphical user interface (GUI) development and control console design. This paper gives a brief description of some of the guidelines used in developing the MNRC's GUIs as continuous, real-time displays

  1. [Haptic tracking control for minimally invasive robotic surgery].

    Science.gov (United States)

    Xu, Zhaohong; Song, Chengli; Wu, Wenwu

    2012-06-01

    Haptic feedback plays a significant role in minimally invasive robotic surgery (MIRS). A major deficiency of the current MIRS is the lack of haptic perception for the surgeon, including the commercially available robot da Vinci surgical system. In this paper, a dynamics model of a haptic robot is established based on Newton-Euler method. Because it took some period of time in exact dynamics solution, we used a digital PID arithmetic dependent on robot dynamics to ensure real-time bilateral control, and it could improve tracking precision and real-time control efficiency. To prove the proposed method, an experimental system in which two Novint Falcon haptic devices acting as master-slave system has been developed. Simulations and experiments showed proposed methods could give instrument force feedbacks to operator, and bilateral control strategy is an effective method to master-slave MIRS. The proposed methods could be used to tele-robotic system.

  2. Graphical User Interfaces for Volume Rendering Applications in Medical Imaging

    OpenAIRE

    Lindfors, Lisa; Lindmark, Hanna

    2002-01-01

    Volume rendering applications are used in medical imaging in order to facilitate the analysis of three-dimensional image data. This study focuses on how to improve the usability of graphical user interfaces of these systems, by gathering user requirements. This is achieved by evaluations of existing systems, together with interviews and observations at clinics in Sweden that use volume rendering to some extent. The usability of the applications of today is not sufficient, according to the use...

  3. Enhancing audiovisual experience with haptic feedback: a survey on HAV.

    Science.gov (United States)

    Danieau, F; Lecuyer, A; Guillotel, P; Fleureau, J; Mollet, N; Christie, M

    2013-01-01

    Haptic technology has been widely employed in applications ranging from teleoperation and medical simulation to art and design, including entertainment, flight simulation, and virtual reality. Today there is a growing interest among researchers in integrating haptic feedback into audiovisual systems. A new medium emerges from this effort: haptic-audiovisual (HAV) content. This paper presents the techniques, formalisms, and key results pertinent to this medium. We first review the three main stages of the HAV workflow: the production, distribution, and rendering of haptic effects. We then highlight the pressing necessity for evaluation techniques in this context and discuss the key challenges in the field. By building on existing technologies and tackling the specific challenges of the enhancement of audiovisual experience with haptics, we believe the field presents exciting research perspectives whose financial and societal stakes are significant.

  4. Introducing a new open source GIS user interface for the SWAT model

    Science.gov (United States)

    The Soil and Water Assessment Tool (SWAT) model is a robust watershed modelling tool. It typically uses the ArcSWAT interface to create its inputs. ArcSWAT is public domain software which works in the licensed ArcGIS environment. The aim of this paper was to develop an open source user interface ...

  5. Circumventing Graphical User Interfaces in Chemical Engineering Plant Design

    Science.gov (United States)

    Romey, Noel; Schwartz, Rachel M.; Behrend, Douglas; Miao, Peter; Cheung, H. Michael; Beitle, Robert

    2007-01-01

    Graphical User Interfaces (GUIs) are pervasive elements of most modern technical software and represent a convenient tool for student instruction. For example, GUIs are used for [chemical] process design software (e.g., CHEMCAD, PRO/II and ASPEN) typically encountered in the senior capstone course. Drag and drop aspects of GUIs are challenging for…

  6. Wearable Vibrotactile Haptic Device for Stiffness Discrimination during Virtual Interactions

    Directory of Open Access Journals (Sweden)

    Andualem Tadesse Maereg

    2017-09-01

    Full Text Available In this paper, we discuss the development of cost effective, wireless, and wearable vibrotactile haptic device for stiffness perception during an interaction with virtual objects. Our experimental setup consists of haptic device with five vibrotactile actuators, virtual reality environment tailored in Unity 3D integrating the Oculus Rift Head Mounted Display (HMD and the Leap Motion controller. The virtual environment is able to capture touch inputs from users. Interaction forces are then rendered at 500 Hz and fed back to the wearable setup stimulating fingertips with ERM vibrotactile actuators. Amplitude and frequency of vibrations are modulated proportionally to the interaction force to simulate the stiffness of a virtual object. A quantitative and qualitative study is done to compare the discrimination of stiffness on virtual linear spring in three sensory modalities: visual only feedback, tactile only feedback, and their combination. A common psychophysics method called the Two Alternative Forced Choice (2AFC approach is used for quantitative analysis using Just Noticeable Difference (JND and Weber Fractions (WF. According to the psychometric experiment result, average Weber fraction values of 0.39 for visual only feedback was improved to 0.25 by adding the tactile feedback.

  7. A Semi-automated Approach to Improve the Efficiency of Medical Imaging Segmentation for Haptic Rendering.

    Science.gov (United States)

    Banerjee, Pat; Hu, Mengqi; Kannan, Rahul; Krishnaswamy, Srinivasan

    2017-08-01

    The Sensimmer platform represents our ongoing research on simultaneous haptics and graphics rendering of 3D models. For simulation of medical and surgical procedures using Sensimmer, 3D models must be obtained from medical imaging data, such as magnetic resonance imaging (MRI) or computed tomography (CT). Image segmentation techniques are used to determine the anatomies of interest from the images. 3D models are obtained from segmentation and their triangle reduction is required for graphics and haptics rendering. This paper focuses on creating 3D models by automating the segmentation of CT images based on the pixel contrast for integrating the interface between Sensimmer and medical imaging devices, using the volumetric approach, Hough transform method, and manual centering method. Hence, automating the process has reduced the segmentation time by 56.35% while maintaining the same accuracy of the output at ±2 voxels.

  8. Evaluation of flexible endoscope steering using haptic guidance

    NARCIS (Netherlands)

    Reilink, Rob; Stramigioli, Stefano; Kappers, Astrid M L; Misra, Sarthak

    Background: Steering the tip of a flexible endoscope relies on the physician's dexterity and experience. For complex flexible endoscopes, conventional controls may be inadequate. Methods: A steering method based on a multi-degree-of-freedom haptic device is presented. Haptic cues are generated based

  9. Evaluation of flexible endoscope steering using haptic guidance

    NARCIS (Netherlands)

    Reilink, Rob; Stramigioli, Stefano; Kappers, Astrid M.L.; Misra, Sarthak

    2011-01-01

    Background - Steering the tip of a flexible endoscope relies on the physician’s dexterity and experience. For complex flexible endoscopes, conventional controls may be inadequate. Methods - A steering method based on a multi-degree-of-freedom haptic device is presented. Haptic cues are generated

  10. A mobile user-interface for elderly care from the perspective of relatives.

    Science.gov (United States)

    Warpenius, Erika; Alasaarela, Esko; Sorvoja, Hannu; Kinnunen, Matti

    2015-03-01

    As the number of elderly people rises, relatives' care-taking responsibilities increase accordingly. This creates a need for developing new systems that enable relatives to keep track of aged family members. To develop new mobile services for elderly healthcare we tried to identify the most wanted features of a mobile user-interface from the perspective of relatives. Feature mapping was based on two online surveys: one administered to the relatives (N = 32) and nurses (N = 3) of senior citizens and the other to nursing students (N = 18). Results of the surveys, confirmed by face-to-face interviews of the relatives (N = 8), indicated that the most valued features of the mobile user-interface are Accident Reporting (e.g. falling), Alarms (e.g. fire-alarm), Doctor Visits and evaluation of the General Condition of the Senior. The averaged importance ratings of these features were 9.2, 9.0, 8.6 and 8.5, respectively (on a scale from 0 to 10). Other important considerations for the user-interface development are aspiration to simplicity and ease-of-use. We recommend that the results are taken into account, when designing and implementing mobile services for elderly healthcare.

  11. Graphical User Interface Tool Kit for Path-Based Network Policy Language

    National Research Council Canada - National Science Library

    Ekin, Tufan

    2002-01-01

    .... Two of the changes are related to the semantics of the language. A graphical user interface tool kit for creating, validating, archiving and compiling policies represented in PPL has been developed...

  12. A Virtual Reality System for PTCD Simulation Using Direct Visuo-Haptic Rendering of Partially Segmented Image Data.

    Science.gov (United States)

    Fortmeier, Dirk; Mastmeyer, Andre; Schröder, Julian; Handels, Heinz

    2016-01-01

    This study presents a new visuo-haptic virtual reality (VR) training and planning system for percutaneous transhepatic cholangio-drainage (PTCD) based on partially segmented virtual patient models. We only use partially segmented image data instead of a full segmentation and circumvent the necessity of surface or volume mesh models. Haptic interaction with the virtual patient during virtual palpation, ultrasound probing and needle insertion is provided. Furthermore, the VR simulator includes X-ray and ultrasound simulation for image-guided training. The visualization techniques are GPU-accelerated by implementation in Cuda and include real-time volume deformations computed on the grid of the image data. Computation on the image grid enables straightforward integration of the deformed image data into the visualization components. To provide shorter rendering times, the performance of the volume deformation algorithm is improved by a multigrid approach. To evaluate the VR training system, a user evaluation has been performed and deformation algorithms are analyzed in terms of convergence speed with respect to a fully converged solution. The user evaluation shows positive results with increased user confidence after a training session. It is shown that using partially segmented patient data and direct volume rendering is suitable for the simulation of needle insertion procedures such as PTCD.

  13. Comparison and Evaluation of End-User Interfaces for Online Public Access Catalogs.

    Science.gov (United States)

    Zumer, Maja

    End-user interfaces for the online public access catalogs (OPACs) of OhioLINK, a system linking major university and research libraries in Ohio, and its 16 member libraries, accessible through the Internet, are compared and evaluated from the user-oriented perspective. A common, systematic framework was used for the scientific observation of the…

  14. Teaching Classical Mechanics Concepts Using Visuo-Haptic Simulators

    Science.gov (United States)

    Neri, Luis; Noguez, Julieta; Robledo-Rella, Victor; Escobar-Castillejos, David; Gonzalez-Nucamendi, Andres

    2018-01-01

    In this work, the design and implementation of several physics scenarios using haptic devices are presented and discussed. Four visuo-haptic applications were developed for an undergraduate engineering physics course. Experiments with experimental and control groups were designed and implemented. Activities and exercises related to classical…

  15. The role of haptic feedback in laparoscopic simulation training.

    Science.gov (United States)

    Panait, Lucian; Akkary, Ehab; Bell, Robert L; Roberts, Kurt E; Dudrick, Stanley J; Duffy, Andrew J

    2009-10-01

    Laparoscopic virtual reality simulators are becoming a ubiquitous tool in resident training and assessment. These devices provide the operator with various levels of realism, including haptic (or force) feedback. However, this feature adds significantly to the cost of the devices, and limited data exist assessing the value of haptics in skill acquisition and development. Utilizing the Laparoscopy VR (Immersion Medical, Gaithersburg, MD), we hypothesized that the incorporation of force feedback in the simulated operative environment would allow superior trainee performance compared with performance of the same basic skills tasks in a non-haptic model. Ten medical students with minimal laparoscopic experience and similar baseline skill levels as proven by performance of two fundamentals of laparoscopic surgery (FLS) tasks (peg transfer and cutting drills) voluntarily participated in the study. Each performed two tasks, analogous to the FLS drills, on the Laparoscopy VR at 3 levels of difficulty, based on the established settings of the manufacturer. After achieving familiarity with the device and tasks, the students completed the drills both with and without force feedback. Data on completion time, instrument path length, right and left hand errors, and grasping tension were analyzed. The scores in the haptic-enhanced simulation environment were compared with the scores in the non-haptic model and analyzed utilizing Student's t-test. The peg transfer drill showed no difference in performance between the haptic and non-haptic simulations for all metrics at all three levels of difficulty. For the more complex cutting exercise, the time to complete the tasks was significantly shorter when force feedback was provided, at all levels of difficulty (158+/-56 versus 187+/-51 s, 176+/-49 versus 222+/-68 s, and 275+/-76 versus 422+/-220 s, at levels 1, 2, and 3, respectively, Psimulation did not demonstrate an appreciable performance improvement among our trainees. These data

  16. End-to-End Flow Control for Visual-Haptic Communication under Bandwidth Change

    Science.gov (United States)

    Yashiro, Daisuke; Tian, Dapeng; Yakoh, Takahiro

    This paper proposes an end-to-end flow controller for visual-haptic communication. A visual-haptic communication system transmits non-real-time packets, which contain large-size visual data, and real-time packets, which contain small-size haptic data. When the transmission rate of visual data exceeds the communication bandwidth, the visual-haptic communication system becomes unstable owing to buffer overflow. To solve this problem, an end-to-end flow controller is proposed. This controller determines the optimal transmission rate of visual data on the basis of the traffic conditions, which are estimated by the packets for haptic communication. Experimental results confirm that in the proposed method, a short packet-sending interval and a short delay are achieved under bandwidth change, and thus, high-precision visual-haptic communication is realized.

  17. AGUIA: autonomous graphical user interface assembly for clinical trials semantic data services.

    Science.gov (United States)

    Correa, Miria C; Deus, Helena F; Vasconcelos, Ana T; Hayashi, Yuki; Ajani, Jaffer A; Patnana, Srikrishna V; Almeida, Jonas S

    2010-10-26

    AGUIA is a front-end web application originally developed to manage clinical, demographic and biomolecular patient data collected during clinical trials at MD Anderson Cancer Center. The diversity of methods involved in patient screening and sample processing generates a variety of data types that require a resource-oriented architecture to capture the associations between the heterogeneous data elements. AGUIA uses a semantic web formalism, resource description framework (RDF), and a bottom-up design of knowledge bases that employ the S3DB tool as the starting point for the client's interface assembly. The data web service, S3DB, meets the necessary requirements of generating the RDF and of explicitly distinguishing the description of the domain from its instantiation, while allowing for continuous editing of both. Furthermore, it uses an HTTP-REST protocol, has a SPARQL endpoint, and has open source availability in the public domain, which facilitates the development and dissemination of this application. However, S3DB alone does not address the issue of representing content in a form that makes sense for domain experts. We identified an autonomous set of descriptors, the GBox, that provides user and domain specifications for the graphical user interface. This was achieved by identifying a formalism that makes use of an RDF schema to enable the automatic assembly of graphical user interfaces in a meaningful manner while using only resources native to the client web browser (JavaScript interpreter, document object model). We defined a generalized RDF model such that changes in the graphic descriptors are automatically and immediately (locally) reflected into the configuration of the client's interface application. The design patterns identified for the GBox benefit from and reflect the specific requirements of interacting with data generated by clinical trials, and they contain clues for a general purpose solution to the challenge of having interfaces

  18. AGUIA: autonomous graphical user interface assembly for clinical trials semantic data services

    Directory of Open Access Journals (Sweden)

    Hayashi Yuki

    2010-10-01

    Full Text Available Abstract Background AGUIA is a front-end web application originally developed to manage clinical, demographic and biomolecular patient data collected during clinical trials at MD Anderson Cancer Center. The diversity of methods involved in patient screening and sample processing generates a variety of data types that require a resource-oriented architecture to capture the associations between the heterogeneous data elements. AGUIA uses a semantic web formalism, resource description framework (RDF, and a bottom-up design of knowledge bases that employ the S3DB tool as the starting point for the client's interface assembly. Methods The data web service, S3DB, meets the necessary requirements of generating the RDF and of explicitly distinguishing the description of the domain from its instantiation, while allowing for continuous editing of both. Furthermore, it uses an HTTP-REST protocol, has a SPARQL endpoint, and has open source availability in the public domain, which facilitates the development and dissemination of this application. However, S3DB alone does not address the issue of representing content in a form that makes sense for domain experts. Results We identified an autonomous set of descriptors, the GBox, that provides user and domain specifications for the graphical user interface. This was achieved by identifying a formalism that makes use of an RDF schema to enable the automatic assembly of graphical user interfaces in a meaningful manner while using only resources native to the client web browser (JavaScript interpreter, document object model. We defined a generalized RDF model such that changes in the graphic descriptors are automatically and immediately (locally reflected into the configuration of the client's interface application. Conclusions The design patterns identified for the GBox benefit from and reflect the specific requirements of interacting with data generated by clinical trials, and they contain clues for a general

  19. Enhancing the Gaming Experience Using 3D Spatial User Interface Technologies.

    Science.gov (United States)

    Kulshreshth, Arun; Pfeil, Kevin; LaViola, Joseph J

    2017-01-01

    Three-dimensional (3D) spatial user interface technologies have the potential to make games more immersive and engaging and thus provide a better user experience. Although technologies such as stereoscopic 3D display, head tracking, and gesture-based control are available for games, it is still unclear how their use affects gameplay and if there are any user performance benefits. The authors have conducted several experiments on these technologies in game environments to understand how they affect gameplay and how we can use them to optimize the gameplay experience.

  20. Fusion interfaces for tactical environments: An application of virtual reality technology

    Science.gov (United States)

    Haas, Michael W.

    1994-01-01

    The term Fusion Interface is defined as a class of interface which integrally incorporates both virtual and nonvirtual concepts and devices across the visual, auditory, and haptic sensory modalities. A fusion interface is a multisensory virtually-augmented synthetic environment. A new facility has been developed within the Human Engineering Division of the Armstrong Laboratory dedicated to exploratory development of fusion interface concepts. This new facility, the Fusion Interfaces for Tactical Environments (FITE) Facility is a specialized flight simulator enabling efficient concept development through rapid prototyping and direct experience of new fusion concepts. The FITE Facility also supports evaluation of fusion concepts by operation fighter pilots in an air combat environment. The facility is utilized by a multidisciplinary design team composed of human factors engineers, electronics engineers, computer scientists, experimental psychologists, and oeprational pilots. The FITE computational architecture is composed of twenty-five 80486-based microcomputers operating in real-time. The microcomputers generate out-the-window visuals, in-cockpit and head-mounted visuals, localized auditory presentations, haptic displays on the stick and rudder pedals, as well as executing weapons models, aerodynamic models, and threat models.