WorldWideScience

Sample records for haptically rendered virtual

  1. Haptic rendering foundations, algorithms, and applications

    CERN Document Server

    Lin, Ming C

    2008-01-01

    For a long time, human beings have dreamed of a virtual world where it is possible to interact with synthetic entities as if they were real. It has been shown that the ability to touch virtual objects increases the sense of presence in virtual environments. This book provides an authoritative overview of state-of-theart haptic rendering algorithms and their applications. The authors examine various approaches and techniques for designing touch-enabled interfaces for a number of applications, including medical training, model design, and maintainability analysis for virtual prototyping, scienti

  2. A Virtual Reality System for PTCD Simulation Using Direct Visuo-Haptic Rendering of Partially Segmented Image Data.

    Science.gov (United States)

    Fortmeier, Dirk; Mastmeyer, Andre; Schröder, Julian; Handels, Heinz

    2016-01-01

    This study presents a new visuo-haptic virtual reality (VR) training and planning system for percutaneous transhepatic cholangio-drainage (PTCD) based on partially segmented virtual patient models. We only use partially segmented image data instead of a full segmentation and circumvent the necessity of surface or volume mesh models. Haptic interaction with the virtual patient during virtual palpation, ultrasound probing and needle insertion is provided. Furthermore, the VR simulator includes X-ray and ultrasound simulation for image-guided training. The visualization techniques are GPU-accelerated by implementation in Cuda and include real-time volume deformations computed on the grid of the image data. Computation on the image grid enables straightforward integration of the deformed image data into the visualization components. To provide shorter rendering times, the performance of the volume deformation algorithm is improved by a multigrid approach. To evaluate the VR training system, a user evaluation has been performed and deformation algorithms are analyzed in terms of convergence speed with respect to a fully converged solution. The user evaluation shows positive results with increased user confidence after a training session. It is shown that using partially segmented patient data and direct volume rendering is suitable for the simulation of needle insertion procedures such as PTCD.

  3. Simulation and training of lumbar punctures using haptic volume rendering and a 6DOF haptic device

    Science.gov (United States)

    Färber, Matthias; Heller, Julika; Handels, Heinz

    2007-03-01

    The lumbar puncture is performed by inserting a needle into the spinal chord of the patient to inject medicaments or to extract liquor. The training of this procedure is usually done on the patient guided by experienced supervisors. A virtual reality lumbar puncture simulator has been developed in order to minimize the training costs and the patient's risk. We use a haptic device with six degrees of freedom (6DOF) to feedback forces that resist needle insertion and rotation. An improved haptic volume rendering approach is used to calculate the forces. This approach makes use of label data of relevant structures like skin, bone, muscles or fat and original CT data that contributes information about image structures that can not be segmented. A real-time 3D visualization with optional stereo view shows the punctured region. 2D visualizations of orthogonal slices enable a detailed impression of the anatomical context. The input data consisting of CT and label data and surface models of relevant structures is defined in an XML file together with haptic rendering and visualization parameters. In a first evaluation the visible human male data has been used to generate a virtual training body. Several users with different medical experience tested the lumbar puncture trainer. The simulator gives a good haptic and visual impression of the needle insertion and the haptic volume rendering technique enables the feeling of unsegmented structures. Especially, the restriction of transversal needle movement together with rotation constraints enabled by the 6DOF device facilitate a realistic puncture simulation.

  4. High Fidelity Haptic Rendering

    CERN Document Server

    Otaduy, Miguel A

    2006-01-01

    The human haptic system, among all senses, provides unique and bidirectional communication between humans and their physical environment. Yet, to date, most human-computer interactive systems have focused primarily on the graphical rendering of visual information and, to a lesser extent, on the display of auditory information. Extending the frontier of visual computing, haptic interfaces, or force feedback devices, have the potential to increase the quality of human-computer interaction by accommodating the sense of touch. They provide an attractive augmentation to visual display and enhance t

  5. Haptic rendering for simulation of fine manipulation

    CERN Document Server

    Wang, Dangxiao; Zhang, Yuru

    2014-01-01

    This book introduces the latest progress in six degrees of freedom (6-DoF) haptic rendering with the focus on a new approach for simulating force/torque feedback in performing tasks that require dexterous manipulation skills. One of the major challenges in 6-DoF haptic rendering is to resolve the conflict between high speed and high fidelity requirements, especially in simulating a tool interacting with both rigid and deformable objects in a narrow space and with fine features. The book presents a configuration-based optimization approach to tackle this challenge. Addressing a key issue in man

  6. Haptics for Virtual Reality and Teleoperation

    CERN Document Server

    Mihelj, Matjaž

    2012-01-01

    This book covers all topics relevant for the design of haptic interfaces and teleoperation systems. The book provides the basic knowledge required for understanding more complex approaches and more importantly it introduces all issues that must be considered for designing efficient and safe haptic interfaces. Topics covered in this book provide insight into all relevant components of a haptic system. The reader is guided from understanding the virtual reality concept to the final goal of being able to design haptic interfaces for specific tasks such as nanomanipulation.  The introduction chapter positions the haptic interfaces within the virtual reality context. In order to design haptic interfaces that will comply with human capabilities at least basic understanding of human sensors-motor system is required. An overview of this topic is provided in the chapter related to human haptics. The book does not try to introduce the state-of-the-art haptic interface solutions because these tend to change quickly. On...

  7. Wearable Vibrotactile Haptic Device for Stiffness Discrimination during Virtual Interactions

    Directory of Open Access Journals (Sweden)

    Andualem Tadesse Maereg

    2017-09-01

    Full Text Available In this paper, we discuss the development of cost effective, wireless, and wearable vibrotactile haptic device for stiffness perception during an interaction with virtual objects. Our experimental setup consists of haptic device with five vibrotactile actuators, virtual reality environment tailored in Unity 3D integrating the Oculus Rift Head Mounted Display (HMD and the Leap Motion controller. The virtual environment is able to capture touch inputs from users. Interaction forces are then rendered at 500 Hz and fed back to the wearable setup stimulating fingertips with ERM vibrotactile actuators. Amplitude and frequency of vibrations are modulated proportionally to the interaction force to simulate the stiffness of a virtual object. A quantitative and qualitative study is done to compare the discrimination of stiffness on virtual linear spring in three sensory modalities: visual only feedback, tactile only feedback, and their combination. A common psychophysics method called the Two Alternative Forced Choice (2AFC approach is used for quantitative analysis using Just Noticeable Difference (JND and Weber Fractions (WF. According to the psychometric experiment result, average Weber fraction values of 0.39 for visual only feedback was improved to 0.25 by adding the tactile feedback.

  8. A "virtually minimal" visuo-haptic training of attention in severe traumatic brain injury.

    Science.gov (United States)

    Dvorkin, Assaf Y; Ramaiya, Milan; Larson, Eric B; Zollman, Felise S; Hsu, Nancy; Pacini, Sonia; Shah, Amit; Patton, James L

    2013-08-09

    Although common during the early stages of recovery from severe traumatic brain injury (TBI), attention deficits have been scarcely investigated. Encouraging evidence suggests beneficial effects of attention training in more chronic and higher functioning patients. Interactive technology may provide new opportunities for rehabilitation in inpatients who are earlier in their recovery. We designed a "virtually minimal" approach using robot-rendered haptics in a virtual environment to train severely injured inpatients in the early stages of recovery to sustain attention to a visuo-motor task. 21 inpatients with severe TBI completed repetitive reaching toward targets that were both seen and felt. Patients were tested over two consecutive days, experiencing 3 conditions (no haptic feedback, a break-through force, and haptic nudge) in 12 successive, 4-minute blocks. The interactive visuo-haptic environments were well-tolerated and engaging. Patients typically remained attentive to the task. However, patients exhibited attention loss both before (prolonged initiation) and during (pauses during motion) a movement. Compared to no haptic feedback, patients benefited from haptic nudge cues but not break-through forces. As training progressed, patients increased the number of targets acquired and spontaneously improved from one day to the next. Interactive visuo-haptic environments could be beneficial for attention training for severe TBI patients in the early stages of recovery and warrants further and more prolonged clinical testing.

  9. Virtual haptic system for intuitive planning of bone fixation plate placement

    Directory of Open Access Journals (Sweden)

    Kup-Sze Choi

    2017-01-01

    Full Text Available Placement of pre-contoured fixation plate is a common treatment for bone fracture. Fitting of fixation plates on fractured bone can be preoperatively planned and evaluated in 3D virtual environment using virtual reality technology. However, conventional systems usually employ 2D mouse and virtual trackball as the user interface, which makes the process inconvenient and inefficient. In the paper, a preoperative planning system equipped with 3D haptic user interface is proposed to allow users to manipulate the virtual fixation plate intuitively to determine the optimal position for placement on distal medial tibia. The system provides interactive feedback forces and visual guidance based on the geometric requirements. Creation of 3D models from medical imaging data, collision detection, dynamics simulation and haptic rendering are discussed. The system was evaluated by 22 subjects. Results show that the time to achieve optimal placement using the proposed system was shorter than that by using 2D mouse and virtual trackball, and the satisfaction rating was also higher. The system shows potential to facilitate the process of fitting fixation plates on fractured bones as well as interactive fixation plate design.

  10. HapTip: Displaying Haptic Shear Forces at the Fingertips for Multi-Finger Interaction in Virtual Environments

    Directory of Open Access Journals (Sweden)

    Adrien eGirard

    2016-04-01

    Full Text Available The fingertips are one of the most important and sensitive parts of our body.They are the first stimulated areas of the hand when we interact with our environment.Providing haptic feedback to the fingertips in virtual reality could thus drastically improve perception and interaction with virtual environments.In this paper, we present a modular approach called HapTip to display such haptic sensations at the level of the fingertips.This approach relies on a wearable and compact haptic device able to simulate 2 Degree of Freedom (DoF shear forces on the fingertip with a displacement range of +/- 2 mm. Several modules can be added and used jointly in order to address multi-finger and/or bimanual scenarios in virtual environments.For that purpose, we introduce several haptic rendering techniques to cover different cases of 3D interaction such as touching a rough virtual surface, or feeling the inertia or weight of a virtual object.In order to illustrate the possibilities offered by HapTip, we provide four use cases focused on touching or grasping virtual objects.To validate the efficiency of our approach, we also conducted experiments to assess the tactile perception obtained with HapTip.Our results show that participants can successfully discriminate the directions of the 2 DoF stimulation of our haptic device.We found also that participants could well perceive different weights of virtual objects simulated using two HapTip devices. We believe that HapTip could be used in numerous applications in virtual reality for which 3D manipulation and tactile sensations are often crucial, such as in virtual prototyping or virtual training.

  11. A “virtually minimal” visuo-haptic training of attention in severe traumatic brain injury

    Science.gov (United States)

    2013-01-01

    Background Although common during the early stages of recovery from severe traumatic brain injury (TBI), attention deficits have been scarcely investigated. Encouraging evidence suggests beneficial effects of attention training in more chronic and higher functioning patients. Interactive technology may provide new opportunities for rehabilitation in inpatients who are earlier in their recovery. Methods We designed a “virtually minimal” approach using robot-rendered haptics in a virtual environment to train severely injured inpatients in the early stages of recovery to sustain attention to a visuo-motor task. 21 inpatients with severe TBI completed repetitive reaching toward targets that were both seen and felt. Patients were tested over two consecutive days, experiencing 3 conditions (no haptic feedback, a break-through force, and haptic nudge) in 12 successive, 4-minute blocks. Results The interactive visuo-haptic environments were well-tolerated and engaging. Patients typically remained attentive to the task. However, patients exhibited attention loss both before (prolonged initiation) and during (pauses during motion) a movement. Compared to no haptic feedback, patients benefited from haptic nudge cues but not break-through forces. As training progressed, patients increased the number of targets acquired and spontaneously improved from one day to the next. Conclusions Interactive visuo-haptic environments could be beneficial for attention training for severe TBI patients in the early stages of recovery and warrants further and more prolonged clinical testing. PMID:23938101

  12. Design of a 4-DOF MR haptic master for application to robot surgery: virtual environment work

    Science.gov (United States)

    Oh, Jong-Seok; Choi, Seung-Hyun; Choi, Seung-Bok

    2014-09-01

    This paper presents the design and control performance of a novel type of 4-degrees-of-freedom (4-DOF) haptic master in cyberspace for a robot-assisted minimally invasive surgery (RMIS) application. By using a controllable magnetorheological (MR) fluid, the proposed haptic master can have a feedback function for a surgical robot. Due to the difficulty in utilizing real human organs in the experiment, the cyberspace that features the virtual object is constructed to evaluate the performance of the haptic master. In order to realize the cyberspace, a volumetric deformable object is represented by a shape-retaining chain-linked (S-chain) model, which is a fast volumetric model and is suitable for real-time applications. In the haptic architecture for an RMIS application, the desired torque and position induced from the virtual object of the cyberspace and the haptic master of real space are transferred to each other. In order to validate the superiority of the proposed master and volumetric model, a tracking control experiment is implemented with a nonhomogenous volumetric cubic object to demonstrate that the proposed model can be utilized in real-time haptic rendering architecture. A proportional-integral-derivative (PID) controller is then designed and empirically implemented to accomplish the desired torque trajectories. It has been verified from the experiment that tracking the control performance for torque trajectories from a virtual slave can be successfully achieved.

  13. Design of a 4-DOF MR haptic master for application to robot surgery: virtual environment work

    International Nuclear Information System (INIS)

    Oh, Jong-Seok; Choi, Seung-Hyun; Choi, Seung-Bok

    2014-01-01

    This paper presents the design and control performance of a novel type of 4-degrees-of-freedom (4-DOF) haptic master in cyberspace for a robot-assisted minimally invasive surgery (RMIS) application. By using a controllable magnetorheological (MR) fluid, the proposed haptic master can have a feedback function for a surgical robot. Due to the difficulty in utilizing real human organs in the experiment, the cyberspace that features the virtual object is constructed to evaluate the performance of the haptic master. In order to realize the cyberspace, a volumetric deformable object is represented by a shape-retaining chain-linked (S-chain) model, which is a fast volumetric model and is suitable for real-time applications. In the haptic architecture for an RMIS application, the desired torque and position induced from the virtual object of the cyberspace and the haptic master of real space are transferred to each other. In order to validate the superiority of the proposed master and volumetric model, a tracking control experiment is implemented with a nonhomogenous volumetric cubic object to demonstrate that the proposed model can be utilized in real-time haptic rendering architecture. A proportional-integral-derivative (PID) controller is then designed and empirically implemented to accomplish the desired torque trajectories. It has been verified from the experiment that tracking the control performance for torque trajectories from a virtual slave can be successfully achieved. (paper)

  14. Heterogeneous Deformable Modeling of Bio-Tissues and Haptic Force Rendering for Bio-Object Modeling

    Science.gov (United States)

    Lin, Shiyong; Lee, Yuan-Shin; Narayan, Roger J.

    This paper presents a novel technique for modeling soft biological tissues as well as the development of an innovative interface for bio-manufacturing and medical applications. Heterogeneous deformable models may be used to represent the actual internal structures of deformable biological objects, which possess multiple components and nonuniform material properties. Both heterogeneous deformable object modeling and accurate haptic rendering can greatly enhance the realism and fidelity of virtual reality environments. In this paper, a tri-ray node snapping algorithm is proposed to generate a volumetric heterogeneous deformable model from a set of object interface surfaces between different materials. A constrained local static integration method is presented for simulating deformation and accurate force feedback based on the material properties of a heterogeneous structure. Biological soft tissue modeling is used as an example to demonstrate the proposed techniques. By integrating the heterogeneous deformable model into a virtual environment, users can both observe different materials inside a deformable object as well as interact with it by touching the deformable object using a haptic device. The presented techniques can be used for surgical simulation, bio-product design, bio-manufacturing, and medical applications.

  15. One-Dimensional Haptic Rendering Using Audio Speaker with Displacement Determined by Inductance

    Directory of Open Access Journals (Sweden)

    Avin Khera

    2016-03-01

    Full Text Available We report overall design considerations and preliminary results for a new haptic rendering device based on an audio loudspeaker. Our application models tissue properties during microsurgery. For example, the device could respond to the tip of a tool by simulating a particular tissue, displaying a desired compressibility and viscosity, giving way as the tissue is disrupted, or exhibiting independent motion, such as that caused by pulsations in blood pressure. Although limited to one degree of freedom and with a relatively small range of displacement compared to other available haptic rendering devices, our design exhibits high bandwidth, low friction, low hysteresis, and low mass. These features are consistent with modeling interactions with delicate tissues during microsurgery. In addition, our haptic rendering device is designed to be simple and inexpensive to manufacture, in part through an innovative method of measuring displacement by existing variations in the speaker’s inductance as the voice coil moves over the permanent magnet. Low latency and jitter are achieved by running the real-time simulation models on a dedicated microprocessor, while maintaining bidirectional communication with a standard laptop computer for user controls and data logging.

  16. Virtual Reality and Haptics for Product Assembly

    Directory of Open Access Journals (Sweden)

    Maria Teresa Restivo

    2012-01-01

    Full Text Available Haptics can significantly enhance the user's sense of immersion and interactivity. An industrial application of virtual reality and haptics for product assembly is described in this paper, which provides a new and low-cost approach for product assembly design, assembly task planning and assembly operation training. A demonstration of the system with haptics device interaction was available at the session of exp.at'11.

  17. A study on haptic collaborative game in shared virtual environment

    Science.gov (United States)

    Lu, Keke; Liu, Guanyang; Liu, Lingzhi

    2013-03-01

    A study on collaborative game in shared virtual environment with haptic feedback over computer networks is introduced in this paper. A collaborative task was used where the players located at remote sites and played the game together. The player can feel visual and haptic feedback in virtual environment compared to traditional networked multiplayer games. The experiment was desired in two conditions: visual feedback only and visual-haptic feedback. The goal of the experiment is to assess the impact of force feedback on collaborative task performance. Results indicate that haptic feedback is beneficial for performance enhancement for collaborative game in shared virtual environment. The outcomes of this research can have a powerful impact on the networked computer games.

  18. Virtual reality in neurosurgical education: part-task ventriculostomy simulation with dynamic visual and haptic feedback.

    Science.gov (United States)

    Lemole, G Michael; Banerjee, P Pat; Luciano, Cristian; Neckrysh, Sergey; Charbel, Fady T

    2007-07-01

    Mastery of the neurosurgical skill set involves many hours of supervised intraoperative training. Convergence of political, economic, and social forces has limited neurosurgical resident operative exposure. There is need to develop realistic neurosurgical simulations that reproduce the operative experience, unrestricted by time and patient safety constraints. Computer-based, virtual reality platforms offer just such a possibility. The combination of virtual reality with dynamic, three-dimensional stereoscopic visualization, and haptic feedback technologies makes realistic procedural simulation possible. Most neurosurgical procedures can be conceptualized and segmented into critical task components, which can be simulated independently or in conjunction with other modules to recreate the experience of a complex neurosurgical procedure. We use the ImmersiveTouch (ImmersiveTouch, Inc., Chicago, IL) virtual reality platform, developed at the University of Illinois at Chicago, to simulate the task of ventriculostomy catheter placement as a proof-of-concept. Computed tomographic data are used to create a virtual anatomic volume. Haptic feedback offers simulated resistance and relaxation with passage of a virtual three-dimensional ventriculostomy catheter through the brain parenchyma into the ventricle. A dynamic three-dimensional graphical interface renders changing visual perspective as the user's head moves. The simulation platform was found to have realistic visual, tactile, and handling characteristics, as assessed by neurosurgical faculty, residents, and medical students. We have developed a realistic, haptics-based virtual reality simulator for neurosurgical education. Our first module recreates a critical component of the ventriculostomy placement task. This approach to task simulation can be assembled in a modular manner to reproduce entire neurosurgical procedures.

  19. Development of a Virtual Guitar using Haptic Device

    OpenAIRE

    田村,真晴; 山下,英生

    2009-01-01

    In recent years, a haptic device that output power as one of the computer output devices has been developed. We can get the feeling that we really touch the material through a sensor of haptic device when we touch a material simulated in a computer. In this research, a virtual guitar in which the feeling playing guitar and the sound volume are changed by adjusting power to input with a haptic device was developed. With the haptic device we feel as if we play a genuine guitar. Moreover, it see...

  20. High-fidelity haptic and visual rendering for patient-specific simulation of temporal bone surgery.

    Science.gov (United States)

    Chan, Sonny; Li, Peter; Locketz, Garrett; Salisbury, Kenneth; Blevins, Nikolas H

    2016-12-01

    Medical imaging techniques provide a wealth of information for surgical preparation, but it is still often the case that surgeons are examining three-dimensional pre-operative image data as a series of two-dimensional images. With recent advances in visual computing and interactive technologies, there is much opportunity to provide surgeons an ability to actively manipulate and interpret digital image data in a surgically meaningful way. This article describes the design and initial evaluation of a virtual surgical environment that supports patient-specific simulation of temporal bone surgery using pre-operative medical image data. Computational methods are presented that enable six degree-of-freedom haptic feedback during manipulation, and that simulate virtual dissection according to the mechanical principles of orthogonal cutting and abrasive wear. A highly efficient direct volume renderer simultaneously provides high-fidelity visual feedback during surgical manipulation of the virtual anatomy. The resulting virtual surgical environment was assessed by evaluating its ability to replicate findings in the operating room, using pre-operative imaging of the same patient. Correspondences between surgical exposure, anatomical features, and the locations of pathology were readily observed when comparing intra-operative video with the simulation, indicating the predictive ability of the virtual surgical environment.

  1. Parametric model of the scala tympani for haptic-rendered cochlear implantation.

    Science.gov (United States)

    Todd, Catherine; Naghdy, Fazel

    2005-01-01

    A parametric model of the human scala tympani has been designed for use in a haptic-rendered computer simulation of cochlear implant surgery. It will be the first surgical simulator of this kind. A geometric model of the Scala Tympani has been derived from measured data for this purpose. The model is compared with two existing descriptions of the cochlear spiral. A first approximation of the basilar membrane is also produced. The structures are imported into a force-rendering software application for system development.

  2. Virtual reality haptic dissection.

    Science.gov (United States)

    Erolin, Caroline; Wilkinson, Caroline; Soames, Roger

    2011-12-01

    This project aims to create a three-dimensional digital model of the human hand and wrist which can be virtually 'dissected' through a haptic interface. Tissue properties will be added to the various anatomical structures to replicate a realistic look and feel. The project will explore the role of the medical artist, and investigate cross-discipline collaborations in the field of virtual anatomy. The software will be used to train anatomy students in dissection skills, before experience on a real cadaver. The effectiveness of the software will be evaluated and assessed both quantitatively as well as qualitatively.

  3. Perceiving haptic feedback in virtual reality simulators.

    Science.gov (United States)

    Våpenstad, Cecilie; Hofstad, Erlend Fagertun; Langø, Thomas; Mårvik, Ronald; Chmarra, Magdalena Karolina

    2013-07-01

    To improve patient safety, training of psychomotor laparoscopic skills is often done on virtual reality (VR) simulators outside the operating room. Haptic sensations have been found to influence psychomotor performance in laparoscopy. The emulation of haptic feedback is thus an important aspect of VR simulation. Some VR simulators try to simulate these sensations with handles equipped with haptic feedback. We conducted a survey on how laparoscopic surgeons perceive handles with and without haptic feedback. Surgeons with different levels of experience in laparoscopy were asked to test two handles: Xitact IHP with haptic feedback and Xitact ITP without haptic feedback (Mentice AB, Gothenburg, Sweden), connected to the LapSim (Surgical Science AB, Sweden) VR simulator. They performed two tasks on the simulator before answering 12 questions regarding the two handles. The surgeons were not informed about the differences in the handles. A total of 85 % of the 20 surgeons who participated in the survey claimed that it is important that handles with haptic feedback feel realistic. Ninety percent of the surgeons preferred the handles without haptic feedback. The friction in the handles with haptic feedback was perceived to be as in reality (5 %) or too high (95 %). Regarding the handles without haptic feedback, the friction was perceived as in reality (45 %), too low (50 %), or too high (5 %). A total of 85 % of the surgeons thought that the handle with haptic feedback attempts to simulate the resistance offered by tissue to deformation. Ten percent thought that the handle succeeds in doing so. The surveyed surgeons believe that haptic feedback is an important feature on VR simulators; however, they preferred the handles without haptic feedback because they perceived the handles with haptic feedback to add additional friction, making them unrealistic and not mechanically transparent.

  4. Image Based Rendering and Virtual Reality

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    The Presentation concerns with an overview of Image Based Rendering approaches and their use on Virtual Reality, including Virtual Photography and Cinematography, and Mobile Robot Navigation.......The Presentation concerns with an overview of Image Based Rendering approaches and their use on Virtual Reality, including Virtual Photography and Cinematography, and Mobile Robot Navigation....

  5. Shifty: A Weight-Shifting Dynamic Passive Haptic Proxy to Enhance Object Perception in Virtual Reality.

    Science.gov (United States)

    Zenner, Andre; Kruger, Antonio

    2017-04-01

    We define the concept of Dynamic Passive Haptic Feedback (DPHF) for virtual reality by introducing the weight-shifting physical DPHF proxy object Shifty. This concept combines actuators known from active haptics and physical proxies known from passive haptics to construct proxies that automatically adapt their passive haptic feedback. We describe the concept behind our ungrounded weight-shifting DPHF proxy Shifty and the implementation of our prototype. We then investigate how Shifty can, by automatically changing its internal weight distribution, enhance the user's perception of virtual objects interacted with in two experiments. In a first experiment, we show that Shifty can enhance the perception of virtual objects changing in shape, especially in length and thickness. Here, Shifty was shown to increase the user's fun and perceived realism significantly, compared to an equivalent passive haptic proxy. In a second experiment, Shifty is used to pick up virtual objects of different virtual weights. The results show that Shifty enhances the perception of weight and thus the perceived realism by adapting its kinesthetic feedback to the picked-up virtual object. In the same experiment, we additionally show that specific combinations of haptic, visual and auditory feedback during the pick-up interaction help to compensate for visual-haptic mismatch perceived during the shifting process.

  6. Virtual reality haptic human dissection.

    Science.gov (United States)

    Needham, Caroline; Wilkinson, Caroline; Soames, Roger

    2011-01-01

    This project aims to create a three-dimensional digital model of the human hand and wrist which can be virtually 'dissected' through a haptic interface. Tissue properties will be added to the various anatomical structures to replicate a realistic look and feel. The project will explore the role of the medical artist and investigate the cross-discipline collaborations required in the field of virtual anatomy. The software will be used to train anatomy students in dissection skills before experience on a real cadaver. The effectiveness of the software will be evaluated and assessed both quantitatively as well as qualitatively.

  7. Evaluation of Pseudo-Haptic Interactions with Soft Objects in Virtual Environments.

    Directory of Open Access Journals (Sweden)

    Min Li

    Full Text Available This paper proposes a pseudo-haptic feedback method conveying simulated soft surface stiffness information through a visual interface. The method exploits a combination of two feedback techniques, namely visual feedback of soft surface deformation and control of the indenter avatar speed, to convey stiffness information of a simulated surface of a soft object in virtual environments. The proposed method was effective in distinguishing different sizes of virtual hard nodules integrated into the simulated soft bodies. To further improve the interactive experience, the approach was extended creating a multi-point pseudo-haptic feedback system. A comparison with regards to (a nodule detection sensitivity and (b elapsed time as performance indicators in hard nodule detection experiments to a tablet computer incorporating vibration feedback was conducted. The multi-point pseudo-haptic interaction is shown to be more time-efficient than the single-point pseudo-haptic interaction. It is noted that multi-point pseudo-haptic feedback performs similarly well when compared to a vibration-based feedback method based on both performance measures elapsed time and nodule detection sensitivity. This proves that the proposed method can be used to convey detailed haptic information for virtual environmental tasks, even subtle ones, using either a computer mouse or a pressure sensitive device as an input device. This pseudo-haptic feedback method provides an opportunity for low-cost simulation of objects with soft surfaces and hard inclusions, as, for example, occurring in ever more realistic video games with increasing emphasis on interaction with the physical environment and minimally invasive surgery in the form of soft tissue organs with embedded cancer nodules. Hence, the method can be used in many low-budget applications where haptic sensation is required, such as surgeon training or video games, either using desktop computers or portable devices, showing

  8. Haptic feedback in OP:Sense - augmented reality in telemanipulated robotic surgery.

    Science.gov (United States)

    Beyl, T; Nicolai, P; Mönnich, H; Raczkowksy, J; Wörn, H

    2012-01-01

    In current research, haptic feedback in robot assisted interventions plays an important role. However most approaches to haptic feedback only regard the mapping of the current forces at the surgical instrument to the haptic input devices, whereas surgeons demand a combination of medical imaging and telemanipulated robotic setups. In this paper we describe how this feature is integrated in our robotic research platform OP:Sense. The proposed method allows the automatic transfer of segmented imaging data to the haptic renderer and therefore allows enriching the haptic feedback with virtual fixtures based on imaging data. Anatomical structures are extracted from pre-operative generated medical images or virtual walls are defined by the surgeon inside the imaging data. Combining real forces with virtual fixtures can guide the surgeon to the regions of interest as well as helps to prevent the risk of damage to critical structures inside the patient. We believe that the combination of medical imaging and telemanipulation is a crucial step for the next generation of MIRS-systems.

  9. Effect on High versus Low Fidelity Haptic Feedback in a Virtual Reality Baseball Simulation

    DEFF Research Database (Denmark)

    Ryge, Andreas Nicolaj; Thomsen, Lui Albæk; Berthelsen, Theis

    2017-01-01

    In this paper we present a within-subjects study (n=26) comparing participants' experience of three kinds of haptic feedback (no haptic feedback, low fidelity haptic feedback and high fidelity haptic feedback) simulating the impact between a virtual baseball bat and ball. We noticed some minor ef...

  10. A Semi-automated Approach to Improve the Efficiency of Medical Imaging Segmentation for Haptic Rendering.

    Science.gov (United States)

    Banerjee, Pat; Hu, Mengqi; Kannan, Rahul; Krishnaswamy, Srinivasan

    2017-08-01

    The Sensimmer platform represents our ongoing research on simultaneous haptics and graphics rendering of 3D models. For simulation of medical and surgical procedures using Sensimmer, 3D models must be obtained from medical imaging data, such as magnetic resonance imaging (MRI) or computed tomography (CT). Image segmentation techniques are used to determine the anatomies of interest from the images. 3D models are obtained from segmentation and their triangle reduction is required for graphics and haptics rendering. This paper focuses on creating 3D models by automating the segmentation of CT images based on the pixel contrast for integrating the interface between Sensimmer and medical imaging devices, using the volumetric approach, Hough transform method, and manual centering method. Hence, automating the process has reduced the segmentation time by 56.35% while maintaining the same accuracy of the output at ±2 voxels.

  11. Virtual reality cerebral aneurysm clipping simulation with real-time haptic feedback.

    Science.gov (United States)

    Alaraj, Ali; Luciano, Cristian J; Bailey, Daniel P; Elsenousi, Abdussalam; Roitberg, Ben Z; Bernardo, Antonio; Banerjee, P Pat; Charbel, Fady T

    2015-03-01

    With the decrease in the number of cerebral aneurysms treated surgically and the increase of complexity of those treated surgically, there is a need for simulation-based tools to teach future neurosurgeons the operative techniques of aneurysm clipping. To develop and evaluate the usefulness of a new haptic-based virtual reality simulator in the training of neurosurgical residents. A real-time sensory haptic feedback virtual reality aneurysm clipping simulator was developed using the ImmersiveTouch platform. A prototype middle cerebral artery aneurysm simulation was created from a computed tomographic angiogram. Aneurysm and vessel volume deformation and haptic feedback are provided in a 3-dimensional immersive virtual reality environment. Intraoperative aneurysm rupture was also simulated. Seventeen neurosurgery residents from 3 residency programs tested the simulator and provided feedback on its usefulness and resemblance to real aneurysm clipping surgery. Residents thought that the simulation would be useful in preparing for real-life surgery. About two-thirds of the residents thought that the 3-dimensional immersive anatomic details provided a close resemblance to real operative anatomy and accurate guidance for deciding surgical approaches. They thought the simulation was useful for preoperative surgical rehearsal and neurosurgical training. A third of the residents thought that the technology in its current form provided realistic haptic feedback for aneurysm surgery. Neurosurgical residents thought that the novel immersive VR simulator is helpful in their training, especially because they do not get a chance to perform aneurysm clippings until late in their residency programs.

  12. Enhancing audiovisual experience with haptic feedback: a survey on HAV.

    Science.gov (United States)

    Danieau, F; Lecuyer, A; Guillotel, P; Fleureau, J; Mollet, N; Christie, M

    2013-01-01

    Haptic technology has been widely employed in applications ranging from teleoperation and medical simulation to art and design, including entertainment, flight simulation, and virtual reality. Today there is a growing interest among researchers in integrating haptic feedback into audiovisual systems. A new medium emerges from this effort: haptic-audiovisual (HAV) content. This paper presents the techniques, formalisms, and key results pertinent to this medium. We first review the three main stages of the HAV workflow: the production, distribution, and rendering of haptic effects. We then highlight the pressing necessity for evaluation techniques in this context and discuss the key challenges in the field. By building on existing technologies and tackling the specific challenges of the enhancement of audiovisual experience with haptics, we believe the field presents exciting research perspectives whose financial and societal stakes are significant.

  13. Surgical virtual reality - highlights in developing a high performance surgical haptic device.

    Science.gov (United States)

    Custură-Crăciun, D; Cochior, D; Constantinoiu, S; Neagu, C

    2013-01-01

    Just like simulators are a standard in aviation and aerospace sciences, we expect for surgical simulators to soon become a standard in medical applications. These will correctly instruct future doctors in surgical techniques without there being a need for hands on patient instruction. Using virtual reality by digitally transposing surgical procedures changes surgery in are volutionary manner by offering possibilities for implementing new, much more efficient, learning methods, by allowing the practice of new surgical techniques and by improving surgeon abilities and skills. Perfecting haptic devices has opened the door to a series of opportunities in the fields of research,industry, nuclear science and medicine. Concepts purely theoretical at first, such as telerobotics, telepresence or telerepresentation,have become a practical reality as calculus techniques, telecommunications and haptic devices evolved,virtual reality taking a new leap. In the field of surgery barrier sand controversies still remain, regarding implementation and generalization of surgical virtual simulators. These obstacles remain connected to the high costs of this yet fully sufficiently developed technology, especially in the domain of haptic devices. Celsius.

  14. Graphic and haptic simulation system for virtual laparoscopic rectum surgery.

    Science.gov (United States)

    Pan, Jun J; Chang, Jian; Yang, Xiaosong; Zhang, Jian J; Qureshi, Tahseen; Howell, Robert; Hickish, Tamas

    2011-09-01

    Medical simulators with vision and haptic feedback techniques offer a cost-effective and efficient alternative to the traditional medical trainings. They have been used to train doctors in many specialties of medicine, allowing tasks to be practised in a safe and repetitive manner. This paper describes a virtual-reality (VR) system which will help to influence surgeons' learning curves in the technically challenging field of laparoscopic surgery of the rectum. Data from MRI of the rectum and real operation videos are used to construct the virtual models. A haptic force filter based on radial basis functions is designed to offer realistic and smooth force feedback. To handle collision detection efficiently, a hybrid model is presented to compute the deformation of intestines. Finally, a real-time cutting technique based on mesh is employed to represent the incision operation. Despite numerous research efforts, fast and realistic solutions of soft tissues with large deformation, such as intestines, prove extremely challenging. This paper introduces our latest contribution to this endeavour. With this system, the user can haptically operate with the virtual rectum and simultaneously watch the soft tissue deformation. Our system has been tested by colorectal surgeons who believe that the simulated tactile and visual feedbacks are realistic. It could replace the traditional training process and effectively transfer surgical skills to novices. Copyright © 2011 John Wiley & Sons, Ltd.

  15. Haptic virtual reality for skill acquisition in endodontics.

    Science.gov (United States)

    Suebnukarn, Siriwan; Haddawy, Peter; Rhienmora, Phattanapon; Gajananan, Kugamoorthy

    2010-01-01

    Haptic virtual reality (VR) has revolutionized the skill acquisition in dentistry. The strength of the haptic VR system is that it can automatically record the outcome and associated kinematic data on how each step of the task is performed, which are not available in the conventional skill training environments. The aim of this study was to assess skill acquisition in endodontics and to identify process and outcome variables for the quantification of proficiency. Twenty novices engaged in the experimental study that involved practicing the access opening task with the haptic VR system. Process (speed, force utilization, and bimanual coordination) and outcome variables were determined for assessing skill performance. These values were compared before and after training. Significant improvements were observed through training in all variables. A unique force used pattern and bimanual coordination were observed in each step of the access opening in the posttraining session. The novices also performed the tasks considerably faster with greater outcome within the first two to three training sessions. The study objectively showed that the novices could learn to perform access opening tasks faster and with more consistency, better bimanual dexterity, and better force utilization. The variables examined showed great promise as objective indicators of proficiency and skill acquisition in haptic VR.

  16. Hybrid rendering of the chest and virtual bronchoscopy [corrected].

    Science.gov (United States)

    Seemann, M D; Seemann, O; Luboldt, W; Gebicke, K; Prime, G; Claussen, C D

    2000-10-30

    Thin-section spiral computed tomography was used to acquire the volume data sets of the thorax. The tracheobronchial system and pathological changes of the chest were visualized using a color-coded surface rendering method. The structures of interest were then superimposed on a volume rendering of the other thoracic structures, thus producing a hybrid rendering. The hybrid rendering technique exploit the advantages of both rendering methods and enable virtual bronchoscopic examinations using different representation models. Virtual bronchoscopic examinations with a transparent color-coded shaded-surface model enables the simultaneous visualization of both the airways and the adjacent structures behind of the tracheobronchial wall and therefore, offers a practical alternative to fiberoptic bronchoscopy. Hybrid rendering and virtual endoscopy obviate the need for time consuming detailed analysis and presentation of axial source images.

  17. A kinesthetic washout filter for force-feedback rendering.

    Science.gov (United States)

    Danieau, Fabien; Lecuyer, Anatole; Guillotel, Philippe; Fleureau, Julien; Mollet, Nicolas; Christie, Marc

    2015-01-01

    Today haptic feedback can be designed and associated to audiovisual content (haptic-audiovisuals or HAV). Although there are multiple means to create individual haptic effects, the issue of how to properly adapt such effects on force-feedback devices has not been addressed and is mostly a manual endeavor. We propose a new approach for the haptic rendering of HAV, based on a washout filter for force-feedback devices. A body model and an inverse kinematics algorithm simulate the user's kinesthetic perception. Then, the haptic rendering is adapted in order to handle transitions between haptic effects and to optimize the amplitude of effects regarding the device capabilities. Results of a user study show that this new haptic rendering can successfully improve the HAV experience.

  18. A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality.

    Science.gov (United States)

    Kim, Mingyu; Jeon, Changyu; Kim, Jinmo

    2017-05-17

    This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality.

  19. A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality

    Science.gov (United States)

    Kim, Mingyu; Jeon, Changyu; Kim, Jinmo

    2017-01-01

    This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality. PMID:28513545

  20. Characterization of a smartphone size haptic rendering system based on thin-film AlN actuators on glass substrates

    Science.gov (United States)

    Bernard, F.; Casset, F.; Danel, J. S.; Chappaz, C.; Basrour, S.

    2016-08-01

    This paper presents for the first time the characterization of a smartphone-size haptic rendering system based on the friction modulation effect. According to previous work and finite element modeling, the homogeneous flexural modes are needed to get the haptic feedback effect. The device studied consists of a thin film AlN transducers deposited on an 110  ×  65 mm2 glass substrate. The transducer’s localization on the glass plate allows a transparent central area of 90  ×  49 mm2. Electrical and mechanical parameters of the system are extracted from measurement. From this extraction, the electrical impedance matching reduced the applied voltage to 17.5 V AC and the power consumption to 1.53 W at the resonance frequency of the vibrating system to reach the haptic rendering specification. Transient characterizations of the actuation highlight a delay under the dynamic tactile detection. The characterization of the AlN transducers used as sensors, including the noise rejection, the delay or the output charge amplitude allows detections with high accuracy of any variation due to external influences. Those specifications are the first step to a low-power-consumption feedback-looped system.

  1. Characterization of a smartphone size haptic rendering system based on thin-film AlN actuators on glass substrates

    International Nuclear Information System (INIS)

    Bernard, F; Basrour, S; Casset, F; Danel, J S; Chappaz, C

    2016-01-01

    This paper presents for the first time the characterization of a smartphone-size haptic rendering system based on the friction modulation effect. According to previous work and finite element modeling, the homogeneous flexural modes are needed to get the haptic feedback effect. The device studied consists of a thin film AlN transducers deposited on an 110  ×  65 mm 2 glass substrate. The transducer’s localization on the glass plate allows a transparent central area of 90  ×  49 mm 2 . Electrical and mechanical parameters of the system are extracted from measurement. From this extraction, the electrical impedance matching reduced the applied voltage to 17.5 V AC and the power consumption to 1.53 W at the resonance frequency of the vibrating system to reach the haptic rendering specification. Transient characterizations of the actuation highlight a delay under the dynamic tactile detection. The characterization of the AlN transducers used as sensors, including the noise rejection, the delay or the output charge amplitude allows detections with high accuracy of any variation due to external influences. Those specifications are the first step to a low-power-consumption feedback-looped system. (paper)

  2. What you can't feel won't hurt you: Evaluating haptic hardware using a haptic contrast sensitivity function.

    Science.gov (United States)

    Salisbury, C M; Gillespie, R B; Tan, H Z; Barbagli, F; Salisbury, J K

    2011-01-01

    In this paper, we extend the concept of the contrast sensitivity function - used to evaluate video projectors - to the evaluation of haptic devices. We propose using human observers to determine if vibrations rendered using a given haptic device are accompanied by artifacts detectable to humans. This determination produces a performance measure that carries particular relevance to applications involving texture rendering. For cases in which a device produces detectable artifacts, we have developed a protocol that localizes deficiencies in device design and/or hardware implementation. In this paper, we present results from human vibration detection experiments carried out using three commercial haptic devices and one high performance voice coil motor. We found that all three commercial devices produced perceptible artifacts when rendering vibrations near human detection thresholds. Our protocol allowed us to pinpoint the deficiencies, however, and we were able to show that minor modifications to the haptic hardware were sufficient to make these devices well suited for rendering vibrations, and by extension, the vibratory components of textures. We generalize our findings to provide quantitative design guidelines that ensure the ability of haptic devices to proficiently render the vibratory components of textures.

  3. A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality

    OpenAIRE

    Kim, Mingyu; Jeon, Changyu; Kim, Jinmo

    2017-01-01

    This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and hea...

  4. Haptic Feedback in Motor Hand Virtual Therapy Increases Precision and Generates Less Mental Workload

    Directory of Open Access Journals (Sweden)

    Cristina Ramírez-Fernández

    2015-10-01

    Full Text Available In this work we show that haptic feedback in upper limb motor therapy improves performance and generates a lower mental workload. To demonstrate this, two groups of participants (healthy adults and elders with hand motor problems used a low-cost haptic device (Novint Falcon and a non-robotic device (Leap Motion Controller. Participants conducted the same rehabilitation task by using a non-immersive virtual environment. Results show significant differences for all participants regarding precision on the use of the haptic feedback device. Additionally, participants in the older adult group demonstrated a lower mental workload while using the haptic device (Novint Falcon. Finally, qualitative results show that participants preferred to conduct their therapy exercises by using the haptic device, as they found it more useful, easier to use and easier to learn

  5. Neurosurgical tactile discrimination training with haptic-based virtual reality simulation.

    Science.gov (United States)

    Patel, Achal; Koshy, Nick; Ortega-Barnett, Juan; Chan, Hoi C; Kuo, Yong-Fan; Luciano, Cristian; Rizzi, Silvio; Matulyauskas, Martin; Kania, Patrick; Banerjee, Pat; Gasco, Jaime

    2014-12-01

    To determine if a computer-based simulation with haptic technology can help surgical trainees improve tactile discrimination using surgical instruments. Twenty junior medical students participated in the study and were randomized into two groups. Subjects in Group A participated in virtual simulation training using the ImmersiveTouch simulator (ImmersiveTouch, Inc., Chicago, IL, USA) that required differentiating the firmness of virtual spheres using tactile and kinesthetic sensation via haptic technology. Subjects in Group B did not undergo any training. With their visual fields obscured, subjects in both groups were then evaluated on their ability to use the suction and bipolar instruments to find six elastothane objects with areas ranging from 1.5 to 3.5 cm2 embedded in a urethane foam brain cavity model while relying on tactile and kinesthetic sensation only. A total of 73.3% of the subjects in Group A (simulation training) were able to find the brain cavity objects in comparison to 53.3% of the subjects in Group B (no training) (P  =  0.0183). There was a statistically significant difference in the total number of Group A subjects able to find smaller brain cavity objects (size ≤ 2.5 cm2) compared to that in Group B (72.5 vs. 40%, P  =  0.0032). On the other hand, no significant difference in the number of subjects able to detect larger objects (size ≧ 3 cm2) was found between Groups A and B (75 vs. 80%, P  =  0.7747). Virtual computer-based simulators with integrated haptic technology may improve tactile discrimination required for microsurgical technique.

  6. Ascending and Descending in Virtual Reality: Simple and Safe System Using Passive Haptics.

    Science.gov (United States)

    Nagao, Ryohei; Matsumoto, Keigo; Narumi, Takuji; Tanikawa, Tomohiro; Hirose, Michitaka

    2018-04-01

    This paper presents a novel interactive system that provides users with virtual reality (VR) experiences, wherein users feel as if they are ascending/descending stairs through passive haptic feedback. The passive haptic stimuli are provided by small bumps under the feet of users; these stimuli are provided to represent the edges of the stairs in the virtual environment. The visual stimuli of the stairs and shoes, provided by head-mounted displays, evoke a visuo-haptic interaction that modifies a user's perception of the floor shape. Our system enables users to experience all types of stairs, such as half-turn and spiral stairs, in a VR setting. We conducted a preliminary user study and two experiments to evaluate the proposed technique. The preliminary user study investigated the effectiveness of the basic idea associated with the proposed technique for the case of a user ascending stairs. The results demonstrated that the passive haptic feedback produced by the small bumps enhanced the user's feeling of presence and sense of ascending. We subsequently performed an experiment to investigate an improved viewpoint manipulation method and the interaction of the manipulation and haptics for both the ascending and descending cases. The experimental results demonstrated that the participants had a feeling of presence and felt a steep stair gradient under the condition of haptic feedback and viewpoint manipulation based on the characteristics of actual stair walking data. However, these results also indicated that the proposed system may not be as effective in providing a sense of descending stairs without an optimization of the haptic stimuli. We then redesigned the shape of the small bumps, and evaluated the design in a second experiment. The results indicated that the best shape to present haptic stimuli is a right triangle cross section in both the ascending and descending cases. Although it is necessary to install small protrusions in the determined direction, by

  7. Force Sensitive Handles and Capacitive Touch Sensor for Driving a Flexible Haptic-Based Immersive System

    Directory of Open Access Journals (Sweden)

    Umberto Cugini

    2013-10-01

    Full Text Available In this article, we present an approach that uses both two force sensitive handles (FSH and a flexible capacitive touch sensor (FCTS to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user’s fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape.

  8. Force sensitive handles and capacitive touch sensor for driving a flexible haptic-based immersive system.

    Science.gov (United States)

    Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto

    2013-10-09

    In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape.

  9. Control of an ER haptic master in a virtual slave environment for minimally invasive surgery applications

    International Nuclear Information System (INIS)

    Han, Young-Min; Choi, Seung-Bok

    2008-01-01

    This paper presents the control performance of an electrorheological (ER) fluid-based haptic master device connected to a virtual slave environment that can be used for minimally invasive surgery (MIS). An already developed haptic joint featuring controllable ER fluid and a spherical joint mechanism is adopted for the master system. Medical forceps and an angular position measuring device are devised and integrated with the joint to establish the MIS master system. In order to embody a human organ in virtual space, a volumetric deformable object is used. The virtual object is then mathematically formulated by a shape-retaining chain-linked (S-chain) model. After evaluating the reflection force, computation time and compatibility with real-time control, the haptic architecture for MIS is established by incorporating the virtual slave with the master device so that the reflection force for the object of the virtual slave and the desired position for the master operator are transferred to each other. In order to achieve the desired force trajectories, a sliding mode controller is formulated and then experimentally realized. Tracking control performances for various force trajectories are evaluated and presented in the time domain

  10. Virtual Reality Cerebral Aneurysm Clipping Simulation With Real-time Haptic Feedback

    Science.gov (United States)

    Alaraj, Ali; Luciano, Cristian J.; Bailey, Daniel P.; Elsenousi, Abdussalam; Roitberg, Ben Z.; Bernardo, Antonio; Banerjee, P. Pat; Charbel, Fady T.

    2014-01-01

    Background With the decrease in the number of cerebral aneurysms treated surgically and the increase of complexity of those treated surgically, there is a need for simulation-based tools to teach future neurosurgeons the operative techniques of aneurysm clipping. Objective To develop and evaluate the usefulness of a new haptic-based virtual reality (VR) simulator in the training of neurosurgical residents. Methods A real-time sensory haptic feedback virtual reality aneurysm clipping simulator was developed using the Immersive Touch platform. A prototype middle cerebral artery aneurysm simulation was created from a computed tomography angiogram. Aneurysm and vessel volume deformation and haptic feedback are provided in a 3-D immersive VR environment. Intraoperative aneurysm rupture was also simulated. Seventeen neurosurgery residents from three residency programs tested the simulator and provided feedback on its usefulness and resemblance to real aneurysm clipping surgery. Results Residents felt that the simulation would be useful in preparing for real-life surgery. About two thirds of the residents felt that the 3-D immersive anatomical details provided a very close resemblance to real operative anatomy and accurate guidance for deciding surgical approaches. They believed the simulation is useful for preoperative surgical rehearsal and neurosurgical training. One third of the residents felt that the technology in its current form provided very realistic haptic feedback for aneurysm surgery. Conclusion Neurosurgical residents felt that the novel immersive VR simulator is helpful in their training especially since they do not get a chance to perform aneurysm clippings until very late in their residency programs. PMID:25599200

  11. Haptics in periodontics

    Directory of Open Access Journals (Sweden)

    Savita Abdulpur Mallikarjun

    2014-01-01

    Full Text Available Throughout history, education has evolved, and new teaching/learning methods have been developed. These methods have helped us come a long way in understanding the pathogenesis, diagnosis, and treatment of diseases of the oral cavity. However, there is still no one good way to render a student/clinician the tactile sense for detecting calculus/caries or placing the incisions or detecting the smoothness of a restoration or any treatment procedures before entering the clinics. In the education field, to improve the tactile sensation, the sense of touch and force-feedback can offer great improvements to the existing learning methods, thus enhancing the quality of education procedures. The concept of Haptics, which is extensively in use and indispensable in other fields like aviation, telecommunication etc., is now making its way into dentistry. Against this background, the following write-up intends to provide a glimpse of the coming wave of Haptics - A virtual reality system in dental education and discusses the strengths and weak points of this system.

  12. A haptic floor for interaction and diagnostics with goal based tasks during virtual reality supported balance training

    Directory of Open Access Journals (Sweden)

    Andrej Krpič

    2014-03-01

    Full Text Available Background: Balance training of patients after stroke is one of the primary tasks of physiotherapy after the hospitalization. It is based on the intensive training, which consists of simple, repetitive, goal-based tasks. The tasks are carried out by physiotherapists, who follow predefined protocols. Introduction of a standing frame and a virtual reality decrease the physical load and number of required physiotherapists. The patients benefit in terms of safety and increased motivation. Additional feedback – haptic floor can enhance the virtual reality experience, add additional level of difficulty and could be also used for generating postural perturbations. The purpose of this article is to examine whether haptic information can be used to identify specific anomalies in dynamic posturography.Methods: The performance and stability of closed-loop system of the haptic floor were tested using frequency analysis. A postural response normative was set up from data assessed in four healthy individuals who were exposed to unexpected movements of the haptic floor in eight directions. Postural responses of a patient after stroke participating in virtual reality supported balance training, where collisions resulted in floor movements, were assessed and contrasted to the normative.Results: Haptic floor system was stable and controllable up to the frequency of 1.1 Hz, sufficient for the generation of postural perturbations. Responses obtained after perturbations in two major directions for a patient after stroke demonstrated noticeable deviations from the normative.Conclusions: Haptic floor design, together with a standing frame and a virtual reality used for balance training, enables an assessment of directionally specific postural responses. The system was designed to identify postural disorders during balance training and rehabilitation progress outside specialized clinics, e.g. at patient’s home.

  13. Absence of modulatory action on haptic height perception with musical pitch

    Directory of Open Access Journals (Sweden)

    Michele eGeronazzo

    2015-09-01

    Full Text Available Although acoustic frequency is not a spatial property of physical objects, in common language, pitch, i.e., the psychological correlated of frequency, is often labeled spatially (i.e., high in pitch or low in pitch. Pitch-height is known to modulate (and interact with the response of participants when they are asked to judge spatial properties of non-auditory stimuli (e.g., visual in a variety of behavioral tasks. In the current study we investigated whether the modulatory action of pitch-height extended to the haptic estimation of height of a virtual step.We implemented a HW/SW setup which is able to render virtual 3D objects (stair-steps haptically through a PHANTOM device, and to provide real-time continuous auditory feedback depending on the user interaction with the object. The haptic exploration was associated with a sinusoidal tone whose pitch varied as a function of the interaction point’s height within (i a narrower and (ii a wider pitch range, or (iii a random pitch variation acting as a control audio condition. Explorations were also performed with no sound (haptic only. Participants were instructed to explore the virtual step freely, and to communicate height estimation by opening their thumb and index finger to mimic the step riser height, or verbally by reporting the height in centimeters of the step riser. We analyzed the role of musical expertise by dividing participants into non musicians and musicians. Results showed no effects of musical pitch on high-realistic haptic feedback. Overall there is no difference between the two groups in the proposed multimodal conditions. Additionally, we observed a different haptic response distribution between musicians and non musicians when estimations of the auditory conditions are matched with estimations in the no sound condition.

  14. A Haptic Feedback Scheme to Accurately Position a Virtual Wrist Prosthesis Using a Three-Node Tactor Array.

    Directory of Open Access Journals (Sweden)

    Andrew Erwin

    Full Text Available In this paper, a novel haptic feedback scheme, used for accurately positioning a 1DOF virtual wrist prosthesis through sensory substitution, is presented. The scheme employs a three-node tactor array and discretely and selectively modulates the stimulation frequency of each tactor to relay 11 discrete haptic stimuli to the user. Able-bodied participants were able to move the virtual wrist prosthesis via a surface electromyography based controller. The participants evaluated the feedback scheme without visual or audio feedback and relied solely on the haptic feedback alone to correctly position the hand. The scheme was evaluated through both normal (perpendicular and shear (lateral stimulations applied on the forearm. Normal stimulations were applied through a prototype device previously developed by the authors while shear stimulations were generated using an ubiquitous coin motor vibrotactor. Trials with no feedback served as a baseline to compare results within the study and to the literature. The results indicated that using normal and shear stimulations resulted in accurately positioning the virtual wrist, but were not significantly different. Using haptic feedback was substantially better than no feedback. The results found in this study are significant since the feedback scheme allows for using relatively few tactors to relay rich haptic information to the user and can be learned easily despite a relatively short amount of training. Additionally, the results are important for the haptic community since they contradict the common conception in the literature that normal stimulation is inferior to shear. From an ergonomic perspective normal stimulation has the potential to benefit upper limb amputees since it can operate at lower frequencies than shear-based vibrotactors while also generating less noise. Through further tuning of the novel haptic feedback scheme and normal stimulation device, a compact and comfortable sensory substitution

  15. The value of haptic feedback in conventional and robot-assisted minimal invasive surgery and virtual reality training: a current review.

    Science.gov (United States)

    van der Meijden, O A J; Schijven, M P

    2009-06-01

    Virtual reality (VR) as surgical training tool has become a state-of-the-art technique in training and teaching skills for minimally invasive surgery (MIS). Although intuitively appealing, the true benefits of haptic (VR training) platforms are unknown. Many questions about haptic feedback in the different areas of surgical skills (training) need to be answered before adding costly haptic feedback in VR simulation for MIS training. This study was designed to review the current status and value of haptic feedback in conventional and robot-assisted MIS and training by using virtual reality simulation. A systematic review of the literature was undertaken using PubMed and MEDLINE. The following search terms were used: Haptic feedback OR Haptics OR Force feedback AND/OR Minimal Invasive Surgery AND/OR Minimal Access Surgery AND/OR Robotics AND/OR Robotic Surgery AND/OR Endoscopic Surgery AND/OR Virtual Reality AND/OR Simulation OR Surgical Training/Education. The results were assessed according to level of evidence as reflected by the Oxford Centre of Evidence-based Medicine Levels of Evidence. In the current literature, no firm consensus exists on the importance of haptic feedback in performing minimally invasive surgery. Although the majority of the results show positive assessment of the benefits of force feedback, results are ambivalent and not unanimous on the subject. Benefits are least disputed when related to surgery using robotics, because there is no haptic feedback in currently used robotics. The addition of haptics is believed to reduce surgical errors resulting from a lack of it, especially in knot tying. Little research has been performed in the area of robot-assisted endoscopic surgical training, but results seem promising. Concerning VR training, results indicate that haptic feedback is important during the early phase of psychomotor skill acquisition.

  16. AR Feels "Softer" than VR: Haptic Perception of Stiffness in Augmented versus Virtual Reality.

    Science.gov (United States)

    Gaffary, Yoren; Le Gouis, Benoit; Marchal, Maud; Argelaguet, Ferran; Arnaldi, Bruno; Lecuyer, Anatole

    2017-11-01

    Does it feel the same when you touch an object in Augmented Reality (AR) or in Virtual Reality (VR)? In this paper we study and compare the haptic perception of stiffness of a virtual object in two situations: (1) a purely virtual environment versus (2) a real and augmented environment. We have designed an experimental setup based on a Microsoft HoloLens and a haptic force-feedback device, enabling to press a virtual piston, and compare its stiffness successively in either Augmented Reality (the virtual piston is surrounded by several real objects all located inside a cardboard box) or in Virtual Reality (the same virtual piston is displayed in a fully virtual scene composed of the same other objects). We have conducted a psychophysical experiment with 12 participants. Our results show a surprising bias in perception between the two conditions. The virtual piston is on average perceived stiffer in the VR condition compared to the AR condition. For instance, when the piston had the same stiffness in AR and VR, participants would select the VR piston as the stiffer one in 60% of cases. This suggests a psychological effect as if objects in AR would feel "softer" than in pure VR. Taken together, our results open new perspectives on perception in AR versus VR, and pave the way to future studies aiming at characterizing potential perceptual biases.

  17. Control of repulsive force in a virtual environment using an electrorheological haptic master for a surgical robot application

    Science.gov (United States)

    Oh, Jong-Seok; Choi, Seung-Hyun; Choi, Seung-Bok

    2014-01-01

    This paper presents control performances of a new type of four-degrees-of-freedom (4-DOF) haptic master that can be used for robot-assisted minimally invasive surgery (RMIS). By adopting a controllable electrorheological (ER) fluid, the function of the proposed master is realized as a haptic feedback as well as remote manipulation. In order to verify the efficacy of the proposed master and method, an experiment is conducted with deformable objects featuring human organs. Since the use of real human organs is difficult for control due to high cost and moral hazard, an excellent alternative method, the virtual reality environment, is used for control in this work. In order to embody a human organ in the virtual space, the experiment adopts a volumetric deformable object represented by a shape-retaining chain linked (S-chain) model which has salient properties such as fast and realistic deformation of elastic objects. In haptic architecture for RMIS, the desired torque/force and desired position originating from the object of the virtual slave and operator of the haptic master are transferred to each other. In order to achieve the desired torque/force trajectories, a sliding mode controller (SMC) which is known to be robust to uncertainties is designed and empirically implemented. Tracking control performances for various torque/force trajectories from the virtual slave are evaluated and presented in the time domain.

  18. Control of repulsive force in a virtual environment using an electrorheological haptic master for a surgical robot application

    International Nuclear Information System (INIS)

    Oh, Jong-Seok; Choi, Seung-Hyun; Choi, Seung-Bok

    2014-01-01

    This paper presents control performances of a new type of four-degrees-of-freedom (4-DOF) haptic master that can be used for robot-assisted minimally invasive surgery (RMIS). By adopting a controllable electrorheological (ER) fluid, the function of the proposed master is realized as a haptic feedback as well as remote manipulation. In order to verify the efficacy of the proposed master and method, an experiment is conducted with deformable objects featuring human organs. Since the use of real human organs is difficult for control due to high cost and moral hazard, an excellent alternative method, the virtual reality environment, is used for control in this work. In order to embody a human organ in the virtual space, the experiment adopts a volumetric deformable object represented by a shape-retaining chain linked (S-chain) model which has salient properties such as fast and realistic deformation of elastic objects. In haptic architecture for RMIS, the desired torque/force and desired position originating from the object of the virtual slave and operator of the haptic master are transferred to each other. In order to achieve the desired torque/force trajectories, a sliding mode controller (SMC) which is known to be robust to uncertainties is designed and empirically implemented. Tracking control performances for various torque/force trajectories from the virtual slave are evaluated and presented in the time domain. (paper)

  19. a New ER Fluid Based Haptic Actuator System for Virtual Reality

    Science.gov (United States)

    Böse, H.; Baumann, M.; Monkman, G. J.; Egersdörfer, S.; Tunayar, A.; Freimuth, H.; Ermert, H.; Khaled, W.

    The concept and some steps in the development of a new actuator system which enables the haptic perception of mechanically inhomogeneous virtual objects are introduced. The system consists of a two-dimensional planar array of actuator elements containing an electrorheological (ER) fluid. When a user presses his fingers onto the surface of the actuator array, he perceives locally variable resistance forces generated by vertical pistons which slide in the ER fluid through the gaps between electrode pairs. The voltage in each actuator element can be individually controlled by a novel sophisticated switching technology based on optoelectric gallium arsenide elements. The haptic information which is represented at the actuator array can be transferred from a corresponding sensor system based on ultrasonic elastography. The combined sensor-actuator system may serve as a technology platform for various applications in virtual reality, like telemedicine where the information on the consistency of tissue of a real patient is detected by the sensor part and recorded by the actuator part at a remote location.

  20. Mucosal detail at CT virtual reality: surface versus volume rendering.

    Science.gov (United States)

    Hopper, K D; Iyriboz, A T; Wise, S W; Neuman, J D; Mauger, D T; Kasales, C J

    2000-02-01

    To evaluate computed tomographic virtual reality with volumetric versus surface rendering. Virtual reality images were reconstructed for 27 normal or pathologic colonic, gastric, or bronchial structures in four ways: the transition zone (a) reconstructed separately from the wall by using volume rendering; (b) with attenuation equal to air; (c) with attenuation equal to wall (soft tissue); (d) with attenuation halfway between air and wall. The four reconstructed images were randomized. Four experienced imagers blinded to the reconstruction graded them from best to worst with predetermined criteria. All readers rated images with the transition zone as a separate structure as overwhelmingly superior (P Virtual reality is best with volume rendering, with the transition zone (mucosa) between the wall and air reconstructed as a separate structure.

  1. Haptic Systems for Post-Stroke Rehabilitation: from Virtual Reality to Remote Rehabilitation

    OpenAIRE

    Daud, Omar Andres

    2011-01-01

    Haptic devices are becoming a common and significant tool in the perspective of robotic neurorehabilitation for motor learning, particularly in post-stroke patients. As a standard approach, this kind of devices are used in a local environment, where the patient interacts with a virtual environment recreated in the computer's screen. In this sense, a general framework for virtual reality based rehabilitation was developed. All the features of the framework, such as the control loop and the ext...

  2. Visuo-Haptic Mixed Reality with Unobstructed Tool-Hand Integration.

    Science.gov (United States)

    Cosco, Francesco; Garre, Carlos; Bruno, Fabio; Muzzupappa, Maurizio; Otaduy, Miguel A

    2013-01-01

    Visuo-haptic mixed reality consists of adding to a real scene the ability to see and touch virtual objects. It requires the use of see-through display technology for visually mixing real and virtual objects, and haptic devices for adding haptic interaction with the virtual objects. Unfortunately, the use of commodity haptic devices poses obstruction and misalignment issues that complicate the correct integration of a virtual tool and the user's real hand in the mixed reality scene. In this work, we propose a novel mixed reality paradigm where it is possible to touch and see virtual objects in combination with a real scene, using commodity haptic devices, and with a visually consistent integration of the user's hand and the virtual tool. We discuss the visual obstruction and misalignment issues introduced by commodity haptic devices, and then propose a solution that relies on four simple technical steps: color-based segmentation of the hand, tracking-based segmentation of the haptic device, background repainting using image-based models, and misalignment-free compositing of the user's hand. We have developed a successful proof-of-concept implementation, where a user can touch virtual objects and interact with them in the context of a real scene, and we have evaluated the impact on user performance of obstruction and misalignment correction.

  3. Faster acquisition of laparoscopic skills in virtual reality with haptic feedback and 3D vision.

    Science.gov (United States)

    Hagelsteen, Kristine; Langegård, Anders; Lantz, Adam; Ekelund, Mikael; Anderberg, Magnus; Bergenfelz, Anders

    2017-10-01

    The study investigated whether 3D vision and haptic feedback in combination in a virtual reality environment leads to more efficient learning of laparoscopic skills in novices. Twenty novices were allocated to two groups. All completed a training course in the LapSim ® virtual reality trainer consisting of four tasks: 'instrument navigation', 'grasping', 'fine dissection' and 'suturing'. The study group performed with haptic feedback and 3D vision and the control group without. Before and after the LapSim ® course, the participants' metrics were recorded when tying a laparoscopic knot in the 2D video box trainer Simball ® Box. The study group completed the training course in 146 (100-291) minutes compared to 215 (175-489) minutes in the control group (p = .002). The number of attempts to reach proficiency was significantly lower. The study group had significantly faster learning of skills in three out of four individual tasks; instrument navigation, grasping and suturing. Using the Simball ® Box, no difference in laparoscopic knot tying after the LapSim ® course was noted when comparing the groups. Laparoscopic training in virtual reality with 3D vision and haptic feedback made training more time efficient and did not negatively affect later video box-performance in 2D. [Formula: see text].

  4. Finite Element Methods for real-time Haptic Feedback of Soft-Tissue Models in Virtual Reality Simulators

    Science.gov (United States)

    Frank, Andreas O.; Twombly, I. Alexander; Barth, Timothy J.; Smith, Jeffrey D.; Dalton, Bonnie P. (Technical Monitor)

    2001-01-01

    We have applied the linear elastic finite element method to compute haptic force feedback and domain deformations of soft tissue models for use in virtual reality simulators. Our results show that, for virtual object models of high-resolution 3D data (>10,000 nodes), haptic real time computations (>500 Hz) are not currently possible using traditional methods. Current research efforts are focused in the following areas: 1) efficient implementation of fully adaptive multi-resolution methods and 2) multi-resolution methods with specialized basis functions to capture the singularity at the haptic interface (point loading). To achieve real time computations, we propose parallel processing of a Jacobi preconditioned conjugate gradient method applied to a reduced system of equations resulting from surface domain decomposition. This can effectively be achieved using reconfigurable computing systems such as field programmable gate arrays (FPGA), thereby providing a flexible solution that allows for new FPGA implementations as improved algorithms become available. The resulting soft tissue simulation system would meet NASA Virtual Glovebox requirements and, at the same time, provide a generalized simulation engine for any immersive environment application, such as biomedical/surgical procedures or interactive scientific applications.

  5. Augmented kinematic feedback from haptic virtual reality for dental skill acquisition.

    Science.gov (United States)

    Suebnukarn, Siriwan; Haddawy, Peter; Rhienmora, Phattanapon; Jittimanee, Pannapa; Viratket, Piyanuch

    2010-12-01

    We have developed a haptic virtual reality system for dental skill training. In this study we examined several kinds of kinematic information about the movement provided by the system supplement knowledge of results (KR) in dental skill acquisition. The kinematic variables examined involved force utilization (F) and mirror view (M). This created three experimental conditions that received augmented kinematic feedback (F, M, FM) and one control condition that did not (KR-only). Thirty-two dental students were randomly assigned to four groups. Their task was to perform access opening on the upper first molar with the haptic virtual reality system. An acquisition session consisted of two days of ten trials of practice in which augmented kinematic feedback was provided for the appropriate experimental conditions after each trial. One week after, a retention test consisting of two trials without augmented feedback was completed. The results showed that the augmented kinematic feedback groups had larger mean performance scores than the KR-only group in Day 1 of the acquisition and retention sessions (ANOVA, p0.05). The trends in acquisition and retention sessions suggest that the augmented kinematic feedback can enhance the performance earlier in the skill acquisition and retention sessions.

  6. Effects of 3D virtual haptics force feedback on brand personality perception: the mediating role of physical presence in advergames.

    Science.gov (United States)

    Jin, Seung-A Annie

    2010-06-01

    This study gauged the effects of force feedback in the Novint Falcon haptics system on the sensory and cognitive dimensions of a virtual test-driving experience. First, in order to explore the effects of tactile stimuli with force feedback on users' sensory experience, feelings of physical presence (the extent to which virtual physical objects are experienced as actual physical objects) were measured after participants used the haptics interface. Second, to evaluate the effects of force feedback on the cognitive dimension of consumers' virtual experience, this study investigated brand personality perception. The experiment utilized the Novint Falcon haptics controller to induce immersive virtual test-driving through tactile stimuli. The author designed a two-group (haptics stimuli with force feedback versus no force feedback) comparison experiment (N = 238) by manipulating the level of force feedback. Users in the force feedback condition were exposed to tactile stimuli involving various force feedback effects (e.g., terrain effects, acceleration, and lateral forces) while test-driving a rally car. In contrast, users in the control condition test-drove the rally car using the Novint Falcon but were not given any force feedback. Results of ANOVAs indicated that (a) users exposed to force feedback felt stronger physical presence than those in the no force feedback condition, and (b) users exposed to haptics stimuli with force feedback perceived the brand personality of the car to be more rugged than those in the control condition. Managerial implications of the study for product trial in the business world are discussed.

  7. The value of haptic feedback in conventional and robot-assisted minimal invasive surgery and virtual reality training: a current review

    NARCIS (Netherlands)

    van der Meijden, O. A. J.; Schijven, M. P.

    2009-01-01

    BACKGROUND: Virtual reality (VR) as surgical training tool has become a state-of-the-art technique in training and teaching skills for minimally invasive surgery (MIS). Although intuitively appealing, the true benefits of haptic (VR training) platforms are unknown. Many questions about haptic

  8. Study of Co-Located and Distant Collaboration with Symbolic Support via a Haptics-Enhanced Virtual Reality Task

    Science.gov (United States)

    Yeh, Shih-Ching; Hwang, Wu-Yuin; Wang, Jin-Liang; Zhan, Shi-Yi

    2013-01-01

    This study intends to investigate how multi-symbolic representations (text, digits, and colors) could effectively enhance the completion of co-located/distant collaborative work in a virtual reality context. Participants' perceptions and behaviors were also studied. A haptics-enhanced virtual reality task was developed to conduct…

  9. The development of a haptic virtual reality environment to study body image and affect.

    Science.gov (United States)

    Tremblay, Line; Bouchard, Stephane; Chebbi, Brahim; Wei, Lai; Monthuy-Blanc, Johana; Boulanger, Dominic

    2013-01-01

    We report the results of a preliminary study testing the effect of participants' mood rating on visual motor performance using a haptic device to manipulate a cartoonish human body. Our results suggest that moods involving high arousal (e.g. happiness) produce larger movements whereas mood involving low arousal (e.g. sadness) produce slower speed of performance. Our results are used for the development of a new haptic virtual reality application that we briefly present here. This application is intended to create a more interactive and motivational environment to treat body image issues and for emotional communication.

  10. UPPER LIMB FUNCTIONAL ASSESSMENT USING HAPTIC INTERFACE

    Directory of Open Access Journals (Sweden)

    Aleš Bardorfer

    2004-12-01

    Full Text Available A new method for the assessment of the upper limb (UL functional state, using a haptic interface is presented. A haptic interface is used as a measuring device, capable of providing objective, repeatable and quantitative data of the UL motion. A patient is presented with a virtual environment, both graphically via a computer screen and haptically via the Phantom Premium 1.5 haptic interface. The setup allows the patient to explore and feel the virtual environment with three of his/her senses; sight, hearing, and most important, touch. Specially designed virtual environments are used to assess the patient’s UL movement capabilities. The tests range from tracking tasks – to assess the accuracy of movement – tracking tasks with added disturbances in a form of random forces – to assess the patient’s control abilities, a labyrinth test – to assess both speed and accuracy, to the last test for measuring the maximal force capacity of the UL.A new method for the assessment of the upper limb (UL functional state, using a haptic interface is presented. A haptic interface is used as a measuring device, capable of providing objective, repeatable and quantitative data of the UL motion. A patient is presented with a virtual environment, both graphically via a computer screen and haptically via the Phantom Premium 1.5 haptic interface. The setup allows the patient to explore and feel the virtual environment with three of his/her senses; sight, hearing, and most important, touch. Specially designed virtual environments are used to assess the patient’s UL movement capabilities. The tests range from tracking tasks–to assess the accuracy of movement-tracking tasks with added disturbances in a form of random forces-to assess the patient’s control abilities, a labyrinth test-to assess both speed and accuracy, to the last test for measuring the maximal force capacity of the UL.A comprehensive study, using the developed measurement setup within the

  11. The virtual haptic back: A simulation for training in palpatory diagnosis

    Directory of Open Access Journals (Sweden)

    Eland David C

    2008-04-01

    Full Text Available Abstract Background Models and simulations are finding increased roles in medical education. The Virtual Haptic Back (VHB is a virtual reality simulation of the mechanical properties of the human back designed as an aid to teaching clinical palpatory diagnosis. Methods Eighty-nine first year medical students of the Ohio University College of Osteopathic Medicine carried out six, 15-minute practice sessions with the VHB, plus tests before and after the sessions in order to monitor progress in identifying regions of simulated abnormal tissue compliance. Students palpated with two digits, fingers or thumbs, by placing them in gimbaled thimbles at the ends of PHANToM 3.0® haptic interface arms. The interface simulated the contours and compliance of the back surface by the action of electric motors. The motors limited the compression of the virtual tissues induced by the palpating fingers, by generating counterforces. Users could see the position of their fingers with respect to the back on a video monitor just behind the plane of the haptic back. The abnormal region varied randomly among 12 locations between trials. During the practice sessions student users received immediate feedback following each trial, indicating either a correct choice or the actual location of the abnormality if an incorrect choice had been made. This allowed the user to feel the actual abnormality before going on to the next trial. Changes in accuracy, speed and Weber fraction across practice sessions were analyzed using a repeated measures analysis of variance. Results Students improved in accuracy and speed of diagnosis with practice. The smallest difference in simulated tissue compliance users were able to detect improved from 28% (SD = 9.5% to 14% (SD = 4.4% during the practice sessions while average detection time decreased from 39 (SD = 19.8 to 17 (SD = 11.7 seconds. When asked in anonymous evaluation questionnaires if they judged the VHB practice to be helpful to

  12. Investigation of Virtual Digital Human and Robotic Device Technology Merger Complimented by Haptics and Autostereoscopic Displays, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — As expected, the STTR Phase I investigation confirmed that the Digital Virtual Human (DVH) and Robonaut technologies can be merged, and that haptic and...

  13. Lack of transfer of skills after virtual reality simulator training with haptic feedback.

    Science.gov (United States)

    Våpenstad, Cecilie; Hofstad, Erlend Fagertun; Bø, Lars Eirik; Kuhry, Esther; Johnsen, Gjermund; Mårvik, Ronald; Langø, Thomas; Hernes, Toril Nagelhus

    2017-12-01

    Virtual reality (VR) simulators enrich surgical training and offer training possibilities outside of the operating room (OR). In this study, we created a criterion-based training program on a VR simulator with haptic feedback and tested it by comparing the performances of a simulator group against a control group. Medical students with no experience in laparoscopy were randomly assigned to a simulator group or a control group. In the simulator group, the candidates trained until they reached predefined criteria on the LapSim ® VR simulator (Surgical Science AB, Göteborg, Sweden) with haptic feedback (Xitact TM IHP, Mentice AB, Göteborg, Sweden). All candidates performed a cholecystectomy on a porcine organ model in a box trainer (the clinical setting). The performances were video rated by two surgeons blinded to subject training status. In total, 30 students performed the cholecystectomy and had their videos rated (N = 16 simulator group, N = 14 control group). The control group achieved better video rating scores than the simulator group (p training program did not transfer skills to the clinical setting. Poor mechanical performance of the simulated haptic feedback is believed to have resulted in a negative training effect.

  14. Intelligent Avatar on E-Learning Using Facial Expression an Haptic

    Directory of Open Access Journals (Sweden)

    Ahmad Hoirul Basori

    2011-04-01

    Full Text Available he process of introducing emotion can be improved through three-dimensional (3D tutoring system. The problem that still not solved is how to provide realistic tutor (avatar in virtual environment. This paper propose an approach to teach children on understanding emotion sensation through facial expression and sense of touch (haptic.The algorithm is created by calculating constant factor (f based on maximum value of RGB and magnitude force then magnitude force range will be associated into particular colour. The Integration process will be started from rendering the facial expression then followed by adjusting the vibration power to emotion value. The result that achieved on experiment, it show around 71% students agree with the classification of magnitude force into emotion representation. Respondents commented that high magnitude force create similar sensation when respondents feel anger, while low magnitude force is more relaxing to respondents. Respondents also said that haptic and facial expression is very interactive and realistic.

  15. Man, mind, and machine: the past and future of virtual reality simulation in neurologic surgery.

    Science.gov (United States)

    Robison, R Aaron; Liu, Charles Y; Apuzzo, Michael L J

    2011-11-01

    To review virtual reality in neurosurgery, including the history of simulation and virtual reality and some of the current implementations; to examine some of the technical challenges involved; and to propose a potential paradigm for the development of virtual reality in neurosurgery going forward. A search was made on PubMed using key words surgical simulation, virtual reality, haptics, collision detection, and volumetric modeling to assess the current status of virtual reality in neurosurgery. Based on previous results, investigators extrapolated the possible integration of existing efforts and potential future directions. Simulation has a rich history in surgical training, and there are numerous currently existing applications and systems that involve virtual reality. All existing applications are limited to specific task-oriented functions and typically sacrifice visual realism for real-time interactivity or vice versa, owing to numerous technical challenges in rendering a virtual space in real time, including graphic and tissue modeling, collision detection, and direction of the haptic interface. With ongoing technical advancements in computer hardware and graphic and physical rendering, incremental or modular development of a fully immersive, multipurpose virtual reality neurosurgical simulator is feasible. The use of virtual reality in neurosurgery is predicted to change the nature of neurosurgical education, and to play an increased role in surgical rehearsal and the continuing education and credentialing of surgical practitioners. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. Construct validity and expert benchmarking of the haptic virtual reality dental simulator.

    Science.gov (United States)

    Suebnukarn, Siriwan; Chaisombat, Monthalee; Kongpunwijit, Thanapohn; Rhienmora, Phattanapon

    2014-10-01

    The aim of this study was to demonstrate construct validation of the haptic virtual reality (VR) dental simulator and to define expert benchmarking criteria for skills assessment. Thirty-four self-selected participants (fourteen novices, fourteen intermediates, and six experts in endodontics) at one dental school performed ten repetitions of three mode tasks of endodontic cavity preparation: easy (mandibular premolar with one canal), medium (maxillary premolar with two canals), and hard (mandibular molar with three canals). The virtual instrument's path length was registered by the simulator. The outcomes were assessed by an expert. The error scores in easy and medium modes accurately distinguished the experts from novices and intermediates at the onset of training, when there was a significant difference between groups (ANOVA, p<0.05). The trend was consistent until trial 5. From trial 6 on, the three groups achieved similar scores. No significant difference was found between groups at the end of training. Error score analysis was not able to distinguish any group at the hard level of training. Instrument path length showed a difference in performance according to groups at the onset of training (ANOVA, p<0.05). This study established construct validity for the haptic VR dental simulator by demonstrating its discriminant capabilities between that of experts and non-experts. The experts' error scores and path length were used to define benchmarking criteria for optimal performance.

  17. Neurosurgery simulation using non-linear finite element modeling and haptic interaction

    Science.gov (United States)

    Lee, Huai-Ping; Audette, Michel; Joldes, Grand R.; Enquobahrie, Andinet

    2012-02-01

    Real-time surgical simulation is becoming an important component of surgical training. To meet the realtime requirement, however, the accuracy of the biomechancial modeling of soft tissue is often compromised due to computing resource constraints. Furthermore, haptic integration presents an additional challenge with its requirement for a high update rate. As a result, most real-time surgical simulation systems employ a linear elasticity model, simplified numerical methods such as the boundary element method or spring-particle systems, and coarse volumetric meshes. However, these systems are not clinically realistic. We present here an ongoing work aimed at developing an efficient and physically realistic neurosurgery simulator using a non-linear finite element method (FEM) with haptic interaction. Real-time finite element analysis is achieved by utilizing the total Lagrangian explicit dynamic (TLED) formulation and GPU acceleration of per-node and per-element operations. We employ a virtual coupling method for separating deformable body simulation and collision detection from haptic rendering, which needs to be updated at a much higher rate than the visual simulation. The system provides accurate biomechancial modeling of soft tissue while retaining a real-time performance with haptic interaction. However, our experiments showed that the stability of the simulator depends heavily on the material property of the tissue and the speed of colliding objects. Hence, additional efforts including dynamic relaxation are required to improve the stability of the system.

  18. Advanced Maintenance Simulation by Means of Hand-Based Haptic Interfaces

    Science.gov (United States)

    Nappi, Michele; Paolino, Luca; Ricciardi, Stefano; Sebillo, Monica; Vitiello, Giuliana

    Aerospace industry has been involved in virtual simulation for design and testing since the birth of virtual reality. Today this industry is showing a growing interest in the development of haptic-based maintenance training applications, which represent the most advanced way to simulate maintenance and repair tasks within a virtual environment by means of a visual-haptic approach. The goal is to allow the trainee to experiment the service procedures not only as a workflow reproduced at a visual level but also in terms of the kinaesthetic feedback involved with the manipulation of tools and components. This study, conducted in collaboration with aerospace industry specialists, is aimed to the development of an immersive virtual capable of immerging the trainees into a virtual environment where mechanics and technicians can perform maintenance simulation or training tasks by directly manipulating 3D virtual models of aircraft parts while perceiving force feedback through the haptic interface. The proposed system is based on ViRstperson, a virtual reality engine under development at the Italian Center for Aerospace Research (CIRA) to support engineering and technical activities such as design-time maintenance procedure validation, and maintenance training. This engine has been extended to support haptic-based interaction, enabling a more complete level of interaction, also in terms of impedance control, and thus fostering the development of haptic knowledge in the user. The user’s “sense of touch” within the immersive virtual environment is simulated through an Immersion CyberForce® hand-based force-feedback device. Preliminary testing of the proposed system seems encouraging.

  19. Haptic, Virtual Interaction and Motor Imagery: Entertainment Tools and Psychophysiological Testing

    Directory of Open Access Journals (Sweden)

    Sara Invitto

    2016-03-01

    Full Text Available In this work, the perception of affordances was analysed in terms of cognitive neuroscience during an interactive experience in a virtual reality environment. In particular, we chose a virtual reality scenario based on the Leap Motion controller: this sensor device captures the movements of the user’s hand and fingers, which are reproduced on a computer screen by the proper software applications. For our experiment, we employed a sample of 10 subjects matched by age and sex and chosen among university students. The subjects took part in motor imagery training and immersive affordance condition (a virtual training with Leap Motion and a haptic training with real objects. After each training sessions the subject performed a recognition task, in order to investigate event-related potential (ERP components. The results revealed significant differences in the attentional components during the Leap Motion training. During Leap Motion session, latencies increased in the occipital lobes, which are entrusted to visual sensory; in contrast, latencies decreased in the frontal lobe, where the brain is mainly activated for attention and action planning.

  20. Haptic, Virtual Interaction and Motor Imagery: Entertainment Tools and Psychophysiological Testing.

    Science.gov (United States)

    Invitto, Sara; Faggiano, Chiara; Sammarco, Silvia; De Luca, Valerio; De Paolis, Lucio T

    2016-03-18

    In this work, the perception of affordances was analysed in terms of cognitive neuroscience during an interactive experience in a virtual reality environment. In particular, we chose a virtual reality scenario based on the Leap Motion controller: this sensor device captures the movements of the user's hand and fingers, which are reproduced on a computer screen by the proper software applications. For our experiment, we employed a sample of 10 subjects matched by age and sex and chosen among university students. The subjects took part in motor imagery training and immersive affordance condition (a virtual training with Leap Motion and a haptic training with real objects). After each training sessions the subject performed a recognition task, in order to investigate event-related potential (ERP) components. The results revealed significant differences in the attentional components during the Leap Motion training. During Leap Motion session, latencies increased in the occipital lobes, which are entrusted to visual sensory; in contrast, latencies decreased in the frontal lobe, where the brain is mainly activated for attention and action planning.

  1. Haptic, Virtual Interaction and Motor Imagery: Entertainment Tools and Psychophysiological Testing

    Science.gov (United States)

    Invitto, Sara; Faggiano, Chiara; Sammarco, Silvia; De Luca, Valerio; De Paolis, Lucio T.

    2016-01-01

    In this work, the perception of affordances was analysed in terms of cognitive neuroscience during an interactive experience in a virtual reality environment. In particular, we chose a virtual reality scenario based on the Leap Motion controller: this sensor device captures the movements of the user’s hand and fingers, which are reproduced on a computer screen by the proper software applications. For our experiment, we employed a sample of 10 subjects matched by age and sex and chosen among university students. The subjects took part in motor imagery training and immersive affordance condition (a virtual training with Leap Motion and a haptic training with real objects). After each training sessions the subject performed a recognition task, in order to investigate event-related potential (ERP) components. The results revealed significant differences in the attentional components during the Leap Motion training. During Leap Motion session, latencies increased in the occipital lobes, which are entrusted to visual sensory; in contrast, latencies decreased in the frontal lobe, where the brain is mainly activated for attention and action planning. PMID:26999151

  2. Modulation of visually evoked postural responses by contextual visual, haptic and auditory information: a 'virtual reality check'.

    Science.gov (United States)

    Meyer, Georg F; Shao, Fei; White, Mark D; Hopkins, Carl; Robotham, Antony J

    2013-01-01

    Externally generated visual motion signals can cause the illusion of self-motion in space (vection) and corresponding visually evoked postural responses (VEPR). These VEPRs are not simple responses to optokinetic stimulation, but are modulated by the configuration of the environment. The aim of this paper is to explore what factors modulate VEPRs in a high quality virtual reality (VR) environment where real and virtual foreground objects served as static visual, auditory and haptic reference points. Data from four experiments on visually evoked postural responses show that: 1) visually evoked postural sway in the lateral direction is modulated by the presence of static anchor points that can be haptic, visual and auditory reference signals; 2) real objects and their matching virtual reality representations as visual anchors have different effects on postural sway; 3) visual motion in the anterior-posterior plane induces robust postural responses that are not modulated by the presence of reference signals or the reality of objects that can serve as visual anchors in the scene. We conclude that automatic postural responses for laterally moving visual stimuli are strongly influenced by the configuration and interpretation of the environment and draw on multisensory representations. Different postural responses were observed for real and virtual visual reference objects. On the basis that automatic visually evoked postural responses in high fidelity virtual environments should mimic those seen in real situations we propose to use the observed effect as a robust objective test for presence and fidelity in VR.

  3. Frictional Compliant Haptic Contact and Deformation of Soft Objects

    Directory of Open Access Journals (Sweden)

    Naci Zafer

    2016-05-01

    Full Text Available This paper is concerned with compliant haptic contact and deformation of soft objects. A human soft fingertip model is considered to act as the haptic interface and is brought into contact with and deforms a discrete surface. A nonlinear constitutive law is developed in predicting normal forces and, for the haptic display of surface texture, motions along the surface are also resisted at various rates by accounting for dynamic Lund-Grenoble (LuGre frictional forces. For the soft fingertip to apply forces over an area larger than a point, normal and frictional forces are distributed around the soft fingertip contact location on the deforming surface. The distribution is realized based on a kernel smoothing function and by a nonlinear spring-damper net around the contact point. Experiments conducted demonstrate the accuracy and effectiveness of our approach in real-time haptic rendering of a kidney surface. The resistive (interaction forces are applied at the user fingertip bone edge. A 3-DoF parallel robotic manipulator equipped with a constraint based controller is used for the implementation. By rendering forces both in lateral and normal directions, the designed haptic interface system allows the user to realistically feel both the geometrical and mechanical (nonlinear properties of the deforming kidney.

  4. Haptic Manipulation of Deformable Objects in Hybrid Bilateral Teleoperation System

    Directory of Open Access Journals (Sweden)

    Juan Manuel Ibarra-Zannatha

    2007-01-01

    Full Text Available The aim of this work is the integration of a virtual environment containing a deformable object, manipulated by an open kinematical chain virtual slave robot, to a bilateral teleoperation scheme based on a real haptic device. The virtual environment of this hybrid bilateral teleoperation system combines collision detection algorithms, dynamical, kinematical and geometrical models with a position–position and/or force–position bilateral control algorithm, to produce on the operator side the reflected forces corresponding to the virtual mechanical interactions, through a haptic device. Contact teleoperation task over the virtual environment with a flexible object is implemented and analysed.

  5. A perspective on the role and utility of haptic feedback in laparoscopic skills training.

    Science.gov (United States)

    Singapogu, Ravikiran; Burg, Timothy; Burg, Karen J L; Smith, Dane E; Eckenrode, Amanda H

    2014-01-01

    Laparoscopic surgery is a minimally invasive surgical technique with significant potential benefits to the patient, including shorter recovery time, less scarring, and decreased costs. There is a growing need to teach surgical trainees this emerging surgical technique. Simulators, ranging from simple "box" trainers to complex virtual reality (VR) trainers, have emerged as the most promising method for teaching basic laparoscopic surgical skills. Current box trainers require oversight from an expert surgeon for both training and assessing skills. VR trainers decrease the dependence on expert teachers during training by providing objective, real-time feedback and automatic skills evaluation. However, current VR trainers generally have limited credibility as a means to prepare new surgeons and have often fallen short of educators' expectations. Several researchers have speculated that the missing component in modern VR trainers is haptic feedback, which refers to the range of touch sensations encountered during surgery. These force types and ranges need to be adequately rendered by simulators for a more complete training experience. This article presents a perspective of the role and utility of haptic feedback during laparoscopic surgery and laparoscopic skills training by detailing the ranges and types of haptic sensations felt by the operating surgeon, along with quantitative studies of how this feedback is used. Further, a number of research studies that have documented human performance effects as a result of the presence of haptic feedback are critically reviewed. Finally, key research directions in using haptic feedback for laparoscopy training simulators are identified.

  6. Cortical mechanisms underlying sensorimotor enhancement promoted by walking with haptic inputs in a virtual environment.

    Science.gov (United States)

    Sangani, Samir; Lamontagne, Anouk; Fung, Joyce

    2015-01-01

    Sensorimotor integration is a complex process in the central nervous system that produces task-specific motor output based on selective and rapid integration of sensory information from multiple sources. This chapter reviews briefly the role of haptic cues in postural control during tandem stance and locomotion, focusing on sensorimotor enhancement of locomotion post stroke. The use of mixed-reality systems incorporating both haptic cues and virtual reality technology in gait rehabilitation post stroke is discussed. Over the last decade, researchers and clinicians have shown evidence of cerebral reorganization that underlies functional recovery after stroke based on results from neuroimaging techniques such as positron emission tomography and functional magnetic resonance imaging. These imaging modalities are however limited in their capacity to measure cortical changes during extensive body motions in upright stance. Functional near-infrared spectroscopy (fNIRS) on the other hand provides a unique opportunity to measure cortical activity associated with postural control during locomotion. Evidence of cortical changes associated with sensorimotor enhancement induced by haptic touch during locomotion is revealed through fNIRS in a pilot study involving healthy individuals and a case study involving a chronic stroke patient. © 2015 Elsevier B.V. All rights reserved.

  7. Modulation of visually evoked postural responses by contextual visual, haptic and auditory information: a 'virtual reality check'.

    Directory of Open Access Journals (Sweden)

    Georg F Meyer

    Full Text Available Externally generated visual motion signals can cause the illusion of self-motion in space (vection and corresponding visually evoked postural responses (VEPR. These VEPRs are not simple responses to optokinetic stimulation, but are modulated by the configuration of the environment. The aim of this paper is to explore what factors modulate VEPRs in a high quality virtual reality (VR environment where real and virtual foreground objects served as static visual, auditory and haptic reference points. Data from four experiments on visually evoked postural responses show that: 1 visually evoked postural sway in the lateral direction is modulated by the presence of static anchor points that can be haptic, visual and auditory reference signals; 2 real objects and their matching virtual reality representations as visual anchors have different effects on postural sway; 3 visual motion in the anterior-posterior plane induces robust postural responses that are not modulated by the presence of reference signals or the reality of objects that can serve as visual anchors in the scene. We conclude that automatic postural responses for laterally moving visual stimuli are strongly influenced by the configuration and interpretation of the environment and draw on multisensory representations. Different postural responses were observed for real and virtual visual reference objects. On the basis that automatic visually evoked postural responses in high fidelity virtual environments should mimic those seen in real situations we propose to use the observed effect as a robust objective test for presence and fidelity in VR.

  8. Modulation of Visually Evoked Postural Responses by Contextual Visual, Haptic and Auditory Information: A ‘Virtual Reality Check’

    Science.gov (United States)

    Meyer, Georg F.; Shao, Fei; White, Mark D.; Hopkins, Carl; Robotham, Antony J.

    2013-01-01

    Externally generated visual motion signals can cause the illusion of self-motion in space (vection) and corresponding visually evoked postural responses (VEPR). These VEPRs are not simple responses to optokinetic stimulation, but are modulated by the configuration of the environment. The aim of this paper is to explore what factors modulate VEPRs in a high quality virtual reality (VR) environment where real and virtual foreground objects served as static visual, auditory and haptic reference points. Data from four experiments on visually evoked postural responses show that: 1) visually evoked postural sway in the lateral direction is modulated by the presence of static anchor points that can be haptic, visual and auditory reference signals; 2) real objects and their matching virtual reality representations as visual anchors have different effects on postural sway; 3) visual motion in the anterior-posterior plane induces robust postural responses that are not modulated by the presence of reference signals or the reality of objects that can serve as visual anchors in the scene. We conclude that automatic postural responses for laterally moving visual stimuli are strongly influenced by the configuration and interpretation of the environment and draw on multisensory representations. Different postural responses were observed for real and virtual visual reference objects. On the basis that automatic visually evoked postural responses in high fidelity virtual environments should mimic those seen in real situations we propose to use the observed effect as a robust objective test for presence and fidelity in VR. PMID:23840760

  9. Development of a virtual speaking simulator using Image Based Rendering.

    Science.gov (United States)

    Lee, J M; Kim, H; Oh, M J; Ku, J H; Jang, D P; Kim, I Y; Kim, S I

    2002-01-01

    The fear of speaking is often cited as the world's most common social phobia. The rapid growth of computer technology has enabled the use of virtual reality (VR) for the treatment of the fear of public speaking. There are two techniques for building virtual environments for the treatment of this fear: a model-based and a movie-based method. Both methods have the weakness that they are unrealistic and not controllable individually. To understand these disadvantages, this paper presents a virtual environment produced with Image Based Rendering (IBR) and a chroma-key simultaneously. IBR enables the creation of realistic virtual environments where the images are stitched panoramically with the photos taken from a digital camera. And the use of chroma-keys puts virtual audience members under individual control in the environment. In addition, real time capture technique is used in constructing the virtual environments enabling spoken interaction between the subject and a therapist or another subject.

  10. Identification of virtual grounds using virtual reality haptic shoes and sound synthesis

    DEFF Research Database (Denmark)

    Serafin, Stefania; Turchet, Luca; Nordahl, Rolf

    2010-01-01

    We describe a system which simulates in real-time the auditory and haptic sensation of walking on different surfaces. The system is based on physical models, that drive both the haptic and audio synthesizers, and a pair of shoes enhanced with sensors and actuators. In a discrimination experiment,...

  11. Development of a virtual reality haptic Veress needle insertion simulator for surgical skills training.

    Science.gov (United States)

    Okrainec, A; Farcas, M; Henao, O; Choy, I; Green, J; Fotoohi, M; Leslie, R; Wight, D; Karam, P; Gonzalez, N; Apkarian, J

    2009-01-01

    The Veress needle is the most commonly used technique for creating the pneumoperitoneum at the start of a laparoscopic surgical procedure. Inserting the Veress needle correctly is crucial since errors can cause significant harm to patients. Unfortunately, this technique can be difficult to teach since surgeons rely heavily on tactile feedback while advancing the needle through the various layers of the abdominal wall. This critical step in laparoscopy, therefore, can be challenging for novice trainees to learn without adequate opportunities to practice in a safe environment with no risk of injury to patients. To address this issue, we have successfully developed a prototype of a virtual reality haptic needle insertion simulator using the tactile feedback of 22 surgeons to set realistic haptic parameters. A survey of these surgeons concluded that our device appeared and felt realistic, and could potentially be a useful tool for teaching the proper technique of Veress needle insertion.

  12. Pseudo-Haptic Feedback in Teleoperation.

    Science.gov (United States)

    Neupert, Carsten; Matich, Sebastian; Scherping, Nick; Kupnik, Mario; Werthschutzky, Roland; Hatzfeld, Christian

    2016-01-01

    In this paper, we develop possible realizations of pseudo-haptic feedback in teleoperation systems based on existing works for pseudo-haptic feedback in virtual reality and the intended applications. We derive four potential factors affecting the performance of haptic feedback (calculation operator, maximum displacement, offset force, and scaling factor), which are analyzed in three compliance identification experiments. First, we analyze the principle usability of pseudo-haptic feedback by comparing information transfer measures for teleoperation and direct interaction. Pseudo-haptic interaction yields well above-chance performance, while direct interaction performs almost perfectly. In order to optimize pseudo-haptic feedback, in the second study we perform a full-factorial experimental design with 36 subjects performing 6,480 trials with 36 different treatments. Information transfer ranges from 0.68 bit to 1.72 bit in a task with a theoretical maximum of 2.6 bit, with a predominant effect of the calculation operator and a minor effect of the maximum displacement. In a third study, short- and long-term learning effects are analyzed. Learning effects regarding the performance of pseudo-haptic feedback cannot be observed for single-day experiments. Tests over 10 days show a maximum increase in information transfer of 0.8 bit. The results show the feasibility of pseudo-haptic feedback for teleoperation and can be used as design basis for task-specific systems.

  13. Providing haptic feedback in robot-assisted minimally invasive surgery: a direct optical force-sensing solution for haptic rendering of deformable bodies.

    Science.gov (United States)

    Ehrampoosh, Shervin; Dave, Mohit; Kia, Michael A; Rablau, Corneliu; Zadeh, Mehrdad H

    2013-01-01

    This paper presents an enhanced haptic-enabled master-slave teleoperation system which can be used to provide force feedback to surgeons in minimally invasive surgery (MIS). One of the research goals was to develop a combined-control architecture framework that included both direct force reflection (DFR) and position-error-based (PEB) control strategies. To achieve this goal, it was essential to measure accurately the direct contact forces between deformable bodies and a robotic tool tip. To measure the forces at a surgical tool tip and enhance the performance of the teleoperation system, an optical force sensor was designed, prototyped, and added to a robot manipulator. The enhanced teleoperation architecture was formulated by developing mathematical models for the optical force sensor, the extended slave robot manipulator, and the combined-control strategy. Human factor studies were also conducted to (a) examine experimentally the performance of the enhanced teleoperation system with the optical force sensor, and (b) study human haptic perception during the identification of remote object deformability. The first experiment was carried out to discriminate deformability of objects when human subjects were in direct contact with deformable objects by means of a laparoscopic tool. The control parameters were then tuned based on the results of this experiment using a gain-scheduling method. The second experiment was conducted to study the effectiveness of the force feedback provided through the enhanced teleoperation system. The results show that the force feedback increased the ability of subjects to correctly identify materials of different deformable types. In addition, the virtual force feedback provided by the teleoperation system comes close to the real force feedback experienced in direct MIS. The experimental results provide design guidelines for choosing and validating the control architecture and the optical force sensor.

  14. Effects of kinesthetic haptic feedback on standing stability of young healthy subjects and stroke patients.

    Science.gov (United States)

    Afzal, Muhammad Raheel; Byun, Ha-Young; Oh, Min-Kyun; Yoon, Jungwon

    2015-03-13

    Haptic control is a useful therapeutic option in rehabilitation featuring virtual reality interaction. As with visual and vibrotactile biofeedback, kinesthetic haptic feedback may assist in postural control, and can achieve balance control. Kinesthetic haptic feedback in terms of body sway can be delivered via a commercially available haptic device and can enhance the balance stability of both young healthy subjects and stroke patients. Our system features a waist-attached smartphone, software running on a computer (PC), and a dedicated Phantom Omni® device. Young healthy participants performed balance tasks after assumption of each of four distinct postures for 30 s (one foot on the ground; the Tandem Romberg stance; one foot on foam; and the Tandem Romberg stance on foam) with eyes closed. Patient eyes were not closed and assumption of the Romberg stance (only) was tested during a balance task 25 s in duration. An Android application running continuously on the smartphone sent mediolateral (ML) and anteroposterior (AP) tilt angles to a PC, which generated kinesthetic haptic feedback via Phantom Omni®. A total of 16 subjects, 8 of whom were young healthy and 8 of whom had suffered stroke, participated in the study. Post-experiment data analysis was performed using MATLAB®. Mean Velocity Displacement (MVD), Planar Deviation (PD), Mediolateral Trajectory (MLT) and Anteroposterior Trajectory (APT) parameters were analyzed to measure reduction in body sway. Our kinesthetic haptic feedback system was effective to reduce postural sway in young healthy subjects regardless of posture and the condition of the substrate (the ground) and to improve MVD and PD in stroke patients who assumed the Romberg stance. Analysis of Variance (ANOVA) revealed that kinesthetic haptic feedback significantly reduced body sway in both categories of subjects. Kinesthetic haptic feedback can be implemented using a commercial haptic device and a smartphone. Intuitive balance cues were

  15. Faster simulated laparoscopic cholecystectomy with haptic feedback technology

    Directory of Open Access Journals (Sweden)

    Yiasemidou M

    2011-10-01

    Full Text Available Marina Yiasemidou, Daniel Glassman, Peter Vasas, Sarit Badiani, Bijendra Patel Barts and the London School of Medicine and Dentistry, Department of Upper GI Surgery, Barts and The Royal London Hospital, London, UK Background: Virtual reality simulators have been gradually introduced into surgical training. One of the enhanced features of the latest virtual simulators is haptic feedback. The usefulness of haptic feedback technology has been a matter of controversy in recent years. Previous studies have assessed the importance of haptic feedback in executing parts of a procedure or basic tasks, such as tissue grasping. The aim of this study was to assess the role of haptic feedback within a structured educational environment, based on the performance of junior surgical trainees after undergoing substantial simulation training. Methods: Novices, whose performance was assessed after several repetitions of a task, were recruited for this study. The performance of senior house officers at the last stage of a validated laparoscopic cholecystectomy curriculum was assessed. Nine senior house officers completed a validated laparoscopic cholecystectomy curriculum on a haptic simulator and nine on a nonhaptic simulator. Performance in terms of mean total time, mean total number of movements, and mean total path length at the last level of the validated curriculum (full procedure of laparoscopic cholecystectomy was compared between the two groups. Results: Haptic feedback significantly reduced the time required to complete the full procedure of laparoscopic cholecystectomy (mean total time for nonhaptic machine 608.83 seconds, mean total time for haptic machine 553.27 seconds; P = 0.019 while maintaining safety standards similar to those of the nonhaptic machine (mean total number of movements: nonhaptic machine 583.74, haptic machine 603.93, P = 0.145, mean total path length: for nonhaptic machine 1207.37 cm, for haptic machine 1262.36 cm, P = 0

  16. Audio effects on haptics perception during drilling simulation

    Directory of Open Access Journals (Sweden)

    Yair Valbuena

    2017-06-01

    Full Text Available Virtual reality has provided immersion and interactions through computer generated environments attempting to reproduce real life experiences through sensorial stimuli. Realism can be achieved through multimodal interactions which can enhance the user’s presence within the computer generated world. The most notorious advances in virtual reality can be seen in computer graphics visuals, where photorealism is the norm thriving to overcome the uncanny valley. Other advances have followed related to sound, haptics, and in a lesser manner smell and taste feedback. Currently, virtual reality systems (multimodal immersion and interactions through visual-haptic-sound are being massively used in entertainment (e.g., cinema, video games, art, and in non-entertainment scenarios (e.g., social inclusion, educational, training, therapy, and tourism. Moreover, the cost reduction of virtual reality technologies has resulted in the availability at a consumer-level of various haptic, headsets, and motion tracking devices. Current consumer-level devices offer low-fidelity experiences due to the properties of the sensors, displays, and other electro-mechanical devices, that may not be suitable for high-precision or realistic experiences requiring dexterity. However, research has been conducted on how toovercome or compensate the lack of high fidelity to provide an engaging user experience using storytelling, multimodal interactions and gaming elements. Our work focuses on analyzing the possible effects of auditory perception on haptic feedback within a drilling scenario. Drilling involves multimodal interactions and it is a task with multiple applications in medicine, crafting, and construction. We compare two drilling scenarios were two groups of participants had to drill through wood while listening to contextual and non-contextual audios. We gathered their perception using a survey after the task completion. From the results, we believe that sound does

  17. Improved haptic interface for colonoscopy simulation.

    Science.gov (United States)

    Woo, Hyun Soo; Kim, Woo Seok; Ahn, Woojin; Lee, Doo Yong; Yi, Sun Young

    2007-01-01

    This paper presents an improved haptic interface of the KAIST-Ewha colonoscopy simulator II. The haptic interface enables the distal portion of the colonoscope to be freely bent while guaranteeing enough workspace and reflective forces for colonoscopy simulation. Its force-torque sensor measures profiles of the user. Manipulation of the colonoscope tip is monitored by four deflection sensors, and triggers computation to render accurate graphic images corresponding to the angle knob rotation. Tack switches are attached on the valve-actuation buttons of the colonoscope to simulate air-injection or suction, and the corresponding deformation of the colon.

  18. Comparative analysis of video processing and 3D rendering for cloud video games using different virtualization technologies

    Science.gov (United States)

    Bada, Adedayo; Alcaraz-Calero, Jose M.; Wang, Qi; Grecos, Christos

    2014-05-01

    This paper describes a comprehensive empirical performance evaluation of 3D video processing employing the physical/virtual architecture implemented in a cloud environment. Different virtualization technologies, virtual video cards and various 3D benchmarks tools have been utilized in order to analyse the optimal performance in the context of 3D online gaming applications. This study highlights 3D video rendering performance under each type of hypervisors, and other factors including network I/O, disk I/O and memory usage. Comparisons of these factors under well-known virtual display technologies such as VNC, Spice and Virtual 3D adaptors reveal the strengths and weaknesses of the various hypervisors with respect to 3D video rendering and streaming.

  19. The Use of Haptic Display Technology in Education

    Science.gov (United States)

    Barfield, Woodrow

    2009-01-01

    The experience of "virtual reality" can consist of head-tracked and stereoscopic virtual worlds, spatialized sound, haptic feedback, and to a lesser extent olfactory cues. Although virtual reality systems have been proposed for numerous applications, the field of education is one particular application that seems well-suited for virtual…

  20. Haptic interface of the KAIST-Ewha colonoscopy simulator II.

    Science.gov (United States)

    Woo, Hyun Soo; Kim, Woo Seok; Ahn, Woojin; Lee, Doo Yong; Yi, Sun Young

    2008-11-01

    This paper presents an improved haptic interface for the Korea Advanced Institute of Science and Technology Ewha Colonoscopy Simulator II. The haptic interface enables the distal portion of the colonoscope to be freely bent while guaranteeing sufficient workspace and reflective forces for colonoscopy simulation. Its force-torque sensor measures the profiles of the user. Manipulation of the colonoscope tip is monitored by four deflection sensors and triggers computations to render accurate graphic images corresponding to the rotation of the angle knob. Tack sensors are attached to the valve-actuation buttons of the colonoscope to simulate air injection or suction as well as the corresponding deformation of the colon. A survey study for face validation was conducted, and the result shows that the developed haptic interface provides realistic haptic feedback for colonoscopy simulations.

  1. Preliminary Experiment Combining Virtual Reality Haptic Shoes and Audio Synthesis

    DEFF Research Database (Denmark)

    Nordahl, Rolf; Berrezag, Amir; Dimitrov, Smilen

    2010-01-01

    We describe a system that provides combined auditory and haptic sensations to simulate walking on different grounds. It uses a physical model that drives haptic transducers embedded in sandals and headphones. The model represents walking interactions with solid surfaces that can creak, or be cove...

  2. Review of Designs for Haptic Data Visualization.

    Science.gov (United States)

    Paneels, Sabrina; Roberts, Jonathan C

    2010-01-01

    There are many different uses for haptics, such as training medical practitioners, teleoperation, or navigation of virtual environments. This review focuses on haptic methods that display data. The hypothesis is that haptic devices can be used to present information, and consequently, the user gains quantitative, qualitative, or holistic knowledge about the presented data. Not only is this useful for users who are blind or partially sighted (who can feel line graphs, for instance), but also the haptic modality can be used alongside other modalities, to increase the amount of variables being presented, or to duplicate some variables to reinforce the presentation. Over the last 20 years, a significant amount of research has been done in haptic data presentation; e.g., researchers have developed force feedback line graphs, bar charts, and other forms of haptic representations. However, previous research is published in different conferences and journals, with different application emphases. This paper gathers and collates these various designs to provide a comprehensive review of designs for haptic data visualization. The designs are classified by their representation: Charts, Maps, Signs, Networks, Diagrams, Images, and Tables. This review provides a comprehensive reference for researchers and learners, and highlights areas for further research.

  3. Haptic-based neurorehabilitation in poststroke patients: a feasibility prospective multicentre trial for robotics hand rehabilitation.

    Science.gov (United States)

    Turolla, Andrea; Daud Albasini, Omar A; Oboe, Roberto; Agostini, Michela; Tonin, Paolo; Paolucci, Stefano; Sandrini, Giorgio; Venneri, Annalena; Piron, Lamberto

    2013-01-01

    Background. Haptic robots allow the exploitation of known motor learning mechanisms, representing a valuable option for motor treatment after stroke. The aim of this feasibility multicentre study was to test the clinical efficacy of a haptic prototype, for the recovery of hand function after stroke. Methods. A prospective pilot clinical trial was planned on 15 consecutive patients enrolled in 3 rehabilitation centre in Italy. All the framework features of the haptic robot (e.g., control loop, external communication, and graphic rendering for virtual reality) were implemented into a real-time MATLAB/Simulink environment, controlling a five-bar linkage able to provide forces up to 20 [N] at the end effector, used for finger and hand rehabilitation therapies. Clinical (i.e., Fugl-Meyer upper extremity scale; nine hold pegboard test) and kinematics (i.e., time; velocity; jerk metric; normalized jerk of standard movements) outcomes were assessed before and after treatment to detect changes in patients' motor performance. Reorganization of cortical activation was detected in one patient by fMRI. Results and Conclusions. All patients showed significant improvements in both clinical and kinematic outcomes. Additionally, fMRI results suggest that the proposed approach may promote a better cortical activation in the brain.

  4. Haptic-Based Neurorehabilitation in Poststroke Patients: A Feasibility Prospective Multicentre Trial for Robotics Hand Rehabilitation

    Directory of Open Access Journals (Sweden)

    Andrea Turolla

    2013-01-01

    Full Text Available Background. Haptic robots allow the exploitation of known motorlearning mechanisms, representing a valuable option for motor treatment after stroke. The aim of this feasibility multicentre study was to test the clinical efficacy of a haptic prototype, for the recovery of hand function after stroke. Methods. A prospective pilot clinical trial was planned on 15 consecutive patients enrolled in 3 rehabilitation centre in Italy. All the framework features of the haptic robot (e.g., control loop, external communication, and graphic rendering for virtual reality were implemented into a real-time MATLAB/Simulink environment, controlling a five-bar linkage able to provide forces up to 20 [N] at the end effector, used for finger and hand rehabilitation therapies. Clinical (i.e., Fugl-Meyer upper extremity scale; nine hold pegboard test and kinematics (i.e., time; velocity; jerk metric; normalized jerk of standard movements outcomes were assessed before and after treatment to detect changes in patients' motor performance. Reorganization of cortical activation was detected in one patient by fMRI. Results and Conclusions. All patients showed significant improvements in both clinical and kinematic outcomes. Additionally, fMRI results suggest that the proposed approach may promote a better cortical activation in the brain.

  5. [Hybrid 3-D rendering of the thorax and surface-based virtual bronchoscopy in surgical and interventional therapy control].

    Science.gov (United States)

    Seemann, M D; Gebicke, K; Luboldt, W; Albes, J M; Vollmar, J; Schäfer, J F; Beinert, T; Englmeier, K H; Bitzer, M; Claussen, C D

    2001-07-01

    The aim of this study was to demonstrate the possibilities of a hybrid rendering method, the combination of a color-coded surface and volume rendering method, with the feasibility of performing surface-based virtual endoscopy with different representation models in the operative and interventional therapy control of the chest. In 6 consecutive patients with partial lung resection (n = 2) and lung transplantation (n = 4) a thin-section spiral computed tomography of the chest was performed. The tracheobronchial system and the introduced metallic stents were visualized using a color-coded surface rendering method. The remaining thoracic structures were visualized using a volume rendering method. For virtual bronchoscopy, the tracheobronchial system was visualized using a triangle surface model, a shaded-surface model and a transparent shaded-surface model. The hybrid 3D visualization uses the advantages of both the color-coded surface and volume rendering methods and facilitates a clear representation of the tracheobronchial system and the complex topographical relationship of morphological and pathological changes without loss of diagnostic information. Performing virtual bronchoscopy with the transparent shaded-surface model facilitates a reasonable to optimal, simultaneous visualization and assessment of the surface structure of the tracheobronchial system and the surrounding mediastinal structures and lesions. Hybrid rendering relieve the morphological assessment of anatomical and pathological changes without the need for time-consuming detailed analysis and presentation of source images. Performing virtual bronchoscopy with a transparent shaded-surface model offers a promising alternative to flexible fiberoptic bronchoscopy.

  6. Virtual Cerebral Aneurysm Clipping with Real-Time Haptic Force Feedback in Neurosurgical Education.

    Science.gov (United States)

    Gmeiner, Matthias; Dirnberger, Johannes; Fenz, Wolfgang; Gollwitzer, Maria; Wurm, Gabriele; Trenkler, Johannes; Gruber, Andreas

    2018-04-01

    Realistic, safe, and efficient modalities for simulation-based training are highly warranted to enhance the quality of surgical education, and they should be incorporated in resident training. The aim of this study was to develop a patient-specific virtual cerebral aneurysm-clipping simulator with haptic force feedback and real-time deformation of the aneurysm and vessels. A prototype simulator was developed from 2012 to 2016. Evaluation of virtual clipping by blood flow simulation was integrated in this software, and the prototype was evaluated by 18 neurosurgeons. In 4 patients with different medial cerebral artery aneurysms, virtual clipping was performed after real-life surgery, and surgical results were compared regarding clip application, surgical trajectory, and blood flow. After head positioning and craniotomy, bimanual virtual aneurysm clipping with an original forceps was performed. Blood flow simulation demonstrated residual aneurysm filling or branch stenosis. The simulator improved anatomic understanding for 89% of neurosurgeons. Simulation of head positioning and craniotomy was considered realistic by 89% and 94% of users, respectively. Most participants agreed that this simulator should be integrated into neurosurgical education (94%). Our illustrative cases demonstrated that virtual aneurysm surgery was possible using the same trajectory as in real-life cases. Both virtual clipping and blood flow simulation were realistic in broad-based but not calcified aneurysms. Virtual clipping of a calcified aneurysm could be performed using the same surgical trajectory, but not the same clip type. We have successfully developed a virtual aneurysm-clipping simulator. Next, we will prospectively evaluate this device for surgical procedure planning and education. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Contribution to the modeling and the identification of haptic interfaces

    International Nuclear Information System (INIS)

    Janot, A.

    2007-12-01

    This thesis focuses on the modeling and the identification of haptic interfaces using cable drive. An haptic interface is a force feedback device, which enables its user to interact with a virtual world or a remote environment explored by a slave system. It aims at the matching between the forces and displacements given by the user and those applied to virtual world. Usually, haptic interfaces make use of a mechanical actuated structure whose distal link is equipped with a handle. When manipulating this handle to interact with explored world, the user feels the apparent mass, compliance and friction of the interface. This distortion introduced between the operator and the virtual world must be modeled and identified to enhance the design of the interface and develop appropriate control laws. The first approach has been to adapt the modeling and identification methods of rigid and localized flexibilities robots to haptic interfaces. The identification technique makes use of the inverse dynamic model and the linear least squares with the measurements of joint torques and positions. This approach is validated on a single degree of freedom and a three degree of freedom haptic devices. A new identification method needing only torque data is proposed. It is based on a closed loop simulation using the direct dynamic model. The optimal parameters minimize the 2 norms of the error between the actual torque and the simulated torque assuming the same control law and the same tracking trajectory. This non linear least squares problem dramatically is simplified using the inverse model to calculate the simulated torque. This method is validated on the single degree of freedom haptic device and the SCARA robot. (author)

  8. Sensorimotor Interactions in the Haptic Perception of Virtual Objects

    Science.gov (United States)

    1997-01-01

    the human user. 2 Compared to our understanding of vision and audition , our knowledge of the human haptic perception is very limited. Many basic...modalities such as vision and audition on haptic perception of viscosity or mass, for example. 116 Some preliminary work has already been done in this...string[3]; *posx="x" *forf="f’ *velv="v" * acca ="a" trial[64]; resp[64]; /* random number */ /* trial number */ /* index */ /* array holding stim

  9. Low cost heads-up virtual reality (HUVR) with optical tracking and haptic feedback

    KAUST Repository

    Margolis, Todd

    2011-01-23

    Researchers at the University of California, San Diego, have created a new, relatively low-cost augmented reality system that enables users to touch the virtual environment they are immersed in. The Heads-Up Virtual Reality device (HUVR) couples a consumer 3D HD flat screen TV with a half-silvered mirror to project any graphic image onto the user\\'s hands and into the space surrounding them. With his or her head position optically tracked to generate the correct perspective view, the user maneuvers a force-feedback (haptic) device to interact with the 3D image, literally \\'touching\\' the object\\'s angles and contours as if it was a tangible physical object. HUVR can be used for training and education in structural and mechanical engineering, archaeology and medicine as well as other tasks that require hand-eye coordination. One of the most unique characteristics of HUVR is that a user can place their hands inside of the virtual environment without occluding the 3D image. Built using open-source software and consumer level hardware, HUVR offers users a tactile experience in an immersive environment that is functional, affordable and scalable.

  10. Low cost heads-up virtual reality (HUVR) with optical tracking and haptic feedback

    Science.gov (United States)

    Margolis, Todd; DeFanti, Thomas A.; Dawe, Greg; Prudhomme, Andrew; Schulze, Jurgen P.; Cutchin, Steve

    2011-03-01

    Researchers at the University of California, San Diego, have created a new, relatively low-cost augmented reality system that enables users to touch the virtual environment they are immersed in. The Heads-Up Virtual Reality device (HUVR) couples a consumer 3D HD flat screen TV with a half-silvered mirror to project any graphic image onto the user's hands and into the space surrounding them. With his or her head position optically tracked to generate the correct perspective view, the user maneuvers a force-feedback (haptic) device to interact with the 3D image, literally 'touching' the object's angles and contours as if it was a tangible physical object. HUVR can be used for training and education in structural and mechanical engineering, archaeology and medicine as well as other tasks that require hand-eye coordination. One of the most unique characteristics of HUVR is that a user can place their hands inside of the virtual environment without occluding the 3D image. Built using open-source software and consumer level hardware, HUVR offers users a tactile experience in an immersive environment that is functional, affordable and scalable.

  11. Testing haptic sensations for spinal anesthesia.

    LENUS (Irish Health Repository)

    2011-01-01

    Having identified key determinants of teaching and learning spinal anesthesia, it was necessary to characterize and render the haptic sensations (feeling of touch) associated with needle insertion in the lower back. The approach used is to match recreated sensations (eg, "pop" through skin or dura mater) with experts\\' perceptions of the equivalent clinical events.

  12. Customization, control, and characterization of a commercial haptic device for high-fidelity rendering of weak forces.

    Science.gov (United States)

    Gurari, Netta; Baud-Bovy, Gabriel

    2014-09-30

    The emergence of commercial haptic devices offers new research opportunities to enhance our understanding of the human sensory-motor system. Yet, commercial device capabilities have limitations which need to be addressed. This paper describes the customization of a commercial force feedback device for displaying forces with a precision that exceeds the human force perception threshold. The device was outfitted with a multi-axis force sensor and closed-loop controlled to improve its transparency. Additionally, two force sensing resistors were attached to the device to measure grip force. Force errors were modeled in the frequency- and time-domain to identify contributions from the mass, viscous friction, and Coulomb friction during open- and closed-loop control. The effect of user interaction on system stability was assessed in the context of a user study which aimed to measure force perceptual thresholds. Findings based on 15 participants demonstrate that the system maintains stability when rendering forces ranging from 0-0.20 N, with an average maximum absolute force error of 0.041 ± 0.013 N. Modeling the force errors revealed that Coulomb friction and inertia were the main contributors to force distortions during respectively slow and fast motions. Existing commercial force feedback devices cannot render forces with the required precision for certain testing scenarios. Building on existing robotics work, this paper shows how a device can be customized to make it reliable for studying the perception of weak forces. The customized and closed-loop controlled device is suitable for measuring force perceptual thresholds. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. A magnetorheological haptic cue accelerator for manual transmission vehicles

    International Nuclear Information System (INIS)

    Han, Young-Min; Noh, Kyung-Wook; Choi, Seung-Bok; Lee, Yang-Sub

    2010-01-01

    This paper proposes a new haptic cue function for manual transmission vehicles to achieve optimal gear shifting. This function is implemented on the accelerator pedal by utilizing a magnetorheological (MR) brake mechanism. By combining the haptic cue function with the accelerator pedal, the proposed haptic cue device can transmit the optimal moment of gear shifting for manual transmission to a driver without requiring the driver's visual attention. As a first step to achieve this goal, a MR fluid-based haptic device is devised to enable rotary motion of the accelerator pedal. Taking into account spatial limitations, the design parameters are optimally determined using finite element analysis to maximize the relative control torque. The proposed haptic cue device is then manufactured and its field-dependent torque and time response are experimentally evaluated. Then the manufactured MR haptic cue device is integrated with the accelerator pedal. A simple virtual vehicle emulating the operation of the engine of a passenger vehicle is constructed and put into communication with the haptic cue device. A feed-forward torque control algorithm for the haptic cue is formulated and control performances are experimentally evaluated and presented in the time domain

  14. Haptic feedback for enhancing realism of walking simulations.

    Science.gov (United States)

    Turchet, Luca; Burelli, Paolo; Serafin, Stefania

    2013-01-01

    In this paper, we describe several experiments whose goal is to evaluate the role of plantar vibrotactile feedback in enhancing the realism of walking experiences in multimodal virtual environments. To achieve this goal we built an interactive and a noninteractive multimodal feedback system. While during the use of the interactive system subjects physically walked, during the use of the noninteractive system the locomotion was simulated while subjects were sitting on a chair. In both the configurations subjects were exposed to auditory and audio-visual stimuli presented with and without the haptic feedback. Results of the experiments provide a clear preference toward the simulations enhanced with haptic feedback showing that the haptic channel can lead to more realistic experiences in both interactive and noninteractive configurations. The majority of subjects clearly appreciated the added feedback. However, some subjects found the added feedback unpleasant. This might be due, on one hand, to the limits of the haptic simulation and, on the other hand, to the different individual desire to be involved in the simulations. Our findings can be applied to the context of physical navigation in multimodal virtual environments as well as to enhance the user experience of watching a movie or playing a video game.

  15. Encountered-Type Haptic Interface for Representation of Shape and Rigidity of 3D Virtual Objects.

    Science.gov (United States)

    Takizawa, Naoki; Yano, Hiroaki; Iwata, Hiroo; Oshiro, Yukio; Ohkohchi, Nobuhiro

    2017-01-01

    This paper describes the development of an encountered-type haptic interface that can generate the physical characteristics, such as shape and rigidity, of three-dimensional (3D) virtual objects using an array of newly developed non-expandable balloons. To alter the rigidity of each non-expandable balloon, the volume of air in it is controlled through a linear actuator and a pressure sensor based on Hooke's law. Furthermore, to change the volume of each balloon, its exposed surface area is controlled by using another linear actuator with a trumpet-shaped tube. A position control mechanism is constructed to display virtual objects using the balloons. The 3D position of each balloon is controlled using a flexible tube and a string. The performance of the system is tested and the results confirm the effectiveness of the proposed principle and interface.

  16. Control of 4-DOF MR haptic master for medical application

    Science.gov (United States)

    Oh, Jong-Seok; Choi, Seung-Hyun; Choi, Seung-Bok

    2014-03-01

    In this work, magnetorheological (MR) based haptic master for robot-assisted minimally invasive surgery (RMIS) is proposed and analyzed. Using a controllable MR fluid, the masters can generate a reflection force with the 4-DOF motion. The proposed master consists of two actuators: MR clutch featuring gimbal mechanism for 2-DOF rotational motion (X and Y axes) and MR clutch attached at gripper of gimbal structures for 1-DOF rotational motion (Z axis) and 1-DOF translational motion. After analyzing the dynamic motion by integrating mechanical and physical properties of the actuators, torque model of the proposed haptic master is derived. For realization of master-slave system, an encoder which can measure position information is integrated with the MR haptic master. In the RMIS system, the measured position is converted as a command signal and sent to the slave robot. In this work, slave and organ of patient are modeled in virtual space. In order to embody a human organ into virtual space, a volumetric deformable object is mathematically formulated by a shape retaining chain linked (S-chain) model. Accordingly, the haptic architecture is established by incorporating the virtual slave with the master device in which the reflection force and desired position originated from the object of the virtual slave and operator of the master, respectively, are transferred to each other. In order to achieve the desired force trajectories, a proportional-integral-derivative (PID) controller is designed and implemented. It has been demonstrated that the effective tracking control performance for the desired motion of reflection force is well presented in time domain.

  17. Virtual Reality Robotic Operation Simulations Using MEMICA Haptic System

    Science.gov (United States)

    Bar-Cohen, Y.; Mavroidis, C.; Bouzit, M.; Dolgin, B.; Harm, D. L.; Kopchok, G. E.; White, R.

    2000-01-01

    There is an increasing realization that some tasks can be performed significantly better by humans than robots but, due to associated hazards, distance, etc., only a robot can be employed. Telemedicine is one area where remotely controlled robots can have a major impact by providing urgent care at remote sites. In recent years, remotely controlled robotics has been greatly advanced. The robotic astronaut, "Robonaut," at NASA Johnson Space Center is one such example. Unfortunately, due to the unavailability of force and tactile feedback capability the operator must determine the required action using only visual feedback from the remote site, which limits the tasks that Robonaut can perform. There is a great need for dexterous, fast, accurate teleoperated robots with the operator?s ability to "feel" the environment at the robot's field. Recently, we conceived a haptic mechanism called MEMICA (Remote MEchanical MIrroring using Controlled stiffness and Actuators) that can enable the design of high dexterity, rapid response, and large workspace system. Our team is developing novel MEMICA gloves and virtual reality models to allow the simulation of telesurgery and other applications. The MEMICA gloves are designed to have a high dexterity, rapid response, and large workspace and intuitively mirror the conditions at a virtual site where a robot is simulating the presence of the human operator. The key components of MEMICA are miniature electrically controlled stiffness (ECS) elements and Electrically Controlled Force and Stiffness (ECFS) actuators that are based on the sue of Electro-Rheological Fluids (ERF). In this paper the design of the MEMICA system and initial experimental results are presented.

  18. Haptic Discrimination of Distance

    Science.gov (United States)

    van Beek, Femke E.; Bergmann Tiest, Wouter M.; Kappers, Astrid M. L.

    2014-01-01

    While quite some research has focussed on the accuracy of haptic perception of distance, information on the precision of haptic perception of distance is still scarce, particularly regarding distances perceived by making arm movements. In this study, eight conditions were measured to answer four main questions, which are: what is the influence of reference distance, movement axis, perceptual mode (active or passive) and stimulus type on the precision of this kind of distance perception? A discrimination experiment was performed with twelve participants. The participants were presented with two distances, using either a haptic device or a real stimulus. Participants compared the distances by moving their hand from a start to an end position. They were then asked to judge which of the distances was the longer, from which the discrimination threshold was determined for each participant and condition. The precision was influenced by reference distance. No effect of movement axis was found. The precision was higher for active than for passive movements and it was a bit lower for real stimuli than for rendered stimuli, but it was not affected by adding cutaneous information. Overall, the Weber fraction for the active perception of a distance of 25 or 35 cm was about 11% for all cardinal axes. The recorded position data suggest that participants, in order to be able to judge which distance was the longer, tried to produce similar speed profiles in both movements. This knowledge could be useful in the design of haptic devices. PMID:25116638

  19. Haptic discrimination of distance.

    Directory of Open Access Journals (Sweden)

    Femke E van Beek

    Full Text Available While quite some research has focussed on the accuracy of haptic perception of distance, information on the precision of haptic perception of distance is still scarce, particularly regarding distances perceived by making arm movements. In this study, eight conditions were measured to answer four main questions, which are: what is the influence of reference distance, movement axis, perceptual mode (active or passive and stimulus type on the precision of this kind of distance perception? A discrimination experiment was performed with twelve participants. The participants were presented with two distances, using either a haptic device or a real stimulus. Participants compared the distances by moving their hand from a start to an end position. They were then asked to judge which of the distances was the longer, from which the discrimination threshold was determined for each participant and condition. The precision was influenced by reference distance. No effect of movement axis was found. The precision was higher for active than for passive movements and it was a bit lower for real stimuli than for rendered stimuli, but it was not affected by adding cutaneous information. Overall, the Weber fraction for the active perception of a distance of 25 or 35 cm was about 11% for all cardinal axes. The recorded position data suggest that participants, in order to be able to judge which distance was the longer, tried to produce similar speed profiles in both movements. This knowledge could be useful in the design of haptic devices.

  20. Evaluation of a haptics-based virtual reality temporal bone simulator for anatomy and surgery training.

    Science.gov (United States)

    Fang, Te-Yung; Wang, Pa-Chun; Liu, Chih-Hsien; Su, Mu-Chun; Yeh, Shih-Ching

    2014-02-01

    Virtual reality simulation training may improve knowledge of anatomy and surgical skills. We evaluated a 3-dimensional, haptic, virtual reality temporal bone simulator for dissection training. The subjects were 7 otolaryngology residents (3 training sessions each) and 7 medical students (1 training session each). The virtual reality temporal bone simulation station included a computer with software that was linked to a force-feedback hand stylus, and the system recorded performance and collisions with vital anatomic structures. Subjects performed virtual reality dissections and completed questionnaires after the training sessions. Residents and students had favorable responses to most questions of the technology acceptance model (TAM) questionnaire. The average TAM scores were above neutral for residents and medical students in all domains, and the average TAM score for residents was significantly higher for the usefulness domain and lower for the playful domain than students. The average satisfaction questionnaire for residents showed that residents had greater overall satisfaction with cadaver temporal bone dissection training than training with the virtual reality simulator or plastic temporal bone. For medical students, the average comprehension score was significantly increased from before to after training for all anatomic structures. Medical students had significantly more collisions with the dura than residents. The residents had similar mean performance scores after the first and third training sessions for all dissection procedures. The virtual reality temporal bone simulator provided satisfactory training for otolaryngology residents and medical students. Copyright © 2013. Published by Elsevier Ireland Ltd.

  1. Real-time dual-band haptic music player for mobile devices.

    Science.gov (United States)

    Hwang, Inwook; Lee, Hyeseon; Choi, Seungmoon

    2013-01-01

    We introduce a novel dual-band haptic music player for real-time simultaneous vibrotactile playback with music in mobile devices. Our haptic music player features a new miniature dual-mode actuator that can produce vibrations consisting of two principal frequencies and a real-time vibration generation algorithm that can extract vibration commands from a music file for dual-band playback (bass and treble). The algorithm uses a "haptic equalizer" and provides plausible sound-to-touch modality conversion based on human perceptual data. In addition, we present a user study carried out to evaluate the subjective performance (precision, harmony, fun, and preference) of the haptic music player, in comparison with the current practice of bass-band-only vibrotactile playback via a single-frequency voice-coil actuator. The evaluation results indicated that the new dual-band playback outperforms the bass-only rendering, also providing several insights for further improvements. The developed system and experimental findings have implications for improving the multimedia experience with mobile devices.

  2. Introduction to haptics for neurosurgeons.

    Science.gov (United States)

    L'Orsa, Rachael; Macnab, Chris J B; Tavakoli, Mahdi

    2013-01-01

    Robots are becoming increasingly relevant to neurosurgeons, extending a neurosurgeon's physical capabilities, improving navigation within the surgical landscape when combined with advanced imaging, and propelling the movement toward minimally invasive surgery. Most surgical robots, however, isolate surgeons from the full range of human senses during a procedure. This forces surgeons to rely on vision alone for guidance through the surgical corridor, which limits the capabilities of the system, requires significant operator training, and increases the surgeon's workload. Incorporating haptics into these systems, ie, enabling the surgeon to "feel" forces experienced by the tool tip of the robot, could render these limitations obsolete by making the robot feel more like an extension of the surgeon's own body. Although the use of haptics in neurosurgical robots is still mostly the domain of research, neurosurgeons who keep abreast of this emerging field will be more prepared to take advantage of it as it becomes more prevalent in operating theaters. Thus, this article serves as an introduction to the field of haptics for neurosurgeons. We not only outline the current and future benefits of haptics but also introduce concepts in the fields of robotic technology and computer control. This knowledge will allow readers to be better aware of limitations in the technology that can affect performance and surgical outcomes, and "knowing the right questions to ask" will be invaluable for surgeons who have purchasing power within their departments.

  3. Mechatronic design of haptic forceps for robotic surgery.

    Science.gov (United States)

    Rizun, P; Gunn, D; Cox, B; Sutherland, G

    2006-12-01

    Haptic feedback increases operator performance and comfort during telerobotic manipulation. Feedback of grasping pressure is critical in many microsurgical tasks, yet no haptic interface for surgical tools is commercially available. Literature on the psychophysics of touch was reviewed to define the spectrum of human touch perception and the fidelity requirements of an ideal haptic interface. Mechanical design and control literature was reviewed to translate the psychophysical requirements to engineering specification. High-fidelity haptic forceps were then developed through an iterative process between engineering and surgery. The forceps are a modular device that integrate with a haptic hand controller to add force feedback for tool actuation in telerobotic or virtual surgery. Their overall length is 153 mm and their mass is 125 g. A contact-free voice coil actuator generates force feedback at frequencies up to 800 Hz. Maximum force output is 6 N (2N continuous) and the force resolution is 4 mN. The forceps employ a contact-free magnetic position sensor as well as micro-machined accelerometers to measure opening/closing acceleration. Position resolution is 0.6 microm with 1.3 microm RMS noise. The forceps can simulate stiffness greater than 20N/mm or impedances smaller than 15 g with no noticeable haptic artifacts or friction. As telerobotic surgery evolves, haptics will play an increasingly important role. Copyright 2006 John Wiley & Sons, Ltd.

  4. Portable haptic interface with omni-directional movement and force capability.

    Science.gov (United States)

    Avizzano, Carlo Alberto; Satler, Massimo; Ruffaldi, Emanuele

    2014-01-01

    We describe the design of a new mobile haptic interface that employs wheels for force rendering. The interface, consisting of an omni-directional Killough type platform, provides 2DOF force feedback with different control modalities. The system autonomously performs sensor fusion for localization and force rendering. This paper explains the relevant choices concerning the functional aspects, the control design, the mechanical and electronic solution. Experimental results for force feedback characterization are reported.

  5. Haptic/graphic rehabilitation: integrating a robot into a virtual environment library and applying it to stroke therapy.

    Science.gov (United States)

    Sharp, Ian; Patton, James; Listenberger, Molly; Case, Emily

    2011-08-08

    Recent research that tests interactive devices for prolonged therapy practice has revealed new prospects for robotics combined with graphical and other forms of biofeedback. Previous human-robot interactive systems have required different software commands to be implemented for each robot leading to unnecessary developmental overhead time each time a new system becomes available. For example, when a haptic/graphic virtual reality environment has been coded for one specific robot to provide haptic feedback, that specific robot would not be able to be traded for another robot without recoding the program. However, recent efforts in the open source community have proposed a wrapper class approach that can elicit nearly identical responses regardless of the robot used. The result can lead researchers across the globe to perform similar experiments using shared code. Therefore modular "switching out"of one robot for another would not affect development time. In this paper, we outline the successful creation and implementation of a wrapper class for one robot into the open-source H3DAPI, which integrates the software commands most commonly used by all robots.

  6. Vertigo in virtual reality with haptics: case report.

    Science.gov (United States)

    Viirre, Erik; Ellisman, Mark

    2003-08-01

    A researcher was working with a desktop virtual environment system. The system was displaying vector fields of a cyclonic weather system, and the system incorporated a haptic display of the forces in the cyclonic field. As the subject viewed the rotating cyclone field, they would move a handle "through" the representation of the moving winds and "feel" the forces buffeting the handle as it moved. Stopping after using the system for about 10 min, the user experienced an immediate sensation of postural instability for several minutes. Several hours later, there was the onset of vertigo with head turns. This vertigo lasted several hours and was accompanied with nausea and motion illusions that exacerbated by head movements. Symptoms persisted mildly the next day and were still present the third and fourth day, but by then were only provoked by head movements. There were no accompanying symptoms or history to suggest an inner ear disorder. Physical examination of inner ear and associated neurologic function was normal. No other users of this system have reported similar symptoms. This case suggests that some individuals may be susceptible to the interaction of displays with motion and movement forces and as a result experience motion illusions. Operators of such systems should be aware of this potential and minimize exposure if vertigo occurs.

  7. Towards open-source, low-cost haptics for surgery simulation.

    Science.gov (United States)

    Suwelack, Stefan; Sander, Christian; Schill, Julian; Serf, Manuel; Danz, Marcel; Asfour, Tamim; Burger, Wolfgang; Dillmann, Rüdiger; Speidel, Stefanie

    2014-01-01

    In minimally invasive surgery (MIS), virtual reality (VR) training systems have become a promising education tool. However, the adoption of these systems in research and clinical settings is still limited by the high costs of dedicated haptics hardware for MIS. In this paper, we present ongoing research towards an open-source, low-cost haptic interface for MIS simulation. We demonstrate the basic mechanical design of the device, the sensor setup as well as its software integration.

  8. Vision-Based Haptic Feedback for Remote Micromanipulation in-SEM Environment

    Science.gov (United States)

    Bolopion, Aude; Dahmen, Christian; Stolle, Christian; Haliyo, Sinan; Régnier, Stéphane; Fatikow, Sergej

    2012-07-01

    This article presents an intuitive environment for remote micromanipulation composed of both haptic feedback and virtual reconstruction of the scene. To enable nonexpert users to perform complex teleoperated micromanipulation tasks, it is of utmost importance to provide them with information about the 3-D relative positions of the objects and the tools. Haptic feedback is an intuitive way to transmit such information. Since position sensors are not available at this scale, visual feedback is used to derive information about the scene. In this work, three different techniques are implemented, evaluated, and compared to derive the object positions from scanning electron microscope images. The modified correlation matching with generated template algorithm is accurate and provides reliable detection of objects. To track the tool, a marker-based approach is chosen since fast detection is required for stable haptic feedback. Information derived from these algorithms is used to propose an intuitive remote manipulation system that enables users situated in geographically distant sites to benefit from specific equipments, such as SEMs. Stability of the haptic feedback is ensured by the minimization of the delays, the computational efficiency of vision algorithms, and the proper tuning of the haptic coupling. Virtual guides are proposed to avoid any involuntary collisions between the tool and the objects. This approach is validated by a teleoperation involving melamine microspheres with a diameter of less than 2 μ m between Paris, France and Oldenburg, Germany.

  9. Telerobotic Haptic Exploration in Art Galleries and Museums for Individuals with Visual Impairments.

    Science.gov (United States)

    Park, Chung Hyuk; Ryu, Eun-Seok; Howard, Ayanna M

    2015-01-01

    This paper presents a haptic telepresence system that enables visually impaired users to explore locations with rich visual observation such as art galleries and museums by using a telepresence robot, a RGB-D sensor (color and depth camera), and a haptic interface. The recent improvement on RGB-D sensors has enabled real-time access to 3D spatial information in the form of point clouds. However, the real-time representation of this data in the form of tangible haptic experience has not been challenged enough, especially in the case of telepresence for individuals with visual impairments. Thus, the proposed system addresses the real-time haptic exploration of remote 3D information through video encoding and real-time 3D haptic rendering of the remote real-world environment. This paper investigates two scenarios in haptic telepresence, i.e., mobile navigation and object exploration in a remote environment. Participants with and without visual impairments participated in our experiments based on the two scenarios, and the system performance was validated. In conclusion, the proposed framework provides a new methodology of haptic telepresence for individuals with visual impairments by providing an enhanced interactive experience where they can remotely access public places (art galleries and museums) with the aid of haptic modality and robotic telepresence.

  10. Virtual Whipple: preoperative surgical planning with volume-rendered MDCT images to identify arterial variants relevant to the Whipple procedure.

    Science.gov (United States)

    Brennan, Darren D; Zamboni, Giulia; Sosna, Jacob; Callery, Mark P; Vollmer, Charles M V; Raptopoulos, Vassilios D; Kruskal, Jonathan B

    2007-05-01

    The purposes of this study were to combine a thorough understanding of the technical aspects of the Whipple procedure with advanced rendering techniques by introducing a virtual Whipple procedure and to evaluate the utility of this new rendering technique in prediction of the arterial variants that cross the anticipated surgical resection plane. The virtual Whipple is a novel technique that follows the complex surgical steps in a Whipple procedure. Three-dimensional reconstructed angiographic images are used to identify arterial variants for the surgeon as part of the preoperative radiologic assessment of pancreatic and ampullary tumors.

  11. Haptic feedback improves surgeons' user experience and fracture reduction in facial trauma simulation.

    Science.gov (United States)

    Girod, Sabine; Schvartzman, Sara C; Gaudilliere, Dyani; Salisbury, Kenneth; Silva, Rebeka

    2016-01-01

    Computer-assisted surgical (CAS) planning tools are available for craniofacial surgery, but are usually based on computer-aided design (CAD) tools that lack the ability to detect the collision of virtual objects (i.e., fractured bone segments). We developed a CAS system featuring a sense of touch (haptic) that enables surgeons to physically interact with individual, patient-specific anatomy and immerse in a three-dimensional virtual environment. In this study, we evaluated initial user experience with our novel system compared to an existing CAD system. Ten surgery resident trainees received a brief verbal introduction to both the haptic and CAD systems. Users simulated mandibular fracture reduction in three clinical cases within a 15 min time limit for each system and completed a questionnaire to assess their subjective experience. We compared standard landmarks and linear and angular measurements between the simulated results and the actual surgical outcome and found that haptic simulation results were not significantly different from actual postoperative outcomes. In contrast, CAD results significantly differed from both the haptic simulation and actual postoperative results. In addition to enabling a more accurate fracture repair, the haptic system provided a better user experience than the CAD system in terms of intuitiveness and self-reported quality of repair.

  12. Three dimensional volume rendering virtual endoscopy of the ossicles using a multi-row detector CT: applications and limitations

    International Nuclear Information System (INIS)

    Kim, Su Yeon; Choi, Sun Seob; Kang, Myung Jin; Shin, Tae Beom; Lee, Ki Nam; Kang, Myung Koo

    2005-01-01

    This study was conducted to know the applications and limitations of three dimensional volume rendering virtual endoscopy of the ossicles using a multi-row detector CT. This study examined 25 patients who underwent temporal bone CT using a 16-row detector CT as a result of hearing problems or trauma. The axial CT scan of the temporal bone was performed with a 0.6 mm collimation, and a reconstruction was carried out with a U70u sharp of kernel value, a 1 mm thickness and 0.5-1.0 mm increments. After observing the ossicles in the axial and coronal images, virtual endoscopy was performed using a three dimensional volume rendering technique with a threshold value of-500 HU. The intra-operative otoendoscopy was performed in 12 ears, and was compared with the virtual endoscopy findings. Virtual endoscopy of the 29 ears without hearing problems demonstrated hypoplastic or an incomplete depiction of the stapes superstructures in 25 ears and a normal depiction in 4 ears. Virtual endoscopy of 21 ears with hearing problems demonstrated no ossicles in 1 ears, no malleus in 3 ears, a malleoincudal subluxation in 6 ears, a dysplastic incus in 5 ears, an incudostapedial subluxation in 9 ears, dysplastic stapes in 2 ears, a hypoplastic or incomplete depiction of the stapes in 16 ears and no stapes in 1 ears. In contrast to the intra-operative otoendoscopy, 8 out of 12 ears showed a hypoplastic or deformed stapes in the virtual endoscopy. Volume rendering virtual endoscopy using a multi-row detector CT is an excellent method for evaluation the ossicles in three dimension, even thought the partial volume effect for the stapes superstructures needs to be considered

  13. HapticScreenを用いたメディアインスタレーション : ANOMALOCARIS(<特集>インタラクティブアート)

    OpenAIRE

    岩田, 洋夫; 矢野, 博明; 中泉, 文孝

    2000-01-01

    "ANOMALOCARIS" is an interactive work, which represents virtual creature through visual and haptic sensation. Anomalocaris is a name of an animal that supposed to live during the Cambrian Era. Virtual anomalocaris is displayed on the HapticScreen. HapticScreen is a new configuration of a force feedback device. HapticScreen has an innovative mechanism that creates sense of touch on whole palm. The device deforms itself to present shapes of virtual object. Typical force feedback device uses a g...

  14. A magnetorheological fluid-based multifunctional haptic device for vehicular instrument controls

    International Nuclear Information System (INIS)

    Han, Young-Min; Kim, Chan-Jung; Choi, Seung-Bok

    2009-01-01

    This paper presents control performances of a magnetorheological (MR) fluid-based multifunctional haptic device which is applicable to vehicular instrument controls. By combining in-vehicle functions into a single device, the proposed haptic device can transmit various reflection forces for each comfort function to a driver without requiring the driver's visual attention. As a multifunctional haptic device, a MR knob is proposed in this work and then devised to be capable of both rotary and push motions with a single knob. Under consideration of the spatial limitations of vehicle dashboards, design parameters are optimally determined by finite element analysis, and the objective function is to maximize a relative control torque. The proposed haptic device is then manufactured, and in-vehicle comfort functions are constructed in a virtual environment which makes the functions to communicate with the haptic device. Subsequently, a feed-forward controller using torque/force maps is formulated for the force tracking control. Control performances such as reflection force of the haptic device are experimentally evaluated via the torque/force map-based feed-forward controller

  15. Feeling objects in Virtual Environments: Presence and Pseudo-Haptics in a Bowling Game

    DEFF Research Database (Denmark)

    Daniliauskaite, Kristina; Magnusdottir, Agusta; Bjørkå, Henrik Birke

    2007-01-01

    , by relying on visual cues, taking therefore advantage of sensory substitution (no haptic feedback device is actually present). The interdependency between presence and a pseudo-haptic feedback is investigated by building avirtual bowling game. Results indicate that there is a significant correlation between...

  16. Haptic device development based on electro static force of cellulose electro active paper

    Science.gov (United States)

    Yun, Gyu-young; Kim, Sang-Youn; Jang, Sang-Dong; Kim, Dong-Gu; Kim, Jaehwan

    2011-04-01

    Haptic is one of well-considered device which is suitable for demanding virtual reality applications such as medical equipment, mobile devices, the online marketing and so on. Nowadays, many of concepts for haptic devices have been suggested to meet the demand of industries. Cellulose has received much attention as an emerging smart material, named as electro-active paper (EAPap). The EAPap is attractive for mobile haptic devices due to its unique characteristics in terms of low actuation power, suitability for thin devices and transparency. In this paper, we suggest a new concept of haptic actuator with the use of cellulose EAPap. Its performance is evaluated depending on various actuation conditions. As a result, cellulose electrostatic force actuator shows a large output displacement and fast response, which is suitable for mobile haptic devices.

  17. Multimodality with Eye tracking and Haptics: A New Horizon for Serious Games?

    Directory of Open Access Journals (Sweden)

    Shujie Deng

    2014-10-01

    Full Text Available The goal of this review is to illustrate the emerging use of multimodal virtual reality that can benefit learning-based games. The review begins with an introduction to multimodal virtual reality in serious games and we provide a brief discussion of why cognitive processes involved in learning and training are enhanced under immersive virtual environments. We initially outline studies that have used eye tracking and haptic feedback independently in serious games, and then review some innovative applications that have already combined eye tracking and haptic devices in order to provide applicable multimodal frameworks for learning-based games. Finally, some general conclusions are identified and clarified in order to advance current understanding in multimodal serious game production as well as exploring possible areas for new applications.

  18. A novel shape-changing haptic table-top display

    Science.gov (United States)

    Wang, Jiabin; Zhao, Lu; Liu, Yue; Wang, Yongtian; Cai, Yi

    2018-01-01

    A shape-changing table-top display with haptic feedback allows its users to perceive 3D visual and texture displays interactively. Since few existing devices are developed as accurate displays with regulatory haptic feedback, a novel attentive and immersive shape changing mechanical interface (SCMI) consisting of image processing unit and transformation unit was proposed in this paper. In order to support a precise 3D table-top display with an offset of less than 2 mm, a custommade mechanism was developed to form precise surface and regulate the feedback force. The proposed image processing unit was capable of extracting texture data from 2D picture for rendering shape-changing surface and realizing 3D modeling. The preliminary evaluation result proved the feasibility of the proposed system.

  19. Body Image and Anti-Fat Attitudes: An Experimental Study Using a Haptic Virtual Reality Environment to Replicate Human Touch.

    Science.gov (United States)

    Tremblay, Line; Roy-Vaillancourt, Mélina; Chebbi, Brahim; Bouchard, Stéphane; Daoust, Michael; Dénommée, Jessica; Thorpe, Moriah

    2016-02-01

    It is well documented that anti-fat attitudes influence the interactions individuals have with overweight people. However, testing attitudes through self-report measures is challenging. In the present study, we explore the use of a haptic virtual reality environment to physically interact with overweight virtual human (VH). We verify the hypothesis that duration and strength of virtual touch vary according to the characteristics of VH in ways similar to those encountered from interaction with real people in anti-fat attitude studies. A group of 61 participants were randomly assigned to one of the experimental conditions involving giving a virtual hug to a female or a male VH of either normal or overweight. We found significant associations between body image satisfaction and anti-fat attitudes and sex differences on these measures. We also found a significant interaction effect of the sex of the participants, sex of the VH, and the body size of the VH. Female participants hugged longer the overweight female VH than overweight male VH. Male participants hugged longer the normal-weight VH than the overweight VH. We conclude that virtual touch is a promising method of measuring attitudes, emotion and social interactions.

  20. Stable haptic feedback based on a Dynamic Vision Sensor for Microrobotics.

    OpenAIRE

    Bolopion , Aude; Ni , Zhenjiang; Agnus , Joël; Benosman , Ryad; Régnier , Stéphane

    2012-01-01

    International audience; This work presents a stable vision based haptic feedback for micromanipulation using both an asynchronous Address Event Representation (AER) silicon retina and a conventional frame-based camera. At this scale, most of the grippers used to manipulate objects lack of force sensing. High frequency vision detection thus provides a sound solution to get information about the position of the object and the tool to provide virtual haptic guides. Artificial retinas present hig...

  1. Force Sensitive Handles and Capacitive Touch Sensor for Driving a Flexible Haptic-Based Immersive System

    OpenAIRE

    Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto

    2013-01-01

    In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object...

  2. The Efficacy of a Haptic-Enhanced Virtual Reality System for Precision Grasp Acquisition in Stroke Rehabilitation

    Directory of Open Access Journals (Sweden)

    Shih-Ching Yeh

    2017-01-01

    Full Text Available Stroke is a leading cause of long-term disability, and virtual reality- (VR- based stroke rehabilitation is effective in increasing motivation and the functional performance. Although much of the functional reach and grasp capabilities of the upper extremities were regained, the pinch movement remains impaired following stroke. In this study, we developed a haptic-enhanced VR system to simulate haptic pinch tasks to assist the recovery of upper-extremity fine motor function. We recruited 16 adults with stroke to verify the efficacy of this new VR system. Each patient received 30 min VR training sessions 3 times per week for 8 weeks. Outcome measures, Fugl-Meyer assessment (FMA, Test Evaluant les Membres superieurs des Personnes Agees (TEMPA, Wolf motor function test (WMFT, Box and Block test (BBT, and Jamar grip dynamometer, showed statistically significant progress from pretest to posttest and follow-up, indicating that the proposed system effectively promoted fine motor recovery of function. Additionally, our evidence suggests that this system was also effective under certain challenging conditions such as being in the chronic stroke phase or a coside of lesion and dominant hand (nondominant hand impaired. System usability assessment indicated that the participants strongly intended to continue using this VR-based system in rehabilitation.

  3. Roughness based perceptual analysis towards digital skin imaging system with haptic feedback.

    Science.gov (United States)

    Kim, K

    2016-08-01

    To examine psoriasis or atopic eczema, analyzing skin roughness by palpation is essential to precisely diagnose skin diseases. However, optical sensor based skin imaging systems do not allow dermatologists to touch skin images. To solve the problem, a new haptic rendering technology that can accurately display skin roughness must be developed. In addition, the rendering algorithm must be able to filter spatial noises created during 2D to 3D image conversion without losing the original roughness on the skin image. In this study, a perceptual way to design a noise filter that will remove spatial noises and in the meantime recover maximized roughness is introduced by understanding human sensitivity on surface roughness. A visuohaptic rendering system that can provide a user with seeing and touching digital skin surface roughness has been developed including a geometric roughness estimation method from a meshed surface. In following, a psychophysical experiment was designed and conducted with 12 human subjects to measure human perception with the developed visual and haptic interfaces to examine surface roughness. From the psychophysical experiment, it was found that touch is more sensitive at lower surface roughness, and vice versa. Human perception with both senses, vision and touch, becomes less sensitive to surface distortions as roughness increases. When interact with both channels, visual and haptic interfaces, the performance to detect abnormalities on roughness is greatly improved by sensory integration with the developed visuohaptic rendering system. The result can be used as a guideline to design a noise filter that can perceptually remove spatial noises while recover maximized roughness values from a digital skin image obtained by optical sensors. In addition, the result also confirms that the developed visuohaptic rendering system can help dermatologists or skin care professionals examine skin conditions by using vision and touch at the same time. © 2015

  4. Development of a wearable haptic game interface

    Directory of Open Access Journals (Sweden)

    J. Foottit

    2016-04-01

    Full Text Available This paper outlines the ongoing development of a wearable haptic game interface, in this case for controlling a flight simulator. The device differs from many traditional haptic feedback implementations in that it combines vibrotactile feedback with gesture based input, thus becoming a two-way conduit between the user and the virtual environment. The device is intended to challenge what is considered an “interface” and sets out to purposefully blur the boundary between man and machine. This allows for a more immersive experience, and a user evaluation shows that the intuitive interface allows the user to become the aircraft that is controlled by the movements of the user's hand.

  5. Visual-haptic integration with pliers and tongs: signal ‘weights’ take account of changes in haptic sensitivity caused by different tools

    Directory of Open Access Journals (Sweden)

    Chie eTakahashi

    2014-02-01

    Full Text Available When we hold an object while looking at it, estimates from visual and haptic cues to size are combined in a statistically optimal fashion, whereby the ‘weight’ given to each signal reflects their relative reliabilities. This allows object properties to be estimated more precisely than would otherwise be possible. Tools such as pliers and tongs systematically perturb the mapping between object size and the hand opening. This could complicate visual-haptic integration because it may alter the reliability of the haptic signal, thereby disrupting the determination of appropriate signal weights. To investigate this we first measured the reliability of haptic size estimates made with virtual pliers-like tools (created using a stereoscopic display and force-feedback robots with different ‘gains’ between hand opening and object size. Haptic reliability in tool use was straightforwardly determined by a combination of sensitivity to changes in hand opening and the effects of tool geometry. The precise pattern of sensitivity to hand opening, which violated Weber’s law, meant that haptic reliability changed with tool gain. We then examined whether the visuo-motor system accounts for these reliability changes. We measured the weight given to visual and haptic stimuli when both were available, again with different tool gains, by measuring the perceived size of stimuli in which visual and haptic sizes were varied independently. The weight given to each sensory cue changed with tool gain in a manner that closely resembled the predictions of optimal sensory integration. The results are consistent with the idea that different tool geometries are modelled by the brain, allowing it to calculate not only the distal properties of objects felt with tools, but also the certainty with which those properties are known. These findings highlight the flexibility of human sensory integration and tool-use, and potentially provide an approach for optimising the

  6. Palpation imaging using a haptic system for virtual reality applications in medicine.

    Science.gov (United States)

    Khaled, W; Reichling, S; Bruhns, O T; Boese, H; Baumann, M; Monkman, G; Egersdoerfer, S; Klein, D; Tunayar, A; Freimuth, H; Lorenz, A; Pessavento, A; Ermert, H

    2004-01-01

    In the field of medical diagnosis, there is a strong need to determine mechanical properties of biological tissue, which are of histological and pathological relevance. Malignant tumors are significantly stiffer than surrounding healthy tissue. One of the established diagnosis procedures is the palpation of body organs and tissue. Palpation is used to measure swelling, detect bone fracture, find and measure pulse, or to locate changes in the pathological state of tissue and organs. Current medical practice routinely uses sophisticated diagnostic tests through magnetic resonance imaging (MRI), computed tomography (CT) and ultrasound (US) imaging. However, they cannot provide direct measure of tissue elasticity. Last year we presented the concept of the first haptic sensor actuator system to visualize and reconstruct mechanical properties of tissue using ultrasonic elastography and a haptic display with electrorheological fluids. We developed a real time strain imaging system for tumor diagnosis. It allows biopsies simultaneously to conventional ultrasound B-Mode and strain imaging investigations. We deduce the relative mechanical properties by using finite element simulations and numerical solution models solving the inverse problem. Various modifications on the haptic sensor actuator system have been investigated. This haptic system has the potential of inducing real time substantial forces, using a compact lightweight mechanism which can be applied to numerous areas including intraoperative navigation, telemedicine, teaching and telecommunication.

  7. Patient adaptive control of end-effector based gait rehabilitation devices using a haptic control framework.

    Science.gov (United States)

    Hussein, Sami; Kruger, Jörg

    2011-01-01

    Robot assisted training has proven beneficial as an extension of conventional therapy to improve rehabilitation outcome. Further facilitation of this positive impact is expected from the application of cooperative control algorithms to increase the patient's contribution to the training effort according to his level of ability. This paper presents an approach for cooperative training for end-effector based gait rehabilitation devices. Thereby it provides the basis to firstly establish sophisticated cooperative control methods in this class of devices. It uses a haptic control framework to synthesize and render complex, task specific training environments, which are composed of polygonal primitives. Training assistance is integrated as part of the environment into the haptic control framework. A compliant window is moved along a nominal training trajectory compliantly guiding and supporting the foot motion. The level of assistance is adjusted via the stiffness of the moving window. Further an iterative learning algorithm is used to automatically adjust this assistance level. Stable haptic rendering of the dynamic training environments and adaptive movement assistance have been evaluated in two example training scenarios: treadmill walking and stair climbing. Data from preliminary trials with one healthy subject is provided in this paper. © 2011 IEEE

  8. Haptic perception of object curvature in Parkinson's disease.

    Directory of Open Access Journals (Sweden)

    Jürgen Konczak

    2008-07-01

    Full Text Available The haptic perception of the curvature of an object is essential for adequate object manipulation and critical for our guidance of actions. This study investigated how the ability to perceive the curvature of an object is altered by Parkinson's disease (PD.Eight healthy subjects and 11 patients with mild to moderate PD had to judge, without vision, the curvature of a virtual "box" created by a robotic manipulandum. Their hands were either moved passively along a defined curved path or they actively explored the curved curvature of a virtual wall. The curvature was either concave or convex (bulging to the left or right and was judged in two locations of the hand workspace--a left workspace location, where the curved hand path was associated with curved shoulder and elbow joint paths, and a right workspace location in which these joint paths were nearly linear. After exploring the curvature of the virtual object, subjects had to judge whether the curvature was concave or convex. Based on these data, thresholds for curvature sensitivity were established. The main findings of the study are: First, 9 out 11 PD patients (82% showed elevated thresholds for detecting convex curvatures in at least one test condition. The respective median threshold for the PD group was increased by 343% when compared to the control group. Second, when distal hand paths became less associated with proximal joint paths (right workspace, haptic acuity was reduced substantially in both groups. Third, sensitivity to hand trajectory curvature was not improved during active exploration in either group.Our data demonstrate that PD is associated with a decreased acuity of the haptic sense, which may occur already at an early stage of the disease.

  9. Modeling and Design of an Electro-Rheological Fluid Based Haptic System for Tele-Operation of Space Robots

    Science.gov (United States)

    Mavroidis, Constantinos; Pfeiffer, Charles; Paljic, Alex; Celestino, James; Lennon, Jamie; Bar-Cohen, Yoseph

    2000-01-01

    For many years, the robotic community sought to develop robots that can eventually operate autonomously and eliminate the need for human operators. However, there is an increasing realization that there are some tasks that human can perform significantly better but, due to associated hazards, distance, physical limitations and other causes, only robot can be employed to perform these tasks. Remotely performing these types of tasks requires operating robots as human surrogates. While current "hand master" haptic systems are able to reproduce the feeling of rigid objects, they present great difficulties in emulating the feeling of remote/virtual stiffness. In addition, they tend to be heavy, cumbersome and usually they only allow limited operator workspace. In this paper a novel haptic interface is presented to enable human-operators to "feel" and intuitively mirror the stiffness/forces at remote/virtual sites enabling control of robots as human-surrogates. This haptic interface is intended to provide human operators intuitive feeling of the stiffness and forces at remote or virtual sites in support of space robots performing dexterous manipulation tasks (such as operating a wrench or a drill). Remote applications are referred to the control of actual robots whereas virtual applications are referred to simulated operations. The developed haptic interface will be applicable to IVA operated robotic EVA tasks to enhance human performance, extend crew capability and assure crew safety. The electrically controlled stiffness is obtained using constrained ElectroRheological Fluids (ERF), which changes its viscosity under electrical stimulation. Forces applied at the robot end-effector due to a compliant environment will be reflected to the user using this ERF device where a change in the system viscosity will occur proportionally to the force to be transmitted. In this paper, we will present the results of our modeling, simulation, and initial testing of such an

  10. Learning in a virtual environment using haptic systems for movement re-education: can this medium be used for remodeling other behaviors and actions?

    Science.gov (United States)

    Merians, Alma S; Fluet, Gerard G; Qiu, Qinyin; Lafond, Ian; Adamovich, Sergei V

    2011-03-01

    Robotic systems that are interfaced with virtual reality gaming and task simulations are increasingly being developed to provide repetitive intensive practice to promote increased compliance and facilitate better outcomes in rehabilitation post-stroke. A major development in the use of virtual environments (VEs) has been to incorporate tactile information and interaction forces into what was previously an essentially visual experience. Robots of varying complexity are being interfaced with more traditional virtual presentations to provide haptic feedback that enriches the sensory experience and adds physical task parameters. This provides forces that produce biomechanical and neuromuscular interactions with the VE that approximate real-world movement more accurately than visual-only VEs, simulating the weight and force found in upper extremity tasks. The purpose of this article is to present an overview of several systems that are commercially available for ambulation training and for training movement of the upper extremity. We will also report on the system that we have developed (NJIT-RAVR system) that incorporates motivating and challenging haptic feedback effects into VE simulations to facilitate motor recovery of the upper extremity post-stroke. The NJIT-RAVR system trains both the upper arm and the hand. The robotic arm acts as an interface between the participants and the VEs, enabling multiplanar movements against gravity in a three-dimensional workspace. The ultimate question is whether this medium can provide a motivating, challenging, gaming experience with dramatically decreased physical difficulty levels, which would allow for participation by an obese person and facilitate greater adherence to exercise regimes. © 2011 Diabetes Technology Society.

  11. Virtual-Reality Simulator System for Double Interventional Cardiac Catheterization Using Fractional-Order Vascular Access Tracker and Haptic Force Producer

    Directory of Open Access Journals (Sweden)

    Guan-Chun Chen

    2015-01-01

    Full Text Available This study proposes virtual-reality (VR simulator system for double interventional cardiac catheterization (ICC using fractional-order vascular access tracker and haptic force producer. An endoscope or a catheter for diagnosis and surgery of cardiovascular disease has been commonly used in minimally invasive surgery. It needs specific skills and experiences for young surgeons or postgraduate year (PGY students to operate a Berman catheter and a pigtail catheter in the inside of the human body and requires avoiding damaging vessels. To improve the training in inserting catheters, a double-catheter mechanism is designed for the ICC procedures. A fractional-order vascular access tracker is used to trace the senior surgeons’ consoled trajectories and transmit the frictional feedback and visual feedback during the insertion of catheters. Based on the clinical feeling through the aortic arch, vein into the ventricle, or tortuous blood vessels, haptic force producer is used to mock the elasticity of the vessel wall using voice coil motors (VCMs. The VR establishment with surgeons’ consoled vessel trajectories and hand feeling is achieved, and the experimental results show the effectiveness for the double ICC procedures.

  12. Objective Assessment of Laparoscopic Force and Psychomotor Skills in a Novel Virtual Reality-Based Haptic Simulator.

    Science.gov (United States)

    Prasad, M S Raghu; Manivannan, Muniyandi; Manoharan, Govindan; Chandramohan, S M

    2016-01-01

    Most of the commercially available virtual reality-based laparoscopic simulators do not effectively evaluate combined psychomotor and force-based laparoscopic skills. Consequently, the lack of training on these critical skills leads to intraoperative errors. To assess the effectiveness of the novel virtual reality-based simulator, this study analyzed the combined psychomotor (i.e., motion or movement) and force skills of residents and expert surgeons. The study also examined the effectiveness of real-time visual force feedback and tool motion during training. Bimanual fundamental (i.e., probing, pulling, sweeping, grasping, and twisting) and complex tasks (i.e., tissue dissection) were evaluated. In both tasks, visual feedback on applied force and tool motion were provided. The skills of the participants while performing the early tasks were assessed with and without visual feedback. Participants performed 5 repetitions of fundamental and complex tasks. Reaction force and instrument acceleration were used as metrics. Surgical Gastroenterology, Government Stanley Medical College and Hospital; Institute of Surgical Gastroenterology, Madras Medical College and Rajiv Gandhi Government General Hospital. Residents (N = 25; postgraduates and surgeons with 4 and ≤10 years of laparoscopic surgery). Residents applied large forces compared with expert surgeons and performed abrupt tool movements (p < 0.001). However, visual + haptic feedback improved the performance of residents (p < 0.001). In complex tasks, visual + haptic feedback did not influence the applied force of expert surgeons, but influenced their tool motion (p < 0.001). Furthermore, in complex tissue sweeping task, expert surgeons applied more force, but were within the tissue damage limits. In both groups, exertion of large forces and abrupt tool motion were observed during grasping, probing or pulling, and tissue sweeping maneuvers (p < 0.001). Modern day curriculum-based training should evaluate the skills

  13. Preliminary assessment of faculty and student perception of a haptic virtual reality simulator for training dental manual dexterity.

    Science.gov (United States)

    Gal, Gilad Ben; Weiss, Ervin I; Gafni, Naomi; Ziv, Amitai

    2011-04-01

    Virtual reality force feedback simulators provide a haptic (sense of touch) feedback through the device being held by the user. The simulator's goal is to provide a learning experience resembling reality. A newly developed haptic simulator (IDEA Dental, Las Vegas, NV, USA) was assessed in this study. Our objectives were to assess the simulator's ability to serve as a tool for dental instruction, self-practice, and student evaluation, as well as to evaluate the sensation it provides. A total of thirty-three evaluators were divided into two groups. The first group consisted of twenty-one experienced dental educators; the second consisted of twelve fifth-year dental students. Each participant performed drilling tasks using the simulator and filled out a questionnaire regarding the simulator and potential ways of using it in dental education. The results show that experienced dental faculty members as well as advanced dental students found that the simulator could provide significant potential benefits in the teaching and self-learning of manual dental skills. Development of the simulator's tactile sensation is needed to attune it to genuine sensation. Further studies relating to aspects of the simulator's structure and its predictive validity, its scoring system, and the nature of the performed tasks should be conducted.

  14. Radiofrequency ablation of hepatic tumors: simulation, planning, and contribution of virtual reality and haptics.

    Science.gov (United States)

    Villard, Caroline; Soler, Luc; Gangi, Afshin

    2005-08-01

    For radiofrequency ablation (RFA) of liver tumors, evaluation of vascular architecture, post-RFA necrosis prediction, and the choice of a suitable needle placement strategy using conventional radiological techniques remain difficult. In an attempt to enhance the safety of RFA, a 3D simulator, treatment planning, and training tool, that simulates the insertion of the needle, the necrosis of the treated area, and proposes an optimal needle placement, has been developed. The 3D scenes are automatically reconstructed from enhanced spiral CT scans. The simulator takes into account the cooling effect of local vessels greater than 3 mm in diameter, making necrosis shapes more realistic. Optimal needle positioning can be automatically generated by the software to produce complete destruction of the tumor, with maximum respect of the healthy liver and of all major structures to avoid. We also studied how the use of virtual reality and haptic devices are valuable to make simulation and training realistic and effective.

  15. Evaluating progressive-rendering algorithms in appearance design tasks.

    Science.gov (United States)

    Jiawei Ou; Karlik, Ondrej; Křivánek, Jaroslav; Pellacini, Fabio

    2013-01-01

    Progressive rendering is becoming a popular alternative to precomputational approaches to appearance design. However, progressive algorithms create images exhibiting visual artifacts at early stages. A user study investigated these artifacts' effects on user performance in appearance design tasks. Novice and expert subjects performed lighting and material editing tasks with four algorithms: random path tracing, quasirandom path tracing, progressive photon mapping, and virtual-point-light rendering. Both the novices and experts strongly preferred path tracing to progressive photon mapping and virtual-point-light rendering. None of the participants preferred random path tracing to quasirandom path tracing or vice versa; the same situation held between progressive photon mapping and virtual-point-light rendering. The user workflow didn’t differ significantly with the four algorithms. The Web Extras include a video showing how four progressive-rendering algorithms converged (at http://youtu.be/ck-Gevl1e9s), the source code used, and other supplementary materials.

  16. The role of haptic feedback in laparoscopic simulation training.

    Science.gov (United States)

    Panait, Lucian; Akkary, Ehab; Bell, Robert L; Roberts, Kurt E; Dudrick, Stanley J; Duffy, Andrew J

    2009-10-01

    Laparoscopic virtual reality simulators are becoming a ubiquitous tool in resident training and assessment. These devices provide the operator with various levels of realism, including haptic (or force) feedback. However, this feature adds significantly to the cost of the devices, and limited data exist assessing the value of haptics in skill acquisition and development. Utilizing the Laparoscopy VR (Immersion Medical, Gaithersburg, MD), we hypothesized that the incorporation of force feedback in the simulated operative environment would allow superior trainee performance compared with performance of the same basic skills tasks in a non-haptic model. Ten medical students with minimal laparoscopic experience and similar baseline skill levels as proven by performance of two fundamentals of laparoscopic surgery (FLS) tasks (peg transfer and cutting drills) voluntarily participated in the study. Each performed two tasks, analogous to the FLS drills, on the Laparoscopy VR at 3 levels of difficulty, based on the established settings of the manufacturer. After achieving familiarity with the device and tasks, the students completed the drills both with and without force feedback. Data on completion time, instrument path length, right and left hand errors, and grasping tension were analyzed. The scores in the haptic-enhanced simulation environment were compared with the scores in the non-haptic model and analyzed utilizing Student's t-test. The peg transfer drill showed no difference in performance between the haptic and non-haptic simulations for all metrics at all three levels of difficulty. For the more complex cutting exercise, the time to complete the tasks was significantly shorter when force feedback was provided, at all levels of difficulty (158+/-56 versus 187+/-51 s, 176+/-49 versus 222+/-68 s, and 275+/-76 versus 422+/-220 s, at levels 1, 2, and 3, respectively, Psimulation did not demonstrate an appreciable performance improvement among our trainees. These data

  17. A Surgical Robot Teleoperation Framework for Providing Haptic Feedback Incorporating Virtual Envrioment-Based Guidance

    Directory of Open Access Journals (Sweden)

    Adnan Munawar

    2016-08-01

    Full Text Available In robot-assisted tele-operated laparoscopic surgeries, the patient side manipulators are controlled via the master manipulators that are controlled by the surgeon. The current generation of robots approved for laparoscopic surgery lack haptic feedback. In theory, haptic feedback would enhance the surgical procedures by enabling better coordination between the hand movements that are improved by the tactile sense of the operating environment. This research presents an overall control framework for a haptic feedback on existing robot platforms, and demonstrated on the daVinci Research Kit (dVRK system. The paper discusses the implementation of a flexible framework that incorporates a stiffness control with gravity compensation for the surgeons manipulator and a sensing and collision detection algorithm for calculating the interaction between the patients manipulators and the surgical area.

  18. Realistic Real-Time Outdoor Rendering in Augmented Reality

    Science.gov (United States)

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems. PMID:25268480

  19. Realistic real-time outdoor rendering in augmented reality.

    Directory of Open Access Journals (Sweden)

    Hoshang Kolivand

    Full Text Available Realistic rendering techniques of outdoor Augmented Reality (AR has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps. Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.

  20. Precise Haptic Device Co-Location for Visuo-Haptic Augmented Reality.

    Science.gov (United States)

    Eck, Ulrich; Pankratz, Frieder; Sandor, Christian; Klinker, Gudrun; Laga, Hamid

    2015-12-01

    Visuo-haptic augmented reality systems enable users to see and touch digital information that is embedded in the real world. PHANToM haptic devices are often employed to provide haptic feedback. Precise co-location of computer-generated graphics and the haptic stylus is necessary to provide a realistic user experience. Previous work has focused on calibration procedures that compensate the non-linear position error caused by inaccuracies in the joint angle sensors. In this article we present a more complete procedure that additionally compensates for errors in the gimbal sensors and improves position calibration. The proposed procedure further includes software-based temporal alignment of sensor data and a method for the estimation of a reference for position calibration, resulting in increased robustness against haptic device initialization and external tracker noise. We designed our procedure to require minimal user input to maximize usability. We conducted an extensive evaluation with two different PHANToMs, two different optical trackers, and a mechanical tracker. Compared to state-of-the-art calibration procedures, our approach significantly improves the co-location of the haptic stylus. This results in higher fidelity visual and haptic augmentations, which are crucial for fine-motor tasks in areas such as medical training simulators, assembly planning tools, or rapid prototyping applications.

  1. Design and Calibration of a New 6 DOF Haptic Device

    Directory of Open Access Journals (Sweden)

    Huanhuan Qin

    2015-12-01

    Full Text Available For many applications such as tele-operational robots and interactions with virtual environments, it is better to have performance with force feedback than without. Haptic devices are force reflecting interfaces. They can also track human hand positions simultaneously. A new 6 DOF (degree-of-freedom haptic device was designed and calibrated in this study. It mainly contains a double parallel linkage, a rhombus linkage, a rotating mechanical structure and a grasping interface. Benefited from the unique design, it is a hybrid structure device with a large workspace and high output capability. Therefore, it is capable of multi-finger interactions. Moreover, with an adjustable base, operators can change different postures without interrupting haptic tasks. To investigate the performance regarding position tracking accuracy and static output forces, we conducted experiments on a three-dimensional electric sliding platform and a digital force gauge, respectively. Displacement errors and force errors are calculated and analyzed. To identify the capability and potential of the device, four application examples were programmed.

  2. Design and Calibration of a New 6 DOF Haptic Device

    Science.gov (United States)

    Qin, Huanhuan; Song, Aiguo; Liu, Yuqing; Jiang, Guohua; Zhou, Bohe

    2015-01-01

    For many applications such as tele-operational robots and interactions with virtual environments, it is better to have performance with force feedback than without. Haptic devices are force reflecting interfaces. They can also track human hand positions simultaneously. A new 6 DOF (degree-of-freedom) haptic device was designed and calibrated in this study. It mainly contains a double parallel linkage, a rhombus linkage, a rotating mechanical structure and a grasping interface. Benefited from the unique design, it is a hybrid structure device with a large workspace and high output capability. Therefore, it is capable of multi-finger interactions. Moreover, with an adjustable base, operators can change different postures without interrupting haptic tasks. To investigate the performance regarding position tracking accuracy and static output forces, we conducted experiments on a three-dimensional electric sliding platform and a digital force gauge, respectively. Displacement errors and force errors are calculated and analyzed. To identify the capability and potential of the device, four application examples were programmed. PMID:26690449

  3. Virtual reality system for treatment of the fear of public speaking using image-based rendering and moving pictures.

    Science.gov (United States)

    Lee, Jae M; Ku, Jeong H; Jang, Dong P; Kim, Dong H; Choi, Young H; Kim, In Y; Kim, Sun I

    2002-06-01

    The fear of speaking is often cited as the world's most common social phobia. The rapid growth of computer technology enabled us to use virtual reality (VR) for the treatment of the fear of public speaking. There have been two techniques used to construct a virtual environment for the treatment of the fear of public speaking: model-based and movie-based. Virtual audiences and virtual environments made by model-based technique are unrealistic and unnatural. The movie-based technique has a disadvantage in that each virtual audience cannot be controlled respectively, because all virtual audiences are included in one moving picture file. To address this disadvantage, this paper presents a virtual environment made by using image-based rendering (IBR) and chroma keying simultaneously. IBR enables us to make the virtual environment realistic because the images are stitched panoramically with the photos taken from a digital camera. And the use of chroma keying allows a virtual audience to be controlled individually. In addition, a real-time capture technique was applied in constructing the virtual environment to give the subjects more interaction, in that they can talk with a therapist or another subject.

  4. Contribution to the modeling and the identification of haptic interfaces; Contribution a la modelisation et a l'identification des interfaces haptiques

    Energy Technology Data Exchange (ETDEWEB)

    Janot, A

    2007-12-15

    This thesis focuses on the modeling and the identification of haptic interfaces using cable drive. An haptic interface is a force feedback device, which enables its user to interact with a virtual world or a remote environment explored by a slave system. It aims at the matching between the forces and displacements given by the user and those applied to virtual world. Usually, haptic interfaces make use of a mechanical actuated structure whose distal link is equipped with a handle. When manipulating this handle to interact with explored world, the user feels the apparent mass, compliance and friction of the interface. This distortion introduced between the operator and the virtual world must be modeled and identified to enhance the design of the interface and develop appropriate control laws. The first approach has been to adapt the modeling and identification methods of rigid and localized flexibilities robots to haptic interfaces. The identification technique makes use of the inverse dynamic model and the linear least squares with the measurements of joint torques and positions. This approach is validated on a single degree of freedom and a three degree of freedom haptic devices. A new identification method needing only torque data is proposed. It is based on a closed loop simulation using the direct dynamic model. The optimal parameters minimize the 2 norms of the error between the actual torque and the simulated torque assuming the same control law and the same tracking trajectory. This non linear least squares problem dramatically is simplified using the inverse model to calculate the simulated torque. This method is validated on the single degree of freedom haptic device and the SCARA robot. (author)

  5. Haptic shared control improves hot cell remote handling despite controller inaccuracies

    International Nuclear Information System (INIS)

    Oosterhout, J. van; Abbink, D.A.; Koning, J.F.; Boessenkool, H.; Wildenbeest, J.G.W.; Heemskerk, C.J.M.

    2013-01-01

    Highlights: Haptic shared control is generally based upon perfect environment information. A realistic implementation holds model errors with respect to the environment. Operators were aided with inaccurate guiding forces during a peg-in-hole task. The results showed that small guiding inaccuracies still aid the operator. -- Abstract: A promising solution to improve task performance in ITER hot cell remote handling is the use of haptic shared control. Haptic shared control can assist the human operator along a safe and optimal path with continuous guiding forces from an intelligent autonomous controller. Previous research tested such controllers with accurate knowledge of the environment (giving flawless guiding forces), while in a practical implementation guidance forces will sometimes be flawed due to inaccurate models or sensor information. This research investigated the effect of zero and small (7.5 mm) errors on task performance compared to normal (unguided) operation. In a human factors experiment subjects performed a three dimensional virtual reality peg-in-hole type task (30 mm diameter; 0.1 mm clearance), with and without potentially flawed haptic shared control. The results showed that the presence of guiding forces, despite of small guiding errors, still improved task performance with respect to unguided operations

  6. Haptic shared control improves hot cell remote handling despite controller inaccuracies

    Energy Technology Data Exchange (ETDEWEB)

    Oosterhout, J. van, E-mail: J.vanOosterhout@differ.nl [Delft University of Technology, Faculty of 3mE, BioMechanical Engineering Department, Mekelweg 2, 2628 CD Delft (Netherlands); Abbink, D.A. [Delft University of Technology, Faculty of 3mE, BioMechanical Engineering Department, Mekelweg 2, 2628 CD Delft (Netherlands); Koning, J.F. [Heemskerk Innovative Technology B.V., Jonckerweg 12, 2201 DZ Noordwijk (Netherlands); Boessenkool, H. [FOM Institute DIFFER (Dutch Institute for Fundamental Energy Research), Association EURATOM-FOM, Partner in the Trilateral Euregio Cluster, PO Box 1207, 3430 BE Nieuwegein (Netherlands); Wildenbeest, J.G.W. [Delft University of Technology, Faculty of 3mE, BioMechanical Engineering Department, Mekelweg 2, 2628 CD Delft (Netherlands); Heemskerk Innovative Technology B.V., Jonckerweg 12, 2201 DZ Noordwijk (Netherlands); Heemskerk, C.J.M. [Heemskerk Innovative Technology B.V., Jonckerweg 12, 2201 DZ Noordwijk (Netherlands)

    2013-10-15

    Highlights: Haptic shared control is generally based upon perfect environment information. A realistic implementation holds model errors with respect to the environment. Operators were aided with inaccurate guiding forces during a peg-in-hole task. The results showed that small guiding inaccuracies still aid the operator. -- Abstract: A promising solution to improve task performance in ITER hot cell remote handling is the use of haptic shared control. Haptic shared control can assist the human operator along a safe and optimal path with continuous guiding forces from an intelligent autonomous controller. Previous research tested such controllers with accurate knowledge of the environment (giving flawless guiding forces), while in a practical implementation guidance forces will sometimes be flawed due to inaccurate models or sensor information. This research investigated the effect of zero and small (7.5 mm) errors on task performance compared to normal (unguided) operation. In a human factors experiment subjects performed a three dimensional virtual reality peg-in-hole type task (30 mm diameter; 0.1 mm clearance), with and without potentially flawed haptic shared control. The results showed that the presence of guiding forces, despite of small guiding errors, still improved task performance with respect to unguided operations.

  7. Prototype of haptic device for sole of foot using magnetic field sensitive elastomer

    Science.gov (United States)

    Kikuchi, T.; Masuda, Y.; Sugiyama, M.; Mitsumata, T.; Ohori, S.

    2013-02-01

    Walking is one of the most popular activities and a healthy aerobic exercise for the elderly. However, if they have physical and / or cognitive disabilities, sometimes it is challenging to go somewhere they don't know well. The final goal of this study is to develop a virtual reality walking system that allows users to walk in virtual worlds fabricated with computer graphics. We focus on a haptic device that can perform various plantar pressures on users' soles of feet as an additional sense in the virtual reality walking. In this study, we discuss a use of a magnetic field sensitive elastomer (MSE) as a working material for the haptic interface on the sole. The first prototype with MSE was developed and evaluated in this work. According to the measurement of planter pressures, it was found that this device can perform different pressures on the sole of a light-weight user by applying magnetic field on the MSE. The result also implied necessities of the improvement of the magnetic circuit and the basic structure of the mechanism of the device.

  8. Contribution to the modeling and the identification of haptic interfaces; Contribution a la modelisation et a l'identification des interfaces haptiques

    Energy Technology Data Exchange (ETDEWEB)

    Janot, A

    2007-12-15

    This thesis focuses on the modeling and the identification of haptic interfaces using cable drive. An haptic interface is a force feedback device, which enables its user to interact with a virtual world or a remote environment explored by a slave system. It aims at the matching between the forces and displacements given by the user and those applied to virtual world. Usually, haptic interfaces make use of a mechanical actuated structure whose distal link is equipped with a handle. When manipulating this handle to interact with explored world, the user feels the apparent mass, compliance and friction of the interface. This distortion introduced between the operator and the virtual world must be modeled and identified to enhance the design of the interface and develop appropriate control laws. The first approach has been to adapt the modeling and identification methods of rigid and localized flexibilities robots to haptic interfaces. The identification technique makes use of the inverse dynamic model and the linear least squares with the measurements of joint torques and positions. This approach is validated on a single degree of freedom and a three degree of freedom haptic devices. A new identification method needing only torque data is proposed. It is based on a closed loop simulation using the direct dynamic model. The optimal parameters minimize the 2 norms of the error between the actual torque and the simulated torque assuming the same control law and the same tracking trajectory. This non linear least squares problem dramatically is simplified using the inverse model to calculate the simulated torque. This method is validated on the single degree of freedom haptic device and the SCARA robot. (author)

  9. Grasping trajectories in a virtual environment adhere to Weber's law.

    Science.gov (United States)

    Ozana, Aviad; Berman, Sigal; Ganel, Tzvi

    2018-06-01

    Virtual-reality and telerobotic devices simulate local motor control of virtual objects within computerized environments. Here, we explored grasping kinematics within a virtual environment and tested whether, as in normal 3D grasping, trajectories in the virtual environment are performed analytically, violating Weber's law with respect to object's size. Participants were asked to grasp a series of 2D objects using a haptic system, which projected their movements to a virtual space presented on a computer screen. The apparatus also provided object-specific haptic information upon "touching" the edges of the virtual targets. The results showed that grasping movements performed within the virtual environment did not produce the typical analytical trajectory pattern obtained during 3D grasping. Unlike as in 3D grasping, grasping trajectories in the virtual environment adhered to Weber's law, which indicates relative resolution in size processing. In addition, the trajectory patterns differed from typical trajectories obtained during 3D grasping, with longer times to complete the movement, and with maximum grip apertures appearing relatively early in the movement. The results suggest that grasping movements within a virtual environment could differ from those performed in real space, and are subjected to irrelevant effects of perceptual information. Such atypical pattern of visuomotor control may be mediated by the lack of complete transparency between the interface and the virtual environment in terms of the provided visual and haptic feedback. Possible implications of the findings to movement control within robotic and virtual environments are further discussed.

  10. Practice on an augmented reality/haptic simulator and library of virtual brains improves residents' ability to perform a ventriculostomy.

    Science.gov (United States)

    Yudkowsky, Rachel; Luciano, Cristian; Banerjee, Pat; Schwartz, Alan; Alaraj, Ali; Lemole, G Michael; Charbel, Fady; Smith, Kelly; Rizzi, Silvio; Byrne, Richard; Bendok, Bernard; Frim, David

    2013-02-01

    Ventriculostomy is a neurosurgical procedure for providing therapeutic cerebrospinal fluid drainage. Complications may arise during repeated attempts at placing the catheter in the ventricle. We studied the impact of simulation-based practice with a library of virtual brains on neurosurgery residents' performance in simulated and live surgical ventriculostomies. Using computed tomographic scans of actual patients, we developed a library of 15 virtual brains for the ImmersiveTouch system, a head- and hand-tracked augmented reality and haptic simulator. The virtual brains represent a range of anatomies including normal, shifted, and compressed ventricles. Neurosurgery residents participated in individual simulator practice on the library of brains including visualizing the 3-dimensional location of the catheter within the brain immediately after each insertion. Performance of participants on novel brains in the simulator and during actual surgery before and after intervention was analyzed using generalized linear mixed models. Simulator cannulation success rates increased after intervention, and live procedure outcomes showed improvement in the rate of successful cannulation on the first pass. However, the incidence of deeper, contralateral (simulator) and third-ventricle (live) placements increased after intervention. Residents reported that simulations were realistic and helpful in improving procedural skills such as aiming the probe, sensing the pressure change when entering the ventricle, and estimating how far the catheter should be advanced within the ventricle. Simulator practice with a library of virtual brains representing a range of anatomies and difficulty levels may improve performance, potentially decreasing complications due to inexpert technique.

  11. Training haptic stiffness discrimination: time course of learning with or without visual information and knowledge of results.

    Science.gov (United States)

    Teodorescu, Kinneret; Bouchigny, Sylvain; Korman, Maria

    2013-08-01

    In this study, we explored the time course of haptic stiffness discrimination learning and how it was affected by two experimental factors, the addition of visual information and/or knowledge of results (KR) during training. Stiffness perception may integrate both haptic and visual modalities. However, in many tasks, the visual field is typically occluded, forcing stiffness perception to be dependent exclusively on haptic information. No studies to date addressed the time course of haptic stiffness perceptual learning. Using a virtual environment (VE) haptic interface and a two-alternative forced-choice discrimination task, the haptic stiffness discrimination ability of 48 participants was tested across 2 days. Each day included two haptic test blocks separated by a training block Additional visual information and/or KR were manipulated between participants during training blocks. Practice repetitions alone induced significant improvement in haptic stiffness discrimination. Between days, accuracy was slightly improved, but decision time performance was deteriorated. The addition of visual information and/or KR had only temporary effects on decision time, without affecting the time course of haptic discrimination learning. Learning in haptic stiffness discrimination appears to evolve through at least two distinctive phases: A single training session resulted in both immediate and latent learning. This learning was not affected by the training manipulations inspected. Training skills in VE in spaced sessions can be beneficial for tasks in which haptic perception is critical, such as surgery procedures, when the visual field is occluded. However, training protocols for such tasks should account for low impact of multisensory information and KR.

  12. Massage Therapy of the Back Using a Real-Time Haptic-Enhanced Telerehabilitation System

    Directory of Open Access Journals (Sweden)

    Cristina Ramírez-Fernández

    2017-01-01

    Full Text Available We present the usability evaluation of a haptic-enhanced telerehabilitation system for massage therapy of the back using the Vybe haptic gaming pad and the gesture sensor LEAP motion controller. The evaluated system includes features that allow for (i administering online therapy programs, (ii providing self-adjustable and safety treatment of back massages using a virtual environment, and (iii saving and replaying massage sessions according to a patient’s therapy program. The usability evaluation with 25 older adults and 10 specialists suggests that the haptic telerehabilitation system is perceived with high usability and pleasurable user experience, while providing personalized intensity of haptic therapy in a supervised, real-time, and secure way to treat the patient. Moreover, the specialists totally agree that the system design features, such as save and play, and delimiting therapy zones are the most important for back massage therapy, while the features of regulating feedback intensity and providing/receiving a massage remotely are also important. Finally, based on their comments, five design insights aiming at improving the current version of the system were generated.

  13. Design and implementation of visual-haptic assistive control system for virtual rehabilitation exercise and teleoperation manipulation.

    Science.gov (United States)

    Veras, Eduardo J; De Laurentis, Kathryn J; Dubey, Rajiv

    2008-01-01

    This paper describes the design and implementation of a control system that integrates visual and haptic information to give assistive force feedback through a haptic controller (Omni Phantom) to the user. A sensor-based assistive function and velocity scaling program provides force feedback that helps the user complete trajectory following exercises for rehabilitation purposes. This system also incorporates a PUMA robot for teleoperation, which implements a camera and a laser range finder, controlled in real time by a PC, were implemented into the system to help the user to define the intended path to the selected target. The real-time force feedback from the remote robot to the haptic controller is made possible by using effective multithreading programming strategies in the control system design and by novel sensor integration. The sensor-based assistant function concept applied to teleoperation as well as shared control enhances the motion range and manipulation capabilities of the users executing rehabilitation exercises such as trajectory following along a sensor-based defined path. The system is modularly designed to allow for integration of different master devices and sensors. Furthermore, because this real-time system is versatile the haptic component can be used separately from the telerobotic component; in other words, one can use the haptic device for rehabilitation purposes for cases in which assistance is needed to perform tasks (e.g., stroke rehab) and also for teleoperation with force feedback and sensor assistance in either supervisory or automatic modes.

  14. Audio Haptic Videogaming for Developing Wayfinding Skills in Learners Who are Blind.

    Science.gov (United States)

    Sánchez, Jaime; de Borba Campos, Marcia; Espinoza, Matías; Merabet, Lotfi B

    2014-01-01

    Interactive digital technologies are currently being developed as a novel tool for education and skill development. Audiopolis is an audio and haptic based videogame designed for developing orientation and mobility (O&M) skills in people who are blind. We have evaluated the cognitive impact of videogame play on O&M skills by assessing performance on a series of behavioral tasks carried out in both indoor and outdoor virtual spaces. Our results demonstrate that the use of Audiopolis had a positive impact on the development and use of O&M skills in school-aged learners who are blind. The impact of audio and haptic information on learning is also discussed.

  15. Ambient visual information confers a context-specific, long-term benefit on memory for haptic scenes.

    Science.gov (United States)

    Pasqualotto, Achille; Finucane, Ciara M; Newell, Fiona N

    2013-09-01

    We investigated the effects of indirect, ambient visual information on haptic spatial memory. Using touch only, participants first learned an array of objects arranged in a scene and were subsequently tested on their recognition of that scene which was always hidden from view. During haptic scene exploration, participants could either see the surrounding room or were blindfolded. We found a benefit in haptic memory performance only when ambient visual information was available in the early stages of the task but not when participants were initially blindfolded. Specifically, when ambient visual information was available a benefit on performance was found in a subsequent block of trials during which the participant was blindfolded (Experiment 1), and persisted over a delay of one week (Experiment 2). However, we found that the benefit for ambient visual information did not transfer to a novel environment (Experiment 3). In Experiment 4 we further investigated the nature of the visual information that improved haptic memory and found that geometric information about a surrounding (virtual) room rather than isolated object landmarks, facilitated haptic scene memory. Our results suggest that vision improves haptic memory for scenes by providing an environment-centred, allocentric reference frame for representing object location through touch. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Haptic teleoperation systems signal processing perspective

    CERN Document Server

    Lee, Jae-young

    2015-01-01

    This book examines the signal processing perspective in haptic teleoperation systems. This text covers the topics of prediction, estimation, architecture, data compression, and error correction that can be applied to haptic teleoperation systems. The authors begin with an overview of haptic teleoperation systems, then look at a Bayesian approach to haptic teleoperation systems. They move onto a discussion of haptic data compression, haptic data digitization and forward error correction.   ·         Presents haptic data prediction/estimation methods that compensate for unreliable networks   ·         Discusses haptic data compression that reduces haptic data size over limited network bandwidth and haptic data error correction that compensate for packet loss problem   ·         Provides signal processing techniques used with existing control architectures.

  17. Enhancing Mediated Interpersonal Communication through Affective Haptics

    Science.gov (United States)

    Tsetserukou, Dzmitry; Neviarouskaya, Alena; Prendinger, Helmut; Kawakami, Naoki; Ishizuka, Mitsuru; Tachi, Susumu

    Driven by the motivation to enhance emotionally immersive experience of real-time messaging in 3D virtual world Second Life, we are proposing a conceptually novel approach to reinforcing (intensifying) own feelings and reproducing (simulating) the emotions felt by the partner through specially designed system, iFeel_IM!. In the paper we are describing the development of novel haptic devices (HaptiHeart, HaptiHug, HaptiTickler, HaptiCooler, and HaptiWarmer) integrated into iFeel_IM! system, which architecture is presented in detail.

  18. Haptic perception

    NARCIS (Netherlands)

    Kappers, A.M.L.; Bergmann Tiest, W.M.

    2013-01-01

    Fueled by novel applications, interest in haptic perception is growing. This paper provides an overview of the state of the art of a number of important aspects of haptic perception. By means of touch we can not only perceive quite different material properties, such as roughness, compliance,

  19. IMPROVING MEDICAL EDUCATION: SIMULATING CHANGES IN PATIENT ANATOMY USING DYNAMIC HAPTIC FEEDBACK.

    Science.gov (United States)

    Yovanoff, Mary; Pepley, David; Mirkin, Katelin; Moore, Jason; Han, David; Miller, Scarlett

    2016-09-01

    Virtual simulation is an emerging field in medical education. Research suggests that simulation reduces complication rates and improves learning gains for medical residents. One benefit of simulators is their allowance for more realistic and dynamic patient anatomies. While potentially useful throughout medical education, few studies have explored the impact of dynamic haptic simulators on medical training. In light of this research void, this study was developed to examine how a Dynamic-Haptic Robotic Trainer (DHRT) impacts medical student self-efficacy and skill gains compared to traditional simulators developed to train students in Internal Jugular Central Venous Catheter (IJ CVC) placement. The study was conducted with 18 third year medical students with no prior CVC insertion experience who underwent a pre-test, simulator training (manikin, robotic, or mixed) and post-test. The results revealed the DHRT as a useful method for training CVC skills and supports further research on dynamic haptic trainers in medical education.

  20. Pervasive haptics science, design, and application

    CERN Document Server

    Saga, Satoshi; Konyo, Masashi

    2016-01-01

    This book examines the state of the art in diverse areas of haptics (touch)-related research, including the psychophysics and neurophysiology of haptics, development of haptics displays and sensors, and applications to a wide variety of fields such as industry, education, therapy, medicine, and welfare for the visually impaired. It also discusses the potential of future haptics interaction, such as haptics for emotional control and remote haptics communication. The book offers a valuable resource not only for haptics and human interface researchers, but also for developers and designers at manufacturing corporations and in the entertainment industries.

  1. A virtual reality based simulator for learning nasogastric tube placement.

    Science.gov (United States)

    Choi, Kup-Sze; He, Xuejian; Chiang, Vico Chung-Lim; Deng, Zhaohong

    2015-02-01

    Nasogastric tube (NGT) placement is a common clinical procedure where a plastic tube is inserted into the stomach through the nostril for feeding or drainage. However, the placement is a blind process in which the tube may be mistakenly inserted into other locations, leading to unexpected complications or fatal incidents. The placement techniques are conventionally acquired by practising on unrealistic rubber mannequins or on humans. In this paper, a virtual reality based training simulation system is proposed to facilitate the training of NGT placement. It focuses on the simulation of tube insertion and the rendering of the feedback forces with a haptic device. A hybrid force model is developed to compute the forces analytically or numerically under different conditions, including the situations when the patient is swallowing or when the tube is buckled at the nostril. To ensure real-time interactive simulations, an offline simulation approach is adopted to obtain the relationship between the insertion depth and insertion force using a non-linear finite element method. The offline dataset is then used to generate real-time feedback forces by interpolation. The virtual training process is logged quantitatively with metrics that can be used for assessing objective performance and tracking progress. The system has been evaluated by nursing professionals. They found that the haptic feeling produced by the simulated forces is similar to their experience during real NGT insertion. The proposed system provides a new educational tool to enhance conventional training in NGT placement. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. The effect of haptic guidance and visual feedback on learning a complex tennis task.

    Science.gov (United States)

    Marchal-Crespo, Laura; van Raai, Mark; Rauter, Georg; Wolf, Peter; Riener, Robert

    2013-11-01

    While haptic guidance can improve ongoing performance of a motor task, several studies have found that it ultimately impairs motor learning. However, some recent studies suggest that the haptic demonstration of optimal timing, rather than movement magnitude, enhances learning in subjects trained with haptic guidance. Timing of an action plays a crucial role in the proper accomplishment of many motor skills, such as hitting a moving object (discrete timing task) or learning a velocity profile (time-critical tracking task). The aim of the present study is to evaluate which feedback conditions-visual or haptic guidance-optimize learning of the discrete and continuous elements of a timing task. The experiment consisted in performing a fast tennis forehand stroke in a virtual environment. A tendon-based parallel robot connected to the end of a racket was used to apply haptic guidance during training. In two different experiments, we evaluated which feedback condition was more adequate for learning: (1) a time-dependent discrete task-learning to start a tennis stroke and (2) a tracking task-learning to follow a velocity profile. The effect that the task difficulty and subject's initial skill level have on the selection of the optimal training condition was further evaluated. Results showed that the training condition that maximizes learning of the discrete time-dependent motor task depends on the subjects' initial skill level. Haptic guidance was especially suitable for less-skilled subjects and in especially difficult discrete tasks, while visual feedback seems to benefit more skilled subjects. Additionally, haptic guidance seemed to promote learning in a time-critical tracking task, while visual feedback tended to deteriorate the performance independently of the task difficulty and subjects' initial skill level. Haptic guidance outperformed visual feedback, although additional studies are needed to further analyze the effect of other types of feedback visualization on

  3. Mechanical model of orthopaedic drilling for augmented-haptics-based training.

    Science.gov (United States)

    Pourkand, Ashkan; Zamani, Naghmeh; Grow, David

    2017-10-01

    In this study, augmented-haptic feedback is used to combine a physical object with virtual elements in order to simulate anatomic variability in bone. This requires generating levels of force/torque consistent with clinical bone drilling, which exceed the capabilities of commercially available haptic devices. Accurate total force generation is facilitated by a predictive model of axial force during simulated orthopaedic drilling. This model is informed by kinematic data collected while drilling into synthetic bone samples using an instrumented linkage attached to the orthopaedic drill. Axial force is measured using a force sensor incorporated into the bone fixture. A nonlinear function, relating force to axial position and velocity, was used to fit the data. The normalized root-mean-square error (RMSE) of forces predicted by the model compared to those measured experimentally was 0.11 N across various bones with significant differences in geometry and density. This suggests that a predictive model can be used to capture relevant variations in the thickness and hardness of cortical and cancellous bone. The practical performance of this approach is measured using the Phantom Premium haptic device, with some required customizations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Real-time solution of the forward kinematics for a parallel haptic device using a numerical approach based on neural networks

    International Nuclear Information System (INIS)

    Liu, Guan Yang; Zhang, Yuru; Wang, Yan; Xie, Zheng

    2015-01-01

    This paper proposes a neural network (NN)-based approach to solve the forward kinematics of a 3-RRR spherical parallel mechanism designed for a haptic device. The proposed algorithm aims to remarkably speed up computation to meet the requirement of high frequency rendering for haptic display. To achieve high accuracy, the workspace of the haptic device is divided into smaller subspaces. The proposed algorithm contains NNs of two different precision levels: a rough estimation NN to identify the index of the subspace and several precise estimation networks with expected accuracy to calculate the forward kinematics. For continuous motion, the algorithm structure is further simplified to save internal memory and increase computing speed, which are critical for a haptic device control system running on an embedded platform. Compared with the mostly used Newton-Raphson method, the proposed algorithm and its simplified version greatly increase the calculation speed by about four times and 10 times, respectively, while achieving the same accuracy level. The proposed approach is of great significance for solving the forward kinematics of parallel mechanism used as haptic devices when high update frequency is needed but hardware resources are limited.

  5. The Effect of Dopaminergic Medication on Joint Kinematics during Haptic Movements in Individuals with Parkinson’s Disease

    Directory of Open Access Journals (Sweden)

    Kuan-yi Li

    2017-01-01

    Full Text Available This study examined whether altered joint angular motion during haptic exploration could account for a decline in haptic sensitivity in individuals with PD by analyzing joint position data during haptic exploration of a curved contour. Each participant’s hand was passively moved by a robotic arm along the edges of a virtual box (5 cm × 15 cm with a curved left wall. After each trial, participants indicated whether the contour was curved or straight. Visual, auditory, and tactile cues were occluded, and an electrogoniometer recorded shoulder and elbow joint angles during each trial. The PD group in the OFF state had a higher mean detection threshold (4.67 m−1 than the control group (3.06 m−1. Individuals with PD in the OFF state also had a significantly greater magnitude of shoulder abduction than those in the ON state (p=0.003 and a smaller magnitude of elbow flexion than those in the ON state or compared to the control group (both p<0.001. These findings suggest that individuals with PD employ joint configurations that may contribute to haptic insensitivity. Dopamine replacement therapy improved joint configurations during haptic exploration in patients with PD, suggesting a role for dopaminergic dysfunction in PD-related haptic insensitivity.

  6. VIRGY: a virtual reality and force feedback based endoscopic surgery simulator.

    Science.gov (United States)

    Baur, C; Guzzoni, D; Georg, O

    1998-01-01

    This paper describes the VIRGY project at the VRAI Group (Virtual Reality and Active Interface), Swiss Federal Institute of Technology (Lausanne, Switzerland). Since 1994, we have been investigating a variety of virtual-reality based methods for simulating laparoscopic surgery procedures. Our goal is to develop an endoscopic surgical training tool which realistically simulates the interactions between one or more surgical instruments and gastrointestinal organs. To support real-time interaction and manipulation between instruments and organs, we have developed several novel graphic simulation techniques. In particular, we are using live video texturing to achieve dynamic effects such as bleeding or vaporization of fatty tissues. Special texture manipulations allows us to generate pulsing objects while minimizing processor load. Additionally, we have created a new surface deformation algorithm which enables real-time deformations under external constraints. Lastly, we have developed a new 3D object definition which allows us to perform operations such as total or partial object cuttings, as well as to selectively render objects with different levels of detail. To provide realistic physical simulation of the forces and torques on surgical instruments encountered during an operation, we have also designed a new haptic device dedicated to endososcopic surgery constraints. We are using special interpolation and extrapolation techniques to integrate our 25 Hz visual simulation with the 300 Hz feedback required for realistic tactile interaction. The fully VIRGY simulator has been tested by surgeons and the quality of both our visual and haptic simulation has been judged sufficient for training basic surgery gestures.

  7. Active skin as new haptic interface

    Science.gov (United States)

    Vuong, Nguyen Huu Lam; Kwon, Hyeok Yong; Chuc, Nguyen Huu; Kim, Duksang; An, Kuangjun; Phuc, Vuong Hong; Moon, Hyungpil; Koo, Jachoon; Lee, Youngkwan; Nam, Jae-Do; Choi, Hyouk Ryeol

    2010-04-01

    In this paper, we present a new haptic interface, called "active skin", which is configured with a tactile sensor and a tactile stimulator in single haptic cell, and multiple haptic cells are embedded in a dielectric elastomer. The active skin generates a wide variety of haptic feel in response to the touch by synchronizing the sensor and the stimulator. In this paper, the design of the haptic cell is derived via iterative analysis and design procedures. A fabrication method dedicated to the proposed device is investigated and a controller to drive multiple haptic cells is developed. In addition, several experiments are performed to evaluate the performance of the active skin.

  8. Design of high-fidelity haptic display for one-dimensional force reflection applications

    Science.gov (United States)

    Gillespie, Brent; Rosenberg, Louis B.

    1995-12-01

    This paper discusses the development of a virtual reality platform for the simulation of medical procedures which involve needle insertion into human tissue. The paper's focus is the hardware and software requirements for haptic display of a particular medical procedure known as epidural analgesia. To perform this delicate manual procedure, an anesthesiologist must carefully guide a needle through various layers of tissue using only haptic cues for guidance. As a simplifying aspect for the simulator design, all motions and forces involved in the task occur along a fixed line once insertion begins. To create a haptic representation of this procedure, we have explored both physical modeling and perceptual modeling techniques. A preliminary physical model was built based on CT-scan data of the operative site. A preliminary perceptual model was built based on current training techniques for the procedure provided by a skilled instructor. We compare and contrast these two modeling methods and discuss the implications of each. We select and defend the perceptual model as a superior approach for the epidural analgesia simulator.

  9. Human haptic perception is interrupted by explorative stops of milliseconds

    Directory of Open Access Journals (Sweden)

    Martin eGrunwald

    2014-04-01

    Full Text Available Introduction: The explorative scanning movements of the hands have been compared to those of the eyes. The visual process is known to be composed of alternating phases of saccadic eye movements and fixation pauses. Descriptive results suggest that during the haptic exploration of objects short movement pauses occur as well. The goal of the present study was to detect these explorative stops (ES during one-handed and two-handed haptic explorations of various objects and patterns, and to measure their duration. Additionally, the associations between the following variables were analyzed: a between mean exploration time and duration of ES, b between certain stimulus features and ES frequency, and c the duration of ES during the course of exploration. Methods: Five different experiments were used. The first two experiments were classical recognition tasks of unknown haptic stimuli (A and of common objects (B. In experiment C space-position information of angle legs had to be perceived and reproduced. For experiments D and E the PHANToM haptic device was used for the exploration of virtual (D and real (E sunken reliefs. Results: In each experiment we observed explorative stops of different average durations. For experiment A: 329.50 ms, experiment B: 67.47 ms, experiment C: 189.92 ms, experiment D: 186.17 ms and experiment E: 140.02 ms. Significant correlations were observed between exploration time and the duration of the ES. Also, ES occurred more frequently, but not exclusively, at defined stimulus features like corners, curves and the endpoints of lines. However, explorative stops do not occur every time a stimulus feature is explored. Conclusions: We assume that ES are a general aspect of human haptic exploration processes. We have tried to interpret the occurrence and duration of ES with respect to the Hypotheses-Rebuild-Model and the Limited Capacity Control System theory.

  10. Pantomime-grasping: Advance knowledge of haptic feedback availability supports an absolute visuo-haptic calibration

    Directory of Open Access Journals (Sweden)

    Shirin eDavarpanah Jazi

    2016-05-01

    Full Text Available An emerging issue in movement neurosciences is whether haptic feedback influences the nature of the information supporting a simulated grasping response (i.e., pantomime-grasping. In particular, recent work by our group contrasted pantomime-grasping responses performed with (i.e., PH+ trials and without (i.e., PH- trials terminal haptic feedback in separate blocks of trials. Results showed that PH- trials were mediated via relative visual information. In contrast, PH+ trials showed evidence of an absolute visuo-haptic calibration – a finding attributed to an error signal derived from a comparison between expected and actual haptic feedback (i.e., an internal forward model. The present study examined whether advanced knowledge of haptic feedback availability influences the aforementioned calibration process. To that end, PH- and PH+ trials were completed in separate blocks (i.e., the feedback schedule used in our group’s previous study and a block wherein PH- and PH+ trials were randomly interleaved on a trial-by-trial basis (i.e., random feedback schedule. In other words, the random feedback schedule precluded participants from predicting whether haptic feedback would be available at the movement goal location. We computed just-noticeable-difference (JND values to determine whether responses adhered to, or violated, the relative psychophysical principles of Weber’s law. Results for the blocked feedback schedule replicated our group’s previous work, whereas in the random feedback schedule PH- and PH+ trials were supported via relative visual information. Accordingly, we propose that a priori knowledge of haptic feedback is necessary to support an absolute visuo-haptic calibration. Moreover, our results demonstrate that the presence and expectancy of haptic feedback is an important consideration in contrasting the behavioral and neural properties of natural and stimulated (i.e., pantomime-grasping grasping.

  11. Interactive virtual simulation using a 3D computer graphics model for microvascular decompression surgery.

    Science.gov (United States)

    Oishi, Makoto; Fukuda, Masafumi; Hiraishi, Tetsuya; Yajima, Naoki; Sato, Yosuke; Fujii, Yukihiko

    2012-09-01

    The purpose of this paper is to report on the authors' advanced presurgical interactive virtual simulation technique using a 3D computer graphics model for microvascular decompression (MVD) surgery. The authors performed interactive virtual simulation prior to surgery in 26 patients with trigeminal neuralgia or hemifacial spasm. The 3D computer graphics models for interactive virtual simulation were composed of the brainstem, cerebellum, cranial nerves, vessels, and skull individually created by the image analysis, including segmentation, surface rendering, and data fusion for data collected by 3-T MRI and 64-row multidetector CT systems. Interactive virtual simulation was performed by employing novel computer-aided design software with manipulation of a haptic device to imitate the surgical procedures of bone drilling and retraction of the cerebellum. The findings were compared with intraoperative findings. In all patients, interactive virtual simulation provided detailed and realistic surgical perspectives, of sufficient quality, representing the lateral suboccipital route. The causes of trigeminal neuralgia or hemifacial spasm determined by observing 3D computer graphics models were concordant with those identified intraoperatively in 25 (96%) of 26 patients, which was a significantly higher rate than the 73% concordance rate (concordance in 19 of 26 patients) obtained by review of 2D images only (p computer graphics model provided a realistic environment for performing virtual simulations prior to MVD surgery and enabled us to ascertain complex microsurgical anatomy.

  12. Touch Is Everywhere: Floor Surfaces as Ambient Haptic Interfaces.

    Science.gov (United States)

    Visell, Y; Law, A; Cooperstock, J R

    2009-01-01

    Floor surfaces are notable for the diverse roles that they play in our negotiation of everyday environments. Haptic communication via floor surfaces could enhance or enable many computer-supported activities that involve movement on foot. In this paper, we discuss potential applications of such interfaces in everyday environments and present a haptically augmented floor component through which several interaction methods are being evaluated. We describe two approaches to the design of structured vibrotactile signals for this device. The first is centered on a musical phrase metaphor, as employed in prior work on tactile display. The second is based upon the synthesis of rhythmic patterns of virtual physical impact transients. We report on an experiment in which participants were able to identify communication units that were constructed from these signals and displayed via a floor interface at well above chance levels. The results support the feasibility of tactile information display via such interfaces and provide further indications as to how to effectively design vibrotactile signals for them.

  13. A real-time haptic interface for interventional radiology procedures.

    Science.gov (United States)

    Moix, Thomas; Ilic, Dejan; Fracheboud, Blaise; Zoethout, Jurjen; Bleuler, Hannes

    2005-01-01

    Interventional Radiology (IR) is a minimally-invasive surgery technique (MIS) where guidewires and catheters are steered in the vascular system under X-ray imaging. In order to perform these procedures, a radiologist has to be correctly trained to master hand-eye coordination, instrument manipulation and procedure protocols. This paper proposes a computer-assisted training environment dedicated to IR. The system is composed of a virtual reality (VR) simulation of the anatomy of the patient linked to a robotic interface providing haptic force feedback.The paper focuses on the requirements, design and prototyping of a specific part of the haptic interface dedicated to catheters. Translational tracking and force feedback on the catheter is provided by two cylinders forming a friction drive arrangement. The whole friction can be set in rotation with an additional motor providing torque feedback. A force and a torque sensor are integrated in the cylinders for direct measurement on the catheter enabling disturbance cancellation with a close-loop force control strategy.

  14. Engineering haptic devices a beginner's guide

    CERN Document Server

    Hatzfeld, Christian

    2014-01-01

    In this greatly reworked second edition of Engineering Haptic Devices the psychophysic content has been thoroughly revised and updated. Chapters on haptic interaction, system structures and design methodology were rewritten from scratch to include further basic principles and recent findings. New chapters on the evaluation of haptic systems and the design of three exemplary haptic systems from science and industry have been added. This book was written for students and engineers that are faced with the development of a task-specific haptic system. It is a reference book for the basics of hap

  15. 1st International AsiaHaptics conference

    CERN Document Server

    Ando, Hideyuki; Kyung, Ki-Uk

    2015-01-01

    This book is aimed not only at haptics and human interface researchers, but also at developers and designers from manufacturing corporations and the entertainment industry who are working to change our lives. This publication comprises the proceedings of the first International AsiaHaptics conference, held in Tsukuba, Japan, in 2014. The book describes the state of the art of the diverse haptics- (touch-) related research, including scientific research into haptics perception and illusion, development of haptics devices, and applications for a wide variety of fields such as education, medicine, telecommunication, navigation, and entertainment.

  16. Clinical and optical intraocular performance of rotationally asymmetric multifocal IOL plate-haptic design versus C-loop haptic design.

    Science.gov (United States)

    Alió, Jorge L; Plaza-Puche, Ana B; Javaloy, Jaime; Ayala, María José; Vega-Estrada, Alfredo

    2013-04-01

    To compare the visual and intraocular optical quality outcomes with different designs of the refractive rotationally asymmetric multifocal intraocular lens (MFIOL) (Lentis Mplus; Oculentis GmbH, Berlin, Germany) with or without capsular tension ring (CTR) implantation. One hundred thirty-five consecutive eyes of 78 patients with cataract (ages 36 to 82 years) were divided into three groups: 43 eyes implanted with the C-Loop haptic design without CTR (C-Loop haptic only group); 47 eyes implanted with the C-Loop haptic design with CTR (C-Loop haptic with CTR group); and 45 eyes implanted with the plate-haptic design (plate-haptic group). Visual acuity, contrast sensitivity, defocus curve, and ocular and intraocular optical quality were evaluated at 3 months postoperatively. Significant differences in the postoperative sphere were found (P = .01), with a more myopic postoperative refraction for the C-Loop haptic only group. No significant differences were detected in photopic and scotopic contrast sensitivity among groups (P ⩾ .05). Significantly better visual acuities were present in the C-Loop haptic with CTR group for the defocus levels of -2.0, -1.5, -1.0, and -0.50 D (P ⩽.03). Statistically significant differences among groups were found in total intraocular root mean square (RMS), high-order intraocular RMS, and intraocular coma-like RMS aberrations (P ⩽.04), with lower values from the plate-haptic group. The plate-haptic design and the C-Loop haptic design with CTR implantation both allow good visual rehabilitation. However, better refractive predictability and intraocular optical quality was obtained with the plate-haptic design without CTR implantation. The plate-haptic design seems to be a better design to support rotational asymmetric MFIOL optics. Copyright 2013, SLACK Incorporated.

  17. Real-time haptic cutting of high-resolution soft tissues.

    Science.gov (United States)

    Wu, Jun; Westermann, Rüdiger; Dick, Christian

    2014-01-01

    We present our systematic efforts in advancing the computational performance of physically accurate soft tissue cutting simulation, which is at the core of surgery simulators in general. We demonstrate a real-time performance of 15 simulation frames per second for haptic soft tissue cutting of a deformable body at an effective resolution of 170,000 finite elements. This is achieved by the following innovative components: (1) a linked octree discretization of the deformable body, which allows for fast and robust topological modifications of the simulation domain, (2) a composite finite element formulation, which thoroughly reduces the number of simulation degrees of freedom and thus enables to carefully balance simulation performance and accuracy, (3) a highly efficient geometric multigrid solver for solving the linear systems of equations arising from implicit time integration, (4) an efficient collision detection algorithm that effectively exploits the composition structure, and (5) a stable haptic rendering algorithm for computing the feedback forces. Considering that our method increases the finite element resolution for physically accurate real-time soft tissue cutting simulation by an order of magnitude, our technique has a high potential to significantly advance the realism of surgery simulators.

  18. Visual and Haptic Mental Rotation

    Directory of Open Access Journals (Sweden)

    Satoshi Shioiri

    2011-10-01

    Full Text Available It is well known that visual information can be retained in several types of memory systems. Haptic information can also be retained in a memory because we can repeat a hand movement. There may be a common memory system for vision and action. On the one hand, it may be convenient to have a common system for acting with visual information. On the other hand, different modalities may have their own memory and use retained information without transforming specific to the modality. We compared memory properties of visual and haptic information. There is a phenomenon known as mental rotation, which is possibly unique to visual representation. The mental rotation is a phenomenon where reaction time increases with the angle of visual target (eg,, a letter to identify. The phenomenon is explained by the difference in time to rotate the representation of the target in the visual sytem. In this study, we compared the effect of stimulus angle on visual and haptic shape identification (two-line shapes were used. We found that a typical effect of mental rotation for the visual stimulus. However, no such effect was found for the haptic stimulus. This difference cannot be explained by the modality differences in response because similar difference was found even when haptical response was used for visual representation and visual response was used for haptic representation. These results indicate that there are independent systems for visual and haptic representations.

  19. Interacting with the biomolecular solvent accessible surface via a haptic feedback device

    Directory of Open Access Journals (Sweden)

    Hayward Steven

    2009-10-01

    Full Text Available Abstract Background From the 1950s computer based renderings of molecules have been produced to aid researchers in their understanding of biomolecular structure and function. A major consideration for any molecular graphics software is the ability to visualise the three dimensional structure of the molecule. Traditionally, this was accomplished via stereoscopic pairs of images and later realised with three dimensional display technologies. Using a haptic feedback device in combination with molecular graphics has the potential to enhance three dimensional visualisation. Although haptic feedback devices have been used to feel the interaction forces during molecular docking they have not been used explicitly as an aid to visualisation. Results A haptic rendering application for biomolecular visualisation has been developed that allows the user to gain three-dimensional awareness of the shape of a biomolecule. By using a water molecule as the probe, modelled as an oxygen atom having hard-sphere interactions with the biomolecule, the process of exploration has the further benefit of being able to determine regions on the molecular surface that are accessible to the solvent. This gives insight into how awkward it is for a water molecule to gain access to or escape from channels and cavities, indicating possible entropic bottlenecks. In the case of liver alcohol dehydrogenase bound to the inhibitor SAD, it was found that there is a channel just wide enough for a single water molecule to pass through. Placing the probe coincident with crystallographic water molecules suggests that they are sometimes located within small pockets that provide a sterically stable environment irrespective of hydrogen bonding considerations. Conclusion By using the software, named HaptiMol ISAS (available from http://www.haptimol.co.uk, one can explore the accessible surface of biomolecules using a three-dimensional input device to gain insights into the shape and water

  20. Distributed rendering for multiview parallax displays

    Science.gov (United States)

    Annen, T.; Matusik, W.; Pfister, H.; Seidel, H.-P.; Zwicker, M.

    2006-02-01

    3D display technology holds great promise for the future of television, virtual reality, entertainment, and visualization. Multiview parallax displays deliver stereoscopic views without glasses to arbitrary positions within the viewing zone. These systems must include a high-performance and scalable 3D rendering subsystem in order to generate multiple views at real-time frame rates. This paper describes a distributed rendering system for large-scale multiview parallax displays built with a network of PCs, commodity graphics accelerators, multiple projectors, and multiview screens. The main challenge is to render various perspective views of the scene and assign rendering tasks effectively. In this paper we investigate two different approaches: Optical multiplexing for lenticular screens and software multiplexing for parallax-barrier displays. We describe the construction of large-scale multi-projector 3D display systems using lenticular and parallax-barrier technology. We have developed different distributed rendering algorithms using the Chromium stream-processing framework and evaluate the trade-offs and performance bottlenecks. Our results show that Chromium is well suited for interactive rendering on multiview parallax displays.

  1. Age, Health and Attractiveness Perception of Virtual (Rendered) Human Hair.

    Science.gov (United States)

    Fink, Bernhard; Hufschmidt, Carla; Hirn, Thomas; Will, Susanne; McKelvey, Graham; Lankhof, John

    2016-01-01

    The social significance of physical appearance and beauty has been documented in many studies. It is known that even subtle manipulations of facial morphology and skin condition can alter people's perception of a person's age, health and attractiveness. While the variation in facial morphology and skin condition cues has been studied quite extensively, comparably little is known on the effect of hair on social perception. This has been partly caused by the technical difficulty of creating appropriate stimuli for investigations of people's response to systematic variation of certain hair characteristics, such as color and style, while keeping other features constant. Here, we present a modeling approach to the investigation of human hair perception using computer-generated, virtual (rendered) human hair. In three experiments, we manipulated hair diameter (Experiment 1), hair density (Experiment 2), and hair style (Experiment 3) of human (female) head hair and studied perceptions of age, health and attractiveness. Our results show that even subtle changes in these features have an impact on hair perception. We discuss our findings with reference to previous studies on condition-dependent quality cues in women that influence human social perception, thereby suggesting that hair is a salient feature of human physical appearance, which contributes to the perception of beauty.

  2. Comparison of Walking and Traveling-Wave Piezoelectric Motors as Actuators in Kinesthetic Haptic Devices.

    Science.gov (United States)

    Olsson, Pontus; Nysjo, Fredrik; Carlbom, Ingrid B; Johansson, Stefan

    2016-01-01

    Piezoelectric motors offer an attractive alternative to electromagnetic actuators in portable haptic interfaces: they are compact, have a high force-to-volume ratio, and can operate with limited or no gearing. However, the choice of a piezoelectric motor type is not obvious due to differences in performance characteristics. We present our evaluation of two commercial, operationally different, piezoelectric motors acting as actuators in two kinesthetic haptic grippers, a walking quasi-static motor and a traveling wave ultrasonic motor. We evaluate each gripper's ability to display common virtual objects including springs, dampers, and rigid walls, and conclude that the walking quasi-static motor is superior at low velocities. However, for applications where high velocity is required, traveling wave ultrasonic motors are a better option.

  3. Structural impact detection with vibro-haptic interfaces

    Science.gov (United States)

    Jung, Hwee-Kwon; Park, Gyuhae; Todd, Michael D.

    2016-07-01

    This paper presents a new sensing paradigm for structural impact detection using vibro-haptic interfaces. The goal of this study is to allow humans to ‘feel’ structural responses (impact, shape changes, and damage) and eventually determine health conditions of a structure. The target applications for this study are aerospace structures, in particular, airplane wings. Both hardware and software components are developed to realize the vibro-haptic-based impact detection system. First, L-shape piezoelectric sensor arrays are deployed to measure the acoustic emission data generated by impacts on a wing. Unique haptic signals are then generated by processing the measured acoustic emission data. These haptic signals are wirelessly transmitted to human arms, and with vibro-haptic interface, human pilots could identify impact location, intensity and possibility of subsequent damage initiation. With the haptic interface, the experimental results demonstrate that human could correctly identify such events, while reducing false indications on structural conditions by capitalizing on human’s classification capability. Several important aspects of this study, including development of haptic interfaces, design of optimal human training strategies, and extension of the haptic capability into structural impact detection are summarized in this paper.

  4. The Haptic Bracelets: Learning Multi-Limb Rhythm Skills from Haptic Stimuli While Reading

    NARCIS (Netherlands)

    Bouwer, A.; Holland, S.; Dalgleish, M.; Holland, S.; Wilkie, K.; Mulholland, P.; Seago, A.

    2013-01-01

    The Haptic Bracelets are a system designed to help people learn multi-limbed rhythms (which involve multiple simultaneous rhythmic patterns) while they carry out other tasks. The Haptic Bracelets consist of vibrotactiles attached to each wrist and ankle, together with a computer system to control

  5. A hitchhiker's guide to virtual reality

    CERN Document Server

    McMenemy , Karen

    2007-01-01

    A Hitchhiker's Guide to Virtual Reality brings together under one cover all the aspects of graphics, video, audio, and haptics that have to work together to make virtual reality a reality. Like any good guide, it reveals the practical things you need to know, from the viewpoint of authors who have been there. This two-part guide covers the science, technology, and mathematics of virtual reality and then details its practical implementation. The first part looks at how the interface between human senses and technology works to create virtual reality, with a focus on vision, the most important s

  6. KinoHaptics: An Automated, Wearable, Haptic Assisted, Physio-therapeutic System for Post-surgery Rehabilitation and Self-care.

    Science.gov (United States)

    Rajanna, Vijay; Vo, Patrick; Barth, Jerry; Mjelde, Matthew; Grey, Trevor; Oduola, Cassandra; Hammond, Tracy

    2016-03-01

    A carefully planned, structured, and supervised physiotherapy program, following a surgery, is crucial for the successful diagnosis of physical injuries. Nearly 50 % of the surgeries fail due to unsupervised, and erroneous physiotherapy. The demand for a physiotherapist for an extended period is expensive to afford, and sometimes inaccessible. Researchers have tried to leverage the advancements in wearable sensors and motion tracking by building affordable, automated, physio-therapeutic systems that direct a physiotherapy session by providing audio-visual feedback on patient's performance. There are many aspects of automated physiotherapy program which are yet to be addressed by the existing systems: a wide classification of patients' physiological conditions to be diagnosed, multiple demographics of the patients (blind, deaf, etc.), and the need to pursue patients to adopt the system for an extended period for self-care. In our research, we have tried to address these aspects by building a health behavior change support system called KinoHaptics, for post-surgery rehabilitation. KinoHaptics is an automated, wearable, haptic assisted, physio-therapeutic system that can be used by a wide variety of demographics and for various physiological conditions of the patients. The system provides rich and accurate vibro-haptic feedback that can be felt by the user, irrespective of the physiological limitations. KinoHaptics is built to ensure that no injuries are induced during the rehabilitation period. The persuasive nature of the system allows for personal goal-setting, progress tracking, and most importantly life-style compatibility. The system was evaluated under laboratory conditions, involving 14 users. Results show that KinoHaptics is highly convenient to use, and the vibro-haptic feedback is intuitive, accurate, and has shown to prevent accidental injuries. Also, results show that KinoHaptics is persuasive in nature as it supports behavior change and habit building

  7. Visual-Haptic Integration: Cue Weights are Varied Appropriately, to Account for Changes in Haptic Reliability Introduced by Using a Tool

    Directory of Open Access Journals (Sweden)

    Chie Takahashi

    2011-10-01

    Full Text Available Tools such as pliers systematically change the relationship between an object's size and the hand opening required to grasp it. Previous work suggests the brain takes this into account, integrating visual and haptic size information that refers to the same object, independent of the similarity of the ‘raw’ visual and haptic signals (Takahashi et al., VSS 2009. Variations in tool geometry also affect the reliability (precision of haptic size estimates, however, because they alter the change in hand opening caused by a given change in object size. Here, we examine whether the brain appropriately adjusts the weights given to visual and haptic size signals when tool geometry changes. We first estimated each cue's reliability by measuring size-discrimination thresholds in vision-alone and haptics-alone conditions. We varied haptic reliability using tools with different object-size:hand-opening ratios (1:1, 0.7:1, and 1.4:1. We then measured the weights given to vision and haptics with each tool, using a cue-conflict paradigm. The weight given to haptics varied with tool type in a manner that was well predicted by the single-cue reliabilities (MLE model; Ernst and Banks, 2002. This suggests that the process of visual-haptic integration appropriately accounts for variations in haptic reliability introduced by different tool geometries.

  8. Effects of a Haptic Augmented Simulation on K-12 Students' Achievement and Their Attitudes Towards Physics

    Science.gov (United States)

    Civelek, Turhan; Ucar, Erdem; Ustunel, Hakan; Aydin, Mehmet Kemal

    2014-01-01

    The current research aims to explore the effects of a haptic augmented simulation on students' achievement and their attitudes towards Physics in an immersive virtual reality environment (VRE). A quasi-experimental post-test design was employed utilizing experiment and control groups. The participants were 215 students from a K-12 school in…

  9. Freely-available, true-color volume rendering software and cryohistology data sets for virtual exploration of the temporal bone anatomy.

    Science.gov (United States)

    Kahrs, Lüder Alexander; Labadie, Robert Frederick

    2013-01-01

    Cadaveric dissection of temporal bone anatomy is not always possible or feasible in certain educational environments. Volume rendering using CT and/or MRI helps understanding spatial relationships, but they suffer in nonrealistic depictions especially regarding color of anatomical structures. Freely available, nonstained histological data sets and software which are able to render such data sets in realistic color could overcome this limitation and be a very effective teaching tool. With recent availability of specialized public-domain software, volume rendering of true-color, histological data sets is now possible. We present both feasibility as well as step-by-step instructions to allow processing of publicly available data sets (Visible Female Human and Visible Ear) into easily navigable 3-dimensional models using free software. Example renderings are shown to demonstrate the utility of these free methods in virtual exploration of the complex anatomy of the temporal bone. After exploring the data sets, the Visible Ear appears more natural than the Visible Human. We provide directions for an easy-to-use, open-source software in conjunction with freely available histological data sets. This work facilitates self-education of spatial relationships of anatomical structures inside the human temporal bone as well as it allows exploration of surgical approaches prior to cadaveric testing and/or clinical implementation. Copyright © 2013 S. Karger AG, Basel.

  10. The Hedonic Haptic Player

    DEFF Research Database (Denmark)

    Vallgårda, Anna; Boer, Laurens; Cahill, Ben

    2017-01-01

    In this design case we present the Hedonic Haptic Player—a wearable device that plays different patterns of vibrations on the body as a form of music for the skin. With this we begin to explore the enjoyability of vibrations in a wearable set-up. Instead of implementing vibrations as a haptic...... output for some form of communication we want to explore their hedonistic value. The process leading up to the Hedonic Haptic player served as a first step in getting a grasp of the design space of vibrotactile stimuli in a broader sense. This is reported as seven episodes of explorations. The Hedonic...

  11. Design and control of MR haptic master/slave robot system for minimally invasive surgery

    Science.gov (United States)

    Uhm, Chang-Ho; Nguyen, Phoung Bac; Choi, Seung-Bok

    2013-04-01

    In this work, magnetorheological (MR) haptic master and slave robot for minimally invasive surgery (MIS) have been designed and tested. The proposed haptic master consists of four actuators; three MR brakes featuring gimbal structure for 3-DOF rotation motion(X, Y and Z axes) and one MR linear actuator for 1-DOF translational motion. The proposed slave robot which is connected with the haptic master has vertically multi- joints, and it consists of four DC servomotors; three for positioning endoscope and one for spinning motion. We added a fixed bar with a ball joint on the base of the slave for the endoscope position at the patient's abdomen to maintain safety. A gimbal structure at the end of the slave robotic arm for the last joint rotates freely with respect to the pivot point of the fixed bar. This master-slave system runs as if a teleoperation system through TCP/IP connection, programmed by LabVIEW. In order to achieve the desired position trajectory, a proportional-integral-derivative (PID) controller is designed and implemented. It has been demonstrated that the effective tracking control performances for the desired motion are well achieved and presented in time domain. At last, an experiment in virtual environments is undertaken to investigate the effectiveness of the MR haptic master device for MIS system.

  12. A virtual trainer concept for robot-assisted human motor learning in rowing

    Directory of Open Access Journals (Sweden)

    Baumgartner L.

    2011-12-01

    Full Text Available Keeping the attention level and observing multiple physiological and biomechanical variables at the same time at high precision is very challenging for human trainers. Concurrent augmented feedback, which is suggested to enhance motor learning in complex motor tasks, can also hardly be provided by a human trainer. Thus, in this paper, a concept for a virtual trainer is presented that may overcome the limits of a human trainer. The intended virtual trainer will be implemented in a CAVE providing auditory, visual and haptic cues. As a first application, the virtual trainer will be used in a realistic scenario for sweep rowing. To provide individual feedback to each rower, the virtual trainer quantifies errors and provides concurrent auditory, visual, and haptic feedback. The concurrent feedback will be adapted according to the actual performance, individual maximal rowing velocity, and the athlete’s individual perception.

  13. IMPROVING MEDICAL EDUCATION: SIMULATING CHANGES IN PATIENT ANATOMY USING DYNAMIC HAPTIC FEEDBACK

    OpenAIRE

    Yovanoff, Mary; Pepley, David; Mirkin, Katelin; Moore, Jason; Han, David; Miller, Scarlett

    2016-01-01

    Virtual simulation is an emerging field in medical education. Research suggests that simulation reduces complication rates and improves learning gains for medical residents. One benefit of simulators is their allowance for more realistic and dynamic patient anatomies. While potentially useful throughout medical education, few studies have explored the impact of dynamic haptic simulators on medical training. In light of this research void, this study was developed to examine how a Dynamic-Hapt...

  14. Open Touch/Sound Maps: A system to convey street data through haptic and auditory feedback

    Science.gov (United States)

    Kaklanis, Nikolaos; Votis, Konstantinos; Tzovaras, Dimitrios

    2013-08-01

    The use of spatial (geographic) information is becoming ever more central and pervasive in today's internet society but the most of it is currently inaccessible to visually impaired users. However, access in visual maps is severely restricted to visually impaired and people with blindness, due to their inability to interpret graphical information. Thus, alternative ways of a map's presentation have to be explored, in order to enforce the accessibility of maps. Multiple types of sensory perception like touch and hearing may work as a substitute of vision for the exploration of maps. The use of multimodal virtual environments seems to be a promising alternative for people with visual impairments. The present paper introduces a tool for automatic multimodal map generation having haptic and audio feedback using OpenStreetMap data. For a desired map area, an elevation map is being automatically generated and can be explored by touch, using a haptic device. A sonification and a text-to-speech (TTS) mechanism provide also audio navigation information during the haptic exploration of the map.

  15. Mobile Haptic Technology Development through Artistic Exploration

    DEFF Research Database (Denmark)

    Cuartielles, David; Göransson, Andreas; Olsson, Tony

    2012-01-01

    This paper investigates how artistic explorations can be useful for the development of mobile haptic technology. It presents an alternative framework of design for wearable haptics that contributes to the building of haptic communities outside specialized research contexts. The paper also present...

  16. Resident simulation training in endoscopic endonasal surgery utilizing haptic feedback technology.

    Science.gov (United States)

    Thawani, Jayesh P; Ramayya, Ashwin G; Abdullah, Kalil G; Hudgins, Eric; Vaughan, Kerry; Piazza, Matthew; Madsen, Peter J; Buch, Vivek; Sean Grady, M

    2016-12-01

    Simulated practice may improve resident performance in endoscopic endonasal surgery. Using the NeuroTouch haptic simulation platform, we evaluated resident performance and assessed the effect of simulation training on performance in the operating room. First- (N=3) and second- (N=3) year residents were assessed using six measures of proficiency. Using a visual analog scale, the senior author scored subjects. After the first session, subjects with lower scores were provided with simulation training. A second simulation served as a task-learning control. Residents were evaluated in the operating room over six months by the senior author-who was blinded to the trained/untrained identities-using the same parameters. A nonparametric bootstrap testing method was used for the analysis (Matlab v. 2014a). Simulation training was associated with an increase in performance scores in the operating room averaged over all measures (p=0.0045). This is the first study to evaluate the training utility of an endoscopic endonasal surgical task using a virtual reality haptic simulator. The data suggest that haptic simulation training in endoscopic neurosurgery may contribute to improvements in operative performance. Limitations include a small number of subjects and adjudication bias-although the trained/untrained identity of subjects was blinded. Further study using the proposed methods may better describe the relationship between simulated training and operative performance in endoscopic Neurosurgery. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Construction and Evaluation of an Ultra Low Latency Frameless Renderer for VR.

    Science.gov (United States)

    Friston, Sebastian; Steed, Anthony; Tilbury, Simon; Gaydadjiev, Georgi

    2016-04-01

    Latency - the delay between a user's action and the response to this action - is known to be detrimental to virtual reality. Latency is typically considered to be a discrete value characterising a delay, constant in time and space - but this characterisation is incomplete. Latency changes across the display during scan-out, and how it does so is dependent on the rendering approach used. In this study, we present an ultra-low latency real-time ray-casting renderer for virtual reality, implemented on an FPGA. Our renderer has a latency of ~1 ms from 'tracker to pixel'. Its frameless nature means that the region of the display with the lowest latency immediately follows the scan-beam. This is in contrast to frame-based systems such as those using typical GPUs, for which the latency increases as scan-out proceeds. Using a series of high and low speed videos of our system in use, we confirm its latency of ~1 ms. We examine how the renderer performs when driving a traditional sequential scan-out display on a readily available HMO, the Oculus Rift OK2. We contrast this with an equivalent apparatus built using a GPU. Using captured human head motion and a set of image quality measures, we assess the ability of these systems to faithfully recreate the stimuli of an ideal virtual reality system - one with a zero latency tracker, renderer and display running at 1 kHz. Finally, we examine the results of these quality measures, and how each rendering approach is affected by velocity of movement and display persistence. We find that our system, with a lower average latency, can more faithfully draw what the ideal virtual reality system would. Further, we find that with low display persistence, the sensitivity to velocity of both systems is lowered, but that it is much lower for ours.

  18. Fast rendering of scanned room geometries

    DEFF Research Database (Denmark)

    Olesen, Søren Krarup; Markovic, Milos; Hammershøi, Dorte

    2014-01-01

    Room acoustics are rendered in Virtual Realities based on models of the real world. These are typically rather coarse representations of the true geometry resulting in room impulse responses with a lack of natural detail. This problem can be overcome by using data scanned by sensors, such as e...

  19. Getting to the Root of Fine Motor Skill Performance in Dentistry: Brain Activity During Dental Tasks in a Virtual Reality Haptic Simulation.

    Science.gov (United States)

    Perry, Suzanne; Bridges, Susan M; Zhu, Frank; Leung, W Keung; Burrow, Michael F; Poolton, Jamie; Masters, Rich Sw

    2017-12-12

    There is little evidence considering the relationship between movement-specific reinvestment (a dimension of personality which refers to the propensity for individuals to consciously monitor and control their movements) and working memory during motor skill performance. Functional near-infrared spectroscopy (fNIRS) measuring oxyhemoglobin demands in the frontal cortex during performance of virtual reality (VR) psychomotor tasks can be used to examine this research gap. The aim of this study was to determine the potential relationship between the propensity to reinvest and blood flow to the dorsolateral prefrontal cortices of the brain. A secondary aim was to determine the propensity to reinvest and performance during 2 dental tasks carried out using haptic VR simulators. We used fNIRS to assess oxygen demands in 24 undergraduate dental students during 2 dental tasks (clinical, nonclinical) on a VR haptic simulator. We used the Movement-Specific Reinvestment Scale questionnaire to assess the students' propensity to reinvest. Students with a high propensity for movement-specific reinvestment displayed significantly greater oxyhemoglobin demands in an area associated with working memory during the nonclinical task (Spearman correlation, r s =.49, P=.03). This small-scale study suggests that neurophysiological differences are evident between high and low reinvesters during a dental VR task in terms of oxyhemoglobin demands in an area associated with working memory. ©Suzanne Perry, Susan M Bridges, Frank Zhu, W Keung Leung, Michael F Burrow, Jamie Poolton, Rich SW Masters. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 12.12.2017.

  20. Polymer-based actuators for virtual reality devices

    Science.gov (United States)

    Bolzmacher, Christian; Hafez, Moustapha; Benali Khoudja, Mohamed; Bernardoni, Paul; Dubowsky, Steven

    2004-07-01

    Virtual Reality (VR) is gaining more importance in our society. For many years, VR has been limited to the entertainment applications. Today, practical applications such as training and prototyping find a promising future in VR. Therefore there is an increasing demand for low-cost, lightweight haptic devices in virtual reality (VR) environment. Electroactive polymers seem to be a potential actuation technology that could satisfy these requirements. Dielectric polymers developed the past few years have shown large displacements (more than 300%). This feature makes them quite interesting for integration in haptic devices due to their muscle-like behaviour. Polymer actuators are flexible and lightweight as compared to traditional actuators. Using stacks with several layers of elatomeric film increase the force without limiting the output displacement. The paper discusses some design methods for a linear dielectric polymer actuator for VR devices. Experimental results of the actuator performance is presented.

  1. Real-time photorealistic stereoscopic rendering of fire

    Science.gov (United States)

    Rose, Benjamin M.; McAllister, David F.

    2007-02-01

    We propose a method for real-time photorealistic stereo rendering of the natural phenomenon of fire. Applications include the use of virtual reality in fire fighting, military training, and entertainment. Rendering fire in real-time presents a challenge because of the transparency and non-static fluid-like behavior of fire. It is well known that, in general, methods that are effective for monoscopic rendering are not necessarily easily extended to stereo rendering because monoscopic methods often do not provide the depth information necessary to produce the parallax required for binocular disparity in stereoscopic rendering. We investigate the existing techniques used for monoscopic rendering of fire and discuss their suitability for extension to real-time stereo rendering. Methods include the use of precomputed textures, dynamic generation of textures, and rendering models resulting from the approximation of solutions of fluid dynamics equations through the use of ray-tracing algorithms. We have found that in order to attain real-time frame rates, our method based on billboarding is effective. Slicing is used to simulate depth. Texture mapping or 2D images are mapped onto polygons and alpha blending is used to treat transparency. We can use video recordings or prerendered high-quality images of fire as textures to attain photorealistic stereo.

  2. Bilateral intraocular lens subluxation secondary to haptic angulation.

    Science.gov (United States)

    Moreno-Montañés, Javier; Fernández-Hortelano, Ana; Caire, Josemaría

    2008-04-01

    An 82-year-old man had uneventful phacoemulsification with bilateral implantation of a hydrophilic acrylic, single-piece intraocular lens (IOL) (ACR6D SE, Laboratoires Cornéal). Five years later, simultaneous and bilateral IOL subluxations occurred. In both eyes, the subluxation was situated on the side of one haptic that had moved forward (temporal area in the right eye and superior area in the left eye). In the right eye, the haptic-capsular bag was entrapped by the pupil and produced endothelial damage. A transscleral suture was placed over and under the subluxated haptic through the anterior and posterior capsules to capture the haptic. The haptic was then sutured to the sclera. No postoperative complications developed. We hypothesize that 10-degree angulated and broad haptic junctions can lead to zonular damage and IOL subluxation.

  3. A haptic unit designed for magnetic-resonance-guided biopsy.

    Science.gov (United States)

    Tse, Z T H; Elhawary, H; Rea, M; Young, I; Davis, B L; Lamperth, M

    2009-02-01

    The magnetic fields present in the magnetic resonance (MR) environment impose severe constraints on any mechatronic device present in its midst, requiring alternative actuators, sensors, and materials to those conventionally used in traditional system engineering. In addition the spatial constraints of closed-bore scanners require a physical separation between the radiologist and the imaged region of the patient. This configuration produces a loss of the sense of touch from the target anatomy for the clinician, which often provides useful information. To recover the force feedback from the tissue, an MR-compatible haptic unit, designed to be integrated with a five-degrees-of-freedom mechatronic system for MR-guided prostate biopsy, has been developed which incorporates position control and force feedback to the operator. The haptic unit is designed to be located inside the scanner isocentre with the master console in the control room. MR compatibility of the device has been demonstrated, showing a negligible degradation of the signal-to-noise ratio and virtually no geometric distortion. By combining information from the position encoder and force sensor, tissue stiffness measurement along the needle trajectory is demonstrated in a lamb liver to aid diagnosis of suspected cancerous tissue.

  4. Improving the visual realism of virtual surgery.

    Science.gov (United States)

    Jin, Wei; Lim, Yi-Je; Xu, Xie George; Singh, Tejinder P; De, Suvranu

    2005-01-01

    In this work we focus our attention on improving the visual realism of virtual surgery. A synthetic solution by innovative use of various image-based rendering methods is presented for realistic rendering of virtual surgery scenes. We have, for the first time, developed a methodology for generating virtual surgery scenes with realistic glistening effects by a combination of various image-based rendering techniques, including image mosaicing and view-dependent texture mapping. Realistic examples are presented to showcase the results.

  5. The Effect of Haptic Guidance on Learning a Hybrid Rhythmic-Discrete Motor Task.

    Science.gov (United States)

    Marchal-Crespo, Laura; Bannwart, Mathias; Riener, Robert; Vallery, Heike

    2015-01-01

    Bouncing a ball with a racket is a hybrid rhythmic-discrete motor task, combining continuous rhythmic racket movements with discrete impact events. Rhythmicity is exceptionally important in motor learning, because it underlies fundamental movements such as walking. Studies suggested that rhythmic and discrete movements are governed by different control mechanisms at different levels of the Central Nervous System. The aim of this study is to evaluate the effect of fixed/fading haptic guidance on learning to bounce a ball to a desired apex in virtual reality with varying gravity. Changing gravity changes dominance of rhythmic versus discrete control: The higher the value of gravity, the more rhythmic the task; lower values reduce the bouncing frequency and increase dwell times, eventually leading to a repetitive discrete task that requires initiation and termination, resembling target-oriented reaching. Although motor learning in the ball-bouncing task with varying gravity has been studied, the effect of haptic guidance on learning such a hybrid rhythmic-discrete motor task has not been addressed. We performed an experiment with thirty healthy subjects and found that the most effective training condition depended on the degree of rhythmicity: Haptic guidance seems to hamper learning of continuous rhythmic tasks, but it seems to promote learning for repetitive tasks that resemble discrete movements.

  6. Tactile display for virtual 3D shape rendering

    CERN Document Server

    Mansutti, Alessandro; Bordegoni, Monica; Cugini, Umberto

    2017-01-01

    This book describes a novel system for the simultaneous visual and tactile rendering of product shapes which allows designers to simultaneously touch and see new product shapes during the conceptual phase of product development. This system offers important advantages, including potential cost and time savings, compared with the standard product design process in which digital 3D models and physical prototypes are often repeatedly modified until an optimal design is achieved. The system consists of a tactile display that is able to represent, within a real environment, the shape of a product. Designers can explore the rendered surface by touching curves lying on the product shape, selecting those curves that can be considered style features and evaluating their aesthetic quality. In order to physically represent these selected curves, a flexible surface is modeled by means of servo-actuated modules controlling a physical deforming strip. The tactile display is designed so as to be portable, low cost, modular,...

  7. Graphic and haptic simulation for transvaginal cholecystectomy training in NOTES.

    Science.gov (United States)

    Pan, Jun J; Ahn, Woojin; Dargar, Saurabh; Halic, Tansel; Li, Bai C; Sankaranarayanan, Ganesh; Roberts, Kurt; Schwaitzberg, Steven; De, Suvranu

    2016-04-01

    Natural Orifice Transluminal Endoscopic Surgery (NOTES) provides an emerging surgical technique which usually needs a long learning curve for surgeons. Virtual reality (VR) medical simulators with vision and haptic feedback can usually offer an efficient and cost-effective alternative without risk to the traditional training approaches. Under this motivation, we developed the first virtual reality simulator for transvaginal cholecystectomy in NOTES (VTEST™). This VR-based surgical simulator aims to simulate the hybrid NOTES of cholecystectomy. We use a 6DOF haptic device and a tracking sensor to construct the core hardware component of simulator. For software, an innovative approach based on the inner-spheres is presented to deform the organs in real time. To handle the frequent collision between soft tissue and surgical instruments, an adaptive collision detection method based on GPU is designed and implemented. To give a realistic visual performance of gallbladder fat tissue removal by cautery hook, a multi-layer hexahedral model is presented to simulate the electric dissection of fat tissue. From the experimental results, trainees can operate in real time with high degree of stability and fidelity. A preliminary study was also performed to evaluate the realism and the usefulness of this hybrid NOTES simulator. This prototyped simulation system has been verified by surgeons through a pilot study. Some items of its visual performance and the utility were rated fairly high by the participants during testing. It exhibits the potential to improve the surgical skills of trainee and effectively shorten their learning curve. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Developing Visual Editors for High-Resolution Haptic Patterns

    DEFF Research Database (Denmark)

    Cuartielles, David; Göransson, Andreas; Olsson, Tony

    2012-01-01

    In this article we give an overview of our iterative work in developing visual editors for creating high resolution haptic patterns to be used in wearable, haptic feedback devices. During the past four years we have found the need to address the question of how to represent, construct and edit high...... resolution haptic patterns so that they translate naturally to the user’s haptic experience. To solve this question we have developed and tested several visual editors...

  9. Different haptic tools reduce trunk velocity in the frontal plane during walking, but haptic anchors have advantages over lightly touching a railing.

    Science.gov (United States)

    Hedayat, Isabel; Moraes, Renato; Lanovaz, Joel L; Oates, Alison R

    2017-06-01

    There are different ways to add haptic input during walking which may affect walking balance. This study compared the use of two different haptic tools (rigid railing and haptic anchors) and investigated whether any effects on walking were the result of the added sensory input and/or the posture generated when using those tools. Data from 28 young healthy adults were collected using the Mobility Lab inertial sensor system (APDM, Oregon, USA). Participants walked with and without both haptic tools and while pretending to use both haptic tools (placebo trials), with eyes opened and eyes closed. Using the tools or pretending to use both tools decreased normalized stride velocity (p  .999). These findings highlight a difference in the type of tool used to add haptic input and suggest that changes in balance control strategy resulting from using the railing are based on arm placement, where it is the posture combined with added sensory input that affects balance control strategies with the haptic anchors. These findings provide a strong framework for additional research to be conducted on the effects of haptic input on walking in populations known to have decreased walking balance.

  10. Virtual endoscopy and 3D volume rendering in the management of frontal sinus fractures.

    Science.gov (United States)

    Belina, Stanko; Cuk, Viseslav; Klapan, Ivica

    2009-12-01

    Frontal sinus fractures (FSF) are commonly caused by traffic accidents, assaults, industrial accidents and gunshot wounds. Classical roentgenography has high proportion of false negative findings in cases of FSF and is not particularly useful in examining the severity of damage to the frontal sinus posterior table and the nasofrontal duct region. High resolution computed tomography was inavoidable during the management of such patients but it may produce large quantity of 2D images. Postprocessing of datasets acquired by high resolution computer tomography from patients with severe head trauma may offer a valuable additional help in diagnostics and surgery planning. We performed virtual endoscopy (VE) and 3D volume rendering (3DVR) on high resolution CT data acquired from a 54-year-old man with with both anterior and posterior frontal sinus wall fracture in order to demonstrate advantages and disadvantages of these methods. Data acquisition was done by Siemens Somatom Emotion scanner and postprocessing was performed with Syngo 2006G software. VE and 3DVR were performed in a man who suffered blunt trauma to his forehead and nose in an traffic accident. Left frontal sinus anterior wall fracture without dislocation and fracture of tabula interna with dislocation were found. 3D position and orientation of fracture lines were shown in by 3D rendering software. We concluded that VE and 3DVR can clearly display the anatomic structure of the paranasal sinuses and nasopharyngeal cavity, revealing damage to the sinus wall caused by a fracture and its relationship to surrounding anatomical structures.

  11. SeaTouch: A Haptic and Auditory Maritime Environment for Non Visual Cognitive Mapping of Blind Sailors

    Science.gov (United States)

    Simonnet, Mathieu; Jacobson, Dan; Vieilledent, Stephane; Tisseau, Jacques

    Navigating consists of coordinating egocentric and allocentric spatial frames of reference. Virtual environments have afforded researchers in the spatial community with tools to investigate the learning of space. The issue of the transfer between virtual and real situations is not trivial. A central question is the role of frames of reference in mediating spatial knowledge transfer to external surroundings, as is the effect of different sensory modalities accessed in simulated and real worlds. This challenges the capacity of blind people to use virtual reality to explore a scene without graphics. The present experiment involves a haptic and auditory maritime virtual environment. In triangulation tasks, we measure systematic errors and preliminary results show an ability to learn configurational knowledge and to navigate through it without vision. Subjects appeared to take advantage of getting lost in an egocentric “haptic” view in the virtual environment to improve performances in the real environment.

  12. fMRI-Compatible Electromagnetic Haptic Interface.

    Science.gov (United States)

    Riener, R; Villgrattner, T; Kleiser, R; Nef, T; Kollias, S

    2005-01-01

    A new haptic interface device is suggested, which can be used for functional magnetic resonance imaging (fMRI) studies. The basic component of this 1 DOF haptic device are two coils that produce a Lorentz force induced by the large static magnetic field of the MR scanner. A MR-compatible optical angular encoder and a optical force sensor enable the implementation of different control architectures for haptic interactions. The challenge was to provide a large torque, and not to affect image quality by the currents applied in the device. The haptic device was tested in a 3T MR scanner. With a current of up to 1A and a distance of 1m to the focal point of the MR-scanner it was possible to generate torques of up to 4 Nm. Within these boundaries image quality was not affected.

  13. Incorporating Haptic Feedback in Simulation for Learning Physics

    Science.gov (United States)

    Han, Insook; Black, John B.

    2011-01-01

    The purpose of this study was to investigate the effectiveness of a haptic augmented simulation in learning physics. The results indicate that haptic augmented simulations, both the force and kinesthetic and the purely kinesthetic simulations, were more effective than the equivalent non-haptic simulation in providing perceptual experiences and…

  14. A haptic interface for virtual simulation of endoscopic surgery.

    Science.gov (United States)

    Rosenberg, L B; Stredney, D

    1996-01-01

    Virtual reality can be described as a convincingly realistic and naturally interactive simulation in which the user is given a first person illusion of being immersed within a computer generated environment While virtual reality systems offer great potential to reduce the cost and increase the quality of medical training, many technical challenges must be overcome before such simulation platforms offer effective alternatives to more traditional training means. A primary challenge in developing effective virtual reality systems is designing the human interface hardware which allows rich sensory information to be presented to users in natural ways. When simulating a given manual procedure, task specific human interface requirements dictate task specific human interface hardware. The following paper explores the design of human interface hardware that satisfies the task specific requirements of virtual reality simulation of Endoscopic surgical procedures. Design parameters were derived through direct cadaver studies and interviews with surgeons. Final hardware design is presented.

  15. The Role of Virtual Articulator in Prosthetic and Restorative Dentistry

    Science.gov (United States)

    Aljanakh, Mohammad

    2014-01-01

    Virtual reality is a computer based technology linked with the future of dentistry and dental practice. The virtual articulator is one such application in prosthetic and restorative dentistry based on virtual reality that will significantly reduce the limitations of the mechanical articulator, and by simulation of real patient data, allow analyses with regard to static and dynamic occlusion as well as to jaw relation. It is the purpose of this article to present the concepts and strategies for a future replacement of the mechanical articulator by a virtual one. Also, a brief note on virtual reality haptic system has been highlighted along with newly developed touch enabled virtual articulator. PMID:25177664

  16. Adaptation of a haptic robot in a 3T fMRI.

    Science.gov (United States)

    Snider, Joseph; Plank, Markus; May, Larry; Liu, Thomas T; Poizner, Howard

    2011-10-04

    Functional magnetic resonance imaging (fMRI) provides excellent functional brain imaging via the BOLD signal with advantages including non-ionizing radiation, millimeter spatial accuracy of anatomical and functional data, and nearly real-time analyses. Haptic robots provide precise measurement and control of position and force of a cursor in a reasonably confined space. Here we combine these two technologies to allow precision experiments involving motor control with haptic/tactile environment interaction such as reaching or grasping. The basic idea is to attach an 8 foot end effecter supported in the center to the robot allowing the subject to use the robot, but shielding it and keeping it out of the most extreme part of the magnetic field from the fMRI machine (Figure 1). The Phantom Premium 3.0, 6DoF, high-force robot (SensAble Technologies, Inc.) is an excellent choice for providing force-feedback in virtual reality experiments, but it is inherently non-MR safe, introduces significant noise to the sensitive fMRI equipment, and its electric motors may be affected by the fMRI's strongly varying magnetic field. We have constructed a table and shielding system that allows the robot to be safely introduced into the fMRI environment and limits both the degradation of the fMRI signal by the electrically noisy motors and the degradation of the electric motor performance by the strongly varying magnetic field of the fMRI. With the shield, the signal to noise ratio (SNR: mean signal/noise standard deviation) of the fMRI goes from a baseline of ~380 to ~330, and ~250 without the shielding. The remaining noise appears to be uncorrelated and does not add artifacts to the fMRI of a test sphere (Figure 2). The long, stiff handle allows placement of the robot out of range of the most strongly varying parts of the magnetic field so there is no significant effect of the fMRI on the robot. The effect of the handle on the robot's kinematics is minimal since it is lightweight (~2

  17. Video Game Device Haptic Interface for Robotic Arc Welding

    Energy Technology Data Exchange (ETDEWEB)

    Corrie I. Nichol; Milos Manic

    2009-05-01

    Recent advances in technology for video games have made a broad array of haptic feedback devices available at low cost. This paper presents a bi-manual haptic system to enable an operator to weld remotely using the a commercially available haptic feedback video game device for the user interface. The system showed good performance in initial tests, demonstrating the utility of low cost input devices for remote haptic operations.

  18. Enhancing realism in virtual environments by simulating the audio-haptic sensation of walking on ground surfaces

    DEFF Research Database (Denmark)

    Nordahl, Rolf; Serafin, Stefania; Nilsson, Niels Christian

    2012-01-01

    overlooking a canyon. Subjects were asked to visit the environment wearing an head-mounted display and a custom made pair of sandals enhanced with sensors and actuators. A 12-channels surround sound system delivered a soundscape which was consistent with the visual environment. Passive haptics was provided...

  19. FGB: A Graphical and Haptic User Interface for Creating Graphical, Haptic User Interfaces

    International Nuclear Information System (INIS)

    ANDERSON, THOMAS G.; BRECKENRIDGE, ARTHURINE; DAVIDSON, GEORGE S.

    1999-01-01

    The emerging field of haptics represents a fundamental change in human-computer interaction (HCI), and presents solutions to problems that are difficult or impossible to solve with a two-dimensional, mouse-based interface. To take advantage of the potential of haptics, however, innovative interaction techniques and programming environments are needed. This paper describes FGB (FLIGHT GHUI Builder), a programming tool that can be used to create an application specific graphical and haptic user interface (GHUI). FGB is itself a graphical and haptic user interface with which a programmer can intuitively create and manipulate components of a GHUI in real time in a graphical environment through the use of a haptic device. The programmer can create a GHUI without writing any programming code. After a user interface is created, FGB writes the appropriate programming code to a file, using the FLIGHT API, to recreate what the programmer created in the FGB interface. FGB saves programming time and increases productivity, because a programmer can see the end result as it is created, and FGB does much of the programming itself. Interestingly, as FGB was created, it was used to help build itself. The further FGB was in its development, the more easily and quickly it could be used to create additional functionality and improve its own design. As a finished product, FGB can be used to recreate itself in much less time than it originally required, and with much less programming. This paper describes FGB's GHUI components, the techniques used in the interface, how the output code is created, where programming additions and modifications should be placed, and how it can be compared to and integrated with existing API's such as MFC and Visual C++, OpenGL, and GHOST

  20. Capturing differences in dental training using a virtual reality simulator.

    Science.gov (United States)

    Mirghani, I; Mushtaq, F; Allsop, M J; Al-Saud, L M; Tickhill, N; Potter, C; Keeling, A; Mon-Williams, M A; Manogue, M

    2018-02-01

    Virtual reality simulators are becoming increasingly popular in dental schools across the world. But to what extent do these systems reflect actual dental ability? Addressing this question of construct validity is a fundamental step that is necessary before these systems can be fully integrated into a dental school's curriculum. In this study, we examined the sensitivity of the Simodont (a haptic virtual reality dental simulator) to differences in dental training experience. Two hundred and eighty-nine participants, with 1 (n = 92), 3 (n = 79), 4 (n = 57) and 5 (n = 61) years of dental training, performed a series of tasks upon their first exposure to the simulator. We found statistically significant differences between novice (Year 1) and experienced dental trainees (operationalised as 3 or more years of training), but no differences between performance of experienced trainees with varying levels of experience. This work represents a crucial first step in understanding the value of haptic virtual reality simulators in dental education. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Evaluation of wearable haptic systems for the fingers in Augmented Reality applications

    DEFF Research Database (Denmark)

    Chinello, Francesco

    2017-01-01

    Although Augmented Reality (AR) has been around for almost five decades, only recently we have witnessed AR systems and applications entering in our everyday life. Representative examples of this technological revolution are the smartphone games “Pok´emon GO” and “Ingress” or the Google Translate...... real-time sign interpretation app. Even if AR applications are already quite compelling and widespread, users are still not able to physically interact with the computer-generated reality. In this respect, wearable haptics can provide the compelling illusion of touching the superimposed virtual objects...

  2. Wide-Area Haptic Guidance: Taking the User by the Hand

    OpenAIRE

    Pérez Arias, Antonia; Hanebeck, Uwe D.

    2010-01-01

    In this paper, we present a novel use of haptic information in extended range telepresence, the wide-area haptic guidance. It consists of force and position signals applied to the user's hand in order to improve safety, accuracy, and speed in some telepresent tasks. Wide-area haptic guidance assists the user in reaching a desired position in a remote environment of arbitrary size without degrading the feeling of presence. Several methods for haptic guidance are analyzed. With active haptic gu...

  3. Virtual reality technology and applications

    CERN Document Server

    Mihelj, Matjaž; Beguš, Samo

    2014-01-01

    As virtual reality expands from the imaginary worlds of science fiction and pervades every corner of everyday life, it is becoming increasingly important for students and professionals alike to understand the diverse aspects of this technology. This book aims to provide a comprehensive guide to the theoretical and practical elements of virtual reality, from the mathematical and technological foundations of virtual worlds to the human factors and the applications that enrich our lives: in the fields of medicine, entertainment, education and others. After providing a brief introduction to the topic, the book describes the kinematic and dynamic mathematical models of virtual worlds. It explores the many ways a computer can track and interpret human movement, then progresses through the modalities that make up a virtual world: visual, acoustic and haptic. It explores the interaction between the actual and virtual environments, as well as design principles of the latter. The book closes with an examination of diff...

  4. Democratizing rendering for multiple viewers in surround VR systems

    KAUST Repository

    Schulze, Jü rgen P.; Acevedo-Feliz, Daniel; Mangan, John; Prudhomme, Andrew; Nguyen, Phi Khanh; Weber, Philip P.

    2012-01-01

    We present a new approach for how multiple users' views can be rendered in a surround virtual environment without using special multi-view hardware. It is based on the idea that different parts of the screen are often viewed by different users, so that they can be rendered from their own view point, or at least from a point closer to their view point than traditionally expected. The vast majority of 3D virtual reality systems are designed for one head-tracked user, and a number of passive viewers. Only the head tracked user gets to see the correct view of the scene, everybody else sees a distorted image. We reduce this problem by algorithmically democratizing the rendering view point among all tracked users. Researchers have proposed solutions for multiple tracked users, but most of them require major changes to the display hardware of the VR system, such as additional projectors or custom VR glasses. Our approach does not require additional hardware, except the ability to track each participating user. We propose three versions of our multi-viewer algorithm. Each of them balances image distortion and frame rate in different ways, making them more or less suitable for certain application scenarios. Our most sophisticated algorithm renders each pixel from its own, optimized camera perspective, which depends on all tracked users' head positions and orientations. © 2012 IEEE.

  5. Democratizing rendering for multiple viewers in surround VR systems

    KAUST Repository

    Schulze, Jürgen P.

    2012-03-01

    We present a new approach for how multiple users\\' views can be rendered in a surround virtual environment without using special multi-view hardware. It is based on the idea that different parts of the screen are often viewed by different users, so that they can be rendered from their own view point, or at least from a point closer to their view point than traditionally expected. The vast majority of 3D virtual reality systems are designed for one head-tracked user, and a number of passive viewers. Only the head tracked user gets to see the correct view of the scene, everybody else sees a distorted image. We reduce this problem by algorithmically democratizing the rendering view point among all tracked users. Researchers have proposed solutions for multiple tracked users, but most of them require major changes to the display hardware of the VR system, such as additional projectors or custom VR glasses. Our approach does not require additional hardware, except the ability to track each participating user. We propose three versions of our multi-viewer algorithm. Each of them balances image distortion and frame rate in different ways, making them more or less suitable for certain application scenarios. Our most sophisticated algorithm renders each pixel from its own, optimized camera perspective, which depends on all tracked users\\' head positions and orientations. © 2012 IEEE.

  6. GPU-based real-time soft tissue deformation with cutting and haptic feedback.

    Science.gov (United States)

    Courtecuisse, Hadrien; Jung, Hoeryong; Allard, Jérémie; Duriez, Christian; Lee, Doo Yong; Cotin, Stéphane

    2010-12-01

    This article describes a series of contributions in the field of real-time simulation of soft tissue biomechanics. These contributions address various requirements for interactive simulation of complex surgical procedures. In particular, this article presents results in the areas of soft tissue deformation, contact modelling, simulation of cutting, and haptic rendering, which are all relevant to a variety of medical interventions. The contributions described in this article share a common underlying model of deformation and rely on GPU implementations to significantly improve computation times. This consistency in the modelling technique and computational approach ensures coherent results as well as efficient, robust and flexible solutions. Copyright © 2010 Elsevier Ltd. All rights reserved.

  7. Small-scale tactile graphics for virtual reality systems

    Science.gov (United States)

    Roberts, John W.; Slattery, Oliver T.; Swope, Brett; Min, Volker; Comstock, Tracy

    2002-05-01

    As virtual reality technology moves forward, there is a need to provide the user with options for greater realism for closer engagement to the human senses. Haptic systems use force feedback to create a large-scale sensation of physical interaction in a virtual environment. Further refinement can be created by using tactile graphics to reproduce a detailed sense of touch. For example, a haptic system might create the sensation of the weight of a virtual orange that the user picks up, and the sensation of pressure on the fingers as the user squeezes the orange. A tactile graphic system could create the texture of the orange on the user's fingertips. IN the real wold, a detailed sense of touch plays a large part in picking up and manipulating small objects. Our team is working to develop technology that can drive a high density fingertip array of tactile simulators at a rapid refresh rate, sufficient to produce a realistic sense of touch. To meet the project criteria, the mechanism must be much lower cost than existing technologies, and must be sufficiently lightweight and compact to permit portable use and to enable installation of the stimulator array in the fingertip of a tactile glove. The primary intended applications for this technology are accessibility for the blind and visually impaired, teleoperation, and virtual reality systems.

  8. Neodymium:YAG laser cutting of intraocular lens haptics in vitro and in vivo.

    Science.gov (United States)

    Feder, J M; Rosenberg, M A; Farber, M D

    1989-09-01

    Various complications following intraocular lens (IOL) surgery result in explantation of the lenses. Haptic fibrosis may necessitate cutting the IOL haptics prior to removal. In this study we used the neodymium: YAG (Nd:YAG) laser to cut polypropylene and poly(methyl methacrylate) (PMMA) haptics in vitro and in rabbit eyes. In vitro we were able to cut 100% of both haptic types successfully (28 PMMA and 30 polypropylene haptics). In rabbit eyes we were able to cut 50% of the PMMA haptics and 43% of the polypropylene haptics. Poly(methyl methacrylate) haptics were easier to cut in vitro and in vivo than polypropylene haptics, requiring fewer shots for transection. Complications of Nd:YAG laser use frequently interfered with haptic transections in rabbit eyes. Haptic transection may be more easily accomplished in human eyes.

  9. Haptic Feedback for the GPU-based Surgical Simulator

    DEFF Research Database (Denmark)

    Sørensen, Thomas Sangild; Mosegaard, Jesper

    2006-01-01

    The GPU has proven to be a powerful processor to compute spring-mass based surgical simulations. It has not previously been shown however, how to effectively implement haptic interaction with a simulation running entirely on the GPU. This paper describes a method to calculate haptic feedback...... with limited performance cost. It allows easy balancing of the GPU workload between calculations of simulation, visualisation, and the haptic feedback....

  10. Haptic perception accuracy depending on self-produced movement.

    Science.gov (United States)

    Park, Chulwook; Kim, Seonjin

    2014-01-01

    This study measured whether self-produced movement influences haptic perception ability (experiment 1) as well as the factors associated with levels of influence (experiment 2) in racket sports. For experiment 1, the haptic perception accuracy levels of five male table tennis experts and five male novices were examined under two different conditions (no movement vs. movement). For experiment 2, the haptic afferent subsystems of five male table tennis experts and five male novices were investigated in only the self-produced movement-coupled condition. Inferential statistics (ANOVA, t-test) and custom-made devices (shock & vibration sensor, Qualisys Track Manager) of the data were used to determine the haptic perception accuracy (experiment 1, experiment 2) and its association with expertise. The results of this research show that expert-level players acquire higher accuracy with less variability (racket vibration and angle) than novice-level players, especially in their self-produced movement coupled performances. The important finding from this result is that, in terms of accuracy, the skill-associated differences were enlarged during self-produced movement. To explain the origin of this difference between experts and novices, the functional variability of haptic afferent subsystems can serve as a reference. These two factors (self-produced accuracy and the variability of haptic features) as investigated in this study would be useful criteria for educators in racket sports and suggest a broader hypothesis for further research into the effects of the haptic accuracy related to variability.

  11. Augmented versus virtual reality laparoscopic simulation: what is the difference? A comparison of the ProMIS augmented reality laparoscopic simulator versus LapSim virtual reality laparoscopic simulator

    NARCIS (Netherlands)

    Botden, Sanne M. B. I.; Buzink, Sonja N.; Schijven, Marlies P.; Jakimowicz, Jack J.

    2007-01-01

    BACKGROUND: Virtual reality (VR) is an emerging new modality for laparoscopic skills training; however, most simulators lack realistic haptic feedback. Augmented reality (AR) is a new laparoscopic simulation system offering a combination of physical objects and VR simulation. Laparoscopic

  12. Development of haptic system for surgical robot

    Science.gov (United States)

    Gang, Han Gyeol; Park, Jiong Min; Choi, Seung-Bok; Sohn, Jung Woo

    2017-04-01

    In this paper, a new type of haptic system for surgical robot application is proposed and its performances are evaluated experimentally. The proposed haptic system consists of an effective master device and a precision slave robot. The master device has 3-DOF rotational motion as same as human wrist motion. It has lightweight structure with a gyro sensor and three small-sized MR brakes for position measurement and repulsive torque generation, respectively. The slave robot has 3-DOF rotational motion using servomotors, five bar linkage and a torque sensor is used to measure resistive torque. It has been experimentally demonstrated that the proposed haptic system has good performances on tracking control of desired position and repulsive torque. It can be concluded that the proposed haptic system can be effectively applied to the surgical robot system in real field.

  13. Mixed reality temporal bone surgical dissector: mechanical design.

    Science.gov (United States)

    Hochman, Jordan Brent; Sepehri, Nariman; Rampersad, Vivek; Kraut, Jay; Khazraee, Milad; Pisa, Justyn; Unger, Bertram

    2014-08-08

    The Development of a Novel Mixed Reality (MR) Simulation. An evolving training environment emphasizes the importance of simulation. Current haptic temporal bone simulators have difficulty representing realistic contact forces and while 3D printed models convincingly represent vibrational properties of bone, they cannot reproduce soft tissue. This paper introduces a mixed reality model, where the effective elements of both simulations are combined; haptic rendering of soft tissue directly interacts with a printed bone model. This paper addresses one aspect in a series of challenges, specifically the mechanical merger of a haptic device with an otic drill. This further necessitates gravity cancelation of the work assembly gripper mechanism. In this system, the haptic end-effector is replaced by a high-speed drill and the virtual contact forces need to be repositioned to the drill tip from the mid wand. Previous publications detail generation of both the requisite printed and haptic simulations. Custom software was developed to reposition the haptic interaction point to the drill tip. A custom fitting, to hold the otic drill, was developed and its weight was offset using the haptic device. The robustness of the system to disturbances and its stable performance during drilling were tested. The experiments were performed on a mixed reality model consisting of two drillable rapid-prototyped layers separated by a free-space. Within the free-space, a linear virtual force model is applied to simulate drill contact with soft tissue. Testing illustrated the effectiveness of gravity cancellation. Additionally, the system exhibited excellent performance given random inputs and during the drill's passage between real and virtual components of the model. No issues with registration at model boundaries were encountered. These tests provide a proof of concept for the initial stages in the development of a novel mixed-reality temporal bone simulator.

  14. Haptic and Audio-visual Stimuli: Enhancing Experiences and Interaction

    NARCIS (Netherlands)

    Nijholt, Antinus; Dijk, Esko O.; Lemmens, Paul M.C.; Luitjens, S.B.

    2010-01-01

    The intention of the symposium on Haptic and Audio-visual stimuli at the EuroHaptics 2010 conference is to deepen the understanding of the effect of combined Haptic and Audio-visual stimuli. The knowledge gained will be used to enhance experiences and interactions in daily life. To this end, a

  15. Virtual Acoustics: Evaluation of Psychoacoustic Parameters

    Science.gov (United States)

    Begault, Durand R.; Null, Cynthia H. (Technical Monitor)

    1997-01-01

    Current virtual acoustic displays for teleconferencing and virtual reality are usually limited to very simple or non-existent renderings of reverberation, a fundamental part of the acoustic environmental context that is encountered in day-to-day hearing. Several research efforts have produced results that suggest that environmental cues dramatically improve perceptual performance within virtual acoustic displays, and that is possible to manipulate signal processing parameters to effectively reproduce important aspects of virtual acoustic perception in real-time. However, the computational resources for rendering reverberation remain formidable. Our efforts at NASA Ames have been focused using a several perceptual threshold metrics, to determine how various "trade-offs" might be made in real-time acoustic rendering. This includes both original work and confirmation of existing data that was obtained in real rather than virtual environments. The talk will consider the importance of using individualized versus generalized pinnae cues (the "Head-Related Transfer Function"); the use of head movement cues; threshold data for early reflections and late reverberation; and consideration of the necessary accuracy for measuring and rendering octave-band absorption characteristics of various wall surfaces. In addition, a consideration of the analysis-synthesis of the reverberation within "everyday spaces" (offices, conference rooms) will be contrasted to the commonly used paradigm of concert hall spaces.

  16. Visual-Haptic Integration: Cue Weights are Varied Appropriately, to Account for Changes in Haptic Reliability Introduced by Using a Tool

    OpenAIRE

    Chie Takahashi; Simon J Watt

    2011-01-01

    Tools such as pliers systematically change the relationship between an object's size and the hand opening required to grasp it. Previous work suggests the brain takes this into account, integrating visual and haptic size information that refers to the same object, independent of the similarity of the ‘raw’ visual and haptic signals (Takahashi et al., VSS 2009). Variations in tool geometry also affect the reliability (precision) of haptic size estimates, however, because they alter the change ...

  17. The effect of depth compression on multiview rendering quality

    NARCIS (Netherlands)

    Merkle, P.; Morvan, Y.; Smolic, A.; Farin, D.S.; Mueller, K..; With, de P.H.N.; Wiegand, T.

    2010-01-01

    This paper presents a comparative study on different techniques for depth-image compression and its implications on the quality of multiview video plus depth virtual view rendering. A novel coding algorithm for depth images that concentrates on their special characteristics, namely smooth regions

  18. Memory for curvature of objects: haptic touch vs. vision.

    Science.gov (United States)

    Ittyerah, Miriam; Marks, Lawrence E

    2007-11-01

    The present study examined the role of vision and haptics in memory for stimulus objects that vary along the dimension of curvature. Experiment 1 measured haptic-haptic (T-T) and haptic-visual (T-V) discrimination of curvature in a short-term memory paradigm, using 30-second retention intervals containing five different interpolated tasks. Results showed poorest performance when the interpolated tasks required spatial processing or movement, thereby suggesting that haptic information about shape is encoded in a spatial-motor representation. Experiment 2 compared visual-visual (V-V) and visual-haptic (V-T) short-term memory, again using 30-second delay intervals. The results of the ANOVA failed to show a significant effect of intervening activity. Intra-modal visual performance and cross-modal performance were similar. Comparing the four modality conditions (inter-modal V-T, T-V; intra-modal V-V, T-T, by combining the data of Experiments 1 and 2), in a global analysis, showed a reliable interaction between intervening activity and experiment (modality). Although there appears to be a general tendency for spatial and movement activities to exert the most deleterious effects overall, the patterns are not identical when the initial stimulus is encoded haptically (Experiment 1) and visually (Experiment 2).

  19. A hardware and software architecture to deal with multimodal and collaborative interactions in multiuser virtual reality environments

    Science.gov (United States)

    Martin, P.; Tseu, A.; Férey, N.; Touraine, D.; Bourdot, P.

    2014-02-01

    Most advanced immersive devices provide collaborative environment within several users have their distinct head-tracked stereoscopic point of view. Combining with common used interactive features such as voice and gesture recognition, 3D mouse, haptic feedback, and spatialized audio rendering, these environments should faithfully reproduce a real context. However, even if many studies have been carried out on multimodal systems, we are far to definitively solve the issue of multimodal fusion, which consists in merging multimodal events coming from users and devices, into interpretable commands performed by the application. Multimodality and collaboration was often studied separately, despite of the fact that these two aspects share interesting similarities. We discuss how we address this problem, thought the design and implementation of a supervisor that is able to deal with both multimodal fusion and collaborative aspects. The aim of this supervisor is to ensure the merge of user's input from virtual reality devices in order to control immersive multi-user applications. We deal with this problem according to a practical point of view, because the main requirements of this supervisor was defined according to a industrial task proposed by our automotive partner, that as to be performed with multimodal and collaborative interactions in a co-located multi-user environment. In this task, two co-located workers of a virtual assembly chain has to cooperate to insert a seat into the bodywork of a car, using haptic devices to feel collision and to manipulate objects, combining speech recognition and two hands gesture recognition as multimodal instructions. Besides the architectural aspect of this supervisor, we described how we ensure the modularity of our solution that could apply on different virtual reality platforms, interactive contexts and virtual contents. A virtual context observer included in this supervisor in was especially designed to be independent to the

  20. Continuous Surface Rendering, Passing from CAD to Physical Representation

    Directory of Open Access Journals (Sweden)

    Mario Covarrubias

    2013-06-01

    Full Text Available This paper describes a desktop-mechatronic interface that has been conceived to support designers in the evaluation of aesthetic virtual shapes. This device allows a continuous and smooth free hand contact interaction on a real and developable plastic tape actuated by a servo-controlled mechanism. The objective in designing this device is to reproduce a virtual surface with a consistent physical rendering well adapted to designers' needs. The desktop-mechatronic interface consists in a servo-actuated plastic strip that has been devised and implemented using seven interpolation points. In fact, by using the MEC (Minimal Energy Curve Spline approach, a developable real surface is rendered taking into account the CAD geometry of the virtual shapes. In this paper, we describe the working principles of the interface by using both absolute and relative approaches to control the position on each single control point on the MEC spline. Then, we describe the methodology that has been implemented, passing from the CAD geometry, linked to VisualNastran in order to maintain the parametric properties of the virtual shape. Then, we present the co-simulation between VisualNastran and MATLAB/Simulink used for achieving this goal and controlling the system and finally, we present the results of the subsequent testing session specifically carried out to evaluate the accuracy and the effectiveness of the mechatronic device.

  1. Prevailing Trends in Haptic Feedback Simulation for Minimally Invasive Surgery.

    Science.gov (United States)

    Pinzon, David; Byrns, Simon; Zheng, Bin

    2016-08-01

    Background The amount of direct hand-tool-tissue interaction and feedback in minimally invasive surgery varies from being attenuated in laparoscopy to being completely absent in robotic minimally invasive surgery. The role of haptic feedback during surgical skill acquisition and its emphasis in training have been a constant source of controversy. This review discusses the major developments in haptic simulation as they relate to surgical performance and the current research questions that remain unanswered. Search Strategy An in-depth review of the literature was performed using PubMed. Results A total of 198 abstracts were returned based on our search criteria. Three major areas of research were identified, including advancements in 1 of the 4 components of haptic systems, evaluating the effectiveness of haptic integration in simulators, and improvements to haptic feedback in robotic surgery. Conclusions Force feedback is the best method for tissue identification in minimally invasive surgery and haptic feedback provides the greatest benefit to surgical novices in the early stages of their training. New technology has improved our ability to capture, playback and enhance to utility of haptic cues in simulated surgery. Future research should focus on deciphering how haptic training in surgical education can increase performance, safety, and improve training efficiency. © The Author(s) 2016.

  2. Virtual reality for spherical images

    Science.gov (United States)

    Pilarczyk, Rafal; Skarbek, Władysław

    2017-08-01

    Paper presents virtual reality application framework and application concept for mobile devices. Framework uses Google Cardboard library for Android operating system. Framework allows to create virtual reality 360 video player using standard OpenGL ES rendering methods. Framework provides network methods in order to connect to web server as application resource provider. Resources are delivered using JSON response as result of HTTP requests. Web server also uses Socket.IO library for synchronous communication between application and server. Framework implements methods to create event driven process of rendering additional content based on video timestamp and virtual reality head point of view.

  3. Bilateral teleoperation of underactuated unmanned aerial vehicles: The virtual slave concept

    NARCIS (Netherlands)

    Mersha, A.Y.; Stramigioli, Stefano; Carloni, Raffaella

    In this paper, we present haptic teleoperation of underactuated unmanned aerial vehicles by providing a multidimensional generalization of the virtual slave concept. The proposed control architecture is composed of high-level and low-level controllers. The high-level controller commands the vehicle

  4. Audio-haptic interaction in simulated walking experiences

    DEFF Research Database (Denmark)

    Serafin, Stefania

    2011-01-01

    and interchangeable use of the haptic and auditory modality in floor interfaces, and for the synergy of perception and action in capturing and guiding human walking. We describe the technology developed in the context of this project, together with some experiments performed to evaluate the role of auditory......In this paper an overview of the work conducted on audio-haptic physically based simulation and evaluation of walking is provided. This work has been performed in the context of the Natural Interactive Walking (NIW) project, whose goal is to investigate possibilities for the integrated...... and haptic feedback in walking tasks....

  5. Haptic sense and the politicization of contemporary image

    Directory of Open Access Journals (Sweden)

    Tarcisio Torres Silva

    2017-08-01

    Full Text Available In this paper, it is intended to propose a theoretical approach to the political effects of the sense of touch/haptic in order to understand to what extent the intensification of contemporary haptic experience contributes to create proximity and engagement among individuals overloaded by too much visual information offered by multiple media. At the end, it is mentioned the work of Brazilian artist Rodrigo Braga to exemplify the contemporary political use of haptic sense.

  6. Toward a comprehensive hybrid physical-virtual reality simulator of peripheral anesthesia with ultrasound and neurostimulator guidance.

    Science.gov (United States)

    Samosky, Joseph T; Allen, Pete; Boronyak, Steve; Branstetter, Barton; Hein, Steven; Juhas, Mark; Nelson, Douglas A; Orebaugh, Steven; Pinto, Rohan; Smelko, Adam; Thompson, Mitch; Weaver, Robert A

    2011-01-01

    We are developing a simulator of peripheral nerve block utilizing a mixed-reality approach: the combination of a physical model, an MRI-derived virtual model, mechatronics and spatial tracking. Our design uses tangible (physical) interfaces to simulate surface anatomy, haptic feedback during needle insertion, mechatronic display of muscle twitch corresponding to the specific nerve stimulated, and visual and haptic feedback for the injection syringe. The twitch response is calculated incorporating the sensed output of a real neurostimulator. The virtual model is isomorphic with the physical model and is derived from segmented MRI data. This model provides the subsurface anatomy and, combined with electromagnetic tracking of a sham ultrasound probe and a standard nerve block needle, supports simulated ultrasound display and measurement of needle location and proximity to nerves and vessels. The needle tracking and virtual model also support objective performance metrics of needle targeting technique.

  7. Conflicting audio-haptic feedback in physically based simulation of walking sounds

    DEFF Research Database (Denmark)

    Turchet, Luca; Serafin, Stefania; Dimitrov, Smilen

    2010-01-01

    We describe an audio-haptic experiment conducted using a system which simulates in real-time the auditory and haptic sensation of walking on different surfaces. The system is based on physical models, that drive both the haptic and audio synthesizers, and a pair of shoes enhanced with sensors...... and actuators. Such experiment was run to examine the ability of subjects to recognize the different surfaces with both coherent and incoherent audio-haptic stimuli. Results show that in this kind of tasks the auditory modality is dominant on the haptic one....

  8. Multimodal Sensing Interface for Haptic Interaction

    Directory of Open Access Journals (Sweden)

    Carlos Diaz

    2017-01-01

    Full Text Available This paper investigates the integration of a multimodal sensing system for exploring limits of vibrato tactile haptic feedback when interacting with 3D representation of real objects. In this study, the spatial locations of the objects are mapped to the work volume of the user using a Kinect sensor. The position of the user’s hand is obtained using the marker-based visual processing. The depth information is used to build a vibrotactile map on a haptic glove enhanced with vibration motors. The users can perceive the location and dimension of remote objects by moving their hand inside a scanning region. A marker detection camera provides the location and orientation of the user’s hand (glove to map the corresponding tactile message. A preliminary study was conducted to explore how different users can perceive such haptic experiences. Factors such as total number of objects detected, object separation resolution, and dimension-based and shape-based discrimination were evaluated. The preliminary results showed that the localization and counting of objects can be attained with a high degree of success. The users were able to classify groups of objects of different dimensions based on the perceived haptic feedback.

  9. Virtual reality training improves balance function.

    Science.gov (United States)

    Mao, Yurong; Chen, Peiming; Li, Le; Huang, Dongfeng

    2014-09-01

    Virtual reality is a new technology that simulates a three-dimensional virtual world on a computer and enables the generation of visual, audio, and haptic feedback for the full immersion of users. Users can interact with and observe objects in three-dimensional visual space without limitation. At present, virtual reality training has been widely used in rehabilitation therapy for balance dysfunction. This paper summarizes related articles and other articles suggesting that virtual reality training can improve balance dysfunction in patients after neurological diseases. When patients perform virtual reality training, the prefrontal, parietal cortical areas and other motor cortical networks are activated. These activations may be involved in the reconstruction of neurons in the cerebral cortex. Growing evidence from clinical studies reveals that virtual reality training improves the neurological function of patients with spinal cord injury, cerebral palsy and other neurological impairments. These findings suggest that virtual reality training can activate the cerebral cortex and improve the spatial orientation capacity of patients, thus facilitating the cortex to control balance and increase motion function.

  10. Virtual reality training improves balance function

    Science.gov (United States)

    Mao, Yurong; Chen, Peiming; Li, Le; Huang, Dongfeng

    2014-01-01

    Virtual reality is a new technology that simulates a three-dimensional virtual world on a computer and enables the generation of visual, audio, and haptic feedback for the full immersion of users. Users can interact with and observe objects in three-dimensional visual space without limitation. At present, virtual reality training has been widely used in rehabilitation therapy for balance dysfunction. This paper summarizes related articles and other articles suggesting that virtual reality training can improve balance dysfunction in patients after neurological diseases. When patients perform virtual reality training, the prefrontal, parietal cortical areas and other motor cortical networks are activated. These activations may be involved in the reconstruction of neurons in the cerebral cortex. Growing evidence from clinical studies reveals that virtual reality training improves the neurological function of patients with spinal cord injury, cerebral palsy and other neurological impairments. These findings suggest that virtual reality training can activate the cerebral cortex and improve the spatial orientation capacity of patients, thus facilitating the cortex to control balance and increase motion function. PMID:25368651

  11. Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface.

    Science.gov (United States)

    Aleotti, Jacopo; Micconi, Giorgio; Caselli, Stefano; Benassi, Giacomo; Zambelli, Nicola; Bettelli, Manuele; Zappettini, Andrea

    2017-09-29

    A visuo-haptic augmented reality (VHAR) interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV) equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source.

  12. Virtual Reality simulator for dental anesthesia training in the inferior alveolar nerve block

    Directory of Open Access Journals (Sweden)

    Cléber Gimenez CORRÊA

    Full Text Available Abstract Objectives This study shows the development and validation of a dental anesthesia-training simulator, specifically for the inferior alveolar nerve block (IANB. The system developed provides the tactile sensation of inserting a real needle in a human patient, using Virtual Reality (VR techniques and a haptic device that can provide a perceived force feedback in the needle insertion task during the anesthesia procedure. Material and Methods To simulate a realistic anesthesia procedure, a Carpule syringe was coupled to a haptic device. The Volere method was used to elicit requirements from users in the Dentistry area; Repeated Measures Two-Way ANOVA (Analysis of Variance, Tukey post-hoc test and averages for the results’ analysis. A questionnaire-based subjective evaluation method was applied to collect information about the simulator, and 26 people participated in the experiments (12 beginners, 12 at intermediate level, and 2 experts. The questionnaire included profile, preferences (number of viewpoints, texture of the objects, and haptic device handler, as well as visual (appearance, scale, and position of objects and haptic aspects (motion space, tactile sensation, and motion reproduction. Results The visual aspect was considered appropriate and the haptic feedback must be improved, which the users can do by calibrating the virtual tissues’ resistance. The evaluation of visual aspects was influenced by the participants’ experience, according to ANOVA test (F=15.6, p=0.0002, with p<0.01. The user preferences were the simulator with two viewpoints, objects with texture based on images and the device with a syringe coupled to it. Conclusion The simulation was considered thoroughly satisfactory for the anesthesia training, considering the needle insertion task, which includes the correct insertion point and depth, as well as the perception of tissues resistances during the insertion.

  13. Virtual Reality simulator for dental anesthesia training in the inferior alveolar nerve block.

    Science.gov (United States)

    Corrêa, Cléber Gimenez; Machado, Maria Aparecida de Andrade Moreira; Ranzini, Edith; Tori, Romero; Nunes, Fátima de Lourdes Santos

    2017-01-01

    This study shows the development and validation of a dental anesthesia-training simulator, specifically for the inferior alveolar nerve block (IANB). The system developed provides the tactile sensation of inserting a real needle in a human patient, using Virtual Reality (VR) techniques and a haptic device that can provide a perceived force feedback in the needle insertion task during the anesthesia procedure. To simulate a realistic anesthesia procedure, a Carpule syringe was coupled to a haptic device. The Volere method was used to elicit requirements from users in the Dentistry area; Repeated Measures Two-Way ANOVA (Analysis of Variance), Tukey post-hoc test and averages for the results' analysis. A questionnaire-based subjective evaluation method was applied to collect information about the simulator, and 26 people participated in the experiments (12 beginners, 12 at intermediate level, and 2 experts). The questionnaire included profile, preferences (number of viewpoints, texture of the objects, and haptic device handler), as well as visual (appearance, scale, and position of objects) and haptic aspects (motion space, tactile sensation, and motion reproduction). The visual aspect was considered appropriate and the haptic feedback must be improved, which the users can do by calibrating the virtual tissues' resistance. The evaluation of visual aspects was influenced by the participants' experience, according to ANOVA test (F=15.6, p=0.0002, with p<0.01). The user preferences were the simulator with two viewpoints, objects with texture based on images and the device with a syringe coupled to it. The simulation was considered thoroughly satisfactory for the anesthesia training, considering the needle insertion task, which includes the correct insertion point and depth, as well as the perception of tissues resistances during the insertion.

  14. Mathematical model of bone drilling for virtual surgery system

    Science.gov (United States)

    Alaytsev, Innokentiy K.; Danilova, Tatyana V.; Manturov, Alexey O.; Mareev, Gleb O.; Mareev, Oleg V.

    2018-04-01

    The bone drilling is an essential part of surgeries in ENT and Dentistry. A proper training of drilling machine handling skills is impossible without proper modelling of the drilling process. Utilization of high precision methods like FEM is limited due to the requirement of 1000 Hz update rate for haptic feedback. The study presents a mathematical model of the drilling process that accounts the properties of materials, the geometry and the rotation rate of a burr to compute the removed material volume. The simplicity of the model allows for integrating it in the high-frequency haptic thread. The precision of the model is enough for a virtual surgery system targeted on the training of the basic surgery skills.

  15. Haptic Feedback for Enhancing Realism of Walking Simulations

    DEFF Research Database (Denmark)

    Turchet, Luca; Burelli, Paolo; Serafin, Stefania

    2013-01-01

    system. While during the use of the interactive system subjects physically walked, during the use of the non-interactive system the locomotion was simulated while subjects were sitting on a chair. In both the configurations subjects were exposed to auditory and audio-visual stimuli presented...... with and without the haptic feedback. Results of the experiments provide a clear preference towards the simulations enhanced with haptic feedback showing that the haptic channel can lead to more realistic experiences in both interactive and non-interactive configurations. The majority of subjects clearly...... appreciated the added feedback. However, some subjects found the added feedback disturbing and annoying. This might be due on one hand to the limits of the haptic simulation and on the other hand to the different individual desire to be involved in the simulations. Our findings can be applied to the context...

  16. Vertical illusory self-motion through haptic stimulation of the feet

    DEFF Research Database (Denmark)

    Nordahl, Rolf; Nilsson, Niels Christian; Turchet, Luca

    2012-01-01

    Circular and linear self-motion illusions induced through visual and auditory stimuli have been studied rather extensively. While the ability of haptic stimuli to augment such illusions has been investigated, the self-motion illusions which primarily are induced by stimulation of the haptic...... to generate the haptic feedback while the final condition included no haptic feedback. Analysis of self-reports were used to assess the participants' experience of illusory self-motion. The results indicate that such illusions are indeed possible. Significant differences were found between the condition...... modality remain relatively unexplored. In this paper, we present an experiment performed with the intention of investigating whether it is possible to use haptic stimulation of the main supporting areas of the feet to induce vertical illusory self-motion on behalf of unrestrained participants during...

  17. Matching rendered and real world images by digital image processing

    Science.gov (United States)

    Mitjà, Carles; Bover, Toni; Bigas, Miquel; Escofet, Jaume

    2010-05-01

    Recent advances in computer-generated images (CGI) have been used in commercial and industrial photography providing a broad scope in product advertising. Mixing real world images with those rendered from virtual space software shows a more or less visible mismatching between corresponding image quality performance. Rendered images are produced by software which quality performance is only limited by the resolution output. Real world images are taken with cameras with some amount of image degradation factors as lens residual aberrations, diffraction, sensor low pass anti aliasing filters, color pattern demosaicing, etc. The effect of all those image quality degradation factors can be characterized by the system Point Spread Function (PSF). Because the image is the convolution of the object by the system PSF, its characterization shows the amount of image degradation added to any taken picture. This work explores the use of image processing to degrade the rendered images following the parameters indicated by the real system PSF, attempting to match both virtual and real world image qualities. The system MTF is determined by the slanted edge method both in laboratory conditions and in the real picture environment in order to compare the influence of the working conditions on the device performance; an approximation to the system PSF is derived from the two measurements. The rendered images are filtered through a Gaussian filter obtained from the taking system PSF. Results with and without filtering are shown and compared measuring the contrast achieved in different final image regions.

  18. Design and Control of a Haptic Enabled Robotic Manipulator

    Directory of Open Access Journals (Sweden)

    Muhammad Yaqoob

    2015-07-01

    Full Text Available Robotic surgery offers various advantages over conventional surgery that includes less bleeding, less trauma, and more precise tissue cutting. However, even surgeons who use the best commercially available surgical robotic systems complain about the absence of haptic feedback in such systems. In this paper, we present the findings of our project to overcome this shortcoming of surgical robotic systems, in which a haptic-enabled robotic system based on master and slave topology is designed and developed. To detect real-time intrusion at the slave end, haptic feedback is implemented along with a programmable system on chip, functioning as an embedded system for processing information. In order to obtain real-time haptic feedback, force and motion sensors are mounted on each joint of the master and slave units. At the master end, results are displayed through a graphical user interface, along with the physical feeling of intrusion at the slave part. Apart from the obvious applications of the current system in robotic surgery, it could also be used in designing more intuitive video games with further precise haptic feedback mechanisms. Moreover, the results presented in our work should pave the way for further scientific investigation, to provide even better haptic mechanisms.

  19. Haptic spatial matching in near peripersonal space.

    Science.gov (United States)

    Kaas, Amanda L; Mier, Hanneke I van

    2006-04-01

    Research has shown that haptic spatial matching at intermanual distances over 60 cm is prone to large systematic errors. The error pattern has been explained by the use of reference frames intermediate between egocentric and allocentric coding. This study investigated haptic performance in near peripersonal space, i.e. at intermanual distances of 60 cm and less. Twelve blindfolded participants (six males and six females) were presented with two turn bars at equal distances from the midsagittal plane, 30 or 60 cm apart. Different orientations (vertical/horizontal or oblique) of the left bar had to be matched by adjusting the right bar to either a mirror symmetric (/ \\) or parallel (/ /) position. The mirror symmetry task can in principle be performed accurately in both an egocentric and an allocentric reference frame, whereas the parallel task requires an allocentric representation. Results showed that parallel matching induced large systematic errors which increased with distance. Overall error was significantly smaller in the mirror task. The task difference also held for the vertical orientation at 60 cm distance, even though this orientation required the same response in both tasks, showing a marked effect of task instruction. In addition, men outperformed women on the parallel task. Finally, contrary to our expectations, systematic errors were found in the mirror task, predominantly at 30 cm distance. Based on these findings, we suggest that haptic performance in near peripersonal space might be dominated by different mechanisms than those which come into play at distances over 60 cm. Moreover, our results indicate that both inter-individual differences and task demands affect task performance in haptic spatial matching. Therefore, we conclude that the study of haptic spatial matching in near peripersonal space might reveal important additional constraints for the specification of adequate models of haptic spatial performance.

  20. Single minimum incision endoscopic radical nephrectomy for renal tumors with preoperative virtual navigation using 3D-CT volume-rendering

    Directory of Open Access Journals (Sweden)

    Shioyama Yasukazu

    2010-04-01

    Full Text Available Abstract Background Single minimum incision endoscopic surgery (MIES involves the use of a flexible high-definition laparoscope to facilitate open surgery. We reviewed our method of radical nephrectomy for renal tumors, which is single MIES combined with preoperative virtual surgery employing three-dimensional CT images reconstructed by the volume rendering method (3D-CT images in order to safely and appropriately approach the renal hilar vessels. We also assessed the usefulness of 3D-CT images. Methods Radical nephrectomy was done by single MIES via the translumbar approach in 80 consecutive patients. We performed the initial 20 MIES nephrectomies without preoperative 3D-CT images and the subsequent 60 MIES nephrectomies with preoperative 3D-CT images for evaluation of the renal hilar vessels and the relation of each tumor to the surrounding structures. On the basis of the 3D information, preoperative virtual surgery was performed with a computer. Results Single MIES nephrectomy was successful in all patients. In the 60 patients who underwent 3D-CT, the number of renal arteries and veins corresponded exactly with the preoperative 3D-CT data (100% sensitivity and 100% specificity. These 60 nephrectomies were completed with a shorter operating time and smaller blood loss than the initial 20 nephrectomies. Conclusions Single MIES radical nephrectomy combined with 3D-CT and virtual surgery achieved a shorter operating time and less blood loss, possibly due to safer and easier handling of the renal hilar vessels.

  1. Haptic sensitivity in needle insertion: the effects of training and visual aid

    Directory of Open Access Journals (Sweden)

    Dumas Cedric

    2011-12-01

    Full Text Available This paper describes an experiment conducted to measure haptic sensitivity and the effects of haptic training with and without visual aid. The protocol for haptic training consisted of a needle insertion task using dual-layer silicon samples. A visual aid was provided as a multimodal cue for the haptic perception task. Results showed that for a group of novices (subjects with no previous experience in needle insertion, training with a visual aid resulted in a longer time to task completion, and a greater applied force, during post-training tests. This suggests that haptic perception is easily overshadowed, and may be completely replaced, by visual feedback. Therefore, haptic skills must be trained differently from visuomotor skills.

  2. Vision holds a greater share in visuo-haptic object recognition than touch

    DEFF Research Database (Denmark)

    Kassuba, Tanja; Klinge, Corinna; Hölig, Cordula

    2013-01-01

    approach of multisensory integration would predict that haptics as the less efficient sense for object recognition gains more from integrating additional visual information than vice versa. To test for asymmetries between vision and touch in visuo-haptic interactions, we measured regional changes in brain...... processed the target object, being more pronounced for haptic than visual targets. This preferential response of visuo-haptic regions indicates a modality-specific asymmetry in crossmodal matching of visual and haptic object features, suggesting a functional primacy of vision over touch in visuo...

  3. Virtual Reality Simulators in the Process IndustryA Review of Existing Systems and the Way Towards ETS

    OpenAIRE

    Cibulka, Jaroslav; Komulainen, Tiina M.; Mirtaheri, Peyman; Nazir, Salman; Manca, Davide

    2016-01-01

    Simulator training with Virtual Reality Simulators deeply engages the operators and improves the learning outcome. The available commercial 3D and Virtual Reality Simulator products range from generic models for laptops to specialized projection rooms with a great variety of different audiovisual, haptic, and sensory effects. However, current virtual reality simulators do not take into account the physical and psychological strain involved in field operators’ work in real process plants. Coll...

  4. [Haptic tracking control for minimally invasive robotic surgery].

    Science.gov (United States)

    Xu, Zhaohong; Song, Chengli; Wu, Wenwu

    2012-06-01

    Haptic feedback plays a significant role in minimally invasive robotic surgery (MIRS). A major deficiency of the current MIRS is the lack of haptic perception for the surgeon, including the commercially available robot da Vinci surgical system. In this paper, a dynamics model of a haptic robot is established based on Newton-Euler method. Because it took some period of time in exact dynamics solution, we used a digital PID arithmetic dependent on robot dynamics to ensure real-time bilateral control, and it could improve tracking precision and real-time control efficiency. To prove the proposed method, an experimental system in which two Novint Falcon haptic devices acting as master-slave system has been developed. Simulations and experiments showed proposed methods could give instrument force feedbacks to operator, and bilateral control strategy is an effective method to master-slave MIRS. The proposed methods could be used to tele-robotic system.

  5. Evaluation of flexible endoscope steering using haptic guidance

    NARCIS (Netherlands)

    Reilink, Rob; Stramigioli, Stefano; Kappers, Astrid M L; Misra, Sarthak

    Background: Steering the tip of a flexible endoscope relies on the physician's dexterity and experience. For complex flexible endoscopes, conventional controls may be inadequate. Methods: A steering method based on a multi-degree-of-freedom haptic device is presented. Haptic cues are generated based

  6. Evaluation of flexible endoscope steering using haptic guidance

    NARCIS (Netherlands)

    Reilink, Rob; Stramigioli, Stefano; Kappers, Astrid M.L.; Misra, Sarthak

    2011-01-01

    Background - Steering the tip of a flexible endoscope relies on the physician’s dexterity and experience. For complex flexible endoscopes, conventional controls may be inadequate. Methods - A steering method based on a multi-degree-of-freedom haptic device is presented. Haptic cues are generated

  7. Cranial implant design using augmented reality immersive system.

    Science.gov (United States)

    Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary

    2007-01-01

    Software tools that utilize haptics for sculpting precise fitting cranial implants are utilized in an augmented reality immersive system to create a virtual working environment for the modelers. The virtual environment is designed to mimic the traditional working environment as closely as possible, providing more functionality for the users. The implant design process uses patient CT data of a defective area. This volumetric data is displayed in an implant modeling tele-immersive augmented reality system where the modeler can build a patient specific implant that precisely fits the defect. To mimic the traditional sculpting workspace, the implant modeling augmented reality system includes stereo vision, viewer centered perspective, sense of touch, and collaboration. To achieve optimized performance, this system includes a dual-processor PC, fast volume rendering with three-dimensional texture mapping, the fast haptic rendering algorithm, and a multi-threading architecture. The system replaces the expensive and time consuming traditional sculpting steps such as physical sculpting, mold making, and defect stereolithography. This augmented reality system is part of a comprehensive tele-immersive system that includes a conference-room-sized system for tele-immersive small group consultation and an inexpensive, easily deployable networked desktop virtual reality system for surgical consultation, evaluation and collaboration. This system has been used to design patient-specific cranial implants with precise fit.

  8. Haptic interfaces using dielectric electroactive polymers

    Science.gov (United States)

    Ozsecen, Muzaffer Y.; Sivak, Mark; Mavroidis, Constantinos

    2010-04-01

    Quality, amplitude and frequency of the interaction forces between a human and an actuator are essential traits for haptic applications. A variety of Electro-Active Polymer (EAP) based actuators can provide these characteristics simultaneously with quiet operation, low weight, high power density and fast response. This paper demonstrates a rolled Dielectric Elastomer Actuator (DEA) being used as a telepresence device in a heart beat measurement application. In the this testing, heart signals were acquired from a remote location using a wireless heart rate sensor, sent through a network and DEA was used to haptically reproduce the heart beats at the medical expert's location. A series of preliminary human subject tests were conducted that demonstrated that a) DE based haptic feeling can be used in heart beat measurement tests and b) through subjective testing the stiffness and actuator properties of the EAP can be tuned for a variety of applications.

  9. Teaching Classical Mechanics Concepts Using Visuo-Haptic Simulators

    Science.gov (United States)

    Neri, Luis; Noguez, Julieta; Robledo-Rella, Victor; Escobar-Castillejos, David; Gonzalez-Nucamendi, Andres

    2018-01-01

    In this work, the design and implementation of several physics scenarios using haptic devices are presented and discussed. Four visuo-haptic applications were developed for an undergraduate engineering physics course. Experiments with experimental and control groups were designed and implemented. Activities and exercises related to classical…

  10. End-to-End Flow Control for Visual-Haptic Communication under Bandwidth Change

    Science.gov (United States)

    Yashiro, Daisuke; Tian, Dapeng; Yakoh, Takahiro

    This paper proposes an end-to-end flow controller for visual-haptic communication. A visual-haptic communication system transmits non-real-time packets, which contain large-size visual data, and real-time packets, which contain small-size haptic data. When the transmission rate of visual data exceeds the communication bandwidth, the visual-haptic communication system becomes unstable owing to buffer overflow. To solve this problem, an end-to-end flow controller is proposed. This controller determines the optimal transmission rate of visual data on the basis of the traffic conditions, which are estimated by the packets for haptic communication. Experimental results confirm that in the proposed method, a short packet-sending interval and a short delay are achieved under bandwidth change, and thus, high-precision visual-haptic communication is realized.

  11. Haptic Routes and digestive destinations in cooking series

    DEFF Research Database (Denmark)

    Waade, Anne Marit; Jørgensen, Ulla Angkjær

    2010-01-01

    and the media in which aesthetical, cultural and symbolic values are related to the way food is mediatised. The main argument is that cooking television series produce haptic images of place and food that include a specific sensuous and emotional relation between screen and viewer. The haptic imagery...

  12. Semantic congruence in audio-haptic simulation of footsteps

    DEFF Research Database (Denmark)

    Turchet, Luca; Serafin, Stefania

    2014-01-01

    of semantic congruence for those audio–haptic pairs of materials which belonged to the same typology. Furthermore, better matching ability was found for the passive case compared to the interactive one, although this may be due to the limits of the technology used for the interactive haptic simulations....

  13. Haptograph Representation of Real-World Haptic Information by Wideband Force Control

    Science.gov (United States)

    Katsura, Seiichiro; Irie, Kouhei; Ohishi, Kiyoshi

    Artificial acquisition and reproduction of human sensations are basic technologies of communication engineering. For example, auditory information is obtained by a microphone, and a speaker reproduces it by artificial means. Furthermore, a video camera and a television make it possible to transmit visual sensation by broadcasting. On the contrary, since tactile or haptic information is subject to the Newton's “law of action and reaction” in the real world, a device which acquires, transmits, and reproduces the information has not been established. From the point of view, real-world haptics is the key technology for future haptic communication engineering. This paper proposes a novel acquisition method of haptic information named “haptograph”. The haptograph visualizes the haptic information like photograph. The proposed haptograph is applied to haptic recognition of the contact environment. A linear motor contacts to the surface of the environment and its reaction force is used to make a haptograph. A robust contact motion and sensor-less sensing of the reaction force are attained by using a disturbance observer. As a result, an encyclopedia of contact environment is attained. Since temporal and spatial analyses are conducted to represent haptic information as the haptograph, it is possible to be recognized and to be evaluated intuitively.

  14. Immersive virtual walk-through development for tokamak using active head mounted display

    International Nuclear Information System (INIS)

    Dutta, Pramit

    2015-01-01

    A fully immersive virtual walk-through of the SST-1 tokamak has been developed. The virtual walkthrough renders the virtual model of SST-1 tokamak through a active stereoscopic head mounted display to visualize the virtual environment. All locations inside and outside of the reactor can be accessed and reviewed. Such a virtual walkthrough provides a 1:1 scale visualization of all components of the tokamak. To achieve such a virtual model, the graphical details of the tokamak CAD model are enhanced. Such enhancements are provided to improve lighting conditions at various locations, texturing of components to have a realistic visual effect and 360° rendering for ease of access. The graphical enhancements also include the redefinition of the facets to optimize the surface triangles to remove lags in display during visual rendering. Two separate algorithms are developed to interact with the virtual model. A fly-by algorithm, developed using C#, uses inputs from a commercial joystick to navigate within the virtual environment. The second algorithm uses the IR and gyroscopic tracking system of the head mounted display to render view as per the current pose of the user within the virtual environment and the direction of view. Such a virtual walk-thorough can be used extensively for design review and integration, review of new components, operator training for remote handling, operations, upgrades of tokamak, etc. (author)

  15. OzBot and haptics: remote surveillance to physical presence

    Science.gov (United States)

    Mullins, James; Fielding, Mick; Nahavandi, Saeid

    2009-05-01

    This paper reports on robotic and haptic technologies and capabilities developed for the law enforcement and defence community within Australia by the Centre for Intelligent Systems Research (CISR). The OzBot series of small and medium surveillance robots have been designed in Australia and evaluated by law enforcement and defence personnel to determine suitability and ruggedness in a variety of environments. Using custom developed digital electronics and featuring expandable data busses including RS485, I2C, RS232, video and Ethernet, the robots can be directly connected to many off the shelf payloads such as gas sensors, x-ray sources and camera systems including thermal and night vision. Differentiating the OzBot platform from its peers is its ability to be integrated directly with haptic technology or the 'haptic bubble' developed by CISR. Haptic interfaces allow an operator to physically 'feel' remote environments through position-force control and experience realistic force feedback. By adding the capability to remotely grasp an object, feel its weight, texture and other physical properties in real-time from the remote ground control unit, an operator's situational awareness is greatly improved through Haptic augmentation in an environment where remote-system feedback is often limited.

  16. Evaluating Remapped Physical Reach for Hand Interactions with Passive Haptics in Virtual Reality.

    Science.gov (United States)

    Han, Dustin T; Suhail, Mohamed; Ragan, Eric D

    2018-04-01

    Virtual reality often uses motion tracking to incorporate physical hand movements into interaction techniques for selection and manipulation of virtual objects. To increase realism and allow direct hand interaction, real-world physical objects can be aligned with virtual objects to provide tactile feedback and physical grasping. However, unless a physical space is custom configured to match a specific virtual reality experience, the ability to perfectly match the physical and virtual objects is limited. Our research addresses this challenge by studying methods that allow one physical object to be mapped to multiple virtual objects that can exist at different virtual locations in an egocentric reference frame. We study two such techniques: one that introduces a static translational offset between the virtual and physical hand before a reaching action, and one that dynamically interpolates the position of the virtual hand during a reaching motion. We conducted two experiments to assess how the two methods affect reaching effectiveness, comfort, and ability to adapt to the remapping techniques when reaching for objects with different types of mismatches between physical and virtual locations. We also present a case study to demonstrate how the hand remapping techniques could be used in an immersive game application to support realistic hand interaction while optimizing usability. Overall, the translational technique performed better than the interpolated reach technique and was more robust for situations with larger mismatches between virtual and physical objects.

  17. Control of a Robot Dancer for Enhancing Haptic Human-Robot Interaction in Waltz.

    Science.gov (United States)

    Hongbo Wang; Kosuge, K

    2012-01-01

    Haptic interaction between a human leader and a robot follower in waltz is studied in this paper. An inverted pendulum model is used to approximate the human's body dynamics. With the feedbacks from the force sensor and laser range finders, the robot is able to estimate the human leader's state by using an extended Kalman filter (EKF). To reduce interaction force, two robot controllers, namely, admittance with virtual force controller, and inverted pendulum controller, are proposed and evaluated in experiments. The former controller failed the experiment; reasons for the failure are explained. At the same time, the use of the latter controller is validated by experiment results.

  18. Investigating Students' Ideas About Buoyancy and the Influence of Haptic Feedback

    Science.gov (United States)

    Minogue, James; Borland, David

    2016-04-01

    While haptics (simulated touch) represents a potential breakthrough technology for science teaching and learning, there is relatively little research into its differential impact in the context of teaching and learning. This paper describes the testing of a haptically enhanced simulation (HES) for learning about buoyancy. Despite a lifetime of everyday experiences, a scientifically sound explanation of buoyancy remains difficult to construct for many. It requires the integration of domain-specific knowledge regarding density, fluid, force, gravity, mass, weight, and buoyancy. Prior studies suggest that novices often focus on only one dimension of the sinking and floating phenomenon. Our HES was designed to promote the integration of the subconcepts of density and buoyant forces and stresses the relationship between the object itself and the surrounding fluid. The study employed a randomized pretest-posttest control group research design and a suite of measures including an open-ended prompt and objective content questions to provide insights into the influence of haptic feedback on undergraduate students' thinking about buoyancy. A convenience sample (n = 40) was drawn from a university's population of undergraduate elementary education majors. Two groups were formed from haptic feedback (n = 22) and no haptic feedback (n = 18). Through content analysis, discernible differences were seen in the posttest explanations sinking and floating across treatment groups. Learners that experienced the haptic feedback made more frequent use of "haptically grounded" terms (e.g., mass, gravity, buoyant force, pushing), leading us to begin to build a local theory of language-mediated haptic cognition.

  19. Force modeling for incisions into various tissues with MRF haptic master

    Science.gov (United States)

    Kim, Pyunghwa; Kim, Soomin; Park, Young-Dai; Choi, Seung-Bok

    2016-03-01

    This study proposes a new model to predict the reaction force that occurs in incisions during robot-assisted minimally invasive surgery. The reaction force is fed back to the manipulator by a magneto-rheological fluid (MRF) haptic master, which is featured by a bi-directional clutch actuator. The reaction force feedback provides similar sensations to laparotomy that cannot be provided by a conventional master for surgery. This advantage shortens the training period for robot-assisted minimally invasive surgery and can improve the accuracy of operations. The reaction force modeling of incisions can be utilized in a surgical simulator that provides a virtual reaction force. In this work, in order to model the reaction force during incisions, the energy aspect of the incision process is adopted and analyzed. Each mode of the incision process is classified by the tendency of the energy change, and modeled for realistic real-time application. The reaction force model uses actual reaction force information with three types of actual tissues: hard tissue, medium tissue, and soft tissue. This modeled force is realized by the MRF haptic master through an algorithm based on the position and velocity of a scalpel using two different control methods: an open-loop algorithm and a closed-loop algorithm. The reaction forces obtained from the proposed model are compared with a desired force in time domain.

  20. Force modeling for incisions into various tissues with MRF haptic master

    International Nuclear Information System (INIS)

    Kim, Pyunghwa; Kim, Soomin; Park, Young-Dai; Choi, Seung-Bok

    2016-01-01

    This study proposes a new model to predict the reaction force that occurs in incisions during robot-assisted minimally invasive surgery. The reaction force is fed back to the manipulator by a magneto-rheological fluid (MRF) haptic master, which is featured by a bi-directional clutch actuator. The reaction force feedback provides similar sensations to laparotomy that cannot be provided by a conventional master for surgery. This advantage shortens the training period for robot-assisted minimally invasive surgery and can improve the accuracy of operations. The reaction force modeling of incisions can be utilized in a surgical simulator that provides a virtual reaction force. In this work, in order to model the reaction force during incisions, the energy aspect of the incision process is adopted and analyzed. Each mode of the incision process is classified by the tendency of the energy change, and modeled for realistic real-time application. The reaction force model uses actual reaction force information with three types of actual tissues: hard tissue, medium tissue, and soft tissue. This modeled force is realized by the MRF haptic master through an algorithm based on the position and velocity of a scalpel using two different control methods: an open-loop algorithm and a closed-loop algorithm. The reaction forces obtained from the proposed model are compared with a desired force in time domain. (paper)

  1. Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface

    Directory of Open Access Journals (Sweden)

    Jacopo Aleotti

    2017-09-01

    Full Text Available A visuo-haptic augmented reality (VHAR interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source.

  2. The contributions of vision and haptics to reaching and grasping

    Directory of Open Access Journals (Sweden)

    Kayla Dawn Stone

    2015-09-01

    Full Text Available This review aims to provide a comprehensive outlook on the sensory (visual and haptic contributions to reaching and grasping. The focus is on studies in developing children, normal and neuropsychological populations, and in sensory-deprived individuals. Studies have suggested a right-hand/left-hemisphere specialization for visually-guided grasping and a left-hand/right-hemisphere specialization for haptically-guided object recognition. This poses the interesting possibility that when vision is not available and grasping relies heavily on the haptic system, there is an advantage to use the left hand. We review the evidence for this possibility and dissect the unique contributions of the visual and haptic systems to grasping. We ultimately discuss how the integration of these two sensory modalities shape hand preference.

  3. Self-Control of Haptic Assistance for Motor Learning: Influences of Frequency and Opinion of Utility

    Science.gov (United States)

    Williams, Camille K.; Tseung, Victrine; Carnahan, Heather

    2017-01-01

    Studies of self-controlled practice have shown benefits when learners controlled feedback schedule, use of assistive devices and task difficulty, with benefits attributed to information processing and motivational advantages of self-control. Although haptic assistance serves as feedback, aids task performance and modifies task difficulty, researchers have yet to explore whether self-control over haptic assistance could be beneficial for learning. We explored whether self-control of haptic assistance would be beneficial for learning a tracing task. Self-controlled participants selected practice blocks on which they would receive haptic assistance, while participants in a yoked group received haptic assistance on blocks determined by a matched self-controlled participant. We inferred learning from performance on retention tests without haptic assistance. From qualitative analysis of open-ended questions related to rationales for/experiences of the haptic assistance that was chosen/provided, themes emerged regarding participants’ views of the utility of haptic assistance for performance and learning. Results showed that learning was directly impacted by the frequency of haptic assistance for self-controlled participants only and view of haptic assistance. Furthermore, self-controlled participants’ views were significantly associated with their requested haptic assistance frequency. We discuss these findings as further support for the beneficial role of self-controlled practice for motor learning. PMID:29255438

  4. Cognitive and tactile factors affecting human haptic performance in later life.

    Directory of Open Access Journals (Sweden)

    Tobias Kalisch

    Full Text Available BACKGROUND: Vision and haptics are the key modalities by which humans perceive objects and interact with their environment in a target-oriented manner. Both modalities share higher-order neural resources and the mechanisms required for object exploration. Compared to vision, the understanding of haptic information processing is still rudimentary. Although it is known that haptic performance, similar to many other skills, decreases in old age, the underlying mechanisms are not clear. It is yet to be determined to what extent this decrease is related to the age-related loss of tactile acuity or cognitive capacity. METHODOLOGY/PRINCIPAL FINDINGS: We investigated the haptic performance of 81 older adults by means of a cross-modal object recognition test. Additionally, we assessed the subjects' tactile acuity with an apparatus-based two-point discrimination paradigm, and their cognitive performance by means of the non-verbal Raven-Standard-Progressive matrices test. As expected, there was a significant age-related decline in performance on all 3 tests. With the exception of tactile acuity, this decline was found to be more distinct in female subjects. Correlation analyses revealed a strong relationship between haptic and cognitive performance for all subjects. Tactile performance, on the contrary, was only significantly correlated with male subjects' haptic performance. CONCLUSIONS: Haptic object recognition is a demanding task in old age, especially when it comes to the exploration of complex, unfamiliar objects. Our data support a disproportionately higher impact of cognition on haptic performance as compared to the impact of tactile acuity. Our findings are in agreement with studies reporting an increase in co-variation between individual sensory performance and general cognitive functioning in old age.

  5. A point-based rendering approach for real-time interaction on mobile devices

    Institute of Scientific and Technical Information of China (English)

    LIANG XiaoHui; ZHAO QinPing; HE ZhiYing; XIE Ke; LIU YuBo

    2009-01-01

    Mobile device is an Important interactive platform. Due to the limitation of computation, memory, display area and energy, how to realize the efficient and real-time interaction of 3D models based on mobile devices is an important research topic. Considering features of mobile devices, this paper adopts remote rendering mode and point models, and then, proposes a transmission and rendering approach that could interact in real time. First, improved simplification algorithm based on MLS and display resolution of mobile devices is proposed. Then, a hierarchy selection of point models and a QoS transmission control strategy are given based on interest area of operator, interest degree of object in the virtual environment and rendering error. They can save the energy consumption. Finally, the rendering and interaction of point models are completed on mobile devices. The experiments show that our method is efficient.

  6. A Study of an Assistance SystemUsing a Haptic Interface

    OpenAIRE

    浅川, 貴史

    2011-01-01

    We make a proposal for a music baton system for visual handicapped persons. This system is constituted by an acceleration sensor. a radio module. and a haptic interface device. The acceleration sensor is built in the music baton grip and the data are transmitted by the radio module. A performer has a receiver with the haptic interface device. The receiver's CPU picks up rhythm from the data and vibrates the haptic interface device. This paper is described about an experiment of comparing the ...

  7. Expanding the Scope of Instant Messaging with Bidirectional Haptic Communication

    OpenAIRE

    Kim, Youngjae; Hahn, Minsoo

    2010-01-01

    This work was conducted on the combination of two fields, i.e., haptic and social messaging. Haptic is one of the most attention-drawing fields and the biggest buzzwords among nextgeneration users. Haptic is being applied to conventional devices such as the cellular phone and even the door lock. Diverse forms of media such as blogs, social network services, and instant messengers are used to send and receive messages. That is mainly why we focus on the messaging experience, the most frequent ...

  8. The haptic and the visual flash-lag effect and the role of flash characteristics.

    Directory of Open Access Journals (Sweden)

    Knut Drewing

    Full Text Available When a short flash occurs in spatial alignment with a moving object, the moving object is seen ahead the stationary one. Similar to this visual "flash-lag effect" (FLE it has been recently observed for the haptic sense that participants judge a moving hand to be ahead a stationary hand when judged at the moment of a short vibration ("haptic flash" that is applied when the two hands are spatially aligned. We further investigated the haptic FLE. First, we compared participants' performance in two isosensory visual or haptic conditions, in which moving object and flash were presented only in a single modality (visual: sphere and short color change, haptic: hand and vibration, and two bisensory conditions, in which the moving object was presented in both modalities (hand aligned with visible sphere, but the flash was presented only visually or only haptically. The experiment aimed to disentangle contributions of the flash's and the objects' modalities to the FLEs in haptics versus vision. We observed a FLE when the flash was visually displayed, both when the moving object was visual and visuo-haptic. Because the position of a visual flash, but not of an analogue haptic flash, is misjudged relative to a same visuo-haptic moving object, the difference between visual and haptic conditions can be fully attributed to characteristics of the flash. The second experiment confirmed that a haptic FLE can be observed depending on flash characteristics: the FLE increases with decreasing intensity of the flash (slightly modulated by flash duration, which had been previously observed for vision. These findings underline the high relevance of flash characteristics in different senses, and thus fit well with the temporal-sampling framework, where the flash triggers a high-level, supra-modal process of position judgement, the time point of which further depends on the processing time of the flash.

  9. A haptic sensor-actor-system based on ultrasound elastography and electrorheological fluids for virtual reality applications in medicine.

    Science.gov (United States)

    Khaled, W; Ermert, H; Bruhns, O; Boese, H; Baumann, M; Monkman, G J; Egersdoerfer, S; Meier, A; Klein, D; Freimuth, H

    2003-01-01

    Mechanical properties of biological tissue represent important diagnostic information and are of histological relevance (hard lesions, "nodes" in organs: tumors; calcifications in vessels: arteriosclerosis). The problem is, that such information is usually obtained by digital palpation only, which is limited with respect to sensitivity. It requires intuitive assessment and does not allow quantitative documentation. A suitable sensor is required for quantitative detection of mechanical tissue properties. On the other hand, there is also some need for a realistic mechanical display of those tissue properties. Suitable actuator arrays with high spatial resolution and real-time capabilities are required operating in a haptic sensor actuator system with different applications. The sensor system uses real time ultrasonic elastography whereas the tactile actuator is based on electrorheological fluids. Due to their small size the actuator array elements have to be manufactured by micro-mechanical production methods. In order to supply the actuator elements with individual high voltages a sophisticated switching and control concept have been designed. This haptic system has the potential of inducing real time substantial forces, using a compact lightweight mechanism which can be applied to numerous areas including intraoperative navigation, telemedicine, teaching, space and telecommunication.

  10. Haptic Data Processing for Teleoperation Systems: Prediction, Compression and Error Correction

    OpenAIRE

    Lee, Jae-young

    2013-01-01

    This thesis explores haptic data processing methods for teleoperation systems, including prediction, compression, and error correction. In the proposed haptic data prediction method, unreliable network conditions, such as time-varying delay and packet loss, are detected by a transport layer protocol. Given the information from the transport layer, a Bayesian approach is introduced to predict position and force data in haptic teleoperation systems. Stability of the proposed method within stoch...

  11. Input and output for surgical simulation: devices to measure tissue properties in vivo and a haptic interface for laparoscopy simulators.

    Science.gov (United States)

    Ottensmeyer, M P; Ben-Ur, E; Salisbury, J K

    2000-01-01

    Current efforts in surgical simulation very often focus on creating realistic graphical feedback, but neglect some or all tactile and force (haptic) feedback that a surgeon would normally receive. Simulations that do include haptic feedback do not typically use real tissue compliance properties, favoring estimates and user feedback to determine realism. When tissue compliance data are used, there are virtually no in vivo property measurements to draw upon. Together with the Center for Innovative Minimally Invasive Therapy at the Massachusetts General Hospital, the Haptics Group is developing tools to introduce more comprehensive haptic feedback in laparoscopy simulators and to provide biological tissue material property data for our software simulation. The platform for providing haptic feedback is a PHANToM Haptic Interface, produced by SensAble Technologies, Inc. Our devices supplement the PHANToM to provide for grasping and optionally, for the roll axis of the tool. Together with feedback from the PHANToM, which provides the pitch, yaw and thrust axes of a typical laparoscopy tool, we can recreate all of the haptic sensations experienced during laparoscopy. The devices integrate real laparoscopy toolhandles and a compliant torso model to complete the set of visual and tactile sensations. Biological tissues are known to exhibit non-linear mechanical properties, and change their properties dramatically when removed from a living organism. To measure the properties in vivo, two devices are being developed. The first is a small displacement, 1-D indenter. It will measure the linear tissue compliance (stiffness and damping) over a wide range of frequencies. These data will be used as inputs to a finite element or other model. The second device will be able to deflect tissues in 3-D over a larger range, so that the non-linearities due to changes in the tissue geometry will be measured. This will allow us to validate the performance of the model on large tissue

  12. Superior haptic-to-visual shape matching in autism spectrum disorders.

    Science.gov (United States)

    Nakano, Tamami; Kato, Nobumasa; Kitazawa, Shigeru

    2012-04-01

    A weak central coherence theory in autism spectrum disorder (ASD) proposes that a cognitive bias toward local processing in ASD derives from a weakness in integrating local elements into a coherent whole. Using this theory, we hypothesized that shape perception through active touch, which requires sequential integration of sensorimotor traces of exploratory finger movements into a shape representation, would be impaired in ASD. Contrary to our expectation, adults with ASD showed superior performance in a haptic-to-visual delayed shape-matching task compared to adults without ASD. Accuracy in discriminating haptic lengths or haptic orientations, which lies within the somatosensory modality, did not differ between adults with ASD and adults without ASD. Moreover, this superior ability in inter-modal haptic-to-visual shape matching was not explained by the score in a unimodal visuospatial rotation task. These results suggest that individuals with ASD are not impaired in integrating sensorimotor traces into a global visual shape and that their multimodal shape representations and haptic-to-visual information transfer are more accurate than those of individuals without ASD. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Real-Time 3D Motion capture by monocular vision and virtual rendering

    OpenAIRE

    Gomez Jauregui , David Antonio; Horain , Patrick

    2012-01-01

    International audience; Avatars in networked 3D virtual environments allow users to interact over the Internet and to get some feeling of virtual telepresence. However, avatar control may be tedious. Motion capture systems based on 3D sensors have recently reached the consumer market, but webcams and camera-phones are more widespread and cheaper. The proposed demonstration aims at animating a user's avatar from real time 3D motion capture by monoscopic computer vision, thus allowing virtual t...

  14. Realistic soft tissue deformation strategies for real time surgery simulation.

    Science.gov (United States)

    Shen, Yunhe; Zhou, Xiangmin; Zhang, Nan; Tamma, Kumar; Sweet, Robert

    2008-01-01

    A volume-preserving deformation method (VPDM) is developed in complement with the mass-spring method (MSM) to improve the deformation quality of the MSM to model soft tissue in surgical simulation. This method can also be implemented as a stand-alone model. The proposed VPDM satisfies the Newton's laws of motion by obtaining the resultant vectors form an equilibrium condition. The proposed method has been tested in virtual surgery systems with haptic rendering demands.

  15. HSP v2: Haptic Signal Processing with Extensions for Physical Modeling

    DEFF Research Database (Denmark)

    Overholt, Daniel; Kontogeorgakopoulos, Alexandros; Berdahl, Edgar

    2010-01-01

    The Haptic Signal Processing (HSP) platform aims to enable musicians to easily design and perform with digital haptic musical instruments [1]. In this paper, we present some new objects introduced in version v2 for modeling of musical dynamical systems such as resonators and vibrating strings. To....... To our knowledge, this is the first time that these diverse physical modeling elements have all been made available for a modular, real-time haptics platform....

  16. Haptic Glove Technology: Skill Development through Video Game Play

    Science.gov (United States)

    Bargerhuff, Mary Ellen; Cowan, Heidi; Oliveira, Francisco; Quek, Francis; Fang, Bing

    2010-01-01

    This article introduces a recently developed haptic glove system and describes how the participants used a video game that was purposely designed to train them in skills that are needed for the efficient use of the haptic glove. Assessed skills included speed, efficiency, embodied skill, and engagement. The findings and implications for future…

  17. Web-based three-dimensional Virtual Body Structures: W3D-VBS.

    Science.gov (United States)

    Temkin, Bharti; Acosta, Eric; Hatfield, Paul; Onal, Erhan; Tong, Alex

    2002-01-01

    Major efforts are being made to improve the teaching of human anatomy to foster cognition of visuospatial relationships. The Visible Human Project of the National Library of Medicine makes it possible to create virtual reality-based applications for teaching anatomy. Integration of traditional cadaver and illustration-based methods with Internet-based simulations brings us closer to this goal. Web-based three-dimensional Virtual Body Structures (W3D-VBS) is a next-generation immersive anatomical training system for teaching human anatomy over the Internet. It uses Visible Human data to dynamically explore, select, extract, visualize, manipulate, and stereoscopically palpate realistic virtual body structures with a haptic device. Tracking user's progress through evaluation tools helps customize lesson plans. A self-guided "virtual tour" of the whole body allows investigation of labeled virtual dissections repetitively, at any time and place a user requires it.

  18. A Fabric-Based Approach for Wearable Haptics

    Directory of Open Access Journals (Sweden)

    Matteo Bianchi

    2016-07-01

    Full Text Available In recent years, wearable haptic systems (WHS have gained increasing attention as a novel and exciting paradigm for human–robot interaction (HRI. These systems can be worn by users, carried around, and integrated in their everyday lives, thus enabling a more natural manner to deliver tactile cues. At the same time, the design of these types of devices presents new issues: the challenge is the correct identification of design guidelines, with the two-fold goal of minimizing system encumbrance and increasing the effectiveness and naturalness of stimulus delivery. Fabrics can represent a viable solution to tackle these issues. They are specifically thought “to be worn”, and could be the key ingredient to develop wearable haptic interfaces conceived for a more natural HRI. In this paper, the author will review some examples of fabric-based WHS that can be applied to different body locations, and elicit different haptic perceptions for different application fields. Perspective and future developments of this approach will be discussed.

  19. Visualizing dynamic geosciences phenomena using an octree-based view-dependent LOD strategy within virtual globes

    Science.gov (United States)

    Li, Jing; Wu, Huayi; Yang, Chaowei; Wong, David W.; Xie, Jibo

    2011-09-01

    Geoscientists build dynamic models to simulate various natural phenomena for a better understanding of our planet. Interactive visualizations of these geoscience models and their outputs through virtual globes on the Internet can help the public understand the dynamic phenomena related to the Earth more intuitively. However, challenges arise when the volume of four-dimensional data (4D), 3D in space plus time, is huge for rendering. Datasets loaded from geographically distributed data servers require synchronization between ingesting and rendering data. Also the visualization capability of display clients varies significantly in such an online visualization environment; some may not have high-end graphic cards. To enhance the efficiency of visualizing dynamic volumetric data in virtual globes, this paper proposes a systematic framework, in which an octree-based multiresolution data structure is implemented to organize time series 3D geospatial data to be used in virtual globe environments. This framework includes a view-dependent continuous level of detail (LOD) strategy formulated as a synchronized part of the virtual globe rendering process. Through the octree-based data retrieval process, the LOD strategy enables the rendering of the 4D simulation at a consistent and acceptable frame rate. To demonstrate the capabilities of this framework, data of a simulated dust storm event are rendered in World Wind, an open source virtual globe. The rendering performances with and without the octree-based LOD strategy are compared. The experimental results show that using the proposed data structure and processing strategy significantly enhances the visualization performance when rendering dynamic geospatial phenomena in virtual globes.

  20. The Hedonic Haptics Player: A Wearable Device to Experience Vibrotactile Compositions

    OpenAIRE

    Boer, Laurens; Vallgårda, Anna; Cahill, Ben

    2017-01-01

    The Hedonic Haptics player is a portable wearable device that plays back vibrotactile compositions. It consists of three domes each of which houses a vibration motor providing vibrotactile sensations to the wearer. The domes are connected to a control unit the size of a small Walkman. The Hedonic Haptics player can store up to ten different compositions made up of haptic signals varying in amplitude, waveform and length. We use these different compositions to explore the aesthetic potential o...

  1. Haptic perception of wetness

    NARCIS (Netherlands)

    Bergmann Tiest, W.M.; Kosters, N.D.; Kappers, Astrid M.L.; Daanen, H.A.M.

    2012-01-01

    In daily life, people interact with textiles of different degrees of wetness, but little is known about the mechanics of wetness perception. This paper describes an experiment with six conditions regarding haptic discrimination of the wetness of fabrics. Three materials were used: cotton wool,

  2. Enhanced operator perception through 3D vision and haptic feedback

    Science.gov (United States)

    Edmondson, Richard; Light, Kenneth; Bodenhamer, Andrew; Bosscher, Paul; Wilkinson, Loren

    2012-06-01

    Polaris Sensor Technologies (PST) has developed a stereo vision upgrade kit for TALON® robot systems comprised of a replacement gripper camera and a replacement mast zoom camera on the robot, and a replacement display in the Operator Control Unit (OCU). Harris Corporation has developed a haptic manipulation upgrade for TALON® robot systems comprised of a replacement arm and gripper and an OCU that provides haptic (force) feedback. PST and Harris have recently collaborated to integrate the 3D vision system with the haptic manipulation system. In multiple studies done at Fort Leonard Wood, Missouri it has been shown that 3D vision and haptics provide more intuitive perception of complicated scenery and improved robot arm control, allowing for improved mission performance and the potential for reduced time on target. This paper discusses the potential benefits of these enhancements to robotic systems used for the domestic homeland security mission.

  3. Parallel rendering

    Science.gov (United States)

    Crockett, Thomas W.

    1995-01-01

    This article provides a broad introduction to the subject of parallel rendering, encompassing both hardware and software systems. The focus is on the underlying concepts and the issues which arise in the design of parallel rendering algorithms and systems. We examine the different types of parallelism and how they can be applied in rendering applications. Concepts from parallel computing, such as data decomposition, task granularity, scalability, and load balancing, are considered in relation to the rendering problem. We also explore concepts from computer graphics, such as coherence and projection, which have a significant impact on the structure of parallel rendering algorithms. Our survey covers a number of practical considerations as well, including the choice of architectural platform, communication and memory requirements, and the problem of image assembly and display. We illustrate the discussion with numerous examples from the parallel rendering literature, representing most of the principal rendering methods currently used in computer graphics.

  4. Three-dimensional rendering of segmented object using matlab - biomed 2010.

    Science.gov (United States)

    Anderson, Jeffrey R; Barrett, Steven F

    2010-01-01

    The three-dimensional rendering of microscopic objects is a difficult and challenging task that often requires specialized image processing techniques. Previous work has been described of a semi-automatic segmentation process of fluorescently stained neurons collected as a sequence of slice images with a confocal laser scanning microscope. Once properly segmented, each individual object can be rendered and studied as a three-dimensional virtual object. This paper describes the work associated with the design and development of Matlab files to create three-dimensional images from the segmented object data previously mentioned. Part of the motivation for this work is to integrate both the segmentation and rendering processes into one software application, providing a seamless transition from the segmentation tasks to the rendering and visualization tasks. Previously these tasks were accomplished on two different computer systems, windows and Linux. This transition basically limits the usefulness of the segmentation and rendering applications to those who have both computer systems readily available. The focus of this work is to create custom Matlab image processing algorithms for object rendering and visualization, and merge these capabilities to the Matlab files that were developed especially for the image segmentation task. The completed Matlab application will contain both the segmentation and rendering processes in a single graphical user interface, or GUI. This process for rendering three-dimensional images in Matlab requires that a sequence of two-dimensional binary images, representing a cross-sectional slice of the object, be reassembled in a 3D space, and covered with a surface. Additional segmented objects can be rendered in the same 3D space. The surface properties of each object can be varied by the user to aid in the study and analysis of the objects. This inter-active process becomes a powerful visual tool to study and understand microscopic objects.

  5. Rapid processing of haptic cues for postural control in blind subjects.

    Science.gov (United States)

    Schieppati, Marco; Schmid, Monica; Sozzi, Stefania

    2014-07-01

    Vision and touch rapidly lead to postural stabilization in sighted subjects. Is touch-induced stabilization more rapid in blind than in sighted subjects, owing to cross-modal reorganization of function in the blind? We estimated the time-period elapsing from onset of availability of haptic support to onset of lateral stabilization in a group of early- and late-onset blinds. Eleven blind (age 39.4 years±11.7SD) and eleven sighted subjects (age 30.0 years±10.0SD), standing eyes closed with feet in tandem position, touched a pad with their index finger and withdrew the finger from the pad in sequence. EMG of postural muscles and displacement of centre of foot pressure were recorded. The task was repeated fifty times, to allow statistical evaluation of the latency of EMG and sway changes following the haptic shift. Steady-state sway (with or without contact with pad, no haptic shift) did not differ between blind and sighted. On adding the haptic stimulus, EMG and sway diminished in both groups, but at an earlier latency (by about 0.5 s) in the blinds (p blinds. When the haptic stimulus was withdrawn, both groups increased EMG and sway at equally short delays. Blinds are rapid in implementing adaptive postural modifications when granted an external haptic reference. Fast processing of the stabilizing haptic spatial-orientation cues may be favoured by cortical plasticity in blinds. These findings add new information to the field of sensory-guided dynamic control of equilibrium in man. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  6. Effect of haptic assistance on learning vehicle reverse parking skills.

    Science.gov (United States)

    Hirokawa, Masakazu; Uesugi, Naohisa; Furugori, Satoru; Kitagawa, Tomoko; Suzuki, Kenji

    2014-01-01

    Compared to conventional visual- and auditory-based assisted driving technologies, haptic modality promises to be more effective and less disturbing assistance to the driver. However, in most previous studies, haptic assistance systems were evaluated from safety and stability viewpoints. Moreover, the effect of haptic assistance on human driving behavior has not been sufficiently discussed. In this paper, we introduce an assisted driving method based on haptic assistance for driver training in reverse parking, which is considered as an uncertain factor in conventional assisted driving systems. The proposed system assists the driver by applying a torque on the steering wheel to guide proper and well-timed steering. To design the appropriate assistance method, we conducted a measurement experiment to determine the qualitative reverse parking driver characteristics. Based on the determined characteristics, we propose a haptic assistance calculation method that utilizes the receding horizon control algorithm. For a simulation environment to assess the proposed assistance method, we also developed a scaled car simulator comprising a 1/10 scaled robot car and an omnidirectional camera. We used the scaled car simulator to conduct comparative experiments on subjects, and observed that the driving skills of the assisted subjects were significantly better than those of the control subjects.

  7. Haptic perception of wetness

    NARCIS (Netherlands)

    Bergmann Tiest, W.M.; Kosters, N.D.; Daanen, H.A.M.; Kappers, A.M.L.

    2011-01-01

    The sensation of wetness is well-known but barely investigated. There are no specific wetness receptors in the skin, but the sensation is mediated by temperature and pressure perception. In our study, we have measured discrimination thresholds for the haptic perception of wetness of three di erent

  8. Haptic perception of wetness

    NARCIS (Netherlands)

    Bergmann Tiest, W.M.; Dolfine Kosters, N.; Daanen, h.a.m.; Kappers, A.M.L.

    2012-01-01

    In daily life, people interact with textiles of different degrees of wetness, but little is known about the me-chanics of wetness perception. This paper describes an experiment with six conditions regarding haptic dis-crimination of the wetness of fabrics. Three materials were used: cotton wool,

  9. INCREASING SAVING BEHAVIOR THROUGH AGE-PROGRESSED RENDERINGS OF THE FUTURE SELF

    Science.gov (United States)

    HERSHFIELD, HAL E.; GOLDSTEIN, DANIEL G.; SHARPE, WILLIAM F.; FOX, JESSE; YEYKELIS, LEO; CARSTENSEN, LAURA L.; BAILENSON, JEREMY N.

    2014-01-01

    Many people fail to save what they need to for retirement (Munnell, Webb, and Golub-Sass 2009). Research on excessive discounting of the future suggests that removing the lure of immediate rewards by pre-committing to decisions, or elaborating the value of future rewards can both make decisions more future-oriented. In this article, we explore a third and complementary route, one that deals not with present and future rewards, but with present and future selves. In line with thinkers who have suggested that people may fail, through a lack of belief or imagination, to identify with their future selves (Parfit 1971; Schelling 1984), we propose that allowing people to interact with age-progressed renderings of themselves will cause them to allocate more resources toward the future. In four studies, participants interacted with realistic computer renderings of their future selves using immersive virtual reality hardware and interactive decision aids. In all cases, those who interacted with virtual future selves exhibited an increased tendency to accept later monetary rewards over immediate ones. PMID:24634544

  10. INCREASING SAVING BEHAVIOR THROUGH AGE-PROGRESSED RENDERINGS OF THE FUTURE SELF.

    Science.gov (United States)

    Hershfield, Hal E; Goldstein, Daniel G; Sharpe, William F; Fox, Jesse; Yeykelis, Leo; Carstensen, Laura L; Bailenson, Jeremy N

    2011-11-01

    Many people fail to save what they need to for retirement (Munnell, Webb, and Golub-Sass 2009). Research on excessive discounting of the future suggests that removing the lure of immediate rewards by pre-committing to decisions, or elaborating the value of future rewards can both make decisions more future-oriented. In this article, we explore a third and complementary route, one that deals not with present and future rewards, but with present and future selves. In line with thinkers who have suggested that people may fail, through a lack of belief or imagination, to identify with their future selves (Parfit 1971; Schelling 1984), we propose that allowing people to interact with age-progressed renderings of themselves will cause them to allocate more resources toward the future. In four studies, participants interacted with realistic computer renderings of their future selves using immersive virtual reality hardware and interactive decision aids. In all cases, those who interacted with virtual future selves exhibited an increased tendency to accept later monetary rewards over immediate ones.

  11. Shared virtual environments for telerehabilitation.

    Science.gov (United States)

    Popescu, George V; Burdea, Grigore; Boian, Rares

    2002-01-01

    Current VR telerehabilitation systems use offline remote monitoring from the clinic and patient-therapist videoconferencing. Such "store and forward" and video-based systems cannot implement medical services involving patient therapist direct interaction. Real-time telerehabilitation applications (including remote therapy) can be developed using a shared Virtual Environment (VE) architecture. We developed a two-user shared VE for hand telerehabilitation. Each site has a telerehabilitation workstation with a videocamera and a Rutgers Master II (RMII) force feedback glove. Each user can control a virtual hand and interact hapticly with virtual objects. Simulated physical interactions between therapist and patient are implemented using hand force feedback. The therapist's graphic interface contains several virtual panels, which allow control over the rehabilitation process. These controls start a videoconferencing session, collect patient data, or apply therapy. Several experimental telerehabilitation scenarios were successfully tested on a LAN. A Web-based approach to "real-time" patient telemonitoring--the monitoring portal for hand telerehabilitation--was also developed. The therapist interface is implemented as a Java3D applet that monitors patient hand movement. The monitoring portal gives real-time performance on off-the-shelf desktop workstations.

  12. Haptic and Visual feedback in 3D Audio Mixing Interfaces

    DEFF Research Database (Denmark)

    Gelineck, Steven; Overholt, Daniel

    2015-01-01

    This paper describes the implementation and informal evaluation of a user interface that explores haptic feedback for 3D audio mixing. The implementation compares different approaches using either the LEAP Motion for mid-air hand gesture control, or the Novint Falcon for active haptic feed- back...

  13. Haptic Paddle Enhancements and a Formal Assessment of Student Learning in System Dynamics

    Science.gov (United States)

    Gorlewicz, Jenna L.; Kratchman, Louis B.; Webster, Robert J., III

    2014-01-01

    The haptic paddle is a force-feedback joystick used at several universities in teaching System Dynamics, a core mechanical engineering undergraduate course where students learn to model dynamic systems in several domains. A second goal of the haptic paddle is to increase the accessibility of robotics and haptics by providing a low-cost device for…

  14. Haptic Cues for Balance: Use of a Cane Provides Immediate Body Stabilization

    Directory of Open Access Journals (Sweden)

    Stefania Sozzi

    2017-12-01

    Full Text Available Haptic cues are important for balance. Knowledge of the temporal features of their effect may be crucial for the design of neural prostheses. Touching a stable surface with a fingertip reduces body sway in standing subjects eyes closed (EC, and removal of haptic cue reinstates a large sway pattern. Changes in sway occur rapidly on changing haptic conditions. Here, we describe the effects and time-course of stabilization produced by a haptic cue derived from a walking cane. We intended to confirm that cane use reduces body sway, to evaluate the effect of vision on stabilization by a cane, and to estimate the delay of the changes in body sway after addition and withdrawal of haptic input. Seventeen healthy young subjects stood in tandem position on a force platform, with eyes closed or open (EO. They gently lowered the cane onto and lifted it from a second force platform. Sixty trials per direction of haptic shift (Touch → NoTouch, T-NT; NoTouch → Touch, NT-T and visual condition (EC-EO were acquired. Traces of Center of foot Pressure (CoP and the force exerted by cane were filtered, rectified, and averaged. The position in space of a reflective marker positioned on the cane tip was also acquired by an optoelectronic device. Cross-correlation (CC analysis was performed between traces of cane tip and CoP displacement. Latencies of changes in CoP oscillation in the frontal plane EC following the T-NT and NT-T haptic shift were statistically estimated. The CoP oscillations were larger in EC than EO under both T and NT (p < 0.001 and larger during NT than T conditions (p < 0.001. Haptic-induced effect under EC (Romberg quotient NT/T ~ 1.2 was less effective than that of vision under NT condition (EC/EO ~ 1.5 (p < 0.001. With EO cane had little effect. Cane displacement lagged CoP displacement under both EC and EO. Latencies to changes in CoP oscillations were longer after addition (NT-T, about 1.6 s than withdrawal (T-NT, about 0.9 s of haptic

  15. Fusion interfaces for tactical environments: An application of virtual reality technology

    Science.gov (United States)

    Haas, Michael W.

    1994-01-01

    The term Fusion Interface is defined as a class of interface which integrally incorporates both virtual and nonvirtual concepts and devices across the visual, auditory, and haptic sensory modalities. A fusion interface is a multisensory virtually-augmented synthetic environment. A new facility has been developed within the Human Engineering Division of the Armstrong Laboratory dedicated to exploratory development of fusion interface concepts. This new facility, the Fusion Interfaces for Tactical Environments (FITE) Facility is a specialized flight simulator enabling efficient concept development through rapid prototyping and direct experience of new fusion concepts. The FITE Facility also supports evaluation of fusion concepts by operation fighter pilots in an air combat environment. The facility is utilized by a multidisciplinary design team composed of human factors engineers, electronics engineers, computer scientists, experimental psychologists, and oeprational pilots. The FITE computational architecture is composed of twenty-five 80486-based microcomputers operating in real-time. The microcomputers generate out-the-window visuals, in-cockpit and head-mounted visuals, localized auditory presentations, haptic displays on the stick and rudder pedals, as well as executing weapons models, aerodynamic models, and threat models.

  16. Assignment about providing of substitute haptic interface for visually disabled persons

    OpenAIRE

    浅川, 貴史

    2013-01-01

    [Abstract] This paper is described about an assignment of haptic interface. We have made a proposal for a music baton system for visually disabled persons. The system is constituted by an acceleration sensor, a radio module, and a haptic interface device. We have carried out an experiment of comparing the visual and the haptic interface. The assignments are declared by the results that are rise-time of a motor and pre-motion. In the paper, we make a proposal for new method of the voltage cont...

  17. Design of a lightweight, cost effective thimble-like sensor for haptic applications based on contact force sensors.

    Science.gov (United States)

    Ferre, Manuel; Galiana, Ignacio; Aracil, Rafael

    2011-01-01

    This paper describes the design and calibration of a thimble that measures the forces applied by a user during manipulation of virtual and real objects. Haptic devices benefit from force measurement capabilities at their end-point. However, the heavy weight and cost of force sensors prevent their widespread incorporation in these applications. The design of a lightweight, user-adaptable, and cost-effective thimble with four contact force sensors is described in this paper. The sensors are calibrated before being placed in the thimble to provide normal and tangential forces. Normal forces are exerted directly by the fingertip and thus can be properly measured. Tangential forces are estimated by sensors strategically placed in the thimble sides. Two applications are provided in order to facilitate an evaluation of sensorized thimble performance. These applications focus on: (i) force signal edge detection, which determines task segmentation of virtual object manipulation, and (ii) the development of complex object manipulation models, wherein the mechanical features of a real object are obtained and these features are then reproduced for training by means of virtual object manipulation.

  18. Design of a Lightweight, Cost Effective Thimble-Like Sensor for Haptic Applications Based on Contact Force Sensors

    Directory of Open Access Journals (Sweden)

    Ignacio Galiana

    2011-12-01

    Full Text Available This paper describes the design and calibration of a thimble that measures the forces applied by a user during manipulation of virtual and real objects. Haptic devices benefit from force measurement capabilities at their end-point. However, the heavy weight and cost of force sensors prevent their widespread incorporation in these applications. The design of a lightweight, user-adaptable, and cost-effective thimble with four contact force sensors is described in this paper. The sensors are calibrated before being placed in the thimble to provide normal and tangential forces. Normal forces are exerted directly by the fingertip and thus can be properly measured. Tangential forces are estimated by sensors strategically placed in the thimble sides. Two applications are provided in order to facilitate an evaluation of sensorized thimble performance. These applications focus on: (i force signal edge detection, which determines task segmentation of virtual object manipulation, and (ii the development of complex object manipulation models, wherein the mechanical features of a real object are obtained and these features are then reproduced for training by means of virtual object manipulation.

  19. A new visual feedback-based magnetorheological haptic master for robot-assisted minimally invasive surgery

    Science.gov (United States)

    Choi, Seung-Hyun; Kim, Soomin; Kim, Pyunghwa; Park, Jinhyuk; Choi, Seung-Bok

    2015-06-01

    In this study, we developed a novel four-degrees-of-freedom haptic master using controllable magnetorheological (MR) fluid. We also integrated the haptic master with a vision device with image processing for robot-assisted minimally invasive surgery (RMIS). The proposed master can be used in RMIS as a haptic interface to provide the surgeon with a sense of touch by using both kinetic and kinesthetic information. The slave robot, which is manipulated with a proportional-integrative-derivative controller, uses a force sensor to obtain the desired forces from tissue contact, and these desired repulsive forces are then embodied through the MR haptic master. To verify the effectiveness of the haptic master, the desired force and actual force are compared in the time domain. In addition, a visual feedback system is implemented in the RMIS experiment to distinguish between the tumor and organ more clearly and provide better visibility to the operator. The hue-saturation-value color space is adopted for the image processing since it is often more intuitive than other color spaces. The image processing and haptic feedback are realized on surgery performance. In this work, tumor-cutting experiments are conducted under four different operating conditions: haptic feedback on, haptic feedback off, image processing on, and image processing off. The experimental realization shows that the performance index, which is a function of pixels, is different in the four operating conditions.

  20. Haptic identification of objects and their depictions.

    Science.gov (United States)

    Klatzky, R L; Loomis, J M; Lederman, S J; Wake, H; Fujita, N

    1993-08-01

    Haptic identification of real objects is superior to that of raised two-dimensional (2-D) depictions. Three explanations of real-object superiority were investigated: contribution of material information, contribution of 3-D shape and size, and greater potential for integration across the fingers. In Experiment 1, subjects, while wearing gloves that gently attenuated material information, haptically identified real objects that provided reduced cues to compliance, mass, and part motion. The gloves permitted exploration with free hand movement, a single outstretched finger, or five outstretched fingers. Performance decreased over these three conditions but was superior to identification of pictures of the same objects in all cases, indicating the contribution of 3-D structure and integration across the fingers. Picture performance was also better with five fingers than with one. In Experiment 2, the subjects wore open-fingered gloves, which provided them with material information. Consequently, the effect of type of exploration was substantially reduced but not eliminated. Material compensates somewhat for limited access to object structure but is not the primary basis for haptic object identification.

  1. The mere exposure effect in the domain of haptics.

    Science.gov (United States)

    Jakesch, Martina; Carbon, Claus-Christian

    2012-01-01

    Zajonc showed that the attitude towards stimuli that one had been previously exposed to is more positive than towards novel stimuli. This mere exposure effect (MEE) has been tested extensively using various visual stimuli. Research on the MEE is sparse, however, for other sensory modalities. We used objects of two material categories (stone and wood) and two complexity levels (simple and complex) to test the influence of exposure frequency (F0 = novel stimuli, F2 = stimuli exposed twice, F10 = stimuli exposed ten times) under two sensory modalities (haptics only and haptics & vision). Effects of exposure frequency were found for high complex stimuli with significantly increasing liking from F0 to F2 and F10, but only for the stone category. Analysis of "Need for Touch" data showed the MEE in participants with high need for touch, which suggests different sensitivity or saturation levels of MEE. This different sensitivity or saturation levels might also reflect the effects of expertise on the haptic evaluation of objects. It seems that haptic and cross-modal MEEs are influenced by factors similar to those in the visual domain indicating a common cognitive basis.

  2. The mere exposure effect in the domain of haptics.

    Directory of Open Access Journals (Sweden)

    Martina Jakesch

    Full Text Available Zajonc showed that the attitude towards stimuli that one had been previously exposed to is more positive than towards novel stimuli. This mere exposure effect (MEE has been tested extensively using various visual stimuli. Research on the MEE is sparse, however, for other sensory modalities.We used objects of two material categories (stone and wood and two complexity levels (simple and complex to test the influence of exposure frequency (F0 = novel stimuli, F2 = stimuli exposed twice, F10 = stimuli exposed ten times under two sensory modalities (haptics only and haptics & vision. Effects of exposure frequency were found for high complex stimuli with significantly increasing liking from F0 to F2 and F10, but only for the stone category. Analysis of "Need for Touch" data showed the MEE in participants with high need for touch, which suggests different sensitivity or saturation levels of MEE.This different sensitivity or saturation levels might also reflect the effects of expertise on the haptic evaluation of objects. It seems that haptic and cross-modal MEEs are influenced by factors similar to those in the visual domain indicating a common cognitive basis.

  3. Dynamics modeling for parallel haptic interfaces with force sensing and control.

    Science.gov (United States)

    Bernstein, Nicholas; Lawrence, Dale; Pao, Lucy

    2013-01-01

    Closed-loop force control can be used on haptic interfaces (HIs) to mitigate the effects of mechanism dynamics. A single multidimensional force-torque sensor is often employed to measure the interaction force between the haptic device and the user's hand. The parallel haptic interface at the University of Colorado (CU) instead employs smaller 1D force sensors oriented along each of the five actuating rods to build up a 5D force vector. This paper shows that a particular manipulandum/hand partition in the system dynamics is induced by the placement and type of force sensing, and discusses the implications on force and impedance control for parallel haptic interfaces. The details of a "squaring down" process are also discussed, showing how to obtain reduced degree-of-freedom models from the general six degree-of-freedom dynamics formulation.

  4. Dynamic shared state maintenance in distributed virtual environments

    Science.gov (United States)

    Hamza-Lup, Felix George

    Advances in computer networks and rendering systems facilitate the creation of distributed collaborative environments in which the distribution of information at remote locations allows efficient communication. Particularly challenging are distributed interactive Virtual Environments (VE) that allow knowledge sharing through 3D information. The purpose of this work is to address the problem of latency in distributed interactive VE and to develop a conceptual model for consistency maintenance in these environments based on the participant interaction model. An area that needs to be explored is the relationship between the dynamic shared state and the interaction with the virtual entities present in the shared scene. Mixed Reality (MR) and VR environments must bring the human participant interaction into the loop through a wide range of electronic motion sensors, and haptic devices. Part of the work presented here defines a novel criterion for categorization of distributed interactive VE and introduces, as well as analyzes, an adaptive synchronization algorithm for consistency maintenance in such environments. As part of the work, a distributed interactive Augmented Reality (AR) testbed and the algorithm implementation details are presented. Currently the testbed is part of several research efforts at the Optical Diagnostics and Applications Laboratory including 3D visualization applications using custom built head-mounted displays (HMDs) with optical motion tracking and a medical training prototype for endotracheal intubation and medical prognostics. An objective method using quaternion calculus is applied for the algorithm assessment. In spite of significant network latency, results show that the dynamic shared state can be maintained consistent at multiple remotely located sites. In further consideration of the latency problems and in the light of the current trends in interactive distributed VE applications, we propose a hybrid distributed system architecture for

  5. Virtual MR endoscopy of the ventricles prior to neurosurgical interventional endoscopy - evaluation of different presentation techniques

    International Nuclear Information System (INIS)

    Lemke, A.J.; Schurig-Urbaniak, A.M.; Niehues, S.M.; Felix, R.; Liebig, T.

    2004-01-01

    Purpose: In the past, virtual endoscopies have been performed for planning of endoscopic interventions or for diagnostic purposes in various organ systems with increasing frequency. This study evaluates the ability of virtual ventricular endoscopy to depict anatomical structures and the use for planning of real endoscopy. Materials and Methods: In a prospective study, 4 volunteers and 8 patients were examined with MRI. In 3 of the patients endoscopy was performed by our neurosurgeons thereafter. The calculation of the virtual endoscopy was based on 1 mm sagittal T2-weighted images. Comparison of surface rendering and volume rendering was made by means of video sequencing of individual views, and these were compared with the intraoperative endoscopic videos concerning the depictability of anatomical landmarks. Results: The reconstructions using volume rendering were more significant and easier to calculate than those based on surface rendering. Virtual endoscopy in the transparent mode allowed visualization of hazardous structures outside the ventricular system such as the basilar artery tip. Transparent 3D images of the ventricles gave a good overview on the depicted structures and enabled a better orientation during the virtual camera flight than surface rendered views. Conclusion: MR-based virtual endoscopy of the ventricular system can be obtained on the basis of surface- and volume-rendered views of sagittal T2-weighted thin sections. Preoperative utilization of this method simplifies the planning of endoscopy by visualization of anatomical structures. (orig.)

  6. Virtual reality simulators for gastrointestinal endoscopy training.

    Science.gov (United States)

    Triantafyllou, Konstantinos; Lazaridis, Lazaros Dimitrios; Dimitriadis, George D

    2014-01-16

    The use of simulators as educational tools for medical procedures is spreading rapidly and many efforts have been made for their implementation in gastrointestinal endoscopy training. Endoscopy simulation training has been suggested for ascertaining patient safety while positively influencing the trainees' learning curve. Virtual simulators are the most promising tool among all available types of simulators. These integrated modalities offer a human-like endoscopy experience by combining virtual images of the gastrointestinal tract and haptic realism with using a customized endoscope. From their first steps in the 1980s until today, research involving virtual endoscopic simulators can be divided in two categories: investigation of the impact of virtual simulator training in acquiring endoscopy skills and measuring competence. Emphasis should also be given to the financial impact of their implementation in endoscopy, including the cost of these state-of-the-art simulators and the potential economic benefits from their usage. Advances in technology will contribute to the upgrade of existing models and the development of new ones; while further research should be carried out to discover new fields of application.

  7. Haptic feedback designs in teleoperation systems for minimal invasive surgery

    NARCIS (Netherlands)

    Font, I.; Weiland, S.; Franken, M.; Steinbuch, M.; Rovers, A.F.

    2004-01-01

    One of the major shortcomings of state-of-the-art robotic systems for minimal invasive surgery is the lack of haptic feedback for the surgeon. In order to provide haptic information, sensors and actuators have to be added to the master and slave device. A control system should process the data and

  8. Semi-Immersive Virtual Turbine Engine Simulation System

    Science.gov (United States)

    Abidi, Mustufa H.; Al-Ahmari, Abdulrahman M.; Ahmad, Ali; Darmoul, Saber; Ameen, Wadea

    2018-05-01

    The design and verification of assembly operations is essential for planning product production operations. Recently, virtual prototyping has witnessed tremendous progress, and has reached a stage where current environments enable rich and multi-modal interaction between designers and models through stereoscopic visuals, surround sound, and haptic feedback. The benefits of building and using Virtual Reality (VR) models in assembly process verification are discussed in this paper. In this paper, we present the virtual assembly (VA) of an aircraft turbine engine. The assembly parts and sequences are explained using a virtual reality design system. The system enables stereoscopic visuals, surround sounds, and ample and intuitive interaction with developed models. A special software architecture is suggested to describe the assembly parts and assembly sequence in VR. A collision detection mechanism is employed that provides visual feedback to check the interference between components. The system is tested for virtual prototype and assembly sequencing of a turbine engine. We show that the developed system is comprehensive in terms of VR feedback mechanisms, which include visual, auditory, tactile, as well as force feedback. The system is shown to be effective and efficient for validating the design of assembly, part design, and operations planning.

  9. A new visual feedback-based magnetorheological haptic master for robot-assisted minimally invasive surgery

    International Nuclear Information System (INIS)

    Choi, Seung-Hyun; Kim, Soomin; Kim, Pyunghwa; Park, Jinhyuk; Choi, Seung-Bok

    2015-01-01

    In this study, we developed a novel four-degrees-of-freedom haptic master using controllable magnetorheological (MR) fluid. We also integrated the haptic master with a vision device with image processing for robot-assisted minimally invasive surgery (RMIS). The proposed master can be used in RMIS as a haptic interface to provide the surgeon with a sense of touch by using both kinetic and kinesthetic information. The slave robot, which is manipulated with a proportional-integrative-derivative controller, uses a force sensor to obtain the desired forces from tissue contact, and these desired repulsive forces are then embodied through the MR haptic master. To verify the effectiveness of the haptic master, the desired force and actual force are compared in the time domain. In addition, a visual feedback system is implemented in the RMIS experiment to distinguish between the tumor and organ more clearly and provide better visibility to the operator. The hue-saturation-value color space is adopted for the image processing since it is often more intuitive than other color spaces. The image processing and haptic feedback are realized on surgery performance. In this work, tumor-cutting experiments are conducted under four different operating conditions: haptic feedback on, haptic feedback off, image processing on, and image processing off. The experimental realization shows that the performance index, which is a function of pixels, is different in the four operating conditions. (paper)

  10. A Feasibility Study with Image-Based Rendered Virtual Reality in Patients with Mild Cognitive Impairment and Dementia.

    Directory of Open Access Journals (Sweden)

    Valeria Manera

    Full Text Available Virtual Reality (VR has emerged as a promising tool in many domains of therapy and rehabilitation, and has recently attracted the attention of researchers and clinicians working with elderly people with MCI, Alzheimer's disease and related disorders. Here we present a study testing the feasibility of using highly realistic image-based rendered VR with patients with MCI and dementia. We designed an attentional task to train selective and sustained attention, and we tested a VR and a paper version of this task in a single-session within-subjects design. Results showed that participants with MCI and dementia reported to be highly satisfied and interested in the task, and they reported high feelings of security, low discomfort, anxiety and fatigue. In addition, participants reported a preference for the VR condition compared to the paper condition, even if the task was more difficult. Interestingly, apathetic participants showed a preference for the VR condition stronger than that of non-apathetic participants. These findings suggest that VR-based training can be considered as an interesting tool to improve adherence to cognitive training in elderly people with cognitive impairment.

  11. Extending Virtual Reality simulation of ITER maintenance operations with dynamic effects

    International Nuclear Information System (INIS)

    Heemskerk, C.J.M.; Baar, M.R. de; Boessenkool, H.; Graafland, B.; Haye, M.J.; Koning, J.F.; Vahedi, M.; Visser, M.

    2011-01-01

    Virtual Reality (VR) simulation can be used to study, improve and verify ITER maintenance operations during preparation. VR can also improve the situational awareness of human operators during actual Remote Handling (RH) operations. Until now, VR systems use geometric models of the environment and the objects being handled and kinematic models of the manipulation systems. The addition of dynamic effects into the VR simulation was investigated. Important dynamic effects are forces due to contact transitions and the bending of beams under heavy loads. A novel dynamics simulation module was developed and introduced as an add-on to the VR4Robots VR software. Tests were performed under simplified test conditions and in the context of realistic ITER maintenance tasks on a benchmark product and on the ECRH Upper Port Launcher Plug (UPL). The introduction of dynamic effects into VR simulations was found to add realism and provide new insights in procedure development. The quality of the haptic feedback depends strongly on the haptic device used to 'display' haptic feedback to the operator. Dynamic effect simulation can also form the basis for real-time guidance support to operators during the execution of maintenance tasks (augmented reality).

  12. Upper limb assessment using a Virtual Peg Insertion Test.

    Science.gov (United States)

    Fluet, Marie-Christine; Lambercy, Olivier; Gassert, Roger

    2011-01-01

    This paper presents the initial evaluation of a Virtual Peg Insertion Test developed to assess sensorimotor functions of arm and hand using an instrumented tool, virtual reality and haptic feedback. Nine performance parameters derived from kinematic and kinetic data were selected and compared between two groups of healthy subjects performing the task with the dominant and non-dominant hand, as well as with a group of chronic stroke subjects suffering from different levels of upper limb impairment. Results showed significantly smaller grasping forces applied by the stroke subjects compared to the healthy subjects. The grasping force profiles suggest a poor coordination between position and grasping for the stroke subjects, and the collision forces with the virtual board were found to be indicative of sensory deficits. These preliminary results suggest that the analyzed parameters could be valid indicators of impairment. © 2011 IEEE

  13. Performance evaluation of a robot-assisted catheter operating system with haptic feedback.

    Science.gov (United States)

    Song, Yu; Guo, Shuxiang; Yin, Xuanchun; Zhang, Linshuai; Hirata, Hideyuki; Ishihara, Hidenori; Tamiya, Takashi

    2018-06-20

    In this paper, a novel robot-assisted catheter operating system (RCOS) has been proposed as a method to reduce physical stress and X-ray exposure time to physicians during endovascular procedures. The unique design of this system allows the physician to apply conventional bedside catheterization skills (advance, retreat and rotate) to an input catheter, which is placed at the master side to control another patient catheter placed at the slave side. For this purpose, a magnetorheological (MR) fluids-based master haptic interface has been developed to measure the axial and radial motions of an input catheter, as well as to provide the haptic feedback to the physician during the operation. In order to achieve a quick response of the haptic force in the master haptic interface, a hall sensor-based closed-loop control strategy is employed. In slave side, a catheter manipulator is presented to deliver the patient catheter, according to position commands received from the master haptic interface. The contact forces between the patient catheter and blood vessel system can be measured by designed force sensor unit of catheter manipulator. Four levels of haptic force are provided to make the operator aware of the resistance encountered by the patient catheter during the insertion procedure. The catheter manipulator was evaluated for precision positioning. The time lag from the sensed motion to replicated motion is tested. To verify the efficacy of the proposed haptic feedback method, the evaluation experiments in vitro are carried out. The results demonstrate that the proposed system has the ability to enable decreasing the contact forces between the catheter and vasculature.

  14. Design of a New 4-DOF Haptic Master Featuring Magnetorheological Fluid

    Directory of Open Access Journals (Sweden)

    Byung-Keun Song

    2014-08-01

    Full Text Available This work presents a novel 4-degree-of-freedom (4-DOF haptic master using magnetorheological (MR fluid which is applicable to a robot-assisted minimally invasive surgery (RMIS system. By using MR fluid, the proposed haptic device can easily generate bidirectional repulsive torque along the directions of the required motions. The proposed master consists of two actuators: an MR bidirectional clutch associated with a planetary gear system and an MR clutch with a bevel gear system. After demonstrating the configuration, the torque models of MR actuators are mathematically derived based on the field-dependent Bingham model. An optimal design that accounts for spatial-limitation and the desired torque constraint is then undertaken. An optimization procedure based on finite element analysis is proposed to determine optimal geometric dimensions. Based on the design procedure, MR haptic master with the optimal parameters has been manufactured. In order to demonstrate the practical feasibility of the proposed haptic master, the field-dependent generating repulsive force is measured. In addition, a proportional-integral-derivative (PID controller is empirically implemented to accomplish the desired torque trajectories. It has been shown that the proposed haptic master can track the desired torque trajectory without a significant error.

  15. Operator dynamics for stability condition in haptic and teleoperation system: A survey.

    Science.gov (United States)

    Li, Hongbing; Zhang, Lei; Kawashima, Kenji

    2018-04-01

    Currently, haptic systems ignore the varying impedance of the human hand with its countless configurations and thus cannot recreate the complex haptic interactions. The literature does not reveal a comprehensive survey on the methods proposed and this study is an attempt to bridge this gap. The paper includes an extensive review of human arm impedance modeling and control deployed to address inherent stability and transparency issues in haptic interaction and teleoperation systems. Detailed classification and comparative study of various contributions in human arm modeling are presented and summarized in tables and diagrams. The main challenges in modeling human arm impedance for haptic robotic applications are identified. The possible future research directions are outlined based on the gaps identified in the survey. Copyright © 2018 John Wiley & Sons, Ltd.

  16. Development of Virtual Reality Cycling Simulator

    OpenAIRE

    Schramka, Filip; Arisona, Stefan; Joos, Michael; Erath, Alexander

    2017-01-01

    This paper presents a cycling simulator implemented using consumer virtual reality hardware and additional off-the-shelf sensors. Challenges like real time motion tracking within the performance requirements of state of the art virtual reality are successfully mastered. Retrieved data from digital motion processors is sent over Bluetooth to a render machine running Unity3D. By processing this data a bicycle is mapped into virtual space. Physically correct behaviour is simulated and high quali...

  17. Haptic shared control improves hot cell remote handling despite controller inaccuracies

    NARCIS (Netherlands)

    van Oosterhout, J.; Abbink, D. A.; Koning, J. F.; Boessenkool, H.; Wildenbeest, J. G. W.; Heemskerk, C. J. M.

    2013-01-01

    A promising solution to improve task performance in ITER hot cell remote handling is the use of haptic shared control. Haptic shared control can assist the human operator along a safe and optimal path with continuous guiding forces from an intelligent autonomous controller. Previous research tested

  18. Comparison of grasping movements made by healthy subjects in a 3-dimensional immersive virtual versus physical environment.

    Science.gov (United States)

    Magdalon, Eliane C; Michaelsen, Stella M; Quevedo, Antonio A; Levin, Mindy F

    2011-09-01

    Virtual reality (VR) technology is being used with increasing frequency as a training medium for motor rehabilitation. However, before addressing training effectiveness in virtual environments (VEs), it is necessary to identify if movements made in such environments are kinematically similar to those made in physical environments (PEs) and the effect of provision of haptic feedback on these movement patterns. These questions are important since reach-to-grasp movements may be inaccurate when visual or haptic feedback is altered or absent. Our goal was to compare kinematics of reaching and grasping movements to three objects performed in an immersive three-dimensional (3D) VE with haptic feedback (cyberglove/grasp system) viewed through a head-mounted display to those made in an equivalent physical environment (PE). We also compared movements in PE made with and without wearing the cyberglove/grasp haptic feedback system. Ten healthy subjects (8 women, 62.1±8.8years) reached and grasped objects requiring 3 different grasp types (can, diameter 65.6mm, cylindrical grasp; screwdriver, diameter 31.6mm, power grasp; pen, diameter 7.5mm, precision grasp) in PE and visually similar virtual objects in VE. Temporal and spatial arm and trunk kinematics were analyzed. Movements were slower and grip apertures were wider when wearing the glove in both the PE and the VE compared to movements made in the PE without the glove. When wearing the glove, subjects used similar reaching trajectories in both environments, preserved the coordination between reaching and grasping and scaled grip aperture to object size for the larger object (cylindrical grasp). However, in VE compared to PE, movements were slower and had longer deceleration times, elbow extension was greater when reaching to the smallest object and apertures were wider for the power and precision grip tasks. Overall, the differences in spatial and temporal kinematics of movements between environments were greater than

  19. A remote instruction system empowered by tightly shared haptic sensation

    Science.gov (United States)

    Nishino, Hiroaki; Yamaguchi, Akira; Kagawa, Tsuneo; Utsumiya, Kouichi

    2007-09-01

    We present a system to realize an on-line instruction environment among physically separated participants based on a multi-modal communication strategy. In addition to visual and acoustic information, commonly used communication modalities in network environments, our system provides a haptic channel to intuitively conveying partners' sense of touch. The human touch sensation, however, is very sensitive for delays and jitters in the networked virtual reality (NVR) systems. Therefore, a method to compensate for such negative factors needs to be provided. We show an NVR architecture to implement a basic framework that can be shared by various applications and effectively deals with the problems. We take a hybrid approach to implement both data consistency by client-server and scalability by peer-to-peer models. As an application system built on the proposed architecture, a remote instruction system targeted at teaching handwritten characters and line patterns on a Korea-Japan high-speed research network also is mentioned.

  20. Obstacle Crossing Differences Between Blind and Blindfolded Subjects After Haptic Exploration

    NARCIS (Netherlands)

    Forner-Cordero, A.; Garcia, V.D.; Rodrigues, S.T.; Duysens, J.

    2016-01-01

    Little is known about the ability of blind people to cross obstacles after they have explored haptically their size and position. Long-term absence of vision may affect spatial cognition in the blind while their extensive experience with the use of haptic information for guidance may lead to

  1. A prototype percutaneous transhepatic cholangiography training simulator with real-time breathing motion.

    Science.gov (United States)

    Villard, P F; Vidal, F P; Hunt, C; Bello, F; John, N W; Johnson, S; Gould, D A

    2009-11-01

    We present here a simulator for interventional radiology focusing on percutaneous transhepatic cholangiography (PTC). This procedure consists of inserting a needle into the biliary tree using fluoroscopy for guidance. The requirements of the simulator have been driven by a task analysis. The three main components have been identified: the respiration, the real-time X-ray display (fluoroscopy) and the haptic rendering (sense of touch). The framework for modelling the respiratory motion is based on kinematics laws and on the Chainmail algorithm. The fluoroscopic simulation is performed on the graphic card and makes use of the Beer-Lambert law to compute the X-ray attenuation. Finally, the haptic rendering is integrated to the virtual environment and takes into account the soft-tissue reaction force feedback and maintenance of the initial direction of the needle during the insertion. Five training scenarios have been created using patient-specific data. Each of these provides the user with variable breathing behaviour, fluoroscopic display tuneable to any device parameters and needle force feedback. A detailed task analysis has been used to design and build the PTC simulator described in this paper. The simulator includes real-time respiratory motion with two independent parameters (rib kinematics and diaphragm action), on-line fluoroscopy implemented on the Graphics Processing Unit and haptic feedback to feel the soft-tissue behaviour of the organs during the needle insertion.

  2. Haptic exploration of fingertip-sized geometric features using a multimodal tactile sensor

    Science.gov (United States)

    Ponce Wong, Ruben D.; Hellman, Randall B.; Santos, Veronica J.

    2014-06-01

    Haptic perception remains a grand challenge for artificial hands. Dexterous manipulators could be enhanced by "haptic intelligence" that enables identification of objects and their features via touch alone. Haptic perception of local shape would be useful when vision is obstructed or when proprioceptive feedback is inadequate, as observed in this study. In this work, a robot hand outfitted with a deformable, bladder-type, multimodal tactile sensor was used to replay four human-inspired haptic "exploratory procedures" on fingertip-sized geometric features. The geometric features varied by type (bump, pit), curvature (planar, conical, spherical), and footprint dimension (1.25 - 20 mm). Tactile signals generated by active fingertip motions were used to extract key parameters for use as inputs to supervised learning models. A support vector classifier estimated order of curvature while support vector regression models estimated footprint dimension once curvature had been estimated. A distal-proximal stroke (along the long axis of the finger) enabled estimation of order of curvature with an accuracy of 97%. Best-performing, curvature-specific, support vector regression models yielded R2 values of at least 0.95. While a radial-ulnar stroke (along the short axis of the finger) was most helpful for estimating feature type and size for planar features, a rolling motion was most helpful for conical and spherical features. The ability to haptically perceive local shape could be used to advance robot autonomy and provide haptic feedback to human teleoperators of devices ranging from bomb defusal robots to neuroprostheses.

  3. HAPTIC LOCATION IN PSEUDOPHAKIC EYES AND NONINFECTIOUS POSTOPERATIVE INFLAMMATION- A PROSPECTIVE STUDY

    Directory of Open Access Journals (Sweden)

    Vinod Kumar Baranwal

    2017-01-01

    Full Text Available BACKGROUND Postoperative noninfectious inflammation after cataract surgery, which can be persistent, remains an undesirable consequence despite many advances in surgical techniques. This ocular inflammation after cataract surgery presents ophthalmologists with a treatment dilemma. The aim of the study was to evaluate and correlate the IOL haptic location and the presence of noninfectious postoperative inflammation in pseudophakic eyes using Ultrasound Biomicroscopy (UBM. MATERIALS AND METHODS In this prospective study, 80 eyes of 80 cataract patients underwent SICS with 6 mm optic non-foldable PCIOL implantation. Post surgery, an examination protocol was followed wherein the patients were assessed by slit-lamp examination on day 1, 2, 7, 14 and 30 for flare and cells. A UBM examination was performed on day 30 for locating the IOL haptic position. Finally, the postoperative inflammation was correlated with IOL haptic position. RESULTS The results showed that IOL haptic position outside the capsular bag significantly increased the amount and duration of postoperative inflammation. CONCLUSION Haptic position outside the bag increases the incidence and duration of postoperative inflammation significantly. In patients undergoing SICS, the aim should be a large continuous curvilinear capsulorhexis within the bag implantation of IOL. UBM examination on day 30 after surgery to know position of IOL haptics outside the bag will be helpful in decreasing apprehension of operating surgeon and suggesting prolonged need of steroids in cases having more than expected postoperative inflammation.

  4. Torque Measurement of 3-DOF Haptic Master Operated by Controllable Electrorheological Fluid

    Directory of Open Access Journals (Sweden)

    Oh Jong-Seok

    2015-02-01

    Full Text Available This work presents a torque measurement method of 3-degree-of-freedom (3-DOF haptic master featuring controllable electrorheological (ER fluid. In order to reflect the sense of an organ for a surgeon, the ER haptic master which can generate the repulsive torque of an organ is utilized as a remote controller for a surgery robot. Since accurate representation of organ feeling is essential for the success of the robot-assisted surgery, it is indispensable to develop a proper torque measurement method of 3-DOF ER haptic master. After describing the structural configuration of the haptic master, the torque models of ER spherical joint are mathematically derived based on the Bingham model of ER fluid. A new type of haptic device which has pitching, rolling, and yawing motions is then designed and manufactured using a spherical joint mechanism. Subsequently, the field-dependent parameters of the Bingham model are identified and generating repulsive torque according to applied electric field is measured. In addition, in order to verify the effectiveness of the proposed torque model, a comparative work between simulated and measured torques is undertaken.

  5. The workload implications of haptic displays in multi-display environments such as the cockpit: Dual-task interference of within-sense haptic inputs (tactile/proprioceptive) and between-sense inputs (tactile/proprioceptive/auditory/visual)

    OpenAIRE

    Castle, H

    2007-01-01

    Visual workload demand within the cockpit is reaching saturation, whereas the haptic sense (proprioceptive and tactile sensation) is relatively untapped, despite studies suggesting the benefits of haptic displays. MRT suggests that inputs from haptic displays will not interfere with inputs from visual or auditory displays. MRT is based on the premise that multisensory integration occurs only after unisensory processing. However, recent neuroscientific findings suggest that t...

  6. Six Degree-of-Freedom Haptic Simulation of a Stringed Musical Instrument for Triggering Sounds.

    Science.gov (United States)

    Dangxiao Wang; Xiaohan Zhao; Youjiao Shi; Yuru Zhang; Jing Xiao

    2017-01-01

    Six degree-of-freedom (DoF) haptic rendering of multi-region contacts between a moving hand avatar and varied-shaped components of a music instrument is fundamental to realizing interactive simulation of music playing. There are two aspects of computational challenges: first, some components have significantly small sizes in some dimensions, such as the strings on a seven-string plucked instrument (e.g., Guqin), which makes it challenging to avoid pop-through during multi-region contact scenarios. Second, deformable strings may produce high-frequency vibration, which requires simulating diversified and subtle force sensations when a hand interacts with strings in different ways. In this paper, we propose a constraint-based approach to haptic interaction and simulation between a moving hand avatar and various parts of a string instrument, using a cylinder model for the string that has a large length-radius ratio and a sphere-tree model for the other parts that have complex shapes. Collision response algorithms based on configuration-based optimization is adapted to solve for the contact configuration of the hand avatar interacting with thin strings without penetration. To simulate the deformation and vibration of a string, a cylindrical volume with variable diameters is defined with response to the interaction force applied by the operator. Experimental results have validated the stability and efficiency of the proposed approach. Subtle force feelings can be simulated to reflect varied interaction patterns, to differentiate collisions between the hand avatar with a static or vibrating string and the effects of various colliding forces and touch locations on the strings.

  7. Development of a Virtual Reality Simulator for Natural Orifice Translumenal Endoscopic Surgery (NOTES) Cholecystectomy Procedure.

    Science.gov (United States)

    Ahn, Woojin; Dargar, Saurabh; Halic, Tansel; Lee, Jason; Li, Baichun; Pan, Junjun; Sankaranarayanan, Ganesh; Roberts, Kurt; De, Suvranu

    2014-01-01

    The first virtual-reality-based simulator for Natural Orifice Translumenal Endoscopic Surgery (NOTES) is developed called the Virtual Translumenal Endoscopic Surgery Trainer (VTESTTM). VTESTTM aims to simulate hybrid NOTES cholecystectomy procedure using a rigid scope inserted through the vaginal port. The hardware interface is designed for accurate motion tracking of the scope and laparoscopic instruments to reproduce the unique hand-eye coordination. The haptic-enabled multimodal interactive simulation includes exposing the Calot's triangle and detaching the gall bladder while performing electrosurgery. The developed VTESTTM was demonstrated and validated at NOSCAR 2013.

  8. Development of visuo-haptic transfer for object recognition in typical preschool and school-aged children.

    Science.gov (United States)

    Purpura, Giulia; Cioni, Giovanni; Tinelli, Francesca

    2018-07-01

    Object recognition is a long and complex adaptive process and its full maturation requires combination of many different sensory experiences as well as cognitive abilities to manipulate previous experiences in order to develop new percepts and subsequently to learn from the environment. It is well recognized that the transfer of visual and haptic information facilitates object recognition in adults, but less is known about development of this ability. In this study, we explored the developmental course of object recognition capacity in children using unimodal visual information, unimodal haptic information, and visuo-haptic information transfer in children from 4 years to 10 years and 11 months of age. Participants were tested through a clinical protocol, involving visual exploration of black-and-white photographs of common objects, haptic exploration of real objects, and visuo-haptic transfer of these two types of information. Results show an age-dependent development of object recognition abilities for visual, haptic, and visuo-haptic modalities. A significant effect of time on development of unimodal and crossmodal recognition skills was found. Moreover, our data suggest that multisensory processes for common object recognition are active at 4 years of age. They facilitate recognition of common objects, and, although not fully mature, are significant in adaptive behavior from the first years of age. The study of typical development of visuo-haptic processes in childhood is a starting point for future studies regarding object recognition in impaired populations.

  9. Haptic and Audio Interaction Design

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 5th International Workshop on Haptic and Audio Interaction Design, HAID 2010 held in Copenhagen, Denmark, in September 2010. The 21 revised full papers presented were carefully reviewed and selected for inclusion in the book. The papers are or...

  10. Validation of the updated ArthroS simulator: face and construct validity of a passive haptic virtual reality simulator with novel performance metrics.

    Science.gov (United States)

    Garfjeld Roberts, Patrick; Guyver, Paul; Baldwin, Mathew; Akhtar, Kash; Alvand, Abtin; Price, Andrew J; Rees, Jonathan L

    2017-02-01

    To assess the construct and face validity of ArthroS, a passive haptic VR simulator. A secondary aim was to evaluate the novel performance metrics produced by this simulator. Two groups of 30 participants, each divided into novice, intermediate or expert based on arthroscopic experience, completed three separate tasks on either the knee or shoulder module of the simulator. Performance was recorded using 12 automatically generated performance metrics and video footage of the arthroscopic procedures. The videos were blindly assessed using a validated global rating scale (GRS). Participants completed a survey about the simulator's realism and training utility. This new simulator demonstrated construct validity of its tasks when evaluated against a GRS (p ≤ 0.003 in all cases). Regarding it's automatically generated performance metrics, established outputs such as time taken (p ≤ 0.001) and instrument path length (p ≤ 0.007) also demonstrated good construct validity. However, two-thirds of the proposed 'novel metrics' the simulator reports could not distinguish participants based on arthroscopic experience. Face validity assessment rated the simulator as a realistic and useful tool for trainees, but the passive haptic feedback (a key feature of this simulator) is rated as less realistic. The ArthroS simulator has good task construct validity based on established objective outputs, but some of the novel performance metrics could not distinguish between surgical experience. The passive haptic feedback of the simulator also needs improvement. If simulators could offer automated and validated performance feedback, this would facilitate improvements in the delivery of training by allowing trainees to practise and self-assess.

  11. Experimental evaluation of magnified haptic feedback for robot-assisted needle insertion and palpation.

    Science.gov (United States)

    Meli, Leonardo; Pacchierotti, Claudio; Prattichizzo, Domenico

    2017-12-01

    Haptic feedback has been proven to play a key role in enhancing the performance of teleoperated medical procedures. However, due to safety issues, commercially-available medical robots do not currently provide the clinician with haptic feedback. This work presents the experimental evaluation of a teleoperation system for robot-assisted medical procedures able to provide magnified haptic feedback to the clinician. Forces registered at the operating table are magnified and provided to the clinician through a 7-DoF haptic interface. The same interface is also used to control the motion of a 6-DoF slave robotic manipulator. The safety of the system is guaranteed by a time-domain passivity-based control algorithm. Two experiments were carried out on stiffness discrimination (during palpation and needle insertion) and one experiment on needle guidance. Our haptic-enabled teleoperation system improved the performance with respect to direct hand interaction of 80%, 306%, and 27% in stiffness discrimination through palpation, stiffness discrimination during needle insertion, and guidance, respectively. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Haptic Cues Used for Outdoor Wayfinding by Individuals with Visual Impairments

    Science.gov (United States)

    Koutsoklenis, Athanasios; Papadopoulos, Konstantinos

    2014-01-01

    Introduction: The study presented here examines which haptic cues individuals with visual impairments use more frequently and determines which of these cues are deemed by these individuals to be the most important for way-finding in urban environments. It also investigates the ways in which these haptic cues are used by individuals with visual…

  13. Force control tasks with pure haptic feedback promote short-term focused attention.

    Science.gov (United States)

    Wang, Dangxiao; Zhang, Yuru; Yang, Xiaoxiao; Yang, Gaofeng; Yang, Yi

    2014-01-01

    Focused attention has great impact on our quality of life. Our learning, social skills and even happiness are closely intertwined with our capacity for focused attention. Attention promotion is replete with examples of training-induced increases in attention capability, most of which rely on visual and auditory stimulation. Pure haptic stimulation to increase attention capability is rarely found. We show that accurate force control tasks with pure haptic feedback enhance short-term focused attention. Participants were trained by a force control task in which information from visual and auditory channels was blocked, and only haptic feedback was provided. The trainees were asked to exert a target force within a pre-defined force tolerance for a specific duration. The tolerance was adaptively modified to different levels of difficulty to elicit full participant engagement. Three attention tests showed significant changes in different aspects of focused attention in participants who had been trained as compared with those who had not, thereby illustrating the role of haptic-based sensory-motor tasks in the promotion of short-term focused attention. The findings highlight the potential value of haptic stimuli in brain plasticity and serve as a new tool to extend existing computer games for cognitive enhancement.

  14. Skin-Inspired Haptic Memory Arrays with an Electrically Reconfigurable Architecture.

    Science.gov (United States)

    Zhu, Bowen; Wang, Hong; Liu, Yaqing; Qi, Dianpeng; Liu, Zhiyuan; Wang, Hua; Yu, Jiancan; Sherburne, Matthew; Wang, Zhaohui; Chen, Xiaodong

    2016-02-24

    Skin-inspired haptic-memory devices, which can retain pressure information after the removel of external pressure by virtue of the nonvolatile nature of the memory devices, are achieved. The rise of haptic-memory devices will allow for mimicry of human sensory memory, opening new avenues for the design of next-generation high-performance sensing devices and systems. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Interactive navigation and bronchial tube tracking in virtual bronchoscopy.

    Science.gov (United States)

    Heng, P A; Fung, P F; Wong, T T; Siu, Y H; Sun, H

    1999-01-01

    An interactive virtual environment for simulation of bronchoscopy is developed. Medical doctor can safely plan their surgical bronchoscopy using the virtual environment without any invasive diagnosis which may risk the patient's health. The 3D pen input device of the system allows the doctor to navigate and visualize the bronchial tree of the patient naturally and interactively. To navigate the patient's bronchial tree, a vessel tracking process is required. While manual tracking is tedious and labor-intensive, fully automatic tracking may not be reliable. We propose a semi-automatic tracking technique called Intelligent Path Tracker which provides automation and enough user control during the vessel tracking. To support an interactive frame rate, we also introduce a new volume rendering acceleration technique, named as IsoRegion Leaping. The volume rendering is further accelerated by distributed rendering on a TCP/IP-based network of low-cost PCs. With these approaches, a 256 x 256 x 256 volume data of human lung, can be navigated and visualized at a frame rate of over 10 Hz in our virtual bronchoscopy system.

  16. A Fully Immersive Set-Up for Remote Interaction and Neurorehabilitation Based on Virtual Body Ownership

    Science.gov (United States)

    Perez-Marcos, Daniel; Solazzi, Massimiliano; Steptoe, William; Oyekoya, Oyewole; Frisoli, Antonio; Weyrich, Tim; Steed, Anthony; Tecchia, Franco; Slater, Mel; Sanchez-Vives, Maria V.

    2012-01-01

    Although telerehabilitation systems represent one of the most technologically appealing clinical solutions for the immediate future, they still present limitations that prevent their standardization. Here we propose an integrated approach that includes three key and novel factors: (a) fully immersive virtual environments, including virtual body representation and ownership; (b) multimodal interaction with remote people and virtual objects including haptic interaction; and (c) a physical representation of the patient at the hospital through embodiment agents (e.g., as a physical robot). The importance of secure and rapid communication between the nodes is also stressed and an example implemented solution is described. Finally, we discuss the proposed approach with reference to the existing literature and systems. PMID:22787454

  17. Haptic biofeedback for improving compliance with lower-extremity partial weight bearing.

    Science.gov (United States)

    Fu, Michael C; DeLuke, Levi; Buerba, Rafael A; Fan, Richard E; Zheng, Ying Jean; Leslie, Michael P; Baumgaertner, Michael R; Grauer, Jonathan N

    2014-11-01

    After lower-extremity orthopedic trauma and surgery, patients are often advised to restrict weight bearing on the affected limb. Conventional training methods are not effective at enabling patients to comply with recommendations for partial weight bearing. The current study assessed a novel method of using real-time haptic (vibratory/vibrotactile) biofeedback to improve compliance with instructions for partial weight bearing. Thirty healthy, asymptomatic participants were randomized into 1 of 3 groups: verbal instruction, bathroom scale training, and haptic biofeedback. Participants were instructed to restrict lower-extremity weight bearing in a walking boot with crutches to 25 lb, with an acceptable range of 15 to 35 lb. A custom weight bearing sensor and biofeedback system was attached to all participants, but only those in the haptic biofeedback group were given a vibrotactile signal if they exceeded the acceptable range. Weight bearing in all groups was measured with a separate validated commercial system. The verbal instruction group bore an average of 60.3±30.5 lb (mean±standard deviation). The bathroom scale group averaged 43.8±17.2 lb, whereas the haptic biofeedback group averaged 22.4±9.1 lb (Phaptic biofeedback group averaged 14.5±6.3% (Phaptic biofeedback to improve compliance with lower-extremity partial weight bearing, haptic biofeedback was superior to conventional physical therapy methods. Further studies in patients with clinical orthopedic trauma are warranted. Copyright 2014, SLACK Incorporated.

  18. Virtual endoscopy in neurosurgery: a review.

    Science.gov (United States)

    Neubauer, André; Wolfsberger, Stefan

    2013-01-01

    Virtual endoscopy is the computerized creation of images depicting the inside of patient anatomy reconstructed in a virtual reality environment. It permits interactive, noninvasive, 3-dimensional visual inspection of anatomical cavities or vessels. This can aid in diagnostics, potentially replacing an actual endoscopic procedure, and help in the preparation of a surgical intervention by bridging the gap between plain 2-dimensional radiologic images and the 3-dimensional depiction of anatomy during actual endoscopy. If not only the endoscopic vision but also endoscopic handling, including realistic haptic feedback, is simulated, virtual endoscopy can be an effective training tool for novice surgeons. In neurosurgery, the main fields of the application of virtual endoscopy are third ventriculostomy, endonasal surgery, and the evaluation of pathologies in cerebral blood vessels. Progress in this very active field of research is achieved through cooperation between the technical and the medical communities. While the technology advances and new methods for modeling, reconstruction, and simulation are being developed, clinicians evaluate existing simulators, steer the development of new ones, and explore new fields of application. This review introduces some of the most interesting virtual reality systems for endoscopic neurosurgery developed in recent years and presents clinical studies conducted either on areas of application or specific systems. In addition, benefits and limitations of single products and simulated neuroendoscopy in general are pointed out.

  19. Topographic modelling of haptic properties of tissue products

    International Nuclear Information System (INIS)

    Rosen, B-G; Fall, A; Farbrot, A; Bergström, P; Rosen, S

    2014-01-01

    The way a product or material feels when touched, haptics, has been shown to be a property that plays an important role when consumers determine the quality of products For tissue products in constant touch with the skin, ''softness'' becomes a primary quality parameter. In the present work, the relationship between topography and the feeling of the surface has been investigated for commercial tissues with varying degree of texture from the low textured crepe tissue to the highly textured embossed- and air-dried tissue products. A trained sensory panel at was used to grade perceived haptic ''roughness''. The technique used to characterize the topography was Digital light projection (DLP) technique, By the use of multivariate statistics, strong correlations between perceived roughness and topography were found with predictability of above 90 percent even though highly textured products were included. Characterization was made using areal ISO 25178-2 topography parameters in combination with non-contacting topography measurement. The best prediction ability was obtained when combining haptic properties with the topography parameters auto-correlation length (Sal), peak material volume (Vmp), core roughness depth (Sk) and the maximum height of the surface (Sz)

  20. Lean on Wii: physical rehabilitation with virtual reality Wii peripherals.

    Science.gov (United States)

    Anderson, Fraser; Annett, Michelle; Bischof, Walter F

    2010-01-01

    In recent years, a growing number of occupational therapists have integrated video game technologies, such as the Nintendo Wii, into rehabilitation programs. 'Wiihabilitation', or the use of the Wii in rehabilitation, has been successful in increasing patients' motivation and encouraging full body movement. The non-rehabilitative focus of Wii applications, however, presents a number of problems: games are too difficult for patients, they mainly target upper-body gross motor functions, and they lack support for task customization, grading, and quantitative measurements. To overcome these problems, we have designed a low-cost, virtual-reality based system. Our system, Virtual Wiihab, records performance and behavioral measurements, allows for activity customization, and uses auditory, visual, and haptic elements to provide extrinsic feedback and motivation to patients.

  1. 3D virtual rendering in thoracoscopic treatment of congenital malformation of the lung

    Directory of Open Access Journals (Sweden)

    Destro F.

    2013-10-01

    Full Text Available Introduction: Congenital malformations of the lung (CML are rare but potentially dangerous congenital malformations. Their identification is important in order to define the most appropriate management. Materials and methods: We retrospectively reviewed data from 37 patients affected by CML treated in our Pediatric Surgery Unit in the last four years with minimally invasive surgery (MIS. Results: Prenatal diagnosis was possible in 26/37 patients. Surgery was performed in the first month of life in 3 symptomatic patients and between 6 and 12 months in the others. All patients underwent radiological evaluation prior to thoracoscopic surgery. Images collected were reconstructed using the VR render software. Discussion and conclusions: Volume rendering gives high anatomical resolution and it can be useful to guide the surgical procedure. Thoracoscopy should be the technique of choice because it is safe, effective and feasible. Furthermore it has the benefit of a minimal access technique and it can be easily performed in children.

  2. Command Recognition of Robot with Low Dimension Whole-Body Haptic Sensor

    Science.gov (United States)

    Ito, Tatsuya; Tsuji, Toshiaki

    The authors have developed “haptic armor”, a whole-body haptic sensor that has an ability to estimate contact position. Although it is developed for safety assurance of robots in human environment, it can also be used as an interface. This paper proposes a command recognition method based on finger trace information. This paper also discusses some technical issues for improving recognition accuracy of this system.

  3. Use of VR Technology and Passive Haptics for MANPADS Training System

    Science.gov (United States)

    2017-09-01

    reach satisfactory technical performance like latency and frame rate, while generating the sensory stimuli needed for this type of training —visual...release. Distribution is unlimited. USE OF VR TECHNOLOGY AND PASSIVE HAPTICS FOR MANPADS TRAINING SYSTEM by Faisal Rashid September 2017...HAPTICS FOR MANPADS TRAINING SYSTEM 5. FUNDING NUMBERS 6. AUTHOR(S) Faisal Rashid 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval

  4. Design of a 7-DOF haptic master using a magneto-rheological devices for robot surgery

    Science.gov (United States)

    Kang, Seok-Rae; Choi, Seung-Bok; Hwang, Yong-Hoon; Cha, Seung-Woo

    2017-04-01

    This paper presents a 7 degrees-of-freedom (7-DOF) haptic master which is applicable to the robot-assisted minimally invasive surgery (RMIS). By utilizing a controllable magneto-rheological (MR) fluid, the haptic master can provide force information to the surgeon during surgery. The proposed haptic master consists of three degrees motions of X, Y, Z and four degrees motions of the pitch, yaw, roll and grasping. All of them have force feedback capability. The proposed haptic master can generate the repulsive forces or torques by activating MR clutch and MR brake. Both MR clutch and MR brake are designed and manufactured with consideration of the size and output torque which is usable to the robotic surgery. A proportional-integral-derivative (PID) controller is then designed and implemented to achieve torque/force tracking trajectories. It is verified that the proposed haptic master can track well the desired torque and force occurred in the surgical place by controlling the input current applied to MR clutch and brake.

  5. Virtual Sensors for Advanced Controllers in Rehabilitation Robotics.

    Science.gov (United States)

    Mancisidor, Aitziber; Zubizarreta, Asier; Cabanes, Itziar; Portillo, Eva; Jung, Je Hyung

    2018-03-05

    In order to properly control rehabilitation robotic devices, the measurement of interaction force and motion between patient and robot is an essential part. Usually, however, this is a complex task that requires the use of accurate sensors which increase the cost and the complexity of the robotic device. In this work, we address the development of virtual sensors that can be used as an alternative of actual force and motion sensors for the Universal Haptic Pantograph (UHP) rehabilitation robot for upper limbs training. These virtual sensors estimate the force and motion at the contact point where the patient interacts with the robot using the mathematical model of the robotic device and measurement through low cost position sensors. To demonstrate the performance of the proposed virtual sensors, they have been implemented in an advanced position/force controller of the UHP rehabilitation robot and experimentally evaluated. The experimental results reveal that the controller based on the virtual sensors has similar performance to the one using direct measurement (less than 0.005 m and 1.5 N difference in mean error). Hence, the developed virtual sensors to estimate interaction force and motion can be adopted to replace actual precise but normally high-priced sensors which are fundamental components for advanced control of rehabilitation robotic devices.

  6. Virtual Sensors for Advanced Controllers in Rehabilitation Robotics

    Directory of Open Access Journals (Sweden)

    Aitziber Mancisidor

    2018-03-01

    Full Text Available In order to properly control rehabilitation robotic devices, the measurement of interaction force and motion between patient and robot is an essential part. Usually, however, this is a complex task that requires the use of accurate sensors which increase the cost and the complexity of the robotic device. In this work, we address the development of virtual sensors that can be used as an alternative of actual force and motion sensors for the Universal Haptic Pantograph (UHP rehabilitation robot for upper limbs training. These virtual sensors estimate the force and motion at the contact point where the patient interacts with the robot using the mathematical model of the robotic device and measurement through low cost position sensors. To demonstrate the performance of the proposed virtual sensors, they have been implemented in an advanced position/force controller of the UHP rehabilitation robot and experimentally evaluated. The experimental results reveal that the controller based on the virtual sensors has similar performance to the one using direct measurement (less than 0.005 m and 1.5 N difference in mean error. Hence, the developed virtual sensors to estimate interaction force and motion can be adopted to replace actual precise but normally high-priced sensors which are fundamental components for advanced control of rehabilitation robotic devices.

  7. Immediate Memory for Haptically-Examined Braille Symbols by Blind and Sighted Subjects.

    Science.gov (United States)

    Newman, Slater E.; And Others

    The paper reports on two experiments in Braille learning which compared blind and sighted subjects on the immediate recall of haptically-examined Braille symbols. In the first study, sighted subjects (N=64) haptically examined each of a set of Braille symbols with their preferred or nonpreferred hand and immediately recalled the symbol by drawing…

  8. Differential effects of non-informative vision and visual interference on haptic spatial processing

    NARCIS (Netherlands)

    Volcic, Robert; Van Rheede, Joram J.; Postma, Albert; Kappers, Astrid M L

    The primary purpose of this study was to examine the effects of non-informative vision and visual interference upon haptic spatial processing, which supposedly derives from an interaction between an allocentric and egocentric reference frame. To this end, a haptic parallelity task served as baseline

  9. Improved Haptic Linear Lines for Better Movement Accuracy in Upper Limb Rehabilitation

    Directory of Open Access Journals (Sweden)

    Joan De Boeck

    2012-01-01

    Full Text Available Force feedback has proven to be beneficial in the domain of robot-assisted rehabilitation. According to the patients' personal needs, the generated forces may either be used to assist, support, or oppose their movements. In our current research project, we focus onto the upper limb training for MS (multiple sclerosis and CVA (cerebrovascular accident patients, in which a basic building block to implement many rehabilitation exercises was found. This building block is a haptic linear path: a second-order continuous path, defined by a list of points in space. Earlier, different attempts have been investigated to realize haptic linear paths. In order to have a good training quality, it is important that the haptic simulation is continuous up to the second derivative while the patient is enforced to follow the path tightly, even when low or no guiding forces are provided. In this paper, we describe our best solution to these haptic linear paths, discuss the weaknesses found in practice, and propose and validate an improvement.

  10. PROJECT HEAVEN: Preoperative Training in Virtual Reality.

    Science.gov (United States)

    Iamsakul, Kiratipath; Pavlovcik, Alexander V; Calderon, Jesus I; Sanderson, Lance M

    2017-01-01

    A cephalosomatic anastomosis (CSA; also called HEAVEN: head anastomosis venture) has been proposed as an option for patients with neurological impairments, such as spinal cord injury (SCI), and terminal medical illnesses, for which medicine is currently powerless. Protocols to prepare a patient for life after CSA do not currently exist. However, methods used in conventional neurorehabilitation can be used as a reference for developing preparatory training. Studies on virtual reality (VR) technologies have documented VR's ability to enhance rehabilitation and improve the quality of recovery in patients with neurological disabilities. VR-augmented rehabilitation resulted in increased motivation towards performing functional training and improved the biopsychosocial state of patients. In addition, VR experiences coupled with haptic feedback promote neuroplasticity, resulting in the recovery of motor functions in neurologically-impaired individuals. To prepare the recipient psychologically for life after CSA, the development of VR experiences paired with haptic feedback is proposed. This proposal aims to innovate techniques in conventional neurorehabilitation to implement preoperative psychological training for the recipient of HEAVEN. Recipient's familiarity to body movements will prevent unexpected psychological reactions from occurring after the HEAVEN procedure.

  11. Study of Electric Music Baton using Haptic Interface for Assistance of Visually Disabled Persons

    OpenAIRE

    浅川, 貴史

    2012-01-01

    [Abstract] We have made a proposal for a music baton system for visual disabled persons. The system is constituted by an acceleration sensor, a radio module, and a haptic interface device. When a conductor moves the baton, Players are able to acknowledge the action using the haptic interface device. We have carried out an experiment of comparing the visual and the haptic interface. The result declared that a pre-motion is important for the visual interface. In the paper, we make a proposal fo...

  12. A Three-Axis Force Sensor for Dual Finger Haptic Interfaces

    OpenAIRE

    Fontana, Marco; Marcheschi, Simone; Salsedo, Fabio; Bergamasco, Massimo

    2012-01-01

    In this work we present the design process, the characterization and testing of a novel three-axis mechanical force sensor. This sensor is optimized for use in closed-loop force control of haptic devices with three degrees of freedom. In particular the sensor has been conceived for integration with a dual finger haptic interface that aims at simulating forces that occur during grasping and surface exploration. The sensing spring structure has been purposely designed in order to match force an...

  13. Enhanced visuo-haptic integration for the non-dominant hand.

    Science.gov (United States)

    Yalachkov, Yavor; Kaiser, Jochen; Doehrmann, Oliver; Naumer, Marcus J

    2015-07-21

    Visuo-haptic integration contributes essentially to object shape recognition. Although there has been a considerable advance in elucidating the neural underpinnings of multisensory perception, it is still unclear whether seeing an object and exploring it with the dominant hand elicits the same brain response as compared to the non-dominant hand. Using fMRI to measure brain activation in right-handed participants, we found that for both left- and right-hand stimulation the left lateral occipital complex (LOC) and anterior cerebellum (aCER) were involved in visuo-haptic integration of familiar objects. These two brain regions were then further investigated in another study, where unfamiliar, novel objects were presented to a different group of right-handers. Here the left LOC and aCER were more strongly activated by bimodal than unimodal stimuli only when the left but not the right hand was used. A direct comparison indicated that the multisensory gain of the fMRI activation was significantly higher for the left than the right hand. These findings are in line with the principle of "inverse effectiveness", implying that processing of bimodally presented stimuli is particularly enhanced when the unimodal stimuli are weak. This applies also when right-handed subjects see and simultaneously touch unfamiliar objects with their non-dominant left hand. Thus, the fMRI signal in the left LOC and aCER induced by visuo-haptic stimulation is dependent on which hand was employed for haptic exploration. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. RenderToolbox3: MATLAB tools that facilitate physically based stimulus rendering for vision research.

    Science.gov (United States)

    Heasly, Benjamin S; Cottaris, Nicolas P; Lichtman, Daniel P; Xiao, Bei; Brainard, David H

    2014-02-07

    RenderToolbox3 provides MATLAB utilities and prescribes a workflow that should be useful to researchers who want to employ graphics in the study of vision and perhaps in other endeavors as well. In particular, RenderToolbox3 facilitates rendering scene families in which various scene attributes and renderer behaviors are manipulated parametrically, enables spectral specification of object reflectance and illuminant spectra, enables the use of physically based material specifications, helps validate renderer output, and converts renderer output to physical units of radiance. This paper describes the design and functionality of the toolbox and discusses several examples that demonstrate its use. We have designed RenderToolbox3 to be portable across computer hardware and operating systems and to be free and open source (except for MATLAB itself). RenderToolbox3 is available at https://github.com/DavidBrainard/RenderToolbox3.

  15. Feasibility Study of Haptic Display for Rotation Tasks of Wrist Work

    OpenAIRE

    曽根, 順治; 岩井, 秀樹; 山田, 勝実; 陳, 軍; 徳山, 喜政; 今野, 晃市; Sone, Junji; Iwai, Hideki; Yamada, Katsumi; Chen, Jun; Tokuyama, Yoshimasa; Konno, Kouichi

    2011-01-01

    We have developed a haptic display for rotational tasks that involve functions of the human wrist. We represent the torque using a motor and a brake. Reference torque curves are obtained by the measuring torque required for each actual task using a torque sensor. The brake represents the stop condition. We have confirmed the effectiveness of the display by comparing the actual tasks with the haptic display experiment.

  16. Haptic interventions as visual anthropology’

    DEFF Research Database (Denmark)

    Kirstein Høgel, Arine

    2017-01-01

    This vignette arose in the course of a practice-led research project using “haptic interventions” to investigate contemporary consumption of cultural pasts and cultural difference. The vignette presents reworkings of unused and newly digitised archival material shot in the Persian Gulf in the 1950s...

  17. The role of haptic versus visual volume cues in the size-weight illusion.

    Science.gov (United States)

    Ellis, R R; Lederman, S J

    1993-03-01

    Three experiments establish the size-weight illusion as a primarily haptic phenomenon, despite its having been more traditionally considered an example of vision influencing haptic processing. Experiment 1 documents, across a broad range of stimulus weights and volumes, the existence of a purely haptic size-weight illusion, equal in strength to the traditional illusion. Experiment 2 demonstrates that haptic volume cues are both sufficient and necessary for a full-strength illusion. In contrast, visual volume cues are merely sufficient, and produce a relatively weaker effect. Experiment 3 establishes that congenitally blind subjects experience an effect as powerful as that of blindfolded sighted observers, thus demonstrating that visual imagery is also unnecessary for a robust size-weight illusion. The results are discussed in terms of their implications for both sensory and cognitive theories of the size-weight illusion. Applications of this work to a human factors design and to sensor-based systems for robotic manipulation are also briefly considered.

  18. Detection of Membrane Puncture with Haptic Feedback using a Tip-Force Sensing Needle.

    Science.gov (United States)

    Elayaperumal, Santhi; Bae, Jung Hwa; Daniel, Bruce L; Cutkosky, Mark R

    2014-09-01

    This paper presents calibration and user test results of a 3-D tip-force sensing needle with haptic feedback. The needle is a modified MRI-compatible biopsy needle with embedded fiber Bragg grating (FBG) sensors for strain detection. After calibration, the needle is interrogated at 2 kHz, and dynamic forces are displayed remotely with a voice coil actuator. The needle is tested in a single-axis master/slave system, with the voice coil haptic display at the master, and the needle at the slave end. Tissue phantoms with embedded membranes were used to determine the ability of the tip-force sensors to provide real-time haptic feedback as compared to external sensors at the needle base during needle insertion via the master/slave system. Subjects were able to determine the position of the embedded membranes with significantly better accuracy using FBG tip feedback than with base feedback using a commercial force/torque sensor (p = 0.045) or with no added haptic feedback (p = 0.0024).

  19. Integration of serious games and wearable haptic interfaces for Neuro Rehabilitation of children with movement disorders: A feasibility study.

    Science.gov (United States)

    Bortone, Ilaria; Leonardis, Daniele; Solazzi, Massimiliano; Procopio, Caterina; Crecchi, Alessandra; Bonfiglio, Luca; Frisoli, Antonio

    2017-07-01

    The past decade has seen the emergence of rehabilitation treatments using virtual reality environments. One of the advantages in using this technology is the potential to create positive motivation, by means of engaging environments and tasks shaped in the form of serious games. In this work, we propose a novel Neuro Rehabilitation System for children with movement disorders, that is based on serious games in immersive virtual reality with haptic feedback. The system design aims to enhance involvement and engagement of patients, to provide congruent multi-sensory afferent feedback during motor exercises, and to benefit from the flexibility of virtual reality in adapting exercises to the patient's needs. We present a feasibility study of the method conducted through an experimental rehabilitation session in a group of 4 children with Cerebral Palsy and Developmental Dyspraxia, 4 Typically Developing children and 4 healthy adults. Subjects and patients were able to accomplish the proposed rehabilitation session and average performance of the motor exercises in patients were lower, although comparable, to healthy subjects. Together with positive comments reported by children after the rehabilitation session, results are encouraging for application of the method in a prolonged rehabilitation treatment.

  20. A three-axis force sensor for dual finger haptic interfaces.

    Science.gov (United States)

    Fontana, Marco; Marcheschi, Simone; Salsedo, Fabio; Bergamasco, Massimo

    2012-10-10

    In this work we present the design process, the characterization and testing of a novel three-axis mechanical force sensor. This sensor is optimized for use in closed-loop force control of haptic devices with three degrees of freedom. In particular the sensor has been conceived for integration with a dual finger haptic interface that aims at simulating forces that occur during grasping and surface exploration. The sensing spring structure has been purposely designed in order to match force and layout specifications for the application. In this paper the design of the sensor is presented, starting from an analytic model that describes the characteristic matrix of the sensor. A procedure for designing an optimal overload protection mechanism is proposed. In the last part of the paper the authors describe the experimental characterization and the integrated test on a haptic hand exoskeleton showing the improvements in the controller performances provided by the inclusion of the force sensor.

  1. A Three-Axis Force Sensor for Dual Finger Haptic Interfaces

    Directory of Open Access Journals (Sweden)

    Fabio Salsedo

    2012-10-01

    Full Text Available In this work we present the design process, the characterization and testing of a novel three-axis mechanical force sensor. This sensor is optimized for use in closed-loop force control of haptic devices with three degrees of freedom. In particular the sensor has been conceived for integration with a dual finger haptic interface that aims at simulating forces that occur during grasping and surface exploration. The sensing spring structure has been purposely designed in order to match force and layout specifications for the application. In this paper the design of the sensor is presented, starting from an analytic model that describes the characteristic matrix of the sensor. A procedure for designing an optimal overload protection mechanism is proposed. In the last part of the paper the authors describe the experimental characterization and the integrated test on a haptic hand exoskeleton showing the improvements in the controller performances provided by the inclusion of the force sensor.

  2. Design of a 7-DOF slave robot integrated with a magneto-rheological haptic master

    Science.gov (United States)

    Hwang, Yong-Hoon; Cha, Seung-Woo; Kang, Seok-Rae; Choi, Seung-Bok

    2017-04-01

    In this study, a 7-DOF slave robot integrated with the haptic master is designed and its dynamic motion is controlled. The haptic master is made using a controllable magneto-rheological (MR) clutch and brake and it provides the surgeon with a sense of touch by using both kinetic and kinesthetic information. Due to the size constraint of the slave robot, a wire actuating is adopted to make the desired motion of the end-effector which has 3-DOF instead of a conventional direct-driven motor. Another motions of the link parts that have 4-DOF use direct-driven motor. In total system, for working as a haptic device, the haptic master need to receive the information of repulsive forces applied on the slave robot. Therefore, repulsive forces on the end-effector are sensed by using three uniaxial torque transducer inserted in the wire actuating system and another repulsive forces applied on link part are sensed by using 6-axis transducer that is able to sense forces and torques. Using another 6-axis transducer, verify the reliability of force information on final end of slave robot. Lastly, integrated with a MR haptic master, psycho-physical test is conducted by different operators who can feel the different repulsive force or torque generated from the haptic master which is equivalent to the force or torque occurred on the end-effector to demonstrate the effectiveness of the proposed system.

  3. A Virtual Environment for People Who Are Blind - A Usability Study.

    Science.gov (United States)

    Lahav, O; Schloerb, D W; Kumar, S; Srinivasan, M A

    2012-01-01

    For most people who are blind, exploring an unknown environment can be unpleasant, uncomfortable, and unsafe. Over the past years, the use of virtual reality as a learning and rehabilitation tool for people with disabilities has been on the rise. This research is based on the hypothesis that the supply of appropriate perceptual and conceptual information through compensatory sensorial channels may assist people who are blind with anticipatory exploration. In this research we developed and tested the BlindAid system, which allows the user to explore a virtual environment. The two main goals of the research were: (a) evaluation of different modalities (haptic and audio) and navigation tools, and (b) evaluation of spatial cognitive mapping employed by people who are blind. Our research included four participants who are totally blind. The preliminary findings confirm that the system enabled participants to develop comprehensive cognitive maps by exploring the virtual environment.

  4. Effects of visual information regarding allocentric processing in haptic parallelity matching.

    Science.gov (United States)

    Van Mier, Hanneke I

    2013-10-01

    Research has revealed that haptic perception of parallelity deviates from physical reality. Large and systematic deviations have been found in haptic parallelity matching most likely due to the influence of the hand-centered egocentric reference frame. Providing information that increases the influence of allocentric processing has been shown to improve performance on haptic matching. In this study allocentric processing was stimulated by providing informative vision in haptic matching tasks that were performed using hand- and arm-centered reference frames. Twenty blindfolded participants (ten men, ten women) explored the orientation of a reference bar with the non-dominant hand and subsequently matched (task HP) or mirrored (task HM) its orientation on a test bar with the dominant hand. Visual information was provided by means of informative vision with participants having full view of the test bar, while the reference bar was blocked from their view (task VHP). To decrease the egocentric bias of the hands, participants also performed a visual haptic parallelity drawing task (task VHPD) using an arm-centered reference frame, by drawing the orientation of the reference bar. In all tasks, the distance between and orientation of the bars were manipulated. A significant effect of task was found; performance improved from task HP, to VHP to VHPD, and HM. Significant effects of distance were found in the first three tasks, whereas orientation and gender effects were only significant in tasks HP and VHP. The results showed that stimulating allocentric processing by means of informative vision and reducing the egocentric bias by using an arm-centered reference frame led to most accurate performance on parallelity matching. © 2013 Elsevier B.V. All rights reserved.

  5. Simulating video-assisted thoracoscopic lobectomy: a virtual reality cognitive task simulation.

    Science.gov (United States)

    Solomon, Brian; Bizekis, Costas; Dellis, Sophia L; Donington, Jessica S; Oliker, Aaron; Balsam, Leora B; Zervos, Michael; Galloway, Aubrey C; Pass, Harvey; Grossi, Eugene A

    2011-01-01

    Current video-assisted thoracoscopic surgery training models rely on animals or mannequins to teach procedural skills. These approaches lack inherent teaching/testing capability and are limited by cost, anatomic variations, and single use. In response, we hypothesized that video-assisted thoracoscopic surgery right upper lobe resection could be simulated in a virtual reality environment with commercial software. An anatomy explorer (Maya [Autodesk Inc, San Rafael, Calif] models of the chest and hilar structures) and simulation engine were adapted. Design goals included freedom of port placement, incorporation of well-known anatomic variants, teaching and testing modes, haptic feedback for the dissection, ability to perform the anatomic divisions, and a portable platform. Preexisting commercial models did not provide sufficient surgical detail, and extensive modeling modifications were required. Video-assisted thoracoscopic surgery right upper lobe resection simulation is initiated with a random vein and artery variation. The trainee proceeds in a teaching or testing mode. A knowledge database currently includes 13 anatomic identifications and 20 high-yield lung cancer learning points. The "patient" is presented in the left lateral decubitus position. After initial camera port placement, the endoscopic view is displayed and the thoracoscope is manipulated via the haptic device. The thoracoscope port can be relocated; additional ports are placed using an external "operating room" view. Unrestricted endoscopic exploration of the thorax is allowed. An endo-dissector tool allows for hilar dissection, and a virtual stapling device divides structures. The trainee's performance is reported. A virtual reality cognitive task simulation can overcome the deficiencies of existing training models. Performance scoring is being validated as we assess this simulator for cognitive and technical surgical education. Copyright © 2011. Published by Mosby, Inc.

  6. Development of a Robotic Colonoscopic Manipulation System, Using Haptic Feedback Algorithm.

    Science.gov (United States)

    Woo, Jaehong; Choi, Jae Hyuk; Seo, Jong Tae; Kim, Tae Il; Yi, Byung Ju

    2017-01-01

    Colonoscopy is one of the most effective diagnostic and therapeutic tools for colorectal diseases. We aim to propose a master-slave robotic colonoscopy that is controllable in remote site using conventional colonoscopy. The master and slave robot were developed to use conventional flexible colonoscopy. The robotic colonoscopic procedure was performed using a colonoscope training model by one expert endoscopist and two unexperienced engineers. To provide the haptic sensation, the insertion force and the rotating torque were measured and sent to the master robot. A slave robot was developed to hold the colonoscopy and its knob, and perform insertion, rotation, and two tilting motions of colonoscope. A master robot was designed to teach motions of the slave robot. These measured force and torque were scaled down by one tenth to provide the operator with some reflection force and torque at the haptic device. The haptic sensation and feedback system was successful and helpful to feel the constrained force or torque in colon. The insertion time using robotic system decreased with repeated procedures. This work proposed a robotic approach for colonoscopy using haptic feedback algorithm, and this robotic device would effectively perform colonoscopy with reduced burden and comparable safety for patients in remote site.

  7. Development of a Robotic Colonoscopic Manipulation System, Using Haptic Feedback Algorithm

    Science.gov (United States)

    Woo, Jaehong; Choi, Jae Hyuk; Seo, Jong Tae

    2017-01-01

    Purpose Colonoscopy is one of the most effective diagnostic and therapeutic tools for colorectal diseases. We aim to propose a master-slave robotic colonoscopy that is controllable in remote site using conventional colonoscopy. Materials and Methods The master and slave robot were developed to use conventional flexible colonoscopy. The robotic colonoscopic procedure was performed using a colonoscope training model by one expert endoscopist and two unexperienced engineers. To provide the haptic sensation, the insertion force and the rotating torque were measured and sent to the master robot. Results A slave robot was developed to hold the colonoscopy and its knob, and perform insertion, rotation, and two tilting motions of colonoscope. A master robot was designed to teach motions of the slave robot. These measured force and torque were scaled down by one tenth to provide the operator with some reflection force and torque at the haptic device. The haptic sensation and feedback system was successful and helpful to feel the constrained force or torque in colon. The insertion time using robotic system decreased with repeated procedures. Conclusion This work proposed a robotic approach for colonoscopy using haptic feedback algorithm, and this robotic device would effectively perform colonoscopy with reduced burden and comparable safety for patients in remote site. PMID:27873506

  8. Real-time surgery simulation of intracranial aneurysm clipping with patient-specific geometries and haptic feedback

    Science.gov (United States)

    Fenz, Wolfgang; Dirnberger, Johannes

    2015-03-01

    Providing suitable training for aspiring neurosurgeons is becoming more and more problematic. The increasing popularity of the endovascular treatment of intracranial aneurysms leads to a lack of simple surgical situations for clipping operations, leaving mainly the complex cases, which present even experienced surgeons with a challenge. To alleviate this situation, we have developed a training simulator with haptic interaction allowing trainees to practice virtual clipping surgeries on real patient-specific vessel geometries. By using specialized finite element (FEM) algorithms (fast finite element method, matrix condensation) combined with GPU acceleration, we can achieve the necessary frame rate for smooth real-time interaction with the detailed models needed for a realistic simulation of the vessel wall deformation caused by the clamping with surgical clips. Vessel wall geometries for typical training scenarios were obtained from 3D-reconstructed medical image data, while for the instruments (clipping forceps, various types of clips, suction tubes) we use models provided by manufacturer Aesculap AG. Collisions between vessel and instruments have to be continuously detected and transformed into corresponding boundary conditions and feedback forces, calculated using a contact plane method. After a training, the achieved result can be assessed based on various criteria, including a simulation of the residual blood flow into the aneurysm. Rigid models of the surgical access and surrounding brain tissue, plus coupling a real forceps to the haptic input device further increase the realism of the simulation.

  9. Inquiry style interactive virtual experiments: a case on circular motion

    Energy Technology Data Exchange (ETDEWEB)

    Zhou Shaona; Wang Xiaojun; Xiao Hua [School of Physics and Telecommunication Engineering, South China Normal University, Guangzhou 510006 (China); Han Jing; Pelz, Nathaniel; Peng Liangyu; Bao Lei, E-mail: xiaoh@scnu.edu.cn, E-mail: lbao@mps.ohio-state.edu [Department of Physics, Ohio State University, Columbus, OH 43210 (United States)

    2011-11-15

    Interest in computer-based learning, especially in the use of virtual reality simulations is increasing rapidly. While there are good reasons to believe that technologies have the potential to improve teaching and learning, how to utilize the technology effectively in teaching specific content difficulties is challenging. To help students develop robust understandings of correct physics concepts, we have developed interactive virtual experiment simulations that have the unique feature of enabling students to experience force and motion via an analogue joystick, allowing them to feel the applied force and simultaneously see its effects. The simulations provide students learning experiences that integrate both scientific representations and low-level sensory cues such as haptic cues under a single setting. In this paper, we introduce a virtual experiment module on circular motion. A controlled study has been conducted to evaluate the impact of using this virtual experiment on students' learning of force and motion in the context of circular motion. The results show that the interactive virtual experiment method is preferred by students and is more effective in helping students grasp the physics concepts than the traditional education method such as problem-solving practices. Our research suggests that well-developed interactive virtual experiments can be useful tools in teaching difficult concepts in science.

  10. Inquiry style interactive virtual experiments: a case on circular motion

    International Nuclear Information System (INIS)

    Zhou Shaona; Wang Xiaojun; Xiao Hua; Han Jing; Pelz, Nathaniel; Peng Liangyu; Bao Lei

    2011-01-01

    Interest in computer-based learning, especially in the use of virtual reality simulations is increasing rapidly. While there are good reasons to believe that technologies have the potential to improve teaching and learning, how to utilize the technology effectively in teaching specific content difficulties is challenging. To help students develop robust understandings of correct physics concepts, we have developed interactive virtual experiment simulations that have the unique feature of enabling students to experience force and motion via an analogue joystick, allowing them to feel the applied force and simultaneously see its effects. The simulations provide students learning experiences that integrate both scientific representations and low-level sensory cues such as haptic cues under a single setting. In this paper, we introduce a virtual experiment module on circular motion. A controlled study has been conducted to evaluate the impact of using this virtual experiment on students' learning of force and motion in the context of circular motion. The results show that the interactive virtual experiment method is preferred by students and is more effective in helping students grasp the physics concepts than the traditional education method such as problem-solving practices. Our research suggests that well-developed interactive virtual experiments can be useful tools in teaching difficult concepts in science.

  11. An image-based approach to the rendering of crowds in real-time

    OpenAIRE

    Tecchia, Franco

    2007-01-01

    The wide use of computer graphics in games, entertainment, medical, architectural and cultural applications, has led it to becoming a prevalent area of research. Games and entertainment in general have become one of the driving forces of the real-time computer graphics industry, bringing reasonably realistic, complex and appealing virtual worlds to the mass-market. At the current stage of technology, an user can interactively navigate through complex, polygon-based scenes rendered with sophis...

  12. Learning of Temporal and Spatial Movement Aspects: A Comparison of Four Types of Haptic Control and Concurrent Visual Feedback.

    Science.gov (United States)

    Rauter, Georg; Sigrist, Roland; Riener, Robert; Wolf, Peter

    2015-01-01

    In literature, the effectiveness of haptics for motor learning is controversially discussed. Haptics is believed to be effective for motor learning in general; however, different types of haptic control enhance different movement aspects. Thus, in dependence on the movement aspects of interest, one type of haptic control may be effective whereas another one is not. Therefore, in the current work, it was investigated if and how different types of haptic controllers affect learning of spatial and temporal movement aspects. In particular, haptic controllers that enforce active participation of the participants were expected to improve spatial aspects. Only haptic controllers that provide feedback about the task's velocity profile were expected to improve temporal aspects. In a study on learning a complex trunk-arm rowing task, the effect of training with four different types of haptic control was investigated: position control, path control, adaptive path control, and reactive path control. A fifth group (control) trained with visual concurrent augmented feedback. As hypothesized, the position controller was most effective for learning of temporal movement aspects, while the path controller was most effective in teaching spatial movement aspects of the rowing task. Visual feedback was also effective for learning temporal and spatial movement aspects.

  13. Communicating Emotion through Haptic Design: A Study Using Physical Keys

    DEFF Research Database (Denmark)

    Kjellerup, Marie Kjær; Larsen, Anne Cathrine; Maier, Anja

    2014-01-01

    This paper explores how designers may communicate with the users of their products through haptic design. More specifically, how tactile properties of materials evoke emotions such as satisfaction, joy, or disgust. A research through design approach has been followed; mood- and material boards...... and prototypes of four ‘haptically enhanced’ (physical) keys were created. Types of keys selected include home, bicycle, hobby, and basement. An experiment with ten participants was conducted, using word association and a software to elicit product emotions (PrEmo). Results show a mapping between the designer...

  14. Representations of space based on haptic input

    NARCIS (Netherlands)

    Zuidhoek, S.

    2005-01-01

    The present thesis focused on the representations of grasping space based on haptic input. We aimed at identifying their characteristics, and the underlying neurocognitive processes and mechanisms. To this end, we studied the systematic distortions in performance on several orientation perception

  15. A strategic map for high-impact virtual experience design

    Science.gov (United States)

    Faste, Haakon; Bergamasco, Massimo

    2009-02-01

    We have employed methodologies of human centered design to inspire and guide the engineering of a definitive low-cost aesthetic multimodal experience intended to stimulate cultural growth. Using a combination of design research, trend analysis and the programming of immersive virtual 3D worlds, over 250 innovative concepts have been brainstormed, prototyped, evaluated and refined. These concepts have been used to create a strategic map for the development of highimpact virtual art experiences, the most promising of which have been incorporated into a multimodal environment programmed in the online interactive 3D platform XVR. A group of test users have evaluated the experience as it has evolved, using a multimodal interface with stereo vision, 3D audio and haptic feedback. This paper discusses the process, content, results, and impact on our engineering laboratory that this research has produced.

  16. Immersive volume rendering of blood vessels

    Science.gov (United States)

    Long, Gregory; Kim, Han Suk; Marsden, Alison; Bazilevs, Yuri; Schulze, Jürgen P.

    2012-03-01

    In this paper, we present a novel method of visualizing flow in blood vessels. Our approach reads unstructured tetrahedral data, resamples it, and uses slice based 3D texture volume rendering. Due to the sparse structure of blood vessels, we utilize an octree to efficiently store the resampled data by discarding empty regions of the volume. We use animation to convey time series data, wireframe surface to give structure, and utilize the StarCAVE, a 3D virtual reality environment, to add a fully immersive element to the visualization. Our tool has great value in interdisciplinary work, helping scientists collaborate with clinicians, by improving the understanding of blood flow simulations. Full immersion in the flow field allows for a more intuitive understanding of the flow phenomena, and can be a great help to medical experts for treatment planning.

  17. P1-17: Pseudo-Haptics Using Motion-in-Depth Stimulus and Second-Order Motion Stimulus

    Directory of Open Access Journals (Sweden)

    Shuichi Sato

    2012-10-01

    Full Text Available Modification of motion of the computer cursor during the manipulation by the observer evokes illusory haptic sensation (Lecuyer et al., 2004 ACM SIGCHI '04 239–246. This study investigates the pseudo-haptics using motion-in-depth and second-order motion. A stereoscopic display and a PHANTOM were used in the first experiment. A subject was asked to move a visual target at a constant speed in horizontal, vertical, or front-back direction. During the manipulation, the speed was reduced to 50% for 500 msec. The haptic sensation was measured using the magnitude estimation method. The result indicates that perceived haptic sensation from motion-in-depth was about 30% of that from horizontal or vertical motion. A 2D display and the PHANTOM were used in the second experiment. The motion cue was second order—in each frame, dots in a square patch reverses in contrast (i.e., all black dots become white and all white dots become black. The patch was moved in a horizontal direction. The result indicates that perceived haptic sensation from second-order motion was about 90% of that from first-order motion.

  18. Improving the performance of DTP2 bilateral teleoperation control system with haptic augmentation

    International Nuclear Information System (INIS)

    Viinikainen, Mikko; Tuominen, Janne; Alho, Pekka; Mattila, Jouni

    2014-01-01

    Highlights: •An experimental haptic shared control system, called CAT developed at the DTP2. •We investigate how the system integrates with the ITER compliant DTP2 RHCS. •The effect of CAT experimentally assessed in an ITER relevant maintenance scenario. -- Abstract: The remote maintenance of the ITER divertor is largely dependent on the usage of haptically teleoperated manipulators and man-in-the-loop operations. These maintenance operations are very demanding for the manipulator operators, yet vital for the success of the whole ITER experiment. Haptic shared control of the maintenance manipulators offers a promising solution for assisting the teleoperators in the maintenance tasks. A shared control system assists the operator by generating artificial guiding force effects and overlaying them on top of the haptic feedback from the teleoperation environment. An experimental haptic shared control system, called the Computer Assisted Teleoperation (CAT) has been developed at the Divertor Test Platform 2 (DTP2). In this paper, we investigate the design of the system and how the system integrates with the ITER compliant DTP2 prototype Remote Handling Control System (RHCS). We also experimentally assess the effect of the guidance to the operator performance in an ITER-relevant maintenance scenario using the Water Hydraulic MANipulator (WHMAN), which is specially designed for the divertor maintenance. The result of the experiment gives suggestive indication that the CAT system improves the performance of the operators of the system

  19. Virtual environments simulation in research reactor

    Science.gov (United States)

    Muhamad, Shalina Bt. Sheik; Bahrin, Muhammad Hannan Bin

    2017-01-01

    Virtual reality based simulations are interactive and engaging. It has the useful potential in improving safety training. Virtual reality technology can be used to train workers who are unfamiliar with the physical layout of an area. In this study, a simulation program based on the virtual environment at research reactor was developed. The platform used for virtual simulation is 3DVia software for which it's rendering capabilities, physics for movement and collision and interactive navigation features have been taken advantage of. A real research reactor was virtually modelled and simulated with the model of avatars adopted to simulate walking. Collision detection algorithms were developed for various parts of the 3D building and avatars to restrain the avatars to certain regions of the virtual environment. A user can control the avatar to move around inside the virtual environment. Thus, this work can assist in the training of personnel, as in evaluating the radiological safety of the research reactor facility.

  20. Video-based rendering

    CERN Document Server

    Magnor, Marcus A

    2005-01-01

    Driven by consumer-market applications that enjoy steadily increasing economic importance, graphics hardware and rendering algorithms are a central focus of computer graphics research. Video-based rendering is an approach that aims to overcome the current bottleneck in the time-consuming modeling process and has applications in areas such as computer games, special effects, and interactive TV. This book offers an in-depth introduction to video-based rendering, a rapidly developing new interdisciplinary topic employing techniques from computer graphics, computer vision, and telecommunication en

  1. The contribution of cutaneous and kinesthetic sensory modalities in haptic perception of orientation.

    Science.gov (United States)

    Frisoli, Antonio; Solazzi, Massimiliano; Reiner, Miriam; Bergamasco, Massimo

    2011-06-30

    The aim of this study was to understand the integration of cutaneous and kinesthetic sensory modalities in haptic perception of shape orientation. A specific robotic apparatus was employed to simulate the exploration of virtual surfaces by active touch with two fingers, with kinesthetic only, cutaneous only and combined sensory feedback. The cutaneous feedback was capable of displaying the local surface orientation at the contact point, through a small plate indenting the fingerpad at contact. A psychophysics test was conducted with SDT methodology on 6 subjects to assess the discrimination threshold of angle perception between two parallel surfaces, with three sensory modalities and two shape sizes. Results show that the cutaneous sensor modality is not affected by size of shape, but kinesthetic performance is decreasing with smaller size. Cutaneous and kinesthetic sensory cues are integrated according to a Bayesian model, so that the combined sensory stimulation always performs better than single modalities alone. Copyright © 2010 Elsevier Inc. All rights reserved.

  2. Differences between early-blind, late-blind, and blindfolded-sighted people in haptic spatial-configuration learning and resulting memory traces

    NARCIS (Netherlands)

    Postma, Albert; Zuidhoek, Sander; Noordzij, Matthijs L.; Kappers, Astrid M L

    2007-01-01

    The roles of visual and haptic experience in different aspects of haptic processing of objects in peripersonal space are examined. In three trials, early-blind, late-blind, and blindfoldedsighted individuals had to match ten shapes haptically to the cut-outs in a board as fast as possible. Both

  3. Haptic-STM: a human-in-the-loop interface to a scanning tunneling microscope.

    Science.gov (United States)

    Perdigão, Luís M A; Saywell, Alex

    2011-07-01

    The operation of a haptic device interfaced with a scanning tunneling microscope (STM) is presented here. The user moves the STM tip in three dimensions by means of a stylus attached to the haptic instrument. The tunneling current measured by the STM is converted to a vertical force, applied to the stylus and felt by the user, with the user being incorporated into the feedback loop that controls the tip-surface distance. A haptic-STM interface of this nature allows the user to feel atomic features on the surface and facilitates the tactile manipulation of the adsorbate/substrate system. The operation of this device is demonstrated via the room temperature STM imaging of C(60) molecules adsorbed on an Au(111) surface in ultra-high vacuum.

  4. Comparative study on collaborative interaction in non-immersive and immersive systems

    Science.gov (United States)

    Shahab, Qonita M.; Kwon, Yong-Moo; Ko, Heedong; Mayangsari, Maria N.; Yamasaki, Shoko; Nishino, Hiroaki

    2007-09-01

    This research studies the Virtual Reality simulation for collaborative interaction so that different people from different places can interact with one object concurrently. Our focus is the real-time handling of inputs from multiple users, where object's behavior is determined by the combination of the multiple inputs. Issues addressed in this research are: 1) The effects of using haptics on a collaborative interaction, 2) The possibilities of collaboration between users from different environments. We conducted user tests on our system in several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments. The case studies are the interaction of users in two cases: collaborative authoring of a 3D model by two users, and collaborative haptic interaction by multiple users. In Virtual Dollhouse, users can observe physics law while constructing a dollhouse using existing building blocks, under gravity effects. In Virtual Stretcher, multiple users can collaborate on moving a stretcher together while feeling each other's haptic motions.

  5. Multi-viewpoint Image Array Virtual Viewpoint Rapid Generation Algorithm Based on Image Layering

    Science.gov (United States)

    Jiang, Lu; Piao, Yan

    2018-04-01

    The use of multi-view image array combined with virtual viewpoint generation technology to record 3D scene information in large scenes has become one of the key technologies for the development of integrated imaging. This paper presents a virtual viewpoint rendering method based on image layering algorithm. Firstly, the depth information of reference viewpoint image is quickly obtained. During this process, SAD is chosen as the similarity measure function. Then layer the reference image and calculate the parallax based on the depth information. Through the relative distance between the virtual viewpoint and the reference viewpoint, the image layers are weighted and panned. Finally the virtual viewpoint image is rendered layer by layer according to the distance between the image layers and the viewer. This method avoids the disadvantages of the algorithm DIBR, such as high-precision requirements of depth map and complex mapping operations. Experiments show that, this algorithm can achieve the synthesis of virtual viewpoints in any position within 2×2 viewpoints range, and the rendering speed is also very impressive. The average result proved that this method can get satisfactory image quality. The average SSIM value of the results relative to real viewpoint images can reaches 0.9525, the PSNR value can reaches 38.353 and the image histogram similarity can reaches 93.77%.

  6. The force pyramid: a spatial analysis of force application during virtual reality brain tumor resection.

    Science.gov (United States)

    Azarnoush, Hamed; Siar, Samaneh; Sawaya, Robin; Zhrani, Gmaan Al; Winkler-Schwartz, Alexander; Alotaibi, Fahad Eid; Bugdadi, Abdulgadir; Bajunaid, Khalid; Marwa, Ibrahim; Sabbagh, Abdulrahman Jafar; Del Maestro, Rolando F

    2017-07-01

    OBJECTIVE Virtual reality simulators allow development of novel methods to analyze neurosurgical performance. The concept of a force pyramid is introduced as a Tier 3 metric with the ability to provide visual and spatial analysis of 3D force application by any instrument used during simulated tumor resection. This study was designed to answer 3 questions: 1) Do study groups have distinct force pyramids? 2) Do handedness and ergonomics influence force pyramid structure? 3) Are force pyramids dependent on the visual and haptic characteristics of simulated tumors? METHODS Using a virtual reality simulator, NeuroVR (formerly NeuroTouch), ultrasonic aspirator force application was continually assessed during resection of simulated brain tumors by neurosurgeons, residents, and medical students. The participants performed simulated resections of 18 simulated brain tumors with different visual and haptic characteristics. The raw data, namely, coordinates of the instrument tip as well as contact force values, were collected by the simulator. To provide a visual and qualitative spatial analysis of forces, the authors created a graph, called a force pyramid, representing force sum along the z-coordinate for different xy coordinates of the tool tip. RESULTS Sixteen neurosurgeons, 15 residents, and 84 medical students participated in the study. Neurosurgeon, resident and medical student groups displayed easily distinguishable 3D "force pyramid fingerprints." Neurosurgeons had the lowest force pyramids, indicating application of the lowest forces, followed by resident and medical student groups. Handedness, ergonomics, and visual and haptic tumor characteristics resulted in distinct well-defined 3D force pyramid patterns. CONCLUSIONS Force pyramid fingerprints provide 3D spatial assessment displays of instrument force application during simulated tumor resection. Neurosurgeon force utilization and ergonomic data form a basis for understanding and modulating resident force

  7. Drafting of the dismantling operations of the MAR 200 workshop with the help of virtual reality

    International Nuclear Information System (INIS)

    Chabal, C.; Soulabaille, Y.; Garnier, T.; Callixte, O.

    2014-01-01

    In order to optimize future dismantling operations of nuclear installations virtual reality allows the validation of predefined scenarios and their adequacy with the environment. CEA uses an immersion and interactive room to validate maintenance and dismantling operations. The equipment of this room is composed of a video wall that gives a 3-dimensional view of the virtual environment, and of a system for motion capture. For the simulation of handling operations a haptic interface has been designed, it allows the user to receive a tactic and effort-feeling feed back. The immersion is completed by a phonic ambience that creates sounds for virtual operations. The use of the immersion room for optimizing the dismantling of a spent fuel dissolver (MAR 200) used in hot cell is presented. (A.C.)

  8. Get your virtual hands off me! - Developing threatening IVAs using haptic feedback

    NARCIS (Netherlands)

    Goedschalk, L.F.; Bosse, T.; Otte, M.; Verheij, B.; Wiering, M.

    2018-01-01

    Intelligent Virtual Agents (IVAs) become widely used for numerous applications, varying from healthcare decision support to communication training. In several of such applications, it is useful if IVAs have the ability to take a negative stance towards the user, for instance for anti-bullying or

  9. Vibrotactile perception assessment for a haptic interface on an antigravity suit.

    Science.gov (United States)

    Ko, Sang Min; Lee, Kwangil; Kim, Daeho; Ji, Yong Gu

    2017-01-01

    Haptic technology is used in various fields to transmit information to the user with or without visual and auditory cues. This study aimed to provide preliminary data for use in developing a haptic interface for an antigravity (anti-G) suit. With the structural characteristics of the anti-G suit in mind, we determined five areas on the body (lower back, outer thighs, inner thighs, outer calves, and inner calves) on which to install ten bar-type eccentric rotating mass (ERM) motors as vibration actuators. To determine the design factors of the haptic anti-G suit, we conducted three experiments to find the absolute threshold, moderate intensity, and subjective assessments of vibrotactile stimuli. Twenty-six fighter pilots participated in the experiments, which were conducted in a fixed-based flight simulator. From the results of our study, we recommend 1) absolute thresholds of ∼11.98-15.84 Hz and 102.01-104.06 dB, 2) moderate intensities of 74.36 Hz and 126.98 dB for the lower back and 58.65 Hz and 122.37 dB for either side of the thighs and calves, and 3) subjective assessments of vibrotactile stimuli (displeasure, easy to perceive, and level of comfort). The results of this study will be useful for the design of a haptic anti-G suit. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Exploring laterality and memory effects in the haptic discrimination of verbal and non-verbal shapes.

    Science.gov (United States)

    Stoycheva, Polina; Tiippana, Kaisa

    2018-03-14

    The brain's left hemisphere often displays advantages in processing verbal information, while the right hemisphere favours processing non-verbal information. In the haptic domain due to contra-lateral innervations, this functional lateralization is reflected in a hand advantage during certain functions. Findings regarding the hand-hemisphere advantage for haptic information remain contradictory, however. This study addressed these laterality effects and their interaction with memory retention times in the haptic modality. Participants performed haptic discrimination of letters, geometric shapes and nonsense shapes at memory retention times of 5, 15 and 30 s with the left and right hand separately, and we measured the discriminability index d'. The d' values were significantly higher for letters and geometric shapes than for nonsense shapes. This might result from dual coding (naming + spatial) or/and from a low stimulus complexity. There was no stimulus-specific laterality effect. However, we found a time-dependent laterality effect, which revealed that the performance of the left hand-right hemisphere was sustained up to 15 s, while the performance of the right-hand-left hemisphere decreased progressively throughout all retention times. This suggests that haptic memory traces are more robust to decay when they are processed by the left hand-right hemisphere.

  11. Optimization of Virtual Loudspeakers for Spatial Room Acoustics Reproduction with Headphones

    Directory of Open Access Journals (Sweden)

    Otto Puomio

    2017-12-01

    Full Text Available The use of headphones in reproducing spatial sound is becoming more and more popular. For instance, virtual reality applications often use head-tracking to keep the binaurally reproduced auditory environment stable and to improve externalization. Here, we study one spatial sound reproduction method over headphones, in particular the positioning of the virtual loudspeakers. The paper presents an algorithm that optimizes the positioning of virtual reproduction loudspeakers to reduce the computational cost in head-tracked real-time rendering. The listening test results suggest that listeners could discriminate the optimized loudspeaker arrays for renderings that reproduced a relatively simple acoustic conditions, but optimized array was not significantly different from equally spaced array for a reproduction of a more complex case. Moreover, the optimization seems to change the perceived openness and timbre, according to the verbal feedback of the test subjects.

  12. Hybrid 3D visualization of the chest and virtual endoscopy of the tracheobronchial system: possibilities and limitations of clinical application.

    Science.gov (United States)

    Seemann, M D; Claussen, C D

    2001-06-01

    A hybrid rendering method which combines a color-coded surface rendering method and a volume rendering method is described, which enables virtual endoscopic examinations using different representation models. 14 patients with malignancies of the lung and mediastinum (n=11) and lung transplantation (n=3) underwent thin-section spiral computed tomography. The tracheobronchial system and anatomical and pathological features of the chest were segmented using an interactive threshold interval volume-growing segmentation algorithm and visualized with a color-coded surface rendering method. The structures of interest were then superimposed on a volume rendering of the other thoracic structures. For the virtual endoscopy of the tracheobronchial system, a shaded-surface model without color coding, a transparent color-coded shaded-surface model and a triangle-surface model were tested and compared. The hybrid rendering technique exploit the advantages of both rendering methods, provides an excellent overview of the tracheobronchial system and allows a clear depiction of the complex spatial relationships of anatomical and pathological features. Virtual bronchoscopy with a transparent color-coded shaded-surface model allows both a simultaneous visualization of an airway, an airway lesion and mediastinal structures and a quantitative assessment of the spatial relationship between these structures, thus improving confidence in the diagnosis of endotracheal and endobronchial diseases. Hybrid rendering and virtual endoscopy obviate the need for time consuming detailed analysis and presentation of axial source images. Virtual bronchoscopy with a transparent color-coded shaded-surface model offers a practical alternative to fiberoptic bronchoscopy and is particularly promising for patients in whom fiberoptic bronchoscopy is not feasible, contraindicated or refused. Furthermore, it can be used as a complementary procedure to fiberoptic bronchoscopy in evaluating airway stenosis and

  13. The Use of Haptic and Tactile Information in the Car to Improve Driving Safety: A Review of Current Technologies

    Directory of Open Access Journals (Sweden)

    Yoren Gaffary

    2018-03-01

    Full Text Available This paper surveys the haptic technologies deployed in cars and their uses to enhance drivers’ safety during manual driving. These technologies enable to deliver haptic (tactile or kinesthetic feedback at various areas of the car, such as the steering wheel, the seat, or the pedal. The paper explores two main uses of the haptic modality to fulfill the safety objective: to provide driving assistance and warning. Driving assistance concerns the transmission of information usually conveyed with other modalities for controlling the car’s functions, maneuvering support, and guidance. Warning concerns the prevention of accidents using emergency warnings, increasing the awareness of surroundings, and preventing collisions, lane departures, and speeding. This paper discusses how haptic feedback has been introduced so far for these purposes and provides perspectives regarding the present and future of haptic cars meant to increase driver’s safety.

  14. How to Build an Embodiment Lab: Achieving Body Representation Illusions in Virtual Reality

    Directory of Open Access Journals (Sweden)

    Bernhard eSpanlang

    2014-11-01

    Full Text Available Advances in computer graphics algorithms and virtual reality (VR systems, together with the reduction in cost of associated equipment, have led scientists to consider VR as a useful tool for conducting experimental studies in fields such as neuroscience and experimental psychology. In particular virtual body ownership, where the feeling of ownership over a virtual body is elicited in the participant, has become a useful tool in the study of body representation, in cognitive neuroscience and psychology, concerned with how the brain represents the body. Although VR has been shown to be a useful tool for exploring body ownership illusions, integrating the various technologies necessary for such a system can be daunting. In this paper we discuss the technical infrastructure necessary to achieve virtual embodiment. We describe a basic VR system and how it may be used for this purpose, and then extend this system with the introduction of real-time motion capture, a simple haptics system and the integration of physiological and brain electrical activity recordings.

  15. Performance improvement of haptic collision detection using subdivision surface and sphere clustering.

    Directory of Open Access Journals (Sweden)

    A Ram Choi

    Full Text Available Haptics applications such as surgery simulations require collision detections that are more precise than others. An efficient collision detection method based on the clustering of bounding spheres was proposed in our prior study. This paper analyzes and compares the applied effects of the five most common subdivision surface methods on some 3D models for haptic collision detection. The five methods are Butterfly, Catmull-Clark, Mid-point, Loop, and LS3 (Least Squares Subdivision Surface. After performing a number of experiments, we have concluded that LS3 method is the most appropriate for haptic simulations. The more we applied surface subdivision, the more the collision detection results became precise. However, it is observed that the performance becomes better until a certain threshold and degrades afterward. In order to reduce the performance degradation, we adopted our prior work, which was the fast and precise collision detection method based on adaptive clustering. As a result, we obtained a notable improvement of the speed of collision detection.

  16. Multisensory Interactions between Auditory and Haptic Object Recognition

    DEFF Research Database (Denmark)

    Kassuba, Tanja; Menz, Mareike M; R�der, Brigitte

    2013-01-01

    and haptic object features activate cortical regions that host unified conceptual object representations. The left fusiform gyrus (FG) and posterior superior temporal sulcus (pSTS) showed increased activation during crossmodal matching of semantically congruent but not incongruent object stimuli. In the FG...

  17. Recruitment of Foveal Retinotopic Cortex During Haptic Exploration of Shapes and Actions in the Dark.

    Science.gov (United States)

    Monaco, Simona; Gallivan, Jason P; Figley, Teresa D; Singhal, Anthony; Culham, Jody C

    2017-11-29

    The role of the early visual cortex and higher-order occipitotemporal cortex has been studied extensively for visual recognition and to a lesser degree for haptic recognition and visually guided actions. Using a slow event-related fMRI experiment, we investigated whether tactile and visual exploration of objects recruit the same "visual" areas (and in the case of visual cortex, the same retinotopic zones) and if these areas show reactivation during delayed actions in the dark toward haptically explored objects (and if so, whether this reactivation might be due to imagery). We examined activation during visual or haptic exploration of objects and action execution (grasping or reaching) separated by an 18 s delay. Twenty-nine human volunteers (13 females) participated in this study. Participants had their eyes open and fixated on a point in the dark. The objects were placed below the fixation point and accordingly visual exploration activated the cuneus, which processes retinotopic locations in the lower visual field. Strikingly, the occipital pole (OP), representing foveal locations, showed higher activation for tactile than visual exploration, although the stimulus was unseen and location in the visual field was peripheral. Moreover, the lateral occipital tactile-visual area (LOtv) showed comparable activation for tactile and visual exploration. Psychophysiological interaction analysis indicated that the OP showed stronger functional connectivity with anterior intraparietal sulcus and LOtv during the haptic than visual exploration of shapes in the dark. After the delay, the cuneus, OP, and LOtv showed reactivation that was independent of the sensory modality used to explore the object. These results show that haptic actions not only activate "visual" areas during object touch, but also that this information appears to be used in guiding grasping actions toward targets after a delay. SIGNIFICANCE STATEMENT Visual presentation of an object activates shape

  18. Short-term plasticity of visuo-haptic object recognition

    DEFF Research Database (Denmark)

    Kassuba, Tanja; Klinge, Corinna; Hölig, Cordula

    2014-01-01

    , the same stimulation gave rise to relative increases in activation during S2 processing in the right LO, left FG, bilateral IPS, and other regions previously associated with object recognition. Critically, the modality of S2 determined which regions were recruited after rTMS. Relative to sham rTMS, real r......TMS induced increased activations during crossmodal congruent matching in the left FG for haptic S2 and the temporal pole for visual S2. In addition, we found stronger activations for incongruent than congruent matching in the right anterior parahippocampus and middle frontal gyrus for crossmodal matching......Functional magnetic resonance imaging (fMRI) studies have provided ample evidence for the involvement of the lateral occipital cortex (LO), fusiform gyrus (FG), and intraparietal sulcus (IPS) in visuo-haptic object integration. Here we applied 30 min of sham (non-effective) or real offline 1 Hz...

  19. Haptically facilitated bimanual training combined with augmented visual feedback in moderate to severe hemiplegia.

    Science.gov (United States)

    Boos, Amy; Qiu, Qinyin; Fluet, Gerard G; Adamovich, Sergei V

    2011-01-01

    This study describes the design and feasibility testing of a hand rehabilitation system that provides haptic assistance for hand opening in moderate to severe hemiplegia while subjects attempt to perform bilateral hand movements. A cable-actuated exoskeleton robot assists the subjects in performing impaired finger movements but is controlled by movement of the unimpaired hand. In an attempt to combine the neurophysiological stimuli of bilateral movement and action observation during training, visual feedback of the impaired hand is replaced by feedback of the unimpaired hand, either by using a sagittaly oriented mirror or a virtual reality setup with a pair of virtual hands presented on a flat screen controlled with movement of the unimpaired hand, providing a visual image of their paretic hand moving normally. Joint angles for both hands are measured using data gloves. The system is programmed to maintain a symmetrical relationship between the two hands as they respond to commands to open and close simultaneously. Three persons with moderate to severe hemiplegia secondary to stroke trained with the system for eight, 30 to 60 minute sessions without adverse events. Each demonstrated positive motor adaptations to training. The system was well tolerated by persons with moderate to severe upper extremity hemiplegia. Further testing of its effects on motor ability with a broader range of clinical presentations is indicated.

  20. Virtual reality in medicine-computer graphics and interaction techniques.

    Science.gov (United States)

    Haubner, M; Krapichler, C; Lösch, A; Englmeier, K H; van Eimeren, W

    1997-03-01

    This paper describes several new visualization and interaction techniques that enable the use of virtual environments for routine medical purposes. A new volume-rendering method supports shaded and transparent visualization of medical image sequences in real-time with an interactive threshold definition. Based on these rendering algorithms two complementary segmentation approaches offer an intuitive assistance for a wide range of requirements in diagnosis and therapy planning. In addition, a hierarchical data representation for geometric surface descriptions guarantees an optimal use of available hardware resources and prevents inaccurate visualization. The combination of the presented techniques empowers the improved human-machine interface of virtual reality to support every interactive task in medical three-dimensional (3-D) image processing, from visualization of unsegmented data volumes up to the simulation of surgical procedures.

  1. Design of a smart haptic system for repulsive force control under irregular manipulation environment

    International Nuclear Information System (INIS)

    Lee, Sang-Rock; Choi, Seung-Hyun; Choi, Seung-Bok; Cho, Myeong-Woo

    2014-01-01

    This paper describes how to make an operator feel the desired repulsive force in a haptic system. When an operator manipulates a haptic system, the repulsive force of the operator varies significantly, depending on many factors such as position, velocity and force. In order to reflect the desired repulsive force to the operator, it is commonly known that a haptic system must compensate for irregularly changing forces. The irregularity of the forces, however, has discouraged many researchers from establishing a clear principle on how to make the operator feel the desired repulsive force. To resolve this problem, we introduce a smart haptic framework that can reflect the desired repulsive force to the operator, regardless of the operator’s movement. A dummy governing equation technique is introduced and used to calculate the proper actuating force in real time. The actuating force is generated by a PID controller. To verify the proposed method, a mathematical proof is offered to show that the repulsive force converges to the desired repulsive force. Additionally, to demonstrate the performance of the proposed method, simulational and experimental tests are implemented. (paper)

  2. Rhythmic Haptic Stimuli Improve Short-Term Attention.

    Science.gov (United States)

    Zhang, Shusheng; Wang, Dangxiao; Afzal, Naqash; Zhang, Yuru; Wu, Ruilin

    2016-01-01

    Brainwave entrainment using rhythmic visual and/or auditory stimulation has shown its efficacy in modulating neural activities and cognitive ability. In the presented study, we aim to investigate whether rhythmic haptic stimulation could enhance short-term attention. An experiment with sensorimotor rhythm (SMR) increasing protocol was performed in which participants were presented sinusoidal vibrotactile stimulus of 15 Hz on their palm. Test of Variables of Attention (T.O.V.A.) was performed before and after the stimulating session. Electroencephalograph (EEG) was recorded across the stimulating session and the two attention test sessions. SMR band power manifested a significant increase after stimulation. Results of T.O.V.A. tests indicated an improvement in the attention of participants who had received the stimulation compared to the control group who had not received the stimulation. The D prime score of T.O.V.A. reveals that participants performed better in perceptual sensitivity and sustaining attention level compared to their baseline performance before the stimulating session. These findings highlight the potential value of using haptics-based brainwave entrainment for cognitive training.

  3. Optimal design of a new 3D haptic gripper for telemanipulation, featuring magnetorheological fluid brakes

    International Nuclear Information System (INIS)

    Nguyen, Q H; Choi, S B; Lee, Y S; Han, M S

    2013-01-01

    In this research work, a new configuration of a 3D haptic gripper for telemanipulation is proposed and optimally designed. The proposed haptic gripper, featuring three magnetorheological fluid brakes (MRBs), reflects the rolling torque, the grasping force and the approach force from the slave manipulator to the master operator. After describing the operational principle of the haptic gripper, an optimal design of the MRBs for the gripper is performed. The purpose of the optimization problem is to find the most compact MRB that can provide a required braking torque/force to the master operator while the off-state torque/force is kept as small as possible. In the optimal design, different types of MRBs and different MR fluids (MRFs) are considered. In order to obtain the optimal solution of the MRBs, an optimization approach based on finite element analysis (FEA) integrated with an optimization tool is used. The optimal solutions of the MRBs are then obtained and the optimized MRBs for the haptic gripper are identified. In addition, discussions on the optimal solutions and performance of the optimized MRBs are given. (paper)

  4. Virtual reality training and assessment in laparoscopic rectum surgery.

    Science.gov (United States)

    Pan, Jun J; Chang, Jian; Yang, Xiaosong; Liang, Hui; Zhang, Jian J; Qureshi, Tahseen; Howell, Robert; Hickish, Tamas

    2015-06-01

    Virtual-reality (VR) based simulation techniques offer an efficient and low cost alternative to conventional surgery training. This article describes a VR training and assessment system in laparoscopic rectum surgery. To give a realistic visual performance of interaction between membrane tissue and surgery tools, a generalized cylinder based collision detection and a multi-layer mass-spring model are presented. A dynamic assessment model is also designed for hierarchy training evaluation. With this simulator, trainees can operate on the virtual rectum with both visual and haptic sensation feedback simultaneously. The system also offers surgeons instructions in real time when improper manipulation happens. The simulator has been tested and evaluated by ten subjects. This prototype system has been verified by colorectal surgeons through a pilot study. They believe the visual performance and the tactile feedback are realistic. It exhibits the potential to effectively improve the surgical skills of trainee surgeons and significantly shorten their learning curve. Copyright © 2014 John Wiley & Sons, Ltd.

  5. Subliminal Cueing of Selection Behavior in a Virtual Environment

    OpenAIRE

    Aranyi, Gabor; Kouider, Sid; Lindsay, Alan; Prins, Hielke; Ahmed, Imtiaj; Jacucci, Giulio; Negri, Paolo; Gamberini, Luciano; Pizzi, David; Cavazza, Marc

    2014-01-01

    The performance of current graphics engines makes it possible to incorporate subliminal cues within virtual environments (VEs), providing an additional way of communication,\\ud fully integrated with the exploration of a virtual scene. In order to advance the application of subliminal information in this area, it is necessary to explore how techniques\\ud previously reported as rendering information subliminal in the psychological literature can be successfully implemented in VEs. Previous lite...

  6. Study on Collaborative Object Manipulation in Virtual Environment

    Science.gov (United States)

    Mayangsari, Maria Niken; Yong-Moo, Kwon

    This paper presents comparative study on network collaboration performance in different immersion. Especially, the relationship between user collaboration performance and degree of immersion provided by the system is addressed and compared based on several experiments. The user tests on our system include several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments.

  7. A haptic sensing upgrade for the current EOD robotic fleet

    Science.gov (United States)

    Rowe, Patrick

    2014-06-01

    The past decade and a half has seen a tremendous rise in the use of mobile manipulator robotic platforms for bomb inspection and disposal, explosive ordnance disposal, and other extremely hazardous tasks in both military and civilian settings. Skilled operators are able to control these robotic vehicles in amazing ways given the very limited situational awareness obtained from a few on-board camera views. Future generations of robotic platforms will, no doubt, provide some sort of additional force or haptic sensor feedback to further enhance the operator's interaction with the robot, especially when dealing with fragile, unstable, and explosive objects. Unfortunately, the robot operators need this capability today. This paper discusses an approach to provide existing (and future) robotic mobile manipulator platforms, with which trained operators are already familiar and highly proficient, this desired haptic and force feedback capability. The goals of this technology are to be rugged, reliable, and affordable. It should also be able to be applied to a wide range of existing robots with a wide variety of manipulator/gripper sizes and styles. Finally, the presentation of the haptic information to the operator is discussed, given the fact that control devices that physically interact with the operators are not widely available and still in the research stages.

  8. Kinematic/Dynamic Characteristics for Visual and Kinesthetic Virtual Environments

    Science.gov (United States)

    Bortolussi, Michael R. (Compiler); Adelstein, B. D.; Gold, Miriam

    1996-01-01

    Work was carried out on two topics of principal importance to current progress in virtual environment research at NASA Ames and elsewhere. The first topic was directed at maximizing the temporal dynamic response of visually presented Virtual Environments (VEs) through reorganization and optimization of system hardware and software. The final results of this portion of the work was a VE system in the Advanced Display and Spatial Perception Laboratory at NASA Ames capable of updating at 60 Hz (the maximum hardware refresh rate) with latencies approaching 30 msec. In the course of achieving this system performance, specialized hardware and software tools for measurement of VE latency and analytic models correlating update rate and latency for different system configurations were developed. The second area of activity was the preliminary development and analysis of a novel kinematic architecture for three Degree Of Freedom (DOF) haptic interfaces--devices that provide force feedback for manipulative interaction with virtual and remote environments. An invention disclosure was filed on this work and a patent application is being pursued by NASA Ames. Activities in these two areas are expanded upon below.

  9. Correlation between perception of color, shadows, and surface textures and the realism of a scene in virtual reality.

    Science.gov (United States)

    Pardo, Pedro J; Suero, María Isabel; Pérez, Ángel Luis

    2018-04-01

    Head-mounted displays allow us to go through immersive experiences in virtual reality and are expected to be present in more and more applications in both recreational and professional fields. In this context, recent years have witnessed significant advances in rendering techniques following physical models of lighting and shading. The aim of this paper is to check the fidelity of the visual appearance of real objects captured through a 3D scanner, rendered in a personal computer and displayed in a virtual reality device. We have compared forward versus deferred rendering in real-time computing using two different illuminations and five artwork replicas. The survey contains seven items for each artwork (color, shading, texture, definition, geometry, chromatic aberration, and pixelation) and an extra item related to the global realism. The results confirm recent advances in virtual reality, showing considerable visual fidelity of generated to real-world images, with a rate close to 4 in a 5-step perceptive scale. They also show a high correlation of the realism sensation with the fidelity of color reproduction, material texture, and definition of the artwork replicas. Moreover, statistically significant differences between two rendering modes are found, with a higher value of realism sensation in the deferred rendering mode.

  10. Design of a New MR Compatible Haptic Interface with Six Actuated Degrees of Freedom

    DEFF Research Database (Denmark)

    Ergin, Mehmet Alper; Kühne, Markus; Thielscher, Axel

    2014-01-01

    Functional magnetic resonance imaging is an often adopted tool to study human motor control mechanisms. Highly controlled experiments as required by this form of analysis can be realized with haptic interfaces. Their design is challenging because of strong safety and MR compatibility requirements....... Existing MR-compatible haptic interfaces are restricted to maximum three actuated degrees of freedom. We propose an MR-compatible haptic interface with six actuated degrees of freedom to be able to study human brain mechanisms of natural pick-and-place movements including arm transport. In this work, we...... present its mechanical design, kinematic and dynamic model, as well as report on its model-based characterization. A novel hybrid control scheme for the employed ultrasonic motors is introduced. Preliminary MR compatibility tests based on one complete actuator-sensor module are performed. No measurable...

  11. Six Degree-of-Freedom Haptic Simulation of Probing Dental Caries Within a Narrow Oral Cavity.

    Science.gov (United States)

    Wang, Dangxiao; Zhao, Xiaohan; Shi, Youjiao; Zhang, Yuru; Hou, Jianxia; Xiao, Jing

    2016-01-01

    Haptic simulation of handling pathological tissues is a crucial component to enhance virtual surgical training systems. In this paper, we introduce a configuration-based optimization approach to simulate the exploration and diagnosis of carious tissues in dental operations. To simulate the six Degree-of-Freedom (6DoF) haptic interaction between the dental probe and the oral tissues, we introduce two interaction states, the sliding state and the penetration state, which simulate the exploration on the surface of and inside of the caries, respectively. Penetration criteria considering a contact force threshold are defined to trigger the switch between the two states. By utilizing a simplified friction model based on the optimization approach, various multi-region frictional contacts between the probe and carious tissues are simulated. To simulate the exploration within the carious tissues for diagnosing the depth of the caries, a dynamic sphere tree is used to constrain the insertion/extraction of the probe within carious tissues along a fixed direction while enabling simulation of additional contacts of the probe with neighboring oral tissues during the insertion/extraction process. Experimental results show that decays with different levels of stiffness and friction coefficients can be stably simulated. Preliminary user studies show that users could easily identify the invisible boundary between the decay and healthy tissues and correctly rank the depth of target decays within a required time limit. The proposed approach could be used for training delicate motor skill of probing target carious teeth in a narrow oral cavity, which requires collaborated control of tool posture and insertion/extraction force, while avoiding damages to adjacent healthy tissues of the tongue and gingiva.

  12. Virtual reality in advanced medical immersive imaging: a workflow for introducing virtual reality as a supporting tool in medical imaging

    KAUST Repository

    Knodel, Markus M.

    2018-02-27

    Radiologic evaluation of images from computed tomography (CT) or magnetic resonance imaging for diagnostic purposes is based on the analysis of single slices, occasionally supplementing this information with 3D reconstructions as well as surface or volume rendered images. However, due to the complexity of anatomical or pathological structures in biomedical imaging, innovative visualization techniques are required to display morphological characteristics three dimensionally. Virtual reality is a modern tool of representing visual data, The observer has the impression of being “inside” a virtual surrounding, which is referred to as immersive imaging. Such techniques are currently being used in technical applications, e.g. in the automobile industry. Our aim is to introduce a workflow realized within one simple program which processes common image stacks from CT, produces 3D volume and surface reconstruction and rendering, and finally includes the data into a virtual reality device equipped with a motion head tracking cave automatic virtual environment system. Such techniques have the potential to augment the possibilities in non-invasive medical imaging, e.g. for surgical planning or educational purposes to add another dimension for advanced understanding of complex anatomical and pathological structures. To this end, the reconstructions are based on advanced mathematical techniques and the corresponding grids which we can export are intended to form the basis for simulations of mathematical models of the pathogenesis of different diseases.

  13. Reaching in reality and virtual reality: a comparison of movement kinematics in healthy subjects and in adults with hemiparesis

    Directory of Open Access Journals (Sweden)

    Feldman Anatol G

    2004-12-01

    Full Text Available Abstract Background Virtual reality (VR is an innovative tool for sensorimotor rehabilitation increasingly being employed in clinical and community settings. Despite the growing interest in VR, few studies have determined the validity of movements made in VR environments with respect to real physical environments. The goal of this study was to compare movements done in physical and virtual environments in adults with motor deficits to those in healthy individuals. Methods The participants were 8 healthy adults and 7 adults with mild left hemiparesis due to stroke. Kinematics of functional arm movements involving reaching, grasping and releasing made in physical and virtual environments were analyzed in two phases: 1 reaching and grasping the ball and 2 ball transport and release. The virtual environment included interaction with an object on a 2D computer screen and haptic force feedback from a virtual ball. Temporal and spatial parameters of reaching and grasping were determined for each phase. Results Individuals in both groups were able to reach, grasp, transport, place and release the virtual and real ball using similar movement strategies. In healthy subjects, reaching and grasping movements in both environments were similar but these subjects used less wrist extension and more elbow extension to place the ball on the virtual vertical surface. Participants with hemiparesis made slower movements in both environments compared to healthy subjects and during transport and placing of the ball, trajectories were more curved and interjoint coordination was altered. Despite these differences, patients with hemiparesis also tended to use less wrist extension during the whole movement and more elbow extension at the end of the placing phase. Conclusion Differences in movements made by healthy subjects in the two environments may be explained by the use of a 2D instead of a 3D virtual environment and the absence of haptic feedback from the VR target

  14. Feeling the beat where it counts: fostering multi-limb rhythm skills with the haptic drum kit

    NARCIS (Netherlands)

    Holland, S.; Bouwer, A.J.; Dalgleish, M.; Hurtig, T.M.

    2010-01-01

    This paper introduces a tool known as the Haptic Drum Kit, which employs four computer-controlled vibrotactile devices, one attached to each wrist and ankle. In the applications discussed here, haptic pulses are used to guide the playing, on a drum kit, of rhythmic patterns that require multi-limb

  15. The C-Lever Project: Haptics for Automotive Applications

    NARCIS (Netherlands)

    Garcia Canseco, E.; Ayemlong Fokem, A.; Serrarens, A.F.A.; Steinbuch, M.; Stigter, H.

    2010-01-01

    The goal of this project is to research the effectiveness of a controlled haptic force feedback shift lever that can accurately reproduce the behavior of a manual gear shift during driving, and that can also be used to control interior and comfort functions in the car.

  16. Adapting haptic guidance authority based on user grip

    NARCIS (Netherlands)

    Smisek, J.; Mugge, W.; Smeets, J.B.J.; Van Paassen, M.M.; Schiele, A

    2014-01-01

    Haptic guidance systems support the operator in task execution using additional forces on the input device. Scaling of the guidance forces determines the control authority of the support system. As task complexity may vary, one level of the guidance scaling may be insufficient, and adaptation of the

  17. VIRTUAL ARCHAEOLOGICAL ENVIRONMENTS GENERATED IN AVAYALIVE ENGAGE

    Directory of Open Access Journals (Sweden)

    ANTHONY Rigby

    2014-09-01

    Full Text Available Realistically rendered and textured virtual spaces can be created in the AVAYALIVE ENGAGE platform by importing high polygon models and scaled accurately reproduced textures. In addition MellaniuM has successfully developed an application for utilizing all the archaeological virtual assets developed in 3D Studio Max or generated over the past several years using photogrammetry and laser scanning. It is possible therefore to create interactive environments of archaeological significance that can be accessed through the Internet and available to up to 40 participants. 

  18. Conservation of old renderings - the consolidation of rendering with loss of cohesion

    Directory of Open Access Journals (Sweden)

    Martha Tavares

    2008-01-01

    Full Text Available The study of external renderings in the scope of conservation and restoration has acquired in the last years great methodological, scientific and technical advances. These renderings are important elements of the built structure, for besides possessing a protection function, they possess often a decorative function of great relevance for the image of the monument. The maintenance of these renderings implies the conservation of traditional constructive techniques and the use of compatible materials, as similar to the originals as possible. The main objective of this study is to define a methodology of conservative restoration using strategies of maintenance of renderings and traditional constructive techniques. The minimum intervention principle is maintained as well as the use of materials compatible with the original ones. This paper describes the technique and products used for the consolidation of the loss of cohesion. The testing campaign was developed under controlled conditions, in laboratory, and in situ in order to evaluate their efficacy for the consolidation of old renders. A set of tests is presented to evaluate the effectiveness of the process. The results are analysed and a reflection is added referring to the applicability of these techniques. Finally the paper presents a proposal for further research.

  19. Robot-assisted microsurgical forceps with haptic feedback for transoral laser microsurgery.

    Science.gov (United States)

    Deshpande, Nikhil; Chauhan, Manish; Pacchierotti, Claudio; Prattichizzo, Domenico; Caldwell, Darwin G; Mattos, Leonardo S

    2016-08-01

    In this paper, a novel, motorized, multi-degrees-of-freedom (DoF), microsurgical forceps tool is presented, which is based on a master-slave teleoperation architecture. The slave device is a 7-DoF manipulator with: (i) 6-DoF positioning and orientation, (ii) 1 open/close gripper DoF; and (iii) an integrated force/torque sensor for tissue grip-force measurement. The master device is a 7-DoF haptic interface which teleoperates the slave device, and provides haptic feedback in its gripper interface. The combination of the device and the surgeon interface replaces the manual, hand-held device providing easy-to-use and ergonomic tissue control, simplifying the surgical tasks. This makes the system suitable to real surgical scenarios in the operating room (OR). The performance of the system was analysed through the evaluation of teleoperation control and characterization of gripping force. The new system offers an overall positioning error of less than 400 μm demonstrating its safety and accuracy. Improved system precision, usability, and ergonomics point to the potential suitability of the device for the OR and its ability to advance haptic-feedback-enhanced transoral laser microsurgeries.

  20. Sound Descriptions of Haptic Experiences of Art Work by Deafblind Cochlear Implant Users

    Directory of Open Access Journals (Sweden)

    Riitta Lahtinen

    2018-05-01

    Full Text Available Deafblind persons’ perception and experiences are based on their residual auditive and visual senses, and touch. Their haptic exploration, through movements and orientation towards objects give blind persons direct, independent experience. Few studies explore the aesthetic experiences and appreciation of artefacts of deafblind people using cochlear implant (CI technology, and how they interpret and express their perceived aesthetic experience through another sensory modality. While speech recognition is studied extensively in this area, the aspect of auditive descriptions made by CI users are a less-studied domain. This present research intervention describes and analyses five different deafblind people sharing their interpretation of five statues vocally, using sounds and written descriptions based on their haptic explorations. The participants found new and multimodal ways of expressing their experiences, as well as re-experiencing them through technological aids. We also found that the CI users modify technology to better suit their personal needs. We conclude that CI technology in combination with self-made sound descriptions enhance memorization of haptic art experiences that can be re-called by the recording of the sound descriptions. This research expands the idea of auditive descriptions, and encourages user-produced descriptions as artistic supports to traditional linguistic, audio descriptions. These can be used to create personal auditive–haptic memory collections similar to how sighted create photo albums.

  1. A Study of Power and Individualism in Virtual Teams: Trends, Challenges, and Solutions

    Science.gov (United States)

    Jablonski, Deirdre

    2013-01-01

    This study investigated the relationship between cultural values and effectiveness of virtual team processes. In order to render an acceptable degree of comparison, four specific team outcomes of virtual team effectiveness were aligned on Hofstede's cultural dimensions of power distance and individualism. The lack of awareness of how power and…

  2. Enhancing Tele-robotics with Immersive Virtual Reality

    Science.gov (United States)

    2017-11-03

    The spheres displayed in the virtual environment represent the real-world readings from the robot in real-time from its LRF and sonar sensors. In...Inc., is comprised of an advanced graphics rendering engine, sound engine, and physics and animation engines. This game engine is capable of delivering

  3. Perceptual grouping affects haptic enumeration over the fingers

    NARCIS (Netherlands)

    Overvliet, K.E.; Plaisier, M.A.

    2016-01-01

    Spatial arrangement is known to influence enumeration times in vision. In haptic enumeration, it has been shown that dividing the total number of items over the two hands can speed up enumeration. Here we investigated how spatial arrangement of items and non-items presented to the individual fingers

  4. Virtual reality neurosurgery: a simulator blueprint.

    Science.gov (United States)

    Spicer, Mark A; van Velsen, Martin; Caffrey, John P; Apuzzo, Michael L J

    2004-04-01

    This article details preliminary studies undertaken to integrate the most relevant advancements across multiple disciplines in an effort to construct a highly realistic neurosurgical simulator based on a distributed computer architecture. Techniques based on modified computational modeling paradigms incorporating finite element analysis are presented, as are current and projected efforts directed toward the implementation of a novel bidirectional haptic device. Patient-specific data derived from noninvasive magnetic resonance imaging sequences are used to construct a computational model of the surgical region of interest. Magnetic resonance images of the brain may be coregistered with those obtained from magnetic resonance angiography, magnetic resonance venography, and diffusion tensor imaging to formulate models of varying anatomic complexity. The majority of the computational burden is encountered in the presimulation reduction of the computational model and allows realization of the required threshold rates for the accurate and realistic representation of real-time visual animations. Intracranial neurosurgical procedures offer an ideal testing site for the development of a totally immersive virtual reality surgical simulator when compared with the simulations required in other surgical subspecialties. The material properties of the brain as well as the typically small volumes of tissue exposed in the surgical field, coupled with techniques and strategies to minimize computational demands, provide unique opportunities for the development of such a simulator. Incorporation of real-time haptic and visual feedback is approached here and likely will be accomplished soon.

  5. A Taxonomy and Comparison of Haptic Actions for Disassembly Tasks

    National Research Council Canada - National Science Library

    Bloomfield, Aaron; Deng, Yu; Wampler, Jeff; Rondot, Pascale; Harth, Dina; McManus, Mary; Badler, Norman

    2003-01-01

    .... We conducted a series of human subject experiments to compare user performance and preference on a disassembly task with and without haptic feedback using CyberGlove, Phantom, and SpaceMouse interfaces...

  6. Walking in a Virtual Town using Nintendo Wiimote and Balance Board

    Directory of Open Access Journals (Sweden)

    Lucio Tommaso De Paolis

    2011-12-01

    Full Text Available EnThe main goal of the Human-Computer Interaction technology is to improve the interactions between users and computers by making computers more usable and receptive to the user's needs. The end point in the interface design would then lead to a paradigm in which the interaction with computers becomes similar to the one between human beings. This paper focuses on an application of navigation and interaction in a virtual environment using the Nintendo Wiimote and the Balance Board. The idea is to have a system of navigation control in a virtual environment based on a locomotion interface in order to make the interaction easier for users without any experience of navigation in a virtual world and more efficient for trained users. The application has been developed for the navigation and interaction in the virtual town. We chose Otranto as an example town; Otranto is located in the easternmost tip of the Italian peninsula and, due to its geographical position, the town was like a bridge between East and West. The virtual environment is a loyal representation of the town of Otranto in the Middle Ages.ItL'obiettivo principale della Human-Computer Interaction technology è il miglioramento delle interazioni tra utenti e computer al fine di rendere tali sistemi più usabili e più adatti a soddisfare le diverse esigenze degli utenti. Il punto di arrivo nella progettazione di un’interfaccia sarebbe, quindi, quello rendere l'interazione con i computer simile quanto più possibile a quella tra esseri umani. Questo paper presenta un'applicazione che permette la navigazione e l’interazione in un ambiente virtuale facendo uso del Wiimote e della Balance Board della Nintendo. L'idea è di disporre di un sistema di controllo della navigazione in un ambiente virtuale basato su una semplice interfaccia di locomozione in grado di rendere più semplice l'interazione per gli utenti senza alcuna esperienza di navigazione in un mondo virtuale e più efficiente

  7. Augmented versus Virtual Reality Laparoscopic Simulation: What Is the Difference?

    Science.gov (United States)

    Botden, Sanne M.B.I.; Buzink, Sonja N.; Schijven, Marlies P.

    2007-01-01

    Background Virtual reality (VR) is an emerging new modality for laparoscopic skills training; however, most simulators lack realistic haptic feedback. Augmented reality (AR) is a new laparoscopic simulation system offering a combination of physical objects and VR simulation. Laparoscopic instruments are used within an hybrid mannequin on tissue or objects while using video tracking. This study was designed to assess the difference in realism, haptic feedback, and didactic value between AR and VR laparoscopic simulation. Methods The ProMIS AR and LapSim VR simulators were used in this study. The participants performed a basic skills task and a suturing task on both simulators, after which they filled out a questionnaire about their demographics and their opinion of both simulators scored on a 5-point Likert scale. The participants were allotted to 3 groups depending on their experience: experts, intermediates and novices. Significant differences were calculated with the paired t-test. Results There was general consensus in all groups that the ProMIS AR laparoscopic simulator is more realistic than the LapSim VR laparoscopic simulator in both the basic skills task (mean 4.22 resp. 2.18, P < 0.000) as well as the suturing task (mean 4.15 resp. 1.85, P < 0.000). The ProMIS is regarded as having better haptic feedback (mean 3.92 resp. 1.92, P < 0.000) and as being more useful for training surgical residents (mean 4.51 resp. 2.94, P < 0.000). Conclusions In comparison with the VR simulator, the AR laparoscopic simulator was regarded by all participants as a better simulator for laparoscopic skills training on all tested features. PMID:17361356

  8. Virtual Reality and Augmented Reality in Plastic Surgery: A Review.

    Science.gov (United States)

    Kim, Youngjun; Kim, Hannah; Kim, Yong Oock

    2017-05-01

    Recently, virtual reality (VR) and augmented reality (AR) have received increasing attention, with the development of VR/AR devices such as head-mounted displays, haptic devices, and AR glasses. Medicine is considered to be one of the most effective applications of VR/AR. In this article, we describe a systematic literature review conducted to investigate the state-of-the-art VR/AR technology relevant to plastic surgery. The 35 studies that were ultimately selected were categorized into 3 representative topics: VR/AR-based preoperative planning, navigation, and training. In addition, future trends of VR/AR technology associated with plastic surgery and related fields are discussed.

  9. Virtual Reality and Augmented Reality in Plastic Surgery: A Review

    Directory of Open Access Journals (Sweden)

    Youngjun Kim

    2017-05-01

    Full Text Available Recently, virtual reality (VR and augmented reality (AR have received increasing attention, with the development of VR/AR devices such as head-mounted displays, haptic devices, and AR glasses. Medicine is considered to be one of the most effective applications of VR/AR. In this article, we describe a systematic literature review conducted to investigate the state-of-the-art VR/AR technology relevant to plastic surgery. The 35 studies that were ultimately selected were categorized into 3 representative topics: VR/AR-based preoperative planning, navigation, and training. In addition, future trends of VR/AR technology associated with plastic surgery and related fields are discussed.

  10. Preliminarily measurement and analysis of sawing forces in fresh cadaver mandible using reciprocating saw for reality-based haptic feedback.

    Science.gov (United States)

    Yua, Dedong; Zhengb, Xiaohu; Chenc, Ming; Shend, Steve G F

    2012-05-01

    The aim of the study was to preliminarily measure and analyze the cutting forces in fresh Chinese cadaver mandible using a clinically widely used reciprocating saw for reality-based haptic feedback. Eight mandibles were taken from fresh Chinese cadavers, 4 females and 4 males, aged between 59 and 95 years. A set of sawing experiments, using a surgery Stryker micro-reciprocating saw and Kistler piezoelectric dynamometer, was carried out by a CNC machining center. Under different vibration frequencies of saw and feeding rates measured from orthognathic surgery, sawing forces were recorded by a signal acquisition system. Remarkably different sawing forces were measured from different cadavers. Feed and vibration frequency of the reciprocating saw could determine the cutting forces only on 1 body. To reduce the impact of bone thickness changes on the cutting force measurements, all the cutting force data should be converted to the force of unit cutting length. The vibration frequency of haptic feedback system is determined by main cutting forces. Fast Fourier transform method can be used to calculate the frequency of this system. To simulate surgery in higher fidelity, all the sawing forces from the experiment should be amended by experienced surgeons before use in virtual reality surgery simulator. Sawing force signals of different ages for force feedback were measured successfully, and more factors related to the bone mechanical properties, such as bone density, should be concerned in the future.

  11. Salient features in 3-D haptic shape perception

    NARCIS (Netherlands)

    Plaisier, Myrthe A; Bergmann Tiest, Wouter M.; Kappers, Astrid M L

    2009-01-01

    Shape is an important cue for recognizing an object by touch. Several features, such as edges, curvature, surface area, and aspect ratio, are associated with 3-D shape. To investigate the saliency of 3-D shape features, we developed a haptic search task. The target and distractor items consisted of

  12. The Effect of Realistic Appearance of Virtual Characters in Immersive Environments - Does the Character's Personality Play a Role?

    Science.gov (United States)

    Zibrek, Katja; Kokkinara, Elena; Mcdonnell, Rachel

    2018-04-01

    Virtual characters that appear almost photo-realistic have been shown to induce negative responses from viewers in traditional media, such as film and video games. This effect, described as the uncanny valley, is the reason why realism is often avoided when the aim is to create an appealing virtual character. In Virtual Reality, there have been few attempts to investigate this phenomenon and the implications of rendering virtual characters with high levels of realism on user enjoyment. In this paper, we conducted a large-scale experiment on over one thousand members of the public in order to gather information on how virtual characters are perceived in interactive virtual reality games. We were particularly interested in whether different render styles (realistic, cartoon, etc.) would directly influence appeal, or if a character's personality was the most important indicator of appeal. We used a number of perceptual metrics such as subjective ratings, proximity, and attribution bias in order to test our hypothesis. Our main result shows that affinity towards virtual characters is a complex interaction between the character's appearance and personality, and that realism is in fact a positive choice for virtual characters in virtual reality.

  13. The role of haptic feedback in laparoscopic training using the LapMentor II.

    Science.gov (United States)

    Salkini, Mohamad W; Doarn, Charles R; Kiehl, Nicholai; Broderick, Timothy J; Donovan, James F; Gaitonde, Krishnanath

    2010-01-01

    Laparoscopic surgery has become the standard of care for many surgical diseases. Haptic (tactile) feedback (HFB) is considered an important component of laparoscopic surgery. Virtual reality simulation (VRS) is an alternative method to teach surgical skills to surgeons in training. Newer VRS trainers such as the Simbionix Lap Mentor II provide significantly improved tactile feedback. However, VRSs are expensive and adding HFB software adds an estimated cost of $30,000 to the commercial price. The HFB provided by the Lap Mentor II has not been validated by an independent party. We used the Simbionix Lap Mentor II in this study to demonstrate the effect of adding an HFB mechanism in the VRS trainer. The study was approved by the University of Cincinnati Institutional Review Board. Twenty laparoscopically novice medical students were enrolled. Each student was asked to perform three different tasks on the Lap Mentor II and repeat each one five times. The chosen tasks demanded significant amount of traction and counter traction. The first task was to pull leaking tubes enough and clip them. The second task was stretching a jelly plate enough to see its attachments to the floor and cut these attachments. In the third task, the trainee had to separate the gallbladder from its bed on the liver. The students were randomized into two groups to perform the tasks with and without HFB. We used accuracy, speed, and economy of movement as scales to compare the performance between the two groups. The participants also completed a simple questionnaire that highlighted age, sex, and experiences in videogame usage. The two groups were comparable in age, sex, and videogame playing. No differences in the accuracy, the economy, and the speed of hand movement were noticed. In fact, adding HFB to the Lap Mentor II simulator did not contribute to any improvement in the performance of the trainees. Interestingly, we found that videogame expert players tend to have faster and more economic

  14. Haptic Human-Human Interaction Through a Compliant Connection Does Not Improve Motor Learning in a Force Field

    NARCIS (Netherlands)

    Beckers, Niek; Keemink, Arvid; van Asseldonk, Edwin; van der Kooij, Herman; Prattichizzo, Domenico; Shinoda, Hiroyuki; Tan, Hong Z.; Ruffaldi, Emanuele; Frisoli, Antonio

    2018-01-01

    Humans have a natural ability to haptically interact with other humans, for instance during physically assisting a child to learn how to ride a bicycle. A recent study has shown that haptic human-human interaction can improve individual motor performance and motor learning rate while learning to

  15. The Effect of Trial-by-trial Adaptation on Conflicts in Haptic Shared Control for Free-Air Teleoperation Tasks

    NARCIS (Netherlands)

    de Jonge, A. W.; Wildenbeest, J. G. W.; Boessenkool, H.; Abbink, D. A.

    2016-01-01

    Haptic shared control can improve execution of teleoperation and driving tasks. However, shared control designs may suffer from conflicts between individual human operators and constant haptic assistance when their desired trajectories differ, leading to momentarily increased forces, discomfort or

  16. Design of a complex virtual reality simulation to train finger motion for persons with hemiparesis: a proof of concept study.

    Science.gov (United States)

    Adamovich, Sergei V; Fluet, Gerard G; Mathai, Abraham; Qiu, Qinyin; Lewis, Jeffrey; Merians, Alma S

    2009-07-17

    Current neuroscience has identified rehabilitation approaches with the potential to stimulate adaptive changes in the brains of persons with hemiparesis. These approaches include, intensive task-oriented training, bimanual activities and balancing proximal and distal upper extremity interventions to reduce competition between these segments for neural territory. This paper describes the design and feasibility testing of a robotic/virtual environment system designed to train the hand and arm of persons with hemiparesis. The system employs a simulated piano that presents visual, auditory and tactile feedback comparable to an actual piano. Arm tracking allows patients to train both the arm and hand as a coordinated unit, emphasizing the integration of both transport and manipulation phases. The piano trainer includes songs and scales that can be performed with one or both hands. Adaptable haptic assistance is available for more involved subjects. An algorithm adjusts task difficulty in proportion to subject performance. A proof of concept study was performed on four subjects with upper extremity hemiparesis secondary to chronic stroke to establish: a) the safety and feasibility of this system and b) the concurrent validity of robotically measured kinematic and performance measures to behavioral measures of upper extremity function. None of the subjects experienced adverse events or responses during or after training. As a group, the subjects improved in both performance time and key press accuracy. Three of the four subjects demonstrated improvements in fractionation, the ability to move each finger individually. Two subjects improved their aggregate time on the Jebsen Test of Hand Function and three of the four subjects improved in Wolf Motor Function Test aggregate time. The system designed in this paper has proven to be safe and feasible for the training of hand function for persons with hemiparesis. It features a flexible design that allows for the use and further

  17. Design of a complex virtual reality simulation to train finger motion for persons with hemiparesis: a proof of concept study

    Directory of Open Access Journals (Sweden)

    Qiu Qinyin

    2009-07-01

    Full Text Available Abstract Background Current neuroscience has identified rehabilitation approaches with the potential to stimulate adaptive changes in the brains of persons with hemiparesis. These approaches include, intensive task-oriented training, bimanual activities and balancing proximal and distal upper extremity interventions to reduce competition between these segments for neural territory. Methods This paper describes the design and feasibility testing of a robotic/virtual environment system designed to train the hand and arm of persons with hemiparesis. The system employs a simulated piano that presents visual, auditory and tactile feedback comparable to an actual piano. Arm tracking allows patients to train both the arm and hand as a coordinated unit, emphasizing the integration of both transport and manipulation phases. The piano trainer includes songs and scales that can be performed with one or both hands. Adaptable haptic assistance is available for more involved subjects. An algorithm adjusts task difficulty in proportion to subject performance. A proof of concept study was performed on four subjects with upper extremity hemiparesis secondary to chronic stroke to establish: a the safety and feasibility of this system and b the concurrent validity of robotically measured kinematic and performance measures to behavioral measures of upper extremity function. Results None of the subjects experienced adverse events or responses during or after training. As a group, the subjects improved in both performance time and key press accuracy. Three of the four subjects demonstrated improvements in fractionation, the ability to move each finger individually. Two subjects improved their aggregate time on the Jebsen Test of Hand Function and three of the four subjects improved in Wolf Motor Function Test aggregate time. Conclusion The system designed in this paper has proven to be safe and feasible for the training of hand function for persons with hemiparesis

  18. Simulations and experimental evaluation of an active orthosis for interaction in virtual environments

    Directory of Open Access Journals (Sweden)

    Tsveov Mihail

    2018-01-01

    Full Text Available In this work, the development of a human arm active orthosis is presented. The orthosis is designed primarily for training and rehabilitation in virtual environments.The orthosis system is intended for embodiment in virtual reality where it is allowing human to perceive forces at different body parts or the weight of lifted objects. In the paper the choice of a mechanical structure is shown equivalent to the structure of the human arm. A mechanical model of the orthosis arm as haptic device is built, where kinematic and dynamic parameters are evaluated. Impedance control scheme is selected as the most suitable for force refection at the hand or arm. An open-loop impedance controller is presented in the paper. Computer experiments are carried out using the dimensions of a real arm orthosis. Computer experiments have been carried out to provide force reflection by VR, according to virtual scenario. The conducted simulations show the range of the forces on the operator hand, orthosis can provide. The results of additional measurements and experimental evaluations of physical quantities in the interaction in a virtual environment are revealed in the paper.

  19. Towards a standard on evaluation of tactile/haptic interactions

    NARCIS (Netherlands)

    Sinclair, I.; Carter, J.; Kassner, S.; Erp, J.B.F. van; Weber, G.; Elliott, L.; Andrew, I.

    2012-01-01

    Tactile and haptic interaction is becoming increasingly important; ergonomic standards can ensure that systems are designed with sufficient concern for ergonomics and interoperability. ISO (through working group TC159/SC4/WG9) is developing international standards in this subject area, dual-tracked

  20. Listening to white noise counteracts visual and haptic pseudoneglect.

    Science.gov (United States)

    Cattaneo, Zaira; Lega, Carlotta; Vecchi, Tomaso; Vallar, Giuseppe

    2012-01-01

    Neurologically intact individuals usually show a leftward bias in line bisection, a tendency known as "pseudoneglect", likely reflecting a right-hemisphere dominance in controlling the allocation of spatial attention. Studies in brain-damaged patients with left visuospatial neglect have reported that auditory stimulation may reduce the deficit, both in a spatially dependent and in a spatially independent way. Here we show for the first time that the concurrent binaural presentation of auditory white noise affects healthy individuals' performance in both visual and haptic bisection, reducing their leftward error. We suggest that this effect depends on the noise boosting alertness and restoring the hemispheric activation balance. Our data clearly show that task-irrelevant auditory noise crossmodally affects the allocation of spatial resources in both the haptic and the visual space; future research may clarify whether these effects are specific for the type of auditory stimulation.

  1. Selective attention modulates visual and haptic repetition priming: effects in aging and Alzheimer's disease.

    Science.gov (United States)

    Ballesteros, Soledad; Reales, José M; Mayas, Julia; Heller, Morton A

    2008-08-01

    In two experiments, we examined the effect of selective attention at encoding on repetition priming in normal aging and Alzheimer's disease (AD) patients for objects presented visually (experiment 1) or haptically (experiment 2). We used a repetition priming paradigm combined with a selective attention procedure at encoding. Reliable priming was found for both young adults and healthy older participants for visually presented pictures (experiment 1) as well as for haptically presented objects (experiment 2). However, this was only found for attended and not for unattended stimuli. The results suggest that independently of the perceptual modality, repetition priming requires attention at encoding and that perceptual facilitation is maintained in normal aging. However, AD patients did not show priming for attended stimuli, or for unattended visual or haptic objects. These findings suggest an early deficit of selective attention in AD. Results are discussed from a cognitive neuroscience approach.

  2. Modeling and test of a kinaesthetic actuator based on MR fluid for haptic applications.

    Science.gov (United States)

    Yang, Tae-Heon; Koo, Jeong-Hoi; Kim, Sang-Youn; Kwon, Dong-Soo

    2017-03-01

    Haptic display units have been widely used for conveying button sensations to users, primarily employing vibrotactile actuators. However, the human feeling for pressing buttons mainly relies on kinaesthetic sensations (rather than vibrotactile sensations), and little studies exist on small-scale kinaesthetic haptic units. Thus, the primary goals of this paper are to design a miniature kinaesthetic actuator based on Magneto-Rheological (MR) fluid that can convey various button-clicking sensations and to experimentally evaluate its haptic performance. The design focuses of the proposed actuator were to produce sufficiently large actuation forces (resistive forces) for human users in a given size constraint and to offer a wide range of actuation forces for conveying vivid haptic sensations to users. To this end, this study first performed a series of parametric studies using mathematical force models for multiple operating modes of MR fluid in conjunction with finite element electromagnetism analysis. After selecting design parameters based on parametric studies, a prototype actuator was constructed, and its performance was evaluated using a dynamic mechanical analyzer. It measured the actuator's resistive force with a varying stroke (pressed depth) up to 1 mm and a varying input current from 0 A to 200 mA. The results show that the proposed actuator creates a wide range of resistive forces from around 2 N (off-state) to over 9.5 N at 200 mA. In order to assess the prototype's performance in the terms of the haptic application prospective, a maximum force rate was calculated to determine just noticeable difference in force changes for the 1 mm stoke of the actuator. The results show that the force rate is sufficient to mimic various levels of button sensations, indicating that the proposed kinaesthetic actuator can offer a wide range of resistive force changes that can be conveyed to human operators.

  3. Rendering of Gemstones

    OpenAIRE

    Krtek, Lukáš

    2012-01-01

    The distinctive appearance of gemstones is caused by the way light reflects and refracts multiple times inside of them. The goal of this thesis is to design and implement an application for photorealistic rendering of gems. The most important effects we aim for are realistic dispersion of light and refractive caustics. For rendering we use well-known algorithm of path tracing with an experimental modification for faster computation of caustic effects. In this thesis we also design and impleme...

  4. Discriminating Tissue Stiffness with a Haptic Catheter: Feeling the Inside of the Beating Heart.

    Science.gov (United States)

    Kesner, Samuel B; Howe, Robert D

    2011-01-01

    Catheter devices allow physicians to access the inside of the human body easily and painlessly through natural orifices and vessels. Although catheters allow for the delivery of fluids and drugs, the deployment of devices, and the acquisition of the measurements, they do not allow clinicians to assess the physical properties of tissue inside the body due to the tissue motion and transmission limitations of the catheter devices, including compliance, friction, and backlash. The goal of this research is to increase the tactile information available to physicians during catheter procedures by providing haptic feedback during palpation procedures. To accomplish this goal, we have developed the first motion compensated actuated catheter system that enables haptic perception of fast moving tissue structures. The actuated catheter is instrumented with a distal tip force sensor and a force feedback interface that allows users to adjust the position of the catheter while experiencing the forces on the catheter tip. The efficacy of this device and interface is evaluated through a psychophyisical study comparing how accurately users can differentiate various materials attached to a cardiac motion simulator using the haptic device and a conventional manual catheter. The results demonstrate that haptics improves a user's ability to differentiate material properties and decreases the total number of errors by 50% over the manual catheter system.

  5. Rehabilitation Program Integrating Virtual Environment to Improve Orientation and Mobility Skills for People Who Are Blind.

    Science.gov (United States)

    Lahav, Orly; Schloerb, David W; Srinivasan, Mandayam A

    2015-01-01

    This paper presents the integration of a virtual environment (BlindAid) in an orientation and mobility rehabilitation program as a training aid for people who are blind. BlindAid allows the users to interact with different virtual structures and objects through auditory and haptic feedback. This research explores if and how use of the BlindAid in conjunction with a rehabilitation program can help people who are blind train themselves in familiar and unfamiliar spaces. The study, focused on nine participants who were congenitally, adventitiously, and newly blind, during their orientation and mobility rehabilitation program at the Carroll Center for the Blind (Newton, Massachusetts, USA). The research was implemented using virtual environment (VE) exploration tasks and orientation tasks in virtual environments and real spaces. The methodology encompassed both qualitative and quantitative methods, including interviews, a questionnaire, videotape recording, and user computer logs. The results demonstrated that the BlindAid training gave participants additional time to explore the virtual environment systematically. Secondly, it helped elucidate several issues concerning the potential strengths of the BlindAid system as a training aid for orientation and mobility for both adults and teenagers who are congenitally, adventitiously, and newly blind.

  6. ISMCR 1994: Topical Workshop on Virtual Reality. Proceedings of the Fourth International Symposium on Measurement and Control in Robotics

    Science.gov (United States)

    1994-01-01

    This symposium on measurement and control in robotics included sessions on: (1) rendering, including tactile perception and applied virtual reality; (2) applications in simulated medical procedures and telerobotics; (3) tracking sensors in a virtual environment; (4) displays for virtual reality applications; (5) sensory feedback including a virtual environment application with partial gravity simulation; and (6) applications in education, entertainment, technical writing, and animation.

  7. Haptic Control with a Robotic Gripper

    OpenAIRE

    Rody, Morgan

    2011-01-01

    The Novint Falcon is a low cost, 3-axis, haptic device primarily designed and built for the gaming industry. Meant to replace the conventional mouse, the Novint Falcon has sub- millimeter accuracy and is capable of real time updates. The device itself has the potential to be used in telerobotics applications when coupled with a robotic gripper for example. Recently, the Intelligent Control Lab at Örebro University in Sweden built such a robotic gripper. The robotic gripper has three fingers a...

  8. Latency in Distributed Acquisition and Rendering for Telepresence Systems.

    Science.gov (United States)

    Ohl, Stephan; Willert, Malte; Staadt, Oliver

    2015-12-01

    Telepresence systems use 3D techniques to create a more natural human-centered communication over long distances. This work concentrates on the analysis of latency in telepresence systems where acquisition and rendering are distributed. Keeping latency low is important to immerse users in the virtual environment. To better understand latency problems and to identify the source of such latency, we focus on the decomposition of system latency into sub-latencies. We contribute a model of latency and show how it can be used to estimate latencies in a complex telepresence dataflow network. To compare the estimates with real latencies in our prototype, we modify two common latency measurement methods. This presented methodology enables the developer to optimize the design, find implementation issues and gain deeper knowledge about specific sources of latency.

  9. Do vision and haptics share common representations? Implicit and explicit memory within and between modalities.

    Science.gov (United States)

    Easton, R D; Srinivas, K; Greene, A J

    1997-01-01

    Previous assessments of verbal cross-modal priming have typically been conducted with the visual and auditory modalities. Within-modal priming is always found to be substantially larger than cross-modal priming, a finding that could reflect modality modularity, or alternatively, differences between the coding of visual and auditory verbal information (i.e., geometric vs. phonological). The present experiments assessed implicit and explicit memory within and between vision and haptics, where verbal information could be coded in geometric terms. Because haptic perception of words is sequential or letter-by-letter, experiments were also conducted to isolate the effects of simultaneous versus sequential processing from the manipulation of modality. Together, the results reveal no effects of modality change on implicit or explicit tests. The authors discuss representational similarities between vision and haptics as well as image mediation as possible explanations for the results.

  10. A Vehicle Haptic Steering by Wire System Based on High Gain GPI Observers

    Directory of Open Access Journals (Sweden)

    A. Rodriguez-Angeles

    2014-01-01

    Full Text Available A vehicle steering by wire (SBW haptic system based on high gain generalized proportional integral (GPI observers is introduced. The observers are considered for the estimation of dynamic perturbations that are present at the tire and steering wheel. To ensure efficient tracking between the commanded steering wheel angle and the tire orientation angle, the estimated perturbations are on line canceled. As to provide a haptic interface with the driver, the estimated dynamic effects at the steering rack are fed back to the steering wheel, yielding a master-slave haptic system with bilateral communication. For implementation purposes few sensors and minimum knowledge of the dynamic model are required, which is a major advantage compared to other approaches. Only position tracking errors are fed back, while all other signals are estimated by the high gain GPI observers. The scheme is robust to uncertainty on the input gain and cancels dynamic perturbation effects such as friction and aligning forces on the tire. Experimental results are presented on a prototype platform.

  11. Transformative Rendering of Internet Resources

    Science.gov (United States)

    2012-10-01

    using either the Firefox or Google Chrome rendering engine. The rendering server then captures a screen shot of the page and creates code that positions...be compromised at web pages the hackers had built for that hacking competition to exploit that particular OS /browser configuration. During...of risk with no benefit. They include: - The rendering server is hosted on a Linux-based operating system ( OS ). The OS is much more secure than the

  12. Implementation of a virtual laryngoscope system using efficient reconstruction algorithms.

    Science.gov (United States)

    Luo, Shouhua; Yan, Yuling

    2009-08-01

    Conventional fiberoptic laryngoscope may cause discomfort to the patient and in some cases it can lead to side effects that include perforation, infection and hemorrhage. Virtual laryngoscopy (VL) can overcome this problem and further it may lower the risk of operation failures. Very few virtual endoscope (VE) based investigations of the larynx have been described in the literature. CT data sets from a healthy subject were used for the VL studies. An algorithm of preprocessing and region-growing for 3-D image segmentation is developed. An octree based approach is applied in our VL system which facilitates a rapid construction of iso-surfaces. Some locating techniques are used for fast rendering and navigation (fly-through). Our VL visualization system provides for real time and efficient 'fly-through' navigation. The virtual camera can be arranged so that it moves along the airway in either direction. Snap shots were taken during fly-throughs. The system can automatically adjust the direction of the virtual camera and prevent collisions of the camera and the wall of the airway. A virtual laryngoscope (VL) system using OpenGL (Open Graphics Library) platform for interactive rendering and 3D visualization of the laryngeal framework and upper airway is established. OpenGL is supported on major operating systems and works with every major windowing system. The VL system runs on regular PC workstations and was successfully tested and evaluated using CT data from a normal subject.

  13. Haptic over visual information in the distribution of visual attention after tool-use in near and far space.

    Science.gov (United States)

    Park, George D; Reed, Catherine L

    2015-10-01

    Despite attentional prioritization for grasping space near the hands, tool-use appears to transfer attentional bias to the tool's end/functional part. The contributions of haptic and visual inputs to attentional distribution along a tool were investigated as a function of tool-use in near (Experiment 1) and far (Experiment 2) space. Visual attention was assessed with a 50/50, go/no-go, target discrimination task, while a tool was held next to targets appearing near the tool-occupied hand or tool-end. Target response times (RTs) and sensitivity (d-prime) were measured at target locations, before and after functional tool practice for three conditions: (1) open-tool: tool-end visible (visual + haptic inputs), (2) hidden-tool: tool-end visually obscured (haptic input only), and (3) short-tool: stick missing tool's length/end (control condition: hand occupied but no visual/haptic input). In near space, both open- and hidden-tool groups showed a tool-end, attentional bias (faster RTs toward tool-end) before practice; after practice, RTs near the hand improved. In far space, the open-tool group showed no bias before practice; after practice, target RTs near the tool-end improved. However, the hidden-tool group showed a consistent tool-end bias despite practice. Lack of short-tool group results suggested that hidden-tool group results were specific to haptic inputs. In conclusion, (1) allocation of visual attention along a tool due to tool practice differs in near and far space, and (2) visual attention is drawn toward the tool's end even when visually obscured, suggesting haptic input provides sufficient information for directing attention along the tool.

  14. Haptic perception disambiguates visual perception of 3D shape

    NARCIS (Netherlands)

    Wijntjes, Maarten W A; Volcic, Robert; Pont, Sylvia C.; Koenderink, Jan J.; Kappers, Astrid M L

    We studied the influence of haptics on visual perception of three-dimensional shape. Observers were shown pictures of an oblate spheroid in two different orientations. A gauge-figure task was used to measure their perception of the global shape. In the first two sessions only vision was used. The

  15. Using Modality Replacement to Facilitate Communication between Visually and Hearing-Impaired People

    DEFF Research Database (Denmark)

    Moustakas, K.; Tzovaras, D.; Dybkjaer, L.

    2011-01-01

    Using sign language, speech, and haptics as communication modalities, a virtual treasure-hunting game serves as an entertainment and educational tool for visually-and hearing-impaired users.......Using sign language, speech, and haptics as communication modalities, a virtual treasure-hunting game serves as an entertainment and educational tool for visually-and hearing-impaired users....

  16. Reducing the motor response in haptic parallel matching eliminates the typically observed gender difference.

    Science.gov (United States)

    van Mier, Hanneke I

    2016-01-01

    When making two bars haptically parallel to each other, large deviations have been observed, most likely caused by the bias of a hand-centered egocentric reference frame. A consistent finding is that women show significantly larger deviations than men when performing this task. It has been suggested that this difference might be due to the fact that women are more egocentrically oriented than men or are less efficient in overcoming the egocentric bias of the hand. If this is indeed the case, reducing the bias of the egocentric reference frame should eliminate the above-mentioned gender difference. This was investigated in the current study. Sixty participants (30 men, 30 women) were instructed to haptically match (task HP) the orientation of a test bar with the dominant hand to the orientation of a reference bar that was perceived with the non-dominant hand. In a haptic visual task (task HV), in which only the reference bar and exploring hand were out of view, no motor response was required, but participants had to "match" the perceived orientation by verbally naming the parallel orientation that was read out on a test protractor. Both females and males performed better in the HV task than in the HP task. Significant gender effects were only found in the haptic parallelity task (HP), corroborating the idea that women perform at the same level as men when the egocentric bias of the hand is reduced.

  17. Augmented Virtuality: A Real-time Process for Presenting Real-world Visual Sensory Information in an Immersive Virtual Environment for Planetary Exploration

    Science.gov (United States)

    McFadden, D.; Tavakkoli, A.; Regenbrecht, J.; Wilson, B.

    2017-12-01

    Virtual Reality (VR) and Augmented Reality (AR) applications have recently seen an impressive growth, thanks to the advent of commercial Head Mounted Displays (HMDs). This new visualization era has opened the possibility of presenting researchers from multiple disciplines with data visualization techniques not possible via traditional 2D screens. In a purely VR environment researchers are presented with the visual data in a virtual environment, whereas in a purely AR application, a piece of virtual object is projected into the real world with which researchers could interact. There are several limitations to the purely VR or AR application when taken within the context of remote planetary exploration. For example, in a purely VR environment, contents of the planet surface (e.g. rocks, terrain, or other features) should be created off-line from a multitude of images using image processing techniques to generate 3D mesh data that will populate the virtual surface of the planet. This process usually takes a tremendous amount of computational resources and cannot be delivered in real-time. As an alternative, video frames may be superimposed on the virtual environment to save processing time. However, such rendered video frames will lack 3D visual information -i.e. depth information. In this paper, we present a technique to utilize a remotely situated robot's stereoscopic cameras to provide a live visual feed from the real world into the virtual environment in which planetary scientists are immersed. Moreover, the proposed technique will blend the virtual environment with the real world in such a way as to preserve both the depth and visual information from the real world while allowing for the sensation of immersion when the entire sequence is viewed via an HMD such as Oculus Rift. The figure shows the virtual environment with an overlay of the real-world stereoscopic video being presented in real-time into the virtual environment. Notice the preservation of the object

  18. Frontoparietal white matter integrity predicts haptic performance in chronic stroke

    Directory of Open Access Journals (Sweden)

    Alexandra L. Borstad

    2016-01-01

    Full Text Available Frontoparietal white matter supports information transfer between brain areas involved in complex haptic tasks such as somatosensory discrimination. The purpose of this study was to gain an understanding of the relationship between microstructural integrity of frontoparietal network white matter and haptic performance in persons with chronic stroke and to compare frontoparietal network integrity in participants with stroke and age matched control participants. Nineteen individuals with stroke and 16 controls participated. Haptic performance was quantified using the Hand Active Sensation Test (HASTe, an 18-item match-to-sample test of weight and texture discrimination. Three tesla MRI was used to obtain diffusion-weighted and high-resolution anatomical images of the whole brain. Probabilistic tractography was used to define 10 frontoparietal tracts total; Four intrahemispheric tracts measured bilaterally 1 thalamus to primary somatosensory cortex (T–S1, 2 thalamus to primary motor cortex (T–M1, 3 primary to secondary somatosensory cortex (S1 to SII and 4 primary somatosensory cortex to middle frontal gyrus (S1 to MFG and, 2 interhemispheric tracts; S1–S1 and precuneus interhemispheric. A control tract outside the network, the cuneus interhemispheric tract, was also examined. The diffusion metrics fractional anisotropy (FA, mean diffusivity (MD, axial (AD and radial diffusivity (RD were quantified for each tract. Diminished FA and elevated MD values are associated with poorer white matter integrity in chronic stroke. Nine of 10 tracts quantified in the frontoparietal network had diminished structural integrity poststroke compared to the controls. The precuneus interhemispheric tract was not significantly different between groups. Principle component analysis across all frontoparietal white matter tract MD values indicated a single factor explained 47% and 57% of the variance in tract mean diffusivity in stroke and control groups respectively

  19. Frontoparietal white matter integrity predicts haptic performance in chronic stroke.

    Science.gov (United States)

    Borstad, Alexandra L; Choi, Seongjin; Schmalbrock, Petra; Nichols-Larsen, Deborah S

    2016-01-01

    Frontoparietal white matter supports information transfer between brain areas involved in complex haptic tasks such as somatosensory discrimination. The purpose of this study was to gain an understanding of the relationship between microstructural integrity of frontoparietal network white matter and haptic performance in persons with chronic stroke and to compare frontoparietal network integrity in participants with stroke and age matched control participants. Nineteen individuals with stroke and 16 controls participated. Haptic performance was quantified using the Hand Active Sensation Test (HASTe), an 18-item match-to-sample test of weight and texture discrimination. Three tesla MRI was used to obtain diffusion-weighted and high-resolution anatomical images of the whole brain. Probabilistic tractography was used to define 10 frontoparietal tracts total; Four intrahemispheric tracts measured bilaterally 1) thalamus to primary somatosensory cortex (T-S1), 2) thalamus to primary motor cortex (T-M1), 3) primary to secondary somatosensory cortex (S1 to SII) and 4) primary somatosensory cortex to middle frontal gyrus (S1 to MFG) and, 2 interhemispheric tracts; S1-S1 and precuneus interhemispheric. A control tract outside the network, the cuneus interhemispheric tract, was also examined. The diffusion metrics fractional anisotropy (FA), mean diffusivity (MD), axial (AD) and radial diffusivity (RD) were quantified for each tract. Diminished FA and elevated MD values are associated with poorer white matter integrity in chronic stroke. Nine of 10 tracts quantified in the frontoparietal network had diminished structural integrity poststroke compared to the controls. The precuneus interhemispheric tract was not significantly different between groups. Principle component analysis across all frontoparietal white matter tract MD values indicated a single factor explained 47% and 57% of the variance in tract mean diffusivity in stroke and control groups respectively. Age

  20. Neodymium:YAG laser cutting of intraocular lens haptics.

    Science.gov (United States)

    Gorn, R A; Steinert, R F

    1985-11-01

    Neodymium:YAG laser cutting of polymethylmethacrylate and polypropylene anterior chamber and posterior chamber intraocular lens haptics was studied in terms of ease of transection and physical structure of the cut areas as seen by scanning electron microscopy. A marked difference was discovered, with the polymethylmethacrylate cutting easily along transverse planes, whereas the polypropylene resisted cutting along longitudinal fibers. Clinical guidelines are presented.