WorldWideScience

Sample records for robotic interaction hri

  1. iCub-HRI: A Software Framework for Complex Human–Robot Interaction Scenarios on the iCub Humanoid Robot

    Directory of Open Access Journals (Sweden)

    Tobias Fischer

    2018-03-01

    Full Text Available Generating complex, human-like behavior in a humanoid robot like the iCub requires the integration of a wide range of open source components and a scalable cognitive architecture. Hence, we present the iCub-HRI library which provides convenience wrappers for components related to perception (object recognition, agent tracking, speech recognition, and touch detection, object manipulation (basic and complex motor actions, and social interaction (speech synthesis and joint attention exposed as a C++ library with bindings for Java (allowing to use iCub-HRI within Matlab and Python. In addition to previously integrated components, the library allows for simple extension to new components and rapid prototyping by adapting to changes in interfaces between components. We also provide a set of modules which make use of the library, such as a high-level knowledge acquisition module and an action recognition module. The proposed architecture has been successfully employed for a complex human–robot interaction scenario involving the acquisition of language capabilities, execution of goal-oriented behavior and expression of a verbal narrative of the robot’s experience in the world. Accompanying this paper is a tutorial which allows a subset of this interaction to be reproduced. The architecture is aimed at researchers familiarizing themselves with the iCub ecosystem, as well as expert users, and we expect the library to be widely used in the iCub community.

  2. Intelligent lead: a novel HRI sensor for guide robots.

    Science.gov (United States)

    Cho, Keum-Bae; Lee, Beom-Hee

    2012-01-01

    This paper addresses the introduction of a new Human Robot Interaction (HRI) sensor for guide robots. Guide robots for geriatric patients or the visually impaired should follow user's control command, keeping a certain desired distance allowing the user to work freely. Therefore, it is necessary to acquire control commands and a user's position on a real-time basis. We suggest a new sensor fusion system to achieve this objective and we will call this sensor the "intelligent lead". The objective of the intelligent lead is to acquire a stable distance from the user to the robot, speed-control volume and turn-control volume, even when the robot platform with the intelligent lead is shaken on uneven ground. In this paper we explain a precise Extended Kalman Filter (EKF) procedure for this. The intelligent lead physically consists of a Kinect sensor, the serial linkage attached with eight rotary encoders, and an IMU (Inertial Measurement Unit) and their measurements are fused by the EKF. A mobile robot was designed to test the performance of the proposed sensor system. After installing the intelligent lead in the mobile robot, several tests are conducted to verify that the mobile robot with the intelligent lead is capable of achieving its goal points while maintaining the appropriate distance between the robot and the user. The results show that we can use the intelligent lead proposed in this paper as a new HRI sensor joined a joystick and a distance measure in the mobile environments such as the robot and the user are moving at the same time.

  3. 6-REXOS: Upper Limb Exoskeleton Robot with Improved pHRI

    Directory of Open Access Journals (Sweden)

    Malin Gunasekara

    2015-04-01

    Full Text Available Close interaction can be observed between an exoskeleton robot and its wearer. Therefore, appropriate physical human-robot interaction (pHRI should be considered when designing an exoskeleton robot to provide safe and comfortable motion assistance. Different features have been used in recent studies to enhance the pHRI in upper-limb exoskeleton robots. However, less attention has been given to integrating kinematic redundancy into upper-limb exoskeleton robots to improve the pHRI. In this context, this paper proposes a six-degrees-of-freedom (DoF upper-limb exoskeleton robot (6-REXOS for the motion assistance of physically weak individuals. The 6-REXOS uses a kinematically different structure to that of the human lower arm, where the exoskeleton robot is worn. The 6-REXOS has four active DoFs to generate the motion of the human lower arm. Furthermore, two flexible bellow couplings are attached to the wrist and elbow joints to generate two passive DoFs. These couplings not only allow translational motion in wrist and elbow joints but also a redundancy in the robot. Furthermore, the compliance of the flexible coupling contributes to avoiding misalignments between human and robot joint axes. The redundancy in the 6-REXOS is verified based on manipulability index, minimum singular value, condition number and manipulability ellipsoids. The 6-REXOS and a four-DoF exoskeleton robot are compared to verify the manipulation advantage due to the redundancy. The four-DoF exoskeleton robot is designed by excluding the two passive DoFs of the 6-REXOS. In addition, a kinematic model is proposed for the human lower arm to validate the performance of the 6-REXOS. Kinematic analysis and simulations are carried out to validate the 6-REXOS and human-lower-arm model.

  4. Influence of Attachment Pressure and Kinematic Configuration on pHRI with Wearable Robots

    Directory of Open Access Journals (Sweden)

    André Schiele

    2009-01-01

    Full Text Available The goal of this paper is to show the influence of exoskeleton attachment, such as the pressure on the fixation cuffs and alignment of the robot joint to the human joint, on subjective and objective performance metrics (i.e. comfort, mental load, interface forces, tracking error and available workspace during a typical physical human-robot interaction (pHRI experiment. A mathematical model of a single degree of freedom interaction between humans and a wearable robot is presented and used to explain the causes and characteristics of interface forces between the two. The pHRI model parameters (real joint offsets, attachment stiffness are estimated from experimental interface force measurements acquired during tests with 14 subjects. Insights gained by the model allow optimisation of the exoskeleton kinematics. This paper shows that offsets of more than ±10 cm exist between human and robot axes of rotation, even if a well-designed exoskeleton is aligned properly before motion. Such offsets can create interface loads of up to 200 N and 1.5 Nm in the absence of actuation. The optimal attachment pressure is determined to be 20 mmHg and the attachment stiffness is about 300 N/m. Inclusion of passive compensation joints in the exoskeleton is shown to lower the interaction forces significantly, which enables a more ergonomic pHRI.

  5. Interaction with Soft Robotic Tentacles

    DEFF Research Database (Denmark)

    Jørgensen, Jonas

    2018-01-01

    Soft robotics technology has been proposed for a number of applications that involve human-robot interaction. In this tabletop demonstration it is possible to interact with two soft robotic platforms that have been used in human-robot interaction experiments (also accepted to HRI'18 as a Late...

  6. HRI caught on film

    NARCIS (Netherlands)

    Bartneck, C.; Kanda, T.

    2007-01-01

    The Human Robot Interaction 2007 conference hosted a video session, in which movies of interesting, important, illustrative, or humorous HRI research moments are shown. This paper summarizes the abstracts of the presented videos. Robots and humans do not always behave as expected and the results can

  7. The Human-Robot Interaction Operating System

    Science.gov (United States)

    Fong, Terrence; Kunz, Clayton; Hiatt, Laura M.; Bugajska, Magda

    2006-01-01

    In order for humans and robots to work effectively together, they need to be able to converse about abilities, goals and achievements. Thus, we are developing an interaction infrastructure called the "Human-Robot Interaction Operating System" (HRI/OS). The HRI/OS provides a structured software framework for building human-robot teams, supports a variety of user interfaces, enables humans and robots to engage in task-oriented dialogue, and facilitates integration of robots through an extensible API.

  8. Socially intelligent robots: dimensions of human-robot interaction.

    Science.gov (United States)

    Dautenhahn, Kerstin

    2007-04-29

    Social intelligence in robots has a quite recent history in artificial intelligence and robotics. However, it has become increasingly apparent that social and interactive skills are necessary requirements in many application areas and contexts where robots need to interact and collaborate with other robots or humans. Research on human-robot interaction (HRI) poses many challenges regarding the nature of interactivity and 'social behaviour' in robot and humans. The first part of this paper addresses dimensions of HRI, discussing requirements on social skills for robots and introducing the conceptual space of HRI studies. In order to illustrate these concepts, two examples of HRI research are presented. First, research is surveyed which investigates the development of a cognitive robot companion. The aim of this work is to develop social rules for robot behaviour (a 'robotiquette') that is comfortable and acceptable to humans. Second, robots are discussed as possible educational or therapeutic toys for children with autism. The concept of interactive emergence in human-child interactions is highlighted. Different types of play among children are discussed in the light of their potential investigation in human-robot experiments. The paper concludes by examining different paradigms regarding 'social relationships' of robots and people interacting with them.

  9. The Impact of the Contingency of Robot Feedback for HRI

    DEFF Research Database (Denmark)

    Fischer, Kerstin; Lohan, Katrin Solveig; Saunders, Joe

    2013-01-01

    robot iCub on a set of shapes and on a stacking task in two conditions, once with socially contingent, nonverbal feedback implemented in response to different gaze and looming behaviors of the human tutor, and once with non-contingent, saliency-based feedback. The results of the analysis of participants......’ linguistic behaviors in the two conditions show that contingency has an impact on the complexity and the pre-structuring of the task for the robot, i.e. on the participants’ tutoring behaviors. Contingency thus plays a considerable role for learning by demonstration....

  10. Common Metrics for Human-Robot Interaction

    Science.gov (United States)

    Steinfeld, Aaron; Lewis, Michael; Fong, Terrence; Scholtz, Jean; Schultz, Alan; Kaber, David; Goodrich, Michael

    2006-01-01

    This paper describes an effort to identify common metrics for task-oriented human-robot interaction (HRI). We begin by discussing the need for a toolkit of HRI metrics. We then describe the framework of our work and identify important biasing factors that must be taken into consideration. Finally, we present suggested common metrics for standardization and a case study. Preparation of a larger, more detailed toolkit is in progress.

  11. Affect in Human-Robot Interaction

    Science.gov (United States)

    2014-01-01

    Werry, I., Rae, J., Dickerson, P., Stribling, P., & Ogden, B. (2002). Robotic Playmates: Analysing Interactive Competencies of Children with Autism ...WE-4RII. IEEE International Conference on Intelligent Robots and Systems, Edmonton, Canada. 35. Moravec, H. (1988). Mind Children : The Future of...and if so when and where? • What approaches, theories , representations, and experimental methods inform affective HRI research? Report Documentation

  12. Human-Robot Interaction: Status and Challenges.

    Science.gov (United States)

    Sheridan, Thomas B

    2016-06-01

    The current status of human-robot interaction (HRI) is reviewed, and key current research challenges for the human factors community are described. Robots have evolved from continuous human-controlled master-slave servomechanisms for handling nuclear waste to a broad range of robots incorporating artificial intelligence for many applications and under human supervisory control. This mini-review describes HRI developments in four application areas and what are the challenges for human factors research. In addition to a plethora of research papers, evidence of success is manifest in live demonstrations of robot capability under various forms of human control. HRI is a rapidly evolving field. Specialized robots under human teleoperation have proven successful in hazardous environments and medical application, as have specialized telerobots under human supervisory control for space and repetitive industrial tasks. Research in areas of self-driving cars, intimate collaboration with humans in manipulation tasks, human control of humanoid robots for hazardous environments, and social interaction with robots is at initial stages. The efficacy of humanoid general-purpose robots has yet to be proven. HRI is now applied in almost all robot tasks, including manufacturing, space, aviation, undersea, surgery, rehabilitation, agriculture, education, package fetch and delivery, policing, and military operations. © 2016, Human Factors and Ergonomics Society.

  13. Toward a framework for levels of robot autonomy in human-robot interaction.

    Science.gov (United States)

    Beer, Jenay M; Fisk, Arthur D; Rogers, Wendy A

    2014-07-01

    A critical construct related to human-robot interaction (HRI) is autonomy, which varies widely across robot platforms. Levels of robot autonomy (LORA), ranging from teleoperation to fully autonomous systems, influence the way in which humans and robots may interact with one another. Thus, there is a need to understand HRI by identifying variables that influence - and are influenced by - robot autonomy. Our overarching goal is to develop a framework for levels of robot autonomy in HRI. To reach this goal, the framework draws links between HRI and human-automation interaction, a field with a long history of studying and understanding human-related variables. The construct of autonomy is reviewed and redefined within the context of HRI. Additionally, the framework proposes a process for determining a robot's autonomy level, by categorizing autonomy along a 10-point taxonomy. The framework is intended to be treated as guidelines to determine autonomy, categorize the LORA along a qualitative taxonomy, and consider which HRI variables (e.g., acceptance, situation awareness, reliability) may be influenced by the LORA.

  14. Accelerating Robot Development through Integral Analysis of Human-Robot Interaction

    NARCIS (Netherlands)

    Kooijmans, T.; Kanda, T.; Bartneck, C.; Ishiguro, H.; Hagita, N.

    2007-01-01

    The development of interactive robots is a complicated process, involving a plethora of psychological, technical, and contextual influences. To design a robot capable of operating "intelligently" in everyday situations, one needs a profound understanding of human-robot interaction (HRI). We propose

  15. The Tactile Ethics of Soft Robotics: Designing Wisely for Human-Robot Interaction.

    Science.gov (United States)

    Arnold, Thomas; Scheutz, Matthias

    2017-06-01

    Soft robots promise an exciting design trajectory in the field of robotics and human-robot interaction (HRI), promising more adaptive, resilient movement within environments as well as a safer, more sensitive interface for the objects or agents the robot encounters. In particular, tactile HRI is a critical dimension for designers to consider, especially given the onrush of assistive and companion robots into our society. In this article, we propose to surface an important set of ethical challenges for the field of soft robotics to meet. Tactile HRI strongly suggests that soft-bodied robots balance tactile engagement against emotional manipulation, model intimacy on the bonding with a tool not with a person, and deflect users from personally and socially destructive behavior the soft bodies and surfaces could normally entice.

  16. Human-Robot Interaction

    Science.gov (United States)

    Sandor, Aniko; Cross, E. Vincent, II; Chang, Mai Lee

    2015-01-01

    Human-robot interaction (HRI) is a discipline investigating the factors affecting the interactions between humans and robots. It is important to evaluate how the design of interfaces affect the human's ability to perform tasks effectively and efficiently when working with a robot. By understanding the effects of interface design on human performance, workload, and situation awareness, interfaces can be developed to appropriately support the human in performing tasks with minimal errors and with appropriate interaction time and effort. Thus, the results of research on human-robot interfaces have direct implications for the design of robotic systems. For efficient and effective remote navigation of a rover, a human operator needs to be aware of the robot's environment. However, during teleoperation, operators may get information about the environment only through a robot's front-mounted camera causing a keyhole effect. The keyhole effect reduces situation awareness which may manifest in navigation issues such as higher number of collisions, missing critical aspects of the environment, or reduced speed. One way to compensate for the keyhole effect and the ambiguities operators experience when they teleoperate a robot is adding multiple cameras and including the robot chassis in the camera view. Augmented reality, such as overlays, can also enhance the way a person sees objects in the environment or in camera views by making them more visible. Scenes can be augmented with integrated telemetry, procedures, or map information. Furthermore, the addition of an exocentric (i.e., third-person) field of view from a camera placed in the robot's environment may provide operators with the additional information needed to gain spatial awareness of the robot. Two research studies investigated possible mitigation approaches to address the keyhole effect: 1) combining the inclusion of the robot chassis in the camera view with augmented reality overlays, and 2) modifying the camera

  17. Simplified Human-Robot Interaction: Modeling and Evaluation

    Directory of Open Access Journals (Sweden)

    Balazs Daniel

    2013-10-01

    Full Text Available In this paper a novel concept of human-robot interaction (HRI modeling is proposed. Including factors like trust in automation, situational awareness, expertise and expectations a new user experience framework is formed for industrial robots. Service Oriented Robot Operation, proposed in a previous paper, creates an abstract level in HRI and it is also included in the framework. This concept is evaluated with exhaustive tests. Results prove that significant improvement in task execution may be achieved and the new system is more usable for operators with less experience with robotics; personnel specific for small and medium enterprises (SMEs.

  18. Toward Multimodal Human-Robot Interaction to Enhance Active Participation of Users in Gait Rehabilitation.

    Science.gov (United States)

    Gui, Kai; Liu, Honghai; Zhang, Dingguo

    2017-11-01

    Robotic exoskeletons for physical rehabilitation have been utilized for retraining patients suffering from paraplegia and enhancing motor recovery in recent years. However, users are not voluntarily involved in most systems. This paper aims to develop a locomotion trainer with multiple gait patterns, which can be controlled by the active motion intention of users. A multimodal human-robot interaction (HRI) system is established to enhance subject's active participation during gait rehabilitation, which includes cognitive HRI (cHRI) and physical HRI (pHRI). The cHRI adopts brain-computer interface based on steady-state visual evoked potential. The pHRI is realized via admittance control based on electromyography. A central pattern generator is utilized to produce rhythmic and continuous lower joint trajectories, and its state variables are regulated by cHRI and pHRI. A custom-made leg exoskeleton prototype with the proposed multimodal HRI is tested on healthy subjects and stroke patients. The results show that voluntary and active participation can be effectively involved to achieve various assistive gait patterns.

  19. Warning Signals for Poor Performance Improve Human-Robot Interaction

    NARCIS (Netherlands)

    van den Brule, Rik; Bijlstra, Gijsbert; Dotsch, Ron; Haselager, Pim; Wigboldus, Daniel HJ

    2016-01-01

    The present research was aimed at investigating whether human-robot interaction (HRI) can be improved by a robot’s nonverbal warning signals. Ideally, when a robot signals that it cannot guarantee good performance, people could take preventive actions to ensure the successful completion of the

  20. The Influence of Social Interaction on the Perception of Emotional Expression: A Case Study with a Robot Head

    Science.gov (United States)

    Murray, John C.; Cañamero, Lola; Bard, Kim A.; Ross, Marina Davila; Thorsteinsson, Kate

    In this paper we focus primarily on the influence that socio-emotional interaction has on the perception of emotional expression by a robot. We also investigate and discuss the importance of emotion expression in socially interactive situations involving human robot interaction (HRI), and show the importance of utilising emotion expression when dealing with interactive robots, that are to learn and develop in socially situated environments. We discuss early expressional development and the function of emotion in communication in humans and how this can improve HRI communications. Finally we provide experimental results showing how emotion-rich interaction via emotion expression can affect the HRI process by providing additional information.

  1. Advancing the Strategic Messages Affecting Robot Trust Effect: The Dynamic of User- and Robot-Generated Content on Human-Robot Trust and Interaction Outcomes.

    Science.gov (United States)

    Liang, Yuhua Jake; Lee, Seungcheol Austin

    2016-09-01

    Human-robot interaction (HRI) will soon transform and shift the communication landscape such that people exchange messages with robots. However, successful HRI requires people to trust robots, and, in turn, the trust affects the interaction. Although prior research has examined the determinants of human-robot trust (HRT) during HRI, no research has examined the messages that people received before interacting with robots and their effect on HRT. We conceptualize these messages as SMART (Strategic Messages Affecting Robot Trust). Moreover, we posit that SMART can ultimately affect actual HRI outcomes (i.e., robot evaluations, robot credibility, participant mood) by affording the persuasive influences from user-generated content (UGC) on participatory Web sites. In Study 1, participants were assigned to one of two conditions (UGC/control) in an original experiment of HRT. Compared with the control (descriptive information only), results showed that UGC moderated the correlation between HRT and interaction outcomes in a positive direction (average Δr = +0.39) for robots as media and robots as tools. In Study 2, we explored the effect of robot-generated content but did not find similar moderation effects. These findings point to an important empirical potential to employ SMART in future robot deployment.

  2. Human-Robot Interaction

    Science.gov (United States)

    Rochlis-Zumbado, Jennifer; Sandor, Aniko; Ezer, Neta

    2012-01-01

    Risk of Inadequate Design of Human and Automation/Robotic Integration (HARI) is a new Human Research Program (HRP) risk. HRI is a research area that seeks to understand the complex relationship among variables that affect the way humans and robots work together to accomplish goals. The DRP addresses three major HRI study areas that will provide appropriate information for navigation guidance to a teleoperator of a robot system, and contribute to the closure of currently identified HRP gaps: (1) Overlays -- Use of overlays for teleoperation to augment the information available on the video feed (2) Camera views -- Type and arrangement of camera views for better task performance and awareness of surroundings (3) Command modalities -- Development of gesture and voice command vocabularies

  3. Augmented Robotics Dialog System for Enhancing Human–Robot Interaction

    Science.gov (United States)

    Alonso-Martín, Fernando; Castro-González, Aívaro; de Gorostiza Luengo, Francisco Javier Fernandez; Salichs, Miguel Ángel

    2015-01-01

    Augmented reality, augmented television and second screen are cutting edge technologies that provide end users extra and enhanced information related to certain events in real time. This enriched information helps users better understand such events, at the same time providing a more satisfactory experience. In the present paper, we apply this main idea to human–robot interaction (HRI), to how users and robots interchange information. The ultimate goal of this paper is to improve the quality of HRI, developing a new dialog manager system that incorporates enriched information from the semantic web. This work presents the augmented robotic dialog system (ARDS), which uses natural language understanding mechanisms to provide two features: (i) a non-grammar multimodal input (verbal and/or written) text; and (ii) a contextualization of the information conveyed in the interaction. This contextualization is achieved by information enrichment techniques that link the extracted information from the dialog with extra information about the world available in semantic knowledge bases. This enriched or contextualized information (information enrichment, semantic enhancement or contextualized information are used interchangeably in the rest of this paper) offers many possibilities in terms of HRI. For instance, it can enhance the robot's pro-activeness during a human–robot dialog (the enriched information can be used to propose new topics during the dialog, while ensuring a coherent interaction). Another possibility is to display additional multimedia content related to the enriched information on a visual device. This paper describes the ARDS and shows a proof of concept of its applications. PMID:26151202

  4. Toward understanding social cues and signals in human?robot interaction: effects of robot gaze and proxemic behavior

    OpenAIRE

    Fiore, Stephen M.; Wiltshire, Travis J.; Lobato, Emilio J. C.; Jentsch, Florian G.; Huang, Wesley H.; Axelrod, Benjamin

    2013-01-01

    As robots are increasingly deployed in settings requiring social interaction, research is needed to examine the social signals perceived by humans when robots display certain social cues. In this paper, we report a study designed to examine how humans interpret social cues exhibited by robots. We first provide a brief overview of perspectives from social cognition in humans and how these processes are applicable to human–robot interaction (HRI). We then discuss the need to examine the relatio...

  5. Human-Robot Teams for Unknown and Uncertain Environments

    Science.gov (United States)

    Fong, Terry

    2015-01-01

    Man-robot interaction is the study of interactions between humans and robots. It is often referred as HRI by researchers. Human-robot interaction is a multidisciplinary field with contributions from human-computer interaction, artificial intelligence.

  6. Development of Methodologies, Metrics, and Tools for Investigating Human-Robot Interaction in Space Robotics

    Science.gov (United States)

    Ezer, Neta; Zumbado, Jennifer Rochlis; Sandor, Aniko; Boyer, Jennifer

    2011-01-01

    Human-robot systems are expected to have a central role in future space exploration missions that extend beyond low-earth orbit [1]. As part of a directed research project funded by NASA s Human Research Program (HRP), researchers at the Johnson Space Center have started to use a variety of techniques, including literature reviews, case studies, knowledge capture, field studies, and experiments to understand critical human-robot interaction (HRI) variables for current and future systems. Activities accomplished to date include observations of the International Space Station s Special Purpose Dexterous Manipulator (SPDM), Robonaut, and Space Exploration Vehicle (SEV), as well as interviews with robotics trainers, robot operators, and developers of gesture interfaces. A survey of methods and metrics used in HRI was completed to identify those most applicable to space robotics. These methods and metrics included techniques and tools associated with task performance, the quantification of human-robot interactions and communication, usability, human workload, and situation awareness. The need for more research in areas such as natural interfaces, compensations for loss of signal and poor video quality, psycho-physiological feedback, and common HRI testbeds were identified. The initial findings from these activities and planned future research are discussed. Human-robot systems are expected to have a central role in future space exploration missions that extend beyond low-earth orbit [1]. As part of a directed research project funded by NASA s Human Research Program (HRP), researchers at the Johnson Space Center have started to use a variety of techniques, including literature reviews, case studies, knowledge capture, field studies, and experiments to understand critical human-robot interaction (HRI) variables for current and future systems. Activities accomplished to date include observations of the International Space Station s Special Purpose Dexterous Manipulator

  7. Visual exploration and analysis of human-robot interaction rules

    Science.gov (United States)

    Zhang, Hui; Boyles, Michael J.

    2013-01-01

    We present a novel interaction paradigm for the visual exploration, manipulation and analysis of human-robot interaction (HRI) rules; our development is implemented using a visual programming interface and exploits key techniques drawn from both information visualization and visual data mining to facilitate the interaction design and knowledge discovery process. HRI is often concerned with manipulations of multi-modal signals, events, and commands that form various kinds of interaction rules. Depicting, manipulating and sharing such design-level information is a compelling challenge. Furthermore, the closed loop between HRI programming and knowledge discovery from empirical data is a relatively long cycle. This, in turn, makes design-level verification nearly impossible to perform in an earlier phase. In our work, we exploit a drag-and-drop user interface and visual languages to support depicting responsive behaviors from social participants when they interact with their partners. For our principal test case of gaze-contingent HRI interfaces, this permits us to program and debug the robots' responsive behaviors through a graphical data-flow chart editor. We exploit additional program manipulation interfaces to provide still further improvement to our programming experience: by simulating the interaction dynamics between a human and a robot behavior model, we allow the researchers to generate, trace and study the perception-action dynamics with a social interaction simulation to verify and refine their designs. Finally, we extend our visual manipulation environment with a visual data-mining tool that allows the user to investigate interesting phenomena such as joint attention and sequential behavioral patterns from multiple multi-modal data streams. We have created instances of HRI interfaces to evaluate and refine our development paradigm. As far as we are aware, this paper reports the first program manipulation paradigm that integrates visual programming

  8. Optimized Assistive Human-Robot Interaction Using Reinforcement Learning.

    Science.gov (United States)

    Modares, Hamidreza; Ranatunga, Isura; Lewis, Frank L; Popa, Dan O

    2016-03-01

    An intelligent human-robot interaction (HRI) system with adjustable robot behavior is presented. The proposed HRI system assists the human operator to perform a given task with minimum workload demands and optimizes the overall human-robot system performance. Motivated by human factor studies, the presented control structure consists of two control loops. First, a robot-specific neuro-adaptive controller is designed in the inner loop to make the unknown nonlinear robot behave like a prescribed robot impedance model as perceived by a human operator. In contrast to existing neural network and adaptive impedance-based control methods, no information of the task performance or the prescribed robot impedance model parameters is required in the inner loop. Then, a task-specific outer-loop controller is designed to find the optimal parameters of the prescribed robot impedance model to adjust the robot's dynamics to the operator skills and minimize the tracking error. The outer loop includes the human operator, the robot, and the task performance details. The problem of finding the optimal parameters of the prescribed robot impedance model is transformed into a linear quadratic regulator (LQR) problem which minimizes the human effort and optimizes the closed-loop behavior of the HRI system for a given task. To obviate the requirement of the knowledge of the human model, integral reinforcement learning is used to solve the given LQR problem. Simulation results on an x - y table and a robot arm, and experimental implementation results on a PR2 robot confirm the suitability of the proposed method.

  9. Toward understanding social cues and signals in human-robot interaction: effects of robot gaze and proxemic behavior.

    Science.gov (United States)

    Fiore, Stephen M; Wiltshire, Travis J; Lobato, Emilio J C; Jentsch, Florian G; Huang, Wesley H; Axelrod, Benjamin

    2013-01-01

    As robots are increasingly deployed in settings requiring social interaction, research is needed to examine the social signals perceived by humans when robots display certain social cues. In this paper, we report a study designed to examine how humans interpret social cues exhibited by robots. We first provide a brief overview of perspectives from social cognition in humans and how these processes are applicable to human-robot interaction (HRI). We then discuss the need to examine the relationship between social cues and signals as a function of the degree to which a robot is perceived as a socially present agent. We describe an experiment in which social cues were manipulated on an iRobot Ava(TM) mobile robotics platform in a hallway navigation scenario. Cues associated with the robot's proxemic behavior were found to significantly affect participant perceptions of the robot's social presence and emotional state while cues associated with the robot's gaze behavior were not found to be significant. Further, regardless of the proxemic behavior, participants attributed more social presence and emotional states to the robot over repeated interactions than when they first interacted with it. Generally, these results indicate the importance for HRI research to consider how social cues expressed by a robot can differentially affect perceptions of the robot's mental states and intentions. The discussion focuses on implications for the design of robotic systems and future directions for research on the relationship between social cues and signals.

  10. A meta-analysis of factors affecting trust in human-robot interaction.

    Science.gov (United States)

    Hancock, Peter A; Billings, Deborah R; Schaefer, Kristin E; Chen, Jessie Y C; de Visser, Ewart J; Parasuraman, Raja

    2011-10-01

    We evaluate and quantify the effects of human, robot, and environmental factors on perceived trust in human-robot interaction (HRI). To date, reviews of trust in HRI have been qualitative or descriptive. Our quantitative review provides a fundamental empirical foundation to advance both theory and practice. Meta-analytic methods were applied to the available literature on trust and HRI. A total of 29 empirical studies were collected, of which 10 met the selection criteria for correlational analysis and 11 for experimental analysis. These studies provided 69 correlational and 47 experimental effect sizes. The overall correlational effect size for trust was r = +0.26,with an experimental effect size of d = +0.71. The effects of human, robot, and environmental characteristics were examined with an especial evaluation of the robot dimensions of performance and attribute-based factors. The robot performance and attributes were the largest contributors to the development of trust in HRI. Environmental factors played only a moderate role. Factors related to the robot itself, specifically, its performance, had the greatest current association with trust, and environmental factors were moderately associated. There was little evidence for effects of human-related factors. The findings provide quantitative estimates of human, robot, and environmental factors influencing HRI trust. Specifically, the current summary provides effect size estimates that are useful in establishing design and training guidelines with reference to robot-related factors of HRI trust. Furthermore, results indicate that improper trust calibration may be mitigated by the manipulation of robot design. However, many future research needs are identified.

  11. Robot Choreography

    DEFF Research Database (Denmark)

    Jochum, Elizabeth Ann; Heath, Damith

    2016-01-01

    We propose a robust framework for combining performance paradigms with human robot interaction (HRI) research. Following an analysis of several case studies that combine the performing arts with HRI experiments, we propose a methodology and “best practices” for implementing choreography and other...... performance paradigms in HRI experiments. Case studies include experiments conducted in laboratory settings, “in the wild”, and live performance settings. We consider the technical and artistic challenges of designing and staging robots alongside humans in these various settings, and discuss how to combine...

  12. ROILA : RObot Interaction LAnguage

    NARCIS (Netherlands)

    Mubin, O.

    2011-01-01

    The number of robots in our society is increasing rapidly. The number of service robots that interact with everyday people already outnumbers industrial robots. The easiest way to communicate with these service robots, such as Roomba or Nao, would be natural speech. However, the limitations

  13. Human-Robot Interaction Directed Research Project

    Science.gov (United States)

    Rochlis, Jennifer; Ezer, Neta; Sandor, Aniko

    2011-01-01

    Human-robot interaction (HRI) is about understanding and shaping the interactions between humans and robots (Goodrich & Schultz, 2007). It is important to evaluate how the design of interfaces and command modalities affect the human s ability to perform tasks accurately, efficiently, and effectively (Crandall, Goodrich, Olsen Jr., & Nielsen, 2005) It is also critical to evaluate the effects of human-robot interfaces and command modalities on operator mental workload (Sheridan, 1992) and situation awareness (Endsley, Bolt , & Jones, 2003). By understanding the effects of interface design on human performance, workload, and situation awareness, interfaces can be developed that support the human in performing tasks with minimal errors and with appropriate interaction time and effort. Thus, the results of research on human-robot interfaces have direct implications for design. Because the factors associated with interfaces and command modalities in HRI are too numerous to address in 3 years of research, the proposed research concentrates on three manageable areas applicable to National Aeronautics and Space Administration (NASA) robot systems. These topic areas emerged from the Fiscal Year (FY) 2011 work that included extensive literature reviews and observations of NASA systems. The three topic areas are: 1) video overlays, 2) camera views, and 3) command modalities. Each area is described in detail below, along with relevance to existing NASA human-robot systems. In addition to studies in these three topic areas, a workshop is proposed for FY12. The workshop will bring together experts in human-robot interaction and robotics to discuss the state of the practice as applicable to research in space robotics. Studies proposed in the area of video overlays consider two factors in the implementation of augmented reality (AR) for operator displays during teleoperation. The first of these factors is the type of navigational guidance provided by AR symbology. In the proposed

  14. Sustaining Emotional Communication when Interacting with an Android Robot

    DEFF Research Database (Denmark)

    Vlachos, Evgenios

    The more human-like a robot appears and acts, the more users will have the belief of communicating with a human partner rather than with an artificial entity. However, current robotic technology displays limitations on the design of the facial interface and on the design of believable Human...... of an android related to its actions, perception and intelligence, or failure to identify which robot type is qualified to perform a specific task, might lead to disruption of HRI. This study is concerned with the problem of sustaining emotional communication when interacting with an android social robot......-ended evaluation method pertaining to the interpretation of Android facial expressions (g) a study on how users’ perception and attitude can change after direct interaction with a robot, (h) a study on how androids can maintain the focus of attention during short-term dyadic interactions, and (i) a state...

  15. Toward understanding social cues and signals in human–robot interaction: effects of robot gaze and proxemic behavior

    Science.gov (United States)

    Fiore, Stephen M.; Wiltshire, Travis J.; Lobato, Emilio J. C.; Jentsch, Florian G.; Huang, Wesley H.; Axelrod, Benjamin

    2013-01-01

    As robots are increasingly deployed in settings requiring social interaction, research is needed to examine the social signals perceived by humans when robots display certain social cues. In this paper, we report a study designed to examine how humans interpret social cues exhibited by robots. We first provide a brief overview of perspectives from social cognition in humans and how these processes are applicable to human–robot interaction (HRI). We then discuss the need to examine the relationship between social cues and signals as a function of the degree to which a robot is perceived as a socially present agent. We describe an experiment in which social cues were manipulated on an iRobot AvaTM mobile robotics platform in a hallway navigation scenario. Cues associated with the robot’s proxemic behavior were found to significantly affect participant perceptions of the robot’s social presence and emotional state while cues associated with the robot’s gaze behavior were not found to be significant. Further, regardless of the proxemic behavior, participants attributed more social presence and emotional states to the robot over repeated interactions than when they first interacted with it. Generally, these results indicate the importance for HRI research to consider how social cues expressed by a robot can differentially affect perceptions of the robot’s mental states and intentions. The discussion focuses on implications for the design of robotic systems and future directions for research on the relationship between social cues and signals. PMID:24348434

  16. Towards understanding social cues and signals in human-robot interaction: Effects of robot gaze and proxemic behavior

    Directory of Open Access Journals (Sweden)

    Stephen M. Fiore

    2013-11-01

    Full Text Available As robots are increasingly deployed in settings requiring social interaction, research is needed to examine the social signals perceived by humans when robots display certain social cues. In this paper, we report a study designed to examine how humans interpret social cues exhibited by robots. We first provide a brief overview of perspectives from social cognition in humans and how these processes are applicable to human-robot interaction (HRI. We then discuss the need to examine the relationship between social cues and signals as a function of the degree to which a robot is perceived as a socially present agent. We describe an experiment in which social cues were manipulated on an iRobot Ava™ Mobile Robotics Platform in a hallway navigation scenario. Cues associated with the robot’s proxemic behavior were found to significantly affect participant perceptions of the robot’s social presence and emotional state while cues associated with the robot’s gaze behavior were not found to be significant. Further, regardless of the proxemic behavior, participants attributed more social presence and emotional states to the robot over repeated interactions than when they first interacted with it. Generally, these results indicate the importance for HRI research to consider how social cues expressed by a robot can differentially affect perceptions of the robot’s mental states and intentions. The discussion focuses on implications for the design of robotic systems and future directions for research on the relationship between social cues and signals.

  17. Human Robot Interaction for Hybrid Collision Avoidance System for Indoor Mobile Robots

    Directory of Open Access Journals (Sweden)

    Mazen Ghandour

    2017-06-01

    Full Text Available In this paper, a novel approach for collision avoidance for indoor mobile robots based on human-robot interaction is realized. The main contribution of this work is a new technique for collision avoidance by engaging the human and the robot in generating new collision-free paths. In mobile robotics, collision avoidance is critical for the success of the robots in implementing their tasks, especially when the robots navigate in crowded and dynamic environments, which include humans. Traditional collision avoidance methods deal with the human as a dynamic obstacle, without taking into consideration that the human will also try to avoid the robot, and this causes the people and the robot to get confused, especially in crowded social places such as restaurants, hospitals, and laboratories. To avoid such scenarios, a reactive-supervised collision avoidance system for mobile robots based on human-robot interaction is implemented. In this method, both the robot and the human will collaborate in generating the collision avoidance via interaction. The person will notify the robot about the avoidance direction via interaction, and the robot will search for the optimal collision-free path on the selected direction. In case that no people interacted with the robot, it will select the navigation path autonomously and select the path that is closest to the goal location. The humans will interact with the robot using gesture recognition and Kinect sensor. To build the gesture recognition system, two models were used to classify these gestures, the first model is Back-Propagation Neural Network (BPNN, and the second model is Support Vector Machine (SVM. Furthermore, a novel collision avoidance system for avoiding the obstacles is implemented and integrated with the HRI system. The system is tested on H20 robot from DrRobot Company (Canada and a set of experiments were implemented to report the performance of the system in interacting with the human and avoiding

  18. Human-Robot Interaction Directed Research Project

    Science.gov (United States)

    Sandor, Aniko; Cross, Ernest V., II; Chang, Mai Lee

    2014-01-01

    Human-robot interaction (HRI) is a discipline investigating the factors affecting the interactions between humans and robots. It is important to evaluate how the design of interfaces and command modalities affect the human's ability to perform tasks accurately, efficiently, and effectively when working with a robot. By understanding the effects of interface design on human performance, workload, and situation awareness, interfaces can be developed to appropriately support the human in performing tasks with minimal errors and with appropriate interaction time and effort. Thus, the results of research on human-robot interfaces have direct implications for the design of robotic systems. This DRP concentrates on three areas associated with interfaces and command modalities in HRI which are applicable to NASA robot systems: 1) Video Overlays, 2) Camera Views, and 3) Command Modalities. The first study focused on video overlays that investigated how Augmented Reality (AR) symbology can be added to the human-robot interface to improve teleoperation performance. Three types of AR symbology were explored in this study, command guidance (CG), situation guidance (SG), and both (SCG). CG symbology gives operators explicit instructions on what commands to input, whereas SG symbology gives operators implicit cues so that operators can infer the input commands. The combination of CG and SG provided operators with explicit and implicit cues allowing the operator to choose which symbology to utilize. The objective of the study was to understand how AR symbology affects the human operator's ability to align a robot arm to a target using a flight stick and the ability to allocate attention between the symbology and external views of the world. The study evaluated the effects type of symbology (CG and SG) has on operator tasks performance and attention allocation during teleoperation of a robot arm. The second study expanded on the first study by evaluating the effects of the type of

  19. An Interactive Astronaut-Robot System with Gesture Control

    Directory of Open Access Journals (Sweden)

    Jinguo Liu

    2016-01-01

    Full Text Available Human-robot interaction (HRI plays an important role in future planetary exploration mission, where astronauts with extravehicular activities (EVA have to communicate with robot assistants by speech-type or gesture-type user interfaces embedded in their space suits. This paper presents an interactive astronaut-robot system integrating a data-glove with a space suit for the astronaut to use hand gestures to control a snake-like robot. Support vector machine (SVM is employed to recognize hand gestures and particle swarm optimization (PSO algorithm is used to optimize the parameters of SVM to further improve its recognition accuracy. Various hand gestures from American Sign Language (ASL have been selected and used to test and validate the performance of the proposed system.

  20. In our own image? Emotional and neural processing differences when observing human-human vs human-robot interactions.

    Science.gov (United States)

    Wang, Yin; Quadflieg, Susanne

    2015-11-01

    Notwithstanding the significant role that human-robot interactions (HRI) will play in the near future, limited research has explored the neural correlates of feeling eerie in response to social robots. To address this empirical lacuna, the current investigation examined brain activity using functional magnetic resonance imaging while a group of participants (n = 26) viewed a series of human-human interactions (HHI) and HRI. Although brain sites constituting the mentalizing network were found to respond to both types of interactions, systematic neural variation across sites signaled diverging social-cognitive strategies during HHI and HRI processing. Specifically, HHI elicited increased activity in the left temporal-parietal junction indicative of situation-specific mental state attributions, whereas HRI recruited the precuneus and the ventromedial prefrontal cortex (VMPFC) suggestive of script-based social reasoning. Activity in the VMPFC also tracked feelings of eeriness towards HRI in a parametric manner, revealing a potential neural correlate for a phenomenon known as the uncanny valley. By demonstrating how understanding social interactions depends on the kind of agents involved, this study highlights pivotal sub-routes of impression formation and identifies prominent challenges in the use of humanoid robots. © The Author (2015). Published by Oxford University Press.

  1. Human motion behavior while interacting with an industrial robot.

    Science.gov (United States)

    Bortot, Dino; Ding, Hao; Antonopolous, Alexandros; Bengler, Klaus

    2012-01-01

    Human workers and industrial robots both have specific strengths within industrial production. Advantageously they complement each other perfectly, which leads to the development of human-robot interaction (HRI) applications. Bringing humans and robots together in the same workspace may lead to potential collisions. The avoidance of such is a central safety requirement. It can be realized with sundry sensor systems, all of them decelerating the robot when the distance to the human decreases alarmingly and applying the emergency stop, when the distance becomes too small. As a consequence, the efficiency of the overall systems suffers, because the robot has high idle times. Optimized path planning algorithms have to be developed to avoid that. The following study investigates human motion behavior in the proximity of an industrial robot. Three different kinds of encounters between the two entities under three robot speed levels are prompted. A motion tracking system is used to capture the motions. Results show, that humans keep an average distance of about 0,5m to the robot, when the encounter occurs. Approximation of the workbenches is influenced by the robot in ten of 15 cases. Furthermore, an increase of participants' walking velocity with higher robot velocities is observed.

  2. Social Moments: A Perspective on Interaction for Social Robotics

    Directory of Open Access Journals (Sweden)

    Gautier Durantin

    2017-06-01

    Full Text Available During a social interaction, events that happen at different timescales can indicate social meanings. In order to socially engage with humans, robots will need to be able to comprehend and manipulate the social meanings that are associated with these events. We define social moments as events that occur within a social interaction and which can signify a pragmatic or semantic meaning. A challenge for social robots is recognizing social moments that occur on short timescales, which can be on the order of 102 ms. In this perspective, we propose that understanding the range and roles of social moments in a social interaction and implementing social micro-abilities—the abilities required to engage in a timely manner through social moments—is a key challenge for the field of human robot interaction (HRI to enable effective social interactions and social robots. In particular, it is an open question how social moments can acquire their associated meanings. Practically, the implementation of these social micro-abilities presents engineering challenges for the fields of HRI and social robotics, including performing processing of sensors and using actuators to meet fast timescales. We present a key challenge of social moments as integration of social stimuli across multiple timescales and modalities. We present the neural basis for human comprehension of social moments and review current literature related to social moments and social micro-abilities. We discuss the requirements for social micro-abilities, how these abilities can enable more natural social robots, and how to address the engineering challenges associated with social moments.

  3. Pragmatic Frames for Teaching and Learning in Human-Robot Interaction: Review and Challenges.

    Science.gov (United States)

    Vollmer, Anna-Lisa; Wrede, Britta; Rohlfing, Katharina J; Oudeyer, Pierre-Yves

    2016-01-01

    One of the big challenges in robotics today is to learn from human users that are inexperienced in interacting with robots but yet are often used to teach skills flexibly to other humans and to children in particular. A potential route toward natural and efficient learning and teaching in Human-Robot Interaction (HRI) is to leverage the social competences of humans and the underlying interactional mechanisms. In this perspective, this article discusses the importance of pragmatic frames as flexible interaction protocols that provide important contextual cues to enable learners to infer new action or language skills and teachers to convey these cues. After defining and discussing the concept of pragmatic frames, grounded in decades of research in developmental psychology, we study a selection of HRI work in the literature which has focused on learning-teaching interaction and analyze the interactional and learning mechanisms that were used in the light of pragmatic frames. This allows us to show that many of the works have already used in practice, but not always explicitly, basic elements of the pragmatic frames machinery. However, we also show that pragmatic frames have so far been used in a very restricted way as compared to how they are used in human-human interaction and argue that this has been an obstacle preventing robust natural multi-task learning and teaching in HRI. In particular, we explain that two central features of human pragmatic frames, mostly absent of existing HRI studies, are that (1) social peers use rich repertoires of frames, potentially combined together, to convey and infer multiple kinds of cues; (2) new frames can be learnt continually, building on existing ones, and guiding the interaction toward higher levels of complexity and expressivity. To conclude, we give an outlook on the future research direction describing the relevant key challenges that need to be solved for leveraging pragmatic frames for robot learning and teaching.

  4. Pragmatic Frames for Teaching and Learning in Human–Robot Interaction: Review and Challenges

    Science.gov (United States)

    Vollmer, Anna-Lisa; Wrede, Britta; Rohlfing, Katharina J.; Oudeyer, Pierre-Yves

    2016-01-01

    One of the big challenges in robotics today is to learn from human users that are inexperienced in interacting with robots but yet are often used to teach skills flexibly to other humans and to children in particular. A potential route toward natural and efficient learning and teaching in Human-Robot Interaction (HRI) is to leverage the social competences of humans and the underlying interactional mechanisms. In this perspective, this article discusses the importance of pragmatic frames as flexible interaction protocols that provide important contextual cues to enable learners to infer new action or language skills and teachers to convey these cues. After defining and discussing the concept of pragmatic frames, grounded in decades of research in developmental psychology, we study a selection of HRI work in the literature which has focused on learning–teaching interaction and analyze the interactional and learning mechanisms that were used in the light of pragmatic frames. This allows us to show that many of the works have already used in practice, but not always explicitly, basic elements of the pragmatic frames machinery. However, we also show that pragmatic frames have so far been used in a very restricted way as compared to how they are used in human–human interaction and argue that this has been an obstacle preventing robust natural multi-task learning and teaching in HRI. In particular, we explain that two central features of human pragmatic frames, mostly absent of existing HRI studies, are that (1) social peers use rich repertoires of frames, potentially combined together, to convey and infer multiple kinds of cues; (2) new frames can be learnt continually, building on existing ones, and guiding the interaction toward higher levels of complexity and expressivity. To conclude, we give an outlook on the future research direction describing the relevant key challenges that need to be solved for leveraging pragmatic frames for robot learning and teaching

  5. Classifying a Person's Degree of Accessibility From Natural Body Language During Social Human-Robot Interactions.

    Science.gov (United States)

    McColl, Derek; Jiang, Chuan; Nejat, Goldie

    2017-02-01

    For social robots to be successfully integrated and accepted within society, they need to be able to interpret human social cues that are displayed through natural modes of communication. In particular, a key challenge in the design of social robots is developing the robot's ability to recognize a person's affective states (emotions, moods, and attitudes) in order to respond appropriately during social human-robot interactions (HRIs). In this paper, we present and discuss social HRI experiments we have conducted to investigate the development of an accessibility-aware social robot able to autonomously determine a person's degree of accessibility (rapport, openness) toward the robot based on the person's natural static body language. In particular, we present two one-on-one HRI experiments to: 1) determine the performance of our automated system in being able to recognize and classify a person's accessibility levels and 2) investigate how people interact with an accessibility-aware robot which determines its own behaviors based on a person's speech and accessibility levels.

  6. Digital Emotion : How Audiences React to Robots on Screen

    OpenAIRE

    Damian Schofield,

    2018-01-01

    The experience of interacting with robots is becoming a more pervasive part of our day-to-day life. When considering the experience of interacting with other technologies and artefacts, interaction with robots presents a distinct and potentially unique component: physical connection. Robots share our physical space; this is a prominent part of the interaction experience. Robots offer a lifelike presence and the Human-Robot Interaction (HRI) issues go beyond the traditional interactions of mor...

  7. Effective Human-Robot Collaborative Work for Critical Missions

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this project is to improve human-robot interaction (HRI) in order to enhance the capability of NASA critical missions. This research will focus two...

  8. A Preliminary Study of Peer-to-Peer Human-Robot Interaction

    Science.gov (United States)

    Fong, Terrence; Flueckiger, Lorenzo; Kunz, Clayton; Lees, David; Schreiner, John; Siegel, Michael; Hiatt, Laura M.; Nourbakhsh, Illah; Simmons, Reid; Ambrose, Robert

    2006-01-01

    The Peer-to-Peer Human-Robot Interaction (P2P-HRI) project is developing techniques to improve task coordination and collaboration between human and robot partners. Our work is motivated by the need to develop effective human-robot teams for space mission operations. A central element of our approach is creating dialogue and interaction tools that enable humans and robots to flexibly support one another. In order to understand how this approach can influence task performance, we recently conducted a series of tests simulating a lunar construction task with a human-robot team. In this paper, we describe the tests performed, discuss our initial results, and analyze the effect of intervention on task performance.

  9. Fable: Socially Interactive Modular Robot

    DEFF Research Database (Denmark)

    Magnússon, Arnþór; Pacheco, Moises; Moghadam, Mikael

    2013-01-01

    Modular robots have a significant potential as user-reconfigurable robotic playware, but often lack sufficient sensing for social interaction. We address this issue with the Fable modular robotic system by exploring the use of smart sensor modules that has a better ability to sense the behavior...

  10. Mobile Mixed-Reality Interfaces That Enhance Human–Robot Interaction in Shared Spaces

    Directory of Open Access Journals (Sweden)

    Jared A. Frank

    2017-06-01

    Full Text Available Although user interfaces with gesture-based input and augmented graphics have promoted intuitive human–robot interactions (HRI, they are often implemented in remote applications on research-grade platforms requiring significant training and limiting operator mobility. This paper proposes a mobile mixed-reality interface approach to enhance HRI in shared spaces. As a user points a mobile device at the robot’s workspace, a mixed-reality environment is rendered providing a common frame of reference for the user and robot to effectively communicate spatial information for performing object manipulation tasks, improving the user’s situational awareness while interacting with augmented graphics to intuitively command the robot. An evaluation with participants is conducted to examine task performance and user experience associated with the proposed interface strategy in comparison to conventional approaches that utilize egocentric or exocentric views from cameras mounted on the robot or in the environment, respectively. Results indicate that, despite the suitability of the conventional approaches in remote applications, the proposed interface approach provides comparable task performance and user experiences in shared spaces without the need to install operator stations or vision systems on or around the robot. Moreover, the proposed interface approach provides users the flexibility to direct robots from their own visual perspective (at the expense of some physical workload and leverages the sensing capabilities of the tablet to expand the robot’s perceptual range.

  11. Understanding and Resolving Failures in Human-Robot Interaction: Literature Review and Model Development

    Directory of Open Access Journals (Sweden)

    Shanee Honig

    2018-06-01

    Full Text Available While substantial effort has been invested in making robots more reliable, experience demonstrates that robots operating in unstructured environments are often challenged by frequent failures. Despite this, robots have not yet reached a level of design that allows effective management of faulty or unexpected behavior by untrained users. To understand why this may be the case, an in-depth literature review was done to explore when people perceive and resolve robot failures, how robots communicate failure, how failures influence people's perceptions and feelings toward robots, and how these effects can be mitigated. Fifty-two studies were identified relating to communicating failures and their causes, the influence of failures on human-robot interaction (HRI, and mitigating failures. Since little research has been done on these topics within the HRI community, insights from the fields of human computer interaction (HCI, human factors engineering, cognitive engineering and experimental psychology are presented and discussed. Based on the literature, we developed a model of information processing for robotic failures (Robot Failure Human Information Processing, RF-HIP, that guides the discussion of our findings. The model describes the way people perceive, process, and act on failures in human robot interaction. The model includes three main parts: (1 communicating failures, (2 perception and comprehension of failures, and (3 solving failures. Each part contains several stages, all influenced by contextual considerations and mitigation strategies. Several gaps in the literature have become evident as a result of this evaluation. More focus has been given to technical failures than interaction failures. Few studies focused on human errors, on communicating failures, or the cognitive, psychological, and social determinants that impact the design of mitigation strategies. By providing the stages of human information processing, RF-HIP can be used as a

  12. Measuring the anthropomorphism, animacy, likeability, perceived intelligence and perceived safety of robots

    NARCIS (Netherlands)

    Bartneck, C.; Kulic, D.; Croft, E.

    2008-01-01

    This study emphasizes the need for standardized measurement tools for human robot interaction (HRI). If we are to make progress in this field then we must be able to compare the results from different studies. A literature review has been performed on the measurements of five key concepts in HRI:

  13. Intelligent Interaction for Human-Friendly Service Robot in Smart House Environment

    Directory of Open Access Journals (Sweden)

    Z. Zenn Bien

    2008-01-01

    Full Text Available The smart house under consideration is a service-integrated complex system to assist older persons and/or people with disabilities. The primary goal of the system is to achieve independent living by various robotic devices and systems. Such a system is treated as a human-in-the loop system in which human- robot interaction takes place intensely and frequently. Based on our experiences of having designed and implemented a smart house environment, called Intelligent Sweet Home (ISH, we present a framework of realizing human-friendly HRI (human-robot interaction module with various effective techniques of computational intelligence. More specifically, we partition the robotic tasks of HRI module into three groups in consideration of the level of specificity, fuzziness or uncertainty of the context of the system, and present effective interaction method for each case. We first show a task planning algorithm and its architecture to deal with well-structured tasks autonomously by a simplified set of commands of the user instead of inconvenient manual operations. To provide with capability of interacting in a human-friendly way in a fuzzy context, it is proposed that the robot should make use of human bio-signals as input of the HRI module as shown in a hand gesture recognition system, called a soft remote control system. Finally we discuss a probabilistic fuzzy rule-based life-long learning system, equipped with intention reading capability by learning human behavioral patterns, which is introduced as a solution in uncertain and time-varying situations.

  14. Toward a unified method for analysing and teaching Human Robot Interaction

    DEFF Research Database (Denmark)

    Dinesen, Jens Vilhelm

    , drawing on key theories and methods from both communications- and interaction-theory. The aim is to provide a single unified method for analysing interaction, through means of video analysis and then applying theories, with proven mutual compatibility, to reach a desired granularity of study.......This abstract aims to present key aspect of a future paper, which outlines the ongoing development ofa unified method for analysing and teaching Human-Robot-Interaction. The paper will propose a novel method for analysing both HRI, interaction with other forms of technologies and fellow humans...

  15. Exploiting Child-Robot Aesthetic Interaction for a Social Robot

    OpenAIRE

    Lee, Jae-Joon; Kim, Dae-Won; Kang, Bo-Yeong

    2012-01-01

    A social robot interacts and communicates with humans by using the embodied knowledge gained from interactions with its social environment. In recent years, emotion has emerged as a popular concept for designing social robots. Several studies on social robots reported an increase in robot sociability through emotional imitative interactions between the robot and humans. In this paper conventional emotional interactions are extended by exploiting the aesthetic theories that the sociability of ...

  16. Interactive robots in experimental biology.

    Science.gov (United States)

    Krause, Jens; Winfield, Alan F T; Deneubourg, Jean-Louis

    2011-07-01

    Interactive robots have the potential to revolutionise the study of social behaviour because they provide several methodological advances. In interactions with live animals, the behaviour of robots can be standardised, morphology and behaviour can be decoupled (so that different morphologies and behavioural strategies can be combined), behaviour can be manipulated in complex interaction sequences and models of behaviour can be embodied by the robot and thereby be tested. Furthermore, robots can be used as demonstrators in experiments on social learning. As we discuss here, the opportunities that robots create for new experimental approaches have far-reaching consequences for research in fields such as mate choice, cooperation, social learning, personality studies and collective behaviour. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Interactive Exploration Robots: Human-Robotic Collaboration and Interactions

    Science.gov (United States)

    Fong, Terry

    2017-01-01

    For decades, NASA has employed different operational approaches for human and robotic missions. Human spaceflight missions to the Moon and in low Earth orbit have relied upon near-continuous communication with minimal time delays. During these missions, astronauts and mission control communicate interactively to perform tasks and resolve problems in real-time. In contrast, deep-space robotic missions are designed for operations in the presence of significant communication delay - from tens of minutes to hours. Consequently, robotic missions typically employ meticulously scripted and validated command sequences that are intermittently uplinked to the robot for independent execution over long periods. Over the next few years, however, we will see increasing use of robots that blend these two operational approaches. These interactive exploration robots will be remotely operated by humans on Earth or from a spacecraft. These robots will be used to support astronauts on the International Space Station (ISS), to conduct new missions to the Moon, and potentially to enable remote exploration of planetary surfaces in real-time. In this talk, I will discuss the technical challenges associated with building and operating robots in this manner, along with lessons learned from research conducted with the ISS and in the field.

  18. Human-Robot Interaction: Intention Recognition and Mutual Entrainment

    Science.gov (United States)

    2012-08-18

    bility, but only the human arm is modeled, with linear, low-pass-filter type transfer functions [16]. The coupled dynamics in pHRI has been intensively ...be located inside or on the edge of the polygon. IV. DISCUSSION A. Issues in Implementing LIPM on a Mobile Robot Instead of focusing on kinesiology

  19. Technological Dangers and the Potential of Human-Robot Interaction

    DEFF Research Database (Denmark)

    Nørskov, Marco

    2016-01-01

    The ethical debate on social robotics has become one of the cutting edge topics of our time. When it comes to both academic and non-academic debates, the methodological framework is, with few exceptions, typically and tacitly grounded in an us-versus-them perspective. It is as though we were...... of positioning with regard to HRI. It is argued that the process itself is an artifact with moral significance, and consequently tantamount to discrimination. Furthermore, influenced by Heidegger’s warnings concerning technology, this chapter explores the possibilities of HRI with respect to the accompanying...

  20. Robot Futures

    DEFF Research Database (Denmark)

    Christoffersen, Anja; Grindsted Nielsen, Sally; Jochum, Elizabeth Ann

    Robots are increasingly used in health care settings, e.g., as homecare assistants and personal companions. One challenge for personal robots in the home is acceptance. We describe an innovative approach to influencing the acceptance of care robots using theatrical performance. Live performance...... is a useful testbed for developing and evaluating what makes robots expressive; it is also a useful platform for designing robot behaviors and dialogue that result in believable characters. Therefore theatre is a valuable testbed for studying human-robot interaction (HRI). We investigate how audiences...... perceive social robots interacting with humans in a future care scenario through a scripted performance. We discuss our methods and initial findings, and outline future work....

  1. Studying Robots Outside the Lab

    DEFF Research Database (Denmark)

    Blond, Lasse

    and ethnographic studies will enhance understandings of the dynamics of HRI. Furthermore, the paper emphasizes how users and the context of use matters to integration of robots, as it is shown how roboticists are unable to control how their designs are implemented in practice and that the sociality of social...... robots is inscribed by its users in social practice. This paper can be seen as a contribution to studies of long-term HRI. It presents the challenges of robot adaptation in practice and discusses the limitations of the present conceptual understanding of human-robotic relations. The ethnographic data......As more and more robots enter our social world there is a strong need for further field studies of human-robotic interaction. Based on a two-year ethnographic study of the implementation of the South Korean socially assistive robot in Danish elderly care this paper argues that empirical...

  2. Two-Stage Hidden Markov Model in Gesture Recognition for Human Robot Interaction

    Directory of Open Access Journals (Sweden)

    Nhan Nguyen-Duc-Thanh

    2012-07-01

    Full Text Available Hidden Markov Model (HMM is very rich in mathematical structure and hence can form the theoretical basis for use in a wide range of applications including gesture representation. Most research in this field, however, uses only HMM for recognizing simple gestures, while HMM can definitely be applied for whole gesture meaning recognition. This is very effectively applicable in Human-Robot Interaction (HRI. In this paper, we introduce an approach for HRI in which not only the human can naturally control the robot by hand gesture, but also the robot can recognize what kind of task it is executing. The main idea behind this method is the 2-stages Hidden Markov Model. The 1st HMM is to recognize the prime command-like gestures. Based on the sequence of prime gestures that are recognized from the 1st stage and which represent the whole action, the 2nd HMM plays a role in task recognition. Another contribution of this paper is that we use the output Mixed Gaussian distribution in HMM to improve the recognition rate. In the experiment, we also complete a comparison of the different number of hidden states and mixture components to obtain the optimal one, and compare to other methods to evaluate this performance.

  3. Polite Interactions with Robots

    DEFF Research Database (Denmark)

    Benotti, Luciana; Blackburn, Patrick Rowan

    2016-01-01

    We sketch an inference architecture that permits linguistic aspects of politeness to be interpreted; we do so by applying the ideas of politeness theory to the SCARE corpus of task-oriented dialogues, a type of dialogue of particular relevance to robotics. The fragment of the SCARE corpus we...... analyzed contains 77 uses of politeness strategies: our inference architecture covers 58 of them using classical AI planning techniques; the remainder require other forms of means-ends inference. So by the end of the paper we will have discussed in some detail how to interpret automatically different forms...

  4. Interactions between Humans and Robots

    DEFF Research Database (Denmark)

    Vlachos, Evgenios; Schärfe, Henrik

    2013-01-01

    ), and explains the relationships and dependencies that exist between them. The four main factors that define the properties of a robot, and therefore the interaction, are distributed in two dimensions: (1) Intelligence (Control - Autonomy), and (2) Perspective (Tool - Medium). Based on these factors, we...

  5. Negative Affect in Human Robot Interaction

    DEFF Research Database (Denmark)

    Rehm, Matthias; Krogsager, Anders

    2013-01-01

    The vision of social robotics sees robots moving more and more into unrestricted social environments, where robots interact closely with users in their everyday activities, maybe even establishing relationships with the user over time. In this paper we present a field trial with a robot in a semi...

  6. Implications of the Google’s US 8,996,429 B1 Patent in Cloud Robotics Based Therapeutic Researches

    NARCIS (Netherlands)

    Fosch Villaronga, Eduard; Albo-Canals, Jordi; Neves, Antonio J.R.

    2018-01-01

    Intended for being informative to both legal and engineer communities, this chapter raises awareness on the implications of recent patents in the field of human-robot interaction (HRI) studies. Google patented the use of cloud robotics to create robot personality(-ies). The broad claims of the

  7. Fundamentals of ergonomic exoskeleton robots

    OpenAIRE

    Schiele, A.

    2008-01-01

    This thesis is the first to provide the fundamentals of ergonomic exoskeleton design. The fundamental theory as well as technology necessary to analyze and develop ergonomic wearable robots interacting with humans is established and validated by experiments and prototypes. The fundamentals are (1) a new theoretical framework for analyzing physical human robot interaction (pHRI) with exoskeletons, and (2) a clear set of design rules of how to build wearable, portable exoskeletons to easily and...

  8. Toward a tactile language for human-robot interaction: two studies of tacton learning and performance.

    Science.gov (United States)

    Barber, Daniel J; Reinerman-Jones, Lauren E; Matthews, Gerald

    2015-05-01

    Two experiments were performed to investigate the feasibility for robot-to-human communication of a tactile language using a lexicon of standardized tactons (tactile icons) within a sentence. Improvements in autonomous systems technology and a growing demand within military operations are spurring interest in communication via vibrotactile displays. Tactile communication may become an important element of human-robot interaction (HRI), but it requires the development of messaging capabilities approaching the communication power of the speech and visual signals used in the military. In Experiment 1 (N = 38), we trained participants to identify sets of directional, dynamic, and static tactons and tested performance and workload following training. In Experiment 2 (N = 76), we introduced an extended training procedure and tested participants' ability to correctly identify two-tacton phrases. We also investigated the impact of multitasking on performance and workload. Individual difference factors were assessed. Experiment 1 showed that participants found dynamic and static tactons difficult to learn, but the enhanced training procedure in Experiment 2 produced competency in performance for all tacton categories. Participants in the latter study also performed well on two-tacton phrases and when multitasking. However, some deficits in performance and elevation of workload were observed. Spatial ability predicted some aspects of performance in both studies. Participants may be trained to identify both single tactons and tacton phrases, demonstrating the feasibility of developing a tactile language for HRI. Tactile communication may be incorporated into multi-modal communication systems for HRI. It also has potential for human-human communication in challenging environments. © 2014, Human Factors and Ergonomics Society.

  9. Pantomimic gestures for human-robot interaction

    CSIR Research Space (South Africa)

    Burke, Michael G

    2015-10-01

    Full Text Available -1 IEEE TRANSACTIONS ON ROBOTICS 1 Pantomimic Gestures for Human-Robot Interaction Michael Burke, Student Member, IEEE, and Joan Lasenby Abstract This work introduces a pantomimic gesture interface, which classifies human hand gestures using...

  10. Emotion based human-robot interaction

    Directory of Open Access Journals (Sweden)

    Berns Karsten

    2018-01-01

    Full Text Available Human-machine interaction is a major challenge in the development of complex humanoid robots. In addition to verbal communication the use of non-verbal cues such as hand, arm and body gestures or mimics can improve the understanding of the intention of the robot. On the other hand, by perceiving such mechanisms of a human in a typical interaction scenario the humanoid robot can adapt its interaction skills in a better way. In this work, the perception system of two social robots, ROMAN and ROBIN of the RRLAB of the TU Kaiserslautern, is presented in the range of human-robot interaction.

  11. A Guide for Developing Human-Robot Interaction Experiments in the Robotic Interactive Visualization and Experimentation Technology (RIVET) Simulation

    Science.gov (United States)

    2016-05-01

    5.2 Gunnery 34 5.3 Driverless Vehicle Transport 36 6. Discussion 38 6.1 Lessons Learned 38 6.1.1 Problem No. 1: CARVE Does Not Always Load...gunnery user interface GUI setup 5.3 Driverless Vehicle Transport A third, and most recent, type of mission used for HRI experimentation investigated a...specific non-combat operation of driverless vehicles for passenger transit. One example is an experiment motivated by the Autonomous Robotics for

  12. Semiotics and Human-Robot Interaction

    OpenAIRE

    Sequeira, Joao; Ribeiro, M.Isabel

    2007-01-01

    The social barriers that still constrain the use of robots in modern societies will tend to vanish with the sophistication increase of interaction strategies. Communication and interaction between people and robots occurring in a friendly manner and being accessible to everyone, independent of their skills in robotics issues, will certainly foster the breaking of barriers. Socializing behaviors, such as following people, are relatively easy to obtain with current state of the art robotics. Ho...

  13. Self-alteration in HRI

    DEFF Research Database (Denmark)

    Yamazaki, Ryuji; Nørskov, Marco

    human communications. As an example of the results, in group work activities in an elementary school, we found that Telenoid's limited capability led children to change their attitudes so that they could work together. Also, in a care facility, the elderly with dementia developed prosocial behaviors......Humanlike androids are being developed with the ambition to be immersed into our daily life and meet us on an equal level in social interaction. The possibilities and limitations of these types of robots can potentially change societies and Human-Robot Interaction might affect the very way in which...... the ways in which our subjectivity can be innerly transformed, decentred, in other words, self-altered. In our trials so far, we have been investigating the potential of teleoperated androids, which are embodied telecommunication media with humanlike appearances. By conducting pilot studies in Japan...

  14. HRI observations of the Pleiades

    Science.gov (United States)

    Harnden, F. R., Jr.; Caillault, J.-P.; Damiani, F.; Kashyap, V.; Micela, G.; Prosser, C.; Rosner, R.; Sciortino, S.; Stauffer, J.

    1996-01-01

    The preliminary analysis of the data from the first four Rosat high resolution imager (HRI) pointings provided many new faint Pleiades detections. The completion of the high resolution survey of the most source-confused regions of this open cluster will lead to the construction of proper X-ray luminosity functions and will yield a definitive assessment of the coronal emission of the Pleiades members.

  15. Contact Estimation in Robot Interaction

    Directory of Open Access Journals (Sweden)

    Filippo D'Ippolito

    2014-07-01

    Full Text Available In the paper, safety issues are examined in a scenario in which a robot manipulator and a human perform the same task in the same workspace. During the task execution, the human should be able to physically interact with the robot, and in this case an estimation algorithm for both interaction forces and a contact point is proposed in order to guarantee safety conditions. The method, starting from residual joint torque estimation, allows both direct and adaptive computation of the contact point and force, based on a principle of equivalence of the contact forces. At the same time, all the unintended contacts must be avoided, and a suitable post-collision strategy is considered to move the robot away from the collision area or else to reduce impact effects. Proper experimental tests have demonstrated the applicability in practice of both the post-impact strategy and the estimation algorithms; furthermore, experiments demonstrate the different behaviour resulting from the adaptation of the contact point as opposed to direct calculation.

  16. Acceptance of an assistive robot in older adults: a mixed-method study of human–robot interaction over a 1-month period in the Living Lab setting

    Directory of Open Access Journals (Sweden)

    Wu YH

    2014-05-01

    /societal issues associated with robot use.Conclusion: It is important to destigmatize images of assistive robots to facilitate their acceptance. Universal design aiming to increase the market for and production of products that are usable by everyone (to the greatest extent possible might help to destigmatize assistive devices.Keywords: assistive robot, human–robot interaction, HRI, robot-acceptance, technology acceptance

  17. Avoiding Local Optima with Interactive Evolutionary Robotics

    Science.gov (United States)

    2012-07-09

    the top of a flight of stairs selects for climbing ; suspending the robot and the target object above the ground and creating rungs between the two will...REPORT Avoiding Local Optimawith Interactive Evolutionary Robotics 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: The main bottleneck in evolutionary... robotics has traditionally been the time required to evolve robot controllers. However with the continued acceleration in computational resources, the

  18. Understanding Soldier Robot Teams in Virtual Environments

    National Research Council Canada - National Science Library

    Barnes, Michael J; Jentsch, Florian; Cosenzo, Keryl A; Chen, Jessie Y; McDermott, Patricia

    2006-01-01

    ...) and renamed the Collaborative Robotics ATO. The purpose of the program is to understand HRI issues in order to develop and evaluate technologies to improve HRI battlefield performance for Future Combat Systems (FCS...

  19. Practical evaluation of robots for elderly in Denmark — an overview

    DEFF Research Database (Denmark)

    Hansen, Søren Tranberg; Andersen, Hans Jørgen; Bak, Thomas

    2010-01-01

    Robots for elderly have drawn a great deal of attention as it is a controversial topic being pushed forward by the fact that there will be a dramatic increase of elderly in most western countries. Within the field of HRI, much research has been conducted on robots interacting with elderly and also...... a number of commercial products have been introduced to the market. Since 2006, a number of projects have been launched in Denmark in order to evaluate robot technology in practice in elder care. This paper gives an brief overview of a selected number of projects and outlines the characteristics...... and results. Finally it is discussed how HRI can benefit from these....

  20. From Child-Robot Interaction to Child-Robot-Therapist Interaction: A Case Study in Autism

    Directory of Open Access Journals (Sweden)

    I. Giannopulu

    2012-01-01

    Full Text Available Troubles in social communication as well as deficits in the cognitive treatment of emotions are supposed to be a fundamental part of autism. We present a case study based on multimodal interaction between a mobile robot and a child with autism in spontaneous, free game play. This case study tells us that the robot mediates the interaction between the autistic child and therapist once the robot-child interaction has been established. In addition, the child uses the robot as a mediator to express positive emotion playing with the therapist. It is thought that the three-pronged interaction i.e., child-robot-therapist could better facilitate the transfer of social and emotional abilities to real life settings. Robot therapy has a high potential to improve the condition of brain activity in autistic children.

  1. Quantifying the human-robot interaction forces between a lower limb exoskeleton and healthy users.

    Science.gov (United States)

    Rathore, Ashish; Wilcox, Matthew; Ramirez, Dafne Zuleima Morgado; Loureiro, Rui; Carlson, Tom

    2016-08-01

    To counter the many disadvantages of prolonged wheelchair use, patients with spinal cord injuries (SCI) are beginning to turn towards robotic exoskeletons. However, we are currently unaware of the magnitude and distribution of forces acting between the user and the exoskeleton. This is a critical issue, as SCI patients have an increased susceptibility to skin lesions and pressure ulcer development. Therefore, we developed a real-time force measuring apparatus, which was placed at the physical human-robot interface (pHRI) of a lower limb robotic exoskeleton. Experiments captured the dynamics of these interaction forces whilst the participants performed a range of typical stepping actions. Our results indicate that peak forces occurred at the anterior aspect of both the left and right legs, areas that are particularly prone to pressure ulcer development. A significant difference was also found between the average force experienced at the anterior and posterior sensors of the right thigh during the swing phase for different movement primitives. These results call for the integration of instrumented straps as standard in lower limb exoskeletons. They also highlight the potential of such straps to be used as an alternative/complementary interface for the high-level control of lower limb exoskeletons in some patient groups.

  2. Architecture for Multiple Interacting Robot Intelligences

    Science.gov (United States)

    Peters, Richard Alan, II (Inventor)

    2008-01-01

    An architecture for robot intelligence enables a robot to learn new behaviors and create new behavior sequences autonomously and interact with a dynamically changing environment. Sensory information is mapped onto a Sensory Ego-Sphere (SES) that rapidly identifies important changes in the environment and functions much like short term memory. Behaviors are stored in a database associative memory (DBAM) that creates an active map from the robot's current state to a goal state and functions much like long term memory. A dream state converts recent activities stored in the SES and creates or modifies behaviors in the DBAM.

  3. Adaptive training algorithm for robot-assisted upper-arm rehabilitation, applicable to individualised and therapeutic human-robot interaction.

    Science.gov (United States)

    Chemuturi, Radhika; Amirabdollahian, Farshid; Dautenhahn, Kerstin

    2013-09-28

    Rehabilitation robotics is progressing towards developing robots that can be used as advanced tools to augment the role of a therapist. These robots are capable of not only offering more frequent and more accessible therapies but also providing new insights into treatment effectiveness based on their ability to measure interaction parameters. A requirement for having more advanced therapies is to identify how robots can 'adapt' to each individual's needs at different stages of recovery. Hence, our research focused on developing an adaptive interface for the GENTLE/A rehabilitation system. The interface was based on a lead-lag performance model utilising the interaction between the human and the robot. The goal of the present study was to test the adaptability of the GENTLE/A system to the performance of the user. Point-to-point movements were executed using the HapticMaster (HM) robotic arm, the main component of the GENTLE/A rehabilitation system. The points were displayed as balls on the screen and some of the points also had a real object, providing a test-bed for the human-robot interaction (HRI) experiment. The HM was operated in various modes to test the adaptability of the GENTLE/A system based on the leading/lagging performance of the user. Thirty-two healthy participants took part in the experiment comprising of a training phase followed by the actual-performance phase. The leading or lagging role of the participant could be used successfully to adjust the duration required by that participant to execute point-to-point movements, in various modes of robot operation and under various conditions. The adaptability of the GENTLE/A system was clearly evident from the durations recorded. The regression results showed that the participants required lower execution times with the help from a real object when compared to just a virtual object. The 'reaching away' movements were longer to execute when compared to the 'returning towards' movements irrespective of the

  4. User localization during human-robot interaction.

    Science.gov (United States)

    Alonso-Martín, F; Gorostiza, Javi F; Malfaz, María; Salichs, Miguel A

    2012-01-01

    This paper presents a user localization system based on the fusion of visual information and sound source localization, implemented on a social robot called Maggie. One of the main requisites to obtain a natural interaction between human-human and human-robot is an adequate spatial situation between the interlocutors, that is, to be orientated and situated at the right distance during the conversation in order to have a satisfactory communicative process. Our social robot uses a complete multimodal dialog system which manages the user-robot interaction during the communicative process. One of its main components is the presented user localization system. To determine the most suitable allocation of the robot in relation to the user, a proxemic study of the human-robot interaction is required, which is described in this paper. The study has been made with two groups of users: children, aged between 8 and 17, and adults. Finally, at the end of the paper, experimental results with the proposed multimodal dialog system are presented.

  5. A Robotic Coach Architecture for Elder Care (ROCARE) Based on Multi-User Engagement Models.

    Science.gov (United States)

    Fan, Jing; Bian, Dayi; Zheng, Zhi; Beuscher, Linda; Newhouse, Paul A; Mion, Lorraine C; Sarkar, Nilanjan

    2017-08-01

    The aging population with its concomitant medical conditions, physical and cognitive impairments, at a time of strained resources, establishes the urgent need to explore advanced technologies that may enhance function and quality of life. Recently, robotic technology, especially socially assistive robotics has been investigated to address the physical, cognitive, and social needs of older adults. Most system to date have predominantly focused on one-on-one human robot interaction (HRI). In this paper, we present a multi-user engagement-based robotic coach system architecture (ROCARE). ROCARE is capable of administering both one-on-one and multi-user HRI, providing implicit and explicit channels of communication, and individualized activity management for long-term engagement. Two preliminary feasibility studies, a one-on-one interaction and a triadic interaction with two humans and a robot, were conducted and the results indicated potential usefulness and acceptance by older adults, with and without cognitive impairment.

  6. A Multilayer Hidden Markov Models-Based Method for Human-Robot Interaction

    Directory of Open Access Journals (Sweden)

    Chongben Tao

    2013-01-01

    Full Text Available To achieve Human-Robot Interaction (HRI by using gestures, a continuous gesture recognition approach based on Multilayer Hidden Markov Models (MHMMs is proposed, which consists of two parts. One part is gesture spotting and segment module, the other part is continuous gesture recognition module. Firstly, a Kinect sensor is used to capture 3D acceleration and 3D angular velocity data of hand gestures. And then, a Feed-forward Neural Networks (FNNs and a threshold criterion are used for gesture spotting and segment, respectively. Afterwards, the segmented gesture signals are respectively preprocessed and vector symbolized by a sliding window and a K-means clustering method. Finally, symbolized data are sent into Lower Hidden Markov Models (LHMMs to identify individual gestures, and then, a Bayesian filter with sequential constraints among gestures in Upper Hidden Markov Models (UHMMs is used to correct recognition errors created in LHMMs. Five predefined gestures are used to interact with a Kinect mobile robot in experiments. The experimental results show that the proposed method not only has good effectiveness and accuracy, but also has favorable real-time performance.

  7. The relation between people's attitudes and anxiety towards robots in human-robot interaction

    NARCIS (Netherlands)

    de Graaf, M.M.A.; Ben Allouch, Soumaya

    2013-01-01

    This paper examines the relation between an interaction with a robot and peoples’ attitudes and emotion towards robots. In our study, participants have had an acquaintance talk with a social robot and both their general attitude and anxiety towards social robots were measured before and after the

  8. Interacting With Robots to Investigate the Bases of Social Interaction.

    Science.gov (United States)

    Sciutti, Alessandra; Sandini, Giulio

    2017-12-01

    Humans show a great natural ability at interacting with each other. Such efficiency in joint actions depends on a synergy between planned collaboration and emergent coordination, a subconscious mechanism based on a tight link between action execution and perception. This link supports phenomena as mutual adaptation, synchronization, and anticipation, which cut drastically the delays in the interaction and the need of complex verbal instructions and result in the establishment of joint intentions, the backbone of social interaction. From a neurophysiological perspective, this is possible, because the same neural system supporting action execution is responsible of the understanding and the anticipation of the observed action of others. Defining which human motion features allow for such emergent coordination with another agent would be crucial to establish more natural and efficient interaction paradigms with artificial devices, ranging from assistive and rehabilitative technology to companion robots. However, investigating the behavioral and neural mechanisms supporting natural interaction poses substantial problems. In particular, the unconscious processes at the basis of emergent coordination (e.g., unintentional movements or gazing) are very difficult-if not impossible-to restrain or control in a quantitative way for a human agent. Moreover, during an interaction, participants influence each other continuously in a complex way, resulting in behaviors that go beyond experimental control. In this paper, we propose robotics technology as a potential solution to this methodological problem. Robots indeed can establish an interaction with a human partner, contingently reacting to his actions without losing the controllability of the experiment or the naturalness of the interactive scenario. A robot could represent an "interactive probe" to assess the sensory and motor mechanisms underlying human-human interaction. We discuss this proposal with examples from our

  9. Enhancing Perception with Tactile Object Recognition in Adaptive Grippers for Human–Robot Interaction

    Directory of Open Access Journals (Sweden)

    Juan M. Gandarias

    2018-02-01

    Full Text Available The use of tactile perception can help first response robotic teams in disaster scenarios, where visibility conditions are often reduced due to the presence of dust, mud, or smoke, distinguishing human limbs from other objects with similar shapes. Here, the integration of the tactile sensor in adaptive grippers is evaluated, measuring the performance of an object recognition task based on deep convolutional neural networks (DCNNs using a flexible sensor mounted in adaptive grippers. A total of 15 classes with 50 tactile images each were trained, including human body parts and common environment objects, in semi-rigid and flexible adaptive grippers based on the fin ray effect. The classifier was compared against the rigid configuration and a support vector machine classifier (SVM. Finally, a two-level output network has been proposed to provide both object-type recognition and human/non-human classification. Sensors in adaptive grippers have a higher number of non-null tactels (up to 37% more, with a lower mean of pressure values (up to 72% less than when using a rigid sensor, with a softer grip, which is needed in physical human–robot interaction (pHRI. A semi-rigid implementation with 95.13% object recognition rate was chosen, even though the human/non-human classification had better results (98.78% with a rigid sensor.

  10. Enhancing Perception with Tactile Object Recognition in Adaptive Grippers for Human-Robot Interaction.

    Science.gov (United States)

    Gandarias, Juan M; Gómez-de-Gabriel, Jesús M; García-Cerezo, Alfonso J

    2018-02-26

    The use of tactile perception can help first response robotic teams in disaster scenarios, where visibility conditions are often reduced due to the presence of dust, mud, or smoke, distinguishing human limbs from other objects with similar shapes. Here, the integration of the tactile sensor in adaptive grippers is evaluated, measuring the performance of an object recognition task based on deep convolutional neural networks (DCNNs) using a flexible sensor mounted in adaptive grippers. A total of 15 classes with 50 tactile images each were trained, including human body parts and common environment objects, in semi-rigid and flexible adaptive grippers based on the fin ray effect. The classifier was compared against the rigid configuration and a support vector machine classifier (SVM). Finally, a two-level output network has been proposed to provide both object-type recognition and human/non-human classification. Sensors in adaptive grippers have a higher number of non-null tactels (up to 37% more), with a lower mean of pressure values (up to 72% less) than when using a rigid sensor, with a softer grip, which is needed in physical human-robot interaction (pHRI). A semi-rigid implementation with 95.13% object recognition rate was chosen, even though the human/non-human classification had better results (98.78%) with a rigid sensor.

  11. Perceptions of a Soft Robotic Tentacle in Interaction

    DEFF Research Database (Denmark)

    Jørgensen, Jonas

    2018-01-01

    Soft robotics technology has been proposed for a number of applications that involve human-robot interaction. This video documents a platform created to explore human perceptions of soft robots in interaction. The video presents select footage from an interaction experiment conducted...

  12. Playful Interaction with Voice Sensing Modular Robots

    DEFF Research Database (Denmark)

    Heesche, Bjarke; MacDonald, Ewen; Fogh, Rune

    2013-01-01

    This paper describes a voice sensor, suitable for modular robotic systems, which estimates the energy and fundamental frequency, F0, of the user’s voice. Through a number of example applications and tests with children, we observe how the voice sensor facilitates playful interaction between child...... children and two different robot configurations. In future work, we will investigate if such a system can motivate children to improve voice control and explore how to extend the sensor to detect emotions in the user’s voice....

  13. From Human-Computer Interaction to Human-Robot Social Interaction

    OpenAIRE

    Toumi, Tarek; Zidani, Abdelmadjid

    2014-01-01

    Human-Robot Social Interaction became one of active research fields in which researchers from different areas propose solutions and directives leading robots to improve their interactions with humans. In this paper we propose to introduce works in both human robot interaction and human computer interaction and to make a bridge between them, i.e. to integrate emotions and capabilities concepts of the robot in human computer model to become adequate for human robot interaction and discuss chall...

  14. Child, Robot and Educational Material : A Triadic Interaction

    NARCIS (Netherlands)

    Davison, Daniel Patrick

    The process in which a child and a robot work together to solve a learning task can be characterised as a triadic interaction. Interactions between the child and robot; the child and learning materials; and the robot and learning materials will each shape the perception and appreciation the child

  15. Child, Robot and Educational Material: A Triadic Interaction

    NARCIS (Netherlands)

    Davison, Daniel Patrick

    The process in which a child and a robot work together to solve a learning task can be characterised as a triadic interaction. Interactions between the child and robot; the child and learning materials; and the robot and learning materials will each shape the perception and appreciation the child

  16. Human-Robot Interaction and Human Self-Realization

    DEFF Research Database (Denmark)

    Nørskov, Marco

    2014-01-01

    is to test the basis for this type of discrimination when it comes to human-robot interaction. Furthermore, the paper will take Heidegger's warning concerning technology as a vantage point and explore the possibility of human-robot interaction forming a praxis that might help humans to be with robots beyond...

  17. Effect of cognitive biases on human-robot interaction: a case study of robot's misattribution

    OpenAIRE

    Biswas, Mriganka; Murray, John

    2014-01-01

    This paper presents a model for developing long-term human-robot interactions and social relationships based on the principle of 'human' cognitive biases applied to a robot. The aim of this work is to study how a robot influenced with human ‘misattribution’ helps to build better human-robot interactions than unbiased robots. The results presented in this paper suggest that it is important to know the effect of cognitive biases in human characteristics and interactions in order to better u...

  18. A taxonomy for user-healthcare robot interaction.

    Science.gov (United States)

    Bzura, Conrad; Im, Hosung; Liu, Tammy; Malehorn, Kevin; Padir, Taskin; Tulu, Bengisu

    2012-01-01

    This paper evaluates existing taxonomies aimed at characterizing the interaction between robots and their users and modifies them for health care applications. The modifications are based on existing robot technologies and user acceptance of robotics. Characterization of the user, or in this case the patient, is a primary focus of the paper, as they present a unique new role as robot users. While therapeutic and monitoring-related applications for robots are still relatively uncommon, we believe they will begin to grow and thus it is important that the spurring relationship between robot and patient is well understood.

  19. Multimodal interaction for human-robot teams

    Science.gov (United States)

    Burke, Dustin; Schurr, Nathan; Ayers, Jeanine; Rousseau, Jeff; Fertitta, John; Carlin, Alan; Dumond, Danielle

    2013-05-01

    Unmanned ground vehicles have the potential for supporting small dismounted teams in mapping facilities, maintaining security in cleared buildings, and extending the team's reconnaissance and persistent surveillance capability. In order for such autonomous systems to integrate with the team, we must move beyond current interaction methods using heads-down teleoperation which require intensive human attention and affect the human operator's ability to maintain local situational awareness and ensure their own safety. This paper focuses on the design, development and demonstration of a multimodal interaction system that incorporates naturalistic human gestures, voice commands, and a tablet interface. By providing multiple, partially redundant interaction modes, our system degrades gracefully in complex environments and enables the human operator to robustly select the most suitable interaction method given the situational demands. For instance, the human can silently use arm and hand gestures for commanding a team of robots when it is important to maintain stealth. The tablet interface provides an overhead situational map allowing waypoint-based navigation for multiple ground robots in beyond-line-of-sight conditions. Using lightweight, wearable motion sensing hardware either worn comfortably beneath the operator's clothing or integrated within their uniform, our non-vision-based approach enables an accurate, continuous gesture recognition capability without line-of-sight constraints. To reduce the training necessary to operate the system, we designed the interactions around familiar arm and hand gestures.

  20. Interaction dynamics of multiple mobile robots with simple navigation strategies

    Science.gov (United States)

    Wang, P. K. C.

    1989-01-01

    The global dynamic behavior of multiple interacting autonomous mobile robots with simple navigation strategies is studied. Here, the effective spatial domain of each robot is taken to be a closed ball about its mass center. It is assumed that each robot has a specified cone of visibility such that interaction with other robots takes place only when they enter its visibility cone. Based on a particle model for the robots, various simple homing and collision-avoidance navigation strategies are derived. Then, an analysis of the dynamical behavior of the interacting robots in unbounded spatial domains is made. The article concludes with the results of computer simulations studies of two or more interacting robots.

  1. Exploring multimodal robotic interaction through storytelling for aphasics

    NARCIS (Netherlands)

    Mubin, O.; Al Mahmud, A.; Abuelma'atti, O.; England, D.

    2008-01-01

    In this poster, we propose the design of a multimodal robotic interaction mechanism that is intended to be used by Aphasics for storytelling. Through limited physical interaction, mild to moderate aphasic people can interact with a robot that may help them to be more active in their day to day

  2. Pose Estimation and Adaptive Robot Behaviour for Human-Robot Interaction

    DEFF Research Database (Denmark)

    Svenstrup, Mikael; Hansen, Søren Tranberg; Andersen, Hans Jørgen

    2009-01-01

    Abstract—This paper introduces a new method to determine a person’s pose based on laser range measurements. Such estimates are typically a prerequisite for any human-aware robot navigation, which is the basis for effective and timeextended interaction between a mobile robot and a human. The robot......’s pose. The resulting pose estimates are used to identify humans who wish to be approached and interacted with. The interaction motion of the robot is based on adaptive potential functions centered around the person that respect the persons social spaces. The method is tested in experiments...

  3. Anthropomorphic Robot Design and User Interaction Associated with Motion

    Science.gov (United States)

    Ellis, Stephen R.

    2016-01-01

    Though in its original concept a robot was conceived to have some human-like shape, most robots now in use have specific industrial purposes and do not closely resemble humans. Nevertheless, robots that resemble human form in some way have continued to be introduced. They are called anthropomorphic robots. The fact that the user interface to all robots is now highly mediated means that the form of the user interface is not necessarily connected to the robots form, human or otherwise. Consequently, the unique way the design of anthropomorphic robots affects their user interaction is through their general appearance and the way they move. These robots human-like appearance acts as a kind of generalized predictor that gives its operators, and those with whom they may directly work, the expectation that they will behave to some extent like a human. This expectation is especially prominent for interactions with social robots, which are built to enhance it. Often interaction with them may be mainly cognitive because they are not necessarily kinematically intricate enough for complex physical interaction. Their body movement, for example, may be limited to simple wheeled locomotion. An anthropomorphic robot with human form, however, can be kinematically complex and designed, for example, to reproduce the details of human limb, torso, and head movement. Because of the mediated nature of robot control, there remains in general no necessary connection between the specific form of user interface and the anthropomorphic form of the robot. But their anthropomorphic kinematics and dynamics imply that the impact of their design shows up in the way the robot moves. The central finding of this report is that the control of this motion is a basic design element through which the anthropomorphic form can affect user interaction. In particular, designers of anthropomorphic robots can take advantage of the inherent human-like movement to 1) improve the users direct manual control over

  4. Interaction debugging : an integral approach to analyze human-robot interaction

    NARCIS (Netherlands)

    Kooijmans, T.; Kanda, T.; Bartneck, C.; Ishiguro, H.; Hagita, N.

    2006-01-01

    Along with the development of interactive robots, controlled experiments and field trials are regularly conducted to stage human-robot interaction. Experience in this field has shown that analyzing human-robot interaction for evaluation purposes fosters the development of improved systems and the

  5. Human-robot interaction strategies for walker-assisted locomotion

    CERN Document Server

    Cifuentes, Carlos A

    2016-01-01

    This book presents the development of a new multimodal human-robot interface for testing and validating control strategies applied to robotic walkers for assisting human mobility and gait rehabilitation. The aim is to achieve a closer interaction between the robotic device and the individual, empowering the rehabilitation potential of such devices in clinical applications. A new multimodal human-robot interface for testing and validating control strategies applied to robotic walkers for assisting human mobility and gait rehabilitation is presented. Trends and opportunities for future advances in the field of assistive locomotion via the development of hybrid solutions based on the combination of smart walkers and biomechatronic exoskeletons are also discussed. .

  6. A Meta-Analysis of Factors Influencing the Development of Human-Robot Trust

    Science.gov (United States)

    2011-12-01

    culture accounts for significant differences in trust ratings for robots; some collectivist cultures have higher trust ratings than individualistic ...HRI Occurs Other potential factors impacting trust in HRI are directly related to the environment in which HRI occurs. For example, the cultural ... cultures (Li et al., 2010). Our SMEs also indicated that team collaboration issues (e.g., communication, shared mental models) and tasking

  7. Safe physical human robot interaction- past, present and future

    International Nuclear Information System (INIS)

    Pervez, Aslam; Ryu, Jeha

    2008-01-01

    When a robot physically interacts with a human user, the requirements should be drastically changed. The most important requirement is the safety of the human user in the sense that robot should not harm the human in any situation. During the last few years, research has been focused on various aspects of safe physical human robot interaction. This paper provides a review of the work on safe physical interaction of robotic systems sharing their workspace with human users (especially elderly people). Three distinct areas of research are identified: interaction safety assessment, interaction safety through design, and interaction safety through planning and control. The paper then highlights the current challenges and available technologies and points out future research directions for realization of a safe and dependable robotic system for human users

  8. A new approach of active compliance control via fuzzy logic control for multifingered robot hand

    Science.gov (United States)

    Jamil, M. F. A.; Jalani, J.; Ahmad, A.

    2016-07-01

    Safety is a vital issue in Human-Robot Interaction (HRI). In order to guarantee safety in HRI, a model reference impedance control can be a very useful approach introducing a compliant control. In particular, this paper establishes a fuzzy logic compliance control (i.e. active compliance control) to reduce impact and forces during physical interaction between humans/objects and robots. Exploiting a virtual mass-spring-damper system allows us to determine a desired compliant level by understanding the behavior of the model reference impedance control. The performance of fuzzy logic compliant control is tested in simulation for a robotic hand known as the RED Hand. The results show that the fuzzy logic is a feasible control approach, particularly to control position and to provide compliant control. In addition, the fuzzy logic control allows us to simplify the controller design process (i.e. avoid complex computation) when dealing with nonlinearities and uncertainties.

  9. Empathizing with Emotional Robot Based on Cognition Reappraisal

    Institute of Scientific and Technical Information of China (English)

    Xin Liu; Lun Xie; Zhiliang Wang

    2017-01-01

    This paper proposes a continuous cognitive emotional regulation model for robot in the case of external emotional stimulus from interactive person's expressions. It integrates a guiding cognitive reappraisal strategy into the HMM (Hidden Markov Model) emotional interactive model for empathizing between ro-bot and person. The emotion is considered as a source in the 3D space (Arousal, Valence, and Stance). State transition and emotion intensity can be quantitatively analyzed in the continu-ous space. This cognition-emotion interactive model have been verified by the expression and behavior robot. Empathizing is the main distinguishing feature of our work, and it is realized by the emotional regulation which operated in a continuous 3D emotional space enabling a wide range of intermediate emo-tions. The experiment results provide evidence with acceptability, accuracy, richness, fluency, interestingness, friendliness and exaggeration that the robot with cognition and emotional control ability could be better accepted in the human-robot interaction (HRI).

  10. Velocity-curvature patterns limit human-robot physical interaction.

    Science.gov (United States)

    Maurice, Pauline; Huber, Meghan E; Hogan, Neville; Sternad, Dagmar

    2018-01-01

    Physical human-robot collaboration is becoming more common, both in industrial and service robotics. Cooperative execution of a task requires intuitive and efficient interaction between both actors. For humans, this means being able to predict and adapt to robot movements. Given that natural human movement exhibits several robust features, we examined whether human-robot physical interaction is facilitated when these features are considered in robot control. The present study investigated how humans adapt to biological and non-biological velocity patterns in robot movements. Participants held the end-effector of a robot that traced an elliptic path with either biological (two-thirds power law) or non-biological velocity profiles. Participants were instructed to minimize the force applied on the robot end-effector. Results showed that the applied force was significantly lower when the robot moved with a biological velocity pattern. With extensive practice and enhanced feedback, participants were able to decrease their force when following a non-biological velocity pattern, but never reached forces below those obtained with the 2/3 power law profile. These results suggest that some robust features observed in natural human movements are also a strong preference in guided movements. Therefore, such features should be considered in human-robot physical collaboration.

  11. Effects of Robot Facial Characteristics and Gender in Persuasive Human-Robot Interaction

    Directory of Open Access Journals (Sweden)

    Aimi S. Ghazali

    2018-06-01

    Full Text Available The growing interest in social robotics makes it relevant to examine the potential of robots as persuasive agents and, more specifically, to examine how robot characteristics influence the way people experience such interactions and comply with the persuasive attempts by robots. The purpose of this research is to identify how the (ostensible gender and the facial characteristics of a robot influence the extent to which people trust it and the psychological reactance they experience from its persuasive attempts. This paper reports a laboratory study where SociBot™, a robot capable of displaying different faces and dynamic social cues, delivered persuasive messages to participants while playing a game. In-game choice behavior was logged, and trust and reactance toward the advisor were measured using questionnaires. Results show that a robotic advisor with upturned eyebrows and lips (features that people tend to trust more in humans is more persuasive, evokes more trust, and less psychological reactance compared to one displaying eyebrows pointing down and lips curled downwards at the edges (facial characteristics typically not trusted in humans. Gender of the robot did not affect trust, but participants experienced higher psychological reactance when interacting with a robot of the opposite gender. Remarkably, mediation analysis showed that liking of the robot fully mediates the influence of facial characteristics on trusting beliefs and psychological reactance. Also, psychological reactance was a strong and reliable predictor of trusting beliefs but not of trusting behavior. These results suggest robots that are intended to influence human behavior should be designed to have facial characteristics we trust in humans and could be personalized to have the same gender as the user. Furthermore, personalization and adaptation techniques designed to make people like the robot more may help ensure they will also trust the robot.

  12. Interactions With Robots: The Truths We Reveal About Ourselves.

    Science.gov (United States)

    Broadbent, Elizabeth

    2017-01-03

    In movies, robots are often extremely humanlike. Although these robots are not yet reality, robots are currently being used in healthcare, education, and business. Robots provide benefits such as relieving loneliness and enabling communication. Engineers are trying to build robots that look and behave like humans and thus need comprehensive knowledge not only of technology but also of human cognition, emotion, and behavior. This need is driving engineers to study human behavior toward other humans and toward robots, leading to greater understanding of how humans think, feel, and behave in these contexts, including our tendencies for mindless social behaviors, anthropomorphism, uncanny feelings toward robots, and the formation of emotional attachments. However, in considering the increased use of robots, many people have concerns about deception, privacy, job loss, safety, and the loss of human relationships. Human-robot interaction is a fascinating field and one in which psychologists have much to contribute, both to the development of robots and to the study of human behavior.

  13. Multi-function robots with speech interaction and emotion feedback

    Science.gov (United States)

    Wang, Hongyu; Lou, Guanting; Ma, Mengchao

    2018-03-01

    Nowadays, the service robots have been applied in many public circumstances; however, most of them still don’t have the function of speech interaction, especially the function of speech-emotion interaction feedback. To make the robot more humanoid, Arduino microcontroller was used in this study for the speech recognition module and servo motor control module to achieve the functions of the robot’s speech interaction and emotion feedback. In addition, W5100 was adopted for network connection to achieve information transmission via Internet, providing broad application prospects for the robot in the area of Internet of Things (IoT).

  14. Ocular interaction with robots: an aid to the disabled

    International Nuclear Information System (INIS)

    Azorin, J.M.; Ianez, E.; Fernandez Jover, E.; Sabater, J.M.

    2010-01-01

    This paper describes a technique to control remotely a robot arm from his eyes movement. This method will help disabled people to control a robot in order to aid them to perform tasks in their daily lives. The electrooculography technique (EOG) is used to detect the eyes movement. EOG registers the potential difference between the cornea and the retina using electrodes. The eyes movement is used to control a remote robot arm of 6 degrees of freedom. First, the paper introduces several eye movement techniques to interact with devices, focusing on the EOG one. Then, the paper describes the system that allows interacting with a robot through the eyes movement. Finally, the paper shows some experimental results related to the robot controlled by the EOG-based interface. (Author).

  15. Control of a Robot Dancer for Enhancing Haptic Human-Robot Interaction in Waltz.

    Science.gov (United States)

    Hongbo Wang; Kosuge, K

    2012-01-01

    Haptic interaction between a human leader and a robot follower in waltz is studied in this paper. An inverted pendulum model is used to approximate the human's body dynamics. With the feedbacks from the force sensor and laser range finders, the robot is able to estimate the human leader's state by using an extended Kalman filter (EKF). To reduce interaction force, two robot controllers, namely, admittance with virtual force controller, and inverted pendulum controller, are proposed and evaluated in experiments. The former controller failed the experiment; reasons for the failure are explained. At the same time, the use of the latter controller is validated by experiment results.

  16. Why Robots Should Be Social: Enhancing Machine Learning through Social Human-Robot Interaction.

    Science.gov (United States)

    de Greeff, Joachim; Belpaeme, Tony

    2015-01-01

    Social learning is a powerful method for cultural propagation of knowledge and skills relying on a complex interplay of learning strategies, social ecology and the human propensity for both learning and tutoring. Social learning has the potential to be an equally potent learning strategy for artificial systems and robots in specific. However, given the complexity and unstructured nature of social learning, implementing social machine learning proves to be a challenging problem. We study one particular aspect of social machine learning: that of offering social cues during the learning interaction. Specifically, we study whether people are sensitive to social cues offered by a learning robot, in a similar way to children's social bids for tutoring. We use a child-like social robot and a task in which the robot has to learn the meaning of words. For this a simple turn-based interaction is used, based on language games. Two conditions are tested: one in which the robot uses social means to invite a human teacher to provide information based on what the robot requires to fill gaps in its knowledge (i.e. expression of a learning preference); the other in which the robot does not provide social cues to communicate a learning preference. We observe that conveying a learning preference through the use of social cues results in better and faster learning by the robot. People also seem to form a "mental model" of the robot, tailoring the tutoring to the robot's performance as opposed to using simply random teaching. In addition, the social learning shows a clear gender effect with female participants being responsive to the robot's bids, while male teachers appear to be less receptive. This work shows how additional social cues in social machine learning can result in people offering better quality learning input to artificial systems, resulting in improved learning performance.

  17. Why Robots Should Be Social: Enhancing Machine Learning through Social Human-Robot Interaction.

    Directory of Open Access Journals (Sweden)

    Joachim de Greeff

    Full Text Available Social learning is a powerful method for cultural propagation of knowledge and skills relying on a complex interplay of learning strategies, social ecology and the human propensity for both learning and tutoring. Social learning has the potential to be an equally potent learning strategy for artificial systems and robots in specific. However, given the complexity and unstructured nature of social learning, implementing social machine learning proves to be a challenging problem. We study one particular aspect of social machine learning: that of offering social cues during the learning interaction. Specifically, we study whether people are sensitive to social cues offered by a learning robot, in a similar way to children's social bids for tutoring. We use a child-like social robot and a task in which the robot has to learn the meaning of words. For this a simple turn-based interaction is used, based on language games. Two conditions are tested: one in which the robot uses social means to invite a human teacher to provide information based on what the robot requires to fill gaps in its knowledge (i.e. expression of a learning preference; the other in which the robot does not provide social cues to communicate a learning preference. We observe that conveying a learning preference through the use of social cues results in better and faster learning by the robot. People also seem to form a "mental model" of the robot, tailoring the tutoring to the robot's performance as opposed to using simply random teaching. In addition, the social learning shows a clear gender effect with female participants being responsive to the robot's bids, while male teachers appear to be less receptive. This work shows how additional social cues in social machine learning can result in people offering better quality learning input to artificial systems, resulting in improved learning performance.

  18. Context-Augmented Robotic Interaction Layer (CARIL), Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — CHI Systems and the Institute for Human Machine Cognition have teamed to create a human-robot interaction system that leverages cognitive representations of shared...

  19. HRI and the future of law

    NARCIS (Netherlands)

    Fosch Villaronga, Eduard; Heldeweg, Michiel A.

    2017-01-01

    Following Sinek’s what-how-why model, this project is about the creation of a dynamic regulatory instrument that co-evolves with robot technology development (what), using a robot and a regulatory impact assessments and evaluation settings (simulation and living labs) for empirical testing (how),

  20. Social Moments: A Perspective on Interaction for Social Robotics

    OpenAIRE

    Durantin, Gautier; Heath, Scott; Wiles, Janet

    2017-01-01

    During a social interaction, events that happen at different timescales can indicate social meanings. In order to socially engage with humans, robots will need to be able to comprehend and manipulate the social meanings that are associated with these events. We define social moments as events that occur within a social interaction and which can signify a pragmatic or semantic meaning. A challenge for social robots is recognizing social moments that occur on short timescales, which can be on t...

  1. Moving android: on social robots and body-in-interaction.

    Science.gov (United States)

    Alac, Morana

    2009-08-01

    Social robotics studies embodied technologies designed for social interaction. This paper examines the implied idea of embodiment using as data a sequence in which practitioners of social robotics are involved in designing a robot's movement. The moments of learning and work in the laboratory enact the social body as material, dynamic, and multiparty: the body-in-interaction. In describing subject-object reconfigurations, the paper explores how the well-known ideas of extending the body with instruments can be applied to a technology designed to function as our surrogate.

  2. Human-robot interaction assessment using dynamic engagement profiles

    DEFF Research Database (Denmark)

    Drimus, Alin; Poltorak, Nicole

    2017-01-01

    -1] interval, where 0 represents disengaged and 1 fully engaged. The network shows a good accuracy at recognizing the engagement state of humans given positive emotions. A time based analysis of interaction experiments between small humanoid robots and humans provides time series of engagement estimates, which...... and is applicable to humanoid robotics as well as other related contexts.......This paper addresses the use of convolutional neural networks for image analysis resulting in an engagement metric that can be used to assess the quality of human robot interactions. We propose a method based on a pretrained convolutional network able to map emotions onto a continuous [0...

  3. A Video Game-Based Framework for Analyzing Human-Robot Interaction: Characterizing Interface Design in Real-Time Interactive Multimedia Applications

    Science.gov (United States)

    2006-01-01

    segments video game interaction into domain-independent components which together form a framework that can be used to characterize real-time interactive...multimedia applications in general and HRI in particular. We provide examples of using the components in both the video game and the Unmanned Aerial

  4. Intelligence for Human-Assistant Planetary Surface Robots

    Science.gov (United States)

    Hirsh, Robert; Graham, Jeffrey; Tyree, Kimberly; Sierhuis, Maarten; Clancey, William J.

    2006-01-01

    The central premise in developing effective human-assistant planetary surface robots is that robotic intelligence is needed. The exact type, method, forms and/or quantity of intelligence is an open issue being explored on the ERA project, as well as others. In addition to field testing, theoretical research into this area can help provide answers on how to design future planetary robots. Many fundamental intelligence issues are discussed by Murphy [2], including (a) learning, (b) planning, (c) reasoning, (d) problem solving, (e) knowledge representation, and (f) computer vision (stereo tracking, gestures). The new "social interaction/emotional" form of intelligence that some consider critical to Human Robot Interaction (HRI) can also be addressed by human assistant planetary surface robots, as human operators feel more comfortable working with a robot when the robot is verbally (or even physically) interacting with them. Arkin [3] and Murphy are both proponents of the hybrid deliberative-reasoning/reactive-execution architecture as the best general architecture for fully realizing robot potential, and the robots discussed herein implement a design continuously progressing toward this hybrid philosophy. The remainder of this chapter will describe the challenges associated with robotic assistance to astronauts, our general research approach, the intelligence incorporated into our robots, and the results and lessons learned from over six years of testing human-assistant mobile robots in field settings relevant to planetary exploration. The chapter concludes with some key considerations for future work in this area.

  5. Human-Robot Interaction: Does Robotic Guidance Force Affect Gait-Related Brain Dynamics during Robot-Assisted Treadmill Walking?

    Directory of Open Access Journals (Sweden)

    Kristel Knaepen

    Full Text Available In order to determine optimal training parameters for robot-assisted treadmill walking, it is essential to understand how a robotic device interacts with its wearer, and thus, how parameter settings of the device affect locomotor control. The aim of this study was to assess the effect of different levels of guidance force during robot-assisted treadmill walking on cortical activity. Eighteen healthy subjects walked at 2 km.h-1 on a treadmill with and without assistance of the Lokomat robotic gait orthosis. Event-related spectral perturbations and changes in power spectral density were investigated during unassisted treadmill walking as well as during robot-assisted treadmill walking at 30%, 60% and 100% guidance force (with 0% body weight support. Clustering of independent components revealed three clusters of activity in the sensorimotor cortex during treadmill walking and robot-assisted treadmill walking in healthy subjects. These clusters demonstrated gait-related spectral modulations in the mu, beta and low gamma bands over the sensorimotor cortex related to specific phases of the gait cycle. Moreover, mu and beta rhythms were suppressed in the right primary sensory cortex during treadmill walking compared to robot-assisted treadmill walking with 100% guidance force, indicating significantly larger involvement of the sensorimotor area during treadmill walking compared to robot-assisted treadmill walking. Only marginal differences in the spectral power of the mu, beta and low gamma bands could be identified between robot-assisted treadmill walking with different levels of guidance force. From these results it can be concluded that a high level of guidance force (i.e., 100% guidance force and thus a less active participation during locomotion should be avoided during robot-assisted treadmill walking. This will optimize the involvement of the sensorimotor cortex which is known to be crucial for motor learning.

  6. Human-Robot Interaction: Does Robotic Guidance Force Affect Gait-Related Brain Dynamics during Robot-Assisted Treadmill Walking?

    Science.gov (United States)

    Knaepen, Kristel; Mierau, Andreas; Swinnen, Eva; Fernandez Tellez, Helio; Michielsen, Marc; Kerckhofs, Eric; Lefeber, Dirk; Meeusen, Romain

    2015-01-01

    In order to determine optimal training parameters for robot-assisted treadmill walking, it is essential to understand how a robotic device interacts with its wearer, and thus, how parameter settings of the device affect locomotor control. The aim of this study was to assess the effect of different levels of guidance force during robot-assisted treadmill walking on cortical activity. Eighteen healthy subjects walked at 2 km.h-1 on a treadmill with and without assistance of the Lokomat robotic gait orthosis. Event-related spectral perturbations and changes in power spectral density were investigated during unassisted treadmill walking as well as during robot-assisted treadmill walking at 30%, 60% and 100% guidance force (with 0% body weight support). Clustering of independent components revealed three clusters of activity in the sensorimotor cortex during treadmill walking and robot-assisted treadmill walking in healthy subjects. These clusters demonstrated gait-related spectral modulations in the mu, beta and low gamma bands over the sensorimotor cortex related to specific phases of the gait cycle. Moreover, mu and beta rhythms were suppressed in the right primary sensory cortex during treadmill walking compared to robot-assisted treadmill walking with 100% guidance force, indicating significantly larger involvement of the sensorimotor area during treadmill walking compared to robot-assisted treadmill walking. Only marginal differences in the spectral power of the mu, beta and low gamma bands could be identified between robot-assisted treadmill walking with different levels of guidance force. From these results it can be concluded that a high level of guidance force (i.e., 100% guidance force) and thus a less active participation during locomotion should be avoided during robot-assisted treadmill walking. This will optimize the involvement of the sensorimotor cortex which is known to be crucial for motor learning.

  7. Implementation of Admittance Control on a Construction Robot using Load Cells

    DEFF Research Database (Denmark)

    Bekker, Misha; Pedersen, Rasmus; Bak, Thomas

    2018-01-01

    Physical human-robot interactions (pHRI) must be safe and should feel natural to the human operator. To this end impedance or admittance control is often employed to relate the force applied by the human to the dynamic behavior of the robot. The robot under consideration in this work uses a load...... cell to sense the externally applied force. This paper presents a practical modeling procedure and implementation of admittance control that specifically deal with the undesired non-linearities caused by the use of a load cell. Experiments are performed on a 1-DoF testbed to validate the work done...

  8. Teaching Human Poses Interactively to a Social Robot

    Science.gov (United States)

    Gonzalez-Pacheco, Victor; Malfaz, Maria; Fernandez, Fernando; Salichs, Miguel A.

    2013-01-01

    The main activity of social robots is to interact with people. In order to do that, the robot must be able to understand what the user is saying or doing. Typically, this capability consists of pre-programmed behaviors or is acquired through controlled learning processes, which are executed before the social interaction begins. This paper presents a software architecture that enables a robot to learn poses in a similar way as people do. That is, hearing its teacher's explanations and acquiring new knowledge in real time. The architecture leans on two main components: an RGB-D (Red-, Green-, Blue- Depth) -based visual system, which gathers the user examples, and an Automatic Speech Recognition (ASR) system, which processes the speech describing those examples. The robot is able to naturally learn the poses the teacher is showing to it by maintaining a natural interaction with the teacher. We evaluate our system with 24 users who teach the robot a predetermined set of poses. The experimental results show that, with a few training examples, the system reaches high accuracy and robustness. This method shows how to combine data from the visual and auditory systems for the acquisition of new knowledge in a natural manner. Such a natural way of training enables robots to learn from users, even if they are not experts in robotics. PMID:24048336

  9. Interactive animated displayed of man-controlled and autonomous robots

    International Nuclear Information System (INIS)

    Crane, C.D. III; Duffy, J.

    1986-01-01

    An interactive computer graphics program has been developed which allows an operator to more readily control robot motions in two distinct modes; viz., man-controlled and autonomous. In man-controlled mode, the robot is guided by a joystick or similar device. As the robot moves, actual joint angle information is measured and supplied to a graphics system which accurately duplicates the robot motion. Obstacles are placed in the actual and animated workspace and the operator is warned of imminent collisions by sight and sound via the graphics system. Operation of the system in man-controlled mode is shown. In autonomous mode, a collision-free path between specified points is obtained by previewing robot motions on the graphics system. Once a satisfactory path is selected, the path characteristics are transmitted to the actual robot and the motion is executed. The telepresence system developed at the University of Florida has been successful in demonstrating that the concept of controlling a robot manipulator with the aid of an interactive computer graphics system is feasible and practical. The clarity of images coupled with real-time interaction and real-time determination of imminent collision with obstacles has resulted in improved operator performance. Furthermore, the ability for an operator to preview and supervise autonomous operations is a significant attribute when operating in a hazardous environment

  10. Teaching Human Poses Interactively to a Social Robot

    Directory of Open Access Journals (Sweden)

    Miguel A. Salichs

    2013-09-01

    Full Text Available The main activity of social robots is to interact with people. In order to do that, the robot must be able to understand what the user is saying or doing. Typically, this capability consists of pre-programmed behaviors or is acquired through controlled learning processes, which are executed before the social interaction begins. This paper presents a software architecture that enables a robot to learn poses in a similar way as people do. That is, hearing its teacher’s explanations and acquiring new knowledge in real time. The architecture leans on two main components: an RGB-D (Red-, Green-, Blue- Depth -based visual system, which gathers the user examples, and an Automatic Speech Recognition (ASR system, which processes the speech describing those examples. The robot is able to naturally learn the poses the teacher is showing to it by maintaining a natural interaction with the teacher. We evaluate our system with 24 users who teach the robot a predetermined set of poses. The experimental results show that, with a few training examples, the system reaches high accuracy and robustness. This method shows how to combine data from the visual and auditory systems for the acquisition of new knowledge in a natural manner. Such a natural way of training enables robots to learn from users, even if they are not experts in robotics.

  11. Extending NGOMSL Model for Human-Humanoid Robot Interaction in the Soccer Robotics Domain

    Directory of Open Access Journals (Sweden)

    Rajesh Elara Mohan

    2008-01-01

    Full Text Available In the field of human-computer interaction, the Natural Goals, Operators, Methods, and Selection rules Language (NGOMSL model is one of the most popular methods for modelling knowledge and cognitive processes for rapid usability evaluation. The NGOMSL model is a description of the knowledge that a user must possess to operate the system represented as elementary actions for effective usability evaluations. In the last few years, mobile robots have been exhibiting a stronger presence in commercial markets and very little work has been done with NGOMSL modelling for usability evaluations in the human-robot interaction discipline. This paper focuses on extending the NGOMSL model for usability evaluation of human-humanoid robot interaction in the soccer robotics domain. The NGOMSL modelled human-humanoid interaction design of Robo-Erectus Junior was evaluated and the results of the experiments showed that the interaction design was able to find faults in an average time of 23.84 s. Also, the interaction design was able to detect the fault within the 60 s in 100% of the cases. The Evaluated Interaction design was adopted by our Robo-Erectus Junior version of humanoid robots in the RoboCup 2007 humanoid soccer league.

  12. Why Robots Should Be Social: Enhancing Machine Learning through Social Human-Robot Interaction

    Science.gov (United States)

    de Greeff, Joachim; Belpaeme, Tony

    2015-01-01

    Social learning is a powerful method for cultural propagation of knowledge and skills relying on a complex interplay of learning strategies, social ecology and the human propensity for both learning and tutoring. Social learning has the potential to be an equally potent learning strategy for artificial systems and robots in specific. However, given the complexity and unstructured nature of social learning, implementing social machine learning proves to be a challenging problem. We study one particular aspect of social machine learning: that of offering social cues during the learning interaction. Specifically, we study whether people are sensitive to social cues offered by a learning robot, in a similar way to children’s social bids for tutoring. We use a child-like social robot and a task in which the robot has to learn the meaning of words. For this a simple turn-based interaction is used, based on language games. Two conditions are tested: one in which the robot uses social means to invite a human teacher to provide information based on what the robot requires to fill gaps in its knowledge (i.e. expression of a learning preference); the other in which the robot does not provide social cues to communicate a learning preference. We observe that conveying a learning preference through the use of social cues results in better and faster learning by the robot. People also seem to form a “mental model” of the robot, tailoring the tutoring to the robot’s performance as opposed to using simply random teaching. In addition, the social learning shows a clear gender effect with female participants being responsive to the robot’s bids, while male teachers appear to be less receptive. This work shows how additional social cues in social machine learning can result in people offering better quality learning input to artificial systems, resulting in improved learning performance. PMID:26422143

  13. Feasibility of interactive gesture control of a robotic microscope

    Directory of Open Access Journals (Sweden)

    Antoni Sven-Thomas

    2015-09-01

    Full Text Available Robotic devices become increasingly available in the clinics. One example are motorized surgical microscopes. While there are different scenarios on how to use the devices for autonomous tasks, simple and reliable interaction with the device is a key for acceptance by surgeons. We study, how gesture tracking can be integrated within the setup of a robotic microscope. In our setup, a Leap Motion Controller is used to track hand motion and adjust the field of view accordingly. We demonstrate with a survey that moving the field of view over a specified course is possible even for untrained subjects. Our results indicate that touch-less interaction with robots carrying small, near field gesture sensors is feasible and can be of use in clinical scenarios, where robotic devices are used in direct proximity of patient and physicians.

  14. Social Robots as Persuasive Agents

    DEFF Research Database (Denmark)

    Vlachos, Evgenios; Schärfe, Henrik

    2014-01-01

    Robots are more and more used in a social context, and in this paper we try to formulate a research agenda concerning ethical issues around social HRI in order to be prepared for future scenarios where robots may be a naturally integrated part of human society. We outline different paradigms to d...... to describe the role of social robots in communication processes with humans, and connect HRI with the topic of persuasive technology in health care, to critically reflect the potential benefits of using social robots as persuasive agents....

  15. Animal Robot Assisted-therapy for Rehabilitation of Patient with Post-Stroke Depression

    Science.gov (United States)

    Zikril Zulkifli, Winal; Shamsuddin, Syamimi; Hwee, Lim Thiam

    2017-06-01

    Recently, the utilization of therapeutic animal robots has expanded. This research aims to explore robotics application for mental healthcare in Malaysia through human-robot interaction (HRI). PARO, the robotic seal PARO was developed to give psychological effects on humans. Major Depressive Disorder (MDD) is a common but severe mood disorder. This study focuses on the interaction protocol between PARO and patients with MDD. Initially, twelve rehabilitation patients gave subjective evaluation on their first interaction with PARO. Next, therapeutic interaction environment was set-up with PARO in it to act as an augmentation strategy with other psychological interventions for post-stroke depression. Patient was exposed to PARO for 20 minutes. The results of behavioural analysis complemented with information from HRI survey question. The analysis also observed that the individual interactors engaged with the robot in diverse ways based on their needs Results show positive reaction toward the acceptance of an animal robot. Next, therapeutic interaction is set-up for PARO to contribute as an augmentation strategy with other psychological interventions for post-stroke depression. The outcome is to reduce the stress level among patients through facilitated therapy session with PARO

  16. Abstract robots with an attitude : applying interpersonal relation models to human-robot interaction

    NARCIS (Netherlands)

    Hiah, J.L.; Beursgens, L.; Haex, R.; Perez Romero, L.M.; Teh, Y.; Bhomer, ten M.; Berkel, van R.E.A.; Barakova, E.I.

    2013-01-01

    This paper explores new possibilities for social interaction between a human user and a robot with an abstract shape. The social interaction takes place by simulating behaviors such as submissiveness and dominance and analyzing the corresponding human reactions. We used an object that has no

  17. Fish-robot interactions in a free-swimming environment: Effects of speed and configuration of robots on live fish

    Science.gov (United States)

    Butail, Sachit; Polverino, Giovanni; Phamduy, Paul; Del Sette, Fausto; Porfiri, Maurizio

    2014-03-01

    We explore fish-robot interactions in a comprehensive set of experiments designed to highlight the effects of speed and configuration of bioinspired robots on live zebrafish. The robot design and movement is inspired by salient features of attraction in zebrafish and includes enhanced coloration, aspect ratio of a fertile female, and carangiform/subcarangiformlocomotion. The robots are autonomously controlled to swim in circular trajectories in the presence of live fish. Our results indicate that robot configuration significantly affects both the fish distance to the robots and the time spent near them.

  18. Can Robotic Interaction Improve Joint Attention Skills?

    Science.gov (United States)

    Warren, Zachary E.; Zheng, Zhi; Swanson, Amy R.; Bekele, Esubalew; Zhang, Lian; Crittendon, Julie A.; Weitlauf, Amy F.; Sarkar, Nilanjan

    2015-01-01

    Although it has often been argued that clinical applications of advanced technology may hold promise for addressing impairments associated with autism spectrum disorder (ASD), relatively few investigations have indexed the impact of intervention and feedback approaches. This pilot study investigated the application of a novel robotic interaction…

  19. Progress in EEG-Based Brain Robot Interaction Systems

    Directory of Open Access Journals (Sweden)

    Xiaoqian Mao

    2017-01-01

    Full Text Available The most popular noninvasive Brain Robot Interaction (BRI technology uses the electroencephalogram- (EEG- based Brain Computer Interface (BCI, to serve as an additional communication channel, for robot control via brainwaves. This technology is promising for elderly or disabled patient assistance with daily life. The key issue of a BRI system is to identify human mental activities, by decoding brainwaves, acquired with an EEG device. Compared with other BCI applications, such as word speller, the development of these applications may be more challenging since control of robot systems via brainwaves must consider surrounding environment feedback in real-time, robot mechanical kinematics, and dynamics, as well as robot control architecture and behavior. This article reviews the major techniques needed for developing BRI systems. In this review article, we first briefly introduce the background and development of mind-controlled robot technologies. Second, we discuss the EEG-based brain signal models with respect to generating principles, evoking mechanisms, and experimental paradigms. Subsequently, we review in detail commonly used methods for decoding brain signals, namely, preprocessing, feature extraction, and feature classification, and summarize several typical application examples. Next, we describe a few BRI applications, including wheelchairs, manipulators, drones, and humanoid robots with respect to synchronous and asynchronous BCI-based techniques. Finally, we address some existing problems and challenges with future BRI techniques.

  20. 2nd Workshop on Evaluating Child Robot Interaction

    NARCIS (Netherlands)

    Zaga, Cristina; Lohse, M.; Charisi, Vasiliki; Evers, Vanessa; Neerincx, Marc; Kanda, Takayuki; Leite, Iolanda

    Many researchers have started to explore natural interaction scenarios for children. No matter if these children are normally developing or have special needs, evaluating Child-Robot Interaction (CRI) is a challenge. To find methods that work well and provide reliable data is difficult, for example

  1. Project InterActions: A Multigenerational Robotic Learning Environment

    Science.gov (United States)

    Bers, Marina U.

    2007-12-01

    This paper presents Project InterActions, a series of 5-week workshops in which very young learners (4- to 7-year-old children) and their parents come together to build and program a personally meaningful robotic project in the context of a multigenerational robotics-based community of practice. The goal of these family workshops is to teach both parents and children about the mechanical and programming aspects involved in robotics, as well as to initiate them in a learning trajectory with and about technology. Results from this project address different ways in which parents and children learn together and provide insights into how to develop educational interventions that would educate parents, as well as children, in new domains of knowledge and skills such as robotics and new technologies.

  2. Mobile app for human-interaction with sitter robots

    Science.gov (United States)

    Das, Sumit Kumar; Sahu, Ankita; Popa, Dan O.

    2017-05-01

    Human environments are often unstructured and unpredictable, thus making the autonomous operation of robots in such environments is very difficult. Despite many remaining challenges in perception, learning, and manipulation, more and more studies involving assistive robots have been carried out in recent years. In hospital environments, and in particular in patient rooms, there are well-established practices with respect to the type of furniture, patient services, and schedule of interventions. As a result, adding a robot into semi-structured hospital environments is an easier problem to tackle, with results that could have positive benefits to the quality of patient care and the help that robots can offer to nursing staff. When working in a healthcare facility, robots need to interact with patients and nurses through Human-Machine Interfaces (HMIs) that are intuitive to use, they should maintain awareness of surroundings, and offer safety guarantees for humans. While fully autonomous operation for robots is not yet technically feasible, direct teleoperation control of the robot would also be extremely cumbersome, as it requires expert user skills, and levels of concentration not available to many patients. Therefore, in our current study we present a traded control scheme, in which the robot and human both perform expert tasks. The human-robot communication and control scheme is realized through a mobile tablet app that can be customized for robot sitters in hospital environments. The role of the mobile app is to augment the verbal commands given to a robot through natural speech, camera and other native interfaces, while providing failure mode recovery options for users. Our app can access video feed and sensor data from robots, assist the user with decision making during pick and place operations, monitor the user health over time, and provides conversational dialogue during sitting sessions. In this paper, we present the software and hardware framework that

  3. Muecas: A Multi-Sensor Robotic Head for Affective Human Robot Interaction and Imitation

    Directory of Open Access Journals (Sweden)

    Felipe Cid

    2014-04-01

    Full Text Available This paper presents a multi-sensor humanoid robotic head for human robot interaction. The design of the robotic head, Muecas, is based on ongoing research on the mechanisms of perception and imitation of human expressions and emotions. These mechanisms allow direct interaction between the robot and its human companion through the different natural language modalities: speech, body language and facial expressions. The robotic head has 12 degrees of freedom, in a human-like configuration, including eyes, eyebrows, mouth and neck, and has been designed and built entirely by IADeX (Engineering, Automation and Design of Extremadura and RoboLab. A detailed description of its kinematics is provided along with the design of the most complex controllers. Muecas can be directly controlled by FACS (Facial Action Coding System, the de facto standard for facial expression recognition and synthesis. This feature facilitates its use by third party platforms and encourages the development of imitation and of goal-based systems. Imitation systems learn from the user, while goal-based ones use planning techniques to drive the user towards a final desired state. To show the flexibility and reliability of the robotic head, the paper presents a software architecture that is able to detect, recognize, classify and generate facial expressions in real time using FACS. This system has been implemented using the robotics framework, RoboComp, which provides hardware-independent access to the sensors in the head. Finally, the paper presents experimental results showing the real-time functioning of the whole system, including recognition and imitation of human facial expressions.

  4. A Human–Robot Interaction Perspective on Assistive and Rehabilitation Robotics

    Directory of Open Access Journals (Sweden)

    Philipp Beckerle

    2017-05-01

    Full Text Available Assistive and rehabilitation devices are a promising and challenging field of recent robotics research. Motivated by societal needs such as aging populations, such devices can support motor functionality and subject training. The design, control, sensing, and assessment of the devices become more sophisticated due to a human in the loop. This paper gives a human–robot interaction perspective on current issues and opportunities in the field. On the topic of control and machine learning, approaches that support but do not distract subjects are reviewed. Options to provide sensory user feedback that are currently missing from robotic devices are outlined. Parallels between device acceptance and affective computing are made. Furthermore, requirements for functional assessment protocols that relate to real-world tasks are discussed. In all topic areas, the design of human-oriented frameworks and methods is dominated by challenges related to the close interaction between the human and robotic device. This paper discusses the aforementioned aspects in order to open up new perspectives for future robotic solutions.

  5. A Human–Robot Interaction Perspective on Assistive and Rehabilitation Robotics

    Science.gov (United States)

    Beckerle, Philipp; Salvietti, Gionata; Unal, Ramazan; Prattichizzo, Domenico; Rossi, Simone; Castellini, Claudio; Hirche, Sandra; Endo, Satoshi; Amor, Heni Ben; Ciocarlie, Matei; Mastrogiovanni, Fulvio; Argall, Brenna D.; Bianchi, Matteo

    2017-01-01

    Assistive and rehabilitation devices are a promising and challenging field of recent robotics research. Motivated by societal needs such as aging populations, such devices can support motor functionality and subject training. The design, control, sensing, and assessment of the devices become more sophisticated due to a human in the loop. This paper gives a human–robot interaction perspective on current issues and opportunities in the field. On the topic of control and machine learning, approaches that support but do not distract subjects are reviewed. Options to provide sensory user feedback that are currently missing from robotic devices are outlined. Parallels between device acceptance and affective computing are made. Furthermore, requirements for functional assessment protocols that relate to real-world tasks are discussed. In all topic areas, the design of human-oriented frameworks and methods is dominated by challenges related to the close interaction between the human and robotic device. This paper discusses the aforementioned aspects in order to open up new perspectives for future robotic solutions. PMID:28588473

  6. A Human-Robot Interaction Perspective on Assistive and Rehabilitation Robotics.

    Science.gov (United States)

    Beckerle, Philipp; Salvietti, Gionata; Unal, Ramazan; Prattichizzo, Domenico; Rossi, Simone; Castellini, Claudio; Hirche, Sandra; Endo, Satoshi; Amor, Heni Ben; Ciocarlie, Matei; Mastrogiovanni, Fulvio; Argall, Brenna D; Bianchi, Matteo

    2017-01-01

    Assistive and rehabilitation devices are a promising and challenging field of recent robotics research. Motivated by societal needs such as aging populations, such devices can support motor functionality and subject training. The design, control, sensing, and assessment of the devices become more sophisticated due to a human in the loop. This paper gives a human-robot interaction perspective on current issues and opportunities in the field. On the topic of control and machine learning, approaches that support but do not distract subjects are reviewed. Options to provide sensory user feedback that are currently missing from robotic devices are outlined. Parallels between device acceptance and affective computing are made. Furthermore, requirements for functional assessment protocols that relate to real-world tasks are discussed. In all topic areas, the design of human-oriented frameworks and methods is dominated by challenges related to the close interaction between the human and robotic device. This paper discusses the aforementioned aspects in order to open up new perspectives for future robotic solutions.

  7. Simulation tools for robotics research and assessment

    Science.gov (United States)

    Fields, MaryAnne; Brewer, Ralph; Edge, Harris L.; Pusey, Jason L.; Weller, Ed; Patel, Dilip G.; DiBerardino, Charles A.

    2016-05-01

    The Robotics Collaborative Technology Alliance (RCTA) program focuses on four overlapping technology areas: Perception, Intelligence, Human-Robot Interaction (HRI), and Dexterous Manipulation and Unique Mobility (DMUM). In addition, the RCTA program has a requirement to assess progress of this research in standalone as well as integrated form. Since the research is evolving and the robotic platforms with unique mobility and dexterous manipulation are in the early development stage and very expensive, an alternate approach is needed for efficient assessment. Simulation of robotic systems, platforms, sensors, and algorithms, is an attractive alternative to expensive field-based testing. Simulation can provide insight during development and debugging unavailable by many other means. This paper explores the maturity of robotic simulation systems for applications to real-world problems in robotic systems research. Open source (such as Gazebo and Moby), commercial (Simulink, Actin, LMS), government (ANVEL/VANE), and the RCTA-developed RIVET simulation environments are examined with respect to their application in the robotic research domains of Perception, Intelligence, HRI, and DMUM. Tradeoffs for applications to representative problems from each domain are presented, along with known deficiencies and disadvantages. In particular, no single robotic simulation environment adequately covers the needs of the robotic researcher in all of the domains. Simulation for DMUM poses unique constraints on the development of physics-based computational models of the robot, the environment and objects within the environment, and the interactions between them. Most current robot simulations focus on quasi-static systems, but dynamic robotic motion places an increased emphasis on the accuracy of the computational models. In order to understand the interaction of dynamic multi-body systems, such as limbed robots, with the environment, it may be necessary to build component

  8. A Social Cognitive Neuroscience Stance on Human-Robot Interactions

    Directory of Open Access Journals (Sweden)

    Chaminade Thierry

    2011-12-01

    Full Text Available Robotic devices, thanks to the controlled variations in their appearance and behaviors, provide useful tools to test hypotheses pertaining to social interactions. These agents were used to investigate one theoretical framework, resonance, which is defined, at the behavioral and neural levels, as an overlap between first- and third- person representations of mental states such as motor intentions or emotions. Behaviorally, we found a reduced, but significant, resonance towards a humanoid robot displaying biological motion, compared to a human. Using neuroimaging, we've reported that while perceptual processes in the human occipital and temporal lobe are more strongly engaged when perceiving a humanoid robot than a human action, activity in areas involved in motor resonance depends on attentional modulation for artificial agent more strongly than for human agents. Altogether, these studies using artificial agents offer valuable insights into the interaction of bottom-up and top-down processes in the perception of artificial agents.

  9. Model-based acquisition and analysis of multimodal interactions for improving human-robot interaction

    OpenAIRE

    Renner, Patrick; Pfeiffer, Thies

    2014-01-01

    For solving complex tasks cooperatively in close interaction with robots, they need to understand natural human communication. To achieve this, robots could benefit from a deeper understanding of the processes that humans use for successful communication. Such skills can be studied by investigating human face-to-face interactions in complex tasks. In our work the focus lies on shared-space interactions in a path planning task and thus 3D gaze directions and hand movements are of particular in...

  10. A Multimodal Emotion Detection System during Human-Robot Interaction

    Science.gov (United States)

    Alonso-Martín, Fernando; Malfaz, María; Sequeira, João; Gorostiza, Javier F.; Salichs, Miguel A.

    2013-01-01

    In this paper, a multimodal user-emotion detection system for social robots is presented. This system is intended to be used during human–robot interaction, and it is integrated as part of the overall interaction system of the robot: the Robotics Dialog System (RDS). Two modes are used to detect emotions: the voice and face expression analysis. In order to analyze the voice of the user, a new component has been developed: Gender and Emotion Voice Analysis (GEVA), which is written using the Chuck language. For emotion detection in facial expressions, the system, Gender and Emotion Facial Analysis (GEFA), has been also developed. This last system integrates two third-party solutions: Sophisticated High-speed Object Recognition Engine (SHORE) and Computer Expression Recognition Toolbox (CERT). Once these new components (GEVA and GEFA) give their results, a decision rule is applied in order to combine the information given by both of them. The result of this rule, the detected emotion, is integrated into the dialog system through communicative acts. Hence, each communicative act gives, among other things, the detected emotion of the user to the RDS so it can adapt its strategy in order to get a greater satisfaction degree during the human–robot dialog. Each of the new components, GEVA and GEFA, can also be used individually. Moreover, they are integrated with the robotic control platform ROS (Robot Operating System). Several experiments with real users were performed to determine the accuracy of each component and to set the final decision rule. The results obtained from applying this decision rule in these experiments show a high success rate in automatic user emotion recognition, improving the results given by the two information channels (audio and visual) separately. PMID:24240598

  11. Developing novel extensions to support prototyping for interactive social robots

    NARCIS (Netherlands)

    Bhömer, ten M.; Bartneck, C.; Hu, J.; Ahn, R.M.C.; Tuyls, K.P.; Delbressine, F.L.M.; Feijs, L.M.G.

    2009-01-01

    Lego Mindstorms NXT is a platform highly suitable for prototyping in the field of interactive social robotics. During a technology masterclass at Eindhoven University of Technology students from the department of Industrial Design have developed five novel extensions (sensors and actuators) for the

  12. Playte, a tangible interface for engaging human-robot interaction

    DEFF Research Database (Denmark)

    Christensen, David Johan; Fogh, Rune; Lund, Henrik Hautop

    2014-01-01

    This paper describes a tangible interface, Playte, designed for children animating interactive robots. The system supports physical manipulation of behaviors represented by LEGO bricks and allows the user to record and train their own new behaviors. Our objective is to explore several modes of in...

  13. Dynamic perceptions of human-likeness while interacting with a social robot

    NARCIS (Netherlands)

    Ruijten, P.A.M.; Cuijpers, R.H.

    2017-01-01

    In human-robot interaction research, much attention is given to the development of socially assistive robots that can have natural interactions with their users. One crucial aspect of such natural interactions is that the robot is perceived as human-like. Much research already exists that

  14. Physical human-robot interaction of an active pelvis orthosis: toward ergonomic assessment of wearable robots.

    Science.gov (United States)

    d'Elia, Nicolò; Vanetti, Federica; Cempini, Marco; Pasquini, Guido; Parri, Andrea; Rabuffetti, Marco; Ferrarin, Maurizio; Molino Lova, Raffaele; Vitiello, Nicola

    2017-04-14

    In human-centered robotics, exoskeletons are becoming relevant for addressing needs in the healthcare and industrial domains. Owing to their close interaction with the user, the safety and ergonomics of these systems are critical design features that require systematic evaluation methodologies. Proper transfer of mechanical power requires optimal tuning of the kinematic coupling between the robotic and anatomical joint rotation axes. We present the methods and results of an experimental evaluation of the physical interaction with an active pelvis orthosis (APO). This device was designed to effectively assist in hip flexion-extension during locomotion with a minimum impact on the physiological human kinematics, owing to a set of passive degrees of freedom for self-alignment of the human and robotic hip flexion-extension axes. Five healthy volunteers walked on a treadmill at different speeds without and with the APO under different levels of assistance. The user-APO physical interaction was evaluated in terms of: (i) the deviation of human lower-limb joint kinematics when wearing the APO with respect to the physiological behavior (i.e., without the APO); (ii) relative displacements between the APO orthotic shells and the corresponding body segments; and (iii) the discrepancy between the kinematics of the APO and the wearer's hip joints. The results show: (i) negligible interference of the APO in human kinematics under all the experimented conditions; (ii) small (i.e., ergonomics assessment of wearable robots.

  15. Interactive Industrial Robot Programming for the Ceramic Industry

    Directory of Open Access Journals (Sweden)

    Germano Veiga

    2013-10-01

    Full Text Available This paper presents an interactive programming method for programming industrial robots in ceramic applications. The main purpose was to develop a simple but flexible programming system that empowers the user with product driven programming without compromising flexibility. To achieve this flexibility, a two step hybrid programming model was designed: first the user sketches the desired trajectory in a spatial augmented reality programming table using the final product and then relies on an advanced 3D graphical system to tune the robot trajectory in the final workcell. The results measured by the end-user feedback show that a new level of flexibility was reached for this type of application.

  16. A Kinect-Based Gesture Recognition Approach for a Natural Human Robot Interface

    Directory of Open Access Journals (Sweden)

    Grazia Cicirelli

    2015-03-01

    Full Text Available In this paper, we present a gesture recognition system for the development of a human-robot interaction (HRI interface. Kinect cameras and the OpenNI framework are used to obtain real-time tracking of a human skeleton. Ten different gestures, performed by different persons, are defined. Quaternions of joint angles are first used as robust and significant features. Next, neural network (NN classifiers are trained to recognize the different gestures. This work deals with different challenging tasks, such as the real-time implementation of a gesture recognition system and the temporal resolution of gestures. The HRI interface developed in this work includes three Kinect cameras placed at different locations in an indoor environment and an autonomous mobile robot that can be remotely controlled by one operator standing in front of one of the Kinects. Moreover, the system is supplied with a people re-identification module which guarantees that only one person at a time has control of the robot. The system's performance is first validated offline, and then online experiments are carried out, proving the real-time operation of the system as required by a HRI interface.

  17. Social HRI for people with dementia : one size fits all?

    NARCIS (Netherlands)

    Perugia, G.; Díaz Boladeras, M.; Barakova, E.I.; Catal Mallofré, A.; Rauterberg, G.W.M.

    2017-01-01

    Motivational and emotional disorders (i.e. apathy and depression) are very frequent in dementia and might greatly affect the positive psychological state experienced during social HRI. We conducted a six-weeks study in two nursing homes comparing the affective states that two playful activities,

  18. Brain Computer Interfaces for Enhanced Interaction with Mobile Robot Agents

    Science.gov (United States)

    2016-07-27

    SECURITY CLASSIFICATION OF: Brain Computer Interfaces (BCIs) show great potential in allowing humans to interact with computational environments in a...Distribution Unlimited UU UU UU UU 27-07-2016 17-Sep-2013 16-Sep-2014 Final Report: Brain Computer Interfaces for Enhanced Interactions with Mobile Robot...published in peer-reviewed journals: Number of Papers published in non peer-reviewed journals: Final Report: Brain Computer Interfaces for Enhanced

  19. Multi-Robot Remote Interaction with FS-MAS

    Directory of Open Access Journals (Sweden)

    Yunliang Jiang

    2013-02-01

    Full Text Available The need to reduce bandwidth, improve productivity, autonomy and the scalability in multi-robot teleoperation has been recognized for a long time. In this article we propose a novel finite state machine mobile agent based on the network interaction service model, namely FS-MAS. This model consists of three finite state machines, namely the Finite State Mobile Agent (FS-Agent, which is the basic service module. The Service Content Finite State Machine (Content-FS, using the XML language to define workflow, to describe service content and service computation process. The Mobile Agent computation model Finite State Machine (MACM-FS, used to describe the service implementation. Finally, we apply this service model to the multi-robot system, the initial realization completing complex tasks in the form of multi-robot scheduling. This demonstrates that the robot has greatly improved intelligence, and provides a wide solution space for critical issues such as task division, rational and efficient use of resource and multi-robot collaboration.

  20. Intrinsic interactive reinforcement learning - Using error-related potentials for real world human-robot interaction.

    Science.gov (United States)

    Kim, Su Kyoung; Kirchner, Elsa Andrea; Stefes, Arne; Kirchner, Frank

    2017-12-14

    Reinforcement learning (RL) enables robots to learn its optimal behavioral strategy in dynamic environments based on feedback. Explicit human feedback during robot RL is advantageous, since an explicit reward function can be easily adapted. However, it is very demanding and tiresome for a human to continuously and explicitly generate feedback. Therefore, the development of implicit approaches is of high relevance. In this paper, we used an error-related potential (ErrP), an event-related activity in the human electroencephalogram (EEG), as an intrinsically generated implicit feedback (rewards) for RL. Initially we validated our approach with seven subjects in a simulated robot learning scenario. ErrPs were detected online in single trial with a balanced accuracy (bACC) of 91%, which was sufficient to learn to recognize gestures and the correct mapping between human gestures and robot actions in parallel. Finally, we validated our approach in a real robot scenario, in which seven subjects freely chose gestures and the real robot correctly learned the mapping between gestures and actions (ErrP detection (90% bACC)). In this paper, we demonstrated that intrinsically generated EEG-based human feedback in RL can successfully be used to implicitly improve gesture-based robot control during human-robot interaction. We call our approach intrinsic interactive RL.

  1. A Decentralized Interactive Architecture for Aerial and Ground Mobile Robots Cooperation

    OpenAIRE

    Harik, El Houssein Chouaib; Guérin, François; Guinand, Frédéric; Brethé, Jean-François; Pelvillain, Hervé

    2014-01-01

    International audience; —This paper presents a novel decentralized interactive architecture for aerial and ground mobile robots cooperation. The aerial mobile robot is used to provide a global coverage during an area inspection, while the ground mobile robot is used to provide a local coverage of ground features. We include a human-in-the-loop to provide waypoints for the ground mobile robot to progress safely in the inspected area. The aerial mobile robot follows continuously the ground mobi...

  2. Multi-Axis Force Sensor for Human-Robot Interaction Sensing in a Rehabilitation Robotic Device.

    Science.gov (United States)

    Grosu, Victor; Grosu, Svetlana; Vanderborght, Bram; Lefeber, Dirk; Rodriguez-Guerrero, Carlos

    2017-06-05

    Human-robot interaction sensing is a compulsory feature in modern robotic systems where direct contact or close collaboration is desired. Rehabilitation and assistive robotics are fields where interaction forces are required for both safety and increased control performance of the device with a more comfortable experience for the user. In order to provide an efficient interaction feedback between the user and rehabilitation device, high performance sensing units are demanded. This work introduces a novel design of a multi-axis force sensor dedicated for measuring pelvis interaction forces in a rehabilitation exoskeleton device. The sensor is conceived such that it has different sensitivity characteristics for the three axes of interest having also movable parts in order to allow free rotations and limit crosstalk errors. Integrated sensor electronics make it easy to acquire and process data for a real-time distributed system architecture. Two of the developed sensors are integrated and tested in a complex gait rehabilitation device for safe and compliant control.

  3. Robotic Motion Learning Framework to Promote Social Engagement

    Directory of Open Access Journals (Sweden)

    Rachael Burns

    2018-02-01

    Full Text Available Imitation is a powerful component of communication between people, and it poses an important implication in improving the quality of interaction in the field of human–robot interaction (HRI. This paper discusses a novel framework designed to improve human–robot interaction through robotic imitation of a participant’s gestures. In our experiment, a humanoid robotic agent socializes with and plays games with a participant. For the experimental group, the robot additionally imitates one of the participant’s novel gestures during a play session. We hypothesize that the robot’s use of imitation will increase the participant’s openness towards engaging with the robot. Experimental results from a user study of 12 subjects show that post-imitation, experimental subjects displayed a more positive emotional state, had higher instances of mood contagion towards the robot, and interpreted the robot to have a higher level of autonomy than their control group counterparts did. These results point to an increased participant interest in engagement fueled by personalized imitation during interaction.

  4. Walking in the uncanny valley: importance of the attractiveness on the acceptance of a robot as a working partner

    Science.gov (United States)

    Destephe, Matthieu; Brandao, Martim; Kishi, Tatsuhiro; Zecca, Massimiliano; Hashimoto, Kenji; Takanishi, Atsuo

    2015-01-01

    The Uncanny valley hypothesis, which tells us that almost-human characteristics in a robot or a device could cause uneasiness in human observers, is an important research theme in the Human Robot Interaction (HRI) field. Yet, that phenomenon is still not well-understood. Many have investigated the external design of humanoid robot faces and bodies but only a few studies have focused on the influence of robot movements on our perception and feelings of the Uncanny valley. Moreover, no research has investigated the possible relation between our uneasiness feeling and whether or not we would accept robots having a job in an office, a hospital or elsewhere. To better understand the Uncanny valley, we explore several factors which might have an influence on our perception of robots, be it related to the subjects, such as culture or attitude toward robots, or related to the robot such as emotions and emotional intensity displayed in its motion. We asked 69 subjects (N = 69) to rate the motions of a humanoid robot (Perceived Humanity, Eeriness, and Attractiveness) and state where they would rather see the robot performing a task. Our results suggest that, among the factors we chose to test, the attitude toward robots is the main influence on the perception of the robot related to the Uncanny valley. Robot occupation acceptability was affected only by Attractiveness, mitigating any Uncanny valley effect. We discuss the implications of these findings for the Uncanny valley and the acceptability of a robotic worker in our society. PMID:25762967

  5. A Review of Verbal and Non-Verbal Human-Robot Interactive Communication

    OpenAIRE

    Mavridis, Nikolaos

    2014-01-01

    In this paper, an overview of human-robot interactive communication is presented, covering verbal as well as non-verbal aspects of human-robot interaction. Following a historical introduction, and motivation towards fluid human-robot communication, ten desiderata are proposed, which provide an organizational axis both of recent as well as of future research on human-robot communication. Then, the ten desiderata are examined in detail, culminating to a unifying discussion, and a forward-lookin...

  6. The Creation of a Multi-Human, Multi-Robot Interactive Jam Session

    OpenAIRE

    Weinberg, Gil; Blosser, Brian; Mallikarjuna, Trishul; Raman, Aparna

    2009-01-01

    This paper presents an interactive and improvisational jam session, including human players and two robotic musicians. The project was developed in an effort to create novel and inspiring music through human-robot collaboration. The jam session incorporates Shimon, a newly-developed socially-interactive robotic marimba player, and Haile, a perceptual robotic percussionist developed in previous work. The paper gives an overview of the musical perception modules, adaptive improvisation modes an...

  7. Cognitive Emotional Regulation Model in Human-Robot Interaction

    OpenAIRE

    Liu, Xin; Xie, Lun; Liu, Anqi; Li, Dan

    2015-01-01

    This paper integrated Gross cognitive process into the HMM (hidden Markov model) emotional regulation method and implemented human-robot emotional interaction with facial expressions and behaviors. Here, energy was the psychological driving force of emotional transition in the cognitive emotional model. The input facial expression was translated into external energy by expression-emotion mapping. Robot’s next emotional state was determined by the cognitive energy (the stimulus after cognition...

  8. Interactive Industrial Robot Programming for the Ceramic Industry

    OpenAIRE

    Germano Veiga; Pedro Malaca; Rui Cancela

    2013-01-01

    This paper presents an interactive programming method for programming industrial robots in ceramic applications. The main purpose was to develop a simple but flexible programming system that empowers the user with product driven programming without compromising flexibility. To achieve this flexibility, a two step hybrid programming model was designed: first the user sketches the desired trajectory in a spatial augmented reality programming table using the final product and then relies on an a...

  9. Interactive Rhythm Learning System by Combining Tablet Computers and Robots

    Directory of Open Access Journals (Sweden)

    Chien-Hsing Chou

    2017-03-01

    Full Text Available This study proposes a percussion learning device that combines tablet computers and robots. This device comprises two systems: a rhythm teaching system, in which users can compose and practice rhythms by using a tablet computer, and a robot performance system. First, teachers compose the rhythm training contents on the tablet computer. Then, the learners practice these percussion exercises by using the tablet computer and a small drum set. The teaching system provides a new and user-friendly score editing interface for composing a rhythm exercise. It also provides a rhythm rating function to facilitate percussion training for children and improve the stability of rhythmic beating. To encourage children to practice percussion exercises, a robotic performance system is used to interact with the children; this system can perform percussion exercises for students to listen to and then help them practice the exercise. This interaction enhances children’s interest and motivation to learn and practice rhythm exercises. The results of experimental course and field trials reveal that the proposed system not only increases students’ interest and efficiency in learning but also helps them in understanding musical rhythms through interaction and composing simple rhythms.

  10. Presence of Life-Like Robot Expressions Influences Children’s Enjoyment of Human-Robot Interactions in the Field

    NARCIS (Netherlands)

    Cameron, David; Fernando, Samuel; Collins, Emily; Millings, Abigail; Moore, Roger; Sharkey, Amanda; Evers, Vanessa; Prescott, Tony

    Emotions, and emotional expression, have a broad influence on the interactions we have with others and are thus a key factor to consider in developing social robots. As part of a collaborative EU project, this study examined the impact of lifelike affective facial expressions, in the humanoid robot

  11. Robotics

    Science.gov (United States)

    Popov, E. P.; Iurevich, E. I.

    The history and the current status of robotics are reviewed, as are the design, operation, and principal applications of industrial robots. Attention is given to programmable robots, robots with adaptive control and elements of artificial intelligence, and remotely controlled robots. The applications of robots discussed include mechanical engineering, cargo handling during transportation and storage, mining, and metallurgy. The future prospects of robotics are briefly outlined.

  12. Towards culture-specific robot customization : a study on greeting interaction with Egyptians

    NARCIS (Netherlands)

    Trovato, G.; Zecca, M.; Sessa, S.; Jamone, L.; Ham, J.R.C.; Hashimoto, K.; Takanishi, A.

    2014-01-01

    A complex relationship exists between national cultural background and interaction with robots, and many earlier studies have investigated how people from different cultures perceive the inclusion of robots into society. Conversely, very few studies have investigated how robots, speaking and using

  13. A Motion System for Social and Animated Robots

    Directory of Open Access Journals (Sweden)

    Jelle Saldien

    2014-05-01

    Full Text Available This paper presents an innovative motion system that is used to control the motions and animations of a social robot. The social robot Probo is used to study Human-Robot Interactions (HRI, with a special focus on Robot Assisted Therapy (RAT. When used for therapy it is important that a social robot is able to create an “illusion of life” so as to become a believable character that can communicate with humans. The design of the motion system in this paper is based on insights from the animation industry. It combines operator-controlled animations with low-level autonomous reactions such as attention and emotional state. The motion system has a Combination Engine, which combines motion commands that are triggered by a human operator with motions that originate from different units of the cognitive control architecture of the robot. This results in an interactive robot that seems alive and has a certain degree of “likeability”. The Godspeed Questionnaire Series is used to evaluate the animacy and likeability of the robot in China, Romania and Belgium.

  14. An Experimental Study of Embodied Interaction and Human Perception of Social Presence for Interactive Robots in Public Settings

    DEFF Research Database (Denmark)

    Jochum, Elizabeth Ann; Heath, Damith; Vlachos, Evgenios

    2018-01-01

    The human perception of cognitive robots as social depends on many factors, including those that do not necessarily pertain to a robot’s cognitive functioning. Experience Design offers a useful framework for evaluating when participants interact with robots as products or tools and when they regard...... them as social actors. This study describes a between-participants experiment conducted at a science museum, where visitors were invited to play a game of noughts and crosses with a Baxter robot. The goal is to foster meaningful interactions that promote engagement between the human and robot...... in a museum context. Using an Experience Design framework, we tested the robot in three different conditions to better understand which factors contribute to the perception of robots as social. The experiment also outlines best practices for conducting human-robot interaction research in museum exhibitions...

  15. Peer-to-Peer Human-Robot Interaction for Space Exploration

    Science.gov (United States)

    Fong, Terrence; Nourbakhsh, Illah

    2004-01-01

    NASA has embarked on a long-term program to develop human-robot systems for sustained, affordable space exploration. To support this mission, we are working to improve human-robot interaction and performance on planetary surfaces. Rather than building robots that function as glorified tools, our focus is to enable humans and robots to work as partners and peers. In this paper. we describe our approach, which includes contextual dialogue, cognitive modeling, and metrics-based field testing.

  16. Movement coordination in applied human-human and human-robot interaction

    DEFF Research Database (Denmark)

    Schubö, Anna; Vesper, Cordula; Wiesbeck, Mathey

    2007-01-01

    and describing human-human interaction in terms of goal-oriented movement coordination is considered an important and necessary step for designing and describing human-robot interaction. In the present scenario, trajectories of hand and finger movements were recorded while two human participants performed......The present paper describes a scenario for examining mechanisms of movement coordination in humans and robots. It is assumed that coordination can best be achieved when behavioral rules that shape movement execution in humans are also considered for human-robot interaction. Investigating...... coordination were affected. Implications for human-robot interaction are discussed....

  17. Towards quantifying dynamic human-human physical interactions for robot assisted stroke therapy.

    Science.gov (United States)

    Mohan, Mayumi; Mendonca, Rochelle; Johnson, Michelle J

    2017-07-01

    Human-Robot Interaction is a prominent field of robotics today. Knowledge of human-human physical interaction can prove vital in creating dynamic physical interactions between human and robots. Most of the current work in studying this interaction has been from a haptic perspective. Through this paper, we present metrics that can be used to identify if a physical interaction occurred between two people using kinematics. We present a simple Activity of Daily Living (ADL) task which involves a simple interaction. We show that we can use these metrics to successfully identify interactions.

  18. Systematic Analysis of Video Data from Different Human-Robot Interaction Studies: A Categorisation of Social Signals During Error Situations

    OpenAIRE

    Manuel eGiuliani; Nicole eMirnig; Gerald eStollnberger; Susanne eStadler; Roland eBuchner; Manfred eTscheligi

    2015-01-01

    Human?robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human?robot interaction experiments. For that, we analyzed 201 videos of five human?robot interaction user studies with varying tasks from four independent projects. The analysis shows tha...

  19. A Quantitative Analysis of Dressing Dynamics for Robotic Dressing Assistance

    Directory of Open Access Journals (Sweden)

    Greg Chance

    2017-05-01

    Full Text Available Assistive robots have a great potential to address issues related to an aging population and an increased demand for caregiving. Successful deployment of robots working in close proximity with people requires consideration of both safety and human–robot interaction (HRI. One of the established activities of daily living where robots could play an assistive role is dressing. Using the correct force profile for robot control will be essential in this application of HRI requiring careful exploration of factors related to the user’s pose and the type of garments involved. In this paper, a Baxter robot was used to dress a jacket onto a mannequin and human participants considering several combinations of user pose and clothing type (base layers, while recording dynamic data from the robot, a load cell, and an IMU. We also report on suitability of these sensors for identifying dressing errors, e.g., fabric snagging. Data were analyzed by comparing the overlap of confidence intervals to determine sensitivity to dressing. We expand the analysis to include classification techniques such as decision tree and support vector machines using k-fold cross-validation. The 6-axis load cell successfully discriminated between clothing types with predictive model accuracies between 72 and 97%. Used independently, the IMU and Baxter sensors were insufficient to discriminate garment types with the IMU showing 40–72% accuracy, but when used in combination this pair of sensors achieved an accuracy similar to the more expensive load cell (98%. When observing dressing errors (snagging, Baxter’s sensors and the IMU data demonstrated poor sensitivity but applying machine learning methods resulted in model with high predicative accuracy and low false negative rates (≤5%. The results show that the load cell could be used independently for this application with good accuracy but a combination of the lower cost sensors could also be used without a significant loss in

  20. Intrinsically motivated reinforcement learning for human-robot interaction in the real-world.

    Science.gov (United States)

    Qureshi, Ahmed Hussain; Nakamura, Yutaka; Yoshikawa, Yuichiro; Ishiguro, Hiroshi

    2018-03-26

    For a natural social human-robot interaction, it is essential for a robot to learn the human-like social skills. However, learning such skills is notoriously hard due to the limited availability of direct instructions from people to teach a robot. In this paper, we propose an intrinsically motivated reinforcement learning framework in which an agent gets the intrinsic motivation-based rewards through the action-conditional predictive model. By using the proposed method, the robot learned the social skills from the human-robot interaction experiences gathered in the real uncontrolled environments. The results indicate that the robot not only acquired human-like social skills but also took more human-like decisions, on a test dataset, than a robot which received direct rewards for the task achievement. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Drum-mate: interaction dynamics and gestures in human-humanoid drumming experiments

    Science.gov (United States)

    Kose-Bagci, Hatice; Dautenhahn, Kerstin; Syrdal, Dag S.; Nehaniv, Chrystopher L.

    2010-06-01

    This article investigates the role of interaction kinesics in human-robot interaction (HRI). We adopted a bottom-up, synthetic approach towards interactive competencies in robots using simple, minimal computational models underlying the robot's interaction dynamics. We present two empirical, exploratory studies investigating a drumming experience with a humanoid robot (KASPAR) and a human. In the first experiment, the turn-taking behaviour of the humanoid is deterministic and the non-verbal gestures of the robot accompany its drumming to assess the impact of non-verbal gestures on the interaction. The second experiment studies a computational framework that facilitates emergent turn-taking dynamics, whereby the particular dynamics of turn-taking emerge from the social interaction between the human and the humanoid. The results from the HRI experiments are presented and analysed qualitatively (in terms of the participants' subjective experiences) and quantitatively (concerning the drumming performance of the human-robot pair). The results point out a trade-off between the subjective evaluation of the drumming experience from the perspective of the participants and the objective evaluation of the drumming performance. A certain number of gestures was preferred as a motivational factor in the interaction. The participants preferred the models underlying the robot's turn-taking which enable the robot and human to interact more and provide turn-taking closer to 'natural' human-human conversations, despite differences in objective measures of drumming behaviour. The results are consistent with the temporal behaviour matching hypothesis previously proposed in the literature which concerns the effect that the participants adapt their own interaction dynamics to the robot's.

  2. Spatiotemporal Aspects of Engagement during Dialogic Storytelling Child–Robot Interaction

    Directory of Open Access Journals (Sweden)

    Scott Heath

    2017-06-01

    Full Text Available The success of robotic agents in close proximity of humans depends on their capacity to engage in social interactions and maintain these interactions over periods of time that are suitable for learning. A critical requirement is the ability to modify the behavior of the robot contingently to the attentional and social cues signaled by the human. A benchmark challenge for an engaging social robot is that of storytelling. In this paper, we present an exploratory study to investigate dialogic storytelling—storytelling with contingent responses—using a child-friendly robot. The aim of the study was to develop an engaging storytelling robot and to develop metrics for evaluating engagement. Ten children listened to an illustrated story told by a social robot during a science fair. The responses of the robot were adapted during the interaction based on the children’s engagement and touches of the pictures displayed by the robot on a tablet embedded in its torso. During the interaction the robot responded contingently to the child, but only when the robot invited the child to interact. We describe the robot architecture used to implement dialogic storytelling and evaluate the quality of human–robot interaction based on temporal (patterns of touch, touch duration and spatial (motions in the space surrounding the robot metrics. We introduce a novel visualization that emphasizes the temporal dynamics of the interaction and analyze the motions of the children in the space surrounding the robot. The study demonstrates that the interaction through invited contingent responses succeeded in engaging children, although the robot missed some opportunities for contingent interaction and the children had to adapt to the task. We conclude that (i the consideration of both temporal and spatial attributes is fundamental for establishing metrics to estimate levels of engagement in real-time, (ii metrics for engagement are sensitive to both the group and

  3. Exploring cultural factors in human-robot interaction : A matter of personality?

    NARCIS (Netherlands)

    Weiss, Astrid; Evers, Vanessa

    2011-01-01

    This paper proposes an experimental study to investigate task-dependence and cultural-background dependence of the personality trait attribution on humanoid robots. In Human-Robot Interaction, as well as in Human-Agent Interaction research, the attribution of personality traits towards intelligent

  4. Physical Human Robot Interaction for a Wall Mounting Robot - External Force Estimation

    DEFF Research Database (Denmark)

    Alonso García, Alejandro; Villarmarzo Arruñada, Noelia; Pedersen, Rasmus

    2018-01-01

    The use of collaborative robots enhances human capabilities, leading to better working conditions and increased productivity. In building construction, such robots are needed, among other tasks, to install large glass panels, where the robot takes care of the heavy lifting part of the job while...

  5. See You See Me: the Role of Eye Contact in Multimodal Human-Robot Interaction.

    Science.gov (United States)

    Xu, Tian Linger; Zhang, Hui; Yu, Chen

    2016-05-01

    We focus on a fundamental looking behavior in human-robot interactions - gazing at each other's face. Eye contact and mutual gaze between two social partners are critical in smooth human-human interactions. Therefore, investigating at what moments and in what ways a robot should look at a human user's face as a response to the human's gaze behavior is an important topic. Toward this goal, we developed a gaze-contingent human-robot interaction system, which relied on momentary gaze behaviors from a human user to control an interacting robot in real time. Using this system, we conducted an experiment in which human participants interacted with the robot in a joint attention task. In the experiment, we systematically manipulated the robot's gaze toward the human partner's face in real time and then analyzed the human's gaze behavior as a response to the robot's gaze behavior. We found that more face looks from the robot led to more look-backs (to the robot's face) from human participants and consequently created more mutual gaze and eye contact between the two. Moreover, participants demonstrated more coordinated and synchronized multimodal behaviors between speech and gaze when more eye contact was successfully established and maintained.

  6. Mobile Manipulation, Tool Use, and Intuitive Interaction for Cognitive Service Robot Cosero

    Directory of Open Access Journals (Sweden)

    Jörg Stückler

    2016-11-01

    Full Text Available Cognitive service robots that shall assist persons in need in performing their activities of daily living have recently received much attention in robotics research.Such robots require a vast set of control and perception capabilities to provide useful assistance through mobile manipulation and human-robot interaction.In this article, we present hardware design, perception, and control methods for our cognitive service robot Cosero.We complement autonomous capabilities with handheld teleoperation interfaces on three levels of autonomy.The robot demonstrated various advanced skills, including the use of tools.With our robot we participated in the annual international RoboCup@Home competitions, winning them three times in a row.

  7. The host-encoded Heme Regulated Inhibitor (HRI facilitates virulence-associated activities of bacterial pathogens.

    Directory of Open Access Journals (Sweden)

    Niraj Shrestha

    Full Text Available Here we show that cells lacking the heme-regulated inhibitor (HRI are highly resistant to infection by bacterial pathogens. By examining the infection process in wild-type and HRI null cells, we found that HRI is required for pathogens to execute their virulence-associated cellular activities. Specifically, unlike wild-type cells, HRI null cells infected with the gram-negative bacterial pathogen Yersinia are essentially impervious to the cytoskeleton-damaging effects of the Yop virulence factors. This effect is due to reduced functioning of the Yersinia type 3 secretion (T3S system which injects virulence factors directly into the host cell cytosol. Reduced T3S activity is also observed in HRI null cells infected with the bacterial pathogen Chlamydia which results in a dramatic reduction in its intracellular proliferation. We go on to show that a HRI-mediated process plays a central role in the cellular infection cycle of the Gram-positive pathogen Listeria. For this pathogen, HRI is required for the post-invasion trafficking of the bacterium to the infected host cytosol. Thus by depriving Listeria of its intracellular niche, there is a highly reduced proliferation of Listeria in HRI null cells. We provide evidence that these infection-associated functions of HRI (an eIF2α kinase are independent of its activity as a regulator of protein synthesis. This is the first report of a host factor whose absence interferes with the function of T3S secretion and cytosolic access by pathogens and makes HRI an excellent target for inhibitors due to its broad virulence-associated activities.

  8. Modeling and Simulation for Exploring Human-Robot Team Interaction Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Dudenhoeffer, Donald Dean; Bruemmer, David Jonathon; Davis, Midge Lee

    2001-12-01

    Small-sized and micro-robots will soon be available for deployment in large-scale forces. Consequently, the ability of a human operator to coordinate and interact with largescale robotic forces is of great interest. This paper describes the ways in which modeling and simulation have been used to explore new possibilities for human-robot interaction. The paper also discusses how these explorations have fed implementation of a unified set of command and control concepts for robotic force deployment. Modeling and simulation can play a major role in fielding robot teams in actual missions. While live testing is preferred, limitations in terms of technology, cost, and time often prohibit extensive experimentation with physical multi-robot systems. Simulation provides insight, focuses efforts, eliminates large areas of the possible solution space, and increases the quality of actual testing.

  9. Robot Acting on Moving Bodies (RAMBO): Interaction with tumbling objects

    Science.gov (United States)

    Davis, Larry S.; Dementhon, Daniel; Bestul, Thor; Ziavras, Sotirios; Srinivasan, H. V.; Siddalingaiah, Madhu; Harwood, David

    1989-01-01

    Interaction with tumbling objects will become more common as human activities in space expand. Attempting to interact with a large complex object translating and rotating in space, a human operator using only his visual and mental capacities may not be able to estimate the object motion, plan actions or control those actions. A robot system (RAMBO) equipped with a camera, which, given a sequence of simple tasks, can perform these tasks on a tumbling object, is being developed. RAMBO is given a complete geometric model of the object. A low level vision module extracts and groups characteristic features in images of the object. The positions of the object are determined in a sequence of images, and a motion estimate of the object is obtained. This motion estimate is used to plan trajectories of the robot tool to relative locations rearby the object sufficient for achieving the tasks. More specifically, low level vision uses parallel algorithms for image enhancement by symmetric nearest neighbor filtering, edge detection by local gradient operators, and corner extraction by sector filtering. The object pose estimation is a Hough transform method accumulating position hypotheses obtained by matching triples of image features (corners) to triples of model features. To maximize computing speed, the estimate of the position in space of a triple of features is obtained by decomposing its perspective view into a product of rotations and a scaled orthographic projection. This allows use of 2-D lookup tables at each stage of the decomposition. The position hypotheses for each possible match of model feature triples and image feature triples are calculated in parallel. Trajectory planning combines heuristic and dynamic programming techniques. Then trajectories are created using dynamic interpolations between initial and goal trajectories. All the parallel algorithms run on a Connection Machine CM-2 with 16K processors.

  10. Considerations for designing robotic upper limb rehabilitation devices

    Science.gov (United States)

    Nadas, I.; Vaida, C.; Gherman, B.; Pisla, D.; Carbone, G.

    2017-12-01

    The present study highlights the advantages of robotic systems for post-stroke rehabilitation of the upper limb. The latest demographic studies illustrate a continuous increase of the average life span, which leads to a continuous increase of stroke incidents and patients requiring rehabilitation. Some studies estimate that by 2030 the number of physical therapists will be insufficient for the patients requiring physical rehabilitation, imposing a shift in the current methodologies. A viable option is the implementation of robotic systems that assist the patient in performing rehabilitation exercises, the physical therapist role being to establish the therapeutic program for each patient and monitor their individual progress. Using a set of clinical measurements for the upper limb motions, the analysis of rehabilitation robotic systems provides a comparative study between the motions required by clinicians and the ones that robotic systems perform for different therapeutic exercises. A critical analysis of existing robots is performed using several classifications: mechanical design, assistance type, actuation and power transmission, control systems and human robot interaction (HRI) strategies. This classification will determine a set of pre-requirements for the definition of new concepts and efficient solutions for robotic assisted rehabilitation therapy.

  11. COMPARISON OF CLASSICAL AND INTERACTIVE MULTI-ROBOT EXPLORATION STRATEGIES IN POPULATED ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    Nassim Kalde

    2015-06-01

    Full Text Available Multi-robot exploration consists in coordinating robots for mapping an unknown environment. It raises several issues concerning task allocation, robot control, path planning and communication. We study exploration in populated environments, in which pedestrian flows can severely impact performances. However, humans have adaptive skills for taking advantage of these flows while moving. Therefore, in order to exploit these human abilities, we propose a novel exploration strategy that explicitly allows for human-robot interactions. Our model for exploration in populated environments combines the classical frontier-based strategy with our interactive approach. We implement interactions where robots can locally choose a human guide to follow and define a parametric heuristic to balance interaction and frontier assignments. Finally, we evaluate to which extent human presence impacts our exploration model in terms of coverage ratio, travelled distance and elapsed time to completion.

  12. Robotics

    International Nuclear Information System (INIS)

    Scheide, A.W.

    1983-01-01

    This article reviews some of the technical areas and history associated with robotics, provides information relative to the formation of a Robotics Industry Committee within the Industry Applications Society (IAS), and describes how all activities relating to robotics will be coordinated within the IEEE. Industrial robots are being used for material handling, processes such as coating and arc welding, and some mechanical and electronics assembly. An industrial robot is defined as a programmable, multifunctional manipulator designed to move material, parts, tools, or specialized devices through variable programmed motions for a variety of tasks. The initial focus of the Robotics Industry Committee will be on the application of robotics systems to the various industries that are represented within the IAS

  13. Vocal Interactivity in-and-between Humans, Animals and Robots

    Directory of Open Access Journals (Sweden)

    Roger K Moore

    2016-10-01

    Full Text Available Almost all animals exploit vocal signals for a range of ecologically-motivated purposes: detecting predators prey and marking territory, expressing emotions, establishing social relations and sharing information. Whether it is a bird raising an alarm, a whale calling to potential partners,a dog responding to human commands, a parent reading a story with a child, or a business-person accessing stock prices using emph{Siri}, vocalisation provides a valuable communication channel through which behaviour may be coordinated and controlled, and information may be distributed and acquired.Indeed, the ubiquity of vocal interaction has led to research across an extremely diverse array of fields, from assessing animal welfare, to understanding the precursors of human language, to developing voice-based human-machine interaction. Opportunities for cross-fertilisation between these fields abound; for example, using artificial cognitive agents to investigate contemporary theories of language grounding, using machine learning to analyse different habitats or adding vocal expressivity to the next generation of language-enabled autonomous social agents. However, much of the research is conducted within well-defined disciplinary boundaries, and many fundamental issues remain. This paper attempts to redress the balance by presenting a comparative review of vocal interaction within-and-between humans, animals and artificial agents (such as robots, and it identifies a rich set of open research questions that may benefit from an inter-disciplinary analysis.

  14. A Novel Bioinspired Vision System: A Step toward Real-Time Human-Robot Interactions

    Directory of Open Access Journals (Sweden)

    Abdul Rahman Hafiz

    2011-01-01

    Full Text Available Building a human-like robot that could be involved in our daily lives is a dream of many scientists. Achieving a sophisticated robot's vision system, which can enhance the robot's real-time interaction ability with the human, is one of the main keys toward realizing such an autonomous robot. In this work, we are suggesting a bioinspired vision system that helps to develop an advanced human-robot interaction in an autonomous humanoid robot. First, we enhance the robot's vision accuracy online by applying a novel dynamic edge detection algorithm abstracted from the rules that the horizontal cells play in the mammalian retina. Second, in order to support the first algorithm, we improve the robot's tracking ability by designing a variant photoreceptors distribution corresponding to what exists in the human vision system. The experimental results verified the validity of the model. The robot could have a clear vision in real time and build a mental map that assisted it to be aware of the frontal users and to develop a positive interaction with them.

  15. Building a grid-semantic map for the navigation of service robots through human–robot interaction

    Directory of Open Access Journals (Sweden)

    Cheng Zhao

    2015-11-01

    Full Text Available This paper presents an interactive approach to the construction of a grid-semantic map for the navigation of service robots in an indoor environment. It is based on the Robot Operating System (ROS framework and contains four modules, namely Interactive Module, Control Module, Navigation Module and Mapping Module. Three challenging issues have been focused during its development: (i how human voice and robot visual information could be effectively deployed in the mapping and navigation process; (ii how semantic names could combine with coordinate data in an online Grid-Semantic map; and (iii how a localization–evaluate–relocalization method could be used in global localization based on modified maximum particle weight of the particle swarm. A number of experiments are carried out in both simulated and real environments such as corridors and offices to verify its feasibility and performance.

  16. Influences of a Socially Interactive Robot on the Affective Behavior of Young Children with Disabilities. Social Robots Research Reports, Number 3

    Science.gov (United States)

    Dunst, Carl J.; Prior, Jeremy; Hamby, Deborah W.; Trivette, Carol M.

    2013-01-01

    Findings from two studies of 11 young children with autism, Down syndrome, or attention deficit disorders investigating the effects of Popchilla, a socially interactive robot, on the children's affective behavior are reported. The children were observed under two conditions, child-toy interactions and child-robot interactions, and ratings of child…

  17. Interaction between Task Oriented and Affective Information Processing in Cognitive Robotics

    Science.gov (United States)

    Haazebroek, Pascal; van Dantzig, Saskia; Hommel, Bernhard

    There is an increasing interest in endowing robots with emotions. Robot control however is still often very task oriented. We present a cognitive architecture that allows the combination of and interaction between task representations and affective information processing. Our model is validated by comparing simulation results with empirical data from experimental psychology.

  18. ReACT!: An Interactive Educational Tool for AI Planning for Robotics

    Science.gov (United States)

    Dogmus, Zeynep; Erdem, Esra; Patogulu, Volkan

    2015-01-01

    This paper presents ReAct!, an interactive educational tool for artificial intelligence (AI) planning for robotics. ReAct! enables students to describe robots' actions and change in dynamic domains without first having to know about the syntactic and semantic details of the underlying formalism, and to solve planning problems using…

  19. Effects of eye contact and iconic gestures on message retention in human-robot interaction

    NARCIS (Netherlands)

    Dijk, van E.T.; Torta, E.; Cuijpers, R.H.

    2013-01-01

    The effects of iconic gestures and eye contact on message retention in human-robot interaction were investigated in a series of experiments. A humanoid robot gave short verbal messages to participants, accompanied either by iconic gestures or no gestures while making eye contact with the participant

  20. Multimodal human-machine interaction for service robots in home-care environments

    OpenAIRE

    Goetze, Stefan; Fischer, S.; Moritz, Niko; Appell, Jens-E.; Wallhoff, Frank

    2012-01-01

    This contribution focuses on multimodal interaction techniques for a mobile communication and assistance system on a robot platform. The system comprises of acoustic, visual and haptic input modalities. Feedback is given to the user by a graphical user interface and a speech synthesis system. By this, multimodal and natural communication with the robot system is possible.

  1. Interacting with and via mobile devices and mobile robots in an assisted living setting

    Directory of Open Access Journals (Sweden)

    Maria Dagioglou

    2015-05-01

    Full Text Available Using robotic home assistants as a platform for remote health monitoring offers several advantages, but also presents considerable challenges related to both the technical immaturity of home robotics and to user acceptance issues. In this paper we explore tablets and similar mobile devices as the medium of communication between robots and their users, presenting relevant current and planned research in humanrobot interaction that can help the telehealth community circumvent technical shortcomings, improve user acceptance, and maximize the quality of the data collected by robotic home assistants.

  2. Vocal Production of Young Children with Disabilities during Child-Robot Interactions. Social Robots Research Reports, Number 5

    Science.gov (United States)

    Dunst, Carl J.; Hamby, Deborah W.; Trivette, Carol M.; Prior, Jeremy; Derryberry, Graham

    2013-01-01

    The effects of a socially interactive robot on the vocalization production of five children with disabilities (4 with autism, 1 with a sensory processing disorder) were the focus of the intervention study described in this research report. The interventions with each child were conducted over 4 or 5 days in the children's homes and involved…

  3. Robotics

    Energy Technology Data Exchange (ETDEWEB)

    Lorino, P; Altwegg, J M

    1985-05-01

    This article, which is aimed at the general reader, examines latest developments in, and the role of, modern robotics. The 7 main sections are sub-divided into 27 papers presented by 30 authors. The sections are as follows: 1) The role of robotics, 2) Robotics in the business world and what it can offer, 3) Study and development, 4) Utilisation, 5) Wages, 6) Conditions for success, and 7) Technological dynamics.

  4. Navigation Strategy by Contact Sensing Interaction for a Biped Humanoid Robot

    Directory of Open Access Journals (Sweden)

    Hanafiah Yussof

    2008-11-01

    Full Text Available This report presents a basic contact interaction-based navigation strategy for a biped humanoid robot to support current visual-based navigation. The robot's arms were equipped with force sensors to detect physical contact with objects. We proposed a motion algorithm consisting of searching tasks, self-localization tasks, correction of locomotion direction tasks and obstacle avoidance tasks. Priority was given to right-side direction to navigate the robot locomotion. Analysis of trajectory generation, biped gait pattern, and biped walking characteristics was performed to define an efficient navigation strategy in a biped walking humanoid robot. The proposed algorithm is evaluated in an experiment with a 21-dofs humanoid robot operating in a room with walls and obstacles. The experimental results reveal good robot performance when recognizing objects by touching, grasping, and continuously generating suitable trajectories to correct direction and avoid collisions.

  5. The Snackbot: Documenting the Design of a Robot for Long-term Human-Robot Interaction

    Science.gov (United States)

    2009-03-01

    distributed robots. Proceedings of the Computer Supported Cooperative Work Conference’02. NY: ACM Press. [18] Kanda, T., Takayuki , H., Eaton, D., and...humanoid robots. Proceedings of HRI’06. New York, NY: ACM Press, 351-352. [23] Nabe, S., Kanda, T., Hiraki , K., Ishiguro, H., Kogure, K., and Hagita

  6. Human Factors Principles in Design of Computer-Mediated Visualization for Robot Missions

    Energy Technology Data Exchange (ETDEWEB)

    David I Gertman; David J Bruemmer

    2008-12-01

    With increased use of robots as a resource in missions supporting countermine, improvised explosive devices (IEDs), and chemical, biological, radiological nuclear and conventional explosives (CBRNE), fully understanding the best means by which to complement the human operator’s underlying perceptual and cognitive processes could not be more important. Consistent with control and display integration practices in many other high technology computer-supported applications, current robotic design practices rely highly upon static guidelines and design heuristics that reflect the expertise and experience of the individual designer. In order to use what we know about human factors (HF) to drive human robot interaction (HRI) design, this paper reviews underlying human perception and cognition principles and shows how they were applied to a threat detection domain.

  7. Human-robot interaction tests on a novel robot for gait assistance.

    Science.gov (United States)

    Tagliamonte, Nevio Luigi; Sergi, Fabrizio; Carpino, Giorgio; Accoto, Dino; Guglielmelli, Eugenio

    2013-06-01

    This paper presents tests on a treadmill-based non-anthropomorphic wearable robot assisting hip and knee flexion/extension movements using compliant actuation. Validation experiments were performed on the actuators and on the robot, with specific focus on the evaluation of intrinsic backdrivability and of assistance capability. Tests on a young healthy subject were conducted. In the case of robot completely unpowered, maximum backdriving torques were found to be in the order of 10 Nm due to the robot design features (reduced swinging masses; low intrinsic mechanical impedance and high-efficiency reduction gears for the actuators). Assistance tests demonstrated that the robot can deliver torques attracting the subject towards a predicted kinematic status.

  8. Acceptance of an assistive robot in older adults: a mixed-method study of human–robot interaction over a 1-month period in the Living Lab setting

    Science.gov (United States)

    Wu, Ya-Huei; Wrobel, Jérémy; Cornuet, Mélanie; Kerhervé, Hélène; Damnée, Souad; Rigaud, Anne-Sophie

    2014-01-01

    Background There is growing interest in investigating acceptance of robots, which are increasingly being proposed as one form of assistive technology to support older adults, maintain their independence, and enhance their well-being. In the present study, we aimed to observe robot-acceptance in older adults, particularly subsequent to a 1-month direct experience with a robot. Subjects and methods Six older adults with mild cognitive impairment (MCI) and five cognitively intact healthy (CIH) older adults were recruited. Participants interacted with an assistive robot in the Living Lab once a week for 4 weeks. After being shown how to use the robot, participants performed tasks to simulate robot use in everyday life. Mixed methods, comprising a robot-acceptance questionnaire, semistructured interviews, usability-performance measures, and a focus group, were used. Results Both CIH and MCI subjects were able to learn how to use the robot. However, MCI subjects needed more time to perform tasks after a 1-week period of not using the robot. Both groups rated similarly on the robot-acceptance questionnaire. They showed low intention to use the robot, as well as negative attitudes toward and negative images of this device. They did not perceive it as useful in their daily life. However, they found it easy to use, amusing, and not threatening. In addition, social influence was perceived as powerful on robot adoption. Direct experience with the robot did not change the way the participants rated robots in their acceptance questionnaire. We identified several barriers to robot-acceptance, including older adults’ uneasiness with technology, feeling of stigmatization, and ethical/societal issues associated with robot use. Conclusion It is important to destigmatize images of assistive robots to facilitate their acceptance. Universal design aiming to increase the market for and production of products that are usable by everyone (to the greatest extent possible) might help to

  9. Acceptance of an assistive robot in older adults: a mixed-method study of human-robot interaction over a 1-month period in the Living Lab setting.

    Science.gov (United States)

    Wu, Ya-Huei; Wrobel, Jérémy; Cornuet, Mélanie; Kerhervé, Hélène; Damnée, Souad; Rigaud, Anne-Sophie

    2014-01-01

    There is growing interest in investigating acceptance of robots, which are increasingly being proposed as one form of assistive technology to support older adults, maintain their independence, and enhance their well-being. In the present study, we aimed to observe robot-acceptance in older adults, particularly subsequent to a 1-month direct experience with a robot. Six older adults with mild cognitive impairment (MCI) and five cognitively intact healthy (CIH) older adults were recruited. Participants interacted with an assistive robot in the Living Lab once a week for 4 weeks. After being shown how to use the robot, participants performed tasks to simulate robot use in everyday life. Mixed methods, comprising a robot-acceptance questionnaire, semistructured interviews, usability-performance measures, and a focus group, were used. Both CIH and MCI subjects were able to learn how to use the robot. However, MCI subjects needed more time to perform tasks after a 1-week period of not using the robot. Both groups rated similarly on the robot-acceptance questionnaire. They showed low intention to use the robot, as well as negative attitudes toward and negative images of this device. They did not perceive it as useful in their daily life. However, they found it easy to use, amusing, and not threatening. In addition, social influence was perceived as powerful on robot adoption. Direct experience with the robot did not change the way the participants rated robots in their acceptance questionnaire. We identified several barriers to robot-acceptance, including older adults' uneasiness with technology, feeling of stigmatization, and ethical/societal issues associated with robot use. It is important to destigmatize images of assistive robots to facilitate their acceptance. Universal design aiming to increase the market for and production of products that are usable by everyone (to the greatest extent possible) might help to destigmatize assistive devices.

  10. An Exploration of the Benefits of an Animallike Robot Companion with More Advanced Touch Interaction Capabilities for Dementia Care

    NARCIS (Netherlands)

    Jung, Merel Madeleine; van der Leij, Lisa; Kelders, Saskia Marion

    2017-01-01

    Animallike robot companions such as robotic seal Paro are increasingly used in dementia care due to the positive effects that interaction with these robots can have on the well-being of these patients. Touch is one of the most important interaction modalities for patients with dementia and can be a

  11. Compliance control based on PSO algorithm to improve the feeling during physical human-robot interaction.

    Science.gov (United States)

    Jiang, Zhongliang; Sun, Yu; Gao, Peng; Hu, Ying; Zhang, Jianwei

    2016-01-01

    Robots play more important roles in daily life and bring us a lot of convenience. But when people work with robots, there remain some significant differences in human-human interactions and human-robot interaction. It is our goal to make robots look even more human-like. We design a controller which can sense the force acting on any point of a robot and ensure the robot can move according to the force. First, a spring-mass-dashpot system was used to describe the physical model, and the second-order system is the kernel of the controller. Then, we can establish the state space equations of the system. In addition, the particle swarm optimization algorithm had been used to obtain the system parameters. In order to test the stability of system, the root-locus diagram had been shown in the paper. Ultimately, some experiments had been carried out on the robotic spinal surgery system, which is developed by our team, and the result shows that the new controller performs better during human-robot interaction.

  12. Simulation of Robot Kinematics Using Interactive Computer Graphics.

    Science.gov (United States)

    Leu, M. C.; Mahajan, R.

    1984-01-01

    Development of a robot simulation program based on geometric transformation softwares available in most computer graphics systems and program features are described. The program can be extended to simulate robots coordinating with external devices (such as tools, fixtures, conveyors) using geometric transformations to describe the…

  13. Novelty Detection for Interactive Pose Recognition by a Social Robot

    Directory of Open Access Journals (Sweden)

    Victor Gonzalez-Pacheco

    2015-04-01

    Full Text Available Active robot learners take an active role in their own learning by making queries to their human teachers when they receive new data. However, not every received input is useful for the robot, and asking for non-informative inputs or asking too many questions might worsen the user's perception of the robot. We present a novelty detection system that enables a robot to ask labels for new stimuli only when they seem both novel and interesting. Our system separates the decision process into two steps: first, it discriminates novel from known stimuli, and second, it estimates if these stimuli are likely to happen again. Our approach uses the notion of curiosity, which controls the eagerness with which the robot asks questions to the user. We evaluate our approach in the domain of pose learning by training our robot with a set of pointing poses able to detect up to 84%, 79%, and 78% of the observed novelties in three different experiments. Our approach enables robots to keep learning continuously, even after training is finished. The introduction of the curiosity parameter allows tuning, for the conditions in which the robot should want to learn more.

  14. An Exploratory Investigation into the Effects of Adaptation in Child-Robot Interaction

    Science.gov (United States)

    Salter, Tamie; Michaud, François; Létourneau, Dominic

    The work presented in this paper describes an exploratory investigation into the potential effects of a robot exhibiting an adaptive behaviour in reaction to a child’s interaction. In our laboratory we develop robotic devices for a diverse range of children that differ in age, gender and ability, which includes children that are diagnosed with cognitive difficulties. As all children vary in their personalities and styles of interaction, it would follow that adaptation could bring many benefits. In this abstract we give our initial examination of a series of trials which explore the effects of a fully autonomous rolling robot exhibiting adaptation (through changes in motion and sound) compared to it exhibiting pre-programmed behaviours. We investigate sensor readings on-board the robot that record the level of ‘interaction’ that the robot receives when a child plays with it and also we discuss the results from analysing video footage looking at the social aspect of the trial.

  15. Interactive robot control system and method of use

    Science.gov (United States)

    Sanders, Adam M. (Inventor); Reiland, Matthew J. (Inventor); Abdallah, Muhammad E. (Inventor); Linn, Douglas Martin (Inventor); Platt, Robert (Inventor)

    2012-01-01

    A robotic system includes a robot having joints, actuators, and sensors, and a distributed controller. The controller includes command-level controller, embedded joint-level controllers each controlling a respective joint, and a joint coordination-level controller coordinating motion of the joints. A central data library (CDL) centralizes all control and feedback data, and a user interface displays a status of each joint, actuator, and sensor using the CDL. A parameterized action sequence has a hierarchy of linked events, and allows the control data to be modified in real time. A method of controlling the robot includes transmitting control data through the various levels of the controller, routing all control and feedback data to the CDL, and displaying status and operation of the robot using the CDL. The parameterized action sequences are generated for execution by the robot, and a hierarchy of linked events is created within the sequence.

  16. Improving therapeutic outcomes in autism spectrum disorders: Enhancing social communication and sensory processing through the use of interactive robots.

    Science.gov (United States)

    Sartorato, Felippe; Przybylowski, Leon; Sarko, Diana K

    2017-07-01

    For children with autism spectrum disorders (ASDs), social robots are increasingly utilized as therapeutic tools in order to enhance social skills and communication. Robots have been shown to generate a number of social and behavioral benefits in children with ASD including heightened engagement, increased attention, and decreased social anxiety. Although social robots appear to be effective social reinforcement tools in assistive therapies, the perceptual mechanism underlying these benefits remains unknown. To date, social robot studies have primarily relied on expertise in fields such as engineering and clinical psychology, with measures of social robot efficacy principally limited to qualitative observational assessments of children's interactions with robots. In this review, we examine a range of socially interactive robots that currently have the most widespread use as well as the utility of these robots and their therapeutic effects. In addition, given that social interactions rely on audiovisual communication, we discuss how enhanced sensory processing and integration of robotic social cues may underlie the perceptual and behavioral benefits that social robots confer. Although overall multisensory processing (including audiovisual integration) is impaired in individuals with ASD, social robot interactions may provide therapeutic benefits by allowing audiovisual social cues to be experienced through a simplified version of a human interaction. By applying systems neuroscience tools to identify, analyze, and extend the multisensory perceptual substrates that may underlie the therapeutic benefits of social robots, future studies have the potential to strengthen the clinical utility of social robots for individuals with ASD. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Social robotics to help children with autism in their interactions through imitation

    Directory of Open Access Journals (Sweden)

    Pennazio Valentina

    2017-06-01

    Full Text Available This article aims to reflect on the main variables that make social robotics efficient in an educational and rehabilitative intervention. Social robotics is based on imitation, and the study is designed for children affected by profound autism, aiming for the development of their social interactions. Existing research, at the national and international levels, shows how children with autism can interact more easily with a robotic companion rather than a human peer, considering its less complex and more predictable actions. This contribution also highlights how using robotic platforms helps in teaching children with autism basic social abilities, imitation, communication and interaction; this encourages them to transfer the learned abilities to human interactions with both adults and peers, through human–robot imitative modelling. The results of a pilot study conducted in a kindergarten school in the Liguria region are presented. The study included applying a robotic system, at first in a dyadic child–robot relation, then in a triadic one that also included another child, with the aim of eliciting social and imitative abilities in a child with profound autism.

  18. Applications of artificial intelligence in safe human-robot interactions.

    Science.gov (United States)

    Najmaei, Nima; Kermani, Mehrdad R

    2011-04-01

    The integration of industrial robots into the human workspace presents a set of unique challenges. This paper introduces a new sensory system for modeling, tracking, and predicting human motions within a robot workspace. A reactive control scheme to modify a robot's operations for accommodating the presence of the human within the robot workspace is also presented. To this end, a special class of artificial neural networks, namely, self-organizing maps (SOMs), is employed for obtaining a superquadric-based model of the human. The SOM network receives information of the human's footprints from the sensory system and infers necessary data for rendering the human model. The model is then used in order to assess the danger of the robot operations based on the measured as well as predicted human motions. This is followed by the introduction of a new reactive control scheme that results in the least interferences between the human and robot operations. The approach enables the robot to foresee an upcoming danger and take preventive actions before the danger becomes imminent. Simulation and experimental results are presented in order to validate the effectiveness of the proposed method.

  19. Enabling Effective Human-Robot Interaction Using Perspective-Taking in Robots

    National Research Council Canada - National Science Library

    Trafton, J. G; Cassimatis, Nicholas L; Bugajska, Magdalena D; Brock, Derek P; Mintz, Farilee E; Schultz, Alan C

    2005-01-01

    ...) and present a cognitive architecture for performing perspective-taking called Polyscheme. Finally, we show a fully integrated system that instantiates our theoretical framework within a working robot system...

  20. A Deep ROSAT HRI Observation of NGC 1313

    Science.gov (United States)

    Schlegel, Eric M.; Petre, Robert; Colbert, E. J. M.; Miller, Scott

    2000-11-01

    We describe a series of observations of NGC 1313 using the ROSAT HRI with a combined exposure time of 183.5 ks. The observations span an interval between 1992 and 1998; the purpose of observations since 1994 was to monitor the X-ray flux of SN 1978K, one of several luminous sources in the galaxy. No diffuse emission is detected in the galaxy to a level of ~1-2×1037 ergs s-1 arcmin-2. A total of eight sources are detected in the summed image within the D25 diameter of the galaxy. The luminosities of five of the eight range from ~6×1037 to ~6×1038 ergs s-1 these sources are most likely accreting X-ray binaries, similar to sources observed in M31 and M33. The remaining three sources all emit above 1039 ergs s-1. We present light curves of the five brightest sources. Variability is detected at the 99.9% level in four of these. We identify one of the sources as an NGC 1313 counterpart of a Galactic X-ray source. The light curve, though crudely sampled, most closely resembles that of a Galactic black hole candidate such as GX 339-4 but with considerably higher peak X-ray luminosity. An additional seven sources lie outside the D25 diameter and are either foreground stars or background active galactic nuclei.

  1. Natural Tasking of Robots Based on Human Interaction Cues

    Science.gov (United States)

    2005-06-01

    MIT. • Matthew Marjanovic , researcher, ITA Software. • Brian Scasselatti, Assistant Professor of Computer Science, Yale. • Matthew Williamson...2004. 25 [74] Charlie C. Kemp. Shoes as a platform for vision. 7th IEEE International Symposium on Wearable Computers, 2004. [75] Matthew Marjanovic ...meso: Simulated muscles for a humanoid robot. Presentation for Humanoid Robotics Group, MIT AI Lab, August 2001. [76] Matthew J. Marjanovic . Teaching

  2. See You See Me: the Role of Eye Contact in Multimodal Human-Robot Interaction

    Science.gov (United States)

    XU, TIAN (LINGER); ZHANG, HUI; YU, CHEN

    2016-01-01

    We focus on a fundamental looking behavior in human-robot interactions – gazing at each other’s face. Eye contact and mutual gaze between two social partners are critical in smooth human-human interactions. Therefore, investigating at what moments and in what ways a robot should look at a human user’s face as a response to the human’s gaze behavior is an important topic. Toward this goal, we developed a gaze-contingent human-robot interaction system, which relied on momentary gaze behaviors from a human user to control an interacting robot in real time. Using this system, we conducted an experiment in which human participants interacted with the robot in a joint attention task. In the experiment, we systematically manipulated the robot’s gaze toward the human partner’s face in real time and then analyzed the human’s gaze behavior as a response to the robot’s gaze behavior. We found that more face looks from the robot led to more look-backs (to the robot’s face) from human participants and consequently created more mutual gaze and eye contact between the two. Moreover, participants demonstrated more coordinated and synchronized multimodal behaviors between speech and gaze when more eye contact was successfully established and maintained. PMID:28966875

  3. Human-Robot Interaction in High Vulnerability Domains

    Science.gov (United States)

    Gore, Brian F.

    2016-01-01

    Future NASA missions will require successful integration of the human with highly complex systems. Highly complex systems are likely to involve humans, automation, and some level of robotic assistance. The complex environments will require successful integration of the human with automation, with robots, and with human-automation-robot teams to accomplish mission critical goals. Many challenges exist for the human performing in these types of operational environments with these kinds of systems. Systems must be designed to optimally integrate various levels of inputs and outputs based on the roles and responsibilities of the human, the automation, and the robots; from direct manual control, shared human-robotic control, or no active human control (i.e. human supervisory control). It is assumed that the human will remain involved at some level. Technologies that vary based on contextual demands and on operator characteristics (workload, situation awareness) will be needed when the human integrates into these systems. Predictive models that estimate the impact of the technologies on the system performance and the on the human operator are also needed to meet the challenges associated with such future complex human-automation-robot systems in extreme environments.

  4. Evaluation by Expert Dancers of a Robot That Performs Partnered Stepping via Haptic Interaction.

    Directory of Open Access Journals (Sweden)

    Tiffany L Chen

    Full Text Available Our long-term goal is to enable a robot to engage in partner dance for use in rehabilitation therapy, assessment, diagnosis, and scientific investigations of two-person whole-body motor coordination. Partner dance has been shown to improve balance and gait in people with Parkinson's disease and in older adults, which motivates our work. During partner dance, dance couples rely heavily on haptic interaction to convey motor intent such as speed and direction. In this paper, we investigate the potential for a wheeled mobile robot with a human-like upper-body to perform partnered stepping with people based on the forces applied to its end effectors. Blindfolded expert dancers (N=10 performed a forward/backward walking step to a recorded drum beat while holding the robot's end effectors. We varied the admittance gain of the robot's mobile base controller and the stiffness of the robot's arms. The robot followed the participants with low lag (M=224, SD=194 ms across all trials. High admittance gain and high arm stiffness conditions resulted in significantly improved performance with respect to subjective and objective measures. Biomechanical measures such as the human hand to human sternum distance, center-of-mass of leader to center-of-mass of follower (CoM-CoM distance, and interaction forces correlated with the expert dancers' subjective ratings of their interactions with the robot, which were internally consistent (Cronbach's α=0.92. In response to a final questionnaire, 1/10 expert dancers strongly agreed, 5/10 agreed, and 1/10 disagreed with the statement "The robot was a good follower." 2/10 strongly agreed, 3/10 agreed, and 2/10 disagreed with the statement "The robot was fun to dance with." The remaining participants were neutral with respect to these two questions.

  5. Evaluation by Expert Dancers of a Robot That Performs Partnered Stepping via Haptic Interaction

    Science.gov (United States)

    Chen, Tiffany L.; Bhattacharjee, Tapomayukh; McKay, J. Lucas; Borinski, Jacquelyn E.; Hackney, Madeleine E.; Ting, Lena H.; Kemp, Charles C.

    2015-01-01

    Our long-term goal is to enable a robot to engage in partner dance for use in rehabilitation therapy, assessment, diagnosis, and scientific investigations of two-person whole-body motor coordination. Partner dance has been shown to improve balance and gait in people with Parkinson's disease and in older adults, which motivates our work. During partner dance, dance couples rely heavily on haptic interaction to convey motor intent such as speed and direction. In this paper, we investigate the potential for a wheeled mobile robot with a human-like upper-body to perform partnered stepping with people based on the forces applied to its end effectors. Blindfolded expert dancers (N=10) performed a forward/backward walking step to a recorded drum beat while holding the robot's end effectors. We varied the admittance gain of the robot's mobile base controller and the stiffness of the robot's arms. The robot followed the participants with low lag (M=224, SD=194 ms) across all trials. High admittance gain and high arm stiffness conditions resulted in significantly improved performance with respect to subjective and objective measures. Biomechanical measures such as the human hand to human sternum distance, center-of-mass of leader to center-of-mass of follower (CoM-CoM) distance, and interaction forces correlated with the expert dancers' subjective ratings of their interactions with the robot, which were internally consistent (Cronbach's α=0.92). In response to a final questionnaire, 1/10 expert dancers strongly agreed, 5/10 agreed, and 1/10 disagreed with the statement "The robot was a good follower." 2/10 strongly agreed, 3/10 agreed, and 2/10 disagreed with the statement "The robot was fun to dance with." The remaining participants were neutral with respect to these two questions. PMID:25993099

  6. A Self-Organizing Interaction and Synchronization Method between a Wearable Device and Mobile Robot.

    Science.gov (United States)

    Kim, Min Su; Lee, Jae Geun; Kang, Soon Ju

    2016-06-08

    In the near future, we can expect to see robots naturally following or going ahead of humans, similar to pet behavior. We call this type of robots "Pet-Bot". To implement this function in a robot, in this paper we introduce a self-organizing interaction and synchronization method between wearable devices and Pet-Bots. First, the Pet-Bot opportunistically identifies its owner without any human intervention, which means that the robot self-identifies the owner's approach on its own. Second, Pet-Bot's activity is synchronized with the owner's behavior. Lastly, the robot frequently encounters uncertain situations (e.g., when the robot goes ahead of the owner but meets a situation where it cannot make a decision, or the owner wants to stop the Pet-Bot synchronization mode to relax). In this case, we have adopted a gesture recognition function that uses a 3-D accelerometer in the wearable device. In order to achieve the interaction and synchronization in real-time, we use two wireless communication protocols: 125 kHz low-frequency (LF) and 2.4 GHz Bluetooth low energy (BLE). We conducted experiments using a prototype Pet-Bot and wearable devices to verify their motion recognition of and synchronization with humans in real-time. The results showed a guaranteed level of accuracy of at least 94%. A trajectory test was also performed to demonstrate the robot's control performance when following or leading a human in real-time.

  7. Portraits of self-organization in fish schools interacting with robots

    Science.gov (United States)

    Aureli, M.; Fiorilli, F.; Porfiri, M.

    2012-05-01

    In this paper, we propose an enabling computational and theoretical framework for the analysis of experimental instances of collective behavior in response to external stimuli. In particular, this work addresses the characterization of aggregation and interaction phenomena in robot-animal groups through the exemplary analysis of fish schooling in the vicinity of a biomimetic robot. We adapt global observables from statistical mechanics to capture the main features of the shoal collective motion and its response to the robot from experimental observations. We investigate the shoal behavior by using a diffusion mapping analysis performed on these global observables that also informs the definition of relevant portraits of self-organization.

  8. Affective and Engagement Issues in the Conception and Assessment of a Robot-Assisted Psychomotor Therapy for Persons with Dementia

    Directory of Open Access Journals (Sweden)

    Natacha Rouaix

    2017-06-01

    Full Text Available The interest in robot-assisted therapies (RAT for dementia care has grown steadily in recent years. However, RAT using humanoid robots is still a novel practice for which the adhesion mechanisms, indications and benefits remain unclear. Also, little is known about how the robot's behavioral and affective style might promote engagement of persons with dementia (PwD in RAT. The present study sought to investigate the use of a humanoid robot in a psychomotor therapy for PwD. We examined the robot's potential to engage participants in the intervention and its effect on their emotional state. A brief psychomotor therapy program involving the robot as the therapist's assistant was created. For this purpose, a corpus of social and physical behaviors for the robot and a “control software” for customizing the program and operating the robot were also designed. Particular attention was given to components of the RAT that could promote participant's engagement (robot's interaction style, personalization of contents. In the pilot assessment of the intervention nine PwD (7 women and 2 men, M age = 86 y/o hospitalized in a geriatrics unit participated in four individual therapy sessions: one classic therapy (CT session (patient- therapist and three RAT sessions (patient-therapist-robot. Outcome criteria for the evaluation of the intervention included: participant's engagement, emotional state and well-being; satisfaction of the intervention, appreciation of the robot, and empathy-related behaviors in human-robot interaction (HRI. Results showed a high constructive engagement in both CT and RAT sessions. More positive emotional responses in participants were observed in RAT compared to CT. RAT sessions were better appreciated than CT sessions. The use of a social robot as a mediating tool appeared to promote the involvement of PwD in the therapeutic intervention increasing their immediate wellbeing and satisfaction.

  9. Interaction Challenges in Human-Robot Space Exploration

    Science.gov (United States)

    Fong, Terrence; Nourbakhsh, Illah

    2005-01-01

    In January 2004, NASA established a new, long-term exploration program to fulfill the President's Vision for U.S. Space Exploration. The primary goal of this program is to establish a sustained human presence in space, beginning with robotic missions to the Moon in 2008, followed by extended human expeditions to the Moon as early as 2015. In addition, the program places significant emphasis on the development of joint human-robot systems. A key difference from previous exploration efforts is that future space exploration activities must be sustainable over the long-term. Experience with the space station has shown that cost pressures will keep astronaut teams small. Consequently, care must be taken to extend the effectiveness of these astronauts well beyond their individual human capacity. Thus, in order to reduce human workload, costs, and fatigue-driven error and risk, intelligent robots will have to be an integral part of mission design.

  10. Affective and behavioral responses to robot-initiated social touch : Towards understanding the opportunities and limitations of physical contact in human-robot interaction

    NARCIS (Netherlands)

    Willemse, C.J.A.M.; Toet, A.; Erp, J.B.F. van

    2017-01-01

    Social touch forms an important aspect of the human non-verbal communication repertoire, but is often overlooked in human–robot interaction. In this study, we investigated whether robot-initiated touches can induce physiological, emotional, and behavioral responses similar to those reported for

  11. Parents' Appraisals of the Animacy and Likability of Socially Interactive Robots for Intervening with Young Children with Disabilities. Social Robots Research Reports, Number 2

    Science.gov (United States)

    Dunst, Carl J.; Trivette, Carol M.; Prior, Jeremy; Hamby, Deborah W.; Embler, Davon

    2013-01-01

    Findings from a survey of parents' ratings of seven different human-like qualities of four socially interactive robots are reported. The four robots were Popchilla, Keepon, Kaspar, and CosmoBot. The participants were 96 parents and other primary caregivers of young children with disabilities 1 to 12 years of age. Results showed that Popchilla, a…

  12. Parents' Judgments of the Acceptability and Importance of Socially Interactive Robots for Intervening with Young Children with Disabilities. Social Robots Research Reports, Number 1

    Science.gov (United States)

    Dunst, Carl J.; Trivette, Carol M.; Prior, Jeremy; Hamby, Deborah W.; Embler, Davon

    2013-01-01

    A number of different types of socially interactive robots are being used as part of interventions with young children with disabilities to promote their joint attention and language skills. Parents' judgments of two dimensions (acceptance and importance) of the social validity of four different social robots were the focus of the study described…

  13. Affective and Behavioral Responses to Robot-Initiated Social Touch : Toward Understanding the Opportunities and Limitations of Physical Contact in Human–Robot Interaction

    NARCIS (Netherlands)

    Willemse, Christian J. A. M.; Toet, Alexander; van Erp, Jan B. F.

    2017-01-01

    Social touch forms an important aspect of the human non-verbal communication repertoire, but is often overlooked in human-robot interaction. In this study, we investigated whether robot-initiated touches can induce physiological, emotional, and behavioral responses similar to those reported for

  14. Ontology-based indirect interaction of mobile robots for joint task solving: a scenario for obstacle overcoming

    Directory of Open Access Journals (Sweden)

    Petrov Mikhail

    2017-01-01

    Full Text Available This paper describes an ontology-based approach to interaction of users and mobile robots for joint task solving. The use of ontologies allows supporting semantic interoperability between robots. The ontologies store knowledge about the tasks to be performed, knowledge about the functionality of robots and the current situation factors like a robot location or busyness. Ontologies are published in a smart space which allows indirect interaction between participants. On the basis of the knowledge, a robot can define a task that is to be performed and get the current status of other robots. The paper presents a reference model of the approach to indirect interaction between mobile robots for joint task solving, an ontology model for the knowledge organization, and application of the presented approach for the scenario for obstacle overcoming.

  15. A receptionist robot for Brazilian people: study on interaction involving illiterates

    Directory of Open Access Journals (Sweden)

    Trovato Gabriele

    2017-04-01

    Full Text Available The receptionist job, consisting in providing useful indications to visitors in a public office, is one possible employment of social robots. The design and the behaviour of robots expected to be integrated in human societies are crucial issues, and they are dependent on the culture and society in which the robot should be deployed. We study the factors that could be used in the design of a receptionist robot in Brazil, a country with a mix of races and considerable gaps in economic and educational level. This inequality results in the presence of functional illiterate people, unable to use reading, writing and numeracy skills. We invited Brazilian people, including a group of functionally illiterate subjects, to interact with two types of receptionists differing in physical appearance (agent v mechanical robot and in the sound of the voice (human like v mechanical. Results gathered during the interactions point out a preference for the agent, for the human-like voice and a more intense reaction to stimuli by illiterates. These results provide useful indications that should be considered when designing a receptionist robot, as well as insights on the effect of illiteracy in the interaction.

  16. Dynamic Characterization and Interaction Control of the CBM-Motus Robot for Upper-Limb Rehabilitation

    Directory of Open Access Journals (Sweden)

    Loredana Zollo

    2013-10-01

    Full Text Available This paper presents dynamic characterization and control of an upper-limb rehabilitation machine aimed at improving robot performance in the interaction with the patient. An integrated approach between mechanics and control is the key issue of the paper for the development of a robotic machine with desirable dynamic properties. Robot inertial and acceleration properties are studied in the workspace via a graphical representation based on ellipses. Robot friction is experimentally retrieved by means of a parametric identification procedure. A current-based impedance control is developed in order to compensate for friction and enhance control performance in the interaction with the patient by means of force feedback, without increasing system inertia. To this end, servo-amplifier motor currents are monitored to provide force feedback in the interaction, thus avoiding the need for force sensors mounted at the robot end-effector. Current-based impedance control is implemented on the robot; experimental results in free space as well as in constrained space are provided.

  17. Head Orientation Behavior of Users and Durations in Playful Open-Ended Interactions with an Android Robot

    DEFF Research Database (Denmark)

    Vlachos, Evgenios; Jochum, Elizabeth Ann; Schärfe, Henrik

    2016-01-01

    This paper presents the results of a field-experiment focused on the head orientation behavior of users in short-term dyadic interactions with an android (male) robot in a playful context, as well as on the duration of the interactions. The robotic trials took place in an art exhibition where...... subjects. The findings suggest that androids have the ability to maintain the focus of attention during short-term in-teractions within a playful context. This study provides an insight on how users communicate with an android robot, and on how to design meaningful human robot social interaction for real...

  18. Human Robotic Swarm Interaction Using an Artificial Physics Approach

    Science.gov (United States)

    2014-12-01

    Sarah has done an amazing job at being a mom and a spouse to a sometimes eccentric naval officer. Most importantly, I’d like to thank Jesus Christ...physicometics-based frame- work in a four-robot auditory scene monitoring scenario [19]. In his experiments, Apker et al. uses the four- wheeled Pioneer3-AT ground

  19. Interacting with Multi-Robot Systems Using BML

    Science.gov (United States)

    2013-06-01

    presented to the operator. 1. Introduction There are many operations in which a multi-robot system (MRS) can be deployed to support the human forces...within the MRS easily. © Fraunhofer FKIE Communication Architecture ~ ~ Fraunhofer FKIE © Fraunhofer FKIE Battle Mangement Language BML...Fraunhofer FKIE Battle Mangement Language Orders Orders move patrol observe distribute guard recce imagery intelligence gathering

  20. Designing HRD Interventions for Employee-Robot Interaction

    Science.gov (United States)

    Heo, Se Jin

    2011-01-01

    The purpose of this study was to identify critical causes of work stress and job satisfaction of nurses, which can contribute to find appropriate organizational supports to help nurses effectively work with a surgical robot. Delphi method was employed to identify the critical stressors and the key causes of job satisfaction of nurses working with…

  1. Experiences developing socially acceptable interactions for a robotic trash barrel

    DEFF Research Database (Denmark)

    Yang, Stephen; Mok, Brian Ka Jun; Sirkin, David

    2015-01-01

    strategies that seemed to evoke clear engagement and responses, both positive and negative. Observations and interviews show that a) people most welcome the robot's presence when they need its services and it actively advertises its intent through movement; b) people create mental models of the trash barrel...

  2. Perception of Affective Body Movements in HRI Across Age Groups

    DEFF Research Database (Denmark)

    Rehm, Matthias; Krogsager, Anders; Segato, Nicolaj

    2016-01-01

    robots and the signals they produce. In this paper we focus on affective connotations of body movements and investigate how the perception of body movements of robots is related to age. Inspired by a study from Japan, we introduce culture as a variable in the experiment and discuss the difficulties...... of cross-cultural comparisons. The results show that there are certain age-related differences in the perception of affective body movements, but not as strong as in the original study. A follow up experiment puts the affective body movements into context and shows that recognition rates deteriorate...

  3. Collaboration by Design: Using Robotics to Foster Social Interaction in Kindergarten

    Science.gov (United States)

    Lee, Kenneth T. H.; Sullivan, Amanda; Bers, Marina U.

    2013-01-01

    Research shows the importance of social interaction between peers in child development. Although technology can foster peer interactions, teachers often struggle with teaching with technology. This study examined a sample of (n = 19) children participating in a kindergarten robotics summer workshop to determine the effect of teaching using a…

  4. Robotics

    Indian Academy of Sciences (India)

    netic induction to detect an object. The development of ... end effector, inclination of object, magnetic and electric fields, etc. The sensors described ... In the case of a robot, the various actuators and motors have to be modelled. The major ...

  5. Evolutionary robotics

    Indian Academy of Sciences (India)

    In evolutionary robotics, a suitable robot control system is developed automatically through evolution due to the interactions between the robot and its environment. It is a complicated task, as the robot and the environment constitute a highly dynamical system. Several methods have been tried by various investigators to ...

  6. An Interactive Control Algorithm Used for Equilateral Triangle Formation with Robotic Sensors

    Science.gov (United States)

    Li, Xiang; Chen, Hongcai

    2014-01-01

    This paper describes an interactive control algorithm, called Triangle Formation Algorithm (TFA), used for three neighboring robotic sensors which are distributed randomly to self-organize into and equilateral triangle (E) formation. The algorithm is proposed based on the triangular geometry and considering the actual sensors used in robotics. In particular, the stability of the TFA, which can be executed by robotic sensors independently and asynchronously for E formation, is analyzed in details based on Lyapunov stability theory. Computer simulations are carried out for verifying the effectiveness of the TFA. The analytical results and simulation studies indicate that three neighboring robots employing conventional sensors can self-organize into E formations successfully regardless of their initial distribution using the same TFAs. PMID:24759118

  7. Empowering Older Patients to Engage in Self Care: Designing an Interactive Robotic Device

    Science.gov (United States)

    Tiwari, Priyadarshi; Warren, Jim; Day, Karen

    2011-01-01

    Objectives: To develop and test an interactive robot mounted computing device to support medication management as an example of a complex self-care task in older adults. Method: A Grounded Theory (GT), Participatory Design (PD) approach was used within three Action Research (AR) cycles to understand design requirements and test the design configuration addressing the unique task requirements. Results: At the end of the first cycle a conceptual framework was evolved. The second cycle informed architecture and interface design. By the end of third cycle residents successfully interacted with the dialogue system and were generally satisfied with the robot. The results informed further refinement of the prototype. Conclusion: An interactive, touch screen based, robot-mounted information tool can be developed to support healthcare needs of older people. Qualitative methods such as the hybrid GT-PD-AR approach may be particularly helpful for innovating and articulating design requirements in challenging situations. PMID:22195203

  8. I Show You How I Like You: Emotional Human-Robot Interaction through Facial Expression and Tactile Stimulation

    DEFF Research Database (Denmark)

    Canamero, Dolores; Fredslund, Jacob

    2001-01-01

    We report work on a LEGO robot that displays different emotional expressions in response to physical stimulation, for the purpose of social interaction with humans. This is a first step toward our longer-term goal of exploring believable emotional exchanges to achieve plausible interaction...... with a simple robot. Drawing inspiration from theories of human basic emotions, we implemented several prototypical expressions in the robot's caricatured face and conducted experiments to assess the recognizability of these expressions...

  9. Cutting Edge Research in Homeopathy: HRI's second international research conference in Rome.

    Science.gov (United States)

    Tournier, Alexander L; Roberts, E Rachel

    2016-02-01

    Rome, 3rd-5th June 2015, was the setting for the Homeopathy Research Institute's (HRI) second conference with the theme 'Cutting Edge Research in Homeopathy'. Attended by over 250 delegates from 39 countries, this event provided an intense two and a half day programme of presentations and a forum for the sharing of ideas and the creation of international scientific collaborations. With 35 oral presentations from leaders in the field, the scientific calibre of the programme was high and the content diverse. This report summarises the key themes underpinning the cutting edge data presented by the speakers, including six key-note presentations, covering advancements in both basic and clinical research. Given the clear commitment of the global homeopathic community to high quality research, the resounding success of both Barcelona 2013 and Rome 2015 HRI conferences, and the dedicated support of colleagues, the HRI moves confidently forward towards the next biennial conference. Copyright © 2015.

  10. Rhythm Patterns Interaction - Synchronization Behavior for Human-Robot Joint Action

    Science.gov (United States)

    Mörtl, Alexander; Lorenz, Tamara; Hirche, Sandra

    2014-01-01

    Interactive behavior among humans is governed by the dynamics of movement synchronization in a variety of repetitive tasks. This requires the interaction partners to perform for example rhythmic limb swinging or even goal-directed arm movements. Inspired by that essential feature of human interaction, we present a novel concept and design methodology to synthesize goal-directed synchronization behavior for robotic agents in repetitive joint action tasks. The agents’ tasks are described by closed movement trajectories and interpreted as limit cycles, for which instantaneous phase variables are derived based on oscillator theory. Events segmenting the trajectories into multiple primitives are introduced as anchoring points for enhanced synchronization modes. Utilizing both continuous phases and discrete events in a unifying view, we design a continuous dynamical process synchronizing the derived modes. Inverse to the derivation of phases, we also address the generation of goal-directed movements from the behavioral dynamics. The developed concept is implemented to an anthropomorphic robot. For evaluation of the concept an experiment is designed and conducted in which the robot performs a prototypical pick-and-place task jointly with human partners. The effectiveness of the designed behavior is successfully evidenced by objective measures of phase and event synchronization. Feedback gathered from the participants of our exploratory study suggests a subjectively pleasant sense of interaction created by the interactive behavior. The results highlight potential applications of the synchronization concept both in motor coordination among robotic agents and in enhanced social interaction between humanoid agents and humans. PMID:24752212

  11. Interactive language learning by robots: the transition from babbling to word forms.

    Science.gov (United States)

    Lyon, Caroline; Nehaniv, Chrystopher L; Saunders, Joe

    2012-01-01

    The advent of humanoid robots has enabled a new approach to investigating the acquisition of language, and we report on the development of robots able to acquire rudimentary linguistic skills. Our work focuses on early stages analogous to some characteristics of a human child of about 6 to 14 months, the transition from babbling to first word forms. We investigate one mechanism among many that may contribute to this process, a key factor being the sensitivity of learners to the statistical distribution of linguistic elements. As well as being necessary for learning word meanings, the acquisition of anchor word forms facilitates the segmentation of an acoustic stream through other mechanisms. In our experiments some salient one-syllable word forms are learnt by a humanoid robot in real-time interactions with naive participants. Words emerge from random syllabic babble through a learning process based on a dialogue between the robot and the human participant, whose speech is perceived by the robot as a stream of phonemes. Numerous ways of representing the speech as syllabic segments are possible. Furthermore, the pronunciation of many words in spontaneous speech is variable. However, in line with research elsewhere, we observe that salient content words are more likely than function words to have consistent canonical representations; thus their relative frequency increases, as does their influence on the learner. Variable pronunciation may contribute to early word form acquisition. The importance of contingent interaction in real-time between teacher and learner is reflected by a reinforcement process, with variable success. The examination of individual cases may be more informative than group results. Nevertheless, word forms are usually produced by the robot after a few minutes of dialogue, employing a simple, real-time, frequency dependent mechanism. This work shows the potential of human-robot interaction systems in studies of the dynamics of early language

  12. Speech-Based Human and Service Robot Interaction: An Application for Mexican Dysarthric People

    Directory of Open Access Journals (Sweden)

    Santiago Omar Caballero Morales

    2013-01-01

    Full Text Available Dysarthria is a motor speech disorder due to weakness or poor coordination of the speech muscles. This condition can be caused by a stroke, traumatic brain injury, or by a degenerative neurological disease. Commonly, people with this disorder also have muscular dystrophy, which restricts their use of switches or keyboards for communication or control of assistive devices (i.e., an electric wheelchair or a service robot. In this case, speech recognition is an attractive alternative for interaction and control of service robots, despite the difficulty of achieving robust recognition performance. In this paper we present a speech recognition system for human and service robot interaction for Mexican Spanish dysarthric speakers. The core of the system consisted of a Speaker Adaptive (SA recognition system trained with normal-speech. Features such as on-line control of the language model perplexity and the adding of vocabulary, contribute to high recognition performance. Others, such as assessment and text-to-speech (TTS synthesis, contribute to a more complete interaction with a service robot. Live tests were performed with two mild dysarthric speakers, achieving recognition accuracies of 90–95% for spontaneous speech and 95–100% of accomplished simulated service robot tasks.

  13. Systematic analysis of video data from different human–robot interaction studies: a categorization of social signals during error situations

    Science.gov (United States)

    Giuliani, Manuel; Mirnig, Nicole; Stollnberger, Gerald; Stadler, Susanne; Buchner, Roland; Tscheligi, Manfred

    2015-01-01

    Human–robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human–robot interaction experiments. For that, we analyzed 201 videos of five human–robot interaction user studies with varying tasks from four independent projects. The analysis shows that there are two types of error situations: social norm violations and technical failures. Social norm violations are situations in which the robot does not adhere to the underlying social script of the interaction. Technical failures are caused by technical shortcomings of the robot. The results of the video analysis show that the study participants use many head movements and very few gestures, but they often smile, when in an error situation with the robot. Another result is that the participants sometimes stop moving at the beginning of error situations. We also found that the participants talked more in the case of social norm violations and less during technical failures. Finally, the participants use fewer non-verbal social signals (for example smiling, nodding, and head shaking), when they are interacting with the robot alone and no experimenter or other human is present. The results suggest that participants do not see the robot as a social interaction partner with comparable communication skills. Our findings have implications for builders and evaluators of human–robot interaction systems. The builders need to consider including modules for recognition and classification of head movements to the robot input channels. The evaluators need to make sure that the presence of an experimenter does not skew the results of their user studies. PMID:26217266

  14. Systematic analysis of video data from different human-robot interaction studies: a categorization of social signals during error situations.

    Science.gov (United States)

    Giuliani, Manuel; Mirnig, Nicole; Stollnberger, Gerald; Stadler, Susanne; Buchner, Roland; Tscheligi, Manfred

    2015-01-01

    Human-robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human-robot interaction experiments. For that, we analyzed 201 videos of five human-robot interaction user studies with varying tasks from four independent projects. The analysis shows that there are two types of error situations: social norm violations and technical failures. Social norm violations are situations in which the robot does not adhere to the underlying social script of the interaction. Technical failures are caused by technical shortcomings of the robot. The results of the video analysis show that the study participants use many head movements and very few gestures, but they often smile, when in an error situation with the robot. Another result is that the participants sometimes stop moving at the beginning of error situations. We also found that the participants talked more in the case of social norm violations and less during technical failures. Finally, the participants use fewer non-verbal social signals (for example smiling, nodding, and head shaking), when they are interacting with the robot alone and no experimenter or other human is present. The results suggest that participants do not see the robot as a social interaction partner with comparable communication skills. Our findings have implications for builders and evaluators of human-robot interaction systems. The builders need to consider including modules for recognition and classification of head movements to the robot input channels. The evaluators need to make sure that the presence of an experimenter does not skew the results of their user studies.

  15. Children with Autism Spectrum Disorders Make a Fruit Salad with Probo, the Social Robot: An Interaction Study

    Science.gov (United States)

    Simut, Ramona E.; Vanderfaeillie, Johan; Peca, Andreea; Van de Perre, Greet; Vanderborght, Bram

    2016-01-01

    Social robots are thought to be motivating tools in play tasks with children with autism spectrum disorders. Thirty children with autism were included using a repeated measurements design. It was investigated if the children's interaction with a human differed from the interaction with a social robot during a play task. Also, it was examined if…

  16. Learning compliant manipulation through kinesthetic and tactile human-robot interaction.

    Science.gov (United States)

    Kronander, Klas; Billard, Aude

    2014-01-01

    Robot Learning from Demonstration (RLfD) has been identified as a key element for making robots useful in daily lives. A wide range of techniques has been proposed for deriving a task model from a set of demonstrations of the task. Most previous works use learning to model the kinematics of the task, and for autonomous execution the robot then relies on a stiff position controller. While many tasks can and have been learned this way, there are tasks in which controlling the position alone is insufficient to achieve the goals of the task. These are typically tasks that involve contact or require a specific response to physical perturbations. The question of how to adjust the compliance to suit the need of the task has not yet been fully treated in Robot Learning from Demonstration. In this paper, we address this issue and present interfaces that allow a human teacher to indicate compliance variations by physically interacting with the robot during task execution. We validate our approach in two different experiments on the 7 DoF Barrett WAM and KUKA LWR robot manipulators. Furthermore, we conduct a user study to evaluate the usability of our approach from a non-roboticists perspective.

  17. Estimation of Physical Human-Robot Interaction Using Cost-Effective Pneumatic Padding

    Directory of Open Access Journals (Sweden)

    André Wilkening

    2016-08-01

    Full Text Available The idea to use a cost-effective pneumatic padding for sensing of physical interaction between a user and wearable rehabilitation robots is not new, but until now there has not been any practical relevant realization. In this paper, we present a novel method to estimate physical human-robot interaction using a pneumatic padding based on artificial neural networks (ANNs. This estimation can serve as rough indicator of applied forces/torques by the user and can be applied for visual feedback about the user’s participation or as additional information for interaction controllers. Unlike common mostly very expensive 6-axis force/torque sensors (FTS, the proposed sensor system can be easily integrated in the design of physical human-robot interfaces of rehabilitation robots and adapts itself to the shape of the individual patient’s extremity by pressure changing in pneumatic chambers, in order to provide a safe physical interaction with high user’s comfort. This paper describes a concept of using ANNs for estimation of interaction forces/torques based on pressure variations of eight customized air-pad chambers. The ANNs were trained one-time offline using signals of a high precision FTS which is also used as reference sensor for experimental validation. Experiments with three different subjects confirm the functionality of the concept and the estimation algorithm.

  18. Specific and Class Object Recognition for Service Robots through Autonomous and Interactive Methods

    Science.gov (United States)

    Mansur, Al; Kuno, Yoshinori

    Service robots need to be able to recognize and identify objects located within complex backgrounds. Since no single method may work in every situation, several methods need to be combined and robots have to select the appropriate one automatically. In this paper we propose a scheme to classify situations depending on the characteristics of the object of interest and user demand. We classify situations into four groups and employ different techniques for each. We use Scale-invariant feature transform (SIFT), Kernel Principal Components Analysis (KPCA) in conjunction with Support Vector Machine (SVM) using intensity, color, and Gabor features for five object categories. We show that the use of appropriate features is important for the use of KPCA and SVM based techniques on different kinds of objects. Through experiments we show that by using our categorization scheme a service robot can select an appropriate feature and method, and considerably improve its recognition performance. Yet, recognition is not perfect. Thus, we propose to combine the autonomous method with an interactive method that allows the robot to recognize the user request for a specific object and class when the robot fails to recognize the object. We also propose an interactive way to update the object model that is used to recognize an object upon failure in conjunction with the user's feedback.

  19. Ghost-in-the-Machine reveals human social signals for human-robot interaction.

    Science.gov (United States)

    Loth, Sebastian; Jettka, Katharina; Giuliani, Manuel; de Ruiter, Jan P

    2015-01-01

    We used a new method called "Ghost-in-the-Machine" (GiM) to investigate social interactions with a robotic bartender taking orders for drinks and serving them. Using the GiM paradigm allowed us to identify how human participants recognize the intentions of customers on the basis of the output of the robotic recognizers. Specifically, we measured which recognizer modalities (e.g., speech, the distance to the bar) were relevant at different stages of the interaction. This provided insights into human social behavior necessary for the development of socially competent robots. When initiating the drink-order interaction, the most important recognizers were those based on computer vision. When drink orders were being placed, however, the most important information source was the speech recognition. Interestingly, the participants used only a subset of the available information, focussing only on a few relevant recognizers while ignoring others. This reduced the risk of acting on erroneous sensor data and enabled them to complete service interactions more swiftly than a robot using all available sensor data. We also investigated socially appropriate response strategies. In their responses, the participants preferred to use the same modality as the customer's requests, e.g., they tended to respond verbally to verbal requests. Also, they added redundancy to their responses, for instance by using echo questions. We argue that incorporating the social strategies discovered with the GiM paradigm in multimodal grammars of human-robot interactions improves the robustness and the ease-of-use of these interactions, and therefore provides a smoother user experience.

  20. Ghost-in-the-Machine reveals human social signals for human–robot interaction

    Science.gov (United States)

    Loth, Sebastian; Jettka, Katharina; Giuliani, Manuel; de Ruiter, Jan P.

    2015-01-01

    We used a new method called “Ghost-in-the-Machine” (GiM) to investigate social interactions with a robotic bartender taking orders for drinks and serving them. Using the GiM paradigm allowed us to identify how human participants recognize the intentions of customers on the basis of the output of the robotic recognizers. Specifically, we measured which recognizer modalities (e.g., speech, the distance to the bar) were relevant at different stages of the interaction. This provided insights into human social behavior necessary for the development of socially competent robots. When initiating the drink-order interaction, the most important recognizers were those based on computer vision. When drink orders were being placed, however, the most important information source was the speech recognition. Interestingly, the participants used only a subset of the available information, focussing only on a few relevant recognizers while ignoring others. This reduced the risk of acting on erroneous sensor data and enabled them to complete service interactions more swiftly than a robot using all available sensor data. We also investigated socially appropriate response strategies. In their responses, the participants preferred to use the same modality as the customer’s requests, e.g., they tended to respond verbally to verbal requests. Also, they added redundancy to their responses, for instance by using echo questions. We argue that incorporating the social strategies discovered with the GiM paradigm in multimodal grammars of human–robot interactions improves the robustness and the ease-of-use of these interactions, and therefore provides a smoother user experience. PMID:26582998

  1. Natural Tasking of Robots Based on Human Interaction Cues (CD-ROM)

    National Research Council Canada - National Science Library

    Brooks, Rodney A

    2005-01-01

    ...: 1 CD-ROM; 4 3/4 in.; 207 MB. ABSTRACT: We proposed developing the perceptual and intellectual abilities of robots so that in the field, war-fighters can interact with them in the same natural ways as they do with their human cohorts...

  2. Designing robot embodiments for social interaction: Affordances topple realism and aesthetics

    NARCIS (Netherlands)

    Paauwe, R.A.; Hoorn, J.F.; Konijn, E.A.; Keyson, D.V.

    2015-01-01

    In the near future, human-like social robots will become indispensable for providing support in various social tasks, in particular for healthcare (e.g., assistance, coaching). The perception of realism, in particular human-like features, can help facilitate mediated social interaction. The current

  3. Designing Robot Embodiments for Social Interaction : Affordances Topple Realism and Aesthetics

    NARCIS (Netherlands)

    Paauwe, R.A.; Hoorn, J.F.; Konijn, E.A.; Keyson, D.V.

    2015-01-01

    In the near future, human-like social robots will become indispensable for providing support in various social tasks, in particular for healthcare (e.g., assistance, coaching). The perception of realism, in particular human-like features, can help facilitate mediated social interaction. The current

  4. Design challenges for long-term interaction with a robot in a science classroom

    NARCIS (Netherlands)

    Davison, Daniel Patrick; Charisi, Vasiliki; Wijnen, Frances Martine; Papenmeier, Andrea; van der Meij, Jan; Reidsma, Dennis; Evers, Vanessa

    This paper aims to present the main challenges that emerged during the process of the research design of a longitudinal study on child-robot interaction for science education and to discuss relevant suggestions in the context. The theoretical rationale is based on aspects of the theory of social

  5. Information theory and robotics meet to study predator-prey interactions

    Science.gov (United States)

    Neri, Daniele; Ruberto, Tommaso; Cord-Cruz, Gabrielle; Porfiri, Maurizio

    2017-07-01

    Transfer entropy holds promise to advance our understanding of animal behavior, by affording the identification of causal relationships that underlie animal interactions. A critical step toward the reliable implementation of this powerful information-theoretic concept entails the design of experiments in which causal relationships could be systematically controlled. Here, we put forward a robotics-based experimental approach to test the validity of transfer entropy in the study of predator-prey interactions. We investigate the behavioral response of zebrafish to a fear-evoking robotic stimulus, designed after the morpho-physiology of the red tiger oscar and actuated along preprogrammed trajectories. From the time series of the positions of the zebrafish and the robotic stimulus, we demonstrate that transfer entropy correctly identifies the influence of the stimulus on the focal subject. Building on this evidence, we apply transfer entropy to study the interactions between zebrafish and a live red tiger oscar. The analysis of transfer entropy reveals a change in the direction of the information flow, suggesting a mutual influence between the predator and the prey, where the predator adapts its strategy as a function of the movement of the prey, which, in turn, adjusts its escape as a function of the predator motion. Through the integration of information theory and robotics, this study posits a new approach to study predator-prey interactions in freshwater fish.

  6. You Can Leave Your Head On: Attention Management and Turn-Taking in Multi-party Interaction with a Virtual Human/Robot Duo

    NARCIS (Netherlands)

    Linssen, Jeroen; Berkhoff, Meike; Bode, Max; Rens, Eduard; Theune, Mariet; Wiltenburg, Daan; Beskow, Jonas; Peters, Christopher; Castellano, Ginevra; O'Sullivan, Carol; Leite, Iolanda; Kopp, Stefan

    In two small studies, we investigated how a virtual human/ robot duo can complement each other in joint interaction with one or more users. The robot takes care of turn management while the virtual human draws attention to the robot. Our results show that having the virtual human address the robot,

  7. You Look Human, But Act Like a Machine: Agent Appearance and Behavior Modulate Different Aspects of Human–Robot Interaction

    Directory of Open Access Journals (Sweden)

    Abdulaziz Abubshait

    2017-08-01

    Full Text Available Gaze following occurs automatically in social interactions, but the degree to which gaze is followed depends on whether an agent is perceived to have a mind, making its behavior socially more relevant for the interaction. Mind perception also modulates the attitudes we have toward others, and determines the degree of empathy, prosociality, and morality invested in social interactions. Seeing mind in others is not exclusive to human agents, but mind can also be ascribed to non-human agents like robots, as long as their appearance and/or behavior allows them to be perceived as intentional beings. Previous studies have shown that human appearance and reliable behavior induce mind perception to robot agents, and positively affect attitudes and performance in human–robot interaction. What has not been investigated so far is whether different triggers of mind perception have an independent or interactive effect on attitudes and performance in human–robot interaction. We examine this question by manipulating agent appearance (human vs. robot and behavior (reliable vs. random within the same paradigm and examine how congruent (human/reliable vs. robot/random versus incongruent (human/random vs. robot/reliable combinations of these triggers affect performance (i.e., gaze following and attitudes (i.e., agent ratings in human–robot interaction. The results show that both appearance and behavior affect human–robot interaction but that the two triggers seem to operate in isolation, with appearance more strongly impacting attitudes, and behavior more strongly affecting performance. The implications of these findings for human–robot interaction are discussed.

  8. You Look Human, But Act Like a Machine: Agent Appearance and Behavior Modulate Different Aspects of Human-Robot Interaction.

    Science.gov (United States)

    Abubshait, Abdulaziz; Wiese, Eva

    2017-01-01

    Gaze following occurs automatically in social interactions, but the degree to which gaze is followed depends on whether an agent is perceived to have a mind, making its behavior socially more relevant for the interaction. Mind perception also modulates the attitudes we have toward others, and determines the degree of empathy, prosociality, and morality invested in social interactions. Seeing mind in others is not exclusive to human agents, but mind can also be ascribed to non-human agents like robots, as long as their appearance and/or behavior allows them to be perceived as intentional beings. Previous studies have shown that human appearance and reliable behavior induce mind perception to robot agents, and positively affect attitudes and performance in human-robot interaction. What has not been investigated so far is whether different triggers of mind perception have an independent or interactive effect on attitudes and performance in human-robot interaction. We examine this question by manipulating agent appearance (human vs. robot) and behavior (reliable vs. random) within the same paradigm and examine how congruent (human/reliable vs. robot/random) versus incongruent (human/random vs. robot/reliable) combinations of these triggers affect performance (i.e., gaze following) and attitudes (i.e., agent ratings) in human-robot interaction. The results show that both appearance and behavior affect human-robot interaction but that the two triggers seem to operate in isolation, with appearance more strongly impacting attitudes, and behavior more strongly affecting performance. The implications of these findings for human-robot interaction are discussed.

  9. When Humanoid Robots Become Human-Like Interaction Partners: Corepresentation of Robotic Actions

    Science.gov (United States)

    Stenzel, Anna; Chinellato, Eris; Bou, Maria A. Tirado; del Pobil, Angel P.; Lappe, Markus; Liepelt, Roman

    2012-01-01

    In human-human interactions, corepresenting a partner's actions is crucial to successfully adjust and coordinate actions with others. Current research suggests that action corepresentation is restricted to interactions between human agents facilitating social interaction with conspecifics. In this study, we investigated whether action…

  10. Pilot Study of Person Robot Interaction in a Public Transit Space

    DEFF Research Database (Denmark)

    Svenstrup, Mikael; Bak, Thomas; Maler, Ouri

    2009-01-01

    This paper describes a study of the effect of a human interactive robot placed in an urban transit space. The underlying hypothesis is that it is possible to create interesting new living spaces and induce value in terms of experiences, information or economics, by putting socially interactive...... showed harder than expected to start interaction with commuters due to their determination and speed towards their goal. Further it was demonstrated that it was possible to track and follow people, who were not beforehand informed about the experiment. The evaluation indicated that the distance...... to initiate interaction was shorter than would be expected for normal human to human interaction....

  11. Using Language Games as a Way to Investigate Interactional Engagement in Human-Robot Interaction

    DEFF Research Database (Denmark)

    Jensen, L. C.

    2016-01-01

    how students' engagement with a social robot can be systematically investigated and evaluated. For this purpose, I present a small user study in which a robot plays a word formation game with a human, in which engagement is determined by means of an analysis of the 'language games' played...

  12. Turn-Taking Based on Information Flow for Fluent Human-Robot Interaction

    OpenAIRE

    Thomaz, Andrea L.; Chao, Crystal

    2011-01-01

    Turn-taking is a fundamental part of human communication. Our goal is to devise a turn-taking framework for human-robot interaction that, like the human skill, represents something fundamental about interaction, generic to context or domain. We propose a model of turn-taking, and conduct an experiment with human subjects to inform this model. Our findings from this study suggest that information flow is an integral part of human floor-passing behavior. Following this, we implement autonomous ...

  13. Robot initiative in a team learning task increases the rhythm of interaction but not the perceived engagement

    Science.gov (United States)

    Ivaldi, Serena; Anzalone, Salvatore M.; Rousseau, Woody; Sigaud, Olivier; Chetouani, Mohamed

    2014-01-01

    We hypothesize that the initiative of a robot during a collaborative task with a human can influence the pace of interaction, the human response to attention cues, and the perceived engagement. We propose an object learning experiment where the human interacts in a natural way with the humanoid iCub. Through a two-phases scenario, the human teaches the robot about the properties of some objects. We compare the effect of the initiator of the task in the teaching phase (human or robot) on the rhythm of the interaction in the verification phase. We measure the reaction time of the human gaze when responding to attention utterances of the robot. Our experiments show that when the robot is the initiator of the learning task, the pace of interaction is higher and the reaction to attention cues faster. Subjective evaluations suggest that the initiating role of the robot, however, does not affect the perceived engagement. Moreover, subjective and third-person evaluations of the interaction task suggest that the attentive mechanism we implemented in the humanoid robot iCub is able to arouse engagement and make the robot's behavior readable. PMID:24596554

  14. Human-inspired feedback synergies for environmental interaction with a dexterous robotic hand.

    Science.gov (United States)

    Kent, Benjamin A; Engeberg, Erik D

    2014-11-07

    Effortless control of the human hand is mediated by the physical and neural couplings inherent in the structure of the hand. This concept was explored for environmental interaction tasks with the human hand, and a novel human-inspired feedback synergy (HFS) controller was developed for a robotic hand which synchronized position and force feedback signals to mimic observed human hand motions. This was achieved by first recording the finger joint motion profiles of human test subjects, where it was observed that the subjects would extend their fingers to maintain a natural hand posture when interacting with different surfaces. The resulting human joint angle data were used as inspiration to develop the HFS controller for the anthropomorphic robotic hand, which incorporated finger abduction and force feedback in the control laws for finger extension. Experimental results showed that by projecting a broader view of the tasks at hand to each specific joint, the HFS controller produced hand motion profiles that closely mimic the observed human responses and allowed the robotic manipulator to interact with the surfaces while maintaining a natural hand posture. Additionally, the HFS controller enabled the robotic hand to autonomously traverse vertical step discontinuities without prior knowledge of the environment, visual feedback, or traditional trajectory planning techniques.

  15. Human-inspired feedback synergies for environmental interaction with a dexterous robotic hand

    International Nuclear Information System (INIS)

    Kent, Benjamin A; Engeberg, Erik D

    2014-01-01

    Effortless control of the human hand is mediated by the physical and neural couplings inherent in the structure of the hand. This concept was explored for environmental interaction tasks with the human hand, and a novel human-inspired feedback synergy (HFS) controller was developed for a robotic hand which synchronized position and force feedback signals to mimic observed human hand motions. This was achieved by first recording the finger joint motion profiles of human test subjects, where it was observed that the subjects would extend their fingers to maintain a natural hand posture when interacting with different surfaces. The resulting human joint angle data were used as inspiration to develop the HFS controller for the anthropomorphic robotic hand, which incorporated finger abduction and force feedback in the control laws for finger extension. Experimental results showed that by projecting a broader view of the tasks at hand to each specific joint, the HFS controller produced hand motion profiles that closely mimic the observed human responses and allowed the robotic manipulator to interact with the surfaces while maintaining a natural hand posture. Additionally, the HFS controller enabled the robotic hand to autonomously traverse vertical step discontinuities without prior knowledge of the environment, visual feedback, or traditional trajectory planning techniques. (paper)

  16. Interaction force and motion estimators facilitating impedance control of the upper limb rehabilitation robot.

    Science.gov (United States)

    Mancisidor, Aitziber; Zubizarreta, Asier; Cabanes, Itziar; Bengoa, Pablo; Jung, Je Hyung

    2017-07-01

    In order to enhance the performance of rehabilitation robots, it is imperative to know both force and motion caused by the interaction between user and robot. However, common direct measurement of both signals through force and motion sensors not only increases the complexity of the system but also impedes affordability of the system. As an alternative of the direct measurement, in this work, we present new force and motion estimators for the proper control of the upper-limb rehabilitation Universal Haptic Pantograph (UHP) robot. The estimators are based on the kinematic and dynamic model of the UHP and the use of signals measured by means of common low-cost sensors. In order to demonstrate the effectiveness of the estimators, several experimental tests were carried out. The force and impedance control of the UHP was implemented first by directly measuring the interaction force using accurate extra sensors and the robot performance was compared to the case where the proposed estimators replace the direct measured values. The experimental results reveal that the controller based on the estimators has similar performance to that using direct measurement (less than 1 N difference in root mean square error between two cases), indicating that the proposed force and motion estimators can facilitate implementation of interactive controller for the UHP in robotmediated rehabilitation trainings.

  17. Gestalt Processing in Human-Robot Interaction: A Novel Account for Autism Research

    Directory of Open Access Journals (Sweden)

    Maya Dimitrova

    2015-12-01

    Full Text Available The paper presents a novel analysis focused on showing that education is possible through robotic enhancement of the Gestalt processing in children with autism, which is not comparable to alternative educational methods such as demonstration and instruction provided solely by human tutors. The paper underlines the conceptualization of cognitive processing of holistic representations traditionally named in psychology as Gestalt structures, emerging in the process of human-robot interaction in educational settings. Two cognitive processes are proposed in the present study - bounding and unfolding - and their role in Gestalt emergence is outlined. The proposed theoretical approach explains novel findings of autistic perception and gives guidelines for design of robot-assistants to the rehabilitation process.

  18. Dynamical Integration of Language and Behavior in a Recurrent Neural Network for Human--Robot Interaction

    Directory of Open Access Journals (Sweden)

    Tatsuro Yamada

    2016-07-01

    Full Text Available To work cooperatively with humans by using language, robots must not only acquire a mapping between language and their behavior but also autonomously utilize the mapping in appropriate contexts of interactive tasks online. To this end, we propose a novel learning method linking language to robot behavior by means of a recurrent neural network. In this method, the network learns from correct examples of the imposed task that are given not as explicitly separated sets of language and behavior but as sequential data constructed from the actual temporal flow of the task. By doing this, the internal dynamics of the network models both language--behavior relationships and the temporal patterns of interaction. Here, ``internal dynamics'' refers to the time development of the system defined on the fixed-dimensional space of the internal states of the context layer. Thus, in the execution phase, by constantly representing where in the interaction context it is as its current state, the network autonomously switches between recognition and generation phases without any explicit signs and utilizes the acquired mapping in appropriate contexts. To evaluate our method, we conducted an experiment in which a robot generates appropriate behavior responding to a human's linguistic instruction. After learning, the network actually formed the attractor structure representing both language--behavior relationships and the task's temporal pattern in its internal dynamics. In the dynamics, language--behavior mapping was achieved by the branching structure. Repetition of human's instruction and robot's behavioral response was represented as the cyclic structure, and besides, waiting to a subsequent instruction was represented as the fixed-point attractor. Thanks to this structure, the robot was able to interact online with a human concerning the given task by autonomously switching phases.

  19. Dynamical Integration of Language and Behavior in a Recurrent Neural Network for Human-Robot Interaction.

    Science.gov (United States)

    Yamada, Tatsuro; Murata, Shingo; Arie, Hiroaki; Ogata, Tetsuya

    2016-01-01

    To work cooperatively with humans by using language, robots must not only acquire a mapping between language and their behavior but also autonomously utilize the mapping in appropriate contexts of interactive tasks online. To this end, we propose a novel learning method linking language to robot behavior by means of a recurrent neural network. In this method, the network learns from correct examples of the imposed task that are given not as explicitly separated sets of language and behavior but as sequential data constructed from the actual temporal flow of the task. By doing this, the internal dynamics of the network models both language-behavior relationships and the temporal patterns of interaction. Here, "internal dynamics" refers to the time development of the system defined on the fixed-dimensional space of the internal states of the context layer. Thus, in the execution phase, by constantly representing where in the interaction context it is as its current state, the network autonomously switches between recognition and generation phases without any explicit signs and utilizes the acquired mapping in appropriate contexts. To evaluate our method, we conducted an experiment in which a robot generates appropriate behavior responding to a human's linguistic instruction. After learning, the network actually formed the attractor structure representing both language-behavior relationships and the task's temporal pattern in its internal dynamics. In the dynamics, language-behavior mapping was achieved by the branching structure. Repetition of human's instruction and robot's behavioral response was represented as the cyclic structure, and besides, waiting to a subsequent instruction was represented as the fixed-point attractor. Thanks to this structure, the robot was able to interact online with a human concerning the given task by autonomously switching phases.

  20. Proxemics models for human-aware navigation in robotics: Grounding interaction and personal space models in experimental data from psychology

    OpenAIRE

    Barnaud , Marie-Lou; Morgado , Nicolas; Palluel-Germain , Richard; Diard , Julien; Spalanzani , Anne

    2014-01-01

    International audience; In order to navigate in a social environment, a robot must be aware of social spaces, which include proximity and interaction-based constraints. Previous models of interaction and personal spaces have been inspired by studies in social psychology but not systematically grounded and validated with respect to experimental data. We propose to implement personal and interaction space models in order to replicate a classical psychology experiment. Our robotic simulations ca...

  1. A Taxonomy of Privacy Constructs for Privacy-Sensitive Robotics

    OpenAIRE

    Rueben, Matthew; Grimm, Cindy M.; Bernieri, Frank J.; Smart, William D.

    2017-01-01

    The introduction of robots into our society will also introduce new concerns about personal privacy. In order to study these concerns, we must do human-subject experiments that involve measuring privacy-relevant constructs. This paper presents a taxonomy of privacy constructs based on a review of the privacy literature. Future work in operationalizing privacy constructs for HRI studies is also discussed.

  2. Motor contagion during human-human and human-robot interaction.

    Science.gov (United States)

    Bisio, Ambra; Sciutti, Alessandra; Nori, Francesco; Metta, Giorgio; Fadiga, Luciano; Sandini, Giulio; Pozzo, Thierry

    2014-01-01

    Motor resonance mechanisms are known to affect humans' ability to interact with others, yielding the kind of "mutual understanding" that is the basis of social interaction. However, it remains unclear how the partner's action features combine or compete to promote or prevent motor resonance during interaction. To clarify this point, the present study tested whether and how the nature of the visual stimulus and the properties of the observed actions influence observer's motor response, being motor contagion one of the behavioral manifestations of motor resonance. Participants observed a humanoid robot and a human agent move their hands into a pre-specified final position or put an object into a container at various velocities. Their movements, both in the object- and non-object- directed conditions, were characterized by either a smooth/curvilinear or a jerky/segmented trajectory. These trajectories were covered with biological or non-biological kinematics (the latter only by the humanoid robot). After action observation, participants were requested to either reach the indicated final position or to transport a similar object into another container. Results showed that motor contagion appeared for both the interactive partner except when the humanoid robot violated the biological laws of motion. These findings suggest that the observer may transiently match his/her own motor repertoire to that of the observed agent. This matching might mediate the activation of motor resonance, and modulate the spontaneity and the pleasantness of the interaction, whatever the nature of the communication partner.

  3. Motor contagion during human-human and human-robot interaction.

    Directory of Open Access Journals (Sweden)

    Ambra Bisio

    Full Text Available Motor resonance mechanisms are known to affect humans' ability to interact with others, yielding the kind of "mutual understanding" that is the basis of social interaction. However, it remains unclear how the partner's action features combine or compete to promote or prevent motor resonance during interaction. To clarify this point, the present study tested whether and how the nature of the visual stimulus and the properties of the observed actions influence observer's motor response, being motor contagion one of the behavioral manifestations of motor resonance. Participants observed a humanoid robot and a human agent move their hands into a pre-specified final position or put an object into a container at various velocities. Their movements, both in the object- and non-object- directed conditions, were characterized by either a smooth/curvilinear or a jerky/segmented trajectory. These trajectories were covered with biological or non-biological kinematics (the latter only by the humanoid robot. After action observation, participants were requested to either reach the indicated final position or to transport a similar object into another container. Results showed that motor contagion appeared for both the interactive partner except when the humanoid robot violated the biological laws of motion. These findings suggest that the observer may transiently match his/her own motor repertoire to that of the observed agent. This matching might mediate the activation of motor resonance, and modulate the spontaneity and the pleasantness of the interaction, whatever the nature of the communication partner.

  4. Gaze-and-brain-controlled interfaces for human-computer and human-robot interaction

    Directory of Open Access Journals (Sweden)

    Shishkin S. L.

    2017-09-01

    Full Text Available Background. Human-machine interaction technology has greatly evolved during the last decades, but manual and speech modalities remain single output channels with their typical constraints imposed by the motor system’s information transfer limits. Will brain-computer interfaces (BCIs and gaze-based control be able to convey human commands or even intentions to machines in the near future? We provide an overview of basic approaches in this new area of applied cognitive research. Objective. We test the hypothesis that the use of communication paradigms and a combination of eye tracking with unobtrusive forms of registering brain activity can improve human-machine interaction. Methods and Results. Three groups of ongoing experiments at the Kurchatov Institute are reported. First, we discuss the communicative nature of human-robot interaction, and approaches to building a more e cient technology. Specifically, “communicative” patterns of interaction can be based on joint attention paradigms from developmental psychology, including a mutual “eye-to-eye” exchange of looks between human and robot. Further, we provide an example of “eye mouse” superiority over the computer mouse, here in emulating the task of selecting a moving robot from a swarm. Finally, we demonstrate a passive, noninvasive BCI that uses EEG correlates of expectation. This may become an important lter to separate intentional gaze dwells from non-intentional ones. Conclusion. The current noninvasive BCIs are not well suited for human-robot interaction, and their performance, when they are employed by healthy users, is critically dependent on the impact of the gaze on selection of spatial locations. The new approaches discussed show a high potential for creating alternative output pathways for the human brain. When support from passive BCIs becomes mature, the hybrid technology of the eye-brain-computer (EBCI interface will have a chance to enable natural, fluent, and the

  5. Social interaction in robotic agents emulating the mirror neuron function

    NARCIS (Netherlands)

    Barakova, E.I.; Mira, J.; Álvarez, J.R.

    2007-01-01

    Emergent interactions that are expressed by the movements of two agents are discussed in this paper. The common coding principle is used to show how the mirror neuron system may facilitate interaction behaviour. Synchronization between neuron groups in different structures of the mirror neuron

  6. Predicting the long-term effects of human-robot interaction: a reflection on responsibility in medical robotics.

    Science.gov (United States)

    Datteri, Edoardo

    2013-03-01

    This article addresses prospective and retrospective responsibility issues connected with medical robotics. It will be suggested that extant conceptual and legal frameworks are sufficient to address and properly settle most retrospective responsibility problems arising in connection with injuries caused by robot behaviours (which will be exemplified here by reference to harms occurred in surgical interventions supported by the Da Vinci robot, reported in the scientific literature and in the press). In addition, it will be pointed out that many prospective responsibility issues connected with medical robotics are nothing but well-known robotics engineering problems in disguise, which are routinely addressed by roboticists as part of their research and development activities: for this reason they do not raise particularly novel ethical issues. In contrast with this, it will be pointed out that novel and challenging prospective responsibility issues may emerge in connection with harmful events caused by normal robot behaviours. This point will be illustrated here in connection with the rehabilitation robot Lokomat.

  7. Instrumented Compliant Wrist with Proximity and Contact Sensing for Close Robot Interaction Control

    Directory of Open Access Journals (Sweden)

    Pascal Laferrière

    2017-06-01

    Full Text Available Compliance has been exploited in various forms in robotic systems to allow rigid mechanisms to come into contact with fragile objects, or with complex shapes that cannot be accurately modeled. Force feedback control has been the classical approach for providing compliance in robotic systems. However, by integrating other forms of instrumentation with compliance into a single device, it is possible to extend close monitoring of nearby objects before and after contact occurs. As a result, safer and smoother robot control can be achieved both while approaching and while touching surfaces. This paper presents the design and extensive experimental evaluation of a versatile, lightweight, and low-cost instrumented compliant wrist mechanism which can be mounted on any rigid robotic manipulator in order to introduce a layer of compliance while providing the controller with extra sensing signals during close interaction with an object’s surface. Arrays of embedded range sensors provide real-time measurements on the position and orientation of surfaces, either located in proximity or in contact with the robot’s end-effector, which permits close guidance of its operation. Calibration procedures are formulated to overcome inter-sensor variability and achieve the highest available resolution. A versatile solution is created by embedding all signal processing, while wireless transmission connects the device to any industrial robot’s controller to support path control. Experimental work demonstrates the device’s physical compliance as well as the stability and accuracy of the device outputs. Primary applications of the proposed instrumented compliant wrist include smooth surface following in manufacturing, inspection, and safe human-robot interaction.

  8. Textile Pressure Mapping Sensor for Emotional Touch Detection in Human-Robot Interaction

    Directory of Open Access Journals (Sweden)

    Bo Zhou

    2017-11-01

    Full Text Available In this paper, we developed a fully textile sensing fabric for tactile touch sensing as the robot skin to detect human-robot interactions. The sensor covers a 20-by-20 cm 2 area with 400 sensitive points and samples at 50 Hz per point. We defined seven gestures which are inspired by the social and emotional interactions of typical people to people or pet scenarios. We conducted two groups of mutually blinded experiments, involving 29 participants in total. The data processing algorithm first reduces the spatial complexity to frame descriptors, and temporal features are calculated through basic statistical representations and wavelet analysis. Various classifiers are evaluated and the feature calculation algorithms are analyzed in details to determine each stage and segments’ contribution. The best performing feature-classifier combination can recognize the gestures with a 93 . 3 % accuracy from a known group of participants, and 89 . 1 % from strangers.

  9. Modelling engagement in dementia through behaviour. Contribution for socially interactive robotics.

    Science.gov (United States)

    Perugia, Giulia; Diaz Doladeras, Marta; Mallofre, Andreu Catala; Rauterberg, Matthias; Barakova, Emilia

    2017-07-01

    In this paper, we present a novel tool to measure engagement in people with dementia playing board games and interacting with a social robot, Pleo. We carried out two studies to reach a comprehensive inventory of behaviours accounting for engagement in dementia. The first one is an exploratory study aimed at modelling engagement in cognitive board games. The second one is a longitudinal study to investigate how people with dementia express engagement in cognitive games and in interactions with social robots. We adopted a technique coming from Ethology to mould behaviour, the ethogram. Ethogram is founded on low level behaviours, and allows hierarchical structuring. Herein, we present preliminary results consisting in the description of two ethograms and in their structuring obtained through thematic analysis. Such results show that an underlying structure of engagement exists across activities, and that different activities trigger different behavioural displays of engagement that adhere to such a structure.

  10. Textile Pressure Mapping Sensor for Emotional Touch Detection in Human-Robot Interaction.

    Science.gov (United States)

    Zhou, Bo; Altamirano, Carlos Andres Velez; Zurian, Heber Cruz; Atefi, Seyed Reza; Billing, Erik; Martinez, Fernando Seoane; Lukowicz, Paul

    2017-11-09

    In this paper, we developed a fully textile sensing fabric for tactile touch sensing as the robot skin to detect human-robot interactions. The sensor covers a 20-by-20 cm 2 area with 400 sensitive points and samples at 50 Hz per point. We defined seven gestures which are inspired by the social and emotional interactions of typical people to people or pet scenarios. We conducted two groups of mutually blinded experiments, involving 29 participants in total. The data processing algorithm first reduces the spatial complexity to frame descriptors, and temporal features are calculated through basic statistical representations and wavelet analysis. Various classifiers are evaluated and the feature calculation algorithms are analyzed in details to determine each stage and segments' contribution. The best performing feature-classifier combination can recognize the gestures with a 93 . 3 % accuracy from a known group of participants, and 89 . 1 % from strangers.

  11. Bio-Inspired Interaction Control of Robotic Machines for Motor Therapy

    OpenAIRE

    Zollo, Loredana; Formica, Domenico; Guglielmelli, Eugenio

    2007-01-01

    In this chapter basic criteria for the design and implementation of interaction control of robotic machines for motor therapy have been briefly introduced and two bio-inspired compliance control laws developed by the authors to address requirements coming from this specific application field have been presented. The two control laws are named the coactivation-based compliance control in the joint space and the torque-dependent compliance control in the joint space, respectively. They try to o...

  12. Autonomous construction using scarce resources in unknown environments - Ingredients for an intelligent robotic interaction with the physical world

    OpenAIRE

    Magnenat, Stéphane; Philippsen, Roland; Mondada, Francesco

    2012-01-01

    The goal of creating machines that autonomously perform useful work in a safe, robust and intelligent manner continues to motivate robotics research. Achieving this autonomy requires capabilities for understanding the environment, physically interacting with it, predicting the outcomes of actions and reasoning with this knowledge. Such intelligent physical interaction was at the centre of early robotic investigations and remains an open topic. In this paper, we build on the fruit of decades ...

  13. Effects of a Socially Interactive Robot on the Conversational Turns between Parents and Their Young Children with Autism. Social Robots Research Reports, Number 6

    Science.gov (United States)

    Dunst, Carl J.; Hamby, Deborah W.; Trivette, Carol M.; Prior, Jeremy; Derryberry, Graham

    2013-01-01

    The effects of a socially interactive robot on the conversational turns between four young children with autism and their mothers were investigated as part of the intervention study described in this research report. The interventions with each child were conducted over 4 or 5 days in the children's homes where a practitioner facilitated…

  14. Effects of Child-Robot Interactions on the Vocalization Production of Young Children with Disabilities. Social Robots. Research Reports, Number 4

    Science.gov (United States)

    Dunst, Carl J.; Trivette, Carol M.; Hamby, Deborah W.; Prior, Jeremy; Derryberry, Graham

    2013-01-01

    Findings from two studies investigating the effects of a socially interactive robot on the vocalization production of young children with disabilities are reported. The two studies included seven children with autism, two children with Down syndrome, and two children with attention deficit disorders. The Language ENvironment Analysis (LENA)…

  15. The Planning of Straight Line Trajectory in Robotics Using Interactive Graphics

    Directory of Open Access Journals (Sweden)

    Kesheng Wang

    1987-07-01

    Full Text Available The planning of straight line trajectory using the interactive computer graphics simulation of robot manipulator movement is discussed. This new approach to straight line motion planning improves the 'bound deviation joint paths' developed by R. M. Taylor (1979. The new approach has three characteristics: (1 linear interpolation in joint space; (2 unequal intervals for interpolating knot points; (3 using interactive computer graphics to assure that the maximum deviation in the whole segment is less than the pre-specified values. The structure and mathematical basis of a computer program developed for this purpose are presented.

  16. An Exploration of the Benefits of an Animallike Robot Companion with More Advanced Touch Interaction Capabilities for Dementia Care

    Directory of Open Access Journals (Sweden)

    Merel M. Jung

    2017-06-01

    Full Text Available Animallike robot companions such as robotic seal Paro are increasingly used in dementia care due to the positive effects that interaction with these robots can have on the well-being of these patients. Touch is one of the most important interaction modalities for patients with dementia and can be a natural way to interact with animallike robots. To advance the development of animallike robots, we explored in what ways people with dementia could benefit from interaction with an animallike robot with more advanced touch recognition capabilities and which touch gestures would be important in their interaction with Paro. In addition, we explored which other target groups might benefit from interaction with animallike robots with more advanced interaction capabilities. In this study, we administered a questionnaire and conducted interviews with two groups of health-care providers who all worked in a geriatric psychiatry department. One group used Paro in their work (i.e., the expert group; n = 5 while the other group had no experience with the use of animallike robot (i.e., the layman group; n = 4. The results showed that health-care providers perceived Paro as an effective intervention to improve the well-being of people with dementia. Examples of usages for Paro that were mentioned were providing distraction, interrupting problematic behaviors, and stimulating communication. Furthermore, the care providers indicated that people with dementia (would use mostly positive forms of touch and speech to interact with Paro. Paro’s auditory responses were criticized because they can overstimulate the patients. In addition, the care providers argued that social interactions with Paro are currently limited and therefore the robot does not meet the needs of a broader audience such as healthy elderly people who still live in their own homes. The development of robot pets with more advanced social capabilities such as touch and speech recognition might

  17. Design of robotic leg and physiotherapy (ROLEP) assist with interactive game

    Science.gov (United States)

    Hasan, A. F.; Husin, M. F. Che; Hashim, M. N.; Rosli, K. A.; Roslim, F. R. A.; Abidin, A. F. Z.

    2017-09-01

    Injuries in certain parts of the feet can cause a person to have difficulty in walking or running if it is not treated through physiotherapy. In Malaysia, therapy centers only provide a service or the use of basic tools that are not efficient as more sophisticated equipment requires a high cost. In fact, exercise requiring close monitoring physiotherapist are also at a high cost. Therefore, using robot therapy is a new technology that can provide an alternative way to solve this problem. The implementation of this project has produced a robotic physiotherapy which has one degree of freedom, portable and inexpensive way to help the movement of the patient's leg. It covers basic electrical circuits, mechanical components, programming and has been combined with an interactive game as the main driver. ROLEP (Robotic-Leg-Physiotherapy) is able to help patients through the therapy process. It was built using CT-UNO as its microprocessor connected to MD10-C which acted as the motor driver. The interactive game produced by using Unity game software is a key driver in getting rid of boredom and reduce pain. As a result, ROLEP designed can operate well within its range of the patient's weight. It has the advantage of portability and easy to use by the patients. ROLEP expected to help patients undergoing therapy process more efficient and interesting in the process of recovery.

  18. Detecting Biological Motion for Human–Robot Interaction: A Link between Perception and Action

    Directory of Open Access Journals (Sweden)

    Alessia Vignolo

    2017-06-01

    Full Text Available One of the fundamental skills supporting safe and comfortable interaction between humans is their capability to understand intuitively each other’s actions and intentions. At the basis of this ability is a special-purpose visual processing that human brain has developed to comprehend human motion. Among the first “building blocks” enabling the bootstrapping of such visual processing is the ability to detect movements performed by biological agents in the scene, a skill mastered by human babies in the first days of their life. In this paper, we present a computational model based on the assumption that such visual ability must be based on local low-level visual motion features, which are independent of shape, such as the configuration of the body and perspective. Moreover, we implement it on the humanoid robot iCub, embedding it into a software architecture that leverages the regularities of biological motion also to control robot attention and oculomotor behaviors. In essence, we put forth a model in which the regularities of biological motion link perception and action enabling a robotic agent to follow a human-inspired sensory-motor behavior. We posit that this choice facilitates mutual understanding and goal prediction during collaboration, increasing the pleasantness and safety of the interaction.

  19. What makes robots social? : A user’s perspective on characteristics for social human-robot interaction

    NARCIS (Netherlands)

    de Graaf, M.M.A.; Ben Allouch, Soumaya

    2015-01-01

    A common description of a social robot is for it to be capable of communicating in a humanlike manner. However, a description of what communicating in a ‘humanlike manner’ means often remains unspecified. This paper provides a set of social behaviors and certain specific features social robots

  20. Can We Talk to Robots? Ten-Month-Old Infants Expected Interactive Humanoid Robots to Be Talked to by Persons

    Science.gov (United States)

    Arita, A.; Hiraki, K.; Kanda, T.; Ishiguro, H.

    2005-01-01

    As technology advances, many human-like robots are being developed. Although these humanoid robots should be classified as objects, they share many properties with human beings. This raises the question of how infants classify them. Based on the looking-time paradigm used by [Legerstee, M., Barna, J., & DiAdamo, C., (2000). Precursors to the…

  1. Robust Control of a Cable-Driven Soft Exoskeleton Joint for Intrinsic Human-Robot Interaction.

    Science.gov (United States)

    Jarrett, C; McDaid, A J

    2017-07-01

    A novel, cable-driven soft joint is presented for use in robotic rehabilitation exoskeletons to provide intrinsic, comfortable human-robot interaction. The torque-displacement characteristics of the soft elastomeric core contained within the joint are modeled. This knowledge is used in conjunction with a dynamic system model to derive a sliding mode controller (SMC) to implement low-level torque control of the joint. The SMC controller is experimentally compared with a baseline feedback-linearised proportional-derivative controller across a range of conditions and shown to be robust to un-modeled disturbances. The torque controller is then tested with six healthy subjects while they perform a selection of activities of daily living, which has validated its range of performance. Finally, a case study with a participant with spastic cerebral palsy is presented to illustrate the potential of both the joint and controller to be used in a physiotherapy setting to assist clinical populations.

  2. Identifying Factors Reinforcing Robotization: Interactive Forces of Employment, Working Hour and Wage

    OpenAIRE

    Joonmo Cho; Jinha Kim

    2018-01-01

    Unlike previous studies on robotization approaching the future based on the cutting-edge technologies and adopting a framework where robotization is considered as an exogenous variable, this study considers that robotization occurs endogenously and uses it as a dependent variable for an objective examination of the effect of robotization on the labor market. To this end, a robotization indicator is created based on the actual number of industrial robots currently deployed in workplaces, and a...

  3. A Framework for Interactive Teaching of Virtual Borders to Mobile Robots

    OpenAIRE

    Sprute, Dennis; Rasch, Robin; Tönnies, Klaus; König, Matthias

    2017-01-01

    The increasing number of robots in home environments leads to an emerging coexistence between humans and robots. Robots undertake common tasks and support the residents in their everyday life. People appreciate the presence of robots in their environment as long as they keep the control over them. One important aspect is the control of a robot's workspace. Therefore, we introduce virtual borders to precisely and flexibly define the workspace of mobile robots. First, we propose a novel framewo...

  4. Children with Autism Spectrum Disorders Make a Fruit Salad with Probo, the Social Robot: An Interaction Study.

    Science.gov (United States)

    Simut, Ramona E; Vanderfaeillie, Johan; Peca, Andreea; Van de Perre, Greet; Vanderborght, Bram

    2016-01-01

    Social robots are thought to be motivating tools in play tasks with children with autism spectrum disorders. Thirty children with autism were included using a repeated measurements design. It was investigated if the children's interaction with a human differed from the interaction with a social robot during a play task. Also, it was examined if the two conditions differed in their ability to elicit interaction with a human accompanying the child during the task. Interaction of the children with both partners did not differ apart from the eye-contact. Participants had more eye-contact with the social robot compared to the eye-contact with the human. The conditions did not differ regarding the interaction elicited with the human accompanying the child.

  5. PARO robot affects diverse interaction modalities in group sensory therapy for older adults with dementia.

    Science.gov (United States)

    Šabanović, Selma; Bennett, Casey C; Chang, Wan-Ling; Huber, Lesa

    2013-06-01

    We evaluated the seal-like robot PARO in the context of multi-sensory behavioral therapy in a local nursing home. Participants were 10 elderly nursing home residents with varying levels of dementia. We report three principle findings from our observations of interactions between the residents, PARO, and a therapist during seven weekly therapy sessions. Firstly, we show PARO provides indirect benefits for users by increasing their activity in particular modalities of social interaction, including visual, verbal, and physical interaction, which vary between primary and non-primary interactors. Secondly, PARO's positive effects on older adults' activity levels show steady growth over the duration of our study, suggesting they are not due to short-term "novelty effects." Finally, we show a variety of ways in which individual participants interacted with PARO and relate this to the "interpretive flexibility" of its design.

  6. I Show You How I Like You: Emotional Human-Robot Interaction through Facial Expression and Tactile Stimulation

    DEFF Research Database (Denmark)

    Fredslund, Jakob; Cañamero, Lola D.

    2001-01-01

    We report work on a LEGO robot capable of displaying several emo- tional expressions in response to physical contact. Our motivation has been to explore believable emotional exchanges to achieve plausible interaction with a simple robot. We have worked toward this goal in two ways. First......, acknowledging the importance of physical manipulation in children's inter- actions, interaction with the robot is through tactile stimulation; the various kinds of stimulation that can elicit the robot's emotions are grounded in a model of emotion activation based on different stimulation patterns. Sec- ond......, emotional states need to be clearly conveyed. We have drawn inspira- tion from theories of human basic emotions with associated universal facial expressions, which we have implemented in a caricaturized face. We have conducted experiments on both children and adults to assess the recogniz- ability...

  7. Human-robot interaction: kinematics and muscle activity inside a powered compliant knee exoskeleton.

    Science.gov (United States)

    Knaepen, Kristel; Beyl, Pieter; Duerinck, Saartje; Hagman, Friso; Lefeber, Dirk; Meeusen, Romain

    2014-11-01

    Until today it is not entirely clear how humans interact with automated gait rehabilitation devices and how we can, based on that interaction, maximize the effectiveness of these exoskeletons. The goal of this study was to gain knowledge on the human-robot interaction, in terms of kinematics and muscle activity, between a healthy human motor system and a powered knee exoskeleton (i.e., KNEXO). Therefore, temporal and spatial gait parameters, human joint kinematics, exoskeleton kinetics and muscle activity during four different walking trials in 10 healthy male subjects were studied. Healthy subjects can walk with KNEXO in patient-in-charge mode with some slight constraints in kinematics and muscle activity primarily due to inertia of the device. Yet, during robot-in-charge walking the muscular constraints are reversed by adding positive power to the leg swing, compensating in part this inertia. Next to that, KNEXO accurately records and replays the right knee kinematics meaning that subject-specific trajectories can be implemented as a target trajectory during assisted walking. No significant differences in the human response to the interaction with KNEXO in low and high compliant assistance could be pointed out. This is in contradiction with our hypothesis that muscle activity would decrease with increasing assistance. It seems that the differences between the parameter settings of low and high compliant control might not be sufficient to observe clear effects in healthy subjects. Moreover, we should take into account that KNEXO is a unilateral, 1 degree-of-freedom device.

  8. Accompany: Acceptable robotiCs COMPanions for AgeiNG Years - Multidimensional aspects of human-system interactions

    OpenAIRE

    Amirabdollahian F.; Op Den Akker R.; Bedaf S.; Bormann R.; Draper H.; Evers V.; Gelderblom G.J.; Ruiz C.G.; Hewson D.; Hu N.

    2013-01-01

    With changes in life expectancy across the world, technologies enhancing well-being of individuals, specifically for older people, are subject to a new stream of research and development. In this paper we present the ACCOMPANY project, a pan-European project which focuses on home companion technologies. The projects aims to progress beyond the state of the art in multiple areas such as empathic and social human-robot interaction, robot learning and memory visualisation, monitoring persons and...

  9. Exploring Host-Microbiome Interactions using an in Silico Model of Biomimetic Robots and Engineered Living Cells

    OpenAIRE

    Keith C. Heyde; Warren C. Ruder

    2015-01-01

    The microbiome?s underlying dynamics play an important role in regulating the behavior and health of its host. In order to explore the details of these interactions, we created an in silico model of a living microbiome, engineered with synthetic biology, that interfaces with a biomimetic, robotic host. By analytically modeling and computationally simulating engineered gene networks in these commensal communities, we reproduced complex behaviors in the host. We observed that robot movements de...

  10. Hand Gesture Modeling and Recognition for Human and Robot Interactive Assembly Using Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Fei Chen

    2015-04-01

    Full Text Available Gesture recognition is essential for human and robot collaboration. Within an industrial hybrid assembly cell, the performance of such a system significantly affects the safety of human workers. This work presents an approach to recognizing hand gestures accurately during an assembly task while in collaboration with a robot co-worker. We have designed and developed a sensor system for measuring natural human-robot interactions. The position and rotation information of a human worker's hands and fingertips are tracked in 3D space while completing a task. A modified chain-code method is proposed to describe the motion trajectory of the measured hands and fingertips. The Hidden Markov Model (HMM method is adopted to recognize patterns via data streams and identify workers' gesture patterns and assembly intentions. The effectiveness of the proposed system is verified by experimental results. The outcome demonstrates that the proposed system is able to automatically segment the data streams and recognize the gesture patterns thus represented with a reasonable accuracy ratio.

  11. Tema 2: The NAO robot as a Persuasive Educational and Entertainment Robot (PEER – a case study on children’s articulation, categorization and interaction with a social robot for learning

    Directory of Open Access Journals (Sweden)

    Lykke Brogaard Bertel

    2016-01-01

    Full Text Available The application of social robots as motivational tools and companions in education is increasingly being explored from a theoretical and practical point of view. In this paper, we examine the social robot NAO as a Persuasive Educational and Entertainment Robot (PEER and present findings from a case study on the use of NAO to support learning environments in Danish primary schools. In the case study we focus on the children’s practice of articulation and embodied interaction with NAO and investigate the role of NAO as a ‘tool’, ‘social actor’ or ‘simulating medium’ in the learning designs. We examine whether this categorization is static or dynamic, i. e. develops and changes over the course of the interaction and explore how this relates to and affects the student’s motivation to engage in the NAO-supported learning activities.

  12. Tema 2: The NAO robot as a Persuasive Educational and Entertainment Robot (PEER – a case study on children’s articulation, categorization and interaction with a social robot for learning

    Directory of Open Access Journals (Sweden)

    Lykke Brogaard Bertel

    2015-12-01

    Full Text Available The application of social robots as motivational tools and companions in education is increasingly being explored from a theoretical and practical point of view. In this paper, we examine the social robot NAO as a Persuasive Educational and Entertainment Robot (PEER and present findings from a case study on the use of NAO to support learning environments in Danish primary schools. In the case study we focus on the children’s practice of articulation and embodied interaction with NAO and investigate the role of NAO as a ‘tool’, ‘social actor’ or ‘simulating medium’ in the learning designs. We examine whether this categorization is static or dynamic, i. e. develops and changes over the course of the interaction and explore how this relates to and affects the student’s motivation to engage in the NAO-supported learning activities.

  13. Exploring the acquisition and production of grammatical constructions through human-robot interaction with echo state networks.

    Science.gov (United States)

    Hinaut, Xavier; Petit, Maxime; Pointeau, Gregoire; Dominey, Peter Ford

    2014-01-01

    One of the principal functions of human language is to allow people to coordinate joint action. This includes the description of events, requests for action, and their organization in time. A crucial component of language acquisition is learning the grammatical structures that allow the expression of such complex meaning related to physical events. The current research investigates the learning of grammatical constructions and their temporal organization in the context of human-robot physical interaction with the embodied sensorimotor humanoid platform, the iCub. We demonstrate three noteworthy phenomena. First, a recurrent network model is used in conjunction with this robotic platform to learn the mappings between grammatical forms and predicate-argument representations of meanings related to events, and the robot's execution of these events in time. Second, this learning mechanism functions in the inverse sense, i.e., in a language production mode, where rather than executing commanded actions, the robot will describe the results of human generated actions. Finally, we collect data from naïve subjects who interact with the robot via spoken language, and demonstrate significant learning and generalization results. This allows us to conclude that such a neural language learning system not only helps to characterize and understand some aspects of human language acquisition, but also that it can be useful in adaptive human-robot interaction.

  14. Online Assessment of Human-Robot Interaction for Hybrid Control of Walking

    Directory of Open Access Journals (Sweden)

    Ana de-los-Reyes

    2011-12-01

    Full Text Available Restoration of walking ability of Spinal Cord Injury subjects can be achieved by different approaches, as the use of robotic exoskeletons or electrical stimulation of the user’s muscles. The combined (hybrid approach has the potential to provide a solution to the drawback of each approach. Specific challenges must be addressed with specific sensory systems and control strategies. In this paper we present a system and a procedure to estimate muscle fatigue from online physical interaction assessment to provide hybrid control of walking, regarding the performances of the muscles under stimulation.

  15. Interaction learning for dynamic movement primitives used in cooperative robotic tasks

    DEFF Research Database (Denmark)

    Kulvicius, Tomas; Biehl, Martin; Aein, Mohamad Javad

    2013-01-01

    Abstract Since several years dynamic movement primitives (DMPs) are more and more getting into the center of interest for flexible movement control in robotics. In this study we introduce sensory feedback together with a predictive learning mechanism which allows tightly coupled dual-agent systems...... to learn an adaptive, sensor-driven interaction based on DMPs. The coupled conventional (no-sensors, no learning) DMP-system automatically equilibrates and can still be solved analytically allowing us to derive conditions for stability. When adding adaptive sensor control we can show that both agents learn...

  16. Dynamic characterization of contact interactions of micro-robotic leg structures

    Science.gov (United States)

    Ryou, Jeong Hoon; Oldham, Kenn Richard

    2014-05-01

    Contact dynamics of microelectromechanical systems (MEMS) are typically complicated and it is consequently difficult to model all dynamic characteristics observed in time-domain responses involving impact. This issue becomes worse when a device, such as a mobile micro-robot, is not clamped to a substrate and has a complex mechanical structure. To characterize such a contact interaction situation, two walking micro-robot prototypes are tested having intentionally simple structures with different dimensions (21.2 mm × 16.3 mm × 0.75 mm and 32 mm × 25.4 mm × 4.1 mm) and weights (0.16 and 2.7 g). Contact interaction behaviors are characterized by analyzing experimental data under various excitation signals. A numerical approach was used to derive a novel contact model consisting of a coefficient of restitution matrix that uses modal vibration information. Experimental validation of the simulation model shows that it captures various dynamic features of the contact interaction when simulating leg behavior more accurately than previous contact models, such as single-point coefficient of restitution or compliant ground models. In addition, this paper shows that small-scale forces can be added to the simulation to improve model accuracy, resulting in average errors across driving conditions on the order of 2-6% for bounce frequency, maximum foot height, and average foot height, although there is substantial variation from case to case.

  17. Dynamic characterization of contact interactions of micro-robotic leg structures

    International Nuclear Information System (INIS)

    Ryou, Jeong Hoon; Oldham, Kenn Richard

    2014-01-01

    Contact dynamics of microelectromechanical systems (MEMS) are typically complicated and it is consequently difficult to model all dynamic characteristics observed in time-domain responses involving impact. This issue becomes worse when a device, such as a mobile micro-robot, is not clamped to a substrate and has a complex mechanical structure. To characterize such a contact interaction situation, two walking micro-robot prototypes are tested having intentionally simple structures with different dimensions (21.2 mm × 16.3 mm × 0.75 mm and 32 mm × 25.4 mm × 4.1 mm) and weights (0.16 and 2.7 g). Contact interaction behaviors are characterized by analyzing experimental data under various excitation signals. A numerical approach was used to derive a novel contact model consisting of a coefficient of restitution matrix that uses modal vibration information. Experimental validation of the simulation model shows that it captures various dynamic features of the contact interaction when simulating leg behavior more accurately than previous contact models, such as single-point coefficient of restitution or compliant ground models. In addition, this paper shows that small-scale forces can be added to the simulation to improve model accuracy, resulting in average errors across driving conditions on the order of 2–6% for bounce frequency, maximum foot height, and average foot height, although there is substantial variation from case to case. (paper)

  18. Robot-assisted therapy for improving social interactions and activity participation among institutionalized older adults: a pilot study.

    Science.gov (United States)

    Sung, Huei-Chuan; Chang, Shu-Min; Chin, Mau-Yu; Lee, Wen-Li

    2015-03-01

    Animal-assisted therapy is gaining popularity as part of therapeutic activities for older adults in many long-term care facilities. However, concerns about dog bites, allergic responses to pets, disease, and insufficient available resources to care for a real pet have led to many residential care facilities to ban this therapy. There are situations where a substitute artificial companion, such as robotic pet, may serve as a better alternative. This pilot study used a one-group pre- and posttest design to evaluate the effect of a robot-assisted therapy for older adults. Sixteen eligible participants participated in the study and received a group robot-assisted therapy using a seal-like robot pet for 30 minutes twice a week for 4 weeks. All participants received assessments of their communication and interaction skills using the Assessment of Communication and Interaction Skills (ACIS-C) and activity participation using the Activity Participation Scale at baseline and at week 4. A total of 12 participants completed the study. Wilcoxon signed rank test showed that participants' communication and interaction skills (z = -2.94, P = 0.003) and activity participation (z = -2.66, P = 0.008) were significantly improved after receiving 4-week robot-assisted therapy. By interacting with a robot pet, such as Paro, the communication, interaction skills, and activity participation of the older adults can be improved. The robot-assisted therapy can be provided as a routine activity program and has the potential to improve social health of older adults in residential care facilities. Copyright © 2014 Wiley Publishing Asia Pty Ltd.

  19. Touch versus In-Air Hand Gestures: Evaluating the Acceptance by Seniors of Human-Robot Interaction

    NARCIS (Netherlands)

    Znagui Hassani, Anouar; van Dijk, Elisabeth M.A.G.; Ludden, Geke Dina Simone; Eertink, Henk

    2011-01-01

    Do elderly people have a preference between performing inair gestures or pressing screen buttons to interact with an assistive robot? This study attempts to provide answers to this question by measuring the level of acceptance, performance as well as knowledge of both interaction modalities during a

  20. Blank Field Sources in the ROSAT HRI Brera Multiscale Wavelet catalog

    OpenAIRE

    Chieregato, M.; Campana, S.; Treves, A.; Moretti, A.; Mignani, R. P.; Tagliaferri, G.

    2005-01-01

    The search for Blank Field Sources (BFS), i.e. X-ray sources without optical counterparts, paves the way to the identification of unusual objects in the X-ray sky. Here we present four BFS detected in the Brera Multiscale Wavelet catalog of ROSAT HRI observations. This sample has been selected on the basis of source brightness, distance from possible counterparts at other wavelengths, point-like shape and good estimate of the X-ray flux (f_X). The observed f_X and the limiting magnitude of th...

  1. Narratives with Robots: The Impact of Interaction Context and Individual Differences on Story Recall and Emotional Understanding

    Directory of Open Access Journals (Sweden)

    Iolanda Leite

    2017-07-01

    Full Text Available Role-play scenarios have been considered a successful learning space for children to develop their social and emotional abilities. In this paper, we investigate whether socially assistive robots in role-playing settings are as effective with small groups of children as they are with a single child and whether individual factors such as gender, grade level (first vs. second, perception of the robots (peer vs. adult, and empathy level (low vs. high play a role in these two interaction contexts. We conducted a three-week repeated exposure experiment where 40 children interacted with socially assistive robotic characters that acted out interactive stories around words that contribute to expanding children’s emotional vocabulary. Our results showed that although participants who interacted alone with the robots recalled the stories better than participants in the group condition, no significant differences were found in children’s emotional interpretation of the narratives. With regard to individual differences, we found that a single child setting appeared more appropriate to first graders than a group setting, empathy level is an important predictor for emotional understanding of the narratives, and children’s performance varies depending on their perception of the robots (peer vs. adult in the two conditions.

  2. Fabrication of robot head module using contact resistance force sensor for human robot interaction and its evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong Ki; Kim, Jong Ho [Korea Reserch Institute of Standards and Science, Daejeon (Korea, Republic of); Kwon, Hyun Joon [Univ. of Maryland, Maryland (United States); Kwon, Young Ha [Kyung Hee Univ., Gyunggi Do (Korea, Republic of)

    2012-10-15

    This paper presents a design of a robot head module with touch sensing algorithms that can simultaneously detect contact force and location. The module is constructed with a hemisphere and three sensor units that are fabricated using contact resistance force sensors. The surface part is designed with the hemisphere that measures 300 mm in diameter and 150 mm in height. Placed at the bottom of the robot head module are three sensor units fabricated using a simple screen printing technique. The contact force and the location of the model are evaluated through the calibration setup. The experiment showed that the calculated contact positions almost coincided with the applied load points as the contact location changed with a location error of about {+-}8.67 mm. The force responses of the module were evaluated at two points under loading and unloading conditions from 0 N to 5 N. The robot head module showed almost the same force responses at the two points.

  3. Views from Within a Narrative: Evaluating Long-Term Human-Robot Interaction in a Naturalistic Environment Using Open-Ended Scenarios.

    Science.gov (United States)

    Syrdal, Dag Sverre; Dautenhahn, Kerstin; Koay, Kheng Lee; Ho, Wan Ching

    2014-01-01

    This article describes the prototyping of human-robot interactions in the University of Hertfordshire (UH) Robot House. Twelve participants took part in a long-term study in which they interacted with robots in the UH Robot House once a week for a period of 10 weeks. A prototyping method using the narrative framing technique allowed participants to engage with the robots in episodic interactions that were framed using narrative to convey the impression of a continuous long-term interaction. The goal was to examine how participants responded to the scenarios and the robots as well as specific robot behaviours, such as agent migration and expressive behaviours. Evaluation of the robots and the scenarios were elicited using several measures, including the standardised System Usability Scale, an ad hoc Scenario Acceptance Scale, as well as single-item Likert scales, open-ended questionnaire items and a debriefing interview. Results suggest that participants felt that the use of this prototyping technique allowed them insight into the use of the robot, and that they accepted the use of the robot within the scenario.

  4. How do walkers behave when crossing the way of a mobile robot that replicates human interaction rules?

    Science.gov (United States)

    Vassallo, Christian; Olivier, Anne-Hélène; Souères, Philippe; Crétual, Armel; Stasse, Olivier; Pettré, Julien

    2018-02-01

    Previous studies showed the existence of implicit interaction rules shared by human walkers when crossing each other. Especially, each walker contributes to the collision avoidance task and the crossing order, as set at the beginning, is preserved along the interaction. This order determines the adaptation strategy: the first arrived increases his/her advance by slightly accelerating and changing his/her heading, whereas the second one slows down and moves in the opposite direction. In this study, we analyzed the behavior of human walkers crossing the trajectory of a mobile robot that was programmed to reproduce this human avoidance strategy. In contrast with a previous study, which showed that humans mostly prefer to give the way to a non-reactive robot, we observed similar behaviors between human-human avoidance and human-robot avoidance when the robot replicates the human interaction rules. We discuss this result in relation with the importance of controlling robots in a human-like way in order to ease their cohabitation with humans. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Robosapien Robot used to Model Humanoid Interaction to Perform tasks in Dangerous Manufacturing Environments

    International Nuclear Information System (INIS)

    Stopforth, R; Bright, G

    2014-01-01

    Humans are involved with accidents in manufacturing environments. A possibility to prevent humans from these scenarios is, to introduce humanoid robots within these industrial areas. This paper investigates the control scenario and environments required at a small scale level, with the use of the Robosapien robot. The Robosapien robot is modified to control it with a task of removing a cylinder and inserting it into a hole. Analysis is performed on the performance of the Robosapien robot and relating it with that of a humanoid robot. A discussion with suggestions is concluded with the efficiency and profitability that would need to be considered, for having a humanoid robot within the manufacturing environment

  6. Monocular SLAM for autonomous robots with enhanced features initialization.

    Science.gov (United States)

    Guerra, Edmundo; Munguia, Rodrigo; Grau, Antoni

    2014-04-02

    This work presents a variant approach to the monocular SLAM problem focused in exploiting the advantages of a human-robot interaction (HRI) framework. Based upon the delayed inverse-depth feature initialization SLAM (DI-D SLAM), a known monocular technique, several but crucial modifications are introduced taking advantage of data from a secondary monocular sensor, assuming that this second camera is worn by a human. The human explores an unknown environment with the robot, and when their fields of view coincide, the cameras are considered a pseudo-calibrated stereo rig to produce estimations for depth through parallax. These depth estimations are used to solve a related problem with DI-D monocular SLAM, namely, the requirement of a metric scale initialization through known artificial landmarks. The same process is used to improve the performance of the technique when introducing new landmarks into the map. The convenience of the approach taken to the stereo estimation, based on SURF features matching, is discussed. Experimental validation is provided through results from real data with results showing the improvements in terms of more features correctly initialized, with reduced uncertainty, thus reducing scale and orientation drift. Additional discussion in terms of how a real-time implementation could take advantage of this approach is provided.

  7. A Meta-Analysis of Factors Affecting Trust in Human-Robot Interaction

    Science.gov (United States)

    2011-10-01

    directly affects the willingness of people to accept robot -produced information, follow robots ’ suggestions, and thus benefit from the advantages inherent...perceived complexity of operation). Consequently, if the perceived risk of using the robot exceeds its perceived benefit , practical operators almost...necessary presence of a human caregiver (Graf, Hans, & Schraft, 2004). Other robotic devices, such as wheelchairs (Yanco, 2001) and exoskeletons (e.g

  8. A design-centred framework for social human-robot interaction

    NARCIS (Netherlands)

    Bartneck, C.; Forlizzi, J.

    2004-01-01

    Robots currently integrate into our everyday lives, but little is known about how they can act socially. In this paper, we propose a definition of social robots and describe a framework that classifies properties of social robots. The properties consist of form, modality, social norms, autonomy, and

  9. Robots for Elderly Care: Their Level of Social Interactions and the Targeted End User.

    Science.gov (United States)

    Bedaf, Sandra; de Witte, Luc

    2017-01-01

    Robots for older adults have a lot of potential. In order to create an overview of the developments in this area a systematic review of robots for older adults living independently was conducted. Robots were categorized based on their market readiness, the type of provided support (i.e., physical, non-physical, non-specified), and the activity domain they claim to support. Additionally, the commercially available robots are places in a proposed framework to help to distinguish the different types of robots and their focus. During the presentation an updated version of the state of the art will be presented.

  10. Hryðjuverk og íslensk utanríkisstefna. Hvaða áhrif höfðu hryðjuverkin 11. september 2001 á íslenska utanríkisstefnu?

    Directory of Open Access Journals (Sweden)

    Þorvarður Atli Þórsson

    2008-06-01

    Full Text Available Í þessari grein er sett fram sú kenning að áhrif hryðjuverkanna 11. september á afstöðu íslenskra stjórnvalda gagnvart hryðjuverkum hafi verið umtalsverð og haft áhrif hvernig við nálguðumst hryðjuverk og þá sem þau fremja. Til að skoða þessi áhrif voru skoðaðar þær ræður utanríkisráðherra og dómsmálaráðherra sem settar hafa verið á heimasíður viðkomandi ráðuneyta, en bæði ráðuneyti telja hryðjuverk vera á dagskrá þeirra. Voru allar ræður og greinar utanríkisráðherra sem birtar voru á tímabilinu 1995-2008 og dómsmálaráðherra sem birtar voru á tímabilinu 1999-2008 skoðaðar, 495 samtals, og athugað var hvort hryðjuverk voru sett í samhengi við alþjóðlega glæpastarfsemi eða alþjóðlega ógn og þjóðaröryggi.

  11. Identifying Factors Reinforcing Robotization: Interactive Forces of Employment, Working Hour and Wage

    Directory of Open Access Journals (Sweden)

    Joonmo Cho

    2018-02-01

    Full Text Available Unlike previous studies on robotization approaching the future based on the cutting-edge technologies and adopting a framework where robotization is considered as an exogenous variable, this study considers that robotization occurs endogenously and uses it as a dependent variable for an objective examination of the effect of robotization on the labor market. To this end, a robotization indicator is created based on the actual number of industrial robots currently deployed in workplaces, and a multiple regression analysis is performed using the robotization indicator and labor variables such as employment, working hours, and wage. The results using the multiple regression considering the triangular relationship of employment–working-hours–wages show that job destruction due to robotization is not too remarkable yet that use. Our results show the complementary relation between employment and robotization, but the substituting relation between working hour and robotization. The results also demonstrate the effects of union, the size of the company and the proportion of production workers and simple labor workers etc. These findings indicate that the degree of robotization may vary with many factors of the labor market. Limitations of this study and implications for future research are also discussed.

  12. Grounding action words in the sensorimotor interaction with the world: experiments with a simulated iCub humanoid robot

    Directory of Open Access Journals (Sweden)

    Davide Marocco

    2010-05-01

    Full Text Available This paper presents a cognitive robotics model for the study of the embodied representation of action words. The present research will present how a iCub humanoid robot can learn the meaning of action words (i.e. words that represent dynamical events that happen in time by physically acting on the environment and linking the effects of its own actions with the behaviour observed on the objects before and after the action. The control system of the robot is an artificial neural network trained to manipulate an object through a Back-Propagation Through Time algorithm. We will show that in the presented model the grounding of action words relies directly to the way in which an agent interacts with the environment and manipulates it.

  13. Effect of human-robot interaction on muscular synergies on healthy people and post-stroke chronic patients.

    Science.gov (United States)

    Scano, A; Chiavenna, A; Caimmi, M; Malosio, M; Tosatti, L M; Molteni, F

    2017-07-01

    Robot-assisted training is a widely used technique to promote motor re-learning on post-stroke patients that suffer from motor impairment. While it is commonly accepted that robot-based therapies are potentially helpful, strong insights about their efficacy are still lacking. The motor re-learning process may act on muscular synergies, which are groups of co-activating muscles that, being controlled as a synergic group, allow simplifying the problem of motor control. In fact, by coordinating a reduced amount of neural signals, complex motor patterns can be elicited. This paper aims at analyzing the effects of robot assistance during 3D-reaching movements in the framework of muscular synergies. 5 healthy people and 3 neurological patients performed free and robot-assisted reaching movements at 2 different speeds (slow and quasi-physiological). EMG recordings were used to extract muscular synergies. Results indicate that the interaction with the robot very slightly alters healthy people patterns but, on the contrary, it may promote the emergency of physiological-like synergies on neurological patients.

  14. INTERACT

    DEFF Research Database (Denmark)

    Jochum, Elizabeth; Borggreen, Gunhild; Murphey, TD

    This paper considers the impact of visual art and performance on robotics and human-computer interaction and outlines a research project that combines puppetry and live performance with robotics. Kinesics—communication through movement—is the foundation of many theatre and performance traditions ...

  15. Quantifying Age-Related Differences in Human Reaching while Interacting with a Rehabilitation Robotic Device

    Directory of Open Access Journals (Sweden)

    Vivek Yadav

    2010-01-01

    Full Text Available New movement assessment and data analysis methods are developed to quantify human arm motion patterns during physical interaction with robotic devices for rehabilitation. These methods provide metrics for future use in diagnosis, assessment and rehabilitation of subjects with affected arm movements. Specifically, the current study uses existing pattern recognition methods to evaluate the effect of age on performance of a specific motion, reaching to a target by moving the end-effector of a robot (an X-Y table. Differences in the arm motion patterns of younger and older subjects are evaluated using two measures: the principal component analysis similarity factor (SPCA to compare path shape and the number of Fourier modes representing 98% of the path ‘energy’ to compare the smoothness of movement, a particularly important variable for assessment of pathologic movement. Both measures are less sensitive to noise than others previously reported in the literature and preserve information that is often lost through other analysis techniques. Data from the SPCA analysis indicate that age is a significant factor affecting the shapes of target reaching paths, followed by reaching movement type (crossing body midline/not crossing and reaching side (left/right; hand dominance and trial repetition are not significant factors. Data from the Fourier-based analysis likewise indicate that age is a significant factor affecting smoothness of movement, and movements become smoother with increasing trial number in both younger and older subjects, although more rapidly so in younger subjects. These results using the proposed data analysis methods confirm current practice that age-matched subjects should be used for comparison to quantify recovery of arm movement during rehabilitation. The results also highlight the advantages that these methods offer relative to other reported measures.

  16. R3D3 in the Wild: Using A Robot for Turn Management in Multi-Party Interaction with a Virtual Human

    NARCIS (Netherlands)

    Theune, Mariet; Wiltenburg, Daan; Bode, Max; Linssen, Jeroen

    R3D3 is a combination of a virtual human with a non-speaking robot capable of head gestures and emotive gaze behaviour. We use the robot to implement various turn management functions for use in multi-party interaction with R3D3, and present the results of a field study investigating their effects

  17. Fundamentals of soft robot locomotion

    OpenAIRE

    Calisti, M.; Picardi, G.; Laschi, C.

    2017-01-01

    Soft robotics and its related technologies enable robot abilities in several robotics domains including, but not exclusively related to, manipulation, manufacturing, human���robot interaction and locomotion. Although field applications have emerged for soft manipulation and human���robot interaction, mobile soft robots appear to remain in the research stage, involving the somehow conflictual goals of having a deformable body and exerting forces on the environment to achieve locomotion. This p...

  18. ROSAPL: towards a heterogeneous multi‐robot system and Human interaction framework

    OpenAIRE

    Boronat Roselló, Emili

    2014-01-01

    The appearance of numerous robotic frameworks and middleware has provided researchers with reliable hardware and software units avoiding the need of developing ad-hoc platforms and focus their work on how improve the robots' high-level capabilities and behaviours. Despite this none of these are facilitating frameworks considering social capabilities as a factor in robots design. In a world that everyday seems more and more connected, with the slow but steady advance of th...

  19. Modular Robotic Wearable

    DEFF Research Database (Denmark)

    Lund, Henrik Hautop; Pagliarini, Luigi

    2009-01-01

    In this concept paper we trace the contours and define a new approach to robotic systems, composed of interactive robotic modules which are somehow worn on the body. We label such a field as Modular Robotic Wearable (MRW). We describe how, by using modular robotics for creating wearable....... Finally, by focusing on the intersection of the combination modular robotic systems, wearability, and bodymind we attempt to explore the theoretical characteristics of such approach and exploit the possible playware application fields....

  20. Robot Teachers

    DEFF Research Database (Denmark)

    Nørgård, Rikke Toft; Ess, Charles Melvin; Bhroin, Niamh Ni

    The world's first robot teacher, Saya, was introduced to a classroom in Japan in 2009. Saya, had the appearance of a young female teacher. She could express six basic emotions, take the register and shout orders like 'be quiet' (The Guardian, 2009). Since 2009, humanoid robot technologies have...... developed. It is now suggested that robot teachers may become regular features in educational settings, and may even 'take over' from human teachers in ten to fifteen years (cf. Amundsen, 2017 online; Gohd, 2017 online). Designed to look and act like a particular kind of human; robot teachers mediate human...... existence and roles, while also aiming to support education through sophisticated, automated, human-like interaction. Our paper explores the design and existential implications of ARTIE, a robot teacher at Oxford Brookes University (2017, online). Drawing on an initial empirical exploration we propose...

  1. Evaluation of unimodal and multimodal communication cues for attracting attention in human–robot interaction

    NARCIS (Netherlands)

    Torta, E.; van Heumen, J.; Piunti, F.; Romeo, L.; Cuijpers, R.H.

    2015-01-01

    One of the most common tasks of a robot companion in the home is communication. In order to initiate an information exchange with its human partner, the robot needs to attract the attention of the human. This paper presents results of two user studies ( N=12 ) to evaluate the effectiveness of

  2. Using Haptic and Auditory Interaction Tools to Engage Students with Visual Impairments in Robot Programming Activities

    Science.gov (United States)

    Howard, A. M.; Park, Chung Hyuk; Remy, S.

    2012-01-01

    The robotics field represents the integration of multiple facets of computer science and engineering. Robotics-based activities have been shown to encourage K-12 students to consider careers in computing and have even been adopted as part of core computer-science curriculum at a number of universities. Unfortunately, for students with visual…

  3. Role of Gaze Cues in Interpersonal Motor Coordination: Towards Higher Affiliation in Human-Robot Interaction.

    Directory of Open Access Journals (Sweden)

    Mahdi Khoramshahi

    Full Text Available The ability to follow one another's gaze plays an important role in our social cognition; especially when we synchronously perform tasks together. We investigate how gaze cues can improve performance in a simple coordination task (i.e., the mirror game, whereby two players mirror each other's hand motions. In this game, each player is either a leader or follower. To study the effect of gaze in a systematic manner, the leader's role is played by a robotic avatar. We contrast two conditions, in which the avatar provides or not explicit gaze cues that indicate the next location of its hand. Specifically, we investigated (a whether participants are able to exploit these gaze cues to improve their coordination, (b how gaze cues affect action prediction and temporal coordination, and (c whether introducing active gaze behavior for avatars makes them more realistic and human-like (from the user point of view.43 subjects participated in 8 trials of the mirror game. Each subject performed the game in the two conditions (with and without gaze cues. In this within-subject study, the order of the conditions was randomized across participants, and subjective assessment of the avatar's realism was assessed by administering a post-hoc questionnaire. When gaze cues were provided, a quantitative assessment of synchrony between participants and the avatar revealed a significant improvement in subject reaction-time (RT. This confirms our hypothesis that gaze cues improve the follower's ability to predict the avatar's action. An analysis of the pattern of frequency across the two players' hand movements reveals that the gaze cues improve the overall temporal coordination across the two players. Finally, analysis of the subjective evaluations from the questionnaires reveals that, in the presence of gaze cues, participants found it not only more human-like/realistic, but also easier to interact with the avatar.This work confirms that people can exploit gaze cues to

  4. SafeNet: a methodology for integrating general-purpose unsafe devices in safe-robot rehabilitation systems.

    Science.gov (United States)

    Vicentini, Federico; Pedrocchi, Nicola; Malosio, Matteo; Molinari Tosatti, Lorenzo

    2014-09-01

    Robot-assisted neurorehabilitation often involves networked systems of sensors ("sensory rooms") and powerful devices in physical interaction with weak users. Safety is unquestionably a primary concern. Some lightweight robot platforms and devices designed on purpose include safety properties using redundant sensors or intrinsic safety design (e.g. compliance and backdrivability, limited exchange of energy). Nonetheless, the entire "sensory room" shall be required to be fail-safe and safely monitored as a system at large. Yet, sensor capabilities and control algorithms used in functional therapies require, in general, frequent updates or re-configurations, making a safety-grade release of such devices hardly sustainable in cost-effectiveness and development time. As such, promising integrated platforms for human-in-the-loop therapies could not find clinical application and manufacturing support because of lacking in the maintenance of global fail-safe properties. Under the general context of cross-machinery safety standards, the paper presents a methodology called SafeNet for helping in extending the safety rate of Human Robot Interaction (HRI) systems using unsafe components, including sensors and controllers. SafeNet considers, in fact, the robotic system as a device at large and applies the principles of functional safety (as in ISO 13489-1) through a set of architectural procedures and implementation rules. The enabled capability of monitoring a network of unsafe devices through redundant computational nodes, allows the usage of any custom sensors and algorithms, usually planned and assembled at therapy planning-time rather than at platform design-time. A case study is presented with an actual implementation of the proposed methodology. A specific architectural solution is applied to an example of robot-assisted upper-limb rehabilitation with online motion tracking. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. Designing Emotionally Expressive Robots

    DEFF Research Database (Denmark)

    Tsiourti, Christiana; Weiss, Astrid; Wac, Katarzyna

    2017-01-01

    Socially assistive agents, be it virtual avatars or robots, need to engage in social interactions with humans and express their internal emotional states, goals, and desires. In this work, we conducted a comparative study to investigate how humans perceive emotional cues expressed by humanoid...... robots through five communication modalities (face, head, body, voice, locomotion) and examined whether the degree of a robot's human-like embodiment affects this perception. In an online survey, we asked people to identify emotions communicated by Pepper -a highly human-like robot and Hobbit – a robot...... for robots....

  6. Distributed Robotics Education

    DEFF Research Database (Denmark)

    Lund, Henrik Hautop; Pagliarini, Luigi

    2011-01-01

    Distributed robotics takes many forms, for instance, multirobots, modular robots, and self-reconfigurable robots. The understanding and development of such advanced robotic systems demand extensive knowledge in engineering and computer science. In this paper, we describe the concept of a distribu......Distributed robotics takes many forms, for instance, multirobots, modular robots, and self-reconfigurable robots. The understanding and development of such advanced robotic systems demand extensive knowledge in engineering and computer science. In this paper, we describe the concept...... to be changed, related to multirobot control and human-robot interaction control from virtual to physical representation. The proposed system is valuable for bringing a vast number of issues into education – such as parallel programming, distribution, communication protocols, master dependency, connectivity...

  7. Robotic intelligence kernel

    Science.gov (United States)

    Bruemmer, David J [Idaho Falls, ID

    2009-11-17

    A robot platform includes perceptors, locomotors, and a system controller. The system controller executes a robot intelligence kernel (RIK) that includes a multi-level architecture and a dynamic autonomy structure. The multi-level architecture includes a robot behavior level for defining robot behaviors, that incorporate robot attributes and a cognitive level for defining conduct modules that blend an adaptive interaction between predefined decision functions and the robot behaviors. The dynamic autonomy structure is configured for modifying a transaction capacity between an operator intervention and a robot initiative and may include multiple levels with at least a teleoperation mode configured to maximize the operator intervention and minimize the robot initiative and an autonomous mode configured to minimize the operator intervention and maximize the robot initiative. Within the RIK at least the cognitive level includes the dynamic autonomy structure.

  8. Timing of Multimodal Robot Behaviors during Human-Robot Collaboration

    DEFF Research Database (Denmark)

    Jensen, Lars Christian; Fischer, Kerstin; Suvei, Stefan-Daniel

    2017-01-01

    In this paper, we address issues of timing between robot behaviors in multimodal human-robot interaction. In particular, we study what effects sequential order and simultaneity of robot arm and body movement and verbal behavior have on the fluency of interactions. In a study with the Care-O-bot, ...... output plays a special role because participants carry their expectations from human verbal interaction into the interactions with robots....

  9. MECHANICAL DESIGN OF AN AUTONOMOUS MARINE ROBOTIC SYSTEM FOR INTERACTION WITH DIVERS

    Directory of Open Access Journals (Sweden)

    Nikola Stilinović

    2016-09-01

    Full Text Available SCUBA diving, professional or recreational, remains one of the most hazardous activities known by man, mostly due to the fact that the human survival in the underwater environment requires use of technical equipment such as breathing regulators. Loss of breathing gas supply, burst eardrum, decompression sickness and nitrogen narcosis are just a few problems which can occur during an ordinary dive and result in injuries, long-term illnesses or even death. Most common way to reduce the risk of diving is to dive in pairs, thus allowing divers to cooperate with each other and react when uncommon situation occurs. Having the ability to react before an unwanted situation happens would improve diver safety. This paper describes an autonomous marine robotic system that replaces a human dive buddy. Such a robotic system, developed within an FP7 project “CADDY – Cognitive Autonomous Diving Buddy” provides a symbiotic link between robots and human divers in the underwater. The proposed concept consists of a diver, an autonomous underwater vehicle (AUV Buddy and an autonomous surface vehicle (ASV PlaDyPos, acting within a cooperative network linked via an acoustic communication channel. This is a first time that an underwater human-robot system of such a scale has ever been developed. In this paper, focus is put on mechanical characteristics of the robotic vehicles.

  10. An Interactive Human Interface Arm Robot with the Development of Food Aid

    Directory of Open Access Journals (Sweden)

    NASHWAN D. Zaki

    2012-03-01

    Full Text Available A robotic system for the disabled who needs supports at meal is proposed. A feature of this system is that the robotic aid system can communicate with the operator using the speech recognition and speech synthesis functions. Another feature is that the robotic aid system uses an image processing, and by doing this the system can recognize the environmental situations of the dishes, cups and so on. Due to this image processing function, the operator does not need to specify the position and the posture of the dishes and target objects. Furthermore, combination communication between speech and image processing will enables a friendly man-machine to communicate with each other, since speech and visual information are essential in the human communication.

  11. How does a surgeon’s brain buzz? An EEG coherence study on the interaction between humans and robot

    Science.gov (United States)

    2013-01-01

    Introduction In humans, both primary and non-primary motor areas are involved in the control of voluntary movements. However, the dynamics of functional coupling among different motor areas have not been fully clarified yet. There is to date no research looking to the functional dynamics in the brain of surgeons working in laparoscopy compared with those trained and working in robotic surgery. Experimental procedures We enrolled 16 right-handed trained surgeons and assessed changes in intra- and inter-hemispheric EEG coherence with a 32-channels device during the same motor task with either a robotic or a laparoscopic approach. Estimates of auto and coherence spectra were calculated by a fast Fourier transform algorithm implemented on Matlab 5.3. Results We found increase of coherence in surgeons performing laparoscopy, especially in theta and lower alpha activity, in all experimental conditions (M1 vs. SMA, S1 vs. SMA, S1 vs. pre-SMA and M1 vs. S1; p with different approaches. To the best of our knowledge, this is the first study that tried to assess semi-quantitative differences during the interaction between normal human brain and robotic devices. PMID:23607324

  12. Exploratorium: Robots.

    Science.gov (United States)

    Brand, Judith, Ed.

    2002-01-01

    This issue of Exploratorium Magazine focuses on the topic robotics. It explains how to make a vibrating robotic bug and features articles on robots. Contents include: (1) "Where Robot Mice and Robot Men Run Round in Robot Towns" (Ray Bradbury); (2) "Robots at Work" (Jake Widman); (3) "Make a Vibrating Robotic Bug" (Modesto Tamez); (4) "The Robot…

  13. Experiments with a First Prototype of a Spatial Model of Cultural Meaning through Natural-Language Human-Robot Interaction

    Directory of Open Access Journals (Sweden)

    Oliver Schürer

    2018-01-01

    Full Text Available When using assistive systems, the consideration of individual and cultural meaning is crucial for the utility and acceptance of technology. Orientation, communication and interaction are rooted in perception and therefore always happen in material space. We understand that a major problem lies in the difference between human and technical perception of space. Cultural policies are based on meanings including their spatial situation and their rich relationships. Therefore, we have developed an approach where the different perception systems share a hybrid spatial model that is generated by artificial intelligence—a joint effort by humans and assistive systems. The aim of our project is to create a spatial model of cultural meaning based on interaction between humans and robots. We define the role of humanoid robots as becoming our companions. This calls for technical systems to include still inconceivable human and cultural agendas for the perception of space. In two experiments, we tested a first prototype of the communication module that allows a humanoid to learn cultural meanings through a machine learning system. Interaction is achieved by non-verbal and natural-language communication between humanoids and test persons. This helps us to better understand how a spatial model of cultural meaning can be developed.

  14. The ultimatum game as measurement tool for anthropomorphism in human-robot interaction

    NARCIS (Netherlands)

    Torta, E.; Dijk, van E.T.; Ruijten, P.A.M.; Cuijpers, R.H.; Herrmann, G.; Pearson, M.J.; Lenz, A.; et al., xx

    2013-01-01

    Anthropomorphism is the tendency to attribute human characteristics to non–human entities. This paper presents exploratory work to evaluate how human responses during the ultimatum game vary according to the level of anthropomorphism of the opponent, which was either a human, a humanoid robot or a

  15. Child-Robot Interaction in the Wild : Field Testing Activities of the ALIZ-E Project

    NARCIS (Netherlands)

    Greeff, J. de; Blanson Henkemans, O.A.; Fraaije, A.; Solms, L.; Wigdor, N.; Bierman, B.

    2014-01-01

    A field study was conducted in which CRI activities developed by the ALIZ-E project were tested with the project's primary user group: children with diabetes. This field study resulted in new insights in the modalities and roles a robot aimed at CRI in a healthcare setting might utilise, while in

  16. Group Tasks, Activities, Dynamics, and Interactions in Collaborative Robotics Projects with Elementary and Middle School Children

    Science.gov (United States)

    Yuen, Timothy T.; Boecking, Melanie; Stone, Jennifer; Tiger, Erin Price; Gomez, Alvaro; Guillen, Adrienne; Arreguin, Analisa

    2014-01-01

    Robotics provide the opportunity for students to bring their individual interests, perspectives and areas of expertise together in order to work collaboratively on real-world science, technology, engineering and mathematics (STEM) problems. This paper examines the nature of collaboration that manifests in groups of elementary and middle school…

  17. Designing and Implementing an Interactive Social Robot from Off-the-shelf Components

    DEFF Research Database (Denmark)

    Tan, Zheng-Hua; Thomsen, Nicolai Bæk; Duan, Xiaodong

    2015-01-01

    people feel comfortable in its presence. All electrical components are standard off-the-shelf commercial products making a replication possible. Furthermore, the software is based on Robot Operating Software (ROS) and is made freely available.We present our experience with the design and discuss possible...

  18. A First Step toward the Automatic Understanding of Social Touch for Naturalistic Human–Robot Interaction

    NARCIS (Netherlands)

    Jung, Merel Madeleine; Poel, Mannes; Reidsma, Dennis; Heylen, Dirk K.J.

    2017-01-01

    Social robots should be able to automatically understand and respond to human touch. The meaning of touch does not only depend on the form of touch but also on the context in which the touch takes place. To gain more insight into the factors that are relevant to interpret the meaning of touch within

  19. Rehabilitation robotics.

    Science.gov (United States)

    Krebs, H I; Volpe, B T

    2013-01-01

    This chapter focuses on rehabilitation robotics which can be used to augment the clinician's toolbox in order to deliver meaningful restorative therapy for an aging population, as well as on advances in orthotics to augment an individual's functional abilities beyond neurorestoration potential. The interest in rehabilitation robotics and orthotics is increasing steadily with marked growth in the last 10 years. This growth is understandable in view of the increased demand for caregivers and rehabilitation services escalating apace with the graying of the population. We provide an overview on improving function in people with a weak limb due to a neurological disorder who cannot properly control it to interact with the environment (orthotics); we then focus on tools to assist the clinician in promoting rehabilitation of an individual so that s/he can interact with the environment unassisted (rehabilitation robotics). We present a few clinical results occurring immediately poststroke as well as during the chronic phase that demonstrate superior gains for the upper extremity when employing rehabilitation robotics instead of usual care. These include the landmark VA-ROBOTICS multisite, randomized clinical study which demonstrates clinical gains for chronic stroke that go beyond usual care at no additional cost. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Springer handbook of robotics

    CERN Document Server

    Khatib, Oussama

    2016-01-01

    The second edition of this handbook provides a state-of-the-art cover view on the various aspects in the rapidly developing field of robotics. Reaching for the human frontier, robotics is vigorously engaged in the growing challenges of new emerging domains. Interacting, exploring, and working with humans, the new generation of robots will increasingly touch people and their lives. The credible prospect of practical robots among humans is the result of the scientific endeavour of a half a century of robotic developments that established robotics as a modern scientific discipline. The ongoing vibrant expansion and strong growth of the field during the last decade has fueled this second edition of the Springer Handbook of Robotics. The first edition of the handbook soon became a landmark in robotics publishing and won the American Association of Publishers PROSE Award for Excellence in Physical Sciences & Mathematics as well as the organization’s Award for Engineering & Technology. The second edition o...

  1. Relational vs. group self-construal: Untangling the role of national culture in HRI

    NARCIS (Netherlands)

    Evers, V.; Maldonado, H.; Brodecki, T.; Hinds, P.; Fong, T.; Dautenhahn, K.

    2008-01-01

    As robots (and other technologies) increasingly make decisions on behalf of people, it is important to understand how people from diverse cultures respond to this capability. Thus far, much design of autonomous systems takes a Western view valuing individual preferences and choice. We challenge the

  2. Design of Intelligent Robot as A Tool for Teaching Media Based on Computer Interactive Learning and Computer Assisted Learning to Improve the Skill of University Student

    Science.gov (United States)

    Zuhrie, M. S.; Basuki, I.; Asto B, I. G. P.; Anifah, L.

    2018-01-01

    The focus of the research is the teaching module which incorporates manufacturing, planning mechanical designing, controlling system through microprocessor technology and maneuverability of the robot. Computer interactive and computer-assisted learning is strategies that emphasize the use of computers and learning aids (computer assisted learning) in teaching and learning activity. This research applied the 4-D model research and development. The model is suggested by Thiagarajan, et.al (1974). 4-D Model consists of four stages: Define Stage, Design Stage, Develop Stage, and Disseminate Stage. This research was conducted by applying the research design development with an objective to produce a tool of learning in the form of intelligent robot modules and kit based on Computer Interactive Learning and Computer Assisted Learning. From the data of the Indonesia Robot Contest during the period of 2009-2015, it can be seen that the modules that have been developed confirm the fourth stage of the research methods of development; disseminate method. The modules which have been developed for students guide students to produce Intelligent Robot Tool for Teaching Based on Computer Interactive Learning and Computer Assisted Learning. Results of students’ responses also showed a positive feedback to relate to the module of robotics and computer-based interactive learning.

  3. Robots as Confederates

    DEFF Research Database (Denmark)

    Fischer, Kerstin

    2016-01-01

    This paper addresses the use of robots in experimental research for the study of human language, human interaction, and human nature. It is argued that robots make excellent confederates that can be completely controlled, yet which engage human participants in interactions that allow us to study...... numerous linguistic and psychological variables in isolation in an ecologically valid way. Robots thus combine the advantages of observational studies and of controlled experimentation....

  4. Service Robots

    DEFF Research Database (Denmark)

    Clemmensen, Torkil; Nielsen, Jeppe Agger; Andersen, Kim Normann

    The position presented in this paper is that in order to understand how service robots shape, and are being shaped by, the physical and social contexts in which they are used, we need to consider both work/organizational analysis and interaction design. We illustrate this with qualitative data...... and personal experiences to generate discussion about how to link these two traditions. This paper presents selected results from a case study that investigated the implementation and use of robot vacuum cleaners in Danish eldercare. The study demonstrates interpretive flexibility with variation...

  5. Trust and Trustworthiness in Human-Robot Interaction: A Formal Conceptualization

    Science.gov (United States)

    2016-05-11

    results also show that a robot that fails by traveling a short distance and stopping does not have a significantly larger negative impact on both self...needed to get out”). Some likened it to getting the high score in a video game while others just wanted to “survive” the simulation. Participants who did...formally characterizing the concept of trust using tools from interdependence and game theory in complex and dynamic social environments. This effort

  6. A Comparison of Avatar, Video, and Robot-Mediated Interaction on Users' Trust in Expertise

    Directory of Open Access Journals (Sweden)

    Ye ePan

    2016-03-01

    Full Text Available Communication technologies are becoming increasingly diverse in form and functionality. A central concern is the ability to detect whether others are trustworthy. Judgments of trustworthiness rely, in part, on assessments of nonverbal cues, which are affected by media representations. In this research, we compared trust formation on three media representations. We presented 24 participants with advisors represented by two of three alternate formats: video, avatar, or robot. Unknown to the participants, one was an expert and the other was a non-expert. We observed participants' advice seeking behaviour under risk as an indicator of their trust in the advisor. We found that most participants preferred seeking advice from the expert, but we also found a tendency for seeking robot or video advice. Avatar advice, in contrast, was more rarely sought. Users' self-reports support these findings. These results suggest that when users make trust assessments the physical presence of the robot representation might compensate for the lack of identity cues.

  7. Closed loop interactions between spiking neural network and robotic simulators based on MUSIC and ROS

    Directory of Open Access Journals (Sweden)

    Philipp Weidel

    2016-08-01

    Full Text Available In order to properly assess the function and computational properties of simulated neural systems, it is necessary to account for the nature of the stimuli that drive the system. However, providing stimuli that are rich and yet both reproducible and amenable to experimental manipulations is technically challenging, and even more so if a closed-loop scenario is required. In this work, we present a novel approach to solve this problem, connecting robotics and neural network simulators. We implement a middleware solution that bridges the Robotic Operating System (ROS to the Multi-Simulator Coordinator (MUSIC. This enables any robotic and neural simulators that implement the corresponding interfaces to be efficiently coupled, allowing real-time performance for a wide range of configurations. This work extends the toolset available for researchers in both neurorobotics and computational neuroscience, and creates the opportunity to perform closed-loop experiments of arbitrary complexity to address questions in multiple areas, including embodiment, agency, and reinforcement learning.

  8. Ambulatory movements, team dynamics and interactions during robot-assisted surgery.

    Science.gov (United States)

    Ahmad, Nabeeha; Hussein, Ahmed A; Cavuoto, Lora; Sharif, Mohamed; Allers, Jenna C; Hinata, Nobuyuki; Ahmad, Basel; Kozlowski, Justen D; Hashmi, Zishan; Bisantz, Ann; Guru, Khurshid A

    2016-07-01

    To analyse ambulatory movements and team dynamics during robot-assisted surgery (RAS), and to investigate whether congestion of the physical space associated with robotic technology led to workflow challenges or predisposed to errors and adverse events. With institutional review board approval, we retrospectively reviewed 10 recorded robot-assisted radical prostatectomies in a single operating room (OR). The OR was divided into eight zones, and all movements were tracked and described in terms of start and end zones, duration, personnel and purpose. Movements were further classified into avoidable (can be eliminated/improved) and unavoidable (necessary for completion of the procedure). The mean operating time was 166 min, of which ambulation constituted 27 min (16%). A total of 2 896 ambulatory movements were identified (mean: 290 ambulatory movements/procedure). Most of the movements were procedure-related (31%), and were performed by the circulating nurse. We identified 11 main pathways in the OR; the heaviest traffic was between the circulating nurse zone, transit zone and supply-1 zone. A total of 50% of ambulatory movements were found to be avoidable. More than half of the movements during RAS can be eliminated with an improved OR setting. More studies are needed to design an evidence-based OR layout that enhances access, workflow and patient safety. © 2016 The Authors BJU International © 2016 BJU International Published by John Wiley & Sons Ltd.

  9. Robot Actors, Robot Dramaturgies

    DEFF Research Database (Denmark)

    Jochum, Elizabeth

    This paper considers the use of tele-operated robots in live performance. Robots and performance have long been linked, from the working androids and automata staged in popular exhibitions during the nineteenth century and the robots featured at Cybernetic Serendipity (1968) and the World Expo...

  10. The future African workplace: The use of collaborative robots in manufacturing

    Directory of Open Access Journals (Sweden)

    Andre P. Calitz

    2017-07-01

    Full Text Available Orientation: Industry 4.0 promotes technological innovations and human–robot collaboration (HRC. Human–robot interaction (HRI and HRC on the manufacturing assembly line have been implemented in numerous advanced production environments worldwide. Collaborative robots (Cobots are increasingly being used as collaborators with humans in factory production and assembly environments. Research purpose: The purpose of the research is to investigate the current use and future implementation of Cobots worldwide and its specific impact on the African workforce. Motivation for the study: Exploring the gap that exists between the international implementation of Cobots and the potential implementation and impact on the African manufacturing and assembly environment and specifically on the African workforce. Research design, approach and method: The study features a qualitative research design. An open-ended question survey was conducted amongst leading manufacturing companies in South Africa in order to determine the status and future implementation of Cobot practices. Thematic analysis and content analysis were conducted using AtlasTi. Main findings: The findings indicate that the African businesses were aware of the international business trends, regarding Cobot implementation, and the possible impact of Cobots on the African work force. Factors specifically highlighted in this study are fear of retrenchment, human–Cobot trust and the African culture. Practical implications and value-add: This study provides valuable background on the international status of Cobot implementation and the possible impact on the African workforce. The study highlights the importance of building employee trust, providing the relevant training and addressing the fear of retrenchment amongst employees.

  11. Social and Affective Robotics Tutorial

    NARCIS (Netherlands)

    Pantic, Maja; Evers, Vanessa; Deisenroth, Marc; Merino, Luis; Schuller, Björn

    2016-01-01

    Social and Affective Robotics is a growing multidisciplinary field encompassing computer science, engineering, psychology, education, and many other disciplines. It explores how social and affective factors influence interactions between humans and robots, and how affect and social signals can be

  12. Detection of movement intention using EEG in a human-robot interaction environment

    Directory of Open Access Journals (Sweden)

    Ernesto Pablo Lana

    Full Text Available Introduction : This paper presents a detection method for upper limb movement intention as part of a brain-machine interface using EEG signals, whose final goal is to assist disabled or vulnerable people with activities of daily living. Methods EEG signals were recorded from six naïve healthy volunteers while performing a motor task. Every volunteer remained in an acoustically isolated recording room. The robot was placed in front of the volunteers such that it seemed to be a mirror of their right arm, emulating a Brain Machine Interface environment. The volunteers were seated in an armchair throughout the experiment, outside the reaching area of the robot to guarantee safety. Three conditions are studied: observation, execution, and imagery of right arm’s flexion and extension movements paced by an anthropomorphic manipulator robot. The detector of movement intention uses the spectral F test for discrimination of conditions and uses as feature the desynchronization patterns found on the volunteers. Using a detector provides an objective method to acknowledge for the occurrence of movement intention. Results When using four realizations of the task, detection rates ranging from 53 to 97% were found in five of the volunteers when the movement was executed, in three of them when the movement was imagined, and in two of them when the movement was observed. Conclusions Detection rates for movement observation raises the question of how the visual feedback may affect the performance of a working brain-machine interface, posing another challenge for the upcoming interface implementation. Future developments will focus on the improvement of feature extraction and detection accuracy for movement intention using EEG data.

  13. Robotic architectures

    CSIR Research Space (South Africa)

    Mtshali, M

    2010-01-01

    Full Text Available In the development of mobile robotic systems, a robotic architecture plays a crucial role in interconnecting all the sub-systems and controlling the system. The design of robotic architectures for mobile autonomous robots is a challenging...

  14. Fundamentals of soft robot locomotion.

    Science.gov (United States)

    Calisti, M; Picardi, G; Laschi, C

    2017-05-01

    Soft robotics and its related technologies enable robot abilities in several robotics domains including, but not exclusively related to, manipulation, manufacturing, human-robot interaction and locomotion. Although field applications have emerged for soft manipulation and human-robot interaction, mobile soft robots appear to remain in the research stage, involving the somehow conflictual goals of having a deformable body and exerting forces on the environment to achieve locomotion. This paper aims to provide a reference guide for researchers approaching mobile soft robotics, to describe the underlying principles of soft robot locomotion with its pros and cons, and to envisage applications and further developments for mobile soft robotics. © 2017 The Author(s).

  15. PARLOMA – A Novel Human-Robot Interaction System for Deaf-Blind Remote Communication

    Directory of Open Access Journals (Sweden)

    Ludovico Orlando Russo

    2015-05-01

    Full Text Available Deaf-blindness forces people to live in isolation. At present, there is no existing technological solution enabling two (or many deaf-blind people to communicate remotely among themselves in tactile Sign Language (t-SL. When resorting to t-SL, deaf-blind people can communicate only with people physically present in the same place, because they are required to reciprocally explore their hands to exchange messages. We present a preliminary version of PARLOMA, a novel system to enable remote communication between deaf-blind persons. It is composed of a low-cost depth sensor as the only input device, paired with a robotic hand as the output device. Essentially, any user can perform hand-shapes in front of the depth sensor. The system is able to recognize a set of hand-shapes that are sent over the web and reproduced by an anthropomorphic robotic hand. PARLOMA can work as a “telephone” for deaf-blind people. Hence, it will dramatically improve the quality of life of deaf-blind persons. PARLOMA has been presented and supported by the main Italian deaf-blind association, Lega del Filo d'Oro. End users are involved in the design phase.

  16. What you say is not what you get: arguing for artificial languages instead of natural languages in human robot speech interaction

    NARCIS (Netherlands)

    Mubin, O.; Bartneck, C.; Feijs, L.M.G.

    2009-01-01

    The project described hereunder focuses on the design and implementation of a "Artificial Robotic Interaction Language", where the research goal is to find a balance between the effort necessary from the user to learn a new language and the resulting benefit of optimized automatic speech recognition

  17. Audio-Visual Tibetan Speech Recognition Based on a Deep Dynamic Bayesian Network for Natural Human Robot Interaction

    Directory of Open Access Journals (Sweden)

    Yue Zhao

    2012-12-01

    Full Text Available Audio-visual speech recognition is a natural and robust approach to improving human-robot interaction in noisy environments. Although multi-stream Dynamic Bayesian Network and coupled HMM are widely used for audio-visual speech recognition, they fail to learn the shared features between modalities and ignore the dependency of features among the frames within each discrete state. In this paper, we propose a Deep Dynamic Bayesian Network (DDBN to perform unsupervised extraction of spatial-temporal multimodal features from Tibetan audio-visual speech data and build an accurate audio-visual speech recognition model under a no frame-independency assumption. The experiment results on Tibetan speech data from some real-world environments showed the proposed DDBN outperforms the state-of-art methods in word recognition accuracy.

  18. Effects of Interruptibility-Aware Robot Behavior

    OpenAIRE

    Banerjee, Siddhartha; Silva, Andrew; Feigh, Karen; Chernova, Sonia

    2018-01-01

    As robots become increasingly prevalent in human environments, there will inevitably be times when a robot needs to interrupt a human to initiate an interaction. Our work introduces the first interruptibility-aware mobile robot system, and evaluates the effects of interruptibility-awareness on human task performance, robot task performance, and on human interpretation of the robot's social aptitude. Our results show that our robot is effective at predicting interruptibility at high accuracy, ...

  19. CTIO, ROSAT HRI, and Chandra ACIS Observations of the Archetypical Mixed-morphology Supernova Remnant W28 (G6.4–0.1)

    International Nuclear Information System (INIS)

    Pannuti, Thomas G.; Kosakowski, Alekzander R.; Ernst, Sonny; Rho, Jeonghee; Kargaltsev, Oleg; Rangelov, Blagoy; Hare, Jeremy; Winkler, P. Frank; Keohane, Jonathan W.

    2017-01-01

    We present a joint analysis of optical emission-line and X-ray observations of the archetypical Galactic mixed-morphology supernova remnant (MMSNR) W28 (G6.4–0.1). MMSNRs comprise a class of sources whose shell-like radio morphology contrasts with a filled center in X-rays; the origin of these contrasting morphologies remains uncertain. Our CTIO images reveal enhanced [S ii] emission relative to H α along the northern and eastern rims of W28. Hydroxyl (OH) masers are detected along these same rims, supporting prior studies suggesting that W28 is interacting with molecular clouds at these locations, as observed for several other MMSNRs. Our ROSAT HRI mosaic of W28 provides almost complete coverage of the supernova remnant (SNR). The X-ray and radio emission is generally anti-correlated, except for the luminous northeastern rim, which is prominent in both bands. Our Chandra observation sampled the X-ray-luminous central diffuse emission. Spectra extracted from the bright central peak and from nearby annular regions are best fit with two overionized recombining plasma models. We also find that while the X-ray emission from the central peak is dominated by swept-up material, that from the surrounding regions shows evidence for oxygen-rich ejecta, suggesting that W28 was produced by a massive progenitor. We also analyze the X-ray properties of two X-ray sources (CXOU J175857.55−233400.3 and 3XMM J180058.5–232735) projected into the interior of W28 and conclude that neither is a neutron star associated with the SNR. The former is likely to be a foreground cataclysmic variable or a quiescent low-mass X-ray-binary, while the latter is likely to be a coronally active main-sequence star.

  20. CTIO, ROSAT HRI, and Chandra ACIS Observations of the Archetypical Mixed-morphology Supernova Remnant W28 (G6.4–0.1)

    Energy Technology Data Exchange (ETDEWEB)

    Pannuti, Thomas G.; Kosakowski, Alekzander R.; Ernst, Sonny [Space Science Center, Department of Earth and Space Sciences, Morehead State University, 235 Martindale Drive, Morehead, KY 40351 (United States); Rho, Jeonghee [SETI Institute, 189 Bernardo Avenue, Mountain View, CA 94043 (United States); Kargaltsev, Oleg; Rangelov, Blagoy; Hare, Jeremy [Department of Physics, 214 Samson Hall, George Washington University, Washington, D.C. 20052 (United States); Winkler, P. Frank [Department of Physics, Middlebury College, Middlebury, VT 05753 (United States); Keohane, Jonathan W., E-mail: t.pannuti@moreheadstate.edu, E-mail: jrho@seti.org, E-mail: jrho@sofia.usra.edu, E-mail: kargaltsev@gwu.edu, E-mail: alekzanderkos@ou.edu, E-mail: winkler@middlebury.edu, E-mail: jkeohane@hsc.edu [Department of Physics and Astronomy, Hampden-Sydney College, Hampden-Sydney, VA 23943 (United States)

    2017-04-10

    We present a joint analysis of optical emission-line and X-ray observations of the archetypical Galactic mixed-morphology supernova remnant (MMSNR) W28 (G6.4–0.1). MMSNRs comprise a class of sources whose shell-like radio morphology contrasts with a filled center in X-rays; the origin of these contrasting morphologies remains uncertain. Our CTIO images reveal enhanced [S ii] emission relative to H α along the northern and eastern rims of W28. Hydroxyl (OH) masers are detected along these same rims, supporting prior studies suggesting that W28 is interacting with molecular clouds at these locations, as observed for several other MMSNRs. Our ROSAT HRI mosaic of W28 provides almost complete coverage of the supernova remnant (SNR). The X-ray and radio emission is generally anti-correlated, except for the luminous northeastern rim, which is prominent in both bands. Our Chandra observation sampled the X-ray-luminous central diffuse emission. Spectra extracted from the bright central peak and from nearby annular regions are best fit with two overionized recombining plasma models. We also find that while the X-ray emission from the central peak is dominated by swept-up material, that from the surrounding regions shows evidence for oxygen-rich ejecta, suggesting that W28 was produced by a massive progenitor. We also analyze the X-ray properties of two X-ray sources (CXOU J175857.55−233400.3 and 3XMM J180058.5–232735) projected into the interior of W28 and conclude that neither is a neutron star associated with the SNR. The former is likely to be a foreground cataclysmic variable or a quiescent low-mass X-ray-binary, while the latter is likely to be a coronally active main-sequence star.

  1. Robot engineering

    International Nuclear Information System (INIS)

    Jung, Seul

    2006-02-01

    This book deals with robot engineering, giving descriptions of robot's history, current tendency of robot field, work and characteristic of industrial robot, essential merit and vector, application of matrix, analysis of basic vector, expression of Denavit-Hartenberg, robot kinematics such as forward kinematics, inverse kinematics, cases of MATLAB program, and motion kinematics, robot kinetics like moment of inertia, centrifugal force and coriolis power, and Euler-Lagrangian equation course plan, SIMULINK position control of robots.

  2. Robot engineering

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Seul

    2006-02-15

    This book deals with robot engineering, giving descriptions of robot's history, current tendency of robot field, work and characteristic of industrial robot, essential merit and vector, application of matrix, analysis of basic vector, expression of Denavit-Hartenberg, robot kinematics such as forward kinematics, inverse kinematics, cases of MATLAB program, and motion kinematics, robot kinetics like moment of inertia, centrifugal force and coriolis power, and Euler-Lagrangian equation course plan, SIMULINK position control of robots.

  3. The Mobile Robot "Little Helper"

    DEFF Research Database (Denmark)

    Hvilshøj, Mads; Bøgh, Simon; Madsen, Ole

    2009-01-01

    Increased customer needs and intensified global competition require intelligent and flexible automation. The interaction technology mobile robotics addresses this, so it holds great potential within the industry. This paper presents the concepts, ideas and working principles of the mobile robot...... this show promising results regarding industrial integration, exploitation and maturation of mobile robotics....

  4. Cognitive Tools for Humanoid Robots in Space

    National Research Council Canada - National Science Library

    Sofge, Donald; Perzanowski, Dennis; Skubic, Marjorie; Bugajska, Magdalena; Trafton, J. G; Cassimatis, Nicholas; Brock, Derek; Adams, William; Schultz, Alan

    2004-01-01

    .... The key to achieving this interaction is to provide the robot with sufficient skills for natural communication with humans so that humans can interact with the robot almost as though it were another human...

  5. 2016 International Symposium on Experimental Robotics

    CERN Document Server

    Nakamura, Yoshihiko; Khatib, Oussama; Venture, Gentiane

    2017-01-01

    Experimental Robotics XV is the collection of papers presented at the International Symposium on Experimental Robotics, Roppongi, Tokyo, Japan on October 3-6, 2016. 73 scientific papers were selected and presented after peer review. The papers span a broad range of sub-fields in robotics including aerial robots, mobile robots, actuation, grasping, manipulation, planning and control and human-robot interaction, but shared cutting-edge approaches and paradigms to experimental robotics. The readers will find a breadth of new directions of experimental robotics. The International Symposium on Experimental Robotics is a series of bi-annual symposia sponsored by the International Foundation of Robotics Research, whose goal is to provide a forum dedicated to experimental robotics research. Robotics has been widening its scientific scope, deepening its methodologies and expanding its applications. However, the significance of experiments remains and will remain at the center of the discipline. The ISER gatherings are...

  6. Robot Games for Elderly

    DEFF Research Database (Denmark)

    Hansen, Søren Tranberg

    2011-01-01

    improve a person’s overall health, and this thesis investigates how games based on an autonomous, mobile robot platform, can be used to motivate elderly to move physically while playing. The focus of the investigation is on the development of games for an autonomous, mobile robot based on algorithms using...... spatio-temporal information about player behaviour - more specifically, I investigate three types of games each using a different control strategy. The first game is based on basic robot control which allows the robot to detect and follow a person. A field study in a rehabilitation centre and a nursing....... The robot facilitates interaction, and the study suggests that robot based games potentially can be used for training balance and orientation. The second game consists in an adaptive game algorithm which gradually adjusts the game challenge to the mobility skills of the player based on spatio...

  7. Information Engineering in Autonomous Robot Software

    NARCIS (Netherlands)

    Ziafati, P.

    2015-01-01

    In order to engage and help in our daily life, autonomous robots are to operate in dynamic and unstructured environments and interact with people. As the robot's environment and its behaviour are getting more complex, so are the robot's software and the knowledge that the robot needs to carry out

  8. Beyond R2D2 - The design of nonverbal interaction behavior optimized for robot-specific morphologies

    NARCIS (Netherlands)

    Karreman, Daphne Eleonora

    2016-01-01

    It is likely that in the near future we will meet more and more robots that will perform tasks in social environments, such as shopping malls, airports or museums. However, design guidelines that inform the design of effective nonverbal behavior for robots are scarce. This is surprising since the

  9. Active Vision for Sociable Robots

    National Research Council Canada - National Science Library

    Breazeal, Cynthia; Edsinger, Aaron; Fitzpatrick, Paul; Scassellati, Brian

    2001-01-01

    .... In humanoid robotic systems, or in any animate vision system that interacts with people, social dynamics provide additional levels of constraint and provide additional opportunities for processing economy...

  10. Multimodal Interaction in Ambient Intelligence Environments Using Speech, Localization and Robotics

    Science.gov (United States)

    Galatas, Georgios

    2013-01-01

    An Ambient Intelligence Environment is meant to sense and respond to the presence of people, using its embedded technology. In order to effectively sense the activities and intentions of its inhabitants, such an environment needs to utilize information captured from multiple sensors and modalities. By doing so, the interaction becomes more natural…

  11. An Interactive Robotic Fish Exhibit for Designed Settings in Informal Science Learning

    Science.gov (United States)

    Phamduy, Paul; Leou, Mary; Milne, Catherine; Porfiri, Maurizio

    2017-01-01

    Informal science learning aims to improve public understanding of STEM. Free-choice learners can be engaged in a wide range of experiences, ranging from watching entertaining educational videos to actively participating in hands-on projects. Efforts in informal science learning are often gauged by their ability to elicit interaction, to foster…

  12. Dynamics Analysis of Fluid-Structure Interaction for a Biologically-Inspired Biped Robot Running on Water

    Directory of Open Access Journals (Sweden)

    Linsen Xu

    2013-10-01

    Full Text Available A kinematics analysis of a biologically-inspired biped robot is carried out, and the trajectory of the robot foot is understood. For calculating the pressure distribution across a robot foot before touching the surface of water, the compression flow of air and the depression motion of the water surface are considered. The pressure model after touching the water surface has been built according to the theory of rigid body planar motion. The multi-material ALE algorithm is applied to emulate the course of the foot slapping water. The simulation results indicate that the model of the bionic robot can satisfy the water-running function. The real prototype of the robot is manufactured to test its function of running on water. When the biped robot is running on water, the average force generated by the propulsion mechanism is about 1.3N. The experimental results show that the propulsion system can satisfy the requirement of biped robot running on water.

  13. Olympijské hry jako kulturní událost The Olympic Games as a cultural event

    Directory of Open Access Journals (Sweden)

    Zvezdan Savić

    2007-02-01

    Full Text Available Olympijské hry se staly událostí zahrnující mnoho sportů, která zaměstnává nejen sportovce z různých zemí, ale také milióny diváků z celého světa. Z hlediska veřejného zájmu tedy převyšují jakoukoliv jinou sportovní nebo kulturní událost. Z iniciativy několika málo zemí a malého počtu sportovců se vyvinul historický fenomén dnešní civilizace, odehrávající se na určeném místě, s vlastními soutěžícími, nabídkou a pravidly. Olympijské hry vyjadřují ideologii různých národů na jediném místě, ideologii náboženství, zvyků, tradic, jazyků nebo obecně kultur. Probíhá při nich masová komunikace mezi soutěžícími a zbytkem celého světa. Sociální, vědecký, sportovně-technický a politický vývoj otevřel sportu jako sociálně-kulturnímu jevu široké obzory a sport se stal obecně prospěšným. Olympijských her se dnes účastní více než sto devadesát zemí. Sportovci a rozhodčí pocházejí z různých sociálních prostředí, což bylo v době, ve které hry vznikly, něco nepředstavitelného. Přesto je sociální komunikace mezi mladými lidmi celého světa v rámci této grandiózní události významným prvkem dnešních olympijských her. Můžeme při nich sledovat mistrovské sportovní výkony i setkávání mladých lidí bez ohledu na ideologii, rasu a náboženství. Právě to činí hry mimořádně krásnými a významnými. Autoři výzkumu se snažili podat podrobnější vysvětlení důležitých kulturních aspektů olympijských her a ukázat jejich sociální kontext. The Olympic Games have become a multi sport event, which entertains not only athletes from different countries, but a world wide audience numbering millions. They therefore exceed any other sports or cultural event when it comes to matters of public interest. Deriving from the initiative of a few countries and a small number of athletes, a historically significant phenomenon

  14. A Novel Interactive Exoskeletal Robot for Overground Locomotion Studies in Rats.

    Science.gov (United States)

    Song, Yun Seong; Hogan, Neville

    2015-07-01

    This paper introduces a newly developed apparatus, Iron Rat, for locomotion research in rodents. Its main purpose is to allow maximal freedom of voluntary overground movement of the animal while providing forceful interaction to the hindlimbs. Advantages and challenges of the proposed exoskeletal apparatus over other existing designs are discussed. Design and implementation challenges are presented and discussed, emphasizing their implications for free, voluntary movement of the animal. A live-animal experiment was conducted to assess the design. Unconstrained natural movement of the animal was compared with its movement with the exoskeletal module attached. The compact design and back-drivable implementation of this apparatus will allow novel experimental manipulations that may include forceful yet compliant dynamic interaction with the animal's overground locomotion.

  15. A review on humanoid robotics in healthcare

    OpenAIRE

    Joseph Azeta; Christian Bolu; Abiodun Abioye A.; Oyawale Festus

    2018-01-01

    Humanoid robots have evolved over the years and today it is in many different areas of applications, from homecare to social care and healthcare robotics. This paper deals with a brief overview of the current and potential applications of humanoid robotics in healthcare settings. We present a comprehensive contextualization of humanoid robots in healthcare by identifying and characterizing active research activities on humanoid robot that can work interactively and effectively with humans so ...

  16. A Meta-Analysis of Factors Influencing the Development of Trust in Automation: Implications for Human-Robot Interaction

    Science.gov (United States)

    2014-07-01

    however, 27 of these articles had insufficient information to calculate effect size. Authors were contacted via email and were given 5 weeks to... Multitasking Personality Robot Personality Communication Mode States Team Collaboration Fatigue Capability In-group Membership Stress

  17. Robot Aesthetics

    DEFF Research Database (Denmark)

    Jochum, Elizabeth Ann; Putnam, Lance Jonathan

    This paper considers art-based research practice in robotics through a discussion of our course and relevant research projects in autonomous art. The undergraduate course integrates basic concepts of computer science, robotic art, live performance and aesthetic theory. Through practice...... in robotics research (such as aesthetics, culture and perception), we believe robot aesthetics is an important area for research in contemporary aesthetics....

  18. Filigree Robotics

    DEFF Research Database (Denmark)

    Tamke, Martin; Evers, Henrik Leander; Clausen Nørgaard, Esben

    2016-01-01

    Filigree Robotics experiments with the combination of traditional ceramic craft with robotic fabrication in order to generate a new narrative of fine three-dimensional ceramic ornament for architecture.......Filigree Robotics experiments with the combination of traditional ceramic craft with robotic fabrication in order to generate a new narrative of fine three-dimensional ceramic ornament for architecture....

  19. Cross-cultural study on human-robot greeting interaction : acceptance and discomfort by Egyptians and Japanese

    OpenAIRE

    Trovato, G.; Zecca, M.; Sessa, S.; Jamone, L.; Ham, J.R.C.; Hashimoto, K.; Takanishi, A.

    2013-01-01

    As witnessed in several behavioural studies, a complex relationship exists between people’s cultural background and their general acceptance towards robots. However, very few studies have investigated whether a robot’s original language and gesture based on certain culture have an impact on the people of the different cultures. The purpose of this work is to provide experimental evidence which supports the idea that humans may accept more easily a robot that can adapt to their specific cultur...

  20. Motion control for a walking companion robot with a novel human–robot interface

    Directory of Open Access Journals (Sweden)

    Yunqi Lv

    2016-09-01

    Full Text Available A walking companion robot is presented for rehabilitation from dyskinesia of lower limbs in this article. A new human–robot interface (HRI is designed which adopts one-axis force sensor and potentiometer connector to detect the motion of the user. To accompany in displacement and angle between the user and the robot precisely in real time, the common motions are classified into two elemental motion states. With distinction method of motion states, a classification scheme of motion control is adopted. The mathematical model-based control method is first introduced and the corresponding control systems are built. Due to the unavoidable deviation of the mathematical model-based control method, a force control method is proposed and the corresponding control systems are built. The corresponding simulations demonstrate that the efficiency of the two proposed control methods. The experimental data and paths of robot verify the two control methods and indicate that the force control method can better satisfy the user’s requirements.

  1. Social Impact of Recharging Activity in Long-Term HRI and Verbal Strategies to Manage User Expectations During Recharge

    Directory of Open Access Journals (Sweden)

    Amol Deshmukh

    2018-04-01

    Full Text Available Social robots perform tasks to help humans in their daily activities. However, if they fail to fulfill expectations this may affect their acceptance. This work investigates the service degradation caused by recharging, during which the robot is socially inactive. We describe two studies conducted in an ecologically valid office environment. In the first long-term study (3 weeks, we investigated the service degradation caused by the recharging behavior of a social robot. In the second study, we explored the social strategies used to manage users’ expectations during recharge. Our findings suggest that the use of verbal strategies (transparency, apology, and politeness can make robots more acceptable to users during recharge.

  2. On the role of exchange of power and information signals in control and stability of the human-robot interaction

    Science.gov (United States)

    Kazerooni, H.

    1991-01-01

    A human's ability to perform physical tasks is limited, not only by his intelligence, but by his physical strength. If, in an appropriate environment, a machine's mechanical power is closely integrated with a human arm's mechanical power under the control of the human intellect, the resulting system will be superior to a loosely integrated combination of a human and a fully automated robot. Therefore, we must develop a fundamental solution to the problem of 'extending' human mechanical power. The work presented here defines 'extenders' as a class of robot manipulators worn by humans to increase human mechanical strength, while the wearer's intellect remains the central control system for manipulating the extender. The human, in physical contact with the extender, exchanges power and information signals with the extender. The aim is to determine the fundamental building blocks of an intelligent controller, a controller which allows interaction between humans and a broad class of computer-controlled machines via simultaneous exchange of both power and information signals. The prevalent trend in automation has been to physically separate the human from the machine so the human must always send information signals via an intermediary device (e.g., joystick, pushbutton, light switch). Extenders, however are perfect examples of self-powered machines that are built and controlled for the optimal exchange of power and information signals with humans. The human wearing the extender is in physical contact with the machine, so power transfer is unavoidable and information signals from the human help to control the machine. Commands are transferred to the extender via the contact forces and the EMG signals between the wearer and the extender. The extender augments human motor ability without accepting any explicit commands: it accepts the EMG signals and the contact force between the person's arm and the extender, and the extender 'translates' them into a desired position. In

  3. Needle-tissue interactive mechanism and steering control in image-guided robot-assisted minimally invasive surgery: a review.

    Science.gov (United States)

    Li, Pan; Yang, Zhiyong; Jiang, Shan

    2018-06-01

    Image-guided robot-assisted minimally invasive surgery is an important medicine procedure used for biopsy or local target therapy. In order to reach the target region not accessible using traditional techniques, long and thin flexible needles are inserted into the soft tissue which has large deformation and nonlinear characteristics. However, the detection results and therapeutic effect are directly influenced by the targeting accuracy of needle steering. For this reason, the needle-tissue interactive mechanism, path planning, and steering control are investigated in this review by searching literatures in the last 10 years, which results in a comprehensive overview of the existing techniques with the main accomplishments, limitations, and recommendations. Through comprehensive analyses, surgical simulation for insertion into multi-layer inhomogeneous tissue is verified as a primary and propositional aspect to be explored, which accurately predicts the nonlinear needle deflection and tissue deformation. Investigation of the path planning of flexible needles is recommended to an anatomical or a deformable environment which has characteristics of the tissue deformation. Nonholonomic modeling combined with duty-cycled spinning for needle steering, which tracks the tip position in real time and compensates for the deviation error, is recommended as a future research focus in the steering control in anatomical and deformable environments. Graphical abstract a Insertion force when the needle is inserted into soft tissue. b Needle deflection model when the needle is inserted into soft tissue [68]. c Path planning in anatomical environments [92]. d Duty-cycled spinning incorporated in nonholonomic needle steering [64].

  4. Real-time face and gesture analysis for human-robot interaction

    Science.gov (United States)

    Wallhoff, Frank; Rehrl, Tobias; Mayer, Christoph; Radig, Bernd

    2010-05-01

    Human communication relies on a large number of different communication mechanisms like spoken language, facial expressions, or gestures. Facial expressions and gestures are one of the main nonverbal communication mechanisms and pass large amounts of information between human dialog partners. Therefore, to allow for intuitive human-machine interaction, a real-time capable processing and recognition of facial expressions, hand and head gestures are of great importance. We present a system that is tackling these challenges. The input features for the dynamic head gestures and facial expressions are obtained from a sophisticated three-dimensional model, which is fitted to the user in a real-time capable manner. Applying this model different kinds of information are extracted from the image data and afterwards handed over to a real-time capable data-transferring framework, the so-called Real-Time DataBase (RTDB). In addition to the head and facial-related features, also low-level image features regarding the human hand - optical flow, Hu-moments are stored into the RTDB for the evaluation process of hand gestures. In general, the input of a single camera is sufficient for the parallel evaluation of the different gestures and facial expressions. The real-time capable recognition of the dynamic hand and head gestures are performed via different Hidden Markov Models, which have proven to be a quick and real-time capable classification method. On the other hand, for the facial expressions classical decision trees or more sophisticated support vector machines are used for the classification process. These obtained results of the classification processes are again handed over to the RTDB, where other processes (like a Dialog Management Unit) can easily access them without any blocking effects. In addition, an adjustable amount of history can be stored by the RTDB buffer unit.

  5. Architecture for robot intelligence

    Science.gov (United States)

    Peters, II, Richard Alan (Inventor)

    2004-01-01

    An architecture for robot intelligence enables a robot to learn new behaviors and create new behavior sequences autonomously and interact with a dynamically changing environment. Sensory information is mapped onto a Sensory Ego-Sphere (SES) that rapidly identifies important changes in the environment and functions much like short term memory. Behaviors are stored in a DBAM that creates an active map from the robot's current state to a goal state and functions much like long term memory. A dream state converts recent activities stored in the SES and creates or modifies behaviors in the DBAM.

  6. Towards Using a Generic Robot as Training Partner

    DEFF Research Database (Denmark)

    Sørensen, Anders Stengaard; Savarimuthu, Thiusius Rajeeth; Nielsen, Jacob

    2014-01-01

    In this paper, we demonstrate how a generic industrial robot can be used as a training partner, for upper limb training. The motion path and human/robot interaction of a non-generic upper-arm training robot is transferred to a generic industrial robot arm, and we demonstrate that the robot arm can...... implement the same type of interaction, but can expand the training regime to include both upper arm and shoulder training. We compare the generic robot to two affordable but custom-built training robots, and outline interesting directions for future work based on these training robots....

  7. 3D light robotics

    DEFF Research Database (Denmark)

    Glückstad, Jesper; Palima, Darwin; Villangca, Mark Jayson

    2016-01-01

    As celebrated by the Nobel Prize 2014 in Chemistry light-based technologies can now overcome the diffraction barrier for imaging with nanoscopic resolution by so-called super-resolution microscopy1. However, interactive investigations coupled with advanced imaging modalities at these small scale ...... research discipline that could potentially be able to offer the full packet needed for true "active nanoscopy" by use of so-called light-driven micro-robotics or Light Robotics in short....

  8. What makes a robot 'social'?

    Science.gov (United States)

    Jones, Raya A

    2017-08-01

    Rhetorical moves that construct humanoid robots as social agents disclose tensions at the intersection of science and technology studies (STS) and social robotics. The discourse of robotics often constructs robots that are like us (and therefore unlike dumb artefacts). In the discourse of STS, descriptions of how people assimilate robots into their activities are presented directly or indirectly against the backdrop of actor-network theory, which prompts attributing agency to mundane artefacts. In contradistinction to both social robotics and STS, it is suggested here that to view a capacity to partake in dialogical action (to have a 'voice') is necessary for regarding an artefact as authentically social. The theme is explored partly through a critical reinterpretation of an episode that Morana Alač reported and analysed towards demonstrating her bodies-in-interaction concept. This paper turns to 'body' with particular reference to Gibsonian affordances theory so as to identify the level of analysis at which dialogicality enters social interactions.

  9. Modelling Engagement in Multi-Party Conversations : Data-Driven Approaches to Understanding Human-Human Communication Patterns for Use in Human-Robot Interactions

    OpenAIRE

    Oertel, Catharine

    2016-01-01

    The aim of this thesis is to study human-human interaction in order to provide virtual agents and robots with the capability to engage into multi-party-conversations in a human-like-manner. The focus lies with the modelling of conversational dynamics and the appropriate realization of multi-modal feedback behaviour. For such an undertaking, it is important to understand how human-human communication unfolds in varying contexts and constellations over time. To this end, multi-modal human-human...

  10. Robotic environments

    NARCIS (Netherlands)

    Bier, H.H.

    2011-01-01

    Technological and conceptual advances in fields such as artificial intelligence, robotics, and material science have enabled robotic architectural environments to be implemented and tested in the last decade in virtual and physical prototypes. These prototypes are incorporating sensing-actuating

  11. Healthcare Robotics

    OpenAIRE

    Riek, Laurel D.

    2017-01-01

    Robots have the potential to be a game changer in healthcare: improving health and well-being, filling care gaps, supporting care givers, and aiding health care workers. However, before robots are able to be widely deployed, it is crucial that both the research and industrial communities work together to establish a strong evidence-base for healthcare robotics, and surmount likely adoption barriers. This article presents a broad contextualization of robots in healthcare by identifying key sta...

  12. Toward cognitive robotics

    Science.gov (United States)

    Laird, John E.

    2009-05-01

    Our long-term goal is to develop autonomous robotic systems that have the cognitive abilities of humans, including communication, coordination, adapting to novel situations, and learning through experience. Our approach rests on the recent integration of the Soar cognitive architecture with both virtual and physical robotic systems. Soar has been used to develop a wide variety of knowledge-rich agents for complex virtual environments, including distributed training environments and interactive computer games. For development and testing in robotic virtual environments, Soar interfaces to a variety of robotic simulators and a simple mobile robot. We have recently made significant extensions to Soar that add new memories and new non-symbolic reasoning to Soar's original symbolic processing, which should significantly improve Soar abilities for control of robots. These extensions include episodic memory, semantic memory, reinforcement learning, and mental imagery. Episodic memory and semantic memory support the learning and recalling of prior events and situations as well as facts about the world. Reinforcement learning provides the ability of the system to tune its procedural knowledge - knowledge about how to do things. Mental imagery supports the use of diagrammatic and visual representations that are critical to support spatial reasoning. We speculate on the future of unmanned systems and the need for cognitive robotics to support dynamic instruction and taskability.

  13. Industrial Robots.

    Science.gov (United States)

    Reed, Dean; Harden, Thomas K.

    Robots are mechanical devices that can be programmed to perform some task of manipulation or locomotion under automatic control. This paper discusses: (1) early developments of the robotics industry in the United States; (2) the present structure of the industry; (3) noneconomic factors related to the use of robots; (4) labor considerations…

  14. The ethics of human-robot relationships

    NARCIS (Netherlands)

    de Graaf, M.M.A.

    2015-01-01

    Currently, human-robot interactions are constructed according to the rules of human-human interactions inviting users to interact socially with robots. Is there something morally wrong with deceiving humans into thinking they can foster meaningful interactions with a technological object? Or is this

  15. Tactical mobile robots for urban search and rescue

    Science.gov (United States)

    Blitch, John; Sidki, Nahid; Durkin, Tim

    2000-07-01

    Few disasters can inspire more compassion for victims and families than those involving structural collapse. Video clips of children's bodies pulled from earthquake stricken cities and bombing sties tend to invoke tremendous grief and sorrow because of the totally unpredictable nature of the crisis and lack of even the slightest degree of negligence (such as with those who choose to ignore storm warnings). Heartbreaking stories of people buried alive for days provide a visceral and horrific perspective of some of greatest fears ever to be imagined by human beings. Current trends toward urban sprawl and increasing human discord dictates that structural collapse disasters will continue to present themselves at an alarming rate. The proliferation of domestic terrorism, HAZMAT and biological contaminants further complicates the matter further and presents a daunting problem set for Urban Search and Rescue (USAR) organizations around the world. This paper amplifies the case for robot assisted search and rescue that was first presented during the KNOBSAR project initiated at the Colorado School of Mines in 1995. It anticipates increasing technical development in mobile robot technologies and promotes their use for a wide variety of humanitarian assistance missions. Focus is placed on development of advanced robotic systems that are employed in a complementary tool-like fashion as opposed to traditional robotic approaches that portend to replace humans in hazardous tasks. Operational challenges for USAR are presented first, followed by a brief history of mobiles robot development. The paper then presents conformal robotics as a new design paradigm with emphasis on variable geometry and volumes. A section on robot perception follows with an initial attempt to characterize sensing in a volumetric manner. Collaborative rescue is then briefly discussed with an emphasis on marsupial operations and linked mobility. The paper concludes with an emphasis on Human Robot Interface

  16. Traveling Robots and Their Cultural Baggage

    DEFF Research Database (Denmark)

    Blond, Lasse

    When social robots are imported from Asia to Europe they bring along with them a cultural luggage consisting of foreign sociotechnical imaginary. The effort to adopt the robot Silbot to Nordic social services exposed unfamiliar and cultural-dependent views of care, cognition, health and human...... nature. Studying Silbot in “the wild” highlighted these issues as well as the human-robot interaction and the adaptation of the robot to real life praxis. The importance of comprehending robots as parts of sociotechnical ensembles is emphasized as well as the observance of how robots are shaped...... by the cultural context in the recipient countries....

  17. 25th Conference on Robotics in Alpe-Adria-Danube Region

    CERN Document Server

    Borangiu, Theodor

    2017-01-01

    This book presents the proceedings of the 25th International Conference on Robotics in Alpe-Adria-Danube Region, RAAD 2016 held in Belgrade, Serbia, on June 30th–July 2nd, 2016. In keeping with the tradition of the event, RAAD 2016 covered all the important areas of research and innovation in new robot designs and intelligent robot control, with papers including Intelligent robot motion control; Robot vision and sensory processing; Novel design of robot manipulators and grippers; Robot applications in manufacturing and services; Autonomous systems, humanoid and walking robots; Human–robot interaction and collaboration; Cognitive robots and emotional intelligence; Medical, human-assistive robots and prosthetic design; Robots in construction and arts, and Evolution, education, legal and social issues of robotics. For the first time in RAAD history, the themes cloud robots, legal and ethical issues in robotics as well as robots in arts were included in the technical program. The book is a valuable resource f...

  18. Modelling of industrial robot in LabView Robotics

    Science.gov (United States)

    Banas, W.; Cwikła, G.; Foit, K.; Gwiazda, A.; Monica, Z.; Sekala, A.

    2017-08-01

    Currently can find many models of industrial systems including robots. These models differ from each other not only by the accuracy representation parameters, but the representation range. For example, CAD models describe the geometry of the robot and some even designate a mass parameters as mass, center of gravity, moment of inertia, etc. These models are used in the design of robotic lines and sockets. Also systems for off-line programming use these models and many of them can be exported to CAD. It is important to note that models for off-line programming describe not only the geometry but contain the information necessary to create a program for the robot. Exports from CAD to off-line programming system requires additional information. These models are used for static determination of reachability points, and testing collision. It’s enough to generate a program for the robot, and even check the interaction of elements of the production line, or robotic cell. Mathematical models allow robots to study the properties of kinematic and dynamic of robot movement. In these models the geometry is not so important, so are used only selected parameters such as the length of the robot arm, the center of gravity, moment of inertia. These parameters are introduced into the equations of motion of the robot and motion parameters are determined.

  19. Robot Mechanisms

    CERN Document Server

    Lenarcic, Jadran; Stanišić, Michael M

    2013-01-01

    This book provides a comprehensive introduction to the area of robot mechanisms, primarily considering industrial manipulators and humanoid arms. The book is intended for both teaching and self-study. Emphasis is given to the fundamentals of kinematic analysis and the design of robot mechanisms. The coverage of topics is untypical. The focus is on robot kinematics. The book creates a balance between theoretical and practical aspects in the development and application of robot mechanisms, and includes the latest achievements and trends in robot science and technology.

  20. Robots Spur Software That Lends a Hand

    Science.gov (United States)

    2014-01-01

    While building a robot to assist astronauts in space, Johnson Space Center worked with partners to develop robot reasoning and interaction technology. The partners created Robonaut 1, which led to Robonaut 2, and the work also led to patents now held by Universal Robotics in Nashville, Tennessee. The NASA-derived technology is available for use in warehousing, mining, and more.

  1. Design of robust robotic proxemic behaviour

    NARCIS (Netherlands)

    Torta, E.; Cuijpers, R.H.; Juola, J.F.; Pol, van der D.; Mutlu, B.; Bartneck, C.; Ham, J.R.C.; Evers, V.; Kanda, T.

    2011-01-01

    Personal robots that share the same space with humans need to be socially acceptable and effective as they interact with people. In this paper we focus our attention on the definition of a behaviour-based robotic architecture that, (1) allows the robot to navigate safely in a cluttered and

  2. European regulatory framework for person carrier robots

    NARCIS (Netherlands)

    Fosch Villaronga, E.; Roig, A.

    The aim of this paper is to establish the grounds for a future regulatory framework for Person Carrier Robots, which includes legal and ethical aspects. Current industrial standards focus on physical human–robot interaction, i.e. on the prevention of harm. Current robot technology nonetheless

  3. Molecular Robots Obeying Asimov's Three Laws of Robotics.

    Science.gov (United States)

    Kaminka, Gal A; Spokoini-Stern, Rachel; Amir, Yaniv; Agmon, Noa; Bachelet, Ido

    2017-01-01

    Asimov's three laws of robotics, which were shaped in the literary work of Isaac Asimov (1920-1992) and others, define a crucial code of behavior that fictional autonomous robots must obey as a condition for their integration into human society. While, general implementation of these laws in robots is widely considered impractical, limited-scope versions have been demonstrated and have proven useful in spurring scientific debate on aspects of safety and autonomy in robots and intelligent systems. In this work, we use Asimov's laws to examine these notions in molecular robots fabricated from DNA origami. We successfully programmed these robots to obey, by means of interactions between individual robots in a large population, an appropriately scoped variant of Asimov's laws, and even emulate the key scenario from Asimov's story "Runaround," in which a fictional robot gets into trouble despite adhering to the laws. Our findings show that abstract, complex notions can be encoded and implemented at the molecular scale, when we understand robots on this scale on the basis of their interactions.

  4. Robotics education

    International Nuclear Information System (INIS)

    Benton, O.

    1984-01-01

    Robotics education courses are rapidly spreading throughout the nation's colleges and universities. Engineering schools are offering robotics courses as part of their mechanical or manufacturing engineering degree program. Two year colleges are developing an Associate Degree in robotics. In addition to regular courses, colleges are offering seminars in robotics and related fields. These seminars draw excellent participation at costs running up to $200 per day for each participant. The last one drew 275 people from Texas to Virginia. Seminars are also offered by trade associations, private consulting firms, and robot vendors. IBM, for example, has the Robotic Assembly Institute in Boca Raton and charges about $1,000 per week for course. This is basically for owners of IBM robots. Education (and training) can be as short as one day or as long as two years. Here is the educational pattern that is developing now

  5. How does a surgeon's brain buzz? An EEG coherence study on the interaction between humans and robot.

    Science.gov (United States)

    Bocci, Tommaso; Moretto, Carlo; Tognazzi, Silvia; Briscese, Lucia; Naraci, Megi; Leocani, Letizia; Mosca, Franco; Ferrari, Mauro; Sartucci, Ferdinando

    2013-04-22

    In humans, both primary and non-primary motor areas are involved in the control of voluntary movements. However, the dynamics of functional coupling among different motor areas have not been fully clarified yet. There is to date no research looking to the functional dynamics in the brain of surgeons working in laparoscopy compared with those trained and working in robotic surgery. We enrolled 16 right-handed trained surgeons and assessed changes in intra- and inter-hemispheric EEG coherence with a 32-channels device during the same motor task with either a robotic or a laparoscopic approach. Estimates of auto and coherence spectra were calculated by a fast Fourier transform algorithm implemented on Matlab 5.3. We found increase of coherence in surgeons performing laparoscopy, especially in theta and lower alpha activity, in all experimental conditions (M1 vs. SMA, S1 vs. SMA, S1 vs. pre-SMA and M1 vs. S1; p right vs. left M1, right vs. left S1, right pre-SMA vs. left M1, left pre-SMA vs. right M1; p brain and robotic devices.

  6. Experiments on co-operating robot arms

    International Nuclear Information System (INIS)

    Arthaya, B.; De Schutter, J.

    1994-01-01

    When two robots manipulate a common object or perform a single task together, a closed-kinematic chain is formed. If both robots are controlled under position control only, at a certain phase during the manipulation, the interaction forces may become unacceptably high. The interaction forces are caused by the kinematic as well as the dynamic errors in the robot position controller. In order to avoid this problem, a synchronized motion between both robots has to be generated, not only by controlling the position (velocity) of the two end-effectors, but also by controlling the interaction forces between them. In order to generate a synchronized motion, the first robot controller continuously modifies the task frame velocity corresponding to the velocity of the other robot. This implies that the velocity of the other robot is used as feed-forward information in order to anticipate its motion. This approach results in a better tracking behaviour

  7. Robots show us how to teach them: feedback from robots shapes tutoring behavior during action learning.

    Science.gov (United States)

    Vollmer, Anna-Lisa; Mühlig, Manuel; Steil, Jochen J; Pitsch, Karola; Fritsch, Jannik; Rohlfing, Katharina J; Wrede, Britta

    2014-01-01

    Robot learning by imitation requires the detection of a tutor's action demonstration and its relevant parts. Current approaches implicitly assume a unidirectional transfer of knowledge from tutor to learner. The presented work challenges this predominant assumption based on an extensive user study with an autonomously interacting robot. We show that by providing feedback, a robot learner influences the human tutor's movement demonstrations in the process of action learning. We argue that the robot's feedback strongly shapes how tutors signal what is relevant to an action and thus advocate a paradigm shift in robot action learning research toward truly interactive systems learning in and benefiting from interaction.

  8. Robotic buildings(s)

    NARCIS (Netherlands)

    Bier, H.H.

    2014-01-01

    Technological and conceptual advances in fields such as artificial intelligence, robotics, and material science have enabled robotic building to be in the last decade prototypically implemented. In this context, robotic building implies both physically built robotic environments and robotically

  9. Soft Robotics.

    Science.gov (United States)

    Whitesides, George M

    2018-04-09

    This description of "soft robotics" is not intended to be a conventional review, in the sense of a comprehensive technical summary of a developing field. Rather, its objective is to describe soft robotics as a new field-one that offers opportunities to chemists and materials scientists who like to make "things" and to work with macroscopic objects that move and exert force. It will give one (personal) view of what soft actuators and robots are, and how this class of soft devices fits into the more highly developed field of conventional "hard" robotics. It will also suggest how and why soft robotics is more than simply a minor technical "tweak" on hard robotics and propose a unique role for chemistry, and materials science, in this field. Soft robotics is, at its core, intellectually and technologically different from hard robotics, both because it has different objectives and uses and because it relies on the properties of materials to assume many of the roles played by sensors, actuators, and controllers in hard robotics. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Evolutionary Developmental Robotics: Improving Morphology and Control of Physical Robots.

    Science.gov (United States)

    Vujovic, Vuk; Rosendo, Andre; Brodbeck, Luzius; Iida, Fumiya

    2017-01-01

    Evolutionary algorithms have previously been applied to the design of morphology and control of robots. The design space for such tasks can be very complex, which can prevent evolution from efficiently discovering fit solutions. In this article we introduce an evolutionary-developmental (evo-devo) experiment with real-world robots. It allows robots to grow their leg size to simulate ontogenetic morphological changes, and this is the first time that such an experiment has been performed in the physical world. To test diverse robot morphologies, robot legs of variable shapes were generated during the evolutionary process and autonomously built using additive fabrication. We present two cases with evo-devo experiments and one with evolution, and we hypothesize that the addition of a developmental stage can be used within robotics to improve performance. Moreover, our results show that a nonlinear system-environment interaction exists, which explains the nontrivial locomotion patterns observed. In the future, robots will be present in our daily lives, and this work introduces for the first time physical robots that evolve and grow while interacting with the environment.

  11. Robotic Software Integration Using MARIE

    Directory of Open Access Journals (Sweden)

    Carle Côté

    2006-03-01

    Full Text Available This paper presents MARIE, a middleware framework oriented towards developing and integrating new and existing software for robotic systems. By using a generic communication framework, MARIE aims to create a flexible distributed component system that allows robotics developers to share software programs and algorithms, and design prototypes rapidly based on their own integration needs. The use of MARIE is illustrated with the design of a socially interactive autonomous mobile robot platform capable of map building, localization, navigation, tasks scheduling, sound source localization, tracking and separation, speech recognition and generation, visual tracking, message reading and graphical interaction using a touch screen interface.

  12. Energy in Robotics

    NARCIS (Netherlands)

    Folkertsma, Gerrit A.; Stramigioli, Stefano

    2017-01-01

    Energy and energy exchange govern interactions in the physical world. By explicitly considering the energy and power in a robotic system, many control and design problems become easier or more insightful than in a purely signal-based view. We show the application of these energy considerations to

  13. Robots for use in autism research.

    Science.gov (United States)

    Scassellati, Brian; Admoni, Henny; Matarić, Maja

    2012-01-01

    Autism spectrum disorders are a group of lifelong disabilities that affect people's ability to communicate and to understand social cues. Research into applying robots as therapy tools has shown that robots seem to improve engagement and elicit novel social behaviors from people (particularly children and teenagers) with autism. Robot therapy for autism has been explored as one of the first application domains in the field of socially assistive robotics (SAR), which aims to develop robots that assist people with special needs through social interactions. In this review, we discuss the past decade's work in SAR systems designed for autism therapy by analyzing robot design decisions, human-robot interactions, and system evaluations. We conclude by discussing challenges and future trends for this young but rapidly developing research area.

  14. Robotics 101

    Science.gov (United States)

    Sultan, Alan

    2011-01-01

    Robots are used in all kinds of industrial settings. They are used to rivet bolts to cars, to move items from one conveyor belt to another, to gather information from other planets, and even to perform some very delicate types of surgery. Anyone who has watched a robot perform its tasks cannot help but be impressed by how it works. This article…

  15. Vitruvian Robot

    DEFF Research Database (Denmark)

    Hasse, Cathrine

    2017-01-01

    future. A real version of Ava would not last long in a human world because she is basically a solipsist, who does not really care about humans. She cannot co-create the line humans walk along. The robots created as ‘perfect women’ (sex robots) today are very far from the ideal image of Ava...

  16. Socially Impaired Robots: Human Social Disorders and Robots' Socio-Emotional Intelligence

    OpenAIRE

    Vitale, Jonathan; Williams, Mary-Anne; Johnston, Benjamin

    2016-01-01

    Social robots need intelligence in order to safely coexist and interact with humans. Robots without functional abilities in understanding others and unable to empathise might be a societal risk and they may lead to a society of socially impaired robots. In this work we provide a survey of three relevant human social disorders, namely autism, psychopathy and schizophrenia, as a means to gain a better understanding of social robots' future capability requirements. We provide evidence supporting...

  17. Robot vision

    International Nuclear Information System (INIS)

    Hall, E.L.

    1984-01-01

    Almost all industrial robots use internal sensors such as shaft encoders which measure rotary position, or tachometers which measure velocity, to control their motions. Most controllers also provide interface capabilities so that signals from conveyors, machine tools, and the robot itself may be used to accomplish a task. However, advanced external sensors, such as visual sensors, can provide a much greater degree of adaptability for robot control as well as add automatic inspection capabilities to the industrial robot. Visual and other sensors are now being used in fundamental operations such as material processing with immediate inspection, material handling with adaption, arc welding, and complex assembly tasks. A new industry of robot vision has emerged. The application of these systems is an area of great potential

  18. Social Robots

    DEFF Research Database (Denmark)

    Social robotics is a cutting edge research area gathering researchers and stakeholders from various disciplines and organizations. The transformational potential that these machines, in the form of, for example, caregiving, entertainment or partner robots, pose to our societies and to us as indiv......Social robotics is a cutting edge research area gathering researchers and stakeholders from various disciplines and organizations. The transformational potential that these machines, in the form of, for example, caregiving, entertainment or partner robots, pose to our societies and to us...... as individuals seems to be limited by our technical limitations and phantasy alone. This collection contributes to the field of social robotics by exploring its boundaries from a philosophically informed standpoint. It constructively outlines central potentials and challenges and thereby also provides a stable...

  19. Robotic seeding

    DEFF Research Database (Denmark)

    Pedersen, Søren Marcus; Fountas, Spyros; Sørensen, Claus Aage Grøn

    2017-01-01

    Agricultural robotics has received attention for approximately 20 years, but today there are only a few examples of the application of robots in agricultural practice. The lack of uptake may be (at least partly) because in many cases there is either no compelling economic benefit......, or there is a benefit but it is not recognized. The aim of this chapter is to quantify the economic benefits from the application of agricultural robots under a specific condition where such a benefit is assumed to exist, namely the case of early seeding and re-seeding in sugar beet. With some predefined assumptions...... with regard to speed, capacity and seed mapping, we found that among these two technical systems both early seeding with a small robot and re-seeding using a robot for a smaller part of the field appear to be financially viable solutions in sugar beet production....

  20. Human - Robot Proximity

    DEFF Research Database (Denmark)

    Nickelsen, Niels Christian Mossfeldt

    The media and political/managerial levels focus on the opportunities to re-perform Denmark through digitization. Feeding assistive robotics is a welfare technology, relevant to citizens with low or no function in their arms. Despite national dissemination strategies, it proves difficult to recruit...... the study that took place as multi-sited ethnography at different locations in Denmark and Sweden. Based on desk research, observation of meals and interviews I examine socio-technological imaginaries and their practical implications. Human - robotics interaction demands engagement and understanding...

  1. Master-slave robotic system for needle indentation and insertion.

    Science.gov (United States)

    Shin, Jaehyun; Zhong, Yongmin; Gu, Chengfan

    2017-12-01

    Bilateral control of a master-slave robotic system is a challenging issue in robotic-assisted minimally invasive surgery. It requires the knowledge on contact interaction between a surgical (slave) robot and soft tissues. This paper presents a master-slave robotic system for needle indentation and insertion. This master-slave robotic system is able to characterize the contact interaction between the robotic needle and soft tissues. A bilateral controller is implemented using a linear motor for robotic needle indentation and insertion. A new nonlinear state observer is developed to online monitor the contact interaction with soft tissues. Experimental results demonstrate the efficacy of the proposed master-slave robotic system for robotic needle indentation and needle insertion.

  2. Transferring human impedance regulation skills to robots

    CERN Document Server

    Ajoudani, Arash

    2016-01-01

    This book introduces novel thinking and techniques to the control of robotic manipulation. In particular, the concept of teleimpedance control as an alternative method to bilateral force-reflecting teleoperation control for robotic manipulation is introduced. In teleimpedance control, a compound reference command is sent to the slave robot including both the desired motion trajectory and impedance profile, which are then realized by the remote controller. This concept forms a basis for the development of the controllers for a robotic arm, a dual-arm setup, a synergy-driven robotic hand, and a compliant exoskeleton for improved interaction performance.

  3. A review on humanoid robotics in healthcare

    Directory of Open Access Journals (Sweden)

    Joseph Azeta

    2018-01-01

    Full Text Available Humanoid robots have evolved over the years and today it is in many different areas of applications, from homecare to social care and healthcare robotics. This paper deals with a brief overview of the current and potential applications of humanoid robotics in healthcare settings. We present a comprehensive contextualization of humanoid robots in healthcare by identifying and characterizing active research activities on humanoid robot that can work interactively and effectively with humans so as to fill some identified gaps in current healthcare deficiency.

  4. Applications of Chaotic Dynamics in Robotics

    Directory of Open Access Journals (Sweden)

    Xizhe Zang

    2016-03-01

    Full Text Available This article presents a summary of applications of chaos and fractals in robotics. Firstly, basic concepts of deterministic chaos and fractals are discussed. Then, fundamental tools of chaos theory used for identifying and quantifying chaotic dynamics will be shared. Principal applications of chaos and fractal structures in robotics research, such as chaotic mobile robots, chaotic behaviour exhibited by mobile robots interacting with the environment, chaotic optimization algorithms, chaotic dynamics in bipedal locomotion and fractal mechanisms in modular robots will be presented. A brief survey is reported and an analysis of the reviewed publications is also presented.

  5. Micro intelligence robot

    International Nuclear Information System (INIS)

    Jeon, Yon Ho

    1991-07-01

    This book gives descriptions of micro robot about conception of robots and micro robot, match rules of conference of micro robots, search methods of mazes, and future and prospect of robots. It also explains making and design of 8 beat robot like making technique, software, sensor board circuit, and stepping motor catalog, speedy 3, Mr. Black and Mr. White, making and design of 16 beat robot, such as micro robot artist, Jerry 2 and magic art of shortening distances algorithm of robot simulation.

  6. An Intelligent Robot Programing

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Seong Yong

    2012-01-15

    This book introduces an intelligent robot programing with background of the begging, introduction of VPL, and SPL, building of environment for robot platform, starting of robot programing, design of simulation environment, robot autonomy drive control programing, simulation graphic. Such as SPL graphic programing graphical image and graphical shapes, and graphical method application, application of procedure for robot control, robot multiprogramming, robot bumper sensor programing, robot LRF sencor programing and robot color sensor programing.

  7. An Intelligent Robot Programing

    International Nuclear Information System (INIS)

    Hong, Seong Yong

    2012-01-01

    This book introduces an intelligent robot programing with background of the begging, introduction of VPL, and SPL, building of environment for robot platform, starting of robot programing, design of simulation environment, robot autonomy drive control programing, simulation graphic. Such as SPL graphic programing graphical image and graphical shapes, and graphical method application, application of procedure for robot control, robot multiprogramming, robot bumper sensor programing, robot LRF sencor programing and robot color sensor programing.

  8. Il circolo tecnologico: dall’uomo al robot e ritorno

    Directory of Open Access Journals (Sweden)

    BONITO OLIVA, ROSSELLA

    2017-12-01

    Full Text Available The technological Circle: from Man to Robot and return Robotics raised new questions in the already complex relationship between technology and ethics. Robots, more than any other machine, come close to human abilities of acting and interacting. Robots are created by human intelligence, they are perceived however through the collective imagery of post-humanistic culture. To reflect on the relation between robot and man means to investigate whether robots are a reflection of mankind, or if technologic ideology has slowly molded the subject: the man of the present is a robot.

  9. Lessons Learned in Designing User-configurable Modular Robotics

    DEFF Research Database (Denmark)

    Lund, Henrik Hautop

    2013-01-01

    User-configurable robotics allows users to easily configure robotic systems to perform task-fulfilling behaviors as desired by the users. With a user configurable robotic system, the user can easily modify the physical and func-tional aspect in terms of hardware and software components of a robotic...... with the semi-autonomous com-ponents of the user-configurable robotic system in interaction with the given environment. Components constituting such a user-configurable robotic system can be characterized as modules in a modular robotic system. Several factors in the definition and implementation...

  10. ISS Robotic Student Programming

    Science.gov (United States)

    Barlow, J.; Benavides, J.; Hanson, R.; Cortez, J.; Le Vasseur, D.; Soloway, D.; Oyadomari, K.

    2016-01-01

    The SPHERES facility is a set of three free-flying satellites launched in 2006. In addition to scientists and engineering, middle- and high-school students program the SPHERES during the annual Zero Robotics programming competition. Zero Robotics conducts virtual competitions via simulator and on SPHERES aboard the ISS, with students doing the programming. A web interface allows teams to submit code, receive results, collaborate, and compete in simulator-based initial rounds and semi-final rounds. The final round of each competition is conducted with SPHERES aboard the ISS. At the end of 2017 a new robotic platform called Astrobee will launch, providing new game elements and new ground support for even more student interaction.

  11. Appeal and Perceived Naturalness of a Soft Robotic Tentacle

    DEFF Research Database (Denmark)

    Jørgensen, Jonas

    2018-01-01

    Soft robotics technology has been proposed for a number of applications that involve human-robot interaction. This study investigates how a silicone-based pneumatically actuated soft robotic tentacle is perceived in interaction. Quantitative and qualitative data was gathered from questionnaires (N...

  12. A Video Game-Based Framework for Analyzing Human-Robot Interaction: Characterizing Interface Design in Real-Time Interactive Multimedia Applications

    National Research Council Canada - National Science Library

    Richer, Justin; Drury, Jill L

    2006-01-01

    .... This paper segments video game interaction into domain-independent components which together form a framework that can be used to characterize real-time interactive multimedia applications in general...

  13. To Err Is Robot: How Humans Assess and Act toward an Erroneous Social Robot

    Directory of Open Access Journals (Sweden)

    Nicole Mirnig

    2017-05-01

    Full Text Available We conducted a user study for which we purposefully programmed faulty behavior into a robot’s routine. It was our aim to explore if participants rate the faulty robot different from an error-free robot and which reactions people show in interaction with a faulty robot. The study was based on our previous research on robot errors where we detected typical error situations and the resulting social signals of our participants during social human–robot interaction. In contrast to our previous work, where we studied video material in which robot errors occurred unintentionally, in the herein reported user study, we purposefully elicited robot errors to further explore the human interaction partners’ social signals following a robot error. Our participants interacted with a human-like NAO, and the robot either performed faulty or free from error. First, the robot asked the participants a set of predefined questions and then it asked them to complete a couple of LEGO building tasks. After the interaction, we asked the participants to rate the robot’s anthropomorphism, likability, and perceived intelligence. We also interviewed the participants on their opinion about the interaction. Additionally, we video-coded the social signals the participants showed during their interaction with the robot as well as the answers they provided the robot with. Our results show that participants liked the faulty robot significantly better than the robot that interacted flawlessly. We did not find significant differences in people’s ratings of the robot’s anthropomorphism and perceived intelligence. The qualitative data confirmed the questionnaire results in showing that although the participants recognized the robot’s mistakes, they did not necessarily reject the erroneous robot. The annotations of the video data further showed that gaze shifts (e.g., from an object to the robot or vice versa and laughter are typical reactions to unexpected robot behavior

  14. AssistMe robot, an assistance robotic platform

    Directory of Open Access Journals (Sweden)

    A. I. Alexan

    2012-06-01

    Full Text Available This paper presents the design and implementation of a full size assistance robot. Its main purpose it to assist a person and eventually avoid a life threatening situation. Its implementation revolves around a chipKIT Arduino board that interconnects a robotic base controller with a 7 inch TABLET PC and various sensors. Due to the Android and Arduino combination, the robot can interact with the person and provide an easy development platform for future improvement and feature adding. The TABLET PC is Webcam, WIFI and Bluetooth enabled, offering a versatile platform that is able to process data and in the same time provide the user a friendly interface.

  15. You Look Human, But Act Like a Machine: Agent Appearance and Behavior Modulate Different Aspects of Human–Robot Interaction

    OpenAIRE

    Abubshait, Abdulaziz; Wiese, Eva

    2017-01-01

    Gaze following occurs automatically in social interactions, but the degree to which gaze is followed depends on whether an agent is perceived to have a mind, making its behavior socially more relevant for the interaction. Mind perception also modulates the attitudes we have toward others, and determines the degree of empathy, prosociality, and morality invested in social interactions. Seeing mind in others is not exclusive to human agents, but mind can also be ascribed to non-human agents lik...

  16. A Novel Passive Path Following Controller for a Rehabilitation Robot

    National Research Council Canada - National Science Library

    Zhang, X; Behal, A; Dawson, D. M; Chen, J

    2004-01-01

    .... Motivated by a nonholonomic kinematic constraint, a dynamic path generator is designed to trace a desired contour in the robot's workspace when an interaction force is applied at the robot's end-effector...

  17. Integrated Control Strategies Supporting Autonomous Functionalities in Mobile Robots

    National Research Council Canada - National Science Library

    Sights, B; Everett, H. R; Pacis, E. B; Kogut, G; Thompson, M

    2005-01-01

    High-level intelligence allows a mobile robot to create and interpret complex world models, but without a precise control system, the accuracy of the world model and the robot's ability to interact...

  18. Polymer optical fiber strain gauge for human-robot interaction forces assessment on an active knee orthosis

    Science.gov (United States)

    Leal-Junior, Arnaldo G.; Frizera, Anselmo; Marques, Carlos; Sánchez, Manuel R. A.; Botelho, Thomaz R.; Segatto, Marcelo V.; Pontes, Maria José

    2018-03-01

    This paper presents the development of a polymer optical fiber (POF) strain gauge based on the light coupling principle, which the power attenuation is created by the misalignment between two POFs. The misalignment, in this case, is proportional to the strain on the structure that the fibers are attached. This principle has the advantages of low cost, ease of implementation, temperature insensitiveness, electromagnetic fields immunity and simplicity on the sensor interrogation and signal processing. Such advantages make the proposed solution an interesting alternative to the electronic strain gauges. For this reason, an analytical model for the POF strain gauge is proposed and validated. Furthermore, the proposed POF sensor is applied on an active orthosis for knee rehabilitation exercises through flexion/extension cycles. The controller of the orthosis provides 10 different levels of robotic assistance on the flexion/extension movement. The POF strain gauge is tested at each one of these levels. Results show good correlation between the optical and electronic strain gauges with root mean squared deviation (RMSD) of 1.87 Nm when all cycles are analyzed, which represents a deviation of less than 8%. For the application, the proposed sensor presented higher stability than the electronic one, which can provide advantages on the rehabilitation exercises and on the inner controller of the device.

  19. Space Robotics Challenge

    Data.gov (United States)

    National Aeronautics and Space Administration — The Space Robotics Challenge seeks to infuse robot autonomy from the best and brightest research groups in the robotics community into NASA robots for future...

  20. Robotic arm

    International Nuclear Information System (INIS)

    Kwech, H.

    1989-01-01

    A robotic arm positionable within a nuclear vessel by access through a small diameter opening and having a mounting tube supported within the vessel and mounting a plurality of arm sections for movement lengthwise of the mounting tube as well as for movement out of a window provided in the wall of the mounting tube is disclosed. An end effector, such as a grinding head or welding element, at an operating end of the robotic arm, can be located and operated within the nuclear vessel through movement derived from six different axes of motion provided by mounting and drive connections between arm sections of the robotic arm. The movements are achieved by operation of remotely-controllable servo motors, all of which are mounted at a control end of the robotic arm to be outside the nuclear vessel. 23 figs

  1. Robotic surgery

    Science.gov (United States)

    ... with this type of surgery give it some advantages over standard endoscopic techniques. The surgeon can make ... Elsevier Saunders; 2015:chap 87. Muller CL, Fried GM. Emerging technology in surgery: Informatics, electronics, robotics. In: ...

  2. Robotic parathyroidectomy.

    Science.gov (United States)

    Okoh, Alexis Kofi; Sound, Sara; Berber, Eren

    2015-09-01

    Robotic parathyroidectomy has recently been described. Although the procedure eliminates the neck scar, it is technically more demanding than the conventional approaches. This report is a review of the patients' selection criteria, technique, and outcomes. © 2015 Wiley Periodicals, Inc.

  3. Light Robotics

    DEFF Research Database (Denmark)

    Glückstad, Jesper; Palima, Darwin

    Light Robotics - Structure-Mediated Nanobiophotonics covers the latest means of sculpting of both light and matter for achieving bioprobing and manipulation at the smallest scales. The synergy between photonics, nanotechnology and biotechnology spans the rapidly growing field of nanobiophotonics...

  4. Robotic arm

    Science.gov (United States)

    Kwech, Horst

    1989-04-18

    A robotic arm positionable within a nuclear vessel by access through a small diameter opening and having a mounting tube supported within the vessel and mounting a plurality of arm sections for movement lengthwise of the mounting tube as well as for movement out of a window provided in the wall of the mounting tube. An end effector, such as a grinding head or welding element, at an operating end of the robotic arm, can be located and operated within the nuclear vessel through movement derived from six different axes of motion provided by mounting and drive connections between arm sections of the robotic arm. The movements are achieved by operation of remotely-controllable servo motors, all of which are mounted at a control end of the robotic arm to be outside the nuclear vessel.

  5. Trust me, I am Robot!

    DEFF Research Database (Denmark)

    Stoyanova, Angelina; Drefeld, Jonas; Tanev, Stoyan

    of the emerging trust relationship is a key component of the use value of the robotic system and of the value proposition of the robotic system producers. The study is based on a qualitative research approach combining the phenomenological research paradigm with a grounded theory building approach based......The aim of this paper is to discuss some of the issues regarding the emergence of trust within the context of the interaction between human patients and medical rehabilitation technology based on robot system solutions. The starting assumption of the analysis is that the articulation...

  6. Experientally guided robots. [for planet exploration

    Science.gov (United States)

    Merriam, E. W.; Becker, J. D.

    1974-01-01

    This paper argues that an experientally guided robot is necessary to successfully explore far-away planets. Such a robot is characterized as having sense organs which receive sensory information from its environment and motor systems which allow it to interact with that environment. The sensori-motor information which it receives is organized into an experiential knowledge structure and this knowledge in turn is used to guide the robot's future actions. A summary is presented of a problem solving system which is being used as a test bed for developing such a robot. The robot currently engages in the behaviors of visual tracking, focusing down, and looking around in a simulated Martian landscape. Finally, some unsolved problems are outlined whose solutions are necessary before an experientally guided robot can be produced. These problems center around organizing the motivational and memory structure of the robot and understanding its high-level control mechanisms.

  7. Easy Reconfiguration of Modular Industrial Collaborative Robots

    DEFF Research Database (Denmark)

    Schou, Casper

    2016-01-01

    the production staff collaborating to perform common tasks. This change of environment imposes a much more dynamic lifecycle for the robot which consequently requires new ways of interacting. This thesis investigates how the changeover to a new task on a collaborative robot can be performed by the shop floor...... operators already working alongside the robot. To effectively perform this changeover, the operator must both reconfigure the hardware of the robot and reprogram the robot to match the new task. To enable shop floor operators to quickly and intuitively program the robot, this thesis proposes the use...... of parametric, task-related robot skills with a manual parameterization method. Reconfiguring the hardware entails adding, removing, or modifying some of the robot’s components. This thesis investigate how software configurator tools can aid the operator in selecting appropriate hardware modules, and how agent...

  8. Audio localization for mobile robots

    OpenAIRE

    de Guillebon, Thibaut; Grau Saldes, Antoni; Bolea Monte, Yolanda

    2009-01-01

    The department of the University for which I worked is developing a project based on the interaction with robots in the environment. My work was to define an audio system for the robot. This audio system that I have to realize consists on a mobile head which is able to follow the sound in its environment. This subject was treated as a research problem, with the liberty to find and develop different solutions and make them evolve in the chosen way.

  9. Attitudes and reactions to a healthcare robot.

    Science.gov (United States)

    Broadbent, Elizabeth; Kuo, I Han; Lee, Yong In; Rabindran, Joel; Kerse, Ngaire; Stafford, Rebecca; MacDonald, Bruce A

    2010-06-01

    The use of robots in healthcare is a new concept. The public's perception and acceptance is not well understood. The objective was to investigate the perceptions and emotions toward the utilization of healthcare robots among individuals over 40 years of age, investigate factors contributing to acceptance, and evaluate differences in blood pressure checks taken by a robot and a medical student. Fifty-seven (n = 57) adults aged over 40 years and recruited from local general practitioner or gerontology group lists participated in two cross-sectional studies. The first was an open-ended questionnaire assessing perceptions of robots. In the second study, participants had their blood pressure taken by a medical student and by a robot. Patient comfort with each encounter, perceived accuracy of each measurement, and the quality of the patient interaction were studied in each case. Readings were compared by independent t-tests and regression analyses were conducted to predict quality ratings. Participants' perceptions about robots were influenced by their prior exposure to robots in literature or entertainment media. Participants saw many benefits and applications for healthcare robots, including simple medical procedures and physical assistance, but had some concerns about reliability, safety, and the loss of personal care. Blood pressure readings did not differ between the medical student and robot, but participants felt more comfortable with the medical student and saw the robot as less accurate. Although age and sex were not significant predictors, individuals who held more positive initial attitudes and emotions toward robots rated the robot interaction more favorably. Many people see robots as having benefits and applications in healthcare but some have concerns. Individual attitudes and emotions regarding robots in general are likely to influence future acceptance of their introduction into healthcare processes.

  10. Recent advances in robotics

    International Nuclear Information System (INIS)

    Beni, G.; Hackwood, S.

    1984-01-01

    Featuring 10 contributions, this volume offers a state-of-the-art report on robotic science and technology. It covers robots in modern industry, robotic control to help the disabled, kinematics and dynamics, six-legged walking robots, a vector analysis of robot manipulators, tactile sensing in robots, and more

  11. Analyzing Robotic Kinematics Via Computed Simulations

    Science.gov (United States)

    Carnahan, Timothy M.

    1992-01-01

    Computing system assists in evaluation of kinematics of conceptual robot. Displays positions and motions of robotic manipulator within work cell. Also displays interactions between robotic manipulator and other objects. Results of simulation displayed on graphical computer workstation. System includes both off-the-shelf software originally developed for automotive industry and specially developed software. Simulation system also used to design human-equivalent hand, to model optical train in infrared system, and to develop graphical interface for teleoperator simulation system.

  12. Soft Robotics Week

    CERN Document Server

    Rossiter, Jonathan; Iida, Fumiya; Cianchetti, Matteo; Margheri, Laura

    2017-01-01

    This book offers a comprehensive, timely snapshot of current research, technologies and applications of soft robotics. The different chapters, written by international experts across multiple fields of soft robotics, cover innovative systems and technologies for soft robot legged locomotion, soft robot manipulation, underwater soft robotics, biomimetic soft robotic platforms, plant-inspired soft robots, flying soft robots, soft robotics in surgery, as well as methods for their modeling and control. Based on the results of the second edition of the Soft Robotics Week, held on April 25 – 30, 2016, in Livorno, Italy, the book reports on the major research lines and novel technologies presented and discussed during the event.

  13. Medical robotics.

    Science.gov (United States)

    Ferrigno, Giancarlo; Baroni, Guido; Casolo, Federico; De Momi, Elena; Gini, Giuseppina; Matteucci, Matteo; Pedrocchi, Alessandra

    2011-01-01

    Information and communication technology (ICT) and mechatronics play a basic role in medical robotics and computer-aided therapy. In the last three decades, in fact, ICT technology has strongly entered the health-care field, bringing in new techniques to support therapy and rehabilitation. In this frame, medical robotics is an expansion of the service and professional robotics as well as other technologies, as surgical navigation has been introduced especially in minimally invasive surgery. Localization systems also provide treatments in radiotherapy and radiosurgery with high precision. Virtual or augmented reality plays a role for both surgical training and planning and for safe rehabilitation in the first stage of the recovery from neurological diseases. Also, in the chronic phase of motor diseases, robotics helps with special assistive devices and prostheses. Although, in the past, the actual need and advantage of navigation, localization, and robotics in surgery and therapy has been in doubt, today, the availability of better hardware (e.g., microrobots) and more sophisticated algorithms(e.g., machine learning and other cognitive approaches)has largely increased the field of applications of these technologies,making it more likely that, in the near future, their presence will be dramatically increased, taking advantage of the generational change of the end users and the increasing request of quality in health-care delivery and management.

  14. Using a Humanoid Robot to Develop a Dialogue-Based Interactive Learning Environment for Elementary Foreign Language Classrooms

    Science.gov (United States)

    Chang, Chih-Wei; Chen, Gwo-Dong

    2010-01-01

    Elementary school is the critical stage during which the development of listening comprehension and oral abilities in language acquisition occur, especially with a foreign language. However, the current foreign language instructors often adopt one-way teaching, and the learning environment lacks any interactive instructional media with which to…

  15. Human-robot interaction modeling and simulation of supervisory control and situational awareness during field experimentation with military manned and unmanned ground vehicles

    Science.gov (United States)

    Johnson, Tony; Metcalfe, Jason; Brewster, Benjamin; Manteuffel, Christopher; Jaswa, Matthew; Tierney, Terrance

    2010-04-01

    The proliferation of intelligent systems in today's military demands increased focus on the optimization of human-robot interactions. Traditional studies in this domain involve large-scale field tests that require humans to operate semiautomated systems under varying conditions within military-relevant scenarios. However, provided that adequate constraints are employed, modeling and simulation can be a cost-effective alternative and supplement. The current presentation discusses a simulation effort that was executed in parallel with a field test with Soldiers operating military vehicles in an environment that represented key elements of the true operational context. In this study, "constructive" human operators were designed to represent average Soldiers executing supervisory control over an intelligent ground system. The constructive Soldiers were simulated performing the same tasks as those performed by real Soldiers during a directly analogous field test. Exercising the models in a high-fidelity virtual environment provided predictive results that represented actual performance in certain aspects, such as situational awareness, but diverged in others. These findings largely reflected the quality of modeling assumptions used to design behaviors and the quality of information available on which to articulate principles of operation. Ultimately, predictive analyses partially supported expectations, with deficiencies explicable via Soldier surveys, experimenter observations, and previously-identified knowledge gaps.

  16. Robotic and Survey Telescopes

    Science.gov (United States)

    Woźniak, Przemysław

    Robotic telescopes are revolutionizing the way astronomers collect their dataand conduct sky surveys. This chapter begins with a discussion of principles thatguide the process of designing, constructing, and operating telescopes andobservatories that offer a varying degree of automation, from instruments remotelycontrolled by observers to fully autonomous systems requiring no humansupervision during their normal operations. Emphasis is placed on designtrade-offs involved in building end-to-end systems intended for a wide range ofscience applications. The second part of the chapter contains descriptions ofseveral projects and instruments, both existing and currently under development.It is an attempt to provide a representative selection of actual systems thatillustrates state of the art in technology, as well as important ideas and milestonesin the development of the field. The list of presented instruments spans the fullrange in size starting from small all-sky monitors, through midrange robotic andsurvey telescopes, and finishing with large robotic instruments and surveys.Explosive growth of telescope networking is enabling entirely new modesof interaction between the survey and follow-up observing. Increasingimportance of standardized communication protocols and software is stressed.These developments are driven by the fusion of robotic telescope hardware,massive storage and databases, real-time knowledge extraction, and datacross-correlation on a global scale. The chapter concludes with examplesof major science results enabled by these new technologies and futureprospects.

  17. Generic robot architecture

    Science.gov (United States)

    Bruemmer, David J [Idaho Falls, ID; Few, Douglas A [Idaho Falls, ID

    2010-09-21

    The present invention provides methods, computer readable media, and apparatuses for a generic robot architecture providing a framework that is easily portable to a variety of robot platforms and is configured to provide hardware abstractions, abstractions for generic robot attributes, environment abstractions, and robot behaviors. The generic robot architecture includes a hardware abstraction level and a robot abstraction level. The hardware abstraction level is configured for developing hardware abstractions that define, monitor, and control hardware modules available on a robot platform. The robot abstraction level is configured for defining robot attributes and provides a software framework for building robot behaviors from the robot attributes. Each of the robot attributes includes hardware information from at least one hardware abstraction. In addition, each robot attribute is configured to substantially isolate the robot behaviors from the at least one hardware abstraction.

  18. 'Filigree Robotics'

    DEFF Research Database (Denmark)

    2016-01-01

    -scale 3D printed ceramics accompanied by prints, videos and ceramic probes, which introduce the material and design processes of the project.'Filigree Robotics' experiments with a combination of the traditional ceramic technique of ‘Overforming’ with 3d Laserscan and Robotic extrusion technique...... application of reflectivity after an initial 3d print. The consideration and integration of this material practice into a digital workflow took place in an interdisciplinary collaboration of Ceramicist Flemming Tvede Hansen from KADK Superformlab and architectural researchers from CITA (Martin Tamke, Henrik...... to the creation of the form and invites for experimentation. In Filigree Robotics we combine the crafting of the mold with a parallel running generative algorithm, which is fed by a constant laserscan of the 3d surface. This algorithm, analyses the topology of the mold, identifies high and low points and uses...

  19. Cloud Robotics Platforms

    Directory of Open Access Journals (Sweden)

    Busra Koken

    2015-01-01

    Full Text Available Cloud robotics is a rapidly evolving field that allows robots to offload computation-intensive and storage-intensive jobs into the cloud. Robots are limited in terms of computational capacity, memory and storage. Cloud provides unlimited computation power, memory, storage and especially collaboration opportunity. Cloud-enabled robots are divided into two categories as standalone and networked robots. This article surveys cloud robotic platforms, standalone and networked robotic works such as grasping, simultaneous localization and mapping (SLAM and monitoring.

  20. Programming Robots with Associative Memories

    International Nuclear Information System (INIS)

    Touzet, C.

    1999-01-01

    Today, there are several drawbacks that impede the necessary and much needed use of robot learning techniques in real applications. First, the time needed to achieve the synthesis of any behavior is prohibitive. Second, the robot behavior during the learning phase is by definition bad, it may even be dangerous. Third, except within the lazy learning approach, a new behavior implies a new learning phase. We propose in this paper to use self-organizing maps to encode the non explicit model of the robot-world interaction sampled by the lazy memory, and then generate a robot behavior by means of situations to be achieved, i.e., points on the self-organizing maps. Any behavior can instantaneously be synthesized by the definition of a goal situation. Its performance will be minimal (not evidently bad) and will improve by the mere repetition of the behavior

  1. Mobile Robots in Human Environments

    DEFF Research Database (Denmark)

    Svenstrup, Mikael

    intelligent mobile robotic devices capable of being a more natural and sociable actor in a human environment. More specific the emphasis is on safe and natural motion and navigation issues. First part of the work focus on developing a robotic system, which estimates human interest in interacting......, lawn mowers, toy pets, or as assisting technologies for care giving. If we want robots to be an even larger and more integrated part of our every- day environments, they need to become more intelligent, and behave safe and natural to the humans in the environment. This thesis deals with making...... as being able to navigate safely around one person, the robots must also be able to navigate in environments with more people. This can be environments such as pedestrian streets, hospital corridors, train stations or airports. The developed human-aware navigation strategy is enhanced to formulate...

  2. 3D Printed Robotic Hand

    Science.gov (United States)

    Pizarro, Yaritzmar Rosario; Schuler, Jason M.; Lippitt, Thomas C.

    2013-01-01

    Dexterous robotic hands are changing the way robots and humans interact and use common tools. Unfortunately, the complexity of the joints and actuations drive up the manufacturing cost. Some cutting edge and commercially available rapid prototyping machines now have the ability to print multiple materials and even combine these materials in the same job. A 3D model of a robotic hand was designed using Creo Parametric 2.0. Combining "hard" and "soft" materials, the model was printed on the Object Connex350 3D printer with the purpose of resembling as much as possible the human appearance and mobility of a real hand while needing no assembly. After printing the prototype, strings where installed as actuators to test mobility. Based on printing materials, the manufacturing cost of the hand was $167, significantly lower than other robotic hands without the actuators since they have more complex assembly processes.

  3. Programming Robots with Associative Memories

    Energy Technology Data Exchange (ETDEWEB)

    Touzet, C

    1999-07-10

    Today, there are several drawbacks that impede the necessary and much needed use of robot learning techniques in real applications. First, the time needed to achieve the synthesis of any behavior is prohibitive. Second, the robot behavior during the learning phase is "by definition" bad, it may even be dangerous. Third, except within the lazy learning approach, a new behavior implies a new learning phase. We propose in this paper to use self-organizing maps to encode the non explicit model of the robot-world interaction sampled by the lazy memory, and then generate a robot behavior by means of situations to be achieved, i.e., points on the self-organizing maps. Any behavior can instantaneously be synthesized by the definition of a goal situation. Its performance will be minimal (not evidently bad) and will improve by the mere repetition of the behavior.

  4. Medical robotics

    CERN Document Server

    Troccaz, Jocelyne

    2013-01-01

    In this book, we present medical robotics, its evolution over the last 30 years in terms of architecture, design and control, and the main scientific and clinical contributions to the field. For more than two decades, robots have been part of hospitals and have progressively become a common tool for the clinician. Because this domain has now reached a certain level of maturity it seems important and useful to provide a state of the scientific, technological and clinical achievements and still open issues. This book describes the short history of the domain, its specificity and constraints, and

  5. Dynamic photogrammetric calibration of industrial robots

    Science.gov (United States)

    Maas, Hans-Gerd

    1997-07-01

    Today's developments in industrial robots focus on aims like gain of flexibility, improvement of the interaction between robots and reduction of down-times. A very important method to achieve these goals are off-line programming techniques. In contrast to conventional teach-in-robot programming techniques, where sequences of actions are defined step-by- step via remote control on the real object, off-line programming techniques design complete robot (inter-)action programs in a CAD/CAM environment. This poses high requirements to the geometric accuracy of a robot. While the repeatability of robot poses in the teach-in mode is often better than 0.1 mm, the absolute pose accuracy potential of industrial robots is usually much worse due to tolerances, eccentricities, elasticities, play, wear-out, load, temperature and insufficient knowledge of model parameters for the transformation from poses into robot axis angles. This fact necessitates robot calibration techniques, including the formulation of a robot model describing kinematics and dynamics of the robot, and a measurement technique to provide reference data. Digital photogrammetry as an accurate, economic technique with realtime potential offers itself for this purpose. The paper analyzes the requirements posed to a measurement technique by industrial robot calibration tasks. After an overview on measurement techniques used for robot calibration purposes in the past, a photogrammetric robot calibration system based on off-the- shelf lowcost hardware components will be shown and results of pilot studies will be discussed. Besides aspects of accuracy, reliability and self-calibration in a fully automatic dynamic photogrammetric system, realtime capabilities are discussed. In the pilot studies, standard deviations of 0.05 - 0.25 mm in the three coordinate directions could be achieved over a robot work range of 1.7 X 1.5 X 1.0 m3. The realtime capabilities of the technique allow to go beyond kinematic robot

  6. A Review of Mobile Robotic Telepresence

    Directory of Open Access Journals (Sweden)

    Annica Kristoffersson

    2013-01-01

    Full Text Available Mobile robotic telepresence (MRP systems incorporate video conferencing equipment onto mobile robot devices which can be steered from remote locations. These systems, which are primarily used in the context of promoting social interaction between people, are becoming increasingly popular within certain application domains such as health care environments, independent living for the elderly, and office environments. In this paper, an overview of the various systems, application areas, and challenges found in the literature concerning mobile robotic telepresence is provided. The survey also proposes a set terminology for the field as there is currently a lack of standard terms for the different concepts related to MRP systems. Further, this paper provides an outlook on the various research directions for developing and enhancing mobile robotic telepresence systems per se, as well as evaluating the interaction in laboratory and field settings. Finally, the survey outlines a number of design implications for the future of mobile robotic telepresence systems for social interaction.

  7. Child's recognition of emotions in robot's face and body

    NARCIS (Netherlands)

    Cohen, I.; Looije, R.; Neerincx, M.A.

    2011-01-01

    Social robots can comfort and support children who have to cope with chronic diseases. In previous studies, a "facial robot", the iCat, proved to show well-recognized emotional expressions that are important in social interactions. The question is if a mobile robot without a face, the Nao, can

  8. Measuring acceptance of an assistive social robot: a suggested toolkit

    NARCIS (Netherlands)

    Heerink, M.; Kröse, B.; Evers, V.; Wielinga, B.

    2009-01-01

    The human robot interaction community is multidisciplinary by nature and has members from social science to engineering backgrounds. In this paper we aim to provide human robot developers with a straightforward toolkit to evaluate users' acceptance of assistive social robots they are designing or

  9. Maintaining trust while fixated to a rehabilitative robot

    DEFF Research Database (Denmark)

    Jensen, Laura U.; Winther, Trine Straarup; Jørgensen, Rasmus

    2016-01-01

    This paper investigates the trust relationship between humans and a rehabilitation robot, the RoboTrainer. We present a study in which participants let the robot guide their arms through a series of preset coordinates in a 3D space. Each participant interact with the robot twice, one time where...

  10. The AAAI 2006 Mobile Robot Competition and Exhibition

    OpenAIRE

    Rybski, Paul E.; Forbes, Jeffrey; Burhans, Debra; Dodds, Zach; Oh, Paul; Scheutz, Matthias; Avanzato, Bob

    2007-01-01

    The Fifteenth Annual AAAI Robot Competition and Exhibition was held at the Twenty-First National Conference on Artificial Intelligence in Boston, Massachusetts, in July 2006. This article describes the events that were held at the conference, including the Scavenger Hunt, Human Robot Interaction, and Robot Exhibition.

  11. Modelling and testing proxemic behaviour for humanoid robots

    NARCIS (Netherlands)

    Torta, E.; Cuijpers, R.H.; Juola, J.F.; Pol, van der D.

    2012-01-01

    Humanoid robots that share the same space with humans need to be socially acceptable and effective as they interact with people. In this paper we focus our attention on the definition of a behavior-based robotic architecture that (1) allows the robot to navigate safely in a cluttered and dynamically

  12. Cultural Robotics: The Culture of Robotics and Robotics in Culture

    Directory of Open Access Journals (Sweden)

    Hooman Samani

    2013-12-01

    Full Text Available In this paper, we have investigated the concept of “Cultural Robotics” with regard to the evolution of social into cultural robots in the 21st Century. By defining the concept of culture, the potential development of a culture between humans and robots is explored. Based on the cultural values of the robotics developers, and the learning ability of current robots, cultural attributes in this regard are in the process of being formed, which would define the new concept of cultural robotics. According to the importance of the embodiment of robots in the sense of presence, the influence of robots in communication culture is anticipated. The sustainability of robotics culture based on diversity for cultural communities for various acceptance modalities is explored in order to anticipate the creation of different attributes of culture between robots and humans in the future.

  13. Robots to assist daily activities: views of older adults with Alzheimer's disease and their caregivers.

    Science.gov (United States)

    Wang, Rosalie H; Sudhama, Aishwarya; Begum, Momotaz; Huq, Rajibul; Mihailidis, Alex

    2017-01-01

    Robots have the potential to both enable older adults with dementia to perform daily activities with greater independence, and provide support to caregivers. This study explored perspectives of older adults with Alzheimer's disease (AD) and their caregivers on robots that provide stepwise prompting to complete activities in the home. Ten dyads participated: Older adults with mild-to-moderate AD and difficulty completing activity steps, and their family caregivers. Older adults were prompted by a tele-operated robot to wash their hands in the bathroom and make a cup of tea in the kitchen. Caregivers observed interactions. Semi-structured interviews were conducted individually. Transcribed interviews were thematically analyzed. Three themes summarized responses to robot interactions: contemplating a future with assistive robots, considering opportunities with assistive robots, and reflecting on implications for social relationships. Older adults expressed opportunities for robots to help in daily activities, were open to the idea of robotic assistance, but did not want a robot. Caregivers identified numerous opportunities and were more open to robots. Several wanted a robot, if available. Positive consequences of robots in caregiving scenarios could include decreased frustration, stress, and relationship strain, and increased social interaction via the robot. A negative consequence could be decreased interaction with caregivers. Few studies have investigated in-depth perspectives of older adults with dementia and their caregivers following direct interaction with an assistive prompting robot. To fulfill the potential of robots, continued dialogue between users and developers, and consideration of robot design and caregiving relationship factors are necessary.

  14. Robotic vision system for random bin picking with dual-arm robots

    Directory of Open Access Journals (Sweden)

    Kang Sangseung

    2016-01-01

    Full Text Available Random bin picking is one of the most challenging industrial robotics applications available. It constitutes a complicated interaction between the vision system, robot, and control system. For a packaging operation requiring a pick-and-place task, the robot system utilized should be able to perform certain functions for recognizing the applicable target object from randomized objects in a bin. In this paper, we introduce a robotic vision system for bin picking using industrial dual-arm robots. The proposed system recognizes the best object from randomized target candidates based on stereo vision, and estimates the position and orientation of the object. It then sends the result to the robot control system. The system was developed for use in the packaging process of cell phone accessories using dual-arm robots.

  15. Which Robot Features Can Stimulate Better Responses from Children with Autism in Robot-Assisted Therapy?

    Directory of Open Access Journals (Sweden)

    Jaeryoung Lee

    2012-09-01

    Full Text Available This study explores the response of autistic children to a few design features of the robots for autism therapy and provides suggestions on the robot features that have a stronger influence on the therapeutic process. First, we investigate the effect of selected robot features on the development of social communication skills in autistic children. The results indicate that the toy's “face” and “moving limb” usually draw the children's attention and improve children's facial expression skills, but do not contribute to the development of other social communication skills. Secondly, we study the response of children with low-functioning autism to robots with verbal communication functionalities. Test results show that children interacted with the verbal-featured robot more intensively than with the experimenter. We conclude that robots with faces and moving limbs can engage autistic children in a better way. Facial expression of the robots can elicit a greater response than prompting by humans.

  16. Towards safe robots approaching Asimov’s 1st law

    CERN Document Server

    Haddadin, Sami

    2014-01-01

    The vision of seamless human-robot interaction in our everyday life that allows for tight cooperation between human and robot has not become reality yet. However, the recent increase in technology maturity finally made it possible to realize systems of high integration, advanced sensorial capabilities and enhanced power to cross this barrier and merge living spaces of humans and robot workspaces to at least a certain extent. Together with the increasing industrial effort to realize first commercial service robotics products this makes it necessary to properly address one of the most fundamental questions of Human-Robot Interaction: How to ensure safety in human-robot coexistence? In this authoritative monograph, the essential question about the necessary requirements for a safe robot is addressed in depth and from various perspectives. The approach taken in this book focuses on the biomechanical level of injury assessment, addresses the physical evaluation of robot-human impacts, and isolates the major factor...

  17. Robot vision for nuclear advanced robot

    International Nuclear Information System (INIS)

    Nakayama, Ryoichi; Okano, Hideharu; Kuno, Yoshinori; Miyazawa, Tatsuo; Shimada, Hideo; Okada, Satoshi; Kawamura, Astuo

    1991-01-01

    This paper describes Robot Vision and Operation System for Nuclear Advanced Robot. This Robot Vision consists of robot position detection, obstacle detection and object recognition. With these vision techniques, a mobile robot can make a path and move autonomously along the planned path. The authors implemented the above robot vision system on the 'Advanced Robot for Nuclear Power Plant' and tested in an environment mocked up as nuclear power plant facilities. Since the operation system for this robot consists of operator's console and a large stereo monitor, this system can be easily operated by one person. Experimental tests were made using the Advanced Robot (nuclear robot). Results indicate that the proposed operation system is very useful, and can be operate by only person. (author)

  18. Control algorithms for autonomous robot navigation

    International Nuclear Information System (INIS)

    Jorgensen, C.C.

    1985-01-01

    This paper examines control algorithm requirements for autonomous robot navigation outside laboratory environments. Three aspects of navigation are considered: navigation control in explored terrain, environment interactions with robot sensors, and navigation control in unanticipated situations. Major navigation methods are presented and relevance of traditional human learning theory is discussed. A new navigation technique linking graph theory and incidental learning is introduced

  19. Investigating the Effect of Social Robot Embodiment.

    Science.gov (United States)

    Whelan, Sally; Kouroupetroglou, Christos; Santorelli, Adam; Raciti, Massimiliano; Barrett, Eva; Casey, Dympna

    2017-01-01

    The experiment described in this paper is an early assessment to identify if the embodiment of a verbal and visual user interaction system in a robot is more effective in people with dementia than when using the same system in a simple laptop. This study provides input for the robot's design.

  20. Starting a Robotics Program in Your County

    Science.gov (United States)

    Habib, Maria A.

    2012-01-01

    The current mission mandates of the National 4-H Headquarters are Citizenship, Healthy Living, and Science. Robotics programs are excellent in fulfilling the Science mandate. Robotics engages students in STEM (Science, Engineering, Technology, and Mathematics) fields by providing interactive, hands-on, minds-on, cross-disciplinary learning…