Search results for "Human-Robot interaction"
showing 10 items of 37 documents
The design of interfaces for multi-robot path planning and control
2014
The field of human-robot interaction has evolved beyond issues concerning the design and development of one person controlling one robot to exploring HRI for groups of robots and teams. Our design research explores biologically-inspired motion that is initiated by a human operator, applied to a single or a small group of robots, and used to affect the motion and path planning of another subset of robots. This exploratory design study first created a taxonomy to categorize individual robot motions, looking at how they could be categorized and used as building blocks. We then combined individual motions with time and velocity as design variables to guide our interaction design. This work led …
A Topic Recognition System for Real World Human-Robot Conversations
2013
One of the main features of social robots is the ability to communicate and interact with people as partners in a natural way. However, achieving a good verbal interaction is a hard task due to the errors on speech recognition systems, and due to the understanting the natural language itself. This paper tries to overcome such kind of problems by presenting a system that enables social robots to get involved in conversation by recognizing its topic. Through the use of classical text mining approach, the presented system allows social robots to understand topics of conversation between human partners, enabling the customization of behaviours in their accordance. The system has been evaluated …
Grounded Human-Robot Interaction
2009
Automation Inner Speech as an Anthropomorphic Feature Affecting Human Trust: Current Issues and Future Directions
2021
This paper aims to discuss the possible role of inner speech in influencing trust in human–automation interaction. Inner speech is an everyday covert inner monolog or dialog with oneself, which is essential for human psychological life and functioning as it is linked to self-regulation and self-awareness. Recently, in the field of machine consciousness, computational models using different forms of robot speech have been developed that make it possible to implement inner speech in robots. As is discussed, robot inner speech could be a new feature affecting human trust by increasing robot transparency and anthropomorphism.
Recognizing actions with the associative self-organizing map
2013
When artificial agents interact and cooperate with other agents, either human or artificial, they need to recognize others’ actions and infer their hidden intentions from the sole observation of their surface level movements. Indeed, action and intention understanding in humans is believed to facilitate a number of social interactions and is supported by a complex neural substrate (i.e. the mirror neuron system). Implementation of such mechanisms in artificial agents would pave the route to the development of a vast range of advanced cognitive abilities, such as social interaction, adaptation, and learning by imitation, just to name a few. We present a first step towards a fully-fledged int…
A MOBILE ROBOT FOR TRANSPORT APPLICATIONS IN HOSPITAL DOMAIN WITH SAFE HUMAN DETECTION ALGORITHM
2009
We have been developing a MKR (Muratec Keio Robot), an autonomous omni-directional mobile transfer robot system for hospital applications. This robot has a wagon truck to transfer luggage, important specimens and other materials. This study proposes a safe obstacle collision avoidance technique that includes a human detection algorithm for omni directional mobile robots that realizes a safe movement technology. The robot can distinguish people from others obstacles with human detection algorithm. The robot evades to people more safely by considering its relative position and velocity with respect to them. Some experiments in a hospital were carried out to verify the performance of the human…
Sensorimotor Communication for Humans and Robots: Improving Interactive Skills by Sending Coordination Signals
2018
During joint actions, humans continuously exchange coordination signals and use nonverbal, sensorimotor forms of communication. Here we discuss a specific example of sensorimotor communication-"signaling"-which consists in the intentional modification of one's own action plan (e.g., a plan for reaching a glass of wine) to make it more predictable or discriminable from alternative action plans that are contextually plausible (e.g., a plan for reaching another glass on the same table). We first review the existing evidence on signaling in human-human interactions, discussing under which conditions humans use signaling. Successively, we distill these insights into a computational theory of sig…
Vision and emotional flow in a cognitive architecture for human-machine interaction
2011
The detection and recognition of a human face should meet the need for social interaction that drives a humanoid robot, and it should be consistent with its cognitive model and the perceived scene. The paper deals with the description of the potential of having a system of emotional contagion, and proposes a simple implementation of it. An emotional index allows to build a mechanism which tends to align the emotional states of the robot and the human when a specific object is detected in the scene. Pursuing the idea of social interaction based on affect recognition, a first practical application capable of managing the emotional flow is described, involving both conceptual spaces and an emo…
Toward Self-Aware Robots
2018
Despite major progress in Robotics and AI, robots are still basically “zombies” repeatedly achieving actions and tasks without understanding what they are doing. Deep-Learning AI programs classify tremendous amounts of data without grasping the meaning of their inputs or outputs. We still lack a genuine theory of the underlying principles and methods that would enable robots to understand their environment, to be cognizant of what they do, to take appropriate and timely initiatives, to learn from their own experience and to show that they know that they have learned and how. The rationale of this paper is that the understanding of its environment by an agent (the agent itself and its effect…
Sing with the Telenoid
2012
We introduce a novel research proposal project aimed to build a robotic setup in which the Telenoid learns to improvise jazz singing in a duet with a human singer. In the proposed application, the Telenoid acts in teleoperated mode during the learning phase, while it becomes more and more autonomous during the working phase. A goal of the research is to investigate the essence of human communication which is based on gestures and prosody. We will employ an architecture for imitation learning that incrementally learns from demonstrations sequences of internal model activations, based on the idea of coupled forward- inverse internal models for representing musical phrases and the body sequenc…