Search results for "human-robot interaction"
showing 10 items of 37 documents
Agents and robots for collaborating and supporting physicians in healthcare scenarios
2020
Graphical abstract
What should I do next? Using shared representations to solve interaction problems
2011
Studies on how “the social mind” works reveal that cognitive agents engaged in joint actions actively estimate and influence another’s cognitive variables and form shared representations with them. (How) do shared representations enhance coordination? In this paper, we provide a probabilistic model of joint action that emphasizes how shared representations help solving interaction problems. We focus on two aspects of the model. First, we discuss how shared representations permit to coordinate at the level of cognitive variables (beliefs, intentions, and actions) and determine a coherent unfolding of action execution and predictive processes in the brains of two agents. Second, we discuss th…
Agents in dynamic contexts, a system for learning plans
2020
Reproducing the human ability to cooperate and collaborate in a dynamic environment is a significant challenge in the field of human-robot teaming interaction. Generally, in this context, a robot has to adapt itself to handle unforeseen situations. The problem is runtime planning when some factors are not known before the execution starts. This work aims to show and discuss a method to handle this kind of situation. Our idea is to use the Belief-Desire-Intention agent paradigm, its the Jason reasoning cycle and a Non-Axiomatic Reasoning System. The result is a novel method that gives the robot the ability to select the best plan.
Acceptability Study of A3-K3 Robotic Architecture for a Neurorobotics Painting
2019
In this paper, authors present a novel architecture for controlling an industrial robot via Brain Computer Interface. The robot used is a Series 2000 KR 210-2. The robotic arm was fitted with DI drawing devices that clamp, hold and manipulate various artistic media like brushes, pencils, pens. User selected a high-level task, for instance a shape or movement, using a human machine interface and the translation in robot movement was entirely demanded to the Robot Control Architecture defining a plan to accomplish user's task. The architecture was composed by a Human Machine Interface based on P300 Brain Computer Interface and a robotic architecture composed by a deliberative layer and a reac…
Resolving ambiguities in a grounded human-robot interaction
2009
In this paper we propose a trainable system that learns grounded language models from examples with a minimum of user intervention and without feedback. We have focused on the acquisition of grounded meanings of spatial and adjective/noun terms. The system has been used to understand and subsequently to generate appropriate natural language descriptions of real objects and to engage in verbal interactions with a human partner. We have also addressed the problem of resolving eventual ambiguities arising during verbal interaction through an information theoretic approach.
Robot's Inner Speech Effects on Trust and Anthropomorphic Cues in Human-Robot Cooperation
2021
Inner Speech is an essential but also elusive human psychological process which refers to an everyday covert internal conversation with oneself. We argue that programming a robot with an overt self-talk system, which simulates human inner speech, might enhance human trust by improving robot transparency and anthropomorphism. For this reasons, this work aims to investigate if robot’s inner speech, here intended as overt self-talk, affects human trust and anthropomorphism when human and robot cooperate. A group of participants was engaged in collaboration with the robot. During cooperation, the robot talks to itself. To evaluate if the robot’s inner speech influences human trust, two question…
A global workspace theory model for trust estimation in human-robot interaction
2019
Successful and genuine social connections between humans are based on trust, even more when the people involved have to collaborate to reach a shared goal. With the advent of new findings and technologies in the field of robotics, it appears that this same key factor that regulates relationships between humans also applies with the same importance to human-robot interactions (HRI). Previous studies have proven the usefulness of a robot able to estimate the trustworthiness of its human collaborators and in this position paper we discuss a method to extend an existing state-of-the-art trust model with considerations based on social cues such as emotions. The proposed model follows the Global …
“It Is Not the Robot Who Learns, It Is Me.” Treating Severe Dysgraphia Using Child–Robot Interaction
2021
Writing disorders are frequent and impairing. However, social robots may help to improve children's motivation and to propose enjoyable and tailored activities. Here, we have used the Co-writer scenario in which a child is asked to teach a robot how to write via demonstration on a tablet, combined with a series of games we developed to train specifically pressure, tilt, speed, and letter liaison controls. This setup was proposed to a 10-year-old boy with a complex neurodevelopmental disorder combining phonological disorder, attention deficit/hyperactivity disorder, dyslexia, and developmental coordination disorder with severe dysgraphia. Writing impairments were severe and limited his parti…
Developing Self-Awareness in Robots via Inner Speech
2019
The experience of inner speech is a common one. Such a dialogue accompanies the introspection of mental life and fulfills essential roles in human behavior, such as self-restructuring, self-regulation, and re-focusing on attentional resources. Although the underpinning of inner speech is mostly investigated in psychological and philosophical fields, the research in robotics generally does not address such a form of self-aware behavior. Existing models of inner speech inspire computational tools to provide a robot with this form of self-awareness. Here, the widespread psychological models of inner speech are reviewed, and a cognitive architecture for a robot implementing such a capability is…
Visually-Grounded Language Model for Human-Robot Interaction
2010
Visually grounded human-robot interaction is recognized to be an essential ingredient of socially intelligent robots, and the integration of vision and language increasingly attracts attention of researchers in diverse fields. However, most systems lack the capability to adapt and expand themselves beyond the preprogrammed set of communicative behaviors. Their linguistic capabilities are still far from being satisfactory which make them unsuitable for real-world applications. In this paper we will present a system in which a robotic agent can learn a grounded language model by actively interacting with a human user. The model is grounded in the sense that meaning of the words is linked to a…