Search results for "human-robot interaction"

showing 10 items of 37 documents

Agents and robots for collaborating and supporting physicians in healthcare scenarios

2020

Graphical abstract

RiskTelemedicineComputer scienceRemote patient monitoringPneumonia ViralHealth InformaticsTelehealthHealth informaticsArticle03 medical and health sciences0302 clinical medicineArtificial IntelligenceComputer SystemsHealth careHumans030212 general & internal medicineAgent architecturePandemics030304 developmental biologyComputingMethodologies_COMPUTERGRAPHICSAgedMonitoring PhysiologicSettore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniPatient monitoring0303 health sciencesInfectious Disease MedicineRobots in therapybusiness.industryMulti-agent systemMulti-agent systemsCOVID-19RoboticsModels TheoreticalTriageTelemedicineComputer Science ApplicationsNursing HomesRisk analysis (engineering)GeriatricsEmergency MedicinebusinessHuman-robot interactionCoronavirus InfectionsRobots in Emergency Care for COVID-19Medical InformaticsJournal of Biomedical Informatics
researchProduct

What should I do next? Using shared representations to solve interaction problems

2011

Studies on how “the social mind” works reveal that cognitive agents engaged in joint actions actively estimate and influence another’s cognitive variables and form shared representations with them. (How) do shared representations enhance coordination? In this paper, we provide a probabilistic model of joint action that emphasizes how shared representations help solving interaction problems. We focus on two aspects of the model. First, we discuss how shared representations permit to coordinate at the level of cognitive variables (beliefs, intentions, and actions) and determine a coherent unfolding of action execution and predictive processes in the brains of two agents. Second, we discuss th…

Computer sciencejoint actionModels PsychologicalBayesian inference050105 experimental psychology03 medical and health sciencesUser-Computer Interface0302 clinical medicineCognitionJoint action Graphical models Human-Robot Interaction Shared representationsHumans0501 psychology and cognitive sciencesInterpersonal RelationsCooperative BehaviorProblem SolvingConstellationCognitive scienceSettore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniFocus (computing)Communicationbusiness.industryGeneral Neuroscience05 social sciencesStatistical modelCognitionpredictionTower (mathematics)Joint actionAction (philosophy)businesssignaling030217 neurology & neurosurgery
researchProduct

Agents in dynamic contexts, a system for learning plans

2020

Reproducing the human ability to cooperate and collaborate in a dynamic environment is a significant challenge in the field of human-robot teaming interaction. Generally, in this context, a robot has to adapt itself to handle unforeseen situations. The problem is runtime planning when some factors are not known before the execution starts. This work aims to show and discuss a method to handle this kind of situation. Our idea is to use the Belief-Desire-Intention agent paradigm, its the Jason reasoning cycle and a Non-Axiomatic Reasoning System. The result is a novel method that gives the robot the ability to select the best plan.

Settore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniReasoning systemComputer science020207 software engineeringContext (language use)02 engineering and technologyPlan (drawing)Field (computer science)Human–robot interactionPlanningWork (electrical)Human–computer interaction020204 information systems0202 electrical engineering electronic engineering information engineeringRobotBDIHuman-robot interactionJason
researchProduct

Acceptability Study of A3-K3 Robotic Architecture for a Neurorobotics Painting

2019

In this paper, authors present a novel architecture for controlling an industrial robot via Brain Computer Interface. The robot used is a Series 2000 KR 210-2. The robotic arm was fitted with DI drawing devices that clamp, hold and manipulate various artistic media like brushes, pencils, pens. User selected a high-level task, for instance a shape or movement, using a human machine interface and the translation in robot movement was entirely demanded to the Robot Control Architecture defining a plan to accomplish user's task. The architecture was composed by a Human Machine Interface based on P300 Brain Computer Interface and a robotic architecture composed by a deliberative layer and a reac…

Computer science0206 medical engineeringBiomedical Engineering02 engineering and technologyPlan (drawing)Human–robot interactionTask (project management)law.inventionlcsh:RC321-57103 medical and health sciencesIndustrial robotbrain computer interface (BCI)0302 clinical medicineHuman–computer interactionlawArtificial Intelligenceevent related potential (ERP)Architecturelcsh:Neurosciences. Biological psychiatry. Neuropsychiatryhuman-robot interaction (HRI)Original ResearchartRobotics and AIrobot020601 biomedical engineeringRobotRobotic arm030217 neurology & neurosurgeryNeuroroboticsFrontiers in Neurorobotics
researchProduct

Resolving ambiguities in a grounded human-robot interaction

2009

In this paper we propose a trainable system that learns grounded language models from examples with a minimum of user intervention and without feedback. We have focused on the acquisition of grounded meanings of spatial and adjective/noun terms. The system has been used to understand and subsequently to generate appropriate natural language descriptions of real objects and to engage in verbal interactions with a human partner. We have also addressed the problem of resolving eventual ambiguities arising during verbal interaction through an information theoretic approach.

Computer sciencebusiness.industryContext (language use)computer.software_genreInformation theoryHuman–robot interactionHuman-Robot InteractionVisualizationRoboticNounMachine learningLanguage modelArtificial intelligencebusinesscomputerAdjectiveNatural language processingNatural language
researchProduct

Robot's Inner Speech Effects on Trust and Anthropomorphic Cues in Human-Robot Cooperation

2021

Inner Speech is an essential but also elusive human psychological process which refers to an everyday covert internal conversation with oneself. We argue that programming a robot with an overt self-talk system, which simulates human inner speech, might enhance human trust by improving robot transparency and anthropomorphism. For this reasons, this work aims to investigate if robot’s inner speech, here intended as overt self-talk, affects human trust and anthropomorphism when human and robot cooperate. A group of participants was engaged in collaboration with the robot. During cooperation, the robot talks to itself. To evaluate if the robot’s inner speech influences human trust, two question…

Settore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniSettore M-PSI/04 - Psicologia Dello Sviluppo E Psicologia Dell'EducazioneRobot inner speech human-robot interaction robot consciousness robot trust cognitive robotics
researchProduct

A global workspace theory model for trust estimation in human-robot interaction

2019

Successful and genuine social connections between humans are based on trust, even more when the people involved have to collaborate to reach a shared goal. With the advent of new findings and technologies in the field of robotics, it appears that this same key factor that regulates relationships between humans also applies with the same importance to human-robot interactions (HRI). Previous studies have proven the usefulness of a robot able to estimate the trustworthiness of its human collaborators and in this position paper we discuss a method to extend an existing state-of-the-art trust model with considerations based on social cues such as emotions. The proposed model follows the Global …

Settore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniCognitive RoboticsEmotionsTheory of MindGlobal Workspace TheoryTrustHuman-Robot Interaction
researchProduct

“It Is Not the Robot Who Learns, It Is Me.” Treating Severe Dysgraphia Using Child–Robot Interaction

2021

Writing disorders are frequent and impairing. However, social robots may help to improve children's motivation and to propose enjoyable and tailored activities. Here, we have used the Co-writer scenario in which a child is asked to teach a robot how to write via demonstration on a tablet, combined with a series of games we developed to train specifically pressure, tilt, speed, and letter liaison controls. This setup was proposed to a 10-year-old boy with a complex neurodevelopmental disorder combining phonological disorder, attention deficit/hyperactivity disorder, dyslexia, and developmental coordination disorder with severe dysgraphia. Writing impairments were severe and limited his parti…

Occupational therapy030506 rehabilitationmedicine.medical_specialtylcsh:RC435-571educationHuman–robot interaction[INFO.INFO-AI]Computer Science [cs]/Artificial Intelligence [cs.AI]dysgraphia03 medical and health scienceshuman-robot interaction0302 clinical medicinePhysical medicine and rehabilitationDysgraphia[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingHandwritingoccupational therapylcsh:Psychiatrymedicine[INFO.INFO-RB]Computer Science [cs]/Robotics [cs.RO][INFO.INFO-HC]Computer Science [cs]/Human-Computer Interaction [cs.HC]Phonological Disorderlearning-byteachingOriginal Researchlearning-by-teachingPsychiatry[SDV.MHEP.PED]Life Sciences [q-bio]/Human health and pathology/PediatricsSocial robot[SCCO.NEUR]Cognitive science/NeuroscienceDyslexiaserious-game[INFO.INFO-CV]Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV]medicine.diseasePsychiatry and Mental health[SCCO.PSYC]Cognitive science/Psychology[INFO.EIAH]Computer Science [cs]/Technology for Human Learning0305 other medical sciencePsychology030217 neurology & neurosurgeryLearning by teachinghandwritingFrontiers in Psychiatry
researchProduct

Developing Self-Awareness in Robots via Inner Speech

2019

The experience of inner speech is a common one. Such a dialogue accompanies the introspection of mental life and fulfills essential roles in human behavior, such as self-restructuring, self-regulation, and re-focusing on attentional resources. Although the underpinning of inner speech is mostly investigated in psychological and philosophical fields, the research in robotics generally does not address such a form of self-aware behavior. Existing models of inner speech inspire computational tools to provide a robot with this form of self-awareness. Here, the widespread psychological models of inner speech are reviewed, and a cognitive architecture for a robot implementing such a capability is…

Underpinningmedia_common.quotation_subjectlcsh:Mechanical engineering and machineryPsychological Modelslcsh:QA75.5-76.95human-robot interactioninner speechArtificial Intelligencelcsh:TJ1-1570media_commonOriginal ResearchCognitive scienceRobotics and AISettore ING-INF/05 - Sistemi Di Elaborazione Delle Informazionibusiness.industryRoboticsrobotCognitive architectureComputer Science ApplicationsMental lifeSelf-awarenessIntrospectionRobotcognitive cycleArtificial intelligencelcsh:Electronic computers. Computer sciencebusinessPsychologyself-awareness
researchProduct

Visually-Grounded Language Model for Human-Robot Interaction

2010

Visually grounded human-robot interaction is recognized to be an essential ingredient of socially intelligent robots, and the integration of vision and language increasingly attracts attention of researchers in diverse fields. However, most systems lack the capability to adapt and expand themselves beyond the preprogrammed set of communicative behaviors. Their linguistic capabilities are still far from being satisfactory which make them unsuitable for real-world applications. In this paper we will present a system in which a robotic agent can learn a grounded language model by actively interacting with a human user. The model is grounded in the sense that meaning of the words is linked to a…

Settore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniHuman-Robot Interaction Language learning Language grounding
researchProduct