6533b85cfe1ef96bd12bd44d

RESEARCH PRODUCT

Path Following in Non-Visual Conditions.

Stefano PapettiAlan Del PiccoloDavide Rocchesso

subject

AdultMaleComputer scienceInformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.HCI)Path following02 engineering and technology050105 experimental psychologyTask (project management)Haptic InterfacesPosition (vector)Feedback SensoryPhysical Stimulation0202 electrical engineering electronic engineering information engineeringmedicineHumans0501 psychology and cognitive sciencesComputer visionHuman computer interaction User interfaces Audio user interfaces Haptic interfacesAudio User InterfacesSettore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniSettore INF/01 - Informaticabusiness.industry05 social sciences020207 software engineeringIndex fingerHuman Computer InteractionComputer Science ApplicationsVisualizationHuman-Computer Interactionmedicine.anatomical_structureAcoustic StimulationTouch PerceptionPath (graph theory)Task analysisAuditory PerceptionFemaleArtificial intelligenceCuesbusinessPsychomotor PerformanceGestureUser InterfacesSpatial Navigation

description

Path-following tasks have been investigated mostly under visual conditions, that is when subjects are able to see both the path and the tool, or limb, used for navigation. Moreover, only basic path shapes are usually adopted. In the present experiment, participants must rely exclusively on continuous, non-speech, and ecological auditory and vibrotactile cues to follow a path on a flat surface. Two different, asymmetric path shapes were tested. Participants navigated by moving their index finger over a surface sensing position and force. Results show that the different non-visual feedback modes did not affect the task's accuracy, yet they affected its speed, with vibrotactile feedback causing the slowest gestures. Also, vibrotactile feedback caused participants to exert more force over the surface. Finally, the shape of the path was relevant to the accuracy, and participants tended to prefer audio over vibrotactile and audio-tactile feedback.

10.1109/toh.2018.2861767https://pubmed.ncbi.nlm.nih.gov/30072341