Search results for "Eye Tracking"
showing 10 items of 117 documents
Effort in Semi-Automatized Subtitling Processes
2020
The presented study investigates the impact of automatic speech recognition (ASR) and assisting scripts on effort during transcription and translation processes, two main subprocesses of interlingual subtitling. Applying keylogging and eye tracking, this study takes a first look at how the integration of ASR impacts these subprocesses. 12 professional subtitlers and 13 translation students were recorded performing two intralingual transcriptions and three translation tasks to evaluate the impact on temporal, technical, and cognitive effort, and split-attention. Measures include editing time, visit count and duration, insertions, and deletions. The main findings show that, in both tasks, ASR…
Driver Situation Awareness and Perceived Sleepiness during Truck Platoon Driving–Insights from Eye-tracking Data
2021
Truck platoon driving technology uses vehicle-to-vehicle communication to allow one truck to follow another in an automated fashion. The first vehicle is operated manually, the second vehicle is dr...
Image Content Enhancement Through Salient Regions Segmentation for People With Color Vision Deficiencies
2019
Color vision deficiencies affect visual perception of colors and, more generally, color images. Several sciences such as genetics, biology, medicine, and computer vision are involved in studying and analyzing vision deficiencies. As we know from visual saliency findings, human visual system tends to fix some specific points and regions of the image in the first seconds of observation summing up the most important and meaningful parts of the scene. In this article, we provide some studies about human visual system behavior differences between normal and color vision-deficient visual systems. We eye-tracked the human fixations in first 3 seconds of observation of color images to build real f…
How Our Gaze Reacts to Another Person’s Tears? Experimental Insights Into Eye Tracking Technology
2020
Crying is an ubiquitous human behavior through which an emotion is expressed on the face together with visible tears and constitutes a slippery riddle for researchers. To provide an answer to the question “How our gaze reacts to another person’s tears?,” we made use of eye tracking technology to study a series of visual stimuli. By presenting an illustrative example through an experimental setting specifically designed to study the “tearing effect,” the present work aims to offer methodological insight on how to use eye-tracking technology to study non-verbal cues. A sample of 30 healthy young women with normal visual acuity performed a within-subjects task in which they evaluated images of…
Did the three little pigs frighten the wolf? How deaf readers use lexical and syntactic cues to comprehend sentences
2020
Abstract Background The ways in which students with deafness process syntactic and semantic cues while reading sentences are unclear. While some studies have supported the preference for semantic cues, others have not. Aim To examine differences in the processing of syntactic versus semantic cues during sentence reading among students who are deaf or hard of hearing (DHH). Method Twenty DHH students (mean age = 12.48 years) and 20 chronologically age-matched students with typical hearing (TH) were asked to read sentences written in Spanish with different grammatical structures and to choose the picture that best matched the sentences’ meaning while their eye movements were being registered.…
Comprehension of complex animation : cueing, segmentation and 2D / 3D presentations
2011
The goal of our studies was to test the effect of segmentation, cueing, and 2D/3D presentations to foster complex animation rocessing. The material was an upright mechanical piano system. We used an eye tracking system which provides information about learners’ attention direction during the animation processing. We analyzed the effect of the format presentations and the eye movements during learning. Based on animation and multimedia research background, four experiments were conducted. In the first experiment the effect of the presentation of simplified external representations on learning from complex animation was investigated. Experiment two and three aimed at studying the cognitive pr…
Comparing competing views of analogy making using eye-tracking technology
2016
International audience; We used eye-tracking to study the time course of analogical reasoning in adults. We considered proportions of looking times and saccades. The main question was whether or not adults would follow the same search strategies for different types of analogical problems (Scene Analogies vs. Classical A:B:C:D vs a Scene version of A:B::C:D). We then compared these results to the predictions of various models of analogical reasoning. Results revealed a picture of common search patterns with local adaptations to the specifics of each paradigm in both looking-time duration and the number and types of saccades. These results are discussed in terms of conceptions of analogical r…
2020
To successfully learn using open Internet resources, students must be able to critically search, evaluate and select online information, and verify sources. Defined as critical online reasoning (COR), this construct is operationalized on two levels in our study: (1) the student level using the newly developed Critical Online Reasoning Assessment (CORA), and (2) the online information processing level using event log data, including gaze durations and fixations. The written responses of 32 students for one CORA task were scored by three independent raters. The resulting score was operationalized as “task performance,” whereas the gaze fixations and durations were defined as indicators of “pr…
How Quickly Can We Predict Users’ Ratings on Aesthetic Evaluations of Websites? Employing Machine Learning on Eye-Tracking Data
2020
This study examines how quickly we can predict users’ ratings on visual aesthetics in terms of simplicity, diversity, colorfulness, craftsmanship. To predict users’ ratings, first we capture gaze behavior while looking at high, neutral, and low visually appealing websites, followed by a survey regarding user perceptions on visual aesthetics towards the same websites. We conduct an experiment with 23 experienced users in online shopping, capture gaze behavior and through employing machine learning we examine how fast we can accurately predict their ratings. The findings show that after 25 s we can predict ratings with an error rate ranging from 9% to 11% depending on which facet of visual ae…
Evaluación a través de la neurociencia de la elección de marca en el punto de venta. Un análisis integrado del seguimiento ocular (ET) y el recorrido…
2019
El tema objeto de esta tesis doctoral es la modelización de un patrón de elección de marca en un entorno virtual que recrea un supermercado, integrando el análisis de datos recogidos mediante eye tracking, seguimiento del recorrido espacial en tienda (Human Behaviour Tracking: HBT) y cuestionario. Los objetivos son los siguientes: (1) conocer el patrón de comportamiento visual y espacial en la elección de marca en un entorno virtual; (2) contrastar que la elección de marca en un punto de venta sigue un proceso secuencial compuesto por tres fases que son: (a) fase de orientación, (b) fase de evaluación y (c) fase de verificación y (3) conocer los determinantes que desencadenan la elección de…