Search results for "Computer vision"
showing 10 items of 2353 documents
Path Following in Non-Visual Conditions.
2018
Path-following tasks have been investigated mostly under visual conditions, that is when subjects are able to see both the path and the tool, or limb, used for navigation. Moreover, only basic path shapes are usually adopted. In the present experiment, participants must rely exclusively on continuous, non-speech, and ecological auditory and vibrotactile cues to follow a path on a flat surface. Two different, asymmetric path shapes were tested. Participants navigated by moving their index finger over a surface sensing position and force. Results show that the different non-visual feedback modes did not affect the task's accuracy, yet they affected its speed, with vibrotactile feedback causin…
Pointing to a target from an upright position in human: tuning of postural responses when there is target uncertainty
2000
International audience; Human subjects performed, from a standing position, rapid hand pointings to visual targets located within or beyond the prehension space. To examine the interaction between posture and the goal-directed movement we introduced a visual double-step perturbation requiring a reprogramming of the hand movement. Trials directed towards the same spatial goal but differentiated only by the likeliness of a visual double-step were compared. The hand kinematics was not affected by the uncertainty of the visual perturbation; an increased trunk bending, however, was observed. This suggests that uncertainty constraints are integrated in a predictive manner for the optimal coordina…
The remapping of time by active tool-use
2015
Multiple, action-based space representations are each based on the extent to which action is possible toward a specific sector of space, such as near/reachable and far/unreachable. Studies on tool-use revealed how the boundaries between these representations are dynamic. Space is not only multidimensional and dynamic, but it is also known for interacting with other dimensions of magnitude, such as time. However, whether time operates on similar action-driven multiple representations and whether it can be modulated by tool-use is yet unknown. To address these issues, healthy participants performed a time bisection task in two spatial positions (near and far space) before and after an active …
Influence of stimulus color on the control of reaching-grasping movements.
2001
This kinematic study aimed to determine whether color is a stimulus property involved in the control of reaching-grasping movements. Subjects reached and grasped a target-object, located either on the right or on the left of the subject's midline. A distractor, placed along the subject's midline, could be randomly presented. The colors, i.e., both chromaticity (red and green stimuli were presented) and lightness, of the target and distractor were varied in experiment 1. Only stimulus lightness and only stimulus chromaticity were varied in experiments 2 and 3, respectively. In experiment 4 subjects matched with their thumb and index finger the size of the target-stimuli presented in experime…
Planning an action.
1997
The motor control of a sequence of two motor acts forming an action was studied in the present experiment. The two analysed motor acts were reaching-grasping an object (first target) and placing it on a second target of the same shape and size (experiment 1). The aim was to determine whether extrinsic properties of the second target (i.e. target distance) could selectively influence the kinematics of reaching and grasping. Distance, position and size of both targets were randomly varied across the experimental session. The kinematics of the initial phase of the first motor act, that is, velocity of reaching and hand shaping of grasping, were influenced by distance of the second target. No k…
Color and lightness constancy in different perceptual tasks
1998
Color and lightness constancy with respect to changing illumination was studied with three different perceptual tasks: ranking of colored papers according (1) to their lightness and (2) to their chromatic similarity in photopic, mesopic, and scotopic states of adaptation, and (3) recognition of remembered colored papers after changes of illumination in photopic vision. Constancy was found in the second task, only. Excitations of light receptors and luminance channels were computed to simulate the empirical rank orders. Results of the first task can be predicted with the hypothesis that luminance channels are activated, if lightness is asked for. Sequences arranged with respect to chromatic …
Effect of luminance on photopic visual acuity in the presence of laser speckle
1988
Visual acuity in coherent and incoherent light has been determined by using square-wave gratings of 100% contrast. Luminance was varied from 3 to 400 cd/m2. Coherent illumination resulted in a 40% loss of visual acuity. This is probably due to the masking effect of coherent spatial noise (speckle). However, the most interesting finding is the change in shape of the photopic visual-acuity-luminance function. With coherent illumination, the function is vertically displaced and of a different gradient. An increase in luminance produces a decrease in visual acuity. This indicates that the masking effect of the speckle is dependent on luminance. Two observers were used, and similar results were …
Haptic information differentially interferes with visual analysis in reaching-grasping control and in perceptual processes.
1998
We used an interference paradigm in order to study integration between haptic and visual information in motor control and in perceptual analysis. Subjects either reached and grasped a visually presented sphere or matched its size with their left hand while manipulating with their right hand another sphere whose size could be smaller or greater. In four experiments haptic analysis of the manipulated sphere could be either automatically incorporated with or explicitly dissociated from visual analysis. In a fifth experiment reaching-grasping and matching were executed with the right hand, whereas manipulation was executed with the left hand. Manipulation with the right hand influenced finger s…
Does matching of internal and external facial features depend on orientation and viewpoint?
2009
Although it is recognized that external (hair, head and face outline, ears) and internal (eyes, eyebrows, nose, mouth) features contribute differently to face recognition it is unclear whether both feature classes predominately stimulate different sensory pathways. We employed a sequential speed-matching task to study face perception with internal and external features in the context of intact faces, and at two levels of contextual congruency. Both internal and external features were matched faster and more accurately in the context of totally congruent/incongruent facial stimuli compared to just featurally congruent/ incongruent faces. Matching of totally congruent/incongruent faces was no…
Visual inference of arm movement is constrained by motor representations
2015
International audience; Several studies support the idea that motion inference is strongly motor dependent. In the present study, we address the role of biomechanical constraints in motion prediction and how this implicit knowledge can interfere in a spatial prediction task. Right-handed (RHS) and left-handed subjects (LHS) had to estimate the final position of a horizontal arm movement in which the final part of the trajectory was hidden. Our study highlighted a direction effect: end point prediction accuracy was better to infer the final position of horizontal motion directed toward the median line of human body. This finding suggests that the spatial prediction of end point is mapped ont…