6533b861fe1ef96bd12c45d9

RESEARCH PRODUCT

Exploiting Visual Saliency Algorithms for Object-Based Attention: A New Color and Scale-Based Approach

Edoardo Ardizzone 6Francesco GugliuzzaAlessandro Bruno

subject

Settore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniComputer sciencebusiness.industrymedia_common.quotation_subject05 social sciencesComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONEye movementExperimental dataScale-invariant feature transformVisual saliency Object-based attention SIFT Fixation maps Dataset Eye trackingPattern recognition02 engineering and technology050105 experimental psychologySalientPerceptionFixation (visual)0202 electrical engineering electronic engineering information engineeringEye tracking020201 artificial intelligence & image processing0501 psychology and cognitive sciencesComputer visionArtificial intelligencebusinessObject-based attentionmedia_common

description

Visual Saliency aims to detect the most important regions of an image from a perceptual point of view. More in detail, the goal of Visual Saliency is to build a Saliency Map revealing the salient subset of a given image by analyzing bottom-up and top-down factors of Visual Attention. In this paper we proposed a new method for Saliency detection based on colour and scale analysis, extending our previous work based on SIFT spatial density inspection. We conducted several experiments to study the relationships between saliency methods and the object attention processes and we collected experimental data by tracking the eye movements of thirty viewers in the first three seconds of observation of several images. More precisely, we used a dataset that consists of images with an object in the foreground on an homogeneous background. We are interested in studying the performance of our saliency method with respect to the real fixation maps collected during the experiments. We compared the performances of our method with several state of the art methods with very encouraging results.

10.1007/978-3-319-68548-9_18http://hdl.handle.net/10447/336800