Contrast-Based Visual Saliency Estimation

Project Members

Federico Perazzi (Disney Research Zurich)
Yael Pritch (Disney Research Zurich)
Alexander Sorkine-Hornung (Disney Research Zurich)

PROJECT_ContrastBasedSaliency_teaser

Saliency estimation has become a valuable tool in image processing. Yet, existing approaches exhibit considerable variation in methodology, and it is often difficult to attribute improvements in result quality to specific algorithm properties. In this work, we reconsider some of the design choices of previous methods and propose a conceptually clear and intuitive algorithm for contrast-based saliency estimation.

Our algorithm consists of four basic steps. First, our method decomposes a given image into compact, perceptually homogeneous elements that abstract unnecessary detail. Based on this abstraction, we compute two measures of contrast that rate the uniqueness and the spatial distribution of these elements. From the element contrast we then derive a saliency measure that produces a pixel-accurate saliency map that uniformly covers the objects of interest and consistently separates foreground and background. We show that the complete contrast and saliency estimation can be formulated in a unified way using high dimensional Gaussian filters. This result contributes to the conceptual simplicity of our method and lends itself to a highly efficient implementation with linear complexity. In a detailed experimental evaluation, we analyze the contribution of each individual feature and show that our method outperforms all state-of-the-art approaches at the time of publication

Publications

Saliency Filters- Contrast Based Filtering for Salient Region Detection-Thumbnail

Saliency Filters: Contrast Based Filtering for Salient Region Detection
October 24, 2012
IEEE Conference on Computer Vision Pattern Recognition (CVPR) 2012
Paper File [pdf, 7.33 MB]

Copyright Notice

The documents contained in these directories are included by the contributing authors as a means to ensure timely dissemination of scholarly and technical work on a non-commercial basis. Copyright and all rights therein are maintained by the authors or by other copyright holders, notwithstanding that they have offered their works here electronically. It is understood that all persons copying this information will adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.