This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.

Search for Publication

Year(s) from:  to 
Keywords (separated by spaces):

Sparsity Potentials for Detecting Objects with the Hough Transform

Nima Razavi, Nima S. Alvar, Juergen Gall, Luc van Gool
British Machine Vision Conference
2012, in press


Hough transform based object detectors divide an object into a number of patches and combine them using a shape model. For efficient combination of patches into the shape model, the individual patches are assumed to be independent of one another. Although this independence assumption is key for fast inference, it requires the individual patches to have a high discriminative power in predicting the class and location of objects. In this paper, we argue that the sparsity of the appearance of a patch in its neighborhood can be a very powerful measure to increase the discriminative power of a local patch and incorporate it as a sparsity potential for object detection. Further, we show that this potential shall depend on the appearance of the patch to adapt to the statistics of the neighborhood specific to the type of appearance (e.g. texture or structure) it represents. We have evaluated our method on challenging datasets including the PASCAL VOC 2007 dataset and show that using the proposed sparsity potential result in a substantial improvement in the detection accuracy.

Download in pdf format
  author = {Nima Razavi and Nima S. Alvar and Juergen Gall and Luc van Gool},
  title = {Sparsity Potentials for Detecting Objects with the Hough Transform},
  booktitle = {British Machine Vision Conference},
  year = {2012},
  keywords = {Object Detection, Hough Transform, Self-Similarity, Sparse, Sparsity},
  note = {in press}