This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.

Search for Publication

Year(s) from:  to 
Keywords (separated by spaces):

Iterative Nearest Neighbors

Radu Timofte and Luc Van Gool
Pattern Recognition
Vol. 48, No. 1, pp. 60–72, January 2015


Representing data as a linear combination of a set of selected known samples is of interest for various machine learning applications such as dimensionality reduction or classification. k-Nearest Neighbors (kNN) and its variants are still among the best-known and most often used techniques. Some popular richer representations are Sparse Representation (SR) based on solving an l1 -regularized least squares formulation, Collaborative Representation (CR) based on l2 -regularized least squares, and Locally Linear Embedding (LLE) based on an l1 -constrained least squares problem. We propose a novel sparse representation, the Iterative Nearest Neighbors (INN). It combines the power of SR and LLE with the computational simplicity of kNN. We empirically validate our representation in terms of sparse support signal recovery and compare with similar Matching Pursuit (MP) and Orthogonal Matching Pursuit (OMP), two other iterative methods. We also test our method in terms of dimensionality reduction and classification, using standard benchmarks for faces (AR), traffic signs (GTSRB), and objects (PASCAL VOC 2007). INN compares favorably to NN, MP, and OMP, and on par with CR and SR, while being orders of magnitude faster than the latter. On the downside, INN does not scale well with higher dimensionalities of the data.

Link to publisher's page
Download in pdf format
  author = {Radu Timofte and Luc Van Gool},
  title = {Iterative Nearest Neighbors},
  journal = {Pattern Recognition},
  year = {2015},
  month = {January},
  pages = {60–72},
  volume = {48},
  number = {1},
  keywords = {}