This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.

Search for Publication

Year(s) from:  to 
Keywords (separated by spaces):

Robust Adaptive Embedded Label Propagation With Weight Learning for Inductive Classification

Z. Zhang, F. Li, L. Jia, J. Qin, L. Zhang and S. Yan
IEEE Transactions on Neural Networks and Learning Systems
2017, in press


We propose a robust inductive semi-supervised label prediction model over the embedded representation, termed adaptive embedded label propagation with weight learning (AELP-WL), for classification. AELP-WL offers several properties. First, our method seamlessly integrates the robust adaptive embedded label propagation with adaptive weight learning into a unified framework. By minimizing the reconstruction errors over embedded features and embedded soft labels jointly, our AELP-WL can explicitly ensure the learned weights to be joint optimal for representation and classification, which differs from most existing LP models that perform weight learning separately by an independent step before label prediction. Second, existing models usually precalculate the weights over the original samples that may contain unfavorable features and noise decreasing performance. To this end, our model adds a constraint that decomposes original data into a sparse component encoding embedded noise-removed sparse representations of samples and a sparse error part fitting noise, and then performs the adaptive weight learning over the embedded sparse representations. Third, our AELP-WL computes the projected soft labels by trading-off the manifold smoothness and label fitness errors over the adaptive weights and the embedded representations for enhancing the label estimation power. By including a regressive label approximation error for simultaneous minimization to correlate sample features with the embedded soft labels, the out-of-sample issue is naturally solved. By minimizing the reconstruction errors over features and embedded soft labels, classification error and label approximation error jointly, state-of-the-art results are delivered.

Link to publisher's page
  author = {Z. Zhang and F. Li and L. Jia and J. Qin and L. Zhang and S. Yan},
  title = {Robust Adaptive Embedded Label Propagation With Weight Learning for Inductive Classification},
  journal = {IEEE Transactions on Neural Networks and Learning Systems},
  year = {2017},
  month = {},
  pages = {},
  volume = {},
  number = {},
  keywords = {Robustness, Adaptation models, Data models, Manifolds, Feature extraction, Sparse matrices, Predictive models},
  note = {in press}