Locality Preserving Robust Regression for Jointly Sparse Subspace Learning

Ning Liu, Zhihui Lai, Xuechen Li, Yudong Chen, Dongmei Mo, Heng Kong, Linlin Shen

Research output: Journal PublicationArticlepeer-review

11 Citations (Scopus)


As the extended version of conventional Ridge Regression, L{2,1} -norm based ridge regression learning methods have been widely used in subspace learning since they are more robust than Frobenius norm based regression and meanwhile guarantee joint sparsity. However, conventional L_{2,1} -norm regression methods encounter the small-class problem and meanwhile ignore the local geometric structures, which degrade their performances. To address these problems, we propose a novel regression method called Locality Preserving Robust Regression (LPRR). In addition to using the L_{2,1} -norm for jointly sparse regression, we also utilize capped L_{2} -norm in loss function to further enhance the robustness of the proposed algorithm. Moreover, to make use of local structure information, we also integrate the property of locality preservation into our model since it is of great importance in dimensionality reduction. The convergence analysis and computational complexity of the proposed iterative algorithm are presented. Experimental results on four datasets indicate that the proposed LPRR performs better than some famous subspace learning methods in classification tasks.

Original languageEnglish
Article number9181612
Pages (from-to)2274-2287
Number of pages14
JournalIEEE Transactions on Circuits and Systems for Video Technology
Issue number6
Publication statusPublished - Jun 2021
Externally publishedYes


  • capped L -norm loss
  • Feature extraction
  • L -regularization
  • subspace learning

ASJC Scopus subject areas

  • Media Technology
  • Electrical and Electronic Engineering


Dive into the research topics of 'Locality Preserving Robust Regression for Jointly Sparse Subspace Learning'. Together they form a unique fingerprint.

Cite this