Abstract
As the extended version of conventional Ridge Regression, L{2,1} -norm based ridge regression learning methods have been widely used in subspace learning since they are more robust than Frobenius norm based regression and meanwhile guarantee joint sparsity. However, conventional L_{2,1} -norm regression methods encounter the small-class problem and meanwhile ignore the local geometric structures, which degrade their performances. To address these problems, we propose a novel regression method called Locality Preserving Robust Regression (LPRR). In addition to using the L_{2,1} -norm for jointly sparse regression, we also utilize capped L_{2} -norm in loss function to further enhance the robustness of the proposed algorithm. Moreover, to make use of local structure information, we also integrate the property of locality preservation into our model since it is of great importance in dimensionality reduction. The convergence analysis and computational complexity of the proposed iterative algorithm are presented. Experimental results on four datasets indicate that the proposed LPRR performs better than some famous subspace learning methods in classification tasks.
Original language | English |
---|---|
Article number | 9181612 |
Pages (from-to) | 2274-2287 |
Number of pages | 14 |
Journal | IEEE Transactions on Circuits and Systems for Video Technology |
Volume | 31 |
Issue number | 6 |
DOIs | |
Publication status | Published - Jun 2021 |
Externally published | Yes |
Keywords
- Feature extraction
- L -regularization
- capped L -norm loss
- subspace learning
ASJC Scopus subject areas
- Media Technology
- Electrical and Electronic Engineering