Unsupervised discriminant canonical correlation analysis for feature fusion

Sheng Wang, Xingjian Gu, Jianfeng Lu, Jing Yu Yang, Ruili Wang, Jian Yang

Research output: Chapter in Book/Conference proceedingConference contributionpeer-review

8 Citations (Scopus)

Abstract

Canonical correlation analysis (CCA) has been widely applied to information fusion. It only considers the correlated information of the paired data, but ignores the correlated information between the samples in the same class. Furthermore, class information is useful for CCA, but there is little class information in the scenarios of real applications. Thus, it is difficult to utilize the correlated information between the samples in the same class. To utilize the correlated information between the samples, we propose a method named Unsupervised Discriminant Canonical Correlation Analysis (UDCCA). In UDCCA, the class membership and mapping are iteratively computed by using the normalized spectral clustering and generalized Eigen value methods alternatively. The experimental results on the MFD dataset and ORL dataset show that UDCCA outperforms traditional CCA and its variants in most situations.

Original languageEnglish
Title of host publicationProceedings - International Conference on Pattern Recognition
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1550-1555
Number of pages6
ISBN (Electronic)9781479952083
DOIs
Publication statusPublished - 4 Dec 2014
Externally publishedYes
Event22nd International Conference on Pattern Recognition, ICPR 2014 - Stockholm, Sweden
Duration: 24 Aug 201428 Aug 2014

Publication series

NameProceedings - International Conference on Pattern Recognition
ISSN (Print)1051-4651

Conference

Conference22nd International Conference on Pattern Recognition, ICPR 2014
Country/TerritorySweden
CityStockholm
Period24/08/1428/08/14

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'Unsupervised discriminant canonical correlation analysis for feature fusion'. Together they form a unique fingerprint.

Cite this