Abstract
Since many classifier methods cannot identify and remove redundant observations and unrelated attributes from data, they usually give more inconsistent classification between actual and predicted outputs. Introducing single- or multi-kernel functions to classifier models helps to solve non-linearly separable problems, but it reduces the predictive interpretability. In this paper, we put forward a novel two-stage sparse multi-kernel optimization classifier (TSMOC) method under the framework of combining support vector classifier (SVC) and multiple kernel learning (MKL), aiming to solve the above issues. With our defined row and column multi-kernel matrices, the proposed method employs iterative updates to compute the ℓ0- norm approximations of coefficients and weights, which extract important observations and attributes besides prediction. Based on the experimental results on thirteen real-world datasets, TSMOC generally outperforms the other seven classifiers of SVC, ℓ1- norm SVC, least-squares SVC, LASSO classifier, SimpleMKL, EasyMKL, and DeepMKL. Besides obtaining the best classification accuracy, TSMOC extracts the smallest number of observations and attributes important to prediction and it can provide explainable prediction with their contribution percentages.
Original language | English |
---|---|
Article number | 120635 |
Journal | Expert Systems with Applications |
Volume | 231 |
DOIs | |
Publication status | Published - 30 Nov 2023 |
Keywords
- Classification
- Explainable prediction
- Multiple kernel learning
- Sparse learning
- Support vector classifier
ASJC Scopus subject areas
- General Engineering
- Computer Science Applications
- Artificial Intelligence