Multi-Class Support Vector Machine with Maximizing Minimum Margin

Authors

  • Feiping Nie Northwestern Polytechnical University
  • Zhezheng Hao Northwestern Polytechnical University
  • Rong Wang Northwestern Polytechnical University

DOI:

https://doi.org/10.1609/aaai.v38i13.29361

Keywords:

ML: Classification and Regression, ML: Multi-class/Multi-label Learning & Extreme Classification

Abstract

Support Vector Machine (SVM) stands out as a prominent machine learning technique widely applied in practical pattern recognition tasks. It achieves binary classification by maximizing the "margin", which represents the minimum distance between instances and the decision boundary. Although many efforts have been dedicated to expanding SVM for multi-class case through strategies such as one versus one and one versus the rest, satisfactory solutions remain to be developed. In this paper, we propose a novel method for multi-class SVM that incorporates pairwise class loss considerations and maximizes the minimum margin. Adhering to this concept, we embrace a new formulation that imparts heightened flexibility to multi-class SVM. Furthermore, the correlations between the proposed method and multiple forms of multi-class SVM are analyzed. The proposed regularizer, akin to the concept of "margin", can serve as a seamless enhancement over the softmax in deep learning, providing guidance for network parameter learning. Empirical evaluations demonstrate the effectiveness and superiority of our proposed method over existing multi-classification methods. Complete version is available at https://arxiv.org/pdf/2312.06578.pdf. Code is available at https://github.com/zz-haooo/M3SVM.

Downloads

Published

2024-03-24

How to Cite

Nie, F., Hao, Z., & Wang, R. (2024). Multi-Class Support Vector Machine with Maximizing Minimum Margin. Proceedings of the AAAI Conference on Artificial Intelligence, 38(13), 14466-14473. https://doi.org/10.1609/aaai.v38i13.29361

Issue

Section

AAAI Technical Track on Machine Learning IV