Abstract
Support Vector Machine (SVM) is originally proposed as a binary classification model with achieving great success in many applications. In reality,
it is more often to solve a problem which has more
than two classes. So, it is natural to extend SVM
to a multi-class classifier. There have been many
works proposed to construct a multi-class classi-
fier based on binary SVM, such as one versus rest
strategy (OvsR), one versus one strategy (OvsO)
and Weston’s multi-class SVM. The first two split
the multi-class problem to multiple binary classi-
fication subproblems, and we need to train multiple binary classifiers. Weston’s multi-class SVM is
formed by ensuring risk constraints and imposing
a specific regularization, like Frobenius norm. It
is not derived by maximizing the margin between
hyperplane and training data which is the motivation in SVM. In this paper, we propose a multiclass SVM model from the perspective of maximizing margin between training points and hyperplane, and analyze the relation between our model
and other related methods. In the experiment, it
shows that our model can get better or compared results when comparing with other related methods