Abstract
In pattern classification, it is needed to efficiently treat two- way data (feature matrices) while preserving the two-way structure such as spatio-temporal relationships, etc . The classifier for the feature matrix is generally formulated by multiple bilinear forms which result in a ma- trix. The rank of the matrix, i.e ., the number of bilinear forms, should be low from the viewpoint of generalization performance and computa- tional cost. For that purpose, we propose a low-rank bilinear classifier based on the efficient optimization. In the proposed method, the classi- fier is optimized by minimizing the trace norm of the classifier (matrix), which contributes to the rank reduction for an efficient classifier without any hard constraint on the rank. We formulate the optimization prob- lem in a tractable convex form and propose the procedure to solve it efficiently with the global optimum. In addition, by considering a kernel- based extension of the bilinear method, we induce a novel multiple kernel learning (MKL), called heterogeneous MKL. The method combines both inter kernels between heterogeneous types of features and the ordinary kernels within homogeneous features into a new discriminative kernel in a unified manner using the bilinear model. In the experiments on var- ious classification problems using feature arrays, co-occurrence feature matrices, and multiple kernels, the proposed method exhibits favorable performances compared to the other methods.