Abstract
The success of many visual recognition tasks largely depends on a good similarity measure, and distance metric
learning plays an important role in this regard. Meanwhile, Symmetric Positive Definite (SPD) matrix is receiving increased attention for feature representation in multiple computer vision applications. However, distance metric learning on SPD matrices has not been sufficiently researched. A few existing works approached this by learning
either d2 × p or d × k transformation matrix for d × d SPD
matrices. Different from these methods, this paper proposes
a new member to the family of distance metric learning for
SPD matrices. It learns only d parameters to adjust the
eigenvalues of the SPD matrices through an efficient optimisation scheme. Also, it is shown that the proposed method
can be interpreted as learning a sample-specific transformation matrix, instead of the fixed transformation matrix
learned for all the samples in the existing works. The optimised d parameters can be used to “massage” the SPD
matrices for better discrimination while still keeping them
in the original space. From this perspective, the proposed
method complements, rather than competes with, the existing linear-transformation-based methods, as the latter can
always be applied to the output of the former to perform
distance metric learning in further. The proposed method
has been tested on multiple SPD-based visual representation data sets used in the literature, and the results demonstrate its interesting properties and attractive performance.