Interaction-aware Spatio-temporal Pyramid
Attention Networks for Action Classification
Abstract. Local features at neighboring spatial positions in feature
maps have high correlation since their receptive fields are often overlapped. Self-attention usually uses the weighted sum (or other functions)
with internal elements of each local feature to obtain its weight score,
which ignores interactions among local features. To address this, we propose an effective interaction-aware self-attention model inspired by PCA
to learn attention maps. Furthermore, since different layers in a deep network capture feature maps of different scales, we use these feature maps
to construct a spatial pyramid and then utilize multi-scale information
to obtain more accurate attention scores, which are used to weight the
local features in all spatial positions of feature maps to calculate attention maps. Moreover, our spatial pyramid attention is unrestricted to
the number of its input feature maps so it is easily extended to a spatiotemporal version. Finally, our model is embedded in general CNNs to
form end-to-end attention networks for action classification. Experimental results show that our method achieves the state-of-the-art results on
the UCF101, HMDB51 and untrimmed Charades