Abstract. A capsule is a collection of neurons which represents different
variants of a pattern in the network. The routing scheme ensures only
certain capsules which resemble lower counterparts in the higher layer
should be activated. However, the computational complexity becomes
a bottleneck for scaling up to larger networks, as lower capsules need
to correspond to each and every higher capsule. To resolve this limitation, we approximate the routing process with two branches: a master
branch which collects primary information from its direct contact in the
lower layer and an aide branch that replenishes master based on pattern
variants encoded in other lower capsules. Compared with previous iterative and unsupervised routing scheme, these two branches are communicated in a fast, supervised and one-time pass fashion. The complexity
and runtime of the model are therefore decreased by a large margin.
Motivated by the routing to make higher capsule have agreement with
lower capsule, we extend the mechanism as a compensation for the rapid
loss of information in nearby layers. We devise a feedback agreement
unit to send back higher capsules as feedback. It could be regarded as
an additional regularization to the network. The feedback agreement is
achieved by comparing the optimal transport divergence between two
distributions (lower and higher capsules). Such an add-on witnesses a
unanimous gain in both capsule and vanilla networks. Our proposed EncapNet performs favorably better against previous state-of-the-arts on
CIFAR10/100, SVHN and a subset of ImageNet