Abstract. Efficient CNN designs like ResNets and DenseNet were proposed to improve accuracy vs efficiency trade-offs. They essentially increased the connectivity, allowing efficient information flow across layers.
Inspired by these techniques, we propose to model connections between
filters of a CNN using graphs which are simultaneously sparse and well
connected. Sparsity results in efficiency while well connectedness can preserve the expressive power of the CNNs. We use a well-studied class of
graphs from theoretical computer science that satisfies these properties
known as Expander graphs. Expander graphs are used to model connections between filters in CNNs to design networks called X-Nets. We
present two guarantees on the connectivity of X-Nets: Each node influences every node in a layer in logarithmic steps, and the number of paths
between two sets of nodes is proportional to the product of their sizes.
We also propose efficient training and inference algorithms, making it
possible to train deeper and wider X-Nets effectively.