clcNet: Improving the Efficiency of Convolutional Neural Network using
Channel Local Convolutions
Abstract
Depthwise convolution and grouped convolution has
been successfully applied to improve the efficiency of convolutional neural network (CNN). We suggest that these models can be considered as special cases of a generalized convolution operation, named channel local convolution(CLC),
where an output channel is computed using a subset of the
input channels. This definition entails computation dependency relations between input and output channels, which
can be represented by a channel dependency graph(CDG).
By modifying the CDG of grouped convolution, a new CLC
kernel named interlaced grouped convolution (IGC) is created. Stacking IGC and GC kernels results in a convolution
block (named CLC Block) for approximating regular convolution. By resorting to the CDG as an analysis tool, we
derive the rule for setting the meta-parameters of IGC and
GC and the framework for minimizing the computational
cost. A new CNN model named clcNet is then constructed
using CLC blocks, which shows significantly higher computational efficiency and fewer parameters compared to stateof-the-art networks, when being tested using the ImageNet-
1K dataset