Abstract
Binary neural networks (BNN) have been studied extensively since they run dramatically faster at lower memory
and power consumption than floating-point networks, thanks
to the efficiency of bit operations. However, contemporary
BNNs whose weights and activations are both single bits
suffer from severe accuracy degradation. To understand why,
we investigate the representation ability, speed and bias/variance of BNNs through extensive experiments. We conclude
that the error of BNNs are predominantly caused by the intrinsic instability (training time) and non-robustness (train
& test time). Inspired by this investigation, we propose the
Binary Ensemble Neural Network (BENN) which leverages
ensemble methods to improve the performance of BNNs with
limited efficiency cost. While ensemble techniques have been
broadly believed to be only marginally helpful for strong
classifiers such as deep neural networks, our analysis and
experiments show that they are naturally a perfect fit to boost
BNNs. We find that our BENN, which is faster and more robust than state-of-the-art binary networks, can even surpass
the accuracy of the full-precision floating number network
with the same architecture.