HATS: Histograms of Averaged Time Surfaces for Robust
Event-based Object Classification
Abstract
Event-based cameras have recently drawn the attention
of the Computer Vision community thanks to their advantages in terms of high temporal resolution, low power consumption and high dynamic range, compared to traditional
frame-based cameras. These properties make event-based
cameras an ideal choice for autonomous vehicles, robot
navigation or UAV vision, among others. However, the
accuracy of event-based object classification algorithms,
which is of crucial importance for any reliable system working in real-world conditions, is still far behind their framebased counterparts. Two main reasons for this performance
gap are: 1. The lack of effective low-level representations
and architectures for event-based object classification and
2. The absence of large real-world event-based datasets. In
this paper we address both problems. First, we introduce
a novel event-based feature representation together with a
new machine learning architecture. Compared to previous
approaches, we use local memory units to efficiently leverage past temporal information and build a robust eventbased representation. Second, we release the first large
real-world event-based dataset for object classification. We
compare our method to the state-of-the-art with extensive
experiments, showing better classification performance and
real-time computation