Abstract
The perceived success of recent visual recognition approaches has largely been derived from their performance on classi fication tasks, where all possible classes are known at training time. But what about open set problems, where unknown classes appear at test time? Intuitively, if we could accurately model just the positive data for any known class without overfitting, we could reject the large set of unknown classes even under an assumption of incomplete class knowledge. In this paper, we formulate the problem as one of modeling positive training data at the decision boundary, where we can invoke the statisti- cal extreme value theory. A new algorithm called the is introduced for estimating the unnormalized posterior probability of class inclusion.