Abstract
We develop a general framework for marginbased multicategory classification in metric spaces. The basic work-horse is a marginregularized version of the nearest-neighbor classifier. We prove generalization bounds that match the state of the art in sample size n and sig nificantly improve the dependence on the number of classes k. Our point of departure is a nearly Bayes-optimal finite-sample risk bound independent of k. Although k-free, this bound is unregularized and non-adaptive, which motivates our main result: Rademacher and scale-sensitive margin bounds with a logarithmic dependence on k. As the best previous risk estimates in this set ting were of order , our bound is exponentially sharper. From the algorithmic standpoint, in doubling metric spaces our classifier may be trained on n examples in time and evaluated on new points in O(log n) time.