资源论文Max-Margin Invariant Features from Transformed Unlabeled Data

Max-Margin Invariant Features from Transformed Unlabeled Data

2020-02-10 | |  71 |   40 |   0

Abstract 

The study of representations invariant to common transformations of the data is important to learning. Most techniques have focused on local approximate invariance implemented within expensive optimization frameworks lacking explicit theoretical guarantees. In this paper, we study kernels that are invariant to a unitary group while having theoretical guarantees in addressing the important practical issue of unavailability of transformed versions of labelled data. A problem we call the Unlabeled Transformation Problem which is a special form of semisupervised learning and one-shot learning. We present a theoretically motivated alternate approach to the invariant kernel SVM based on which we propose MaxMargin Invariant Features (MMIF) to solve this problem. As an illustration, we design an framework for face recognition and demonstrate the efficacy of our approach on a large scale semi-synthetic dataset with 153,000 images and a new challenging protocol on Labelled Faces in the Wild (LFW) while out-performing strong baselines.

上一篇:Unified representation of tractography and diffusion-weighted MRI data using sparse multidimensional arrays

下一篇:ADMM without a Fixed Penalty Parameter: Faster Convergence with New Adaptive Penalization

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Joint Pose and Ex...

    Facial expression recognition (FER) is a challe...