资源论文Regularizers versus Losses for Nonlinear Dimensionality Reduction

Regularizers versus Losses for Nonlinear Dimensionality Reduction

2020-02-28 | |  40 |   47 |   0

Abstract

We demonstrate that almost all nonparametric dimensionality reduction methods can be expressed by a simple procedure: regularized loss minimization plus singular value truncation. By distinguishing the role of the loss and regularizer in such a process, we recover a factored perspective that reveals some gaps in the current literature. Beyond identifying a useful new loss for manifold unfolding, a key contribution is to derive new convex regularizers that combine distance maximization with rank reduction. These regularizers can be applied to any loss.

上一篇:Apprenticeship Learning for Model Parameters of Partially Observable Environments

下一篇:Complexity Analysis of the Lasso Regularization Path

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...