资源论文Direct Sparsity Optimization Based Feature Selection for Multi-Class Classification

Direct Sparsity Optimization Based Feature Selection for Multi-Class Classification

2019-11-22 | |  118 |   62 |   0
Abstract A novel sparsity optimization method is proposed to select features for multi-class classification problems by directly optimizing a ?2,p -norm ( 0 < ???? ? 1 ) based sparsity function subject to data-fitting inequality constraints to obtain large between-class margins. The direct sparse optimization method circumvents the empirical tuning of regularization parameters in existing feature selection methods that adopt the sparsity model as a regularization term. To solve the direct sparsity optimization problem that is non-smooth and non-convex when 0 < ???? < 1, we propose an efficient iterative algorithm with proved convergence by converting it to a convex and smooth optimization problem at every iteration step. The proposed algorithm has been evaluated based on publicly available datasets. The experiments have demonstrated that our algorithm could achieve feature selection performance competitive to state-of-the-art algorithms.

上一篇:Fast Learning from Distributed Datasets without Entity Matching

下一篇:Deep Subspace Clustering with Sparsity Prior

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...