资源论文Unsupervised Feature Extraction by Time-Contrastive Learning and Nonlinear ICA

Unsupervised Feature Extraction by Time-Contrastive Learning and Nonlinear ICA

2020-02-10 | |  85 |   53 |   0

Abstract 

Nonlinear independent component analysis (ICA) provides an appealing framework for unsupervised feature learning, but the models proposed so far are not identifiable. Here, we first propose a new intuitive principle of unsupervised deep learning from time series which uses the nonstationary structure of the data. Our learning principle, time-contrastive learning (TCL), finds a representation which allows optimal discrimination of time segments (windows). Surprisingly, we show how TCL can be related to a nonlinear ICA model, when ICA is redefined to include temporal nonstationarities. In particular, we show that TCL combined with linear ICA estimates the nonlinear ICA model up to point-wise transformations of the sources, and this solution is unique — thus providing the first identifiability result for nonlinear ICA which is rigorous, constructive, as well as very general.

上一篇:Tagger: Deep Unsupervised Perceptual Grouping

下一篇:Unsupervised Learning from Noisy Networks with Applications to Hi-C Data

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...