Abstract
In multi-task learning, a major challenge springs from a notorious issue known as negative transfer, which refers to the phenomenon that sharing the knowledge with dissimilar and hard tasks often results in a worsened performance. To circumvent this issue, we propose a novel multi-task learning method, which simultaneously learns latent task representations and a block-diagonal Latent Task Assignment Matrix (LTAM). Different from most of the previous work, pursuing the BlockDiagonal structure of LTAM (assigning latent tasks to output tasks) alleviates negative transfer via punishing inter-group knowledge transfer and sharing. This goal is challenging since our notion of Block-Diagonal Property extends the traditional notion for homogeneous and square matrices. In this paper, we propose a spectral regularizer which is proven to leverage the expected structure. Practically, we provide a relaxation scheme which improves the flexibility of the model. With the objective function given, we then propose an alternating optimization method, which reveals an interesting connection between our method and the optimal transport problem. Finally, the method is demonstrated on a simulation dataset, three real-world benchmark datasets and further applied to two personalized attribute learning datasets.