Efficient Protocol for Collaborative Dictionary Learning in Decentralized
Networks
Abstract
This paper is concerned with the task of collaborative density estimation in the distributed multi-task
setting. Major application scenarios include collaborative anomaly detection among industrial assets owned by different companies competing with
each other. Of critical importance here is to meet
two conflicting goals at once: data privacy and collaboration. To this end, we propose a new framework for collaborative dictionary learning. By using a mixture of the exponential family, we show
that collaborative learning can be nicely separated
into three steps: local updates, global consensus,
and optimization. For the critical step of consensus building, we propose a new algorithm that
does not rely on expensive encryption-based multiparty computation. Our theoretical and experimental analysis shows that our method is several orders
of magnitude faster than the alternative