资源论文HYPER -SAGNN: ASELF -ATTENTION BASED GRAPHNEURAL NETWORK FOR HYPERGRAPHS

HYPER -SAGNN: ASELF -ATTENTION BASED GRAPHNEURAL NETWORK FOR HYPERGRAPHS

2020-01-02 | |  46 |   49 |   0

Abstract

Graph representation learning for hypergraphs can be used to extract patterns among higher-order interactions that are critically important in many real world problems. Current approaches designed for hypergraphs, however, are unable to handle different types of hypergraphs and are typically not generic for various learning tasks. Indeed, models that can predict variable-sized heterogeneous hyperedges have not been available. Here we develop a new self-attention based graph neural network called Hyper-SAGNN applicable to homogeneous and heterogeneous hypergraphs with variable hyperedge sizes. We perform extensive evaluations on multiple datasets, including four benchmark network datasets and two single-cell Hi-C datasets in genomics. We demonstrate that Hyper-SAGNN significantly outperforms the state-of-the-art methods on traditional tasks while also achieving great performance on a new task called outsider identification. Hyper-SAGNN will be useful for graph representation learning to uncover complex higher-order interactions in different applications.

上一篇:INTRINSICALLY MOTIVATED DISCOVERY OF DIVERSEPATTERNS IN SELF -O RGANIZING SYSTEMS

下一篇:FEW-S HOT LEARNING ON GRAPHS VIA SUPER -CLASSES BASED ON GRAPH SPECTRAL MEASURES

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...