资源论文Hierarchical Diffusion Attention Network

Hierarchical Diffusion Attention Network

2019-10-09 | |  91 |   30 |   0
Abstract A series of recent studies formulated the diffusion prediction problem as a sequence prediction task and proposed several sequential models based on recurrent neural networks. However, nonsequential properties exist in real diffusion cascades, which do not strictly follow the sequential assumptions of previous work. In this paper, we propose a hierarchical diffusion attention network (HiDAN), which adopts a non-sequential framework and two-level attention mechanisms, for diffusion prediction. At the user level, a dependency attention mechanism is proposed to dynamically capture historical user-to-user dependencies and extract the dependency-aware user information. At the cascade (i.e., sequence) level, a time-aware in- fluence attention is designed to infer possible future user’s dependencies on historical users by considering both inherent user importance and time decay effects. Significantly higher effectiveness and ef- ficiency of HiDAN over state-of-the-art sequential models are demonstrated when evaluated on three real diffusion datasets. The further case studies illustrate that HiDAN can accurately capture diffusion dependencies

上一篇:Graph Convolutional Networks on User Mobility Heterogeneous Graphs for Social Relationship Inference

下一篇:Interpreting and Evaluating Neural Network Robustness

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...