Structural Deep Embedding for Hyper-Networks

Published in AAAI, 2018

Recommended citation: Tu, Ke, et al. "Structural Deep Embedding for Hyper-Networks". Proceedings of the 32nd AAAI Conference on Artificial Intelligence(2018). /files/2018_AAAI_DHNE.pdf

Network embedding has recently attracted lots of attentions in data mining. Existing network embedding methods mainly focus on networks with pairwise relationships. In real world, however, the relationships among data points could go be- yond pairwise, i.e., three or more objects are involved in each relationship represented by a hyperedge, thus forming hyper-networks. These hyper-networks pose great challenges to existing network embedding methods when the hyperedges are indecomposable, that is to say, any subset of nodes in a hyperedge cannot form another hyperedge. These indecomposable hyperedges are especially common in heterogeneous networks.

In this paper, we propose a novel Deep Hyper-Network Embedding (DHNE) model to embed hyper- networks with indecomposable hyperedges. More specifi- cally, we theoretically prove that any linear similarity met- ric in embedding space commonly used in existing meth- ods cannot maintain the indecomposibility property in hyper- networks, and thus propose a new deep model to realize a non-linear tuplewise similarity function while preserving both local and global proximities in the formed embedding space. We conduct extensive experiments on four different types of hyper-networks, including a GPS network, an on- line social network, a drug network and a semantic network. The empirical results demonstrate that our method can signif- icantly and consistently outperform the state-of-the-art algorithms.

The code is available on here.

The datasets are available on follows:

Recommended citation:

@inproceedings{tu2018structural,
  title={Structural deep embedding for hyper-networks},
  author={Tu, Ke and Cui, Peng and Wang, Xiao and Wang, Fei and Zhu, Wenwu},
  booktitle={Thirty-Second AAAI Conference on Artificial Intelligence},
  year={2018}
}