• Stars
    star
    2
  • Language
  • Created over 6 years ago
  • Updated over 6 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Networks Embedding (NE) plays a very important role in network analysis in real life. Most of the current Network Representation Learning (NRL) models only consider the structure information, and have static embeddings. However, the identical vertex can exhibit different characters when interacting with different vertices. In this paper, we propose a context-aware text-embedding model which seamlessly integrates the structure information and the text information of the vertex. We employ the Variational AutoEncoder (VAE) to statically obtain the textual information of each vertex and use mutual attention mechanism to dynamically assign the embeddings to a vertex according to different neighbors it interacts with. Comprehensive experiments were conducted on two publicly available link prediction datasets. Experimental results demonstrate that our model performs superior compared to baselines.