Abstract
Graphs represent real world relationships, and graph embedding projects nodes in a graph to a latent space that can help simplify downstream tasks. Recent development of graph convolutions in deep learning significantly improves the performance of many learning tasks on graphs. Unfortunately, prior embedding methods either do not embed graphs with node features, or fail to produce high-quality embeddings for downstream learning tasks that result in large performance gap in comparison to direct learning on graphs.We present a versatile and effective embedding method, Conv2Vec, for embedding graphs with or without node features. It is based on graph convolutions with objective functions motivated by concepts and structures from classical graph algorithms. Conv2Vec produce high-quality embedding for both plain graphs and graphs with node features for downstream tasks.We evaluate the embeddings generated by Conv2Vec with a transductive node classification task. With the generated embeddings and very simple machine learning approaches, we are able to achieve accuracies similar to those achieved by direct learning with graph convolutions. Interestingly, if we strip the node features from the graph and thus learning an embedding has to rely entirely on the graph topology, node classification with our embedding significantly outperforms direct learning with various graph convolutions. This suggests that structures from classical graph algorithms may play an important role in learning on graphs.