Title
Soft-mask: Adaptive Substructure Extractions for Graph Neural Networks
Abstract
ABSTRACT For learning graph representations, not all detailed structures within a graph are relevant to the given graph tasks. Task-relevant structures can be localized or sparse which are only involved in subgraphs or characterized by the interactions of subgraphs (a hierarchical perspective). A graph neural network should be able to efficiently extract task-relevant structures and be invariant to irrelevant parts, which is challenging for general message passing GNNs. In this work, we propose to learn graph representations from a sequence of subgraphs of the original graph to better capture task-relevant substructures or hierarchical structures and skip noisy parts. To this end, we design soft-mask GNN layer to extract desired subgraphs through the mask mechanism. The soft-mask is defined in a continuous space to maintain the differentiability and characterize the weights of different parts. Compared with existing subgraph or hierarchical representation learning methods and graph pooling operations, the soft-mask GNN layer is not limited by the fixed sample or drop ratio, and therefore is more flexible to extract subgraphs with arbitrary sizes. Extensive experiments on public graph benchmarks show that soft-mask mechanism brings performance improvements. And it also provides interpretability where visualizing the values of masks in each layer allows us to have an insight into the structures learned by the model.
Year
DOI
Venue
2021
10.1145/3442381.3449929
International World Wide Web Conference
Keywords
DocType
Citations 
deep learning, graph neural networks, graph representation leanring
Conference
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Mingqi Yang100.68
Yanming Shen2113.61
heng qi34410.17
Baocai Yin4691124.79