This website requires JavaScript.

T2-GNN: Graph Neural Networks for Graphs with Incomplete Features and Structure via Teacher-Student Distillation

Cuiying HuoDi JinYawen LiDongxiao HeYu-Bin YangLingfei Wu
Dec 2022
Graph Neural Networks (GNNs) have been a prevailing technique for tacklingvarious analysis tasks on graph data. A key premise for the remarkableperformance of GNNs relies on complete and trustworthy initial graphdescriptions (i.e., node features and graph structure), which is often notsatisfied since real-world graphs are often incomplete due to variousunavoidable factors. In particular, GNNs face greater challenges when both nodefeatures and graph structure are incomplete at the same time. The existingmethods either focus on feature completion or structure completion. Theyusually rely on the matching relationship between features and structure, oremploy joint learning of node representation and feature (or structure)completion in the hope of achieving mutual benefit. However, recent studiesconfirm that the mutual interference between features and structure leads tothe degradation of GNN performance. When both features and structure areincomplete, the mismatch between features and structure caused by the missingrandomness exacerbates the interference between the two, which may triggerincorrect completions that negatively affect node representation. To this end,in this paper we propose a general GNN framework based on teacher-studentdistillation to improve the performance of GNNs on incomplete graphs, namelyT2-GNN. To avoid the interference between features and structure, we separatelydesign feature-level and structure-level teacher models to provide targetedguidance for student model (base GNNs, such as GCN) through distillation. Thenwe design two personalized methods to obtain well-trained feature and structureteachers. To ensure that the knowledge of the teacher model is comprehensivelyand effectively distilled to the student model, we further propose a dualdistillation mode to enable the student to acquire as much expert knowledge aspossible.