This website requires JavaScript.

Deep Attention-guided Graph Clustering with Dual Self-supervision

Zhihao PengHui LiuYuheng JiaJunhui Hou
摘要
Existing deep embedding clustering works only consider the deepest layer tolearn a feature embedding and thus fail to well utilize the availablediscriminative information from cluster assignments, resulting performancelimitation. To this end, we propose a novel method, namely deepattention-guided graph clustering with dual self-supervision (DAGC).Specifically, DAGC first utilizes a heterogeneity-wise fusion module toadaptively integrate the features of an auto-encoder and a graph convolutionalnetwork in each layer and then uses a scale-wise fusion module to dynamicallyconcatenate the multi-scale features in different layers. Such modules arecapable of learning a discriminative feature embedding via an attention-basedmechanism. In addition, we design a distribution-wise fusion module thatleverages cluster assignments to acquire clustering results directly. To betterexplore the discriminative information from the cluster assignments, we developa dual self-supervision solution consisting of a soft self-supervision strategywith a triplet Kullback-Leibler divergence loss and a hard self-supervisionstrategy with a pseudo supervision loss. Extensive experiments validate thatour method consistently outperforms state-of-the-art methods on six benchmarkdatasets. Especially, our method improves the ARI by more than 18.14% over thebest baseline.
展开全部
图表提取

暂无人提供速读十问回答

论文十问由沈向洋博士提出,鼓励大家带着这十个问题去阅读论文,用有用的信息构建认知模型。写出自己的十问回答,还有机会在当前页面展示哦。

Q1论文试图解决什么问题?
Q2这是否是一个新的问题?
Q3这篇文章要验证一个什么科学假设?
0
被引用
笔记
问答