Continual Graph Convolutional Networks for Text Classification
Nov. 19th, 2022: Got accepted by AAAI ‘23!
Tiandeng WU*, Qijiong LIU*, Yao HUANG, Yi CAO, Xiao-Ming WU#, and Jiandong DING#
*Equal contribution (co-first authors). Author ordering determined by dice rolling.
[Code] [Paper]
Abstract
To capture global non-consecutive and long-distance semantic information, graph convolutional network (GCN) has been widely used for text classification. While GCN-based methods have achieved great success in offline evaluations, they usually construct fixed document-token graphs and cannot perform inference on new documents. It is still a challenge to apply GCNs in online systems which need to infer continual text data. In this work, we present a Continual GCN model, short as ContGCN, to generalize inferences from observed documents to unobserved documents. Concretely, we propose a novel global-token-local-document paradigm to dynamically update the document-token graph in every batch for any online system during both training and testing phases. Moreover, we design an occurrence memory module and a self-supervised contrastive learning objective to update the proposed ContGCN in any online system in a label-free manner. Extensive offline experiments conducted on five public datasets demonstrate that our proposed ContGCN can significantly improve inference quality. A 3-month A/B test on our internal online system shows ContGCN achieves 8.86% performance gain compared with state-of-the-art methods.
Citation
1 | @inproceedings{wu2023contgcn, |
Continual Graph Convolutional Networks for Text Classification