| | |
| | |
Stat |
Members: 3657 Articles: 2'599'751 Articles rated: 2609
09 October 2024 |
|
| | | |
|
Article overview
| |
|
InducT-GCN: Inductive Graph Convolutional Networks for Text Classification | Kunze Wang
; Soyeon Caren Han
; Josiah Poon
; | Date: |
1 Jun 2022 | Abstract: | Text classification aims to assign labels to textual units by making use of
global information. Recent studies have applied graph neural network (GNN) to
capture the global word co-occurrence in a corpus. Existing approaches require
that all the nodes (training and test) in a graph are present during training,
which are transductive and do not naturally generalise to unseen nodes. To make
those models inductive, they use extra resources, like pretrained word
embedding. However, high-quality resource is not always available and hard to
train. Under the extreme settings with no extra resource and limited amount of
training set, can we still learn an inductive graph-based text classification
model? In this paper, we introduce a novel inductive graph-based text
classification framework, InducT-GCN (InducTive Graph Convolutional Networks
for Text classification). Compared to transductive models that require test
documents in training, we construct a graph based on the statistics of training
documents only and represent document vectors with a weighted sum of word
vectors. We then conduct one-directional GCN propagation during testing. Across
five text classification benchmarks, our InducT-GCN outperformed
state-of-the-art methods that are either transductive in nature or pre-trained
additional resources. We also conducted scalability testing by gradually
increasing the data size and revealed that our InducT-GCN can reduce the time
and space complexity. The code is available on:
this https URL. | Source: | arXiv, 2206.00265 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
|
| |
|
|
|