Multi-attention mechanism based on gate recurrent unit for English text classification

Authors

DOI:

https://doi.org/10.4108/eai.27-1-2022.173166

Keywords:

English text classification, multi-attention mechanism, GRU, deep learning

Abstract

This article has been retracted, and the retraction notice can be found here: http://dx.doi.org/10.4108/eai.8-4-2022.173791.


Text classification is one of the core tasks in the field of natural language processing. Aiming at the advantages and disadvantages of current deep learning-based English text classification methods in long text classification, this paper proposes an English text classification model, which introduces multi-attention mechanism based on gate recurrent unit (GRU) to focus on important parts of English text. Firstly, sentences and documents are encoded according to the hierarchical structure of English documents. Second, it uses the attention mechanism separately at each level. On the basis of the global object vector, the maximum pooling is used to extract the specific object vector of sentence, so that the encoded document vector has more obvious category features and can pay more attention to the most distinctive semantic features of each English text. Finally, documents are classified according to the constructed English document representation. Experimental results on public data sets show that this model has better classification performance for long English texts with hierarchical structure.

Downloads

Published

27-01-2022

How to Cite

1.
Liu H. Multi-attention mechanism based on gate recurrent unit for English text classification. EAI Endorsed Scal Inf Syst [Internet]. 2022 Jan. 27 [cited 2024 Nov. 24];9(4):e20. Available from: https://publications.eai.eu/index.php/sis/article/view/339