Skip to main content
SHARE
Publication

Hierarchical Convolutional Attention Networks for Text Classification...

by Shang Gao, Arvind Ramanathan, Georgia Tourassi
Publication Type
Conference Paper
Journal Name
Proceedings of The Third Workshop on Representation Learning for NLP
Publication Date
Page Numbers
11 to 23
Volume
0
Issue
0
Conference Name
Proceedings of The Third Workshop on Representation Learning for NLP
Conference Location
Melbourne, Australia
Conference Sponsor
Association for Computational Linguistics
Conference Date

Recent work in machine translation has demonstrated that self-attention mecha- nisms can be used in place of recurrent neural networks to increase training speed without sacrificing model accuracy. We propose combining this approach with the benefits of convolutional filters and a hi- erarchical structure to create a document classification model that is both highly ac- curate and fast to train – we name our method Hierarchical Convolutional Atten- tion Networks. We demonstrate the effec- tiveness of this architecture by surpassing the accuracy of the current state-of-the-art on several classification tasks while being twice as fast to train.