Very deep convolutional networks for text classification

Alexis Conneau, Holger Schwenk, Yann LeCun, Löc Barrault

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The dominant approach for many NLP tasks are recurrent neural networks, in particular LSTMs, and convolutional neural networks. However, these architectures are rather shallow in comparison to the deep convolutional networks which have pushed the state-of-the-art in computer vision. We present a new architecture (VDCNN) for text processing which operates directly at the character level and uses only small convolutions and pooling operations. We are able to show that the performance of this model increases with the depth: using up to 29 convolutional layers, we report improvements over the state-ofthe- Art on several public text classification tasks. To the best of our knowledge, this is the first time that very deep convolutional nets have been applied to text processing.

Original languageEnglish (US)
Title of host publicationLong Papers - Continued
PublisherAssociation for Computational Linguistics (ACL)
Pages1107-1116
Number of pages10
Volume1
ISBN (Electronic)9781510838604
StatePublished - Jan 1 2017
Event15th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2017 - Valencia, Spain
Duration: Apr 3 2017Apr 7 2017

Other

Other15th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2017
CountrySpain
CityValencia
Period4/3/174/7/17

Fingerprint

text processing
neural network
art
performance
Text Processing
Recurrent Neural Networks
Neural Networks
Art
Natural Language Processing
Computer Vision
Layer

ASJC Scopus subject areas

  • Linguistics and Language
  • Language and Linguistics

Cite this

Conneau, A., Schwenk, H., LeCun, Y., & Barrault, L. (2017). Very deep convolutional networks for text classification. In Long Papers - Continued (Vol. 1, pp. 1107-1116). Association for Computational Linguistics (ACL).

Very deep convolutional networks for text classification. / Conneau, Alexis; Schwenk, Holger; LeCun, Yann; Barrault, Löc.

Long Papers - Continued. Vol. 1 Association for Computational Linguistics (ACL), 2017. p. 1107-1116.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Conneau, A, Schwenk, H, LeCun, Y & Barrault, L 2017, Very deep convolutional networks for text classification. in Long Papers - Continued. vol. 1, Association for Computational Linguistics (ACL), pp. 1107-1116, 15th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2017, Valencia, Spain, 4/3/17.
Conneau A, Schwenk H, LeCun Y, Barrault L. Very deep convolutional networks for text classification. In Long Papers - Continued. Vol. 1. Association for Computational Linguistics (ACL). 2017. p. 1107-1116
Conneau, Alexis ; Schwenk, Holger ; LeCun, Yann ; Barrault, Löc. / Very deep convolutional networks for text classification. Long Papers - Continued. Vol. 1 Association for Computational Linguistics (ACL), 2017. pp. 1107-1116
@inproceedings{a45dacb196de4fb2bd184bca8f77629a,
title = "Very deep convolutional networks for text classification",
abstract = "The dominant approach for many NLP tasks are recurrent neural networks, in particular LSTMs, and convolutional neural networks. However, these architectures are rather shallow in comparison to the deep convolutional networks which have pushed the state-of-the-art in computer vision. We present a new architecture (VDCNN) for text processing which operates directly at the character level and uses only small convolutions and pooling operations. We are able to show that the performance of this model increases with the depth: using up to 29 convolutional layers, we report improvements over the state-ofthe- Art on several public text classification tasks. To the best of our knowledge, this is the first time that very deep convolutional nets have been applied to text processing.",
author = "Alexis Conneau and Holger Schwenk and Yann LeCun and L{\"o}c Barrault",
year = "2017",
month = "1",
day = "1",
language = "English (US)",
volume = "1",
pages = "1107--1116",
booktitle = "Long Papers - Continued",
publisher = "Association for Computational Linguistics (ACL)",

}

TY - GEN

T1 - Very deep convolutional networks for text classification

AU - Conneau, Alexis

AU - Schwenk, Holger

AU - LeCun, Yann

AU - Barrault, Löc

PY - 2017/1/1

Y1 - 2017/1/1

N2 - The dominant approach for many NLP tasks are recurrent neural networks, in particular LSTMs, and convolutional neural networks. However, these architectures are rather shallow in comparison to the deep convolutional networks which have pushed the state-of-the-art in computer vision. We present a new architecture (VDCNN) for text processing which operates directly at the character level and uses only small convolutions and pooling operations. We are able to show that the performance of this model increases with the depth: using up to 29 convolutional layers, we report improvements over the state-ofthe- Art on several public text classification tasks. To the best of our knowledge, this is the first time that very deep convolutional nets have been applied to text processing.

AB - The dominant approach for many NLP tasks are recurrent neural networks, in particular LSTMs, and convolutional neural networks. However, these architectures are rather shallow in comparison to the deep convolutional networks which have pushed the state-of-the-art in computer vision. We present a new architecture (VDCNN) for text processing which operates directly at the character level and uses only small convolutions and pooling operations. We are able to show that the performance of this model increases with the depth: using up to 29 convolutional layers, we report improvements over the state-ofthe- Art on several public text classification tasks. To the best of our knowledge, this is the first time that very deep convolutional nets have been applied to text processing.

UR - http://www.scopus.com/inward/record.url?scp=85021673250&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85021673250&partnerID=8YFLogxK

M3 - Conference contribution

VL - 1

SP - 1107

EP - 1116

BT - Long Papers - Continued

PB - Association for Computational Linguistics (ACL)

ER -