A compact encoding for efficient character-level deep text classification

Marinho, Wemerson; Martí, Luis; Sánchez Pi, Nayat

Keywords: Text classification, character level convolutional neural networks, encoding of words

Abstract

This paper puts forward a new text to tensor representation that relies on information compression techniques to assign shorter codes to the most frequently used characters. This representation is language-independent with no need of pretraining and produces an encoding with no information loss. It provides an adequate description of the morphology of text, as it is able to represent prefixes, declensions, and inflections with similar vectors and are able to represent even unseen words on the training dataset. Similarly, as it is compact yet sparse, is ideal for speed up training times using tensor processing libraries. As part of this paper, we show that this technique is especially effective when coupled with convolutional neural networks (CNNs) for text classification at character-level. We apply two variants of CNN coupled with it. Experimental results show that it drastically reduces the number of parameters to be …

Más información

Editorial: IEEE Computer Society
Fecha de publicación: 2018
Año de Inicio/Término: 2018
Página de inicio: 1
Página final: 8
Idioma: Inglés
URL: https://ieeexplore.ieee.org/document/8489139
DOI:

10.1109/IJCNN.2018.8489139