Augmenting BERT-style Models with Predictive Coding to Improve Discourse-level Representations

Araujo, Vladimir; Villa, Andrés; Moens, Marie-Francine

Abstract

Current language models are usually trained using a self-supervised scheme, where the main focus is learning representations at the word or sentence level. However, there has been limited progress in generating useful discourse-level representations. In this work, we propose to use ideas from predictive coding theory to augment BERT-style language models with a mechanism that allows them to learn suitable discourse-level representations. As a result, our proposed approach is able to predict future sentences using explicit top-down connections that operate at the intermediate layers of the network. By experimenting with benchmarks designed to evaluate discourse-related knowledge using pre-trained sentence representations, we demonstrate that our approach improves performance in 6 out of 11 tasks by excelling in discourse relationship detection.

Más información

Título según WOS: Augmenting BERT-style Models with Predictive Coding to Improve Discourse-level Representations
Título según SCOPUS: Augmenting BERT-style Models with Predictive Coding to Improve Discourse-level Representations
Título de la Revista: EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings
Editorial: Association for Computational Linguistics (ACL)
Fecha de publicación: 2021
Año de Inicio/Término: 7 November 2021 through 11 November 2021
Página final: 3022
Idioma: English
DOI:

10.18653/v1/2021.emnlp-main.240

Notas: ISI, SCOPUS