On the von Neumann entropy of language networks
Abstract
Words are not isolated entities within a language. In this paper, we measure the number of choices transmitted in natural language by means of the von Neumann entropy of language networks. This quantity, introduced in Quantum Information accounts, provides a detailed characterization of network complexities. The simulations are based on a large parallel corpus of 362 languages across 55 linguistic families (focusing on the sub-sample of 85 languages from the Americas). With this, we constructed language networks as a simple way to describe word connectivity patterns for each language. We studied several aspects of the von Neumann entropy of language networks. First, we discovered large groups of languages with low average degree and high von Neumann entropy. The results suggested also that large von Neumann entropy is associated with word entropy (as a proxy for morphological complexity), and is inversely related to degree regularity. This means that there are pressures at play that keep a balance between word morphological complexity and patterns of connections between words. We suggested also a strong influence of functional words on low von Neumann entropy languages. Our approach is thus a simple network-based contribution to establish cross-linguistic language comparisons from textual data.
Más información
Título de la Revista: | EUROPHYSICS LETTERS |
Volumen: | 136 |
Editorial: | IOP Publishing, EDP Sciences, Italian Physical Society |
Fecha de publicación: | 2021 |
Página de inicio: | 68003-p1 |
Página final: | 68003-p7 |
Idioma: | English |
URL: | https://10.1209/0295-5075/ac39ee |