Sufficient conditions for the convergence of the Shannon differential entropy
Keywords: information, divergence, variation, entropy, theory, probability, differential, measures, conditions, Total, Sufficient
Abstract
This work revisits and extends results concerning the convergence of the Shannon differential entropy. Concrete connections with the convergence of probability measures in the sense of total variations and (direct and reverse) information divergence are established. In particular, under uniform bounded conditions on the sequence of probability measures, the results stipulate that the convergence in information divergence is sufficient to guarantee the convergence of the differential entropy functional. © 2011 IEEE.
Más información
Título de la Revista: | 2011 IEEE INFORMATION THEORY WORKSHOP (ITW) |
Editorial: | IEEE |
Fecha de publicación: | 2011 |
Página de inicio: | 608 |
Página final: | 612 |
URL: | http://www.scopus.com/inward/record.url?eid=2-s2.0-83655202640&partnerID=q2rCbXpz |