CHARACTERIZING PROBABILISTIC STRUCTURE IN LEARNING USING INFORMATION SUFFICIENCY

Faraggi, Victor; Egana, A.; Pavez, Eduardo

Abstract

We use the concept of information sufficiency (IS) to represent probabilistic structures in machine learning (ML). Our main result provides a functional expression that charac-terizes the class of probabilistic models consistent with an IS encoder-decoder latent predictive structure. This result formally justifies the encoder-decoder forward stages many modern ML architectures adopt to learn latent (compressed) representations in data. To illustrate IS as a realistic and rele-vant model assumption, we revisit some known ML concepts and present some interesting new examples: invariant, robust, sparse, and digital models. © 2024 IEEE.

Más información

Título según WOS: CHARACTERIZING PROBABILISTIC STRUCTURE IN LEARNING USING INFORMATION SUFFICIENCY
Título según SCOPUS: Characterizing Probabilistic Structure in Learning Using Information Sufficiency
Título de la Revista: IEEE International Workshop on Machine Learning for Signal Processing, MLSP
Editorial: IEEE Computer Society
Fecha de publicación: 2024
Año de Inicio/Término: 22-25 September 2024
Idioma: English
URL: 10.1109/MLSP58920.2024.10734735
DOI:

10.1109/MLSP58920.2024.10734735

Notas: ISI, SCOPUS