Upper bound Kullback-Leibler divergence for transient hidden Markov models

Silva J.; Narayanan S.

Abstract

This paper reports an upper bound for the Kullback-Leibler divergence (KLD) for a general family of transient hidden Markov models (HMMs). An upper bound KLD (UBKLD) expression for Gaussian mixtures models (GMMs) is presented which is generalized for the case of HMMs. Moreover, this formulation is extended to the case of HMMs with nonemitting states, where under some general assumptions, the UBKLD is proved to be well defined for a general family of transient models. In particular, the UBKLD has a computationally efficient closed-form for HMMs with left-to-right topology and a final nonemitting state, that we refer to as left-to-right transient HMMs. Finally, the usefulness of the closed-form expression is experimentally evaluated for automatic speech recognition (ASR) applications, where left-to-right transient HMMs are used to model basic acoustic-phonetic units. Results show that the UBKLD is an accurate discrimination indicator for comparing acoustic HMMs used for ASR. © 2008 IEEE.

Más información

Título según WOS: Upper bound Kullback-Leibler divergence for transient hidden Markov models
Título según SCOPUS: Upper bound Kullback-Leibler divergence for transient hidden Markov models
Título de la Revista: IEEE TRANSACTIONS ON SIGNAL PROCESSING
Volumen: 56
Número: 9
Editorial: IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Fecha de publicación: 2008
Página de inicio: 4176
Página final: 4188
Idioma: English
URL: http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=4599176
DOI:

10.1109/TSP.2008.924137

Notas: ISI, SCOPUS