MIHMMs: Mutual Information Hidden Markov Models

This paper proposes a new family of Hidden Markov Models (HMMs)
named Mutual Information Hidden Markov Models (MIHMMs). MIHMMs
have the same graphical structure as HMMs. However, the objective
function being optimized is not the joint likelihood of the
observations and the hidden states. It is a convex
combination of the mutual information between the hidden states
and the observations, and the likelihood of the observations and
the states. First, we present both theoretical and practical
motivations for having such an objective function. Next, we derive
the parameter estimation (learning) equations for both the
discrete and continuous observation cases. Finally, we illustrate
the superiority of our approach in different classification tasks
by comparing the classification performance of our proposed Mutual
Information HMMs with standard Maximum Likelihood HMMs,
in the case of synthetic and real, discrete and continuous,
supervised and unsupervised data. We believe that MIHMMs are a
powerful tool to solve many of the problems associated with HMMs
when used for classification and/or clustering.

Papers

MIHMMs: Mutual Information Hidden Markov Models
N. Oliver and A. Garg.
ICML 2002 Conference Proceedings.

Copyright © 2002 Nuria Oliver