An Investigation on Mutual Information for the Linear Predictive System and the Extrapolation of Speech Signals
Conference: Sprachkommunikation - Beiträge zur 10. ITG-Fachtagung
09/26/2012 - 09/28/2012 at Braunschweig, Deutschland
Proceedings: Sprachkommunikation
Pages: 4Language: englishTyp: PDF
Personal VDE Members are entitled to a 10% discount on this title
Authors:
Taghia, Jalal; Martin, Rainer (Institute of Communication Acoustics, Ruhr-Universität Bochum, Bochum, Germany)
Taghia, Jalil; Leijon, Arne (Sound and Image Processing Lab, KTH Royal Institute of Technology, Stockholm, Sweden)
Abstract:
Mutual information (MI) is an important information theoretic concept which has many applications in telecommunications, in blind source separation, and in machine learning. More recently, it has been also employed for the instrumental assessment of speech intelligibility where traditionally correlation based measures are used. In this paper, we address the difference between MI and correlation from the viewpoint of discovering dependencies between variables in the context of speech signals. We perform our investigation by considering the linear predictive approximation and the extrapolation of speech signals as examples. We compare a parametric MI estimation approach based on a Gaussian mixture model (GMM) with the knearest neighbor (KNN) approach which is a well-known non-parametric method available to estimate the MI. We show that the GMM-based MI estimator leads to more consistent results.