Clustering of Speech Signal Autoregressive Models by Kullback-Leibler Information Divergence Minimum Criterion
Keywords:
Automatic Speech Recognition, Autoregressive Model, Information Divergence, Centroid, ClusterAbstract
There has been solved the problem of clustering a set of speech signal autoregressive models in the framework of the information-theoretic approach. Therefore, an algorithm to find optimal parameters of the autoregressive model in terms of Kullback-Leibler
information divergence minimum was developed. Based on it the well-known k-means clustering algorithm was modified. There have been conducted experimental studies on efficiency of the developed algorithms applied to speaker independent isolated words recognition using discrete hidden Markov models. It has been identified that the best results of recognition accuracy are achieved
using warped linear predictive coefficients as a feature vector and a vector quantizer codebook equal to 256.