Learning Hidden Markov Models from the state distribution oracle
Document Type
Conference Proceeding
Publication Date
12-1-2004
Abstract
A Hidden Markov Model (HMM) is a probabilistic model that has been widely applied to a number of fields since its inception over 30 years ago. Computational Biology, Speech Recognition, and Image Processing are but a few of the application areas of HMMs. We propose an efficient algorithm for learning the parameters of a first order HMM from a state distribution (SD) oracle. The SD oracle provides the learner with the state distribution vector corresponding to a query string in the model. The SD oracle is shown to be necessary for polynomial-time learning in the sense that the consistency problem involving learning HMM parameters from a training set of state distribution vectors without the ability to query the SD oracle, is NP-complete. The learning algorithm proposed is based on an algorithm described by Tzeng for learning Probabilistic Automata. ©2004 IEEE.
Publication Source (Journal or Book title)
Proceedings of the 2004 International Conference on Machine Learning and Applications, ICMLA '04
First Page
73
Last Page
80
Recommended Citation
Moscovich, L., & Chen, J. (2004). Learning Hidden Markov Models from the state distribution oracle. Proceedings of the 2004 International Conference on Machine Learning and Applications, ICMLA '04, 73-80. Retrieved from https://repository.lsu.edu/eecs_pubs/2411