We present and analyze three different online algorithms for learning in discrete Hidden Markov Models (HMMs) and compare their performance with the Baldi‐Chauvin Algorithm. Using the Kullback‐Leibler divergence as a measure of the generalization error we draw learning curves in simplified situations and compare the results. The performance for learning drifting concepts of one of the presented algorithms is analyzed and compared with the Baldi‐Chauvin algorithm in the same situations. A brief discussion about learning and symmetry breaking based on our results is also presented.
This content is only available via PDF.
© 2006 American Institute of Physics.
2006
American Institute of Physics
You do not currently have access to this content.