Online learning in discrete hidden Markov models

Abstract

We present and analyze three different online algorithms for learning in discrete Hidden Markov Models (HMMs) and compare their performance with the Baldi-Chauvin Algorithm. Using the Kullback-Leibler divergence as a measure of the generalization error we draw learning curves in simplified situations and compare the results. The performance for learning drifting concepts of one of the presented algorithms is analyzed and compared with the Baldi-Chauvin algorithm in the same situations. A brief discussion about learning and symmetry breaking based on our results is also presented.

Publication DOI: https://doi.org/10.1063/1.2423274
Divisions: College of Engineering & Physical Sciences > Systems analytics research institute (SARI)
College of Engineering & Physical Sciences
Additional Information: © 2007 The Authors
Event Title: Bayesian inference and maximum entropy methods In science and engineering
Event Type: Other
Event Dates: 2006-07-08 - 2006-07-13
Uncontrolled Keywords: Bayesian algorithm,generalization error,HMMs,online algorithm,Physics and Astronomy(all)
ISBN: 978-0-7354-0371-6
Last Modified: 24 Jan 2024 08:33
Date Deposited: 29 Apr 2019 09:32
Full Text Link:
Related URLs: http://www.scop ... tnerID=8YFLogxK (Scopus URL)
http://scitatio ... .1063/1.2423274 (Publisher URL)
PURE Output Type: Conference contribution
Published Date: 2006-12-29
Authors: Alamino, Roberto C. (ORCID Profile 0000-0001-8224-2801)
Caticha, Nestor

Download

[img]

Version: Accepted Version

| Preview

Export / Share Citation


Statistics

Additional statistics for this record