A deterministic inference framework for discrete nonparametric latent variable models:learning complex probabilistic models with simple algorithms

Abstract

Latent variable models provide a powerful framework for describing complex data by capturing its structure with a combination of more compact unobserved variables. The Bayesian approach to statistical latent models additionally provides a consistent and principled framework for dealing with uncertainty inherent in the data described with our model. However, in most Bayesian latent variable models we face the limitation that the number of unobserved variables has to be specied a priori. With the increasingly larger and more complex data problems such parametric models fail to make most out of the data available. Any increase in data passed into the model only affects the accuracy of the inferred posteriors and models fail to adapt to adequately capture new arising structure. Flexible Bayesian nonparametric models can mitigate such challenges and allow the learn arbitrarily complex representations given enough data is provided. However,their applications are restricted to applications in which computational resources are plentiful because of the exhaustive sampling methods they require for inference. At the same time we see that in practice despite the large variety of exible models available, simple algorithms such as K-means or Viterbi algorithm remain the preferred tool for most real world applications.This has motivated us in this thesis to borrow the exibility provided by Bayesian nonparametric models,but to derive easy to use, scalable techniques which can be applied to large data problems and can be ran on resource constraint embedded hardware. We propose nonparametric model-based clustering algorithms nearly as simple as K-means which overcome most of its challenges and can infer the number of clusters from the data. Their potential is demonstrated for many different scenarios and applications such as phenotyping Parkinson and Parkisonism related conditions in an unsupervised way. With few simple steps we derive a related approach for nonparametric analysis on longitudinal data which converges few orders of magnitude faster than current available sampling methods. The framework is extended to effcient inference in nonparametric sequential models where example applications can be behaviour extraction and DNA sequencing. We demonstrate that our methods could be easily extended to allow for exible online learning in a realistic setup using severely limited computational resources. We develop a system capable of inferring online nonparametric hidden Markov models from streaming data using only embedded hardware. This allowed us to develop occupancy estimation technology using only a simple motion sensor.

Divisions: Aston University (General)
Additional Information: If you have discovered material in Aston Research Explorer which is unlawful e.g. breaches copyright, (either yours or that of a third party) or any other law, including but not limited to those relating to patent, trademark, confidentiality, data protection, obscenity, defamation, libel, then please read our Takedown Policy and contact the service immediately.
Institution: Aston University
Uncontrolled Keywords: Bayesian nonparametrics,clustering,segmentation,mixture models,hidden Markov models
Last Modified: 08 Dec 2023 08:56
Date Deposited: 17 May 2019 12:31
Completed Date: 2017-06-16
Authors: Raykov, Yordan

Download

Export / Share Citation


Statistics

Additional statistics for this record