Simple approximate MAP inference for Dirichlet processes mixtures

Abstract

The Dirichlet process mixture model (DPMM) is a ubiquitous, flexible Bayesian nonparametric statistical model. However, full probabilistic inference in this model is analytically intractable, so that computationally intensive techniques such as Gibbs sampling are required. As a result, DPMM-based methods, which have considerable potential, are restricted to applications in which computational resources and time for inference is plentiful. For example, they would not be practical for digital signal processing on embedded hardware, where computational resources are at a serious premium. Here, we develop a simplified yet statistically rigorous approximate maximum a-posteriori (MAP) inference algorithm for DPMMs. This algorithm is as simple as DP-means clustering, solves the MAP problem as well as Gibbs sampling, while requiring only a fraction of the computational effort. (For freely available code that implements the MAP-DP algorithm for Gaussian mixtures see http://www.maxlittle.net/.) Unlike related small variance asymptotics (SVA), our method is non-degenerate and so inherits the “rich get richer” property of the Dirichlet process. It also retains a non-degenerate closed-form likelihood which enables out-of-sample calculations and the use of standard tools such as cross-validation. We illustrate the benefits of our algorithm on a range of examples and contrast it to variational, SVA and sampling approaches from both a computational complexity perspective as well as in terms of clustering performance. We demonstrate the wide applicabiity of our approach by presenting an approximate MAP inference method for the infinite hidden Markov model whose performance contrasts favorably with a recently proposed hybrid SVA approach. Similarly, we show how our algorithm can applied to a semiparametric mixed-effects regression model where the random effects distribution is modelled using an infinite mixture model, as used in longitudinal progression modelling in population health science. Finally, we propose directions for future research on approximate MAP inference in Bayesian nonparametrics.

Publication DOI: https://doi.org/10.1214/16-EJS1196
Divisions: College of Engineering & Physical Sciences
College of Engineering & Physical Sciences > Systems analytics research institute (SARI)
Additional Information: Creative Commons Attribution License. Authors retain ownership of the copyright for their article, but authors allow anyone to download, reuse, reprint, modify, distribute, and/or copy articles in EJS, so long as the original authors and source are credited.
Uncontrolled Keywords: Bayesian nonparametrics,clustering,Gaussian mixture model,Statistics and Probability
Publication ISSN: 1935-7524
Last Modified: 22 Jan 2024 08:09
Date Deposited: 22 Nov 2016 08:50
Full Text Link:
Related URLs: http://www.scop ... tnerID=8YFLogxK (Scopus URL)
PURE Output Type: Article
Published Date: 2016-11
Published Online Date: 2016-11-16
Accepted Date: 2016-11-16
Submitted Date: 2015-05
Authors: Raykov, Yordan P. (ORCID Profile 0000-0003-0753-717X)
Boukouvalas, Alexios
Little, Max A. (ORCID Profile 0000-0002-1507-3822)

Download

[img]

Version: Published Version

License: Creative Commons Attribution


Export / Share Citation


Statistics

Additional statistics for this record