When quoting this document, please refer to the following
URN: urn:nbn:de:0030-drops-21148
Go to the corresponding Portal

Seeger, Matthias ; Nickisch, Hannes

Large Scale Variational Inference and Experimental Design for Sparse Generalized Linear Models

Document 1.pdf (480 KB)


Sparsity is a fundamental concept of modern statistics, and often the only general principle available at the moment to address novel learning applications with many more variables than observations. While much progress has been made recently in the theoretical understanding and algorithmics of sparse point estimation, higher-order problems such as covariance estimation or optimal data acquisition are seldomly addressed for sparsity-favouring models, and there are virtually no algorithms for large scale applications of these. We provide novel approximate Bayesian inference algorithms for sparse generalized linear models, that can be used with hundred thousands of variables, and run orders of magnitude faster than previous algorithms in domains where either apply. By analyzing our methods and establishing some novel convexity results, we settle a long-standing open question about variational Bayesian inference for continuous variable models: the Gaussian lower bound relaxation, which has been used previously for a range of models, is proved to be a convex optimization problem, if and only if the posterior mode is found by convex programming. Our algorithms reduce to the same computational primitives than commonly used sparse estimation methods do, but require Gaussian marginal variance estimation as well. We show how the Lanczos algorithm from numerical mathematics can be employed to compute the latter. We are interested in Bayesian experimental design here (which is mainly driven by efficient approximate inference), a powerful framework for optimizing measurement architectures of complex signals, such as natural images. Designs optimized by our Bayesian framework strongly outperform choices advocated by compressed sensing theory, and with our novel algorithms, we can scale it up to full-size images. Immediate applications of our method lie in digital photography and medical imaging. We have applied our framework to problems of magnetic resonance imaging design and reconstruction, and part of this work appeared at a conference (Seeger et al., 2008). The present paper describes our methods in much greater generality, and most of the theory is novel. Experiments and evaluations will be given in a later paper.

BibTeX - Entry

  author =	{Matthias Seeger and Hannes Nickisch},
  title =	{Large Scale Variational Inference and Experimental Design for Sparse Generalized Linear Models},
  booktitle =	{Sampling-based Optimization in the Presence of Uncertainty },
  year =	{2009},
  editor =	{J{\"u}rgen Branke and Barry L. Nelson and Warren Buckler Powell and Thomas J. Santner},
  number =	{09181},
  series =	{Dagstuhl Seminar Proceedings},
  ISSN =	{1862-4405},
  publisher =	{Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik, Germany},
  address =	{Dagstuhl, Germany},
  URL =		{},
  annote =	{Keywords: Bayesian experimental design, variational inference, sparse estimation}

Keywords: Bayesian experimental design, variational inference, sparse estimation
Seminar: 09181 - Sampling-based Optimization in the Presence of Uncertainty
Issue Date: 2009
Date of publication: 30.07.2009

DROPS-Home | Fulltext Search | Imprint Published by LZI