Large Scale Variational Inference and Experimental Design for Sparse Generalized Linear Models

Authors Matthias Seeger, Hannes Nickisch



PDF
Thumbnail PDF

File

DagSemProc.09181.4.pdf
  • Filesize: 479 kB
  • 29 pages

Document Identifiers

Author Details

Matthias Seeger
Hannes Nickisch

Cite AsGet BibTex

Matthias Seeger and Hannes Nickisch. Large Scale Variational Inference and Experimental Design for Sparse Generalized Linear Models. In Sampling-based Optimization in the Presence of Uncertainty. Dagstuhl Seminar Proceedings, Volume 9181, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)
https://doi.org/10.4230/DagSemProc.09181.4

Abstract

Sparsity is a fundamental concept of modern statistics, and often the only general principle available at the moment to address novel learning applications with many more variables than observations. While much progress has been made recently in the theoretical understanding and algorithmics of sparse point estimation, higher-order problems such as covariance estimation or optimal data acquisition are seldomly addressed for sparsity-favouring models, and there are virtually no algorithms for large scale applications of these. We provide novel approximate Bayesian inference algorithms for sparse generalized linear models, that can be used with hundred thousands of variables, and run orders of magnitude faster than previous algorithms in domains where either apply. By analyzing our methods and establishing some novel convexity results, we settle a long-standing open question about variational Bayesian inference for continuous variable models: the Gaussian lower bound relaxation, which has been used previously for a range of models, is proved to be a convex optimization problem, if and only if the posterior mode is found by convex programming. Our algorithms reduce to the same computational primitives than commonly used sparse estimation methods do, but require Gaussian marginal variance estimation as well. We show how the Lanczos algorithm from numerical mathematics can be employed to compute the latter. We are interested in Bayesian experimental design here (which is mainly driven by efficient approximate inference), a powerful framework for optimizing measurement architectures of complex signals, such as natural images. Designs optimized by our Bayesian framework strongly outperform choices advocated by compressed sensing theory, and with our novel algorithms, we can scale it up to full-size images. Immediate applications of our method lie in digital photography and medical imaging. We have applied our framework to problems of magnetic resonance imaging design and reconstruction, and part of this work appeared at a conference (Seeger et al., 2008). The present paper describes our methods in much greater generality, and most of the theory is novel. Experiments and evaluations will be given in a later paper.
Keywords
  • Bayesian experimental design
  • variational inference
  • sparse estimation

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail