Dagstuhl Seminar Proceedings, Volume 9181



Publication Details

  • published at: 2009-07-30
  • Publisher: Schloss Dagstuhl – Leibniz-Zentrum für Informatik

Access Numbers

Documents

No documents found matching your filter selection.
Document
09181 Abstracts Collection – Sampling-based Optimization in the Presence of Uncertainty

Authors: Jürgen Branke, Barry L. Nelson, Warren Buckler Powell, and Thomas J. Santner


Abstract
This Dagstuhl seminar brought together researchers from statistical ranking and selection; experimental design and response-surface modeling; stochastic programming; approximate dynamic programming; optimal learning; and the design and analysis of computer experiments with the goal of attaining a much better mutual understanding of the commonalities and differences of the various approaches to sampling-based optimization, and to take first steps toward an overarching theory, encompassing many of the topics above.

Cite as

Jürgen Branke, Barry L. Nelson, Warren Buckler Powell, and Thomas J. Santner. 09181 Abstracts Collection – Sampling-based Optimization in the Presence of Uncertainty. In Sampling-based Optimization in the Presence of Uncertainty. Dagstuhl Seminar Proceedings, Volume 9181, pp. 1-15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)


Copy BibTex To Clipboard

@InProceedings{branke_et_al:DagSemProc.09181.1,
  author =	{Branke, J\"{u}rgen and Nelson, Barry L. and Powell, Warren Buckler and Santner, Thomas J.},
  title =	{{09181 Abstracts Collection – Sampling-based Optimization in the Presence of Uncertainty}},
  booktitle =	{Sampling-based Optimization in the Presence of Uncertainty},
  pages =	{1--15},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2009},
  volume =	{9181},
  editor =	{J\"{u}rgen Branke and Barry L. Nelson and Warren Buckler Powell and Thomas J. Santner},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.09181.1},
  URN =		{urn:nbn:de:0030-drops-21187},
  doi =		{10.4230/DagSemProc.09181.1},
  annote =	{Keywords: Optimal learning, optimization in the presence of uncertainty, simulation optimization, sequential experimental design, ranking and selection, random search, stochastic approximation, approximate dynamic programming}
}
Document
09181 Executive Summary – Sampling-based Optimization in the Presence of Uncertainty

Authors: Jürgen Branke, Barry L. Nelson, Warren Buckler Powell, and Thomas J. Santner


Abstract
This Dagstuhl seminar brought together researchers from statistical ranking and selection; experimental design and response-surface modeling; stochastic programming; approximate dynamic programming; optimal learning; and the design and analysis of computer experiments with the goal of attaining a much better mutual understanding of the commonalities and differences of the various approaches to sampling-based optimization, and to take first steps toward an overarching theory, encompassing many of the topics above.

Cite as

Jürgen Branke, Barry L. Nelson, Warren Buckler Powell, and Thomas J. Santner. 09181 Executive Summary – Sampling-based Optimization in the Presence of Uncertainty. In Sampling-based Optimization in the Presence of Uncertainty. Dagstuhl Seminar Proceedings, Volume 9181, pp. 1-3, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)


Copy BibTex To Clipboard

@InProceedings{branke_et_al:DagSemProc.09181.2,
  author =	{Branke, J\"{u}rgen and Nelson, Barry L. and Powell, Warren Buckler and Santner, Thomas J.},
  title =	{{09181 Executive Summary – Sampling-based Optimization in the Presence of Uncertainty }},
  booktitle =	{Sampling-based Optimization in the Presence of Uncertainty},
  pages =	{1--3},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2009},
  volume =	{9181},
  editor =	{J\"{u}rgen Branke and Barry L. Nelson and Warren Buckler Powell and Thomas J. Santner},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.09181.2},
  URN =		{urn:nbn:de:0030-drops-21161},
  doi =		{10.4230/DagSemProc.09181.2},
  annote =	{Keywords: Optimal learning, optimization in the presence of uncertainty, simulation optimization, sequential experimental design, ranking and selection, random search, stochastic approximation, approximate dynamic programming}
}
Document
09181 Working Group on Hybridization between R&S, DoE and Optimization

Authors: Chun-Hung Chen, Liu Hong, Paul B. Kantor, David P. Morton, Juta Pichitlamken, and Matthias Seeger


Abstract
This is the report of the working group on the relation between, or hybrid combination of design experiment optimization and R&S. The rapporteur, Paul Kantor, learned a great deal at the conference which he summarized by sharing the cartoon shown here. ("A student asking the teacher'... may i be excused, my is full" (from a 1986 cartoon by Gary Larson) - omitted here for copyright reasons).

Cite as

Chun-Hung Chen, Liu Hong, Paul B. Kantor, David P. Morton, Juta Pichitlamken, and Matthias Seeger. 09181 Working Group on Hybridization between R&S, DoE and Optimization. In Sampling-based Optimization in the Presence of Uncertainty. Dagstuhl Seminar Proceedings, Volume 9181, pp. 1-14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)


Copy BibTex To Clipboard

@InProceedings{chen_et_al:DagSemProc.09181.3,
  author =	{Chen, Chun-Hung and Hong, Liu and Kantor, Paul B. and Morton, David P. and Pichitlamken, Juta and Seeger, Matthias},
  title =	{{09181 Working Group on Hybridization between R\&S, DoE and Optimization}},
  booktitle =	{Sampling-based Optimization in the Presence of Uncertainty},
  pages =	{1--14},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2009},
  volume =	{9181},
  editor =	{J\"{u}rgen Branke and Barry L. Nelson and Warren Buckler Powell and Thomas J. Santner},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.09181.3},
  URN =		{urn:nbn:de:0030-drops-21172},
  doi =		{10.4230/DagSemProc.09181.3},
  annote =	{Keywords: }
}
Document
Large Scale Variational Inference and Experimental Design for Sparse Generalized Linear Models

Authors: Matthias Seeger and Hannes Nickisch


Abstract
Sparsity is a fundamental concept of modern statistics, and often the only general principle available at the moment to address novel learning applications with many more variables than observations. While much progress has been made recently in the theoretical understanding and algorithmics of sparse point estimation, higher-order problems such as covariance estimation or optimal data acquisition are seldomly addressed for sparsity-favouring models, and there are virtually no algorithms for large scale applications of these. We provide novel approximate Bayesian inference algorithms for sparse generalized linear models, that can be used with hundred thousands of variables, and run orders of magnitude faster than previous algorithms in domains where either apply. By analyzing our methods and establishing some novel convexity results, we settle a long-standing open question about variational Bayesian inference for continuous variable models: the Gaussian lower bound relaxation, which has been used previously for a range of models, is proved to be a convex optimization problem, if and only if the posterior mode is found by convex programming. Our algorithms reduce to the same computational primitives than commonly used sparse estimation methods do, but require Gaussian marginal variance estimation as well. We show how the Lanczos algorithm from numerical mathematics can be employed to compute the latter. We are interested in Bayesian experimental design here (which is mainly driven by efficient approximate inference), a powerful framework for optimizing measurement architectures of complex signals, such as natural images. Designs optimized by our Bayesian framework strongly outperform choices advocated by compressed sensing theory, and with our novel algorithms, we can scale it up to full-size images. Immediate applications of our method lie in digital photography and medical imaging. We have applied our framework to problems of magnetic resonance imaging design and reconstruction, and part of this work appeared at a conference (Seeger et al., 2008). The present paper describes our methods in much greater generality, and most of the theory is novel. Experiments and evaluations will be given in a later paper.

Cite as

Matthias Seeger and Hannes Nickisch. Large Scale Variational Inference and Experimental Design for Sparse Generalized Linear Models. In Sampling-based Optimization in the Presence of Uncertainty. Dagstuhl Seminar Proceedings, Volume 9181, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)


Copy BibTex To Clipboard

@InProceedings{seeger_et_al:DagSemProc.09181.4,
  author =	{Seeger, Matthias and Nickisch, Hannes},
  title =	{{Large Scale Variational Inference and Experimental Design for Sparse Generalized Linear Models}},
  booktitle =	{Sampling-based Optimization in the Presence of Uncertainty},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2009},
  volume =	{9181},
  editor =	{J\"{u}rgen Branke and Barry L. Nelson and Warren Buckler Powell and Thomas J. Santner},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.09181.4},
  URN =		{urn:nbn:de:0030-drops-21148},
  doi =		{10.4230/DagSemProc.09181.4},
  annote =	{Keywords: Bayesian experimental design, variational inference, sparse estimation}
}
Document
Sequential Parameter Optimization

Authors: Thomas Bartz-Beielstein


Abstract
We provide a comprehensive, effective and very efficient methodology for the design and experimental analysis of algorithms. We rely on modern statistical techniques for tuning and understanding algorithms from an experimental perspective. Therefore, we make use of the sequential parameter optimization (SPO) method that has been successfully applied as a tuning procedure to numerous heuristics for practical and theoretical optimization problems. Two case studies, which illustrate the applicability of SPO to algorithm tuning and model selection, are presented.

Cite as

Thomas Bartz-Beielstein. Sequential Parameter Optimization. In Sampling-based Optimization in the Presence of Uncertainty. Dagstuhl Seminar Proceedings, Volume 9181, pp. 1-32, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)


Copy BibTex To Clipboard

@InProceedings{bartzbeielstein:DagSemProc.09181.5,
  author =	{Bartz-Beielstein, Thomas},
  title =	{{Sequential Parameter Optimization}},
  booktitle =	{Sampling-based Optimization in the Presence of Uncertainty},
  pages =	{1--32},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2009},
  volume =	{9181},
  editor =	{J\"{u}rgen Branke and Barry L. Nelson and Warren Buckler Powell and Thomas J. Santner},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.09181.5},
  URN =		{urn:nbn:de:0030-drops-21159},
  doi =		{10.4230/DagSemProc.09181.5},
  annote =	{Keywords: Optimization, evolutionary algorithms, design of experiments}
}

Filters


Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail