Challenges and Perspectives in Deep Generative Modeling (Dagstuhl Seminar 23072)

Authors Vincent Fortuin, Yingzhen Li, Kevin Murphy, Stephan Mandt, Laura Manduchi and all authors of the abstracts in this report



PDF
Thumbnail PDF

File

DagRep.13.2.47.pdf
  • Filesize: 1.91 MB
  • 24 pages

Document Identifiers

Author Details

Vincent Fortuin
  • University of Cambridge, GB
Yingzhen Li
  • Imperial College London, GB
Kevin Murphy
  • Google Research - Mountain View, US
Stephan Mandt
  • University of California - Irvine, US
Laura Manduchi
  • ETH Zürich, CH
and all authors of the abstracts in this report

Cite AsGet BibTex

Vincent Fortuin, Yingzhen Li, Kevin Murphy, Stephan Mandt, and Laura Manduchi. Challenges and Perspectives in Deep Generative Modeling (Dagstuhl Seminar 23072). In Dagstuhl Reports, Volume 13, Issue 2, pp. 47-70, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)
https://doi.org/10.4230/DagRep.13.2.47

Abstract

Deep generative models, such as variational autoencoders, generative adversarial networks, normalizing flows, and diffusion probabilistic models, have attracted a lot of recent interest. However, we believe that several challenges hinder their more widespread adoption: (C1) the difficulty of objectively evaluating the generated data; (C2) challenges in designing scalable architectures for fast likelihood evaluation or sampling; and (C3) challenges related to finding reproducible, interpretable, and semantically meaningful latent representations. In this Dagstuhl Seminar, we have discussed these open problems in the context of real-world applications of deep generative models, including (A1) generative modeling of scientific data, (A2) neural data compression, and (A3) out-of-distribution detection. By discussing challenges C1-C3 in concrete contexts A1-A3, we have worked towards identifying commonly occurring problems and ways towards overcoming them. We thus foresee many future research collaborations to arise from this seminar and for the discussed ideas to form the foundation for fruitful avenues of future research. We proceed in this report by summarizing the main results of the seminar and then giving an overview of the different contributed talks and working group discussions.

Subject Classification

ACM Subject Classification
  • Computing methodologies → Unsupervised learning
  • Computing methodologies → Kernel methods
  • Computing methodologies → Learning in probabilistic graphical models
Keywords
  • deep generative models
  • representation learning
  • generative modeling
  • neural data compression
  • out-of-distribution detection

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail