Search Results

Documents authored by Kammar, Ohad


Document
Invited Paper
Bayesian Inversion by Omega-Complete Cone Duality (Invited Paper)

Authors: Fredrik Dahlqvist, Vincent Danos, Ilias Garnier, and Ohad Kammar

Published in: LIPIcs, Volume 59, 27th International Conference on Concurrency Theory (CONCUR 2016)


Abstract
The process of inverting Markov kernels relates to the important subject of Bayesian modelling and learning. In fact, Bayesian update is exactly kernel inversion. In this paper, we investigate how and when Markov kernels (aka stochastic relations, or probabilistic mappings, or simply kernels) can be inverted. We address the question both directly on the category of measurable spaces, and indirectly by interpreting kernels as Markov operators: - For the direct option, we introduce a typed version of the category of Markov kernels and use the so-called "disintegration of measures". Here, one has to specialise to measurable spaces borne from a simple class of topological spaces -e.g. Polish spaces (other choices are possible). Our method and result greatly simplify a recent development in Ref. [4]. - For the operator option, we use a cone version of the category of Markov operators (kernels seen as predicate transformers). That is to say, our linear operators are not just continuous, but are required to satisfy the stronger condition of being $\om$-chain-continuous. Prior work shows that one obtains an adjunction in the form of a pair of contravariant and inverse functors between the categories of $L_1$- and $L_\infty$-cones [3]. Inversion, seen through the operator prism, is just adjunction. No topological assumption is needed. - We show that both categories (Markov kernels and $\om$-chain-continuous Markov operators) are related by a family of contravariant functors $T_p$ for $1\leq p\leq\infty$. The $T_p$'s are Kleisli extensions of (duals of) conditional expectation functors introduced in Ref. [3]. - With this bridge in place, we can prove that both notions of inversion agree when both defined: if $f$ is a kernel, and $f\dg$ its direct inverse, then $T_\infty(f)\dg=T_1(f\dg)$.

Cite as

Fredrik Dahlqvist, Vincent Danos, Ilias Garnier, and Ohad Kammar. Bayesian Inversion by Omega-Complete Cone Duality (Invited Paper). In 27th International Conference on Concurrency Theory (CONCUR 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 59, pp. 1:1-1:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{dahlqvist_et_al:LIPIcs.CONCUR.2016.1,
  author =	{Dahlqvist, Fredrik and Danos, Vincent and Garnier, Ilias and Kammar, Ohad},
  title =	{{Bayesian Inversion by Omega-Complete Cone Duality}},
  booktitle =	{27th International Conference on Concurrency Theory (CONCUR 2016)},
  pages =	{1:1--1:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-017-0},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{59},
  editor =	{Desharnais, Jos\'{e}e and Jagadeesan, Radha},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CONCUR.2016.1},
  URN =		{urn:nbn:de:0030-drops-61909},
  doi =		{10.4230/LIPIcs.CONCUR.2016.1},
  annote =	{Keywords: probabilistic models, bayesian learning, markov operators}
}
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail