Variational Bayes via Propositionalization

Authors Taisuke Sato, Yoshitaka Kameya, Kenichi Kurihara



PDF
Thumbnail PDF

File

DagSemProc.07161.10.pdf
  • Filesize: 104 kB
  • 8 pages

Document Identifiers

Author Details

Taisuke Sato
Yoshitaka Kameya
Kenichi Kurihara

Cite As Get BibTex

Taisuke Sato, Yoshitaka Kameya, and Kenichi Kurihara. Variational Bayes via Propositionalization. In Probabilistic, Logical and Relational Learning - A Further Synthesis. Dagstuhl Seminar Proceedings, Volume 7161, pp. 1-8, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008) https://doi.org/10.4230/DagSemProc.07161.10

Abstract

We propose a unified approach to VB (variational Bayes) in
symbolic-statistical  modeling  via  propositionalization.
By  propositionalization we  mean, broadly,  expressing and
computing  probabilistic  models  such  as  BNs  (Bayesian
networks) and PCFGs  (probabilistic context free grammars)
in   terms   of   propositional   logic   that   considers
propositional variables as binary random variables.

Our  proposal  is motivated  by  three observations.   The
first  one  is   that  PPC  (propostionalized  probability
computation), i.e.  probability computation formalized in
a propositional setting, has  turned out to be general and
efficient    when    variable    values    are    sparsely
interdependent.   Examples include  (discrete)  BNs, PCFGs
and more generally PRISM  which is a Turing complete logic
programming language with EM learning ability we have been
developing,  and computes probabilities  using graphically
represented AND/OR boolean formulas.  Efficiency of PPC is
classically testified  by the Inside-Outside  algorithm in
the case of PCFGs and by recent PPC approaches in the case
of BNs such  as the one by Darwiche  et al. that exploits
$0$ probability and  CSI (context specific independence).
Dechter  et  al. also  revealed  that  PPC  is a  general
computation scheme for BNs  by their formulation of AND/OR
search spaces.

Second of all, while VB  has been around for sometime as a
practically effective approach  to Bayesian modeling, it's
use is still somewhat  restricted to simple models such as
BNs and HMMs (hidden  Markov models) though its usefulness
is  established  through a  variety  of applications  from
model selection  to prediction.  On  the other hand  it is
already proved  that VB  can be extended  to PCFGs  and is
efficiently implementable  using dynamic programming. Note
that PCFGs are just one class of PPC and much more general
PPC is realized  by PRISM. Accordingly if VB is extened to
PRISM's PPC,  we will obtain VB  for general probabilistic
models, far wider than BNs and PCFGs.

The last observation is  that once VB becomes available in
PRISM, it saves us a lot  of time and energy.  First we do
not have  to derive  a new VB  algorithm from  scratch for
each model and implement it.  All we have to do is just to
write a  probabilistic model at predicate  level. The rest
of  work will be  carried out  automatically in  a unified
manner by the PRISM system as it happens in the case of EM
learning.  Deriving  and implementing a VB  algorithm is a
tedious error-prone process,  and ensuring its correctness
would be difficult  beyond PCFGs without formal semantics.

PRISM  augmented with  VB will  completely  eliminate such
needs and  make it easy  to explore and test  new Bayesian
models by  helping the user cope with  data sparseness and
avoid over-fitting.

Subject Classification

Keywords
  • Variational Bayes
  • propositionalized probability computation
  • PRISM

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail