Search Results

Documents authored by Kameya, Yoshitaka


Document
Variational Bayes via Propositionalization

Authors: Taisuke Sato, Yoshitaka Kameya, and Kenichi Kurihara

Published in: Dagstuhl Seminar Proceedings, Volume 7161, Probabilistic, Logical and Relational Learning - A Further Synthesis (2008)


Abstract
We propose a unified approach to VB (variational Bayes) in symbolic-statistical modeling via propositionalization. By propositionalization we mean, broadly, expressing and computing probabilistic models such as BNs (Bayesian networks) and PCFGs (probabilistic context free grammars) in terms of propositional logic that considers propositional variables as binary random variables. Our proposal is motivated by three observations. The first one is that PPC (propostionalized probability computation), i.e. probability computation formalized in a propositional setting, has turned out to be general and efficient when variable values are sparsely interdependent. Examples include (discrete) BNs, PCFGs and more generally PRISM which is a Turing complete logic programming language with EM learning ability we have been developing, and computes probabilities using graphically represented AND/OR boolean formulas. Efficiency of PPC is classically testified by the Inside-Outside algorithm in the case of PCFGs and by recent PPC approaches in the case of BNs such as the one by Darwiche et al. that exploits $0$ probability and CSI (context specific independence). Dechter et al. also revealed that PPC is a general computation scheme for BNs by their formulation of AND/OR search spaces. Second of all, while VB has been around for sometime as a practically effective approach to Bayesian modeling, it's use is still somewhat restricted to simple models such as BNs and HMMs (hidden Markov models) though its usefulness is established through a variety of applications from model selection to prediction. On the other hand it is already proved that VB can be extended to PCFGs and is efficiently implementable using dynamic programming. Note that PCFGs are just one class of PPC and much more general PPC is realized by PRISM. Accordingly if VB is extened to PRISM's PPC, we will obtain VB for general probabilistic models, far wider than BNs and PCFGs. The last observation is that once VB becomes available in PRISM, it saves us a lot of time and energy. First we do not have to derive a new VB algorithm from scratch for each model and implement it. All we have to do is just to write a probabilistic model at predicate level. The rest of work will be carried out automatically in a unified manner by the PRISM system as it happens in the case of EM learning. Deriving and implementing a VB algorithm is a tedious error-prone process, and ensuring its correctness would be difficult beyond PCFGs without formal semantics. PRISM augmented with VB will completely eliminate such needs and make it easy to explore and test new Bayesian models by helping the user cope with data sparseness and avoid over-fitting.

Cite as

Taisuke Sato, Yoshitaka Kameya, and Kenichi Kurihara. Variational Bayes via Propositionalization. In Probabilistic, Logical and Relational Learning - A Further Synthesis. Dagstuhl Seminar Proceedings, Volume 7161, pp. 1-8, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{sato_et_al:DagSemProc.07161.10,
  author =	{Sato, Taisuke and Kameya, Yoshitaka and Kurihara, Kenichi},
  title =	{{Variational Bayes via Propositionalization}},
  booktitle =	{Probabilistic, Logical and Relational Learning - A Further Synthesis},
  pages =	{1--8},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{7161},
  editor =	{Luc de Raedt and Thomas Dietterich and Lise Getoor and Kristian Kersting and Stephen H. Muggleton},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.07161.10},
  URN =		{urn:nbn:de:0030-drops-13860},
  doi =		{10.4230/DagSemProc.07161.10},
  annote =	{Keywords: Variational Bayes, propositionalized probability computation, PRISM}
}
Document
Learning through failure

Authors: Taisuke Sato and Yoshitaka Kameya

Published in: Dagstuhl Seminar Proceedings, Volume 5051, Probabilistic, Logical and Relational Learning - Towards a Synthesis (2006)


Abstract
PRISM, a symbolic-statistical modeling language we have been developing since '97, recently incorporated a program transformation technique to handle failure in generative modeling. I'll show this feature opens a way to new breeds of symbolic models, including EM learning from negative observations, constrained HMMs and finite PCFGs.

Cite as

Taisuke Sato and Yoshitaka Kameya. Learning through failure. In Probabilistic, Logical and Relational Learning - Towards a Synthesis. Dagstuhl Seminar Proceedings, Volume 5051, pp. 1-6, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{sato_et_al:DagSemProc.05051.9,
  author =	{Sato, Taisuke and Kameya, Yoshitaka},
  title =	{{Learning through failure}},
  booktitle =	{Probabilistic, Logical and Relational Learning - Towards a Synthesis},
  pages =	{1--6},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{5051},
  editor =	{Luc De Raedt and Thomas Dietterich and Lise Getoor and Stephen H. Muggleton},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.05051.9},
  URN =		{urn:nbn:de:0030-drops-4185},
  doi =		{10.4230/DagSemProc.05051.9},
  annote =	{Keywords: Program transformation, failure, generative modeling}
}
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail