3 Search Results for "Sato, Taisuke"


Document
Tabling for infinite probability computation

Authors: Taisuke Sato and Philipp Meyer

Published in: LIPIcs, Volume 17, Technical Communications of the 28th International Conference on Logic Programming (ICLP'12) (2012)


Abstract
Tabling in logic programming has been used to eliminate redundant computation and also to stop infinite loop. In this paper we add the third usage of tabling, i.e. to make infinite computation possible for probabilistic logic programs. Using PRISM, a logic-based probabilistic modeling language with a tabling mechanism, we generalize prefix probability computation for PCFGs to probabilistic logic programs. Given a top-goal, we search for all SLD proofs by tabled search regardless of whether they contain loop or not. We then convert them to a set of linear probability equations and solve them by matrix operation. The solution gives us the probability of the top-goal, which, in nature, is an infinite sum of probabilities. Our generalized approach to prefix probability computation through tabling opens a way to logic-based probabilistic modeling of cyclic dependencies.

Cite as

Taisuke Sato and Philipp Meyer. Tabling for infinite probability computation. In Technical Communications of the 28th International Conference on Logic Programming (ICLP'12). Leibniz International Proceedings in Informatics (LIPIcs), Volume 17, pp. 348-358, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2012)


Copy BibTex To Clipboard

@InProceedings{sato_et_al:LIPIcs.ICLP.2012.348,
  author =	{Sato, Taisuke and Meyer, Philipp},
  title =	{{Tabling for infinite probability computation}},
  booktitle =	{Technical Communications of the 28th International Conference on Logic Programming (ICLP'12)},
  pages =	{348--358},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-43-9},
  ISSN =	{1868-8969},
  year =	{2012},
  volume =	{17},
  editor =	{Dovier, Agostino and Santos Costa, V{\'\i}tor},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICLP.2012.348},
  URN =		{urn:nbn:de:0030-drops-36355},
  doi =		{10.4230/LIPIcs.ICLP.2012.348},
  annote =	{Keywords: probability, tabling, PRISM}
}
Document
Variational Bayes via Propositionalization

Authors: Taisuke Sato, Yoshitaka Kameya, and Kenichi Kurihara

Published in: Dagstuhl Seminar Proceedings, Volume 7161, Probabilistic, Logical and Relational Learning - A Further Synthesis (2008)


Abstract
We propose a unified approach to VB (variational Bayes) in symbolic-statistical modeling via propositionalization. By propositionalization we mean, broadly, expressing and computing probabilistic models such as BNs (Bayesian networks) and PCFGs (probabilistic context free grammars) in terms of propositional logic that considers propositional variables as binary random variables. Our proposal is motivated by three observations. The first one is that PPC (propostionalized probability computation), i.e. probability computation formalized in a propositional setting, has turned out to be general and efficient when variable values are sparsely interdependent. Examples include (discrete) BNs, PCFGs and more generally PRISM which is a Turing complete logic programming language with EM learning ability we have been developing, and computes probabilities using graphically represented AND/OR boolean formulas. Efficiency of PPC is classically testified by the Inside-Outside algorithm in the case of PCFGs and by recent PPC approaches in the case of BNs such as the one by Darwiche et al. that exploits $0$ probability and CSI (context specific independence). Dechter et al. also revealed that PPC is a general computation scheme for BNs by their formulation of AND/OR search spaces. Second of all, while VB has been around for sometime as a practically effective approach to Bayesian modeling, it's use is still somewhat restricted to simple models such as BNs and HMMs (hidden Markov models) though its usefulness is established through a variety of applications from model selection to prediction. On the other hand it is already proved that VB can be extended to PCFGs and is efficiently implementable using dynamic programming. Note that PCFGs are just one class of PPC and much more general PPC is realized by PRISM. Accordingly if VB is extened to PRISM's PPC, we will obtain VB for general probabilistic models, far wider than BNs and PCFGs. The last observation is that once VB becomes available in PRISM, it saves us a lot of time and energy. First we do not have to derive a new VB algorithm from scratch for each model and implement it. All we have to do is just to write a probabilistic model at predicate level. The rest of work will be carried out automatically in a unified manner by the PRISM system as it happens in the case of EM learning. Deriving and implementing a VB algorithm is a tedious error-prone process, and ensuring its correctness would be difficult beyond PCFGs without formal semantics. PRISM augmented with VB will completely eliminate such needs and make it easy to explore and test new Bayesian models by helping the user cope with data sparseness and avoid over-fitting.

Cite as

Taisuke Sato, Yoshitaka Kameya, and Kenichi Kurihara. Variational Bayes via Propositionalization. In Probabilistic, Logical and Relational Learning - A Further Synthesis. Dagstuhl Seminar Proceedings, Volume 7161, pp. 1-8, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{sato_et_al:DagSemProc.07161.10,
  author =	{Sato, Taisuke and Kameya, Yoshitaka and Kurihara, Kenichi},
  title =	{{Variational Bayes via Propositionalization}},
  booktitle =	{Probabilistic, Logical and Relational Learning - A Further Synthesis},
  pages =	{1--8},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{7161},
  editor =	{Luc de Raedt and Thomas Dietterich and Lise Getoor and Kristian Kersting and Stephen H. Muggleton},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.07161.10},
  URN =		{urn:nbn:de:0030-drops-13860},
  doi =		{10.4230/DagSemProc.07161.10},
  annote =	{Keywords: Variational Bayes, propositionalized probability computation, PRISM}
}
Document
Learning through failure

Authors: Taisuke Sato and Yoshitaka Kameya

Published in: Dagstuhl Seminar Proceedings, Volume 5051, Probabilistic, Logical and Relational Learning - Towards a Synthesis (2006)


Abstract
PRISM, a symbolic-statistical modeling language we have been developing since '97, recently incorporated a program transformation technique to handle failure in generative modeling. I'll show this feature opens a way to new breeds of symbolic models, including EM learning from negative observations, constrained HMMs and finite PCFGs.

Cite as

Taisuke Sato and Yoshitaka Kameya. Learning through failure. In Probabilistic, Logical and Relational Learning - Towards a Synthesis. Dagstuhl Seminar Proceedings, Volume 5051, pp. 1-6, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{sato_et_al:DagSemProc.05051.9,
  author =	{Sato, Taisuke and Kameya, Yoshitaka},
  title =	{{Learning through failure}},
  booktitle =	{Probabilistic, Logical and Relational Learning - Towards a Synthesis},
  pages =	{1--6},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{5051},
  editor =	{Luc De Raedt and Thomas Dietterich and Lise Getoor and Stephen H. Muggleton},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.05051.9},
  URN =		{urn:nbn:de:0030-drops-4185},
  doi =		{10.4230/DagSemProc.05051.9},
  annote =	{Keywords: Program transformation, failure, generative modeling}
}
  • Refine by Author
  • 3 Sato, Taisuke
  • 2 Kameya, Yoshitaka
  • 1 Kurihara, Kenichi
  • 1 Meyer, Philipp

  • Refine by Classification

  • Refine by Keyword
  • 2 PRISM
  • 1 Program transformation
  • 1 Variational Bayes
  • 1 failure
  • 1 generative modeling
  • Show More...

  • Refine by Type
  • 3 document

  • Refine by Publication Year
  • 1 2006
  • 1 2008
  • 1 2012

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail