4 Search Results for "Maass, Peter"


Document
A Two-Step Soft Segmentation Procedure for MALDI Imaging Mass Spectrometry Data

Authors: Ilya Chernyavsky, Theodore Alexandrov, Peter Maass, and Sergey I. Nikolenko

Published in: OASIcs, Volume 26, German Conference on Bioinformatics 2012


Abstract
We propose a new method for soft spatial segmentation of matrix assisted laser desorption/ionization imaging mass spectrometry (MALDI-IMS) data which is based on probabilistic clustering with subsequent smoothing. Clustering of spectra is done with the Latent Dirichlet Allocation (LDA) model. Then, clustering results are smoothed with a Markov random field (MRF) resulting in a soft probabilistic segmentation map. We show several extensions of the basic MRF model specifically tuned for MALDI-IMS data segmentation. We describe a highly parallel implementation of the smoothing algorithm based on GraphLab framework and show experimental results.

Cite as

Ilya Chernyavsky, Theodore Alexandrov, Peter Maass, and Sergey I. Nikolenko. A Two-Step Soft Segmentation Procedure for MALDI Imaging Mass Spectrometry Data. In German Conference on Bioinformatics 2012. Open Access Series in Informatics (OASIcs), Volume 26, pp. 39-48, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2012)


Copy BibTex To Clipboard

@InProceedings{chernyavsky_et_al:OASIcs.GCB.2012.39,
  author =	{Chernyavsky, Ilya and Alexandrov, Theodore and Maass, Peter and Nikolenko, Sergey I.},
  title =	{{A Two-Step Soft Segmentation Procedure for MALDI Imaging Mass Spectrometry Data}},
  booktitle =	{German Conference on Bioinformatics 2012},
  pages =	{39--48},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-44-6},
  ISSN =	{2190-6807},
  year =	{2012},
  volume =	{26},
  editor =	{B\"{o}cker, Sebastian and Hufsky, Franziska and Scheubert, Kerstin and Schleicher, Jana and Schuster, Stefan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.GCB.2012.39},
  URN =		{urn:nbn:de:0030-drops-37163},
  doi =		{10.4230/OASIcs.GCB.2012.39},
  annote =	{Keywords: MALDI imaging mass spectrometry, hyperspectral image segmentation, probabilistic graphical models, latent Dirichlet allocation, Markov random field}
}
Document
One-shot Learning of Poisson Distributions in fast changing environments

Authors: Peter Tino

Published in: Dagstuhl Seminar Proceedings, Volume 10302, Learning paradigms in dynamic environments (2010)


Abstract
In Bioinformatics, Audic and Claverie were among the first to systematically study the influence of random fluctuations and sampling size on the reliability of digital expression profile data. For a transcript representing a small fraction of the library and a large number N of clones, the probability of observing x tags of the same gene will be well-approximated by the Poisson distribution parametrised by its mean (and variance) m>0, where the unknown parameter m signifies the number of transcripts of the given type (tag) per N clones in the cDNA library. On an abstract level, to determine whether a gene is differentially expressed or not, one has two numbers generated from two distinct Poisson distributions and based on this (extremely sparse) sample one has to decide whether the two Poisson distributions are identical or not. This can be used e.g. to determine equivalence of Poisson photon sources (up to time shift) in gravitational lensing. Each Poisson distribution is represented by a single measurement only, which is, of course, from a purely statistical standpoint very problematic. The key instrument of the Audic-Claverie approach is a distribution P over tag counts y in one library informed by the tag count x in the other library, under the null hypothesis that the tag counts are generated from the same but unknown Poisson distribution. P is obtained by Bayesian averaging (infinite mixture) of all possible Poisson distributions with mixing proportions equal to the posteriors (given x) under the flat prior over m. We ask: Given that the tag count samples from SAGE libraries are *extremely* limited, how useful actually is the Audic-Claverie methodology? We rigorously analyse the A-C statistic P that forms a backbone of the methodology and represents our knowledge of the underlying tag generating process based on one observation. We show will that the A-C statistic P and the underlying Poisson distribution of the tag counts share the same mode structure. Moreover, the K-L divergence from the true unknown Poisson distribution to the A-C statistic is minimised when the A-C statistic is conditioned on the mode of the Poisson distribution. Most importantly (and perhaps rather surprisingly), the expectation of this K-L divergence never exceeds 1/2 bit! This constitutes a rigorous quantitative argument, extending the previous empirical Monte Carlo studies, that supports the wide spread use of Audic-Claverie method, even though by their very nature, the SAGE libraries represent very sparse samples.

Cite as

Peter Tino. One-shot Learning of Poisson Distributions in fast changing environments. In Learning paradigms in dynamic environments. Dagstuhl Seminar Proceedings, Volume 10302, pp. 1-9, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{tino:DagSemProc.10302.4,
  author =	{Tino, Peter},
  title =	{{One-shot Learning of Poisson Distributions in fast changing environments}},
  booktitle =	{Learning paradigms in dynamic environments},
  pages =	{1--9},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2010},
  volume =	{10302},
  editor =	{Barbara Hammer and Pascal Hitzler and Wolfgang Maass and Marc Toussaint},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.10302.4},
  URN =		{urn:nbn:de:0030-drops-27998},
  doi =		{10.4230/DagSemProc.10302.4},
  annote =	{Keywords: Audic-Claverie statistic, Bayesian averaging, information theory, one-shot learning, Poisson distribution}
}
Document
Equilibria of Iterative Softmax and Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks

Authors: Peter Tino

Published in: Dagstuhl Seminar Proceedings, Volume 8041, Recurrent Neural Networks- Models, Capacities, and Applications (2008)


Abstract
Optimization dynamics using self-organizing neural networks (SONN) driven by softmax weight renormalization has been shown to be capable of intermittent search for high-quality solutions in assignment optimization problems. However, the search is sensitive to temperature setting in the softmax renormalization step. The powerful search occurs only at the critical temperature that depends on the problem size. So far the critical temperatures have been determined only by tedious trial-and-error numerical simulations. We offer a rigorous analysis of the search performed by SONN and derive analytical approximations to the critical temperatures. We demonstrate on a set of N-queens problems for a wide range of problem sizes N that the analytically determined critical temperatures predict the optimal working temperatures for SONN intermittent search very well.

Cite as

Peter Tino. Equilibria of Iterative Softmax and Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks. In Recurrent Neural Networks- Models, Capacities, and Applications. Dagstuhl Seminar Proceedings, Volume 8041, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{tino:DagSemProc.08041.3,
  author =	{Tino, Peter},
  title =	{{Equilibria of Iterative Softmax and Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks}},
  booktitle =	{Recurrent Neural Networks- Models, Capacities, and Applications},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8041},
  editor =	{Luc De Raedt and Barbara Hammer and Pascal Hitzler and Wolfgang Maass},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08041.3},
  URN =		{urn:nbn:de:0030-drops-14202},
  doi =		{10.4230/DagSemProc.08041.3},
  annote =	{Keywords: Recurrent self-organizing maps, symmetry breaking bifurcation, N-queens}
}
Document
Perspectives of Neuro--Symbolic Integration – Extended Abstract --

Authors: Kai-Uwe Kühnberger, Helmar Gust, and Peter Geibel

Published in: Dagstuhl Seminar Proceedings, Volume 8041, Recurrent Neural Networks- Models, Capacities, and Applications (2008)


Abstract
There is an obvious tension between symbolic and subsymbolic theories, because both show complementary strengths and weaknesses in corresponding applications and underlying methodologies. The resulting gap in the foundations and the applicability of these approaches is theoretically unsatisfactory and practically undesirable. We sketch a theory that bridges this gap between symbolic and subsymbolic approaches by the introduction of a Topos-based semi-symbolic level used for coding logical first-order expressions in a homogeneous framework. This semi-symbolic level can be used for neural learning of logical first-order theories. Besides a presentation of the general idea of the framework, we sketch some challenges and important open problems for future research with respect to the presented approach and the field of neuro-symbolic integration, in general.

Cite as

Kai-Uwe Kühnberger, Helmar Gust, and Peter Geibel. Perspectives of Neuro--Symbolic Integration – Extended Abstract --. In Recurrent Neural Networks- Models, Capacities, and Applications. Dagstuhl Seminar Proceedings, Volume 8041, pp. 1-6, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{kuhnberger_et_al:DagSemProc.08041.4,
  author =	{K\"{u}hnberger, Kai-Uwe and Gust, Helmar and Geibel, Peter},
  title =	{{Perspectives of Neuro--Symbolic Integration – Extended Abstract --}},
  booktitle =	{Recurrent Neural Networks- Models, Capacities, and Applications},
  pages =	{1--6},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8041},
  editor =	{Luc De Raedt and Barbara Hammer and Pascal Hitzler and Wolfgang Maass},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08041.4},
  URN =		{urn:nbn:de:0030-drops-14226},
  doi =		{10.4230/DagSemProc.08041.4},
  annote =	{Keywords: Neuro-Symbolic Integration, Topos Theory, First-Order Logic}
}
  • Refine by Author
  • 2 Tino, Peter
  • 1 Alexandrov, Theodore
  • 1 Chernyavsky, Ilya
  • 1 Geibel, Peter
  • 1 Gust, Helmar
  • Show More...

  • Refine by Classification

  • Refine by Keyword
  • 1 Audic-Claverie statistic
  • 1 Bayesian averaging
  • 1 First-Order Logic
  • 1 MALDI imaging mass spectrometry
  • 1 Markov random field
  • Show More...

  • Refine by Type
  • 4 document

  • Refine by Publication Year
  • 2 2008
  • 1 2010
  • 1 2012

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail