Volume

Dagstuhl Seminar Proceedings, Volume 8041



Publication Details

  • published at: 2008-04-15
  • Publisher: Schloss Dagstuhl – Leibniz-Zentrum für Informatik

Access Numbers

Documents

No documents found matching your filter selection.
Document
08041 Abstracts Collection – Recurrent Neural Networks - Models, Capacities, and Applications

Authors: Luc De Raedt, Barbara Hammer, Pascal Hitzler, and Wolfgang Maass


Abstract
From January 20 to 25 2008, the Dagstuhl Seminar 08041 ``Recurrent Neural Networks- Models, Capacities, and Applications'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available.

Cite as

Luc De Raedt, Barbara Hammer, Pascal Hitzler, and Wolfgang Maass. 08041 Abstracts Collection – Recurrent Neural Networks - Models, Capacities, and Applications. In Recurrent Neural Networks- Models, Capacities, and Applications. Dagstuhl Seminar Proceedings, Volume 8041, pp. 1-16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{deraedt_et_al:DagSemProc.08041.1,
  author =	{De Raedt, Luc and Hammer, Barbara and Hitzler, Pascal and Maass, Wolfgang},
  title =	{{08041 Abstracts Collection – Recurrent Neural Networks - Models, Capacities, and Applications}},
  booktitle =	{Recurrent Neural Networks- Models, Capacities, and Applications},
  pages =	{1--16},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8041},
  editor =	{Luc De Raedt and Barbara Hammer and Pascal Hitzler and Wolfgang Maass},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08041.1},
  URN =		{urn:nbn:de:0030-drops-14250},
  doi =		{10.4230/DagSemProc.08041.1},
  annote =	{Keywords: Recurrent Neural Networks, Neural-Symbolic Integration, Biological Models, Hybrid Models, Relational Learning Echo State Networks, Spike Prediction, Unsupervised Recurrent Networks}
}
Document
08041 Summary – Recurrent Neural Networks - Models, Capacities, and Applications

Authors: Luc De Raedt, Barbara Hammer, Pascal Hitzler, and Wolfgang Maass


Abstract
The seminar centered around recurrent information processing in neural systems and its connections to brain sciences, on the one hand, and higher symbolic reasoning, on the other side. The goal was to explore connections across the disciplines and to tackle important questions which arise in all sub-disciplines such as representation of temporal information, generalization ability, inference, and learning.

Cite as

Luc De Raedt, Barbara Hammer, Pascal Hitzler, and Wolfgang Maass. 08041 Summary – Recurrent Neural Networks - Models, Capacities, and Applications. In Recurrent Neural Networks- Models, Capacities, and Applications. Dagstuhl Seminar Proceedings, Volume 8041, pp. 1-4, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{deraedt_et_al:DagSemProc.08041.2,
  author =	{De Raedt, Luc and Hammer, Barbara and Hitzler, Pascal and Maass, Wolfgang},
  title =	{{08041 Summary – Recurrent Neural Networks - Models, Capacities, and Applications}},
  booktitle =	{Recurrent Neural Networks- Models, Capacities, and Applications},
  pages =	{1--4},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8041},
  editor =	{Luc De Raedt and Barbara Hammer and Pascal Hitzler and Wolfgang Maass},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08041.2},
  URN =		{urn:nbn:de:0030-drops-14243},
  doi =		{10.4230/DagSemProc.08041.2},
  annote =	{Keywords: Recurrent networks}
}
Document
Equilibria of Iterative Softmax and Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks

Authors: Peter Tino


Abstract
Optimization dynamics using self-organizing neural networks (SONN) driven by softmax weight renormalization has been shown to be capable of intermittent search for high-quality solutions in assignment optimization problems. However, the search is sensitive to temperature setting in the softmax renormalization step. The powerful search occurs only at the critical temperature that depends on the problem size. So far the critical temperatures have been determined only by tedious trial-and-error numerical simulations. We offer a rigorous analysis of the search performed by SONN and derive analytical approximations to the critical temperatures. We demonstrate on a set of N-queens problems for a wide range of problem sizes N that the analytically determined critical temperatures predict the optimal working temperatures for SONN intermittent search very well.

Cite as

Peter Tino. Equilibria of Iterative Softmax and Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks. In Recurrent Neural Networks- Models, Capacities, and Applications. Dagstuhl Seminar Proceedings, Volume 8041, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{tino:DagSemProc.08041.3,
  author =	{Tino, Peter},
  title =	{{Equilibria of Iterative Softmax and Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks}},
  booktitle =	{Recurrent Neural Networks- Models, Capacities, and Applications},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8041},
  editor =	{Luc De Raedt and Barbara Hammer and Pascal Hitzler and Wolfgang Maass},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08041.3},
  URN =		{urn:nbn:de:0030-drops-14202},
  doi =		{10.4230/DagSemProc.08041.3},
  annote =	{Keywords: Recurrent self-organizing maps, symmetry breaking bifurcation, N-queens}
}
Document
Perspectives of Neuro--Symbolic Integration – Extended Abstract --

Authors: Kai-Uwe Kühnberger, Helmar Gust, and Peter Geibel


Abstract
There is an obvious tension between symbolic and subsymbolic theories, because both show complementary strengths and weaknesses in corresponding applications and underlying methodologies. The resulting gap in the foundations and the applicability of these approaches is theoretically unsatisfactory and practically undesirable. We sketch a theory that bridges this gap between symbolic and subsymbolic approaches by the introduction of a Topos-based semi-symbolic level used for coding logical first-order expressions in a homogeneous framework. This semi-symbolic level can be used for neural learning of logical first-order theories. Besides a presentation of the general idea of the framework, we sketch some challenges and important open problems for future research with respect to the presented approach and the field of neuro-symbolic integration, in general.

Cite as

Kai-Uwe Kühnberger, Helmar Gust, and Peter Geibel. Perspectives of Neuro--Symbolic Integration – Extended Abstract --. In Recurrent Neural Networks- Models, Capacities, and Applications. Dagstuhl Seminar Proceedings, Volume 8041, pp. 1-6, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{kuhnberger_et_al:DagSemProc.08041.4,
  author =	{K\"{u}hnberger, Kai-Uwe and Gust, Helmar and Geibel, Peter},
  title =	{{Perspectives of Neuro--Symbolic Integration – Extended Abstract --}},
  booktitle =	{Recurrent Neural Networks- Models, Capacities, and Applications},
  pages =	{1--6},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8041},
  editor =	{Luc De Raedt and Barbara Hammer and Pascal Hitzler and Wolfgang Maass},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08041.4},
  URN =		{urn:nbn:de:0030-drops-14226},
  doi =		{10.4230/DagSemProc.08041.4},
  annote =	{Keywords: Neuro-Symbolic Integration, Topos Theory, First-Order Logic}
}
Document
The Grand Challenges and Myths of Neural-Symbolic Computation

Authors: Luis C. Lamb


Abstract
The construction of computational cognitive models integrating the connectionist and symbolic paradigms of artificial intelligence is a standing research issue in the field. The combination of logic-based inference and connectionist learning systems may lead to the construction of semantically sound computational cognitive models in artificial intelligence, computer and cognitive sciences. Over the last decades, results regarding the computation and learning of classical reasoning within neural networks have been promising. Nonetheless, there still remains much do be done. Artificial intelligence, cognitive and computer science are strongly based on several non-classical reasoning formalisms, methodologies and logics. In knowledge representation, distributed systems, hardware design, theorem proving, systems specification and verification classical and non-classical logics have had a great impact on theory and real-world applications. Several challenges for neural-symbolic computation are pointed out, in particular for classical and non-classical computation in connectionist systems. We also analyse myths about neural-symbolic computation and shed new light on them considering recent research advances.

Cite as

Luis C. Lamb. The Grand Challenges and Myths of Neural-Symbolic Computation. In Recurrent Neural Networks- Models, Capacities, and Applications. Dagstuhl Seminar Proceedings, Volume 8041, pp. 1-16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{lamb:DagSemProc.08041.5,
  author =	{Lamb, Luis C.},
  title =	{{The Grand Challenges  and Myths of Neural-Symbolic Computation}},
  booktitle =	{Recurrent Neural Networks- Models, Capacities, and Applications},
  pages =	{1--16},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8041},
  editor =	{Luc De Raedt and Barbara Hammer and Pascal Hitzler and Wolfgang Maass},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08041.5},
  URN =		{urn:nbn:de:0030-drops-14233},
  doi =		{10.4230/DagSemProc.08041.5},
  annote =	{Keywords: Connectionist non-classical logics, neural-symbolic computation, non-classical reasoning, computational cognitive models}
}
Document
The role of recurrent networks in neural architectures of grounded cognition: learning of control

Authors: Frank Van der Velde and Marc de Kamps


Abstract
Recurrent networks have been used as neural models of language processing, with mixed results. Here, we discuss the role of recurrent networks in a neural architecture of grounded cognition. In particular, we discuss how the control of binding in this architecture can be learned. We trained a simple recurrent network (SRN) and a feedforward network (FFN) for this task. The results show that information from the architecture is needed as input for these networks to learn control of binding. Thus, both control systems are recurrent. We found that the recurrent system consisting of the architecture and an SRN or an FFN as a "core" can learn basic (but recursive) sentence structures. Problems with control of binding arise when the system with the SRN is tested on number of new sentence structures. In contrast, control of binding for these structures succeeds with the FFN. Yet, for some structures with (unlimited) embeddings, difficulties arise due to dynamical binding conflicts in the architecture itself. In closing, we discuss potential future developments of the architecture presented here.

Cite as

Frank Van der Velde and Marc de Kamps. The role of recurrent networks in neural architectures of grounded cognition: learning of control. In Recurrent Neural Networks- Models, Capacities, and Applications. Dagstuhl Seminar Proceedings, Volume 8041, pp. 1-18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{vandervelde_et_al:DagSemProc.08041.6,
  author =	{Van der Velde, Frank and de Kamps, Marc},
  title =	{{The role of recurrent networks in neural architectures of grounded cognition: learning of control}},
  booktitle =	{Recurrent Neural Networks- Models, Capacities, and Applications},
  pages =	{1--18},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8041},
  editor =	{Luc De Raedt and Barbara Hammer and Pascal Hitzler and Wolfgang Maass},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08041.6},
  URN =		{urn:nbn:de:0030-drops-14213},
  doi =		{10.4230/DagSemProc.08041.6},
  annote =	{Keywords: Grounded representations, binding control, combinatorial structures, neural architecture, recurrent network, learning}
}

Filters


Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail