Counting to Ten with Two Fingers: Compressed Counting with Spiking Neurons

Authors Yael Hitron, Merav Parter



PDF
Thumbnail PDF

File

LIPIcs.ESA.2019.57.pdf
  • Filesize: 1.72 MB
  • 17 pages

Document Identifiers

Author Details

Yael Hitron
  • Department of Computer Science and Applied Mathematics, Weizmann Institute of Science, Rehovot 76100, Israel
Merav Parter
  • Department of Computer Science and Applied Mathematics, Weizmann Institute of Science, Rehovot 76100, Israel

Acknowledgements

We are grateful to Cameron Musco, Renan Gross and Eylon Yogev for various useful discussions.

Cite As Get BibTex

Yael Hitron and Merav Parter. Counting to Ten with Two Fingers: Compressed Counting with Spiking Neurons. In 27th Annual European Symposium on Algorithms (ESA 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 144, pp. 57:1-57:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019) https://doi.org/10.4230/LIPIcs.ESA.2019.57

Abstract

We consider the task of measuring time with probabilistic threshold gates implemented by bio-inspired spiking neurons. In the model of spiking neural networks, network evolves in discrete rounds, where in each round, neurons fire in pulses in response to a sufficiently high membrane potential. This potential is induced by spikes from neighboring neurons that fired in the previous round, which can have either an excitatory or inhibitory effect.
Discovering the underlying mechanisms by which the brain perceives the duration of time is one of the largest open enigma in computational neuro-science. To gain a better algorithmic understanding onto these processes, we introduce the neural timer problem. In this problem, one is given a time parameter t, an input neuron x, and an output neuron y. It is then required to design a minimum sized neural network (measured by the number of auxiliary neurons) in which every spike from x in a given round i, makes the output y fire for the subsequent t consecutive rounds.
We first consider a deterministic implementation of a neural timer and show that Theta(log t) (deterministic) threshold gates are both sufficient and necessary. This raised the question of whether randomness can be leveraged to reduce the number of neurons. We answer this question in the affirmative by considering neural timers with spiking neurons where the neuron y is required to fire for t consecutive rounds with probability at least 1-delta, and should stop firing after at most 2t rounds with probability 1-delta for some input parameter delta in (0,1). Our key result is a construction of a neural timer with O(log log 1/delta) spiking neurons. Interestingly, this construction uses only one spiking neuron, while the remaining neurons can be deterministic threshold gates. We complement this construction with a matching lower bound of Omega(min{log log 1/delta, log t}) neurons. This provides the first separation between deterministic and randomized constructions in the setting of spiking neural networks.
Finally, we demonstrate the usefulness of compressed counting networks for synchronizing neural networks. In the spirit of distributed synchronizers [Awerbuch-Peleg, FOCS'90], we provide a general transformation (or simulation) that can take any synchronized network solution and simulate it in an asynchronous setting (where edges have arbitrary response latencies) while incurring a small overhead w.r.t the number of neurons and computation time.

Subject Classification

ACM Subject Classification
  • Theory of computation → Design and analysis of algorithms
Keywords
  • stochastic neural networks
  • approximate counting
  • synchronizer

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Edgar D Adrian. The impulses produced by sensory nerve endings. The Journal of physiology, 61(1):49-72, 1926. Google Scholar
  2. Melissa J Allman, Sundeep Teki, Timothy D Griffiths, and Warren H Meck. Properties of the internal clock: first-and second-order principles of subjective time. Annual review of psychology, 65:743-771, 2014. Google Scholar
  3. Douglas B Armstrong, Arthur D Friedman, and Premachandran R Menon. Design of asynchronous circuits assuming unbounded gate delays. IEEE Transactions on Computers, 100(12):1110-1120, 1969. Google Scholar
  4. Baruch Awerbuch and David Peleg. Network Synchronization with Polylogarithmic Overhead. In 31st Annual Symposium on Foundations of Computer Science, St. Louis, Missouri, USA, October 22-24, 1990, Volume II, pages 514-522, 1990. Google Scholar
  5. Tobias Bjerregaard and Shankar Mahadevan. A survey of research and practices of network-on-chip. ACM Computing Surveys (CSUR), 38(1):1, 2006. Google Scholar
  6. Chi-Ning Chou, Kai-Min Chung, and Chi-Jen Lu. On the Algorithmic Power of Spiking Neural Networks. In 10th Innovations in Theoretical Computer Science Conference, ITCS 2019, January 10-12, 2019, San Diego, California, USA, pages 26:1-26:20, 2019. Google Scholar
  7. RE Lee DeVille and Charles S Peskin. Synchrony and asynchrony in a fully stochastic neural network. Bulletin of mathematical biology, 70(6):1608-1633, 2008. Google Scholar
  8. Huawei Fan, Yafeng Wang, Hengtong Wang, Ying-Cheng Lai, and Xingang Wang. Autapses promote synchronization in neuronal networks. Scientific reports, 8(1):580, 2018. Google Scholar
  9. Gerald T Finnerty, Michael N Shadlen, Mehrdad Jazayeri, Anna C Nobre, and Dean V Buonomano. Time in cortical circuits. Journal of Neuroscience, 35(41):13912-13916, 2015. Google Scholar
  10. Philippe Flajolet. Approximate Counting: A Detailed Analysis. BIT, 25(1):113-134, 1985. Google Scholar
  11. Wulfram Gerstner, Andreas K Kreiter, Henry Markram, and Andreas VM Herz. Neural codes: firing rates and beyond. Proceedings of the National Academy of Sciences, 94(24):12740-12741, 1997. Google Scholar
  12. Scott Hauck. Asynchronous design methodologies: An overview. Proceedings of the IEEE, 83(1):69-93, 1995. Google Scholar
  13. Kaori Ikeda and John M Bekkers. Autapses. Current Biology, 16(9):R308, 2006. Google Scholar
  14. Fabian Kuhn, Joel Spencer, Konstantinos Panagiotou, and Angelika Steger. Synchrony and asynchrony in neural networks. In Proceedings of the twenty-first annual ACM-SIAM symposium on Discrete algorithms, pages 949-964. SIAM, 2010. Google Scholar
  15. Robert A. Legenstein, Wolfgang Maass, Christos H. Papadimitriou, and Santosh Srinivas Vempala. Long Term Memory and the Densest K-Subgraph Problem. In 9th Innovations in Theoretical Computer Science Conference, ITCS 2018, January 11-14, 2018, Cambridge, MA, USA, pages 57:1-57:15, 2018. Google Scholar
  16. Benjamin Lindner. Some unsolved problems relating to noise in biological systems. Journal of Statistical Mechanics: Theory and Experiment, 2009(01):P01008, 2009. Google Scholar
  17. Nancy Lynch and Cameron Musco. A Basic Compositional Model for Spiking Neural Networks. arXiv preprint, 2018. URL: http://arxiv.org/abs/1808.03884.
  18. Nancy Lynch, Cameron Musco, and Merav Parter. Computational Tradeoffs in Biological Neural Networks: Self-Stabilizing Winner-Take-All Networks. In Proceedings of the \cITCS2017, 2017. Google Scholar
  19. Nancy Lynch, Cameron Musco, and Merav Parter. Spiking Neural Networks: An Algorithmic Perspective. In 5th Workshop on Biological Distributed Algorithms (BDA 2017), July 2017. Google Scholar
  20. Nancy A. Lynch, Cameron Musco, and Merav Parter. Neuro-RAM Unit with Applications to Similarity Testing and Compression in Spiking Neural Networks. In 31st International Symposium on Distributed Computing, DISC 2017, October 16-20, 2017, Vienna, Austria, pages 33:1-33:16, 2017. Google Scholar
  21. Jun Ma, Xinlin Song, Wuyin Jin, and Chuni Wang. Autapse-induced synchronization in a coupled neuronal network. Chaos, Solitons & Fractals, 80:31-38, 2015. Google Scholar
  22. Wolfgang Maass. Lower Bounds for the Computational Power of Networks of Spiking Neurons. Electronic Colloquium on Computational Complexity (ECCC), 1(19), 1994. URL: http://eccc.hpi-web.de/eccc-reports/1994/TR94-019/index.html.
  23. Wolfgang Maass. On the computational power of noisy spiking neurons. In ŅIPS1995, 1996. Google Scholar
  24. Wolfgang Maass. Networks of spiking neurons: the third generation of neural network models. Neural Networks, 10(9):1659-1671, 1997. Google Scholar
  25. Rajit Manohar and Yoram Moses. The eventual C-element theorem for delay-insensitive asynchronous circuits. In 2017 23rd IEEE International Symposium on Asynchronous Circuits and Systems (ASYNC), pages 102-109. IEEE, 2017. Google Scholar
  26. Hugo Merchant, Deborah L Harrington, and Warren H Meck. Neural basis of the perception and estimation of time. Annual review of neuroscience, 36:313-336, 2013. Google Scholar
  27. Robert Morris. Counting large numbers of events in small registers. Communications of the ACM, 21(10):840-842, 1978. Google Scholar
  28. Christos H Papadimitriou and Santosh S Vempala. Random projection in the brain and computation with assemblies of neurons. In 10th Innovations in Theoretical Computer Science Conference (ITCS 2019). Schloss Dagstuhl-Leibniz-Zentrum fuer Informatik, 2018. Google Scholar
  29. Jens Sparsø. Asynchronous circuit design-a tutorial. In Chapters 1-8 in "Principles of asynchronous circuit design-A systems Perspective". Kluwer Academic Publishers, 2001. Google Scholar
  30. Misha V Tsodyks and Henry Markram. The neural code between neocortical pyramidal neurons depends on neurotransmitter release probability. Proceedings of the national academy of sciences, 94(2):719-723, 1997. Google Scholar
  31. Barbeeba Wang and Nancy Lynch. Integrating Temporal Information to Spatial Information in a Neural Circuit. arXiv preprint, 2019. URL: http://arxiv.org/abs/1903.01217.
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail