On the Algorithmic Power of Spiking Neural Networks

Authors Chi-Ning Chou, Kai-Min Chung, Chi-Jen Lu



PDF
Thumbnail PDF

File

LIPIcs.ITCS.2019.26.pdf
  • Filesize: 1.49 MB
  • 20 pages

Document Identifiers

Author Details

Chi-Ning Chou
  • School of Engineering and Applied Sciences, Harvard University, Cambridge, MA 02138, USA
Kai-Min Chung
  • Institute of Information Science, Academia Sinica, Taipei, Taiwan
Chi-Jen Lu
  • Institute of Information Science, Academia Sinica, Taipei, Taiwan

Cite As Get BibTex

Chi-Ning Chou, Kai-Min Chung, and Chi-Jen Lu. On the Algorithmic Power of Spiking Neural Networks. In 10th Innovations in Theoretical Computer Science Conference (ITCS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 124, pp. 26:1-26:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019) https://doi.org/10.4230/LIPIcs.ITCS.2019.26

Abstract

Spiking Neural Networks (SNN) are mathematical models in neuroscience to describe the dynamics among a set of neurons that interact with each other by firing instantaneous signals, a.k.a., spikes. Interestingly, a recent advance in neuroscience [Barrett-Denève-Machens, NIPS 2013] showed that the neurons' firing rate, i.e., the average number of spikes fired per unit of time, can be characterized by the optimal solution of a quadratic program defined by the parameters of the dynamics. This indicated that SNN potentially has the computational power to solve non-trivial quadratic programs. However, the results were justified empirically without rigorous analysis. 
We put this into the context of natural algorithms and aim to investigate the algorithmic power of SNN. Especially, we emphasize on giving rigorous asymptotic analysis on the performance of SNN in solving optimization problems. To enforce a theoretical study, we first identify a simplified SNN model that is tractable for analysis. Next, we confirm the empirical observation in the work of Barrett et al. by giving an upper bound on the convergence rate of SNN in solving the quadratic program. Further, we observe that in the case where there are infinitely many optimal solutions, SNN tends to converge to the one with smaller l_1 norm. We give an affirmative answer to our finding by showing that SNN can solve the l_1 minimization problem under some regular conditions.
Our main technical insight is a dual view of the SNN dynamics, under which SNN can be viewed as a new natural primal-dual algorithm for the l_1 minimization problem. We believe that the dual view is of independent interest and may potentially find interesting interpretation in neuroscience.

Subject Classification

ACM Subject Classification
  • Theory of computation → Models of computation
Keywords
  • Spiking Neural Networks
  • Natural Algorithms
  • l_1 Minimization

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Moshe Abeles. Corticonics: Neural circuits of the cerebral cortex. Cambridge University Press, 1991. Google Scholar
  2. Christina Allen and Charles F Stevens. An evaluation of causes for unreliability of synaptic transmission. Proceedings of the National Academy of Sciences, 91(22):10380-10383, 1994. Google Scholar
  3. Arunava Banerjee. Learning Precise Spike Train-to-Spike Train Transformations in Multilayer Feedforward Neuronal Networks. Neural computation, 28(5):826-848, 2016. Google Scholar
  4. David G Barrett, Sophie Denève, and Christian K Machens. Firing rate predictions in optimal balanced networks. In Advances in Neural Information Processing Systems, pages 1538-1546, 2013. Google Scholar
  5. Amir Beck and Marc Teboulle. A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM journal on imaging sciences, 2(1):183-202, 2009. Google Scholar
  6. Yoshua Bengio, Thomas Mesnard, Asja Fischer, Saizheng Zhang, and Yuhuai Wu. STDP-compatible approximation of backpropagation in an energy-based model. Neural computation, 29(3):555-577, 2017. Google Scholar
  7. William Bialek, Fred Rieke, RR De Ruyter Van Steveninck, and David Warland. Reading a neural code. Science, 252(5014):1854-1857, 1991. Google Scholar
  8. Jonathan Binas, Giacomo Indiveri, and Michael Pfeiffer. Spiking Analog VLSI Neuron Assemblies as Constraint Satisfaction Problem Solvers. arXiv preprint arXiv:1511.00540, 2015. Google Scholar
  9. Anmol Biswas, Sidharth Prasad, Sandip Lashkare, and Udayan Ganguly. A simple and efficient SNN and its performance &robustness evaluation method to enable hardware implementation. arXiv preprint arXiv:1612.02233, 2016. Google Scholar
  10. Vincenzo Bonifaci, Kurt Mehlhorn, and Girish Varma. Physarum can compute shortest paths. Journal of Theoretical Biology, 309:121-133, 2012. Google Scholar
  11. J Frédéric Bonnans and Alexander Shapiro. Optimization problems with perturbations: A guided tour. SIAM review, 40(2):228-264, 1998. Google Scholar
  12. Olaf Booij and Hieu tat Nguyen. A gradient descent rule for spiking neurons emitting multiple spikes. Information Processing Letters, 95(6):552-558, 2005. Google Scholar
  13. Stephen Boyd and Lieven Vandenberghe. Convex optimization. Cambridge university press, 2004. Google Scholar
  14. Nicolas Brunel and Peter E Latham. Firing rate of the noisy quadratic integrate-and-fire neuron. Neural Computation, 15(10):2281-2306, 2003. Google Scholar
  15. Lars Buesing, Johannes Bill, Bernhard Nessler, and Wolfgang Maass. Neural dynamics as sampling: a model for stochastic computation in recurrent networks of spiking neurons. PLoS Comput Biol, 7(11):e1002211, 2011. Google Scholar
  16. Bernard Chazelle. Natural algorithms. In Proceedings of the twentieth Annual ACM-SIAM Symposium on Discrete Algorithms, pages 422-431. Society for Industrial and Applied Mathematics, 2009. Google Scholar
  17. Bernard Chazelle. Natural algorithms and influence systems. Communications of the ACM, 55(12):101-110, 2012. Google Scholar
  18. Scott Shaobing Chen, David L Donoho, and Michael A Saunders. Atomic decomposition by basis pursuit. SIAM review, 43(1):129-159, 2001. Google Scholar
  19. Peter U Diehl and Matthew Cook. Unsupervised learning of digit recognition using spike-timing-dependent plasticity. Frontiers in computational neuroscience, 9:99, 2015. Google Scholar
  20. A Aldo Faisal, Luc PJ Selen, and Daniel M Wolpert. Noise in the nervous system. Nature reviews neuroscience, 9(4):292-303, 2008. Google Scholar
  21. Richard FitzHugh. Impulses and physiological states in theoretical models of nerve membrane. Biophysical journal, 1(6):445-466, 1961. Google Scholar
  22. Nicolas Fourcaud-Trocmé, David Hansel, Carl Van Vreeswijk, and Nicolas Brunel. How spike generation mechanisms determine the neuronal response to fluctuating inputs. Journal of Neuroscience, 23(37):11628-11640, 2003. Google Scholar
  23. Wulfram Gerstner. Time structure of the activity in neural network models. Physical review E, 51(1):738, 1995. Google Scholar
  24. Tim Gollisch and Markus Meister. Rapid neural coding in the retina with relative spike latencies. science, 319(5866):1108-1111, 2008. Google Scholar
  25. Walter Heiligenberg. Neural nets in electric fish. MIT press Cambridge, MA, 1991. Google Scholar
  26. James L Hindmarsh and RM Rose. A model of neuronal bursting using three coupled first order differential equations. Proc. R. Soc. Lond. B, 221(1222):87-102, 1984. Google Scholar
  27. Alan L Hodgkin and Andrew F Huxley. A quantitative description of membrane current and its application to conduction and excitation in nerve. The Journal of physiology, 117(4):500, 1952. Google Scholar
  28. John J Hopfield. Pattern recognition computation using action potential timing for stimulus representation. Nature, 376(6535):33, 1995. Google Scholar
  29. Eugene M Izhikevich et al. Simple model of spiking neurons. IEEE Transactions on neural networks, 14(6):1569-1572, 2003. Google Scholar
  30. Zeno Jonke, Stefan Habenschuss, and Wolfgang Maass. A theoretical basis for efficient computations with noisy spiking neurons. arXiv preprint arXiv:1412.5862, 2014. Google Scholar
  31. Zeno Jonke, Stefan Habenschuss, and Wolfgang Maass. Solving constraint satisfaction problems with networks of spiking neurons. Frontiers in neuroscience, 10, 2016. Google Scholar
  32. Saeed Reza Kheradpisheh, Mohammad Ganjtabesh, and Timothée Masquelier. Bio-inspired unsupervised learning of visual features leads to robust invariant object recognition. Neurocomputing, 205:382-392, 2016. Google Scholar
  33. Werner M Kistler, Wulfram Gerstner, and J Leo van Hemmen. Reduction of the Hodgkin-Huxley equations to a single-variable threshold model. Neural computation, 9(5):1015-1045, 1997. Google Scholar
  34. Nobuyuki Kuwabara and Nobuo Suga. Delay lines and amplitude selectivity are created in subthalamic auditory nuclei: the brachium of the inferior colliculus of the mustached bat. Journal of neurophysiology, 69(5):1713-1724, 1993. Google Scholar
  35. Louis Lapicque. Recherches quantitatives sur l’excitation électrique des nerfs traitée comme une polarisation. J. Physiol. Pathol. Gen, 9(1):620-635, 1907. Google Scholar
  36. Tsung-Han Lin and Ping Tak Peter Tang. Dictionary Learning by Dynamical Neural Networks. arXiv preprint arXiv:1805.08952, 2018. Google Scholar
  37. Adi Livnat and Christos Papadimitriou. Sex as an algorithm: the theory of evolution under the lens of computation. Communications of the ACM, 59(11):84-93, 2016. Google Scholar
  38. Adi Livnat, Christos Papadimitriou, Aviad Rubinstein, Gregory Valiant, and Andrew Wan. Satisfiability and evolution. In Foundations of Computer Science (FOCS), 2014 IEEE 55th Annual Symposium on, pages 524-530. IEEE, 2014. Google Scholar
  39. Nancy Lynch and Cameron Musco. A Basic Compositional Model for Spiking Neural Networks. arXiv preprint arXiv:1808.03884, 2018. Google Scholar
  40. Nancy Lynch, Cameron Musco, and Merav Parter. Spiking Neural Networks: An Algorithmic Perspective. In Workshop on Biological Distributed Algorithms (BDA), July 28th, 2017, Washington DC, USA, 2017. Google Scholar
  41. Nancy A. Lynch, Cameron Musco, and Merav Parter. Computational Tradeoffs in Biological Neural Networks: Self-Stabilizing Winner-Take-All Networks. In 8th Innovations in Theoretical Computer Science Conference, ITCS 2017, January 9-11, 2017, Berkeley, CA, USA, pages 15:1-15:44, 2017. URL: http://dx.doi.org/10.4230/LIPIcs.ITCS.2017.15.
  42. Nancy A. Lynch, Cameron Musco, and Merav Parter. Neuro-RAM Unit with Applications to Similarity Testing and Compression in Spiking Neural Networks. In 31st International Symposium on Distributed Computing, DISC 2017, October 16-20, 2017, Vienna, Austria, pages 33:1-33:16, 2017. URL: http://dx.doi.org/10.4230/LIPIcs.DISC.2017.33.
  43. Wolfgang Maass. Lower bounds for the computational power of networks of spiking neurons. Neural computation, 8(1):1-40, 1996. Google Scholar
  44. Wolfgang Maass. Fast sigmoidal networks via spiking neurons. Neural Computation, 9(2):279-304, 1997. Google Scholar
  45. Wolfgang Maass. Networks of spiking neurons: the third generation of neural network models. Neural networks, 10(9):1659-1671, 1997. Google Scholar
  46. Wolfgang Maass. Computing with spiking neurons. Pulsed neural networks, 85, 1999. Google Scholar
  47. Wolfgang Maass. To Spike or Not to Spike: That Is the Question. Proceedings of the IEEE, 103(12):2219-2224, 2015. Google Scholar
  48. Wolfgang Maass and Christopher M Bishop. Pulsed neural networks. MIT press, 2001. Google Scholar
  49. Catherine Morris and Harold Lecar. Voltage oscillations in the barnacle giant muscle fiber. Biophysical journal, 35(1):193-213, 1981. Google Scholar
  50. Hesham Mostafa, Lorenz K Müller, and Giacomo Indiveri. An event-based architecture for solving constraint satisfaction problems. Nature communications, 6, 2015. Google Scholar
  51. Toshiyuki Nakagaki, Hiroyasu Yamada, and Ágota Tóth. Intelligence: Maze-solving by an amoeboid organism. Nature, 407(6803):470-470, 2000. Google Scholar
  52. Bruno A Olshausen and David J Field. Emergence of simple-cell receptive field properties by learning a sparse code for natural images. Nature, 381(6583):607, 1996. Google Scholar
  53. Hélene Paugam-Moisy and Sander Bohte. Computing with spiking neuron networks. In Handbook of natural computing, pages 335-376. Springer, 2012. Google Scholar
  54. Fred Rieke and David Warland. Spikes: exploring the neural code. MIT press, 1999. Google Scholar
  55. Christopher J Rozell, Don H Johnson, Richard G Baraniuk, and Bruno A Olshausen. Sparse coding via thresholding and local competition in neural circuits. Neural computation, 20(10):2526-2563, 2008. Google Scholar
  56. Rufin Van Rullen and Simon J Thorpe. Rate coding versus temporal order coding: what the retinal ganglion cells tell the visual cortex. Neural computation, 13(6):1255-1283, 2001. Google Scholar
  57. Michael N Shadlen and William T Newsome. Noise, neural codes and cortical organization. Current opinion in neurobiology, 4(4):569-579, 1994. Google Scholar
  58. Samuel Shapero, Christopher Rozell, and Paul Hasler. Configurable hardware integrate and fire neurons for sparse approximation. Neural Networks, 45:134-143, 2013. Google Scholar
  59. Samuel Shapero, Mengchen Zhu, Jennifer Hasler, and Christopher Rozell. Optimal sparse approximation with integrate and fire neurons. International journal of neural systems, 24(05):1440001, 2014. Google Scholar
  60. Sumit Bam Shrestha and Qing Song. Robust learning in SpikeProp. Neural Networks, 86:54-68, 2017. Google Scholar
  61. Richard B Stein. A theoretical analysis of neuronal variability. Biophysical Journal, 5(2):173, 1965. Google Scholar
  62. Ping Tak Peter Tang. Convergence of LCA Flows to (C) LASSO Solutions. arXiv preprint arXiv:1603.01644, 2016. Google Scholar
  63. Ping Tak Peter Tang, Tsung-Han Lin, and Mike Davies. Sparse Coding by Spiking Neural Networks: Convergence Theory and Computational Results. arXiv preprint arXiv:1705.05475, 2017. Google Scholar
  64. Wondimu Teka, Toma M Marinov, and Fidel Santamaria. Neuronal spike timing adaptation described with a fractional leaky integrate-and-fire model. PLoS computational biology, 10(3):e1003526, 2014. Google Scholar
  65. Atsushi Tero, Ryo Kobayashi, and Toshiyuki Nakagaki. A mathematical model for adaptive transport network in path finding by true slime mold. Journal of theoretical biology, 244(4):553-564, 2007. Google Scholar
  66. Simon Thorpe, Arnaud Delorme, and Rufin Van Rullen. Spike-based strategies for rapid processing. Neural networks, 14(6-7):715-725, 2001. Google Scholar
  67. Simon Thorpe, Denis Fize, and Catherine Marlot. Speed of processing in the human visual system. nature, 381(6582):520, 1996. Google Scholar
  68. Joel Zylberberg, Jason Timothy Murphy, and Michael Robert DeWeese. A sparse coding model with synaptically local plasticity and spiking neurons can account for the diverse shapes of V1 simple cell receptive fields. PLoS computational biology, 7(10):e1002250, 2011. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail