Abstract
We study input compression in a biologically inspired model of neural computation. We demonstrate that a network consisting of a random projection step (implemented via random synaptic connectivity) followed by a sparsification step (implemented via winnertakeall competition) can reduce wellseparated highdimensional input vectors to wellseparated lowdimensional vectors. By augmenting our network with a third module, we can efficiently map each input (along with any small perturbations of the input) to a unique representative neuron, solving a neural clustering problem.
Both the size of our network and its processing time, i.e., the time it takes the network to compute the compressed output given a presented input, are independent of the (potentially large) dimension of the input patterns and depend only on the number of distinct inputs that the network must encode and the pairwise relative Hamming distance between these inputs. The first two steps of our construction mirror known biological networks, for example, in the fruit fly olfactory system [Caron et al., 2013; Lin et al., 2014; Dasgupta et al., 2017]. Our analysis helps provide a theoretical understanding of these networks and lay a foundation for how random compression and input memorization may be implemented in biological neural networks.
Technically, a contribution in our network design is the implementation of a shortterm memory. Our network can be given a desired memory time t_m as an input parameter and satisfies the following with high probability: any pattern presented several times within a time window of t_m rounds will be mapped to a single representative output neuron. However, a pattern not presented for c⋅t_m rounds for some constant c>1 will be "forgotten", and its representative output neuron will be released, to accommodate newly introduced patterns.
BibTeX  Entry
@InProceedings{hitron_et_al:LIPIcs:2020:11708,
author = {Yael Hitron and Nancy Lynch and Cameron Musco and Merav Parter},
title = {{Random Sketching, Clustering, and ShortTerm Memory in Spiking Neural Networks}},
booktitle = {11th Innovations in Theoretical Computer Science Conference (ITCS 2020)},
pages = {23:123:31},
series = {Leibniz International Proceedings in Informatics (LIPIcs)},
ISBN = {9783959771344},
ISSN = {18688969},
year = {2020},
volume = {151},
editor = {Thomas Vidick},
publisher = {Schloss DagstuhlLeibnizZentrum fuer Informatik},
address = {Dagstuhl, Germany},
URL = {https://drops.dagstuhl.de/opus/volltexte/2020/11708},
URN = {urn:nbn:de:0030drops117087},
doi = {10.4230/LIPIcs.ITCS.2020.23},
annote = {Keywords: biological distributed computing, spiking neural networks, compressed sensing, clustering, random projection, dimensionality reduction, winnertakea}
}
Keywords: 

biological distributed computing, spiking neural networks, compressed sensing, clustering, random projection, dimensionality reduction, winnertakea 
Seminar: 

11th Innovations in Theoretical Computer Science Conference (ITCS 2020) 
Issue Date: 

2020 
Date of publication: 

10.01.2020 