LIPIcs.SoCG.2019.1.pdf
- Filesize: 178 kB
- 1 pages
An intriguing geometric primitive, "expand-and-sparsify", has been found in the olfactory system of the fly and several other organisms. It maps an input vector to a much higher-dimensional sparse representation, using a random linear transformation followed by winner-take-all thresholding. I will show that this representation has a variety of formal properties, such as locality preservation, that make it an attractive data structure for algorithms and machine learning. In particular, mimicking the fly’s circuitry yields algorithms for similarity search and for novelty detection that have provable guarantees as well as having practical performance that is competitive with state-of-the-art methods. This talk is based on work with Saket Navlakha (Salk Institute), Chuck Stevens (Salk Institute), and Chris Tosh (Columbia).
Feedback for Dagstuhl Publishing