LIPIcs.DNA.27.1.pdf
- Filesize: 2.14 MB
- 20 pages
Molecular programming - a paradigm wherein molecules are engineered to perform computation - shows great potential for applications in nanotechnology, disease diagnostics and smart therapeutics. A key challenge is to identify systematic approaches for compiling abstract models of computation to molecules. Due to their wide applicability, one of the most useful abstractions to realize is neural networks. In prior work, real-valued weights were achieved by individually controlling the concentrations of the corresponding "weight" molecules. However, large-scale preparation of reactants with precise concentrations quickly becomes intractable. Here, we propose to bypass this fundamental problem using Binarized Neural Networks (BNNs), a model that is highly scalable in a molecular setting due to the small number of distinct weight values. We devise a noise-tolerant digital molecular circuit that compactly implements a majority voting operation on binary-valued inputs to compute the neuron output. The network is also rate-independent, meaning the speed at which individual reactions occur does not affect the computation, further increasing robustness to noise. We first demonstrate our design on the MNIST classification task by simulating the system as idealized chemical reactions. Next, we map the reactions to DNA strand displacement cascades, providing simulation results that demonstrate the practical feasibility of our approach. We perform extensive noise tolerance simulations, showing that digital molecular neurons are notably more robust to noise in the concentrations of chemical reactants compared to their analog counterparts. Finally, we provide initial experimental results of a single binarized neuron. Our work suggests a solid framework for building even more complex neural network computation.
Feedback for Dagstuhl Publishing