,
Johanne Cohen
,
Adrian Wurm
Creative Commons Attribution 4.0 International license
We show the existence of a fixed recurrent network capable of approximating any computable function with arbitrary precision, provided that an encoding of the function is given in the initial input. While uniform approximation over a compact domain is a well-known property of neural networks, we go further by proving that our network ensures effective uniform approximation - simultaneously ensuring:
- Uniform approximation in the sup-norm sense, guaranteeing precision across the compact domain {[0,1]^d};
- Uniformity in the sense of computability theory (also referred to as effectivity or universality), meaning the same network works for all computable functions. Our result is obtained constructively, using original arguments. Moreover, our construction bridges computation theory with neural network approximation, providing new insights into the fundamental connections between circuit complexity and function representation.
Furthermore, this connection extends beyond computability to complexity theory. The obtained network is efficient: if a function is computable or approximable in polynomial time in the Turing machine model, then the network requires only a polynomial number of recurrences or iterations to achieve the same level of approximation, and conversely. Moreover, the recurrent network can be assumed to be very narrow, strengthening the link our results and existing models of very deep learning, where uniform approximation properties have already been established.
@InProceedings{bournez_et_al:LIPIcs.MFCS.2025.29,
author = {Bournez, Olivier and Cohen, Johanne and Wurm, Adrian},
title = {{A Universal Uniform Approximation Theorem for Neural Networks}},
booktitle = {50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025)},
pages = {29:1--29:20},
series = {Leibniz International Proceedings in Informatics (LIPIcs)},
ISBN = {978-3-95977-388-1},
ISSN = {1868-8969},
year = {2025},
volume = {345},
editor = {Gawrychowski, Pawe{\l} and Mazowiecki, Filip and Skrzypczak, Micha{\l}},
publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
address = {Dagstuhl, Germany},
URL = {https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2025.29},
URN = {urn:nbn:de:0030-drops-241365},
doi = {10.4230/LIPIcs.MFCS.2025.29},
annote = {Keywords: Models of computation, Complexity theory, Formal neural networks}
}