When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.ISAAC.2017.29
URN: urn:nbn:de:0030-drops-82726
URL: http://drops.dagstuhl.de/opus/volltexte/2017/8272/
 Go to the corresponding LIPIcs Volume Portal

### Agnostically Learning Boolean Functions with Finite Polynomial Representation

 pdf-format:

### Abstract

Agnostic learning is an extremely hard task in computational learning theory. In this paper we revisit the results in [Kalai et al. SIAM J. Comput. 2008] on agnostically learning boolean functions with finite polynomial representation and those that can be approximated by the former. An example of the former is the class of all boolean low-degree polynomials. For the former, [Kalai et al. SIAM J. Comput. 2008] introduces the l_1-polynomial regression method to learn them to error opt+epsilon. We present a simple instantiation for one step in the method and accordingly give the analysis. Moreover, we show that even ignoring this step can bring a learning result of error 2opt+epsilon as well. Then we consider applying the result for learning concept classes that can be approximated by the former to learn richer specific classes. Our result is that the class of s-term DNF formulae can be agnostically learned to error opt+epsilon with respect to arbitrary distributions for any epsilon in time poly(n^d, 1/epsilon), where d=O(\sqrt{n}\cdot s\cdot \log s\log^2(1/epsilon)).

### BibTeX - Entry

@InProceedings{ding:LIPIcs:2017:8272,
author =	{Ning Ding},
title =	{{Agnostically Learning Boolean Functions with Finite Polynomial Representation}},
booktitle =	{28th International Symposium on Algorithms and Computation (ISAAC 2017)},
pages =	{29:1--29:11},
series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
ISBN =	{978-3-95977-054-5},
ISSN =	{1868-8969},
year =	{2017},
volume =	{92},
editor =	{Yoshio Okamoto and Takeshi Tokuyama},
publisher =	{Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik},