eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2017-12-07
29:1
29:11
10.4230/LIPIcs.ISAAC.2017.29
article
Agnostically Learning Boolean Functions with Finite Polynomial Representation
Ding, Ning
Agnostic learning is an extremely hard task in computational learning theory. In this paper we revisit the results in [Kalai et al. SIAM J. Comput. 2008] on agnostically learning boolean functions with finite polynomial representation and those that can be approximated by the former. An example of the former is the class of all boolean low-degree polynomials. For the former, [Kalai et al. SIAM J. Comput. 2008] introduces the l_1-polynomial regression method to learn them to error opt+epsilon. We present a simple instantiation for one step in the method and accordingly give the analysis. Moreover, we show that even ignoring this step can bring a learning result of error 2opt+epsilon as well. Then we consider applying the result for learning concept classes that can be approximated by the former to learn richer specific classes. Our result is that the class of s-term DNF formulae can be agnostically learned to error opt+epsilon with respect to arbitrary distributions for any epsilon in time poly(n^d, 1/epsilon), where d=O(\sqrt{n}\cdot s\cdot \log s\log^2(1/epsilon)).
https://drops.dagstuhl.de/storage/00lipics/lipics-vol092-isaac2017/LIPIcs.ISAAC.2017.29/LIPIcs.ISAAC.2017.29.pdf
Agnostic Learning
Boolean Functions
Low-Degree Polynomials