Abstract
A line of recent works showed that for a large class of learning problems, any learning algorithm requires either superlinear memory size or a superpolynomial number of samples [Raz, 2016; Kol et al., 2017; Raz, 2017; Moshkovitz and Moshkovitz, 2018; Beame et al., 2018; Garg et al., 2018]. For example, any algorithm for learning parities of size n requires either a memory of size Omega(n^{2}) or an exponential number of samples [Raz, 2016].
All these works modeled the learner as a onepass branching program, allowing only one pass over the stream of samples. In this work, we prove the first memorysamples lower bounds (with a superlinear lower bound on the memory size and superpolynomial lower bound on the number of samples) when the learner is allowed two passes over the stream of samples. For example, we prove that any twopass algorithm for learning parities of size n requires either a memory of size Omega(n^{1.5}) or at least 2^{Omega(sqrt{n})} samples.
More generally, a matrix M: A x X  > {1,1} corresponds to the following learning problem: An unknown element x in X is chosen uniformly at random. A learner tries to learn x from a stream of samples, (a_1, b_1), (a_2, b_2) ..., where for every i, a_i in A is chosen uniformly at random and b_i = M(a_i,x).
Assume that k,l, r are such that any submatrix of M of at least 2^{k} * A rows and at least 2^{l} * X columns, has a bias of at most 2^{r}. We show that any twopass learning algorithm for the learning problem corresponding to M requires either a memory of size at least Omega (k * min{k,sqrt{l}}), or at least 2^{Omega(min{k,sqrt{l},r})} samples.
BibTeX  Entry
@InProceedings{garg_et_al:LIPIcs:2019:10844,
author = {Sumegha Garg and Ran Raz and Avishay Tal},
title = {{TimeSpace Lower Bounds for TwoPass Learning}},
booktitle = {34th Computational Complexity Conference (CCC 2019)},
pages = {22:122:39},
series = {Leibniz International Proceedings in Informatics (LIPIcs)},
ISBN = {9783959771160},
ISSN = {18688969},
year = {2019},
volume = {137},
editor = {Amir Shpilka},
publisher = {Schloss DagstuhlLeibnizZentrum fuer Informatik},
address = {Dagstuhl, Germany},
URL = {http://drops.dagstuhl.de/opus/volltexte/2019/10844},
URN = {urn:nbn:de:0030drops108446},
doi = {10.4230/LIPIcs.CCC.2019.22},
annote = {Keywords: branching program, timespace tradeoffs, twopass streaming, PAC learning, lower bounds}
}
Keywords: 

branching program, timespace tradeoffs, twopass streaming, PAC learning, lower bounds 
Seminar: 

34th Computational Complexity Conference (CCC 2019) 
Issue Date: 

2019 
Date of publication: 

16.07.2019 