Abstract
Let F be a multivariate function from a product set Σ^n to an Abelian group G. A kpartition of F with cost δ is a partition of the set of variables V into k nonempty subsets (X_1, ̇s, X_k) such that F(V) is δclose to F_1(X_1)+ ̇s+F_k(X_k) for some F_1, ̇s, F_k with respect to a given error metric. We study algorithms for agnostically learning k partitions and testing kpartitionability over various groups and error metrics given query access to F. In particular we show that
1) Given a function that has a kpartition of cost δ, a partition of cost O(k n^2)(δ + ε) can be learned in time Õ(n^2 poly 1/ε) for any ε > 0. In contrast, for k = 2 and n = 3 learning a partition of cost δ + ε is NPhard.
2) When F is realvalued and the error metric is the 2norm, a 2partition of cost √(δ^2 + ε) can be learned in time Õ(n^5/ε^2).
3) When F is Z_qvalued and the error metric is Hamming weight, kpartitionability is testable with onesided error and O(kn^3/ε) nonadaptive queries. We also show that even twosided testers require Ω(n) queries when k = 2.
This work was motivated by reinforcement learning control tasks in which the set of control variables can be partitioned. The partitioning reduces the task into multiple lowerdimensional ones that are relatively easier to learn. Our second algorithm empirically increases the scores attained over previous heuristic partitioning methods applied in this context.
BibTeX  Entry
@InProceedings{bogdanov_et_al:LIPIcs:2020:11722,
author = {Andrej Bogdanov and Baoxiang Wang},
title = {{Learning and Testing Variable Partitions}},
booktitle = {11th Innovations in Theoretical Computer Science Conference (ITCS 2020)},
pages = {37:137:22},
series = {Leibniz International Proceedings in Informatics (LIPIcs)},
ISBN = {9783959771344},
ISSN = {18688969},
year = {2020},
volume = {151},
editor = {Thomas Vidick},
publisher = {Schloss DagstuhlLeibnizZentrum fuer Informatik},
address = {Dagstuhl, Germany},
URL = {https://drops.dagstuhl.de/opus/volltexte/2020/11722},
URN = {urn:nbn:de:0030drops117221},
doi = {10.4230/LIPIcs.ITCS.2020.37},
annote = {Keywords: partitioning, agnostic learning, property testing, sublineartime algorithms, hypergraph cut, reinforcement learning}
}
Keywords: 

partitioning, agnostic learning, property testing, sublineartime algorithms, hypergraph cut, reinforcement learning 
Seminar: 

11th Innovations in Theoretical Computer Science Conference (ITCS 2020) 
Issue Date: 

2020 
Date of publication: 

10.01.2020 