Support Recovery in Universal One-Bit Compressed Sensing

Authors Arya Mazumdar, Soumyabrata Pal



PDF
Thumbnail PDF

File

LIPIcs.ITCS.2022.106.pdf
  • Filesize: 0.84 MB
  • 20 pages

Document Identifiers

Author Details

Arya Mazumdar
  • Halıcıoğlu Data Science Institute, University of California, San Diego, CA, USA
Soumyabrata Pal
  • College of Information and Computer Sciences, University of Massachusetts Amherst, MA, USA

Cite AsGet BibTex

Arya Mazumdar and Soumyabrata Pal. Support Recovery in Universal One-Bit Compressed Sensing. In 13th Innovations in Theoretical Computer Science Conference (ITCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 215, pp. 106:1-106:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)
https://doi.org/10.4230/LIPIcs.ITCS.2022.106

Abstract

One-bit compressed sensing (1bCS) is an extreme-quantized signal acquisition method that has been intermittently studied in the past decade. In 1bCS, linear samples of a high dimensional signal are quantized to only one bit per sample (sign of the measurement). The extreme quantization makes it an interesting case study of the more general single-index or generalized linear models. At the same time it can also be thought of as a "design" version of learning a binary linear classifier or halfspace-learning. Assuming the original signal vector to be sparse, existing results in 1bCS either aim to find the support of the vector, or approximate the signal within an ε-ball. The focus of this paper is support recovery, which often also computationally facilitate approximate signal recovery. A universal measurement matrix for 1bCS refers to one set of measurements that work for all sparse signals. With universality, it is known that Θ̃(k²) 1bCS measurements are necessary and sufficient for support recovery (where k denotes the sparsity). In this work, we show that it is possible to universally recover the support with a small number of false positives with Õ(k^{3/2}) measurements. If the dynamic range of the signal vector is known, then with a different technique, this result can be improved to only Õ(k) measurements. Other results on universal but approximate support recovery are also provided in this paper. All of our main recovery algorithms are simple and polynomial-time.

Subject Classification

ACM Subject Classification
  • Mathematics of computing → Coding theory
  • Mathematics of computing → Combinatorics
  • Mathematics of computing → Information theory
  • Theory of computation → Data compression
Keywords
  • Superset Recovery
  • Approximate Support Recovery
  • List union-free family
  • Descartes’ rule of signs

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Jayadev Acharya, Arnab Bhattacharyya, and Pritish Kamath. Improved bounds for universal one-bit compressive sensing. In 2017 IEEE International Symposium on Information Theory (ISIT), pages 2353-2357. IEEE, 2017. Google Scholar
  2. Alexander Barg and Arya Mazumdar. Group testing schemes from codes and designs. IEEE Transactions on Information Theory, 63(11):7131-7141, 2017. Google Scholar
  3. Petros Boufounos and Richard G. Baraniuk. 1-bit compressive sensing. In 42nd Annual Conference on Information Sciences and Systems, CISS 2008, Princeton, NJ, USA, 19-21 March 2008, pages 16-21. IEEE, 2008. URL: https://doi.org/10.1109/CISS.2008.4558487.
  4. Emmanuel J Candès, Justin Romberg, and Terence Tao. Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information. IEEE Transactions on Information Theory, 52(2):489-509, 2006. Google Scholar
  5. Don Coppersmith and James B Shearer. New bounds for union-free families of sets. the electronic journal of combinatorics, 5(1):R39, 1998. Google Scholar
  6. David L. Donoho. Compressed sensing. IEEE Trans. Information Theory, 52(4):1289-1306, 2006. Google Scholar
  7. Dingzhu Du, Frank K Hwang, and Frank Hwang. Combinatorial group testing and its applications, volume 12. World Scientific, 2000. Google Scholar
  8. A D'yachkov, P Vilenkin, D Torney, and A Macula. Families of finite sets in which no intersection of l sets is covered by the union of s others. Journal of Combinatorial Theory, Series A, 99(2):195-218, 2002. Google Scholar
  9. Arkadii Georgievich Dyachkov and Vladimir Vasil'evich Rykov. A survey of superimposed code theory. Problems of Control and Information Theory, 12(4):1-13, 1983. Google Scholar
  10. Paul Erdös, Peter Frankl, and Zoltán Füredi. Families of finite sets in which no set is covered by the union ofr others. Israel Journal of Mathematics, 51(1-2):79-89, 1985. Google Scholar
  11. Larkin Flodin, Venkata Gandikota, and Arya Mazumdar. Superset technique for approximate recovery in one-bit compressed sensing. In Advances in Neural Information Processing Systems 32: NeurIPS 2019, December 8-14, 2019, Vancouver, BC, Canada, pages 10387-10396, 2019. URL: https://arxiv.org/pdf/1910.13971.pdf.
  12. P Frankl and Z Füredi. Union-free families of sets and equations over fields. Journal of Number Theory, 23(2):210-218, 1986. Google Scholar
  13. Zoltán Füredi. Onr-cover-free families. Journal of Combinatorial Theory, Series A, 73(1):172-173, 1996. Google Scholar
  14. Sivakant Gopi, Praneeth Netrapalli, Prateek Jain, and Aditya Nori. One-bit compressed sensing: Provable support and vector recovery. In International Conference on Machine Learning, pages 154-162, 2013. Google Scholar
  15. Jarvis D. Haupt and Richard G. Baraniuk. Robust support recovery using sparse compressive sensing matrices. In 45st Annual Conference on Information Sciences and Systems, CISS 2011, The John Hopkins University, Baltimore, MD, USA, 23-25 March 2011, pages 1-6. IEEE, 2011. URL: https://doi.org/10.1109/CISS.2011.5766202.
  16. Laurent Jacques, Jason N Laska, Petros T Boufounos, and Richard G Baraniuk. Robust 1-bit compressive sensing via binary stable embeddings of sparse vectors. IEEE Transactions on Information Theory, 59(4):2082-2102, 2013. Google Scholar
  17. Ping Li. One scan 1-bit compressed sensing. In Arthur Gretton and Christian C. Robert, editors, Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016, Cadiz, Spain, May 9-11, 2016, volume 51 of JMLR Workshop and Conference Proceedings, pages 1515-1523. JMLR.org, 2016. URL: http://jmlr.org/proceedings/papers/v51/li16g.html.
  18. Arya Mazumdar. Nonadaptive group testing with random set of defectives. IEEE Trans. Information Theory, 62(12):7522-7531, 2016. URL: https://doi.org/10.1109/TIT.2016.2613870.
  19. Hung Q Ngo, Ely Porat, and Atri Rudra. Efficiently decodable error-correcting list disjunct matrices and applications. In International Colloquium on Automata, Languages, and Programming, pages 557-568. Springer, 2011. Google Scholar
  20. Yaniv Plan and Roman Vershynin. Robust 1-bit compressed sensing and sparse logistic regression: A convex programming approach. IEEE Trans. Information Theory, 59(1):482-494, 2013. URL: https://doi.org/10.1109/TIT.2012.2207945.
  21. Yaniv Plan and Roman Vershynin. The generalized lasso with non-linear observations. IEEE Transactions on information theory, 62(3):1528-1537, 2016. Google Scholar
  22. Ely Porat and Amir Rothschild. Explicit nonadaptive combinatorial group testing schemes. IEEE Trans. Information Theory, 57(12):7982-7989, 2011. URL: https://doi.org/10.1109/TIT.2011.2163296.
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail