Coresets for the Nearest-Neighbor Rule

Authors Alejandro Flores-Velazco , David M. Mount



PDF
Thumbnail PDF

File

LIPIcs.ESA.2020.47.pdf
  • Filesize: 4.87 MB
  • 19 pages

Document Identifiers

Author Details

Alejandro Flores-Velazco
  • Department of Computer Science, University of Maryland, College Park, MD, USA
David M. Mount
  • Department of Computer Science and Institute for Advanced Computer Studies, University of Maryland, College Park, MD, USA

Acknowledgements

Thanks to Prof. Emely Arráiz for suggesting the problem of condensation while the first author was a student at Universidad Simón Bolívar, Venezuela. Thanks to Ahmed Abdelkader for the helpful discussions and valuable suggestions.

Cite As Get BibTex

Alejandro Flores-Velazco and David M. Mount. Coresets for the Nearest-Neighbor Rule. In 28th Annual European Symposium on Algorithms (ESA 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 173, pp. 47:1-47:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020) https://doi.org/10.4230/LIPIcs.ESA.2020.47

Abstract

Given a training set P of labeled points, the nearest-neighbor rule predicts the class of an unlabeled query point as the label of its closest point in the set. To improve the time and space complexity of classification, a natural question is how to reduce the training set without significantly affecting the accuracy of the nearest-neighbor rule. Nearest-neighbor condensation deals with finding a subset R ⊆ P such that for every point p ∈ P, p’s nearest-neighbor in R has the same label as p. This relates to the concept of coresets, which can be broadly defined as subsets of the set, such that an exact result on the coreset corresponds to an approximate result on the original set. However, the guarantees of a coreset hold for any query point, and not only for the points of the training set.
This paper introduces the concept of coresets for nearest-neighbor classification. We extend existing criteria used for condensation, and prove sufficient conditions to correctly classify any query point when using these subsets. Additionally, we prove that finding such subsets of minimum cardinality is NP-hard, and propose quadratic-time approximation algorithms with provable upper-bounds on the size of their selected subsets. Moreover, we show how to improve one of these algorithms to have subquadratic runtime, being the first of this kind for condensation.

Subject Classification

ACM Subject Classification
  • Theory of computation → Computational geometry
Keywords
  • coresets
  • nearest-neighbor rule
  • classification
  • nearest-neighbor condensation
  • training-set reduction
  • approximate nearest-neighbor
  • approximation algorithms

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Pankaj K Agarwal, Sariel Har-Peled, and Kasturi R Varadarajan. Geometric approximation via coresets. Combinatorial and computational geometry, 52:1-30, 2005. Google Scholar
  2. Alexandr Andoni, Piotr Indyk, and Ilya Razenshteyn. Approximate nearest neighbor search in high dimensions. arXiv preprint, 2018. URL: http://arxiv.org/abs/1806.09823.
  3. Fabrizio Angiulli. Fast nearest neighbor condensation for large data sets classification. IEEE Transactions on Knowledge and Data Engineering, 19(11):1450-1464, 2007. Google Scholar
  4. Sunil Arya, Guilherme D Da Fonseca, and David M Mount. Approximate polytope membership queries. SIAM Journal on Computing, 47(1):1-51, 2018. Google Scholar
  5. Sunil Arya, Theocharis Malamatos, and David M Mount. Space-time tradeoffs for approximate nearest neighbor searching. Journal of the ACM (JACM), 57(1):1, 2009. Google Scholar
  6. Franz Aurenhammer and Herbert Edelsbrunner. An optimal algorithm for constructing the weighted voronoi diagram in the plane. Pattern Recognition, 17(2):251-257, 1984. Google Scholar
  7. Ricardo Barandela, Francesc J Ferri, and J Salvador Sánchez. Decision boundary preserving prototype selection for nearest neighbor classification. International Journal of Pattern Recognition and Artificial Intelligence, 19(06):787-806, 2005. Google Scholar
  8. Cenk Baykal, Lucas Liebenwein, Igor Gilitschenski, Dan Feldman, and Daniela Rus. Data-dependent coresets for compressing neural networks with applications to generalization bounds. arXiv preprint, 2018. URL: http://arxiv.org/abs/1804.05345.
  9. Jon Louis Bentley and James B Saxe. Decomposable searching problems I. Static-to-dynamic transformation. Journal of Algorithms, 1(4):301-358, 1980. Google Scholar
  10. Ahmad Biniaz, Sergio Cabello, Paz Carmi, Jean-Lou De Carufel, Anil Maheshwari, Saeed Mehrabi, and Michiel Smid. On the minimum consistent subset problem. In WADS, 2019. Google Scholar
  11. Vladimir Braverman, Dan Feldman, and Harry Lang. New frameworks for offline and streaming coreset constructions. arXiv preprint, 2016. URL: http://arxiv.org/abs/1612.00889.
  12. Václav Chvatal. A greedy heuristic for the set-covering problem. Math. Oper. Res., 1979. Google Scholar
  13. Richard Cole and Lee-Ad Gottlieb. Searching dynamic point sets in spaces with bounded doubling dimension. In Proceedings of the thirty-eighth annual ACM symposium on Theory of computing, pages 574-583, 2006. Google Scholar
  14. Thomas Cover and Peter Hart. Nearest neighbor pattern classification. IEEE Trans. Inf. Theor., 1967. Google Scholar
  15. Luc Devroye. On the inequality of cover and hart in nearest neighbor discrimination. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 1:75-78, 1981. Google Scholar
  16. Uriel Feige. A threshold of ln n for approximating set cover. JACM, 1998. Google Scholar
  17. Dan Feldman. Core-sets: Updated survey. In Sampling Techniques for Supervised or Unsupervised Tasks, pages 23-44. Springer, 2020. Google Scholar
  18. Dan Feldman and Michael Langberg. A unified framework for approximating and clustering data. In Proceedings of the forty-third annual ACM symposium on Theory of computing, pages 569-578, 2011. Google Scholar
  19. Evelyn Fix and Joseph L. Hodges. Discriminatory analysis, nonparametric discrimination: Consistency properties. US Air Force School of Aviation Medicine, Technical Report 4(3):477+, January 1951. Google Scholar
  20. Alejandro Flores-Velazco and David M. Mount. Guarantees on nearest-neighbor condensation heuristics. In Zachary Friggstad and Jean-Lou De Carufel, editors, Proceedings of the 31st Canadian Conference on Computational Geometry, CCCG 2019, August 8-10, 2019, University of Alberta, Edmonton, Alberta, Canada, pages 87-93, 2019. Google Scholar
  21. Salvador Garcia, Joaquin Derrac, Jose Cano, and Francisco Herrera. Prototype selection for nearest neighbor classification: Taxonomy and empirical study. IEEE TPAMI, 2012. Google Scholar
  22. Lee-Ad Gottlieb, Aryeh Kontorovich, and Pinhas Nisnevitch. Near-optimal sample compression for nearest neighbors. In Advances in Neural Information Processing Systems, 2014. Google Scholar
  23. Sariel Har-Peled and Soham Mazumdar. On coresets for k-means and k-median clustering. In Proceedings of the thirty-sixth annual ACM symposium on Theory of computing, pages 291-300, 2004. Google Scholar
  24. Sariel Har-Peled and Manor Mendel. Fast construction of nets in low-dimensional metrics and their applications. SIAM Journal on Computing, 35(5):1148-1184, 2006. Google Scholar
  25. Peter Hart. The condensed nearest neighbor rule (corresp.). IEEE Trans. Inf. Theor., 1968. Google Scholar
  26. Juha Heinonen. Lectures on analysis on metric spaces. Springer Science & Business Media, 2012. Google Scholar
  27. Norbert Jankowski and Marek Grochowski. Comparison of instances selection algorithms I. Algorithms survey. In Artificial Intelligence and Soft Computing-ICAISC. Springer, 2004. Google Scholar
  28. Kamyar Khodamoradi, Ramesh Krishnamurti, and Bodhayan Roy. Consistent subset problem with two labels. In Conference on Algorithms and Discrete Applied Mathematics, 2018. Google Scholar
  29. Lucas Liebenwein, Cenk Baykal, Harry Lang, Dan Feldman, and Daniela Rus. Provable filter pruning for efficient neural networks. arXiv preprint, 2019. URL: http://arxiv.org/abs/1911.07412.
  30. Carsten Lund and Mihalis Yannakakis. On the hardness of approximating minimization problems. Journal of the ACM (JACM), 41(5):960-981, 1994. Google Scholar
  31. David M. Mount, Nathan S. Netanyahu, Ruth Silverman, and Angela Y. Wu. Chromatic nearest neighbor searching: A query sensitive approach. Computational Geometry, 2000. Google Scholar
  32. Azaria Paz and Shlomo Moran. Non deterministic polynomial optimization problems and their approximations. Theoretical Computer Science, 15(3):251-277, 1981. Google Scholar
  33. Jeff M. Phillips. Coresets and sketches, 2016. URL: http://arxiv.org/abs/1601.00617.
  34. G. L. Ritter, H. B. Woodruff, S. R. Lowry, and T. L. Isenhour. An algorithm for a selective nearest neighbor decision rule. IEEE Transactions on Information Theory, 1975. Google Scholar
  35. Petr Slavík. A tight analysis of the greedy algorithm for set cover. In Proceedings of the twenty-eighth annual ACM symposium on Theory of Computing, STOC, 1996. Google Scholar
  36. Charles J Stone. Consistent nonparametric regression. The annals of statistics, pages 595-620, 1977. Google Scholar
  37. Godfried Toussaint. Open problems in geometric methods for instance-based learning. In JCDCG, volume 2866 of Lecture Notes in Computer Science. Springer, 2002. URL: https://doi.org/10.1007/978-3-540-44400-8_29.
  38. Murad Tukan, Cenk Baykal, Dan Feldman, and Daniela Rus. On coresets for support vector machines. arXiv preprint, 2020. URL: http://arxiv.org/abs/2002.06469.
  39. Gordon Wilfong. Nearest neighbor problems. In Proceedings of the Seventh Annual Symposium on Computational Geometry, SoCG, pages 224-233, New York, NY, USA, 1991. ACM. Google Scholar
  40. Anastasiya V. Zukhba. NP-completeness of the problem of prototype selection in the nearest neighbor method. Pattern Recog. Image Anal., 20(4):484-494, 2010. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail