Matrix Estimation, Latent Variable Model and Collaborative Filtering

Author Devavrat Shah



PDF
Thumbnail PDF

File

LIPIcs.FSTTCS.2017.4.pdf
  • Filesize: 419 kB
  • 8 pages

Document Identifiers

Author Details

Devavrat Shah

Cite AsGet BibTex

Devavrat Shah. Matrix Estimation, Latent Variable Model and Collaborative Filtering. In 37th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 93, pp. 4:1-4:8, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)
https://doi.org/10.4230/LIPIcs.FSTTCS.2017.4

Abstract

Estimating a matrix based on partial, noisy observations is prevalent in variety of modern applications with recommendation system being a prototypical example. The non-parametric latent variable model provides canonical representation for such matrix data when the underlying distribution satisfies ``exchangeability'' with graphons and stochastic block model being recent examples of interest. Collaborative filtering has been a successfully utilized heuristic in practice since the dawn of e-commerce. In this extended abstract, we will argue that collaborative filtering (and its variants) solve matrix estimation for a generic latent variable model with near optimal sample complexity.
Keywords
  • Matrix Estimation
  • Graphon Estimation
  • Collaborative Filtering

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Emmanuel Abbe and Colin Sandon. Community detection in general stochastic block models: Fundamental limits and efficient algorithms for recovery. In Foundations of Computer Science (FOCS), 2015 IEEE 56th Annual Symposium on, pages 670-688. IEEE, 2015. Google Scholar
  2. Emmanuel Abbe and Colin Sandon. Recovering communities in the general stochastic block model without knowing the parameters. In Advances in neural information processing systems, 2015. Google Scholar
  3. Emmanuel Abbe and Colin Sandon. Detection in the stochastic block model with multiple clusters: proof of the achievability conjectures, acyclic bp, and the information-computation gap. Advances in neural information processing systems, 2016. Google Scholar
  4. Edo M Airoldi, Thiago B Costa, and Stanley H Chan. Stochastic blockmodel approximation of a graphon: Theory and consistent estimation. In Advances in Neural Information Processing Systems, pages 692-700, 2013. Google Scholar
  5. D.J. Aldous. Representations for partially exchangeable arrays of random variables. J. Multivariate Anal., 11:581-598, 1981. Google Scholar
  6. Tim Austin. Exchangeable random arrays. Technical Report, Notes for IAS workshop., 2012. Google Scholar
  7. Christian Borgs, Jennifer Chayes, Christina E. Lee, and Devavrat Shah. Thy friend is my friend: Iterative collaborating filtering for sparse graphon estimation. In Advances in Neural Information Processing Systems 30, 2017. Google Scholar
  8. Christian Borgs, Jennifer Chayes, and Adam Smith. Private graphon estimation for sparse graphs. In Advances in Neural Information Processing Systems, pages 1369-1377, 2015. Google Scholar
  9. Christian Borgs, Jennifer T Chayes, Henry Cohn, and Shirshendu Ganguly. Consistent nonparametric estimation for heavy-tailed sparse graphs. arXiv preprint arXiv:1508.06675, 2015. Google Scholar
  10. Christian Borgs, Jennifer T Chayes, Henry Cohn, and Nina Holden. Sparse exchangeable graphs and their limits via graphon processes. arXiv preprint arXiv:1601.07134, 2016. Google Scholar
  11. Christian Borgs, Jennifer T Chayes, Henry Cohn, and Yufei Zhao. An L^p theory of sparse graph convergence I: limits, sparse random graph models, and power law distributions. arXiv preprint arXiv:1401.2906, 2014. Google Scholar
  12. Christian Borgs, Jennifer T Chayes, Henry Cohn, and Yufei Zhao. An L^p theory of sparse graph convergence II: Ld convergence, quotients, and right convergence. arXiv preprint arXiv:1408.0744, 2014. Google Scholar
  13. Christian Borgs, Jennifer T Chayes, László Lovász, Vera T Sós, and Katalin Vesztergombi. Convergent sequences of dense graphs I: Subgraph frequencies, metric properties and testing. Advances in Mathematics, 219(6):1801-1851, 2008. Google Scholar
  14. Emmanuel Candes and Benjamin Recht. Exact matrix completion via convex optimization. Communications of the ACM, 55(6):111-119, 2009. Google Scholar
  15. Emmanuel J Candès and Terence Tao. The power of convex relaxation: Near-optimal matrix completion. IEEE Transactions on Information Theory, 56(5):2053-2080, 2010. Google Scholar
  16. Sourav Chatterjee. Matrix estimation by universal singular value thresholding. The Annals of Statistics, 43(1):177-214, 2015. Google Scholar
  17. Yudong Chen and Martin J Wainwright. Fast low-rank estimation by projected gradient descent: General statistical and algorithmic guarantees. arXiv preprint arXiv:1509.03025, 2015. Google Scholar
  18. Mark A Davenport, Yaniv Plan, Ewout van den Berg, and Mary Wootters. 1-bit matrix completion. Information and Inference, 3(3):189-223, 2014. Google Scholar
  19. Persi Diaconis and Svante Janson. Graph limits and exchangeable random graphs. Rendiconti di Matematica, VII(28):33-61, 2008. Google Scholar
  20. Chao Gao, Yu Lu, and Harrison H Zhou. Rate-optimal graphon estimation. The Annals of Statistics, 43(6):2624-2652, 2015. Google Scholar
  21. David Goldberg, David Nichols, Brian M. Oki, and Douglas Terry. Using collaborative filtering to weave an information tapestry. Commun. ACM, 1992. Google Scholar
  22. D.N. Hoover. Row-column exchangeability and a generalized model for probability. In Exchangeability in Probability and Statistics (Rome, 1981), pages 281-291, 1981. Google Scholar
  23. Raghunandan H Keshavan, Andrea Montanari, and Sewoong Oh. Matrix completion from a few entries. IEEE Transactions on Information Theory, 56(6):2980-2998, 2010. Google Scholar
  24. Raghunandan H Keshavan, Andrea Montanari, and Sewoong Oh. Matrix completion from noisy entries. Journal of Machine Learning Research, 11(Jul):2057-2078, 2010. Google Scholar
  25. Olga Klopp, Alexandre B Tsybakov, and Nicolas Verzelen. Oracle inequalities for network models and sparse graphon estimation. To appear in Annals of Statistics, 2015. Google Scholar
  26. Yehuda Koren and Robert Bell. Advances in collaborative filtering. In Recommender Systems Handbook, pages 145-186. Springer US, 2011. Google Scholar
  27. Christina E. Lee, Yihua Li, Devavrat Shah, and Dogyoon Song. Blind regression: Nonparametric regression for latent variable models via collaborative filtering. In Advances in Neural Information Processing Systems 29, pages 2155-2163, 2016. Google Scholar
  28. Greg Linden, Brent Smith, and Jeremy York. Amazon.com recommendations: Item-to-item collaborative filtering. IEEE Internet Computing, 7(1):76-80, 2003. Google Scholar
  29. László Lovász. Large networks and graph limits, volume 60. American Mathematical Society Providence, 2012. Google Scholar
  30. Sahand Negahban and Martin J Wainwright. Estimation of (near) low-rank matrices with noise and high-dimensional scaling. The Annals of Statistics, pages 1069-1097, 2011. Google Scholar
  31. Xia Ning, Christian Desrosiers, and George Karypis. Recommender Systems Handbook, chapter A Comprehensive Survey of Neighborhood-Based Recommendation Methods, pages 37-76. Springer US, 2015. Google Scholar
  32. Benjamin Recht. A simpler approach to matrix completion. Journal of Machine Learning Research, 12(Dec):3413-3430, 2011. Google Scholar
  33. David Steurer and Sam Hopkins. Bayesian estimation from few samples: community detection and related problems. 2017. Google Scholar
  34. Victor Veitch and Daniel M Roy. The class of random graphs arising from exchangeable random measures. arXiv preprint arXiv:1512.03099, 2015. Google Scholar
  35. Patrick J Wolfe and Sofia C Olhede. Nonparametric graphon estimation. arXiv preprint arXiv:1309.5936, 2013. Google Scholar
  36. Yuan Zhang, Elizaveta Levina, and Ji Zhu. Estimating network edge probabilities by neighborhood smoothing. arXiv preprint arXiv:1509.08588, 2015. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail