Spectral Norm of Random Kernel Matrices with Applications to Privacy
Kernel methods are an extremely popular set of techniques used for many important machine learning and data analysis applications. In addition to having good practical performance, these methods are supported by a well-developed theory. Kernel methods use an implicit mapping of the input data into a high dimensional feature space defined by a kernel function, i.e., a function returning the inner product between the images of two data points in the feature space. Central to any kernel method is the kernel matrix, which is built by evaluating the kernel function on a given sample dataset.
In this paper, we initiate the study of non-asymptotic spectral properties of random kernel matrices. These are n x n random matrices whose (i,j)th entry is obtained by evaluating the kernel function on x_i and x_j, where x_1,..,x_n are a set of n independent random high-dimensional vectors. Our main contribution is to obtain tight upper bounds on the spectral norm (largest eigenvalue) of random kernel matrices constructed by using common kernel functions such as polynomials and Gaussian radial basis.
As an application of these results, we provide lower bounds on the distortion needed for releasing the coefficients of kernel ridge regression under attribute privacy, a general privacy notion which captures a large class of privacy definitions. Kernel ridge regression is standard method for performing non-parametric regression that regularly outperforms traditional regression approaches in various domains. Our privacy distortion lower bounds are the first for any kernel technique, and our analysis assumes realistic scenarios for the input, unlike all previous lower bounds for other release problems which only hold under very restrictive input settings.
Random Kernel Matrices
Spectral Norm
Subguassian Distribution
Data Privacy
Reconstruction Attacks
898-914
Regular Paper
Shiva Prasad
Kasiviswanathan
Shiva Prasad Kasiviswanathan
Mark
Rudelson
Mark Rudelson
10.4230/LIPIcs.APPROX-RANDOM.2015.898
Olivier Bousquet, Ulrike von Luxburg, and G Rätsch. Advanced Lectures on Machine Learning. In ML Summer Schools 2003, 2004.
Xiuyuan Cheng and Amit Singer. The Spectrum of Random Inner-Product Kernel Matrices. Random Matrices: Theory and Applications, 2(04), 2013.
Krzysztof Choromanski and Tal Malkin. The Power of the Dinur-Nissim Algorithm: Breaking Privacy of Statistical and Graph Databases. In PODS, pages 65-76. ACM, 2012.
Anindya De. Lower Bounds in Differential Privacy. In TCC, pages 321-338, 2012.
Irit Dinur and Kobbi Nissim. Revealing Information while Preserving Privacy. In PODS, pages 202-210. ACM, 2003.
Yen Do and Van Vu. The Spectrum of Random Kernel Matrices: Universality Results for Rough and Varying Kernels. Random Matrices: Theory and Applications, 2(03), 2013.
Cynthia Dwork, Frank McSherry, Kobbi Nissim, and Adam Smith. Calibrating Noise to Sensitivity in Private Data Analysis. In TCC, volume 3876 of LNCS, pages 265-284. Springer, 2006.
Cynthia Dwork, Frank McSherry, and Kunal Talwar. The Price of Privacy and the Limits of LP Decoding. In STOC, pages 85-94. ACM, 2007.
Cynthia Dwork and Sergey Yekhanin. New Efficient Attacks on Statistical Disclosure Control Mechanisms. In CRYPTO, pages 469-480. Springer, 2008.
Arthur E Hoerl and Robert W Kennard. Ridge Regression: Biased Estimation for Nonorthogonal Problems. Technometrics, 12(1):55-67, 1970.
Prateek Jain and Abhradeep Thakurta. Differentially Private Learning with Kernels. In ICML, pages 118-126, 2013.
Lei Jia and Shizhong Liao. Accurate Probabilistic Error Bound for Eigenvalues of Kernel Matrix. In Advances in Machine Learning, pages 162-175. Springer, 2009.
Noureddine El Karoui. The Spectrum of Kernel Random Matrices. The Annals of Statistics, pages 1-50, 2010.
Shiva Prasad Kasiviswanathan, Mark Rudelson, and Adam Smith. The Power of Linear Reconstruction Attacks. In SODA, pages 1415-1433, 2013.
Shiva Prasad Kasiviswanathan, Mark Rudelson, Adam Smith, and Jonathan Ullman. The Price of Privately Releasing Contingency Tables and the Spectra of Random Matrices with Correlated Rows. In STOC, pages 775-784, 2010.
Gert RG Lanckriet, Nello Cristianini, Peter Bartlett, Laurent El Ghaoui, and Michael I Jordan. Learning the Kernel Matrix with Semidefinite Programming. The Journal of Machine Learning Research, 5:27-72, 2004.
James Mercer. Functions of Positive and Negative Type, and their Connection with the Theory of Integral Equations. Philosophical transactions of the royal society of London. Series A, containing papers of a mathematical or physical character, pages 415-446, 1909.
Martin M Merener. Polynomial-time Attack on Output Perturbation Sanitizers for Real-valued Databases. Journal of Privacy and Confidentiality, 2(2):5, 2011.
S. Muthukrishnan and Aleksandar Nikolov. Optimal Private Halfspace Counting via Discrepancy. In STOC, pages 1285-1292, 2012.
Mark Rudelson. Recent Developments in Non-asymptotic Theory of Random Matrices. Modern Aspects of Random Matrix Theory, 72:83, 2014.
Craig Saunders, Alexander Gammerman, and Volodya Vovk. Ridge Regression Learning Algorithm in Dual Variables. In ICML, pages 515-521, 1998.
Bernhard Schölkopf, Ralf Herbrich, and Alex J Smola. A Generalized Representer Theorem. In COLT, pages 416-426, 2001.
Bernhard Scholkopf and Alexander J Smola. Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, 2001.
John Shawe-Taylor and Nello Cristianini. Kernel Methods for Pattern Analysis. Cambridge University Press, 2004.
John Shawe-Taylor, Christopher KI Williams, Nello Cristianini, and Jaz Kandola. On the Eigenspectrum of the Gram matrix and the Generalization Error of Kernel-PCA. Information Theory, IEEE Transactions on, 51(7):2510-2522, 2005.
Vikas Sindhwani, Minh Ha Quang, and Aurélie C Lozano. Scalable Matrix-valued Kernel Learning for High-dimensional Nonlinear Multivariate Regression and Granger Causality. arXiv preprint arXiv:1210.4792, 2012.
Roman Vershynin. Introduction to the Non-asymptotic Analysis of Random Matrices. arXiv preprint arXiv:1011.3027, 2010.
Creative Commons Attribution 3.0 Unported license
https://creativecommons.org/licenses/by/3.0/legalcode