Practical Performance of Random Projections in Linear Programming

Authors Leo Liberti , Benedetto Manca , Pierre-Louis Poirion

Thumbnail PDF


  • Filesize: 2.09 MB
  • 15 pages

Document Identifiers

Author Details

Leo Liberti
  • LIX CNRS, Ecole Polytechnique, Institut Polytechnique de Paris, 91128 Palaiseau, France
  • liberti
Benedetto Manca
  • Department of Mathematics and Informatics, University of Cagliari, Italy
Pierre-Louis Poirion
  • RIKEN Center for Advanced Intelligence Project, Tokyo, Japan

Cite AsGet BibTex

Leo Liberti, Benedetto Manca, and Pierre-Louis Poirion. Practical Performance of Random Projections in Linear Programming. In 20th International Symposium on Experimental Algorithms (SEA 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 233, pp. 21:1-21:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


The use of random projections in mathematical programming allows standard solution algorithms to solve instances of much larger sizes, at least approximately. Approximation results have been derived in the relevant literature for many specific problems, as well as for several mathematical programming subclasses. Despite the theoretical developments, it is not always clear that random projections are actually useful in solving mathematical programs in practice. In this paper we provide a computational assessment of the application of random projections to linear programming.

Subject Classification

ACM Subject Classification
  • Mathematics of computing → Mathematical optimization
  • Theory of computation → Random projections and metric embeddings
  • Linear Programming
  • Johnson-Lindenstrauss Lemma
  • Computational testing


  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    PDF Downloads


  1. D. Achlioptas. Database-friendly random projections: Johnson-Lindenstrauss with binary coins. Journal of Computer and System Sciences, 66:671-687, 2003. Google Scholar
  2. D. Amelunxen, M. Lotz, M. McCoy, and J. Tropp. Living on the edge: phase transitions in convex programs with random data. Information and Inference: A Journal of the IMA, 3:224-294, 2014. Google Scholar
  3. E. Candès and T. Tao. Decoding by Linear Programming. IEEE Transactions on Information Theory, 51(12):4203-4215, 2005. Google Scholar
  4. C. D'Ambrosio, L. Liberti, P.-L. Poirion, and K. Vu. Random projections for quadratic programs. Mathematical Programming B, 183:619-647, 2020. Google Scholar
  5. S. Damelin and W. Miller. The mathematics of signal processing. CUP, Cambridge, 2012. Google Scholar
  6. G. Dantzig. The Diet Problem. Interfaces, 20(4):43-47, 1990. Google Scholar
  7. S. Dirksen. Dimensionality reduction with subgaussian matrices: A unified theory. Foundations of Computational Mathematics, 16:1367-1396, 2016. Google Scholar
  8. L. Ford and D. Fulkerson. Flows in Networks. Princeton University Press, Princeton, NJ, 1962. Google Scholar
  9. R. Fourer and D. Gay. The AMPL Book. Duxbury Press, Pacific Grove, 2002. Google Scholar
  10. IBM. ILOG CPLEX 20.1 User’s Manual. IBM, 2020. Google Scholar
  11. P. Indyk. Algorithmic applications of low-distortion geometric embeddings. In Foundations of Computer Science, volume 42 of FOCS, pages 10-33, Washington, DC, 2001. IEEE. Google Scholar
  12. P. Indyk and A. Naor. Nearest neighbor preserving embeddings. ACM Transactions on Algorithms, 3(3):Art. 31, 2007. Google Scholar
  13. W. Johnson and J. Lindenstrauss. Extensions of Lipschitz mappings into a Hilbert space. In G. Hedlund, editor, Conference in Modern Analysis and Probability, volume 26 of Contemporary Mathematics, pages 189-206, Providence, RI, 1984. AMS. Google Scholar
  14. E. Jones, T. Oliphant, and P. Peterson. SciPy: Open source scientific tools for Python, 2001. [Online; accessed 2016-03-01]. URL:
  15. D. Kane and J. Nelson. Sparser Johnson-Lindenstrauss transforms. Journal of the ACM, 61(1):4, 2014. Google Scholar
  16. R. Koenker. Quantile regression. CUP, Cambridge, 2005. Google Scholar
  17. L. Liberti. Decoding noisy messages: a method that just shouldn't work. In A. Deza, S. Gupta, and S. Pokutta, editors, Data Science and Optimization. Fields Institute, Toronto, pending minor revisions. Google Scholar
  18. L. Liberti, P.-L. Poirion, and K. Vu. Random projections for conic programs. Linear Algebra and its Applications, 626:204-220, 2021. Google Scholar
  19. L. Liberti and K. Vu. Barvinok’s naive algorithm in distance geometry. Operations Research Letters, 46:476-481, 2018. Google Scholar
  20. D. Pucci de Farias and B. Van Roy. On constraint sampling in the Linear Programming approach to approximate Dynamic Programming. Mathematics of Operations Research, 29(3):462-478, 2004. Google Scholar
  21. G. van Rossum and et al. Python Language Reference, version 3. Python Software Foundation, 2019. Google Scholar
  22. S. Vempala. The Random Projection Method. Number 65 in DIMACS Series in Discrete Mathematics and Theoretical Computer Science. AMS, Providence, RI, 2004. Google Scholar
  23. S. Venkatasubramanian and Q. Wang. The Johnson-Lindenstrauss transform: An empirical study. In Algorithm Engineering and Experiments, volume 13 of ALENEX, pages 164-173, Providence, RI, 2011. SIAM. Google Scholar
  24. K. Vu, P.-L. Poirion, C. D'Ambrosio, and L. Liberti. Random projections for quadratic programs over a Euclidean ball. In A. Lodi and et al., editors, Integer Programming and Combinatorial Optimization (IPCO), volume 11480 of LNCS, pages 442-452, New York, 2019. Springer. Google Scholar
  25. K. Vu, P.-L. Poirion, and L. Liberti. Random projections for linear programming. Mathematics of Operations Research, 43(4):1051-1071, 2018. Google Scholar
  26. J. Yang, X. Meng, and M. Mahoney. Quantile regression for large-scale applications. SIAM Journal of Scientific Computing, 36(5):S78-S110, 2014. Google Scholar
Questions / Remarks / Feedback

Feedback for Dagstuhl Publishing

Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail