Query Complexity of Stochastic Minimum Vertex Cover
Abstract
We study the stochastic minimum vertex cover problem for general graphs. In this problem, we are given a graph and an existence probability for each edge . Edges of are realized (or exist) independently with these probabilities, forming the realized subgraph . The existence of an edge in can only be verified using edge queries. The goal of this problem is to find a near-optimal vertex cover of using a small number of queries.
Previous work by Derakhshan, Durvasula, and Haghtalab [STOC 2023] established the existence of approximation algorithms for this problem with queries. They also show that, under mild correlation among edge realizations, beating this approximation ratio requires querying a subgraph of size . Here, refers to Ruzsa-Szemerédi Graphs and represents the largest number of induced edge-disjoint matchings of size in an -vertex graph.
In this work, we design a simple algorithm for finding a approximate vertex cover by querying a subgraph of size for any absolute constant . Our algorithm can tolerate up to correlated edges, hence effectively completing our understanding of the problem under mild correlation.
Keywords and phrases:
Sublinear algorithms, Vertex cover, Query complexityFunding:
Zhiyang Xun: Supported by NSF Grant CCF-2312573 and a Simons Investigator Award (#409864, David Zuckerman).Copyright and License:
2012 ACM Subject Classification:
Theory of computation Streaming, sublinear and near linear time algorithmsAcknowledgements:
The authors thank Sanjeev Khanna for countless insightful discussions and help.Editors:
Raghu MekaSeries and Publisher:
Leibniz International Proceedings in Informatics, Schloss Dagstuhl – Leibniz-Zentrum für Informatik
1 Introduction
We study the stochastic minimum vertex cover problem. In this problem, we are given an arbitrary -vertex graph and an existence probability for each edge . Edges of are realized or exist independently, with their corresponding probability forming the realized subgraph . While is unknown, one can query an edge to see if this edge exists in . The goal is to find a near-optimal vertex cover of by querying a small subset of .
There is extensive literature studying various graph problems in this stochastic setting, such as minimum spanning tree [21, 22], all-pairs shortest paths [24], and maximum matching [13, 5, 12, 25, 3, 6, 9, 10, 8, 16, 15]. The stochastic vertex cover for general graphs was first studied by Blum, Behnezhad, and Derakhshan [7]. They provided a 2-approximation algorithm with queries where . Later, this approximation ratio was improved to by the work of Derakhshan, Durvasula, and Haghtalab [14] with queries. Even though these stochastic settings are primarily concerned with information-theoretical questions, this was the first work to provide positive results for a computationally hard problem. However, the key unresolved question remains:
Question 1.
How many non-adaptive edge queries are needed to find a -approximation to the minimum vertex cover problem in the stochastic setting?
In this paper, we take a step toward answering this question. We design a simple algorithm and establish a relation between the number of queries it makes and the density of a well-known class of graphs called Ruzsa-Szemerédi Graphs (Definition 2). Our number of queries matches the lower bound of Derakhshan, Haghtalab, and Durvasula [14] for the number of queries needed to surpass a 1.5-approximation in the presence of correlated edges.
Definition 2 (Ruzsa-Szemerédi Graphs [23]).
An -vertex graph is an graph if its edge set can be decomposed into edge-disjoint induced matchings of size . We use to denote the maximum for which exists.
Our main result is formalized in the theorem below.
Result 3.
Given any let be for . There exists an algorithm that queries a subgraph of size and finds a vertex cover with the expected approximation ratio of .
Proving lower and upper bounds parameterized by the density of RS-graphs has been of interest in various communities, such as dynamic algorithms [11, 4], streaming algorithms [20, 1, 2], property testing [17], and additive combinatorics [19]. Despite this, our understanding of their density remains incomplete. What we currently know is:
where is the -iterated logarithmic function. Our result is most interesting if the lower-bound turns out to be tight. In such a case the number of edges queried by our algorithms is
On the opposite end, if the actual density of RSgraphs is closer to the known upper-bound, our result implies
1.1 Technical Overview
We first design an algorithm with an additive error of at most in Section 2 and later show how it can be modified to obtain the approximation ratio. To choose which edges to query we rely on a parameter defined for any edge as follows:
where . In this definition, the randomization comes from the realization of and is a fixed algorithm which returns the minimum vertex cover of a given graph. Basically, is the probability with which an edge is covered by .111Since we are not concerned with computational efficiency, we assume s are known. Regardless, similar to [14], our results can also be obtained with the estimated value of s calculated using polynomially many calls to the MVC oracle.
Given a parameter whose value we fix later, we set , the subgraph that we query, to be the subgraph of edges with . I.e.,
Since we need the output to be a valid vertex cover, our final solution also needs to cover all the edges that we do not query. Let us denote this subgraph by . Therefore, after we query and see its realization , the most straightforward way of finding a small vertex cover is to output . However, for the sake of analysis, we need to design an alternative algorithm for finding a vertex cover of . We formally explain this in Algorithm 1 and briefly explain the overall ideas below.
After querying and observing its realization , we also hallucinate the edges in to get . The hallucinated subgraph is a random realization of containing each edge independently w.p. . We then let and commit all its vertices to the final solution. Next, we pick another set of vertices, , to cover all the edges in that were not covered by . That is, . We finally output . This clearly covers both and and is a valid solution. The extra step of hallucinating is to ensure that has the same distribution as the optimal solution (see 8). This gives us two properties:
-
1.
The expected size of is the same as OPT. To analyze our approximation ratio, it suffices to upper-bound .
-
2.
For any edge , we have . Since by our choice of , the edges in all have . This implies that any edge in remains in w.p. at most .
Having this, we argue that if the expected size of the solution is more than , then we have . This means that the subgraph has a large minimum vertex cover and, consequently, a large maximum matching. Now, suppose we draw independent subgraphs of from the same distribution as and call them . (We can do so by running our algorithm on different hallucinations of (see Definition 4).)
From each , we take the maximum matching of it to be , and from , we delete all the edges present in at least one other for . We show that this gives us a set of induced matchings. Notice that since we have that any edge is present in with probability at most , we can argue that deleting the edges present in other ’s will not reduce the size of significantly for a choice of . Therefore, we conclude that if is greater than , we can find a dense RS-graph in . Finally, by carefully choosing , where is , we argue that Algorithm 1 has an additive approximation of at most .
In Section 3, we explain how we can modify this algorithm and design our algorithm Algorithm 3. The main difference is that we pre-commit a subset of vertices to the final solution at the beginning of the algorithm. These vertices are the ones that have a high chance of being in OPT; therefore, including them does not significantly increase the size of the output. This allows us to argue that the optimal solution on is large enough to allow us to turn the additive error into a multiplicative error.
1.2 Preliminaries
Throughout the paper, we denote the input graph as , where . From this input graph, we have each edge present in independently with probability . We denote the minimum value among all s to be . Given a small constant , our goal is to query a sparse subgraph of represented by non-adaptively and find a approximate minimum vertex cover of .
We let be a fixed algorithm which given any graph outputs its minimum vertex cover. We may also refer to this as the minimum vertex cover oracle. We define to be the optimal solution to our problem. Note that OPT is a random variable since is a random realization of . We also let be the expected size of this optimal solution. We let be the size of the maximum matching of the graph .
For any vertex , we define
which is the probability that joins the optimal solution. This means . Similarly for any edge in the graph we define
Definition 4 (Graph Hallucination).
Given a any subgraph of we define a hallucination of to be a subgraph which contains each edge independently with probability .
We define to be the largest number of induced edge-disjoint matchings of size in an -vertex graph.
2 The algorithm
In this section, we prove the following theorem.
Theorem 5.
Let be . Running Algorithm 1 with finds a vertex cover that has an additive approximation of at most to the optimal minimum vertex cover and queries a subgraph of size .
To prove this theorem, we design Algorithm 1, which takes a parameter , and queries the subgraph with . Let us call the edges realized in to be . We then hallucinate the edges not in which we call the subgraph to get . We then let be .
This makes have the same distribution of OPT (see 8 for the proof). Now we commit the vertices in to the final solution and all that remains to be covered are edges in that have not been covered so far. Therefore we add to the solution. All we need to do is to argue is small since we have .
We show that if is large we can find a large RS-graph in (see Algorithm 2) so by carefully choosing which we relate to the density of RS-graphs we prove is small and hence we prove Algorithm 1 has at most additive error of . In the next section we modify Algorithm 1 slightly to get Algorithm 3 and show that this algorithm finds a approximation to the minimum vertex cover.
2.1 A Two-Step Algorithm
Here we describe the algorithm for finding the minimum vertex cover of . We call it the two step algorithm because the output consists of union of two sets and .
-
1.
Let be the subgraph of edges with .
-
2.
Let be .
-
3.
Query edges in to get its realization .
-
4.
Hallucinate edges in to form .
-
5.
Let .
-
6.
Let .
-
7.
Let .
-
8.
Return .
First, let us prove that the output of Algorithm 1 is a vertex cover.
Claim 6.
The output of Algorithm 1 is a vertex cover for .
Proof.
Since is a vertex cover on the graph that does not cover we can argue that is a vertex cover for . Now let us bound the number of queries that Algorithm 1 makes.
Claim 7.
The size of subgraph that we query is at most for for .
Proof.
Due to [14] we know that if we take a random minimum vertex cover for at most edges will not be covered by it with high probability. Here we repeat how this argument works for convenience. Let us call this number of edges and call this subgraph . For any subgraph with more than edges, the probability of none of its edges being in is at most . Moreover, since is an induced subgraph of , there are at most possibilities for it. By a union bound, we get that with high probability, has at most edges. That is:
Due to linearity of expectation we have
For the edges in we have so we can get and since we have we get that .
Claim 8.
The expected size of from Algorithm 1 is equal to opt and the distribution of is the same as OPT
Proof.
Look at the graph that is a minimum vertex cover for. It is the graph . For any edge , if it is in then it is present in with probability and if it is in it is present in with probability . Since we can see that the distribution of is the same as the distribution for and hence the expected size of the minimum vertex cover for is the same as which is opt. Also since the distribution of is the same as we get
Hence comes from the same distribution as OPT.
Lemma 9.
If Algorithm 1 does not have additive approximation of at most in expectation, then
Proof.
Due to 8, the expected size of is opt. Therefore, if Algorithm 1 has worse than additive approximation then we can conclude . In any graph the maximum matching size it at least half of the size of minimum vertex cover. Applying this statement on the graph give us that
2.2 Finding a Dense RS-graph in
Suppose in Algorithm 1 instead of querying and hallucinating we hallucinate all the edges. We repeat this process times, calling the resulting graph in Algorithm 1 for each from to . For each , we find the maximum matching . Then, from each , we delete the edges that are present in at least one other for , and we define to be the remaining edges of after these deletions. In Algorithm 2, we formalize this process. Our goal in this section is to argue that if Algorithm 1 does not have an additive error of at most , then the matchings will be large, and we can find a large RS-graph as a subgraph of .
-
1.
For from to :
-
(a)
Let and be graphs defined in Algorithm 1
-
(b)
Hallucinate graph to get
-
(c)
Let be
-
(d)
Let
-
(a)
-
2.
Let be the maximum matching of
-
3.
Delete the edges in present in at least one other with to get
-
4.
Return
The next claim will bound the probability of edges being in .
Claim 10.
For each edge and each graph , we have that .
Proof.
For the edge we have two cases, or . If we have and , then this edge will not be present in , otherwise if since covers all the edges of this edge will not be present in . So the only case that happens is when we have . In this case since we have and is a random minimum vertex cover of , then the edge will be covered with probability in (this is because comes from the same distribution as OPT according to 8). So this edge is present in with a probability of at most .
The following claim will prove that the matching are induced matchings.
Claim 11.
The matchings formed in the random process are induced matchings.
Proof.
The graphs are induced subgraphs on . Suppose that ’s are not induced matchings. Since we delete duplicate edges from to get there is no edge that is present in and for . Hence, there must exist two edges and present in and present in for to break the property of induced matchings. We call edges like a cross edge. In this case, we can argue that the edge is also present in since both of its endpoints are in the vertices of . Since is present in and this edge gets deleted and can not be part of . This contradicts with what we assumed. Therefore we can conclude that ’s are induced matchings.
Claim 12.
Suppose we set , then we have .
Proof.
Due to Lemma 9, we have that . Now we should argue that with the edge deletions, each will lose at most edges in expectation. Take an edge in . For each we have due to 10 since for an edge to be in it should also be in . By an application of union bound the edge is present in at least one other with probability at most . Hence we get,
And for the size of we have:
Lemma 13.
By running the process explained above in Algorithm 2 we can find an graph (see Definition 2) with and on .
Proof.
Due to 12 we can have matchings each with expected size . Putting these matchings together will give us expected edges in total. Suppose we break each matching to smaller matchings of size size. If the number of edges in is not divisible by we will lose at most the reminder of to which is at most . The new matchings each have size exactly and are induced matchings. This is because the original ’s did not have a cross edge (see 11 for what a cross edge is) and hence the new matchings also do not have a cross edge since we are just partitioning the edges of to smaller matchings. The total size of the new set of matchings is at least
This is because the total number of removed edges is bounded by the number of edges removed from each which is times the number of ’s which is . Since we are running the random process which in expectation gives us induced matchings of size with total size of there is an instance which we have at least edges in total. From this we can construct an graph with and .
Now we are ready to prove Theorem 5. We restate the theorem below.
See 5
Proof.
Suppose that Algorithm 1 does not find a vertex cover with additive approximation of . Then by setting in Lemma 13 we get
matchings each of size that are induced matchings. This is in contradiction with what we assumed since is the maximum number of induced matchings of size for an RS-graph and now we have found matchings of this size. This means that indeed Algorithm 1 has an additive approximation of at most for . Due to 7, we have which for we get .
3 Getting a Multiplicative Approximation
In this section, we modify Algorithm 1 to give us a multiplicative approximation of . Our modification is simple yet effective. At the beginning of the algorithm, we set and commit to the final solution. The intuition behind this is that since these vertices have high , they have a high probability of being in the final solution so it does not hurt us to have them in the final solution.
We do this specifically to be able to prove Lemma 16 which states s for a subgraph of vertices is lower bounded by . This means that if we find an algorithm that is additive in terms of the size of this subgraph, then it is also a multiplicative approximation because of the lower bound for s of this subgraph. Here is the modified algorithm.
The first lemma we prove is for bounding the expected size of .
Lemma 14.
for
Proof.
We can write as follows:
Due to 8, we know that . Take an arbitrary vertex . Since is a random minimum vertex cover, any vertex in has a probability of of also being in . And since for we have Hence we have
| (1) |
We know that this is because and we have . Form Equation 1, we can get
| (2) |
Since we have and from Equation 2 we have
Putting this together with due to 8 we get,
From Lemma 14 we can argue that if Algorithm 3 has worse than additive approximation in expectation, then is greater than . We use this fact in the following lemma.
Lemma 15.
If Algorithm 3 does not have a approximation of at most in expectation, then
Proof.
Due to Lemma 14 we have so if Algorithm 3 does not have a multiplicative approximation of at most then we can conclude . In any graph, the maximum matching size is at least half the minimum vertex cover size. Applying this statement on the graph gives us,
The next lemma proves a lower bound for of vertices in having at least one edge. We call this graph for future reference, meaning we remove the vertices in having no edge.
Lemma 16.
For any vertex , we have for .
Proof.
Take an edge since it has a degree of at least one. For this edge we know that since we put all the edges with in . Also we have since . We also know that . Putting this all together we get that which for a we get .
Claim 17.
We have for
Proof.
Putting 17 and Lemma 15 together we get that if Algorithm 3 does not give a approximation, then . The next lemma asserts that if Algorithm 3 does not find a approximation then we can find a large RS-graph in .
Lemma 18.
If Algorithm 3 is not a approximation then, we can find an graph (see Definition 2) with and on
Proof.
Suppose we set . Then we can run the algorithm explained in Algorithm 2 and put together matchings formed from different realizations of to find an RS-graph where each matching size is . Due to having we can prove 10. This is because for edges in we can argue that they exist in with a probability of at most similar to how we proved it for ’s in Algorithm 1 in 10. Moreover, like how we proved for Algorithm 2 that the matchings are induced matchings we form ’s like in Algorithm 2 and we can argue that they are induced matchings. Then we can also prove 12 and Lemma 13 for the new value of . Therefore, similar to how we proved Lemma 13 we can find an graph with and on . Now we are ready to prove the main result of this work.
See 3
Proof.
Let us run Algorithm 3 with and . Meaning we have . We proceed similar to how we proved Theorem 5. Suppose that Algorithm 3 does not find a vertex cover with approximation. Then we can find an graph in for and since for . By setting in Lemma 13 we get
matchings each of size that are induced matchings. This is in contradiction with what we assumed since is the maximum number of induced matchings of size for an RS-graph and now we have found matchings of this size. This means that indeed Algorithm 3 has an approximation of at most for . Due to 7, we have which for we get .
Correlated Edges
Suppose that has some edges that are realized in independently called and some edges that are correlated with each other and the edges in called . This means that edges in might exist based on edges in or that are realized. We need to adjust the definition of hallucination and realize the correlated edges by drawing them from the given distribution rather than realizing them independently. Moreover, if a set of them are queried and we want to hallucinate the rest, it should be done conditionally.
Notice that the only place we use independent realization of edges is in the proof of 7. So we can argue that among the edges in at most edges will be in . Furthermore we know that where is the subgraph of that is in . Since the number of correlated edges is at most we have so we get:
Based on the explanation above we drive the following corollary.
Corollary 19.
Our Algorithm 3 can handle up to correlated edges for and still provide a approximation to the optimal solution with queries.
References
- [1] Sepehr Assadi. A two-pass (conditional) lower bound for semi-streaming maximum matching. In Joseph (Seffi) Naor and Niv Buchbinder, editors, Proceedings of the 2022 ACM-SIAM Symposium on Discrete Algorithms, SODA 2022, Virtual Conference / Alexandria, VA, USA, January 9 - 12, 2022, pages 708–742. SIAM, 2022. doi:10.1137/1.9781611977073.32.
- [2] Sepehr Assadi, Soheil Behnezhad, Sanjeev Khanna, and Huan Li. On regularity lemma and barriers in streaming and dynamic matching. In Barna Saha and Rocco A. Servedio, editors, Proceedings of the 55th Annual ACM Symposium on Theory of Computing, STOC 2023, Orlando, FL, USA, June 20-23, 2023, pages 131–144. ACM, 2023. doi:10.1145/3564246.3585110.
- [3] Sepehr Assadi and Aaron Bernstein. Towards a unified theory of sparsification for matching problems. In Jeremy T. Fineman and Michael Mitzenmacher, editors, 2nd Symposium on Simplicity in Algorithms, SOSA 2019, January 8-9, 2019, San Diego, CA, USA, volume 69 of OASIcs, pages 11:1–11:20. Schloss Dagstuhl – Leibniz-Zentrum für Informatik, 2019. doi:10.4230/OASICS.SOSA.2019.11.
- [4] Sepehr Assadi and Sanjeev Khanna. Improved bounds for fully dynamic matching via ordered ruzsa-szemerédi graphs. CoRR, abs/2406.13573, 2024. doi:10.48550/arXiv.2406.13573.
- [5] Sepehr Assadi, Sanjeev Khanna, and Yang Li. The stochastic matching problem: Beating half with a non-adaptive algorithm. In Constantinos Daskalakis, Moshe Babaioff, and Hervé Moulin, editors, Proceedings of the 2017 ACM Conference on Economics and Computation, EC ’17, Cambridge, MA, USA, June 26-30, 2017, pages 99–116. ACM, 2017. doi:10.1145/3033274.3085146.
- [6] Sepehr Assadi, Sanjeev Khanna, and Yang Li. The stochastic matching problem with (very) few queries. ACM Trans. Economics and Comput., 7(3):16:1–16:19, 2019. doi:10.1145/3355903.
- [7] Soheil Behnezhad, Avrim Blum, and Mahsa Derakhshan. Stochastic vertex cover with few queries. In Joseph (Seffi) Naor and Niv Buchbinder, editors, Proceedings of the 2022 ACM-SIAM Symposium on Discrete Algorithms, SODA 2022, Virtual Conference / Alexandria, VA, USA, January 9 - 12, 2022, pages 1808–1846. SIAM, 2022. doi:10.1137/1.9781611977073.73.
- [8] Soheil Behnezhad and Mahsa Derakhshan. Stochastic weighted matching: (stochastic weighted matching: (1-) approximation. In Sandy Irani, editor, 61st IEEE Annual Symposium on Foundations of Computer Science, FOCS 2020, Durham, NC, USA, November 16-19, 2020, pages 1392–1403. IEEE, 2020. doi:10.1109/FOCS46700.2020.00131.
- [9] Soheil Behnezhad, Mahsa Derakhshan, Alireza Farhadi, MohammadTaghi Hajiaghayi, and Nima Reyhani. Stochastic matching on uniformly sparse graphs. In Dimitris Fotakis and Evangelos Markakis, editors, Algorithmic Game Theory - 12th International Symposium, SAGT 2019, Athens, Greece, September 30 - October 3, 2019, Proceedings, volume 11801 of Lecture Notes in Computer Science, pages 357–373. Springer, 2019. doi:10.1007/978-3-030-30473-7_24.
- [10] Soheil Behnezhad, Mahsa Derakhshan, and MohammadTaghi Hajiaghayi. Stochastic matching with few queries: (1-) approximation. In Konstantin Makarychev, Yury Makarychev, Madhur Tulsiani, Gautam Kamath, and Julia Chuzhoy, editors, Proceedings of the 52nd Annual ACM SIGACT Symposium on Theory of Computing, STOC 2020, Chicago, IL, USA, June 22-26, 2020, pages 1111–1124. ACM, 2020. doi:10.1145/3357713.3384340.
- [11] Soheil Behnezhad and Alma Ghafari. Fully dynamic matching and ordered ruzsa-szemerédi graphs. In 65st IEEE Annual Symposium on Foundations of Computer Science, FOCS 2024, 2024.
- [12] Soheil Behnezhad and Nima Reyhani. Almost optimal stochastic weighted matching with few queries. In Éva Tardos, Edith Elkind, and Rakesh Vohra, editors, Proceedings of the 2018 ACM Conference on Economics and Computation, Ithaca, NY, USA, June 18-22, 2018, pages 235–249. ACM, 2018. doi:10.1145/3219166.3219226.
- [13] Avrim Blum, John P. Dickerson, Nika Haghtalab, Ariel D. Procaccia, Tuomas Sandholm, and Ankit Sharma. Ignorance is almost bliss: Near-optimal stochastic matching with few queries. In Tim Roughgarden, Michal Feldman, and Michael Schwarz, editors, Proceedings of the Sixteenth ACM Conference on Economics and Computation, EC ’15, Portland, OR, USA, June 15-19, 2015, pages 325–342. ACM, 2015. doi:10.1145/2764468.2764479.
- [14] Mahsa Derakhshan, Naveen Durvasula, and Nika Haghtalab. Stochastic minimum vertex cover in general graphs: A 3/2-approximation. In Barna Saha and Rocco A. Servedio, editors, Proceedings of the 55th Annual ACM Symposium on Theory of Computing, STOC 2023, Orlando, FL, USA, June 20-23, 2023, pages 242–253. ACM, 2023. doi:10.1145/3564246.3585230.
- [15] Mahsa Derakhshan and Mohammad Saneian. Query efficient weighted stochastic matching. CoRR, abs/2311.08513, 2023. doi:10.48550/arXiv.2311.08513.
- [16] Shaddin Dughmi, Yusuf Hakan Kalayci, and Neel Patel. On sparsification of stochastic packing problems. In Kousha Etessami, Uriel Feige, and Gabriele Puppis, editors, 50th International Colloquium on Automata, Languages, and Programming, ICALP 2023, July 10-14, 2023, Paderborn, Germany, volume 261 of LIPIcs, pages 51:1–51:17. Schloss Dagstuhl – Leibniz-Zentrum für Informatik, 2023. doi:10.4230/LIPICS.ICALP.2023.51.
- [17] Eldar Fischer, Eric Lehman, Ilan Newman, Sofya Raskhodnikova, Ronitt Rubinfeld, and Alex Samorodnitsky. Monotonicity testing over general poset domains. In Proceedings of the thiry-fourth annual ACM symposium on Theory of computing, pages 474–483, 2002. doi:10.1145/509907.509977.
- [18] Jacob Fox. A new proof of the graph removal lemma. Annals of Mathematics, pages 561–579, 2011.
- [19] Jacob Fox, Hao Huang, and Benny Sudakov. On graphs decomposable into induced matchings of linear sizes. Bulletin of the London Mathematical Society, 49(1):45–57, 2017.
- [20] Ashish Goel, Michael Kapralov, and Sanjeev Khanna. On the communication and streaming complexity of maximum bipartite matching. In Yuval Rabani, editor, Proceedings of the Twenty-Third Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2012, Kyoto, Japan, January 17-19, 2012, pages 468–485. SIAM, 2012. doi:10.1137/1.9781611973099.41.
- [21] Michel X. Goemans and Jan Vondrák. Covering minimum spanning trees of random subgraphs. In J. Ian Munro, editor, Proceedings of the Fifteenth Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2004, New Orleans, Louisiana, USA, January 11-14, 2004, pages 934–941. SIAM, 2004. URL: http://dl.acm.org/citation.cfm?id=982792.982933.
- [22] Michel X. Goemans and Jan Vondrák. Covering minimum spanning trees of random subgraphs. Random Struct. Algorithms, 29(3):257–276, 2006. doi:10.1002/RSA.20115.
- [23] Imre Z Ruzsa and Endre Szemerédi. Triple systems with no six points carrying three triangles. Combinatorics (Keszthely, 1976), Coll. Math. Soc. J. Bolyai, 18(939-945):2, 1978.
- [24] Jan Vondrák. Shortest-path metric approximation for random subgraphs. Random Struct. Algorithms, 30(1-2):95–104, 2007. doi:10.1002/RSA.20150.
- [25] Yutaro Yamaguchi and Takanori Maehara. Stochastic packing integer programs with few queries. In Artur Czumaj, editor, Proceedings of the Twenty-Ninth Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2018, New Orleans, LA, USA, January 7-10, 2018, pages 293–310. SIAM, 2018. doi:10.1137/1.9781611975031.21.