Abstract 1 Introduction 2 An Overview of the Two-Party Protocol 3 The 𝒌-Party Protocol 4 A Lower Bound for the Two-Party Case 5 Proof of the Main Theorem References

One-Way Communication Complexity of Minimum Vertex Cover in General Graphs

Mahsa Derakhshan ORCID Northeastern University, Boston, MA, USA Andisheh Ghasemi ORCID Northeastern University, Boston, MA, USA Rajmohan Rajaraman ORCID Northeastern University, Boston, MA, USA
Abstract

We study the communication complexity of the Minimum Vertex Cover (MVC) problem on general graphs within the k-party one-way communication model. Edges of an arbitrary n-vertex graph are distributed among k parties. The objective is for the parties to collectively find a small vertex cover of the graph while adhering to a communication protocol where each party sequentially sends a message to the next until the last party outputs a valid vertex cover of the whole graph. We are particularly interested in the trade-off between the size of the messages sent and the approximation ratio of the output solution.

It is straightforward to see that any constant approximation protocol for MVC requires communicating Ω(n) bits. Additionally, there exists a trivial 2-approximation protocol where the parties collectively find a maximal matching of the graph greedily and return the subset of vertices matched. This raises a natural question: What is the best approximation ratio achievable using optimal communication of O(n)? We design a protocol with an approximation ratio of (22k+1+ϵ) and O(n) communication for any desirably small constant ϵ>0, which is strictly better than 2 for any constant number of parties. Moreover, we show that achieving an approximation ratio smaller than 3/2 for the two-party case requires n1+Ω(1/lglgn) communication, thereby establishing the tightness of our protocol for two parties.

A notable aspect of our protocol is that no edges are communicated between the parties. Instead, for any 1i<k, the i-th party only communicates a constant number of vertex covers for all edges assigned to the first i parties. An interesting consequence is that the communication cost of our protocol is O(n) bits, as opposed to the typical Ω(nlogn) bits required for many graph problems, such as maximum matching, where protocols commonly involve communicating edges.

Keywords and phrases:
Communication Complexity, Minimum Vertex Cover
Category:
Track A: Algorithms, Complexity and Games
Copyright and License:
[Uncaptioned image] © Mahsa Derakhshan, Andisheh Ghasemi, and Rajmohan Rajaraman; licensed under Creative Commons License CC-BY 4.0
2012 ACM Subject Classification:
Theory of computation Communication complexity
Related Version:
Full Version: https://arxiv.org/pdf/2505.00164
Editors:
Keren Censor-Hillel, Fabrizio Grandoni, Joël Ouaknine, and Gabriele Puppis

1 Introduction

We study the communication complexity of the Minimum Vertex Cover (MVC) problem for general graphs within the k-party one-way communication model. In this model, the edges of an arbitrary n-vertex graph G=(V,E) are distributed among k parties, labeled from 1 to k. The objective is for the parties to collectively find a small vertex cover of the graph G while adhering to the following communication protocol: Each party, in sequence, sends a message to the next party, starting from the first one, and ultimately, the last party outputs a valid vertex cover of G. In this work, we focus particularly on the trade-off between the size of the messages sent and the approximation ratio of the output solution.

Extensive literature addresses various problems within this communication model, such as graph connectivity, set cover, minimum cut, and maximum matching. (See e.g., [17, 18, 23, 5, 16, 7]). The model, first introduced by Yao [26] in 1979 also has close relations with streaming algorithms, which has further contributed to the significant attention it has received. However, to the best of our knowledge, this is the first work to consider the minimum vertex cover problem for general graphs in this communication model.

It is straightforward to see that a message of size Ω(n) is necessary to achieve any constant approximation ratio for the Minimum Vertex Cover (MVC) problem; for example, if the entire graph is given to the first party and the MVC has size n/2. Moreover, with slight adjustments to a lower bound for the maximum bipartite matching problem [17], we prove that to achieve any approximation ratio smaller than 3/2 in the two-party case, ω(n) communication is necessary. On the positive side, there exists a trivial 2-approximation protocol where the parties collectively find a maximal matching of the graph greedily and return the subset of vertices matched in the maximal matching. Therefore, a natural question arises:

Question 1.

Is it possible to achieve better than a 2-approximation with the optimal communication complexity of O(n)?

We note that this question was, in fact, open even with n2Ω(1) communication. In this work, we answer the question affirmatively for any constant number of parties. We design a protocol with the approximation ratio being a function of the number of parties. This protocol is optimal when there are only two parties. The approximation ratio and communication complexity of our protocol are as follows.

Theorem 1.

For any k2 and any desirably small ϵ>0, there exists a randomized MVC protocol in the k-party one-way communication model with an expected approximation ratio of (22k+1+ϵ), in which each party communicates a message of size Ok,ϵ(n)111We use the notation Ok,ϵ() to indicate that the hidden constant may depend on the parameters k and ϵ.. This approximation ratio is tight for k=2 up to a factor of 1+ϵ.

An interesting aspect of our protocol is that no edges are communicated between the parties. Instead, for any 1i<k, the i-th party communicates only a constant number of vertex covers (dependent on ϵ and k) of all the edges assigned to the first i parties. As a result, the communication cost of our protocol is O(n) bits, in contrast to the typical Ω(nlogn) bits required for approximating many graph problems [23], such as maximum matching, where protocols often involve communicating edges.

While the problem studied in this work is information-theoretical rather than computational, it is worth noting that our protocol is not polynomial-time. This is expected, as achieving any approximation ratio better than 2 for the minimum vertex cover problem is known to be computationally hard under the unique games conjecture [19]. Additionally, our protocol can be considered non-explicit because we provide an existential proof for parts of our protocol rather than explicitly constructing it in full (due to the use of von Neumann’s Minimax Theorem). However, the protocol can, in principle, be identified through a brute-force search in doubly exponential time.

Relation to streaming algorithms

A key motivation behind studying the communication complexity of one-way protocols for graph problems is due to its relations with the streaming model [15]. In the single-pass streaming model, the edges of an n-vertex graph arrive sequentially in a stream in an arbitrary order. A streaming algorithm needs to construct a solution using a limited memory smaller than the input size. Any lower bounds established for the communication complexity in the one-way model directly imply lower bounds for the memory requirements in the streaming model. This, indeed, has been the standard approach for proving streaming lower bounds. Similarly, any protocol developed within the communication framework can provide insights into designing streaming algorithms.

A long-standing, central open question in streaming literature is whether it is possible to achieve better than a 0.5-approximate matching in O~(n) space with only a single pass. Even estimating the size of the maximum matching (the dual of minimum vertex cover) within a factor greater than 0.5 remains unresolved in this setting. The same holds true for surpassing a 2-approximation for Minimum Vertex Cover (MVC) in bipartite graphs, as well as the seemingly more challenging problem of approximating MVC on general graphs.

Our work highlights an important point for those aiming to prove the impossibility of better-than-2 approximation for streaming MVC within O(n) space: the standard approach of using the one-way communication model with a constant number of parties does not suffice, as we present a protocol that achieves better-than-2 approximation when k is a constant. On a positive note, we hope that our protocol will inspire the development of space-efficient streaming algorithms.

1.1 Our Techniques

In this section, we provide a brief overview of our protocol. A more detailed explanation of the protocol for two parties is presented in Section 2, and the multi-party protocol is formally introduced and analyzed in Section 3. A lower bound for the two-party case is presented in Section 4, and the proof of main theorem put together in Section 5.

In our protocol, each party i, for 1i<k, communicates a set of vertex covers of the edges assigned to parties 1 through i. This information is clearly sufficient for the last party to obtain a valid vertex cover of the entire graph. Interestingly, it turns out that communicating only a constant number222This constant is independent of n, the number of vertices, but depends on k, the number of parties, and ϵ. of vertex covers provides enough information to find a sufficiently small vertex cover of the whole graph. To construct this message, each party i must solve several instances of the following problem: Let S be one of the vertex covers communicated to party i, and let 𝒟B be a distribution on the edges assigned to the future parties (i+1 to k). Given this information, party i aims to add a subset of vertices to S to cover her edges while ensuring that this does not significantly increase the final approximation ratio.

Let cv denote the probability of the vertex belonging to the optimal solution given the distribution 𝒟B. Player i must choose which vertices to add to S. To make this decision, she assigns weights to the vertices, where higher weights show a poorer fit for inclusion in the solution. The player then computes the minimum-weighted vertex cover based on these assigned weights. We aim to design the weight assignments so that vertices with higher weights are less likely to be selected. Additionally, we aim to provide earlier players with less flexibility in their choices, as they have access to less information. To achieve this, we define the weight function as:

wv=1(2βki)cv, (1)

where (2βki) is the approximation ratio achievable in a protocol with ki parties (the number of remaining parties). This means that if i is the second to last party (i.e., ki=1), then she sets wv=1cv since a single party protocol finds the optimal solution (β1=1). Finally, analyzing the approximation ratio boils down to upper-bounding the weight of this minimum weight vertex cover which we discuss further in Sections 3.1 and 3.3. Once this upper bound is established, we use an inductive proof to show that party i can add a subset of vertices to S ensuring

𝔼[|solution outputted by the final party|]𝔼[|optimal solution|]22k+1, (2)

where the expectation is taken over 𝒟B. Ideally, we would prefer this to be an instance-wise approximation ratio, where the upper bound holds for the expectation of the ratio rather than for the ratio of expectations. In such a case, a simple application of von Neumann’s Minimax Theorem would suffice to establish the existence of our desired protocol. Nevertheless, this upper bound, combined with a simple trick (also used by Assadi and Behnezhad [2]), can still give us the same approximation ratio with a small additive loss.

The key idea is that if the Minimum Vertex Cover of all instances in the support of the distribution 𝒟B have nearly the same size, then the per-instance approximation ratio and the ratio of expectations will also be approximately equal. To use this, each party discretizes the range of possible optimal solution sizes into a constant number of intervals. Each party then solves a separate instance of the problem under the assumption that the true optimal solution size falls within a particular interval. For each of these size estimates, the party is facing a distinct problem instance. The number of size estimates (or “guesses”) each party makes depends on their position among the k parties. Later parties in the protocol are allowed more error, meaning they make a larger number of guesses to refine their estimate. Ultimately, the total number of vertex covers that the last party receives is proportional to the product of the guesses made by the first k1 parties. See Figure 2 for a depiction of this protocol.

1.2 Related Work

In recent works, several advancements in the communication complexity of various graph-related problems have been made. Assadi et al. [6] established optimal lower bounds for approximating the Set Cover problem in the two-party communication model, which also corresponds with a streaming algorithm, thereby addressing both streaming and communication complexities. Additionally, Abboud et al. [1] investigated the same communication model for exact answers, which included back-and-forth communication, while Dark et al. [12] modified this model to include deletions. The model has also been studied by Assadi et al. [4] in a simultaneous framework under random partitions. Moreover, Naidu and Shah [22] as well as Chitnis et al. [11] contributed relevant insights in the dynamic data stream model, particularly concerning the vertex cover problem.

Most related to our work is the rich literature on the communication complexity of the maximum matching problem. (See e.g., [17, 10, 3, 21]) Within the O~(n) one-way communication regime, Goel, Kapralov, and Khanna have established a protocol for bipartite matching that achieves a tight approximation ratio of 2/3 in the two-party case [17]. For general graphs, the same tight approximation ratio is achieved by Assadi and Bernstein [3]. In scenarios involving more than two parties (k>2), the work of Behnezhad and Khanna [9] implies the existence of protocols with an approximation ratio of 0.6 and 0.53, respectively, for three and four parties, and 0.5+Ω1/k(1) for any k>4. Furthermore, Singla and Lee [21] provide 0.5+2O(k) approximation protocols for the more restricted model where each party has to irrevocably add some of their edges to the matching instead of communicating an arbitrary message of size O~(n).

Given the duality between maximum matching and minimum vertex cover (MVC) on bipartite graphs, these results can be used to approximate the size of bipartite MVC. However, simply having an approximate matching does not suffice to construct an approximate MVC. Although it would not be surprising if some of these works also have implications for finding an approximate MVC for bipartite graphs, to the best of our knowledge, there is no explicit study of MVC within the model discussed in this paper, even for bipartite graphs. General graphs, however, which are the main focus of this work, are a completely different story, as the MVC can be up to two times the maximum matching. Therefore, the ideas developed for approximating matching matching are not of much use here.

Another line of work related to this paper is the study of the stochastic minimum vertex cover problem [8, 13, 14]. In this problem, we are given a graph G=(V,E) and an existence probability for each edge eE. Edges of G are realized (or exist) independently with these probabilities, forming the realized subgraph 𝒢. The existence of an edge in 𝒢 can only be verified using edge queries. The goal of this problem is to find a near-optimal vertex cover of 𝒢 using a small number of queries. Most related to our work is the 1.5 approximation algorithm designed by Derakhshan et al [13]. Their algorithm first commits a subset of vertices S to the final solution, then queries any edge in G which is not covered by S, and adds an MVC of the realized edges to the solution. This effectively results in a solution which covers all edges of 𝒢. The core of their algorithm is their choice of set S.

The stochastic vertex cover algorithm of [13] can be used to attack the special case of the two-party communication model as follows. Let us assume that Alice (the first party) in addition to her own graph also has a distribution over Bob’s graph (the second party). In other words, she has a distribution over the whole graph with her edges having an existence probability of one. (Unlike the stochastic model, existence of edges are not independent here.) We observe that in this case if Alice follows the algorithm of [13] and only communicates subset S to Bob, they will be able to find a vertex cover whose expected size is at most 1.5 times the expected size of the optimal solution. Moreover, we show that the assumption regarding the knowledge of distribution can be lifted via techniques from [2] allowing for a 1.5 approximation in the 2-party communication model. However, for the multi-party case, which is the main focus of this work, while the technical insights from this work are helpful, it does not imply any approximation ratio better than 2. To address this challenge, we extend the argument of  [13] to bound the cost of a suitably weighted vertex cover in the 2-party case and leverage that to further generalize to the k-party case, beating the threshold of 2 for the approximation ratio.

1.3 Preliminaries

In this work, we consider the one-way k-party communication model, focusing on randomized protocols where each party utilizes private random bits. The primary interest in this model is information-theoretical, with parties assumed to be computationally unbounded. The communication cost is measured by the worst-case length of the messages exchanged between any two parties. For standard definitions and further details, we refer to the textbook by Kushilevitz and Nisan [20].

We have a base graph G=(V,E) which is distributed among k parties in a way that each party i has the graph Gi=(V,Ei) where l=1kEl=E. The objective of the parties is to collectively find a small vertex cover of the graph G while adhering to the following communication protocol: Starting from the first party, each party 1i<k, in turn, sends a message Mi and ultimately, the last party outputs a valid vertex cover of G.

Notation.

We define MVC(.) as a function that, for any given graph, returns its minimum vertex cover. The size of the minimum vertex cover for an input graph is denoted by τ(.), and OPT represents the minimum vertex cover of the input graph G, implying that |OPT|=τ(G) is the optimal solution size.

2 An Overview of the Two-Party Protocol

As a warm-up, we consider a two-party version of the problem with Alice and Bob as the first and second parties, respectively. Let GA=(V,EA) and GB=(V,EB) represent the subgraphs given to Alice and Bob. In this section, we provide an overview of our protocol for two parties with O(n) communication for a constant ϵ>0 that can be made arbitrarily small. We later prove this is a 3/2+ϵ-approximate protocol.

Consider a simplified version of the problem where Alice, in addition to GA, also knows the size of the minimum vertex cover of the whole graph G=GAGB. We will later discuss how to lift this assumption. For this case, the protocol we consider is simple: Alice communicates a carefully constructed vertex cover X of her subgraph (not necessarily the minimum vertex cover) to Bob. This construction is randomized, and we argue that it is possible to pick X that will lead to an expected approximation ratio of 3/2.

Let us analyze the approximation ratio achieved by a fixed X. The final solution picked by Bob will be the union of X and a minimum vertex cover of GB[V\X]. Hence, the approximation ratio will be:

𝔼[|X|+τ(GB[V\X])]τ(G) (3)
A Two-Player Game.

We can view the problem faced by Alice as a zero-sum two-player game between her and an adversary, both of whom know GA and the size of the optimal solution represented by opt. Alice’s strategy is to select a subset XV of vertices that covers GA. The adversary’s strategy is to select GB such that τ(GBGA)=opt. We let the adversary’s utility be the approximation ratio defined in Equation 3 since that is what Alice wants to minimize. Note that a mixed strategy of Alice in this game (a distribution over vertex covers of GA) is equivalent to a randomized algorithm for picking X in the original problem. Therefore, we are interested in proving that Alice has a mixed strategy for selecting X such that no pure strategy of the adversary obtains utility larger than 3/2 against it. The Minimax Theorem due to von Neumann [24] implies the following about this game:

Proposition 2.

In the described game, if for any mixed strategy chosen by the adversary, there exists a pure strategy X for Alice such that the adversary’s expected utility is at most 3/2, then there exists a mixed strategy for Alice that ensures the expected utility of the adversary is at most 3/2 for any strategy the adversary employs.

Due to Proposition 2, the problem is reduced to proving that given any distribution 𝒟B over GB, there exists a vertex cover X of GA with an instance-wise approximation ratio of at most 3/2. Given distribution 𝒟B, let cv denote the probability of a vertex being in the optimal solution of GAGB, assuming GB is drawn from 𝒟B. That is,

cv:=PrGB𝒟B[vMVC(GAGB)].

After calculating these probabilities (which is possible without computational limitations), Alice constructs a vertex-weighted subgraph with the weight of any vertex vV being wv=1cv. She then lets X be the minimum weight vertex cover of GA. These steps are formalized in Algorithm 1.

Algorithm 1 Minimum Weighted Vertex Cover 2-Party Algorithm.

We now provide some intuition behind Algorithm 1. Let W(X) be the sum of the weight of the vertices in X. We claim that W(X) is the expected difference between opt and the size of the solution outputted by the algorithm. For any vertex in X, its weight is essentially the difference between its contribution to X and its contribution to the optimal solution. In other words, this weight is the cost of including that vertex in X.

We observe that since Bob calculates the exact minimum vertex cover (MVC) for the edges that X does not cover, the total weight W(X) acts as an upper bound on the expected difference between the optimal solution and the solution found by the algorithm. This is exactly what we want to minimize, which is why Alice sets weights as wv=1cv and takes a minimum weight vertex cover of GA.

Following this observation, to prove the desired approximation ratio, we need to show an upper bound of

W(X)𝔼GB𝒟B[τ(G)]/2. (4)

Proving this turns out to be a technically challenging problem, which we tackle in Section 3.3 (in more generality)333Inequality 4 follows from Lemma 8 with parameter β=1.. Our proof uses the probabilistic method and shows that the expected weight of a particular randomized vertex cover of GA is at most half the optimal vertex cover cost of G. To construct this randomized vertex cover, we follow a similar approach as the stochastic vertex cover algorithm of [13]. In particular, we organize the vertices into three groups based on the cv values and a suitable threshold t>1/2: cv(0,1t], cv(1t,t], (t,1]. We include all vertices from the third group. The vertices in the first group can be omitted from a vertex cover as all their edges in GA are to vertices in the third group. Finally, we include those vertices from the middle group that are selected in a minimum vertex cover of a graph GAGB drawn randomly based on the distribution 𝒟B.

We extend the argument of [13] to analyze the weight of the above randomized vertex cover, which we then leverage to both establish the inequality 4 as well as crucially generalize to the case of k players.

Using inequality 4, we then prove for the 2-player case that

|X|+𝔼GB𝒟B[τ(GB[V\X])]𝔼GB𝒟B[τ(G)]3/2. (5)

Here, |X|+𝔼GB𝒟B[τ(GB[V\X])] is the expected size of the output since Bob returns the union of X and a minimum vertex cover of the remaining graph. Observe that by definition of weights, for any vertex we have cv+wv=1. As a result we can write

|X|+𝔼[τ(GB[V\X])]vX(cv+wv)+vXcv=𝔼[τ(G)]+vXwv(4)3𝔼[τ(G)]/2.

This, however, is not an instance-wise approximation ratio. By invoking Yao’s minimax principle and using the assumption that the size of the solution is fixed, we are able to ensure that there is a randomized protocol that achieves an instance-wise approximation ratio of 3/2 under the fixed size assumption.

Lifting the Knowledge of Size.

So far, we have discussed our protocol, assuming that Alice knows the size of the optimal solution. To lift this assumption, we use a standard approach which is also used by Assadi and Behnezhad [2]. Given a constant ϵ(0,1), we create log1+ϵ2 instances of the problem. In the i-th instance, Alice guesses the size of the optimal solution to be in the range

((1+ϵ)i1τ(GA),(1+ϵ)iτ(GA)).

For each of her guesses, she communicates a different vertex cover of GA. It is easy to show that if one of these guesses is correct, the subset Alice communicates for that guess allows Bob to find a 3/2+ϵ approximation ratio. These guesses together cover the case of τ(G)<2τ(GA). To address the case of τ(G)2τ(GA), Alice also communicates an MVC of her graph. In this scenario, the size of the output would be at most

τ(G)+τ(GA)3τ(G)/2.

Putting the pieces together, Alice can guarantee an expected approximation ratio of 3/2+ϵ by communicating a message of size nlog1+ϵ2.

3 The 𝒌-Party Protocol

In this section, we present the general k-party protocol, which builds on the sketch of the 2-party protocol presented in Section 2. Our protocol simulates a series of two-party protocols between any party i and a second party encapsulating all the remaining parties i+1 to k.

However, it is important to note that a naive approach of composing arbitrary 3/2-approximate two-party protocols may not yield an effective k-party protocol. For instance, consider the graph presented in Figure 1 consisting of sets A, B, C, and D, each containing n/4 vertices. These sets are connected by two bipartite matchings, M1 between A and B, and M2 between C and D, and a complete graph K over BC. For this graph, the optimal vertex cover is BC of size n/2. Suppose the first party receives M1, the second party receives M2 and the third party receives K. A 3/2-approximate two-party protocol may select M1 for the first party and M2 for the second party, leading to an overall solution of size n, which would be 2-approximate. We address this challenge by carefully selecting vertices, prioritized by the cv values, and setting weights so that the optimal solution for the remaining graph becomes progressively smaller as the protocol proceeds with the parties.

Figure 1: The blue matching M1 is given to the first party, the red matching M2 to the second party, and a complete graph (non-bipartite) among vertices in the box to the third party.

We give an overview of our k-party protocol. Following the framework of the 2-party protocol, the first party makes several guesses about the size of opt, with the number of guesses depending only on k and parameter ϵ>0, which can be made arbitrarily small. For each guess (which is an interval) the first party constructs a vertex cover for her graph and communicates all these vertex covers to the next party. Figure 2 depicts our protocol.

For any 1<i<k, the i-th party receives a message M, which is a set of sets of vertices. Each element SM is a vertex cover of 1l<iGl. For each SM, party i constructs a new instance of the problem. She assumes S will be in the final vertex cover and lets Gi=Gi[V\S] be the subgraph of Gi not covered by S. At this point, party i faces a similar problem to the first party in a (ki)-party setting with the assigned subgraph being Gi. Therefore, she starts by making a number of guesses about τ(G[V\S]). We use bi to refer to the number of guesses party i needs to make and later discuss its exact value. For each guess, she picks a vertex cover of Gi and takes its union with S as a vertex cover of 1jiGj. Repeating this for all elements of M results in constructing

|M|×[number of guesses on τ(G[V\S]) for any SM]

vertex covers of 1jiGj. She then communicates all of these to the next party. Since the number of guesses is only a function of k and ϵ, the size of the message is Ok,ϵ(n).

Finally, the last party, after receiving a message M, finds the smallest vertex cover of her graph that includes at least one subset of vertices SM. This ensures that the final output is a valid vertex cover for the entire graph.

Two points remain to be addressed about the protocol. Consider party i and the message M she receives. First, for any SM, we need to describe how the party forms guesses about the size of the minimum vertex cover of G[VS]. Party i makes bi=log1+ϵ2ki+1 guesses about τ(G[VS]). For any 1<l<bi, the l-th guess is:

(1+ϵ)l1τ(Gi)τ(G[VS])<(1+ϵ)lτ(Gi).

There is an additional guess τ(G[VS])(1+ϵ)biτ(Gi) to cover the remaining possibilities.

Figure 2: The tree of communications generated by parties using algorithm 2. The message S sent through each edge to the i-th layer is a vertex cover of the subgraphs given to the first i1 parties. For any of these messages the i-th party receives, she constructs bi different vertex covers of Gi=Gi[VS] and sends their union with S to the next party. Each edge basically represents a subproblem party i needs to solve. Given S, she will make bi guesses about the size of τ(Gi) and for any of these guesses, needs to solve the following subproblem: Find a vertex cover X of Gi such that committing the subset SX to the final solution results in a good approximation ratio conditioned on the specific guess about τ(Gi).

The second point to address is the following: given a guess on the optimal vertex cover, how should we construct a vertex cover of Gi=Gi[VS] in order to obtain the desired approximation ratio? We discuss this in detail in Section 3.1 via the formulation of a two-player game that captures the decision process for party i. We then formalize the protocol in Section 3.2 and establish its communication complexity and approximation ratio.

3.1 A Two-Player Game

In this section, we will discuss the sub-problems each party needs to solve in the design of our protocol. In Figure 2, the construction of a message sent through any edge represents one of these sub-problems. Consider any subset of vertices S party i receives from the previous party. (For the first party, this is an empty set.), and let V=VS. Assuming that S will be part of the final solution and any guess the party has about the size of the optimal solution on the remaining graph G=G[V], she needs to send a message to the next party. The message will be the union of S and a vertex cover X of her remaining subgraph Gi=Gi[V].

To simplify the problem, we can ignore S since it is part of both the input message and the outgoing message. Therefore, in the simplified version, party i needs to solve the following problem. The input consists of a subgraph Gi=(V,Ei) and a guess about the size of the remaining optimal solution in the form of τ(G)(O,(1+ϵ)O) where O is an integer number. The goal of the party is to find a vertex cover X of Gi in order to minimize the size of the final solution if subset X is committed to the final solution knowing that the rest of the parties will implement a protocol with an expected approximation ratio of at most 22ik+5ϵ on the remaining subgraph G[VX].

We can view these sub-problems as two-party problems between party i, which we will refer to as Alice, and a second party, Bob, who encapsulates parties i+1 to k in the original problem. In this problem, Bob, instead of being able to find an exact MVC of the remaining graph, can only find a solution with an approximation ratio of at most 22ik+5ϵ. In this two-party problem, we use GA=(V,EA) to refer to G and GB=(V,EB) as the union of the subgraphs given to parties i+1 to k in the original problem. Hence, Alice’s goal here is to pick a randomized X such that it results in an expected approximation of at most 22ik1+5ϵ for any priori fixed GB.

We reformulate this two-party problem as a two-player game. The first player is Alice, and the second player is an adversary who will determine GB.

Definition 3 (MVC Game).

Given three parameters β(0,1), ϵ(0,1) and O(0,n) we define a two-player zero-sum game between Alice and an adversary on a graph with vertex set V. Alice has a set of edges EA between vertices V, which is known to both players. Alice’s strategy is to select a subset XV of vertices that covers all edges in EA. The adversary’s strategy is to select a set of edges EB over V satisfying

O|MVC(EBEA)|(1+ϵ)O.

The adversary’s utility is defined as:

UB(X,EB)=|X|+(2β)×|MVC(EB[VX])||MVC(EBEA)|. (6)

The notation EB[VX] represents the edges in EB that are induced by the vertices in the set VX, meaning it includes only those edges in EB where both endpoints are in VX.

We begin by proving that for any given distribution over adversary’s strategies, there exists a deterministic strategy for Alice that achieves a payoff of at least (2β/2)(1+ϵ).

Lemma 4.

In the MVC game presented in Definition 3, given any randomized (mixed) strategy EB of the adversary, Alice has a deterministic (pure) strategy X such that

𝔼[UB(X,EB)](2β/2)(1+ϵ).
Proof.

For any vertex v let us define cv=Pr[vMVC(EBEA)]. Alice chooses a minimum weight vertex cover X of EA, where the weight of a vertex is defined as we=1(2β)cv. We will prove that this choice of X satisfies the statement of the lemma. We have

𝔼[|X|+(2β)|MVC(EB[VX])|] (vX1)+(2β)vVXcv
=(vX(1(2β)cv+(2β)cv))+(2β)vVXcv
=(vX(1(2β)cv))+(2β)vVcv (7)

In the first inequality, we use the fact that

𝔼[|MVC(EB[VX])|]vVXcv. (8)

This is due to the definition of cvs, which implies there is a vertex cover of EBEA, containing any vertex vVX w.p. cv. Since edges in EB[VX] can only be covered by vertices in VX, this implies that there is also a vertex cover of EB[VX] which contains each vertex w.p. cv hence by linearity of expectation, we get (8).

A technical part of our proof is to show vX(1(2β)cv)β2vVcv which we defer to Lemma 8 in Section 3.3. Combining this with (7) gives us

𝔼[|X|+(2β)|MVC(EB[VX])|] vX(1(2β)cv)+(2β)vVcv
β2vVcv+(2β)vVcv
(2β/2)vVcv
(2β/2)𝔼[MVC(EAEB)](2β/2)(1+ϵ)O.

Hence, we get

𝔼[|X|+(2β)|MVC(EB[VX])||MVC(EAEA)|] 𝔼[|X|+(2β)|MVC(EB[VX])|]O
(2β/2)(1+ϵ)OO(2β/2)(1+ϵ),

completing the proof of the lemma.

Combining Lemma 4 with von Neumann’s Minimax Theorem yields the existence of a randomized strategy for Alice that achieves an expected payoff of at least (2β/2)(1+ϵ).

Lemma 5.

In the MVC game presented in Definition 3, Alice has a randomized strategy X such that no strategy EB played by the adversary obtains utility larger than (2β/2)(1+ϵ).

Proof.

Given any parameter α>0, von Neumann’s Minimax Theorem [24] (alternatively, Yao’s minimax principle [25]) implies that in the described game, if for any mixed strategy chosen by the adversary, there exists a pure strategy X for Alice such that the adversary’s expected utility is at most α, then there exists a mixed strategy for Alice that ensures the expected utility of the adversary is at most α for any strategy the adversary employs. By Lemma 4, for any mixed strategy of the adversary, there exists a pure strategy for Alice that achieves an expected utility of at least (2β/2)(1+ϵ). This implies the existence of a mixed strategy for Alice with the same expected utility against an arbitrary adversary.

3.2 The Protocol

In this section, we provide a formal statement of our protocol in Algorithm 2. We then establish in Lemma 6 that the total communication complexity is O(n) for any constant k. Finally, we prove in Lemma 7 that it results in our desired approximation ratio.

Algorithm 2 k-Party Algorithm for party i.

As mentioned before, parts of our protocol are non-explicit. In particular, we only show the existence of suitable subsets X1,,Xbi. In our protocol, each party i communicates a set of suitable vertex covers of the graph given to the first i parties. We do not present an efficient procedure for computing these covers, which makes our protocol non-explicit.

Lemma 6.

Given a graph G=(V,E) and k parties, if all the parties use algorithm 2 to communicate and find the vertex cover of G, the communication cost will be (k1)!(log1+ϵ2)k1O(n).

Proof.

Since each party sends bi new subsets for each subset they receive, the number of subsets communicated between the last two parties is

b1b2bk1=log1+ϵ2k1log1+ϵ2k2log1+ϵ21=(k1)!(log1+ϵ2)k1

Moreover, since each subset of vertices can be represented using an n-bit string, the communication cost is (k1)!(log1+ϵ2)k1O(n)=Oϵ,k(n).

We next establish an upper bound on the approximation ratio achieved by the protocol.

Lemma 7.

In the k-party minimum vertex cover problem, let πk(G) be the size of the output if all the parties use algorithm 2 on the base graph G. Then

𝔼[πk(G)](212k1+5ϵ)τ(G)
Proof.

We will use induction on the number of parties. If k=1, then the algorithm will output the set with the minimum size in M1. Since M0={}, a set in M1 will be τ(G=G[V\]=G) that is added on line 11. Therefore, the output is τ(G) which gives us

𝔼[π1(G)]τ(G)(21211+5ϵ)τ(G)=(1+5ϵ)τ(G),

proving base case for k=1. For the inductive step, assume that the lemma statement holds for k1 parties and any base graph G. That is

𝔼[πk1(G)](212k2+5ϵ)τ(G). (9)

In line 6 of the algorithm 2, let X1,X2,,Xb1 be the subsets generated by the first party using the strategy of Lemma 5 with O and β of line 9, where b1=log1+ϵ2k1. Let j be the integer number satisfying (1+ϵ)j1τ(G1)τ(G)(1+ϵ)jτ(G1).

Case 1: 𝒋𝒃𝟏.

Party 2 upon receiving Xj, constructs a graph G=G[V\Xj] since Xj is committed to be in the final solution and we do not need to consider the edges of these vertices. We find a vertex cover on G where the partitions are Gl=Gl[V\Xj] for 2lk. Since the lemma statement holds for k1 parties, applying algorithm 2 on a graph G gives us a vertex cover of G with the expected approximation ratio of

(212(k1)1+5ϵ).

Since Πk covers all the edges in G[V\Xj], a vertex cover for G is ΠkXj.

As a result of this, the final solution will have size πk(G)=|Xj|+πk1(G) and results in the following inequality for the approximation ratio.

𝔼[πk(G)]τ(G)=𝔼[|Xj|]+𝔼[πk1(G)]τ(EAEB)|Xj|+(212k1+5ϵ)τ(G)τ(EAEB) (10)

Now, we construct a version of the MVC game in which we have Alice as the first party and the adversary as the union of the next k1 parties. Let β=12k15ϵ, O=(1+ϵ)j1τ(G1) and ϵ=15. As described in the game definition, Alice sends a vertex cover to Bob and aims to minimize the utility function 11. The adversary (the next k1 parties) aims to choose edges EB in a way to maximize his utility. Let GB=(V,EB=l=2kEl). The adversary’s strategy is to select a set of edges EB over V s.t. O|τ(EBEA)|(1+ϵ)O.

𝔼[UB(Xj,EB)]=|Xj|+(2β)τ(EB[V\Xj])τ(EBEA)=|Xj|+(212k2+5ϵ)τ(G)τ(EBEA) (11)

Now using Lemma 5, Alice has a strategy such that no strategy played by the adversary obtains utility larger than (2β/2)(1+ϵ).

𝔼[πk(G)]τ(G)𝔼[UB(Xj,EB)](2β/2)(1+ϵ) (12)

By putting β in Equation 12 we get

𝔼[UB(Xj,EB)] (2β/2)(1+ϵ)=(212k1+5/2ϵ)(1+ϵ)
=212k1+5/2ϵ+2ϵ12k1ϵ+5/2ϵ2=212k1+ϵ[9/212k1+5/2ϵ].

Since ϵ15, 9/212k1+5/2ϵ5. The following completes our inductive step for case 1.

πk(G)τ(G)(2β/2)(1+ϵ)212k1+5ϵ.
Case 2: 𝒋>𝒃𝒊.

So, 2k1τ(G1)<τ(G). Since τ(G1) is a subset in M1, party 2 constructs G=G[V\MVC(G1)]. Since τ(G)τ(G), by Equation 9, we get

𝔼[πk1(G)](212k2+5ϵ)τ(G)(212k2+5ϵ)τ(G).

As explained in the previous case, the output is at least the summation of the minimum vertex cover of G1 and the algorithm’s output on the rest of the partitions.

𝔼[πk(G)]τ(G1)+πk1(G)12k1τ(G)+(212k2+5ϵ)τ(G)(212k1+5ϵ)τ(G).

This completes the proof for case 2 which completes the proof.

The upper bound of Theorem 1 follows from Lemmas 6 and 7.

3.3 A Technical Lemma

In this section, we establish the following lemma, which is used in the proof of Lemma 4.

Lemma 8.

vX(1(2β)cv)β2vVcv.

It helps to first establish the following claim that may be of independent interest. We defer the proof to the full version of the paper.

Lemma 9.

Let w:(0,1]0 be a function with finite support444We define the support of w to be the number of elements x(0,1] for which w(x)>0.. Then, we have

(x[1/2,1]:y[x,1]w(y)yy[0,1x]w(y)y)y(0,1]w(y)y(12y)0.
Proof of Lemma 8.

Consider a vertex cover I:={v:cv>t}{MVC(Gi𝒢)v:cv(1t,t)}, where MVC(Gi𝒢) is the minimum vertex cover of a graph Gi drawn randomly according to the distribution 𝒢 and t is the smallest t[0.5,1] such that v:cv>tcvv:cv<1tcv. Since X is a minimum weighted vertex cover of GA, we have that

vXwv𝔼Gi𝒢[vIwv]

Let S1={v:cv<1t}, S2={v:1tcvt}, and S3={v:cv>t}. Then,

𝔼[(S1&S3)Iwv]=S3wv=S3(12cv+βcv)S3βcvβ2(S3cv+S1cv)=β2S1&S3cv,

where we used cv1/2 for vS3 and S3cvS1cv by definition of t.

We now consider the set S2. We derive

𝔼[S2Iwv] = vS2cv(1(2β)cv)=vS2(cv(12cv)+βcv2)
= vS2(cv(12cv)β2cv(12cv)+β2cv)
= (1β2)vS2cv(12cv)+β2vS2cvβ2vS2cv,

where the last inequality follows from Lemma 9 (with w(x) set to the number of vertices v for which cv=x, if xS2, and 0 if xS2).

This yields 𝔼[vIwv]β2vcv implying that vXwvβ2vcv.∎

4 A Lower Bound for the Two-Party Case

We show the following lower bound, which establishes the tightness of our protocol for the two-party case.

Lemma 10.

For any constant ε>0, there exists a constant c>0 such that any vertex cover computed by a two party protocol with n1+c/lglgn communication complexity has an approximation ratio of at least 3/2ε.

Proof.

Let G be a Rusza-Szemeredi (bipartite) graph (P,Q,E) formed by n vertices on each side and k induced matchings M1,Mk, where k=nΩ(1/lglgn) and |Mi|=(1/2ε1)n for 1ik, where ε1>0 is a constant that can be made arbitrarily small. Rusza-Szemeredi graphs with the preceding parameters have been shown to exist [17]. We generate a random bipartite graph G=(PP,QQ,E1E2), with a total of (3+2ε1)n vertices, as follows:

  1. 1.

    P, Q are as in G; P (resp., Q) is a set of (1/2+ε1)n vertices disjoint from P (resp., Q).

  2. 2.

    Select a subset Mi of ε2n edges uniformly at random from Mi, for 1ik, independently, where ε2>0 is a constant that can be set arbitrarily small. Set E1=1ikMi; we thus have |E1|=kε2n.

  3. 3.

    Choose r uniformly at random from {1,,k}. Let Pr and Qr denote the vertices of Mr in P and Q, respectively. Let MP and MQ denote a perfect matching between P and PPr and between Q and QQr, respectively. Set E2 to MPMQ.

In the two-party instance, Alice receives the bipartite graph (PP,QQ,E1) and Bob receives the bipartite graph (PP,QQ,E2). The minimum vertex cover of G has size at most ε2n+2(1/2+ε1)n=(1+ε2+2ε1)n; for example, if P~ denotes the set of vertices of Mr in Pr, then P~(PPr)(QQr) is a vertex cover of this size. We will argue below that with probability at least 1o(1), any protocol with O(n) communication produces a vertex cover of size at least (3/2+2ε2ε3)n, where ε3>0 is a constant that can be made arbitrarily small by suitably setting other constants, if necessary.

The proof is via a counting argument. Let 𝒢A denote the collection of all possible graphs that can be presented to Alice. By our construction above, we have

|𝒢A|((1/2ε1)nε2n)k((1/2ε1)nε2n)ε2nk.

Suppose Alice sends a message with s bits to Bob; let ϕ:𝒢A{0,1}s denote the mapping that Alice uses to map the graph she receives to the s-bit message she sends. For any graph H𝒢A, let Γ(H)={H:ϕ(H)=ϕ(H)}. Since Bob needs to produce a valid vertex cover under all inputs, Bob needs to ensure that the vertex cover output for any input H to Alice should cover every edge in the union of all graphs in Γ(H). Since the number of possible outputs of ϕ is 2s, a simple averaging argument implies that there exists an H such that |Γ(H)||𝒢A|/2s.

Lemma 11.

For any subset of 𝒢, let I{1,,k} be the set of indices such that union of all of the edges in the graphs in has at least (1/2ε3)n edges from Mi, for each iI, where ε3>ε1 is an arbitrary positive constant. If |||𝒢A|/2s(1+o(1)) and s=o(nk), then |I|=k(1o(1)).

Proof.

Fix an i in [k]. Let mi denote the number of edges from Mi contained in the union of all of the edges in the graphs in . Then, if I is the set of indices as defined in the lemma, we have the following upper bound on ||.

|| i(miε2n)
((1/2ε1)nε2n)|I|((1/2ε3)nε2n)k|I|
|𝒢A|(((1/2ε3)nε2n)((1/2ε1)nε2n))k|I|
|𝒢A|(1/2ε3ε21/2ε1ε2)ε2n(k|I|)
|𝒢A|/2cn(k|I|),

for some constant c>0, which depends on ε1,ε2,ε3. Since |||𝒢A|/2s(1+o(1)), it follows that cn(k|I|) is at most s(1+o(1)). Since s=o(nk), we obtain |I|k(1o(1)).

For a uniformly random chosen graph H𝒢A, with probability at least 1o(1), |Γ(H)||𝒢A|/2s(1+o(1)); this is because the number of graphs H in 𝒢A that have |Γ(H)| less than |𝒢A|/2s(1+o(1)) is at most 2s|𝒢A|/2s(1+o(1)), which is a o(1) fraction of 𝒢A. By Lemma 11, it follows that for a randomly chosen matching Mr, with probability 1o(1), the union of all of the edges in the graphs in Γ(H) contains at least (1/2ε3)n edges from Mr. Since Bob has to cover all the edges in this union, the vertex cover returned includes at least n/2ε3n of the vertices of Mr and at least the 2(n/2+ε2n) vertices needed to cover the edges of E2. This implies a vertex cover of size at least 3n/2+2ε2nε3n with probability at least 1o(1).

This completes the proof of the desired claim that any protocol with o(nk)=n1+Ω(1/lglgn) communication (for a suitable hidden constant in the Ω term) produces a vertex cover of size at least (3/2+2ε2ε3)n with probability 1o(1). Therefore, there exists a constant c>0 such that no vertex cover computed by a two party protocol with n1+c/lglgn communication complexity can have an approximation ratio of any constant smaller than 3/2. Our lower bound applies to bipartite graphs. Our proof follows the approach of [17] who established a similar lower bound for the maximum matching problem in bipartite graphs. While the size of a maximum matching equals the size of a minimum vertex cover in bipartite graphs, there is no direct reduction between the two problems in our communication model. Indeed, though the specific graph construction we use for the vertex cover lower bound follows the same framework as for the maximum matching lower bound, the parameters and the specific arguments are different. Our parameters for the probability distribution over the bipartite graphs are similar to those used in the lower bound for stochastic vertex cover in [13].

5 Proof of the Main Theorem

In this section, we put all the pieces together and provide a formal proof for our main result using Lemma 7, Lemma 6, and Lemma 10.

Theorem 1 (restated).

For any k2 and any desirably small ϵ>0, there exists a randomized MVC protocol in the k-party one-way communication model with an expected approximation ratio of (22k+1+ϵ), in which each party communicates a message of size Ok,ϵ(n). This approximation ratio is tight for k=2 up to a factor of 1+ϵ.

Proof.

We prove in Lemma 7 that if all the parties use Algorithm 2 then the resulting protocol has an approximation ratio of (221k+5ϵ) for any ϵ(0,1/5). Moreover, based on Lemma 6, the communication cost of this protocol is (k1)!log1+ϵ2k1O(n) which is linear in n assuming k and ϵ are constants. Therefore, the communication cost of our protocol is Ok,ϵ(n). By choosing a sufficiently small ϵϵ/5 we get an approximation ratio of (221k+ϵ) and communication cost of Oϵ,k(n).

Finally, to prove the tightness of our approximation ratio for the two-party case, in Lemma 10, we have a lower bound of 3/2ϵ for the expected approximation ratio of two-party protocols with communication cost of O(n). This implies that our protocol for two parties is tight within an additive error of ϵ.

References

  • [1] Amir Abboud, Keren Censor-Hillel, Seri Khoury, and Ami Paz. Smaller cuts, higher lower bounds. ACM Transactions on Algorithms (TALG), 17(4):1–40, 2021. doi:10.1145/3469834.
  • [2] Sepehr Assadi and Soheil Behnezhad. On the robust communication complexity of bipartite matching. In Mary Wootters and Laura Sanità, editors, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques, APPROX/RANDOM 2021, August 16-18, 2021, University of Washington, Seattle, Washington, USA (Virtual Conference), volume 207 of LIPIcs, pages 48:1–48:17. Schloss Dagstuhl – Leibniz-Zentrum für Informatik, 2021. doi:10.4230/LIPICS.APPROX/RANDOM.2021.48.
  • [3] Sepehr Assadi and Aaron Bernstein. Towards a unified theory of sparsification for matching problems. In Jeremy T. Fineman and Michael Mitzenmacher, editors, 2nd Symposium on Simplicity in Algorithms, SOSA 2019, January 8-9, 2019, San Diego, CA, USA, volume 69 of OASIcs, pages 11:1–11:20. Schloss Dagstuhl – Leibniz-Zentrum für Informatik, 2019. doi:10.4230/OASICS.SOSA.2019.11.
  • [4] Sepehr Assadi and Sanjeev Khanna. Randomized composable coresets for matching and vertex cover. In Proceedings of the 29th ACM Symposium on Parallelism in Algorithms and Architectures, pages 3–12, 2017. doi:10.1145/3087556.3087581.
  • [5] Sepehr Assadi and Sanjeev Khanna. Tight bounds on the round complexity of the distributed maximum coverage problem. In Artur Czumaj, editor, Proceedings of the Twenty-Ninth Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2018, New Orleans, LA, USA, January 7-10, 2018, pages 2412–2431. SIAM, 2018. doi:10.1137/1.9781611975031.155.
  • [6] Sepehr Assadi, Sanjeev Khanna, and Yang Li. Tight bounds for single-pass streaming complexity of the set cover problem. In Proceedings of the forty-eighth annual ACM symposium on Theory of Computing, pages 698–711, 2016. doi:10.1145/2897518.2897576.
  • [7] Amir Azarmehr and Soheil Behnezhad. Robust communication complexity of matching: EDCS achieves 5/6 approximation. In Kousha Etessami, Uriel Feige, and Gabriele Puppis, editors, 50th International Colloquium on Automata, Languages, and Programming, ICALP 2023, July 10-14, 2023, Paderborn, Germany, volume 261 of LIPIcs, pages 14:1–14:15. Schloss Dagstuhl – Leibniz-Zentrum für Informatik, 2023. doi:10.4230/LIPICS.ICALP.2023.14.
  • [8] Soheil Behnezhad, Avrim Blum, and Mahsa Derakhshan. Stochastic vertex cover with few queries. In Joseph (Seffi) Naor and Niv Buchbinder, editors, Proceedings of the 2022 ACM-SIAM Symposium on Discrete Algorithms, SODA 2022, Virtual Conference / Alexandria, VA, USA, January 9 - 12, 2022, pages 1808–1846. SIAM, 2022. doi:10.1137/1.9781611977073.73.
  • [9] Soheil Behnezhad and Sanjeev Khanna. New trade-offs for fully dynamic matching via hierarchical EDCS. In Joseph (Seffi) Naor and Niv Buchbinder, editors, Proceedings of the 2022 ACM-SIAM Symposium on Discrete Algorithms, SODA 2022, Virtual Conference / Alexandria, VA, USA, January 9 - 12, 2022, pages 3529–3566. SIAM, 2022. doi:10.1137/1.9781611977073.140.
  • [10] Aaron Bernstein. Improved bounds for matching in random-order streams. In Artur Czumaj, Anuj Dawar, and Emanuela Merelli, editors, 47th International Colloquium on Automata, Languages, and Programming, ICALP 2020, July 8-11, 2020, Saarbrücken, Germany (Virtual Conference), volume 168 of LIPIcs, pages 12:1–12:13. Schloss Dagstuhl – Leibniz-Zentrum für Informatik, 2020. doi:10.4230/LIPICS.ICALP.2020.12.
  • [11] Rajesh Chitnis, Graham Cormode, Hossein Esfandiari, MohammadTaghi Hajiaghayi, Andrew McGregor, Morteza Monemizadeh, and Sofya Vorotnikova. Kernelization via sampling with applications to finding matchings and related problems in dynamic graph streams. In Proceedings of the twenty-seventh annual ACM-SIAM symposium on Discrete algorithms, pages 1326–1344. SIAM, 2016. doi:10.1137/1.9781611974331.CH92.
  • [12] Jacques Dark and Christian Konrad. Optimal lower bounds for matching and vertex cover in dynamic graph streams. arXiv preprint arXiv:2005.11116, 2020. arXiv:2005.11116.
  • [13] Mahsa Derakhshan, Naveen Durvasula, and Nika Haghtalab. Stochastic minimum vertex cover in general graphs: A 3/2-approximation. In Barna Saha and Rocco A. Servedio, editors, Proceedings of the 55th Annual ACM Symposium on Theory of Computing, STOC 2023, Orlando, FL, USA, June 20-23, 2023, pages 242–253. ACM, 2023. doi:10.1145/3564246.3585230.
  • [14] Mahsa Derakhshan, Mohammad Saneian, and Zhiyang Xun. Query complexity of stochastic minimum vertex cover. In 16th Innovations in Theoretical Computer Science Conference, ITCS, 2025.
  • [15] Joan Feigenbaum, Sampath Kannan, Andrew McGregor, Siddharth Suri, and Jian Zhang. On graph problems in a semi-streaming model. Theoretical Computer Science, 348(2-3):207–216, 2005. doi:10.1016/J.TCS.2005.09.013.
  • [16] Prantar Ghosh and Vihan Shah. New lower bounds in merlin-arthur communication and graph streaming verification. In Venkatesan Guruswami, editor, 15th Innovations in Theoretical Computer Science Conference, ITCS 2024, January 30 to February 2, 2024, Berkeley, CA, USA, volume 287 of LIPIcs, pages 53:1–53:22. Schloss Dagstuhl – Leibniz-Zentrum für Informatik, 2024. doi:10.4230/LIPICS.ITCS.2024.53.
  • [17] Ashish Goel, Michael Kapralov, and Sanjeev Khanna. On the communication and streaming complexity of maximum bipartite matching. In Yuval Rabani, editor, Proceedings of the Twenty-Third Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2012, Kyoto, Japan, January 17-19, 2012, pages 468–485. SIAM, 2012. doi:10.1137/1.9781611973099.41.
  • [18] Michael Kapralov. Better bounds for matchings in the streaming model. In Sanjeev Khanna, editor, Proceedings of the Twenty-Fourth Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2013, New Orleans, Louisiana, USA, January 6-8, 2013, pages 1679–1697. SIAM, 2013. doi:10.1137/1.9781611973105.121.
  • [19] Subhash Khot and Oded Regev. Vertex cover might be hard to approximate to within 2-epsilon. J. Comput. Syst. Sci., 74(3):335–349, 2008. doi:10.1016/J.JCSS.2007.06.019.
  • [20] Eyal Kushilevitz and Noam Nisan. Communication complexity. Cambridge University Press, 1997.
  • [21] Euiwoong Lee and Sahil Singla. Maximum matching in the online batch-arrival model. ACM Trans. Algorithms, 16(4):49:1–49:31, 2020. doi:10.1145/3399676.
  • [22] Kheeran K Naidu and Vihan Shah. Space optimal vertex cover in dynamic streams. arXiv preprint arXiv:2209.05623, 2022. doi:10.48550/arXiv.2209.05623.
  • [23] Xiaoming Sun and David P. Woodruff. Tight bounds for graph problems in insertion streams. In Naveen Garg, Klaus Jansen, Anup Rao, and José D. P. Rolim, editors, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques, APPROX/RANDOM 2015, August 24-26, 2015, Princeton, NJ, USA, volume 40 of LIPIcs, pages 435–448. Schloss Dagstuhl – Leibniz-Zentrum für Informatik, 2015. doi:10.4230/LIPICS.APPROX-RANDOM.2015.435.
  • [24] John von Neumann. Zur Theorie der Gesellschaftsspiele. (German) [On the theory of games of strategy]. Math. Ann., 100:295–320, 1928. doi:10.1007/BF01448847.
  • [25] Andrew Chi-Chih Yao. Probabilistic computations: Toward a unified measure of complexity (extended abstract). In 18th Annual Symposium on Foundations of Computer Science, Providence, Rhode Island, USA, 31 October - 1 November 1977, pages 222–227. IEEE Computer Society, 1977. doi:10.1109/SFCS.1977.24.
  • [26] Andrew Chi-Chih Yao. Some complexity questions related to distributive computing (preliminary report). In Proceedings of the 11h Annual ACM Symposium on Theory of Computing, April 30 - May 2, 1979, Atlanta, Georgia, USA, pages 209–213. ACM, 1979. doi:10.1145/800135.804414.