Abstract
We consider the sensitivity of algorithms for the maximum matching problem against edge and vertex modifications. When an algorithm A for the maximum matching problem is deterministic, the sensitivity of A on G is defined as max_{e ∈ E(G)}A(G) △ A(G  e), where Ge is the graph obtained from G by removing an edge e ∈ E(G) and △ denotes the symmetric difference. When A is randomized, the sensitivity is defined as max_{e ∈ E(G)}d_{EM}(A(G),A(Ge)), where d_{EM}(⋅,⋅) denotes the earth mover’s distance between two distributions. Thus the sensitivity measures the difference between the output of an algorithm after the input is slightly perturbed. Algorithms with low sensitivity, or stable algorithms are desirable because they are robust to edge failure or attack.
In this work, we show a randomized (1ε)approximation algorithm with worstcase sensitivity O_ε(1), which substantially improves upon the (1ε)approximation algorithm of Varma and Yoshida (SODA'21) that obtains average sensitivity n^O(1/(1+ε²)) sensitivity algorithm, and show a deterministic 1/2approximation algorithm with sensitivity exp(O(log^*n)) for boundeddegree graphs. We then show that any deterministic constantfactor approximation algorithm must have sensitivity Ω(log^* n). Our results imply that randomized algorithms are strictly more powerful than deterministic ones in that the former can achieve sensitivity independent of n whereas the latter cannot. We also show analogous results for vertex sensitivity, where we remove a vertex instead of an edge.
Finally, we introduce the notion of normalized weighted sensitivity, a natural generalization of sensitivity that accounts for the weights of deleted edges. For a graph with weight function w, the normalized weighted sensitivity is defined to be the sum of the weighted edges in the symmetric difference of the algorithm normalized by the altered edge, i.e., max_{e ∈ E(G)}1/(w(e))w (A(G) △ A(G  e)). Hence the normalized weighted sensitivity measures the weighted difference between the output of an algorithm after the input is slightly perturbed, normalized by the weight of the perturbation. We show that if all edges in a graph have polynomially bounded weight, then given a tradeoff parameter α > 2, there exists an algorithm that outputs a 1/(4α)approximation to the maximum weighted matching in O(m log_α n) time, with normalized weighted sensitivity O(1).
BibTeX  Entry
@InProceedings{yoshida_et_al:LIPIcs.ITCS.2021.58,
author = {Yuichi Yoshida and Samson Zhou},
title = {{Sensitivity Analysis of the Maximum Matching Problem}},
booktitle = {12th Innovations in Theoretical Computer Science Conference (ITCS 2021)},
pages = {58:158:20},
series = {Leibniz International Proceedings in Informatics (LIPIcs)},
ISBN = {9783959771771},
ISSN = {18688969},
year = {2021},
volume = {185},
editor = {James R. Lee},
publisher = {Schloss DagstuhlLeibnizZentrum f{\"u}r Informatik},
address = {Dagstuhl, Germany},
URL = {https://drops.dagstuhl.de/opus/volltexte/2021/13597},
URN = {urn:nbn:de:0030drops135979},
doi = {10.4230/LIPIcs.ITCS.2021.58},
annote = {Keywords: Sensitivity analysis, maximum matching, graph algorithms}
}
Keywords: 

Sensitivity analysis, maximum matching, graph algorithms 
Collection: 

12th Innovations in Theoretical Computer Science Conference (ITCS 2021) 
Issue Date: 

2021 
Date of publication: 

04.02.2021 