LIPIcs.MFCS.2024.33.pdf
- Filesize: 0.64 MB
- 10 pages
We consider a graph coloring algorithm that processes vertices in order taken uniformly at random and assigns colors to them using First-Fit strategy. We show that this algorithm uses, in expectation, at most (1+o(1))⋅ln n / ln ln n different colors to color any forest with n vertices. We also construct a family of forests that shows that this bound is best possible.
Feedback for Dagstuhl Publishing