Etessami, Kousha
The complexity of analyzing infinitestate Markov chains, Markov decision processes, and stochastic games (Invited talk)
Abstract
In recent years, a considerable amount of research has been devoted to understanding the computational complexity of basic analysis problems, and model checking problems, for finitelypresented countable infinitestate probabilistic systems. In particular, we have studied recursive Markov chains (RMCs), recursive Markov decision processes (RMDPs) and recursive stochastic games (RSGs). These arise by adding a natural recursion feature to finitestate Markov chains, MDPs, and stochastic games. RMCs and RMDPs provide natural abstract models of probabilistic procedural programs with recursion, and they are expressively equivalent to probabilistic and MDP extensions of pushdown automata. Moreover, a number of wellstudied stochastic processes, including multitype branching processes, (discretetime) quasibirthdeath processes, and stochastic contextfree grammars, can be suitably captured by subclasses of RMCs.
A central computational problem for analyzing various classes of recursive probabilistic systems is the computation of their (optimal) termination probabilities. These form a key ingredient for many other analyses, including model checking. For RMCs, and for important subclasses of RMDPs and RSGs, computing their termination values is equivalent to computing the least fixed point (LFP) solution of a corresponding monotone system of polynomial (min/max) equations. The complexity of computing the LFP solution for such equation systems is a intriguing problem, with connections to several areas of research. The LFP solution may in general be irrational. So, one possible aim is to compute it to within a desired additive error epsilon > 0. For general RMCs, approximating their termination probability within any nontrivial constant additive error < 1/2, is at least as hard as longstanding open problems in the complexity of numerical computation which are not even known to be in NP. For several key subclasses of RMCs and RMDPs, computing their termination values
turns out to be much more tractable.
In this talk I will survey algorithms for, and discuss the computational complexity of, key analysis problems for classes of infinitestate recursive MCs, MDPs, and stochastic games. In particular, I will discuss recent joint work with Alistair Stewart and Mihalis Yannakakis (in papers that appeared at STOC'12 and ICALP'12), in which we have obtained polynomial time algorithms for computing, to within arbitrary desired precision, the LFP solution of probabilistic polynomial (min/max) systems of equations. Using this, we obtained the first Ptime algorithms for computing (within desired precision) the extinction probabilities of multitype branching processes, the probability that an arbitrary given stochastic contextfree grammar generates a given string, and the optimum (maximum or minimum) extinction probabilities for branching MDPs and contextfree MDPs. For branching MDPs, their corresponding equations amount to Bellman optimality equations for minimizing/maximizing their termination probabilities. Our algorithms combine variations and generalizations of Newton's method with other techniques, including linear programming. The algorithms are fairly easy to implement, but analyzing their worstcase running time
is mathematically quite involved.
BibTeX  Entry
@InProceedings{etessami:LIPIcs:2013:3914,
author = {Kousha Etessami},
title = {{The complexity of analyzing infinitestate Markov chains, Markov decision processes, and stochastic games (Invited talk)}},
booktitle = {30th International Symposium on Theoretical Aspects of Computer Science (STACS 2013)},
pages = {12},
series = {Leibniz International Proceedings in Informatics (LIPIcs)},
ISBN = {9783939897507},
ISSN = {18688969},
year = {2013},
volume = {20},
editor = {Natacha Portier and Thomas Wilke},
publisher = {Schloss DagstuhlLeibnizZentrum fuer Informatik},
address = {Dagstuhl, Germany},
URL = {http://drops.dagstuhl.de/opus/volltexte/2013/3914},
URN = {urn:nbn:de:0030drops39143},
doi = {10.4230/LIPIcs.STACS.2013.1},
annote = {Keywords: recursive Markov chains, Markov decision processes, stochastic games, monotone systems of nonlinear equations, least fixed points, Newton's method, co}
}
26.02.2013
Keywords: 

recursive Markov chains, Markov decision processes, stochastic games, monotone systems of nonlinear equations, least fixed points, Newton's method, co 
Seminar: 

30th International Symposium on Theoretical Aspects of Computer Science (STACS 2013)

Issue date: 

2013 
Date of publication: 

26.02.2013 