Abstract
Stochastic optimization problems attempt to model uncertainty in the data by assuming that (part of) the input is specified by a probability distribution. We consider the wellstudied paradigm of stochastic recourse models, where the uncertainty evolves through a series of stages and one can take decisions in each stage in response to the new information learned. We obtain the first approximation algorithms for a variety of 2stage and kstage stochastic integer optimization problems where the underlying random data is given by a "black box" and no restrictions are placed on the costs of the two stages: one can merely sample data from this distribution, but no direct information about the distributions is given. Our results are based on two principal components. First, we show that for a broad class of 2stage and kstage linear programs, where k is not part of the input, given only a "black box" to draw independent samples from the distribution, one can, for any \epsilon>0, compute a solution of cost guaranteed to be within a (1+\epsilon) factor of the optimum, in time polynomial in 1/\epsilon, the size of the input, and a parameter \lambda that is the ratio of the cost of the same action in successive stages which is a lower bound on the sample complexity in the "blackbox" model. This is based on reformulating the stochastic linear program, which has both an exponential number of variables and an exponential number of constraints, as a compact convex program, and adapting tools from convex optimization to solve the resulting program to near optimality. In doing so, a significant difficulty that we must overcome is that even evaluating the objective function of this convex program at a given point may be quite difficult and provably hard. To the best of our knowledge, this is the first such result for multistage stochastic programs. Second, we give a rounding approach for stochastic integer programs that shows that approximation algorithms for a deterministic analogue yields, with a small constantfactor loss, provably nearoptimal solutions for the stochastic generalization. Thus we obtain approximation algorithms for several stochastic problems, including the stochastic versions of the set cover, vertex cover, facility location, multicut (on trees) and multicommodity flow problems.
BibTeX  Entry
@InProceedings{swamy_et_al:DSP:2005:72,
author = {Chaitanya Swamy and David Shmoys},
title = {Approximation Algorithms for 2stage and Multistage Stochastic Optimization},
booktitle = {Algorithms for Optimization with Incomplete Information},
year = {2005},
editor = {Susanne Albers and Rolf H. M{\"o}hring and Georg Ch. Pflug and R{\"u}diger Schultz},
number = {05031},
series = {Dagstuhl Seminar Proceedings},
ISSN = {18624405},
publisher = {Internationales Begegnungs und Forschungszentrum f{\"u}r Informatik (IBFI), Schloss Dagstuhl, Germany},
address = {Dagstuhl, Germany},
URL = {http://drops.dagstuhl.de/opus/volltexte/2005/72},
annote = {Keywords: Algorithms, Approximation Algorithms, Optimization, Convex Optimization, Stochastic Optimization}
}
Keywords: 

Algorithms, Approximation Algorithms, Optimization, Convex Optimization, Stochastic Optimization 
Seminar: 

05031  Algorithms for Optimization with Incomplete Information 
Issue Date: 

2005 
Date of publication: 

30.05.2005 