LIPIcs.FSTTCS.2011.411.pdf
- Filesize: 450 kB
- 12 pages
For security and efficiency reasons, most systems do not give the users a full access to their information. One key specification formalism for these systems are the so called Partially Observable Markov Decision Processes (POMDP for short), which have been extensively studied in several research communities, among which AI and model-checking. In this paper we tackle the problem of the minimal information a user needs at runtime to achieve a simple goal, modeled as reaching an objective with probability one. More precisely, to achieve her goal, the user can at each step either choose to use the partial information, or pay a fixed cost and receive the full information. The natural question is then to minimize the cost the user needs to fulfill her objective. This optimization question gives rise to two different problems, whether we consider to minimize the worst case cost, or the average cost. On the one hand, concerning the worst case cost, we show that efficient techniques from the model checking community can be adapted to compute the optimal worst case cost and give optimal strategies for the users. On the other hand, we show that the optimal average price (a question typically considered in the AI community) cannot be computed in general, nor can it be approximated in polynomial time even up to a large approximation factor.
Feedback for Dagstuhl Publishing