Solvency Markov Decision Processes with Interest

Investor logo

Warning

This publication doesn't include Institute of Computer Science. It includes Faculty of Informatics. Official publication website can be found on muni.cz.
Authors

BRÁZDIL Tomáš CHEN Taolue FOREJT Vojtěch NOVOTNÝ Petr SIMAITIS Aistis

Year of publication 2013
Type Article in Proceedings
Conference IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2013)
MU Faculty or unit

Faculty of Informatics

Citation
Web http://drops.dagstuhl.de/opus/volltexte/2013/4395/pdf/37.pdf
Doi http://dx.doi.org/10.4230/LIPIcs.FSTTCS.2013.487
Field Informatics
Keywords stochastic systems; markov decision processes; reward functions
Description Solvency games, introduced by Berger et al., provide an abstract framework for modeling decisions of a risk-averse investor, whose goal is to avoid ever going broke. We study a new variant of this model, where in addition to stochastic environment and fixed increments and decrements to the investor's wealth we introduce interest, which is earned or paid on the current level of savings or debt, respectively. We concentrate on problems related to the minimum initial wealth sufficient to avoid bankrupting (i.e. steady decrease of the wealth) with probability at least $p$. We present an exponential time algorithm which approximates this minimum initial wealth, and show that a polynomial time approximation is not possible unless P = NP. For the qualitative case, i.e. p=1, we show that the problem whether a given number is larger than or equal to the minimum initial wealth belongs to NP intersection coNP, and show that a polynomial time algorithm would yield a polynomial time algorithm for mean-payoff games, existence of which is a longstanding open problem. We also identify some classes of solvency MDPs for which this problem is in P. In all above cases the algorithms also give corresponding bankruptcy avoiding strategies.
Related projects:

You are running an old browser version. We recommend updating your browser to its latest version.

More info