MDPs with energy-parity objectives



Mayr, Richard, Schewe, Sven ORCID: 0000-0002-9093-9518, Totzke, Patrick ORCID: 0000-0001-5274-8190 and Wojtczak, Dominik ORCID: 0000-0001-5560-0546
(2017) MDPs with energy-parity objectives. In: LICS, 2018-6-20 - 2018-6-23, Reykjavik, Iceland.

[img] Text
main.pdf - Author Accepted Manuscript

Download (342kB)

Abstract

Energy-parity objectives combine $\omega$-regular with quantitative objectives of reward MDPs. The controller needs to avoid to run out of energy while satisfying a parity objective. We refute the common belief that, if an energy-parity objective holds almost-surely, then this can be realised by some finite memory strategy. We provide a surprisingly simple counterexample that only uses coB\"uchi conditions. We introduce the new class of bounded (energy) storage objectives that, when combined with parity objectives, preserve the finite memory property. Based on these, we show that almost-sure and limit-sure energy-parity objectives, as well as almost-sure and limit-sure storage parity objectives, are in $\mathit{NP}\cap \mathit{coNP}$ and can be solved in pseudo-polynomial time for energy-parity MDPs.

Item Type: Conference or Workshop Item (Other)
Additional Information: timestamp: Fri, 02 Nov 2018 00:00:00 +0100 biburl: https://dblp.org/rec/bib/conf/lics/MayrSTW17 bibsource: dblp computer science bibliography, https://dblp.org
Uncontrolled Keywords: Markov processes, Games, Memory management, Energy storage, Color, Probability distribution, Probabilistic logic
Depositing User: Symplectic Admin
Date Deposited: 31 May 2017 09:25
Last Modified: 19 Jan 2023 07:03
DOI: 10.1109/LICS.2017.8005131
Related URLs:
URI: https://livrepository.liverpool.ac.uk/id/eprint/3007712