Predictive state representations with state space partitioning

Yunlong Liu, Yun Tang, Yifeng Zeng

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

Predictive state representations (PSRs) are powerful methods of modeling dynamical systems by representing state through observational data. Most of the current PSR techniques focus on learning a complete PSR model from the entire state space. Consequently, the techniques are often not scalable due to the dimensional curse, which limits applications of PSR. In this paper, we propose a new PSR learning technique. Instead of directly learning a complete PSR at one time, we learn a set of local models each of which is constructed on a sub-state space and then combine the learnt models. We employ the landmark technique to partition the entire state space. We further show the theoretical guarantees on the learning performance of the proposed technique and present empirical results on multiple domains.

Original languageEnglish
Title of host publicationAAMAS 2015 - Proceedings of the 2015 International Conference on Autonomous Agents and Multiagent Systems
EditorsEdith Elkind, Gerhard Weiss, Pinar Yolum, Rafael H. Bordini
PublisherInternational Foundation for Autonomous Agents and Multiagent Systems (IFAAMAS)
Pages1259-1266
Number of pages8
Volume2
ISBN (Electronic)9781450337700
Publication statusPublished - 1 Jan 2015
Event14th International Conference on Autonomous Agents and Multiagent Systems - Istanbul Congress Centre, Istanbul, Turkey
Duration: 4 May 20158 May 2015
http://www.ifaamas.org/AAMAS/aamas2015/

Conference

Conference14th International Conference on Autonomous Agents and Multiagent Systems
Abbreviated titleAAMAS
Country/TerritoryTurkey
CityIstanbul
Period4/05/158/05/15
Internet address

Fingerprint

Dive into the research topics of 'Predictive state representations with state space partitioning'. Together they form a unique fingerprint.

Cite this