Abstract
Predictive state representations (PSRs) are powerful methods of modeling dynamical systems by representing state through observational data. Most of the current PSR techniques focus on learning a complete PSR model from the entire state space. Consequently, the techniques are often not scalable due to the dimensional curse, which limits applications of PSR. In this paper, we propose a new PSR learning technique. Instead of directly learning a complete PSR at one time, we learn a set of local models each of which is constructed on a sub-state space and then combine the learnt models. We employ the landmark technique to partition the entire state space. We further show the theoretical guarantees on the learning performance of the proposed technique and present empirical results on multiple domains.
Original language | English |
---|---|
Title of host publication | AAMAS 2015 - Proceedings of the 2015 International Conference on Autonomous Agents and Multiagent Systems |
Editors | Edith Elkind, Gerhard Weiss, Pinar Yolum, Rafael H. Bordini |
Publisher | International Foundation for Autonomous Agents and Multiagent Systems (IFAAMAS) |
Pages | 1259-1266 |
Number of pages | 8 |
Volume | 2 |
ISBN (Electronic) | 9781450337700 |
Publication status | Published - 1 Jan 2015 |
Event | 14th International Conference on Autonomous Agents and Multiagent Systems - Istanbul Congress Centre, Istanbul, Turkey Duration: 4 May 2015 → 8 May 2015 http://www.ifaamas.org/AAMAS/aamas2015/ |
Conference
Conference | 14th International Conference on Autonomous Agents and Multiagent Systems |
---|---|
Abbreviated title | AAMAS |
Country/Territory | Turkey |
City | Istanbul |
Period | 4/05/15 → 8/05/15 |
Internet address |