Wednesday, December 7, 2022 - 4:00pm to 4:30pm
Event Calendar Category
LIDS & Stats Tea
Building and Room Number
We study the task of learning state representations from potentially high-dimensional observations, with the goal of controlling an unknown partially observable system. We pursue a direct latent model learning approach, where a dynamic model in some latent state space is learned by predicting quantities directly related to planning (e.g., costs) without reconstructing the observations. In particular, we revisit Linear Quadratic Gaussian (LQG) control, one of the most fundamental partially observable control problems, and focus on an intuitive cost-driven state representation learning method. As our main results, we establish finite-sample guarantees of finding a near-optimal state representation function and a near-optimal controller using the directly learned latent model. To the best of our knowledge, despite various empirical successes, prior to this work it was unclear if such a cost-driven latent model learner enjoys finite-sample guarantees. Our work underscores the value of predicting multi-step costs, an idea that is key to our theory and notably, also an idea that has been previously observed to be empirically valuable for learning state representations.
Joint work with: Kaiqing Zhang, Russ Tedrake, and Suvrit Sra.
Yi Tian is a 4th-year Ph.D. student at LIDS and EECS, advised by Prof. Suvrit Sra. His research interests include machine learning, control, and optimization.