Memory Augmented State Space Model for Time Series Forecasting
Memory Augmented State Space Model for Time Series Forecasting
Yinbo Sun, Lintao Ma, Yu Liu, Shijun Wang, James Zhang, YangFei Zheng, Hu Yun, Lei Lei, Yulin Kang, Llinbao Ye
Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence
Main Track. Pages 3451-3457.
https://doi.org/10.24963/ijcai.2022/479
State space model (SSM) provides a general and flexible forecasting framework for time series. Conventional SSM with fixed-order Markovian assumption often falls short in handling the long-range temporal dependencies and/or highly non-linear correlation in time-series data, which is crucial for accurate forecasting. To this extend, we present External Memory Augmented State Space Model (EMSSM) within the sequential Monte Carlo (SMC) framework. Unlike the common fixed-order Markovian SSM, our model features an external memory system, in which we store informative latent state experience, whereby to create ``memoryful" latent dynamics modeling complex long-term dependencies. Moreover, conditional normalizing flows are incorporated in our emission model, enabling the adaptation to a broad class of underlying data distributions. We further propose a Monte Carlo Objective that employs an efficient variational proposal distribution, which fuses the filtering and the dynamic prior information, to approximate the posterior state with proper particles. Our results demonstrate the competitiveness of forecasting performance of our proposed model comparing with other state-of-the-art SSMs.
Keywords:
Machine Learning: Time-series; Data Streams
Machine Learning: Bayesian Learning
Machine Learning: Probabilistic Machine Learning
Uncertainty in AI: Inference