Neural Continuous-Discrete State Space Models for Irregularly-Sampled Time Series, Abdul Fatir Ansari★, Alvin Heng★, Andre Lim★, and Harold Soh★, arXiv preprint, 2023

Learning accurate predictive models of real-world dynamic phenomena (e.g., climate, biological) remains a challenging task. One key issue is that the data generated by both natural and artificial processes often comprise time series that are irregularly sampled and/or contain missing observations. In this work, we propose the Neural Continuous-Discrete State Space Model (NCDSSM) for continuous-time modeling of time series through discrete-time observations. employs auxiliary variables to disentangle recognition from dynamics, thus requiring amortized inference only for the auxiliary variables. Leveraging techniques from continuous-discrete filtering theory, we demonstrate how to perform accurate Bayesian inference for the dynamic states. We propose three flexible parameterizations of the latent dynamics and an efficient training objective that marginalizes the dynamic states during inference. Empirical results on multiple benchmark datasets across various domains show improved imputation and forecasting performance of NCDSSMover existing models.

## Resources

You can find our paper here. Check out our repository here on github.

## Citation

Please consider citing our paper if you build upon our results and ideas.

Abdul Fatir Ansari★, Alvin Heng★, Andre Lim★, and Harold Soh★, “Neural Continuous-Discrete State Space Models for Irregularly-Sampled Time Series”, arXiv preprint, 2023

@article{fatir2023neural,
title={Neural Continuous-Discrete State Space Models for Irregularly-Sampled Time Series},
author={Fatir Ansari, Abdul and Heng, Alvin and Lim, Andre and Soh, Harold}, journal={arXiv e-prints},
pages={arXiv--2301},
year={2023} }