Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deep Direct Discriminative Decoders for High-dimensional Time-series Data Analysis (2205.10947v2)

Published 22 May 2022 in cs.LG and stat.ML

Abstract: The state-space models (SSMs) are widely utilized in the analysis of time-series data. SSMs rely on an explicit definition of the state and observation processes. Characterizing these processes is not always easy and becomes a modeling challenge when the dimension of observed data grows or the observed data distribution deviates from the normal distribution. Here, we propose a new formulation of SSM for high-dimensional observation processes. We call this solution the deep direct discriminative decoder (D4). The D4 brings deep neural networks' expressiveness and scalability to the SSM formulation letting us build a novel solution that efficiently estimates the underlying state processes through high-dimensional observation signal. We demonstrate the D4 solutions in simulated and real data such as Lorenz attractors, Langevin dynamics, random walk dynamics, and rat hippocampus spiking neural data and show that the D4 performs better than traditional SSMs and RNNs. The D4 can be applied to a broader class of time-series data where the connection between high-dimensional observation and the underlying latent process is hard to characterize.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (42)
  1. A new look at state-space models for neural data. Journal of computational neuroscience 2010;29(1):107–126.
  2. A task-level adaptive MapReduce framework for real-time streaming data in healthcare applications. Future generation computer systems 2015;43:149–160.
  3. Application of Kalman filtering to the calibration and alignment of inertial navigation systems. In: 29th IEEE Conference on Decision and Control IEEE; 1990. p. 3325–3334.
  4. Kiriy E, Buehler M. Three-state extended kalman filter for mobile robot localization. McGill University, Montreal, Canada, Tech Rep TR-CIM 2002;5:23.
  5. Dynamic analysis of neural encoding by point process adaptive filtering. Neural computation 2004;16(5):971–998.
  6. Durbin J, Koopman SJ. Time series analysis by state space methods, vol. 38. OUP Oxford; 2012.
  7. A general recurrent state space framework for modeling neural dynamics during decision-making. In: International Conference on Machine Learning PMLR; 2020. p. 11680–11691.
  8. Bayesian learning and inference in recurrent switching linear dynamical systems. In: Artificial Intelligence and Statistics PMLR; 2017. p. 914–922.
  9. Machine learning for neural decoding. Eneuro 2020;7(4).
  10. BehaveNet: nonlinear embedding and Bayesian neural decoding of behavioral videos. Advances in Neural Information Processing Systems 2019;32.
  11. Neuropixels 2.0: A miniaturized high-density probe for stable, long-term brain recordings. Science 2021;372(6539):eabf4588.
  12. Three-dimensional micro-electrode array for recording dissociated neuronal cultures. Lab on a Chip 2009;9(14):2036–2042.
  13. Deep kalman filters. arXiv preprint arXiv:151105121 2015;.
  14. A Comparison Study of Point-Process Filter and Deep Learning Performance in Estimating Rat Position Using an Ensemble of Place Cells. In: 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) IEEE; 2018. p. 4732–4735.
  15. Recurrent switching dynamical systems models for multiple interacting neural populations. Advances in neural information processing systems 2020;33:14867–14878.
  16. Linear dynamical neural population models through nonlinear embeddings. Advances in neural information processing systems 2016;29.
  17. Structured VAEs: Composing probabilistic graphical models and variational autoencoders. arXiv preprint arXiv:160306277 2016;2:44.
  18. Black box variational inference for state space models. arXiv preprint arXiv:151107367 2015;.
  19. Deep state space models for time series forecasting. Advances in neural information processing systems 2018;31.
  20. The Discriminative Kalman Filter for Bayesian Filtering with Nonlinear and Nongaussian Observation Models. Neural Computation 2020;32(5):969–1017.
  21. Direct Discriminative Decoder Models for Analysis of High-Dimensional Dynamical Neural Data. Neural Computation 2022;34(5):1100–1135.
  22. Chen Z, et al. Bayesian filtering: From Kalman filters to particle filters, and beyond. Statistics 2003;182(1):1–69.
  23. Spatio-temporal correlations and visual signalling in a complete neuronal population. Nature 2008;454(7207):995–999.
  24. A tutorial on particle filtering and smoothing: Fifteen years later. Handbook of nonlinear filtering 2009;12(656-704):3.
  25. Wong R. Asymptotic approximations of integrals. SIAM; 2001.
  26. Kitagawa G. Monte Carlo filter and smoother for non-Gaussian nonlinear state space models. Journal of computational and graphical statistics 1996;5(1):1–25.
  27. Kenny DA, Judd CM. Estimating the nonlinear and interactive effects of latent variables. Psychological bulletin 1984;96(1):201.
  28. Myung IJ. Tutorial on maximum likelihood estimation. Journal of mathematical Psychology 2003;47(1):90–100.
  29. Cognitive state prediction using an EM algorithm applied to gamma distributed data. In: 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) IEEE; 2015. p. 7819–7824.
  30. Kuhn H. A, W. TUCKER, Nonlinear Programming. In: 2nd Berkeley Symposium; 1951. p. 481–492.
  31. Wan EA, Nelson AT. Dual extended Kalman filter methods. Kalman filtering and neural networks 2001;123.
  32. Kingma DP, Welling M. An introduction to variational autoencoders. arXiv preprint arXiv:190602691 2019;.
  33. Kingma DP, Welling M. Auto-encoding variational bayes. arXiv preprint arXiv:13126114 2013;.
  34. Real-Time Point Process Filter for Multidimensional Decoding Problems Using Mixture Models. Journal of Neuroscience Methods 2021;348:109006.
  35. A point process framework for relating neural spiking activity to spiking history, neural ensemble, and extrinsic covariate effects. Journal of neurophysiology 2005;93(2):1074–1089.
  36. Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:14123555 2014;.
  37. Chai T, Draxler RR. Root mean square error (RMSE) or mean absolute error (MAE)?–Arguments against avoiding RMSE in the literature. Geoscientific model development 2014;7(3):1247–1250.
  38. Yao W, Lindsay BG. Bayesian mixture labeling by highest posterior density. Journal of the American Statistical Association 2009;104(486):758–767.
  39. Leimkuhler B, Matthews C. Robust and efficient configurational molecular sampling via Langevin dynamics. The Journal of chemical physics 2013;138(17):05B601_1.
  40. On the origin and structure of the Lorenz attractor. In: Akademiia Nauk SSSR Doklady, vol. 234; 1977. p. 336–339.
  41. Variational inference: A review for statisticians. Journal of the American statistical Association 2017;112(518):859–877.
  42. Efficient decoding of multi-dimensional signals from population spiking activity using a Gaussian mixture particle filter. IEEE Transactions on Biomedical Engineering 2019;66(12):3486–3498.

Summary

We haven't generated a summary for this paper yet.