Sliding Window Neural Generated Tracking Based on Measurement Model
Abstract: In the pursuit of further advancement in the field of target tracking, this paper explores the efficacy of a feedforward neural network in predicting drones tracks, aiming to eventually, compare the tracks created by the well-known Kalman filter and the ones created by our proposed neural network. The unique feature of our proposed neural network tracker is that it is using only a measurement model to estimate the next states of the track. Object model selection and linearization is one of the challenges that always face in the tracking process. The neural network uses a sliding window to incorporate the history of measurements when applying estimations of the track values. The testing results are comparable to the ones generated by the Kalman filter, especially for the cases where there is low measurement covariance. The complexity of linearization is avoided when using this proposed model.
- G. Welch, G. Bishop et al., “An introduction to the kalman filter,” 1995.
- M. B. Rhudy, R. A. Salguero, and K. Holappa, “A kalman filtering tutorial for undergraduate students,” International Journal of Computer Science & Engineering Survey, vol. 8, no. 1, pp. 1–9, 2017.
- G. A. Terejanu et al., “Extended kalman filter tutorial,” University at Buffalo, 2008.
- M. Laaraiedh, “Implementation of kalman filter with python language,” arXiv preprint arXiv:1204.0375, 2012.
- D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by back-propagating errors,” nature, vol. 323, no. 6088, pp. 533–536, 1986.
- J. Barr, O. Harrald, S. Hiscocks, N. Perree, H. Pritchett, J. Wright, B. Balaji, E. Hunter, D. Kirkland, D. Raval et al., “Stone soup open source framework for tracking and state estimation: enhancements and applications,” in Signal Processing, Sensor/Information Fusion, and Target Recognition XXXI, vol. 12122. SPIE, 2022, pp. 43–59.
- M. H. Sazli, “A brief review of feed-forward neural networks,” Communications Faculty of Sciences University of Ankara Series A2-A3 Physical Sciences and Engineering, vol. 50, no. 01, 2006.
- C. Nwankpa, W. Ijomah, A. Gachagan, and S. Marshall, “Activation functions: Comparison of trends in practice and research for deep learning,” arXiv preprint arXiv:1811.03378, 2018.
- P. Ramachandran, B. Zoph, and Q. V. Le, “Searching for activation functions,” arXiv preprint arXiv:1710.05941, 2017.
- A. Krogh, “What are artificial neural networks?” Nature biotechnology, vol. 26, no. 2, pp. 195–197, 2008.
- X. Yu, M. O. Efe, and O. Kaynak, “A general backpropagation algorithm for feedforward neural networks learning,” IEEE transactions on neural networks, vol. 13, no. 1, pp. 251–254, 2002.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.