Papers
Topics
Authors
Recent
Search
2000 character limit reached

Sliding Window Neural Generated Tracking Based on Measurement Model

Published 10 Jun 2023 in cs.CV and eess.SP | (2306.06434v1)

Abstract: In the pursuit of further advancement in the field of target tracking, this paper explores the efficacy of a feedforward neural network in predicting drones tracks, aiming to eventually, compare the tracks created by the well-known Kalman filter and the ones created by our proposed neural network. The unique feature of our proposed neural network tracker is that it is using only a measurement model to estimate the next states of the track. Object model selection and linearization is one of the challenges that always face in the tracking process. The neural network uses a sliding window to incorporate the history of measurements when applying estimations of the track values. The testing results are comparable to the ones generated by the Kalman filter, especially for the cases where there is low measurement covariance. The complexity of linearization is avoided when using this proposed model.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (11)
  1. G. Welch, G. Bishop et al., “An introduction to the kalman filter,” 1995.
  2. M. B. Rhudy, R. A. Salguero, and K. Holappa, “A kalman filtering tutorial for undergraduate students,” International Journal of Computer Science & Engineering Survey, vol. 8, no. 1, pp. 1–9, 2017.
  3. G. A. Terejanu et al., “Extended kalman filter tutorial,” University at Buffalo, 2008.
  4. M. Laaraiedh, “Implementation of kalman filter with python language,” arXiv preprint arXiv:1204.0375, 2012.
  5. D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by back-propagating errors,” nature, vol. 323, no. 6088, pp. 533–536, 1986.
  6. J. Barr, O. Harrald, S. Hiscocks, N. Perree, H. Pritchett, J. Wright, B. Balaji, E. Hunter, D. Kirkland, D. Raval et al., “Stone soup open source framework for tracking and state estimation: enhancements and applications,” in Signal Processing, Sensor/Information Fusion, and Target Recognition XXXI, vol. 12122.   SPIE, 2022, pp. 43–59.
  7. M. H. Sazli, “A brief review of feed-forward neural networks,” Communications Faculty of Sciences University of Ankara Series A2-A3 Physical Sciences and Engineering, vol. 50, no. 01, 2006.
  8. C. Nwankpa, W. Ijomah, A. Gachagan, and S. Marshall, “Activation functions: Comparison of trends in practice and research for deep learning,” arXiv preprint arXiv:1811.03378, 2018.
  9. P. Ramachandran, B. Zoph, and Q. V. Le, “Searching for activation functions,” arXiv preprint arXiv:1710.05941, 2017.
  10. A. Krogh, “What are artificial neural networks?” Nature biotechnology, vol. 26, no. 2, pp. 195–197, 2008.
  11. X. Yu, M. O. Efe, and O. Kaynak, “A general backpropagation algorithm for feedforward neural networks learning,” IEEE transactions on neural networks, vol. 13, no. 1, pp. 251–254, 2002.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.