Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 90 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 20 tok/s
GPT-5 High 23 tok/s Pro
GPT-4o 93 tok/s
GPT OSS 120B 441 tok/s Pro
Kimi K2 212 tok/s Pro
2000 character limit reached

Quaternion recurrent neural network with real-time recurrent learning and maximum correntropy criterion (2402.14227v2)

Published 22 Feb 2024 in cs.LG

Abstract: We develop a robust quaternion recurrent neural network (QRNN) for real-time processing of 3D and 4D data with outliers. This is achieved by combining the real-time recurrent learning (RTRL) algorithm and the maximum correntropy criterion (MCC) as a loss function. While both the mean square error and maximum correntropy criterion are viable cost functions, it is shown that the non-quadratic maximum correntropy loss function is less sensitive to outliers, making it suitable for applications with multidimensional noisy or uncertain data. Both algorithms are derived based on the novel generalised HR (GHR) calculus, which allows for the differentiation of real functions of quaternion variables and offers the product and chain rules, thus enabling elegant and compact derivations. Simulation results in the context of motion prediction of chest internal markers for lung cancer radiotherapy, which includes regular and irregular breathing sequences, support the analysis.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (26)
  1. T. Parcollet, M. Morchid, and G. Linarès, “A survey of quaternion neural networks,” Artificial Intelligence Review, vol. 53, no. 4, p. 2957–2982, 2019.
  2. X. Zhu, Y. Xu, H. Xu, and C. Chen, “Quaternion convolutional neural networks,” in Proceedings of the European Conference on Computer Vision (ECCV), 2019.
  3. S. Zhang, Y. Tay, L. Yao, and Q. Liu, “Quaternion knowledge graph embeddings,” in Proceedings of the Conference on Neural Information Processing Systems (NeurIPS), 2019.
  4. C. J. Gaudet and A. S. Maida, “Deep quaternion networks,” in Proceedings of the International Joint Conference on Neural Networks (IJCNN), 2018, pp. 1–8.
  5. T. Parcollet, M. Morchid, and G. Linares, “Quaternion convolutional neural networks for heterogeneous image processing,” in Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2019, pp. 8514–8518.
  6. K. Takahashi, E. Tano, and M. Hashimoto, “Feedforward–feedback controller based on a trained quaternion neural network using a generalised HR calculus with application to trajectory control of a three-link robot manipulator,” Machines, vol. 10, no. 5, p. 333, 2022.
  7. D. Comminiello, M. Lella, S. Scardapane, and A. Uncini, “Quaternion convolutional neural networks for detection and localization of 3D sound events,” in Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2019, pp. 8533–8537.
  8. D. Xu, Y. Xia, and D. P. Mandic, “Optimization in quaternion dynamic systems: Gradient, Hessian, and learning algorithms,” IEEE Transactions on Neural Networks and Learning Systems, vol. 27, no. 2, pp. 249–261, 2016.
  9. D. Xu, C. Jahanchahi, C. C. Took, and D. P. Mandic, “Enabling quaternion derivatives: The generalized HR calculus,” Royal Society Open Science, vol. 2, no. 8, p. 150255, 2015.
  10. D. Xu, L. Zhang, and H. Zhang, “Learning algorithms in quaternion neural networks using GHR calculus,” Neural Network World, vol. 27, no. 3, pp. 271–282, 2017.
  11. C. Popa, “Learning algorithms for quaternion-valued neural networks,” Neural Processing Letters, vol. 47, no. 3, pp. 949–973, 2017.
  12. R. J. Williams and D. Zipser, “A learning algorithm for continually running fully recurrent neural networks,” Neural Computation, vol. 1, no. 2, pp. 270–280, 1989.
  13. T. Parcollet, M. Ravanelli, M. Morchid, G. Linarès, C. Trabelsi, R. De Mori, and Y. Bengio, “Quaternion recurrent neural networks,” in Proceedings of the International Conference on Learning Representations (ICLR), 2018.
  14. I. Santamaria, P. P. Pokharel, and J. C. Principe, “Generalized correlation function: Definition, properties, and application to blind equalization,” IEEE Transactions on Signal Processing, vol. 54, no. 6, pp. 2187–2197, 2006.
  15. A. Sudbery, “Quaternionic analysis,” Mathematical Proceedings of the Cambridge Philosophical Society, vol. 85, no. 2, pp. 199–225, 1979.
  16. W. Liu, P. P. Pokharel, and J. C. Principe, “Correntropy: A localized similarity measure,” in Proceedings of the International Joint Conference on Neural Networks (IJCNN), 2006, pp. 4919–4924.
  17. T. Ogunfunmi and T. Paul, “The quaternion maximum correntropy algorithm,” IEEE Transactions on Circuits and Systems II: Express Briefs, vol. 62, no. 6, pp. 598–602, 2015.
  18. M. Pohl, M. Uesaka, K. Demachi, and R. B. Chhatkuli, “Prediction of the motion of chest internal points using a recurrent neural network trained with real-time recurrent learning for latency compensation in lung cancer radiotherapy,” Computerized Medical Imaging and Graphics, vol. 91, p. 101941, 2022.
  19. A. Krauss, S. Nill, and U. Oelfke, “The comparative performance of four respiratory motion predictors for real-time tumour tracking,” Physics in Medicine and Biology, vol. 56, no. 16, p. 5303–5317, 2011.
  20. T. P. Teo, S. B. Ahmed, P. Kawalec, N. Alayoubi, N. Bruce, E. Lyn, and S. Pistorius, “Feasibility of predicting tumor motion using online data acquired during treatment and a generalized neural network optimized with offline patient tumor trajectories,” Medical Physics, vol. 45, no. 2, p. 830–845, 2018.
  21. M. Mafi and S. M. Moghadam, “Real-time prediction of tumor motion using a dynamic neural network,” Medical, Biological Engineering, Computing, vol. 58, no. 3, p. 529–539, 2020.
  22. K. Jiang, F. Fujii, and T. Shiinoki, “Prediction of lung tumor motion using nonlinear autoregressive model with exogenous input,” Physics in Medicine & Biology, vol. 64, no. 21, p. 21NT02, 2019.
  23. T. Krilavicius, I. Zliobaite, H. Simonavicius, and L. Jarusevicius, “Predicting respiratory motion for real-time tumour tracking in radiotherapy,” in Proceedings of the IEEE 29th International Symposium on Computer-Based Medical Systems (CBMS), 2016.
  24. M. Pohl, M. Uesaka, H. Takahashi, K. Demachi, and R. Bhusal Chhatkuli, “Prediction of the position of external markers using a recurrent neural network trained with unbiased online recurrent optimization for safe lung cancer radiotherapy,” Computer Methods and Programs in Biomedicine, vol. 222, p. 106908, 2022.
  25. C. Took and D. Mandic, “The Quaternion LMS algorithm for adaptive filtering of hypercomplex processes,” IEEE Transactions on Signal Processing, vol. 57, no. 4, p. 1316–1327, 2009.
  26. N. Benvenuto and F. Piazza, “On the complex backpropagation algorithm,” IEEE Transactions on Signal Processing, vol. 40, no. 4, pp. 967–969, 1992.
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube