Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

LaCE-LHMP: Airflow Modelling-Inspired Long-Term Human Motion Prediction By Enhancing Laminar Characteristics in Human Flow (2403.13640v1)

Published 20 Mar 2024 in cs.RO

Abstract: Long-term human motion prediction (LHMP) is essential for safely operating autonomous robots and vehicles in populated environments. It is fundamental for various applications, including motion planning, tracking, human-robot interaction and safety monitoring. However, accurate prediction of human trajectories is challenging due to complex factors, including, for example, social norms and environmental conditions. The influence of such factors can be captured through Maps of Dynamics (MoDs), which encode spatial motion patterns learned from (possibly scattered and partial) past observations of motion in the environment and which can be used for data-efficient, interpretable motion prediction (MoD-LHMP). To address the limitations of prior work, especially regarding accuracy and sensitivity to anomalies in long-term prediction, we propose the Laminar Component Enhanced LHMP approach (LaCE-LHMP). Our approach is inspired by data-driven airflow modelling, which estimates laminar and turbulent flow components and uses predominantly the laminar components to make flow predictions. Based on the hypothesis that human trajectory patterns also manifest laminar flow (that represents predictable motion) and turbulent flow components (that reflect more unpredictable and arbitrary motion), LaCE-LHMP extracts the laminar patterns in human dynamics and uses them for human motion prediction. We demonstrate the superior prediction performance of LaCE-LHMP through benchmark comparisons with state-of-the-art LHMP methods, offering an unconventional perspective and a more intuitive understanding of human movement patterns.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (39)
  1. “Human motion trajectory prediction: A survey” In Int. J. of Robotics Research 39.8 Sage Publications Sage UK: London, England, 2020, pp. 895–935
  2. “Human motion prediction under social grouping constraints” In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018, pp. 3358–3364 IEEE
  3. “Survey of maps of dynamics for mobile robots” In Int. J. of Robotics Research 42.11, 2023, pp. 977–1006
  4. “CLiFF-LHMP: Using Spatial Dynamics Patterns for Long- Term Human Motion Prediction” In Proc. of the IEEE Int. Conf. on Intell. Robots and Syst. (IROS), 2023
  5. “Enabling Flow Awareness for Mobile Robots in Partially Observable Environments” In IEEE Robotics and Automation Letters 2.2, 2017, pp. 1093–1100
  6. “A Data-Efficient Approach for Long-Term Human Motion Prediction Using Maps of Dynamics” In arXiv preprint arXiv:2306.03617, 2023
  7. W. Liu, D.Lian W.Luo and S. Gao “Future Frame Prediction for Anomaly Detection – A New Baseline” In Proc. of the IEEE Conf. on Comp. Vis. and Pat. Rec. (CVPR), 2018
  8. “Soft+Hardwired attention: An LSTM framework for human trajectory prediction and abnormal event detection” In Neural networks 108 Elsevier, 2018, pp. 466–478
  9. “Probabilistic air flow modelling using turbulent and laminar characteristics for ground and aerial robots” In IEEE Robotics and Automation Letters 2.2 IEEE, 2017, pp. 1117–1123
  10. ASHRAE Handbook ASHRAE “Fundamentals, SI ed” In American Society of Heating, Refrigerating and Air-Conditioning Engineers, Atlanta, GA 2017, 2017
  11. “Laminar and Turbulent Flow” In Flow Noise: Theory Springer, 2022, pp. 71–105
  12. “Social LSTM: Human trajectory prediction in crowded spaces” In Proc. of the IEEE Conf. on Comp. Vis. and Pat. Rec. (CVPR), 2016, pp. 961–971
  13. “Human motion prediction using semi-adaptable neural networks” In 2019 American Control Conference (ACC), 2019, pp. 4884–4890 IEEE
  14. “LSTM-based real-time action detection and prediction in human motion streams” In Multimedia Tools and Applications 78 Springer, 2019, pp. 27309–27331
  15. “Social-stgcnn: A social spatio-temporal graph convolutional neural network for human trajectory prediction” In Proc. of the IEEE Conf. on Comp. Vis. and Pat. Rec. (CVPR), 2020, pp. 14424–14432
  16. “Noticing motion patterns: A temporal cnn with a novel convolution operator for human trajectory prediction” In IEEE Robotics and Automation Letters 6.2 IEEE, 2020, pp. 628–634
  17. “Motion trajectory prediction based on a CNN-LSTM sequential model” In Science China Information Sciences 63 Springer, 2020, pp. 1–21
  18. “Trajectorycnn: a new spatio-temporal feature learning network for human motion prediction” In IEEE Transactions on Circuits and Systems for Video Technology 31.6 IEEE, 2020, pp. 2133–2146
  19. “SoPhie: An attentive GAN for predicting paths compliant to social and physical constraints” In Proc. of the IEEE Conf. on Comp. Vis. and Pat. Rec. (CVPR), 2019, pp. 1349–1358
  20. “Atten-GAN: Pedestrian Trajectory Prediction with GAN Based on Attention Mechanism” In Cognitive Computation 14.6 Springer, 2022, pp. 2296–2305
  21. “Trajectron++: Dynamically-Feasible Trajectory Forecasting With Heterogeneous Data” In Proc. of the Europ. Conf. on Comp. Vision (ECCV), 2020, pp. 683–700
  22. “Sliding sequential CVAE with time variant socially-aware rethinking for trajectory prediction” In arXiv preprint arXiv:2110.15016, 2021
  23. Pei Xu, Jean-Bernard Hayet and Ioannis Karamouzas “Socialvae: Human trajectory prediction using timewise latents” In European Conference on Computer Vision, 2022, pp. 511–528 Springer
  24. “Transformer networks for trajectory forecasting” In Proc. of the IEEE Int. Conf. on Pattern Recognition, 2021, pp. 10335–10342 IEEE
  25. “What the constant velocity model can teach us about pedestrian motion prediction” In IEEE Robotics and Automation Letters 5.2 IEEE, 2020, pp. 1696–1703
  26. “Social force model for pedestrian dynamics” In Physical review E 51.5 APS, 1995, pp. 4282
  27. “People tracking with human motion predictions from social forces” In Proc. of the IEEE Int. Conf. on Robotics and Automation (ICRA), 2010, pp. 464–469
  28. “Walking Ahead: The Headed Social Force Model” In PloS one 12.1 Public Library of Science, 2017, pp. e0169734
  29. J. Berg, M. Lin and D. Manocha “Reciprocal velocity obstacles for real-time multi-agent navigation” In Proc. of the IEEE Int. Conf. on Robotics and Automation (ICRA), 2008, pp. 1928–1935
  30. “Reciprocal n-body collision avoidance” In Proc. of the Int. Symp. of Robotics Research (ISRR), 2011, pp. 3–19 Springer
  31. “Context-Based Path Prediction for Targets with Switching Dynamics” In Int. J. of Comp. Vision (IJCV) 127.3, 2019, pp. 239–262
  32. “Planning-based prediction for pedestrians” In Proc. of the IEEE Int. Conf. on Intell. Robots and Syst. (IROS), 2009, pp. 3931–3936
  33. A. Rudenko, L. Palmieri and K.O. Arras “Joint Prediction of Human Motion Using a Planning-Based Social Force Approach” In Proc. of the IEEE Int. Conf. on Robotics and Automation (ICRA), 2018, pp. 1–7
  34. “Pedestrian prediction by planning using deep neural networks” In Proc. of the IEEE Int. Conf. on Robotics and Automation (ICRA), 2018, pp. 1–5
  35. “Long-term path prediction in urban scenarios using circular distributions” In Image and Vision Computing 69 Elsevier, 2018, pp. 81–91
  36. “From Goals, Waypoints & Paths To Long Term Human Trajectory Forecasting” In Proc. of the IEEE Int. Conf. on Computer Vision (ICCV), 2021, pp. 15213–15222
  37. Sebastian Thrun “Probabilistic robotics” In Communications of the ACM 45.3 ACM New York, NY, USA, 2002, pp. 52–57
  38. “Person tracking in large public spaces using 3-D range sensors” In IEEE Transactions on Human-Machine Systems 43.6 IEEE, 2013, pp. 522–534
  39. “The Atlas Benchmark: an Automated Evaluation Framework for Human Motion Prediction” In Proc. of the IEEE Int. Symp. on Robot and Human Interactive Comm. (RO-MAN), 2022
Citations (2)

Summary

We haven't generated a summary for this paper yet.