Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Intention-aware Long Horizon Trajectory Prediction of Surrounding Vehicles using Dual LSTM Networks (1906.02815v1)

Published 6 Jun 2019 in cs.LG, cs.RO, and stat.ML

Abstract: As autonomous vehicles (AVs) need to interact with other road users, it is of importance to comprehensively understand the dynamic traffic environment, especially the future possible trajectories of surrounding vehicles. This paper presents an algorithm for long-horizon trajectory prediction of surrounding vehicles using a dual long short term memory (LSTM) network, which is capable of effectively improving prediction accuracy in strongly interactive driving environments. In contrast to traditional approaches which require trajectory matching and manual feature selection, this method can automatically learn high-level spatial-temporal features of driver behaviors from naturalistic driving data through sequence learning. By employing two blocks of LSTMs, the proposed method feeds the sequential trajectory to the first LSTM for driver intention recognition as an intermediate indicator, which is immediately followed by a second LSTM for future trajectory prediction. Test results from real-world highway driving data show that the proposed method can, in comparison to state-of-art methods, output more accurate and reasonable estimate of different future trajectories over 5s time horizon with root mean square error (RMSE) for longitudinal and lateral prediction less than 5.77m and 0.49m, respectively.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Long Xin (2 papers)
  2. Pin Wang (31 papers)
  3. Ching-Yao Chan (19 papers)
  4. Jianyu Chen (69 papers)
  5. Shengbo Eben Li (98 papers)
  6. Bo Cheng (51 papers)
Citations (120)

Summary

We haven't generated a summary for this paper yet.