Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 152 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 26 tok/s Pro
GPT-5 High 30 tok/s Pro
GPT-4o 101 tok/s Pro
Kimi K2 199 tok/s Pro
GPT OSS 120B 430 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Semantic-Fused Multi-Granularity Cross-City Traffic Prediction (2302.11774v2)

Published 23 Feb 2023 in cs.LG

Abstract: Accurate traffic prediction is essential for effective urban management and the improvement of transportation efficiency. Recently, data-driven traffic prediction methods have been widely adopted, with better performance than traditional approaches. However, they often require large amounts of data for effective training, which becomes challenging given the prevalence of data scarcity in regions with inadequate sensing infrastructures. To address this issue, we propose a Semantic-Fused Multi-Granularity Transfer Learning (SFMGTL) model to achieve knowledge transfer across cities with fused semantics at different granularities. In detail, we design a semantic fusion module to fuse various semantics while conserving static spatial dependencies via reconstruction losses. Then, a fused graph is constructed based on node features through graph structure learning. Afterwards, we implement hierarchical node clustering to generate graphs with different granularity. To extract feasible meta-knowledge, we further introduce common and private memories and obtain domain-invariant features via adversarial training. It is worth noting that our work jointly addresses semantic fusion and multi-granularity issues in transfer learning. We conduct extensive experiments on six real-world datasets to verify the effectiveness of our SFMGTL model by comparing it with other state-of-the-art baselines. Afterwards, we also perform ablation and case studies, demonstrating that our model possesses substantially fewer parameters compared to baseline models. Moreover, we illustrate how knowledge transfer aids the model in accurately predicting demands, especially during peak hours. The codes can be found at https://github.com/zeonchen/SFMGTL.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (54)
  1. Traffic flow prediction with big data: a deep learning approach, IEEE Transactions on Intelligent Transportation Systems 16 (2014) 865–873.
  2. Traffic flow prediction based on spatiotemporal potential energy fields, IEEE Transactions on Knowledge and Data Engineering (2022).
  3. Spatiotemporal patterns in large-scale traffic speed prediction, IEEE Transactions on Intelligent Transportation Systems 15 (2013) 794–804.
  4. Dynamic and multi-faceted spatio-temporal deep learning for traffic speed forecasting, in: Proceedings of the 27th ACM SIGKDD conference on knowledge discovery & data mining, pp. 547–555.
  5. Dneat: A novel dynamic node-edge attention network for origin-destination demand prediction, Transportation Research Part C: Emerging Technologies 122 (2021) 102851.
  6. Deepflowgen: Intention-aware fine grained crowd flow generation via deep neural networks, IEEE Transactions on Knowledge and Data Engineering 34 (2021) 5693–5707.
  7. Context-aware spatial-temporal neural network for citywide crowd flow prediction via modeling long-range spatial dependency, ACM Transactions on Knowledge Discovery from Data (TKDD) 16 (2021) 1–21.
  8. A survey on modern deep neural network for traffic prediction: Trends, methods and challenges, IEEE Transactions on Knowledge and Data Engineering (2020).
  9. A cross-city federated transfer learning framework: A case study on urban region profiling, arXiv preprint arXiv:2206.00007 (2022).
  10. Kill two birds with one stone: A multi-view multi-adversarial learning approach for joint air quality and weather prediction, IEEE Transactions on Knowledge and Data Engineering (2023).
  11. Temporal multi-view graph convolutional networks for citywide traffic volume inference, in: 2021 IEEE International Conference on Data Mining (ICDM), IEEE, pp. 1042–1047.
  12. Dynamic multi-view graph neural networks for citywide traffic inference, ACM Transactions on Knowledge Discovery from Data 17 (2023) 1–22.
  13. Spatiotemporal multi-graph convolution network for ride-hailing demand forecasting, in: Proceedings of the AAAI conference on artificial intelligence, volume 33, pp. 3656–3663.
  14. Bike flow prediction with multi-graph convolutional networks, in: Proceedings of the 26th ACM SIGSPATIAL international conference on advances in geographic information systems, pp. 397–400.
  15. M. Müller, Dynamic time warping, Information retrieval for music and motion (2007) 69–84.
  16. J. Mo, Z. Gong, Cross-city multi-granular adaptive transfer learning for traffic flow prediction, IEEE Transactions on Knowledge and Data Engineering (2022).
  17. Multivariate correlation-aware spatio-temporal graph convolutional networks for multi-scale traffic prediction, ACM Transactions on Intelligent Systems and Technology (TIST) 13 (2022) 1–22.
  18. B. L. Smith, M. J. Demetsky, Traffic flow forecasting: comparison of modeling approaches, Journal of transportation engineering 123 (1997) 261–266.
  19. Combining kohonen maps with arima time series models to forecast traffic flow, Transportation Research Part C: Emerging Technologies 4 (1996) 307–318.
  20. Long short-term memory neural network for traffic speed prediction using remote microwave sensor data, Transportation Research Part C: Emerging Technologies 54 (2015) 187–197.
  21. Learning traffic as images: a deep convolutional neural network for large-scale transportation network speed prediction, Sensors 17 (2017) 818.
  22. Diffusion convolutional recurrent neural network: Data-driven traffic forecasting, arXiv preprint arXiv:1707.01926 (2017).
  23. T-gcn: A temporal graph convolutional network for traffic prediction, IEEE Transactions on Intelligent Transportation Systems 21 (2019) 3848–3858.
  24. Urban traffic prediction from spatio-temporal data using deep meta learning, in: Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining, pp. 1720–1730.
  25. A comprehensive survey on transfer learning, Proceedings of the IEEE 109 (2020) 43–76.
  26. Correcting sample selection bias by unlabeled data, Advances in neural information processing systems 19 (2006).
  27. Domain adaptation via transfer component analysis, IEEE transactions on neural networks 22 (2010) 199–210.
  28. J. Vanschoren, Meta-learning: A survey, arXiv preprint arXiv:1810.03548 (2018).
  29. Model-agnostic meta-learning for fast adaptation of deep networks, in: International conference on machine learning, PMLR, pp. 1126–1135.
  30. Task2vec: Task embedding for meta-learning, in: Proceedings of the IEEE/CVF international conference on computer vision, pp. 6430–6439.
  31. Automated relational meta-learning, arXiv preprint arXiv:2001.00745 (2020).
  32. F. Zhao, D. Wang, Multimodal graph meta contrastive learning, in: Proceedings of the 30th ACM International Conference on Information & Knowledge Management, pp. 3657–3661.
  33. Tadanet: Task-adaptive network for graph-enriched meta-learning, in: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1789–1799.
  34. Cross-city transfer learning for deep spatio-temporal prediction, arXiv preprint arXiv:1802.00386 (2018).
  35. Learning from multiple cities: A meta-learning approach for spatial-temporal prediction, in: The World Wide Web Conference, pp. 2181–2191.
  36. Spatio-temporal knowledge transfer for urban crowd flow prediction via deep attentive adaptation networks, IEEE Transactions on Intelligent Transportation Systems 23 (2021) 4695–4705.
  37. Spatio-temporal graph few-shot learning with cross-city knowledge transfer, arXiv preprint arXiv:2205.13947 (2022).
  38. Transferable graph structure learning for graph-based traffic forecasting across cities, in: Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 1032–1043.
  39. Selective cross-city transfer learning for traffic prediction via source city region re-weighting, in: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 731–741.
  40. Domain adversarial spatial-temporal network: a transferable framework for short-term traffic forecasting across cities, in: Proceedings of the 31st ACM International Conference on Information & Knowledge Management, pp. 1905–1915.
  41. Transfer learning with spatial–temporal graph convolutional network for traffic prediction, IEEE Transactions on Intelligent Transportation Systems (2023).
  42. Citytrans: Domain-adversarial training with knowledge transfer for spatio-temporal prediction across cities, IEEE Transactions on Knowledge and Data Engineering (2023).
  43. Spatio-temporal graph structure learning for traffic forecasting, in: Proceedings of the AAAI conference on artificial intelligence, volume 34, pp. 1177–1185.
  44. Tree structure-aware graph representation learning via integrated hierarchical aggregation and relational metric learning, in: 2020 IEEE International Conference on Data Mining (ICDM), IEEE, pp. 432–441.
  45. Continuous-time and multi-level graph representation learning for origin-destination demand prediction, in: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 516–524.
  46. Hierarchical graph representation learning with differentiable pooling, Advances in neural information processing systems 31 (2018).
  47. Gman: A graph multi-attention network for traffic prediction, in: Proceedings of the AAAI conference on artificial intelligence, volume 34, pp. 1234–1241.
  48. An lstm based encoder-decoder model for multistep traffic flow prediction, in: 2019 International Joint Conference on Neural Networks (IJCNN), IEEE, pp. 1–8.
  49. Tranad: Deep transformer networks for anomaly detection in multivariate time series data, arXiv preprint arXiv:2201.07284 (2022).
  50. Graph attention networks, stat 1050 (2017) 10–48550.
  51. Y. Ganin, V. Lempitsky, Unsupervised domain adaptation by backpropagation, in: International conference on machine learning, PMLR, pp. 1180–1189.
  52. D. P. Kingma, J. Ba, Adam: A method for stochastic optimization, arXiv preprint arXiv:1412.6980 (2014).
  53. On the properties of neural machine translation: Encoder-decoder approaches, arXiv preprint arXiv:1409.1259 (2014).
  54. Characterizing and avoiding negative transfer, in: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp. 11293–11302.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Github Logo Streamline Icon: https://streamlinehq.com