Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Prompt Mining for Language-based Human Mobility Forecasting (2403.03544v1)

Published 6 Mar 2024 in cs.AI and cs.CL

Abstract: With the advancement of LLMs, language-based forecasting has recently emerged as an innovative approach for predicting human mobility patterns. The core idea is to use prompts to transform the raw mobility data given as numerical values into natural language sentences so that the LLMs can be leveraged to generate the description for future observations. However, previous studies have only employed fixed and manually designed templates to transform numerical values into sentences. Since the forecasting performance of LLMs heavily relies on prompts, using fixed templates for prompting may limit the forecasting capability of LLMs. In this paper, we propose a novel framework for prompt mining in language-based mobility forecasting, aiming to explore diverse prompt design strategies. Specifically, the framework includes a prompt generation stage based on the information entropy of prompts and a prompt refinement stage to integrate mechanisms such as the chain of thought. Experimental results on real-world large-scale data demonstrate the superiority of generated prompts from our prompt mining pipeline. Additionally, the comparison of different prompt variants shows that the proposed prompt refinement process is effective. Our study presents a promising direction for further advancing language-based mobility forecasting.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (37)
  1. Human mobility prediction based on individual and collective geographical preferences. In 13th international IEEE conference on intelligent transportation systems. IEEE, 312–317.
  2. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling. arXiv preprint arXiv:1412.3555 (2014).
  3. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT 2019, Minneapolis, MN, USA, June 2-7, 2019, Volume 1 (Long and Short Papers). Association for Computational Linguistics, 4171–4186.
  4. Deepmove: Predicting human mobility with attentional recurrent networks. In Proceedings of the 2018 world wide web conference. 1459–1468.
  5. Sepp Hochreiter and Jürgen Schmidhuber. 1997. Long Short-term Memory. Neural computation 9, 8 (1997), 1735–1780.
  6. How do you go where? improving next location prediction by learning travel mode information using transformers. In Proceedings of the 30th International Conference on Advances in Geographic Information Systems. 1–10.
  7. Reformer: The Efficient Transformer. In International Conference on Learning Representations.
  8. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020, Online, July 5-10, 2020. Association for Computational Linguistics, 7871–7880.
  9. Entropy-based discrimination between translated Chinese and original Chinese using data mining techniques. Plos one 17, 3 (2022), e0265633.
  10. Predicting the next location: A recurrent model with spatial and temporal contexts. In Thirtieth AAAI conference on artificial intelligence.
  11. Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting. In International conference on learning representations.
  12. Marcelo A Montemurro and Damián H Zanette. 2011. Universal entropy of word ordering across linguistic families. PLoS One 6, 5 (2011), e19875.
  13. A Time Series is Worth 64 Words: Long-term Forecasting with Transformers. In The Eleventh International Conference on Learning Representations.
  14. OpenAI. 2023. GPT-4 Technical Report. arXiv:2303.08774
  15. A hybrid Markov-based model for human mobility prediction. Neurocomputing 278 (2018), 99–109.
  16. Language models are unsupervised multitask learners. OpenAI blog 1, 8 (2019), 9.
  17. Information gain-based metric for recognizing transitions in human activities. Pervasive and Mobile Computing 38 (2017), 92–109.
  18. Where to go next: Modeling long-and short-term user preferences for point-of-interest recommendation. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 34. 214–221.
  19. Sean J Taylor and Benjamin Letham. 2018. Forecasting at scale. The American Statistician 72, 1 (2018), 37–45.
  20. Llama 2: Open foundation and fine-tuned chat models. arXiv preprint arXiv:2307.09288 (2023).
  21. Attention is all you need. Advances in neural information processing systems 30 (2017).
  22. Chain-of-thought prompting elicits reasoning in large language models. Advances in Neural Information Processing Systems 35 (2022), 24824–24837.
  23. TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis. In The Eleventh International Conference on Learning Representations.
  24. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Advances in Neural Information Processing Systems 34 (2021).
  25. MobTCast: Leveraging auxiliary trajectory forecasting for human mobility prediction. Advances in Neural Information Processing Systems 34 (2021), 30380–30391.
  26. Hao Xue and Flora D Salim. 2021. TERMCast: Temporal relation modeling for effective urban flow forecasting. In Pacific-Asia Conference on Knowledge Discovery and Data Mining. Springer, 741–753.
  27. Hao Xue and Flora D Salim. 2023. Promptcast: A new prompt-based learning paradigm for time series forecasting. IEEE Transactions on Knowledge and Data Engineering (2023).
  28. Translating human mobility forecasting through natural language generation. In Proceedings of the Fifteenth ACM International Conference on Web Search and Data Mining. 1224–1233.
  29. Leveraging language foundation models for human mobility forecasting. In Proceedings of the 30th International Conference on Advances in Geographic Information Systems. 1–9.
  30. MTMGNN: Multi-time multi-graph neural network for metro passenger flow prediction. GeoInformatica 27, 1 (2023), 77–105.
  31. Big bird: Transformers for longer sequences. Advances in Neural Information Processing Systems 33 (2020), 17283–17297.
  32. Are transformers effective for time series forecasting?. In Proceedings of the AAAI conference on artificial intelligence, Vol. 37. 11121–11128.
  33. Pegasus: Pre-training with extracted gap-sentences for abstractive summarization. In International Conference on Machine Learning. PMLR, 11328–11339.
  34. Less is more: Fast multivariate time series forecasting with light sampling-oriented mlp structures. arXiv preprint arXiv:2207.01186 (2022).
  35. Yunhao Zhang and Junchi Yan. 2022. Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting. In The Eleventh International Conference on Learning Representations.
  36. Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of AAAI.
  37. FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting. In International Conference on Machine Learning, ICML 2022, 17-23 July 2022, Baltimore, Maryland, USA (Proceedings of Machine Learning Research, Vol. 162). PMLR, 27268–27286.
Citations (3)

Summary

We haven't generated a summary for this paper yet.