Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 76 tok/s
Gemini 2.5 Pro 58 tok/s Pro
GPT-5 Medium 35 tok/s Pro
GPT-5 High 28 tok/s Pro
GPT-4o 78 tok/s Pro
Kimi K2 218 tok/s Pro
GPT OSS 120B 465 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

Understanding Inter-Session Intentions via Complex Logical Reasoning (2312.13866v2)

Published 21 Dec 2023 in cs.AI and cs.CL

Abstract: Understanding user intentions is essential for improving product recommendations, navigation suggestions, and query reformulations. However, user intentions can be intricate, involving multiple sessions and attribute requirements connected by logical operators such as And, Or, and Not. For instance, a user may search for Nike or Adidas running shoes across various sessions, with a preference for purple. In another example, a user may have purchased a mattress in a previous session and is now looking for a matching bed frame without intending to buy another mattress. Existing research on session understanding has not adequately addressed making product or attribute recommendations for such complex intentions. In this paper, we present the task of logical session complex query answering (LS-CQA), where sessions are treated as hyperedges of items, and we frame the problem of complex intention understanding as an LS-CQA task on an aggregated hypergraph of sessions, items, and attributes. This is a unique complex query answering task with sessions as ordered hyperedges. We also introduce a new model, the Logical Session Graph Transformer (LSGT), which captures interactions among items across different sessions and their logical connections using a transformer structure. We analyze the expressiveness of LSGT and prove the permutation invariance of the inputs for the logical operators. By evaluating LSGT on three datasets, we demonstrate that it achieves state-of-the-art results.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (35)
  1. Query embedding on hyper-relational knowledge graphs. In The Tenth International Conference on Learning Representations, ICLR 2022, Virtual Event, April 25-29, 2022. OpenReview.net, 2022. URL https://openreview.net/forum?id=4rLw09TgRw9.
  2. Complex query answering with neural link predictors. In 9th International Conference on Learning Representations, ICLR 2021, Virtual Event, Austria, May 3-7, 2021. OpenReview.net, 2021. URL https://openreview.net/forum?id=Mos9F9kDwkz.
  3. Query2Particles: Knowledge graph reasoning with particle embeddings. In Findings of the Association for Computational Linguistics: NAACL 2022, pp.  2703–2714, Seattle, United States, July 2022a. Association for Computational Linguistics. doi: 10.18653/v1/2022.findings-naacl.207. URL https://aclanthology.org/2022.findings-naacl.207.
  4. Sequential query encoding for complex query answering on knowledge graphs. Transactions on Machine Learning Research, 2023. ISSN 2835-8856. URL https://openreview.net/forum?id=ERqGqZzSu5.
  5. Answering complex logical queries on knowledge graphs via query computation tree optimization. CoRR, abs/2212.09567, 2022b. doi: 10.48550/arXiv.2212.09567. URL https://doi.org/10.48550/arXiv.2212.09567.
  6. Weisfeiler and leman go relational. In Bastian Rieck and Razvan Pascanu (eds.), Learning on Graphs Conference, LoG 2022, 9-12 December 2022, Virtual Event, volume 198 of Proceedings of Machine Learning Research, pp.  46. PMLR, 2022. URL https://proceedings.mlr.press/v198/barcelo22a.html.
  7. Fuzzy logic based logical query answering on knowledge graphs. In Thirty-Sixth AAAI Conference on Artificial Intelligence, AAAI 2022, Thirty-Fourth Conference on Innovative Applications of Artificial Intelligence, IAAI 2022, The Twelveth Symposium on Educational Advances in Artificial Intelligence, EAAI 2022 Virtual Event, February 22 - March 1, 2022, pp.  3939–3948. AAAI Press, 2022. URL https://ojs.aaai.org/index.php/AAAI/article/view/20310.
  8. Evolutionary preference learning via graph nested GRU ODE for session-based recommendation. In Mohammad Al Hasan and Li Xiong (eds.), Proceedings of the 31st ACM International Conference on Information & Knowledge Management, Atlanta, GA, USA, October 17-21, 2022, pp.  624–634. ACM, 2022. doi: 10.1145/3511808.3557314. URL https://doi.org/10.1145/3511808.3557314.
  9. Embedding logical queries on knowledge graphs. In Samy Bengio, Hanna M. Wallach, Hugo Larochelle, Kristen Grauman, Nicolò Cesa-Bianchi, and Roman Garnett (eds.), Advances in Neural Information Processing Systems 31: Annual Conference on Neural Information Processing Systems 2018, NeurIPS 2018, December 3-8, 2018, Montréal, Canada, pp.  2030–2041, 2018. URL https://proceedings.neurips.cc/paper/2018/hash/ef50c335cca9f340bde656363ebd02fd-Abstract.html.
  10. Session-based recommendations with recurrent neural networks. arXiv preprint arXiv:1511.06939, 2015.
  11. A theory of link prediction via relational weisfeiler-leman. CoRR, abs/2302.02209, 2023. doi: 10.48550/ARXIV.2302.02209. URL https://doi.org/10.48550/arXiv.2302.02209.
  12. Going deeper into permutation-sensitive graph neural networks. In Kamalika Chaudhuri, Stefanie Jegelka, Le Song, Csaba Szepesvári, Gang Niu, and Sivan Sabato (eds.), International Conference on Machine Learning, ICML 2022, 17-23 July 2022, Baltimore, Maryland, USA, volume 162 of Proceedings of Machine Learning Research, pp.  9377–9409. PMLR, 2022. URL https://proceedings.mlr.press/v162/huang22l.html.
  13. Amazon-m2: A multilingual multi-locale shopping session dataset for recommendation and text generation. arXiv preprint arXiv:2307.09688, 2023.
  14. Pure transformers are powerful graph learners. In NeurIPS, 2022. URL http://papers.nips.cc/paper_files/paper/2022/hash/5d84236751fe6d25dc06db055a3180b0-Abstract-Conference.html.
  15. Adsgnn: Behavior-graph augmented relevance modeling in sponsored search. In Fernando Diaz, Chirag Shah, Torsten Suel, Pablo Castells, Rosie Jones, and Tetsuya Sakai (eds.), SIGIR ’21: The 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, Virtual Event, Canada, July 11-15, 2021, pp.  223–232. ACM, 2021. doi: 10.1145/3404835.3462926. URL https://doi.org/10.1145/3404835.3462926.
  16. Neural attentive session-based recommendation. In Proceedings of the 2017 ACM on Conference on Information and Knowledge Management, pp.  1419–1428, 2017.
  17. STAMP: short-term attention/memory priority model for session-based recommendation. In Yike Guo and Faisal Farooq (eds.), Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, KDD 2018, London, UK, August 19-23, 2018, pp.  1831–1839. ACM, 2018. doi: 10.1145/3219819.3219950. URL https://doi.org/10.1145/3219819.3219950.
  18. Mask and reason: Pre-training knowledge graph transformers for complex logical queries. In Aidong Zhang and Huzefa Rangwala (eds.), KDD ’22: The 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Washington, DC, USA, August 14 - 18, 2022, pp.  1120–1130. ACM, 2022. doi: 10.1145/3534678.3539472. URL https://doi.org/10.1145/3534678.3539472.
  19. NQE: n-ary query embedding for complex query answering over hyper-relational knowledge graphs. In Brian Williams, Yiling Chen, and Jennifer Neville (eds.), Thirty-Seventh AAAI Conference on Artificial Intelligence, AAAI 2023, Thirty-Fifth Conference on Innovative Applications of Artificial Intelligence, IAAI 2023, Thirteenth Symposium on Educational Advances in Artificial Intelligence, EAAI 2023, Washington, DC, USA, February 7-14, 2023, pp.  4543–4551. AAAI Press, 2023. URL https://ojs.aaai.org/index.php/AAAI/article/view/25576.
  20. Invariant and equivariant graph networks. In 7th International Conference on Learning Representations, ICLR 2019, New Orleans, LA, USA, May 6-9, 2019. OpenReview.net, 2019. URL https://openreview.net/forum?id=Syx72jC9tm.
  21. Star graph neural networks for session-based recommendation. In Mathieu d’Aquin, Stefan Dietze, Claudia Hauff, Edward Curry, and Philippe Cudré-Mauroux (eds.), CIKM ’20: The 29th ACM International Conference on Information and Knowledge Management, Virtual Event, Ireland, October 19-23, 2020, pp.  1195–1204. ACM, 2020. doi: 10.1145/3340531.3412014. URL https://doi.org/10.1145/3340531.3412014.
  22. Beta embeddings for multi-hop logical reasoning in knowledge graphs. In Hugo Larochelle, Marc’Aurelio Ranzato, Raia Hadsell, Maria-Florina Balcan, and Hsuan-Tien Lin (eds.), Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems 2020, NeurIPS 2020, December 6-12, 2020, virtual, 2020. URL https://proceedings.neurips.cc/paper/2020/hash/e43739bba7cdb577e9e3e4e42447f5a5-Abstract.html.
  23. Query2box: Reasoning over knowledge graphs in vector space using box embeddings. In 8th International Conference on Learning Representations, ICLR 2020, Addis Ababa, Ethiopia, April 26-30, 2020. OpenReview.net, 2020. URL https://openreview.net/forum?id=BJgr4kSFDS.
  24. Improved recurrent neural networks for session-based recommendations. In Alexandros Karatzoglou, Balázs Hidasi, Domonkos Tikk, Oren Sar Shalom, Haggai Roitman, Bracha Shapira, and Lior Rokach (eds.), Proceedings of the 1st Workshop on Deep Learning for Recommender Systems, DLRS@RecSys 2016, Boston, MA, USA, September 15, 2016, pp.  17–22. ACM, 2016. doi: 10.1145/2988450.2988452. URL https://doi.org/10.1145/2988450.2988452.
  25. Jiaxi Tang and Ke Wang. Personalized top-n sequential recommendation via convolutional sequence embedding. In Yi Chang, Chengxiang Zhai, Yan Liu, and Yoelle Maarek (eds.), Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining, WSDM 2018, Marina Del Rey, CA, USA, February 5-9, 2018, pp.  565–573. ACM, 2018. doi: 10.1145/3159652.3159656. URL https://doi.org/10.1145/3159652.3159656.
  26. Logicrec: Recommendation with users’ logical requirements. In Hsin-Hsi Chen, Wei-Jou (Edward) Duh, Hen-Hsen Huang, Makoto P. Kato, Josiane Mothe, and Barbara Poblete (eds.), Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2023, Taipei, Taiwan, July 23-27, 2023, pp.  2129–2133. ACM, 2023. doi: 10.1145/3539618.3592012. URL https://doi.org/10.1145/3539618.3592012.
  27. Benchmarking the combinatorial generalizability of complex query answering on knowledge graphs. In Joaquin Vanschoren and Sai-Kit Yeung (eds.), Proceedings of the Neural Information Processing Systems Track on Datasets and Benchmarks 1, NeurIPS Datasets and Benchmarks 2021, December 2021, virtual, 2021. URL https://datasets-benchmarks-proceedings.neurips.cc/paper/2021/hash/7eabe3a1649ffa2b3ff8c02ebfd5659f-Abstract-round2.html.
  28. Logical message passing networks with one-hop inference on atomic formulas. In The Eleventh International Conference on Learning Representations, ICLR 2023, Kigali, Rwanda, May 1-5, 2023. OpenReview.net, 2023. URL https://openreview.net/pdf?id=SoyOsp7i_l.
  29. Session-based recommendation with graph neural networks. In The Thirty-Third AAAI Conference on Artificial Intelligence, AAAI 2019, The Thirty-First Innovative Applications of Artificial Intelligence Conference, IAAI 2019, The Ninth AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019, Honolulu, Hawaii, USA, January 27 - February 1, 2019, pp.  346–353. AAAI Press, 2019. doi: 10.1609/aaai.v33i01.3301346. URL https://doi.org/10.1609/aaai.v33i01.3301346.
  30. Self-supervised hypergraph convolutional networks for session-based recommendation. In Thirty-Fifth AAAI Conference on Artificial Intelligence, AAAI 2021, Thirty-Third Conference on Innovative Applications of Artificial Intelligence, IAAI 2021, The Eleventh Symposium on Educational Advances in Artificial Intelligence, EAAI 2021, Virtual Event, February 2-9, 2021, pp.  4503–4511. AAAI Press, 2021. URL https://ojs.aaai.org/index.php/AAAI/article/view/16578.
  31. How powerful are graph neural networks? In 7th International Conference on Learning Representations, ICLR 2019, New Orleans, LA, USA, May 6-9, 2019. OpenReview.net, 2019. URL https://openreview.net/forum?id=ryGs6iA5Km.
  32. Neural-symbolic entangled framework for complex query answering. In S. Koyejo, S. Mohamed, A. Agarwal, D. Belgrave, K. Cho, and A. Oh (eds.), Advances in Neural Information Processing Systems, volume 35, pp.  1806–1819. Curran Associates, Inc., 2022. URL https://proceedings.neurips.cc/paper_files/paper/2022/file/0bcfb525c8f8f07ae10a93d0b2a40e00-Paper-Conference.pdf.
  33. Gammae: Gamma embeddings for logical queries on knowledge graphs. In Yoav Goldberg, Zornitsa Kozareva, and Yue Zhang (eds.), Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022, Abu Dhabi, United Arab Emirates, December 7-11, 2022, pp.  745–760. Association for Computational Linguistics, 2022. URL https://aclanthology.org/2022.emnlp-main.47.
  34. Efficiently leveraging multi-level user intent for session-based recommendation via atten-mixer network. In Tat-Seng Chua, Hady W. Lauw, Luo Si, Evimaria Terzi, and Panayiotis Tsaparas (eds.), Proceedings of the Sixteenth ACM International Conference on Web Search and Data Mining, WSDM 2023, Singapore, 27 February 2023 - 3 March 2023, pp.  168–176. ACM, 2023. doi: 10.1145/3539597.3570445. URL https://doi.org/10.1145/3539597.3570445.
  35. Neural-symbolic models for logical queries on knowledge graphs. In Kamalika Chaudhuri, Stefanie Jegelka, Le Song, Csaba Szepesvári, Gang Niu, and Sivan Sabato (eds.), International Conference on Machine Learning, ICML 2022, 17-23 July 2022, Baltimore, Maryland, USA, volume 162 of Proceedings of Machine Learning Research, pp.  27454–27478. PMLR, 2022. URL https://proceedings.mlr.press/v162/zhu22c.html.
Citations (2)

Summary

  • The paper presents LSGT, a model that leverages hypergraph structures to perform complex logical queries across multiple sessions.
  • The model maintains input permutation invariance, ensuring accurate application of logical operations like AND, OR, and NOT in user queries.
  • Experimental evaluation proves that LSGT outperforms state-of-the-art models in capturing and predicting evolving user intentions.

Introduction to Logical Session Intention Understanding

Understanding a user's intentions online is a cornerstone of creating personalized experiences, such as tailored product recommendations. This understanding grows more challenging when users interact with systems over multiple sessions with various preferences, and their intentions are not always explicitly clarified. This challenge has led to the development of new models that can infer complex, multi-session user intentions by transforming sessions and user interactions into a hypergraph structure.

Proposed Approach

A hypergraph is a generalization of a graph where edges can connect more than two vertices, making it an apt data structure for capturing the more intricate relationships between different products, user sessions, and other attributes. The process of identifying a user’s intention in this context is likened to answering complex logical queries based on the hypergraph. The paper introduces the Logical Session Graph Transformer (LSGT), a model specifically designed to address these types of reasoning tasks.

Model Design

The LSGT model has a structural advantage in handling complex queries involving multiple sessions and user preferences. It uses a combination of transformer architecture, known for its powerful representational abilities, and logical reasoning capabilities. The model ensures that input permutation invariance is maintained, preserving the logical integrity of operations like AND, OR, and NOT. Evaluation of the LSGT model shows promising results in accurately capturing user intentions, outperforming state-of-the-art models in various complex reasoning tasks.

Evaluation

In the paper's experiments, different datasets were utilized to evaluate the LSGT model's effectiveness. The results confirmed that the proposed model could handle the task of logical session complex query answering with a high degree of accuracy. The LSGT demonstrated its capability to make sophisticated inferences based on the hypergraph, showcasing its potential to enhance user experience by accurately predicting user intentions across multi-session interactions.

Theoretical Justification

The expressiveness of the LSGT model was thoroughly analyzed, demonstrating that it can at least match the reasoning powers of other well-regarded logical query encoders. The proof of its ability to handle operation-wise permutation invariance affirms the soundness of its logical capabilities for the complex reasoning tasks it is designed for.

Conclusion

The paper culminates in emphasizing the significance of the LSGT model as a remarkable advancement in interpreting user intentions – a critical step for systems offering context-aware recommendations. Its ability to grasp complex user requirements across sessions, particularly those that are implicit or evolving over time, sets the stage for more intuitive and intelligent user interactions in e-commerce and beyond.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.