Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
38 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

ALBERT with Knowledge Graph Encoder Utilizing Semantic Similarity for Commonsense Question Answering (2211.07065v1)

Published 14 Nov 2022 in cs.CL and cs.AI

Abstract: Recently, pre-trained language representation models such as bidirectional encoder representations from transformers (BERT) have been performing well in commonsense question answering (CSQA). However, there is a problem that the models do not directly use explicit information of knowledge sources existing outside. To augment this, additional methods such as knowledge-aware graph network (KagNet) and multi-hop graph relation network (MHGRN) have been proposed. In this study, we propose to use the latest pre-trained LLM a lite bidirectional encoder representations from transformers (ALBERT) with knowledge graph information extraction technique. We also propose to applying the novel method, schema graph expansion to recent LLMs. Then, we analyze the effect of applying knowledge graph-based knowledge extraction techniques to recent pre-trained LLMs and confirm that schema graph expansion is effective in some extent. Furthermore, we show that our proposed model can achieve better performance than existing KagNet and MHGRN models in CommonsenseQA dataset.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Byeongmin Choi (1 paper)
  2. YongHyun Lee (2 papers)
  3. Yeunwoong Kyung (1 paper)
  4. Eunchan Kim (2 papers)
Citations (8)