Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

CLICKER: Attention-Based Cross-Lingual Commonsense Knowledge Transfer (2302.13201v1)

Published 26 Feb 2023 in cs.CL

Abstract: Recent advances in cross-lingual commonsense reasoning (CSR) are facilitated by the development of multilingual pre-trained models (mPTMs). While mPTMs show the potential to encode commonsense knowledge for different languages, transferring commonsense knowledge learned in large-scale English corpus to other languages is challenging. To address this problem, we propose the attention-based Cross-LIngual Commonsense Knowledge transfER (CLICKER) framework, which minimizes the performance gaps between English and non-English languages in commonsense question-answering tasks. CLICKER effectively improves commonsense reasoning for non-English languages by differentiating non-commonsense knowledge from commonsense knowledge. Experimental results on public benchmarks demonstrate that CLICKER achieves remarkable improvements in the cross-lingual CSR task for languages other than English.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Ruolin Su (7 papers)
  2. Zhongkai Sun (6 papers)
  3. Sixing Lu (5 papers)
  4. Chengyuan Ma (20 papers)
  5. Chenlei Guo (17 papers)

Summary

We haven't generated a summary for this paper yet.