2000 character limit reached
IITK at SemEval-2020 Task 10: Transformers for Emphasis Selection (2007.10820v1)
Published 21 Jul 2020 in cs.CL, cs.AI, and cs.LG
Abstract: This paper describes the system proposed for addressing the research problem posed in Task 10 of SemEval-2020: Emphasis Selection For Written Text in Visual Media. We propose an end-to-end model that takes as input the text and corresponding to each word gives the probability of the word to be emphasized. Our results show that transformer-based models are particularly effective in this task. We achieved the best Matchm score (described in section 2.2) of 0.810 and were ranked third on the leaderboard.