Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Feature-augmented Machine Reading Comprehension with Auxiliary Tasks (2211.09438v1)

Published 17 Nov 2022 in cs.CL

Abstract: While most successful approaches for machine reading comprehension rely on single training objective, it is assumed that the encoder layer can learn great representation through the loss function we define in the predict layer, which is cross entropy in most of time, in the case that we first use neural networks to encode the question and paragraph, then directly fuse the encoding result of them. However, due to the distantly loss backpropagating in reading comprehension, the encoder layer cannot learn effectively and be directly supervised. Thus, the encoder layer can not learn the representation well at any time. Base on this, we propose to inject multi granularity information to the encoding layer. Experiments demonstrate the effect of adding multi granularity information to the encoding layer can boost the performance of machine reading comprehension system. Finally, empirical study shows that our approach can be applied to many existing MRC models.

Summary

We haven't generated a summary for this paper yet.