Exploring Answer Information Methods for Question Generation with Transformers (2312.03483v1)
Abstract: There has been a lot of work in question generation where different methods to provide target answers as input, have been employed. This experimentation has been mostly carried out for RNN based models. We use three different methods and their combinations for incorporating answer information and explore their effect on several automatic evaluation metrics. The methods that are used are answer prompting, using a custom product method using answer embeddings and encoder outputs, choosing sentences from the input paragraph that have answer related information, and using a separate cross-attention attention block in the decoder which attends to the answer. We observe that answer prompting without any additional modes obtains the best scores across rouge, meteor scores. Additionally, we use a custom metric to calculate how many of the generated questions have the same answer, as the answer which is used to generate them.
- Benjamin Börschinger and Mark Johnson. 2011. A particle filter algorithm for Bayesian wordsegmentation. In Proceedings of the Australasian Language Technology Association Workshop 2011, pages 10–18, Canberra, Australia.
- Mixture content selection for diverse sequence generation. arXiv preprint arXiv:1909.01953.
- Learning phrase representations using rnn encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078.
- Xinya Du and Claire Cardie. 2017. Identifying where to focus in reading comprehension for neural question generation. In Proceedings of the 2017 conference on empirical methods in natural language processing, pages 2067–2073.
- Learning to ask: Neural question generation for reading comprehension. arXiv preprint arXiv:1705.00106.
- Question generation for question answering. In Proceedings of the 2017 conference on empirical methods in natural language processing, pages 866–874.
- Question generation for reading comprehension assessment by modeling how and what to ask. arXiv preprint arXiv:2204.02908.
- Noise reduction and targeted exploration in imitation learning for Abstract Meaning Representation parsing. In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1–11, Berlin, Germany. Association for Computational Linguistics.
- Improving neural question generation using world knowledge. arXiv preprint arXiv:1909.03716.
- Mary Harper. 2014. Learning from 26 languages: Program management and science in the babel program. In Proceedings of COLING 2014, the 25th International Conference on Computational Linguistics: Technical Papers, page 1, Dublin, Ireland. Dublin City University and Association for Computational Linguistics.
- Vrindavan Harrison and Marilyn Walker. 2018. Neural generation of diverse questions using answer focus, contextual and linguistic features. arXiv preprint arXiv:1809.02637.
- Sepp Hochreiter and Jürgen Schmidhuber. 1997. Long short-term memory. Neural computation, 9(8):1735–1780.
- Improving neural question generation using answer separation. In Proceedings of the AAAI conference on artificial intelligence, volume 33, pages 6602–6609.
- Bart: Denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension. arXiv preprint arXiv:1910.13461.
- Learning to generate questions by learningwhat not to generate. In The world wide web conference, pages 1106–1118.
- Transformer-based end-to-end question generation. arXiv preprint arXiv:2005.01107, 4.
- Improving question generation with sentence-level semantic matching and answer position inferring. In Proceedings of the AAAI conference on artificial intelligence, volume 34, pages 8464–8471.
- Warren S McCulloch and Walter Pitts. 1943. A logical calculus of the ideas immanent in nervous activity. The bulletin of mathematical biophysics, 5:115–133.
- Squad: 100,000+ questions for machine comprehension of text. arXiv preprint arXiv:1606.05250.
- Automatic question generation for intelligent tutoring systems. In 2017 2Nd international conference on communication systems, computing and IT applications (CSCITA), pages 127–132. IEEE.
- Neural models for key phrase detection and question generation. arXiv preprint arXiv:1706.04560.
- Answer-focused and position-aware neural question generation. In Proceedings of the 2018 conference on empirical methods in natural language processing, pages 3930–3939.
- Question answering and question generation as dual tasks. arXiv preprint arXiv:1706.02027.
- Weak supervision enhanced generative network for question generation. arXiv preprint arXiv:1907.00607.
- mt5: A massively multilingual pre-trained text-to-text transformer. arXiv preprint arXiv:2010.11934.
- Machine comprehension by text-to-text neural question generation. arXiv preprint arXiv:1705.02012.
- Pegasus: Pre-training with extracted gap-sentences for abstractive summarization. In International Conference on Machine Learning, pages 11328–11339. PMLR.
- Sequential copying networks. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 32.
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
Get 30 days freePaper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.