Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

How Useful are Educational Questions Generated by Large Language Models? (2304.06638v1)

Published 13 Apr 2023 in cs.CL, cs.AI, cs.CY, and cs.LG

Abstract: Controllable text generation (CTG) by LLMs has a huge potential to transform education for teachers and students alike. Specifically, high quality and diverse question generation can dramatically reduce the load on teachers and improve the quality of their educational content. Recent work in this domain has made progress with generation, but fails to show that real teachers judge the generated questions as sufficiently useful for the classroom setting; or if instead the questions have errors and/or pedagogically unhelpful content. We conduct a human evaluation with teachers to assess the quality and usefulness of outputs from combining CTG and question taxonomies (Bloom's and a difficulty taxonomy). The results demonstrate that the questions generated are high quality and sufficiently useful, showing their promise for widespread use in the classroom setting.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Sabina Elkins (5 papers)
  2. Ekaterina Kochmar (33 papers)
  3. Jackie C. K. Cheung (11 papers)
  4. Iulian Serban (6 papers)
Citations (22)

Summary

We haven't generated a summary for this paper yet.