Papers
Topics
Authors
Recent
2000 character limit reached

Assessing AI-Generated Questions' Alignment with Cognitive Frameworks in Educational Assessment (2504.14232v1)

Published 19 Apr 2025 in cs.AI and cs.CL

Abstract: This study evaluates the integration of Bloom's Taxonomy into OneClickQuiz, an AI driven plugin for automating Multiple-Choice Question (MCQ) generation in Moodle. Bloom's Taxonomy provides a structured framework for categorizing educational objectives into hierarchical cognitive levels. Our research investigates whether incorporating this taxonomy can improve the alignment of AI-generated questions with specific cognitive objectives. We developed a dataset of 3691 questions categorized according to Bloom's levels and employed various classification models-Multinomial Logistic Regression, Naive Bayes, Linear Support Vector Classification (SVC), and a Transformer-based model (DistilBERT)-to evaluate their effectiveness in categorizing questions. Our results indicate that higher Bloom's levels generally correlate with increased question length, Flesch-Kincaid Grade Level (FKGL), and Lexical Density (LD), reflecting the increased complexity of higher cognitive demands. Multinomial Logistic Regression showed varying accuracy across Bloom's levels, performing best for "Knowledge" and less accurately for higher-order levels. Merging higher-level categories improved accuracy for complex cognitive tasks. Naive Bayes and Linear SVC also demonstrated effective classification for lower levels but struggled with higher-order tasks. DistilBERT achieved the highest performance, significantly improving classification of both lower and higher-order cognitive levels, achieving an overall validation accuracy of 91%. This study highlights the potential of integrating Bloom's Taxonomy into AI-driven assessment tools and underscores the advantages of advanced models like DistilBERT for enhancing educational content generation.

Summary

Integration of AI with Bloom's Taxonomy for Automated Assessment in Educational Settings

The paper “Assessing AI-Generated Questions’ Alignment with Cognitive Frameworks in Educational Assessment” by Yaacoub, Da-Rugna, and Assaghir presents a comprehensive evaluation of AI-driven question generation within Moodle, particularly focusing on the integration of Bloom’s Taxonomy into the OneClickQuiz plugin. The authors investigate whether incorporating Bloom's Taxonomy can enhance the alignment of AI-generated questions with specific pedagogical objectives, thus providing an insightful overview of how generative AI can align educational content with cognitive frameworks.

Bloom’s Taxonomy categorizes educational objectives into a hierarchy of cognitive levels, facilitating curriculum design and assessment strategies that promote higher-order thinking. The paper involved generating 3691 questions categorized by Bloom’s levels and applied several classification models, including Multinomial Logistic Regression, Naive Bayes, Linear Support Vector Classification (SVC), and the Transformer-based DistilBERT, to evaluate their effectiveness in categorizing these questions.

Key Findings

  1. Correlation with Cognitive Complexity: The paper notes a correlation between higher Bloom's levels and increased question length, FKGL, and Lexical Density (LD), mirroring the complexity associated with advanced cognitive demands.
  2. Classification Performance: The DistilBERT model achieved the highest validation accuracy of 91%, significantly outperforming other methods in classifying both lower and higher-order cognitive levels. Traditional models exhibited challenges in accurately classifying higher-order cognitive skills, aligning with prior research.
  3. Implications for AI-driven Assessment Tools: The integration of Bloom’s Taxonomy into AI question generation using advanced models enhances the pedagogical soundness and potential of AI-based educational technologies. The findings underscore the advantages of employing deep learning models like DistilBERT to effectively generate and classify questions that promote critical thinking and deeper learning.

Discussion and Implications

The paper’s insights into how AI can be integrated with structured cognitive frameworks such as Bloom’s Taxonomy represent a significant contribution to the field of educational technology. By automating MCQ generation in alignment with pedagogical objectives, AI-driven tools can support educators in designing assessments that foster higher-order thinking. This is particularly relevant given the challenges traditional classifiers face when tasked with categorizing complex cognitive tasks.

While the paper demonstrates the potential of AI in automating and enhancing the educational assessment process, it also highlights the necessity for continuous refinement of AI models. Addressing the limitations in categorizing higher-order cognitive levels will require advancements in AI training techniques and the development of hybrid models that combine strengths from various approaches.

Future Directions

Further research should explore refining AI algorithms and expanding training datasets to improve the accuracy and alignment of AI-generated questions across all cognitive levels. Investigations into hybrid models could offer solutions to the complexity challenges noted in classifying higher-order tasks. This paper sets the stage for developing more sophisticated AI-driven tools that are capable of dynamically tailoring educational content to meet nuanced cognitive objectives across varied disciplines and learning contexts.

In conclusion, by integrating Bloom’s Taxonomy into AI-based assessment tools, educators can harness the power of AI to provide tailored learning experiences that align with cognitive goals, enhancing both teaching and evaluation processes in contemporary educational settings.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.