- The paper introduces a system that integrates keystroke logging and timed snapshots to capture the writing process for more personalized LLM feedback.
- The methodology was tested with 20 undergraduates, with 83% highlighting improved clarity in identifying writing challenges and 72% noting effective revision tracking.
- The findings imply that integrating real-time writing data can produce targeted, adaptive feedback, thereby advancing educational writing tools.
Teaching LLMs to Understand Student Writing Processes
This paper investigates the potential of leveraging writing process data to enhance feedback from LLMs in educational contexts. Traditional LLM-based feedback systems typically assess only the final draft of student work, neglecting the significant insights that can be gleaned from the actions and revisions that students make during the writing process. The authors propose a novel system that integrates keystroke logging and timed snapshots to provide process-aware feedback, aiming to offer more personal and meaningful feedback to students.
Methodology
A digital writing tool was developed to log keystroke data and capture periodic snapshots of essay drafts. The tool monitored students' typing behavior and recorded the evolution of their documents. In real-time, student writing data, including timestamps and revision actions, were transmitted to the LLM, enabling feedback that incorporates the cognitive processes behind the students' writing choices.
The research involved 20 undergraduate students, who used the tool to write essays under timed conditions. Both the final essays and writing process data were evaluated by LLM-generated feedback mechanisms. Additionally, after receiving feedback, students completed surveys to assess the perceived utility and accuracy of the feedback in relation to their writing processes.
Findings
The paper reveals that process-aware feedback provided by LLMs was generally preferred by students, aligning closely with their own reflective writing strategies. Notably, types of revisions such as content additions or paragraph restructuring correlated with higher scores in coherence and elaboration.
Qualitative analysis identified key themes in student feedback:
- Capturing Core Writing Issues: The tool effectively identified writing challenges, such as thesis clarity and organizational weaknesses, noted by 83% of participants.
- Precision vs. Fairness: Some participants felt unfairly critiqued for errors they had already revised, indicating a gap in contextual awareness.
- Tracking the Revision Journey: The system was commended for understanding revision strategies, with 72% of students acknowledging its ability to track cognitive difficulty and idea development.
- Missing Nuance: Certain aspects, like emotional and creative writing, were perceived as misinterpreted by the system.
- Aspiration for Personal Growth and Tailored Support: Participants expressed interest in more personalized and adaptive feedback that caters to informal or genre-specific writing styles.
Quantitative data mirrored these sentiments, indicating high satisfaction with the tool's ability to identify thesis clarity and argument structure (average score of 4.1 out of 5), and grammar-related feedback (average score of 4.7).
Implications and Future Work
The paper suggests that integrating data on the writing process can significantly enhance LLM-generated feedback to be more pedagogically useful. By understanding not only the end result but also the cognitive journey of students, educational tools can provide feedback that is perceived as more relevant and supportive.
Future research should aim to expand on these initial findings by including larger and more diverse samples. There is also potential to rigorously compare LLM-generated feedback with human feedback to understand areas of complementarity and conflict. Additionally, refining user modeling and incorporating affective data could support more personalized, adaptive feedback systems.
Conclusion
The integration of process-sensing capabilities into LLMs represents a promising advancement in educational technology. By focusing on how students think and revise, rather than just the completed text, feedback systems can evolve to better support cognitive and skill development in writing. The findings underscore the importance of creating feedback mechanisms that are attuned to the iterative and dynamic nature of the learning process, thereby fostering more meaningful engagement and improvement in student writing.