Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

How Do Programming Students Use Generative AI? (2501.10091v1)

Published 17 Jan 2025 in cs.HC, cs.AI, and cs.CY

Abstract: Programming students have a widespread access to powerful Generative AI tools like ChatGPT. While this can help understand the learning material and assist with exercises, educators are voicing more and more concerns about an over-reliance on generated outputs and lack of critical thinking skills. It is thus important to understand how students actually use generative AI and what impact this could have on their learning behavior. To this end, we conducted a study including an exploratory experiment with 37 programming students, giving them monitored access to ChatGPT while solving a code understanding and improving exercise. While only 23 of the students actually opted to use the chatbot, the majority of those eventually prompted it to simply generate a full solution. We observed two prevalent usage strategies: to seek knowledge about general concepts and to directly generate solutions. Instead of using the bot to comprehend the code and their own mistakes, students often got trapped in a vicious cycle of submitting wrong generated code and then asking the bot for a fix. Those who self-reported using generative AI regularly were more likely to prompt the bot to generate a solution. Our findings indicate that concerns about potential decrease in programmers' agency and productivity with Generative AI are justified. We discuss how researchers and educators can respond to the potential risk of students uncritically over-relying on generative AI. We also discuss potential modifications to our study design for large-scale replications.

Overview of "How Do Programming Students Use Generative AI?"

The research paper by Christian Rahe and Walid Maalej, titled "How Do Programming Students Use Generative AI?", provides an in-depth examination of the interactions and usage patterns of generative AI (GAI) tools like ChatGPT among programming students. The paper is critical as it addresses the growing concerns among educators about the potential over-reliance on these tools, which could impede students' learning and problem-solving capabilities.

Study Design and Methodology

The researchers conducted a mixed-methods paper comprising an exploratory experiment with 37 programming students at the University of Hamburg. The students were given monitored access to ChatGPT while tackling a coding exercise aimed at understanding and improving code. This paper was aimed at answering several research questions, such as the strategies students employ when using generative AI tools, how much of their problem-solving process they delegate to AI, and whether a student could theoretically pass an introductory programming course using only AI-generated answers.

Key Findings

Generative AI Usage Patterns

The paper revealed that a substantial number of students used ChatGPT to generate complete solutions rather than for understanding the coding concepts. The two prevalent strategies observed were using the chatbot to glean information about coding concepts and directly generating solutions to problems. Notably, students frequently entered a cyclical pattern of submitting incorrect AI-generated code and continuously querying the AI for corrections, resulting in unproductive problem-solving loops.

Influence on Student Performance

Interestingly, students who self-reported regular use of generative AI tended to rely more heavily on it for solution generation. While this indicates a degree of trust in the tool, it also underscores the risk that novice programmers might accept incorrect AI-generated code, potentially eroding critical code comprehension and debugging skills. The paper also highlighted the inefficacy of ChatGPT in providing explanations or understanding students' code, despite students not requesting explanatory assistance.

Implications and Future Directions

The findings of this paper have profound implications for software engineering education. They suggest a need for educators to develop strategies that help students effectively integrate these tools into their learning process without undermining their ability to think critically and solve problems independently. Encouragingly, the paper advocates for a balanced integration of GAI tools in education, ensuring that they complement rather than replace students' learning processes.

In future research, there is a need to explore the potential for these AI tools to support guided learning experiences that enhance understanding rather than merely providing shortcuts to solutions. Additionally, studies could examine long-term impacts on students' programming capabilities and retention of knowledge when frequently using generative AI tools in their coursework.

Conclusion

Rahe and Maalej's paper is an important contribution to the discussion on the role of generative AI in programming education. It underscores the necessity for critical thinking and responsible usage patterns among students while using AI tools. The insights drawn from this research call for an evolution in teaching methodologies to accommodate these emerging technologies, prioritizing pedagogical strategies that promote critical engagement with AI-generated content.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Christian Rahe (1 paper)
  2. Walid Maalej (41 papers)