Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

AUGER: Automatically Generating Review Comments with Pre-training Models (2208.08014v2)

Published 17 Aug 2022 in cs.SE

Abstract: Code review is one of the best practices as a powerful safeguard for software quality. In practice, senior or highly skilled reviewers inspect source code and provide constructive comments, considering what authors may ignore, for example, some special cases. The collaborative validation between contributors results in code being highly qualified and less chance of bugs. However, since personal knowledge is limited and varies, the efficiency and effectiveness of code review practice are worthy of further improvement. In fact, it still takes a colossal and time-consuming effort to deliver useful review comments. This paper explores a synergy of multiple practical review comments to enhance code review and proposes AUGER (AUtomatically GEnerating Review comments): a review comments generator with pre-training models. We first collect empirical review data from 11 notable Java projects and construct a dataset of 10,882 code changes. By leveraging Text-to-Text Transfer Transformer (T5) models, the framework synthesizes valuable knowledge in the training stage and effectively outperforms baselines by 37.38% in ROUGE-L. 29% of our automatic review comments are considered useful according to prior studies. The inference generates just in 20 seconds and is also open to training further. Moreover, the performance also gets improved when thoroughly analyzed in case study.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Lingwei Li (4 papers)
  2. Li Yang (273 papers)
  3. Huaxi Jiang (2 papers)
  4. Jun Yan (247 papers)
  5. Tiejian Luo (23 papers)
  6. Zihan Hua (2 papers)
  7. Geng Liang (2 papers)
  8. Chun Zuo (8 papers)
Citations (35)

Summary

We haven't generated a summary for this paper yet.