Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

ComFormer: Code Comment Generation via Transformer and Fusion Method-based Hybrid Code Representation (2107.03644v1)

Published 8 Jul 2021 in cs.SE

Abstract: Developers often write low-quality code comments due to the lack of programming experience, which can reduce the efficiency of developers program comprehension. Therefore, developers hope that code comment generation tools can be developed to illustrate the functionality and purpose of the code. Recently, researchers mainly model this problem as the neural machine translation problem and tend to use deep learning-based methods. In this study, we propose a novel method ComFormer based on Transformer and fusion method-based hybrid code presentation. Moreover, to alleviate OOV (out-of-vocabulary) problem and speed up model training, we further utilize the Byte-BPE algorithm to split identifiers and Sim_SBT method to perform AST Traversal. We compare ComFormer with seven state-of-the-art baselines from code comment generation and neural machine translation domains. Comparison results show the competitiveness of ComFormer in terms of three performance measures. Moreover, we perform a human study to verify that ComFormer can generate high-quality comments.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Guang Yang (422 papers)
  2. Xiang Chen (343 papers)
  3. Jinxin Cao (1 paper)
  4. Shuyuan Xu (31 papers)
  5. Zhanqi Cui (3 papers)
  6. Chi Yu (7 papers)
  7. Ke Liu (597 papers)
Citations (20)

Summary

We haven't generated a summary for this paper yet.