Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Unsupervised Learning of General-Purpose Embeddings for Code Changes (2106.02087v2)

Published 3 Jun 2021 in cs.SE and cs.LG

Abstract: Applying machine learning to tasks that operate with code changes requires their numerical representation. In this work, we propose an approach for obtaining such representations during pre-training and evaluate them on two different downstream tasks - applying changes to code and commit message generation. During pre-training, the model learns to apply the given code change in a correct way. This task requires only code changes themselves, which makes it unsupervised. In the task of applying code changes, our model outperforms baseline models by 5.9 percentage points in accuracy. As for the commit message generation, our model demonstrated the same results as supervised models trained for this specific task, which indicates that it can encode code changes well and can be improved in the future by pre-training on a larger dataset of easily gathered code changes.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Mikhail Pravilov (1 paper)
  2. Egor Bogomolov (21 papers)
  3. Yaroslav Golubev (40 papers)
  4. Timofey Bryksin (67 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.