Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Unleashing the Power of Compiler Intermediate Representation to Enhance Neural Program Embeddings (2204.09191v1)

Published 20 Apr 2022 in cs.SE

Abstract: Neural program embeddings have demonstrated considerable promise in a range of program analysis tasks, including clone identification, program repair, code completion, and program synthesis. However, most existing methods generate neural program embeddings directly from the program source codes, by learning from features such as tokens, abstract syntax trees, and control flow graphs. This paper takes a fresh look at how to improve program embeddings by leveraging compiler intermediate representation (IR). We first demonstrate simple yet highly effective methods for enhancing embedding quality by training embedding models alongside source code and LLVM IR generated by default optimization levels (e.g., -O2). We then introduce IRGen, a framework based on genetic algorithms (GA), to identify (near-)optimal sequences of optimization flags that can significantly improve embedding quality.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Zongjie Li (29 papers)
  2. Pingchuan Ma (91 papers)
  3. Huaijin Wang (5 papers)
  4. Shuai Wang (466 papers)
  5. Qiyi Tang (16 papers)
  6. Sen Nie (14 papers)
  7. Shi Wu (9 papers)
Citations (21)

Summary

We haven't generated a summary for this paper yet.