Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

CRaDLe: Deep Code Retrieval Based on Semantic Dependency Learning (2012.01028v2)

Published 2 Dec 2020 in cs.SE

Abstract: Code retrieval is a common practice for programmers to reuse existing code snippets in open-source repositories. Given a user query (i.e., a natural language description), code retrieval aims at searching for the most relevant ones from a set of code snippets. The main challenge of effective code retrieval lies in mitigating the semantic gap between natural language descriptions and code snippets. With the ever-increasing amount of available open-source code, recent studies resort to neural networks to learn the semantic matching relationships between the two sources. The statement-level dependency information, which highlights the dependency relations among the program statements during the execution, reflects the structural importance of one statement in the code, which is favorable for accurately capturing the code semantics but has never been explored for the code retrieval task. In this paper, we propose CRaDLe, a novel approach for Code Retrieval based on statement-level semantic Dependency Learning. Specifically, CRaDLe distills code representations through fusing both the dependency and semantic information at the statement level and then learns a unified vector representation for each code and description pair for modeling the matching relationship. Comprehensive experiments and analysis on real-world datasets show that the proposed approach can accurately retrieve code snippets for a given query and significantly outperform the state-of-the-art approaches to the task.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Wenchao Gu (10 papers)
  2. Zongjie Li (29 papers)
  3. Cuiyun Gao (97 papers)
  4. Chaozheng Wang (28 papers)
  5. Hongyu Zhang (147 papers)
  6. Zenglin Xu (145 papers)
  7. Michael R. Lyu (176 papers)
Citations (42)

Summary

We haven't generated a summary for this paper yet.