Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

iJTyper: An Iterative Type Inference Framework for Java by Integrating Constraint- and Statistically-based Methods (2402.09995v1)

Published 15 Feb 2024 in cs.SE

Abstract: Inferring the types of API elements in incomplete code snippets (e.g., those on Q&A forums) is a prepositive step required to work with the code snippets. Existing type inference methods can be mainly categorized as constraint-based or statistically-based. The former imposes higher requirements on code syntax and often suffers from low recall due to the syntactic limitation of code snippets. The latter relies on the statistical regularities learned from a training corpus and does not take full advantage of the type constraints in code snippets, which may lead to low precision. In this paper, we propose an iterative type inference framework for Java, called iJTyper, by integrating the strengths of both constraint- and statistically-based methods. For a code snippet, iJTyper first applies a constraint-based method and augments the code context with the inferred types of API elements. iJTyper then applies a statistically-based method to the augmented code snippet. The predicted candidate types of API elements are further used to improve the constraint-based method by reducing its pre-built knowledge base. iJTyper iteratively executes both methods and performs code context augmentation and knowledge base reduction until a termination condition is satisfied. Finally, the final inference results are obtained by combining the results of both methods. We evaluated iJTyper on two open-source datasets. Results show that 1) iJTyper achieves high average precision/recall of 97.31% and 92.52% on both datasets; 2) iJTyper significantly improves the recall of two state-of-the-art baselines, SnR and MLMTyper, by at least 7.31% and 27.44%, respectively; and 3) iJTyper improves the average precision/recall of the popular LLM, ChatGPT, by 3.25% and 0.51% on both datasets.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com