Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Data Generation for Neural Programming by Example (1911.02624v1)

Published 6 Nov 2019 in cs.LG, cs.NE, cs.PL, and stat.ML

Abstract: Programming by example is the problem of synthesizing a program from a small set of input / output pairs. Recent works applying machine learning methods to this task show promise, but are typically reliant on generating synthetic examples for training. A particular challenge lies in generating meaningful sets of inputs and outputs, which well-characterize a given program and accurately demonstrate its behavior. Where examples used for testing are generated by the same method as training data then the performance of a model may be partly reliant on this similarity. In this paper we introduce a novel approach using an SMT solver to synthesize inputs which cover a diverse set of behaviors for a given program. We carry out a case study comparing this method to existing synthetic data generation procedures in the literature, and find that data generated using our approach improves both the discriminatory power of example sets and the ability of trained machine learning models to generalize to unfamiliar data.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Judith Clymo (3 papers)
  2. Haik Manukian (10 papers)
  3. Nathanaël Fijalkow (49 papers)
  4. Adrià Gascón (35 papers)
  5. Brooks Paige (43 papers)
Citations (6)

Summary

We haven't generated a summary for this paper yet.