Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning to Combine Per-Example Solutions for Neural Program Synthesis (2106.07175v2)

Published 14 Jun 2021 in cs.LG, cs.AI, cs.PL, and cs.SE

Abstract: The goal of program synthesis from examples is to find a computer program that is consistent with a given set of input-output examples. Most learning-based approaches try to find a program that satisfies all examples at once. Our work, by contrast, considers an approach that breaks the problem into two stages: (a) find programs that satisfy only one example, and (b) leverage these per-example solutions to yield a program that satisfies all examples. We introduce the Cross Aggregator neural network module based on a multi-head attention mechanism that learns to combine the cues present in these per-example solutions to synthesize a global solution. Evaluation across programs of different lengths and under two different experimental settings reveal that when given the same time budget, our technique significantly improves the success rate over PCCoder [Zohar et. al 2018] and other ablation baselines. The code, data and trained models for our work can be found at https://github.com/shrivastavadisha/N-PEPS.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (32)
  1. Synthesis through unification. In International Conference on Computer Aided Verification, pages 163–179. Springer, 2015.
  2. Scaling enumerative program synthesis via divide and conquer. In TACAS, 2017.
  3. Layer normalization, 2016.
  4. Deepcoder: Learning to write programs. In International Conference on Learning Representations, 2016.
  5. Just-in-time learning for bottom-up enumerative synthesis. Proceedings of the ACM on Programming Languages, 4(OOPSLA):1–29, 2020.
  6. Leveraging grammar and reinforcement learning for neural program synthesis. In International Conference on Learning Representations, 2018.
  7. Execution-guided neural program synthesis. In International Conference on Learning Representations, 2018.
  8. Neural Program Meta-Induction. In I Guyon, U V Luxburg, S Bengio, H Wallach, R Fergus, S Vishwanathan, and R Garnett, editors, Advances in Neural Information Processing Systems, volume 30. Curran Associates, Inc., 2017a.
  9. Robustfill: Neural program learning under noisy i/o. In Proceedings of the 34th International Conference on Machine Learning - Volume 70, ICML’17, page 990–998. JMLR.org, 2017b.
  10. Write, execute, assess: Program synthesis with a repl. In H. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alché-Buc, E. Fox, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 32. Curran Associates, Inc., 2019.
  11. Sumit Gulwani. Automating string processing in spreadsheets using input-output examples. In Proceedings of the 38th Annual ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages, POPL ’11, page 317–330, New York, NY, USA, 2011. Association for Computing Machinery. ISBN 9781450304900. doi: 10.1145/1926385.1926423.
  12. Synthesize, execute and debug: Learning to repair for neural program synthesis. In H. Larochelle, M. Ranzato, R. Hadsell, M. F. Balcan, and H. Lin, editors, Advances in Neural Information Processing Systems, volume 33, pages 17685–17695. Curran Associates, Inc., 2020.
  13. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778, 2016.
  14. Global relational models of source code. In International Conference on Learning Representations, 2020.
  15. Neural-guided deductive search for real-time program synthesis from examples. In International Conference on Learning Representations, 2018.
  16. Adam: A method for stochastic optimization. In Yoshua Bengio and Yann LeCun, editors, 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, May 7-9, 2015, Conference Track Proceedings, 2015.
  17. Generalization without systematicity: On the compositional skills of sequence-to-sequence recurrent networks. In Jennifer Dy and Andreas Krause, editors, Proceedings of the 35th International Conference on Machine Learning, volume 80 of Proceedings of Machine Learning Research, pages 2873–2882. PMLR, 10–15 Jul 2018.
  18. Accelerating search-based program synthesis using learned probabilistic models. In Proceedings of the 39th ACM SIGPLAN Conference on Programming Language Design and Implementation, PLDI 2018, page 436–449, New York, NY, USA, 2018. Association for Computing Machinery. ISBN 9781450356985. doi: 10.1145/3192366.3192410.
  19. Learning to infer program sketches. In Kamalika Chaudhuri and Ruslan Salakhutdinov, editors, Proceedings of the 36th International Conference on Machine Learning, volume 97 of Proceedings of Machine Learning Research, pages 4861–4870. PMLR, 09–15 Jun 2019.
  20. Learning to represent programs with property signatures. In International Conference on Learning Representations, 2020.
  21. {BUSTLE}: Bottom-up program synthesis through learning-guided exploration. In International Conference on Learning Representations, 2021.
  22. Neuro-symbolic program synthesis, 2016.
  23. Pytorch: An imperative style, high-performance deep learning library. In H. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alché-Buc, E. Fox, and R. Garnett, editors, Advances in Neural Information Processing Systems 32, pages 8024–8035. Curran Associates, Inc., 2019.
  24. Richard E. Pattis. Karel the Robot: A Gentle Introduction to the Art of Programming. John Wiley & Sons, Inc., USA, 2nd edition, 1994. ISBN 0471107026.
  25. Perfect is the enemy of good: Best-effort program synthesis. In 34th European Conference on Object-Oriented Programming (ECOOP 2020). Schloss Dagstuhl-Leibniz-Zentrum für Informatik, 2020.
  26. Test-driven synthesis. ACM Sigplan Notices, 49(6):408–418, 2014.
  27. Self-attention with relative position representations. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers), pages 464–468, New Orleans, Louisiana, June 2018. Association for Computational Linguistics. doi: 10.18653/v1/N18-2074.
  28. Frangel: component-based synthesis with control structures. Proceedings of the ACM on Programming Languages, 3(POPL):1–29, 2019.
  29. Combinatorial sketching for finite programs. In John Paul Shen and Margaret Martonosi, editors, Proceedings of the 12th International Conference on Architectural Support for Programming Languages and Operating Systems, ASPLOS 2006, San Jose, CA, USA, October 21-25, 2006, pages 404–415. ACM, 2006. doi: 10.1145/1168857.1168907.
  30. Attention is all you need. In Proceedings of the 31st International Conference on Neural Information Processing Systems, NIPS’17, page 6000–6010, Red Hook, NY, USA, 2017. Curran Associates Inc. ISBN 9781510860964.
  31. Weixiong Zhang. Complete anytime beam search. In Proceedings of the Fifteenth National/Tenth Conference on Artificial Intelligence/Innovative Applications of Artificial Intelligence, AAAI ’98/IAAI ’98, page 425–430, USA, 1998. American Association for Artificial Intelligence. ISBN 0262510987.
  32. Automatic program synthesis of long programs with a learned garbage collector. In Advances in Neural Information Processing Systems, pages 2094–2103, 2018.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Disha Shrivastava (15 papers)
  2. Hugo Larochelle (87 papers)
  3. Daniel Tarlow (41 papers)
Citations (11)

Summary

We haven't generated a summary for this paper yet.

Github Logo Streamline Icon: https://streamlinehq.com