Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Modelling homogeneous generative meta-programming (1602.06568v2)

Published 21 Feb 2016 in cs.PL

Abstract: Homogeneous generative meta-programming (HGMP) enables the generation of program fragments at compile-time or run-time. We present the first foundational calculus which can model powerful HGMP languages such as Template Haskell. The calculus is designed such that we can gradually enhance it with the features needed to model many of the advanced features of real languages. As a demonstration of the flexibility of our approach, we also provide a simple type system for the calculus.

Citations (7)

Summary

We haven't generated a summary for this paper yet.