Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Distribution Theoretic Semantics for Non-Smooth Differentiable Programming (2207.05946v1)

Published 13 Jul 2022 in cs.PL

Abstract: With the wide spread of deep learning and gradient descent inspired optimization algorithms, differentiable programming has gained traction. Nowadays it has found applications in many different areas as well, such as scientific computing, robotics, computer graphics and others. One of its notoriously difficult problems consists in interpreting programs that are not differentiable everywhere. In this work we define $\lambda_\delta$, a core calculus for non-smooth differentiable programs and define its semantics using concepts from distribution theory, a well-established area of functional analysis. We also show how $\lambda_\delta$ presents better equational properties than other existing semantics and use our semantics to reason about a simplified ray tracing algorithm. Further, we relate our semantics to existing differentiable languages by providing translations to and from other existing differentiable semantic models. Finally, we provide a proof-of-concept implementation in PyTorch of the novel constructions in this paper.

Citations (2)

Summary

We haven't generated a summary for this paper yet.