Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Efficient and Sound Differentiable Programming in a Functional Array-Processing Language (2212.10307v1)

Published 20 Dec 2022 in cs.PL, cs.LG, and cs.MS

Abstract: Automatic differentiation (AD) is a technique for computing the derivative of a function represented by a program. This technique is considered as the de-facto standard for computing the differentiation in many machine learning and optimisation software tools. Despite the practicality of this technique, the performance of the differentiated programs, especially for functional languages and in the presence of vectors, is suboptimal. We present an AD system for a higher-order functional array-processing language. The core functional language underlying this system simultaneously supports both source-to-source forward-mode AD and global optimisations such as loop transformations. In combination, gradient computation with forward-mode AD can be as efficient as reverse mode, and the Jacobian matrices required for numerical algorithms such as Gauss-Newton and Levenberg-Marquardt can be efficiently computed.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Amir Shaikhha (21 papers)
  2. Mathieu Huot (15 papers)
  3. Shabnam Ghasemirad (3 papers)
  4. Andrew Fitzgibbon (21 papers)
  5. Simon Peyton Jones (14 papers)
  6. Dimitrios Vytiniotis (12 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.