Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Data-Centric Optimization Framework for Machine Learning (2110.10802v3)

Published 20 Oct 2021 in cs.LG, cs.DC, and cs.PF

Abstract: Rapid progress in deep learning is leading to a diverse set of quickly changing models, with a dramatically growing demand for compute. However, as frameworks specialize performance optimization to patterns in popular networks, they implicitly constrain novel and diverse models that drive progress in research. We empower deep learning researchers by defining a flexible and user-customizable pipeline for optimizing training of arbitrary deep neural networks, based on data movement minimization. The pipeline begins with standard networks in PyTorch or ONNX and transforms computation through progressive lowering. We define four levels of general-purpose transformations, from local intra-operator optimizations to global data movement reduction. These operate on a data-centric graph intermediate representation that expresses computation and data movement at all levels of abstraction, including expanding basic operators such as convolutions to their underlying computations. Central to the design is the interactive and introspectable nature of the pipeline. Every part is extensible through a Python API, and can be tuned interactively using a GUI. We demonstrate competitive performance or speedups on ten different networks, with interactive optimizations discovering new opportunities in EfficientNet.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Oliver Rausch (9 papers)
  2. Tal Ben-Nun (53 papers)
  3. Nikoli Dryden (21 papers)
  4. Andrei Ivanov (17 papers)
  5. Shigang Li (25 papers)
  6. Torsten Hoefler (203 papers)
Citations (16)

Summary

We haven't generated a summary for this paper yet.