Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Data Dwarfs: A Lens Towards Fully Understanding Big Data and AI Workloads (1802.00699v2)

Published 1 Feb 2018 in cs.DC and cs.PF

Abstract: The complexity and diversity of big data and AI workloads make understanding them difficult and challenging. This paper proposes a new approach to characterizing big data and AI workloads. We consider each big data and AI workload as a pipeline of one or more classes of unit of computations performed on different initial or intermediate data inputs. Each class of unit of computation captures the common requirements while being reasonably divorced from individual implementations, and hence we call it a data dwarf. For the first time, among a wide variety of big data and AI workloads, we identify eight data dwarfs that takes up most of run time, including Matrix, Sampling, Logic, Transform, Set, Graph, Sort and Statistic. We implement the eight data dwarfs on different software stacks as the micro benchmarks of an open-source big data and AI benchmark suite, and perform comprehensive characterization of those data dwarfs from perspective of data sizes, types, sources, and patterns as a lens towards fully understanding big data and AI workloads.

Citations (4)

Summary

We haven't generated a summary for this paper yet.