Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

cuSZ-$i$: High-Ratio Scientific Lossy Compression on GPUs with Optimized Multi-Level Interpolation (2312.05492v6)

Published 9 Dec 2023 in cs.DC

Abstract: Error-bounded lossy compression is a critical technique for significantly reducing scientific data volumes. Compared to CPU-based compressors, GPU-based compressors exhibit substantially higher throughputs, fitting better for today's HPC applications. However, the critical limitations of existing GPU-based compressors are their low compression ratios and qualities, severely restricting their applicability. To overcome these, we introduce a new GPU-based error-bounded scientific lossy compressor named cuSZ-$i$, with the following contributions: (1) A novel GPU-optimized interpolation-based prediction method significantly improves the compression ratio and decompression data quality. (2) The Huffman encoding module in cuSZ-$i$ is optimized for better efficiency. (3) cuSZ-$i$ is the first to integrate the NVIDIA Bitcomp-lossless as an additional compression-ratio-enhancing module. Evaluations show that cuSZ-$i$ significantly outperforms other latest GPU-based lossy compressors in compression ratio under the same error bound (hence, the desired quality), showcasing a 476% advantage over the second-best. This leads to cuSZ-$i$'s optimized performance in several real-world use cases.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (13)
  1. Jinyang Liu (51 papers)
  2. Jiannan Tian (30 papers)
  3. Shixun Wu (10 papers)
  4. Sheng Di (58 papers)
  5. Boyuan Zhang (36 papers)
  6. Yafan Huang (4 papers)
  7. Kai Zhao (160 papers)
  8. Guanpeng Li (10 papers)
  9. Dingwen Tao (60 papers)
  10. Zizhong Chen (41 papers)
  11. Franck Cappello (60 papers)
  12. Robert Underwood (26 papers)
  13. Jiajun Huang (30 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.