Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Out-of-core singular value decomposition (1907.06470v1)

Published 15 Jul 2019 in cs.MS, cs.NA, and math.NA

Abstract: Singular value decomposition (SVD) is a standard matrix factorization technique that produces optimal low-rank approximations of matrices. It has diverse applications, including machine learning, data science and signal processing. However, many common problems involve very large matrices that cannot fit in the main memory of commodity computers, making it impractical to use standard SVD algorithms that assume fast random access or large amounts of space for intermediate calculations. To address this issue, we have implemented an out-of-core (external memory) randomized SVD solution that is fully scalable and efficiently parallelizable. This solution factors both dense and sparse matrices of arbitrarily large size within arbitrarily small memory limits, efficiently using out-of-core storage as needed. It uses an innovative technique for partitioning matrices that lends itself to out-of-core and parallel processing, as well as memory and I/O use planning, automatic load balancing, performance tuning, and makes possible a number of other practical enhancements to the current state-of-the-art. Furthermore, by using persistent external storage (generally HDDs or SSDs), users can resume interrupted operations without having to recalculate previously performed steps, solving a major practical problem in factoring very large matrices.

Citations (8)

Summary

We haven't generated a summary for this paper yet.