Memory cost of tensor reshaping for lossy Tucker compression

Determine whether, in the setting of lossy compression using Tucker decomposition, rearranging the elements of an Nth-order tensor into an Mth-order tensor with lower mode dimensions—as performed in the tenSVD algorithm—results in a higher memory cost than using the original Nth-order tensor structure. Specifically, compare the storage requirements defined as the sum of the sizes of the component matrices plus the size of the core tensor for both configurations (Σ_n I_n·r_n + Π_n r_n for the original Nth-order tensor versus Σ_m J_m·r_m + Π_m r_m for the reshaped Mth-order tensor), and ascertain whether the reshaped configuration increases, decreases, or preserves the memory cost.

Background

The paper introduces tenSVD, a tensor-based compression algorithm that first reshapes an Nth-order tensor into an Mth-order tensor with smaller mode dimensions, then applies Tucker decomposition. For lossy compression, the authors recall that storage cost is governed by the sizes of the component matrices and the core tensor.

In discussing memory cost, the authors explicitly note uncertainty about how reshaping affects storage. They provide formulas for memory cost in both the original and reshaped configurations, and observe that core tensor storage often dominates, motivating a strategy to retain only the largest-magnitude core entries. However, they do not settle whether reshaping increases or decreases total memory cost in general, leaving this question open.

References

In the case where the elements are reordered into a higher-order tensor, it is not clear whether the memory cost will be higher.

tenSVD algorithm for compression  (2505.21686 - Gallo, 27 May 2025) in Section 3.2 (Memory Cost)