Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Further limitations of the known approaches for matrix multiplication (1712.07246v1)

Published 19 Dec 2017 in cs.CC and cs.DS

Abstract: We consider the techniques behind the current best algorithms for matrix multiplication. Our results are threefold. (1) We provide a unifying framework, showing that all known matrix multiplication running times since 1986 can be achieved from a single very natural tensor - the structural tensor $T_q$ of addition modulo an integer $q$. (2) We show that if one applies a generalization of the known techniques (arbitrary zeroing out of tensor powers to obtain independent matrix products in order to use the asymptotic sum inequality of Sch\"{o}nhage) to an arbitrary monomial degeneration of $T_q$, then there is an explicit lower bound, depending on $q$, on the bound on the matrix multiplication exponent $\omega$ that one can achieve. We also show upper bounds on the value $\alpha$ that one can achieve, where $\alpha$ is such that $n\times n\alpha \times n$ matrix multiplication can be computed in $n{2+o(1)}$ time. (3) We show that our lower bound on $\omega$ approaches $2$ as $q$ goes to infinity. This suggests a promising approach to improving the bound on $\omega$: for variable $q$, find a monomial degeneration of $T_q$ which, using the known techniques, produces an upper bound on $\omega$ as a function of $q$. Then, take $q$ to infinity. It is not ruled out, and hence possible, that one can obtain $\omega=2$ in this way.

Citations (39)

Summary

We haven't generated a summary for this paper yet.