Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Combinatorial and Recurrent Approaches for Efficient Matrix Inversion: Sub-cubic algorithms leveraging Fast Matrix products (2307.07611v1)

Published 14 Jul 2023 in math.NA, cs.NA, and math.CO

Abstract: In this paper, we introduce novel fast matrix inversion algorithms that leverage triangular decomposition and recurrent formalism, incorporating Strassen's fast matrix multiplication. Our research places particular emphasis on triangular matrices, where we propose a novel computational approach based on combinatorial techniques for finding the inverse of a general non-singular triangular matrix. Unlike iterative methods, our combinatorial approach for (block) triangular-type matrices enables direct computation of the matrix inverse through a nonlinear combination of carefully selected combinatorial entries from the initial matrix. This unique characteristic makes our proposed method fully parallelizable, offering significant potential for efficient implementation on parallel computing architectures. Our approach demonstrates intriguing features that allow the derivation of recurrent relations for constructing the matrix inverse. By combining the (block) combinatorial approach, with a recursive triangular split method for inverting triangular matrices, we develop potentially competitive algorithms that strike a balance between efficiency and accuracy. We provide rigorous mathematical proofs of the newly presented method. Additionally, we conduct extensive numerical tests to showcase its applicability and efficiency. The comprehensive evaluation and experimental results presented in this paper confirm the practical utility of our proposed algorithms, demonstrating their superiority over classical approaches in terms of computational efficiency.

Summary

We haven't generated a summary for this paper yet.

Youtube Logo Streamline Icon: https://streamlinehq.com