Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

$(\min,+)$ Matrix and Vector Products for Inputs Decomposable into Few Monotone Subsequences (2309.01136v1)

Published 3 Sep 2023 in cs.DS and cs.CC

Abstract: We study the time complexity of computing the $(\min,+)$ matrix product of two $n\times n$ integer matrices in terms of $n$ and the number of monotone subsequences the rows of the first matrix and the columns of the second matrix can be decomposed into. In particular, we show that if each row of the first matrix can be decomposed into at most $m_1$ monotone subsequences and each column of the second matrix can be decomposed into at most $m_2$ monotone subsequences such that all the subsequences are non-decreasing or all of them are non-increasing then the $(\min,+)$ product of the matrices can be computed in $O(m_1m_2n{2.569})$ time. On the other hand, we observe that if all the rows of the first matrix are non-decreasing and all columns of the second matrix are non-increasing or {\em vice versa} then this case is as hard as the general one. Similarly, we also study the time complexity of computing the $(\min,+)$ convolution of two $n$-dimensional integer vectors in terms of $n$ and the number of monotone subsequences the two vectors can be decomposed into. We show that if the first vector can be decomposed into at most $m_1$ monotone subsequences and the second vector can be decomposed into at most $m_2$ subsequences such that all the subsequences of the first vector are non-decreasing and all the subsequences of the second vector are non-increasing or {\em vice versa} then their $(\min,+)$ convolution can be computed in $\tilde{O}(m_1m_2n{1.5})$ time. On the other, the case when both vectors are non-decreasing or both of them are non-increasing is as hard as the general case.

Summary

We haven't generated a summary for this paper yet.