Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 191 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 39 tok/s Pro
GPT-5 High 34 tok/s Pro
GPT-4o 110 tok/s Pro
Kimi K2 185 tok/s Pro
GPT OSS 120B 437 tok/s Pro
Claude Sonnet 4.5 33 tok/s Pro
2000 character limit reached

Efficiently updating a covariance matrix and its LDL decomposition (2002.08831v1)

Published 20 Feb 2020 in math.NA, cs.NA, and stat.CO

Abstract: Equations are presented which efficiently update or downdate the covariance matrix of a large number of $m$-dimensional observations. Updates and downdates to the covariance matrix, as well as mixed updates/downdates, are shown to be rank-$k$ modifications, where $k$ is the number of new observations added plus the number of old observations removed. As a result, the update and downdate equations decrease the required number of multiplications for a modification to $\Theta((k+1)m2)$ instead of $\Theta((n+k+1)m2)$ or $\Theta((n-k+1)m2)$, where $n$ is the number of initial observations. Having the rank-$k$ formulas for the updates also allows a number of other known identities to be applied, providing a way of applying updates and downdates directly to the inverse and decompositions of the covariance matrix. To illustrate, we provide an efficient algorithm for applying the rank-$k$ update to the LDL decomposition of a covariance matrix.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.