Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 33 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 24 tok/s Pro
GPT-5 High 26 tok/s Pro
GPT-4o 74 tok/s Pro
Kimi K2 188 tok/s Pro
GPT OSS 120B 362 tok/s Pro
Claude Sonnet 4.5 34 tok/s Pro
2000 character limit reached

Decompositions of Dependence for High-Dimensional Extremes (1612.07190v4)

Published 20 Dec 2016 in stat.ME

Abstract: Employing the framework of regular variation, we propose two decompositions which help to summarize and describel high-dimensional tail dependence. Via transformation, we define a vector space on the positive orthant, yielding the notion of basis. With a suitably-chosen transformation, we show that transformed-linear operations applied to regularly varying random vectors preserve regular variation. Rather than model regular-variation's angular measure, we summarize tail dependence via a matrix of pairwise tail dependence metrics. This matrix is positive semidefinite, and eigendecomposition allows one to interpret tail dependence via the resulting eigenbasis. Additionally this matrix is completely positive, and a resulting decomposition allows one to easily construct regularly varying random vectors which share the same pairwise tail dependencies. We illustrate our methods with Swiss rainfall data and financial return data.

Citations (72)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.