Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 105 tok/s
Gemini 2.5 Pro 53 tok/s Pro
GPT-5 Medium 41 tok/s
GPT-5 High 42 tok/s Pro
GPT-4o 104 tok/s
GPT OSS 120B 474 tok/s Pro
Kimi K2 256 tok/s Pro
2000 character limit reached

Optimal use of auxiliary information : information geometry and empirical process (2107.00563v1)

Published 1 Jul 2021 in math.ST and stat.TH

Abstract: We incorporate into the empirical measure the auxiliary information given by a finite collection of expectation in an optimal information geometry way. This allows to unify several methods exploiting a side information and to uniquely define an informed empirical measure. These methods are shown to share the same asymptotic properties. Then we study the informed empirical process subject to a true information. We establish the Glivenko-Cantelli and Donsker theorems for the informed empirical measure under minimal assumptions and we quantify the asymptotic uniform variance reduction. Moreover, we prove that the informed empirical process is more concentrated than the classical empirical process for all large $n$. Finally, as an illustration of the variance reduction, we apply some of these results to the informed empirical quantiles.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Authors (1)

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube