Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 90 tok/s
Gemini 2.5 Pro 57 tok/s Pro
GPT-5 Medium 27 tok/s
GPT-5 High 22 tok/s Pro
GPT-4o 101 tok/s
GPT OSS 120B 467 tok/s Pro
Kimi K2 163 tok/s Pro
2000 character limit reached

A note on the normal approximation error for randomly weighted self-normalized sums (1109.5812v1)

Published 27 Sep 2011 in math.PR

Abstract: Let $\bX={X_n}{n\geq 1}$ and $\bY={Y_n}{n\geq 1}$ be two independent random sequences. We obtain rates of convergence to the normal law of randomly weighted self-normalized sums $$ \psi_n(\bX,\bY)=\sum_{i=1}nX_iY_i/V_n,\quad V_n=\sqrt{Y_12+...+Y_n2}. $$ These rates are seen to hold for the convergence of a number of important statistics, such as for instance Student's $t$-statistic or the empirical correlation coefficient.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube