Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The logarithmically averaged Chowla and Elliott conjectures for two-point correlations (1509.05422v4)

Published 17 Sep 2015 in math.NT

Abstract: Let $\lambda$ denote the Liouville function. The Chowla conjecture, in the two-point correlation case, asserts that $$ \sum_{n \leq x} \lambda(a_1 n + b_1) \lambda(a_2 n+b_2) = o(x) $$ as $x \to \infty$, for any fixed natural numbers $a_1,a_2,b_1,b_2$ with $a_1b_2-a_2b_1 \neq 0$. In this paper we establish the logarithmically averaged version $$ \sum_{x/\omega(x) < n \leq x} \frac{\lambda(a_1 n + b_1) \lambda(a_2 n+b_2)}{n} = o(\log \omega(x)) $$ of the Chowla conjecture as $x \to \infty$, where $1 \leq \omega(x) \leq x$ is an arbitrary function of $x$ that goes to infinity as $x \to \infty$, thus breaking the "parity barrier" for this problem. Our main tools are the multiplicativity of the Liouville function at small primes, a recent result of Matom\"aki, Radziwi{\l}{\l}, and the author on the averages of modulated multiplicative functions in short intervals, concentration of measure inequalities, the Hardy-Littlewood circle method combined with a restriction theorem for the primes, and a novel "entropy decrement argument". Most of these ingredients are also available (in principle, at least) for the higher order correlations, with the main missing ingredient being the need to control short sums of multiplicative functions modulated by local nilsequences. Our arguments also extend to more general bounded multiplicative functions than the Liouville function $\lambda$, leading to a logarithmically averaged version of the Elliott conjecture in the two-point case. In a subsequent paper we will use this version of the Elliott conjecture to affirmatively settle the Erd\H{o}s discrepancy problem.

Summary

We haven't generated a summary for this paper yet.