Helson's conjecture for smooth numbers

Helson's conjecture for smooth numbers
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Let $Ψ(x,y)$ denote the count of $y$-smooth numbers below $x$ and $P(n)$ denote the largest prime factor of $n$. We prove that for $f$ a Steinhaus random multiplicative function, the partial sums over $y$-smooth numbers always enjoy better than squareroot cancellation, in the sense that $$ \mathbb{E} \Big|\sum_{\substack{1\leq n \leq x\ P(n) \leq y}} f(n) \Big| = o\left( Ψ(x,y)^{1/2} \right),$$ uniformly on the entire range $ 2 \leq y \leq x$. The bounds are quantitative and give a large saving when $y$ isn’t too close to $x$.


💡 Research Summary

The paper studies the behavior of partial sums of a Steinhaus random multiplicative function f over the set of y‑smooth numbers up to x, i.e. numbers whose largest prime factor does not exceed y. The main object is
 S(x,y)=∑{1≤n≤x, P(n)≤y} f(n)
and the authors investigate the first absolute moment E|S(x,y)|. Helson conjectured that for the full sum (y=x) one should have better than square‑root cancellation, i.e. E|∑
{n≤x}f(n)|=o(√x). This was proved by Harper, who obtained the sharp bound ≍√x (log log x)^{1/4}. The present work extends the phenomenon to the whole family of smooth numbers, proving that for every 2 ≤ y ≤ x,  E|S(x,y)| = o(Ψ(x,y)^{1/2}), uniformly in the whole range. Here Ψ(x,y)=#{n≤x: P(n)≤y} is the usual smooth‑number counting function.

The authors give three quantitative regimes:

  1. Moderately sized y (Theorem 1.2). When (log x)^{1+ε} ≤ y ≤ x^{1/(log log x)^{1+ε}}, they obtain an exponential saving:  E|S(x,y)| ≪ Ψ(x,y) exp(−u (log 2+o(1))) with u=log x·log y. This is a power‑saving in terms of log log x and holds uniformly for y in this interval.

  2. Large y close to x (Theorem 1.3). For y as large as x^{1−δ} (δ>0 fixed) the authors exploit the critical Gaussian multiplicative chaos (GMC) phenomenon discovered by Harper. They show that the Euler product associated with f exhibits a log‑correlated Gaussian field whose critical exponent yields a logarithmic improvement:  E|S(x,y)| ≪ Ψ(x,y) (log log x)^{-c} for some absolute c>0. This mirrors the extra (log log x)^{1/4} factor in Harper’s result for the full sum.

  3. Very small y (Theorem 1.4). When y is below roughly log x, a different mechanism dominates. The contribution to the Euler product comes from rare events involving primes near y, leading to a super‑polynomial saving:  E|S(x,y)| ≪ Ψ(x,y) exp(−c (log log x)^{β}) for suitable constants c,β>0. This reflects a “super‑critical” type of cancellation distinct from the critical GMC case.

The proof strategy departs from the usual conditioning approach that relates the first moment to a ½‑th moment of a random Euler product. Instead, the authors start from Perron’s formula:  S(x,y)= (1/2πi)∫{σ−i∞}^{σ+i∞} F_y(s) x^{s} ds / s, where F_y(s)=∏{p≤y}(1−f(p)p^{-s})^{-1} is the random Euler product truncated at y. Choosing σ>2/3 and applying the triangle inequality yields  E|S(x,y)| ≲ x^{σ/2} E|F_y(σ/2)| log x. For σ>2/3 the logarithm of the Euler product is approximately Gaussian with variance ≈½∑_{p≤y}p^{-σ}. Consequently,  E|F_y(σ/2)| ≈ ζ(σ,y)^{1/4}, and the bound becomes  E|S(x,y)| ≲ x^{σ/2} ζ(σ,y)^{1/4} log x. Optimising over σ leads to the saddle‑point α(x²,y), giving the uniform bound  E|S(x,y)| ≲ Ψ(x²,y)^{1/4}(log x)^{9/8}(log y)^{1/8}, which is already o(Ψ(x,y)^{1/2}) for the whole range C log x ≤ y ≤ x^{1/(log log x)}. For larger y the critical GMC analysis refines the estimate, while for smaller y a large‑deviation analysis of the Euler product near the imaginary axis yields the stronger exponential savings.

Beyond the main theorem, the authors discuss an application to the distribution of smooth numbers in short intervals. By combining their bound with a Perron‑type representation of Ψ(x+h,y)−Ψ(x,y), they obtain an error term o(1) for intervals of length h≈√x, suggesting that one can prove the existence of y‑smooth numbers in intervals substantially shorter than previously known, potentially down to h≫√x/(log log x)^{1/4}.

In summary, the paper establishes that Helson’s conjecture holds uniformly for smooth numbers, provides explicit quantitative bounds in three distinct regimes of y, and introduces novel techniques—Perron‑based reduction, Gaussian approximation of random Euler products, and large‑deviation analysis—that may be useful for other problems involving random multiplicative functions and smooth‑number statistics.


Comments & Academic Discussion

Loading comments...

Leave a Comment