Ratio of $2$-norm to infinity norm of coefficients in the expansion of $(1+𝑥+𝑥^2+\ldots+x^m)^n$?

This isn't a complete answer, but I feel it's enough of a step in the right direction to post it.

Building off your notation, define $A:\mathbb{N}^2\to\mathbb{R}$, $A(m,n)=||a||_2/||a||_{\infty}$. There are several elementary cases:

  • $A(m,n)\geq A(1,2)= \sqrt{3/2}$ (Cauchy-Schwarz)
  • $A(m,1)=\sqrt{m+1}$
  • $A(1,n)=\sqrt{\binom{2n}{n}} / \binom{n}{\lfloor n/2\rfloor}\approx \sqrt[4]{\frac{\pi }{4}}\sqrt[4]{n+1}$
  • $A(m,2) = \frac{1}{m+1} \sqrt{(m+1) \left(2 (m+1)^2+1\right)/3}\approx\sqrt{\frac{2}{3}} \sqrt{m+1} $. The radicand is the $(m+1)^{st}$ octahedral number.
  • $A(2,n) = (-1)^{-n}\frac{ \sqrt{\, _2F_1\left(\frac{1}{2}-n,-n;1;4\right)}}{\, _2F_1\left(\frac{1}{2},-n;1;4\right)} = O(n^{1/4})$. Here $_2 F_1(a,b;c;z)$ is the hypergeometric function $$ _2 F_1(a,b;c;z) = \sum_{k=0}^{\infty}\frac{(a)_k(b)_k}{(c)_k k!}z^k, $$ where $(a)_k$ is the Pochhammer symbol $(a)_k = \Gamma(a+k)/\Gamma(a)$.

I computed $A(m,n)$ for $1\leq m,n\leq 50$ and imposed a power fit. I got $$A(m,n)\approx -0.0244157 +0.70197 (m+1)^{0.501309} (n+1)^{0.253279}$$This would seem to agree with our observations that $A(m,n)=O( m+1)^{1/2}$ for fixed $n$ and $A(m,n)=O(n+1)^{1/4}$ for fixed $m$. Here's a graph:

enter image description here

Many others pairs are known on OEIS as well; they also have recurrence relations from which it might be possible to compute an asymptotic. I would also refer you to the excellent book A=B, which discusses these recurrence relations and might be of use.


Here's a non-rigorous account of what the answer "should be" for fixed $m$ and large $n$, with some loose thoughts on how to get actual bounds.

The question is phrased in terms of polynomials, but it may be easier to think about in terms of (finitely-supported) functions on $\mathbb Z$.

Let $\chi_{[0, m)}:\mathbb Z \to \mathbb R$ be the indicator function given by

$$ \chi_{[0,m)}(k) = \begin{cases} 1 & 0 \leq k < m \\ 0 & \text{otherwise}. \end{cases} $$

You're asking about properties of the $n$-times convolutional power $\chi_{[0,m)}^{\star n}$. (The polynomials in the question statement are the generating functions of these convolutional powers.)

Or, stated probabilistically, $m^{-n} \chi_{[0,m)}^{\star n}$ is the probability mass function of the sum of $n$ independent random variables, each chosen uniformly from ${0, 1, \dots, m-1}$. In other words, the probability mass function of the sum $n$ rolls of $m$-sided dice.

For fixed $m$ and large $n$, the behavior of this is well-understood, essentially described by the central limit theorem and its many variants.

In rough terms the distribution is well-approximated by a normal distribution with the appropriate mean and variance. We don't care about the mean, and the variance is $V = n \frac{m^2 - 1}{12}$. So you expect the ratio of the 2-norm and the $\infty$-norm to approach that of pdf of the normal distribution with that variance.

(Note that we're approximating a discrete distribution on $\mathbb Z$ with a continuous distribution on $\mathbb R$; the 2-norm and $\infty$-norm on both sides of the comparison "should" still be similar, but this takes a bit of thought.)

This ratio of norms for the pdf of the normal distribution can be computed directly with an integral, and it has (see leonbloy's answer):

$$ \frac{\| \cdot \|_2}{\| \cdot \|_\infty} = \left(V \pi \right)^{1/4} = \left(n \pi \frac{m^2-1}{12}\right)^{1/4} $$

So far this is all quite loose; making this precise requires you to put some bounds on the estimates given by the central limit theorem. You want good $\ell^2$ bounds and good $\ell^\infty$ bounds.

I think $\ell^2$ should be fairly straightforward. I suspect you can even get good estimates on the 2-norm by working in Fourier space without even bringing the CLT into it. (The Fourier transform of $\chi_{[0,m)}^{\star n}$ is some power of some fixed function $\mathbb R / \mathbb Z \to \mathbb C$, and its $L^2$ norm is dominated by its behavior near 0, since that fixed function is strictly less than 1 everywhere else.)

It might be more hairy to address how good the $\ell^\infty$ estimate is; it's concerned with a scale at which repeated convolution doesn't actually smooth things out*. I suspect this is a well-studied problem, i.e. I suspect there are known good bounds on whether and how quickly the actual probability mass function at the "central point" for sums of i.i.d. discrete random variables approaches what you'd expect from the CLT, but I'm not familiar with details. It may be straightforward to get this kind of bound by working in Fourier space as well.

*: Come to think of it, the $\ell^2$ norm is also sensitive to behavior at that scale, so some care is required for both, see comments.


Following Anton Malyshev's idea, $a_k$ can be regarded as the probability function of the random variable $Y=\sum_{i=1}^n X_i$, where $X_i$ are iid uniform discrete variables - (except for a normalization factor which will be cancelled in the division, hence we'll ignore it).

For simplicity, let's assume $m$ is odd, so that $X_i$ takes values over $\{ \frac{-m+1}{2}, \cdots -1, 0, 1 \cdots \frac{m-1}{2} \}$.

Notice also that $\sum a_k^2 = P(Y_1=Y_2) = P(Y_1 - Y_2 = 0)$ where $Y_1$,$Y_2$ are two iid random variables following the pmf $a_k$.

Let's then define $W_i = X_{i,1} - X_{i,2}$, so that $Z = \sum W_i = Y_1 - Y_2$.

Then, we want to evaluate the ratio

$$ A(m,n)= \frac{\sqrt{P(Z=0)}}{P(Y=0)}=\frac{\sqrt{P(\sum W_i=0)}}{P(\sum X_i=0)} \tag 1$$

where both $Z$ and $Y$ are integer valued with zero mean.

For that we use the CLT and, for refinement, this Edgeworth expansion.

We have the following first moments (odd moments are zero), and fourth cumulant:

$$\begin{array}{c|c|c|} & X_i & W_i \\ \hline m_2=\sigma^2& \displaystyle{\frac{m^2-1}{12}} & \displaystyle{\frac{m^2-1}{6}} \\ m_4 & \displaystyle{\frac{3 m^4-10m^2+7}{240} }& \displaystyle{\frac{2 {{m}^{4}}-5 {{m}^{2}}+3}{30}} \\ \kappa_4 & \displaystyle{-\frac{{{m}^{4}}-1}{120} }& \displaystyle{-\frac{{{m}^{4}}-1}{60}} \\ \end{array} \tag 2$$

Then the zero order approximation gives

$$ A(m.n) \approx \frac{(2 \pi n \,\sigma_X^2)^{1/2}}{(2 \pi n \, \sigma_W^2)^{1/4}} = \left( \pi \,n \, \frac{m^2-1}{12}\right)^{1/4} \triangleq A_0(m.n) \tag 3$$

And the first order correction is

$$ \begin{align} A(m,n) &\approx A_0(m,n)\frac{\sqrt{ 1+ \displaystyle{\frac{\kappa_{4,W}}{8 n \sigma_W^4}}}}{1+\displaystyle{\frac{\kappa_{4,X}}{8 n \sigma_X^4}}} \\ &= \left( \pi \,n \, \frac{m^2-1}{12}\right)^{1/4} \frac{\sqrt{1-\displaystyle{\frac{3 \left( {{m}^{2}}+1\right) }{40 n\, \left( {{m}^{2}}-1\right) }}}}{1-\displaystyle{\frac{3 \left( {{m}^{2}}+1\right) }{20 n\, \left( {{m}^{2}}-1\right) }}} \triangleq A_1(m.n) \tag 4 \end{align}$$

The approximations are quite good, even for small values of $n$:

$$ \begin{array}{c|c|c|c|c} n & m & {A(m,n) \text{ (exact)}} & A_0(m,n) & A_1(m,n)\\ \hline 3 & 3 & 1.6963 & 1.5832 & 1.6622 \\ 5 & 3 & 1.8535 & 1.7989 & 1.8514 \\ 10 & 3 & 2.1698 & 2.1393 & 2.1699 \\ 20 & 3 & 2.5621 & 2.5440 & 2.5621 \\ \hline 3 & 9 & 2.9627 & 2.8154 & 2.9292 \\ 5 & 9 & 3.2779 & 3.1989 & 3.2750 \\ 10 & 9 & 3.8491 & 3.8042 & 3.8487 \\ 20 & 9 & 4.5504 & 4.5240 & 4.5503 \\ 30 & 9 & 5.0260 & 5.0066 & 5.0260 \\ \end{array} $$

Both are expected to be asympotically exact (vanishing relative error) for $n \to \infty$. From $(4)$ one can conjecture:

$$ A(m,n) = \left( \pi \,n \, \frac{m^2-1}{12}\right)^{1/4} \left( 1 + \frac{1}{n}\frac{9}{ 80} \frac{m^2+1}{m^2-1} + O( n^{-2}) \right) \tag 5 $$