## Zeros of random polynomials

Given a polynomial $p(z) = a_0+a_1z+\cdots + a_nz^n$, where the coefficients are random, what can we say about the distribution of the roots (on $\mathbb{C}$)? Of course, it would depend on what “random” means. Here, “random” means that the sequence $(a_n)$ is an i.i.d. sequence of complex random variables.

It turns out that under a rather weak condition on $(a_n)$, then as $n\to\infty$, the roots will tend to be distributed on the unit circle! (There are lots of interesting discussions here, which explain why this should be true intuitively.)

I will give a rigorous proof (and a rigorous formulation) of this result. For simplicity, we will assume that $(a_n)$ are complex standard Gaussians. That is, for any Borel set $B\subseteq \mathbb{C}$, we have

$\displaystyle \mathbf{P}(a_j\in B) = \frac{1}{\pi} \int_B e^{-|z|^2}\,\text{d}m(z)$,

where $m$ is the Lebesgue measure on the complex plane. We will also assume two basic potential theoretic results without proofs, namely

$\displaystyle \Delta \max\{0,\log|z|\} = \frac{1}{2\pi}\text{d}\theta \quad \text{and} \quad \Delta \frac{1}{2\pi} \log|z| = \delta_0.$

For $p_n(z) = \sum_{j=0}^n a_jz^j = a_n\prod_{j=1}^n (z-\zeta_j)$, we write $Z_{p_n} = \frac{1}{n}\sum_{j=1}^n \delta_{\zeta_j}$, the normalized counting measure.  We define the expected normalized counting measure, $\mathbf{E}[Z_{p_n}]$, as follows. For any $\psi\in C_c(\mathbb{C})$,

$\displaystyle \begin{array}{rl} \displaystyle (\mathbf{E}[Z_{p_n}],\psi) &:= \mathbf{E}[(Z_{p_n}, \psi)]\\ &\displaystyle =\frac{1}{\pi^{n+1}}\int_{\mathbb{C}^{n+1}} \frac{1}{n}\sum_{j=1}^n \psi(\zeta_j)e^{-\sum_{j=0}^n |a_j|^2} \,\text{d}m(a_0)\cdots\text{d}m(a_n). \end{array}$

Intuitively, this $\mathbf{E}[Z_{p_n}]$ tells us how the zeros of $p_n$ are distributed “on average”.

 Theorem. We have $\displaystyle \lim_{n\to\infty} \mathbf{E}[Z_{p_n}] = \frac{1}{2\pi}\text{d}\theta$ in weak*-topology.

We will write $\mu = \frac{1}{2\pi}\text{d}\theta$. The key ingredient of the proof will be the orthonormality of $z^j$ with respect to $L^2(\mu)$. Define

$\displaystyle S_n(z,w) = \sum_{j=0}^n z^j\overline{w^j}$.

It is clear that $p_n(z) = \int_{\mathbb{S}^1} p_n(w)S_n(z,w)\,\text{d}\mu(w)$. Also, note that

$\displaystyle \frac{1}{2n}\log{S_n(z,z)} \to \max\{0, \log|z|\}$

locally uniformly in $\mathbb{C}$ (an easy calculus exercise). Why is this important? It is because using some basic potential theory, this implies

$\displaystyle \frac{1}{2\pi}\Delta\frac{1}{2n}\log{S_n(z,z)} \to \mu$

as $n\to\infty$ in weak*-topology.

Proof of Theorem: Write $a = (a_0,\ldots, a_n)$ and $b(z) = (1,z,\ldots, z^n)\in\mathbb{C}^{n+1}$. Also write $u(z) = b(z)/\|b(z)\|$, the normalized vector. Then

$\displaystyle p_n(z) = \sum_{j=0}^n a_jz^j = \langle a, b(z)\rangle = \langle a, u(z)\rangle \|b(z)\|.$

Note that $\|b(z)\| = (\sum_{j=0}^n |z^j|^2)^{1/2} = S_n(z,z)^{1/2}$. Thus

$\displaystyle p_n(z) = \langle a, u(z)\rangle S_n(z,z)^{1/2}.$

Recall that $Z_{p_n} = \frac{1}{2\pi}\Delta\frac{1}{n}\log|p_n(z)|$. Also, note that

$\displaystyle \frac{1}{n}\log|p_n(z)| = \frac{1}{2n}\log S_n(z,z) + \frac{1}{n}\log |\langle a, u(z)\rangle|.$

Therefore,

$\displaystyle \begin{array}{rl} (\mathbf{E}[Z_{p_n}],\psi) & \displaystyle = \mathbf{E}[(Z_{p_n}, \psi)]\\ &\displaystyle = \mathbf{E}\left[\left(\frac{1}{2\pi}\Delta\frac{1}{2n}\log S_n(z,z),\psi\right)\right]+\frac{1}{2\pi n} \mathbf{E}[(\Delta \log|\langle a,u(z)\rangle|,\psi )] .\end{array}$

The first term is actually nonrandom, and it converges to $\frac{1}{2\pi}\int_{\mathbb{S}^1} \psi\,\text{d}\theta$. We will show that the second terms goes to $0$ and this will finish the proof. It suffices to show that the expectation in the second term is bounded.

Consider

$\displaystyle \begin{array}{rl} \displaystyle \mathbf{E}[(\Delta \log|\langle a,u(z)\rangle|,\psi )] & \displaystyle = \mathbf{E}[(\log|\langle a,u(z)\rangle|, \Delta\psi)] \\ & \displaystyle = \int_\mathbb{C} \Delta \psi(z) \mathbf{E}[(\log|\langle a,u(z)\rangle|)]\,\text{d}m(z),\end{array}$

where in the second inequality we used Fubini. Now, let’s recall that

$\displaystyle \mathbf{E}[\log|\langle a,u(z)\rangle|] = \frac{1}{\pi^{n+1}}\int_{\mathbb{C}^{n+1}}\log|\langle a,u(z)\rangle| e^{-\sum_{j=0}^n |a_j|^2}\,\text{d}m(a_0)\cdots \text{d}m(a_n).$

Note that the integral is invariant under unitary transformations. Therefore, by applying an unitary transformation, we may assume that $u(z) = (1,0,\ldots,0)$ (recall that $u(z)$ is an unit vector). Hence, the integral equals

$\displaystyle \frac{1}{\pi}\int_\mathbb{C} \log|a_0|\,\text{d}m(a_0),$

which is a constant independent of $n$ and $z$. Using the fact that $\int_\mathbb{C} \Delta \psi(z)\,\text{d}m(z)$ is bounded, we are done.

 Remark If you write $p(z) = \sum_{j=0}^n a_jb_j(z)$, where $(b_j)$ is a sequence of orthonormal polynomials with respect to some $L^2(\nu)$, where $\nu$ supports on some “nice” compact subset of the complex plane, then the normalized counting measure of the roots of $p$ will converge to the “equilibrium measure” on the support of $\nu$. This requires more potential theory and so I am not putting the general result here.

Advertisements
This entry was posted in Algebra, Complex analysis, Potential theory, Probability. Bookmark the permalink.