## The power mean inequality and some related topics

I haven’t blogged here for quite a while, I hope this blog has not been abandoned. Anyway this is the first post since I have been in Australia. This is actually a note I prepared when I was in undergraduate year two, about the power mean inequality (I’ve forgotten about this note and found it in my computer recently). More precisely these are all the exercises in a chapter of a book whose name I’ve forgotten. The stuff is quite elementary but perhaps some of you are interested. If I have to make this note now, I think I will organize it in a different way, since it seems that this is organized quite loosely. But since I do not have plenty of time, I just do little formatting and put it here.

The quantities that provide the upper bound in AM-GM inequality are special cases of the general means

$\displaystyle M_t=M_t [\textbf{x};\textbf{p}]\equiv \{\sum_{k=1}^n p_k x_k{}^t \}^{1/t} \ \ \ \ \ (1)$

where ${\textbf{p}=(p_1, \ldots, p_n)}$ is a vector of positive weights with total mass of ${p_1+\ldots +p_n=1}$ and ${\textbf{x}=(x_1, \ldots, x_n)}$ is a vector of positive real numbers. Unless otherwise stated, we will hereafter assume this condition for x and p. As we will see, there is a reasonable definition for the formula (1) at ${t=\pm \infty}$ and at ${t=0}$ so that the map ${t\mapsto M_t}$ is continuous at ${t=0}$.

 Proposition 1 (The Geometric Mean as a Limit) The geometric mean can be regarded as the limit of the general mean, $\displaystyle \lim_{t\rightarrow 0}\{\sum_{k=1}^n p_k x_k{}^t\}^{1/t}=\prod_{k=1}^n x_k{}^{p_k}.$ This quantity is defined to be ${M_0}$, so that the map ${t\mapsto M_t}$ is continuous at ${t=0}$.

Proof: Consider

$\displaystyle \begin{array}{rcl} \lim_{t\rightarrow 0}\log(M_t)&=&\lim_{t\rightarrow 0}\frac{\log(\sum_{k=1}^n p_k x_k{}^t)} {t}\\ &=& \lim_{t\rightarrow 0}\log(\sum_{k=1}^n p_k x_k{}^t \log(x_k)) \textrm{ \quad(L'Hospital's rule)}\\ &=& \log(\sum_{k=1}^n p_k \log(x_k)) \end{array}$

Taking the limit into ${\log}$ and then applying exponential, we can get the result. $\Box$

 Proposition 2 (Power Mean Bound for the Geometric Mean) $\displaystyle \prod_{k=1}^n x_k{}^{p_k} \leq \{\sum_{k=1}^n p_k x_k{}^t\}^{1/t} \textrm{ for all }t>0. \ \ \ \ \ (2)$

Proof:

We aim at the inequality (Siegel’s doubling relation)

$\displaystyle M_t\leq M_{2t} \textrm{ for all }t>0. \ \ \ \ \ (3)$

By Cauchy’s inequality and the splitting trick ${p_k x_k{}^t=p_k{}^\frac{1}{2}p_k{}^\frac{1}{2}x_k{}^t}$, we have

$\displaystyle M_t{}^t=\sum_{k=1}^n p_k x_k{}^t =\sum_{k=1}^n p_k{}^\frac{1}{2}p_k{}^\frac{1}{2}x_k{}^t \leq (\sum_{k=1}^n p_k){}^\frac{1}{2} (\sum_{k=1}^n p_k x_k{}^{2t}){}^\frac{1}{2}=M_{2t}{}^t$

Taking the ${t}$-th root on both sides, we get (3). We deduce that for all ${t>0}$,

$\displaystyle M_{t/2^j} \leq M_{t/2^{j-1}} \leq \ldots \leq M_{t/2} \leq M_t$

Taking the limit ${j\rightarrow \infty}$,

$\displaystyle \prod_{k=1}^n x_k{}^{p_k} =M_0=\lim_{j\rightarrow \infty}M_{t/2^j}\leq M_t\leq \{\sum_{k=1}^n p_k x_k{}^t\}^{1/t} \textrm{ for all }t>0.$

$\Box$

Siegel’s doubling relation (3) and the following plot of the two-term power mean ${(px^t+(1-p)y^t)^{1/t}}$ for ${ x=1, y=2}$

suggests the following inequality:

 Theorem 3 (Power Mean Inequality) The mapping ${t\mapsto M_t}$ is a nondecreasing function on ${\mathbb{R}}$, i.e. $\displaystyle \{\sum_{k=1}^n p_k x_k{^s}\}^{1/s} \leq \{\sum_{k=1}^n p_k x_k{^t}\}^{1/t} \textrm{ for }-\infty The equality holds if and only if ${x_1=\ldots =x_n.}$

Proof: For ${0, by Jensen’s inequality for the map ${x\mapsto x^p}$ with ${p>1}$,

$\displaystyle \{ \sum _{k=1}^n p_k x_k\}^p \leq \sum_{k=1}^n p_k x_k{}^p.$

With this inequality and the substitutions ${y_k=x_k{}^s}$, ${p=t/s>1}$ gives

$\displaystyle \{ \sum _{k=1}^n p_k y_k\}^{t/s} \leq \sum_{k=1}^n p_k y_k{}^{t/s},$

so taking ${t}$-th root gives the desired inequality in this case. Moreover, the strict convexity of ${x \mapsto x^p}$ for ${p>1}$ ensures that the equality holds if and only if ${x_1=\ldots=x_n}$.
For ${s, we have ${0<-t<-s}$, then by the above,

$\displaystyle \begin{array}{rcl} \{\sum_{k=1}^n p_k (x_k^{-1})^{-t}\}^{-1/t} &\leq& \{\sum_{k=1}^n p_k (x_k^{-1})^{-s}\}^{-1/s} \\ \Rightarrow \{\sum_{k=1}^n p_k x_k{}^s\}^{1/s} &\leq& \{\sum_{k=1}^n p_k x_k{}^t\}^{1/t} \end{array}$

For ${s<0. As ${-s>0}$, by (2),

$\displaystyle M_0=\prod_{k=1}^n x_k{}^{p_k} \leq \{\sum_{k=1}^n p_k (x_k^{-1})^{-s}\}^{-1/s} \Rightarrow \{\sum_{k=1}^n p_k x_k{}^s\}^{1/s} \leq \prod_{k=1}^n x_k{}^{p_k}. \ \ \ \ \ (4)$

As ${t>0}$, together with (2) gives the results. For the remaining cases ${0=s and ${s, they have been covered by (2) or (4) already.
Finally, as the figure suggests, we define

$\displaystyle M_{-\infty}[\textbf{x}, \textbf{p}]=\min_k x_k \textrm{ and } M_\infty[\textbf{x}, \textbf{p}]=\max_k x_k$

It is easy to show that ${M_{-\infty}[\textbf{x}, \textbf{p}]\leq M_t[\textbf{x}, \textbf{p}]\leq M_\infty[\textbf{x}, \textbf{p}] \textrm{ for }t\in \mathbb{R}.}$ We also have the continuity relations

$\displaystyle \lim_{t\rightarrow \infty}M_t[\textbf{x}, \textbf{p}]=M_\infty[\textbf{x}, \textbf{p}] \textrm{ and }\lim_{t\rightarrow -\infty}M_t[\textbf{x}, \textbf{p}]=M_{-\infty}[\textbf{x}, \textbf{p}]$

For ${t>0}$ and all ${1\leq k \leq n}$,

$\displaystyle p_k x_k{}^t \leq M_t{}^t \leq M_\infty{}^t, \text{ so } p_k{}^{\frac{1}{t}} x_k \leq M_t \leq M_\infty$

As ${p_k>0}$, ${p_k{}^{1/t}\rightarrow 1}$ as ${t \rightarrow \infty}$, so for all ${k}$,

$\displaystyle x_k \leq \lim_{t\rightarrow \infty}M_t \leq M_\infty$

So

$\displaystyle \max_k x_k\leq \lim_{t\rightarrow \infty}M_t \leq M_\infty$

Therefore

$\displaystyle \lim_{t\rightarrow \infty}M_t=M_\infty \textrm{ as }\max_k x_k=M_\infty.$

By ${M_{-t}(x_1, \ldots, x_n; \textbf{p})=M_t{}^{-1}(1/x_1, \ldots, 1/x_n; \textbf{p})}$ for ${t<0}$ and by the first continuity relation, the second continuity relation follows. $\Box$

 Proposition 4 (Harmonic Means and Recognizable Sums) Suppose ${x_1, \ldots, x_n}$ are positive and ${S}$ denotes their sum, $\displaystyle \frac{n^2}{2n-1}\leq \frac{S}{2S-x_1}+\ldots +\frac{S}{2S-x_n}$

Proof: By AM-HM inequality,

$\displaystyle \begin{array}{rcl} \sum_{k=1}^n \frac{S}{2S-x_k}&\geq& n^2/(\frac{\sum_{k=1}^n (2S-x_k)}{S})\\ &=& n^2/(\frac{2nS-S}{S})\\ &=& \frac{n^2}{2n-1}. \end{array}$

$\Box$

 Exercise 1 (Polya’s Minimax Characterization) Suppose you have to guess for an unknown x in the interval ${[a, b]\subset (0, \infty)}$, it is possible to minimize the largest possible relative error of your guess. That is, for $\displaystyle F(p)=\max_{x\in[a, b]}\{\frac{|p-x|}{x}\},$ there is a ${p^*}$ such that $\displaystyle F(p^*)=\min_p F(p)=\min_p \max_{x\in[a, b]}\{\frac{|p-x|}{x}\}.$

Proof: Firstly, for a given ${p}$, we can differentiate ${g(x)=\frac{(p-x)^2}{x^2}}$ with respect to ${x}$ to see that ${g'(x)=0}$ only when ${x=p}$, which is when the relative error is zero. Therefore for ${\frac{|p-x|}{x}}$ to attains its maximum, ${x}$ must be equal to ${a}$ or ${b}$. Therefore

$\displaystyle F(p)=\max_{x\in[a, b]}\{\frac{|p-x|}{x}\}=\max\{\frac{p-a}{a}, \frac{b-p}{b}\}.$

Then

$\displaystyle \frac{p-a}{a}\geq \frac{b-p}{b} \Leftrightarrow p\geq \frac{2ab}{a+b}\in [a, b] \textrm{ and }\frac{p-a}{a}\leq \frac{b-p}{b} \Leftrightarrow p\leq \frac{2ab}{a+b}\in [a, b].$

Let ${H=\frac{2ab}{a+b}}$, which is the harmonic mean of ${a}$ and ${b}$ with equal weights. It is easy to see that ${\frac{p-a}{a}}$ is increasing for ${p\geq H}$ and ${\frac{b-p}{b}}$ is decreasing for ${p\leq H}$. Therefore ${F(p)}$ attains its minimum at ${p=H=\frac{2ab}{a+b}}$.
$\Box$

 Proposition 5 (The Geometric Mean as a Minimum) The geometric mean has the representation $\displaystyle \{ \prod_{k=1}^n a_k \}^{1/n}=\min\{\frac{1}{n} \sum_{k=1}^n a_k x_k:(x_1,\ldots, x_n)\in D \},$ where ${D}$ is the region of ${\mathbb{R}^n}$ defined by $\displaystyle D=\{(x_1, \ldots, x_n):\prod_{k=1}^n x_k=1, x_k\geq 0, k=1, \ldots , n\}.$

Proof: Assume ${a_k\neq 0}$ for all ${k}$, by AM-GM inequality,

$\displaystyle \frac{1}{n} \sum_{k=1}^n a_k x_k\geq (\prod_{k=1}^n a_k x_k)^{1/n}=(\prod_{k=1}^n a_k)^{1/n} \textrm{ as } (x_1, \ldots, x_n)\in D.$

The AM-GM inequality also tells us that the equality holds if and only if ${a_1x_1=\ldots =a_nx_n=c }$ then ${x_k=c/a_k}$ and ${1=c^n/(a_1 a_2\ldots a_n)\Rightarrow c=(a_1 a_2\ldots a_n)^{1/n}}$. So choosing ${x_k=\frac{(a_1 a_2\ldots a_n)^{1/n}}{a_k}}$, we obtain the result. $\Box$

This result can be used to prove that geometric mean is superadditive, i.e.

$\displaystyle \begin{array}{rcl} & &\{ \prod_{k=1}^n a_k \}^{1/n}+\{ \prod_{k=1}^n b_k \}^{1/n}\\ &=&\min\{\frac{1}{n} \sum_{k=1}^n a_k x_k:(x_1,\ldots, x_n)\in D \}+\min\{\frac{1}{n} \sum_{k=1}^n b_k x_k:(x_1,\ldots, x_n)\in D \}\\ &\leq& \min\{\frac{1}{n} \sum_{k=1}^n (a_k+b_k) x_k:(x_1,\ldots, x_n)\in D \}\\ &=& \{ \prod_{k=1}^n (a_k+b_k) \}^{1/n}. \end{array}$

 Proposition 6 (More on the Method of Halves) $\displaystyle \frac{\sin x}{x}=\prod_{k=1}^\infty \cos(x/2^k).$

Proof: By ${\sin(x)=2\sin(x/2)\cos(x/2)}$,

$\displaystyle \frac{\sin x}{2 \sin(x/2)}\cdot\frac{\sin(x/2)}{2\sin(x/2^2)}\ldots \frac{\sin(x/2^{n-1})}{2\sin(x/2^n)}=\frac{\sin(x)}{2^n \sin(x/2^n)}=\prod_{k=1}^n \cos(x/2^k)$

So

$\displaystyle \prod_{k=1}^\infty \cos(x/2^k)=\lim_{n\rightarrow\infty} \frac{\sin(x)}{2^n \sin(x/2^n)}$

But

$\displaystyle \lim_{n\rightarrow \infty }\frac{\sin(x/2^n)}{1/2^n}=x.$

So we get the result. $\Box$

Substitute ${x=\pi/2}$ in the identity and using the formula ${\cos(x/2)=\sqrt{\frac{1+\cos x}{2}}}$, we have the identity

$\displaystyle \frac{2}{\pi}=\frac{\sqrt{2}}{2}\cdot \frac{\sqrt{2+\sqrt{2}}}{2}\cdot \frac{\sqrt{2+\sqrt{2+\sqrt{2}}}}{2}\ldots$

 Proposition 7 (A Niven-Zuckerman Lemma for ${p}$-th Powers) Consider a sequence of ${n}$-tuples of nonnegative real numbers ${(a_{1k}, a_{2k}, \cdots, a_{nk})}$, ${k=1, 2, \ldots}$. Suppose there is a ${\mu \geq 0}$ for which $\displaystyle a_{1k}+a_{2k}+\cdots+a_{nk}\rightarrow n\mu \textrm{ as }k\rightarrow \infty,$ and suppose for some ${1, $\displaystyle a_{1k}{}^p+a_{2k}{}^p+\cdots+a_{nk}{}^p\rightarrow n\mu^p \textrm{ as }k\rightarrow \infty.$ Then for all ${1 \leq j \leq n}$, $\displaystyle \lim_{k\rightarrow \infty}a_{jk}=\mu.$

Proof: By the power mean inequality, as ${p>1}$, for all ${k}$,

$\displaystyle (\sum_{i=1}^n \frac{a_{ik}{}^p}{n})^{1/p}\geq \sum_{i=1}^n \frac{a_{ik}}{n}.$

Then

$\displaystyle \mu=\lim_{k\rightarrow \infty}(\sum_{i=1}^n \frac{a_{ik}{}^p}{n})^{1/p}\geq \lim_{k\rightarrow \infty}\sum_{i=1}^n \frac{a_{ik}}{n}=\mu.$

So

$\displaystyle (\sum_{i=1}^n \frac{(\lim_{k\rightarrow \infty}a_{ik})^p}{n})^{1/p}=\sum_{i=1}^n \frac{\lim_{k\rightarrow \infty}a_{ik}}{n}=\mu.$

By Proposition 3 we know that this equality holds if and only if

$\displaystyle \lim_{k\rightarrow \infty}a_{ik}=c \textrm{ for all }i.$

But ${a_{1k}+\cdots+a_{nk}\rightarrow n\mu}$ as ${k \rightarrow \infty}$, so ${c=\mu}$, i.e.

$\displaystyle \lim_{k\rightarrow \infty} a_{ik}=\mu \textrm{ for all }i.$

$\Box$