(Nov 28, 2013)

A (constrained) polynomial minimization problem is in the form

() subject to

where are polynomials in with real coefficients.

If and , then the above problem becomes , an unconstrained polynomial minimization problem.

The purpose of writing this series is to introduce the basic principles related to this kind of problem. Before we begin, the prerequisites are:

2. Undergraduate level abstract algebra

3. Undergraduate level real analysis

4. Knowledge about linear programming and semidefinite programming is a plus

We collect all points in that meet the constraints of () by

.

This set is called a *basic closed semialgebraic set*. Abstract algebra is employed to study the geometry of . The field doing this thing is called *real algebraic geometry*.

There is a duality between nonnegative polynomials and a special type of Borel measures on some Euclidean space. This suggests some relation between solving () and measure theory.

Now let’s look at some examples of ().

**Example 1 (Linear Programming).** When the objective function and all constraints are linear polyomials, () is a linear programming (LP) problem which can be expressed as

where , and .

If we add the constraints for all to the LP we get a (NP-hard) 0/1 LP problem:

**Example 2 (Combinatorial Optimization).** Some well-known problems in combinatorial optimization are also polynomial optimization problems. They include maximum stable set problem and maximum cut problem.

**Example 3 (Partition Problem).** This is a popular NP-complete problem: Given a sequence of positive integers. Is there such that ?

If in the unconstrained polynomial minimization problem where , then the sequence can be partitioned into two blocks whose sums are equal.

**Example 4 (Quadratically constrained quadratic programming).** If the objective and constraints are quadratic polynomials then () is a quadratically constrained quadratic program (QCQP). This type of problem is also under our interest. We hope we can get to this later.

Polynomial optimization is an NP-complete problem. One approach to tackle it is to consider relaxations which are easy to solve. If the relaxations are solved we obtain lower bounds to the original problem. Sometimes these lower bounds are close to or even equal to the optimal value.

To begin with, we notice that since infimum is the greatest lower bound, () is equivalent to solving

() .

In other words, looking for the maximum such that is nonnegative on . Next thing we can ask is, is the problem feasible i.e. Is there a such that is nonnegative on ? Or in the following way: What are the nonnegative polynomials on ?

If , there is an answer by David Hilbert in the late 19th century. To describe his result we need a few notations and definitions.

(1) the set of nonnegative integers.

(2)

(3)

(4) Set ring of multivariate polynomials in variables with real coefficients. A monomial in is in the form whose degree is . Any is in the form where there are only finitely many nonzero ‘s. When , is called a *term* of . The *degree* of is given by

(5) Set set of polynomials of degree .

**Definition 1.** A polynomial is said to be *a sum of squares of polynomials* (abbreviation: is SOS) if

for some

(6) Set and

(7) Set and

It is straightforward that and . How about the reverse inclusion? Are all nonnegative polynomials on sums of squares?

First look at what we and you can prove.

**Proposition 1.** Any nonnegative univariate (i.e. ) polynomial on is a sum of two squares.

*Proof.* Let be a nonnegative polynomial on . Then the coefficient of the highest degree term must be positive. The roots of are either real with even multiplicity or appear in complex conjugate pairs. So

for some , , and . It follows that is SOS. Using the identity (follows from evaluating both sides, or a very basic property of complex numbers) we see that is SOS. Since is just a square of some polynomial, is seen to be a sum of two squares.

(Dec 2, 2013)

To prove the next proposition we need to know homogenization.

**Definition 2.** A polynomial is said to be *homogeneous* (or a *form*) if all its terms have the same degree. If is of degree , write , then its *homogenization* is the polynomial defined by

.

This is indeed the numerator of .

**Lemma 2.** If is SOS, then is even. Also, any decomposition where satisfies for all .

*Proof.* Assume is SOS. Then on so is even, say . Write where , and set . If then we are done. Suppose . Then decompose each where and . Thus . The right hand side, is a homogeneous polynomial of degree , while the left hand side is a polynomial of degree , because , and for all . Contradiction.

**Lemma 3.** Given a polynomial and let be its homogenization. Then

(1) on if and only if on ;

(2) is SOS if and only if is SOS.

*Proof.* For (1) and (2), the “if” part follows from the fact . Conversely, for (1), if on then is even and so whenever . Letting and using the continuity of , . So . For (2), if where , then is nonnegative thus is even, and hence

.

By the previous lemma, for all so is SOS.

**Proposition 4.** Any nonnegative quadratic polynomial (i.e. and is arbitrary) is a sum of squares.

*Proof.* Let be of the form , where is a symmetric matrix, and . Its homogenization is , where and . Using Lemma 3 we know on so is positive semidefinite. As is symmetric positive semidefinite, it follows from eigenvalue decomposition with nonnegative eigenvalues that one can write for some . Thus is SOS and by Lemma 3 again, is SOS too.

Keeping track of the proof we see that above is a sum of squares where is the rank of .

David Hilbert (1888) also proved that any nonnegative polynomial in variables with degree 4 is a sum of three squares, together with this complete classification (we omit the proofs):

**Theorem 5 (Hilbert’s Theorem).** One has

if and only if , or , or .

There is a popular example of nonnegative polynomial which is NOT SOS.

**Example 5.** T. Motzkin (1967) published the first explicit example of non-SOS nonnegative polynomial, which is the following bivariate sextic (, ) polynomial:

We see that on . In fact it is always nonnegative, because of the following inequality:

To show that is not SOS, we first assume where . Note that , thus . This means and are constant polynomials. So each has no term concerning only or only . But we know each has degree at most 3. Thus we may write

Comparing the coefficients of in , we get

which is clearly impossible. So is not SOS.

Then, by Lemma 3, the *homogeneous Motzkin form *

is also a non-SOS nonnegative polynomial.

Different people constructed examples of non-SOS nonnegative polynomials, for example, Raphael M. Robinson (slightly after Motzkin), Man-Duen Choi and Tsit-Yuen Lam (1976-77), Anneli and Peter Lax (1978) (cf. 1971 International Math Olympiad), Konrad Schmudgen (1979) etc. (Reference: Reznick, Some Concrete Aspects of Hilbert’s 17th Problem)

**Example 6.** SOS decomposition is in general non-unique. This is an example:

In this session we discuss a representation result about nonnegative polynomials on . Then, how to compute SOS decomposition effectively? Also, how about nonnegative polynomials on general ? Is there any representation result? We will do this next time, and the discussion will become more algebraic.

(End)