Cauchy-Schwartz Inequality

For any two vectors x,y \in \mathbb{R}^n, we have

x^Ty \leq ||x||_2\cdot ||y||_2.

The above inequality is an equality if and only if x,y are collinear. In other words:

(P) \quad \max\limits_{x: ||x||_2 \leq 1} x^Ty = ||y||_2,

with optimal x given by x^* = y/||y||_2 if y is non-zero.

Proof: The inequality is trivial if either one of the vectors x,y is zero. Let us assume both are non-zero. Without loss of generality, we may re-scale x and assume it has unit Euclidean norm (||x||_2=1). Let us first prove that

x^Ty \leq ||y||_2.

We consider the polynomial

p(t) = ||tx-y||_2^2 = t^2 -2t(x^Ty) +y^Ty.

Since it is non-negative for every value of t, its discriminant  \Delta = (x^Ty)^2-y^Tyis non-positive. The Cauchy-Schwartz inequality follows.

The second result is proven as follows. Let v(P) be the optimal value of the problem. The Cauchy-Schwartz inequality implies that v(P) \leq ||y||_2. To prove that the value is attained (it is equal to its upper bound), we observe that if x = y/||y||_2, then

x^Ty = \frac{y^Ty}{||y||_2} = ||y||_2.

The vector x= y/||y||_2 is feasible for the optimization problem (P). This establishes a lower bound on the value of (P), v(P):

||y||_2 \leq v(P) = \max\limits_{x: ||x||_2 \leq 1} x^Ty.

License

Hyper-Textbook: Optimization Models and Applications Copyright © by L. El Ghaoui. All Rights Reserved.

Share This Book