"
 

For any two vectors [latex]x,y \in \mathbb{R}^n[/latex], we have

[latex]\begin{align*} x^Ty &\leq ||x||_2\cdot ||y||_2. \\ \end{align*}[/latex]

The above inequality is an equality if and only if [latex]x,y[/latex] are collinear. In other words:

[latex]\begin{align*} (P) \quad \max\limits_{x: ||x||_2 \leq 1} x^Ty &= ||y||_2, \\ \end{align*}[/latex]

with optimal [latex]x[/latex] given by [latex]x^* = y/||y||_2[/latex] if [latex]y[/latex] is non-zero.

Proof: The inequality is trivial if either one of the vectors [latex]x,y[/latex] is zero. Let us assume both are non-zero. Without loss of generality, we may re-scale [latex]x[/latex] and assume it has unit Euclidean norm ([latex]||x||_2=1[/latex]). Let us first prove that

[latex]\begin{align*} x^Ty &\leq ||y||_2. \\ \end{align*}[/latex]

We consider the polynomial

[latex]\begin{align*} p(t) &= ||tx-y||_2^2 = t^2 -2t(x^Ty) +y^Ty. \\ \end{align*}[/latex]

Since it is non-negative for every value of [latex]t[/latex], its discriminant  [latex]\Delta = (x^Ty)^2-y^Ty[/latex] is non-positive. The Cauchy-Schwartz inequality follows.

The second result is proven as follows. Let [latex]v(P)[/latex] be the optimal value of the problem. The Cauchy-Schwartz inequality implies that [latex]v(P) \leq ||y||_2[/latex]. To prove that the value is attained (it is equal to its upper bound), we observe that if [latex]x = y/||y||_2[/latex], then

[latex]\begin{align*} x^Ty &= \frac{y^Ty}{||y||_2} = ||y||_2. \\ \end{align*}[/latex]

The vector [latex]x= y/||y||_2[/latex] is feasible for the optimization problem [latex](P)[/latex]. This establishes a lower bound on the value of [latex](P)[/latex], [latex]v(P)[/latex]:

[latex]\begin{align*} ||y||_2 &\leq v(P) = \max\limits_{x: ||x||_2 \leq 1} x^Ty. \end{align*}[/latex]

License

Icon for the Public Domain license

This work (Đại số tuyến tính by Tony Tin) is free of known copyright restrictions.