If you find any mistakes, please make a comment! Thank you.

Solution to Mathematics for Machine Learning Exercise 4.12


Solution: By SVD (Singular Value Decomposition), we have $A=U\Sigma V^T$. Let $y=V^T x$, then \[\|y\|_2^2=y^Ty=(V^Tx)^TV^Tx=x^TVV^Tx=x^Tx=\|x\|_2^2.\]Then we have\begin{align*}\|A x\|_2^2=&\ (Ax)^T(Ax)=x^TA^TAx\\ = &\ x^T V \begin{bmatrix}\sigma_1^2 & 0 & 0\\ 0 & \ddots & 0\\ 0 & 0 &\sigma_n^2\end{bmatrix}V^Tx \\ =&\ y^T \begin{bmatrix}\sigma_1^2 & 0 & 0\\ 0 & \ddots & 0\\ 0 & 0 &\sigma_n^2\end{bmatrix} y \\ = &\ \sigma_1^2 y_1^2+\cdots+\sigma_n^2y_n^2 \\ \leqslant &\ \sigma_1^2 y_1^2+\cdots+\sigma_1^2y_n^2\\ =&\ \sigma_1^2(y_1^2+\cdots+y_n^2)\\ =&\ \sigma_1^2 \|y\|_2^2= \sigma_1^2 \|x\|_2^2.\end{align*}Hence we have \[\max_{x\ne 0}\frac{\|Ax\|_2}{\|x\|_2}\leqslant \sigma_1^2.\]Moreover, it is possible to choose $x$ such that the inequality above becomes equality. Namely, setting $$x=V(1,0,\dots,0)^T,$$then $y=(1,0,\dots,0)^T$ and $$\|A x\|_2^2=\sigma_1^2,\quad \|x\|_2=1.$$Hence we proved Theorem 4.24.

Linearity

This website is supposed to help you study Linear Algebras. Please only read these solutions after thinking about the problems carefully. Do not just copy these solutions.
Close Menu