1. Solution. A quick calculation shows that $T^*Tv = ||x||^2\langle v, u \rangle u$ for every $v \in V$. The map $R \in \mathcal{L}(V)$ defined by

$$ Rv = \frac{||x||}{||u||}\langle v, u \rangle u $$ is a square root of $T^*T$. Moreover, it is easy to check that $\langle Rv, v \rangle \ge 0$ for all $v \in V$. We but need to prove that $R$ is self-adjoint. Let $e_1, \dots, e_n$ be an orthonormal basis of $V$. We can write

$$ u = a_1 e_1 + \dots + a_n e_n $$ for some $a_1, \dots, a_n \in \mathbb{F}$. Note that

$$ Re_j = \frac{||x||}{||u||}\langle e_j, u \rangle u = \frac{||x||}{||u||}(a_j \overline{a_1} e_1 + \dots + a_j \overline{a_n} e_n). $$ Therefore, the matrix of $R$ with respect to the basis $e_1, \dots, e_n$ has entries defined by

$$ \mathcal{M}(R)_{j, k} = \frac{||x||}{||u||}a_j \overline{a_k}. $$ Thus $\mathcal{M}(R)_{j, k} = \overline{\mathcal{M}(R)_{k, j}}$, that is, $\mathcal{M}(R) = \mathcal{M}(R^*)$. Hence $R$ is self-adjoint.

2. Solution. Define $T \in \mathcal{L}(\mathbb{C}^2)$ by

$$ \begin{aligned} Te_1 &= 0\\ Te_2 &= 5e_1 \end{aligned} $$ where $e_1, e_2$ is the standard basis of $\mathbb{C}^2$. Then the matrix of $T$ with respect to this is basis is $$ \mathcal{M}(T) = \begin{pmatrix}0 & 5\\0 & 0\end{pmatrix}, $$ which is upper triangular. Thus $0$ is the only eigenvalue of $T$. We have

$$ \mathcal{M}(T^*T) = \mathcal{M}(T^*)\mathcal{M}(T) = \begin{pmatrix}0 & 0\\5 & 0\end{pmatrix}\begin{pmatrix}0 & 5\\0 & 0\end{pmatrix} = \begin{pmatrix}0 & 0\\0 & 25\end{pmatrix}. $$ Hence the eigenvalues of $T^*T$ are $0$ and $25$. By 7.52, the singular values of $T$ are $0$ and $5$.

3. Solution. By the Polar Decomposition (7.45), there exists an isometry $S \in \mathcal{L}(V)$ such that $T^* = S\sqrt{TT^*}$. Taking the adjoint of each side, we get

$$ T = (S\sqrt{TT^*})^* = (\sqrt{TT^*})^*S^* = \sqrt{TT^*}S^*, $$ where the last equality follows because $\sqrt{TT^*}$ is self-adjoint. This yields the desired result, because $S^*$ is also an isometry.

4. Solution. Let $v \in V$ be an eigenvector of $\sqrt{T^*T}$ with $||v|| = 1$ corresponding to $s$ and let $S \in \mathcal{L}(V)$ be an isometry such that $T = S\sqrt{T^*T}$. Then $$ ||Tv|| = ||S\sqrt{T^*T}v|| = ||\sqrt{T^*T}v|| = |s|\:||v|| = |s| = s, $$ where the last equality follows because $\sqrt{T^*T}$ is positive.

5. Solution. We have

$$ \mathcal{M}(T^*T) = \mathcal{M}(T^*)\mathcal{M}(T) = \begin{pmatrix}0 & 1\\-4 & 0\end{pmatrix}\begin{pmatrix}0 & -4\\1 & 0\end{pmatrix} = \begin{pmatrix}1 & 0\\0 & 16\end{pmatrix}. $$ Therefore, the singular values of $T$ are $1$ and $4$.

6. Solution. We will use the orthonormal basis $\sqrt{\frac{1}{2}}, \sqrt{\frac{3}{2}}x, \sqrt{\frac{45}{8}}\left(x^2 – \frac{1}{3}\right)$ of $\mathcal{P}_2(\mathbb{R})$, which was found in the example.

We have

$$ \begin{aligned} D\sqrt{\frac{1}{2}} &= 0\\ D\sqrt{\frac{3}{2}}x &= \sqrt{3}\left(\sqrt{\frac{1}{2}}\right)\\ D\sqrt{\frac{45}{8}}\left(x^2 – \frac{1}{3}\right) &= \sqrt{15}\left(\sqrt{\frac{3}{2}}x\right). \end{aligned} $$ Therefore

$$ \mathcal{M}(D) = \begin{pmatrix} 0 & \sqrt{3} & 0\\ 0 & 0 & \sqrt{15}\\ 0 & 0 & 0 \end{pmatrix}. $$ Thus

$$ \mathcal{M}(D^*D) = \begin{pmatrix} 0 & 0 & 0\\ \sqrt{3} & 0 & 0\\ 0 & \sqrt{15} & 0 \end{pmatrix} \begin{pmatrix} 0 & \sqrt{3} & 0\\ 0 & 0 & \sqrt{15}\\ 0 & 0 & 0 \end{pmatrix} = \begin{pmatrix} 0 & 0 & 0\\ 0 & 3 & 0\\ 0 & 0 & 15 \end{pmatrix}. $$ Thus, the singular values of $D$ are $0, \sqrt{3}, \sqrt{15}$.

7. Solution. A quick calculation, like the ones in the previous exercises, shows that

$$ T^*T(z_1, z_2, z_3) = (4z_1, 9z_2, z_3). $$ Thus

$$ \sqrt{T^*T}(z_1, z_2, z_3) = (2z_1, 3z_2, z_3). $$ Define $S \in \mathcal{L}(\mathbb{F}^3)$ by

$$ S(z_1, z_2, z_3) = (z_3, z_1, z_2). $$ $S$ is clearly an isometry (it maps the standard basis, which is orthonormal, to a permutation of the standard basis) and we have $S\sqrt{T^*T} = T$.

8. Solution. Let $S’ \in \mathcal{L}(V)$ be an isometry such that $T = S’\sqrt{T^*T}$. Then

$$ \begin{aligned} 0 &= ||Tv|| – ||Tv||\\ &= ||S^*Tv|| – ||S’^*Tv||\\ &= ||Rv|| – ||\sqrt{T^*T}v||\\ &= \langle R^*Rv, v \rangle – \langle T^*Tv, v \rangle\\ &= \langle (R^*R – T^*T)v, v \rangle\\ &= \langle (R^2 – T^*T)v, v \rangle \end{aligned} $$ where the last line follows because $R$ is self-adjoint. 7.16 now implies that $R^2 = T^*T$. Since $R$ is positive and the positive square root of $T^*T$ is unique (by 7.36), it follows that $R = \sqrt{T^*T}$.

9. Solution. Consider the proof of 7.45. $T$ being invertible implies $\operatorname{dim} (\operatorname{range} T)^\perp = 0$, thus $S_2 = 0$ and so $S = S_1$. Clearly $S_1$ is unique, so $S$ must also be unique. Conversely, if $S$ is unique then $S_2 = 0$, otherwise we could set $S = S_1 – S_2$ and it would still be an isometry satisfying $T = S\sqrt{T^*T}$. This implies that $m = 0$, because from the defintion we see that $S_2$ must be invertible for any positive integer $m$. Thus $\operatorname{dim} (\operatorname{range} T)^\perp = 0$ and so $T$ is invertible.

10. Solution. Suppose $\lambda$ is an eigenvalue of $T$ and $v$ a corresponding eigenvector. Then

$$ T^*Tv = T^2v = \lambda^2 v = |\lambda|^2v, $$ where the last equality follows because the eigenvalues of self-adjoint operators are real (see 7.13). Therefore, the eigenvectors of $T$ are also eigenvectors of $T^*T$ with corresponding eigenvalues squared. Since $T$ has a basis consisting of eigenvectors, so does $T^*T$ and thus all eigenvalues of $T^*T$ are squares of the absolute values of eigenvalues of $T$. 7.52 now implies that the singular values of $T$ are the absolute values of the eigenvalues of $T$.

11. Solution: It follows from 7.45 that $T=S\sqrt{T^*T}$ for an isometry $S\in\ca L(V)$. Note that $\sqrt{T^*T}$ is self-adjoint and $S^{-1}=S^*$ (7.42 (e), (f)), we have

\[

TT^*=S\sqrt{T^*T}(S\sqrt{T^*T})^*=S\sqrt{T^*T}\sqrt{T^*T}S^*=S(T^*T)S^{-1}.

\]It follows from Problem 15 of Exercise 5A that $TT^*$ and $T^*T$ have the same eigenvalues. Moreover, each eigenvalue has the same multiplicity.

Since $TT^*$ and $T^*T$ are positive operators, see Problem 4 of Exercise 7C, they have nonnegative eigenvalues. Also the singular value of $T$ are the nonnegative square roots of the eigenvalues of $T^*T$ while the singular value of $T$ are the nonnegative square roots of the eigenvalues of $TT^*$, see 7.52. We conclude that $T$ and $T^*$ have the same singular values. Moreover, each eigenvalue has the same multiplicity.

See: Wu Jinyang’s Comment. Basically, matrices or operators related in the way of Problem 15 of Exercise 5A can be considered as the “same”. They have almost the same structure. It is called they are similar to each other.

12. Solution. As a counterexample, take the linear map $T$ from Exercise 5. We have

$$ \mathcal{M}((T^2)^*T^2) = \mathcal{M}(T^*)^2\mathcal{M}(T)^2 = \begin{pmatrix}-4 & 0\\0 & -4\end{pmatrix}\begin{pmatrix}-4 & 0\\0 & -4\end{pmatrix} = \begin{pmatrix}16 & 0\\0 & 16\end{pmatrix}. $$ Therefore, the singular values of $T^2$ are $4, 4$. However, the singular values of $T$ are $1, 4$.

13. Solution. Suppose $T$ is invertible. Then

$$ \operatorname{null} T^* = (\operatorname{range} T)^\perp = V^\perp = \{0\}, $$ where the first equality follows from 7.7 and the second because $T$ is surjective. This shows that $T^*$ is also invertible. Therefore $0$ is not eigenvalue of $T^*T$ and so, by 7.52, it cannot be a singular value of $T$.

Conversely, suppose $0$ is not a singular value of $T$. Then $T^*Tv \neq 0$ for all non-zero $v \in V$. This implies that $Tv \neq 0$ for all non-zero $v \in V$. Thus $T$ is invertible.

14. Solution. First we will prove that $\operatorname{range} T = \operatorname{range} T^*T$. From Exercise 5 in section 7A we see that $\operatorname{range} T = \operatorname{range} T^*$.

Suppose $w \in \operatorname{range} T$. Then $w \in \operatorname{range} T^*$. Thus $w = T^*v$ for some $v \in V$. We can write $v = v’ + v”$ for some $v’ \in \operatorname{null} T^*$ and $v” \in (\operatorname{null} T^*)^\perp$. Thus $w = T^*v”$. But 7.7 shows that $(\operatorname{null} T^*)^\perp = \operatorname{range} T$. Therefore $v” = Tu$ for some $u \in V$ and so $w = T^*Tu \in \operatorname{range}{T^*T}$. Hence $\operatorname{range} T \subset \operatorname{range} T^*T$.

The inclusion in the other direction is easy. We have $$ \operatorname{range} T^*T \subset \operatorname{range} T^* = \operatorname{range} T. $$

Therefore $\operatorname{range} T = \operatorname{range} T^*T$.

Since $\sqrt{T^*T}$ is diagonalizable (because it is self-adjoint), it follows that the number of nonzero singular values of $T$ equals the dimension of $\operatorname{range} \sqrt{T^*T}$. Note that $\operatorname{range} T^*T = \operatorname{range} \sqrt{T^*T}$. Therefore $\operatorname{dim} \operatorname{range} \sqrt{T^*T} = \operatorname{dim} \operatorname{range} T$, completing the proof.

15. Solution. The forward direction is obvious, because if $S$ is an isometry, then $\sqrt{S^*S}$ equals the identity, whose eigenvalues equal $1$.

Suppose all singular values of $S$ equal $1$. This implies that $\sqrt{S^*S}$, and therefore so does $S^*S$, equals the identity. 7.42 now implies that $S$ is an isometry.

16. Solution. Let $S_1, S_2 \in \mathcal{L}(V)$ be isometries such that $T_1 = S_1\sqrt{T_1^*T_1}$ and $T_2 = S_2\sqrt{T_2^*T_2}$ and let $e_1, \dots, e_n$ and $f_1, \dots, f_n$ be orthonormal basis of $V$ consisting of eigenvectors of $T_1$ and $T_2$, respectively, corresponding to the singular values $s_1, \dots, s_n$. Defin $S \in \mathcal{L}(V)$ by

$$ Se_j = f_j $$ for each $j = 1, \dots, n$. Then $S$ is also an isometry and we have

$$ \begin{aligned} \sqrt{T_1^*T_1}e_j &= s_je_j\\ &= S^*(s_jf_j)\\ &= S^*\sqrt{T_2^*T_2}f_j\\ &= S^*\sqrt{T_2^*T_2}Se_j. \end{aligned} $$ So $\sqrt{T_1^*T_1} = S^*\sqrt{T_2^*T_2}S$. Therefore $$ T_1 = S_1\sqrt{T_1^*T_1} = S_1S^*\sqrt{T_2^*T_2}S = S_1S^*S_2^*T_2S, $$ where the last equality follows by multiplying both sides of the equation $T_2 = S_2\sqrt{T_2^*T_2}$ by $S_2^*$. This gives the desired result, because the product of isometries is also an isometry.

17. Solution.

(a) We have $Te_j = s_jf_j$. This implies that the matrix of $T$ if respect to the bases $e_1, \dots, e_n$ and $f_1, \dots, f_n$ is the diagonal matrix whose diagonal entries are the singular values of $T$. By 7.10, we have $T^*f_j = s_je_j$ for each $j$. If we replace $v$ with $f_j$ in the right hand side of the desired result we get the same thing, therefore

$$ T^*v = s_1\langle v, f_1 \rangle e_1 + \dots + s_n \langle v, f_n \rangle e_n $$ by uniqueness of linear maps (see 3.5).

(b) Just apply the previous item to the formula given of the singular value decomposition of $T$.

(c) Note that the $e_j$’s are eigenvectors of $T^*T$ with corresponding eigenvalue $s_j^2$. Thus $\sqrt{T^*T}e_j = s_je_j$. Plugging $e_j$ in the place of $v$ in the right hand side yields the same thing, so the result holds by uniqueness of linear maps again.

(d) The given formula satisfies $TT^{-1} = I$ and $T^{-1}T = I$ and it is well defined, because from Exercise 13 we see that none of $s_j$’s are $0$.

18. Solution.

(a) Use the same notation from the Singular Value Decompositon theorem (7.52). We have

$$ \begin{aligned} ||Tv||^2 &= |s_1\langle v, e_1 \rangle|^2 + \dots + |s_n\langle v, e_n \rangle|^2\\ &= |s_1|^2|\langle v, e_1 \rangle|^2 + \dots + |s_n|^2|\langle v, e_n \rangle|^2\\ &\le |s|^2|\langle v, e_1 \rangle|^2 + \dots + |s|^2|\langle v, e_n \rangle|^2\\ &= |s|^2{|\langle v, e_1 \rangle|^2 + \dots + |\langle v, e_n \rangle|^2}\\ &= s^2||v||^2 \end{aligned} $$

for all $v \in V$, where the first line follows from the Pythagorean Theorem (6.13) and last because $s$ is a positive real value. Taking the square root of both sides shows that $||Tv|| \le s||v||$ for all $v \in V$. The proof that $\hat{s}||v|| \le ||Tv||$ is almost the same.

(b) Let $v$ be an eigenvector of $T$ corresponding to $\lambda$ with $||v|| = 1$. Then

$$ |\lambda| = |\lambda|\:||v|| = ||Tv|| \le s||v|| = s. $$

Similarly, we have $\hat{s} \le |\lambda|$.

20. Solution. See the comment below.

## Travor Zhang

30 Dec 2020Why range T = range T* holds in ex14 without T being normal predicated ?

## Allen

18 Feb 2021Yeah, I don't think we can use this property either. I found another proof done the counterpart dim(null(T)) = # zero singular value of T

at

https://math.stackexchange.com/questions/1070092/dimension-of-null-and-zero-singular-values

## Xinyu

22 Dec 2020For Ex. 1, why not check = to prove that R is self-adjoint.

## katharine

17 Aug 2020For ex.4,how do you proof rangeT=rangeT*?exercise 5 in section7A just show dim(rangeT)=dim(rangeT*).

## Mustafa Kemal Turak

8 Jul 2020Proof Q19: Theorem: f is uniformly continous if and only if every Cauchy sequence {xn} then {f(xn)} is Cauchy sequence. Suppose s is largest singular value of T. Let {vn} is Cauchy sequence on metric space (V, d) then for every ε>0 there exists N ∈ IN d (vn, vm) N. Hence

d(T(vn),T(vm)) <_ s d (vn, vm) N. Hence {T(vn)} is Cauchy sequence so that T is uniformly continous on (V, d)

## Mustafa Kemal Turak

8 Jul 2020I fix misspelling d(T(vn), T(vm)) <_ s d (vn, vm) < ε . By the way inequality comes from Q18.a

## Mustafa Kemal Turak

6 Jul 2020I give simple proof.

R\ is\ positive\ operator\ then\ by\ definition\ implies\ that\ R\ is\ self\ adjoint

\ \ \ \ \ \ \ \ \ \ \ \ \ \ so\ that\ R2is\ self\ adjoint,\ also\ R2 -T*T\ is\ self-adjoint.By\left(7.16\right)

\ \ =\ \ -\ \ =\ ‖Rv‖2 -\ ‖Tv‖2\ =\ ‖Rv‖2-‖SRv‖2

=0 for\ every\ v\inV

Hence\ R2=T*T so\ that\ R=\sqrtT*T because\ R\ is\ positive\ operator\ and\ 7.36

## Zixiu Su

22 Jun 2020For Ex. 13, we can decompose range T by 7.51. Eliminating the zero singular value terms gives us a basis of range T and it's easy to see the dimension as desired.

## Zixiu Su

22 Jun 2020Sorry, I mean ex. 14.

## Chi Yuan Lau

8 Aug 2019How to solve ex 20?

## Linearity

9 Aug 2019Use Ex 18, let $v\in V$, then \begin{align*}\|(S+T)v\|\leqslant \|Sv\|+\|Tv\|\leqslant s\|v\|+t\|v\|.\end{align*}Since $r$ is a singular value of $S+T$, we can take $v$ to be the corresponding eigenvector of $\sqrt{(S+T)^*(S+T)}$, then\[\|(S+T)v\|^2=\langle v,(S+T)^*(S+T)v\rangle =\langle v,r^2v\rangle=r^2\|v\|^2.\]Therefore, we have $$r\|v\|=\|(S+T)v\|\leqslant s\|v\|+t\|v\|.$$Therefore $r\leqslant s+t$.

## risin

16 Jul 2020In the 4th line, v should be the eigenvector of root (s+t)*(s+t), not root (s+t)

## Travor Zhang

30 Dec 2020v nonzero

## Marcel Ackermann

25 Nov 20177D6) T=[[0,1,0],[0,0,2],[0,0,0]], sqrt(T*T)=[[1,0,0],[0,2,0],[0,0,0]], so the singular values are 0, 1, 2.

## Marcel Ackermann

21 Sep 20177D10) sqrt(T*T)=sqrt(TT) (because T is self-adjoint). By spectral theorem T has a diagonal matrix, so sqrt(TT)= sqrt(T') with T' the diagonal elements squared. Then diag(|d_1|, ..., |d_n|) is a positive square root. The singular values are the eigenvalues of this, so it has the absolute values of T has eigenvalues.

## Marcel Ackermann

17 Sep 20177D7) T^{*}(z_1,z_2,z_3)=(2z_2, 3z_3, z_1)

T^*T(z_1,z_2,z_3)=(4z_1,9z_2, z_3)

sqrt(T^*T)(z_1,z_2,z_3) = (2z_1,3z_2,z_3)

This is a positive square root as can easily be seen.

so: S(z_1,z_2,z_3)=(z_3,z_1,z_2)

## Wu Jinyang

27 Aug 2017Hi, I am stuck at question 11.

I checked the solution manual of the second edition and it is not included there.

Can you upload the answer to this question?

I tried to use 7.52 to prove it

since T*T ej = aj ej

TT* (Tej) = aj Tej

yet Tej can equal to zero, making it not a proper eigenvector.

Am I at least heading the correct direction?

## Mohammad Rashidi

27 Aug 2017Updated.

## Wu Jinyang

28 Aug 2017That was fast!

Just one trivial thing. To some extend, the proof is still not complete.

Two vectors can have same eigenvalues with the number of the same eigenvalue different.

For example let V have 2,2,2,3 as eigenvalues. let W have 2,2,3,3 as eigenvalues. their eigenvalues are the same: 2,3. If you get my point.

5A.15 not only show that they have the same eigenvalue, but also shows that the dimension of the eigenspace of the two operators with respect to the same eigenvalue is the same. So you might want to slightly rephrase line 4 to let it contain this meaning.

## Mohammad Rashidi

29 Aug 2017Thanks.

## Marcel Ackermann

30 Jul 20177D1) At first find T*: $T*(w)=<w,x> u$ because $<tu,w>=<<v,u>x,w>=<v,u><x,w>$ and $<v,t*w>=<v,<w,x>u>=<v,u><x,w>$. Let $Rv=\frac{||x||}{||u||}<v,u>u$. We need to show that 1) $T*Tv=RRv \forall v$ and 2) $<rv,v> \geq 0 \forall v$.

1) $T*Tv=T*(<v,u>x)=<<v,u>x,x>u=<v,u><x,x>u$ and $RRv=\frac{||x||}{||u||}<\frac{||x||}{||u||}<v,u>u,u>u=<x,x><,u,u>u$.

2) $<rv,v>=<\frac{||x||}{||u||}<v,u>u,v> \geq <v,u><u,v> \geq 0$.

## Célio Passos

20 Sep 2017You still need to show that $R$ is self-adjoint.

## Scarlet Remilia

19 Jun 2017I can't open exercise 8