Chapter 5 Exercise A


1. Solution: (a) For any $u\in U$, then $Tu=0\in U$ since $U\subset \m{null} T$, hence $U$ is invariant under $T$.

(b) For any $u\in U$, then $Tu\in\m{range} T \subset U$, hence $U$ is invariant under $T$.


2. Solution: See Linear Algebra Done Right Solution Manual Chapter 5 Problem 4. Let $\lambda=0$.


3. Solution: For any $u\in \m{range} S$, there exists $v\in V$ such that $Sv=u$, hence \[Tu=TSv=STv\in \m{range} S.\]Therefore range $S$ is invariant under $T$.


4. Solution: See Linear Algebra Done Right Solution Manual Chapter 5 Problem 1.


5. Solution: See Linear Algebra Done Right Solution Manual Chapter 5 Problem 2.


6. Solution: See Linear Algebra Done Right Solution Manual Chapter 5 Problem 3.


7. Solution: Let $(x,y)$ be an eigenvector of $T$ corresponding to eigenvalue $\lambda$, then we have \[T(x,y)=\lambda(x,y),\]i.e., $(\lambda x,\lambda y)=(-3y,x)$. Hence we have $\lambda x=-3y$ and $\lambda y=x$, it follows that $\lambda^2xy=-3xy$. If $xy\ne 0$, then $\lambda^2=-3$, this is impossible.

If $x=0$, then $y=0$ by $\lambda x=-3y$. However $(x,y)$ is an eigenvector, hence $(x,y)\ne (0,0)$. We get a contradiction.

If $y=0$, then $x=0$ by $\lambda y=x$. Similarly, we get a contradiction.

Hence no such eigenvectors exist, namely $T$ has no eigenvalues.


8. Solution: See Linear Algebra Done Right Solution Manual Chapter 5 Problem 5.


9. Solution: See Linear Algebra Done Right Solution Manual Chapter 5 Problem 6.


10. Solution: (a) Suppose $v=(v_1,\cdots,v_n)$ is a eigenvector of $T$ corresponding to eigenvalue $\lambda$. Then we have $Tv=\lambda v$, hence \begin{equation}\label{5AP1}(v_1,2v_2,\cdots,v_n)=(\lambda v_1,\lambda v_2,\cdots,\lambda v_n).\end{equation} As $v\ne 0$ by definition of eigenvectors, there is some $i\in \{1,2,\cdots,n\}$ such that $v_i\ne 0$. Note that we have $iv_i=\lambda v_i$ by $(\ref{5AP1})$, it implies $\lambda=i$. For $\lambda=i$, it is easy to solve $(\ref{5AP1})$. We can conclude the corresponding eigenvectors are of the form $(0,\cdots,0,a,0,\cdots,0),a\in\mb F$ with $a$ in $i$-th component. Similarly, all eigenvalues of $T$ are $1$, $2$, $\cdots$, $n$. All eigenvectors with respect to $i$ are of the form $(0,\cdots,0,a,0,\cdots,0),a\in\mb F$ with $a$ in $i$-th component.

(b) Suppose $W$ is an invariant subspace of $T$. Assume $e_i=(0,\cdots,0,1,0,\cdots,0)$ with $1$ in $i$-th component. Then $e_1$, $\cdots$, $e_n$ is a basis of $\mb F^n$ and $e_i$ is an eigenvector of $T$ corresponding to $i$. If $a_1e_1+\cdots+a_ke_k\in W$ with $a_1\cdots a_k\ne 0$, we will show $\m{span}(e_{1},e_{2},\cdots,e_{k})\subset W$. Note that $a_1e_1+\cdots+a_ke_k\in W$ and $W$ is invariant with respect to $T$, it follows that \[T(a_1e_1+\cdots+a_ke_k)=a_1e_1+\cdots+ka_ke_k\in W.\]Hence \[k(a_1e_1+\cdots+a_ke_k)-(a_1e_1+\cdots+ka_ke_k)=(k-1)a_1e_1+\cdots+a_{k-1}e_{k-1}\in W,\]and the coefficients are nonzero. Inductively, we will get some \[ \lambda_1e_1+\cdots+\lambda_ie_i\in W \]for $\lambda_1\cdots\lambda_i\ne 0$ for any $i\leqslant k$($\lambda_1$, $\cdots$, $\lambda_i$ change as $i$ changes). In particular, $\mu_1e_1\in W$ and $\mu_1\ne 0$. Hence $e_1\in W$. Then, consider $\eta_1e_1+\eta_2e_2\in W$ where $\eta_1\eta_2\ne 0$, we will get $e_2\in W$. Inductively, we can show that $\{e_{1},e_{2},\cdots,e_{k}\}\subset W$. Hence $\m{span}(e_{1},e_{2},\cdots,e_{k})\subset W$. Similarly, if $a_{i_1}e_{i_1}+\cdots+a_{i_k}e_{i_k}\in W$ with $a_{i_1}\cdots a_{i_k}\ne 0$ and all $\{a_{i_j}\}$ distinct, then $\m{span}(e_{i_1},\cdots,e_{i_k})\subset W$. Now let us consider the general form of $W$. Suppose $W\cap \{e_1,\cdots,e_n\}=\{e_{i_1},\cdots,e_{i_k}\}$, then we will show $\m{span}(e_{i_1},\cdots,e_{i_k})= W$. It is obvious that $\m{span}(e_{i_1},\cdots,e_{i_k})\subset W$. If there some $w\in W$ but $w\notin \m{span}(e_{i_1},\cdots,e_{i_k})$. Then $w$ can be written as \[ w=b_1e_1+\cdots+b_ne_n,\quad b_1,\cdots,b_n\in \mb F \]such that there is some $s\notin \{i_1,\cdots,i_k\}$ and $b_s\ne 0$. By previous argument, we have $e_s\in W$. This contradicts with $W\cap \{e_1,\cdots,e_n\}=\{e_{i_1},\cdots,e_{i_k}\}$. Hence we show that $\m{span}(e_{i_1},\cdots,e_{i_k})= W$. Moreover, all invariant subspaces of $T$ have this form.

I am not satisfied with this solution. ❗


11. Solution: Suppose $\lambda$ is an eigenvalue of $T$ with an eigenvector $q$, then \[q’=Tq=\lambda q.\]Note that in general $\deg p'<\deg p$(because we consider $\deg 0=-\infty$). If $\lambda\ne 0$, then $\deg \lambda q>\deg q’$. We get a contradiction. If $\lambda=0$, then $q=c$ for nonzero $c\in\R$. Hence the only eigenvalue of $T$ is zero with nonzero constant polynomials as eigenvectors.


12. Solution: Suppose $\lambda$ is an eigenvalue of $T$ with an eigenvector $q$. Let $q=a_nx^n+\cdots+a_1x+a_0$ such that $a_n\ne 0$, then \[\lambda q=Tq=xq’,\]namely \[\lambda a_nx^n+\cdots+\lambda a_1x+\lambda a_0=na_nx^n+\cdots+2a_2x^2+a_1x.\]Since $a_n\ne 0$, it follows that $\lambda =n$ by considering the leading coefficient. Then we have $a_0=a_1=\cdots=a_{n-1}=0$, hence $q=a_nx^n$. Hence all eigenvalues of $T$ are $0,1,2,\cdots$ and all eigenvectors correspond to $m$ is $\lambda x^m$ such that $m\in \mb{N}$, $\lambda\ne 0$ and $\lambda\in\R$.


13. Solution: Let $\alpha_i\in\mb F$ such that \[ \left|\alpha_i-\lambda\right| = \frac{1}{1000+i},\quad i=1,\cdots,\dim V+1. \]These $\alpha_i$ exist and are different from each other since $F=\R$ or $\C$. Note that each operator on $V$ has at most $\dim V$ distinct eigenvalues by 5.13. Hence there exists some $i\in\{1,2,\cdots,\dim V+1\}$ such that $\alpha_i$ is not an eigenvalue of $T$. Then by 5.6, $T-\alpha_i I$ is invertible.


14. Solution: Note that any $v\in V$ can be written uniquely as $u+w$ for $u \in U$ and $w \in W$ since $V=U\oplus W$. It follow that this $P$ is well-defined. Maybe you also need to check$P\in\ca L(V)$. Now let us consider the eigenvalues of $P$. By consider $v\ne 0$, if there exists $\lambda\in\mb F$ such that $Pv=\lambda v$. Write $v=u+w$ for $u \in U$ and $w \in W$, then $u$ and $w$ can not be both zero. Hence by definition of $P$, we have \[Pv=u,\quad \lambda v=\lambda u+\lambda w.\]It follows that $u=\lambda u+\lambda w$, namely $(\lambda-1)u+\lambda w=0$. Note that $V=U\oplus W$, it follows that $(\lambda-1)u=\lambda w=0$. If $u\ne 0$, then $\lambda =1$. Hence $w=0$ and the corresponding eigenvectors are nonzero vectors $v\in U$. If $w\ne 0$, then $\lambda =0$. Hence $u=0$ and the corresponding eigenvectors are nonzero vectors $v\in W$.


15. Solution: (a) Suppose $\lambda$ is an eigenvalue of $T$, then there exists a nonzero vector $v\in V$ such that $Tv=\lambda v$. Hence \[S^{-1}TS(S^{-1}v)=S^{-1}Tv=S^{-1}(\lambda v)=\lambda S^{-1}v.\]Note that $S^{-1}v\ne 0$ as $S^{-1}$ is invertible, hence $\lambda$ is an eigenvalue of $S^{-1}TS$, namely every eigenvalue of $T$ is an eigenvalue of $S^{-1}TS$. Similarly, note that $S(S^{-1}TS)S^{-1}=T$, we have every eigenvalue of $S^{-1}TS$ is an eigenvalue of $T$. Hence $T$ and $S^{-1}TS$ have the same eigenvalues.

(b) From the process of (a), one can easily deduce that $v$ is an eigenvector of $T$ if and only if $S^{-1}v$ is an eigenvector of $S^{-1}TS$.


16. Solution: Although this problem is true for infinite-dimensional vector space, I will just consider finite-dimension case since we are considering the matrix of $T$(otherwise, it would be a infinite matrix). Suppose the matrix of $T$ with respect to basis $e_1$, $\cdots$, $e_n$ of $V$ contains only real entries. Then \[Te_j=A_{1,j}e_1+\cdots+A_{n,j}e_n,\]where $A_{i,j}\in\R$ for all $i,j=1,2,\cdots,n$. Let \[v=k_1e_1+\cdots+k_ne_n\]be a eigenvector with respect to $\lambda$, where $k_i\in\C$, $i=1,\cdots,n$. Then we have \[Tv=\lambda v,\]namely \begin{equation}\label{5A161} \lambda\sum_{i=1}^nk_ie_i=\sum_{i=1}^nk_iTe_i=\sum_{i=1}^n\sum_{j=1}^nk_iA_{j,i}e_j. \end{equation} Consider the complex conjugation of $(\ref{5A161})$, we have \begin{equation}\label{5A162} \overline{\lambda}\sum_{i=1}^n\overline{k_i}e_i=\sum_{i=1}^nk_iTe_i=\sum_{i=1}^n\sum_{j=1}^n\overline{k_i}A_{j,i}e_j \end{equation} since $A_{i,j}\in\R$ for all $i,j=1,2,\cdots,n$. (why? consider components)Note that $(\ref{5A162})$ implies \begin{equation}\label{5A163} T(\overline{k_1}e_1+\cdots+\overline{k_n}e_n)=\overline{\lambda}\sum_{i=1}^n\overline{k_i}e_i. \end{equation}Since $v=k_1e_1+\cdots+k_ne_n\ne 0$, it follows that not all $k_i$ is zero, so is $\overline{k_i}$. Hence $\overline{k_1}e_1+\cdots+\overline{k_n}e_n\ne 0$, hence $(\ref{5A163})$ tell us $\bar{\lambda}$ is an eigenvalue of $T$.


17. Solution: See Linear Algebra Done Right Solution Manual Chapter 5 Problem 23.


18. Solution: Suppose $\lambda$ is an eigenvalue of $T$ and one corresponding eigenvector is $(w_1, w_2,\cdots)$. Then not all of $w_i$ is zero. Moreover, we have \[(0,w_1, w_2,\cdots)=T(w_1, w_2,\cdots)=\lambda(w_1, w_2,\cdots).\]If $\lambda=0$, then \[ (0,w_1, w_2,\cdots)=0 \]implies $w_i\equiv 0$ for any $i\in\mb N^+$. We get a contradiction. If $\lambda\ne 0$. Consider the first component, we have $0=\lambda w_1$, hence $w_1=0$. Then consider the second component, we have $\lambda w_2=w_1=0$, hence $w_2=0$. By induction, one can easily deduce that $w_i\equiv 0$ for any $i\in\mb N^+$. We get a contradiction as well. Hence $T$ has no eigenvalues.


19. Solution: See Linear Algebra Done Right Solution Manual Chapter 5 Problem 7.


20. Solution: See Linear Algebra Done Right Solution Manual Chapter 5 Problem 8.


21. Solution: See Linear Algebra Done Right Solution Manual Chapter 5 Problem 10. (b) is almost proved there.


22. Solution: Note that we have \[ T(v+w)=Tv+Tw=3w+3v=3(v+w), \]and\[T(v-w)=Tv-Tw=3w-3v=-3(v-w).\]If $v-w$ or $v+w$ is nonzero, then $3$ or $-3$ is an eigenvalue of $T$. In fact if $v-w=0$ and $v+w=0$, it is easy to see $v=w=0$. It contradicts with $v\ne 0$ and $w\ne 0$.


23. Solution: See Linear Algebra Done Right Solution Manual Chapter 5 Problem 11.


24. Solution: (a) (a) If the sum of the entries in each row of $A$ equals $1$, then one can easily deduce that \[T\left( \begin{array}{c} 1 \\ \vdots \\ 1 \\ \end{array} \right) =\left( \begin{array}{c} 1 \\ \vdots \\ 1 \\ \end{array} \right).\]Hence $1$ is an eigenvalue of $T$ with $\left( \begin{array}{c} 1 \\ \vdots \\ 1 \\ \end{array} \right)$ as a corresponding eigenvector.

(b) This problem is interesting. It is simple by considering determinant. However it is complicated here. We just need to show that $T-I$ is not invertible. It suffices to show $T-I$ is not surjective by 5.6. Note that we have \begin{equation}\label{5AP241} (T-I)\left( \begin{array}{c} x_1 \\ \vdots \\ x_n \\ \end{array} \right)=\left( \begin{array}{c} \sum_{i=1}^n A_{1,i}x_i-x_1 \\ \vdots \\ \sum_{i=1}^n A_{n,i}x_i-x_n \\ \end{array} \right)=\left( \begin{array}{c} y_1 \\ \vdots \\ y_n \\ \end{array} \right), \end{equation}where $A_{i,j}$ is the $(i,j)$-component of $A$. Moreover, we have \[ 1=\sum_{i=1}^n A_{i,j}\quad j=1,\cdots,n. \]Hence \begin{align*} y_1+\cdots+y_n=&\sum_{j=1}^n\sum_{i=1}^n A_{j,i}x_i-\sum_{j=1}^n x_j\nonumber\\ =&\sum_{i=1}^nx_i \sum_{j=1}^n A_{j,i}-\sum_{j=1}^n x_j\\ =&\sum_{i=1}^nx_i-\sum_{j=1}^n x_j=0\nonumber.\end{align*} By $(\ref{5AP241})$ and previous equation, it follows that \[\m{range}(T-I)\subset \{(x_1,\cdots,x_n)^T\in\mb F^{n}:x_1+\cdots+x_n=0\},\]where $(x_1,\cdots,x_n)^T$ means $\left( \begin{array}{c} x_1 \\ \vdots \\ x_n \\ \end{array} \right)$. It follows that $T-I$ is not surjective, hence completing the proof.


25. Solution: Let the eigenvalues corresponding to $u,v$ are $\lambda_1,\lambda_2$ respectively, then we have \[Tu=\lambda_1u,\quad Tv=\lambda_2 v.\]If the eigenvalue corresponding to $u+v$ is $\lambda$, we have\[\lambda(u+v)=T(u+v)=Tu+Tv=\lambda_1u+\lambda_2v.\]It follows that $(\lambda-\lambda_1)u+(\lambda-\lambda_2)v=0$. If $\lambda_1\ne\lambda_2$, then $\lambda-\lambda_1$ and $\lambda-\lambda_2$ can not be both zero. Hence $u$, $v$ is not linearly independent. By 5.10, it follows that $u$ and $v$ correspond to the same eigenvalue. Hence $\lambda_1=\lambda_2$.


26. Solution: See Linear Algebra Done Right Solution Manual Chapter 5 Problem 12.


27. Solution: See Linear Algebra Done Right Solution Manual Chapter 5 Problem 13.


28. Solution: For any nonzero vector $v\in V$, extend it to a basis of $V$ as $v=v_1$, $v_2$, $\cdots$, $v_n$. Then \[Tv_1=\sum_{k=1}^n\lambda_kv_k.\]Consider $U=\m{span}(v_1,v_2)$, as $U$ is invariant under $T$ by assumption. It follows that $Tv_1\in U$. Hence $\lambda_3=\cdots=\lambda_n=0$. Similarly, consider $U=\m{span}(v_1,v_3)$ (note that $\dim V \ge 3$), we will conclude $\lambda_2=\lambda_4=\cdots=\lambda_n=0$. Hence $\lambda_2=\cdots=\lambda_n=0$. This means $v_1$ is an eigenvector of $T$. That is every nonzero vector in $V$ is an eigenvector of $T$ since $v$ is chosen arbitrarily. By Problem 26, we deduce that $T$ is a scalar multiple of the identity operator.


29. Solution: See Linear Algebra Done Right Solution Manual Chapter 5 Problem 9.


30. Solution: Note that $T$ has at most $\dim(\R^3)=3$ eigenvalues (by 5.13) and $4$, $5$, and $\sqrt{7}$ are eigenvalues of $T$, it follows that $9$ is not an eigenvalues of $T$. Hence $(T-9I)$ is surjective (by 5.6). Thus there exists $x\in\R^3$ such that $(T-9I)x=(4,5,\sqrt{7})$, namely $Tx-9x=(4,5,\sqrt{7})$.


31. Solution: If there exists $T\in \ca L(V)$ such that $v_1$, $\cdots$, $v_m$ are eigenvectors of $T$ corresponding to distinct eigenvalues. Then $v_1$, $\cdots$, $v_m$ is linearly independent by 5.10.

Conversely, if $v_1$, $\cdots$, $v_m$ is linearly independent. Then we can extend it to a basis of $V$ as $v_1$, $\cdots$, $v_m$, $v_{m+1}$, $\cdots$, $v_n$. Define $T\in \ca L(V)$ by \[Tv_i=iv_i,\quad i=1,\cdots,n.\]Then $v_1$, $\cdots$, $v_m$ are eigenvectors of $T$ corresponding to eigenvalues $1$, $\cdots$, $m$, respectively.


32. Solution: Let $V=\m{span}($ $e^{\lambda_1 x}$, $\cdots$, $e^{\lambda_n x})$, and define an operator $T\in \ca L(V)$ by $Tf=f’$(You should check $T\in \ca L(V)$). Then consider \[Te^{\lambda_i x}=\lambda_ie^{\lambda_i x}.\]Hence $\lambda_i$ is an eigenvalue of $T$ with an corresponding eigenvector $e^{\lambda_i x}$. As $\lambda_1$, $\cdots$, $\lambda_n$ is a list of distinct real numbers, by 5.10, it follows that $e^{\lambda_1 x}$, $\cdots$, $e^{\lambda_n x}$ is linearly independent.


33. Solution: By definition, for any $x+\m{range} T\in V/(\m{range} T )$, we have \[ T/(\m{range} T )(x+\m{range} T)=Tx+\m{range} T. \]Note that $Tx\in \m{range} T$, it follows that $T/(\m{range} T )(x+\m{range} T)=0$. Since $x+\m{range} T$ is choosed arbitrarily, we conclude that $T/(\m{range} T )=0$.


34. Solution: By definition, for any $x+\m{null} T\in V/(\m{null} T )$, we have \[ T/(\m{null} T )(x+\m{null} T)=Tx+\m{null} T. \]Hence $T/(\m{null} T )$ is injective if and only if\[Tx\in \m{null} T\iff x\in\m{null}T.\]Note that\[Tx\in \m{null} T\iff x\in\m{null}T\]is equivalent to $\m{null} T\cap\m{range} T=\{0\}$. Because if we assume $Tx\in \m{null} T\iff x\in\m{null}T$, then for any $v\in\m{null} T\cap\m{range} T$, then there exists $u\in V$ such that $Tu=v$, hence $Tu\in \m{null} T$ implies $u\in\m{null}T$. That is $v=Tu=0$. The other direction is also true. Hence the proof is completing.


35. Solution: Suppose $\lambda\in\mb F$ is an eigenvalue of $T/U$, we need to show $\lambda$ is an eigenvalue of $T$. There exists a nonzero $x+U\in V/U$(i.e. $x\not\in U$) such that \[(T/U)(x+U)=\lambda(x+U)\Longrightarrow Tx-\lambda x\in U.\]If $\lambda$ is an eigenvalue of $T|_U$, then we are done. If $\lambda$ is not an eigenvalue of $T$, then $T|_U-\lambda I:U\to U$ is invertible by 5.6 (here use $\dim V<\infty$). Hence there exists a $y\in U$ such that \[ (T|_U-\lambda I)y=Tx-\lambda x\Longrightarrow Ty-\lambda y=Tx-\lambda x \]since $Tx-\lambda x\in U$. Hence we have \[ T(x-y)=\lambda(x-y), \]and $x-y\ne 0$ since $x\not\in U$ and $y\in U$. It follows that $\lambda$ is an eigenvalue of $T$.


36. Solution: In Problem 32, we showed $1=e^{0x}$, $e^x$, $e^{2x}$, $\cdots$ is linearly independent in the vector space of real-valued functions on $R$. Consider $V=\m{span}(1,e^x,e^{2x},\cdots)$ and $U=\m{span}(e^x,e^{2x},\cdots)$, then $U$ and $V$ are subspaces of the vector space of real-valued functions on $R$. Define $T\in\ca L(V)$ by $T(f)=e^xf$. Please check $T\in \ca L(V)$ and $U$ is invariant under $T$. Consider $T/U$, we have \[(T/U)(1+U)=e^x+U=0.\]Since $1\not\in U$, it follows that $0$ is an eigenvalue of $T/U$. However $0$ is not an eigenvalue of $T$. Otherwise suppose there exists a nonzero $f\in V$ such that $Tf=0$, then we have $e^xf=0$. Hence $f=0$ since $e^x\ne 0$ for any $x\in \R$. We get a contradiction.


About Linearity

This website is supposed to help you study Linear Algebras. Please only read these solutions after thinking about the problems carefully. Do not just copy these solutions.
This entry was posted in Chapter 5 and tagged .