1. Solution: Suppose for some basis $v_1$, $\cdots$, $v_n$ of $V$ and some basis $w_1$, $\cdots$, $w_m$ of $W$, the matrix of $T$ has at most $\dim \m{range} T-1$ nonzero entries. Then there are at most $\dim \m{range} T-1$ nonzero vectors in $Tv_1$, $\cdots$, $Tv_n$. Note that $\m{range} T=\m{span}(Tv_1,\cdots,Tv_n)$, it follows that \[\dim \m{range} T\le \dim \m{range} T-1.\]We get a contradiction, hence completing the proof.

2. Solution: The basis of $\ca P(\R^3)$ is $x^3$, $x^2$, $x$, $1$. The corresponding basis of $\ca P(\R^2)$ is $3x^2$, $2x$, $1$ (with the indicated order).(Check it!)

3. Solution: We use the notation and the proof of 3.22. Extend $Tv_1$, $\cdots$, $Tv_n$ to a basis of $W$ as $Tv_1$, $\cdots$, $Tv_n$, $\mu_1$, $\cdots$, $\mu_s$. Then with respect to the basis $v_1$, $\cdots$, $v_n$, $u_1$, $\cdots$, $u_m$ of $V$ and the basis $Tv_1$, $\cdots$, $Tv_n$, $\mu_1$, $\cdots$, $\mu_s$ of $W$, all entries of $\ca M(T)$ are 0 except that the entries in row $j$ , column $j$ , equal $1$ for $1\le j\le \dim \m{range} T$.

4. Solution: If $Tv_1=0$, then any basis $w_1$, $\cdots$, $w_n$ of $W$ will satisfy the desired conditions. If $Tv_1\ne 0$, then any basis $w_1$, $\cdots$, $w_n$ of $W$ such that $w_1=Tv_1$ will satisfy the desired conditions.

5. Solution: Let $\nu_1$, $\cdots$, $\nu_m$ be a basis of $V$, denote the first row of $\ca M(T)$ with respect to the bases $\nu_1$, $\cdots$, $\nu_m$ and $w_1$, $\cdots$, $w_n$ by $(a_1,\cdots,a_m)$. If $(a_1,\cdots,a_m)=0$, then we can choose $v_i=\nu_i$, $i=1,\cdots,m$. If $(a_1,\cdots,a_m)\ne 0$, suppose $a_i\ne 0$. Then let \[ v_1=\frac{\nu_i}{a_i},v_j=\nu_{j-1}-a_{j-1}v_1,v_k=\nu_k-a_kv_1 \]for $j=2,\cdots,i$, $k=i+1,\cdots,m$. Then you can check that $v_1$, $\cdots$, $v_m$ satisfies the desired conditions.

6. Solution: Suppose there exist a basis $v_1$, $\cdots$, $v_m$ of $V$ and a basis $w_1$, $\cdots$, $w_n$ of $W$ such that with respect to these bases, all entries of $\ca M(T)$ equal $1$. Then \[Tv_i=w_1+\cdots+w_n,\quad i=1,\cdots,m.\]Hence $\m{range} T=\m{span}(w_1+\cdots+w_n)$, it follows that $\dim \m{range} T = 1$.

Conversely, if $\dim \m{range} T = 1$, then $\dim \m{null} T =\dim V-1$. Let $\nu_1$, $\nu_2$, $\cdots$, $\nu_m$ be a basis of $V$ such that $\nu_2$, $\cdots$, $\nu_m\in \m{null} T$. Note that $T\nu_1\ne0$, hence we can extend it to a basis of $W$ as $T\nu_1$, $w_2$, $\cdots$, $w_n$. Let $w_1=T\nu_1-w_2-\cdots-w_n$ and $v_1=\nu_1$, $v_i=\nu_i+\nu_1$ for $i=2,\cdots,m$. \[ T(v_1)=T(v_i)=w_1+w_2+\cdots+w_n,\quad i=2,\cdots,m. \]It is obvious that $v_1$, $\cdots$, $v_m$ is a basis of $V$ and $w_1$, $\cdots$, $w_n$ is a basis of $W$. Now we can directly check that all entries of $\ca M(T)$ with respect to these bases equal $1$.

7. Solution: Given a basis $v_1$, $\cdots$, $v_m$ of $V$ and a basis $w_1$, $\cdots$, $w_n$ of $W$, denote $v$ and $\ca M(S)$ with respect to these bases by $A$ and $B$, respectively. Then we have \[ Tv_j=\sum_{k=1}^n A_{k,j}w_k \]and \[Sv_j=\sum_{k=1}^n B_{k,j}w_k.\]Hence \[ (T+S)v_j=Tv_j+Sv_j=\sum_{k=1}^n (A_{k,j}+B_{k,j})w_k, \]it follows that the entries in row $k$ , column $j$ of $\ca M(T+S)$ with respect to these bases are $A_{k,j}+B_{k,j}$. By 3.35, we deduce that $\ca M(T+S)=\ca M(T)+\ca M(S)$.

8. Verify 3.38.

Solution: It is almost the same as the previous Exercise.

9. Prove 3.52.

Solution: It is almost the same as Problem 11. Just consider the entries.

10. Suppose $A$ is an $m$-by-$n$ matrix and $C$ is an $n$-by-$p$ matrix. Prove that \[(AC)_{j,\cdot}=A_{j,\cdot}C\] for $1\le j\le m$. In other words, show that row $j$ of $AC$ equals (row $j$ of $A$) times $C$.

Solution: It is almost the same as Problem 11. Just consider the entries.

These exercises are tedious. I prefer solving other interesting exercises… If you have problems regarding to them, please make a comment.

11. Solution: By 3.41, we have \[ (aC)_{1,k}=\sum_{i=1}^na_iC_{i,k}. \]It is obvious that $(a_iC_{i,\cdot})_{1,k}=a_iC_{i,k}$. Hence \[ (aC)_{1,k}=(a_1C_{1,\cdot})_{1,k}+\cdots+(a_nC_{n,\cdot})_{1,k}=(a_1C_{1,\cdot}+\cdots+a_nC_{n,\cdot})_{1,k} \]by 3,35. Thus we deduce that $aC=a_1C_{1,\cdot}+\cdots+a_nC_{n,\cdot}$.

12. Solution: Let $A=\left( \begin{array}{cc} 0 & 1 \\ 1 & 0 \\ \end{array} \right) $ and $C=\left( \begin{array}{cc} 1 & 0 \\ 0 & 2 \\ \end{array} \right) $, then we have \[ AC=\left( \begin{array}{cc} 0 & 2 \\ 1 & 0 \\ \end{array} \right) \]and\[ CA=\left( \begin{array}{cc} 0 & 1 \\ 2 & 0 \\ \end{array} \right) .\]Hence $AC\ne CA$.

13. Solution: See Linear Algebra Done Right Solution Manual Chapter 3 Problem 17.

14. Solution: See Linear Algebra Done Right Solution Manual Chapter 3 Problem 18.

15. Solution: Note that $AAA=(AA)A$, denote $AA=B$, then by definition (3.41) \begin{equation}\label{3CP151} B_{j,r}=\sum_{p=1}^n A_{j,p}A_{p,r}. \end{equation}Similarly, the entry in row $j$ , column $k$, of $A^3$ is \[ \sum_{r=1}^n B_{j,r}A_{r,k}. \]Hence by $(\ref{3CP151})$, we have \[ \sum_{r=1}^n B_{j,r}A_{r,k}=\sum_{r=1}^n \sum_{p=1}^n A_{j,p}A_{p,r}A_{r,k}=\sum_{p=1}^n\sum_{r=1}^nA_{j,p}A_{p,r}A_{r,k}. \]