Chapter 3 Exercise C

1. Suppose $V$ and $W$ are finite-dimensional and $T\in\ca L(V,W)$. Show that with respect to each choice of bases of $V$ and $W$, the matrix of $T$ has at least $\dim \m{range} T$ nonzero entries.

Solution: Suppose for some basis $v_1$, $\cdots$, $v_n$ of $V$ and some basis $w_1$, $\cdots$, $w_m$ of $W$, the matrix of $T$ has at most $\dim \m{range} T-1$ nonzero entries. Then there are at most $\dim \m{range} T-1$ nonzero vectors in $Tv_1$, $\cdots$, $Tv_n$. Note that $\m{range} T=\m{span}(Tv_1,\cdots,Tv_n)$, it follows that $\dim \m{range} T\le \dim \m{range} T-1.$We get a contradiction, hence completing the proof.

2. Suppose $D\in\ca L(\ca P(\R^3),\ca P(\R^2))$ is the differentiation map defined by $Dp=p’$. Find a basis of $\ca P(\R^3)$ and a basis of $\ca P(\R^2)$ such that the matrix of $D$ with respect to these bases is $\left( \begin{array}{cccc} 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ \end{array} \right)$[Compare the exercise above to Example 3.34. The next exercise generalizes the exercise above.]

Solution: The basis of $\ca P(\R^3)$ is $x^3$, $x^2$, $x$, $1$. The corresponding basis of $\ca P(\R^2)$ is $3x^2$, $2x$, $1$ (with the indicated order).(Check it!)

3. Suppose $V$ and $W$ are finite-dimensional and $T\in\ca L(V,W)$. Prove that there exist a basis of $V$ and a basis of $W$ such that with respect to these bases, all entries of $\ca M(T)$ are 0 except that the entries in row $j$ , column $j$ , equal $1$ for $1\le j\le \dim \m{range} T$.

Solution: We use the notation and the proof of 3.22. Extend $Tv_1$, $\cdots$, $Tv_n$ to a basis of $W$ as $Tv_1$, $\cdots$, $Tv_n$, $\mu_1$, $\cdots$, $\mu_s$. Then with respect to the basis $v_1$, $\cdots$, $v_n$, $u_1$, $\cdots$, $u_m$ of $V$ and the basis $Tv_1$, $\cdots$, $Tv_n$, $\mu_1$, $\cdots$, $\mu_s$ of $W$, all entries of $\ca M(T)$ are 0 except that the entries in row $j$ , column $j$ , equal $1$ for $1\le j\le \dim \m{range} T$.

4. Suppose $v_1$, $\cdots$, $v_m$ is a basis of $V$ and $W$ is finite-dimensional. Suppose $T\in\ca L(V,W)$. Prove that there exists a basis $w_1$, $\cdots$, $w_n$ of $W$ such that all the entries in the first column of $\ca M(T)$ (with respect to the bases $v_1$, $\cdots$, $v_m$ and $w_1$, $\cdots$, $w_n$) are $0$ except for possibly a $1$ in the first row, first column. [In this exercise, unlike Exercise 3, you are given the basis of $V$ instead of being able to choose a basis of $V$.]

Solution: If $Tv_1=0$, then any basis $w_1$, $\cdots$, $w_n$ of $W$ will satisfy the desired conditions. If $Tv_1\ne 0$, then any basis $w_1$, $\cdots$, $w_n$ of $W$ such that $w_1=Tv_1$ will satisfy the desired conditions.

5. Suppose $w_1$, $\cdots$, $w_n$ is a basis of $W$ and $V$ is finite-dimensional. Suppose $T\in\ca L(V,W)$. Prove that there exists a basis $v_1$, $\cdots$, $v_m$ of $V$ such that all the entries in the first row of $\ca M(T)$ (with respect to the bases $v_1$, $\cdots$, $v_m$ and $w_1$, $\cdots$, $w_n$) are $0$ except for possibly a $1$ in the first row, first column. [In this exercise, unlike Exercise 3, you are given the basis of $W$ instead of being able to choose a basis of $W$.]

Solution: Let $\nu_1$, $\cdots$, $\nu_m$ be a basis of $V$, denote the first row of $\ca M(T)$ with respect to the bases $\nu_1$, $\cdots$, $\nu_m$ and $w_1$, $\cdots$, $w_n$ by $(a_1,\cdots,a_m)$. If $(a_1,\cdots,a_m)=0$, then we can choose $v_i=\nu_i$, $i=1,\cdots,m$. If $(a_1,\cdots,a_m)\ne 0$, suppose $a_i\ne 0$. Then let $v_1=\frac{\nu_i}{a_i},v_j=\nu_{j-1}-a_{j-1}v_1,v_k=\nu_k-a_kv_1$for $j=2,\cdots,i$, $k=i+1,\cdots,m$. Then you can check that $v_1$, $\cdots$, $v_m$ satisfies the desired conditions.

6. Suppose $V$ and $W$ are finite-dimensional and $T\in\ca L(V,W)$. Prove that $\dim \m{range} T = 1$ if and only if there exist a basis of $V$ and a basis of $W$ such that with respect to these bases, all entries of $\ca M(T)$ equal $1$.

Solution: Suppose there exist a basis $v_1$, $\cdots$, $v_m$ of $V$ and a basis $w_1$, $\cdots$, $w_n$ of $W$ such that with respect to these bases, all entries of $\ca M(T)$ equal $1$. Then $Tv_i=w_1+\cdots+w_n,\quad i=1,\cdots,m.$Hence $\m{range} T=\m{span}(w_1+\cdots+w_n)$, it follows that $\dim \m{range} T = 1$.

Conversely, if $\dim \m{range} T = 1$, then $\dim \m{null} T =\dim V-1$. Let $\nu_1$, $\nu_2$, $\cdots$, $\nu_m$ be a basis of $V$ such that $\nu_2$, $\cdots$, $\nu_m\in \m{null} T$. Note that $T\nu_1\ne0$, hence we can extend it to a basis of $W$ as $T\nu_1$, $w_2$, $\cdots$, $w_n$. Let $w_1=T\nu_1-w_2-\cdots-w_n$ and $v_1=\nu_1$, $v_i=\nu_i+\nu_1$ for $i=2,\cdots,m$. $T(v_1)=T(v_i)=w_1+w_2+\cdots+w_n,\quad i=2,\cdots,m.$It is obvious that $v_1$, $\cdots$, $v_m$ is a basis of $V$ and $w_1$, $\cdots$, $w_n$ is a basis of $W$. Now we can directly check that all entries of $\ca M(T)$ with respect to these bases equal $1$.

7. Verify 3.36.

Solution: Given a basis $v_1$, $\cdots$, $v_m$ of $V$ and a basis $w_1$, $\cdots$, $w_n$ of $W$, denote $v$ and $\ca M(S)$ with respect to these bases by $A$ and $B$, respectively. Then we have $Tv_j=\sum_{k=1}^n A_{k,j}w_k$and $Sv_j=\sum_{k=1}^n B_{k,j}w_k.$Hence $(T+S)v_j=Tv_j+Sv_j=\sum_{k=1}^n (A_{k,j}+B_{k,j})w_k,$it follows that the entries in row $k$ , column $j$ of $\ca M(T+S)$ with respect to these bases are $A_{k,j}+B_{k,j}$. By 3.35, we deduce that $\ca M(T+S)=\ca M(T)+\ca M(S)$.

8. Verify 3.38.

Solution: It is almost the same as the previous Exercise.

9. Prove 3.52.

Solution: It is almost the same as Problem 11. Just consider the entries.

10. Suppose $A$ is an $m$-by-$n$ matrix and $C$ is an $n$-by-$p$ matrix. Prove that $(AC)_{j,\cdot}=A_{j,\cdot}C$ for $1\le j\le m$. In other words, show that row $j$ of $AC$ equals (row $j$ of $A$) times $C$.

Solution: It is almost the same as Problem 11. Just consider the entries.

These exercises are tedious. I prefer solving other interesting exercises… If you have problems regarding to them, please make a comment.

11. Suppose $a=(a_1,\cdots,a_n)$ is a $1$-by-$n$ matrix and $C$ is an $n$-by-$p$ matrix. Prove that $aC=a_1C_{1,\cdot}+\cdots+a_nC_{n,\cdot}.$ In other words, show that $aC$ is a linear combination of the rows of $C$, with the scalars that multiply the rows coming from $a$.

Solution: By 3.41, we have $(aC)_{1,k}=\sum_{i=1}^na_iC_{i,k}.$It is obvious that $(a_iC_{i,\cdot})_{1,k}=a_iC_{i,k}$. Hence $(aC)_{1,k}=(a_1C_{1,\cdot})_{1,k}+\cdots+(a_nC_{n,\cdot})_{1,k}=(a_1C_{1,\cdot}+\cdots+a_nC_{n,\cdot})_{1,k}$by 3,35. Thus we deduce that $aC=a_1C_{1,\cdot}+\cdots+a_nC_{n,\cdot}$.

12. Give an example with $2$-by-$2$ matrices to show that matrix multiplication is not commutative. In other words, find $2$-by-$2$ matrices $A$ and $C$ such that $AC\ne CA$.

Solution: Let $A=\left( \begin{array}{cc} 0 & 1 \\ 1 & 0 \\ \end{array} \right)$ and $C=\left( \begin{array}{cc} 1 & 0 \\ 0 & 2 \\ \end{array} \right)$, then we have $AC=\left( \begin{array}{cc} 0 & 2 \\ 1 & 0 \\ \end{array} \right)$and$CA=\left( \begin{array}{cc} 0 & 1 \\ 2 & 0 \\ \end{array} \right) .$Hence $AC\ne CA$.

13. Prove that the distributive property holds for matrix addition and matrix multiplication. In other words, suppose $A$, $B$, $C$, $D$, $E$, and $F$ are matrices whose sizes are such that $A(B+C)$ and $(D+ E)F$ make sense. Prove that $AB + AC$ and $DF + EF$ both make sense and that $A(B+C)=AB + AC$ and $(D+ E)F= DF + EF$.

Solution: See Linear Algebra Done Right Solution Manual Chapter 3 Problem 17.

14. Prove that matrix multiplication is associative. In other words, suppose $A$, $B$, and $C$ are matrices whose sizes are such that $(AB)C$ makes sense. Prove that $A(BC)$ makes sense and that $(AB)C=A(BC)$.

Solution: See Linear Algebra Done Right Solution Manual Chapter 3 Problem 18.

15. Suppose $A$ is an $n$-by-$n$ matrix and $1\le j, k \le n$. Show that the entry in row $j$ , column $k$, of $A^3$ (which is defined to mean $AAA$) is $\sum_{p=1}^n\sum_{r=1}^nA_{j,p}A_{p,r}A_{r,k}.$Solution: Note that $AAA=(AA)A$, denote $AA=B$, then by definition (3.41) $$\label{3CP151} B_{j,r}=\sum_{p=1}^n A_{j,p}A_{p,r}.$$Similarly, the entry in row $j$ , column $k$, of $A^3$ is $\sum_{r=1}^n B_{j,r}A_{r,k}.$Hence by $(\ref{3CP151})$, we have $\sum_{r=1}^n B_{j,r}A_{r,k}=\sum_{r=1}^n \sum_{p=1}^n A_{j,p}A_{p,r}A_{r,k}=\sum_{p=1}^n\sum_{r=1}^nA_{j,p}A_{p,r}A_{r,k}.$