If you find any mistakes, please make a comment! Thank you.

Solution to Linear Algebra Hoffman & Kunze Chapter 3.2

Exercise 3.2.1

(a) Geometrically, in the $x-y$ plane, $T$ is the reflection about the diagonal $x=y$ and $U$ is a projection onto the $x$-axis.

(b) We have

  • $(U+T)(x_1,x_2)=(x_2,x_1)+(x_1,0)=(x_1+x_2,x_1)$.
  • $(UT)(x_1,x_2)=U(x_2,x_1)=(x_2,0)$.
  • $(TU)(x_1,x_2)=T(x_1,0)=(0,x_1)$.
  • $T^2(x_1,x_2)=T(x_2,x_1)=(x_1,x_2)$, the identity function.
  • $U^2(x_1,x_2)=U(x_1,0)=(x_1,0)$. So $U^2=U$.

Exercise 3.2.2

By Theorem 9 part (v), top of page 82, $T$ is invertible if $\{T\epsilon_1,T\epsilon_2,T\epsilon_3\}$ is a basis of $\mathbb C^3$. Since $\mathbb C^3$ has dimension three, it suffices (by Corollary 1 page 46) to show $T\epsilon_1,T\epsilon_2,T\epsilon_3$ are linearly independent. To do this we row reduce the matrix
$$\left[\begin{array}{ccc}1&0&i\\0&1&1\\i&1&0\end{array}\right]$$to row-reduced echelon form. If it reduces to the identity then its rows are independent, otherwise they are dependent. Row reduction follows:
$$This is in row-reduced echelon form not equal to the identity. Thus $T$ is not invertible.

Exercise 3.2.3

The matrix representation of the transformation is
$$\left[\begin{array}{c}x_1\\x_2\\x_3\end{array}\right] \mapsto \left[\begin{array}{ccc}3&0&0\\1&-1&0\\2&1&1\end{array}\right]\cdot\left[\begin{array}{c}x_1\\x_2\\x_3\end{array}\right]$$where we've identified $\mathbb R^3$ with $\mathbb R^{3\times1}$. $T$ is invertible if the matrix of the transformation is invertible. To determine this we row-reduce the matrix - we row-reduce the augmented matrix to determine the inverse for the second part of the Exercise.
$$\left[\begin{array}{ccc|ccc}3&0&0 & 1&0&0 \\ 1&-1&0 & 0&1&0 \\ 2&1&1 & 0&0&1\end{array}\right]$$$$\rightarrow\left[\begin{array}{ccc|ccc}1&-1&0 & 0&1&0 \\ 3&0&0 & 1&0&0 \\ 2&1&1 & 0&0&1\end{array}\right]$$$$\rightarrow\left[\begin{array}{ccc|ccc}1&-1&0 & 0&1&0 \\ 0&3&0 & 1&-3&0 \\ 0&3&1 & 0&-2&1\end{array}\right]$$$$\rightarrow\left[\begin{array}{ccc|ccc}1&-1&0 & 0&1&0 \\ 0&1&0 & 1/3&-1&0 \\ 0&3&1 & 0&-2&1\end{array}\right]$$$$\rightarrow\left[\begin{array}{ccc|ccc}1&0&0 & 1/3&0&0 \\ 0&1&0 & 1/3&-1&0 \\ 0&0&1 & -1&1&1\end{array}\right]$$Since the left side transformed into the identity, $T$ is invertible. The inverse transformation is given by
$$\left[\begin{array}{c}x_1\\x_2\\x_3\end{array}\right] \mapsto \left[\begin{array}{ccc}1/3&0&0\\1/3&-1&0\\-1&1&1\end{array}\right]\cdot\left[\begin{array}{c}x_1\\x_2\\x_3\end{array}\right]$$So$$T^{-1}(x_1,x_2,x_3)=(x_1/3,\ \ x_1/3-x_2,\ \ -x_1+x_2+x_3).$$

Exercise 3.2.4

Working with the matrix representation of $T$ we must show

Exercise 3.2.5

An (ordered) basis for $\mathbb C^{2\times2}$ is given by
$$A_{11}=\left[\begin{array}{cc}1&0\\0&0\end{array}\right],\quad A_{21}=\left[\begin{array}{cc}0&0\\1&0\end{array}\right]$$$$A_{12}=\left[\begin{array}{cc}0&1\\0&0\end{array}\right],\quad A_{22}=\left[\begin{array}{cc}0&0\\0&1\end{array}\right].$$If we identify $\mathbb C^{2\times 2}$ with $\mathbb C^4$ by
$$\left[\begin{array}{cc}a&b\\c&d\end{array}\right]\mapsto (a,b,c,d)$$then since
$$A_{11}\mapsto A_{11}-4A_{21}$$$$A_{21}\mapsto -A_{11}+4A_{21}$$$$A_{12}\mapsto A_{12}-4A_{22}$$$$A_{22}\mapsto -A_{12}+4A_{22}$$the matrix of the transformation is given by
0&0&-1&4\end{array}\right].$$To find the rank of $T$ we row-reduce this matrix:
0&0&0&0\end{array}\right].$$It has rank two so the rank, so the rank of $T$ is $2$.

Note that $T^2(A)=T(T(A))=T(BA)=B(BA)=B^2A$. Thus $T^2$ is given by multiplication by a matrix just as $T$ is, but multiplication with $B^2$ instead of $B$. Explicitly

Exercise 3.2.6

Let $\{\alpha_1,\alpha_2,\alpha_3\}$ be a basis for $\Bbb R^3$. Then $T(\alpha_1),T(\alpha_2),T(\alpha_3)$ must be linearly dependent in $\Bbb R^2$, because $\Bbb R^2$ has dimension $2$. So suppose $$b_1T(\alpha_1)+b_2T(\alpha_2)+b_3T(\alpha_3)=0$$ and not all $b_1,b_2,b_3$ are zero. Then
$$b_1\alpha_1+b_2\alpha_2+b_3\alpha_3\not=0$$ and
$$UT(b_1\alpha_1+b_2\alpha_2+b_3\alpha_3)$$$$=U(T(b_1\alpha_1+b_2\alpha_2+b_3\alpha_3))$$$$=U(b_1T(\alpha_1)+b_2T(\alpha_2)+b_3T(\alpha_3)$$$$=U(0)=0.$$Thus (by the definition at the bottom of page 79) $UT$ is not non-singular and thus by Theorem 9, page 81, $UT$ is not invertible.

The obvious generalization is that if $n>m$ and $T:\Bbb R^n\rightarrow\Bbb R^m$ and $U:\Bbb R^m\rightarrow\Bbb R^n$ are linear transformations, then $UT$ is not invertible. The proof is an immediate generalization the proof of the special case above, just replace $\alpha_3$ with $\dots,\alpha_n$.

Exercise 3.2.7

Identify $\mathbb R^2$ with $\mathbb R^{2\times1}$ and let $T$ and $U$ be given by the matrices
$$A=\left[\begin{array}{cc}1&0\\0&0\end{array}\right],\quad B=\left[\begin{array}{cc}0&1\\0&0\end{array}\right].$$More precisely, for
$$X=\left[\begin{array}{c}x\\y\end{array}\right].$$Let $T$ be given by $X\mapsto AX$ and let $U$ be given by $X\mapsto BX$. Thus $TU$ is given by $X\mapsto ABX$ and $UT$ is given by $X\mapsto BAX$. But $BA=0$ and $AB\not=0$ so we have the desired example.

Exercise 3.2.8

If $T^2=0$ then the range of $T$ must be contained in the null space of $T$ since if $y$ is in the range of $T$ then $y=Tx$ for some $x$ so $Ty=T(Tx)=T^2x=0$. Thus $y$ is in the null space of $T$.

To give an example of an operator where $T^2=0$ but $T\not=0$, let $V=\mathbb R^{2\times1}$ and let $T$ be given by the matrix
$$A=\left[\begin{array}{cc}0&1\\0&0\end{array}\right].$$Specifically, for
$$X=\left[\begin{array}{c}x\\y\end{array}\right].$$Let $T$ be given by $X\mapsto AX$. Since $A\not=0$, $T\not=0$. Now $T^2$ is given by $X\mapsto A^2X$, but $A^2=0$. Thus $T^2=0$.

Exercise 3.2.9

By the comments in the Appendix on functions, at the bottom of page 389, we see that simply because $TU=I$ as functions, then necessarily $T$ is onto and $U$ is one-to-one. It then follows immediately from Theorem 9, page 81, that $T$ is invertible. Now $TT^{-1}=I=TU$ and multiplying on the left by $T^{-1}$ we get $T^{-1}TT^{-1}=T^{-1}TU$ which implies $(I)T^{-1}=(I)U$ and thus $U=T^{-1}$.

Let $V$ be the space of polynomial functions in one variable over $\Bbb R$. Let $D$ be the differentiation operator and let $T$ be the operator ``multiplication by $x$" (exactly as in Example 11, page 80). As shown in Example 11, $UT=I$ while $TU\not=I$. Thus this example fulfills the requirement.

Exercise 3.2.10

Let $\mathcal B=\{\alpha_1,\dots,\alpha_n\}$ be a basis for $F^{n\times1}$ and let $\mathcal B'=\{\beta_1,\dots,\beta_m\}$ be a basis for $F^{m\times1}$. We can define a linear transformation from $F^{n\times1}$ to $F^{m\times1}$ uniquely by specifying where each member of $\mathcal B$ goes in $F^{m\times1}$. If $m<n$ then we can define a linear transformation that maps at least one member of $\mathcal B$ to each member of $\mathcal B'$ and maps at least two members of $\mathcal B$ to the same member of $\mathcal B'$. Any linear transformation so defined must necessarily be onto without being one-to-one. Similarly, if $m>n$ then we can map each member of $\mathcal B$ to a unique member of $\mathcal B'$ with at least one member of $\mathcal B'$ not mapped to by any member of $\mathcal B$. Any such transformation so defined will necessarily be one-to-one but not onto.

Exercise 3.2.11

Let $\{\alpha_1,\dots,\alpha_n\}$ be a basis for $V$. Then the rank of $T$ is the number of linearly independent vectors in the set $\{T\alpha_1,\dots,T\alpha_n\}$. Suppose the rank of $T$ equals $k$ and suppose WLOG that $\{T\alpha_1,\dots,T\alpha_k\}$ is a linearly independent set (it might be that $k=1$, pardon the notation). Then $\{T\alpha_1,\dots,T\alpha_k\}$ give a basis for the range of $T$. It follows that $\{T^2\alpha_1,\dots,T^2\alpha_k\}$ span the range of $T^2$ and since the dimension of the range of $T^2$ is also equal to $k$, $\{T^2\alpha_1,\dots,T^2\alpha_k\}$ must be a basis for the range of $T^2$. Now suppose $v$ is in the range of $T$. Then $v=c_1T\alpha_1+\cdots+c_kT\alpha_k$. Suppose $v$ is also in the null space of $T$. Then $$0=T(v)=T(c_1T\alpha_1+\cdots+c_kT\alpha_k)=c_1T^2\alpha_1+\cdots+c_kT^2\alpha_k.$$ But $\{T^2\alpha_1,\dots,T^2\alpha_k\}$ is a basis, so $T^2\alpha_1,\dots,T^2\alpha_k$ are linearly independent, thus it must be that $c_1=\cdots=c_k=0$, which implies $v=0$. Thus we have shown that if $v$ is in both the range of $T$ and the null space of $T$ then $v=0$, as required.

Exercise 3.2.12

We showed in Exercise 2.3.12, page 49, that the dimension of $V$ is $mn$ and the dimension of $W$ is $pn$. By Theorem 9 page (iv) we know that an invertible linear transformation must take a basis to a basis. Thus if there's an invertible linear transformation between $V$ and $W$ it must be that both spaces have the same dimension. Thus if $T$ is inverible then $pn=mn$ which implies $p=m$. The matrix $B$ is then invertible because the assignment $B\mapsto BX$ is one-to-one (Theorem 9 (ii), page 81) and non-invertible matrices have non-trivial solutions to $BX=0$ (Theorem 13, page 23). Conversely, if $p=n$ and $B$ is invertible, then we can define the inverse transformation $T^{-1}$ by $T^{-1}(A)=B^{-1}A$ and it follows that $T$ is invertible.

From http://greggrant.org


This website is supposed to help you study Linear Algebras. Please only read these solutions after thinking about the problems carefully. Do not just copy these solutions.

This Post Has One Comment

  1. Question 11 can be proved in this way. It is easy to show that null(T) is a subspace of null$(T^2)$ (Theorem 8.2 of Axler). If rank(T)=rank($T^2$), then by the rank-nullity theorem, null(T)=null($T^2$).
    Now, suppose that $v$ is in range(T) and null(T). Then, there exists some $u$ in V such that Tu=v. Hence, $T(Tu)=Tv=0$, which implies that u is in null($T^2$). Therefore, u is in null(T), which implies that Tu=v=0.

Leave a Reply

Close Menu