If you find any mistakes, please make a comment! Thank you.

Solution to Linear Algebra Hoffman & Kunze Chapter 6.5


Exercise 6.5.1

Find an invertible real matrix $P$ such that $P^{-1}AP$ and $P^{-1}BP$ are both diagonal, where $A$ and $B$ are the real matrices
$$\text{(a)}\quad A=\left[\begin{array}{cc} 1&2\\0&2\end{array}\right],\quad B=\left[\begin{array}{cc} 3&-8\\0&-1\end{array}\right]$$$$\text{(b)} \quad A=\left[\begin{array}{cc} 1&1\\1&1\end{array}\right],\quad B=\left[\begin{array}{cc} 1&a\\a&1\end{array}\right].$$Solution: The proof of Theorem 8 shows that if a $2\times2$ matrix has two characteristic values then the $P$ that diagonalizes $A$ will necessarily also diagonalize any $B$ that commutes with $A$.

(a) Characteristic polynomial equals $(x-1)(x-2)$. So $c_1=1$, $c_2=2$.
$$c_1:\quad\left[\begin{array}{cc}0&-2\\0&-2\end{array}\right]\left[\begin{array}{c}1\\0\end{array}\right]=\left[\begin{array}{c}0\\0\end{array}\right]$$$$c_2:\quad\left[\begin{array}{cc}1&-2\\0&0\end{array}\right]\left[\begin{array}{c}2\\1\end{array}\right]=\left[\begin{array}{c}0\\0\end{array}\right]$$So $P=\left[\begin{array}{cc}1&2\\0&1\end{array}\right]$ and $P^{-1}=\left[\begin{array}{cc}1&-2\\0&1\end{array}\right]$.
$$P^{-1}AP=\left[\begin{array}{cc}1&0\\0&2\end{array}\right],\quad P^{-1}BP=\left[\begin{array}{cc}3&0\\0&-1\end{array}\right]$$(b) Characteristic polynomial equals $x(x-2)$. So $c_1=0$, $c_2=2$. $$c_1:\quad\left[\begin{array}{cc}-1&-1\\-1&-1\end{array}\right]\left[\begin{array}{c}-1\\1\end{array}\right]=\left[\begin{array}{c}0\\0\end{array}\right]$$$$c_2:\quad\left[\begin{array}{cc}1&-1\\-1&1\end{array}\right]\left[\begin{array}{c}1\\1\end{array}\right]=\left[\begin{array}{c}0\\0\end{array}\right]$$So $P=\left[\begin{array}{cc}-1&1\\1&1\end{array}\right]$ and $P^{-1}=\left[\begin{array}{cc}-1/2&1/2\\1/2&1/2\end{array}\right]$.$$P^{-1}AP=\left[\begin{array}{cc}0&0\\0&2\end{array}\right],\quad P^{-1}BP=\left[\begin{array}{cc}1-a&0\\0&1+a\end{array}\right].$$


Exercise 6.5.2

Let $\mathcal F$ be a commuting family of $3\times3$ complex matrices. How many linearly independent matrices can $\mathcal F$ contain? What about the $n\times n$ case?

Solution: This turns out to be quite a hard question, so I’m not sure what Hoffman & Kunze had in mind. But there’s a general theorem from 1905 by I. Schur which says the answer is $\left \lfloor{\frac{n^2}{4}}\right \rfloor+1$. A simpler proof was published in 1998 by M. Mirzakhani in the American Mathematical Monthly.


Exercise 6.5.3

Let $T$ be a linear operator on an $n$-dimensional space, and suppose that $T$ has $n$ distinct characteristic values. Prove that any linear operator which commutes with $T$ is a polynomial in $T$.

Solution: Since $T$ has $n$ distinct characteristic values, $T$ is diagonalizable (exercise 6.2.7, page 190). Choose a basis $\mathcal B$ for which $T$ is represented by a diagonal matrix $A$. Suppose the linear transformation $S$ commutes with $T$. Let $B$ be the matrix of $S$ in the basis $\mathcal B$. Then the $ij$-th entry of $AB$ is $a_{ii}b_{ij}$ and the $ij$-th entry of $BA$ is $a_{jj}b_{ij}$. Therefore if $a_{ii}b_{ij}=a_{jj}b_{ij}$ and $a_{ii}\not=a_{jj}$, then it must be that $b_{ij}=0$. So we have shown that $B$ must also be diagonal. So we have to show there exists a polynomial such that $f(a_{ii})=b_{ii}$ for all $i=1,\dots,n$. By Section 4.3 there exists a polynomial with this property.


Exercise 6.5.4

Let $A$, $B$, $C$, and $D$ be $n\times n$ complex matrices which commute. Let $E$ be the $2n\times2n$ matrix
$$E=\left[\begin{array}{cc} A & B\\ C & D\end{array}\right].$$Prove that $\det E=\det(AD-BC)$.

Solution: If $A$ is invertible, then\[\left[\begin{array}{cc} A & B\\ C & D\end{array}\right]\cdot \left[\begin{array}{cc} I_n & -A^{-1}B\\ 0 & I_n\end{array}\right]=\left[\begin{array}{cc} A & 0\\ C & D-CA^{-1}B\end{array}\right].\]Hence $$\det E=\det A\det (D-CA^{-1}B)=\det(AD-ACA^{-1}B).$$Since $A$ and $C$ commute, we have $AD-ACA^{-1}B=AD-CB$. Thus $$\det E=\det(AD-ACA^{-1}B)=\det(AD-BC).$$If $A$ is not invertible, then we set $A_t=A+tI_n$, where $t$ is a complex number. Let$$E_t=\left[\begin{array}{cc} A_t & B\\ C & D\end{array}\right].$$Clearly, for enough small $t$, $A_t$ is invertible ($\det A_t$ is a polynomial of $t$ which can only have finitely many solutions). Moreover, $A_t$ and $C$ commute. Hence by the case we proved, we have $\det E_t=\det(A_tD-BC)$ for sufficiently small $t$. But both sides are polynomials in $t$, hence we can take $t\to 0$ and the limit should be the same. That is $\det E=\det(AD-BC)$.


Exercise 6.5.5

Let $F$ be a field, $n$ a positive integer, and let $V$ be the space of $n\times n$ matrices over $F$. If $A$ is a fixed $n\times n$ matrix over $F$, let $T_A$ be the linear operator on $V$ defined by $T_A(B)=AB-BA$. Consider the family of linear operators $T_A$ obtained by letting $A$ vary over all diagonal matrices. Prove that the operators in that family are simultaneously diagonalizable.

Solution: If we stack the cloumns of an $n\times n$ matrix on top of each other with column one at the top, the matrix of $T_A$ in the standard basis is then given by
$$\left[\begin{array}{cccc}
A & & & \\
& A & &\\
& & \ddots & \\
& & & A
\end{array}\right].$$Thus if $A$ is diagonal then $T_A$ is diagonalizable.

Now $T_AT_B(C)=ABC-ACB-BCA+CBA$ and $T_BT_A(C)=BAC-BCA-ACB+CAB$. Therefore we must show that $BAC+CAB=ABC+CBA$. The $i,j$-th entry of $BAC+CAB$ is $c_{ij}(a_{ii}b_{ii}+a_{jj}b_{jj})$. And this is exactly the same as the $i,j$-th entry of $ABC+CBA$. Thus $T_A$ and $T_B$ commute. Thus by Theorem 8 the family can be simultaneously diagonalized.

Linearity

This website is supposed to help you study Linear Algebras. Please only read these solutions after thinking about the problems carefully. Do not just copy these solutions.
Close Menu