1. Solution: If $T$ is invertible, then there exists $S\in\ca L(V)$ such that $TS=ST=I$. Then it follows from 10.4 that\[\ca M(S,(v_1,\cdots,v_n))\ca M(T,(v_1,\cdots,v_n))=\ca M(ST,(v_1,\cdots,v_n))=I\]\[\ca M(T,(v_1,\cdots,v_n))\ca M(S,(v_1,\cdots,v_n))=\ca M(TS,(v_1,\cdots,v_n))=I.\]Hence $\ca M(T,(v_1,\cdots,v_n))$ is invertible.

If $\ca M(T,(v_1,\cdots,v_n))$ is invertible, then there exists $B\in \mb F^{n,n}$ such that \[B\ca M(T,(v_1,\cdots,v_n))=\ca M(T,(v_1,\cdots,v_n))B.\]Note that by 3.60, we have an isomorphism between $\ca L(V)$ and $\mb F^{(n,n)}$ by taking $W=V$ and $w_i=v_i$ for all $i$, we can choose $S\in\ca L(V)$ such that $B=\ca M(S,(v_1,\cdots,v_n))$. Again by 10.4, we have\[\ca M(ST,(v_1,\cdots,v_n))=B\ca M(T,(v_1,\cdots,v_n))=I,\]\[\ca M(TS,(v_1,\cdots,v_n))=\ca M(T,(v_1,\cdots,v_n))B=I.\]Therefore $ST=I$ and $TS=I$, hence $T$ is invertible.

See also Linear Algebra Done Right Solution Manual Chapter 10 Problem 1.

2. Solution: Let $V$ be $\mb F^{n}$, where $n$ is the number of rows of $A$. Fix a basis $v_1,\cdots,v_n$ of $V$. By 3.60, we choose $T,S\in\ca L(V)$ such that\[\ca M(T,(v_1,\cdots,v_n))=A\quad\text{and}\quad\ca M(S,(v_1,\cdots,v_n))=B.\]Since $AB=I$, it follows from 10.4 that\[\ca M(TS,(v_1,\cdots,v_n))=\ca M(T,(v_1,\cdots,v_n))\ca M(S,(v_1,\cdots,v_n))=AB=I,\]hence $TS=I$. By Problem 10 of Exercise 3D, we have $ST=I$. Again by 10.4, we have\[BA=\ca M(S,(v_1,\cdots,v_n))\ca M(T,(v_1,\cdots,v_n))=\ca M(ST,(v_1,\cdots,v_n))=I.\]See also Linear Algebra Done Right Solution Manual Chapter 10 Problem 2.

3. Solution: See Linear Algebra Done Right Solution Manual Chapter 10 Problem 3.

4. Solution: See Linear Algebra Done Right Solution Manual Chapter 10 Problem 4.

5. Solution: Let $V=\C^n$, where $n$ is the number of the rows of $B$. Choose a basis $v_1,\cdots,v_n$ of $V$. Let $T\in\ca L(V)$ so that\[\ca M(T,(v_1,\cdots,v_n))=B.\]By 8.29, there exists another basis $w_1,\cdots,w_n$ of $V$ such that $\ca M(T,(w_1,\cdots,w_n))$ is an upper-triangular matrix.

Let $A=\ca M(I,(w_1,\cdots,w_n),(v_1,\cdots,v_n))$, then $A$ is intertible. It follows from 10.7 that\[\ca M(T,(w_1,\cdots,w_n)=A^{-1}\ca M(T,(v_1,\cdots,v_n))A=A^{-1}BA.\]Since $\ca M(T,(w_1,\cdots,w_n))$ is an upper-triangular matrix, so is $A^{-1}BA$.

See also Linear Algebra Done Right Solution Manual Chapter 10 Problem 5.

6. Solution: Let $V=\R^2$, then $e_1=(1,0)$, $e_2=(0,1)$ is a basis of $V$. Let $T$ be the unique operator in $\ca L(V)$ such that $Te_1=e_2$ and $Te_2=-e_1$. Then \[T^2e_1=Te_2=-e_1\quad \text{and}\quad T^2e_2=-Te_1=-e_2.\]Hence $\ca M(T^2,(e_1,e_2))=-I$. In particular, $\m{trace}~T^2=-2 < 0$.

See also Linear Algebra Done Right Solution Manual Chapter 10 Problem 6.

7. Solution: See Linear Algebra Done Right Solution Manual Chapter 10 Problem 7.

8. Solution: See Linear Algebra Done Right Solution Manual Chapter 10 Problem 8.

9. Solution: See Linear Algebra Done Right Solution Manual Chapter 10 Problem 9.

10. Solution: Choose an orthonormal basis $e_1,\cdots,e_n$ of $V$. By 7.10, we have

\[\ca M(T^*,(e_1,\cdots,e_n))=\overline{\ca M(T,(e_1,\cdots,e_n))}^T.\]By the definition of the trace of a matrix, we have $\m{trace}~A^T=\m{trace}~A$ for any square matrix $A$. Therefore, by 10.16, we have\begin{align*}\m{trace}~T^*=&\m{trace}(\ca M(T^*,(e_1,\cdots,e_n)))\\=&\m{trace}(\overline{\ca M(T,(e_1,\cdots,e_n))}^T)\\=&\m{trace}(\overline{\ca M(T,(e_1,\cdots,e_n))})\\ =&\overline{\m{trace}(\ca M(T,(e_1,\cdots,e_n)))}\\ =&\overline{\m{trace}~T}.\end{align*}

Here we use the fact that $\m{trace}~\overline A=\overline{\m{trace}~A}$ for any square matrix $A$. Why? Prove it.

11. Solution: By 7.35, we know that $T$ is self-adjoint and all the eigenvalues of $T$ are nonnegative. By the Spectral Theorem, $T$ has a diagonal matrix with respect to some orthonormal basis of $V$. Fix this orthonormal basis of $V$. Then $\m{trace}~T=\m{trace}(\ca M(T))=0$.

Note that $\ca M(T)$ is a diagonal matrix with the eigenvalues of $T$ on the diagonal entries and the fact that all the eigenvalues of $T$ are nonnegative, it follows from $\m{trace}(\ca M(T))=0$ that all the eigenvalues of $T$ are zero. Hence $\ca M(T)=0$, which implies $T=0$.

12. Suppose $V$ is an inner product space and $P,Q\in\ca L(V)$ are orthogonal projections. Prove that $\m{trace}(PQ)\geq 0$.

13. Solution: Since $\m{trace}(T)$ is the sum of all eigenvalues and the sum of the diagonal entries of $\ca M(T)$. Hence the third eigenvalue of $T$ is\[51+(-40)+1-(-48)-24=36.\]See also Linear Algebra Done Right Solution Manual Chapter 10 Problem 12.

14. Solution: Choose a basis of $V$. We have $\m{trace}(cT)=\m{trace}(\ca M(cT))$ and $\m{trace}(T)=\m{trace}{\ca M(T)}$ by 10.16. Note that by 3.38, we have $\ca M(cT)=c\ca M(T)$. Therefore, we have\[\m{trace}(\ca M(cT))=\m{trace}(c\ca M(T))=c\m{trace}(\ca M(T)).\]Hence\[\m{trace}(cT)=\m{trace}(\ca M(cT))=c\m{trace}(\ca M(T))=c\m{trace}(T).\]

15. Solution: Choose a basis of $V$. By 10.4, 10.14 and 10.16, we have\begin{align*}&\m{trace}(ST)\\ \text{by 10.16}\quad=&\m{trace}(\ca M(ST))\\ \text{by 10.4}\quad=&\m{trace}(\ca M(S)\ca M(T))\\ \text{by 10.14}\quad=&\m{trace}(\ca M(T)\ca M(S))\\ \text{by 10.4}\quad=&\m{trace}(\ca M(ST))\\ \text{by 10.16}\quad=&\m{trace}(TS).\end{align*}

16. Solution: Let $V=\mb F^2$. Fix a basis $v_1,v_2$ of $V$. Let $S=T$, then $\m{trace}(ST)=\m{trace}(I)=2$ and\[\m{trace}(S)=\m{trace}(T)=2.\]However\[\m{trace}(ST)=2\ne 4=\m{trace}(S)\m{trace}(T).\]See also Linear Algebra Done Right Solution Manual Chapter 10 Problem 14.

17. Solution: We assume that $\dim V=n$. If $T\ne 0$, then $\m{Ker}(T)\ne V$. Let $v_{m+1},\cdots,v_n$ be a basis of $\m{Ker}(T)$, then $m\geq 1$. Extend $v_{m+1},\cdots,v_n$ to a basis $v_1,\dots,v_n$ of $V$. By the proof of 3.22, we have $Tv_1,\cdots,Tv_m$ is linearly independent in $V$. Hence we can extend $Tv_1,\cdots,Tv_m$ to a basis $Tv_1,\cdots,Tv_m,w_{m+1},\cdots,w_m$. Hence there exists $S\in \ca L(V)$ such that $S(Tv_1)=v_1$ (since $m\geq 1$, this is possible), $S(Tv_i)=0$ for all $i=2,\cdots,m$ and $Sw_j=0$ for all $j=m+1,\cdots,n$. Then we have \[STv_1=v_1,\quad STv_i=0,\quad STv_j=S0=0,\quad i=2,\cdots,m,\quad j=m+1,\dots,n.\]Thus $1,0$ are all the eigenvalues of $ST$. Moreover, the eigenvalue $1$ has multiplicity one while the eigenvalue $0$ has multiplicities $n-1$. Since trace is the sum of eigenvalues with mltiplicity, we have $\m{trace}(ST)=1$ and we get a contradiction. Therefore $T=0$.

See also Linear Algebra Done Right Solution Manual Chapter 10 Problem 15.

18. Solution: See Linear Algebra Done Right Solution Manual Chapter 10 Problem 16.

19. Solution: See Linear Algebra Done Right Solution Manual Chapter 10 Problem 18.

20. Solution: See Linear Algebra Done Right Solution Manual Chapter 10 Problem 17.

21. Solution: See Linear Algebra Done Right Solution Manual Chapter 10 Problem 19.

## ming

6 May 2022for q1 can we use q32 of 3F?

## Chuang

5 Nov 2020Maybe i have a more compact answer to Q17. $$\mathsf{Trace}(ST) =\sum_{k} (ST)_{kk} = \sum_k \sum_j S_{kj}T_{jk}.$$ Suppose there exits at least one entry of matrx $T$ which doesn't equal to 0, for example, we denote as $T_{j_0,k_0}$. Then we can set the entry of $S$ all equal to 0 except for $S_{k_0,j_0}$, obviously $$\mathsf{Trace}(ST) = S_{k_0,j_0}T_{j_0,k_0} \ne 0,$$ which is contradictory to the condition. So $T$ must equal to 0.

## Linearity

5 Nov 2020This is an easy exercise. The point is using the methods from the book.

I think the book is trying to avoid using the matrix entries directly.

## Yuheng

31 Aug 2020I think maybe 12 is straightforward? (Or please point out my mistakes. I'll use parentheses to denote inner product.)

Suppose Q is orthogonal projection on U. Choose an orthonormal basis of U, e1,..,en, choose an orthonormal basis of orthogonal complement of U, f1,..,fm, then e1,...,en,f1,...,fm is an orthonormal basis of V, and Qei=ei for all i from 1 to n, and Qfi=0 for all i from 1 to m.

Then trace(PQ) = trace(M(PQ,(e1,...,en))) = (PQe1,e1)+...+(PQen,en)+(PQf1,f1)+...+(PQfm,fm) = (Pe1,e1)+...+(Pen ,en).

For each i from 1 to n, (Pei,ei)>=0, which can be verified by writing ei as a sum of some element belonging to a subspace P is on, and some element orthogonal to the previous element. So trace(PQ)>=0.

## Larry Baker

23 Jun 2022Nice matrix proof. Looks good to me!

## Matt Wang

21 Jul 2020In Problem 5 it should be A = M(I, (v1,...,vn),(w1,...wn)) instead of M(T,(w1,...,wn),(v1,...,vn)).

## Linearity

22 Jul 2020You meant $A=M(I,(w_1,\cdots,w_n),(v_1,\cdots,v_n))$?

## Louis Victor

3 Jan 2020Where can I find the solution to problem12? Thanks!

## LLL

18 Apr 2020Sorry,I'm chinese,my English is poor,we can let PQ be PIQ,then we could use the base change formula.we know that for P,Q respective orthonormal basis,the diagonal of the matrix is all 1.so we can let tracePQ=traceM (PIQ)=trace(M(P)M(Q)M(I))。

## Larry Baker

23 Jun 2022To me, there's a gap in your argument M(P) and M(Q) will be, as you say, diagonal matrices with ones and zeroes along their diagonals. However M(I) is with respect to two different bases, so won't necessarily be the identity matrix, and it's not clear to me that it won't have negative entries.

## Zen

16 Jul 2020Here is one idea. Prove the eigenvalues of $PQ$ or $(PQ)_C$ is real and non-negative.$\newline$

Let $\lambda$ be an eigenvalue of $PQ$ or $(PQ)_C$ (we first consider $P,Q$ over complex vector space). Then, $$\begin{align}

(PQ)v &= \lambda v \\

\langle PQv,Qv \rangle &= \langle PQv,PQv+Qv-PQv\rangle\\

&=\langle PQv,PQv \rangle + \langle PQv,Qv-PQv\rangle\\

&=\langle PQv,PQv \rangle \geq 0\\

&=\langle \lambda v,Qv \rangle\\

&= \lambda \langle Qv+v-Qv,Qv \rangle\\

&= \lambda ( \langle Qv,Qv \rangle + \langle v-Qv,Qv \rangle)\\

&= \lambda \langle Qv,Qv \rangle

\end{align}$$

Since $\langle Qv,Qv \rangle$ is real and nonnegative, $\lambda$ must be a non-negative real. $trace(PQ)= \sum_{i=1}^n \lambda_i \geq 0$(10.9) $\newline$

If $P,Q \in \mathcal{L}(R^n)$,

$$ \begin{align}

(PQ)_C(u+vi)&=(PQ)u+i(PQ)v\\

&= P(Qu)+P(iQv)\\

&=P_C(Qu + iQv)\\

&=P_C(Q_C(u+vi))\\

&=(P_C Q_C)(u+vi)

\end{align}

$$

Thus, $(PQ)_C = (P_C Q_C)$ $\newline$

Also,

$$\begin{align}

(u+vi)- P_C(u+vi) &= (u+vi)-(Pu + iPv)\\

&= (u-Pu)+i(v-Pv)

\end{align}

$$

Suppose $W$ is the subspace of $V$ onto which $P$ projects.

Then $Pu,Pv \in W$ and $(u-Pu),(v-Pv) \in W^\bot$, thus,

$$

\begin{align}

&\langle P_C(u+vi),(u+vi)- P_C(u+vi)\rangle = \langle Pu+iPv,(u-Pu)+i(v-Pv)\rangle\\

&= \langle Pu, (u-Pu)\rangle + \langle Pu,i(v-Pv)\rangle+

\langle iPv,(u-Pu)\rangle+\langle iPv,i(v-Pv) \rangle\\

&= 0

\end{align}

$$

So does $Q_C$.$\newline$ In simple words, the properties of $P,Q$ in real space "reserve" in their complexification, so that our reasoning in complex vector space works for the complexification of real operators.