If you find any mistakes, please make a comment! Thank you.

Solution to Linear Algebra Hoffman & Kunze Chapter 6.6


Exercise 6.6.1

Let $V$ be a finite-dimensional vector space and let $W_1$ be any subspace of $V$. Prove that there is a subspace $W_2$ of $V$ such that $V=W_1\oplus W_2$.

Solution: Since $V$ is finite-dimensional, so is $W_1$. Let $w_1,\dots,w_n$ be a basis of $W$. Then by Theorem 5 of Page 45, we can extend it to a basis $w_1,\dots,w_n$, $v_1,\dots,v_m$ of $V$.

Let $W_2=\mathrm{span}(v_1,\dots,v_m)$. Then it is clear that $V=W_1\oplus W_2$.


Exercise 6.6.2

Let $V$ be a finite-dimensional vector space and let $W_1,\dots,W_k$ be subspaces of $V$ such that
$$V=W_1+\cdots+W_k\quad\text{and}\quad\dim(V)=\dim(W_1)+\cdots+\dim(W_k).$$Prove that $V=W_1\oplus\cdots\oplus W_k.$

Solution: See Exercise 2C.16 of Linear Algebra Done Right.


Exercise 6.6.3

Find a projection $E$ which projects $\mathbb R^2$ onto the subspace spanned by $(1,-1)$ along the subspace spanned by $(1,2)$.

Solution: Consider the linear map $E:\R^2\to \R^2$ such that\[E(1,-1)=(1,-1),\quad E(1,2)=(0,0).\]In terms of the standard basis, we have\[E(1,0)=\frac{1}{3}(2,-2),\quad E(0,1)=\frac{1}{3}(-1,1).\]


Exercise 6.6.4

If $E_1$ and $E_2$ are projections onto independent subspaces, then $E_1+E_2$ is a projection. True or false?

Solution: False. Let $E_1$ be the projection in Exercise 6.6.3 and $E_2$ be the projection onto the subspace spanned by $(1,0)$ along the subspace spanned by $(0,1)$.

Then $$(E_1+E_2)(1,-1)=(1,-1)+(1,0)=(2,-1)$$and \begin{align*}(E_1+E_2)(2,-1)&=E_1(2,-1)+E_2(2,-1)\\&=\frac{1}{3}(5,-5)+(2,0)\ne (2,-1).\end{align*}Hence $(E_1+E_2)^2\ne E_1+E_2$, so $E_1+E_2$ is not a projection.


Exercise 6.6.5

If $E$ is a projection and $f$ is a polynomial, then $f(E)=aI+bE$. What are $a$ and $b$ in terms of the coefficents of $f$?

Solution: If $E$ is a projection, then $E^2=E$. Moreover, we have $$E^{k}==E$$for all $k>00$. Hence $a$ is the constant term of $f$ while $b$ is the sum of all other coefficients of $x^{k+1}$, for all $k\geq 0$.


Exercise 6.6.6

True or false? If a diagonalizable operator has only the characteristic values $0$ and $1$, it is a projection.

Solution: True. Since $T$ is diagonalizable and has only the characteristic values $0$ and $1$. By Theorem 6 of page 204, the minimal polynomial of $T$ is \[p=x(x-1).\]Namely, $$T(T-I)=0\Rightarrow T^2=T.$$Hence $T$ is a projection.


Exercise 6.6.7

Prove that if $E$ is the projection on $R$ along $N$, then $(I-E)$ is the projection on $N$ along $R$.

Solution: We have $V=R\oplus N$. By definition, we have\[Ev_R=v_R,\quad Ev_N=0\]for all $v_R\in R$ and $v_N\in N$. Therefore $$(I-E)v_R=v_R-v_R=0,$$$$(I-E)v_N=v_N-0=v_N.$$Moreover,$$(I-E)^2=I-2E+E^2=I-E.$$Hence $I-E$ is the projection on $N$ along $R$.


Exercise 6.6.8

Let $E_1,\dots, E_k$ be linear operators on the space $V$ such that $E_1+\cdots+E_k=I$.

(a) Prove that if $E_iE_j=0$ for $i\not=j$, then $E_i^2=E_i$ for each $i$.

(b) In the case $k=2$, prove the converse of (a). That is, if $E_1+E_2=I$ and $E_1^2=E_1$, $E_2^2=E_2$, then $E_1E_2=0$.

Solution: For part (a), we have\begin{align*}E_i^2&=E_i(I-E_1-\cdots-E_{i-1}-E_{i+1}-\cdots-E_{n})\\&=E_i-\sum_{j\ne i}E_{i}E_{j}=E_i,\end{align*}since $E_iE_j=0$ for $i\ne j$.

(b) We have $$E_1=E_1(E_1+E_2)=E_1^2+E_1E_2.$$Since $E_1=E_1^2$, we have $E_1E_2=0$. Similarly, one can show that $E_2E_1=0$.


Exercise 6.6.9

Let $V$ be a real vector space and $E$ an idempotent linear operator on $V$, i.e., a projection. Prove that $(I+E)$ is invertible. Find $(I+E)^{-1}$.

Solution: Since $E$ is idempotent, we have\[E^2-E-2I=-2I,\]i.e.\[(I+E)(2I-E)=2I.\]Hence $(I+E)^{-1}$ is equal to $\frac{1}{2}(2I-E)$.


Exercise 6.6.10

Let $F$ be a subfield of the complex numbers (or, a field of characteristic zero). Let $V$ be a finite-dimensional vector space over $F$. Suppose that $E_1,\dots,E_k$ are projections of $V$ and that $E_1+\cdots E_k=I$. Prove that $E_iE_j=0$ for $i\not=j$ (Hint: use the trace function and ask yourself what the trace of a porjection is.)

Solution: Let $W_i$ be the image $W_i$ of the projection $E_i$

Since $E_i$ is a projection, its minimal polynomial is a monic divisor of $x(x-1)$. Since $x(x-1)$ has no multiple roots, by Theorem 6 of page 204, $E_i$ is diagonalizable. Moreover, the eigenvalues are zero or one. Therefore the trace $\mathrm{tr}(E_i)$ is exactly the dimension of $W_i$.

Clearly, we have $W_1+W_2+\cdots+W_k\subset V$. But $E_1+\cdots+E_k=I$, for any $v\in V$, we have\[v=E_1v+\cdots+E_kv\in W_1+W_2+\cdots+W_k.\]Hence $W_1+W_2+\cdots+W_k=V$. Moreover, taking the trace on $E_1+\cdots+E_k=I$, we get that (trace function is linear)\[\dim W_1+\cdots+\dim W_k=\dim V.\]Therefore, we have $V=W_1\oplus\cdots\oplus W_k$ by Exercise 6.6.2.

Taking $w_i\in W_i$, we have\[w_i=Iw_i=\sum_{j=1}^k E_jw_i=w_i+\sum_{j\ne i}^kE_jw_i.\]Since $V=W_1\oplus\cdots\oplus W_k$, we have $E_jw_i=0$ for $j\ne i$. Hence $E_jW_i=0$ for all $j\ne i$. Therefore $$E_jE_iv\in E_j W_i=0$$ for all $v\in V$ and $j\ne i$. Namely, $E_jE_i=0$ for $j\ne i$.


Exercise 6.6.11

Let $V$ be a vector space, let $W_1,\dots,W_k$ be subspace of $V$, and let
$$V_j=W_1+\cdots W_{j-1}+W_{j+1}+\cdots+W_k.$$Suppose that $V=W_1\oplus\cdots\oplus W_k$. Prove that the dual space $V^*$ has the direct-sum decomposition $V^*=V_1^0\oplus\cdots\oplus V_k^0$.

Solution: We use the transpose from Chapter 3.7. We also use the notation from Theorem 9 of page 212.

By Theorem 9 of page 212, we have projections $E_i$ such that $E_iE_j=0$ for $i\ne j$ and $I=E_1+\dots+E_k$. Define the map $E_i^t:V^*\to V^*$ by\[(E^t_if)(v)=f(E_iv),\]for all $f\in V^*$ and $v\in V$. If $i\ne j$, we have\begin{align*}(E_i^tE_j^tf)v=(E_j^tf)(E_iv)=f(E_jE_iv)=f(0)=0.\end{align*}Hence $E_j^tE_i^t=0$ for $j\ne i$. Similarly, one shows that $E_i^t$ is an idempotent.

We also have\begin{align*}((E_1^t+\cdots+E_k^t)f)v&=f(E_1v)+\cdots+f(E_kv)\\&=f(E_1v+\cdots+E_kv)=f((E_1+\cdots+E_k)v)\\&=f(Iv)=f(v).\end{align*}Hence $E_1^t+\cdots+E_k^t=I$.

Let $S_i$ be the image of $E_i^t$, then by Theorem 9 of page 212, we have\[V^*=S_1\oplus \cdots\oplus S_k.\]Therefore, it suffices to show that $S_i=V_i^0$.

Let $v_i\in V_i$, then $E_iv_i=0$ (page 211). Hence for all $f\in V^*$, we have\[(E_i^tf)v_i=f(E_iv_i)=f(0)=0.\]Hence $E_i^tf\in V_i^0$. Namely $S_i\subset V_i^0$.

Conversely, for $v\in V$, we write $v=w_i+v_i$ where $w_i\in W_i$ and $v_i\in V_i$. Moreover, $E_iv=w_i$. If $f\in V_i^0$, then $$f(v)=f(w_i+v_i)=f(w_i),$$$$(E_i^tf)v=f(E_iv)=f(w_i).$$Therefore, we have $E_i^tf=f$ for $f\in V_i^0$. This implies that $V_i^0\subset S_i$.

We conclude that $S_i=V_i^0$ and we are done.

Linearity

This website is supposed to help you study Linear Algebras. Please only read these solutions after thinking about the problems carefully. Do not just copy these solutions.
Close Menu