Exercise 2.3.1
Suppose $v_1$ and $v_2$ are linearly dependent. If one of them, say $v_1$, is the zero vector then it is a scalar multiple of the other one $v_1=0\cdot v_2$. So we can assume both $v_1$ and $v_2$ are non-zero. Then if $\exists$ $c_1,c_2$ such that $c_1v_1+c_2v_2=0$, both $c_1$ and $c_2$ must be non-zero. Therefore we can write $v_1=-\frac{c_2}{c_1}v_2$.
Exercise 2.3.2
By Corollary 3, page 46, it suffices to determine if the matrix whose rows are the $\alpha_i$’s is invertible. By Theorem 12 (ii) we can do this by row reducing the matrix
$$\left[\begin{array}{cccc}1&1&2&4\\2&-1&-5&2\\1&-1&-4&0\\2&1&1&6\end{array}\right].$$$$\left[\begin{array}{cccc}1&1&2&4\\2&-1&-5&2\\1&-1&-4&0\\2&1&1&6\end{array}\right]
\rightarrow
\left[\begin{array}{cccc}1&1&2&4\\0&-3&-9&-6\\0&-2&-6&-4\\0&-1&-3&-2\end{array}\right]
\underset{\text{rows}}{\overset{\text{swap}}{\rightarrow}}
\left[\begin{array}{cccc}1&1&2&4\\0&-1&-3&-2\\0&-3&-9&-6\\0&-2&-6&-4\end{array}\right]\rightarrow$$$$
\rightarrow
\left[\begin{array}{cccc}1&1&2&4\\0&1&3&2\\0&-3&-9&-6\\0&-2&-6&-4\end{array}\right]
\rightarrow\left[\begin{array}{cccc}1&1&2&4\\0&1&3&2\\0&0&0&0\\0&0&0&0\end{array}\right]
$$Thus the four vectors are not linearly independent.
Exercise 2.3.3
In Section 2.5, Theorem 9, page 56, it will be proven that row equivalent matrices have the same row space. The proof of this is almost immediate so there seems no easier way to prove it than to use that fact. If you multiply a matrix $A$ on the left by another matrix $P$, the rows of the new matrix $PA$ are linear combinations of the rows of the original matrix. Thus the rows of $PA$ generate a subspace of the space generated by the rows of $A$. If $P$ is invertible, then the two spaces must be contained in each other since we can go backwards with $P^{-1}$. Thus the rows of row-equivalent matrices generate the same space. Thus using the row reduced form of the matrix in Exercise 2, it must be that the space is two dimensoinal and generated by $(1,1,2,4)$ and $(0,1,3,2)$.
Exercise 2.3.4
By Corollary 3, page 46, to show the vectors are linearly independent it suffices to show the matrix whose rows are the $\alpha_i$’s is invertible. By Theorem 12 (ii) we can do this by row reducing the matrix
$$A=\left[\begin{array}{ccc}1&0&-1\\1&2&1\\0&-3&2\end{array}\right].$$$$\left[\begin{array}{ccc}1&0&-1\\1&2&1\\0&-3&2\end{array}\right]
\rightarrow
\left[\begin{array}{ccc}1&0&-1\\0&2&2\\0&-3&2\end{array}\right]
\rightarrow
\left[\begin{array}{ccc}1&0&-1\\0&1&1\\0&-3&2\end{array}\right]\rightarrow$$$$
\rightarrow
\left[\begin{array}{ccc}1&0&-1\\0&1&1\\0&0&5\end{array}\right]
\rightarrow
\left[\begin{array}{ccc}1&0&-1\\0&1&1\\0&0&1\end{array}\right]
\rightarrow
\left[\begin{array}{ccc}1&0&0\\0&1&0\\0&0&1\end{array}\right].
$$Now to write the standard basis vectors in terms of these vectors, by the discussion at the bottom of page 25 through page 26, we can row-reduce the augmented matrix
$$\left[\begin{array}{ccc|ccc}1&0&-1&1&0&0\\1&2&1&0&1&0\\0&-3&2&0&0&1\end{array}\right].$$This gives
$$\left[\begin{array}{ccc|ccc}1&0&-1&1&0&0\\1&2&1&0&1&0\\0&-3&2&0&0&1\end{array}\right]$$$$\rightarrow\left[\begin{array}{ccc|ccc}1&0&-1&1&0&0\\0&2&2&-1&1&0\\0&-3&2&0&0&1\end{array}\right]$$$$\rightarrow\left[\begin{array}{ccc|ccc}1&0&-1&1&0&0\\0&1&1&-1/2&1/2&0\\0&-3&2&0&0&1\end{array}\right]$$$$\rightarrow\left[\begin{array}{ccc|ccc}1&0&-1&1&0&0\\0&1&1&-1/2&1/2&0\\0&0&5&-3/2&3/2&1\end{array}\right]$$$$\rightarrow\left[\begin{array}{ccc|ccc}1&0&-1&1&0&0\\0&1&1&-1/2&1/2&0\\0&0&1&-3/10&3/10&1/5\end{array}\right]$$$$\rightarrow\left[\begin{array}{ccc|ccc}1&0&0&7/10&3/10&1/5\\0&1&0&-1/5&1/5&-1/5\\0&0&1&-3/10&3/10&1/5\end{array}\right].$$Thus if
$$P=\left[\begin{array}{ccc}7/10&3/10&1/5\\-1/5&1/5&-1/5\\-3/10&3/10&1/5\end{array}\right]$$then $PA=I$, so we have$$\frac{7}{10}\alpha_1+\frac3{10}\alpha_2+\frac15\alpha_3=(1,0,0)$$$$-\frac15\alpha_1+\frac15\alpha_2-\frac15\alpha_3=(0,1,0)$$$$-\frac3{10}\alpha_1+\frac3{10}\alpha_2+\frac15\alpha_3=(0,0,1).$$
Exercise 2.3.5
Let $v_1=(1,0,0)$, $v_2=(0,1,0)$ and $v_3=(1,1,0)$. Then $v_1+v_2-v_3=(0,0,0)$ so they are linearly dependent. We know $v_1$ and $v_2$ are linearly independent as they are two of the standard basis vectors (see Example 13, page 41). Suppose $av_1+bv_3=0$. Then $(a+b,b,0)=(0,0,0)$. The second coordinate implies $b=0$ and then the first coordinate in turn implies $a=0$. Thus $v_1$ and $v_3$ are linearly independent. Analogously $v_2$ and $v_3$ are linearly independent.
Exercise 2.3.6
Let
$$v_{11}=\left[\begin{array}{cc}1&0\\0&0\end{array}\right],\quad v_{12}=\left[\begin{array}{cc}0&1\\0&0\end{array}\right]$$$$v_{21}=\left[\begin{array}{cc}0&0\\1&0\end{array}\right],\quad v_{22}=\left[\begin{array}{cc}0&0\\0&1\end{array}\right]$$Suppose $av_{11}+bv_{12}+cv_{21}+dv_{22}=\left[\begin{array}{cc}0&0\\0&0\end{array}\right]$.Then
$$\left[\begin{array}{cc}a&b\\c&d\end{array}\right] = \left[\begin{array}{cc}0&0\\0&0\end{array}\right],$$from which it follows immediately that $a=b=c=d=0$. Thus $v_{11}$, $v_{12}$, $v_{21}$, $v_{22}$ are linearly independent.
Now let $\left[\begin{array}{cc}a&b\\c&d\end{array}\right]$ be any $2\times2$ matrix. Then $\left[\begin{array}{cc}a&b\\c&d\end{array}\right]=av_{11}+bv_{12}+cv_{21}+dv_{22}$. Thus $v_{11}$, $v_{12}$, $v_{21}$, $v_{22}$ span the space of $2\times2$ matrices.
Thus $v_{11}$, $v_{12}$, $v_{21}$, $v_{22}$ are both linearly independent and they span the space of all $2\times2$ matrices. Thus $v_{11}$, $v_{12}$, $v_{21}$, $v_{22}$ constitue a basis for the space of all $2\times2$ matrices.
Exercise 2.3.7
(a) Let $A=\left[\begin{array}{cc}x&-x\\y&z\end{array}\right]$ and $B=\left[\begin{array}{cc}x’&-x’\\y’&z’\end{array}\right]$ be two elements of $W_1$ and let $c\in F$. Then
$$cA+B=\left[\begin{array}{cc}cx+x’&-cx-x’\\cy+y’&cz+z’\end{array}\right]=\left[\begin{array}{cc}a&-a\\u&v\end{array}\right]$$where $a=cx+x’$, $u=cy+y’$ and $v=cz+z’$. Thus $cA+B$ is in the form of an element of $W_1$. Thus $cA+B\in W_1$. By Theorem 1 (page 35) $W_1$ is a subspace.
Now let $A=\left[\begin{array}{cc}a&b\\-a&d\end{array}\right]$ and $B=\left[\begin{array}{cc}a’&b’\\-a’&d’\end{array}\right]$ be two elements of $W_1$ and let $c\in F$.Then
$$cA+B=\left[\begin{array}{cc}ca+a’&cb+b’\\-ca-a’&cd+d’\end{array}\right]=\left[\begin{array}{cc}x&y\\-x&z\end{array}\right]$$where $x=ca+a’$, $y=cb+b’$ and $z=cd+d’$. Thus $cA+B$ is in the form of an element of $W_2$. Thus $cA+B\in W_2$. By Theorem 1 (page 35) $W_2$ is a subspace.
(b) Let
$$A_1=\left[\begin{array}{cc}1&-1\\0&0\end{array}\right],\quad A_2=\left[\begin{array}{cc}0&0\\1&0\end{array}\right],\quad A_2=\left[\begin{array}{cc}0&0\\0&1\end{array}\right].$$Then $A_1, A_2,A_3\in W_1$ and
$$c_1A_1+c_2A_2+c_3A_3 = \left[\begin{array}{cc}c_1&-c_1\\c_2&c_3\end{array}\right]=\left[\begin{array}{cc}0&0\\0&0\end{array}\right]$$implies $c_1=c_2=c_3=0$. So $A_1$, $A_2$, $A_3$ are linearly independent. Now let $A=\left[\begin{array}{cc}x&-x\\y&z\end{array}\right]$ be any element of $W_1$. Then $A=xA_1+yA_2+zA_3$. Thus $A_1$, $A_2$, $A_3$ span $W_1$. Thus $\{A_1,A_2,A_3\}$ form a basis for $W_1$. Thus $W_1$ has dimension three.
Let
$$A_1=\left[\begin{array}{cc}1&0\\-1&0\end{array}\right],\quad A_2=\left[\begin{array}{cc}0&1\\0&0\end{array}\right],\quad A_2=\left[\begin{array}{cc}0&0\\0&1\end{array}\right].$$Then $A_1, A_2,A_3\in W_2$ and
$$c_1A_1+c_2A_2+c_3A_3 = \left[\begin{array}{cc}c_1&c_2\\-c_1&c_3\end{array}\right]=\left[\begin{array}{cc}0&0\\0&0\end{array}\right]$$implies $c_1=c_2=c_3=0$. So $A_1$, $A_2$, $A_3$ are linearly independent. Now let $A=\left[\begin{array}{cc}x&y\\-x&z\end{array}\right]$ be any element of $W_2$. Then $A=xA_1+yA_2+zA_3$. Thus $A_1$, $A_2$, $A_3$ span $W_2$. Thus $\{A_1,A_2,A_3\}$ form a basis for $W_2$. Thus $W_2$ has dimension three.
Let $V$ be the space of $2\times2$ matrices. We showed in Exercise 6 that the $\dim(V)=4$. Now $W_1\subseteq W_1+W_2\subseteq V$. Thus by Corollary 1, page 46, $3\leq\dim(W_1+W_2)\leq4$. Let $A=\left[\begin{array}{cc}1&0\\-1&0\end{array}\right]$. Then $A\in W_2$ and $A\not\in W_1$. Thus $W_1+W_2$ is strictly bigger than $W_1$. Thus $4\geq\dim(W_1+W_2)>\dim(W_1)=3$. Thus $\dim(W_1+W_2)=4$.
Suppose $A=\left[\begin{array}{cc}a&b\\c&d\end{array}\right]$ is in $W_1\cap W_2$. Then $A\in W_1$ $\Rightarrow$ $a=-b$ and $A\in W_2$ $\Rightarrow$ $a=-c$. So $A=\left[\begin{array}{cc}a&-a\\-a&b\end{array}\right]$. Let $A_1=\left[\begin{array}{cc}1&-1\\-1&0\end{array}\right]$, $A_2=\left[\begin{array}{cc}0&0\\0&1\end{array}\right]$. Suppose $aA_1+bA_2=0$. Then
$$\left[\begin{array}{cc}a&-a\\-a&b\end{array}\right]=\left[\begin{array}{cc}0&0\\0&0\end{array}\right],$$which implies $a=b=0$. Thus $A_1$ and $A_2$ are linearly independent. Let $A=\left[\begin{array}{cc}a&-a\\-a&b\end{array}\right]\in W_1\cap W_2$. Then $A=aA_1+bA_2$. So $A_1,A_2$ span $W_1\cap W_2$. Thus $\{A_1,A_2\}$ is a basis for $W_1\cap W_2$. Thus $\dim(W_1\cap W_2)=2$.
Exercise 2.3.8
Let $V$ be the space of all $2\times2$ matrices. Let
$$A_1=\left[\begin{array}{cc}1&0\\0&1\end{array}\right],\quad A_2=\left[\begin{array}{cc}0&0\\0&1\end{array}\right]$$$$A_3=\left[\begin{array}{cc}1&1\\0&0\end{array}\right],\quad A_4\left[\begin{array}{cc}0&0\\1&1\end{array}\right]$$Then $A_i^2=A_i$ for all $i$. Now
$$aA_1+bA_2+cA_3+dA_4=\left[\begin{array}{cc}a+c&c\\d&b+d\end{array}\right]=\left[\begin{array}{cc}0&0\\0&0\end{array}\right]$$implies $c=d=0$ which in turn implies $a=b=0$. Thus $A_1,A_2,A_3,A_4$ are linearly independent. Thus they span a subspace of $A$ of dimension four. But by Exercise 6, $A$ also has dimension four. Thus by Corollary 1, page 46, the subspace spanned by $A_1,A_2,A_3,A_4$ is the entire space. Thus $\{A_1,A_2,A_3,A_4\}$ is a basis.
Exercise 2.3.9
Suppose $a(\alpha+\beta)+b(\beta+\gamma)+c(\gamma+\alpha)=0$. Rearranging gives $(a+c)\alpha+(a+b)\beta+(b+c)\gamma=0$. Since $\alpha$, $\beta$, and $\gamma$ are linearly independent it follows that $a+c=a+b=b+c=0$. This gives a system of equations in $a,b,c$ with matrix
$$\left[\begin{array}{ccc}1&1&0\\1&0&1\\0&1&1\end{array}\right].$$This row-reduces as follows:
$$\left[\begin{array}{ccc}1&1&0\\1&0&1\\0&1&1\end{array}\right]
\rightarrow
\left[\begin{array}{ccc}1&1&0\\0&-1&1\\0&1&1\end{array}\right]
\rightarrow
\left[\begin{array}{ccc}1&1&0\\0&1&-1\\0&1&1\end{array}\right]\rightarrow$$$$
\rightarrow
\left[\begin{array}{ccc}1&0&1\\0&1&-1\\0&0&2\end{array}\right]
\rightarrow
\left[\begin{array}{ccc}1&0&1\\0&1&-1\\0&0&1\end{array}\right]
\rightarrow
\left[\begin{array}{ccc}1&0&0\\0&1&0\\0&0&1\end{array}\right].
$$Since this row-reduces to the identity matrix, by Theorem 7, page 13, the only solution is $a=b=c=0$. Thus $(\alpha+\beta)$, $(\beta+\gamma)$, and $(\gamma+\alpha)$ are linearly independent.
Exercise 2.3.10
The statement follows from Theorem 4 on Page 44.
Exercise 2.3.11
(a) It is clear from inspection of the definition of a vector space (pages 28-29) that a vector space over a field $F$ is a vector space over every subfield of $F$, because all properties (e.g. commutativity and associativity) are inherited from the operations in $F$. Let $M$ be the vector space of all $2\times2$ matrices over $\mathbb C$ ($M$ is a vector space, see example 2 page 29). We will show $V$ is a subspace $M$ as a vector space over $\mathbb C$. It will follow from the comment above that $V$ is a vector space over $\mathbb R$. Now $V$ is a subset of $M$, so using Theorem 1 (page 35) we must show whenever $A,B\in V$ and $c\in\mathbb C$ then $cA+B\in V$. Let $A,B\in V$. Write $A=\left[\begin{array}{cc}x&y\\z&w\end{array}\right]$ and $B=\left[\begin{array}{cc}x’&y’\\z’&w’\end{array}\right]$. Then
\begin{equation}
x+w=x’+w’=0.
\label{p2.3.11}
\end{equation}$$cA+B=\left[\begin{array}{cc}cx+x’&cy+y’\\cz+z’&cw+w’\end{array}\right]$$To show $cA+B\in V$ we must show $(cx+x’) + (cw+w’)=0$. Rearranging the left hand side gives $c(x+w)+(x’+w’)$ which equals zero by (\ref{p2.3.11}).
(b) We can write the general element of $V$ as
$$A=\left[\begin{array}{cc}a+bi&e+fi\\g+hi&-a-bi\end{array}\right].$$Let
$$v_1=\left[\begin{array}{cc}1&0\\0&-1\end{array}\right],\quad v_2=\left[\begin{array}{cc}i&0\\0&-i\end{array}\right],$$$$v_3=\left[\begin{array}{cc}0&1\\0&0\end{array}\right],\quad v_4=\left[\begin{array}{cc}0&i\\0&0\end{array}\right],$$$$v_5=\left[\begin{array}{cc}0&0\\1&0\end{array}\right],\quad v_6=\left[\begin{array}{cc}0&0\\i&0\end{array}\right].$$Then $A=av_1+bv_2+ev_3+fv_4+gv_5+hv_6$ so $v_1$, $v_2$, $v_3$, $v_4$, $v_5$, $v_6$ span $V$.Suppose $$av_1+bv_2+ev_3+fv_4+gv_5+hv_6=0.$$ Then
$$av_1+bv_2+ev_3+fv_4+gv_5+hv_6=\left[\begin{array}{cc}a+bi&e+fi\\g+hi&-a-bi\end{array}\right]=\left[\begin{array}{cc}0&0\\0&0\end{array}\right]$$implies $a=b=c=d=e=f=g=h=0$ because a complex number $u+vi=0$ $\Leftrightarrow$ $u=v=0$. Thus $v_1$, $v_2$, $v_3$, $v_4$, $v_5$, $v_6$ are linearly independent. Thus $\{v_1, \dots, v_6\}$ is a basis for $V$ as a vector space over $\mathbb R$, and $\dim(V)=6$.
(c) Let $A,B\in W$ and $c\in\mathbb R$. By Theorem 1 (page 35) we must show $cA+B\in W$. Write $A=\left[\begin{array}{cc}x&y\\-\bar{y}&-x\end{array}\right]$ and $B=\left[\begin{array}{cc}x’&y’\\-\bar{y’}&-x’\end{array}\right]$, where $x,y,x’,y’\in\mathbb C$. Then $$cA+B=\left[\begin{array}{cc}cx+x’&cy+y’\\-c\bar{y}-\bar{y’}&-cx-x’\end{array}\right].$$ Since $-c\bar{y}-\bar{y’}=-\overline{(cy+y’)}$, it follows that $cA+B\in W$. Note that we definitely need $c\in\mathbb R$ for this to be true.
It remains to find a basis for $W$. We can write the general element of $W$ as
$$A=\left[\begin{array}{cc}a+bi&e+fi\\-e+fi&-a-bi\end{array}\right].$$Let
$$v_1=\left[\begin{array}{cc}1&0\\0&-1\end{array}\right],\quad v_2=\left[\begin{array}{cc}i&0\\0&-i\end{array}\right],$$$$v_3=\left[\begin{array}{cc}0&1\\-1&0\end{array}\right],\quad v_4=\left[\begin{array}{cc}0&i\\i&0\end{array}\right].$$Then $A=av_1+bv_2+ev_3+fv_4$ so $v_1$, $v_2$, $v_3$, $v_4$ span $V$.Suppose $av_1+bv_2+ev_3+fv_4=0$. Then
$$av_1+bv_2+ev_3+fv_4=\left[\begin{array}{cc}a+bi&e+fi\\-e+fi&-a-bi\end{array}\right]=\left[\begin{array}{cc}0&0\\0&0\end{array}\right]$$implies $a=b=e=f=0$ because a complex number $u+vi=0$ $\Leftrightarrow$ $u=v=0$. Thus $v_1$, $v_2$, $v_3$, $v_4$ are linearly independent. Thus $\{v_1, \dots, v_4\}$ is a basis for $V$ as a vector space over $\mathbb R$, and $\dim(V)=4$.
Exercise 2.3.12
Let $M$ be the space of all $m\times n$ matrices. Let $M_{ij}$ be the matrix of all zeros except for the $i,j$-th place which is a one. We claim $\{M_{ij}\mid 1\leq i\leq m, 1\leq j\leq n\}$ constitute a basis for $M$. Let $A=(a_{ij})$ be an arbitrary marrix in $M$. Then $A=\sum_{ij}a_{ij}M_{ij}$. Thus $\{M_{ij}\}$ span $M$. Suppose $\sum_{ij}a_{ij}M_{ij}=0$. The left hand side equals the matrix $(a_{ij})$ and this equals the zero matrix if and only if every $a_{ij}=0$. Thus $\{M_{ij}\}$ are linearly independent as well. Thus the $nm$ matrices constitute a basis and $M$ has dimension $mn$.
Exercise 2.3.13
If $F$ has characteristic two then $$(\alpha+\beta)+(\beta+\gamma)+(\gamma+\alpha)=2\alpha+2\beta+2\gamma=0+0+0=0$$ since in a field of characteristic two, $2=0$. Thus in this case $(\alpha+\beta)$, $(\beta+\gamma)$ and $(\gamma+\alpha)$ are linearly dependent. However any two of them are linearly independent. For example suppose $a_1(\alpha+\beta)+a_2(\beta+\gamma)=0$. The LHS equals $a_1\alpha+a_2\gamma+(a_1+a_2)\beta$. Since $\alpha$, $\beta$, $\gamma$ are linearly independent, this is zero only if $a_1=0$, $a_2=0$ and $a_1+a_2=0$. In particular $a_1=a_2=0$, so $\alpha+\beta$ and $\beta+\gamma$ are linearly independent.
Exercise 2.3.14
We know that $\mathbb Q$ is countable and $\mathbb R$ is uncountable. Since the set of $n$-tuples of things from a countable set is countable, $\mathbb Q^n$ is countable for all $n$. Now, suppose $\{r_1,\dots,r_n\}$ is a basis for $\mathbb R$ over $\mathbb Q$. Then every element of $\mathbb R$ can be written as $a_1r_1+\cdots+a_nr_n$. Thus we can map $n$-tuples of rational numbers onto $\mathbb R$ by $(a_1,\dots,a_n)\mapsto a_1r_1+\cdots+a_nr_n$. Thus the cardinality of $\mathbb R$ must be less or equal to $\mathbb Q^n$. But the former is uncountable and the latter is countable, a contradiction. Thus there can be no such finite basis.
From http://greggrant.org