Exercise 2.3.1
Suppose and are linearly dependent. If one of them, say , is the zero vector then it is a scalar multiple of the other one . So we can assume both and are non-zero. Then if such that , both and must be non-zero. Therefore we can write .
Exercise 2.3.2
By Corollary 3, page 46, it suffices to determine if the matrix whose rows are the ’s is invertible. By Theorem 12 (ii) we can do this by row reducing the matrix
Thus the four vectors are not linearly independent.
Exercise 2.3.3
In Section 2.5, Theorem 9, page 56, it will be proven that row equivalent matrices have the same row space. The proof of this is almost immediate so there seems no easier way to prove it than to use that fact. If you multiply a matrix on the left by another matrix , the rows of the new matrix are linear combinations of the rows of the original matrix. Thus the rows of generate a subspace of the space generated by the rows of . If is invertible, then the two spaces must be contained in each other since we can go backwards with . Thus the rows of row-equivalent matrices generate the same space. Thus using the row reduced form of the matrix in Exercise 2, it must be that the space is two dimensoinal and generated by and .
Exercise 2.3.4
By Corollary 3, page 46, to show the vectors are linearly independent it suffices to show the matrix whose rows are the ’s is invertible. By Theorem 12 (ii) we can do this by row reducing the matrix
Now to write the standard basis vectors in terms of these vectors, by the discussion at the bottom of page 25 through page 26, we can row-reduce the augmented matrix
This gives
Thus if
then , so we have
Exercise 2.3.5
Let , and . Then so they are linearly dependent. We know and are linearly independent as they are two of the standard basis vectors (see Example 13, page 41). Suppose . Then . The second coordinate implies and then the first coordinate in turn implies . Thus and are linearly independent. Analogously and are linearly independent.
Exercise 2.3.6
Let
Suppose .Then
from which it follows immediately that . Thus , , , are linearly independent.
Now let be any matrix. Then . Thus , , , span the space of matrices.
Thus , , , are both linearly independent and they span the space of all matrices. Thus , , , constitue a basis for the space of all matrices.
Exercise 2.3.7
(a) Let and be two elements of and let . Then
where , and . Thus is in the form of an element of . Thus . By Theorem 1 (page 35) is a subspace.
Now let and be two elements of and let .Then
where , and . Thus is in the form of an element of . Thus . By Theorem 1 (page 35) is a subspace.
(b) Let
Then and
implies . So , , are linearly independent. Now let be any element of . Then . Thus , , span . Thus form a basis for . Thus has dimension three.
Let
Then and
implies . So , , are linearly independent. Now let be any element of . Then . Thus , , span . Thus form a basis for . Thus has dimension three.
Let be the space of matrices. We showed in Exercise 6 that the . Now . Thus by Corollary 1, page 46, . Let . Then and . Thus is strictly bigger than . Thus . Thus .
Suppose is in . Then and . So . Let , . Suppose . Then
which implies . Thus and are linearly independent. Let . Then . So span . Thus is a basis for . Thus .
Exercise 2.3.8
Let be the space of all matrices. Let
Then for all . Now
implies which in turn implies . Thus are linearly independent. Thus they span a subspace of of dimension four. But by Exercise 6, also has dimension four. Thus by Corollary 1, page 46, the subspace spanned by is the entire space. Thus is a basis.
Exercise 2.3.9
Suppose . Rearranging gives . Since , , and are linearly independent it follows that . This gives a system of equations in with matrix
This row-reduces as follows:
Since this row-reduces to the identity matrix, by Theorem 7, page 13, the only solution is . Thus , , and are linearly independent.
Exercise 2.3.10
The statement follows from Theorem 4 on Page 44.
Exercise 2.3.11
(a) It is clear from inspection of the definition of a vector space (pages 28-29) that a vector space over a field is a vector space over every subfield of , because all properties (e.g. commutativity and associativity) are inherited from the operations in . Let be the vector space of all matrices over ( is a vector space, see example 2 page 29). We will show is a subspace as a vector space over . It will follow from the comment above that is a vector space over . Now is a subset of , so using Theorem 1 (page 35) we must show whenever and then . Let . Write and . Then
To show we must show . Rearranging the left hand side gives which equals zero by ().
(b) We can write the general element of as
Let
Then so , , , , , span .Suppose Then
implies because a complex number . Thus , , , , , are linearly independent. Thus is a basis for as a vector space over , and .
(c) Let and . By Theorem 1 (page 35) we must show . Write and , where . Then Since , it follows that . Note that we definitely need for this to be true.
It remains to find a basis for . We can write the general element of as
Let
Then so , , , span .Suppose . Then
implies because a complex number . Thus , , , are linearly independent. Thus is a basis for as a vector space over , and .
Exercise 2.3.12
Let be the space of all matrices. Let be the matrix of all zeros except for the -th place which is a one. We claim constitute a basis for . Let be an arbitrary marrix in . Then . Thus span . Suppose . The left hand side equals the matrix and this equals the zero matrix if and only if every . Thus are linearly independent as well. Thus the matrices constitute a basis and has dimension .
Exercise 2.3.13
If has characteristic two then since in a field of characteristic two, . Thus in this case , and are linearly dependent. However any two of them are linearly independent. For example suppose . The LHS equals . Since , , are linearly independent, this is zero only if , and . In particular , so and are linearly independent.
Exercise 2.3.14
We know that is countable and is uncountable. Since the set of -tuples of things from a countable set is countable, is countable for all . Now, suppose is a basis for over . Then every element of can be written as . Thus we can map -tuples of rational numbers onto by . Thus the cardinality of must be less or equal to . But the former is uncountable and the latter is countable, a contradiction. Thus there can be no such finite basis.
From http://greggrant.org