[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Thinking Cap] Directed review of linear algebra



Your ability to appreciate the finer points of the following lectures depends on your background in linear algebra. To that extent, here is a directed review of linear algebra.

0. Convince yourself that matrix multiplication is essentially a bunch of dot products 
between row vectors of one matrix and the column vectors of the other matrix. This should also give another reason as to why the inner dimensions of the matrices must match before you can multiply them (since you can't define the dot product between two vectors of differing dimensions)

1. Remember the notion of linear dependence. A vector is considered linearly dependent on another set of vectors if it can be written as the linear combination of that set of vectors. Specifically as c1v1+c2v2+..ckvk where vi are the vectors and ci are scalar constants. Show to yourself that the vector <6, 6> is linearly dependent on the vectors <2, 1> and <1, 2>

2. Remember the notion of a space spanned by a set of vectors. Given a set S of vectors, the space spanned by S is the set of all vectors which can be written as the linear combination of the vectors in S. 

3. Remember the notion of linear independence. A set of vectors is linearly independent if none of the vectors in that set can be written as the linear combination of the rest of the vectors. 

4.  Remember  the notion if "basis" for a space. A set is considered a basis for a space if (a) the set is linearly independent and (b) every vector in that space can be written as a linear combination of the basis vectors. The "dimensionality" of a space is the size of its basis set. 

5. Remember the notion of "orthogonal basis" for a space--which is a set of vectors that  is a basis *and* the vectors are orthogonal to each other (i.e., their pairwise dot product vi*vj is 0 if i != j

6. Remember the notion of "orthonormal basis" for a space--which is a set of vectors that forms an orthogonal basis *and* the vectors are all unit vectors.

7. Remember that the row-rank of a matrix is the size of the basis set of the space spanned by the row vectors of the matrix. The column-rank of a matrix is the size of the basis of the space spanned by the column vectors. The Row and Column ranks of a matrix are always equal and this number is the rank of the matrix. The matrix is considered "full rank" if its rank is equal to its number of rows and number of columns (so full rank matrix has to be a square matrix)

Now use this knowledge to answer the following to yourself (feel free to enter answers on the blog). 


Q1. For three dimensional euclidean space, is the set [ <1, 0, 0>, <0, 1, 0>] linearly independent? Is it a basis? 

Q2. For 2-dimensional euclidean space, is the set [<1, 1>, <1, 2>]  linearly independent? Is it a basis? Is it an orthogonal basis? Is it an orthonormal basis? 

Q3. For two dimensional euclidean space, how many different basis sets can you get?  How many of them are orthogonal bases? How many of them are ortho-normal bases? 

Q4: If a matrix is full rank, then are its column vectors linearly independent? how about is row vectors? Are they also orthogonal?  


That is it for now..

Rao





relation between the "rank" of a matrix and the maximum number of linearly independent  rows (or columns). Note that a set S of vectors is considered linearly independent if no member of that set can be written as a linear combination of the