Math 20F - Linear Algebra - Instructor: Sam Buss - Spring 2003

Lecture Topics

This is a synopsis of the lecture topics.   I will usually try to update these within 24 hours of the lecture, and will indicate the next lecture's topics so you can read ahead. 

It is strongly suggested that you keep up with the definitions and concepts as they are introduced in the course, because the material in this class builds on itself.  If you are unsure about the definitions or theorems or general concepts, you should be sure to clarify them (by consulting the text or fellow students or a TA or me).

Lecture 1, Monday, March 31, 2003.    Mostly section 1.1 from text book.
    Linear equation.
    Systems of linear equations.
    Coefficient matrix.
    Augmented matrix.
    Solution set (Set of solutions to a system of linear equations).
    Equivalent systems of equations.
    Solving by back substitution (introductory version).
    Upper triangular matrix.
    Elementary row operations, and their correspondence to operations on a system of linear equations.

Lecture 2, Wednesday, April 3, 2003
    Examples of solving 2x2 systems of equations.
    Consistent and inconsistent systems of linear equations.
    Solution sets can be (a) empty, (b) have a unique solution, (c) have infinitely many solutions.
    Row echelon form.
    Any matrix can be converted into row echelon form.  (By Guassian elimination)
    Lead variables and free variables. 
    How to calculate the solution set by back substitution from row echelon form.
    Reduced Row Echelon Form (RREF).
    Any matrix can be converted to RREF.  (By Gauss-Jordan reduction procedure.)
    Over-determined systems and underdetermined systems.
    A consistent system with free variables has infinitely many solutions.

Lecture 3, Friday, April 5, 2003
    Homogenous equations.
    Matrices, row vectors, column vectors.
    Column vector is the transpose of a row vector.
    Scalar multiplication.
    Matrix addition.
    Matrix multiplication.
    Linear combination of vectors.
    Linear combination of column vectors.  Solvability of Ax=b.

Lecture 4, Monday, April 7, 2003
    Properties such as commutativity, associativity, distributivity.
    Matrix multiplication is not commutative.
    Associative law for matrix product, and its importance.
    Square matrices.
    Identity matrices.
    Invertible matrix, non-singular matrix.
    Uniqueness of the inverse A-1
    If A and B nonsingular, then (AB)-1 = B-1A-1
    Use of inverses to solve n by n system of linear equation if coefficient matrix is invertible.
    Transpose of a matrix. 

Lecture 5, Wednesday, April 9, 2003
    Row operations as matrices.
    The three kinds of elementary matrices.
    Forming EA is same as a row operation on A (for E elementary)
    An algorithm to invert a matrix using row operations only.

Lecture 6, Friday, April 11, 2003
    More on transposes.
    Symmetric and skew-symmetric matrices.
    A big proof of equivalences for a square matrix to be non-singular.
    Diagonal, upper triangular and lower triangular matrices.
    LU decomposition.
    Block notation for matrices (section 1.5).

Handout for Lectures 7 and 8: Available in PDF and postscript formats.

Lecture 7, Monday, April 14, 2003
    Determinants of 2x2 matrices.
    Determinants of 3x3 matrices.
    Cofactor expansion definition of determinants, along any column or any row.
    Row operations of Type II and determinants.

Lecture 8, Wednesday, April 16, 2003
    Row operations and determinants.
    If a matrix has two equal rows, its determinant is zero.
    Determinants of upper and lower triangular matrices.
    Finding determinants by Gaussian elimination.
    Elementary matrices and their determinants.
    det(AB) equals  det(A)*det(B).
    det(AT) equals  det(A).
    Cofactor expansion along any row.   Along any column.

Lecture 9, Friday, April 18, 2003
   
Wrap up determinants.
    Vector spaces
    Basic examples:
        R2, Rn
        Rnm
    Formal definition of vector spaces.
        Closure conditions.  C1, C2.  Also, the redundant C3 and C4 (contains zero and is closed under negation).
        Axioms A1-A8.
    C[a,b] - Vector space of all continuous functions with domain [a,b].
    Definition of subspace.
    Thm: Any subset of a vector space closed under C1, C2 is a subspace.
    Cn[a,b] - Subspace of C[a,b] of functions with continuous n-th derivative

Lecture 10, Monday, April 21, 2003
    Pn - vector space of all polynomials of degree less than  n.
    Pn is isomorphic to
Rn
    Examples of subspaces and of subsets that are not subspaces.
    Example of a non-vector space.
    Nullspace, N(A).  The Nullspace is always a subspace.
    Span of a set of vectors.  A Span is always a subspace.

Lecture 11, Wednesday, April 23, 2003
    Span of a set of vectors.
    A spanning set for a vector space.
    Various examples.
    Span(u1,...,un) is a subset of Span(u1,...,un+1).
    They are equal if and only if un+1 is in Span(u1,...,un).

Lecture 12, Friday, April 25, 2003
    How to determine if a given set of vectors is a spanning set for Rn.
    Linear dependence.
    Linear independence.
    Two nonzero vectors are linearly independent iff each one is a scalar multiple of the other.
    How to determine linear (in)dependence by looking for non-trivial solution of homogeneous equation.
    Fewer than n vectors cannot be a spanning set for Rn
    A set of n vectors span Rn iff the matrix with those columns as vectors is non-singular.
    We are skipping the material on Wronskians (pp. 152-153).

Day 13, Monday, April 28, 2003    MIDTERM EXAM TODAY
    Midterm instead of lecture today.

Lecture 14, Wednesday, April 30, 2003
    Definition of a basis.
    Lemma about size of linearly independent sets (Theorem 3.4.1 on page 157).
    Definition of the dimension of a vector space.
    Examples.
    In vector space of dimension n, any n linearly independent vectors form a basis.
    Any spanning set contains a basis as a subset.
    Any linearly independent set can be extended to a basis.
    How to determine if n vectors in Rn form a basis.

Lecture 15, Friday, May 2, 2003
    Finish up material on bases and dimension.
    Proof of important lemma which is foundation for dimension.
    n linearly independent vectors span a subspace dimension n.
    Change of basis.  From a basis to the standard basis and vice-versa.
    Between two arbitrary bases.
    Example with P3

Lecture 16, Monday, May 5, 2003
    Row space and column space.
    Rank of a matrix.
    Row space and column space have the same dimension.
    Nullity of a matrix (dimension of the null space).
    How to find a basis for the row space.
    How to find a basis for the column space.
    Row space of A is orthogonal to the Null space of A.

Handout for Lecture 17 (Sections 4.1 and 4.2): available in PDF and postscript formats.

Lecture 17, Wednesday, May 7, 2003
    Norm.  Dot product (also known as scalar product or inner product).
    Linear transformation.  Definition and basic properties.
    Various examples.
    Projection to a line specified by a unit vector.
    Matrix representation of a linear transformation.
    Examples again.
    Theorem about how to represent a linear transformation with a matrix.

Lecture 18, Friday, May 9, 2003
    Matrix representation of a rotation.
    Matrix representation of a cross product.
    Skew-symmetric matrices.
    Matrix representation of a projection.
    Symmetric matrices.
    Cauchy-Schwarz inequality.

Lecture 19, Monday, May 12, 2003
    Orthogonal vectors.
    Orthogonal subspaces.
    Examples of defining planes in R3.
    Row space of A is orthogonal to the Null space of A.
    Orthogonal complement.
    Range, R(A), of a matrix.
    Pictures about N(A) and R(AT) and R(A).
    Thm: The complement of a subspace of dimension d has dimension n-d.
    Thm: The complement of the complement of U is U.
    Thm: Any vector can be written uniquely as  x=u+v where u in U and v in complexity U.
    Proofs of theorems postponed until Wednesday lecture.

Lecture 20, Wednesday, May 14, 2003
    Several theorems on orthogonal complements.
    N(A) is the orthogonal complement of R(AT)
    N(AT) is the orthogonal complement of R(A)
    dim(complement of S) + dim(S) = n.
    S is the orthogonal complement of the orthogonal complement of S.
    Any x is uniquely writable as sum of u in U and v in the orthogonal complement of U.
    How to find a basis for the orthogonal complement.

Lecture 21, Friday, May 16, 2003
    A consistent system with nontrivial null space has infinitely many solutions.
    Least squares problems.
    How to find p in R(A) that is closest to b.
    How to solve a least squares problem.
    Example of finding projection onto a subspace.
    Best line fitting data example.
    Best quadratic curve fitting data example.

Lecture 22, Monday, May 19, 2003
    If A is mn and has rank n, then ATA is invertible.
    Quick discussion of inner product spaces and their properties.
    Orthogonal sets of vectors.
    Orthonormal sets of vectors.
    Finding coordinates of x relative to a orthonormal set of vectors.

Lecture 23, Wednesday, May 21, 2003
    Finding the projection of x onto a subspace spanned by a set of orthogonal vectors.
    Orthogonal matrices.
    Rigid transformation intuition.
    Matrix has orthogonal columns iff  QTQ=I.
    Transpose of orthogonal matrix is orthogonal.
    Least squares problems with matrices that have orthonormal columns.
    Projection of b onto subspace spanned by set of orthonormal vectors is p = QQTb.

Day 24, Friday, May 23, 2003 - MIDTERM EXAM #2 TODAY
    Midterm instead of lecture today.

Lecture 25, Wednesday, May 28, 2003
    Modified Gram-Schmidt method for orthonormalization.
    Eigenvalue
    Eigenvector
    Motivating example (from section 6.1).

Lecture 26, Friday, May 30, 2003
    Eigenvalue.
    Eigenvector.
    Eigenspace.
    Characteristic polynomial.
    How to compute eigenvalues and eigenvectors.
    Trace of a matrix.
    Sums and products of eigenvalues and the characteristic polynomial.

Lecture 27, Monday, June 2, 2003
    Sums and products of eigenvalues and the characteristic polynomial (Again).
    Similarity.
    Diagonalizability.

Lecture 28, Wednesday, June 4, 2003
    Similarity.
    Diagonalizability.
    Distinct eigenvalues have linearly independent eigenvectors.
    If n linearly independent eigenvectors, then diagonalizable.
    Defective matrices.

Lecture 29, Friday, June 6, 2003 -LAST DAY OF CLASS!
    Continue reading in section 6.1 and 6.3.