**Math 20F - Linear Algebra****
- Instructor:
Sam Buss - Winter 2003**

**Lecture Topics**

This is a synopsis of the lecture topics. I will try to update these within 24 hours of the lecture, and will indicate the next lecture's topics so you can read ahead.

It is **strongly
suggested** that you keep up with the definitions and concepts as they
are introduced in the course, because the material in this class builds on
itself. If you are unsure about the definitions or theorems or general
concepts, you should be sure to clarify them (by consulting the text or fellow
students or a TA or
me).

**Lecture 1, Monday, January 6, 2003.**
Mostly section 1.1 from text book.

Linear equation.

Systems of linear equations.

Coefficient matrix.

Augmented matrix.

Solution set (Set of solutions to a system of linear
equations).

Equivalent systems of equations.

Solving by back substitution (introductory version).

Upper triangular matrix.

Elementary row operations, and their correspondence to
operations on a system of linear equations.

**Lecture 2, Wednesday, January 8, 2003**

Examples of solving 2x2 systems of equations.

Consistent and inconsistent systems of linear equations.

Solution sets can be (a) empty, (b) have a unique solution,
(c) have infinitely many solutions.

Row echelon form.

Any matrix can be converted into row echelon form. (By
**Guassian elimination**)

Lead variables and free variables.

How to calculate the solution set by back substitution from
row echelon form.

Reduced Row Echelon Form (RREF).

Any matrix can be converted to RREF. (By *
Gauss-Jordan reduction *procedure.)

Over-determined systems and underdetermined systems.

A consistent underdetermined system has infinitely many solutions. Proof: there is at least one free variable in the (reduced) row echelon form.

**Lecture 3, Friday, January 10, 2003**

Homogenous equations.

Matrices, row vectors, column vectors.

Scalar multiplication.

Matrix addition.

Matrix multiplication.

Properties such as commutativity, associativity,
distributivity.

Matrix multiplication is not commutative.

**Lecture 4, Monday, January 13, 2003**

Associative law for matrix product, and its importance.

Square matrices.

Identity matrices.

Invertible matrix, non-singular matrix.

Uniqueness of the inverse A^{-1}

If A and B nonsingular, then (AB)^{-1 }= B^{-1}A^{-1}

Linear combination of column vectors. Solvability of A**x**=**b**.

Use of inverses to solve n by n system of linear equation if
coefficient matrix is invertible.

Transpose of a matrix.

**Lecture 5, Wednesday, January 15, 2003**

Row operations as matrices.

The three kinds of elementary matrices.

A big proof of equivalences for a square matrix to be
non-singular.

Abstract discussion of how this gives an algorithm to invert
a matrix using row operations only.

Concrete examples had to wait until the next lecture due to
lack of time.

**Lecture 6, Friday, January 17, 2003**

Inverting a matrix with row operations (Gauss-Jordan
reduction).

Diagonal, upper triangular and lower triangular matrices.

LU decomposition.

Block notation for matrices (section 1.5).

Introduction to determinants.

Determinant of 2x2 matrices.

**Martin Luther King Day, Monday, January 20, 2003
** No lecture - holiday

**Lecture 7, Wednesday, January 22, 2003**

Determinants of 3x3 matrices.

Determinants of general matrices.

Cofactor expansion along any row. Along any
column.

Determinants of upper and lower triangular matrices.

Row operations of types 1 and 2 and their effect on
determinants.

A theorem about adding rows (or columns).

If a matrix has two equal rows (or two equal columns), its
determinant is zero.

**Handout for Lectures 7 and 8: **Available
in PDF and
postscript formats.

Professor Bender has prepared similar notes, also
available in
PDF and
postscript formats.

**Lecture 8, Friday, January 24, 2003**

Row operations, column operations and determinants.

Elementary matrices and their determinants.

Finding determinants by Gaussian elimination.

det(AB) equals det(A)*det(B).

det(A^{T}) equals det(A).

det(A)=0 if and only if A is singular. Also, if and
only if A**x** = **0** has a non-trivial solution.

Also, if the (reduced) row echelon form of A has a free
variables (and hence of row of all zeroes).

**Lecture 9, Monday, January 27, 2003**

Vector spaces

Basic examples:

**R**^{2}, **R**^{n}

R^{n×}^{m}

Formal definition of vector spaces.

Closure conditions. C1, C2.
Also, the redundant C3 and C4 (contains zero and is closed under negation).

Axioms A1-A8.

More abstract examples:

P_{n} - vector space of all
polynomials of degree less than n.

C[a,b] - Vector space of all
continuous functions with domain [a,b].

**Lecture 10, Wednesday, January 29, 2003**

Example of a non-vector space, where A6 fails.

Subspace. Proper subspace.

For a subset to be a subspace, it is sufficient that the
closure conditions are satisfied.

Examples involving R2, P_{n},
C[a,b] and C^{n}[a,b], and solution set of differential equation.

**Lecture 11, Friday, January 31, 2003**

Nullspace, N(A).

Span of a set of vectors.

A spanning set for a vector space.

How to determine if a given set of vectors is a spanning set
for R^{n}.

**Lecture 12, Monday, February 3, 2003**

Linear dependence.

Linear independence.

How to determine linear (in)dependence by looking for
non-trivial solution of homogeneous equation.

We are skipping the material on Wronskians (pp. 152-153).

**Day 13, Wednesday, February 5, 2003.
** No lecture - Midterm examination.

**Lecture 14, Friday, February 7, 2003**

Definition of a basis.

Lemma about size of linearly independent sets (Theorem 3.4.1
on page 157).

Definition of the dimension of a vector space.

Examples.

In vector space of dimension n, any n linearly independent
vectors form a basis.

How to determine if n vectors in R^{n} form a basis.

**Lecture 15, Monday, February 10, 2003**

Finish up material on bases and dimension.

Change of basis. From a basis to the standard basis and
vice-versa.

Between two arbitrary bases.

**Lecture 16, Wednesday, February 12, 2003**

Row space and column space.

Rank of a matrix.

Row space and column space have the same dimension.

Nullity of a matrix (dimension of the null space).

**Lecture 17, Friday, February 14, 2003**

Norm. Dot product (also known as scalar product or
inner product).

Linear transformation. Definition and basic properties.

Various examples.

Projection to a line specified by a unit vector.

Matrix representation of a linear transformation.

Examples again.

Theorem about how to represent a linear transformation with a
matrix.

**Handout for parts of lectures 17 and 18:
**"Linear Transformations and Matrix Representations,"
available in PDF and
postscript formats.

**Monday, February 17. No class.
President's Day Holiday.**

**Lecture 18, Wednesday, February 19, 2003**

Examples of linear transformations.

Matrix representation of a rotation.

Matrix representation of a cross product.

Skew-symmetric matrices.

Matrix representation of a projection.

Symmetric matrices.

Cauchy-Schwarz inequality.

Triangle inequality.

Orthogonal vectors.

**Lecture 19, Friday, February 21, 2003**

Orthogonal vectors.

Orthogonal subspaces.

If X and Y are orthogonal then their intersection contains
only the zero vector.

Examples of defining planes in R^{3}.

Row space of A is orthogonal to the Null space of A.

Examples and a non-example.

**Lecture 20, Monday, February 24, 2003**

Orthogonal complement.

Range, R(A), of a matrix.

Null space of A is the orthogonal complement of R(A^{T}).

Big theorem on orthogonal complements.

S is the orthogonal complement of the orthogonal complement
of A.

How to find a basis for the orthogonal complement.

**Lecture 21, Wedneday, February 26, 2003**

A consistent system with nontrivial null space has infinitely
many solutions.

Least squares problems.

How to find **p** in R(A) that is closest to **b**.

How to solve a least squares problem.

If A is m×n and has rank n, then A^{T}A is
invertible.

Best line fitting data example.

Best quadratic curve fitting data example.

**Lecture 22, Friday, February 28, 2003**

Quick discussion of inner product spaces and their
properties.

Orthogonal sets of vectors.

Orthonormal sets of vectors.

Finding coordinates of **x** relative to a orthonormal set
of vectors.

Finding the projection of **x** onto a subspace spanned by
a set of orthogonal vectors.

Orthogonal matrices.

Rigid transformation intuition.

**Lecture 23, Monday, March 3, 2003**

Matrices with orthonormal columns.

Orthogonal matrices.

Matrix has orthogonal columns iff Q^{T}Q=I.

Transpose of orthogonal matrix is orthogonal.

Least squares problems with matrices that have orthonormal
columns.

Projection of **b** onto subspace spanned by set of
orthonormal vectors is **p = QQ ^{T}b**.

Gram-Schmidt orthonormalization algorithm.

**Day 24, Wednesday, March 5, 2003
** No lecture, Midterm examination.

**Lecture 25, Friday, March 7, 2003**

Modified Gram-Schmidt method for orthonormalization.

Eigenvalue

Eigenvector

Motivating example (from section 6.1).

**Lecture 26, Monday, March 10, 2003**

Eigenvalue.

Eigenvector.

Eigenspace.

Characteristic polynomial.

How to compute eigenvalues and eigenvectors.

**Day 27, Wednesday, March 12, 2003
** Trace of a matrix.

Sums and products of eigenvalues and the characteristic polynomial.

Similarity.

Diagonalizability.

Distinct eigenvalues have linearly independent eigenvectors.

If

**Lecture 28, Friday, March 14, 2003 - Last Day
of Class!
**Planned: Wrap up of eigenvalues and eigenvectors and
similarity.

Some applications and other results.

Read more in sections 6.1 and 6.3.