Last modified: December 22, 2017.
Homework assignments will be available on this webpage throughout the term. All homework assignments must be submitted to the drop boxes in the basement of AP&M by 4:00 PM on the deadline.
Answers to these problems are now available here. Note that these contain only the broad strokes of a complete solution, and not necessarily as much detail as you are expected to show.
Go to Wolfram Alpha's System of Linear Equations Problem Generator (or another similar site of your choosing), and practice solving systems of linear equations until you can consistently solve them quickly and correctly. You don't need to submit anything, but the odds of you needing to solve at least one system of equations on the midterm or final are high.
Do a similar thing for sums, differences, and scalings of vectors.
You may also find it helpful to work several of the problems in the textbook.
Answers to these problems are now available here.
Go to Wolfram Alpha's Vector Times Matrix Problem Generator (or another similar site of your choosing), and practice multiplying matrices and vectors until you can consistently do it quickly and correctly. You don't need to submit anything, but the odds of you needing to perform at least one computation of this type on the midterm or final are high.
You may also find it helpful to work several of the problems in the textbook; in particular, practice switching between systems of linear equations, vector equations, and matrix equations.
Suppose $T$ is the following $22\times14$ matrix: $$T = \begin{bmatrix} 56 & 59 & 97 & 62 & 72 & 82 & 7 & 60 & 79 & 6 & 51 & 14 & 2 & 21 \\ 22 & 85 & 84 & 88 & 19 & 41 & 18 & 29 & 39 & 25 & 78 & 95 & 90 & 71 \\ 15 & 86 & 49 & 16 & 22 & 65 & 85 & 76 & 76 & 10 & 89 & 62 & 37 & 50 \\ 21 & 12 & 60 & 42 & 44 & 45 & 24 & 19 & 37 & 69 & 5 & 46 & 47 & 25 \\ 99 & 38 & 93 & 79 & 94 & 44 & 47 & 11 & 26 & 53 & 4 & 7 & 20 & 94 \\ 23 & 18 & 89 & 59 & 91 & 96 & 24 & 27 & 57 & 26 & 9 & 41 & 81 & 2 \\ 17 & 98 & 90 & 29 & 1 & 0 & 4 & 52 & 36 & 71 & 80 & 61 & 33 & 28 \\ 16 & 93 & 64 & 81 & 78 & 48 & 77 & 99 & 72 & 31 & 100 & 35 & 56 & 100 \\ 45 & 73 & 63 & 57 & 9 & 54 & 74 & 80 & 77 & 34 & 98 & 35 & 75 & 53 \\ 31 & 43 & 50 & 48 & 87 & 96 & 67 & 13 & 69 & 15 & 23 & 8 & 58 & 46 \\ 58 & 32 & 51 & 30 & 3 & 34 & 1 & 36 & 42 & 55 & 87 & 84 & 20 & 70 \\ 42 & 55 & 34 & 58 & 52 & 18 & 15 & 5 & 49 & 95 & 70 & 19 & 59 & 39 \\ 2 & 48 & 12 & 24 & 54 & 36 & 27 & 10 & 53 & 98 & 7 & 45 & 43 & 23 \\ 80 & 72 & 99 & 57 & 36 & 53 & 58 & 95 & 5 & 30 & 87 & 17 & 31 & 30 \\ 76 & 47 & 82 & 11 & 37 & 73 & 86 & 29 & 60 & 48 & 24 & 28 & 67 & 85 \\ 38 & 37 & 7 & 100 & 77 & 25 & 1 & 46 & 14 & 31 & 74 & 75 & 78 & 4 \\ 41 & 65 & 91 & 13 & 100 & 68 & 18 & 94 & 84 & 97 & 74 & 68 & 38 & 66 \\ 84 & 23 & 71 & 86 & 76 & 12 & 45 & 51 & 81 & 77 & 90 & 92 & 35 & 29 \\ 11 & 69 & 70 & 88 & 40 & 56 & 44 & 50 & 67 & 87 & 16 & 21 & 0 & 21 \\ 17 & 96 & 3 & 99 & 25 & 79 & 89 & 40 & 9 & 33 & 85 & 51 & 89 & 28 \\ 3 & 71 & 32 & 56 & 44 & 65 & 42 & 55 & 41 & 19 & 54 & 13 & 72 & 98 \\ 35 & 2 & 79 & 20 & 60 & 14 & 52 & 88 & 16 & 22 & 90 & 57 & 27 & 63 \\ \end{bmatrix}.$$ Does the matrix equation $T$$\vec{x} = \vec{b}$ in the variable $\vec{x}$ have a solution for every $\vec{b} \in \mathbb{R}^{22}$?
(Hint: you do not need to perform row reduction to solve this problem.)
Answers to these problems are now available here.
Correction: An earlier version of this problem accidentally repeated the matrix $\begin{bmatrix}1&0&0\\0&0&1\\0&1&0\end{bmatrix}$ twice. Since the goal of this problem is to demonstrate row reduction as a sequence of matrix multiplications, its presence there defeated the point of the question. If you have already performed the computation with the original sequence of matrices, submitting it will also count as a correct solution to this problem.
Answers to these problems are now available here.
The set of strings of zero or more letters, where "$+$" corresponds to concatenation and for any number $\lambda \in \mathbb{R}$ and any string of letters $\vec{w}$, $\lambda\vec{w} = \vec{w}$.
For example, if $\vec{v} = \textrm{''}snow\textrm{''}$ and $\vec{w} = \textrm{''}flake\textrm{''}$, then $\vec{v} + \vec{w} = \textrm{''}snowflake\textrm{''}$. Similarly, $3\vec{v} = \textrm{''}snow\textrm{''}$. If $\vec{\varepsilon}$ is the string containing zero letters, and $\vec{x}$ is another string, then $\vec{\varepsilon} + \vec{x} = \vec{x} = \vec{x} + \vec{\varepsilon}$. So for example, $\textrm{''}\textrm{''} + \textrm{''}snow\textrm{''} = \textrm{''}snow\textrm{''} = \textrm{''}snow\textrm{''} + \textrm{''}\textrm{''}$.
Answers to these problems are now available here.
Consider the space of polynomials $\mathbb{P}$. Let $D : \mathbb{P} \to \mathbb{P}$ and $S : \mathbb{P} \to\mathbb{P}$ be linear maps such that for all $k \in \mathbb{N}$, $D(x^k) = kx^{k1}$ and $S(x^k) = \frac1{k+1}x^{k+1}$.
We know that linear maps between finite dimensional vector spaces are onetoone if and only if they are onto (and that this occurs exactly when the standard matrix of such a map is invertible). What we have shown here is that this does not hold for vector spaces of infinite dimension.
Given a finite set $S = \{s_1, \ldots, s_k\}$, the free vector space on $S$ is the set of formal linear combinations of elements of $S$: $$\mathcal{F}(S) = \left\{a_1s_1 + \cdots + a_ks_k : a_1, \ldots, a_k \in \mathbb{R}\right\}.$$ Addition is defined by collecting like terms, and scalar multiplication multiplies the weight of each element. $\mathcal{F}(S)$ should be thought of the vector space obtained by declaring $S$ to be a basis.
Consider the free vector space on the set $Q = \{0\rangle, 1\rangle\}$; then $\mathcal{F}(Q)$ is a 2dimensional vector space, and all vectors are of the form $\alpha0\rangle + \beta1\rangle$. Define the vectors $+\rangle$ and $\rangle$ by $$+\rangle = \frac1{\sqrt2}0\rangle + \frac1{\sqrt2}1\rangle \hspace{2in} \rangle = \frac1{\sqrt2}0\rangle  \frac1{\sqrt2}1\rangle.$$
It turns out that in the vector space $\mathcal{F}_{\mathbb{C}}(Q)$ (which is like $\mathcal{F}(Q)$ with complex coefficients rather than real coefficients), the vectors $\alpha0\rangle + \beta1\rangle$ with $\alpha^2 + \beta^2 = 1$ describe the possible states of a qubit, or quantum bit. If $W$ is the set $\{w\rangle : w \text{ is a binary string of length } k\}$, then the state of a quantum system with $k$ qubits can be described as the vectors in $\mathcal{F}_{\mathbb{C}}(W)$ with coefficients whose square absolute values sum to $1$. A quantum computer acts on such a quantum state by applying linear maps with the property that every quantum state is sent to another state (so, for example, such maps must have trivial kernel). There's a lot of fascinating stuff here which is well beyond the scope of this course; Wikipedia is a good place to start reading.
Answers to these problems are now available here.
Consider the space of polynomials of degree at most $4$, $\mathbb{P}_4$. The standard basis for $\mathbb{P}_4$ is given by the monomials: $\mathcal{E} = \{1,x,x^2,x^3,x^4\}$. However, there are other bases for the polynomials which may be more convenient to work with in some settings; two are detailed below.
The Chebyshev polynomials (of the second type) are useful in approximation theory and polynomial interpolation, and are defined by the conditions $U_0(x) = 1$, $U_1(x) = 2x$, and for $n > 1$, $U_{n+1}(x) = 2xU_n(x)  U_{n1}(x)$. In particular, the first five Chebyshev polynomials are as follows: $$\mathcal{B} = \{1, 2x, 4x^21, 8x^34x, 16x^412x^2+1\}.$$
The Hermite polynomials likewise have many uses, being related to the behaviour of a quantum harmonic oscillator and random matrix theory. Their general form is not as easily presented, but the first five Hermite polynomials are the following: $$\mathcal{C} = \{1, x, x^21, x^33x, x^46x^2+3\}.$$
Compute the following change of basis matrices.
Answers to these problems are now available here.
Answers to these problems are now available here.
Nothing should be submitted for the following exercise, and it should be considered only for interest's sake. Feel free to ignore it if you wish. We will try to gain some insight into the following question: what do the eigenvalues of a matrix chosen at random look like? You should use matlab or another tool of your choice for the following.
It turns out that any matrix with real entries which is equal to its own transpose is diagonalizable, i.e., similar to a diagonal matrix, and therefore there is a basis of $\R^n$ consisting of eigenvectors for the matrix. We will make use of that in what follows.
Name  Role  Office  Office hours  
Ian Charlesworth  Instructor  AP&M 5880C  Tu1012, F1011  ilc@math.ucsd.edu 
Xindong Tang  Teaching Assistant  AP&M 6436  M1011, F23  xit039@ucsd.edu 
Matthew Ung  Teaching Assistant  AP&M 5218  W910  m2ung@ucsd.edu 
My Friday office hour is dedicated to this class, while the time on Tuesday is open both to this course and the other course I am teaching this quarter.
We will be communicating with you and making announcements through an online question and answer platform called Piazza. We ask that when you have a question about the class that might be relevant to other students, you post your question on Piazza instead of emailing us. That way, everyone can benefit from the response. Posts about homework or exams on Piazza should be content based. While you are encouraged to crowdsource and discuss coursework through Piazza, please do not post complete solutions to homework problems there. Questions about grades should be brought to the instructors, in office hours. You can also post private messages to instructors on Piazza, which we prefer over email.
If emailing us is necessary, please do the following:
Date  Time  Location  
Lecture C00 (Charlesworth)  Mondays, Wednesdays, Fridays  11:00am  11:50am  CENTR 119 
Discussion C01 (Xindong Tang)  Thursdays  2:00pm  2:50pm  AP&M 2301 
Discussion C02 (Xindong Tang)  Thursdays  3:00pm  3:50pm  AP&M 2301 
Discussion C03 (Xindong Tang)  Thursdays  4:00pm  4:50pm  AP&M 2301 
Discussion C04 (Xindong Tang)  Thursdays  5:00pm  5:50pm  AP&M 2301 
Discussion C05 (Matthew Ung)  Thursdays  6:00pm  6:50pm  AP&M 2301 
Discussion C06 (Matthew Ung)  Thursdays  7:00pm  7:50pm  AP&M 2301 
Final Exam  Tuesday, Dec 12  11:30am  2:30pm  CENTR 119 
Course: Math 18
Title: Linear Algebra
Credit Hours: 4 (Students may not receive credit for both Math 18 and 31AH.)
Prerequisite: Math Placement Exam qualifying score, or AP Calculus AB score of 2, or SAT II Math Level 2 score of 600 or higher, or Math 3C, or Math 4C, or Math 10A, or Math 20A, or consent of instructor.
Catalog Description: Matrix algebra, Gaussian elimination, determinants, Linear and affine subspaces, bases of Euclidean spaces. Eigenvalues and eigenvectors, quadratic forms, orthogonal matrices, diagonalization of symmetric matrices. Applications. Computing symbolic and graphical solutions using Matlab. See the UC San Diego Course Catalog.
Textbook: Linear Algebra and its Applications, by David C. Lay, Steven R. Lay, and Judi J. McDonald; published by Pearson (Addison Wesley).
Subject Material: We will cover parts of chapters 17 of the text.
Lecture: Attending the lecture is a fundamental part of the course; you are responsible for material presented in the lecture whether or not it is discussed in the textbook. You should expect questions on the exams that will test your understanding of concepts discussed in the lecture.
Reading: Reading the sections of the textbook corresponding to the assigned homework exercises is considered part of the homework assignment; you are responsible for material in the assigned reading whether or not it is discussed in the lecture.
Calendar of Lecture Topics: The following calendar is subject to revision during the term. The section references are only a guide; our pace may vary from it somewhat.
Week  Monday  Tuesday  Wednesday  Thursday  Friday 

0  Sep 25  Sep 26  Sep 27  Sep 28 
Sep 29
1.1 Systems of linear equations

1 
Oct 2
1.2 Row reduction & echelon forms

Oct 3 
Oct 4
1.3 Vector equations

Oct 5
Discussion

Oct 6
1.4 Matrix equation \(A\vec{x} = \vec{b}\)

2 
Oct 9
1.5 Solution sets

Oct 10 
Oct 11
1.7 Linear independence

Oct 12
Discussion

Oct 13
1.8 Linear transformations

3 
Oct 16
1.9 The matrix of a linear transformation

Oct 17 
Oct 18
2.1 Matrix operations

Oct 19
Discussion

Oct 20
Midterm exam

4 
Oct 23
2.2, 2.3 Inverse of a matrix

Oct 24 
Oct 25
4.1 Vector spaces and subspaces

Oct 26
Discussion

Oct 27
4.2 Null spaces & column spaces

5 
Oct 30
4.3 Linear independent sets; bases

Oct 31 
Nov 1
4.5 Dimension

Nov 2
Discussion

Nov 3
4.6 Rank
4.4 Coordinate systems 
6 
Nov 6
4.7 Change of basis

Nov 7 
Nov 8
3.1, 3.2 Determinants

Nov 9
Discussion

Nov 10
Veterans Day

7 
Nov 13
3.3 Determinants and volume

Nov 14 
Nov 15
5.1 Eigenvectors and eigenvalues

Nov 16
Discussion

Nov 17
Midterm exam

8 
Nov 20
5.2 Characteristic polynomial

Nov 21 
Nov 22
5.3 Diagonalization

Nov 23
Thanksgiving

Nov 24
PostThanksgiving

9 
Nov 27
6.1, 6.7 Inner product, length, & orthogonality

Nov 28 
Nov 29
6.2 Orthogonal sets

Nov 30
Discussion

Dec 1
6.3 Orthogonal projections

10 
Dec 4
6.4 GramSchmidt Orthogonalization

Dec 5 
Dec 6
7.1 Spectral Theorem

Dec 7
Discussion

Dec 8
Review

11  Dec 11  Dec 12  Dec 13  Dec 14  Dec 15 
Homework: Homework is a very important part of the course and in order to fully master the topics it is essential that you work carefully on every assignment and try your best to complete every problem. Homework assignments will be made available on the course webpage, above. Your homework can be submitted to the dropbox with your TA's name on it in the basement of the AP&M building. Homework is officially due at 4:00 PM on the due date.
MATLAB: In applications of linear algebra, the theoretical concepts that you will learn in lecture are used together with computers to solve large scale problems. Thus, in addition to your written homework, you will be required to do homework using the computer language MATLAB. The Math 18 MATLAB Assignments page contains all information relevant to the MATLAB component of Math 18. The first assignment is due in week 2 of the course. You can do the homework on any campus computer that has MATLAB. Questions regarding the MATLAB assignments should be directed to the TAs. There are also tutors available beginning Thursday or Friday of the first week of classes in B432 of AP&M. Please turn in your homework via Gradescope, as described on the MATLAB page, by 11:59pm on the indicated due date (as indicated on the Math 18 MATLAB Assignments page); note that late MATLAB homework will not be accepted, but in case you have to miss one MATLAB assignment, your lowest MATLAB homework score will be dropped. There will be a MATLAB quiz at the end of the quarter.
Midterm Exams: There will be two midterm exams given during the quarter. No calculators, phones, or other electronic devices will be allowed during the midterm exams. You may bring at most three fourleaf clovers, horseshoes, manekineko, or other such talismans for good luck. There will be no makeup exams.
Final Examination: The final examination will be held at the date and time stated above.
Administrative Links: Here are two links regarding UC San Diego policies on exams:
Regrade Policy:
Administrative Deadline: Your scores for all graded work will be posted to TritonEd.
Grading: Your course grade will be determined by your cumulative average at the end of the term and will be based on the following scale:
A+  A  A  B+  B  B  C+  C  C 
97  93  90  87  83  80  77  73  70 
In addition, you must pass the final examination in order to pass the course. Note: Since there are no makeup exams, if you miss a midterm exam for any reason, then your course grade will be computed with the second option. There are no exceptions; this grading scheme is intended to accommodate emergencies that require missing an exam.
Your single worst homework score will be ignored.
Academic Integrity: UC San Diego's code of academic integrity outlines the expected academic honesty of all studentd and faculty, and details the consequences for academic dishonesty. The main issues are cheating and plagiarism, of course, for which we have a zerotolerance policy. (Penalties for these offenses always include assignment of a failing grade in the course, and usually involve an administrative penalty, such as suspension or expulsion, as well.) However, academic integrity also includes things like giving credit where credit is due (listing your collaborators on homework assignments, noting books or papers containing information you used in solutions, etc.), and treating your peers respectfully in class. In addition, here are a few of our expectations for etiquette in and out of class.
Here are some additional resources for this course, and math courses in general.
Any remarks pertaining to particular lectures will be posted here throughout the term.
Imagine we are interested in studying some system which can be in two possible states, and evolves over time, moving to a new state every time step randomly but depending on its current state. A simple example with two states may be a traffic light which can be either red or green at any given time (ignoring yellow lights, as most California drivers do). If a green light turns red with probability $\frac34$ and a red light turns green with probability $\frac12$ every twenty seconds, then this system can be described by the following matrix: $$A = \begin{bmatrix}\frac14&\frac12\\\frac34&\frac12\end{bmatrix}.$$ If a light is currently green with probability $p$ and red with probability $1p$, it can be represented by the vector $\begin{bmatrix}p\\1p\end{bmatrix}$, and the probability of it being in each state after twenty seconds is given by $A\begin{bmatrix}p\\1p\end{bmatrix}$, after forty seconds by $A^2\begin{bmatrix}p\\1p\end{bmatrix}$, after sixty seconds by $A^3\begin{bmatrix}p\\1p\end{bmatrix}$, and so on.
What we'd like to understand is what the probability of the system being in each state after a very long time has passed is. We can accomplish this using eigenvectors and eigenvalues. A bit of computation will show that for the matrix above, the eigenvalues are $1$ and $\frac14$, with corresponding eigenvectors $\begin{bmatrix}2\\3\end{bmatrix}$ and $\begin{bmatrix}1\\1\end{bmatrix}$. Suppose that we know our system begins with equal chance of being in either state: $x = \begin{bmatrix}\frac12\\\frac12\end{bmatrix}$. We can write $\begin{bmatrix}\frac12\\\frac12\end{bmatrix} = \frac15\begin{bmatrix}2\\3\end{bmatrix} + \frac1{10}\begin{bmatrix}1\\1\end{bmatrix}$. Then if we apply $A^k$, we see $$A^kx = \frac15A^k\begin{bmatrix}2\\3\end{bmatrix} + \frac1{10}A^k\begin{bmatrix}1\\1\end{bmatrix} = \frac15(1)^k\begin{bmatrix}2\\3\end{bmatrix} + \frac1{10}\left(\frac14\right)^k\begin{bmatrix}1\\1\end{bmatrix}.$$ As $k$ becomes very large, this quickly becomes very close to $\begin{bmatrix}\frac25\\\frac35\end{bmatrix}$. So as the system runs for a long time, its probability of being in the first state approaches $\frac25$, and its probability of being in the second approaches $\frac35$. This tells us something we already know, which is that most traffic lights are red when you reach them.
The same idea can be applied to much more interesting systems. In any system with a finite number of states, such that the probability of moving from each state to the next is known, one can always right down a transitiion matrix with $(i,j)$entry equal to the probability of moving from state $j$ to state $i$. Then all the entries of this matrix will be between $0$ and $1$, and the entries of each column will sum to one. Further, one will always be an eigenvalue of such a matrix (think for a little bit about why $AId$ will not be invertible), and all the eigenvalues will be between $1$ and $1$. If $1$ is not an eigenvalue, then the state of the system from any initial configuration will grow closer and closer to an eigenvector with eigenvalue one.
This is the same sort of math that is going on behind Google's page rank algorithm. They construct a very large matrix whose entries represent the probabilities of moving from any one web page to another, and find the eigenvectors of this matrix. These represent the probability of a user being on a certain site after a long time spent browsing, and sites with a high probability of having people be on them are ranked more highly in searches.