$$\newcommand{\norm}[1]{\left\|#1\right\|} \newcommand{\paren}[1]{\left(#1\right)} \newcommand{\abs}[1]{\left\lvert#1\right\rvert} \newcommand{\set}[1]{\left\{#1\right\}} \newcommand{\ang}[1]{\left\langle#1\right\rangle} \newcommand{\C}{\mathbb{C}} \newcommand{\R}{\mathbb{R}} \newcommand{\Q}{\mathbb{Q}} \newcommand{\Z}{\mathbb{Z}} \newcommand{\N}{\mathbb{N}}$$
Due: December 4th, 2017

## Math 18 Assignment 8

Answers to these problems are now available here.

1. Compute the following things:
1. $\norm{\,\begin{bmatrix}4\\-1\\2\\2\\1\end{bmatrix}\,}$
2. The distance from $\begin{bmatrix}2\\3\\-\frac7{12}\\1\end{bmatrix}$ to $\begin{bmatrix}-1\\3\\2\\\frac85\end{bmatrix}$
3. A unit vector in the same direction as $\begin{bmatrix}8\\-2\\4\end{bmatrix}$
4. $\ang{\begin{bmatrix}2\\-1\\3\end{bmatrix}, \begin{bmatrix}-1\\5\\-1\end{bmatrix}}$
2. Consider the vectors $u, v, w \in \R^5$ given as follows: $$u = \begin{bmatrix}3\\-1\\0\\2\\7\end{bmatrix}\hspace{1in} v = \begin{bmatrix}-1\\5\\3\\6\\0\end{bmatrix}\hspace{1in} w = \begin{bmatrix}1\\-1\\1\\-1\\1\end{bmatrix}.$$
1. Write down a system of linear equations whose solution space is precisely the set of vectors in $\R^5$ which are orthogonal to $u$, $v$, and $w$ (i.e., find a system of equations describing their orthogonal complement $\{u, v, w\}^\perp$).
2. The vectors $u$, $v$, and $w$ are linearly independent. What is the dimension of $\{u, v, w\}^\perp$?
3. Suppose $x_1, \ldots, x_k \in \R^n$ are linearly independent. What is the dimension of $\{x_1, \ldots, x_k\}^\perp$?
3. Compute the following:
1. The projection of the vector $\begin{bmatrix}4\\2\\-1\end{bmatrix}$ onto the line spanned by $\begin{bmatrix}1\\1\\1\end{bmatrix}$.
2. The projection of the vector $\begin{bmatrix}2\\0\\-1\\1\end{bmatrix}$ onto the space spanned by the orthogonal vectors $\begin{bmatrix}3\\1\\0\\-2\end{bmatrix}$ and $\begin{bmatrix}-1\\1\\3\\-1\end{bmatrix}$.
3. The coefficients needed to express the vector you found in part B as a linear combination of $\begin{bmatrix}3\\1\\0\\-2\end{bmatrix}$ and $\begin{bmatrix}-1\\1\\3\\-1\end{bmatrix}$.
4. Suppose $W \subseteq \R^6$ is a subspace with basis $\set{\begin{bmatrix}1\\5\\-1\\6\\787\\0\end{bmatrix},\begin{bmatrix}-34\\4\\4\\4\\4\\8\end{bmatrix},\begin{bmatrix}2\\3\\4\\5\\6\\2\end{bmatrix}, \begin{bmatrix}-5\\123\\-4\\12\\-3\\1\end{bmatrix}}$, and let $P : \R^{\color{red}6}\to\R^{\color{red}6}$ be the orthogonal projection onto $W$. The following do not require very much computation.
1. What is the rank of $P$? How do you know?
2. What is the dimension of the $1$-eigenspace of $P$? How do you know?
3. What is the dimension of the null space of $P$? How do you know?
4. Explain why $P$ must be similar to a diagonal matrix, and find a diagonal matrix it is similar to (note: you are not being asked to find an invertible matrix $Q$ so that $P = QDQ^{-1}$).
5. Consider the space $\mathbb{P}_3$ of polynomials of degree at most $3$, equipped with the inner product defined by $$\ang{p,q} = \frac14\paren{p(-1)q(-1) + p(0)q(0) + p(1)q(1) + p(2)q(2)}.$$
1. Confirm that the constant polynomial $1$ has length $1$; that is, that $\norm{1} = 1$.
2. Find real numbers $a$ and $b$ so that the polynomial $ax+b$ is a unit vector orthogonal to the polynomial $1$.
3. Compute the projection of $x^3-1$ onto the space spanned by $1$ and the vector you found in part B, i.e., the subspace of polynomials of degree at most $1$. (What you've accomplished is to find the best linear approximation to the data points $(-1, -2)$, $(0,-1)$, $(1, 0)$, and $(2, 7)$. You may find it enlightening to plot these points and the line you've computed, but this is not necessary.)
4. Use this to find coefficients $j$ and $k$ so that $x^3+jx+k$ is orthogonal to the subspace of polynomials of degree at most $1$.
6. Nothing should be submitted for the following exercise, and it should be considered only for interest's sake. Feel free to ignore it if you wish. We will try to gain some insight into the following question: what do the eigenvalues of a matrix chosen at random look like? You should use matlab or another tool of your choice for the following.

It turns out that any matrix with real entries which is equal to its own transpose is diagonalizable, i.e., similar to a diagonal matrix, and therefore there is a basis of $\R^n$ consisting of eigenvectors for the matrix. We will make use of that in what follows.

1. Construct a $25\times25$ upper triangular matrix in matlab with entries chosen independantly at random from a Gaussian distribution of variance one; you may find the matlab functions triu and randn useful for this.
2. Use this to construct a symmetric $25\times25$ matrix (one which is equal to its own transpose) whose upper triangular entries are independent random Gaussian variables. Arrange so that the diagonal entries have either variance 1 or $\sqrt2$, whichever you prefer.
3. Multiply the matrix above by the constant $\frac1{\sqrt5}$ (this causes the rows of the matrix to have, on average, a length of $1$).
4. Compute the eigenvalues of the above matrix, and plot them on a histogram. You may find the matlab commands eig and histogram useful for this; you may want something like the command histogram(e, -2:.1:2); this post on stackoverflow may also be helpful.
5. Repeat the above a few times and see if you can divine a pattern.
6. Now do the same for a much larger matrix, normalizing by $\frac1{\sqrt{N}}$ instead of $\frac1{\sqrt5}$; what do the data look like? Keep trying larger and larger matrices until your computer has a hard time handling it, but try to get at least to $10000\times10000$ matrices (you may need to wait a minute or two between commands for matrices of this size, but not too long).
7. Now do the same, except use a symmetric matrix whose entries are $\pm1$ chosen at random instead of Gaussian. You may find the matlab function randn helpful.
It turns out that there is actually a lot of structure lurking within the eigenvalues of a random matrix.