|Todd Kemp||Instructor||APM firstname.lastname@example.org|
|Denise Rava||Teaching Assistant||APM email@example.com|
Our office hours can be found in the following calendar.
|Lecture A00 (Kemp)||Monday, Wednesday, Friday||1:00pm - 1:50pm||APM B402A|
|Lab A01 (Rava)||Wednesday||5:00pm - 5:50pm||APM B432|
|Lab A02 (Rava)||Wednesday||6:00pm - 6:50pm||APM B432|
|Take-Home Midterm Exam||Monday, Feb 10||2:00pm||(Home)|
|Take-Home Final Exam||Friday, Mar 20||11:30am - 2:29pm||(Home)|
Here are lecture notes on topics outside any of the recommended preparatory textbooks.
182.Notes.pdf last updated March 5, 2020.
Here are my lectures notes on Random Matrix Theory. They are intended for a reader who has taken graduate courses in (measure theoretic) probability
theory, complex analysis, real analysis, and have some familiarity with (enumerative) combinatorics. Section 4.1 contains material related
to the discussion in Section 2.4 of our current course notes.
The lectures are typically given via tablet, on notes/slides with some information prepared before lecture, and some filled-in during the lecture. Below, you will find the before and after slides for each lecture (as they are produced).
|13||Not available||Not available|
|19||Not available||Not available|
|20||Not available||Not available|
|27||Not available||Not available|
|28||Not available||Interactive EVT Video here.|
DSC 155 & MATH 182 is a one quarter topics course on the theory of Principal Component Analysis (PCA), a tool from statistics and data analysis that is extremely wide-spread and effective for understanding large, high-dimensional data sets. PCA is a method of projecting a high-dimensional data set into a lower-dimensional affine subspace (of chosen dimension) that best fits the data (in least squares sense), or equivalently maximizes the preserved variance of the original data. It is a computationally efficient algorithm (utilizing effective numerical methods for the singular value decomposition of matrices), and so is often a first-stop for advanced data analytics of big data sets. The algorithm produces a canonical basis for the projected subspace; these vectors are called the principal components. The question of choosing the dimension for the projection is subtle. In virtually all real-world applications, a "cut-off phenomenon" occurs: a small number (usually 2 or 3) of the singular values of the sample covariance matrix for the data account for a large majority of the variance in the data set.
A full (and honestly still developing) theoretical understanding of this "cut-off phenomenon" has only arisen in the last 15 years, using the tools of random matrix theory. The result, known as the BBP Transition (named after Jinho Baik, Gerard Ben Arous, and Sandrine Peche, who discovered it in 2005), explains the phenomenon in terms of analysis of outlier singular values in low-rank perturbations of random covariance matrices. This uses a simple model (standard in signal processing and information theory) of a signal in a noisy channel to tease out exactly when an outlier will appear in PCA analysis of noisy data, and further predicts more subtle effects that current statistical methods are being developed to correct to produce even more accurate data analysis.
The goal of this course is to present and understand the PCA algorithm, and then analyze it to understand how (and when) it works. Time permitting, we will then use these ideas to apply to some current interesting problems in data science and computer science, such as community detection in large (random) networks.
There is no textbook that covers this material, so we will sample from different sources (and prepare instructor lecture notes along the way). Because this is a brand new experimental course, which has never been taught at UCSD (or anywhere else), it is hard to say in advance what the schedule of topics will be. Instead, here is a point-form list of topics we intend to cover, in the order they will be covered.
Prerequisite: The prerequisites are Linear Algebra (MATH 18) and Probability Theory (MATH 180A). MATH 109 (Mathematical Reasoning) is also strongly recommended as a prerequisite or corequisite. Also: MATH 102 (Applied Linear Algebra) would be beneficial, but is not required. For the lab component of the course, some familiarity with Python and MATLAB is helpful, but not required.
Lecture: Attending the lecture is a fundamental part of the course; you are responsible for material presented in the lecture whether or not it is discussed in the notes. You should expect questions on the homework and exams that will test your understanding of concepts discussed in the lecture.
Homework: Homework assignments are posted below, and will be due at 11:59pm on the indicated due date. You must turn in your homework through Gradescope; if you have produced it on paper, you can scan it or simply take clear photos of it to upload. It is allowed and even encouraged to discuss homework problems with your classmates and your instructor and TA, but your final write up of your homework solutions must be your own work.
Labs: The data science labs are accessible through DataHub. The turn-in components should be exported as pdf files and turned in through Gradescope; they are due at 11:59pm on the dates indicated on the labs.
Lab Project: You will choose a real-world high-dimensional data set, and implement the PCA algorithm to analyze it. You will use the tools explored in this class to give a careful analysis of how the PCA algorithm performed, what it discovered about the data, and what structural shortcomings were evidence in the analysis. Topics and data-sets to be approved by the intructor.
Take-Home Midterm Exam: There will be a single take-home midterm exam, available immediately after the lecture on Monday, February 10, due the following day before midnight. You are free to use any paper / online resources you like during the exam, but collaboration with other people is not allowed. This will be enforced by the honor-system; be warned, we will grade carefully looking for evidence of collaboration on the exam, and any suspicious cases will be reported as academic integrity violions (with likely severe penalties).
Final Exam: The final examination will be held at the date and time stated above.
Administrative Links: Here are two links regarding UC San Diego policies on exams:
Grading: Your cumulative average will be determined by whichever of the following four weighted averages is higher (for you):
Your course grade will be determined by your cumulative average at the end of the quarter, and will be based on the following scale:
The above scale is guaranteed: for example, if your cumulative average is 80, your final grade will be at least B-. However, your instructor may adjust the above scale to be more generous.
Academic Integrity: UC San Diego's code of academic integrity outlines the expected academic honesty of all studentd and faculty, and details the consequences for academic dishonesty. The main issues are cheating and plagiarism, of course, for which we have a zero-tolerance policy. (Penalties for these offenses always include assignment of a failing grade in the course, and usually involve an administrative penalty, such as suspension or expulsion, as well.) However, academic integrity also includes things like giving credit where credit is due (listing your collaborators on homework assignments, noting books or papers containing information you used in solutions, etc.), and treating your peers respectfully in class. In addition, here are a few of our expectations for etiquette in and out of class.
Weekly homework assignments are posted here. Homework is due by 11:59pm on the posted date, through Gradescope. Late homework will not be accepted.