D.N. Politis: Current Classes

============================================================================================

Winter 2020: MATH 287A "Time Series Analysis"

Time/Room: 11am at APM 7321

Instructor: Dimitris Politis

Office hours: Fri 4:30-6:30pm at SDSC 217E or by appointment (email: dpolitis@ucsd.edu) --- NO OFFICE HOURS ON FRIDAY JAN. 31

TA: Yiren Wang (email: yiw518@ucsd.edu);

Office hours: Tuesday 2-4pm at APM 1131

Weak and strict stationarity of time series, i.e. stochastic processes in discrete time. Breakdowns of stationarity and remedies (differencing, trend estimation, etc.). Optimal (from the point of view of Mean Squared Error) one-step-ahead prediction and interpolation; connection with Hilbert space methods, e.g. orthogonal projection. Autoregressive (AR), Moving Average (MA), and Autoregressive-Moving Average (ARMA) models for stationary time series; causality, invertibility, and spectral density. Maximum Entropy models for time series, and Kolmogorov's formula for prediction error. Estimation of the ARMA parameters given data; determination of the ARMA model order using criteria such as the AIC and BIC. Nonparametric estimation of the mean, the autocovariance function, and the spectral density in the absence of model assumptions. Confidence intervals for the estimated quantities via asymptotic normality and/or bootstrap methods. Prerequisite: a basic statistics and probability course or instructor consent.

287A Textbooks:

Required textbook: TIME SERIES: THEORY AND METHODS, 2nd ed. Brockwell and Davis, Springer (1991)

Recommended textbook: TIME SERIES: A FIRST COURSE WITH BOOTSTRAP STARTER, by T.S. McElroy and D.N. Politis, Chapman and Hall/CRC Press, 2020.

GRADES:

HW = 40%, Midterm (in-class) = 20%, Final (take home) =40%

HW1--due Monday Jan 27

(From Brockwell and Davis): Ch. 1, ex. 4, 7,8,10,11,12,13. Ch. 2, ex. 6,10,12,15,21.

HW2--due Wed Feb 19

(From Brockwell and Davis): Ch. 3, ex. 2,4,5,9,10,11,13,14,15,16,17,19,22,23.

HW3--due Wed Mar 4

(From Brockwell and Davis): Ch. 4, ex. 2,3,6, 7, 9, 11, 16, 17, 18, 19.

MIDTERM IN-CLASS Mon Mar 2 11am Closed book--one sheet of notes (2-sided) allowed


Download the 287A FINAL EXAM. New rule 3/12/20: DO NOT SUBMIT HARD COPY!!! Please submit your final exam by email to dpolitis@ucsd.edu with cc to: yiw518@ucsd.edu DUE DATE: Wed Mar 18 by 2pm

------------------------------------------------------------------------------------------------------------------------------

Fall 2019: MATH 282A "Applied Statistics"

Time/Room: 11am at APM 7321

Instructor: Dimitris Politis

Office hours: Fri 4:30-6:30pm at SDSC 217E or by appointment (email: dpolitis@ucsd.edu)

TA: Yiren Wang (email: yiw518@ucsd.edu); office hours: Thu 3-5pm at APM 1131

The theory and practice of linear regression and linear models will be discussed. Least squares will be defined by means of projections in n-dimensional Euclidean space. Choosing the regression model (its dimension, etc.) will also be addressed. Prerequisite: a basic statistics course and linear algebra-or instructor consent.

282A Textbooks:

Seber and Lee, Linear regression analysis, Wiley, 2003, 2nd ed. (required)
Draper and Smith, Applied regression analysis, 5th ed., Wiley (recommended)

HW1: due in class Monday Oct 28 (all HW is from Seber and Lee book).

CHAPTER 1. Set 1a: ex. 3. Set 1b: ex. 2,5. Set 1c: ex. 1,3. Misc. Set: ex. 3,5.
CHAPTER 2. Set 2b: ex. 6. Set 2c: ex. 2. Set 2d: ex. 4. Misc. Set: ex. 1, 13, 15,17
PLUS: Work out (with proof) the conditional distribution of Y1 given Y2=y2, where Y1 and Y2 are two random vectors such that the concatenated vector Y=[Y1' Y2']' is multivariate normal with some mean \theta and invertible covariance matrix \Sigma.

HW2: due in class NEW DUE DATE: Wed Nov 20

CHAPTER 3. Set 3a: ex. 2, 3, 7 ; Set 3b: ex. 1, 3, 4 ; Set 3c: ex. 1, 2 ; Set 3d: ex. 1, 2 ; Set 3e: ex. 1, 2 ; Set 3g: ex. 2, 3 ; Set 3h: ex. 2 ; Set 3k: ex. 2, 5.

GRADES:

HW = 20%, Midterm (in-class) = 30%, Final (take home) =50%

Midterm (in-class) will be on Monday Nov. 25. Closed book, with one sheet of notes allowed (2-sides OK)

Bring a Blue book for the midterm!
Week of Nov 18 office hours of Dr. Politis will take place MW 12-12:50 and F 12-12:20 at APM 5701. Friday office hour is cancelled.
No class on Wednesday 27 November

Handout: Three Pythagorean theorems

Final exam FALL 2019 (due by noon Monday Dec. 9). Download take home final exam on Nov. 30

DO NOT COLLABORATE WITH ANYBODY ON THE FINAL!

============================================================================================

Winter 2011: MATH 282B "Applied Statistics"

Time: 11-12pm

Room: APM 5829

Office hours: TBA or by appointment (email: dpolitis@ucsd.edu)

Departures from underlying assumptions in regression. Transformations and Generalized Linear Models. Model selection. Nonlinear regression. Introduction to nonparametric regression. Prerequisite: 282A
Check out the R handout 2010. Click here to download R to your computer.

282B Textbooks (recommended):

Elements of Statistical Learning, by Hastie-Tibshirani-Friedman,Springer(2002), [2nd edition available free online!]
Nonlinear regression analysis and it's application, by Bates and Watts,Wiley(1988)
Nonlinear regression, by Seber and Wild,Wiley(2003)
Nonparametric smoothing and lack-of-fit tests, by J. Hart, Springer(1997)

282B COMPUTATIONAL HOMEWORK PROBLEM : (due Wed Feb 23, 2011)

Consider the model: Y=f(X)+error where f is a polynomial of finite (but unknown) degree.

The data (with n=21) are below:

Y=(-5.07 -3.63 -1.92 -0.72 -0.79 -2.00 0.69 0.63 -0.92 1.05 -1.33 0.04 -0.14 -2.10 -0.43 -0.71 1.10 0.75 3 .36 4.17 4.57)

and X=(-2.0 -1.8 -1.6 -1.4 -1.2 -1.0 -0.8 -0.6 -0.4 -0.2 0.0 0.2 0.4 0.6 0.8 1.0 1.2 1.4 1.6 1.8 2.0)

a) Use the four methods (residual diagnostic plots searching for patterns, forward and backward F-tests, and Mallows Cp) to pick the optimal order of the polynomial to be fitted. Do all four give the same answer? Explain?

b) With the optimal order obtained from Mallows Cp fit the model and get parameter estimates (including the error variance). If in your final model you see some parameter estimates that are close to zero, perform a test to see if they are significantly different from zero or not (in which case they should not be included in the model.

c) In your final model, do diagnostic plots to confirm your assumptions (what are they?) on the errors.

d) Apply ridge regression to the problem at hand; produce figures to show how the ridge regression estimated parameters change as the constraint on the L1 norm becomes bigger.

e) Repeat part (d) with the lasso instead of ridge regression, i.e., L2 norm instead of L1, and compare with the figures from part (d). Do the lasso figures point to the same model that you obtained in part (a)?

============================================================================================

Winter 2008: MATH 281B "Mathematical Statistics"

Instructor: Dimitris Politis, dpolitis@ucsd.edu ; Office hours: MW 1:30-3pm at APM 5747

TA: Michael Scullard, mscullar@math.ucsd.edu ; Office hour: F 1-2pm at APM 6333

GRADES:

HW = 35%, Midterm (in-class) = 25%, Final (take home) =40%

Homework:

(HW1 and HW2 !!)

Final exam:

(final exam !)

Handout:

Asymptotic tools hand out

=================================================================================

Winter 2018: MATH 287A "Time Series Analysis"

Time/Room: 2pm at APM2402

Office hours: MWF 10am-10:50am or by appointment (email: dpolitis@ucsd.edu) Office location: APM5701

TA: Ashley Chen email: jic102@ucsd.edu Office hour: Thu 4-5pm Office: SDSC 294E

Weak and strict stationarity of time series, i.e. stochastic processes in discrete time. Breakdowns of stationarity and remedies (differencing, trend estimation, etc.). Optimal (from the point of view of Mean Squared Error) one-step-ahead prediction and interpolation; connection with Hilbert space methods, e.g. orthogonal projection. Autoregressive (AR), Moving Average (MA), and Autoregressive-Moving Average (ARMA) models for stationary time series; causality, invertibility, and spectral density. Maximum Entropy models for time series, and Kolmogorov's formula for prediction error. Estimation of the ARMA parameters given data; determination of the ARMA model order using criteria such as the AIC and BIC. Nonparametric estimation of the mean, the autocovariance function, and the spectral density in the absence of model assumptions. Confidence intervals for the estimated quantities via asymptotic normality and/or bootstrap methods. Prerequisite: a basic statistics and probability course or instructor consent.

287A Textbooks:

Time Series: Theory and methods, 2nd ed. Brockwell and Davis, Springer (1991) (required)

Hamilton: Time Series Analysis and Priestley: Spectral Analysis and Time Series (recommended)

See also the Handout on Inverse Covariance and eigenvalues of Toeplitz matrices. For the handout, the convention f(w)=\sum_k \gamma (k) exp{ikw} is used (without the 2\pi factor).

GRADES:

HW = 40%, Midterm (in-class) = 20%, Final (take home) =40%

MIDTERM WILL BE IN-CLASS, WED FEB 21. Closed book, one sheet of notes (2-sided) allowed.


HW1--due Monday Jan 29

(From Brockwell and Davis): Ch. 1, ex. 4, 7,8,10,11,12,13. Ch. 2, ex. 6,10,12,15,21.

NOTE: partial solutions for 287A HW1 and partial solutions for 287A HW2 ---DO NOT CIRCULATE!!

HW3--due Monday March 5

(From Brockwell and Davis): From Ch. 4: do exercises 7, 9, 10, 13, 16, 17, 19; and also do the following exercise:

Let X_t=Y_t+W_t where the Y series is independent of the W series. Assume Y_t satisfies an AR(1) model (with respect to some white noise), and W_t satisfies a different AR(1) model (with respect to some other white noise). Show that X_t is not AR(1) but it is ARMA(p,q) and identify p and q. [Hint: show that the spectral density of X_t is of the form of an ARMA(p,q) spectral density.]

Download the FINAL EXAM.

Spring 2018 MATH 287C ADVANCED TIME SERIES

287C textbooks (recommended but NOT required!):

Fan, J. and Yao, Q. (2003). "Nonlinear Time Series", Springer, New York.

GARCH Models: Structure, Statistical Inference and Financial Applications by Christian Francq and Jean-Michel Zakoian (2010), John Wiley.

287C Homework--- due Wednesday May 2, 2018.

THIS IS THE NEW HOMEWORK of 2018!

(Homework)

287C Projects will be due Monday June 4, 2018, and will also be presented in class during last week of classes.

-------------------------------------------------------------------------

BACK