============================================================================================

Time/Room: 1pm at APM B412

Office hours: MWF 10:30-11:30am or by appointment (email: dpolitis@ucsd.edu) [Office location: APM5701]

TA: Srinjoy Das (email: srinjoyd@gmail.com); office hours: Thu 10-11am at APM 5768

The theory and practice of linear regression
and linear models will be discussed. Least
squares will be defined by means of projections
in n-dimensional Euclidean space. Choosing the
regression model (its dimension, etc.) will also
be addressed.
Prerequisite: a basic statistics course and linear algebra-or instructor consent.

Draper and Smith, Applied regression analysis, 3rd ed., Wiley (recommended)

Announcements:

Midterm in-class on Monday 20 November (closed book but can bring one sheet of notes--two-sided).Bring a Blue book for the midterm!

On Monday Nov 20 the office hour of Dr. Politis (10:30 to 11:30am) will be held by Srinjoy Das instead; please go to APM 5768

No class on Wednesday 22 November

GRADES:

HW = 20%, Midterm (in-class) = 30%, Final (take home) =50%DO NOT COLLABORATE WITH ANYBODY ON THE FINAL!

---------------------------------------------------------------------------------------

Time: 11-12pm

Room: APM 5829

Office hours: TBA or by appointment (email: dpolitis@ucsd.edu)

Departures from underlying assumptions in regression. Transformations
and Generalized Linear Models. Model selection.
Nonlinear regression. Introduction to nonparametric regression.
Prerequisite: 282A

Check out the
R handout 2010. Click here
to download R
to your computer.

Nonlinear regression analysis and it's application, by Bates and Watts,Wiley(1988)

Nonlinear regression, by Seber and Wild,Wiley(2003)

Nonparametric smoothing and lack-of-fit tests, by J. Hart, Springer(1997)

Consider the model: Y=f(X)+error where f is a polynomial of finite (but unknown) degree.

The data (with n=21) are below:

Y=(-5.07 -3.63 -1.92 -0.72 -0.79 -2.00 0.69 0.63 -0.92 1.05 -1.33 0.04 -0.14 -2.10 -0.43 -0.71 1.10 0.75 3 .36 4.17 4.57)

and X=(-2.0 -1.8 -1.6 -1.4 -1.2 -1.0 -0.8 -0.6 -0.4 -0.2 0.0 0.2 0.4 0.6 0.8 1.0 1.2 1.4 1.6 1.8 2.0)

a) Use the four methods (residual diagnostic plots searching for patterns, forward and backward F-tests, and Mallows Cp) to pick the optimal order of the polynomial to be fitted. Do all four give the same answer? Explain?

b) With the optimal order obtained from Mallows Cp fit the model and get parameter estimates (including the error variance). If in your final model you see some parameter estimates that are close to zero, perform a test to see if they are significantly different from zero or not (in which case they should not be included in the model.

c) In your final model, do diagnostic plots to confirm your assumptions (what are they?) on the errors.

d) Apply ridge regression to the problem at hand; produce figures to show how the ridge regression estimated parameters change as the constraint on the L1 norm becomes bigger.

e) Repeat part (d) with the lasso instead of ridge regression, i.e., L2 norm instead of L1, and compare with the figures from part (d). Do the lasso figures point to the same model that you obtained in part (a)?

============================================================================================

Instructor: Dimitris Politis, dpolitis@ucsd.edu ; Office hours: MW 1:30-3pm at APM 5747

TA: Michael Scullard, mscullar@math.ucsd.edu ; Office hour: F 1-2pm at APM 6333

=================================================================================

Time/Room: 2pm at APM2402

Office hours: MWF 10am-10:50am or by appointment (email: dpolitis@ucsd.edu) Office location: APM5701

TA: Ashley Chen email: jic102@ucsd.edu Office hour: Thu 4-5pm Office: SDSC 294E

Weak and strict stationarity of time series, i.e. stochastic processes in discrete time.
Breakdowns of stationarity and remedies (differencing, trend estimation, etc.).
Optimal (from the point of view of Mean Squared Error) one-step-ahead prediction
and interpolation; connection with Hilbert space methods, e.g. orthogonal projection.
Autoregressive (AR), Moving Average (MA), and Autoregressive-Moving Average (ARMA)
models for stationary time series; causality, invertibility, and spectral density.
Maximum Entropy models for time series, and Kolmogorov's formula for prediction error.
Estimation of the ARMA parameters given data; determination of the ARMA model order
using criteria such as the AIC and BIC.
Nonparametric estimation of the mean, the autocovariance function, and
the spectral density in the absence of model assumptions.
Confidence intervals for the estimated quantities via asymptotic normality and/or
bootstrap methods.
Prerequisite: a basic statistics and probability course or instructor consent.

Hamilton: Time Series Analysis
and Priestley: Spectral Analysis and Time Series (recommended)

See also the
Handout on Inverse Covariance and eigenvalues of Toeplitz matrices.
For the handout, the convention f(w)=\sum_k \gamma (k) exp{ikw}
is used (without the 2\pi factor).

GRADES:

HW = 40%, Midterm (in-class) = 20%, Final (take home) =40%MIDTERM WILL BE IN-CLASS, WED FEB 21. Closed book, one sheet of notes (2-sided) allowed.

NOTE: partial solutions for 287A HW1 and partial solutions for 287A HW2 ---DO NOT CIRCULATE!!

Let X_t=Y_t+W_t where the Y series is independent of the W series.
Assume Y_t satisfies an AR(1) model (with respect to some white noise),
and W_t satisfies a different AR(1) model (with respect to some other white noise).
Show that X_t is not AR(1) but it is ARMA(p,q) and identify p and q.
[Hint: show that the spectral density of X_t is of the form of an
ARMA(p,q) spectral density.]

Download the
FINAL EXAM.

GARCH Models: Structure, Statistical Inference and Financial Applications by Christian Francq and Jean-Michel Zakoian (2010), John Wiley.

THIS IS THE NEW HOMEWORK of 2018!

-------------------------------------------------------------------------