## Module I: Preliminaries of Nonparametric Regression
__[-]__
__Collapse All[-]__

- Introduction: course overview; example tasks
- Optimal Predictions and Measures of Accuracy: loss functions; predictive risk; bias-variance trade-off
- Linear Smoothers: definition; basic examples
- A First Look at Shrinkage Methods: ridge regression; lasso
- Choosing the Smoothing Parameter: analytic approaches; cross validation

### Lectures:

- 1. Apr 2: Intro. Optimal predictions, predictive performance, bias-variance tradeoff.

[Intro, predictions, bias-variance tradeoff slides] [Intro, predictions, bias-variance tradeoff annotated slides] - 2. Apr 4: Linear smoothers, ridge regression, LASSO.

[Linear smoothers, ridge regression, LASSO slides] [Linear smoothers, ridge regression, LASSO annotated slides] - 3. Apr 9: LASSO cont'd.

[LASSO cont'd slides] [LASSO cont'd annotated slides]

## Module II: Splines and Kernel Methods
__[-]__
__Collapse All[-]__

- Introduction: brief overview
- Spline Methods: piecewise polynomials; natural cubic splines; smoothing splines; B-splines; penalized regression splines
- Kernel Methods: kernel density estimation; the Nadaraya-Watson kernel estimator; local polynomial regression
- Inference for Linear Smoothers: variance estimation; confidence bands
- Spline and Kernel Methods for GLMs: extensions of spline and kernel methods to binomial, Poisson, gamma, etc, data

### Lectures:

- 4. Apr 11: Smoothing parameters, spline intro

[Smoothing parameters, spline slides][Smoothing parameters, spline annotated slides] - 5. Apr 16: B-splines, penalized regression splines, kernel methods intro

[B-splines, penalized regression splines, kernel methods intro slides][B-splines, penalized regression splines, kernel methods intro annotated slides] - 6. Apr 18: Local polynomial regression, KDE, confidence bands

[Local polynomial regression, KDE, confidence bands slides][Local polynomial regression, KDE, confidence bands annotated slides] - BONUS. May 7: Nonparametrics for GLMs.

[Nonparametrics for GLMs slides][Nonparametrics for GLMs annotated slides]

## Module III: Bayesian Nonparametrics
__[-]__
__Collapse All[-]__

- Introduction: principles of Bayesian nonparametrics
- Regression via Gaussian processes
- Density estimation via Dirichlet process mixture of Gaussians

### Lectures:

- 7. Apr 23: Variance estimation (cont'd from prev. module), Intro to Gaussian processes

[Intro to Gaussian processes slides] [Intro to Gaussian processes annotated slides] - 8. Apr 25: Gaussian processes cont'd, selecting hyperparameters

[Gaussian processes, hyperparamaters slides] [Gaussian processes, hyperparamaters annotated slides] - 9. Apr 30: Examples section -- See notes under HW4
- 10. May 2: Gaussian processes recap, finite mixture models

[GP recap, mixture models slides] [GP recap, mixture models annotated slides] - 11. May 7: Dirichlet process mixture models

[Dirichlet process mixture models slides] [Dirichlet process mixture models annotated slides]

## Module IV: Nonparametrics with Multiple Predictors
__[-]__
__Collapse All[-]__

- Introduction: issues when considering multiple predictors
- Generalized Additive Models: GAMs; the backfitting algorithm
- Spline Methods in Several Variables: natural thin plate splines; thin plate regression splines; tensor product splines
- Kernel Methods in Several Variables: extending kernel methods to multidimensional covariates
- Smoothing Parameter Estimation: how to choose level of smoothing in more than one dimension
- Regression Trees: partitioning the covariate space

### Lectures:

- 12. May 9: Multidimensional splines

[Multidimensional splines slides] [Multidimensional splines annotated slides] - 13. May 14: Multidimensional kernel methods, projection pursuit

[Multidimensional kernel method and projection pursuit slides] [Multidimensional kernel method and projection pursuit annotated slides] - 14. May 16: Regression trees

[Regression tree slides] [Regression tree annotated slides]

## Module V: Classification
__[-]__
__Collapse All[-]__

- Logistic Regression
- Bayes Classifiers: linear and quadratic classifiers; naive Bayes classifiers using KDE
- Perceptrons for online learning and SVMs
- Boosting

### Lectures:

- 15. May 21: MARS, classification trees

[MARS, classification trees slides] [MARS, classification trees annotated slides] - 16. May 23: Classification intro, logistic regression

[Classification intro, logistic regression slides][Classification intro, logistic regression annotated slides] - 17. May 28: LDA, QDA, KDE for classification, and Naive Bayes

[LDA, QDA, KDE, and Naive Bayes slides] [LDA, QDA, KDE, and Naive Bayes annotated slides] - 18. May 30: Mixture models, online learning, and perceptron algorithm

[Online learning and perceptron slides][Online learning and perceptron annotated slides] - 19. June 4: Kernelized perceptron and SVMs

[Kernelized perceptron and SVM slides][Kernelized perceptron and SVM annotated slides] - 20. June 6: Multiclass SVMs and boosting

[Multiclass SVMs and boosting slides][Multiclass SVMs and boosting annotated slides]