Math 70 Spring 2020
Elements of Multivariate Statistics and Statistical Learning
Weekly topics
Week 1. Matrix algebra and calculus for linear model. Eigenvectors and eigenvalues,
spectral matrix decomposition, generalized inverse, matrix partition, and quadratic
forms. Matrix calculus and matrix differentiation. Elements of multivariate opti-
mization, gradient and Hessian. Linear least squares and multivariate coefficient of
determination. Orthogonal projection and idempotent matrix.
Sources: Demidenko (Appendix), Härdle and Simar (Chapters 2 & 8).
Week 2. Multivariate distributions. Covariance and correlation matrices. Multivari-
ate normal distribution. Multivariate least squares and linear regression model, its
properties.
Sources: Demidenko (Chapter 3), Härdle and Simar (Chapter 3 & 4).
Week 3. Maximum likelihood estimation. Fisher information matrix, Cramér-Rao
inequality and efficient point estimation, estimating equation approach.
Sources: Demidenko (Sections 6.9 and 6.10), Härdle and Simar (Chapter 5).
Week 4. Wald and likelihood ratio tests for hypothesis testing. The power function
and sample size determination. Testing the general linear hypothesis.
Sources: Demidenko (Section 9), Härdle and Simar (Chapter 7).
Week 5. Generalized linear models (Poisson and logistic regressions). Sources:
Demidenko (Section 8.8), Härdle and Simar (Chapter 10&13), Zelterman.
Week 6. Principal component and discriminant analysis. ROC curve, classification
and misclassification probability errors, discriminant analysis.
Sources: Härdle and Simar (Chapter 10&13), Zelterman.
Week 7: Soft hard and clustering and K-means. Hierarchical clustering and the
K-means algorithm.
Sources: Zelterman, Härdle & Simar, Chapters 9 & 12.
Week 8: Elements of nonparametric and Bayesian statistics. Bootstrap, jackknife,
resampling, cross-validation.
Sources: Hogg et al., Chapters 10 & 11.
Week 9: Analysis of time series, binary and categorical data. Autocorrelation and au-
toregression, Fourier transform. Logistic and Poisson regression, nonlinear regression,
Gauss-Newton algorithm.
Sources: Zelterman, Demidenko, Sections 7.1, 8.1-8.3
Week 10: Nonlinear regression, shape and image statistics. Nonlinear least squares
and its asymptotic properties, optimal design of experiments, random triangles, pro-
crustes estimation of shape, Fourier descriptor analysis, image entropy, using Kolmogorov-
Smirnov test for image comparison. Example: how many stars in the sky?