Lecture 11

less than 1 minute read

  • I am going to hold X-hour Feb 4 12:30 - 1:20 pm tomorrow (Feb 4, Thursday)

  • In lecture 11, we discussed conditional densities (3.3) and linear regressions (3.4). Throughout the course, I am emphasizing the conditional expectation as a predictor of a random variable given another random variable. The least-square linear regression is a particular case of the conditional expectation by assuming a linear relation. As an example, I also mentioned Neural Network-based regression. A good way to interpret the NN-based regression is that it is a conditional expectation by assuming that the relation between two random variables is described by a Neural Network (a form of a function).

Updated: