Lecture Overview:

SLR in Matrix Terms

The SLR model implies:

$$ Y_1=\beta_0+\beta_1X_1+\epsilon_1\\Y_2=\beta_0+\beta_1X_2+\epsilon_2\\.\\.\\.\\Y_n=\beta_0+\beta_1X_2+\epsilon_n $$

Stacking $i=1,..., n$ into matrices, we define:

$$ \bold{Y}=\begin{bmatrix}Y_1\\Y_2\\.\\.\\.\\Y_n\end{bmatrix},~\bold{X}=\begin{bmatrix}1&X_1\\1&X_2\\.&.\\.&.\\.&.\\1&X_n\end{bmatrix},~{\beta}=\begin{bmatrix}\beta_0\\\beta_1\end{bmatrix},~~~~~\epsilon=\begin{bmatrix}\epsilon_1\\\epsilon_2\\.\\.\\.\\\epsilon_n\end{bmatrix} $$

We can write the linear equation system compactly as

$$ \bold{Y=X\beta+\epsilon} $$

$E[Y]=X\beta$

Untitled

Least Squares Estimation in Matrix Terms

Define the estimated regression coefficients as