Mariah Sparks

2022-07-22

Show that: $\sum {x}_{i}{e}_{i}=0$ and also show that $\sum {\stackrel{^}{y}}_{i}{e}_{i}=0$. Now I do believe that being able to solve the first sum will make the solution to the second sum more clear. So far I have proved that $\sum {e}_{i}=0$.

Kitamiliseakekw

Here's one way of viewing it. We want to write
$\left[\begin{array}{c}{y}_{1}\\ ⋮\\ {y}_{n}\end{array}\right]=\stackrel{^}{\alpha }\left[\begin{array}{c}1\\ ⋮\\ 1\end{array}\right]+\stackrel{^}{\beta }\left[\begin{array}{c}{x}_{1}\\ ⋮\\ {x}_{n}\end{array}\right]+\left[\begin{array}{c}{\stackrel{^}{\epsilon }}_{1}\\ ⋮\\ {\epsilon }_{n}\end{array}\right]$
and choose the values of $\stackrel{^}{\alpha }$ and $\stackrel{^}{\beta }$ that minimze ${\stackrel{^}{\epsilon }}_{1}^{2}+\cdots +{\stackrel{^}{\epsilon }}_{n}^{2}$. The sum of the first two terms on the right is $\left[{\stackrel{^}{y}}_{1},\dots ,{\stackrel{^}{y}}_{n}{\right]}^{T}$.
That means the point $\left[\begin{array}{c}{\stackrel{^}{y}}_{1}\\ ⋮\\ {y}_{n}\end{array}\right]=\stackrel{^}{\alpha }\left[\begin{array}{c}1\\ ⋮\\ 1\end{array}\right]+\stackrel{^}{\beta }\left[\begin{array}{c}{x}_{1}\\ ⋮\\ {x}_{n}\end{array}\right]$ is closer to $\left[\begin{array}{c}{y}_{1}\\ ⋮\\ {y}_{n}\end{array}\right]$ in ordinary Euclidean distance than is any other point in the plane spanned by $\left[\begin{array}{c}1\\ ⋮\\ 1\end{array}\right]$ and $\left[\begin{array}{c}{x}_{1}\\ ⋮\\ {x}_{n}\end{array}\right]$. The point in a plane that is closet to $\mathbf{y}$ is the point you get by dropping a perpendicular from $\mathbf{y}$ to the plane. That means $\stackrel{^}{\epsilon }=\stackrel{^}{\mathbf{y}}-\mathbf{y}$ is perpendicular to the two columns that span the plane, and thus perpendicular to every linear combination of them, such as $\stackrel{^}{\mathbf{y}}$. "Perpendicular" means the dot-product is zero. Q.E.D.
The vector of fitted values $\stackrel{^}{\mathbf{y}}=\left[\begin{array}{c}{\stackrel{^}{y}}_{1}\\ ⋮\\ {\stackrel{^}{y}}_{n}\end{array}\right]$ is the orthogonal projection of the vector $\mathbf{y}=\left[\begin{array}{c}{y}_{1}\\ ⋮\\ {y}_{n}\end{array}\right]$ onto the column space of the design matrix $X=\left[\begin{array}{cc}1& {x}_{1}\\ ⋮& ⋮\\ 1& {x}_{n}\end{array}\right]$.
The orthogonal projection is a linear transformation whose matrix is the "hat matrix" is $H=X\left({X}^{T}X{\right)}^{-1}{X}^{T}$, an $n×n$ matrix of rank $2$. Observe that if $\mathbf{w}$ is orthogonal to that column space then $X\mathbf{w}=0$ so $H\mathbf{w}=0$, and if $\mathbf{w}$ is in the column space, then $\mathbf{w}=Xu$ for some $u\in {\mathbb{R}}^{2}$, and so $H\mathbf{w}=\mathbf{w}$.
It follows that $\stackrel{^}{\epsilon }=\mathbf{y}-\stackrel{^}{\mathbf{y}}=\left(I-H\right)\mathbf{y}$ is orthogonal to the column space. Since $\stackrel{^}{\mathbf{y}}$ is in the column space, $\stackrel{^}{\epsilon }$ is orthogonal to $\stackrel{^}{\mathbf{y}}$.

Do you have a similar question?