vagnhestagn

2022-09-07

In a normal linear model (with intercept), show that if the residuals satisfy ${e}_{i}=a+\beta {x}_{i}$, for $i=1\dots n$, where $x$ is a predictor in the model, then each residual is equal to zero.
How to do this

Mckenna Friedman

Since your regression model has intercept, we can assume, the $X$ matrix for the regression has the form
$X=\left(\begin{array}{cc}1& {x}_{1}^{T}\\ 1& {x}_{2}^{T}\\ ⋮& ⋮\\ 1& {x}_{n}^{T}\end{array}\right)$
Note that the residual must be orthogonal to every vector in column space of $X$. This is because the predicted value, $\stackrel{^}{Y}={P}_{X}Y$ (where ${P}_{X}$ is the orthogonal projection matrix onto the column space of $X$), and hence the residual vector $e=Y-\stackrel{^}{Y}=\left(I-{P}_{X}\right)Y$. So for any vector $c$ of appropriate dimension
${c}^{T}e=\left({c}^{T}-{c}^{T}{P}_{X}\right)Y.$
Now if $c$ lies in the column space of $X$ then
${P}_{X}c=c$
or

and it follows for any $c$ in the column space of X$X$
${c}^{T}e=0.$
Now, your condition implies $e$ itself lies in the column space of $X$ and hence must be orthogonal to itself, i.e., ${e}^{T}e=‖e{‖}^{2}=0$, i.e., $e=0.$.

dalllc

If our linear model is given by
${y}_{i}={w}_{0}+{w}_{1}{x}_{i}+{e}_{i}$
we can substitute ${e}_{i}={\beta }_{0}+{\beta }_{1}{x}_{i}$ to obtain
${y}_{i}={w}_{0}+{w}_{1}{x}_{i}+\left({\beta }_{0}+{\beta }_{1}{x}_{i}\right)$
$\phantom{\rule{thickmathspace}{0ex}}⟹\phantom{\rule{thickmathspace}{0ex}}{y}_{i}=\left[{w}_{0}+{\beta }_{0}\right]+\left[{w}_{1}+{\beta }_{1}\right]{x}_{i}$
$\phantom{\rule{thickmathspace}{0ex}}⟹\phantom{\rule{thickmathspace}{0ex}}{y}_{i}={\stackrel{~}{w}}_{o}+{\stackrel{~}{w}}_{1}{x}_{i}+{\stackrel{~}{e}}_{i},$
in which ${\stackrel{~}{e}}_{i}=0$. Hence, as the error is not random. The model can capture everything that is present in the error. Residual term ${\stackrel{~}{e}}_{i}\equiv 0.$.

Do you have a similar question?