# There is a Theorem on "Orthogonal Partitioned Regression" which says: "In the multiple linear least squares regression of y on two sets of variables X_1 and X_2, if the two sets of variables are orthogonal, then the separate coefficient vectors can be obtained by separate regressions of y on X_1 alone and y on X_2 alone. ..."

There is a Theorem on "Orthogonal Partitioned Regression" which says:
"In the multiple linear least squares regression of $y$ on two sets of variables ${X}_{1}$ and ${X}_{2}$, if the two sets of variables are orthogonal, then the separate coefficient vectors can be obtained by separate regressions of $y$ on ${X}_{1}$ alone and $y$ on ${X}_{2}$ alone. ..."
You can still ask an expert for help

• Questions are typically answered in as fast as 30 minutes

Solve your problem for the price of one coffee

• Math expert for every subject
• Pay only if we can solve it

na1p1a2pafr
Two vectors $v,w\in {\mathbb{R}}^{n}$ are orthogonal iff ${v}^{t}w=0$ where t indicates the transpose. Really, we're using the dot product given by $⟨v,w⟩={v}^{t}w$.
There is a different notion of orthogonality for matrices. Here's one definition of an orthogonal matrix: $O\in {\mathrm{M}}_{\mathrm{n}}\left(\mathbb{R}\right)$ is orthogonal if ${O}^{t}O=I$. Equivalently, this means that the columns of $O$ are orthonormal, i.e., that they are orthogonal and have length $1$.