There is a Theorem on "Orthogonal Partitioned Regression" which says: "In the multiple linear least squares regression of y on two sets of variables X_1 and X_2, if the two sets of variables are orthogonal, then the separate coefficient vectors can be obtained by separate regressions of y on X_1 alone and y on X_2 alone. ..."

Stephany Wilkins

Stephany Wilkins

Answered question

2022-10-25

There is a Theorem on "Orthogonal Partitioned Regression" which says:
"In the multiple linear least squares regression of y on two sets of variables X 1 and X 2 , if the two sets of variables are orthogonal, then the separate coefficient vectors can be obtained by separate regressions of y on X 1 alone and y on X 2 alone. ..."

Answer & Explanation

na1p1a2pafr

na1p1a2pafr

Beginner2022-10-26Added 16 answers

Two vectors v , w R n are orthogonal iff v t w = 0 where t indicates the transpose. Really, we're using the dot product given by v , w = v t w.
There is a different notion of orthogonality for matrices. Here's one definition of an orthogonal matrix: O M n ( R ) is orthogonal if O t O = I. Equivalently, this means that the columns of O are orthonormal, i.e., that they are orthogonal and have length 1.

Do you have a similar question?

Recalculate according to your conditions!

New Questions in Inferential Statistics

Ask your question.
Get an expert answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?