Let A and B be two positive definite matrices of orders n and d, respectively. Let X be a matrix of type (n,d) (n rows and d columns). Is it true that A^(−1)XB^(−1)X^⊺ have eigenvalues between zero and one?

genestesya

genestesya

Answered question

2022-09-11

Let A and B be two positive definite matrices of orders n and d, respectively. Let X be a matrix of type (n,d) (n rows and d columns).
Is it true that A−1XB−1X⊺ have eigenvalues between zero and one?
I am unsure if the result holds with this generality, so I am giving more context below.
This problem arises in the study of "contingency tables" (i.e. statistics) where X is a design matrix of zeros and ones, where each row represents an observation and each column represents the category or cell where this observation falls. Thus, xij=1 means that the ith object appeared in the jth cell and if xij=0 it means taht the object did not appear. It is assumed that at least one entry of every row and every column will be one (the possibility of different objects being in different categories is OK). The matrices A and B are obtained by normalising the rows and columns of X. That is, definexi⋅=∑j=1dxij,x⋅j=∑i=1nxij
and set A=diag(xi⋅), B=diag(x⋅j).
I have made more than a few attemps without success but neither of these have really used the structure of the problem. For example, if λ is an eigenvalue of the matrix A−1XB−1X⊺ then any eigenvector of λ will satisfy XB−1X⊺=λAv. It is tempting to multiply on the left by v⊺ and us the fact that boths sides are now weighted norms of v. This already shows that λ≥0. But the algebra becomes messy and I could not see a path forward to show λ≤1. On top of this, I am not really using the structure of X,A and B as I already mentioned.
A perhaps useful fact (that I have not been able to exploit) is that the vector 1 (full of ones) has eigenvalue 1 and any other eigenvector is orthogonal to this one!

Answer & Explanation

William Collins

William Collins

Beginner2022-09-12Added 12 answers

In the particular case you have, it can be shown as follows. For any matrix Λ with eigenvalues (λ1,…,λn), one can bound the spectral radius
ρ(Λ):=max{|λ1|,…,|λn|}
by any operator norm of Λ, that is, ρ(Λ)≤∥Λ∥. These norms satisfy the inequality
∥ΛΛ′∥≤∥Λ∥∥Λ′∥.
I'll choose the norm
∥Λ∥∞:=maxi∑j|Λij|,
and let Λ:=A−1X, Λ′:=B−1X⊤.
Note that we simply have
Λij=Xij∑dℓ=1Xiℓ,Λ′ij=Xji∑nℓ=1Xℓi.
Hence, since all the entries are positive we have
∥Λ∥∞=maxi=1,…,d∑j=1nXij∑nℓ=1Xiℓ=1,
and in the same way
∥Λ′∥∞=maxi=1,…,n∑j=1dXji∑dℓ=1Xℓi=1.
Therefore, all eigenvalues must lie in the interval [−1,1]. Since you already showed that they must be non-negative, they actually all lie in [0,1].

Do you have a similar question?

Recalculate according to your conditions!

New Questions in High school statistics

Ask your question.
Get an expert answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?