Condition for existence of an orthonormal matrix whose column space is orthogonal to the column space of another matrix

While I was reading a statistics paper, I came across one statement that I don't understand (I just have basic linear algebra knowledge).

Assume (in the context of regressions), we have a $n\times p$ data matrix X, assuming that X is invertible and n>p. The paper states

"$U\in {\mathbb{R}}^{n\times p}$ is an orthonormal matrix whose column space is orthogonal to that of X s.t. ${U}^{T}X=0$": such matrix exists if $n\ge 2p$. I don't understand where the last statement comes from.

I know that the nullspace of X has dimension n−rank(X)=n−p in full rank case and U is the orthonormal basis of the null space of X. But I don't get the link why U only exists, if $n\ge p+rank(X)$, i.e. $n\ge 2p$.

While I was reading a statistics paper, I came across one statement that I don't understand (I just have basic linear algebra knowledge).

Assume (in the context of regressions), we have a $n\times p$ data matrix X, assuming that X is invertible and n>p. The paper states

"$U\in {\mathbb{R}}^{n\times p}$ is an orthonormal matrix whose column space is orthogonal to that of X s.t. ${U}^{T}X=0$": such matrix exists if $n\ge 2p$. I don't understand where the last statement comes from.

I know that the nullspace of X has dimension n−rank(X)=n−p in full rank case and U is the orthonormal basis of the null space of X. But I don't get the link why U only exists, if $n\ge p+rank(X)$, i.e. $n\ge 2p$.