I am reading about PCA and found an exercise that says

Show that when a N-dim set of data points X is projected onto the eigenvectors $V=[{e}_{1}{e}_{2}...{e}_{n}]$ of its covariance matrix $C=X{X}^{T}$, the covariance matrix of the projected data ${C}_{p}=Y{Y}^{T}$ is diagonal and hence that, in the space of the eigenvector decomposition, the distribution of X is uncorrelated.

What I have so far is

$Y={V}^{T}X$

Therefore

${C}_{p}=Y{Y}^{T}={V}^{T}X({V}^{T}X{)}^{T}={V}^{T}X{X}^{T}V$

but there I got stuck. Any advise on how to proceed, moreover, what does "The covariance matrix of the projected data is diagonal" mean?

Show that when a N-dim set of data points X is projected onto the eigenvectors $V=[{e}_{1}{e}_{2}...{e}_{n}]$ of its covariance matrix $C=X{X}^{T}$, the covariance matrix of the projected data ${C}_{p}=Y{Y}^{T}$ is diagonal and hence that, in the space of the eigenvector decomposition, the distribution of X is uncorrelated.

What I have so far is

$Y={V}^{T}X$

Therefore

${C}_{p}=Y{Y}^{T}={V}^{T}X({V}^{T}X{)}^{T}={V}^{T}X{X}^{T}V$

but there I got stuck. Any advise on how to proceed, moreover, what does "The covariance matrix of the projected data is diagonal" mean?