Correlation bound Let x and y be two random variables such that: Corr(x,y) = b, where Corr(x,y) represents correlation between x and y, b is a scalar number in range of [-1, 1]. Let y' be an estimation of y. An example could be y'=y+(rand(0,1)-0.5)*.1, rand(0,1) gives random number between 0, 1. I am adding some noise to the data. My questions are: Is there a way where I can bound the correlation between x, y' i.e. Corr(x,y')? I mentioned y' in light of random perturbation, I would like to know what if I don't have that information, where I only know that y' is a estimation of y. Are there any literature that cover it?

Laila Murphy

Laila Murphy

Answered question

2022-11-20

Correlation bound
Let x and y be two random variables such that:
Corr(x,y) = b, where Corr(x,y) represents correlation between x and y, b is a scalar number in range of [-1, 1]. Let y' be an estimation of y. An example could be y'=y+(rand(0,1)-0.5)*.1, rand(0,1) gives random number between 0, 1. I am adding some noise to the data.
My questions are:
Is there a way where I can bound the correlation between x, y' i.e. Corr(x,y')?I mentioned y' in light of random perturbation, I would like to know what if I don't have that information, where I only know that y' is a estimation of y. Are there any literature that cover it?

Answer & Explanation

Julius Haley

Julius Haley

Beginner2022-11-21Added 19 answers

Let e = y y. Assuming that e is independent from x and y with μ e = E ( e ) = 0, then μ y = E ( y ) = E ( y ) = μ y and:
C o r r ( x , y ) = E ( ( x μ x ) ( y μ y ) ) σ x σ y = E ( ( x μ x ) ( y μ y ) ) + E ( ( x μ x ) e ) σ x σ y = C o r r ( x , y ) σ y σ y
E ( ( x μ x ) e ) = E ( x μ x ) E ( e ) = 0 since x and e are independent.
Now, σ y = σ y 2 + σ e 2 , again by independence, so:
C o r r ( x , y ) = C o r r ( x , y ) 1 1 + ( σ e σ y ) 2
So definitely | C o r r ( x , y ) | < | C o r r ( x , y ) |
I believe the specific e you have given, we have σ e = 0.1 6
There is no meaning to "estimation" technically. You can always say that y y is another random variable. If you don't know that y′−y is independent of x, you don't know what E ( ( x μ x ) ( e μ e ) ) is. If you don't know that e = y y and y are independent, you don't know σ y in terms of σ e and σ y . In particular, you don't know σ y > σ y .
A simple example is that if y = x then E ( x , y ) = 1. So if x,y are close enough that x can be said to be an "estimate for y" then E ( x , x ) = 1 > E ( x , y )).
mxty42ued

mxty42ued

Beginner2022-11-22Added 4 answers

Let X,Y be random variables with a given correlation b. Let Z be any random variable independent of σ(X,Y), and suppose Z 0 has a strictly positive, finite variance, and E [ Z ] = 0. Here Z is 'noise' that will contribute to Y = Y + Z
Notice that C o v ( X , Z ) = 0 and
C o v ( X , Y ) = C o v ( X , Y ) + C o v ( X , Z ) = C o v ( X , Y )since Z is independent of X. Moreover V a r ( Y ) = V a r ( Y ) + V a r ( Z ) > V a r ( Y ) by independence. We conclude that
| C o r r ( X , Y ) | = | C o v ( X , Y ) V a r ( X ) V a r ( Y ) | < | C o v ( X , Y ) V a r ( X ) V a r ( Y ) | = | C o r r ( X , Y ) |
We conclude that the addition of any zero-expectation, independent noise of finite variance will diminish the correlation.

Do you have a similar question?

Recalculate according to your conditions!

New Questions in Inferential Statistics

Ask your question.
Get an expert answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?