What do we mean when we say that correlation does not imply causality? What are some of the ways in which an empirical analyst attempts to disentangle the two?

Provide examples.

Provide examples.

wstecznyg5
2022-07-18
Answered

Provide examples.

You can still ask an expert for help

renegadeo41u

Answered 2022-07-19
Author has **9** answers

Correlation: When there is a change in one variable when other variable is change, is called correlation.

causality: The relation between variable with explanation is called causation.

Examples to discriminate correlation and causation.

(1)

Drinking coffee makes people more productive.

This is the example of correlation because there people may think this is because of caffeine but it could be because when people go out to drink coffee shop, they met new people and get distracted by thousands of things, which can lead them to a productive day, so there is no certain cause and effect, thus it is a example of correlation.

(2)

After exercise, people get exhausted.

This is an example of causation because when people exercise, they muscles of the body works more than doing normal things and thus feel exhausted, here affect is the exhausted and the cause is exercise, hence it is an example of causation.

causality: The relation between variable with explanation is called causation.

Examples to discriminate correlation and causation.

(1)

Drinking coffee makes people more productive.

This is the example of correlation because there people may think this is because of caffeine but it could be because when people go out to drink coffee shop, they met new people and get distracted by thousands of things, which can lead them to a productive day, so there is no certain cause and effect, thus it is a example of correlation.

(2)

After exercise, people get exhausted.

This is an example of causation because when people exercise, they muscles of the body works more than doing normal things and thus feel exhausted, here affect is the exhausted and the cause is exercise, hence it is an example of causation.

asked 2022-05-08

If I have built two linear regression models over sets $A$ and $B$, and now want a linear regression over set $A\beta \x88\u037aB$.Is there a way to reuse what I already have?

asked 2022-06-15

regression $x(t)=at+b$, number of trials, and ${R}_{2}$ of the regression. How do I find the value and $95\mathrm{\%}$ confidence interval for the value of $V=x/t$?

asked 2022-07-03

Let the joint distribution of (X, Y) be bivariate normal with mean vector $\left(\begin{array}{c}0\\ 0\end{array}\right)$ and variance-covariance matrix

$\left(\begin{array}{cc}1& \pi \x9d\x9d\x86\\ \pi \x9d\x9d\x86& 1\end{array}\right)$ , where $\beta \x88\x92\pi \x9d\x9f\x8f<\pi \x9d\x9d\x86<\pi \x9d\x9f\x8f$ . Let ${\pi \x9d\x9a\xbd}_{\pi \x9d\x9d\x86}(\pi \x9d\x9f\x8e,\pi \x9d\x9f\x8e)=\pi \x9d\x91\xb7(\pi \x9d\x91\u038f\beta \x89\u20ac\pi \x9d\x9f\x8e,\pi \x9d\x92\x80\beta \x89\u20ac\pi \x9d\x9f\x8e)$ . Then what will be Kendallβs $\mathrm{{\rm O}\x84}$ coefficient between X and Y equal to?

$\left(\begin{array}{cc}1& \pi \x9d\x9d\x86\\ \pi \x9d\x9d\x86& 1\end{array}\right)$ , where $\beta \x88\x92\pi \x9d\x9f\x8f<\pi \x9d\x9d\x86<\pi \x9d\x9f\x8f$ . Let ${\pi \x9d\x9a\xbd}_{\pi \x9d\x9d\x86}(\pi \x9d\x9f\x8e,\pi \x9d\x9f\x8e)=\pi \x9d\x91\xb7(\pi \x9d\x91\u038f\beta \x89\u20ac\pi \x9d\x9f\x8e,\pi \x9d\x92\x80\beta \x89\u20ac\pi \x9d\x9f\x8e)$ . Then what will be Kendallβs $\mathrm{{\rm O}\x84}$ coefficient between X and Y equal to?

asked 2022-04-30

Two random variables, X and Y, have the joint density function:

$f(x,y)=\{\begin{array}{ll}2& 0<x\beta \x89\u20acy<1\\ 0& ioc\end{array}$

Calculate the correlation coefficient between X and Y.

$f(x,y)=\{\begin{array}{ll}2& 0<x\beta \x89\u20acy<1\\ 0& ioc\end{array}$

Calculate the correlation coefficient between X and Y.

asked 2022-08-08

I need to determine multicollinearity of predictors, but I have only two. So, if VIF $\frac{1}{1\beta \x88\x92{R}_{j}^{2}}$ , then in case there are no other predictors VIF will always equal 1? So, maybe it's not even possible to gauge multicollinearity if there are only two predictors?

asked 2022-06-30

If the coefficients of a simple regression line, ${B}_{0}$ and ${B}_{1}$, are the same then why are the regression lines of $y$ on $x$ and $x$ on $y$ different given the condition ${r}^{2}<1$. I have tried all the manipulation and graphical analysis I can but can't seem to see why this is happening.

asked 2022-06-26

Given that X and Y are RV supported on $[2,\text{\Beta}3]$ , If the correlation coefficient of ${X}^{t}$ and ${Y}^{s}$ is 0 for any $s,\text{\Beta}t\text{\Beta}\beta \x88\x88\text{\Beta}[2,\text{\Beta}3]$ , then X and Y are independent.