Independent random variables $X,Y,X,U,V,W$ have variance equal to 1. Find $\rho (S,T)$ - the correlation coefficient of random variables $S=3X+3Y+2Z+U+V+W$ and $T=9X+3Y+2Z+2U+V+W$

vangstosiis
2022-07-19
Answered

Independent random variables $X,Y,X,U,V,W$ have variance equal to 1. Find $\rho (S,T)$ - the correlation coefficient of random variables $S=3X+3Y+2Z+U+V+W$ and $T=9X+3Y+2Z+2U+V+W$

You can still ask an expert for help

encoplemt5

Answered 2022-07-20
Author has **15** answers

Step 1

Let $S=3X+3Y+2Z+U+V+W$ and $T=S+6X+U$ .

$\begin{array}{rl}\u0421\mathrm{o}\mathrm{v}(S,T)& =\u0421\mathrm{o}\mathrm{v}(S,S+6X+U)\\ & =\u0421\mathrm{o}\mathrm{v}(S,S)+\u0421\mathrm{o}\mathrm{v}(S,6X)+\u0421\mathrm{o}\mathrm{v}(S,U)\\ & ={\sigma}_{S}^{2}+6\u0421\mathrm{o}\mathrm{v}(S,X)+\u0421\mathrm{o}\mathrm{v}(S,U)\\ & ={\sigma}_{S}^{2}+6(\u0421\mathrm{o}\mathrm{v}(3X,X)+\u0421\mathrm{o}\mathrm{v}(3Y,X)+\u0421\mathrm{o}\mathrm{v}(2Z,X)+\u0421\mathrm{o}\mathrm{v}(U,X)+\u0421\mathrm{o}\mathrm{v}(V,X)+\u0421\mathrm{o}\mathrm{v}(W,X))\\ & +(\u0421\mathrm{o}\mathrm{v}(3X,U)+\u0421\mathrm{o}\mathrm{v}(3Y,U)+\u0421\mathrm{o}\mathrm{v}(2Z,U)+\u0421\mathrm{o}\mathrm{v}(U,U)+\u0421\mathrm{o}\mathrm{v}(V,U)+\u0421\mathrm{o}\mathrm{v}(W,U))\end{array}$

$X,Y,Z,U,V,W$ are all independent. I guess it would be easy to carry this on.

Let $S=3X+3Y+2Z+U+V+W$ and $T=S+6X+U$ .

$\begin{array}{rl}\u0421\mathrm{o}\mathrm{v}(S,T)& =\u0421\mathrm{o}\mathrm{v}(S,S+6X+U)\\ & =\u0421\mathrm{o}\mathrm{v}(S,S)+\u0421\mathrm{o}\mathrm{v}(S,6X)+\u0421\mathrm{o}\mathrm{v}(S,U)\\ & ={\sigma}_{S}^{2}+6\u0421\mathrm{o}\mathrm{v}(S,X)+\u0421\mathrm{o}\mathrm{v}(S,U)\\ & ={\sigma}_{S}^{2}+6(\u0421\mathrm{o}\mathrm{v}(3X,X)+\u0421\mathrm{o}\mathrm{v}(3Y,X)+\u0421\mathrm{o}\mathrm{v}(2Z,X)+\u0421\mathrm{o}\mathrm{v}(U,X)+\u0421\mathrm{o}\mathrm{v}(V,X)+\u0421\mathrm{o}\mathrm{v}(W,X))\\ & +(\u0421\mathrm{o}\mathrm{v}(3X,U)+\u0421\mathrm{o}\mathrm{v}(3Y,U)+\u0421\mathrm{o}\mathrm{v}(2Z,U)+\u0421\mathrm{o}\mathrm{v}(U,U)+\u0421\mathrm{o}\mathrm{v}(V,U)+\u0421\mathrm{o}\mathrm{v}(W,U))\end{array}$

$X,Y,Z,U,V,W$ are all independent. I guess it would be easy to carry this on.

asked 2022-07-20

Correlation of Rolling Two Dice

If A is a random variable responsible for calculating the sum of two independent rolls of a die, and B is the result of calculating the value of first roll minus the value second roll, is is true that A and B have a $cov(A,B)\ne 0$? In other words, is it true that they are correlated?

I've come to the conclusion that they must be correlated because they are not independent, that is, the event of A can have an impact on event B, but I remain stuck due to the fact that causation does not necessarily imply correlation.

I know that independence −> uncorrelation, but that the opposite isn't true.

If A is a random variable responsible for calculating the sum of two independent rolls of a die, and B is the result of calculating the value of first roll minus the value second roll, is is true that A and B have a $cov(A,B)\ne 0$? In other words, is it true that they are correlated?

I've come to the conclusion that they must be correlated because they are not independent, that is, the event of A can have an impact on event B, but I remain stuck due to the fact that causation does not necessarily imply correlation.

I know that independence −> uncorrelation, but that the opposite isn't true.

asked 2022-05-08

Is the total sum of squares for multiple regression the same as the total sum of squares for anova?

Is anova a test for bivariate correlation or multiple regression?

Is anova a test for bivariate correlation or multiple regression?

asked 2022-07-19

A consumer organization estimates that over a 1-year period 20% of cars will need to be repaired once, 6% will need repairs twice, and 2% will require three or more repairs. What is the probability that a car chosen at random will need repairs?

asked 2022-05-03

Table of values

$\begin{array}{cccccc}x& 1& 2& 3& 4& 5\\ y& 3& 6& 8& 9& 0\\ y& 4& 6& 1& 2& 4\end{array}$

and know that it is a simple linear regression model, what is the value of $n$? I think it is either $5$ or $10$ but am not sure which one.

$\begin{array}{cccccc}x& 1& 2& 3& 4& 5\\ y& 3& 6& 8& 9& 0\\ y& 4& 6& 1& 2& 4\end{array}$

and know that it is a simple linear regression model, what is the value of $n$? I think it is either $5$ or $10$ but am not sure which one.

asked 2022-05-24

set $S$ of coordinates $(x,y)$, and am estimating $f(x)=ax+b$ where $a>0$. I also happen to know that $\mathrm{\forall}x,y((x,y)\in S\phantom{\rule{thickmathspace}{0ex}}\u27f9\phantom{\rule{thickmathspace}{0ex}}y<f(x))$.

The question is how I can utilize this knowledge of the upper bound on values to improve the regression result?

My intuition is to run a "normal" linear regression on all coordinates in $S$ giving $g(x)$ and then construct ${g}^{\prime}(x)=g(x)+c$, with $c$ being the lowest number such that $\mathrm{\forall}x,y((x,y)\in S\phantom{\rule{thickmathspace}{0ex}}\u27f9\phantom{\rule{thickmathspace}{0ex}}y\le {g}^{\prime}(x))$, e.g. such that ${g}^{\prime}(x)$ lies as high as it can whilst still touching at least point in $S$. I do, however, have absolutely no idea if this is the best way to do it, nor how to devise an algorithm that does this efficiently.

The question is how I can utilize this knowledge of the upper bound on values to improve the regression result?

My intuition is to run a "normal" linear regression on all coordinates in $S$ giving $g(x)$ and then construct ${g}^{\prime}(x)=g(x)+c$, with $c$ being the lowest number such that $\mathrm{\forall}x,y((x,y)\in S\phantom{\rule{thickmathspace}{0ex}}\u27f9\phantom{\rule{thickmathspace}{0ex}}y\le {g}^{\prime}(x))$, e.g. such that ${g}^{\prime}(x)$ lies as high as it can whilst still touching at least point in $S$. I do, however, have absolutely no idea if this is the best way to do it, nor how to devise an algorithm that does this efficiently.

asked 2022-07-09

In logistic regression, the regression coefficients ($\hat{{\beta}_{0}},\hat{{\beta}_{1}}$) are calculated via the general method of maximum likelihood. For a simple logistic regression, the maximum likelihood function is given as

$\ell ({\beta}_{0},{\beta}_{1})=\prod _{i:{y}_{i}=1}p({x}_{i})\prod _{{i}^{\prime}:{y}_{{i}^{\prime}}=0}(1-p({x}_{{i}^{\prime}})).$

What is the maximum likelihood function for $2$ predictors? Or $3$ predictors?

$\ell ({\beta}_{0},{\beta}_{1})=\prod _{i:{y}_{i}=1}p({x}_{i})\prod _{{i}^{\prime}:{y}_{{i}^{\prime}}=0}(1-p({x}_{{i}^{\prime}})).$

What is the maximum likelihood function for $2$ predictors? Or $3$ predictors?

asked 2022-07-07

Why is polynomial regression considered a kind of linear regression? For example, the hypothesis function is

$h(x;{t}_{0},{t}_{1},{t}_{2})={t}_{0}+{t}_{1}x+{t}_{2}{x}^{2},$

and the sample points are

$({x}_{1},{y}_{1}),({x}_{2},{y}_{2}),\dots $

$h(x;{t}_{0},{t}_{1},{t}_{2})={t}_{0}+{t}_{1}x+{t}_{2}{x}^{2},$

and the sample points are

$({x}_{1},{y}_{1}),({x}_{2},{y}_{2}),\dots $