Bryan estimates the cost of a vacation to be $730. The actual cost of the vacation is $850. What is the percent error?

posader86
2022-07-20
Answered

You can still ask an expert for help

dasse9

Answered 2022-07-21
Author has **12** answers

First, calculate the error margin,

$850-$730=$120

Divide the error against actual cost,

$\frac{\$120}{\$850}=0.14118$

Multiply by 100% to find the percentage error,

$0.14118\times 100\mathrm{\%}$

Thus, the percentage error is 14.118%

$850-$730=$120

Divide the error against actual cost,

$\frac{\$120}{\$850}=0.14118$

Multiply by 100% to find the percentage error,

$0.14118\times 100\mathrm{\%}$

Thus, the percentage error is 14.118%

asked 2022-06-30

If the coefficients of a simple regression line, ${B}_{0}$ and ${B}_{1}$, are the same then why are the regression lines of $y$ on $x$ and $x$ on $y$ different given the condition ${r}^{2}<1$. I have tried all the manipulation and graphical analysis I can but can't seem to see why this is happening.

asked 2022-07-18

Correlation: Concept to FormulaIn digital signal processing, we calculate the correlation between two discrete signals by multiplying corresponding samples of the two signals and then adding the products. Where does this process/formula for correlation come from?

I understand the concept of correlation (similarity) between two signals. But I fail to understand how it translates to the formula that it does.

All the texts I have seen so far start with this formula and explain cross correlation, auto correlation, etc. None of them attempt to explain how the formula was derived in the first place.

I understand the concept of correlation (similarity) between two signals. But I fail to understand how it translates to the formula that it does.

All the texts I have seen so far start with this formula and explain cross correlation, auto correlation, etc. None of them attempt to explain how the formula was derived in the first place.

asked 2022-05-23

A random sample of size $n$ from a bivariate distribution is denoted by $({x}_{r},{y}_{r}),r=1,2,3,...,n$. Show that if the regression line of $y$ on $x$ passes through the origin of its scatter diagram then

$$\overline{y}\sum _{r=1}^{n}{x}_{r}^{2}=\overline{x}\sum _{r=1}^{n}{x}_{r}{y}_{r}$$

where $(\overline{x},\overline{y})$ is the mean point of the sample.

$$\overline{y}\sum _{r=1}^{n}{x}_{r}^{2}=\overline{x}\sum _{r=1}^{n}{x}_{r}{y}_{r}$$

where $(\overline{x},\overline{y})$ is the mean point of the sample.

asked 2022-07-16

Suppose $Z(t)={\Sigma}_{k=1}^{n}X{e}^{j({\mathit{\omega}}_{0}t+{\mathbf{\Phi}}_{k})}$, $t\in R$ where ${\mathit{\omega}}_{0}$ is a constant, n is a fixed positive integer, ${X}_{1},...,{X}_{n},\u3000{\mathbf{\Phi}}_{1},...,{\mathbf{\Phi}}_{n}$ are mutually independent random variables, and $E{X}_{k}=0,D{X}_{k}={\sigma}_{k}^{2},\mathbf{\Phi}$ , $U[0,2\pi ],k=1,2,...,n$ . Find the mean function and correlation function of $\{Z(t),\text{}t\in R\}$ .

I have tried to solve it.

For mean function,

${m}_{Z}(s)=E\{{Z}_{s}\}=E\{{X}_{s}\}+iE\{{Y}_{t}\}$

$=E\{{\Sigma}_{k=1}^{s}X{e}^{j({\mathit{\omega}}_{0}t+{\mathbf{\Phi}}_{k})}\}$

For correlation function,

${R}_{Z}(s,u)=E\{{Z}_{s},{Z}_{u}\}$

$=E\{Y(s)Y(u)\}$

$=E\{{\Sigma}_{k=1}^{s}X{e}^{j({\mathit{\omega}}_{0}t+{\mathbf{\Phi}}_{k})}{\Sigma}_{k=1}^{u}X{e}^{j({\mathit{\omega}}_{0}t+{\mathbf{\Phi}}_{k})}\}$

$=E\{{\Sigma}_{k=1}^{s}{\Sigma}_{k=1}^{u}X{e}^{j({\mathit{\omega}}_{0}t+{\mathbf{\Phi}}_{k})}X{e}^{j({\mathit{\omega}}_{0}t+{\mathbf{\Phi}}_{k})}\}$

I am stuck here. How to move from here ahead?

I have tried to solve it.

For mean function,

${m}_{Z}(s)=E\{{Z}_{s}\}=E\{{X}_{s}\}+iE\{{Y}_{t}\}$

$=E\{{\Sigma}_{k=1}^{s}X{e}^{j({\mathit{\omega}}_{0}t+{\mathbf{\Phi}}_{k})}\}$

For correlation function,

${R}_{Z}(s,u)=E\{{Z}_{s},{Z}_{u}\}$

$=E\{Y(s)Y(u)\}$

$=E\{{\Sigma}_{k=1}^{s}X{e}^{j({\mathit{\omega}}_{0}t+{\mathbf{\Phi}}_{k})}{\Sigma}_{k=1}^{u}X{e}^{j({\mathit{\omega}}_{0}t+{\mathbf{\Phi}}_{k})}\}$

$=E\{{\Sigma}_{k=1}^{s}{\Sigma}_{k=1}^{u}X{e}^{j({\mathit{\omega}}_{0}t+{\mathbf{\Phi}}_{k})}X{e}^{j({\mathit{\omega}}_{0}t+{\mathbf{\Phi}}_{k})}\}$

I am stuck here. How to move from here ahead?

asked 2022-07-22

Correlation and causation data

What are some real life examples (data) in which high correlation:

1) implies causation

2) doesn't imply causation.

I know that there is a lot of data out there of weird correlations, like divorce rate and consuption of margarine, but the problem with this data is, that we don't really know if one is caused by the other, because nobody tested it, so we cannot strictly say that they're unrelated.

What are some real life examples (data) in which high correlation:

1) implies causation

2) doesn't imply causation.

I know that there is a lot of data out there of weird correlations, like divorce rate and consuption of margarine, but the problem with this data is, that we don't really know if one is caused by the other, because nobody tested it, so we cannot strictly say that they're unrelated.

asked 2022-05-02

I am getting ${f}_{X,Y}(x,y)={f}_{X}(x){f}_{Y}(y)$ even if the correlation coefficient $\rho \ne 0$

asked 2022-06-02

If the joint density function of $X$ and $Y$ is given by:

$f(x,y)=\{\begin{array}{ll}1/2,& \text{for}|x|+|y|\le 1\\ 0,& \text{otherwise}\end{array}$

Show that $Y$ has constant regression with respect to $X$ and/but that $X$ and $Y$ are not independant.

$f(x,y)=\{\begin{array}{ll}1/2,& \text{for}|x|+|y|\le 1\\ 0,& \text{otherwise}\end{array}$

Show that $Y$ has constant regression with respect to $X$ and/but that $X$ and $Y$ are not independant.