 # Expert Help with Regression Line Equation

Recent questions in Regression twaishi03m 2022-12-18

## Which characteristic of a data set makes a linear regression model unreasonable? Bailee Richards 2022-11-26

## Find the meaning of 'Sxx' and 'Sxy' in simple linear regression zweifelndcuv 2022-11-24

## In the least-squares regression line, the desired sum of the errors (residuals) should bea) zero b) positive c) 1 d) negativee) maximized Ricky Arias 2022-11-07

## Explain what does it mean this denotation: $\underset{w}{min}||Xw-y|{|}_{2}^{2}$ Messiah Sutton 2022-11-06

## Why is the standardized regression coefficient in a regression model with more than one independent variable not the same as the correlation coefficient between x we interested in and y in a regression model with more than one independent variable?$\stackrel{^}{{\beta }_{i}}=\mathrm{c}\mathrm{o}\mathrm{r}\left({Y}_{i},{X}_{i}\right)\cdot \frac{\mathrm{S}\mathrm{D}\left({Y}_{i}\right)}{\mathrm{S}\mathrm{D}\left({X}_{i}\right)}$So$\mathrm{c}\mathrm{o}\mathrm{r}\left({Y}_{i},{X}_{i}\right)=\stackrel{^}{{\beta }_{i}}\cdot \frac{\mathrm{S}\mathrm{D}\left({X}_{i}\right)}{\mathrm{S}\mathrm{D}\left({Y}_{i}\right)}$The formula for the standardized regression coefficient is also:$standardizedBeta=\stackrel{^}{{\beta }_{i}}\cdot \frac{\mathrm{S}\mathrm{D}\left({X}_{i}\right)}{\mathrm{S}\mathrm{D}\left({Y}_{i}\right)}$So shouldn't it be$standardizedBeta=\mathrm{c}\mathrm{o}\mathrm{r}\left({Y}_{i},{X}_{i}\right)$? akuzativo617 2022-11-03

## The gradient of the regression line $x$ on $y$ is $-0.2$ and the line passes through $\left(0,3\right)$. If the equation of the line is $x=c+dy$, find the value of $c$ and $d$. Kailyn Hamilton 2022-11-02

## An simple formula, an example, and an explanation for what all the symbols and variables are for basic linear regression? Trace Glass 2022-10-31

## How to reduce equation?$p=\frac{{e}^{{\beta }_{0}+{\beta }_{1}\ast age}}{{e}^{{\beta }_{0}+{\beta }_{1}\ast age}+1}$to ${\mathrm{log}}_{e}\frac{p}{1-p}={\beta }_{0}+{\beta }_{1}\ast age$ Angel Kline 2022-10-18

## Suppose we have the regression model: ${y}_{i}$ = ${\beta }_{0}$ + ${\beta }_{1}{x}_{i}$ + ${ϵ}_{i}$where ${y}_{i}$ = (${Y}_{i}$ - $\overline{Y}$) and ${x}_{i}$ = (${X}_{i}$ - $\overline{X}$).This will be true iff ${\beta }_{0}$ = 0. We immediately see that ${\beta }_{0}$ = (${Y}_{i}$ - $\overline{Y}\right)$) - ${\beta }_{1}$(${X}_{i}$ - $\overline{X}\right)$). Where ${\beta }_{1}$ is given by $\frac{COV\left(X,Y\right)}{VAR\left(X\right)}$. I don't believe this quantity is guaranteed to be 0, so would the answer be that we are unable to determine if the regression line passes through the origin? bergvolk0k 2022-10-17

## Regression analysis is a statistical process for estimating the relationships among variables. Regression analysis is widely used for prediction and forecasting. So why is regression analysis also used as statistical test? Winston Todd 2022-10-15

## Given this regression model: ${y}_{i}={\beta }_{0}+{\beta }_{1}{x}_{i}+{E}_{i}$.All the assumptions are valid except that now: ${E}_{i}\sim N\left(0,{x}_{i}{\sigma }^{2}\right)$Find Maximum likelihood parameters for ${\beta }_{0}$, ${\beta }_{1}$ ecoanuncios7x 2022-09-30

## Linear Regression:$Y=a+bX+ϵ$For $R$ squared in linear regression, in the form of ratio between $\left({y}_{i}-{y}^{bar}\right)$, or in terms of$\left({S}_{xy}{\right)}^{2}/\left({S}_{xx}{S}_{yy}\right)$Not sure if you guys come across this form:${R}^{2}=\frac{Var\left(bX\right)}{V\left(bX\right)+V\left(ϵ\right)}$? streutexw 2022-08-20

## Specify the regression equation to estimate the DiD as:$Y={\beta }_{0}+{\beta }_{1}\ast \left[Time\right]+{\beta }_{3}\ast \left[Time\ast Intervention\right]+{\beta }_{4}\ast \left[Covariates\right]+\epsilon$instead of:$Y={\beta }_{0}+{\beta }_{1}\ast \left[Time\right]+{\beta }_{2}\ast \left[Intervention\right]+{\beta }_{3}\ast \left[Time\ast Intervention\right]+{\beta }_{4}\ast \left[Covariates\right]+\epsilon$Our ${\beta }_{3}$ coefficient would still be yielding the DiD estimator right? Annabathuni Seethu keerthana2022-07-24

## In a poker hand consisting of 5 cards, find the probability of holding (a) 3 aces (b) 4 hearts and 1 club (c) Cards of same suit (d) 2 aces and 3 jacks Annabathuni Seethu keerthana2022-07-24

## In a poker hand consisting of 5 cards, find the probability of holding (a) 3 aces (b) 4 hearts and 1 club (c) Cards of same suit (d) 2 aces and 3 jacks Sonia Ayers 2022-07-09

## In logistic regression, the regression coefficients ($\stackrel{^}{{\beta }_{0}},\stackrel{^}{{\beta }_{1}}$) are calculated via the general method of maximum likelihood. For a simple logistic regression, the maximum likelihood function is given as$\ell \left({\beta }_{0},{\beta }_{1}\right)=\prod _{i:{y}_{i}=1}p\left({x}_{i}\right)\prod _{{i}^{\prime }:{y}_{{i}^{\prime }}=0}\left(1-p\left({x}_{{i}^{\prime }}\right)\right).$What is the maximum likelihood function for $2$ predictors? Or $3$ predictors? Gretchen Schwartz 2022-07-07

## Why is polynomial regression considered a kind of linear regression? For example, the hypothesis function is$h\left(x;{t}_{0},{t}_{1},{t}_{2}\right)={t}_{0}+{t}_{1}x+{t}_{2}{x}^{2},$and the sample points are$\left({x}_{1},{y}_{1}\right),\left({x}_{2},{y}_{2}\right),\dots$ Kyshma Parson2022-07-07

##  logiski9s 2022-07-03
## What is the difference between multi-task lasso regression and ridge regression? The optimization function of multi-task lasso regression is$mi{n}_{w}\sum _{l=1}^{L}1/{N}_{t}\sum _{i=1}^{{N}_{t}}{J}^{l}\left(w,x,y\right)+\gamma \sum _{l=1}^{L}||{w}^{l}|{|}_{2}$while ridge regression is$mi{n}_{w}\sum _{l=1}^{L}1/{N}_{t}{J}^{l}\left(w,x,y\right)+\gamma ||{w}^{l}|{|}_{2}$which looks the same as the ridge regression. As for me, the problem of multi-task lasso regression is equivalent to solve global ridge regression. So what is the difference between these two regression methods? Both of them use ${L}_{2}$ function. Or does it mean that in multi-task lasso regression, the shape of $W$ is (1,n)? vittorecostao1 2022-07-01