Assume we have ${Y}_{1},...,{Y}_{n}$ iid sample from the uniform distribution $U(0,\theta )$. Assume $T({Y}_{1},...,{Y}_{n})=Mi{n}_{1<=i<=n}({Y}_{i})$

gfresh86iop
2022-11-19
Answered

Assume we have ${Y}_{1},...,{Y}_{n}$ iid sample from the uniform distribution $U(0,\theta )$. Assume $T({Y}_{1},...,{Y}_{n})=Mi{n}_{1<=i<=n}({Y}_{i})$

You can still ask an expert for help

Arely Davila

Answered 2022-11-20
Author has **17** answers

$t{y}^{\u2033}-t{y}^{\prime}+y=1$

$-{({s}^{2}\mathcal{L}(y)-sy(0)-{y}^{\prime}(0))}^{\prime}+{(s\mathcal{L}(y)-y(0))}^{\prime}+\mathcal{L}(y)={\displaystyle \frac{1}{s}}$

$(s-{s}^{2}){\mathcal{L}}^{\prime}(y)+(2-2s)\mathcal{L}(y)={\displaystyle \frac{1}{s}}$

${\mathcal{L}}^{\prime}(y)+{\displaystyle \frac{2}{s}}\mathcal{L}(y)={\displaystyle \frac{1}{{s}^{2}(1-s)}}$

with integrating factor ${s}^{2}$ we have

$\begin{array}{rl}({s}^{2}\mathcal{L}(y){)}^{\prime}& ={\displaystyle \frac{1}{1-s}}\\ y& ={\mathcal{L}}^{-1}{\displaystyle \frac{-\mathrm{ln}(1-s)}{{s}^{2}}}+Ct\\ & ={\int}_{0}^{t}{\displaystyle \frac{{e}^{x}}{x}}(t-x)dx+Ct\\ & =t{\int}_{0}^{t}{\displaystyle \frac{{e}^{x}}{x}}dx-{e}^{t}+1+Ct\\ & =t\mathrm{Ei}(t)-{e}^{t}+1+Ct\end{array}$

$-{({s}^{2}\mathcal{L}(y)-sy(0)-{y}^{\prime}(0))}^{\prime}+{(s\mathcal{L}(y)-y(0))}^{\prime}+\mathcal{L}(y)={\displaystyle \frac{1}{s}}$

$(s-{s}^{2}){\mathcal{L}}^{\prime}(y)+(2-2s)\mathcal{L}(y)={\displaystyle \frac{1}{s}}$

${\mathcal{L}}^{\prime}(y)+{\displaystyle \frac{2}{s}}\mathcal{L}(y)={\displaystyle \frac{1}{{s}^{2}(1-s)}}$

with integrating factor ${s}^{2}$ we have

$\begin{array}{rl}({s}^{2}\mathcal{L}(y){)}^{\prime}& ={\displaystyle \frac{1}{1-s}}\\ y& ={\mathcal{L}}^{-1}{\displaystyle \frac{-\mathrm{ln}(1-s)}{{s}^{2}}}+Ct\\ & ={\int}_{0}^{t}{\displaystyle \frac{{e}^{x}}{x}}(t-x)dx+Ct\\ & =t{\int}_{0}^{t}{\displaystyle \frac{{e}^{x}}{x}}dx-{e}^{t}+1+Ct\\ & =t\mathrm{Ei}(t)-{e}^{t}+1+Ct\end{array}$

asked 2022-10-27

Suppose $X\sim N(\mu ,{\sigma}^{2})$. I know that $T(X)=(\overline{X},{S}^{2})$ is a complete sufficient statistic for $\mu ,{\sigma}^{2}$ if $\mu ,{\sigma}^{2}$ are unknown. But if $\mu ,{\sigma}^{2}$ is known, is ${S}^{2}$ still a complete statistic of ${\sigma}^{2}$?

asked 2022-10-30

Difference between Factorization theorem and Fischer-Neymann theorem for t to be sufficient estimator of $\theta $

asked 2022-11-02

Derive the Cramer von Mises test statistic

$$n{C}_{n}=\frac{1}{12n}+\sum _{i=1}^{n}{({U}_{(i)}-\frac{2i-1}{2n})}^{2}$$

where ${U}_{(i)}={F}_{0}({X}_{(i)})$ the order statistics

$$n{C}_{n}=\frac{1}{12n}+\sum _{i=1}^{n}{({U}_{(i)}-\frac{2i-1}{2n})}^{2}$$

where ${U}_{(i)}={F}_{0}({X}_{(i)})$ the order statistics

asked 2022-07-16

Show that if a function of a sufficient statistic is ancillary, then the sufficient statistic is not complete.

asked 2022-10-29

If $\overline{X}$ and ${S}^{2}$ be the usual sample mean and sample variance based in a random sample of $n$ observation $N(\mu ,{\sigma}^{2})$ and $T=\frac{(\overline{X}-\mu )\sqrt{n}}{S}$ ` prove that $Cov(\overline{X},T)=\sigma \frac{\sqrt{n-1}\mathrm{\Gamma}\left(\frac{n-2}{2}\right)}{\sqrt{2n}\mathrm{\Gamma}\left(\frac{n-1}{2}\right)}$

asked 2022-11-07

Let $(\mathcal{X},\mathcal{M})$, $(\mathcal{Y},\mathcal{N})$ and $(\mathcal{Z},\mathcal{O})$ be measurable spaces, and let $f:\mathcal{X}\to \mathcal{Y}$ and $g:\mathcal{Y}\to \mathcal{Z}$ be measurable functions. Then $\sigma (g\circ f)=\sigma (f)$ if and only if $g$ is bijective, where $\sigma (g\circ f)=(g\circ f{)}^{-1}(\mathcal{O})={f}^{-1}({g}^{-1}(\mathcal{O}))$ and $\sigma (f)={f}^{-1}(\mathcal{N})$.

asked 2022-10-28

For a Mann-Whitney test do we use the $T$ value from the smallest sample size as the test statistic or the smallest $T$ value?