so my task is to find out if the random variables defined by ${X}_{n}(t)=n\cdot (-t{)}^{n}$ with $t\in [0,1]$ converges almost surely against another random variable $X$ in the probability space $([0,1],\mathcal{B}\mathcal{[}\mathcal{0}\mathcal{,}\mathcal{1}\mathcal{]},{\lambda}_{[0,1]})$. In a first step I tried to find $X$ by calculating $\underset{n\to \mathrm{\infty}}{lim}{X}_{n}(t)$, which gives me the following:

$X=\{\begin{array}{ll}0,& 0\le t<1\\ \pm \mathrm{\infty},& t=1\end{array}$

So my question now is if it's even possible for this sequence of random variables to converge at all. I would say no but I'm not sure though cause 1 is a null set for the Lebesgue measure. If I'm correct is it, therefore, ok to conclude that any sequence of random variables with no unique limit function doesn't converge for any probability measure?