Assume S follows the geometric Brownian motion dynamics, $dS=\mu Sdt+\sigma SdZ$, with $\mu $ and $\sigma $ constants. Derive the stochastic differential equation satisfied by $y=2S,y={S}^{2},y={e}^{S}$

gobeurzb
2022-09-25
Answered

Deriving Stochastic differential equations. I am having a difficulty in deriving stochastic differential equations from geometric Brownian motion dynamics.

Assume S follows the geometric Brownian motion dynamics, $dS=\mu Sdt+\sigma SdZ$, with $\mu $ and $\sigma $ constants. Derive the stochastic differential equation satisfied by $y=2S,y={S}^{2},y={e}^{S}$

Assume S follows the geometric Brownian motion dynamics, $dS=\mu Sdt+\sigma SdZ$, with $\mu $ and $\sigma $ constants. Derive the stochastic differential equation satisfied by $y=2S,y={S}^{2},y={e}^{S}$

You can still ask an expert for help

Simeon Hester

Answered 2022-09-26
Author has **16** answers

Step 1

It seems like you want us to do you homework, here I will explain you how to do it.

The key is to do Ito's formula: If $f\in {\mathcal{C}}^{2}$ then

$$f({S}_{t})=f({S}_{0})+{\int}_{0}^{t}{f}^{\prime}({S}_{t})d{S}_{t}+\frac{1}{2}{\int}_{0}^{t}{f}^{\u2033}({S}_{t})d[S{]}_{t}$$

Step 2

And so considering the differential form:

$$df({S}_{t})={f}^{\prime}({S}_{t})d{S}_{t}+\frac{1}{2}{f}^{\u2033}({S}_{t})d[S{]}_{t}$$

Where of course $[S{]}_{t}$ denotes the quadratic variation of S. In your case: $[S{]}_{t}={\int}_{0}^{t}(\sigma {S}_{t}{)}^{2}dt$ assuming ${Z}_{t}$ is a BM.

You can apply this formula to $f(x)=2x,f(x)=x2,f(x)=ex,..$

It seems like you want us to do you homework, here I will explain you how to do it.

The key is to do Ito's formula: If $f\in {\mathcal{C}}^{2}$ then

$$f({S}_{t})=f({S}_{0})+{\int}_{0}^{t}{f}^{\prime}({S}_{t})d{S}_{t}+\frac{1}{2}{\int}_{0}^{t}{f}^{\u2033}({S}_{t})d[S{]}_{t}$$

Step 2

And so considering the differential form:

$$df({S}_{t})={f}^{\prime}({S}_{t})d{S}_{t}+\frac{1}{2}{f}^{\u2033}({S}_{t})d[S{]}_{t}$$

Where of course $[S{]}_{t}$ denotes the quadratic variation of S. In your case: $[S{]}_{t}={\int}_{0}^{t}(\sigma {S}_{t}{)}^{2}dt$ assuming ${Z}_{t}$ is a BM.

You can apply this formula to $f(x)=2x,f(x)=x2,f(x)=ex,..$

asked 2021-01-02

$w=x\mathrm{sin}y,\text{}x={e}^{t},\text{}y=\pi -t$, value $t=0$, find $\frac{dw}{dt}$ using the appropriate Chain Rule and evaluate $\frac{dw}{dt}$ at the given value of t.

asked 2022-03-31

How to solve $K\mathbf{\text{x}}\text{'}\text{'}\left(t\right)=\left[\begin{array}{cc}-1& 0\\ 0& -1\end{array}\right]\mathbf{\text{x}}\left(t\right)$

What I'm thinking is to consider the first order version

$\mathbf{\text{x}}\text{'}\left(t\right)=\left[\begin{array}{cc}-1& 0\\ 0& -1\end{array}\right]\mathbf{\text{x}}\left(t\right)$

which I know how to solve, the solution is

$\mathbf{\text{x}}\left(t\right)=\mathbf{\text{c}}\left[\begin{array}{c}{e}^{-t}\\ {e}^{-t}\end{array}\right]$

How do I use this to solve the second-order equation?

asked 2022-03-29

Proving single solution of initial value problem is increasing

Given the initial value problem

${y}^{\prime}\left(x\right)=y\left(x\right)-\mathrm{sin}\left\{y\left(x\right)\right\},y\left(0\right)=1$

Given the initial value problem

asked 2022-04-10

Trying to understand eigenvalues with respect to differential equations.

I am trying to understand how to find eigenvalues from a matrix consisting of exponential terms, considering a differential equation. The examples I've seen online are ODEs. Without using a vector with exponential terms, here is what I have learned.

$\frac{d}{dt}\overrightarrow{x}\left(t\right)=\lambda \overrightarrow{x}\left(t\right)$

$\frac{d}{dt}\left[\begin{array}{c}{x}_{1}\left(t\right)\\ {x}_{2}\left(t\right)\\ {x}_{3}\left(t\right)\end{array}\right]=\left[\begin{array}{ccc}{\lambda}_{1}& 0& 0\\ 0& {\lambda}_{2}& 0\\ 0& 0& {\lambda}_{3}\end{array}\right]\left[\begin{array}{c}{x}_{1}(t=0)\\ {x}_{2}(t=0)\\ {x}_{3}(t=0)\end{array}\right]$

Here is what I am trying to understand

Section of a paper with imag eigenvalue

In this paper, the following assumption is made.

$\overrightarrow{J}\left(t\right)=\left|\overrightarrow{J}\right|{e}^{-i\omega t}=\left[\begin{array}{c}\left|{J}_{x}\right|{e}^{-i\omega t}\\ \left|{J}_{y}\right|{e}^{-i\omega t}\\ \left|{J}_{z}\right|{e}^{-i\omega t}\end{array}\right]$

They are using partial derivatives (I believe this can be viewed as an ODE then?). Differentiating with respect to time I believe yields the following. Please correct me if I am wrong.

$\frac{\partial}{\partial t}\overrightarrow{J}\left(t\right)=\frac{\partial}{\partial t}\left[\begin{array}{c}\left|{J}_{x}\right|{e}^{-i\omega t}\\ \left|{J}_{y}\right|{e}^{-i\omega t}\\ \left|{J}_{z}\right|{e}^{-i\omega t}\end{array}\right]=\left[\begin{array}{ccc}-i\omega & 0& 0\\ 0& -i\omega & 0\\ 0& 0& -i\omega \end{array}\right]\left[\begin{array}{c}\left|{J}_{x}\right|{e}^{-i\omega t(t=0)}\\ \left|{J}_{y}\right|{e}^{-i\omega t(t=0)}\\ \left|{J}_{z}\right|{e}^{-i\omega t(t=0)}\end{array}\right]$

Or

$\frac{\partial}{\partial t}\left[\begin{array}{c}\left|{J}_{x}\right|{e}^{-i\omega t}\\ \left|{J}_{y}\right|{e}^{-i\omega t}\\ \left|{J}_{z}\right|{e}^{-i\omega t}\end{array}\right]=\left[\begin{array}{ccc}-i\omega & 0& 0\\ 0& -i\omega & 0\\ 0& 0& -i\omega \end{array}\right]\left[\begin{array}{c}\left|{J}_{x}\right|\\ \left|{J}_{y}\right|\\ \left|{J}_{z}\right|\end{array}\right]$

Are my assumptions correct? If so, is there a deeper analysis to why this is the case with exponential terms?

asked 2022-04-22

Sinusoids as solutions to differential equations

It is well known that the function

$t\mapsto a\mathrm{cos}\left(\omega t\right)+b\mathrm{sin}\left(\omega t\right)$

is the solution to the differential equation:

$x{}^{\u2033}\left(t\right)=-{\omega}^{2}x\left(t\right)$

with the initial conditions$x\left(0\right)=a\text{}\text{and}\text{}{x}^{\prime}\left(0\right)=b\omega$ . I was wondering what differential equation will be solved by

$f\left(t\right){\textstyle \phantom{\rule{0.222em}{0ex}}}=\sum _{j=1}^{k}({a}_{j}\mathrm{cos}\left({\omega}_{j}t\right)+{b}_{j}\mathrm{sin}\left({\omega}_{j}t\right))$ .

It is obvious that this satisfies$x\left(t\right)=\sum _{j=1}^{k}{x}_{j}\left(t\right)$ with ${x}_{j}{}^{\u2033}\left(t\right)=-{\omega}_{j}^{2}{x}_{j}\left(t\right)$ and initial conditions on ${x}_{j}\left(0\right)$ and ${x}_{j}^{{}^{\prime}}\left(0\right)$ . I was wondering if there were any other (perhaps more natural) differential equations that are satisfied by f.

It is well known that the function

is the solution to the differential equation:

with the initial conditions

It is obvious that this satisfies

asked 2022-03-15

Analytical solution for a separable scalar nonlinear ODE

$\dot{x}\left(t\right)=-x\left(t\right)\sqrt{1+x{\left(t\right)}^{4}}{\textstyle \phantom{\rule{1em}{0ex}}}\text{with}{\textstyle \phantom{\rule{1em}{0ex}}}x\left(0\right)={x}_{0}\in \mathbb{R}$

asked 2021-06-11

If ${x}^{2}+xy+{y}^{3}=1$ find the value of y''' at the point where x = 1