Let r = r ( t ) and θ = θ ( t ) with...

Arectemieryf0

Arectemieryf0

Answered

2022-07-21

Let r = r ( t ) and θ = θ ( t ) with r ( t ) > 0. Let x ( t ) = r ( t ) cos ( θ ( t ) ) and y ( t ) = r ( t ) sin ( θ ( t ) ) . Prove that d θ d t = 1 x 2 + y 2 ( x d y d t y d x d t )
The hint is to use y ( t ) / x ( t ) and use implicit differentiation but I can't see how to use that hint to solve this problem.

Answer & Explanation

Tristan Pittman

Tristan Pittman

Expert

2022-07-22Added 14 answers

We have
y ( t ) x ( t ) = tan ( θ ( t ) ) .
Now differentiate both sides with respect to t. On the left-hand side, you can differentiate y ( t ) / x ( t ) using the quotient rule. But on the other side we need to use the chain rule. We have
d d t tan ( θ ( t ) ) = 1 cos 2 ( θ ( t ) ) × d θ d t .
Now rearrange for d θ d t .

Do you have a similar question?

Recalculate according to your conditions!

Ask your question.
Get your answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?