# If a rocket ship is traveling at .99c for 1 year, and is streaming a video at 30 frames/sec to earth, how would the earth feed be affected? Would it show the video at a much slower rate, would it remain constant, or would it be sped up?

If a rocket ship is traveling at .99c for 1 year, and is streaming a video at 30 frames/sec to earth, how would the earth feed be affected? Would it show the video at a much slower rate, would it remain constant, or would it be sped up?
You can still ask an expert for help

• Questions are typically answered in as fast as 30 minutes

Solve your problem for the price of one coffee

• Math expert for every subject
• Pay only if we can solve it

dilettato5t1
So let's say the rocket has a camera that's supposed to take a frame every $1/30$-th of a second. From the earth's point of view, this frequency will be reduced (by how much exactly is left for you as an exercise). Let's just for the sake of simplicity say that on earth it looks like the camera takes just one frame every (earth-)second.
Next comes the transmission of the stream back to earth. Let's say the rocket sends each frame individually in a single specially shaped pulse of light. Then in earth time, it sends out a frame every second. But during that second, it'll have flown another $.99c\cdot 1s$ further away from earth. Hence, the time between pulses is actually $1s+0.99=1.99s$.
So instead of receiving 30 frames per second, earth only receives $1/1.99\approx 0.5$ frames per seconds, or about 1 frame every 2 seconds.
Now of course depending on your age you remember the good old days where it actually took longer to download a video than to watch it. What you'd do then, of course, is first wait until enough data is loaded so you can resume playback (the good old buffering). If we wait long enough, we get a bunch of frames from the rocket, and we can then play them back at any speed we like, including the original supposed speed of 30 fps. The video will then look totally normal to us.