Rockets in vacuum vs. Rocket on earth

Consider the energy $E_1$ required to remove $1$kg from Earth's gravity. This is given by:

$$ E = \frac{GM}{r} $$

where $r$ is the radius of the Earth, and this works out to be about:

$$E_1 = 6.3 \times 10^7 \,\text{J} $$

Now consider the energy $E_2$ required to accelerate that $1$kg to $0.99c$. The total energy is give by the relativistic equation for the energy:

$$ E^2 = p^2c^2 + m^2 c^4 $$

If we calculate this for $1$kg at $0.99c$ then subtract off the rest energy $mc^2$ we get about:

$$ E_2 = 5.4 \times 10^{17} \,\text{J} $$

So the energy needed to get away from Earth's gravity, $E_1$, is roughly $0.000000012\%$ of the energy, $E_2$, needed to reach the final speed. That's why the difference it makes to the acceleration and deceleration energies is negligible.

A similar argument applies to air resistance. To get to $0.99c$ at a survivable acceleration, i.e. of order $g$, the vast majority of the acceleration would be done after you had left the atmosphere. So the effect of the air resistance would also be negligible.


The answers above are correct; I'd just like to contribute a more "talky", less technical (and therefore also less accurate, I must note) explanation. (High points are in bold.)

If I understand correctly, what you're unclear on is why the teacher says the energy required is the "same" to get going from Earth and to stop at A Centauri. If that's not the part you want to know, correct me.

Assuming it is what you want to know: The friction of atmosphere and the pull of Earth's gravity are quite insignificant compared to the energy required to accelerate to 0.99c, and it seems to me your teacher was speaking in conceptual terms, rather than giving a precise calculation. That is, I don't think they wanted to say the energies for acceleration and deceleration are exactly equal; rather, that they are "broadly equal". In that sense, they were essentially correct.

What your teacher probably didn't consider (possibly because they didn't want to complicate the example further) is that if the ship needs to carry fuel (such as for a rocket), it becomes lighter as it continues to burn that fuel, and therefore the energy required to achieve a particular acceleration / deceleration (they are the same; deceleration is just acceleration in the opposite direction, put very simply) decreases, as energy required for constant acceleration is a function of the mass of the object to be accelerated.

Thus, the ship would actually require less energy to decelerate than it had required to accelerate, if it is powered by a rocket engine or some other engine that consumes significant amounts of fuel. A hypothetical nuclear- or fusion-powered engine, on the other hand, would probably consume fuel much slower (because it gets much more energy out of burning a given amount of fuel, therefore needs to burn less fuel mass overall to make the trip), and thus the mass of the ship would change less over the trip and therefore also the amounts of energy required to accelerate and to decelerate would be closer to equal; possibly much closer than with a rocket.

Getting back to planetary gravity and atmospheric drag: In an Earth-like atmosphere, the amount of drag produced (especially over the mere ~100 km of thickness of the atmosphere the ship needs to go through from surface to space) is tiny relative to the energy needed to get to 0.99c.

Planetary gravity has a somewhat larger effect, but again, negligible compared to the energy needed to accelerate to the final speed. And as with the atmosphere, the gravity becomes weaker the farther you get from the planet, attenuating to basically nothing within a few thousand kilometers (which is practically zero distance relative to a trip about 40,000,000,000,000 km long).

Finally, unless your teacher stated otherwise, there is no reason to assume that the destination is not an Earth-like planet. (We know there are none like that in the A Centauri system IRL, but then, this was a theoretical example.) If it is, then both gravity and atmospheric drag at the destination will be comparable to Earth's, making the situation symmetrical in this regard.

Realistically, if the destination is a rocky planet roughly the size of Earth (such as Alpha Centauri Cb, aka Proxima Centauri b), then it will have more or less Earth's gravity. Even if it has no atmosphere (we currently cannot tell about A Cen Cb), the gravitational pull is much larger than the atmospheric friction for Earth-like conditions; i.e., the situation would remain at least mostly symmetrical.

I hope that clears it up for you a bit.

Mike

EDIT: About relativistic effects: They're not really important for your question. They would of course be present, but as they are symmetrical for the acceleration and deceleration phase of the trip, and increase in magnitude as velocity approaches c, they have no real impact on the potential asymmetry of energy requirements for "takeoff" and "landing", as in both phases the ship will be going incredibly slowly compared to the speed of light.

As to what the "relativistic effects" actually do: Put very very simply, the energy required to achieve the same acceleration for the ship increases as the ship's speed approaches c; very close to c, the increases become huge. (They would be infinite at c, which is, very simply put, the reason no massive object can actually reach c, only approach it to an arbitrarily close fraction.) For 0.99c, the ship would accelerate as if its mass were about 7 times greater than it "actually" is.

The reason: Because the ship's mass actually would be higher, at least from the point of view of an observer relative to whom the ship were doing 0.99c. Energy is ultimately the same thing as mass, and a moving object has kinetic energy; the kinetic energy at 0.99c is so high that it "weighs" the additional ~6 times the ship's "normal" mass (i.e., mass at rest).

So, in this case, relativistic effects increase the overall energy required to make the trip, but since they are (effectively) absent at the "takeoff" and "landing" phases that your question centers on, they only serve to further decrease the relative significance of the takeoff and landing energies as a proportion of the total energy cost of the trip.

Make no mistake though, even discounting relativistic effects (which is unphysical, i.e., a thought experiment only) the energy for leaving / entering a planetary atmosphere / gravity well is still insignificant compared to the energy needed for the rest of the trip to 0.99c and back.

This is about as good an explanation as I can give without explaining the basic concept of special relativity in full, which is well beyond the scope of this post.


The huge amount of energy is requested to reach a velocity close to the speed of light, even if the interstellar space is almost void.
Let us consider the four-force in SR (special relativity)
$F^\mu = dP^\mu / d\tau = m dU^\mu / d\tau $ (1)
where:
$F^\mu$ four-force
$P^\mu$ four-momentum
$m$ rest (proper) mass
$U^\mu$ four-velocity
$\tau$ proper time
However $d / d\tau = \gamma d / dt$ where Lorentz factor $\gamma = dt / d\tau = 1 / \sqrt{1-v^2/c^2}$ and $v$ is the three-velocity, hence (1) can be written
$F^\mu = \gamma m dU^\mu / dt$
As the velocity $v$ approaches the speed of light the $\gamma$ factor increases towards infinity and the force required to boost the spaceship follows, asking for a consequent amount of energy. Same reasoning when it has to decelerate.
Note: The argumentation is simplified assuming the rest mass is constant. Of course to boost a spacecraft to the desired velocity needs to consume fuel, thus reducing the mass; however the issue here is to highlight the fundamental reason why a close-to-speed-of-light travel requires a huge amount of energy. This is a peculiarity of SR, that you do not find in Newtonian mechanics.