How is the internet able to transmit data so fast?

Perhaps this would be better as a comment, but it is too long. I wanted to address the fact that your statements indicate that you perceive the data to be transmitted almost instantaneously, but none of the observations you present actually prove that it is transmitted quickly at all.

You mention videogames. Game developers are WELL aware of the fact that there is significant latency between players, so they pull off several tricks. One of them is that they have the client guess about certain information that it hasn't yet received from the server/other players. For example, your client knows your opponent's position and velocity from, say, 50ms ago. So it extrapolates and says, "if his motion is like this, he's probably about here now", and you see this predicted position. Most of the time, it's pretty accurate (probably because of the effort programmers put into this), and it actually feels like there's no latency. Other times it's inaccurate, and to you, it appears that the player was shot, when really he was at a different position than what your console predicted.

On a related note, most displays run around 60Hz, and some do something called double-buffering. I won't go into the details here, but this introduces up to 33ms of latency between when the processor renders a frame and when it's actually displayed. Most people don't notice this, so I think it's reasonable to suggest that even if the network latency were equivalent to 33ms, you might perceive it to be instant, even without any programming tricks.

In summary, the internet is not necessarily "fast". But smart people do smart things to make it appear as if it's faster than it really is. If you want more information, you may consider asking the people on gamedev.stackexchange.com.


This isn't something that can be answered in a single post, by a single person. However, I hope this answer provides enough information and links to be helpful.

It is important to understand how signals are transmitted over the Internet. Note however that due to noise and the immense number of users, the same signal needs to be encoded, decoded, retransmitted, etc so the time needed for processing is many orders of magnitude more that the actual electrical signal needs to travel. Also have in mind at a millisecond is an very large amount of time for a computer; a GeForce Quadro K6000 graphics card can perform 5.000.000.000+ floating point operations in that much time (5196 GFlops times 1ms).

Conductive cables:

The electrons themselves don't move that fast because they bounce around inside of the conducting cables. However electricity doesn't travel based on electrons bouncing one onto the other, rather one repelling the other through electromagnetic interaction:

Say you have 3 electrons in line (assume one dimensional space). Move the first a little bit. The distance of the first to the second gets a little smaller. The electrostatic force on them gets a little larger. According to Coulomb's Law it is: $$\|F\|=k_e\frac{q_1 q_2}{r^2}$$ where: \$\|F\|\$ is the magnitude of the force, \$k_e\$ is Coulomb's constant, \$q_1\$ and \$q_2\$ is the charge of each of the two particles and finally \$r^2\$ is the distance between them.

As the first particle moves towards the second, the electrostatic force increases almost instantly. This causes the second particle to move a little bit towards the third etc.

"Almost instantly" actually means "at the speed of light" (\$c=299,792,458m/s\$).

There is an extreme number of electrons inside a conducting wire and the physics are a bit more complicated but the gist of it is a signal gets across a conductor "almost instantly" but slower than \$c\$.

Optical Fibre:

Optical fibre cables transmit signals by photons instead of electrons. Even in this case however, the photons don't travel in a straight line. However, the time needed for the photon to travel across the line is still very small compared to the processing time to encode and decode the signals, as well as packet retransmissions.

Wireless:

Finally, communication satellites as well as numerous types of wireless links are used to transmit signals, well, wireless using a great number of transmission protocols, modulations and frequencies. In this case, signals are transmitted using electromagnetic radiation. This a very complex subject and I can't possibly cover it all.

Smart ways to encode information into electrical signals:

It is not enough for a voltage pulse to reach the other end of a wire; that voltage is there to convey some information. The act of encoding information by modifying a carrier signal based on the information to be transmitted (carried, hence the name carrier), is called modulation.

Smart ways to share the same channels:

All these communication channels need to be connected and information needs to travel across this vast network in a reliable way. Initially, to have two nodes communicate with each other, they would reserve a number of cables forming a path from node A to node B. No other node would be able to use this same path. This is called circuit switching. The breakthrough that made such a vast network such as the internet possible was the ability for numerous nodes to share one particular communication channel. This sharing was enabled by packet switching. Instead of reserving a circuit just for two nodes, every node just checks if the bus is free, then transmits a packet containing data and destination info (and some other stuff) and then releases the channel. Packets need to find their destination and this is called packet routing, which is another huge subject. Routing and the need for modulation is the main reason a packet takes "so long" to reach it's destination compared to how fast electromagnetic waves travel. Routing is also necessary for all those users to coexist on the same network.

The Internet:

All these thing, along with numerous other technologies, are used together to form The Internet.

Lag Compensation:

In many applications, including competitive video games, a few milliseconds of delay would be unacceptable, especially when a server needs to register a "hit". That's where lag compensation comes into place. One of the methods used involves the server keeping a short history of each entity position and animation state. Then perform a number of tests and physics simulations to see if a "hit" would occur when a player "fires" their weapons, based on the lag, velocity, and animation state of each entity plus the world geometry.


I'm surprised people only mention modulation (the process that modifies a higher energy "carrier" that can be radiated easily over long distances in function of the signal) as an additional factor rather than a key element in increasing the datarate of communications links. Remember those 54kbps modems at the beginning of internet? When the ADSL modems kicked in, there was an immediate twentyfold+ increase in datarate, mostly due to the modulation used: with QAM, the bitrate became higher than the symbol (clock) rate (by using various amplitudes and phases), like fitting more things in the same bucket travelling at the same speed, just by figuring out a cleverer way to arrange them. And we're not done yet, many others already exist or are being investigated...

enter image description here enter image description here

a) States map of QAM modulation, one of the RF modulations b) How symbol rate can be different from datarate, example with multilevel amplitude modulation (the ancestor of FM), one of the components of QAM (the other is the phase, very related with FM).

For your problem in particular, aside from trying to predict what the players are doing, games are typically rendered by your machine; only a small amount of vital information is transported such as coordinates, velocities etc. A single webpage probably has way more information than that, but the real problem is latency (especially for hardcore gamers), and it accumulates with the number of hubs your signals are coming through from propagation delays, among many other things (distance is also a problem as maximum reliable datarate drops with losses - which increase with distance).