Is there a physical limit to data transfer rate?

tl;dr- The maximum data rate you're looking for would be called the maximum entropy flux. Realistically speaking, we don't know nearly enough about physics yet to meaningfully predict such a thing.

But since it's fun to talk about a data transfer cord that's basically a $1\mathrm{mm}$-tube containing a stream of black holes being fired near the speed of light, the below answer shows an estimate of $1.3{\cdot}{10}^{75}\frac{\mathrm{bit}}{\mathrm{s}}$, which is about $6.5{\cdot}{10}^{64}$ faster than the current upper specification for USB, $20\frac{\mathrm{Gbit}}{\mathrm{s}}=2{\cdot}{10}^{10}\frac{\mathrm{bit}}{\mathrm{s}}$.


Intro

You're basically looking for an upper bound on entropy flux:

  • entropy: the number of potential states which could, in theory, codify information;

  • flux: rate at which something moves through a given area.

So,$$\left[\text{entropy flux}\right]~=~\frac{\left[\text{information}\right]}{\left[\text{area}\right]{\times}\left[\text{time}\right]}\,.$$ Note: If you search for this some more, watch out for "maximum entropy thermodynamics"; "maximum" means something else in that context.

In principle, we can't put an upper bound on stuff like entropy flux because we can't claim to know how physics really works. But, we can speculate at the limits allowed by our current models.

Speculative physical limitations

Wikipedia has a partial list of computational limits that might be estimated given our current models.

In this case, we can consider the limit on maximum data density, e.g. as discussed in this answer. Then, naively, let's assume that we basically have a pipeline shipping data at maximum density arbitrarily close to the speed of light.

The maximum data density was limited by the Bekenstein bound:

In physics, the Bekenstein bound is an upper limit on the entropy $S$, or information $I$, that can be contained within a given finite region of space which has a finite amount of energy—or conversely, the maximum amount of information required to perfectly describe a given physical system down to the quantum level.

–"Bekenstein bound", Wikipedia [references omitted]

Wikipedia lists it has allowing up to$$ I ~ \leq ~ {\frac {2\pi cRm}{\hbar \ln 2}} ~ \approx ~ 2.5769082\times {10}^{43}mR \,,$$where $R$ is the radius of the system containing the information and $m$ is the mass.

Then for a black hole, apparently this reduces to$$ I ~ \leq ~ \frac{A_{\text{horizon}}}{4\ln{\left(2\right)}\,{{\ell}_{\text{Planck}}^2}} \,,$$where

  • ${\ell}_{\text{Planck}}$ is the Planck length;

  • $A_{\text{horizon}}$ is the area of the black hole's event horizon.

This is inconvenient, because we wanted to calculate $\left[\text{entropy flux}\right]$ in terms of how fast information could be passed through something like a wire or pipe, i.e. in terms of $\frac{\left[\text{information}\right]}{\left[\text{area}\right]{\times}\left[\text{time}\right]}.$ But, the units here are messed up because this line of reasoning leads to the holographic principle which basically asserts that we can't look at maximum information of space in terms of per-unit-of-volume, but rather per-unit-of-area.

So, instead of having a continuous stream of information, let's go with a stream of discrete black holes inside of a data pipe of radius $r_{\text{pipe}}$. The black holes' event horizons have the same radius as the pipe, and they travel at $v_{\text{pipe}} \, {\approx} \, c$ back-to-back.

So, information flux might be bound by$$ \frac{\mathrm{d}I}{\mathrm{d}t} ~ \leq ~ \frac{A_{\text{horizon}}}{4\ln{\left(2\right)}\,{{\ell}_{\text{Planck}}^2}} {\times} \frac{v_{\text{pipe}}}{2r_{\text{horizon}}} ~{\approx}~ \frac{\pi \, c }{2\ln{\left(2\right)}\,{\ell}_{\text{Planck}}^2} r_{\text{pipe}} \,,$$where the observation that $ \frac{\mathrm{d}I}{\mathrm{d}t}~{\propto}~r_{\text{pipe}} $ is basically what the holographic principle refers to.

Relatively thick wires are about $1\,\mathrm{mm}$ in diameter, so let's go with $r_{\text{pipe}}=5{\cdot}{10}^{-4}\mathrm{m}$ to mirror that to estimate (WolframAlpha):$$ \frac{\mathrm{d}I}{\mathrm{d}t} ~ \lesssim ~ 1.3{\cdot}{10}^{75}\frac{\mathrm{bit}}{\mathrm{s}} \,.$$

Wikipedia claims that the maximum USB bitrate is currently $20\frac{\mathrm{Gbit}}{\mathrm{s}}=2{\cdot}{10}^{10}\frac{\mathrm{bit}}{\mathrm{s}}$, so this'd be about $6.5{\cdot}{10}^{64}$ times faster than USB's maximum rate.

However, to be very clear, the above was a quick back-of-the-envelope calculation based on the Bekenstein bound and a hypothetical tube that fires black holes near the speed of light back-to-back; it's not a fundamental limitation to regard too seriously yet.


The Shannon-Hartley theorem tells you what the maximum data rate of a communications channel is, given the bandwidth.

$$ C = B \log_2\left(1+\frac{S}{N}\right) $$

Where $C$ is the data rate in bits per second, $S$ is the signal power and $N$ is the noise power.

Pure thermal noise power in a given bandwidth at temperature $T$ is given by:

$$ N = k_BTB $$

So for example, if we take the bandwidth of WiFi (40MHz) at room temperature (298K) using 1W the theoretical maximum data rate for a single channel is:

$$ 40 \times 10^6 \times \log_2\left(1 + \frac{1}{1.38\times 10^{-23} \times 298 \times 40 \times 10^6}\right) = 1.7 \times 10^9 = 1.7 \mathrm{\;Gbs^{-1}} $$

In a practical system, the bandwidth is limited by the cable or antenna and the speed of the electronics at each end. Cables tend to filter out high frequencies, which limits the bandwidth. Antennas will normally only work efficiently across a narrow bandwidth. There will be significantly larger sources of noise from the electronics, and interference from other electronic devices which increases $N$. Signal power is limited by the desire to save power and to prevent causing interference to other devices, and is also affected by the loss from the transmitter to the receiver.

A system like USB uses simple on-off electronic signal operating at one frequency, because that's easy to detect and process. This does not fill the bandwidth of the cable, so USB is operating a long way from the Shannon-Hartley limit (The limiting factors are more to do with the transceivers, i.e. semiconductors). On the other hand, 4G (and soon 5G) mobile phone technology does fill its bandwidth efficiently, because everyone has to share the airwaves and they want to pack as many people in as possible, and those systems are rapidly approaching the limit.


No, there is no fundamental limit on overall transfer rate. Any process that can transfer data at a given rate can be done twice in parallel to transfer data at twice that given rate.