What is the scientific explanation for radio waves bending around the Earth?

EDIT: In the interest of avoiding spreading misleading information, I have removed the portions of this answer that have been disputed or refuted in the comments and edits on this question. Specifically, the parts about the ACK/Distance shown on the screen at 42:47 and the calculation of the curvature have been removed. The rest of this answer, however, still stands.

TL;DR: They erroneously believed that radio antennae were lasers. The antennae should still be able to connect even on a curved Earth.

The video pretends that the signal leaving the radio antennae is like a laser beam, focused in the line that emanates from transmitter to receiver without diverging. In reality, this isn't even close to true, even for directional radio antennae. Both the transmitted signal and the receiver acceptance get wider farther from the respective antennae, purely due to the diffractive properties of waves. This means that the signal actually propagates in a large ellipsoidal region between the antennae called the Fresnel zone**. The rule of thumb that is used in engineering systems is that as long as at least 60 percent of the Fresnel zone is unobstructed, signal reception should be possible.

The maximum radius $F$ of the Fresnel zone is given in the same Wikipedia article by

$$F=\frac{1}{2}\sqrt{\frac{cD}{f}}\,,$$

where $c=3\times {10}^8 \frac{\mathrm{m}}{\mathrm{s}}$ is the speed of light, $D$ is the propagation distance and $f$ is the frequency. Using $D=14 \, \mathrm{km}$ and $f=5.880 \, \mathrm{GHz},$ we see that $F=13.69 \, \mathrm{m}.$ As you can see, the beam expands massively over such a distance. If you cut out the lower $3.84 \, \mathrm{m}$ of that circle, you would find that the fraction of the beam that is obstructed for obstruction height $h$ from the formula for the area of the cut-out portion given here:

$$\frac{A_{\text{obstructed}}}{A_{\text{whole beam}}}=\frac{F^2\cos^{-1}\left(\frac{F-h}{F}\right)-(F-h)\sqrt{2Fh-h^2}}{\pi F^2}\,.$$

Evaluating this expression for $F=13.69 \, \mathrm{m}$ and $h=3.84 \, \mathrm{m}$ gives you an obstruction fraction of $\frac{A_{\text{obstructed}}}{A_{\text{whole beam}}}=0.085.$

So, even on a curved earth, only 8.5 percent of the beam would be obstructed. This is well within the rule of thumb (which required less than 40 percent obstruction), so the antennae should still be able to connect on a curved Earth.

**In reality, propagation of radio waves between two antennae is complicated, and I'm necessarily skipping over a lot of details here, or else this post would become a textbook. What I refer to as the "Fresnel zone" here is technically the first Fresnel zone, but the distinction is not necessary here.


Since the existing answer has a few errors (see my comments on that otherwise excellent answer), I wanted to offer another take. There are three key phenomena at play here, refraction, line of sight, and diffraction. I will tackle each in turn.

Refraction

Since the atmosphere decreases in density as you go up in altitude it acts to refract radio waves. This has the same root cause as light bending when in passes between mediums with different refractive indices (e.g. a prism or the bending of light as look down into a swimming pool). This means that the optical or radio horizon is actually farther away than the geometric horizon. Assuming standard atmospheric conditions, this can be accounted for by calculating the horizon distance as if the earth's radius was bigger by a factor of 4/3 (Wikipedia link). The distance to the horizon can be computed as $$d_{\rm horizon} = \sqrt{2kRh+h^2}$$ where $R$ is the earth's radius (about 6371 km), $k$ is the multiplicative factor (4/3 for radio waves), and $h$ is the height above ground. Based on the video, let's assume the first antenna is about 1.5 m above the ground, that gives a horizon distance of 5.048 km.

It is also possible that some ducting effects are at play. That would actually lead to a very good transmission and fully explain the successful link by itself.

Line of sight

So we know how far the horizon is based on the correct refraction of radio waves, but how far away can you see the other antenna? Some simple trigonometry reveals that to see another antenna distance $d_{\rm total}$ away, that antenna must have height $$ h_2 = \sqrt{(kR)^2+(d_{\rm total}-d_{\rm horizon})^2} - kR $$ Plugging in the numbers from the video ($d_{\rm total} = 14.24~{\rm km}$) we find that the second antenna must be at least 4.97 m above the water to have line of sight for radio waves. It's hard to tell from the video, but this might be a bit too high so we need something other than plain line of sight.

Diffraction

Electromagnetic waves diffract as they pass near objects. This phenomenon causes the waves to essentially bend around corners by some amount so it is not actually necessary to have line of sight to the EM source in order to receive a signal from that source. The effect of diffraction in this case is probably best captured by estimating the relative strength of the diffracted signal compared to the strength it would have if there were line of sight. Assuming it is actually out of line of sight, we can use nomograms in this publication1 to estimate the attenuation. The attenuation is very dependent on frequency and height of the transmitter and receiver, but using a 5.8 GHz frequency (as stated in the comments on the original post), $k=4/3$ as above, and assuming antenna heights of around 2-4 m above the water gives somewhere around 25-30 dB of attenuation. While this is a large attenuation factor, it is certainly believable that the transmission could still be received. It is the equivalent of moving the antennas from about 14.24 km apart (if they had line of sight) to 250+ km apart. After all, satellite communication satellites work and those transmitters are typically in geostationary orbit at about 36,000 km high.

Conclusions

This is a complicated radio propagation problem which is impossible to model fully without knowing the transmit power levels, transmit and receive antenna gain patterns, receiver noise characteristics, waveform properties, and signal processing details. It appears that the receive dish is likely (just) out of line of sight from the transmit dish, even when refraction is considered. However, the combination refraction and diffraction means that some amount of the transmit signal will reach the receiver. The attenuation due to non line of sight is large, but it is still possible that the receiver gets enough power to detect the transmission anyway.


[1] "Propagation by Diffraction," International Telecommunication Union, Recommendation ITU-R PN.526–7, 2001.