Why do most FPV 5.8 GHz video transmitters use PAL or NTSC?

As others have said, PAL and NTSC are analog and have virtually no latency. This is extremely important when flying FPV drones at high speeds. Having latency will tell you where you’ve been, not where you actually are.

I have a large video/photography drone (Yuneec Q500 4K). It transmits the video digitally on 5.8 GHz using the 802.11A protocol. (Just like your home Wi-Fi network.) I have measured the latency of the video feed at 282 milliseconds. This is because the video has to be converted from analog to digital, and then the back again at the receiver. That may not sound like a lot, but it is when you are traveling at high speed and trying to avoid obstacles.

Here is a quick chart that will show how 282 ms of latency affects where you think you are, compared to where you actually are.

Speed    Error       Speed     Error
[MPH]    [Feet]      [km/h]    [Meters]
----------------------------------------------
 10      4.15         16       1.26

 20      8.24         32       2.51

 30     12.39         48       3.78

 40     16.54         64       5.04

 50     20.63’        80       6.29

Considering that some FPV drones can travel at 60+ MPH, you can see how much of an impact that video latency would have.

As far as FPS goes, 30 for NTSC and 24 for PAL are standard frame rates and more than enough for a smooth picture. Higher resolution would also require larger/heavier hardware. When FPV racers are trying to shave fractions of a gram off of their drones, they will sacrifice resolution for a reduction in weight.

I hope my explanation helps!


I upvoted Nettle Creek's answer, but here is a quick explanation about why latency matters:

Piloting a drone is a feedback system with you in the loop. As in any other feedback system, the usual Nyquist stability critera apply. Video latency adds lag in the feedback loop, and the effect is the same as on any other feedback system (you are the opamp): it makes the system less stable. This can be compensated for by various techniques, the easiest of which is to lower the bandwidth, which makes the system slower.

Just imagine a car having a 1 second lag between turning the steering wheel and the wheels actually turning. This more or less equates to putting a blind person behind the wheel, and you in the passenger seat screaming directions. I bet you would take a lot longer to reach your destination...

For example, a few years ago LCD display lag was an issue. Some LCD screens would buffer a frame or two to apply digital processing to the image. This resulted in a lag of 1-2 frames. This essentially made all First Person Shooter games unplayable. One frame delay might not sound like a lot, but it is enough to severely upset the human control system (ie, brain) which is not used at all to having display lag between moving your finger and seeing the result.

Digital video is compressed, so the compressor must accumulate several frames in RAM and process them. You get very high quality, but high latency.

Analog video has basically zero latency. Just a few milliseconds for the display LCD to react. Much less for CRT. It does not have "one frame" (1/60s) latency, for a subtle reason: say your eye is looking at the very center of the screen. The CCD camera reads the image scanline by scanline. This means it will read a scanline, which is transmitted immediately and displayed. So the image which is displayed on screen at the spot you are looking at (say, in the center) appears only a few milliseconds after the camera acquired it. So, latency is very low.

Of course, it will only be updated at the next frame. But each update is very much real time. If your drone is speeding towards a tree trunk, the tree will look smaller at the top of the screen, and larger at the bottom, because during the frame scan the drone got closer.

From my experience playing FPS games, this allows extremely good control. Add one more frame (16ms) and everything breaks down. No more head shots! Aiming becomes impossible. It feels like playing console FPS.

Also for piloting a drone (or aiming in FPS games) image quality doesn't matter much. Crummy analog PAL will do just fine. Back in the day on an old underpowered PC, if excellent image quality settings yielded 30fps, everyone would set the textures to "atari 2600" quality to get more frames per second. 30fps is unplayable unless against a predictable computer opponent.


They choose this protocol, because it is universal so you can use different accessories from different manufacturers. Also, they choose it, because it is simply faster and needs less computing power to decode the image. And a third reason is that a digital signal is less reliable in fast moving objects where you need low latency.