Transatlantic ping faster than sending a pixel to the screen?

The time to send a packet to a remote host is half the time reported by ping, which measures a round trip time.

The display I was measuring was a Sony HMZ-T1 head mounted display connected to a PC.

To measure display latency, I have a small program that sits in a spin loop polling a game controller, doing a clear to a different color and swapping buffers whenever a button is pressed. I video record showing both the game controller and the screen with a 240 fps camera, then count the number of frames between the button being pressed and the screen starting to show a change.

The game controller updates at 250 Hz, but there is no direct way to measure the latency on the input path (I wish I could still wire things to a parallel port and use in/out Sam instructions). As a control experiment, I do the same test on an old CRT display with a 170 Hz vertical retrace. Aero and multiple monitors can introduce extra latency, but under optimal conditions you will usually see a color change starting at some point on the screen (vsync disabled) two 240 Hz frames after the button goes down. It seems there is 8 ms or so of latency going through the USB HID processing, but I would like to nail this down better in the future.

It is not uncommon to see desktop LCD monitors take 10+ 240 Hz frames to show a change on the screen. The Sony HMZ averaged around 18 frames, or 70+ total milliseconds.

This was in a multimonitor setup, so a couple frames are the driver's fault.

Some latency is intrinsic to a technology. LCD panels take 4-20 milliseconds to actually change, depending on the technology. Single chip LCoS displays must buffer one video frame to convert from packed pixels to sequential color planes. Laser raster displays need some amount of buffering to convert from raster return to back and forth scanning patterns. A frame-sequential or top-bottom split stereo 3D display can't update mid frame half the time.

OLED displays should be among the very best, as demonstrated by an eMagin Z800, which is comparable to a 60 Hz CRT in latency, better than any other non-CRT I tested.

The bad performance on the Sony is due to poor software engineering. Some TV features, like motion interpolation, require buffering at least one frame, and may benefit from more. Other features, like floating menus, format conversions, content protection, and so on, could be implemented in a streaming manner, but the easy way out is to just buffer between each subsystem, which can pile up to a half dozen frames in some systems.

This is very unfortunate, but it is all fixable, and I hope to lean on display manufacturers more about latency in the future.


Some monitors can have significant input lag

Accounting for an awesome internet connection compared to a crappy monitor and video card combo its possible

Sources:

Console Gaming: The Lag Factor • Page 2

So, at 30FPS we get baseline performance of eight frames/133ms, but in the second clip where the game has dropped to 24FPS, there is a clear 12 frames/200ms delay between me pulling the trigger, and Niko beginning the shotgun firing animation. That's 200ms plus the additional delay from your screen. Ouch.

A Display can add another 5-10ms

So, a console can have upto 210ms of lag

And, as per David's comment the best case should be about 70ms for sending a packet


It is very simple to demonstrate input lag on monitors, just stick an lcd next to a crt and show a clock or an animation filling the screen and record it. One can be a second or more behind. It is something that LCD manufacturers have tightened up on since gamers, etc have noticed it more.

Eg. Youtube Video: Input Lag Test Vizio VL420M