What is a Matrox GPU and why does my university's UNIX server have one?

General-purpose servers don't need a modern GPU - just enough to show a medium-sized console desktop. They mostly deal with regular CPU computing and networking.

Matrox G200 VGAs, however, are commonly used on servers due to their integration with a baseboard management controller (BMC, also known as iLO, iDRAC, or the IPMI).

This management controller acts as an independent system with its own operating system and lets the server's administrator remotely connect to the console display & keyboard – they can see the BIOS screens, restart a server even if it's completely frozen, even start it from full power-off. For these tasks, the controller must know what the graphics adapter is displaying right now.

So I would guess that the old Matrox video adapters are used for this because they store the video buffer in system RAM (instead of their own VRAM) and use a sufficiently simple data layout that the BMC can decipher it without needing arcane knowledge about the GPU's internals, nor without any help from the main OS.

(Or perhaps the opposite – as mentioned in comments the G200 is usually built into the BMC, possibly giving the BMC completely direct access to the G200's video buffer.)

But even if the server was built for GPU computing, I assume it wouldn't have an "all-in-one graphics card" as PCs do, but instead a set of dedicated compute-only GPGPUs (e.g. from nVidia) for the heavy work – and still the same Matrox VGA for the console.


That Matrox G200eR2 is not a separate video card. It is a chip directly integrated into the server motherboard. It is cheap, very reliable, easy to integrate and provides excellent text (console) display ability and decent 2D graphics ability. Is is also so well known that just about every Operating System for Intel hardware has driver support build-in for it.

The only purpose for a VGA card there is to get a basic console display that you can use for Bios setup and initial installation of the server. After that you will probably only ever access the server remotely. It doesn't have to be a good VGA card. You are not going to be gaming on it. But it is a major blessing if it works out-of-the-box with whatever OS you are going to install on the server. And that is all you need and want in a server.

Matrox chips have always been very popular for this purpose and this particular one was still used in 2014 in new Dell servers and probably in some other brands as well.


Why would my university have them in a modern server (CPU was released in late 2013)?

Because a server does not need a high-performance GPU.
And by the way, Matrox had good Multi-Monitor graphics cards long before ATI/AMD and NVidia had them.

So the decision had probably been logical by the time of purchase.

Tags:

Linux

Unix