Use NVidia GPU from VirtualBox?

I realize a few years have passed but wanted to answer since this post shows up pretty high when you google for "virtualbox 3d multiple GPU". In the time that has passed, things have gotten a lot simpler and better.

People that stumble upon this thread will likely land here because they have a laptop or PC that has two GPU's, which is quite common these days -- especially on gaming laptops. The onboard Intel GPU is used for rendering windows and general applications, but applications that make use of GPU 3D functionality should do that via the higher performing Nvidia GPU.

Today, I was building an Ubuntu VM on my laptop to do some cross-platform development, and everything was fine except the guest VM was extremely slow, and there was no explanation for it because CPU, memory, disk were all showing low utilization.

It didn't take long to figure out it was video performance that was causing the problem. Launching applications, maximizing/minimizing windows -- anything that we take for granted in 2019 but needs 3D acceleration to work at any reasonable speed -- was using GPU 0.

It was easy to determine this because Windows 10 now has the ability to see GPU utilization using "task manager", then the "performance" tab. And I could see as I moved windows, maximized, minimized, that was being done through GPU on the host. That GPU on the host is the integrated Intel HD GPU, and I wanted to use the NVidia GTX-1050ti, which was GPU1.

After searching around I didn't really find anywhere where you could specify which GPU to use. But this thread, and some others, reminded me that on these kind of setups you have to go into the NVidia control panel, then "manage 3d settings", then the "Program Settings" tab.

You won't likely find "Virtualbox" in the list. But you can press the "Add" button, and add virtualbox.exe. You may have to drill down the drive/path where your virtualbox installation is. Once you have added it, in the settings below make sure that item 2. "Select the preferred graphics processor for this program" is set to the GPU that you want it to use, which in my case was "HIgh-performance NVIDIA processor".

Don't set it to auto, and certainly don't set it to integrated. Of course, you need the VM settings set with the 3D acceleration box ticked, and you need the guest additions installed on the host. But once you have set the host video 3d settings as described above, shutdown the guest VM, exit virtual box, and then re-launch virtualbox and the VM.

If you use task manager|performacne and look at the "virtualbox manager" process and watch what GPU gets used when you navigate the guest VM's UI, you should see it using the better GPU now. See the image pasted below.

All that said, don't expect to be able to run games in a guest VM. 3D acceleration pass thru still is not quite that far along. But you can expect to have a modern OS and UI in your guest, and have an acceptable experience. One would be able to play older games in the guest VM, like anything based on directX9. Unfortunately, as the ability to virtualize GPU eveolves, the 3d gaming technology evolves quicker.

Screenshot


Giving the guest full GPU access is probably not possible. If a virtual machine had direct access to your GPU while your host was using it, Bad ThingsTM would happen because sharing memory between two effectively different computers is not a thing; pointers and addresses and whatnot would be very different between them. (No consumer-available card supports servicing two computers at once.)

There are, however, some things you can try. First, set your preferred graphics processor to the good one in the NVidia Control Panel (3D SettingsManage 3D settingsPreferred graphics processor). That might make VirtualBox go with the NVidia card for OpenGL.

If that doesn't help, try installing Guest Additions in Safe Mode on the guest.

Finally, on Linux hosts, you can try to pass the GPU through to the virtual machine, but this will only work for PCI cards and I wasn't able to find whether yours is PCI, and even so, you stand a good chance of ripping the GPU away from the host or causing other problems. First find the PCI address (bus, device, and function) for the good card. Set your VM's chipset to ICH9; this didn't immediately break anything when I tried it. Then use the VBoxManage utility to attach the card:

vboxmanage modifyvm "Your VM Name" --pciattach BB:DD.F@bb:dd.f

Replace Your VM Name as appropriate. BB is the bus number of your GPU on the host; DD is the device; F is the function. After the @, enter the PCI slot that it will be on the guest. For example:

vboxmanage modifyvm "Windows 7 x64" --pciattach 01:[email protected]

In general, GPU passthrough is more likely to be possible on a Linux host. See How to setup a gaming machine with GPU passthrough.