What are the drawbacks of using an integrated GPU for non-gaming purposes?

Since Intel is one of the few companies that actively support open source drivers for their GPUs, they are probably the most robust GPU you could use under Linux right now because of that alone. Personally I would buy an Intel graphics card just for that, but sadly nobody makes video cards with Intel GPUs.

Dual monitor support depends on the connectors on the machine itself. I have seen some machines with recent Intel GPUs that can support two digital (DVI/HDMI/DisplayPort) screens plus one analogue (VGA) for a total of three screens. It just depends on what connectors the manufacturer decides to use.

You'll need to check the specs, but most video cards will output up to 2560x1600 over their digital connections, and 2048x1536 over VGA.

The performance of the GPU doesn't decrease as it's logically separate to the CPU, and power usage is based on the type of GPU rather than whether it is integrated or not. Typically dedicated GPUs are faster, so they use more power. Possibly an integrated one might use a tiny bit less power than an identical dedicated GPU as it may not need as much supporting circuitry, and probably won't need a separate fan as it can share the CPU fan, but the difference would be pretty small.

If you're not gaming and you're running Linux, the advantages of an Intel integrated GPU are pretty clear: Solid open-source driver support.


It would depend on the specific integrated GPU. There's really two main ways of implimenting this - the old way was to have a video processor as part of the chipset - as most pre-ib/sb intels and pre APU AMD chips did. With modern chips, they're part of the processor - but they have their own section/block of the processor. Using or not using an integrated processor dosen't take away or add any performance as far as normal tasks that the processor goes.

However integrated graphics will share ram, reducing the total you have for use. This is probably insignificant if you've loaded up your system with ram, but its the only real disadvantage a integrated GPU would have an identical non-integrated GPU.

You're getting a 'good enough' video adaptor (perfect for normal office type use), effectively free. With the intel 4000 series you mentioned, you can take advantage of quick sync which allows seriously fast, efficient hardware accelerated encoding and decoding. As things are, this is the killer feature if for some wierd reason you had to decide between using an intel graphics adaptor and a discrete one.

At the end of the day if you're asking yourself if you need a discrete card, just start with the intel adaptor, work out what you're missing and then work out what's the best choice. Your processor is going to come with that functionality and you might as well see if it works for you

So

  1. No, you might see some minimal change in temperatures, but you'd simply be using part of your CPU you otherwise won't

  2. Depends on the GPU - the Gforce 660 (which I incidentally run) has one of the best idle power usage metrics of any discrete GPU, and uses 5w at idle. I can't find any numbers on discrete cards. I'd note though it makes no sense in having a discrete GPU unless you game or intend to use a GPGPU, its additional cost for not much benefit. For gaming, nvidia is probably the best, but for GPGPU type tasks like bitcoin, AMD slaughters it.

  3. Probably, assuming you arn't gaming.

  4. I've not had any with a core i7 3770 on windows, or a previous generation core i5. It ought to work

  5. Intel's about the only company that releases open source drivers for any video adaptor it develops itself. You're very likely to see it work right out of the box with minimal issues