Minimum standalone graphics card requirements for dual monitor display
I am researching on hardware requirements for my new PC. One of my requirements is to have dual monitor display to use while programming. I understand that most integrated graphics cards do not support dual monitors (?), and I will need a standalone graphics card for this. But I also want a quiet PC, and graphics cards can be quite noisy. I am not a gamer, and other than dual monitors, I don't anticipate much need for a dedicated graphics card (though an added boost to UIs would be a nice to have).
- What should I look for in a low-end, relatively silent and less power-hungry graphics card for my purpose?
- How do I calculate how much memory it should have?
- What should I look for in terms of output ports? (I am intending to use two 22 or 21 inch 1080p monitors - an area I need to research into as well).
I will be running Windows 8.1.
Solution 1:
1. Integrated graphics is enough
Most integrated graphic cards do support dual monitor output, I set up a lot of such boxes is my office. All major graphics chips manufacturers do this: nVidia and ATI without any issues, Intel – only newer (relatively. I guess, all after 900 series chipsets, and for sure all CPUs with integrated graphics, that is, starting from Sandy Bridge, even Celerons).
Only real requirement is for motherboard, and you can check it in motherboard specification on manufacturer site or in manual. Note that separate connector is not always enough, and in rare (my guess, less than 1%) models these two connectors either always give same picture, or cannot be used simultaneously.
One more thing, I've seen lots of motherboards with Intel chipsets lately (Gigabyte, ASUS, Asrock), which has Dual Monitor output disabled in BIOS Setup. It's simple switch though.
2. Any discrete graphics will do
I know, this is loose statement, yet I have not found any currently selling graphics card with two output connectors which does not support dual monitor output. So get any.
Another thing to note: If this will be ATI, then you will be able to use dual monitor setup with up to 1920x1200 resolution for any monitor, but, depending on card, only one (or more) will support higher resolutions, because on many AMD/ATI cards only one port supports DVI Dual Link. nVidia, even cheapest ones, support DVI Dual Link for all ports, so you may connect 2560x1600 monitors to each (of two) ports.
Most ATI cards with sufficient number of connectors allow up to 3 monitors, and, though I neither set up nor seen such configuration working in the wild, @Hennes says in comments it works (also, he says what you need to make it work). nVidia only support two.
If you've got internal graphics on your motherboard with single output port, single-headed discrete graphics card can be enough also, if you'll set up BIOS of your motherboard to force integrated graphics enabled even when external one present. Though there can be conflicts, so I won't recommend to buy this blindly.
Refrain from any other graphics (all but Intel, nVidia and ATI) chipset manufacturers because of lack of support (from other developers) and miserable drivers.
3. There are graphics cards with USB interface, and there are USB monitors
Sometimes you can't plug in PCI-Express card (not your property, policies, it's notebook, etc.), but still want to use second (or third) monitor. In this case USB can save a day.
USB graphics cards are connected to USB with one side and to a monitor with other side. They have conventional output connector: D-SUB (VGA), DVI or HDMI; some of them have two connectors and allow connecting two monitors.
USB monitors are just connected directly to USB and they are detected by Windows as a graphics card with monitor. Some of them can be powered with USB 3.0, other will require external power, and USB will be used for signaling mostly.
USB 2.0 connection does not provide enough bandwidth for video or even full screen scrolling, but it will work well for mostly static picture (text editor, web browsing, etc.).
USB 3.0 is fast enough for conventional update rate (60Hz), so you can even watch FullHD video on the connected monitor. (of course you have to plug in USB 3.0 video card / monitor to USB 3.0 connector in the PC or notebook).
Note that this is last solution, mostly because drivers of these video cards are quite buggy.
Answers to specific questions:
- What should I look for in a low-end, relatively silent and less power-hungry graphics card for my purpose?
There are graphic cards with passive cooling. They conform your request. Also, they have no problems with fans wearing out and stopping.
- How do I calculate how much memory it should have?
I'd say, not less than 24Mb for each display. Though I don't think you'll be able to find a card with less than 64Mb RAM. And integrated cards use system memory, so limits are very far there.
- What should I look for in terms of output ports? (I am intending to use two 22 or 21 inch 1080p monitors - an area I need to research into as well).
Preferably use DisplayPort or DVI, or HDMI. There are no caveats with them with resolutions up to 1920x1200 (no matter what is physical monitor size). With higher resolutions, there are specific requirements (usually explained in the hires monitor manuals).
Conventional D-Sub (analogue VGA) can be good enough also, but it depends on DAC, and cheapest ($15-$30) video cards may have noisy DACs. Then picture can tremble or have other distortions. Integrated graphics, even on low-end motherboards, usually has good DAC.
Also, when monitor connected via analogue connection, cable matters. I've seen (and used) cheap cables which give clearly visible blur, and even interfere with monitor' auto-setup. Cable from monitor box is good enough and will work flawlessly (or get it back for replacement).
One more thing to consider. DVI has analogue (D-Sub) pins for compatibility, and there are passive DVI to D-Sub adapters. But on many videocards, only one DVI port have these pins connected (or none, if there is separate D-Sub connector), so it can problematic to connect two monitors to single video card via analogue connection.
But ultimately, analogue connection via good DAC and cable will give very clear picture, which is hard to distinguish from digital one. Most people can't, unless monitor connected with two cables, and they switch inputs, or two monitors installed side-by-side, one via D-Sub (analogue) and another via DVI or other digital connection.