Why do workstation graphics cards cost far more than equivalent consumer graphics cards?

It's primarily market segmentation to allow price discrimination. Businesses who make money from work done with these cards have different requirements than gamers. Nvidia and AMD are taking advantage of that by asking them to pay more.

There are some minor differences to create this rate fence. For example, the Quadro / Fire Pro models use different drivers which prioritize rendering accuracy over speed. On the Tesla models, ECC RAM is a selling point for server farms, and NVidia claim higher reliability for 24/7 operation.

The company I work for designs GPGPU accelerated software. Our server suppliers will only sell us Tesla (or GRID) systems. I know if I buy a 1U server with 3x K40 cards, it won't melt in my client's data center. So I'm willingly paying triple price for my cards. I imagine anyone buying a Quadro card for business has the same rationale.


It is pretty simple, and certainly not conspiratorial. It has to do with basic economics and finance.

Let's start with the CAD software. CAD software programmers and that programmer community is pretty inwardly focused. To be sure, the math behind the user interface is very complex, and absorbs huge amount of resources to develop. There is an effort to make the software more usable and user friendly, but that input is usually driven by company insiders and the IT people from the customers they usually interface with, not actual users (CAD Designers). CAD designers generally are not computer geeks, and complain among themselves, not to people who can change things, like the aforementioned IT people, who are not likely to pass on concerns anyway.

That is the back story. So CAD developers spending time to make there graphic interface more generic is not high on there agenda; this would be a huge investment of time and dollars. This would allow standard drivers to be used efficiently interfacing with the software. The structure of their business, and their customers business will probably never make this a priority. Ever. This is why SPECviewperf V12.0.1 results vary so wildly from card/software combinations, making it almost impossible to pick a 'beast' card for multiple software packages unless you spend ridiculous amounts of money.

As for the card makers themselves, well developing drivers ain't cheap. Millions of lines of code, and thousands of man hours of debugging. To develop a driver for each software is out of the question; there are not enough users to justify that expense. So they come up with something that kinda works with all CAD. For the large volume packages, like AutoCAD, they may have a patch that boosts performance, but most have to get a one size fits all, which compromises for some more than others.

Then, they get to certify with the software makers; oh joy. This is a long, arduous, very expensive process, where the hardware providers make a lot of driver changes, and kiss a whole lotta butt. Accommodating one software and making sure they don't jack it up for the other software packages, is a whack-a-mole game almost impossible to imagine.

ECC memory in "pro" cards is actually not overkill. With the software packages having such touchy tessellation, and designers making crappy models, bit flips are not that uncommon. This eliminates a lot of graphic generated crashes (but not all of them). Other than that, the hardware is extremely similar to consumer cards, as far as I can tell.

Then, you have to take all of this expense, and cram it into 10% percent of the volume of their consumer cards, add a reasonable premium for going through all of that crap, and VIOLA, you have a quite expensive video card. If you do some research, and have only one CAD package, you may find a consumer card that blows away the high end (i.e. CATIA V6/ AMD R9 290X) for a quarter of the price, but not likely.

That's my thoughts, anyway. I read a lot of stuff from sellers of CAD cards which is a load of poo, and I thought I would add my two cents.