NVIDIA Cuts Fermi Quadro Pricing by 50%

Rob Williams

Editor-in-Chief
Staff member
Moderator
Being as hyped as it was, AMD's Radeon HD 6800 series launch last week didn't need to be made more exciting, but NVIDIA decided to make it so anyway by dropping the prices of its hugely popular GeForce GTX 460 and GTX 470 graphics cards. It appears now that things didn't quite end there, as multiple sources are reporting that the company also dropped the prices of its Fermi-based Quadro line-up by 50%. Yes, fifty freaking percent.


Read the rest of our post and discuss it here!
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
Wow..... is it just me or does it look like Nvidia might be going broke?

I am not going to lie... these recent happenings have me a bit worried. In the past month alone, we've seen NVIDIA head to retail and drop prices on both its desktop and workstation cards, and it just doesn't look good. On the desktop side, price drops are common, but the fact that Quadro also saw massive drops is a bit... :confused:

I don't think that Fermi was perfect, far from it, but NVIDIA did bring a lot of interesting things to the table. The problem is, Fermi came late, and an unsympathetic AMD felt the urge to continue steam-rolling it with last week's launch. Fermi still excels in some regards, but it doesn't seem to make a huge difference where pure gamers are concerned.

NVIDIA needs to do something, but I'm not quite sure what. Ideally, we need the GTX 500 series to not just be a refurnished GTX 400 series, but rather a fresh design. One that's not so power hungry, hot, and built using a trillion transistors (okay, I exaggerate). I don't think NVIDIA is in dangerous water yet, but things need to change soon to reverse its trek.

It's strange to think that a mere couple of years ago, things were stark opposite. It was AMD that was in a rut and NVIDIA that was excelling. How fickle this industry is.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
Ahh, great find, thanks a lot for sharing! I'll follow-up with our own news post in the AM.

It's actually a relief to hear that this was all just an error... one I'm sure NVIDIA wished it caught before it happened.
 

b1lk1

Tech Monkey
Now, realizing I do understand nothing in the comsumer market could properly use it, but why can't we hve 6GB memory equipped cards for gaming? That is nucking futs!!!! (and so are the prices......)
 

Tharic-Nar

Senior Editor
Staff member
Moderator
More memory and optimised drivers for OpenGL (and possibly CUDA in NVIDIA's case). You pay for licenses etc for speeding up viewports in 3D apps and number crunching. The more mainstream cards support a sub-set of OpenGL, the 'standard' version. Quadro's and FirePro's support the extended version or full, and that contains an additional 70 or 80 commands. This is why using a mainstream card in a pro app can result in instability or lack of support for the full live textured previews and lighting. Having 6GB of memory is also useful for rendering (especially with high levels of sub-sampling as well as 8k rez textures), if a GPU enabled renderer is used. Math crunching simulators and such also gain from the extra memory, less calls to system RAM.

The cards are expensive mainly because of the specialised support they provide to industries where money is no object...
 
Top