NVIDIA Cuts Fermi Quadro Pricing by 50%

Discussion in 'Video Cards and Displays' started by Rob Williams, Oct 25, 2010.

  1. Rob Williams

    Rob Williams Editor-in-Chief Staff Member Moderator

    12,080
    1
    Jan 12, 2005
    Atlantic Canada
    Being as hyped as it was, AMD's Radeon HD 6800 series launch last week didn't need to be made more exciting, but NVIDIA decided to make it so anyway by dropping the prices of its hugely popular GeForce GTX 460 and GTX 470 graphics cards. It appears now that things didn't quite end there, as multiple sources are reporting that the company also dropped the prices of its Fermi-based Quadro line-up by 50%. Yes, fifty freaking percent.

    [​IMG]

    Read the rest of our post and discuss it here!
     
  2. DarkStarr

    DarkStarr Tech Monkey

    585
    0
    Apr 9, 2010
    Wow..... is it just me or does it look like Nvidia might be going broke?
     
  3. Doomsday

    Doomsday Tech Junkie

    1,477
    0
    Nov 24, 2008
    KHI, PAK
    hah! 2.5 yrs ago if u had said that ppl would have laughed their asses off! :D
    it seems Nvidia has taken the spot of AMD as the new underdogs!
     
  4. Rob Williams

    Rob Williams Editor-in-Chief Staff Member Moderator

    12,080
    1
    Jan 12, 2005
    Atlantic Canada
    I am not going to lie... these recent happenings have me a bit worried. In the past month alone, we've seen NVIDIA head to retail and drop prices on both its desktop and workstation cards, and it just doesn't look good. On the desktop side, price drops are common, but the fact that Quadro also saw massive drops is a bit... :confused:

    I don't think that Fermi was perfect, far from it, but NVIDIA did bring a lot of interesting things to the table. The problem is, Fermi came late, and an unsympathetic AMD felt the urge to continue steam-rolling it with last week's launch. Fermi still excels in some regards, but it doesn't seem to make a huge difference where pure gamers are concerned.

    NVIDIA needs to do something, but I'm not quite sure what. Ideally, we need the GTX 500 series to not just be a refurnished GTX 400 series, but rather a fresh design. One that's not so power hungry, hot, and built using a trillion transistors (okay, I exaggerate). I don't think NVIDIA is in dangerous water yet, but things need to change soon to reverse its trek.

    It's strange to think that a mere couple of years ago, things were stark opposite. It was AMD that was in a rut and NVIDIA that was excelling. How fickle this industry is.
     
  5. Mike

    Mike Obliviot

    1
    0
    Oct 27, 2010
  6. Rob Williams

    Rob Williams Editor-in-Chief Staff Member Moderator

    12,080
    1
    Jan 12, 2005
    Atlantic Canada
    Ahh, great find, thanks a lot for sharing! I'll follow-up with our own news post in the AM.

    It's actually a relief to hear that this was all just an error... one I'm sure NVIDIA wished it caught before it happened.
     
  7. b1lk1

    b1lk1 Tech Monkey

    801
    0
    Mar 1, 2006
    Ontario
    Now, realizing I do understand nothing in the comsumer market could properly use it, but why can't we hve 6GB memory equipped cards for gaming? That is nucking futs!!!! (and so are the prices......)
     
  8. TheCrimsonStar

    TheCrimsonStar Tech Monkey

    763
    0
    Apr 12, 2010
    Strawberry Plains, TN
    this might sound like a stupid question...but what's the difference between their desktop Fermi and Quadro Fermi?
     
  9. Tharic-Nar

    Tharic-Nar Senior Editor Staff Member Moderator

    1,119
    1
    Nov 25, 2009
    UK
    More memory and optimised drivers for OpenGL (and possibly CUDA in NVIDIA's case). You pay for licenses etc for speeding up viewports in 3D apps and number crunching. The more mainstream cards support a sub-set of OpenGL, the 'standard' version. Quadro's and FirePro's support the extended version or full, and that contains an additional 70 or 80 commands. This is why using a mainstream card in a pro app can result in instability or lack of support for the full live textured previews and lighting. Having 6GB of memory is also useful for rendering (especially with high levels of sub-sampling as well as 8k rez textures), if a GPU enabled renderer is used. Math crunching simulators and such also gain from the extra memory, less calls to system RAM.

    The cards are expensive mainly because of the specialised support they provide to industries where money is no object...
     

Share This Page