Where are all the Fermi haters now?

b1lk1

Tech Monkey
Well, as the launch of Fermi nears, the level of hatred around the web towards Nvidia seems to exponentially multiply daily. The same people that a mere 3 months ago told us how superior GTX 260/275/280/285/295 was are now the ones in the front of the mob holding the torches to burn the dreaded Nvidia Fermi-stein.

It is no secret where my loyalties lie (ATI fanboy alert) but I have also tempered my comments with intrepidation knowing that Nvidia has the ability to shock and awe with the best of them.

Let's hear it from you guys, what do you REALLY think. Let's keep it civil, but what do you all REALLY think about the impending launch.
 

Tharic-Nar

Senior Editor
Staff member
Moderator
I find it amusing when people demand competition from nvidia because of ATI's massive choke-hold on prices, all $400 of it.... for 6 months.... Didn't Nvidia massively abuse its share for 2-3 years with $700 cards?

Just a thought...
 

b1lk1

Tech Monkey
Well, there real problem is that ATI has artificially been able to get a price premium out of their HD4850/4870 cards with the lack of anything new from Nvidia. I also don't expect the MSRP for Fermi to come close to what we are used from Nvidia upon launch (read: far less than we expect) so it will be VERY interesting to see just where the dust settles.

Also, most people were just fine and happy to spend more on the Nvidia parts since they always were the top performance dogs but right now people are seemingly stating they will not spend no matter the performance lead.
 

Tharic-Nar

Senior Editor
Staff member
Moderator
Well, there real problem is that ATI has artificially been able to get a price premium out of their HD4850/4870 cards with the lack of anything new from Nvidia.

Thought it would be the 5 series. While i understand the prices staying high, if not rising, i don't mind too much since they need to recover some cash. Their CPU line is taking a beating still from Intel and nvidia had a choke hold for over 2 years, so ATI had to sell cheap. They aren't abusing the situation like Nvidia did, it took a short sharp smack from the ATI 4 series to knock some sense into them and to be a little more realistic with prices.

The Fermi release may not have any affect on ATI's prices at all, or very little, probably the best we can hope for is that prices for the 5 series return to what they were at launch, which would be about a 10% drop.

I still want to see some real data on fermi, as well as the impact on the none gaming sector like render farms and GPGPU systems. I don't really give a monkey's about physx and 3d vision to be honest. 3D is expensive with very limited support from games/apps. Physx enabled GPU calculations can choke games under a single card and you need 2 GPU's to use it at its best. There are the driver fiasco's as well, burning out cards. Disabling Physx when a none Nvidia card is detected. Charging $700 for a new gpu on release (more than a lot of PC's cost). They've been making poor decisions with some Quality Control issues to contend with.

I am slightly bitter with Nvidia, i will confess, their larger than life attitude rubs me the wrong way sometimes (but can be amusing at times), and the fact i've had 2 of their cards die on me doesn't help, but i can understand their plight. Like with the Intel licensing issue, and AMD taking to designing its own chipsets again. Nvidia is loosing one market, but gaining others with their ION and Tegra systems. Think they just became too comfortable with their situation and are now beginning to sweat as it's taken away from them.

Sorry i'm all negative, i would like to see Fermi do well, especially if it can push GPGPU into more widespread use.
 
Last edited:

Relayer

E.M.I.
Seems like nVidia is trying to be too much to too many people with Fermi. I think they need to make separate designs for the GPGPU market and then let the tech trickle down to the consumer end. Instead they've had nothing but rebrands for months.

If Fermi manages to pressure ATI price/performance wise, which it looks like it's going to, ATI will just cut prices to unbearable levels for nVidea to sell at. I'm of the opinion that they EOL'd the 200 series because of that. That, or they really underestimated how long it was going to be before they got Fermi out. I think they knew full well though that they wouldn't have anything to sell for a while. They just decided to quit taking a loss on every card they sold. It's quite possible that they are going to have to do it all over again with Fermi though.

They've just taped out the lower end chip (GF108?) so it'll be months before they have any mainstream (~$150) cards and I haven't heard a peep about mobile Dx11 products. So they aren't back on par by any means yet.
 

b1lk1

Tech Monkey
OOPS!!!!!!!! I did mean the 5850/5870 series cards Tharic-Nar.

I agree Relayer as well on what you are posting. Nvidia is in a tough spot and they put all their eggs in the Fermi basket.
 

Optix

Basket Chassis
Staff member
It's nice being an impartial onlooker based on price versus performance. A lot of what is going on is pretty funny. Nividia touting PhysX like it is a gift from heaven and bashing DX11 as a useless piece of technology, AMD firing away on Nividia on (I believe) the Hexus forum...this is the stuff that soap operas are made of.
 

Kougar

Techgage Staff
Staff member
Also, most people were just fine and happy to spend more on the Nvidia parts since they always were the top performance dogs but right now people are seemingly stating they will not spend no matter the performance lead.

Given how it is shaping up, a lot of that has to do with that there isn't any real reason for anyone to buy Fermi. GTX 480 vs 5870.... the 480 is marginally faster, yet costs more, draws more power, runs hotter, and costs much more for NVIDIA to produce and manufacture. Is a 10-15% lead in performance really justify all the other negatives that AMD wins in?

I'll confess I will easily ignore power consumption and heat if the performance is good enough, but given the cards perform so closely to AMD's I just don't see any justification for Fermi. NVIDIA even hobbled double precision calculations on non-Tesla branded cards to 1/4th of the actual performance, so even GPGPU reasons start to lose water here.

I'm glad to see NVIDIA innovating with their architecture and designing the core around non-graphics processing, but that isn't enough to justify the pricetag for average distributed computing users, let alone sole gamers which comprise the majority of their consumer market. A lot is going to hinge on if NVIDIA can expand their Tesla market enough to support GF100. I'll wait until G100b comes along before I will even consider upgrading now.
 
Last edited:

Doomsday

Tech Junkie
my 9600GT is showing age! i need a new GPU and for some reason my heart doesNOT want to buy ATi, even though i like these guys! some thing about there drivers puts me off!

need new GPU, an ideas?!? or wait till GF100b like Kougar?!
 

Optix

Basket Chassis
Staff member
some thing about there drivers puts me off!
Ahh, now that needs some explaination I think.

The 10.3 drivers seem to show a performance improvement in just about all scenerios and it looks like AMD is giving the drivers the attention that ATI never did.
 

Doomsday

Tech Junkie
Ahh, now that needs some explaination I think.

The 10.3 drivers seem to show a performance improvement in just about all scenerios and it looks like AMD is giving the drivers the attention that ATI never did.

Aye, i have noticed that recently their driver's be getting better! Might just go ATi ! :D
 

b1lk1

Tech Monkey
I have to agree now. There is no justification for upgrading and there is little in the way of innovation that screams "BUY ME!". I still stand by the thought that these cards will sell out FAST, but only the fanboys will line up for them. It's sad really, as stated already, there is little chance of sparking a price war now and we are stuck with paying more for cards than we should.

I also need a major explanation why Nvidia is designing beyond graphics capabilities in a graphics card. If they are THAT desperate to have a CPU then they should just design one already.
 

Optix

Basket Chassis
Staff member
Just like with cars there will always be those with more money than brains that will want the best of the best no matter how much it costs or what alternatives there are. Those are the people who will snatch up the initial Nvidia offering.
 

Kougar

Techgage Staff
Staff member
I also need a major explanation why Nvidia is designing beyond graphics capabilities in a graphics card. If they are THAT desperate to have a CPU then they should just design one already.

Because the money isn't in the consumer gaming market, NVIDIA thinks there is more to be had in servers/GPGPU. They think the costs of building two separate cores outweigh that of redesigning one with am emphasis on computing first and gaming second at least for the short term. They may in fact be working on one... again it takes 3-5 years before each architecture gets spun.

What is amusing is I think they are right. There are over a dozen companies that have plans for or already are building servers incorporating Tesla cards... and given NVIDIA can sell Tesla at much higher prices due to the qualifications and testing involved (just like Intel and AMD in the server CPU market), I would bet there is indeed a sizable market there.

Given core architecture development starts ~3-5 years prior to the launch date, what we are seeing now is just a modification of the original Fermi design. Give a few years and I suspect NVIDIA will be producing different cores for each market. The core size is just untenable as it is... at least with G80 the performance justified the (at the time) huge power draw, heat, noise, and almost the price.
 

Tharic-Nar

Senior Editor
Staff member
Moderator
I really can't see Nvidia in the CPU market. There are other markets available to them with better margins. For one, it would be cheaper for Nvidia to buy VIA just for the license on x86 than to go straight to Intel, since the 2 would squabble away for years, i mean, they couldn't agree on a chipset license, so what are the chances of getting x86. The CPU market is very harsh atm, AMD is barely staying afloat, but its server market is very good (12 core Opterons near perfectly scaling in 4 socket systems). If Nvidia can't get an x86 license, they'll either have to create a custom platform or use something like ARM, SPARC or PowerPC, which still means they're tied to other companies. Creating a new core logic and getting everyone to follow it would also be an uphill struggle (motherboards, chipsets, NICS, I/O, etc). The only real choice is accelerator cards, so the push for GPGPU is a justifiable direction for them to take.

Technically it's all full circle, CPU's were integer units, then floating point units were introduced as accelerator cards for specific needs, then FPU's were integrated (partially) into CPU's, then graphics cards came along (which are massively parallel FPU's), and now both Intel and AMD are integrating the graphics units into the CPU. So Nvidia would be making very big FPU's with API extensions. Their concentration on proprietary software technologies is their downfall though, since developers would want to be able to program for any FPU rather than Nvidia only (and a specific model at that), it would be like writing software that'll only work on Intel processors.

There is the lucrative mobile market as well, which is where their Tegra and ION platforms come in. They can mix it up with Console graphics as well. There really isn't any point in them getting into CPU's when they can provide dedicated accelerator chips for a broad range of apps without the worry of dealing with cross license agreements and other legal issues over licenses. It would be nice if they came up with a new core logic for CPU's, but x86 is so entrenched in high end computing, they could never replace it. Thus we will be stuck with a 30+ year old architecture with strict backwards compatibility for that one high paying customer that can't be bothered to change hardware or re-write their software, for at least another 20 years, or until Intel goes under or opens up x86.
 
Last edited:

b1lk1

Tech Monkey
I understand both points guys, and technically I was oversimplifying it to make a point.

My real point is that ATI is giving the current buying public what they want and in droves with many different and seemingly profitable price points. Nvidia has not been successful at selling to the masses for some time as the 200 series has not been overwhelmingly received and it is going to be phased out as well.

I understand what they meant to do with Fermi and Tesla, but that is putting all your eggs in one basket in the hopes NOONE else manages to come up with competing technology. Noone knows if AMD is working on this behind the scenes as well as any other company that could pull the rabbit out of the hat and score the death blow on Nvidia. I realize this is unlikely at best, but that is nearly suicidal if you as me on the Green Team's part.

In the end, we are all boned as the price war that should have been is not going to happen and we are back to paying price premium for graphics cards. Smoke em if you got em.
 

Envy

Obliviot
To be honest, I don't think it's that bad. The 480 beats the 5970 in some things and it's estimated $300 cheaper.
 

b1lk1

Tech Monkey
I agree that it is not THAT bad, it's just not that GOOD that most people had hoped.

As for competing with the HD5970. I only have one word: EYEFINITY. Multi-monitor gaming is coming to a theater near you and is more important than you would think. ATI's solution is so far much more elegant and effective.

Time will tell and we really need to wait for the next revision of Fermi to see what it can REALLY do. I'm just pissed it is nowhere near good enough to cause the long awaited price war.
 

Kougar

Techgage Staff
Staff member
Don't misunderstand, I never said NVIDIA would enter the CPU market. I think they would ruin their business trying. What I am saying is the processing their GPU's can do are finding more and more applications for HPC and supercomputing markets.

Universities are using them to build micro-super computers, others are finding some of the strangest uses for them. Several major server vendors already offer motherboards & severs designed around Tesla GPUs, the CPUs are just there to run everything. NVIDIA has been scoring several deals for large orders... probably the largest one is ORNL's plan to use Fermi to build a next generation supercomputer. Someone else was planning to build a hybrid AMD CPU / GPU supercomputer that would switch between CPU/GPUs based on workload type, but I don't recall who that was. At any rate, my point is NVIDIA is scoring all these deals, and I have yet to see anyone choosing ATI GPUs to do it.

To be honest, I don't think it's that bad. The 480 beats the 5970 in some things and it's estimated $300 cheaper.

Granted I didn't bother to look very hard, but the only tests I saw like that were unrealistic (frame rates below 30FPS). Given Fermi is at best 10-15% better than the 5870, yet costs $100 more... I don't think it's a good situation. I'd love to fold on a GTX 480, but I am not going to buy it for that reason alone. I think I will sit on my GTX 260 until NVIDIA releases a GTX "485" in the very least.
 
Last edited:

Relayer

E.M.I.
To be honest, I don't think it's that bad. The 480 beats the 5970 in some things and it's estimated $300 cheaper.

$300 is stretching/spinning it. $150-$200 is more typical. I could spin it and say it's only $100. It is if you go by MSRP Let's keep it real though. As Cougar said, there's not any realistic situation that Fermi beats 5970. If you crank the AA 'til the frame buffer runs out on the 5970 but not the 480, then it does. By then though the frame rates on both cards are below optimum. So, it's not likely anyone's really going to run the cards at those settings.
 
Top