ATI HD 4870 1GB vs. NVIDIA GTX 260/216 896MB: Follow-Up

Rob Williams

Editor-in-Chief
Staff member
Moderator
Two weeks ago, we published a performance comparison between NVIDIA's GTX 260/216 and ATI's HD 4870 1GB. What we found was that NVIDIA had the upper-hand, both in performance, and efficiency. Today, we're re-testing ATI's card with their new 8.12 driver, to see if it can increase performance enough to sway our decision as to which is the better card.

You can read our follow-up right here and discuss it here.
 

Kougar

Techgage Staff
Staff member
For whatever reason, ATI's card performed better at the high-end, while NVIDIA's performed better at the lower-end.

Those are closer to the results I remembered from the initial launch! The way it was explained to me was because of the huge memory bandwidth advantage ATI has... which only really comes into play at the absurdly high 30" resolution using AA+AF, and isn't enough to change the outcome for 1920x1200 resolutions.

Regarding prices, have you seen: Link?
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
Ahh, thanks for the link. That's right on the "money" then... I found ATI to be exactly $20 less-expensive on average. That doesn't look too good for NVIDIA, though. Arguably, they still have some nicer benefits on their card, but the thing went up $50 in the span of two weeks.
 
U

Unregistered

Guest
which 8.12

gidday,

nice article and finally someone includes real man games like UT3 :)
my one question is what version of the 8.12's are you using?
migth make a bit of a difference if they are the betas as i found they had no big improvements for me over 8.11

keep up the good work :)
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
gidday,

nice article and finally someone includes real man games like UT3 :)
my one question is what version of the 8.12's are you using?
migth make a bit of a difference if they are the betas as i found they had no big improvements for me over 8.11

keep up the good work :)

Whoops, I do believe I forgot to mention that. The 8.12 drivers we used were pre-release... ATI told us that those drivers would be identical to what will be officially released on Wednesday. I'm unsure about the beta drivers you're using, but they may not be the same ones used here.

Overall, the performance increases won't make an obvious difference in all titles, but the boost is nice regardless :)
 
U

Unregistered

Guest
Optimising settings

It would be interesting to see the performance difference once the settings for each driver has been tweaked for performance (without sacrificing too much quality that is)!

For Nvidia, that would mean using the latest nhancer 2.4.2 & Rivatuner v2.20, and for ATI using the latet Ati Tray Tools 1.3.6.1042 (available from guru3d.com)

With the Ati Tray Tools, its necessary to enable a lot of options under 'Tweaks' (I think its pretty much all the standard and advanced tweaks), plus a few others. If set up correctly for both nvidia and ati, I think the results would be very interesting compared to the normal results. This is without overclocking!
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
It would be interesting to see the performance difference once the settings for each driver has been tweaked for performance (without sacrificing too much quality that is)!

For Nvidia, that would mean using the latest nhancer 2.4.2 & Rivatuner v2.20, and for ATI using the latet Ati Tray Tools 1.3.6.1042 (available from guru3d.com)

With the Ati Tray Tools, its necessary to enable a lot of options under 'Tweaks' (I think its pretty much all the standard and advanced tweaks), plus a few others. If set up correctly for both nvidia and ati, I think the results would be very interesting compared to the normal results. This is without overclocking!

That's a good idea, but for the most part, simply overclocking the GPU is going to give you the largest benefits, and the case of the two GPUs here, NVIDIA wins. Their card can be really overclocked (75MHz at least to the Core), while ATI's is limited thanks to the CCC.

If someone had the time to go through the nitty gritty, it might not be a bad idea, but that seems a little too time-consuming for my liking, when overclocking could make an even larger impact.
 
U

Unregistered

Guest
The latest ATI Tray Tools available is actually, 1.6.9.1340. I made a mistake before! Its true overclocking would have the most effect, it just would be interesting a settings tweaked (as long as anistropy/antialising is the same) and overclock comparison. To overclock both the nvidia and ATI, the latest Rivatuner can do both. I haven't used Rivatuner on ATI, but apparently its the most effective overclocking tool for both ATI and Nvidia. I have an Nvidia card myself, I haven't been happy with the driver quality across multiple cards. ATI has the bad rep in terms of drivers but thats old, Nvidia are now the worst in my opinion of the two! Every single proper release driver for my card has worked worse than the latest (higher) beta, even on a clean system. That isn't too uncommon from what I've read. Despite it failing in benchmarks I'm now looking at a 4850 1gb (yes I know, but games like GTA 4 apparently run much better with more vid ram), just to tie me over until something better comes along.

I agree that it is suspect about 'The way its meant to be played', there would be no reason other than a financial one (whether it be biased technical support or straight monetary) for those titles to actually claim that.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
From what I understood, ATI Tray Tools is defunct, at least where newer ATI cards are concerned. Last time I tried to use it (about a year ago), it wouldn't work with the latest card I had on hand, so I just left it.

"To overclock both the nvidia and ATI, the latest Rivatuner can do both."

How so? I've tried this also, and haven't figured out how. Is it simply done by adjusting the individual values under the tweaker setting? I don't know why the ATI overclocking couldn't have simple sliders like the NVIDIA cards... would make things so much easier.

I agree that it is suspect about 'The way its meant to be played', there would be no reason other than a financial one (whether it be biased technical support or straight monetary) for those titles to actually claim that.

It's a confusing situation, that's for sure. From what I understood, NVIDIA didn't pay for the privilege of having certain games optimized for their cards, but they probably do pay for the right to the logo on the box. As I was explained it in the past, NVIDIA offers tips and help with improving gameplay on their cards, but that's about where it ends. I'm really not sure what to conclude on though... I've heard too many sides to this story.

Oh, and to the Aussie poster earlier, the exact driver version is: 8.561-081201a-072643E-ATI.
 

Kougar

Techgage Staff
Staff member
ATI Tool is the one that is defunct, it hasn't seen a major upgrade in two years IIRC. ATI Tray Tools on the other hand is actively developed, although it doesn't offer anything like Rivatuner or an artifact scanner like ATI Tool did.

Regarding the logo and slogans, I do know companies that agree to side with one GPU maker over the other tend to get unreleased hardware pretty early for their game development. Valve almost always seems to be coding their next generation games on plenty of unreleased ATI hardware, to name one example.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
Regarding the logo and slogans, I do know companies that agree to side with one GPU maker over the other tend to get unreleased hardware pretty early for their game development. Valve almost always seems to be coding their next generation games on plenty of unreleased ATI hardware, to name one example.

Ahh, that makes a lot more sense.
 
U

Unregistered

Guest
Thanks for being an HONEST reviewer

Two weeks ago, we published a performance comparison between NVIDIA's GTX 260/216 and ATI's HD 4870 1GB. What we found was that NVIDIA had the upper-hand, both in performance, and efficiency. Today, we're re-testing ATI's card with their new 8.12 driver, to see if it can increase performance enough to sway our decision as to which is the better card.

You can read our follow-up right here and discuss it here.

I'd just like to say I enjoyed both articles, and noticed you didn't have a grinding problem or a fanboy syndrome anywhere, that in fact you were just being plain honest.

I HAVE to give you a huge thumbs up for that and lots of appreciation. I've seen the dark side and it ain't pretty.

I perused both reviews and didn't wind up angry about anything I read or saw - (lol- sadly that is a near first)

Robert, you're a straight shooter, and in my book that is a HUGE plus.

I really can't thank you enough.

SiliconDoc
 
U

Unregistered

Guest
Yes it is interesting the competition and various advantages

I agree that it is suspect about 'The way its meant to be played', there would be no reason other than a financial one (whether it be biased technical support or straight monetary) for those titles to actually claim that.

I have heard a lot about this particular "deal" - but I've never heard mentioned what I'm about to say...

How do you think owning not only a CPU (X2 64 athlon/opteron etc) but ALSO a chipset on many motherboards as well as the GPU ( AMD/ATI ) plays into advantages for optimizing your videocard product ?

Would it be, or isn't it true, that AMD likely has access and knowledge far superior to NVidia's on for instance "hardware calls" or whatever technical term is used for knowing the structures of the cpu's - and using it advantageously ? Does AMD/ATI wind up "refusing" Nvidia when they ask for very neccessary and needed information ?
Ooooooooh !
Do they delay giving them information in whatever agreements have been made, are there "errors" delivered "by mistake" that NVidia has to watch out for ?

One should think so IMO, so NVidia did something to combat the added cost of reverse engineering or catching the little - uhh, shall we say "problems"... they CREATED a CHIPSET for motherboards - and yes some run SLI or are made for SLI and many aren't - but the fact remains that's a LOT of motherboards ATI wants their cards sold for and used on... shall we say "LEVERAGE!" pressure so AMD/ATI winds up having to "play fair" ? LOL

You get the idea... and INTEL is in a similar but different position with perhaps a larger advantage than AMD - and Intel makes a lot of chipsets and integrated video, and has traditionally had a much tighter relationship with --- WINDOWS ! ... and owns the #1 cpu for the longest stretch.

Ok, so TWIMTBP is not the only thing going on, and I think owning a cpu and chipset and videocard (AMD/ATI) winds up giving access to OS calls for the vidcards and directX and who knows how many other things kernel related and or driver related things that NVidia's licensing probably doesn't include.. not to mention AMD/ati's ..... DEEP hard and long look at Intels' cpu core ...

( boy those Nforce chipsets probably helped a great deal , NVidia must have been panicking before the Nforce2 motherboard chipset ! .. seems to me things happen for a reason... lol )

YES, the oft repeated mantra against the depending upon perspective, dirty looking TWIMTBP is only the very tip of the spiderwebbed iceberg....

Anyway, since you commented like others on it - perhaps my thoughts will be interesting to you.

Enjoy your gaming and enthusiastic following and all the great card releases both companies have given us this last year - (and hopefully in the near future too ! )
 
U

Unregistered

Guest
How are both these cards as compared to the GTX260 core 216 55nm?

How are both these cards as compared to the GTX260 core 216 55nm? The initial views on NewEgg aren't that good, mostly due to the Toned down Heatsink on the 55nm.

Also, Mr. Editor, u never tested both these cards for HD Playback or in Overclocked Mode. It would be great if u could try that and add them up in the Main review OR over here. Even putting in 2-3 lines on a comparison between these 2 cards, when Overclocked to max, would do. That would have surely made this a complete review. TIME TIME TIME, if that permits.

Regds
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
Hi Unregistered:

Simply moving from 65nm to 55nm isn't going to make a large difference at all, where performance is concerned. You might see a small difference, but nothing noticeable in real-world gameplay. I'm not sure I can comment on the heatsink, but NVIDIA cards tend to run a little cooler than ATI cards to begin with, and as long as it keeps within its built-in temp threshold, it should be fine. I can't really see a potential downside on the new cards, to be honest.

As for the overclocking, as I mentioned in the original article (or at least in previous articles), where OC'ing is concerned, NVIDIA is going to win... every... single... time. Part of the reason is that their GPUs get hot, but another is that the Catalyst Control Center doesn't allow a very impressive top-overclock. You could use other tools to go even higher, but it's such a hassle, I really don't find it with it. But, I don't find overclocking worth it to begin with... the gain certainly isn't usually worth the time and effort.

For HD playback, I haven't delved much into this before, but again, I wouldn't expect either card to be lacking where performance is concerned. I ran 1080p video on a $700 laptop a few weeks ago... so pretty much any GPU on the market is going to do fine. If you mean quality, I'm not sure there... but it's something I'll look at in the future. Will need to get a BD-ROM in...

As for time, it's definitely extremely limited lately (especially this month), but your ideas are good and I'll definitely consider them for future articles. Thanks for the ideas :)
 
U

Unregistered

Guest
Thanks for the Reply!

^^ Many Thanks for the prompt reply. Got many of my doubts cleared.

What i meant by HD playback capabilities is: read somewhere ATI does much better. In one review somewhere on the net, the reviewer put a very old CPU (E6700 or something like that)...just to test how these 2 cards perform wrt HD. And it was found that due to the 4870s, CPU utilization remains very low (around 18%) while with the GTX260s, CPU utilisation is higher (around 25%). To sum it up, the GPU and CPU utilisation wrt HD and these cards.

One more thing. Saw ur Comparison Shots. Didn't find any difference as such (other than slight High/Low Brightness-contrast in some parts of the image). However, in Far Cry2, the birds were missing. Don't know whether this is AI involved or is it the 'Image Quality' part of it.

Cheers!
 

goijgjigpe

Obliviot
hi,

im utterly confused about which card to get now. please can someone help me!
price to me makes no difference, here in the UK the cards are priced virtually identically with the pre-clocked versions, and between the 2 brands. I just want to know which card is better!

Here's my specific problem though - with regards to the GTX there is a only a £20 price difference between a standard
core clock 576mhz
shader clock 1242mhz
and one thats been overclocked to 650mhz, shader clock 1400mhz.

Now, does this mean that the nvidia would be a hands down clear winner against the 4870 1024mb? or are there other overclocked ati versions i'm unaware of. what is the standard clock speed of the 4870 1024mb - i can't even find this on the ati website. at least i could then compare the standard against the overclocked editions of the card.

i'm so confused!
 
Last edited:

Rob Williams

Editor-in-Chief
Staff member
Moderator
goijgjigpe: I don't really recommend paying £20 more just for an overclocked card... it's not worth it. If you want faster clocks, you could easily take care of that task yourself. You might not want to go that high, but in all honesty, overclocking GPUs in general makes very, very minor differences in overall FPS.

To be honest, both cards are going to be great if the price is exactly the same. As I mentioned before, I tend to be drawn towards NVIDIA for various reasons, some of which are pointed out in the article. So if you just want to pick up a card and not worry about anything, pick up whatever one looks better. If you want to take advantage of PhysX or CUDA applications, then NVIDIA is your only solution.
 
Top