ATI Radeon HD 4890 & NVIDIA GeForce GTX 275

Rob Williams

Editor-in-Chief
Staff member
Moderator
It's not often we get to take two brand-new GPUs and pit them against each other in one launch article, but that's what we're doing with ATI's HD 4890 and NVIDIA's GTX 275. Both cards are priced at $249, and both also happen to offer great performance and insane overclocking-ability. So coupled with those and other factors, who comes out on top?

You can read our full write-up on the two new cards here and discuss them here!
 

Greg King

I just kinda show up...
Staff member
Very good review Rob. I know how much work, time and effort you put into it and the final result was a good, complete review.

To be honest, the 4890 looks to be a very solid product and I don't see it hanging around that much above the $220 price point that much longer. That's just me though. The performance is there but it's just below NVIDIA's 275 in most areas.
 

Kougar

Techgage Staff
Staff member
Howdy Rob,

Just a quick note there isn't any mention of the actual test settings used in the overclock chart / page! That one chart makes for a pretty good snapshot of where things stand, great to see it in there. Nice review. :)
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
DarkSynergy said:
To be honest, the 4890 looks to be a very solid product and I don't see it hanging around that much above the $220 price point that much longer.

I'm thinking along the same lines, and it's too bad. I didn't expect the GTX 275 to come along and obliterate the HD 4890 in pretty-well every test. It's still a terrific card, but pricing will undoubtedly have to be dropped a little bit.

Kougar said:
Just a quick note there isn't any mention of the actual test settings used in the overclock chart / page!

Fixed... I never think to include that information. It will always be the highest settings possible though, which in this case, is 2560x1600 (and also the same configurations we used throughout our regular testing). All of the stock results included on that graph were swiped from our 2560x1600 graphs throughout the article.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
Was talking to Nate last night about this... it's a strange issue. Honestly, I never noticed that when we did use that particular game (we don't now thanks to DRM), although I admit I didn't pay that close attention. If I ever get that game working again, I'll have to test this out as well, since that's the same area I used for benchmarking.

I really have to wonder if there are other games that experience this sort of thing. Is this what we're coming to? Will benchmarkers really have to analyze each and every scene we use to see if issues like this exist?
 

Doomsday

Tech Junkie
nice review! loved the red/green bar chart colors! for some reason i got more interested in the 40nm lineup,lol!
the 40nm's are gonna be awesome! though i am really aching to spend some o me $650 on summat, lol! parents wont agree with the new setup for a 24" LCD. will have to go for a 22" one now. Oh i'll wait for the 40nm GPUs n buy the 22"LCD and a new GPU then!
The innovation Vs Rebranding thing was really nice!
 

Greg King

I just kinda show up...
Staff member
nice review! loved the red/green bar chart colors! for some reason i got more interested in the 40nm lineup,lol!!

I don't personally care about the 40nm chips to be honest but I am blown away that they are able to do it.

Think about it. Who else is at that level? ATI will beat NVIDIA to 40nm. Intel has working 32nm silicon I believe but it wont be to market before ATI. Even AMD doesn't have CPUs at that point. This is a HUGE technological achievement and deserves to be lauded for it.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
Well, the thing about 40nm is... we won't be seeing high-end cards built on that process for a little while. The launch cards will be the HD 4700 series, specifically the HD 4770. That card is somewhat similar to the HD 4830, but interestingly enough, it will use GDDR5 rather than GDDR3. That's alongside a tighter bus, of 128-bit. Seems a bit odd to tighten up the bus and then use faster GDDR, but AMD said that they expect the performance from that configuration to be quite similar to a 256-bit bus width on GDDR3. I want to see it before I believe it though.

Greg King said:
Intel has working 32nm silicon I believe but it wont be to market before ATI.

You're right, AMD is first. The CPU guys are skipping over 40nm entirely and going straight to 32nm, and we should be seeing the first units there this fall. AMD's first 40nm will be out next month though, so they are really on a roll.
 

Doomsday

Tech Junkie
You're right, AMD is first. The CPU guys are skipping over 40nm entirely and going straight to 32nm, and we should be seeing the first units there this fall. AMD's first 40nm will be out next month though, so they are really on a roll.

wohoo! damn i knew i should not have put so much money in the 780i and should have gone for the cheaper Asus P5Q mobo. 3 way SLi mobo,sheesh!, and i am never even gonna 2 way SLi on it, lol!
 
U

Unregistered

Guest
the original song is by daft punk.... ;)

Also, I wouldn't say that the gtx275 dominates the hd4890.... Not everyone plays (or benchmarks) at 1680x1050.

If you look at other reviews, you will see that the 4890 actually pulls ahead in many tests at lower resolutions. (1440x900, etc.)
 

Greg King

I just kinda show up...
Staff member
If your going to come onto the boards and trash our review (that varies in results only slightly with the two that you linked to) please do yourself a favor and attempt to string together at least a readable sentence. That's barely English.

But to be perfectly fair, you're entitled to your own opinion. If you don't care for the site, perhaps you should visit one of the ones you do like.
 
Last edited:

Rob Williams

Editor-in-Chief
Staff member
Moderator
the original song is by daft punk....

Also, I wouldn't say that the gtx275 dominates the hd4890.... Not everyone plays (or benchmarks) at 1680x1050.

If you look at other reviews, you will see that the 4890 actually pulls ahead in many tests at lower resolutions. (1440x900, etc.)

To be honest, there is no reason for someone to purchase a $250 graphics card if their resolution is 1680x1050 or lower, unless they are trying to somehow future-proof or if they plan on upgrading their monitor in the near-future. I think back to when we used to review 9600 GT cards... despite them costing only $100, they could handle games like Call of Duty at 2560x1600 WITH anti-aliasing. It of course wasn't as smooth as a beefier card, but that's at a huge 4.1 megapixel resolution... it only got better going downward.

That's the great thing about GPUs nowadays (and CPUs even)... you really don't have to spend much money to get stellar performance, especially if you are running more modest resolutions.

Daft Punk kicks ass, by the way ;)
 
U

Unregistered

Guest
To be honest, there is no reason for someone to purchase a $250 graphics card if their resolution is 1680x1050 or lower, unless they are trying to somehow future-proof or if they plan on upgrading their monitor in the near-future. I think back to when we used to review 9600 GT cards... despite them costing only $100, they could handle games like Call of Duty at 2560x1600 WITH anti-aliasing. It of course wasn't as smooth as a beefier card, but that's at a huge 4.1 megapixel resolution... it only got better going downward.

Are you joking? You really think somone with a 19" monitor (1440x900) wouldn't buy a 4870 when they came out ($300)....

It isn't "future proofing (no such thing in pc gaming)" its getting the performance you want for a price you can pay.... Some people like playing games at (big shock here) high frame rates, not just slightly above 30... Also, Call of Duty is not a graphics intensive game. Try playing Empire Total War with max settings 1440x900 on a 9600gt.... (gl)

As for your numbers, in every other review I have read the 4870/275 have been neck and neck, 4890 leading in some benches, while 275 leads in others. Yet somehow in your review the 275 is ahead in every single benchmark.... (curious)

I agree with the above poster, obviously you have an agenda...
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
Thanks for the follow-up comments.

You may be one of the more serious gamers, and I respect that, but I still stand by my opinion that people who game at lower resolutions don't need an expensive graphics card. In the example I gave, I wasn't talking about barely-playable settings, but fully-playable settings. Sure, the gameplay wasn't as buttery-smooth as a 9800 GTX, but realize I'm talking about the 4.1 megapixel resolution of 2560x1600, not the 1.3 megapixel resolution of 1440x900.

If that $125 card (at the time) could handle the-then top game at max resolution, I'm confident that any GPU in the $150 price-range today is going to be stellar for anyone on a lower resolution. Games haven't advanced so much since then that my opinion would change. If you are looking for anti-aliasing, things might vary a bit, but I'm still doubtful that a huge graphics card is needed. Again, I played CoD 4 with 4x at a monster resolution with a $125 GPU... things have only gotten better since then (from a price/performance perspective).

Just take a look at some of the results from the lower-end cards in our graphs, such as the $100 HD 4830. It managed to deliver 37.051 FPS in Crysis Warhead (Mainstream) at 1920x1200, 36.5 FPS in Call of Duty: World at War at 1680x1050, 30 FPS in Far Cry 2 at 1680x1080, 88.324 FPS in Left for Dead at 1680x1050, 52.484 FPS in Mirror's Edge at 1680x1050 and 58.101 FPS in Need for Speed: Undercover also at 1680x1050.

Realize that these are higher resolutions than 1280x1024 or 1440x900 (1680x1050 is 36% more pixels than 1440x900), so the only place that the average FPS has to go is up. That's on a $100 graphics card, and each one of those resolution/setting combinations (all featured 4xAA except for Crysis) were completely playable, with the possible exception of Far Cry 2, which should have around 40 FPS to become fully playable. I really, really don't understand why a bigger GPU is needed for resolutions even lower than this. If you don't see things that way, we can simply agree to disagree.

As for our numbers not matching up to other websites, this is an issue I'm investigating and take seriously. It's puzzling, but I'm starting to wonder if NVIDIA made a slew of really worthwhile optimizations in their drivers, because as of a few months ago, their cards didn't dominate the charts as such. We posted an "HD 4870 1GB vs. GTX 260/216" article in December, with almost an identical selection of games, and the performance from both cards were near-identical. So, something is up.

We're looking into revising our GPU game suite once again, this time being careful to not choose games that favor one maker's cards over another (unless it happens to be a blockbuster game, because at that point, it wouldn't be fair to not include it). It's difficult though, because no one says that one company isn't going to optimize their driver to the nines for various games we use. It's a real problem though, especially given the amount of time and effort that goes into re-testing each card (we benchmark 100% manually, we never use timedemos, save for 3DMark).

Trust me when I say that we don't have an agenda, and I wanted nothing more than to see the HD 4890 perform more competitively throughout our results here. If you only knew the hassle that went into wrapping this article up (and the reason we were a full day late in publishing), you would realize that we aren't about to favor NVIDIA just for fun (that's as much as I'm saying on that particular matter).

We report things as we see them, and if there are any inconsistencies with our results, it could be due to driver optimizations for a particular title we're using. If one site happened to have different results for a title we used, it will be due to either a different level being used (or method of testing), or the results were achieved with a timedemo. As I mentioned (and as our testing methodology page points out), we manually benchmark each game and shun timedemos, in order to deliver the most accurate results possible.
 
U

Unregistered

Guest
Thanks for the follow-up comments.

You may be one of the more serious gamers, and I respect that, but I still stand by my opinion that people who game at lower resolutions don't need an expensive graphics card.

But that wasn't your original statement. You said that there was "no reason" to buy the expensive graphics card.

I completely understand that people "don't need" a $300 gpu to game at lower resolutions. But, like you said, if you want to max everything, even at 1440x900 a 9600gt just won't cut it.

As for fps in games, you are talking averages, however, you I look more closely at the "min" fps. If you have an average fps of 60, but a min of 10, then you are going to get some obvious lag depending how often you are hitting, or coming near this min.

If you go with a card that may seem "overkill" for 1440x900, you are going to maintain a framerate where even the min is very high, meaning lag-less gameplay. This is especially noticeable in fast paced online gaming.
 

Greg King

I just kinda show up...
Staff member
I agree with the above poster, obviously you have an agenda...

That's not an accusation that should be thrown around lightly an in this case completely unfair.

Every tester has a different hardware setup. Most are testing with an i7 platform and I have yet to find a site, except for two, who has benched their games at anything lower than 1680x1050. If you look around, most other sites also test at the same resolutions that we have. The differences between the 4890 and the 275 are a handful of frames per second on every respectable site on the web. The differences of a few frames per second is by and large unnoticeable so the order that the cards appear on the graphs means little. We are talking 2-6 frames per second apart in most cases and where there is a larger gap, they are both over 100 fps. Show me someone who says that they can tell the difference between 90 and 96 frames per second and I will show you a liar.

Our results differ from other sites, but not by much.

To accuse us of any agenda, regardless of the manufacturer is unfair and unwarranted. If your complaining about our review because of a frame per second here and there, I bet that your mind was made up before coming to the site. We appreciate constructive criticism, that helps us mature as a site, but throwing around needless accusations is just not warranted in this case.

Oh, and the Xbit review that was linked to by another anonymous user didn't even use the 275 for comparison sakes, so they seem to be bitching just to bitch.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
To be honest, I don't think my original thought really changed too much in between posts. For most people who game on resolutions lower than 1680x1050, I think a $250 GPU is going to be a huge waste. As the example I gave above showed, a $100 GPU offered decent framerates at 1680x1050 with anti-aliasing. Imagine a $150 GPU at lower resolutions!

As for the minimum FPS, that's a great point, but l still don't see that being a huge issue at lower resolutions. The figures I gave above were with a $100 GPU at higher resolutions... if you want the best framerate possible at 1440x900 or lower, the best thing to do is to just disable anti-aliasing. With that done, I can't see how the minimum FPS in any current title at low resolutions is going to be an issue. And if AA is that important to you, then you'd essentially be paying $100 on top of a modest GPU for that feature alone. That's quite the premium.

Still, the minimum FPS issue is a good one, and I'll be looking into it deeper as we go forward with our testing.
 
U

Unregistered

Guest
To be honest, I don't think my original thought really changed too much in between posts. For most people who game on resolutions lower than 1680x1050, I think a $250 GPU is going to be a huge waste. As the example I gave above showed, a $100 GPU offered decent framerates at 1680x1050 with anti-aliasing. Imagine a $150 GPU at lower resolutions!

As for the minimum FPS, that's a great point, but l still don't see that being a huge issue at lower resolutions. The figures I gave above were with a $100 GPU at higher resolutions... if you want the best framerate possible at 1440x900 or lower, the best thing to do is to just disable anti-aliasing. With that done, I can't see how the minimum FPS in any current title at low resolutions is going to be an issue. And if AA is that important to you, then you'd essentially be paying $100 on top of a modest GPU for that feature alone. That's quite the premium.

Still, the minimum FPS issue is a good one, and I'll be looking into it deeper as we go forward with our testing.

Your thoughts may have been the same, but the way you wrote it had two totally different meanings.

Your basing your judgment on what gpu is needed on a handful of games... Also, a 4890 can be had for 230 dollars after mir. I don't see how this is a "waste" for somone who wants to run the latest games on max settings.

You are essentially saying that people who run at 1440x900 should be content with an hd4850, and not want anything to do with a 4890.... I think this makes no sense.... If you have the money, why would you not want higher performance? It isn't a waste to get 60 fps instead of 30, or a waste to be able to run with 8xaa instead of none, to be able to run the latest games on max settings, or medium.
 
Top