ATI Radeon HD 4890 & NVIDIA GeForce GTX 275

Rob Williams

Editor-in-Chief
Staff member
Moderator
The keyword in my last post is "most". You happen to be someone who truly appreciates the highest possible framerates in a game, and that's cool - I'm the same way. However, I'd be hard-pressed to see most people who game at 1440x900 unsatisfied with a $150 GPU. Yes, people like higher performance, but is it REALLY worth the extra $80 - $100, when a $150 gives completely playable framerates to begin with?

For hardcore online gamers, I don't disagree with a beefy GPU, but like I said earlier, it's rare to see a low-end resolution even properly take advantage of a mid-range GPU... it's a joke for the card to handle. I'd be hard-pressed to see a decent $150 GPU with 1GB of GDDR not be able to handle any game at 1440x900 with great settings are playable frame rates. I'll put myself to the test though ;-) When I find some time, I'll put all my theories to the test and see what I come up with.

I'm going to jump to conclusions here and assume that most people who game at 1440x900 or lower aren't looking to purchase a $250 GPU (19" monitors I've seen cost less than that). They're looking to get a wicked card at the best possible price, and at that resolution, I personally think a $150 card is going to deliver the goods. If you disagree, that's fine. We're all entitled to our own opinion, and you obviously take things pretty seriously.

Personally, since graphics cards are such a great value nowadays at any price, I'd like to think that the best possible upgrade for a gamer would be a new monitor, one that's larger. Not only do you have a larger viewing area, but you'd also have a finer resolution that in some games almost negates the need for anti-aliasing (games like Crysis and CoD: World at War look fantastic without it... games like L4D should have it though).

Either way, thanks for your opinions, I definitely see what your saying, but after putting many different graphics cards to the test, I still feel that anything over $150 for a GPU at a low resolution is over-spending, <em>unless</em> you want AA in every game out there and the best possible FPS without going broke.
 
U

Unregistered

Guest
The keyword in my last post is "most". You happen to be someone who truly appreciates the highest possible framerates in a game, and that's cool - I'm the same way. However, I'd be hard-pressed to see most people who game at 1440x900 unsatisfied with a $150 GPU. Yes, people like higher performance, but is it REALLY worth the extra $80 - $100, when a $150 gives completely playable framerates to begin with?

For hardcore online gamers, I don't disagree with a beefy GPU, but like I said earlier, it's rare to see a low-end resolution even properly take advantage of a mid-range GPU... it's a joke for the card to handle. I'd be hard-pressed to see a decent $150 GPU with 1GB of GDDR not be able to handle any game at 1440x900 with great settings are playable frame rates. I'll put myself to the test though ;-) When I find some time, I'll put all my theories to the test and see what I come up with.

I'm going to jump to conclusions here and assume that most people who game at 1440x900 or lower aren't looking to purchase a $250 GPU (19" monitors I've seen cost less than that). They're looking to get a wicked card at the best possible price, and at that resolution, I personally think a $150 card is going to deliver the goods. If you disagree, that's fine. We're all entitled to our own opinion, and you obviously take things pretty seriously.

Personally, since graphics cards are such a great value nowadays at any price, I'd like to think that the best possible upgrade for a gamer would be a new monitor, one that's larger. Not only do you have a larger viewing area, but you'd also have a finer resolution that in some games almost negates the need for anti-aliasing (games like Crysis and CoD: World at War look fantastic without it... games like L4D should have it though).

Either way, thanks for your opinions, I definitely see what your saying, but after putting many different graphics cards to the test, I still feel that anything over $150 for a GPU at a low resolution is over-spending, <em>unless</em> you want AA in every game out there and the best possible FPS without going broke.

I don't get where you are getting this $150 number though.... Are you saying that at 1440x900 it would be dumb for somone to spend $300 on the hd 4870 when they first came out (if they had the money), but that it is smart to buy it now since it is below $150...

Or, are you saying that in current market prices, you wouldn't want to spend above $150 for a 1440x900 capable graphics card?
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
I'm taking a look at things from a current perspective, and where graphics cards are <em>now</em>. I didn't say that the HD 4870 was ever worth the money for people running 1440x900, but rather that lower-end cards are going to provide more than enough power for most gamers there. I chose $150 because it's a clean number and $100 less than $250, and there's also a few good GPUs in that price-range (the HD 4850 1GB, for example).

As I have said a few times already, the current state of the graphics card market is great for anyone looking for a fantastic card at a great price, and if all you are looking to run is 1440x900, then I don't see the reason to spend over $200 for a card. I've seen $100 cards perform well with current top-rate games at 1680x1050, so I under no circumstance would understand why someone would think they'd <em>need</em> a $250 card to get good gaming performance.

It seems like to me that you are just trying to defend your purchase and convince yourself that it was a good buy or something, because I have no idea why else you'd draw this debate out to death. I've made my thoughts clear more than once, and it in no way seems like we're going to agree. If you think a $250 GPU is well-worth it at that resolution, then great - I hope you are happy with your card. It might be right for you, but I certainly don't think it's a necessary purchase for most people with that resolution.
 
U

Unregistered

Guest
It seems like to me that you are just trying to defend your purchase and convince yourself that it was a good buy or something, because I have no idea why else you'd draw this debate out to death. I've made my thoughts clear more than once, and it in no way seems like we're going to agree. If you think a $250 GPU is well-worth it at that resolution, then great - I hope you are happy with your card. It might be right for you, but I certainly don't think it's a necessary purchase for most people with that resolution.

I don't have to convince myself. I have gotten plenty of good use out of it for the $ :p

I didn't know we were debating, I was just trying to figure out why somone on a tech site would be saying there is no reason to buy high-end overkill graphics cards (people don't go into pc-gaming to save $).

We are just talking about different things I believe. You are saying it is not necessary to get an ultra powerful graphics card for 1440x900, which I agree with, but I don't agree that there is no reason to get an overkill gpu. Get it?
 
U

Unregistered

Guest
Heck, you have an intel extreme, why I am even explaining the concept of overkill to you?

(joking, ....sorta)

Anyhow, I'll leave you be now, since I think I am just arguing for the sake of arguing.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
I had that all wrong. I thought I was trying to explain the concept of overkill to you ;-)

As for the Extreme CPU, it wouldn't be in this machine if I didn't have it lying around (review sample). I needed a good CPU that ran well under decent temps, and so it made sense to throw it in here since it was just lying in a box. If it makes you feel any better, I'm running a 9600 GT in this rig (which handles CoD4 fine when I MP it up with my brother ;-)).

Please feel free to check back though, especially with future content. We might not agree on everything, but good ole debate never hurt anyone (err, I should probably take that back!).
 
B

Bogeyman

Guest
Enjoyed the review, Rob!
Believe I enjoyed the discussion thread just as much. Some very good information here. I am kind of in the situation of upgrading my videocard. I'm still gaming on a 17" crt, but will probably have purchased a 19'" or 20" widescreen by years end. I'm kind of looking in the 4850 to gts 250 ballpark for a gpu. This thread has made a lot of sense to me. Maybe this would be a good article for you guys to post. You could actually show at lower res's how the benchies were higher and weren't noticeable to the Average Joe. Different games, from the normal 5 everyone tests, would also be revealing. Thanks again, keep up the good work!
 

Relayer

E.M.I.
Keep in mind that not everyone plays their games stock. Many use custom models, backgrounds, effects, etc. with much higher res meshes and textures.
I realize that you can't take every user into account, and I'm not trying to come in from left field here. I don't play at high resolutions but with a fully modded game I need a high end card anyway.

Good job, Rob. Changing up the games is a good idea.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
Bogeyman: If you have the room, I REALLY recommend a 24" monitor, since gaming is totally sweet on them. They aren't even that expensive anymore. I helped a friend build a PC this past November and he scored a nice Samsung for around $300... and trust me, the extra money is worth it. Also, if you don't plan to upgrade your monitor until later this year, do you really want to upgrade your GPU right now?

Relayer: That's all fine, but it's something I'm concerned over. Higher-res models and improved graphics might require a faster GPU, but we benchmark at 2560x1600, a resolution that's used by few gamers. You can still base your purchasing decision off of our results, regardless of whether or not you mod a game.
 
U

Unregistered

Guest
Wrong settings?

At least in the Left 4 Dead 2560x1600 settings image on page 4 the vsyc with triple buffering is turned on. How on earth it is then possible for you to measure framerates averaging around 132fps. Where do you get a monitor with that resolution and refresh rate? (I want one too.)

Would it not be more plausible that the vsync was not on. If it was turned off, however, it certainly raises doubts about the validity of other settings in that image, does it not.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
At least in the Left 4 Dead 2560x1600 settings image on page 4 the vsyc with triple buffering is turned on. How on earth it is then possible for you to measure framerates averaging around 132fps. Where do you get a monitor with that resolution and refresh rate? (I want one too.)

Would it not be more plausible that the vsync was not on. If it was turned off, however, it certainly raises doubts about the validity of other settings in that image, does it not.

That's a typo, sorry about that. Vsync is never on... just forgot to take it off when I took the screenshots.
 
Top