Palit GeForce GTX 280 1GB

Rob Williams

Editor-in-Chief
Staff member
Moderator
With so many options on the market right now, what makes the GTX 280 a good choice for anyone? The fact that it is the highest-performing card out there sure helps, but it's still not for everyone. To join this club, you better hope you have one massive resolution to push.

You can read the full article here and discuss it in this thread.
 
K

Keatah

Guest
ohh well, just another board..

Blah!! Too many videocards these days. Every week a new model is released. And the drivers from last years cards are not perfected. And the performance is irrelevant, you see, next year's integrated graphics chipsets from the likes of intel will outperform these behemoths, and sip power to the tune of 15 watts as a bonus!

I feel sorry for you early adopters blowing a ton'o money on all these things.. sheesh!
 
K

Keatah

Guest
one more thing..

With all these variants and model numbers and clock speeds and memory sizes out there how is one supposed to know what to buy? A lot of folks I know just up-and-up and go integrated graphics or get the cheapest that works. The ati and nvidia are just hurting themselve with such intense granularity!!
 
C

chr1s

Guest
...

It depends what you want i got an 8800GTX when they first came out and it was very expensive and now they are considerably cheaper.... The same will happen with these and has already happened to some degree which however improving its "bang for buck" doesnt make it feasable in that sense. I wont be buying one of these even though i play allot of high end games at max setings an 8800GTX does me fine and the extra fps you get from one of these however nice it is to have wont help you that much, not for that cost difference anyway. However if you are a company who needs a realy powerful gpu for a time saving solution this is the card i rekon this is more of an "industrial" card than a gamers card considering the competition atm.
 

Kougar

Techgage Staff
Staff member
An Intel integrated GPU fanboy? Now I've seen everything... :)

Nice review, and I am extremely sure you knew this was coming... but you really needed an HD 4870 in there. I know GPUs aren't easy to come by, but unless you use Xfire then it's the next best thing to test against the GTX 280.

I wish both companies would launch their upcoming cards already, I need to buy a GPU and this is killing me! 4870 X2 or GT200b 3xx... or maybe some better prices on the 280/4870... either way it'll be good. :)
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
Nice review, and I am extremely sure you knew this was coming... but you really needed an HD 4870 in there. I know GPUs aren't easy to come by, but unless you use Xfire then it's the next best thing to test against the GTX 280.

You couldn't be any more right... I know I need one. Should have had one in long ago, but it's taking a while. Still, what I said about 2560x1600 should prove true in all cases. If you want that resolution along with topped-out settings (like AA), then the GTX 280 is still a proper high-end card. Still expensive, but I believe it would fare much better than an HD 4870 at that same resolution with equal settings.
 

Kougar

Techgage Staff
Staff member
Well something I noticed (And one site is claiming because of DX10.1 support specifically) is that the Radeons take a very minimal hit due to AA use. They simply scale better with AA than NVIDIA cards now, and at least part of this is the sheer memory bandwidth GDDR5 is giving. This is obviously opposed to their last 2 generations where AA hurt them.

The GTX 280 is unquestionably the best single card. But before the price cuts... it was by far the worst to buy! For less than a single GTX 280 you could buy two 4870's or even 4850's that lay waste to it.

Now that we have a $150 price cut things are more questionable... but still, two HD 4850s cost $100 less, and gives better performance than a GTX 280. ;)

As much as I would prefer to keep a single GPU and avoid any possible Xfire of SLI issues, there hasn't been a better time than ever to try multiple GPUs. And right now, HD 4850 Crossfire looks to give the absolute best price/performance, and best performance short of two 4870's for xHD gaming.

What is even funnier, GTX 280 SLI scaling is just a joke. HD 4870 Xfire still competes well against $1k+ SLI setups.

Edit:

Okay, let me put it this way. Look at FiringSquad's review... the HD 4850 Crossfire beat the GX2 and GTX 280 in every single test at 1920 or 2560 resolution. The only thing better, is two 4870's....
 
Last edited:

Rob Williams

Editor-in-Chief
Staff member
Moderator
FiringSquad doesn't seem to publicize their testing methodology, so I'll assume they stick to timedemos like most everyone else. So, I could care less what their results say. I'll work on getting in two 4870's or another HD 4850 and then perform similar tests, although with real gameplay, since that's where it matters and we aren't so lazy.

Either way, you are right. It's a good time to get into the dual-GPU scheme of things. For $400, you get a lot of power.
 

Kougar

Techgage Staff
Staff member
Well, FiringSquad, Tech Report, Anandtech, all of them have Crossfire tests that show the same tend. The only thing comparable seems to be two 9800GTX+ versions, but those will cost an extra $60 beyond two 4850's... AMD still comes out on top whichever what it's sliced.

Something that was confusing for me when reading your review here was exactly which tests had AA and which did not. The text in the review seemed to imply only the 1920x1200 score had AA used... since this makes a big difference (And who doesn't use AA), it's an issue. Just stick it on the charts or something. :) The 4870 by itself plays just fine at 2560x1600 with AA in some games, so there's no reason not to test with 4x or 8xAA in everything else. As I said above, ATI scales much better with AA than NVIDIA now.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
Something that was confusing for me when reading your review here was exactly which tests had AA and which did not.

As much as I hate to 'uglify' the charts, I might have to start adding that information into the title. But there is a reason I include a link to our testing methodology at the top of every-single results page. I show a screenshot for every game, with the exact settings used at a given resolution:

http://techgage.com/article/palit_geforce_gtx_280_1gb/2
 

Kougar

Techgage Staff
Staff member
Ah, is what I get for reading to fast. I see what ya meant now with the intro paragraphs, sorry ;)

I don't tend to usually look at the ingame screen settings because so many sites have had to force AA or AF settings in the driver for some games.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
Well, if you read too fast, then a lot of other people do as well, I'm sure. I hate to cloud up the graphs with a bunch of information like that, which is why I made a more robust testing methodology page, and prefaced each performance page with a link to it.

Any recommendations on changes you'd make to the settings I chose? There are a few I wish I thought more about before doing them, but for now I think most of them are ok. I am not sure why I chose no AA for 2560x1600 in CoD4, for example. Makes no real sense. That one I kind of regret.

The sad thing is that changing something means re-installing a whack of GPUs to retest... not a fun task.
 

Kougar

Techgage Staff
Staff member
To be honest I was mostly comparing results here to other reviews, and did not sit down to fully read the review. At the time I was mostly after some 4870/260 numbers to compare to everything else. I'm not sure if it is a bad habit or not, but before opening my mouth for comments/questions I try to read the full page that the comment is related to if I hadn't already, to ensure I didn't miss something. I think this is the third time I'd have found the answer on the Testing Methodology page for a review, so I guess it's just a bad habit of mine. ;)

I can't really make up my mind on any particular GPU setup so I've looked at everything and then some... 9800GTX SLI vs 4850 Xfire reviews are hard to find, let alone GTX+ or simulated GTX+ cards. Taking into account the lack of GTX+ cards and that vanilla GTX cards are well over $200 I think the 4850 is just unmatched... it beats the 9800GTX soundly enough and is still cheaper.

Since you've asked for comments I'll try and actually be helpful for a change. :p

Regardless of if it's a bad habit or not I tend to look for AA/AF settings on the page with the benches. Mostly I would suggest picking an AA/AF setting and keeping that constant for the entire review, especially within the same game. Not labeling the graphs is fine, but for lazy bums like me having the AA/AF mentioned at the start of each new game is easiest if not on the graphs. Doubly so if the game doesn't support it or other AA/AF settings are going to be used.

Depending on the graphic card being reviewed, we split up models into two different categories: Low-End to Mid-Range and Mid-Range to High-End. The former will see the GPUs tested using 1280x1024 and 1680x1050 resolutions, since those are the most common resolutions for gamers looking to purchase a GPU in that price-range.

For our Mid-Range to High-End category, we test GPUs at 1680x1050, 1920x1200 and also 2560x1600 to better reflect the resolutions for those looking for a solid GPU offering.

This confused me until I realized that you only were doing Mid to high-end range testing for this review. I might suggest leaving the first part out as it isn't relevant to this actual review, since only the mid-high range was tested?

I would make note if ya had to, or don't use any driver overrides since those aren't relfected in the ingame screenshots. I do like that you have screenshots for every resolution there, because some settings do differ between resolutions within the same game so it's good to know.
 
Last edited:

madstork91

The One, The Only...
Blah!! Too many videocards these days. Every week a new model is released. And the drivers from last years cards are not perfected. And the performance is irrelevant, you see, next year's integrated graphics chipsets from the likes of intel will outperform these behemoths, and sip power to the tune of 15 watts as a bonus!

I feel sorry for you early adopters blowing a ton'o money on all these things.. sheesh!

It is about time the industry stops doing this. If intel releases more than 4 proc a years... they are wasting their R&D time and money.

If a company releases more than 2-3 Vid cards a year, they are wasting their R&D time.

Want proof...? Look at some of the problems some of these cards have. Look at the fooking initial drivers.

Here is an idea!

Dear industry:
Don't release a card every 3 weeks when you guys figured out that you like a certain colored switch better than another one. **** THAT! and **** YOU TOO!
 
Last edited:

Rob Williams

Editor-in-Chief
Staff member
Moderator
Kougar, thanks for all of the input, I appreciate it. I'll take a lot of that advice into consideration with the next GPU review. I'll see about adding the information to each graph, and just make it look good. I'll figure out a way... it'd be a it easier all-around. The last thing I want to do is confuse our readers.

As for the screenshots, I am not sure what you mean about the driver overrides. I never apply any sort of overrides in the drivers when benchmarking, but rather let the game handle everything. If you think this is an issue, then please let me know. Essentially, the game it would affect is UT III, since I would not have AA enabled, since it's not available in the game options.

Those screenshots just show the settings I use for each resolution, even if the screenshot itself is 1280x1024. I kept the screen resolution the same for the sake of having the thumbnails look good beside each other, but what's selected in the screenshot is accurate. I use those same screenshots when benchmarking to double-check to make sure I am using the exact same settings each time.

I'd like you input on more things later, if you don't mind. I appreciate your input and constructive criticism, since it's exactly the kind we need.
 

Kougar

Techgage Staff
Staff member
As for the screenshots, I am not sure what you mean about the driver overrides. I never apply any sort of overrides in the drivers when benchmarking, but rather let the game handle everything. If you think this is an issue, then please let me know. Essentially, the game it would affect is UT III, since I would not have AA enabled, since it's not available in the game options.

My memory sucks sometimes, so I sat down and looked this up. CoH doesn't have Anisiostropic Filtering options ingame. Bioshock does not offer Antialiasing settings ingame. I think Quake Wars also does something funky, but I'm not sure, I know there was another game with AA/AF issues. Many sites have gotten into the happen of using the driver control panel to override application settings and manually force certain AA and AF settings. It doesn't matter to me as this often causes problems anyway or the options are disabled ingame for a reason because the game can't physically support it, etc. I just look for a little blurb about it if I don't see AA/AF listed.

Those screenshots just show the settings I use for each resolution, even if the screenshot itself is 1280x1024. I kept the screen resolution the same for the sake of having the thumbnails look good beside each other, but what's selected in the screenshot is accurate. I use those same screenshots when benchmarking to double-check to make sure I am using the exact same settings each time.

Hm, I was not questioning the accuracy of the screenshots? I was simply stating that you state in the review that you divide the benchmarking into midrange and high-end categories... but the review itself only tests the high-end category.

I'd like you input on more things later, if you don't mind. I appreciate your input and constructive criticism, since it's exactly the kind we need.

No problem Rob, just trying to be helpful. :)
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
"I do like that you have screenshots for every resolution there, because some settings do differ between resolutions within the same game so it's good to know."

That's the line I was referring to. I just meant that the settings seen on the methodology page are exactly what I use, on a per resolution basis.

I think I've confused myself enough though.

One thing I do want to do is revisit settings I have chosen, and perhaps rebenchmark games at certain settings, but I'm unsure, given the amount of work it would take. I do regret not using AA with CoD4 at 2560, though... and a few others.

For the forced AA/AF... back in the day, I used to go against that, since it seemed to only create problems. But that might not be the case anymore, I'm unsure. It might very-well be a good idea to begin doing that, especially with games like UT III that doesn't even offer AA settings in-game at all. Why they don't, I have no idea. Such a basic setting...
 
U

Unregistered

Guest
This guy takes the cake

Keatah plain and simple don't buy the card just use your onboard video
 
U

Unregistered

Guest
ATI Nvidea comparison

Ah, is what I get for reading to fast. I see what ya meant now with the intro paragraphs, sorry ;)

I don't tend to usually look at the ingame screen settings because so many sites have had to force AA or AF settings in the driver for some games.

Kougar,

I bought the ati 3870 and crossfired 2 cards. Nothing but problems. returned for a 4870 x 2 ati Radeon card... Again nothing but problems. Half the people buying ati radeaon 3870 and 4870 and all their new cards are having serious problems. Read all the posts in the amd and at's forums you guys will laugh. I took my 4870 x 2 back and got the gtx280 and what a breeze. Smooth instal and great performance. ^5 to nvidea and ati can kiss muh a%$... LoL ... You guys get the hint
 
Top