NVIDIA's GTX 200-series Specs Revealed

Rob Williams

Editor-in-Chief
Staff member
Moderator
From our front-page news:
While NVIDIA's 9-series launch seemed a bit lackluster, given that they were primarily an over-glorified 8-series (though I can't say enough good about the 9600 GT), the companies' upcoming GTX 200 cards are looking to wipe the disgruntled looks off all our faces.

According to DailyTech, two cards will launch initially, the GTX 260 and GTX 280. As you'd expect, the GTX 280 is the highest offering, which will include 240 stream processors (compared to 128 on the 9800 GTX), a huge 512-bit memory bus width, support for up to 1 GB (although it should be possible for companies to add more). On top of those, the unified shaders are to perform 50% faster than previous generation cards.

The lower-spec'd GTX 260 removes 48 stream processors, for 192 total, decreases the memory bus to 448-bit and also lowers the memory to 896MB. Despite the decreases, it's hard to assume that this card will be "gimped" by any standard. Going by specs alone, even the GTX 260 should be absolutely stellar.

The downside, if there is one, is that DX10.1 will not be supported. This is a bit striking, but goes to show NVIDIA doesn't have much faith in it, or feels the need for inclusion. It would have been nice to see it added either way though. Embargo is set to lift on June 18th, which is right around the same time that ATI will lift theirs for the Radeon 4000-series.

<table align="center"><tbody><tr><td>
nvidia_company_logo_large.jpg

</td></tr></tbody></table>
The GTX 280 enables all features of the D10U processor; the GTX 260 version will consist of a significantly cut-down version of the same GPU. The D10U-30 will enable all 240 unified stream processors designed into the processor. NVIDIA documentation claims these second-generation unified shaders perform 50 percent better than the shaders found on the D9 cards released earlier this year.

Source: DailyTech
 

madmat

Soup Nazi
I wonder what the prices are going to look like. I agree on the 9600GT though, for the cash strapped gamer you can't go wrong with a pair in SLI rather than a 9800GTX.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
I'm really curious on the prices as well. From a specs perspective, these new cards could prove a massive upgrade, so we might very well go back to the $400+ for a single GPU... although it's really hard to say right now.

I'm actually kind of excited though.
 

madmat

Soup Nazi
Me too and as powerful as either card is $400 could be a bargin in terms of horsepower (Megatexels per second) per dollar spent.
 

Kougar

Techgage Staff
Staff member
Nvidia probably doesn't have a solid price set yet anyway. I don't expect ATI to best these cards, but from what I've followed they should come very close... and thanks to their prices they're likely going to force Nvidia's hand on launching with prices slightly lower than they'd prefer... One can hope, anyway.
 

madmat

Soup Nazi
A megatexel is a term nVidia or 3DFX coined to count millions of TEXured pixELs way back during the heyday of the TNT/VooDoo3.
 

Kougar

Techgage Staff
Staff member
Just as high, if not higher more than likely. Just a single 280 GTX has 8pin+6pin PCIe power connectors.

If GT200 is 65nm, it is supposed to have an actual ~250 watt TDP... The infamous R600 drew only around 230W. NVIDIA will probably aim for 55nm though, since this will be a 1billion+ transistor GPU and they'll want to shrink it if at all possible. If they did launch at 65nm, they'll pull a quick turn around with 55nm parts though.
 
Last edited:

Merlin

The Tech Wizard
Maybe Nvidia is saying what a lot of people are saying ( What has DX10 done for gaming, do we actually need it )

Was it a Microsoft hype?
I'm usually not gullable, but the hype was at a point, when I didnt question the evolution of gaming.
Hype < but not less than = the sum of excitment
----------------------------------------------------------------- >= Profits
Gamers looking for something new [] by 3.14159265358979323846

That about sums it up

:techgage::techgage: Merlin :techgage::techgage:
 

Kougar

Techgage Staff
Staff member
What kind of question is that? Just a matter of time before game developers release games tailored for DX10... although would of happened much faster if XP also supported DX10, because as it is DX10 support is only a very minor segment of the already shrinking PC gaming market.

--------

Bit more news about GT200. Apparently it now supports [email protected], and it not only eclipses the PS3 but also the HD 3870 by very significant margins. Finally, I can getting excited again for an NVIDIA GPU... :)
 

Merlin

The Tech Wizard
What kind of question is that? Just a matter of time before game developers release games tailored for DX10... although would of happened much faster if XP also supported DX10, because as it is DX10 support is only a very minor segment of the already shrinking PC gaming market.
Trial Lost planet came in DX9 and DX10.
Thinking I had the DX10 copy, I was not that impressed, then when I DID get the DX10 version, seemed to be no difference at all.
So, just what is the difference?
Only 22 games so far are DX 10, after being out for a good while, seems to be slowing down....what else :)

:techgage::techgage: Merlin :techgage::techgage:
Being curious
 

Kougar

Techgage Staff
Staff member
Trial Lost planet came in DX9 and DX10.
Thinking I had the DX10 copy, I was not that impressed, then when I DID get the DX10 version, seemed to be no difference at all.
So, just what is the difference?
Only 22 games so far are DX 10, after being out for a good while, seems to be slowing down....what else :)

:techgage::techgage: Merlin :techgage::techgage:
Being curious

I don't consider afterthought DX10 patches to count as DX10 games, is my problem. CoH Opposing Fronts has some interesting effects with the weather engine, but still the game looks almost the same played on XP.

Why should a game developer make a great game that shows off DX10? Only a minority of PC users in an already shrinking PC game market could play it. And second they would likely annoy their larger non-DX10 user fanbase that'll feel they got shorted. Almost all the DX10 "extras" are simply token aftertoughts to the game.

If there was a genuine reason to build a DX10 targeted title from the ground up then I think the benefits of DX10 could be shown.
 
I really need to know if this card is going to be more or less powerfull than a 9800gx2. I don't know if you guys know already or not or have seen any leaked specs of the radion HD 4800 or The GT200 from nvidia but if you have any info please respond or message me.
 

Kougar

Techgage Staff
Staff member
Some places say it is, some places say it is not.

Considering the specs... I suspect it is faster in the vast majority of scenarios. They share the same core clock speed, and the 280 is effectively two 9800GX2's rolled into a single core. Only 16 less shaders. ;)
 

Merlin

The Tech Wizard
Some places say it is, some places say it is not.

Considering the specs... I suspect it is faster in the vast majority of scenarios. They share the same core clock speed, and the 280 is effectively two 9800GX2's rolled into a single core. Only 16 less shaders. ;)
New rumors say its gonna have 64 ROPs, 256 TAUs, and a total of 1024 shaders. 1024Bit design, 2GB DDR4 memory
1200MHz core,
2800MHz shaders,
3400MHz memory.
Available late 2008 November, maybe
The 8800 GTX Ultra had a 17 month footprint in the market.
You know they wont blow away cards on their market till the present cards sell out at a certain rate, then they launch a high end card to keep the flow going.Great marketing approuch, but even though it COULD be available now, it's just not going to happen.

:techgage::techgage: Merlin :techgage::techgage:</END quote>
 

Rory Buszka

Partition Master
Merlin, I don't think the question Nvidia asked was what did DX 10 do for gaming -- instead, the question was, what did DX 10.1 do for gaming? And from what I've read on the subject, the main differences are subtle improvements in image quality, and some changes that simplify certain rendering procedures.
 

Merlin

The Tech Wizard
GTX 280

Just ordered the EVGA 01G-P3-1284-AR GeForce GTX 280 SSC Edition

After several reviews and comments from people that have the card, I dove into the high end card, for the first time I got the high end, usually I wait till another high end comes out then buy the card I have been eyeing.
This is going into the quad core Q9450, so then I'll put the 8800GT back into the E8400 machine in SLI

Merlin
 
Top