NVIDIA's GDDR5 Stance Differs from AMD's

Rob Williams

Editor-in-Chief
Staff member
Moderator
From our front-page news:
I posted yesterday about AMD's decision to use GDDR5 chips with their upcoming Radeon 4000-series, and from what it seems, NVIDIA doesn't share the same ambition that they do. At the Nanotech: The Circuits Blog... blog, a quote from NVIDIA doesn't discredit GDDR5 at all, but they note there isn't a need for it right now.

Barry Wagner says, "We aren't particularly attached to any given interface technology", although both their current generation and next-generation (GTX 200) both use GDDR3. He goes on to note that they'd adopt it if it made sense for their business, which at this point doesn't seem to be the case.

The true benefits of GDDR5 might not be seen with gaming, at least until incredibly memory-intensive games come out, but AMD's keen to improve upon stream processing and parallel calculations (along with NVIDIA), and they feel the improved bandwidth of GDDR5 would be beneficial there. The same goes for DDR3 in the desktop market, but for those who own such memory, they likely know just how specific an application needs to be to fully exploit it, and nothing will be different on the GPU.

<table align="center"><tbody><tr><td>
qimonda_gddr5_052108.jpg

</td></tr></tbody></table>
Nvidia is supporting the technology but taking a more cautious approach. The Santa Clara, Calif.-based graphics chipmaker holds a vice chair position in the GDDR5 task group, said Barry Wagner, director of technical marketing at Nvidia. "We're involved in the specification of GDDR5 so if we want to build products around it, at least the spec is architected in a way that we would be content with," Wagner said.

Source: Nanotech: The Circuits Blog
 

Kougar

Techgage Staff
Staff member
ATI wouldn't be ATI if they didn't jump on the new RAM bandwagon.

Ironically... GDDR5 draws significantly less power than GDDR3, has error correcting ability, but most of all it costs less to put on the PCB. GDDR5 requires a less complex solder configuration, partly due to fewer pins to solder. Most of this it also has in favor over GDDR4. I suspect prices for GDDR4 are not exactly cheap yet, only ATI uses it and only on a few cards. Might not have cost that much more to upgrade from GDDR4 to GDDR5, especailly after savings gained on manufacturing costs & PCB complexity.
 

On_Wisconsin

Coastermaker
I think it's a gimmick of sorts...eg 'hey look this has GDDR5, this one doesn't, so the one with GDDR5 must be faster...'
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
Well it is faster, so it's not as tough they are lying. It's just a matter of how much bandwidth we need, and quite frankly, we probably don't need that much.

But if AMD has other plans for their GPUs, like robust calculations (or for those who want to fold), then the GDDR5 might prove beneficial.
 
Top