Voltage Overclocking the GPU (GT200 & some 4000 cards)

Kougar

Techgage Staff
Staff member
This one has been making the rounds like wildfire. It appears Rivatuner has the ability to adjust the voltage settings to GTX 260 & GTX 280 cards (And some ATI 4000 cards) in software. No hard mods or flashing is required. From what people are posting it doesn't work with the newer GTX 285 as the voltage controller was changed.

Original Link & Guide: http://www.xtremesystems.org/forums/showpost.php?p=3600426&postcount=103

Secondary Link with an easier Guide: http://www.ocxtreme.org/forumenus/showthread.php?t=4427

Of course as GPU's are such hot beasts this isn't recommended for most users on stock cooling, but I figured I'd post it here. I'm not the only one that watercools their GPU I believe.

Some early results, I've taken my factory overclocked GTX 260 from 602/1296 to 677/1458 stable, and it only requires 1.25v, up from 1.13v default. Bad news is any further voltage doesn't seem to get 700MHz 100% stable, and the next higher "stepping" beyond the 1458 shader clock is not even close to stable. Anyone attempting to push 700Mhz on the core will need to unlink the shader and core clocks.

It is interesting, but the three voltage regulators on my card tend to run 5-10c hotter than the core temp while under loads. The GTX 280 has a fourth voltage regulator + supporting circuitry that wasn't removed, so it should fare a little better...
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
I'm curious... was it not possible to get anywhere close to 677/1458 without the voltage boost? Do you have a comparable overclock without any need for increased voltage? The entire idea fascinates me anyway, and I'm really glad to see that this is possible via software rather than hard-modding or flashing the BIOS (both of which are far more technical).

I agree on the cooling issue though. It's probably not a great idea for anyone to goof around with this unless there is some superb airflow in your chassis or you have an after-market cooler that's more efficient than a reference cooler (most of those are cheap and inefficient).

Great information man. If I had time and ambition, I'd be all over this!
 

Kougar

Techgage Staff
Staff member
Yes, I naturally have stock voltage overclocks. This is me you're talking about... ;) Due to the complexity and associated risks of BIOS flashing I made a custom BIOS, but never decided to flash the card. And I do not do hard mods as I'd like the hardware to last. So I was left with basic overclocking...

Due to my personal beliefs I do not overclock the GDDR3. My experience has been that they are almost always the first component to fail on modern GPUs, therefore I won't mention their speed here as I have never once OC'd the GDDR3 on this card.

Code:
Stock GTX 260           576/1242
EVGA Superclocked       602/1296
1.13v Stock Voltage OC  648/1404          
1.25v Voltage OC        677/1458
1.3v Voltage OC         686/1458 (not sure if was fully stable)

At stock voltage this OC would (rarely) fail during extended gaming sessions, so I never used this overclock for normal use. In all honesty the card feels more stable with the slight voltage increase so I run it at 648/1404 using 1.2v 24/7 now and not had a single problem. The GPU core temp stays below 60c.

It doesn't matter if I try as high as 1.3v, but the card will not run the next higher stepping beyond 1458 for shaders, and doesn't want to go above 686 for the core. If I let ATI Tool run long enough it will start flashing yellow artifacts at 700Mhz with 1.3v.

If nothing else the slight voltage bump will allow those original 260 and 280 owners to clock their cards as high as the new 55nm models. I'd be very curious to see what the "cherry picked" factory overclocked cards could do with a little more volts as (I believe?) they operate at the same 1.13v core voltage.
 
Last edited:

Rob Williams

Editor-in-Chief
Staff member
Moderator
Thanks for the clock reports... very interesting.

With regards to the core voltage keeping under 60°C, do you mean at idle, or load? It sounds good, regardless. I'll be replacing the junk 8800 GT I have in this machine soon with some card lying around here... I can't handle the 75°C idle temps anymore! 60°C load seems like a dream right now, but I'd be happy if I could just attain that at idle, haha.

I truly wish I had time to experiment with all of this. I'm swamped with work (lots of content en route) at the moment, so I can't devote any time to this, but I'd definitely like to hear other opinions, if anyone else out there with a capable card is interested in upping the voltage.

Nice 24/7 clocks though... have you noticed (or even checked) to see what kind of power draw increases would be seen with small voltage increases like that?
 

Kougar

Techgage Staff
Staff member
Generally under 60c at load unless it's hot indoors, it remains somewhere in the mid 30's at idle. I'd check right now except it is overclocked and running Folding@home.

I expected a bit more overclocking room, but considering just how extremely "binned" GPUs are these days with three to four different factory clock rates per model I guess I shouldn't be surprised. But just a tiny bit more voltage will make those higher stock vCore overclocks stable, which is always a nice thing...

Actually I have not checked on the power draw. I'll need to find my old datasheet, I know switching power supplies saved around 25-50W alone, which makes my old data on the 750W Quad Silencer incomparable... One of these times I need to shut down the system I'll slip the power meter in and do some testing. I suspect if I tried I could now get very close to that 800W mark as I was able to hit above 700W on my last attempt.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
Kougar said:
it remains somewhere in the mid 30's at idle.

Haha, that's fantastic. I ran a quick 3D game earlier on this PC, and the 8800 GT shot up to 95°C, and right now (idle), it's hovering around 80°C. I really gotta haul that card out later and replace it. I still don't think I'd get anywhere near 30°C though, so you must have some pretty nice room temps.

Kougar said:
I suspect if I tried I could now get very close to that 800W mark as I was able to hit above 700W on my last attempt.

How on earth is 800W possible, or even 700W? My Kill-A-Watt died last week, so I'm waiting for a new one. Once it gets here, I'll see how much our Core i7 machine can pull when using two GTX 295's. Can I match your 800W? Probably not. I have a feeling increasing voltage makes a far larger difference than I ever gave it credit.
 

Kougar

Techgage Staff
Staff member
It's just an end-all-be-all watercooling setup! 1/2ID tubing, triple radiator, etc... the temps might be good, but it didn't help much with pushing those clocks up!

700W was possible with injudicious use of overclocking to the razer's edge, both the Q6600 and GTX 260... although at stock voltages for the GPU. Pop a music CD into the drive to enjoy, have five HDDs buzzing away, all fans at full speed, water pump at full speed, a 4GHz Q6600 running IntelBurn or Prime95, and the GPU running Furmark... :D There's a reason I bought a DQ6 motherboard, I can put all that fancy power circuitry to good use! I guess I'll just have to make some photos of the readout for ya... after my exam Thursday!

How on earth did you kill your Kill-A-Watt??
 
Last edited:

Rob Williams

Editor-in-Chief
Staff member
Moderator
Ahh man, you really know how to stress that machine, haha. I guess the sheer amount of hard drives definitely helps, but still. Imagine if you had two GTX 295's in there... you'd likely actually hit 1,000W... that's insane.

I don't think I killed the Kill-A-Watt... it just died. It happened right before I began a fresh round of GPU benchmarking, too, so it was rather inconvenient. I won't even mention how inconvenient it is when I have low-powered Intel processors here to benchmark...

I was supposed to receive the new one on Tues, but that's UPS for you. It's really easy to hate a company when they give you reasons every single week to, heh.

As for my personal system woes, hauling out and replacing that 8800 GT isn't as easy as I thought it'd be. I now see why I should have upgraded to a larger chassis when I decided on this Antec P182. To install a larger card, I'd have to remove the middle bay, put the drives down bottom, and then install the larger card. I don't want to move the drives though, because where they are, there's a big fan behind them. I can't see them boding too well in the bottom there, temperature-wise. Oh well... time to consider a full-tower.
 

Kougar

Techgage Staff
Staff member
Ahh man, you really know how to stress that machine, haha. I guess the sheer amount of hard drives definitely helps, but still. Imagine if you had two GTX 295's in there... you'd likely actually hit 1,000W... that's insane.

I bet I could come awefuly close to 1kW with just another GTX 260... if I had two GTX 280's or 285's then I'm sure I could hit 1kW, don't need Quad SLI for that!

I don't think I killed the Kill-A-Watt... it just died. It happened right before I began a fresh round of GPU benchmarking, too, so it was rather inconvenient. I won't even mention how inconvenient it is when I have low-powered Intel processors here to benchmark...

Oye, your P3 power meter decided to leave you hanging with a low-end Intel CPU in your machine at the time? Guess it was a good a time as any for it... :D

As for my personal system woes, hauling out and replacing that 8800 GT isn't as easy as I thought it'd be. I now see why I should have upgraded to a larger chassis when I decided on this Antec P182. To install a larger card, I'd have to remove the middle bay, put the drives down bottom, and then install the larger card. I don't want to move the drives though, because where they are, there's a big fan behind them. I can't see them boding too well in the bottom there, temperature-wise. Oh well... time to consider a full-tower.

Yes, it is rather inconvenient when that happens. That, along with using four drives meant I only used the lower drive bay in my P180. They sit just behind a 120mm fan in the lower wind tunnel though so temps wouldn't be an issue... both the PSU and 120mm fan are pulling air across the drives. But, it's just another reason I don't use my P180 anymore though.
 
Last edited:
The GT300 die is rumoured to have a massive die size in excess of 500 mm2, much like the GT200. And like the GT200, once again, the derivatives are missing. It is worth noting that the GT200 derivatives were announced almost a year late - and are still awaiting widespread availability.

Take G80, for instance,it was a spectacular GPU back in 2006, when it took the graphics world by storm. Since then, it was shrunk from 90nm to 65nm in fall 2007 for another star performer - the G92 core. Branding confusions ensued, where the G80 based 8800 GTS ended up slower than both G92 products - 8800 GT and 8800 GTS (again, only differentiated by the memory sizes). A "next-generation" G92-based 9800 GTX was introduced, which ended up not much faster than the 8800 GTX. Then came the ATI's HD 4000 series - and the G92 was shrunk, once again, to 55nm G92b in summer 2008 - this time branded as 9800 GTX+, with clock speeds increased and prices cut to compete with ATI's mainstream HD 4850. By this time, the all-powerful flagship of 2006 was just a mainstream product. Fast forward to 2009, and Nvidia does it again - putting out a thermally improved 9800 GTX+ as GTS 250. By the middle of 2009, thanks to the HD 4770 launch and further price cuts of HD 4850 and HD 4870, the once invincible $800 product (8800 Ultra) was selling in the mid-range for $130, subsequently losing out on the price-performance battle. Unfortunately, many consumers assumed GTS 250 was a "next-generation" product, and sales remained strong. This is why Nvidia attracted harsh criticism from several enthusiasts.
bookmarks
 
Top