EVGA's GPU Voltage Tuner Tool Released

Rob Williams

Editor-in-Chief
Staff member
Moderator
From our front-page news:
Overclocking. Hobby for some, life for others. In the past, I used to take overclocking seriously to a certain degree, but to become truly successful at it, you have no choice but to devote a lot of time and energy to studying up and sometimes even modding whatever product it is you're trying to overclock. That's been especially true with motherboards in the past, and it's never too uncommon to see extra wires soldered from one point to another in order to get the upper-hand in a benchmarking run.

This isn't that extreme, but to offer even more control to their customers, EVGA has released their long-awaited "GPU Voltage Tuner", which does just as it sounds. With CPU overclocking, the option to increase the voltage can be found right in the BIOS... it's far from being complicated. With GPU overclocking, the process could prove to be much more complex, since the option for voltage control has never been easily available.

With this new tool, which exclusively supports EVGA's own GTX 200-series, the user is able to increase the voltage in what seems to be three or four steps total, and as it stands right now, the red area, as seen in the below picture, is not yet enabled, and I can jump to conclusions to understand why. With this tool, heat is going to increase, so it's imperative to make sure you have superb airflow. If your card dies when using this tool, it doesn't look like it's warrantied, so be careful out there!

evga_tuner_tool_021009.jpg

This utility allows you to set a custom voltage level for your GTX 295, 280 or 260 graphics card. (see below for full supported list) Using this utility may allow you to increase your clockspeeds beyond what was capable before, and when coupled with the EVGA Precision Overclocking Utility, it is now easier than ever to get the most from your card. With these features and more it is clear why EVGA is the performance leader!


Source: EVGA GPU Voltage Tuner
 

Kougar

Techgage Staff
Staff member
It's a solid program. For those that own EVGA cards it is much quicker to use and test with than the complex Rivatuner alternative, and should allow for a wider supported range of EVGA cards.

Voltage is "limited" to 1.288v on my EVGA SuperClocked GTX 260. Program advertises support for Windows 7 and works fine under W7. It also detects any voltage adjustments already made via Rivatuner.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
Given that the example on their site shows a limit of around 1.188v, 1.288v doesn't sound too bad. I'll have to give this tool a go when I toss their GTX 285 in the test rig for benchmarking. I'm mostly interested about the heat factor, but given NVIDIA cards tend to run much cooler than AMD's, I'm sure there will be a fair amount of wiggle room.
 

Kougar

Techgage Staff
Staff member
Well, not sure how my card stands compared to most other GTX 260's, but it is very heat sensitive.

Heat seems to be as much an issue as voltage. When set to 675/1458 clocks my GTX 260 takes longer to artifact at 1.20v, only after the core temp crosses beyond ~65c would artifacts appear. ATI Tool finds artifacts instantly if I use 1.288v as temps jump to ~68c and climb quickly from there.

I'm sure I could bring the temps down with some tweaking to my loop and GPU Fuzion block, but I'm happy with it as is. I still run 652/1404 @ 1.20v for 24/7 use, anything beyond that is just too close to the edge for stability. Stock was 1.125v
 
Top