Softmod an AMD Radeon HD 6950 to an HD 6970

Tharic-Nar

Senior Editor
Staff member
Moderator
Hardware enthusiasts are always on the look out for ways of getting something for nothing or expanding the use from a given piece of equipment. Overclocking, soft-modding, anything goes for squeezing those precious few flops or frames out of their hardware, or making it perform like a much more expensive model. AMD CPUs with core unlocking, converting NVIDIA GeForce cards into Quadros, even using software to gain added abilities with soundcards. A couple days ago, a softmod was released that allowed the conversion of an AMD HD 6950 into a 6970, not just a mere overclock, but actually re-enabling the extra shaders as well.
amd_hd6970shader_122810.jpg
You can read the rest of our post and discuss here.
 

Optix

Basket Chassis
Staff member
I read about this at around 1am this morning. Too bad you couldn't do the same with a 6850. :(
 

DarkStarr

Tech Monkey
NICE, it looks like it works pretty reliably from their chart.

<table border="0" cellpadding="4"><tbody><tr><td align="center">Manufacturer</td><td align="center">Cards
tested
</td><td align="center">Unlocks and
works fine
</td><td align="center">Unlocks but
rendering errors
</td><td align="center">Does not
unlock
</td></tr><tr><td>AMD</td><td align="center">1</td><td align="center">1</td><td align="center">0</td><td align="center">0</td></tr><tr><td>HIS</td><td align="center">10</td><td align="center">10</td><td align="center">0</td><td align="center">0</td></tr><tr><td>ASUS</td><td align="center">2</td><td align="center">2</td><td align="center">0</td><td align="center">0</td></tr><tr><td>PowerColor</td><td align="center">14</td><td align="center">14</td><td align="center">0</td><td align="center">0</td></tr><tr><td>Sapphire</td><td align="center">27</td><td align="center">27</td><td align="center">0</td><td align="center">0</td></tr><tr><td>XFX</td><td align="center">15</td><td align="center">15</td><td align="center">0</td><td align="center">0</td></tr><tr><td>Club3D</td><td align="center">6</td><td align="center">6</td><td align="center">0</td><td align="center">0</td></tr><tr><td>Gigabyte</td><td align="center">3</td><td align="center">3</td><td align="center">0</td><td align="center">0</td></tr><tr><td>Diamond</td><td align="center">1</td><td align="center">1</td><td align="center">0</td><td align="center">0</td></tr></tbody></table>
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
I'm curious about the true stability of these things over the course of a couple of weeks or even months, though. It's one thing to do the mod and then run a benchmark and call it 100%, but it's another to do the mod and have it fail down the road. The HD 6970 DOES have an 8-pin connector, for example, and I'm guessing that's for a reason. Could the modded HD 6950 be prone to damage due to lack of a proper power input months down the road? We'll have to wait and see.
 

Doomsday

Tech Junkie
maybe be it will work fine with the two 6-pins, as long as u dont overclock the unlocked HD6950 where the real need for an 8-pin comes in! :)
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
maybe be it will work fine with the two 6-pins, as long as u dont overclock the unlocked HD6950 where the real need for an 8-pin comes in! :)

The thing is... even at stock speeds AMD put an 8-pin connector on the HD 6970. If it didn't need it, wouldn't it have used 2 6-pins instead? After all, that'd look a LOT better than NVIDIA's top-end solutions, which also require an 8-pin and 6-pin.

I can't help but feel like some people are going to be risking their cards in doing this. I could be wrong, but I don't think AMD would have chosen to use an 8-pin connector just for fun.
 

DarkStarr

Tech Monkey
Eh, I doubt it really matters, the power should be fine since 2x6pin gets 75w per connector which is 150w + the 75w from PCIe. The other thing is that in reality 3 12v lines in the PCIe 6 pin connectors can probably supply far more than the "rated" 75w. Also 75w x 3 is only 225w yet the card is pulling 252w so obviously the power is there. The 8 pin is there because it technically allows 150w but could likely supply quite a bit more.

EDIT: On top of that the only difference is that the 8 pins extra 2 pins are just ground wires.
 
Last edited:

DarkStarr

Tech Monkey
It says it right on the second page there:

NVIDIA also advises against the use of 6-pin in 8-pin cards since many power supplies will not provide sufficient current over the 6-pin power cable. However, this type of setup could potentially support normal operation as long as the customer checks their PSU manuals and ensures that its 6-pin PCI-E rails can handle the same current rating as an 8-pin power cable, which is 150 watts.
I believe the 8 pin is there to INSURE that the card gets the correct power since cheaper PSUs may not be able to provide 150w on the PCIe 6 pin connector, also could be why a couple cards in their chart now have unlocked with errors.

We will however ill-advise the use of the 6pin connector as a replacement of the 8pin, unless you are absolutely sure about the quality of your PSU.
As I said, if its good it should be fine, if you went with a cheap one probably not.
 

Tharic-Nar

Senior Editor
Staff member
Moderator
Yeah, it does seem like a manufacturing standards limit rather than a consumer based, e.g. prevent less reputable manufacturers from producing 6pin connectors with 150Watt limits using thin wire and having them burn out. Checking the ATX standard, it mentions nothing about wire thickness, so there was nothing stopping companies from using less than ideal wire to power these connectors, just as long as it met the 75-watt criteria. Decent PSU manufacturers tend to over-engineer their products, so this 150-Watt issue is less of a concern. Go to the budget end where everything is built to exactly meet minimum specification... and we have a fire hazard.

Reminds me of the days when people battled over 40 and 80-pin IDE cables... 80 pin just added a whole mass of ground pins to reduce noise and increase throughput... like CAT 5e and CAT 6, 24 AWG compared to 22 AWG, thicker insulation/wire to reduce interference and increase throughput... In most cases, cables were interoperable with the previous generations with a few minor technical exceptions.
 

DarkStarr

Tech Monkey
Yep that's how it seems to me but there is one other thing, the pins in all 6 pins I have seen are:

1 +12v --- 4 Ground
2 +12v --- 5 Ground
3 +12v --- 6 Ground

But I read that the spec is closed so no one sees it, and its possible that the spec has the 6 pin listed as:

1 +12v --- 4 Ground
2 Blank --- 5 Ground
3 +12v --- 6 Ground

So that could be why it is listed as 75w.
 
Top