Is Intel Cheating for Higher 3DMark Scores?

Rob Williams

Editor-in-Chief
Staff member
Moderator
From our front-page news:
For as long as it's been possible to write a graphics driver in such a way that it can inflate the results of popular benchmarking applications, it's been done. NVIDIA is one company that's been accused of it many times in the past, most recently when it changed its driver to allow the graphics card to utilize itself for CPU purposes, which inflated the end CPU score in 3DMark Vantage to unreachable levels by modern-day processors.

But it's not NVIDIA on the hot-seat today, but rather Intel. As investigated by our friends at The Tech Report, Intel seems to be writing specific optimizations in its driver that ends up inflating the end 3DMark Vantage score. How it's done is that when a supported application (as in, executable) is detected, the driver shifts some of the workload over to the processor, so as to improve the overall gaming performance.

That seems like a fair method of doing things to me, because if the CPU isn't being used, then an extra boost of performance is going to be appreciated. But the problem is that the optimization in the driver actually lists 3DMark Vantage... it's not a general optimization. With such "trickery", Futuremark might not approve of Intel's latest driver for use in published benchmark results. But, despite the naysayers, Intel states that it's very confident that Futuremark will approve the driver.

It should be noted that 3DMark isn't the only affected application. Actual games that will see the feature include Crysis, Lost Planet and Call of Juarez. To keep this fair, Intel should really remove these "profiles" and write the driver to take advantage of this functionality whenever needed... it should be completely application-agnostic. It's not, though, so until Futuremark gives its final word, we'll have to just speculate as to whether or not Intel is trying to cheat the system. What do you guys think?

3dmark_vantage_news_101309.jpg

Intel's software-based vertex processing scheme improves in-game frame rates by nearly 50% when Crysis.exe is detected, at least in the first level of the game we used for testing. However, even 15 FPS is a long way from what we'd consider a playable frame rate. The game doesn't exactly look like Crysis Warhead when running at such low detail levels, either. Our Warhead results do prove that Intel's optimization can improve performance in actual games, though—if only in this game and perhaps the handful of others identified in the driver INF file.


Source: Tech Report
 

gibbersome

Coastermaker
If anything, this reduces the credibility of 3DMark Vantage scores. Consumers should pay special attention to in game performance when making decisions on which chipset to go with.

We shouldn't forget, ATI has used similar optimization for their own graphics cards in the past:
Benchmarking Ethics and the ATI Radeon

We've come a long way since 2001...or have we? :rolleyes:


If the advantage translates into better game performance (such as the case with Crysis), the question is raised: Is this optimization or cheating?

Personally, I'm not going to start a debate about Intels ethics (or lack thereof). Nvidia and ATI have been accused of worse, especially Nvidia recently.

In this case, if the unused CPU resources can improve gaming performance, what's the wrong with that? I see this as Intel having better drivers, rather than playing unfair. This performance increase would be even better with quad/multi core CPUs, where the idle CPU cores can be used to substantially enhance gaming performance (since we know that games rarely need more than dual cores).

Also to note is that ATI has been known for lagging behind with driver support for their video cards vs Nvidia. If AMD is following the same trend, this should be a wakeup call for them.
 
Last edited:

Rob Williams

Editor-in-Chief
Staff member
Moderator
Ahh, I didn't realize ATI was also caught cheating for higher benchmark scores as well. I didn't take hardware too seriously up until I launched Techgage, so I missed out on a lot of history there.

gibbersome said:
If the advantage translates into better game performance (such as the case with Crysis), the question is raised: Is this optimization or cheating?

Well, no one would consider it cheating if the enhancement affected all gaming applications, but it doesn't... it singles a select few. But to Intel's defence, what they're doing is essentially Multi-GPU... they are passing some of the workload onto the CPU, which as a result, is acting as a GPU. So, it might be that Futuremark finds what Intel is doing to be just fine. It would still be nice if the enhancement was universal though.
 

gibbersome

Coastermaker
Well, no one would consider it cheating if the enhancement affected all gaming applications, but it doesn't... it singles a select few. But to Intel's defence, what they're doing is essentially Multi-GPU... they are passing some of the workload onto the CPU, which as a result, is acting as a GPU. So, it might be that Futuremark finds what Intel is doing to be just fine. It would still be nice if the enhancement was universal though.

Agreed. If in-game performance is improving, it's okay to engineer the same improvements in the benchmark. The trouble is that this kind of performance doesn't affect all games and that is what's wrong here.

Now if Intel can create a driver that can automatically shift workload from GPU to CPU during intense gaming, regardless of the game's name, this story would be mute.

If the potential is there, why not fully use it?
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
gibbersome said:
Now if Intel can create a driver that can automatically shift workload from GPU to CPU during intense gaming, regardless of the game's name, this story would be mute.

Exactly, and this is something I can't quite wrap my head around. If this optimization is possible for some games, then why not all? It seems a little suspicious that the games it <em>does</em> optimise for aren't too rare to be seen in benchmarks for reviews. Either way, you're right... they could just optimize the driver in general and allow the option to be disabled, and there would be no issue.
 

Kougar

Techgage Staff
Staff member
All I will say is this. Why else would they want to move the GPU into the CPU, unless the CPU was helping offload GPU processing? Basic GPU overhead doesn't warrant it by itself. ;)

If anything, this reduces the credibility of 3DMark Vantage scores. Consumers should pay special attention to in game performance when making decisions on which chipset to go with.

It's so very true! This was the biggest issue with synthetic programs, was their credibility. I do think far to much emphasis is placed on programs like Vantage, they should not be used as the #1 goal when manufacturers tune firmware/software/hardware in their products for performance.

That seems like a fair method of doing things to me, because if the CPU isn't being used, then an extra boost of performance is going to be appreciated.

Ah, but if you notice 3DMark's CPU score drops by ~500 points with this "hack" enabled, so it does have some, although not a large effect.
 
Top