Intel Encourages Game Devs to Utilize the CPU Over the GPU

Rob Williams

Editor-in-Chief
Staff member
Moderator
From our front-page news:
It looks as though Intel plans to give an interesting talk at the upcoming Game Developers Conference, which aims to explain to developers just how useful a CPU can be in gaming, over a typical GPU. We've been down this debate road time and time again, so there isn't much more to expect from this talk, but it might be interesting to gain Intel's latest perspectives on things.

As we mentioned in our "Clearing Up Misconceptions about CUDA and Larrabee" article last summer, both the CPU and GPU offer their own set of benefits to gaming, with the CPU being a natural choice for certain gameplay aspects while the GPU is set aside for more highly-parallel processes. You can't have one without the other, clearly, and that's why we'll likely not see one cancel out the other for quite a while. You simply won't run an entire 3D-accelerated game off of a CPU, at least not for a while, so the likes of NVIDIA are still safe.

For all we know, this talk might be Intel's sneaky way of pre-facing their Larrabee graphics solution launch, which could happen at the end of this year, or sometime next. Since Larrabee utilizes many x86 cores, now is no doubt a great time to explain to developers what the benefits are. Of course, it's too early for any of us to speculate just how successful Intel's graphics card will be, but the wait shouldn't be that much longer to find out.

intel_large_news_logo.png

"Not all algorithms and processes map well to a GPU," said Jon Peddie, president of Jon Peddie Research. "You have to have a problem that is naturally parallel, and except for the rendering of, say, a water surface and subsurface and reflections, the wave motion equations will run just fine on a CPU," Peddie said. Intel may also be seeking ways to make better use of its quad-core processors, according to Tom R. Halfhill, an analyst at the Microprocessor Report. But, he added: "I need to be convinced that a CPU can do those 3D effects better than a GPU can."


Source: Nanotech: The Circuits Blog
 
Top