AMD's First Fusion Processors to be Launched in 2010?

Rob Williams

Editor-in-Chief
Staff member
Moderator
At the beginning of the year, Intel launched its Clarkdale and Arrandale processors, built on the Westmere micro-architecture. A couple of things made these special. One of the more important aspects was that the chip was built on a 32nm process. But secondly, and probably more interesting is that under the one hood, there laid both a CPU and GPU.

amd_fusion_small_logo_051710.jpg


You can read the rest of our post here
 

Tharic-Nar

Senior Editor
Staff member
Moderator
Lets do the time warp again....

Throw memories back to the fateful time when the battle of dual core processors came into being, there was the 'two glued together' Intel chip, and the AMD fused design. When push came to shove, the performance difference was and still is ~5%, while power draw was closer to ~10%. So, there will probably be little difference between glue and soldered, just that there will be a small increase in performance and a small decrease in power.... but since we'll be comparing apples and oranges in the final tests, the real values will be hidden, since the comparison will be between an Intel CPU and IGP Vs an AMD CPU and ATI Graphics.... if the result is an overall more efficient and powerful combo versus the Intel solution, then they'll have big head start in the mobile and low power market, for desktops i'm sure there will be better solutions.
 

Kougar

Techgage Staff
Staff member
That's all very true. But don't view this Fusion as some great thing, view it as the stepping stone to that great thing. ;)

That "great thing" will be a true hybrid cpu/gpu, where the CPU performs it's usual integer, out-of-order, single-threaded workloads... but instead of using it's floating point instruction units it will instead use the GPU core. GPU's excel in FLOPs and basic parallel workloads... a hybrid processor will be able to remove duplicate CPU silicon and replace it with GPU silicon that can do the same task more efficiently, and it will be capable of offloading tasks to the right engines appropriately.

AMD supposedly has plans to release a true hybrid in 2015. :) http://www.xbitlabs.com/news/cpu/di...tion_of_AMD_Fusion_Chips_Due_in_2015_AMD.html
 

Tharic-Nar

Senior Editor
Staff member
Moderator
Indeed, think I made the comparison a while ago, as have others, GPU integration is just another logical step. First there were CPU’s as integer processors, later came dedicated FPU’s, the FPU’s were then integrated into the CPU. Graphics demanded massive parallel processing, 3D graphics requires floating point operations for efficiency, making GPU’s massively parallel FPU’s. GPU’s get bigger and better, then smaller and smaller, and now they’re being fitted back into the CPU again.

Dedicated cards aren’t going anywhere though, you simply can’t fit both a full CPU (even if it’s cut back to remove duplication) and a full GPU on the same chip, with the accompanying memory while having it operate at a sane temperature. Having shared memory has it’s advantages, but that means you can still thrash the memory controller with specific tasks, starving other operations, etc, etc, it’s swings and roundabouts.

Dedicated Graphics cards will probably have two major purposes in 5 years time – High end Gaming and HPC, both will probably carry the appropriate price premium. For everyone else, e.g mainstream, integrated graphics will probably be able to handle 90% of the games out there, since by that time, most will be console ports. Games consoles will be due a hardware refresh by that time, they get released, new graphics, pushes the envelope, demand for dedicated PC gaming cards increases, and the cycle starts again. Services like on-live might take off by then, but I seriously doubt that the state of the worldwide Internet infrastructure would have significantly changed by then, since those services are dictated by latency, not bandwidth, and latency rarely changes at all.

I guess we should stop calling them graphics cards as well - AMD started to call them APU's or Application Processing Units or something, but I believe that was in regard to their Fusion chips.

If there wasn’t a demand for dedicated hardware, we wouldn’t have Creative, Asus and Auzentech releasing dedicated soundcards after all these years of integrated audio.
 
Top