Latest Forum Posts

Latest News Posts
Coming Soon!
Social
Go Back   Techgage.com > Archives > Reviews and Articles

Reviews and Articles Discussion for Techgage content is located here. Only staff can create topics, but everyone is welcome to post.

Reply
 
Thread Tools
Old 08-12-2008, 10:59 PM   #1
Rob Williams
Editor-in-Chief
 
Rob Williams's Avatar
 
Join Date: Jan 2005
Location: Atlantic Canada
Posts: 13,351
Default Clearing Up Misconceptions about CUDA and Larrabee

The non-gaming GPU games are heating up, and Intel is hot on NVIDIA's tail, but can you predict who will reign surpreme in two-years time?

Both Intel and NVIDIA have lots to say about their respective GPU architectures, as well as the competition's. Partly because of this, there are numerous misconceptions floating around about both Larrabee and CUDA, so we decided to see if we could put to rest a few of the most common ones.

You can read the full article here and discuss it here.
__________________
Intel Core i7-3960X, GIGABYTE G1.Assassin 2, Kingston 16GB DDR3-2133, NVIDIA GeForce GTX 770 2GB
Kingston HyperX 3K 240GB SSD (OS, Apps), WD VR 1TB (Games), Corsair 1000HX, Corsair H70 Cooler
Corsair 800D, Dell 2408WFP 24", ASUS Xonar Essence STX, Gentoo (KDE 4.11. 3.12 Kernel)

"Take care to get what you like, or you will be forced to like what you get!" - H.P. Baxxter
<Toad772> I don't always drink alcohol, but when I do, I take it too far.


Rob Williams is offline   Reply With Quote
Old 08-13-2008, 03:42 AM   #2
Kougar
Techgage Staff
 
Kougar's Avatar
 
Join Date: Mar 2008
Location: Texas
Posts: 2,653
Default

Great article!

Hm, tough call. I can't pretend to truly know anything about the coding and compiling and who's code is more closely related to C, but Anand's article really does put Larrabee into a positive light in my opinion.

Whatever NVIDIA may say, their "shaders" are not fully programable and this does constrain performance tuning to an extent, and it makes for some large limitations. As NVIDIA themselves pointed out Intel uses software to manage threads, assign workloads, and generally uses their software driver to replace what NVIDIA and ATI have always done with pure hardware.

The biggest advantage to this, besides that it could be either a huge boon or a huge performance hit, is if I understand things correctly Intel can update their driver to change their "DX10" Larrabee into a "DX10.1" or "DX11" or "DX12" GPU. In effect all of this mess about which DX version is neccesary becomes moot. Even better, game developers can write their games to use whatever they want, since if the Larrabee driver can take the code and convert it out into something that runs on the hardware. So if I understood things right, Intel is significantly less constrained with their approach.

The proof will be in the pudding though, and Larrabee's advantages mean little if it can't deliver the performance to make them worthwhile. Thankfully Anandtech pointedly mentioned Intel's GPU driver team is busy working on those IGP drivers, while a new team is actually working on the Larrabee drivers. I think we all know Intel's poor track record with IGP drivers and missing/delayed integrated graphics features.
__________________
Core i7 4770k 4.2Ghz
Gigabyte Z87X-UD5H
Crucial Ballistix Sport LP 1600MHz 32GB
EVGA GTX 480 HydroCopper FTW
ASUS Xonar DX
Corsair Neutron GTX 240GB | Windows 7 64-bit
Apogee XT + MCP655 & Thermochill Triple 140mm Radiator
Corsair AX1200 PSU | Cooler Master HAF-X

Kougar is offline   Reply With Quote
Old 08-13-2008, 02:20 PM   #3
Rob Williams
Editor-in-Chief
 
Rob Williams's Avatar
 
Join Date: Jan 2005
Location: Atlantic Canada
Posts: 13,351
Default

I do think it's obvious that NVIDIA cards are not as programmable as Larrabee will be, especially given that Larrabee is essentially a CPU with multiple x86 cores designed for specific workloads. NVIDIA said otherwise, and I'm sure they'll claim that for a while. If I had the ability to go out and test, I'd do so. I think I need to get in contact with current CUDA developers, and pick their brains a little bit.

Quote:
Originally Posted by Kougar
The biggest advantage to this, besides that it could be either a huge boon or a huge performance hit, is if I understand things correctly Intel can update their driver to change their "DX10" Larrabee into a "DX10.1" or "DX11" or "DX12" GPU.
Yes, exactly, they'd be able to do that without issue. Larrabee just needs to be fast enough to handle the increased work.

As for the GPU driver and the like, I agree. Intel has a completely different team working on Larrabee, so to assume that their driver will suffer the same humiliation as their IGP ones (which I can vouch for as being less-than-stellar) would be extremely foolish. Intel has the money, the manpower and some incredibly smart people working on Larrabee. It's not going to suffer due to a lackluster GPU driver, I'm sure of that much.
__________________
Intel Core i7-3960X, GIGABYTE G1.Assassin 2, Kingston 16GB DDR3-2133, NVIDIA GeForce GTX 770 2GB
Kingston HyperX 3K 240GB SSD (OS, Apps), WD VR 1TB (Games), Corsair 1000HX, Corsair H70 Cooler
Corsair 800D, Dell 2408WFP 24", ASUS Xonar Essence STX, Gentoo (KDE 4.11. 3.12 Kernel)

"Take care to get what you like, or you will be forced to like what you get!" - H.P. Baxxter
<Toad772> I don't always drink alcohol, but when I do, I take it too far.


Rob Williams is offline   Reply With Quote
Old 08-13-2008, 07:03 PM   #4
Kougar
Techgage Staff
 
Kougar's Avatar
 
Join Date: Mar 2008
Location: Texas
Posts: 2,653
Default

Well Intel is great with hardware, but I won't rule it out just yet that they might have a few issues to iron out with drivers, especially since this is the first of a kind for them. And their last attempt with a discrete GPU didn't really work out so well. I will just have to see how the "new" Intel handles this one.

Their approach to the entire software/hardware design is simply brilliant though...

Another advantage would be multi-GPU scaling. Since it is all down via software the workload can be evenly distributed across all the available Larrabee cards or "multi-cores" with near perfect scaling. In effect Intel has the ability to skip the multi-GPU growing pains NVIDIA and ATI have had to work through the hard way...
__________________
Core i7 4770k 4.2Ghz
Gigabyte Z87X-UD5H
Crucial Ballistix Sport LP 1600MHz 32GB
EVGA GTX 480 HydroCopper FTW
ASUS Xonar DX
Corsair Neutron GTX 240GB | Windows 7 64-bit
Apogee XT + MCP655 & Thermochill Triple 140mm Radiator
Corsair AX1200 PSU | Cooler Master HAF-X

Kougar is offline   Reply With Quote
Old 01-04-2010, 09:47 PM   #5
Unregistered
Guest Poster
 
Posts: n/a
Default NO CURRENT WINNER ? Uhh.. WHAT !?!!

Gee, that's funny, even in August, months ago, Intel couldn't produce a hill of beans, so obviously NVIDIA was and in fact, still is " THE CURRENT WINNER ! " , PERIOD.

Now it may be that, in the future, when Intel, or quite less likely the red rooster red ATI fanbio freaks favorite actually get something out the nm door, there will be a different current winner THEN, but as for now...

DO NOT DETRACT FROM NVIDIA'S MASSIVE, MASSIVE DOMINANCE !

Thanks.
  Reply With Quote
Old 01-05-2010, 09:51 AM   #6
Psi*
Tech Monkey
 
Psi*'s Avatar
 
Join Date: Jun 2009
Location: Westport, CT
Posts: 785
Default

I agree with Unregistered ... NVIDIA is the winner by default.
__________________
Win 7 64 bit, ASUS P6X58D Premium i7-990 @ 4.5 GHz
24 GB CORSAIR DOMINATOR
NVIDIA Tesla M2090 + NVS 290, Seasonic X750
Swiftech Apogee XT block, Indigo Extreme TIM
Swiftech MCR220-QP Radiator, Eheim 1040 pump, 1/2" ID Tubing
Psi* is offline   Reply With Quote
Old 01-05-2010, 04:19 PM   #7
Kougar
Techgage Staff
 
Kougar's Avatar
 
Join Date: Mar 2008
Location: Texas
Posts: 2,653
Default

All I will say is that NVIDIA hasn't won anything yet... their new GPU is launching more than a quarter late, and we have yet to see it deliver on performance. The longer they take the larger the advantage ATI will have.

Additionally ATI/AMD is already launching mobile 5000 series GPUs... last I checked NVIDIA's best mobile GPU is the GTX 280M, which is actually a last-generation 9800GTX, not a GTX 280 core. So NVIDIA is even further behind in the mobile sector...
__________________
Core i7 4770k 4.2Ghz
Gigabyte Z87X-UD5H
Crucial Ballistix Sport LP 1600MHz 32GB
EVGA GTX 480 HydroCopper FTW
ASUS Xonar DX
Corsair Neutron GTX 240GB | Windows 7 64-bit
Apogee XT + MCP655 & Thermochill Triple 140mm Radiator
Corsair AX1200 PSU | Cooler Master HAF-X

Kougar is offline   Reply With Quote
Old 01-05-2010, 05:01 PM   #8
Psi*
Tech Monkey
 
Psi*'s Avatar
 
Join Date: Jun 2009
Location: Westport, CT
Posts: 785
Default

Perhaps I have a gross misunderstanding as I associate CUDA with GPGPU as a hardware accelerator for those software developers that choose to implement it. In that regard, NVIDIA has been the only option.
__________________
Win 7 64 bit, ASUS P6X58D Premium i7-990 @ 4.5 GHz
24 GB CORSAIR DOMINATOR
NVIDIA Tesla M2090 + NVS 290, Seasonic X750
Swiftech Apogee XT block, Indigo Extreme TIM
Swiftech MCR220-QP Radiator, Eheim 1040 pump, 1/2" ID Tubing
Psi* is offline   Reply With Quote
Old 01-05-2010, 09:47 PM   #9
Kougar
Techgage Staff
 
Kougar's Avatar
 
Join Date: Mar 2008
Location: Texas
Posts: 2,653
Default

Quote:
Originally Posted by Psi* View Post
Perhaps I have a gross misunderstanding as I associate CUDA with GPGPU as a hardware accelerator for those software developers that choose to implement it. In that regard, NVIDIA has been the only option.
That's pretty much true. ATI's OpenCL, Brook, and other initiatives aren't as widely received as CUDA has been. I'm not sure how DX11's DirectCompute requirements will play into this either.

The issue is NVIDIA can't win if they don't have their latest-greated GPU out the door. GT300, G100, whatever you want to call it was designed completely around CUDA. So to not have it launching in November like planned was a set back, and now current news states it will be March before they launch it.
__________________
Core i7 4770k 4.2Ghz
Gigabyte Z87X-UD5H
Crucial Ballistix Sport LP 1600MHz 32GB
EVGA GTX 480 HydroCopper FTW
ASUS Xonar DX
Corsair Neutron GTX 240GB | Windows 7 64-bit
Apogee XT + MCP655 & Thermochill Triple 140mm Radiator
Corsair AX1200 PSU | Cooler Master HAF-X

Kougar is offline   Reply With Quote
Old 01-06-2010, 01:26 AM   #10
Psi*
Tech Monkey
 
Psi*'s Avatar
 
Join Date: Jun 2009
Location: Westport, CT
Posts: 785
Default

Nvidia TESLA ... I didn't think that we were on the same sheet of music. I am considering a couple of these versus the traditional conventional raw multi-core systems. Have to see how business goes.

I saw someone's build of a quad socket quad core water cooled AMD system e/w highend AMD server chips. He had 64 GB RAM in the box also. Very impressive build, but the money he put into it would have bought several of these GPGPU cards, would have used less power, and would have been much faster.

For the variety of number crunching software represented by the pics on that Tesla web page, the Nvidia TESLA has been the only option. The C1060 sells for about $1200. Until very recently I did not think that these could be on my plate so I don't know about the others.

And, I know it looks like I depend on wikipedia too much, but this is a nice summary.
__________________
Win 7 64 bit, ASUS P6X58D Premium i7-990 @ 4.5 GHz
24 GB CORSAIR DOMINATOR
NVIDIA Tesla M2090 + NVS 290, Seasonic X750
Swiftech Apogee XT block, Indigo Extreme TIM
Swiftech MCR220-QP Radiator, Eheim 1040 pump, 1/2" ID Tubing

Last edited by Psi*; 01-06-2010 at 12:54 PM.
Psi* is offline   Reply With Quote
Old 01-06-2010, 11:34 AM   #11
killem2
Coastermaker
 
Join Date: Jan 2010
Posts: 220
Default

Nothing wrong with wiki
killem2 is offline   Reply With Quote
Old 01-06-2010, 02:40 PM   #12
Kougar
Techgage Staff
 
Kougar's Avatar
 
Join Date: Mar 2008
Location: Texas
Posts: 2,653
Default

Tesla uses the GT200 core.... Fermi will give a significant performance boost because it was the first GPU core completely designed and optimized for CUDA. So again you wouldn't want to go buying any Tesla equipment until after Fermi, aka GT300, finally launches.
__________________
Core i7 4770k 4.2Ghz
Gigabyte Z87X-UD5H
Crucial Ballistix Sport LP 1600MHz 32GB
EVGA GTX 480 HydroCopper FTW
ASUS Xonar DX
Corsair Neutron GTX 240GB | Windows 7 64-bit
Apogee XT + MCP655 & Thermochill Triple 140mm Radiator
Corsair AX1200 PSU | Cooler Master HAF-X

Kougar is offline   Reply With Quote
Reply

Tags
None

Thread Tools

Posting Rules
You may not post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is On

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Intel Opens Up About Larrabee Rob Williams Reviews and Articles 9 12-19-2009 01:58 PM
Intel's Larrabee Computational Performance Beats Out Competition Rob Williams Video Cards and Displays 6 12-08-2009 01:15 PM
Concept Art for Larrabee's Launch Title Released Rob Williams Video Cards and Displays 0 09-08-2008 02:06 AM
Craving Some Larrabee Info? We've Got It! Rob Williams Processors 0 08-04-2008 02:05 AM
Intel Details Nehalem, Dunnington, Tukwila & Larrabee Rob Williams Reviews and Articles 39 07-12-2008 12:59 AM


All times are GMT -4. The time now is 04:31 PM.