Latest Forum Posts

Latest News Posts
Coming Soon!
Social
Go Back   Techgage.com > Software > Gaming

Gaming Anything gaming-related can be discussed here, for PCs, handhelds and consoles.

Reply
 
Thread Tools
Old 11-05-2009, 03:59 PM   #1
Rob Williams
Editor-in-Chief
 
Rob Williams's Avatar
 
Join Date: Jan 2005
Location: Atlantic Canada
Posts: 13,353
Default AMD vs. NVIDIA: Anti-Aliasing in Batman: Arkham Asylum

From our front-page news:
There's an on-going war of words being exchanged between AMD and NVIDIA, and in some cases, also Eidos and Rocksteady, regarding the recent PC hit, Batman: Arkham Asylum. The story surfaced well before the game's launch, but spread like wildfire when it became available to consumers, as gamers began to experience a downside when running the game with an ATI card installed: no anti-aliasing. While AA is indeed possible with ATI cards, the method of applying it is complicated, compared to any in-game solution.

With all the details boiled down, it appears that NVIDIA is the one in the hotseat, as multiple sources, including developer Rocksteady, claim that the company disallowed the in-game anti-aliasing code to be applied for non-NVIDIA cards. This of course enraged AMD and gamers alike. Simply changing your ATI card's vendor ID to match NVIDIA's would enable anti-aliasing once again, adding even more fuel to the fire.

The story is long and complicated, but Bright Side of News*'s Theo Valich has taken an exhaustive look at the situation from various angles, and has even gotten comment from developers not at all related to the game. Some have praise for NVIDIA, stating that its dedication to game developers is unparalleled. In some cases, NVIDIA has been known to provide not only hardware to developers, but support at no cost. AMD, on the other hand, seemingly does the bare minimum.

The case has a sticking point, though. Half a year before the game's release, Rocksteady approached both AMD and NVIDIA regarding Unreal Engine 3's lack of native anti-aliasing support. NVIDIA went ahead and wrote some code, while AMD decided to focus more on DirectX 11 titles, as the company knew it would be way ahead of the curve (and it is, although we've yet to see such titles). The argument is that if NVIDIA wrote the code required, why should it allow AMD's graphics cards to take advantage? NVIDIA states that AMD didn't do anything to help with the development of AA in the title, and therefore, it's at fault - not NVIDIA.

Believe it or not, despite the fact that Unreal Engine 3 (mentioned earlier here) is one of the most robust engines on the market in terms of features and performance, it doesn't natively support anti-aliasing. This is proven by loading up almost any UE3-built game, including Unreal Tournament III. Players do have the option of forcing AA in the graphics driver's control panel, but that's a less-than-elegant solution.

Not much is sure to come from this, but two things do seem to be proven. For one, Unreal Engine should include native anti-aliasing support. It's kind of absurd that the engine has been around for years, and hasn't included a feature that's been around for well over ten. Second, AMD really has to step up its game (no pun of course) when it comes to catering to game developer's needs. I've heard this from game developers first-hand in the past, so it does seem to be a real issue.


What got AMD seriously aggravated was the fact that the first step of this code is done on all AMD hardware: "'Amusingly', it turns out that the first step is done for all hardware (even ours) whether AA is enabled or not! So it turns out that NVidia's code for adding support for AA is running on our hardware all the time - even though we're not being allowed to run the resolve code! So… They've not just tied a very ordinary implementation of AA to their h/w, but they've done it in a way which ends up slowing our hardware down (because we're forced to write useless depth values to alpha most of the time...)!"


Source: Bright Side of News*
__________________
Intel Core i7-3960X, GIGABYTE G1.Assassin 2, Kingston 16GB DDR3-2133, NVIDIA GeForce GTX 770 2GB
Kingston HyperX 3K 240GB SSD (OS, Apps), WD VR 1TB (Games), Corsair 1000HX, Corsair H70 Cooler
Corsair 800D, Dell 2408WFP 24", ASUS Xonar Essence STX, Gentoo (KDE 4.11. 3.12 Kernel)

"Take care to get what you like, or you will be forced to like what you get!" - H.P. Baxxter
<Toad772> I don't always drink alcohol, but when I do, I take it too far.


Rob Williams is offline   Reply With Quote
Old 11-05-2009, 04:48 PM   #2
Doomsday
Tech Junkie
 
Doomsday's Avatar
 
Join Date: Nov 2008
Location: KHI, PAK
Posts: 1,559
Default

i have heard a lot bout Nvidia doing more for gamers, especially PC, than AMD....not an Nvidia fanboy but if AMD didnot do anything to help out, then i wont blame the green team to only support AA for their cards only...
__________________
PSU: Corsair AX850 - Case: Cooler Master HAF X - CPU:Core i7-2600k - Cooler: Cooler Master V6 GT - Motherboard: Asus Z68 Maximus IV Extreme Z - Memory: Corsair Vengeance 8 GB-1600Mhz - GPU: AMD MSI R6970 Lightning - HDD: WD Caviar Black 1TB, Seagate 2TB Barracuda Green - SSD: Intel 520 Series 120GB - K/B: Razer Lycosa Mirror - Mouse: Logitech G700 - MouseMat: Steel Series 4HD - LCD: Asus VG278H 27" - Speakers: Creative Inspire M4500 4.1 - Headset: Logitech G35 7.1



"Do not look at a man's prayers nor his fasts, rather, measure him by how well he deals with others, the compassion he shows his fellow man, his wisdom and his integrity" - Umar Ibn Al-Khattab


Doomsday is offline   Reply With Quote
Old 11-06-2009, 02:14 AM   #3
Rob Williams
Editor-in-Chief
 
Rob Williams's Avatar
 
Join Date: Jan 2005
Location: Atlantic Canada
Posts: 13,353
Default

It's a complicated story, because both companies continue to contradict each other, so it's hard to take a side. If NVIDIA was the one to develop the code, and AMD had no part whatsoever, than I can't say I feel too bad for AMD. But, in the end, it's unfortunate for gamers who have ATI cards in their machine, because they should have easy access to it, given it's such a standard option.

I still vote for the fact that Epic should have added native anti-aliasing support to UE3 long ago, and I find it foolish that it doesn't have the support even as it stands right now. It's not an inexpensive engine to use, and in all regards, it's one of the most robust out there, so where on earth is the AA support?!
__________________
Intel Core i7-3960X, GIGABYTE G1.Assassin 2, Kingston 16GB DDR3-2133, NVIDIA GeForce GTX 770 2GB
Kingston HyperX 3K 240GB SSD (OS, Apps), WD VR 1TB (Games), Corsair 1000HX, Corsair H70 Cooler
Corsair 800D, Dell 2408WFP 24", ASUS Xonar Essence STX, Gentoo (KDE 4.11. 3.12 Kernel)

"Take care to get what you like, or you will be forced to like what you get!" - H.P. Baxxter
<Toad772> I don't always drink alcohol, but when I do, I take it too far.


Rob Williams is offline   Reply With Quote
Old 11-06-2009, 02:14 PM   #4
Doomsday
Tech Junkie
 
Doomsday's Avatar
 
Join Date: Nov 2008
Location: KHI, PAK
Posts: 1,559
Default

Quote:
Originally Posted by Rob Williams View Post
It's a complicated story, because both companies continue to contradict each other, so it's hard to take a side. If NVIDIA was the one to develop the code, and AMD had no part whatsoever, than I can't say I feel too bad for AMD. But, in the end, it's unfortunate for gamers who have ATI cards in their machine, because they should have easy access to it, given it's such a standard option.

I still vote for the fact that Epic should have added native anti-aliasing support to UE3 long ago, and I find it foolish that it doesn't have the support even as it stands right now. It's not an inexpensive engine to use, and in all regards, it's one of the most robust out there, so where on earth is the AA support?!
well that is true..kinda unfair to the gamers who spent money on the game, had Ati, and were not able to experience the grafix fully.....
__________________
PSU: Corsair AX850 - Case: Cooler Master HAF X - CPU:Core i7-2600k - Cooler: Cooler Master V6 GT - Motherboard: Asus Z68 Maximus IV Extreme Z - Memory: Corsair Vengeance 8 GB-1600Mhz - GPU: AMD MSI R6970 Lightning - HDD: WD Caviar Black 1TB, Seagate 2TB Barracuda Green - SSD: Intel 520 Series 120GB - K/B: Razer Lycosa Mirror - Mouse: Logitech G700 - MouseMat: Steel Series 4HD - LCD: Asus VG278H 27" - Speakers: Creative Inspire M4500 4.1 - Headset: Logitech G35 7.1



"Do not look at a man's prayers nor his fasts, rather, measure him by how well he deals with others, the compassion he shows his fellow man, his wisdom and his integrity" - Umar Ibn Al-Khattab


Doomsday is offline   Reply With Quote
Old 01-06-2010, 09:19 AM   #5
killem2
Coastermaker
 
Join Date: Jan 2010
Posts: 220
Default

This argument kinda got left in the dust after a shit storm of insults from both parties. What I finally gathered from it all was:

Nvidia came to Edios with the codes to make AA work. Now remember there isn't anything like an Nvida AA code and ATI code. AA is universal. However Nvidia felt that since they did that leg work, it should only work with thier cards. AMD states that they did, but Edios says only Nvidia has came forward.

Personally, I'm siding with Nvidia here. Who knows how many titles in the past they have done AA coding for and just ignored amd not doing it. With saying that, honestly, it should be EDIOS who does the leg work to make them both work as it's their game and if they want that feature in there, they should be putting forth the effort.

It is still able to be done, you just have to force it through catalyst drivers i believe.
killem2 is offline   Reply With Quote
Old 01-15-2010, 07:08 PM   #6
Rob Williams
Editor-in-Chief
 
Rob Williams's Avatar
 
Join Date: Jan 2005
Location: Atlantic Canada
Posts: 13,353
Default

This was discussed in a meeting I had with NVIDIA the other day, and it took every opportunity to defend itself and set the record straight. As far as it's concerned, the company did nothing wrong, and in the end, it's merely ATI who isn't doing enough to support things like AA on its own card. Throughout the development cycle, NVIDIA worked close with the developer in order to get AA support into the game, since it wasn't supported by default (which I still consider bizarre).

I think I better stop talking about it since I believe this to be under embargo, but all I can say is that it does appear that ATI was in the wrong here, and NVIDIA has proof to back it up. I'll talk more about this soon...
__________________
Intel Core i7-3960X, GIGABYTE G1.Assassin 2, Kingston 16GB DDR3-2133, NVIDIA GeForce GTX 770 2GB
Kingston HyperX 3K 240GB SSD (OS, Apps), WD VR 1TB (Games), Corsair 1000HX, Corsair H70 Cooler
Corsair 800D, Dell 2408WFP 24", ASUS Xonar Essence STX, Gentoo (KDE 4.11. 3.12 Kernel)

"Take care to get what you like, or you will be forced to like what you get!" - H.P. Baxxter
<Toad772> I don't always drink alcohol, but when I do, I take it too far.


Rob Williams is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is On

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
The Batman Equation Kayden Off Topic 9 10-31-2011 10:55 PM
NVIDIA Shows Off PhysX in Batman: Arkham City Rob Williams Gaming 7 10-21-2011 06:29 PM
Steam Offer: Batman Arkham Asykum Tharic-Nar Gaming 0 10-23-2010 07:54 AM
OMG Batman!!!!! Psi* Off Topic 6 12-24-2009 01:36 PM
The New Batman Movie madstork91 Off Topic 3 06-16-2005 04:53 PM


All times are GMT -4. The time now is 11:47 AM.