AGEIA PhysX.. First Thoughts

Rob Williams

Editor-in-Chief
Staff member
Moderator
AGEIA PhysX.. First Impressions

"Last week at the GDC, the buzz was all around us. At hardware and software booths alike. I was able to see the Ageia PhysX card in action as well as sample ATI and NVIDIA’s approach to physics and gaming."

After reading Greg's report here, discuss it here!
 

Greg King

I just kinda show up...
Staff member
One is available for download at the end of the article and the other one should be there within the next hour or so.
 

Jakal

Tech Monkey
This chip looks like an excellent addition to aid performance. If it works well, it'll tremendously reduce cpu usage for physical interactions within the game. That'll mean higher framerates and better graphics. I love it!
 

supramax

Obliviot
Jakal said:
This chip looks like an excellent addition to aid performance. If it works well, it'll tremendously reduce cpu usage for physical interactions within the game. That'll mean higher framerates and better graphics. I love it!

Its possible to use the graphic processor to do more then graphics. I saw an article on the net about some company that uses graphic processors to help with solving complex mathematical problems. They daisy chain hundreds of graphic processors in different machines, and use both the CPU + graphic CPU in combination to increase the amount of mathematical computations they do for solving what they are working on.
 

Fr00zen

Obliviot
Jakal said:
This chip looks like an excellent addition to aid performance. If it works well, it'll tremendously reduce cpu usage for physical interactions within the game. That'll mean higher framerates and better graphics. I love it!

I am more interested in more special effects and higher quality graphcis. Frame rate is usually already high enough for most games. Besides the human eyes can not distinguish a difference between high frame rates.
 

Greg King

I just kinda show up...
Staff member
Theoretically, the Ageia PPU will add to the GPU's workload. While taking a lot of load off of the CPU, there will be many more objects made available to interact with which means more objects that will have to be rendered. Collisions will be handled by the PPU but while in motion, the GPU will have to render all of the lighting and shadows relative to where the objects are on the screen.
 
U

Unregistered

Guest
Well, I'm also very excited about hardware PPU. But I'm old enough to have been one of the first adapters of the first 3DFx add-on GPU. As soon as the major graphics card producers got into that area, 3DFx quickly became obsolete and incompatable. Perhaps if they maintain DirectX compatability, this won't happen quite so fast with the PhysX chip, but I'll bet these first add-on PPU cards will be relatively short-lived.

My guess is that within a couple of years, no one will be writing games for the PhysX and we'll all be using Nvidia or ATI graphic cards with integrated PPU's.

However, I'll still buy an AGEIA card as soon as the apps are there to support it. But I'll do so, knowing that its probably a short-term solution.

Anode
 

Greg King

I just kinda show up...
Staff member
Anode,

This, I hope, just wont be the case with the PhysX PPU. The approach that nVidia and ATi are taking at the moment is completely different from the direction Ageia is leaning. They want to offload the physics load onto one of the GPUs but as stated earlier, I spoke with some game designers at the GDC and they were not to keen to the idea of giving up GPU performance to code for physics but rather liked the way that Ageia is going with a stand alone card. I, as a gamer and end users, don't really want to sacrifice any of my SLi performance to incorporate physics. Now I can understand how nVidia and ATi are justifying this to themselves because not everyone will be able to afford a PPU when they first ship so the idea of being able to have a lot of the performance without dropping another 300 on the PhysX card will be appealing. That just isn’t the case for me and it sounds like you. We are tech heads and want what’s new while it's still new...

Thanks for your input and feel free to stick around.
 
U

Unregistered

Guest
DarkSynergy said:
Anode,

This, I hope, just wont be the case with the PhysX PPU. The approach that nVidia and ATi are taking at the moment is completely different from the direction Ageia is leaning. They want to offload the physics load onto one of the GPUs but as stated earlier, I spoke with some game designers at the GDC and they were not to keen to the idea of giving up GPU performance to code for physics but rather liked the way that Ageia is going with a stand alone card. I, as a gamer and end users, don't really want to sacrifice any of my SLi performance to incorporate physics. Now I can understand how nVidia and ATi are justifying this to themselves because not everyone will be able to afford a PPU when they first ship so the idea of being able to have a lot of the performance without dropping another 300 on the PhysX card will be appealing. That just isn’t the case for me and it sounds like you. We are tech heads and want what’s new while it's still new...

Thanks for your input and feel free to stick around.

I certainly agree with you that a dedicated PPU is the only thing that makes any sense. I would also be opposed to giving up one clock cycle of my expensive SLI up to another process. But as you also know, its often not the best hardware that prevails. Remeber Aureal A3D? I also bought into that technology, because it was (and still is) far superior to EAX in almost every way. A3D was, in a sense, the audio equivalent to a dedicated PPU that provided unique capabilities for superior sound placement, reflections and muffling effects through walls. Unfortunately, Creative put them out of business before they could transition much of the calculations to a dedicated sound processor.

In my opinion, its clear that the future of PC gaming is in the implementation of dedicated multiprocessors. So I hope you are right and the PhysX PPU (or similar dedicated hardware) prevails against the big two graphics manufacturers.

Anode
 

Buck-O

Coastermaker
Wow, some of the replys in this thread boggle my mind.

Heres my thoughts, and im sure there will be plenty of backlash from them...but thats kinda the idea anyway. ;)


So, does the world ned a PPU? If this where 1999, the answer would be a resounding YES.

But becuase this is 2006, the answer is a definate NO. And heres why.

The current processor technology is moving away fromt eh idea of a single core architexture, and will eventually move exclusively to a dual core architexture at minimum. Eventually we will see quads, and perhaps even a six pack on die processor layout for the enterprise folks.

So what does that have to do with a PPU?
Well, the whole purpose of a PPU is negated by a dual core. If you program a game properly, you can have one core doing the games standard calculations (i.e. all that a single processor bears now), and set a secondary thread of code to the sister processor to do nothing but physics. Not only would it be faster then the PhysX PPU, but it would also cost a heck of alot less for the consumer, and be much easier to program for, and be less of a cost hit for the developers having to pay licencing for Agias PhysX engine. And with Vista on the horizon, with full support for SMP, and 64bit processing, there will be no excuse not to do this.

Plain and simple, Agia is selling snake oil in my oppinion. ANd is attempting to market a technology that would have best been suited to gamers needs YEARS ago. Currently, the technology offers nothing in the way of a direct performance increase that couldent easily be recreated on dual core CPU.

Think about it for a while, and ask yourself the question, "Is it really worth that extra $300?"

The answer may suprise you.
 
U

Unregistered

Guest
physx

actually if i read the article right ageia is giving developers the license!! and making the money off the cards.

yes dual processors are becoming popular but games are getting much much more complex.

i have been jumping around the game forums and alot of people are complaining they want better ai and bigger maps and more players per map.

and if you go to the age 3 web forums there is a thread called ask sandy a developer and this was asked,

16) will there be much much much larger maps in the expansion pack
A – only if we find out suddenly that everyone’s computers are much much much more powerful. Unlike my answers, the map sizes were not chosen arbitrarily
http://forum.agecommunity.com/ibb/posts.aspx?postID=179228&postRepeater1-p=3

i will take a seperate card!!!!!
 

Buck-O

Coastermaker
Unregistered said:
actually if i read the article right ageia is giving developers the license!! and making the money off the cards.

yes dual processors are becoming popular but games are getting much much more complex.

i have been jumping around the game forums and alot of people are complaining they want better ai and bigger maps and more players per map.

and if you go to the age 3 web forums there is a thread called ask sandy a developer and this was asked,

16) will there be much much much larger maps in the expansion pack
A – only if we find out suddenly that everyone’s computers are much much much more powerful. Unlike my answers, the map sizes were not chosen arbitrarily
http://forum.agecommunity.com/ibb/posts.aspx?postID=179228&postRepeater1-p=3

i will take a seperate card!!!!!

Good greif, do you work for Agia...or a advertising company that is contracted out for Agia? Im sorry, but i see very little relavence to this post, and even less actual information that refutes any of the claims ive made. All i see is happy play spin on a product that doesnt doesnt live up to the hype that marketing is bringing it. As i said in my initial post, 90% of the unregistered user posts in this thread sound fishy, WAY to positive, and i think are highly suspect for being plants. And i think Rob should do everything in his aibility to investigate those posts, becuase as far as im conscerned, they are bogus.

But, ill certaintly take on every point youve attempted to make.

Great, thats fantastic that they are handing over their physics engine for free. Its about the only way any developer would be stupid enough to code for it. But even to that end. Getting handed an SDK for free, probibly comes with alot of marketing dollars behind it. Becuase, again, the PhysX card does not make sence. Its like having a 4WD car. And being told that useing 4WD is pointless, but adding on a special bumper attatchment, with a wheezy engine, and two aditional drive wheels will make it a better performer. Huh? WHy not just use the 4WD int eh first place, and forget that little attatchment ever exsisted? Becuase marketing wants you to think otherwise.

Either way, by virture of handing out the SDK for free, and by making such a large public spectacle out of it, if a company where to produce a PROPER SMP bassed physics engine that could fully utilize a dual core, and make better use of resources then a PhysX card could (which again wouldent be difficult given the speed of the PCI bus), Agia would have a lawsuit all over their asses for patent infringment, and copying their intelectual property. So really, in the end, all Agia will turn out to be, is a marketing company with a few big patents for a technology that is outdated by current modern hardware standards, that could easily be coded around.

The best bit, was that you debunked your last point, in the answer you gave for why it was valid.

Adding a PhysX card to your computer, wont add any benefit to allow for larger levels, or more players or for wider expansiveness of the levels. His answer... only if we find out suddenly that everyone’s computers are much much much more powerful".

Simply ploping in a PhysX card, DOES NOTHING TO REMEDY THIS! All it could do is take advantage of some poorly coded eye candy, that your video card, may or may not be able to render on screen fully at a reasonable frame rate.

Becuase, as he said, the reasoning behind it is the power of the computer. Not the video card, not the PhysX card...the WHOLE COMPUTER. If youve only got 512 megs of RAM, a PhysX card wont make Battlefield 2 run any better on your system.

If you have an old Athlon XP 1800+, Quake4 will not run any faster or look any better with a PhysX card.

In fact, the only performance segment where a PhysX card would be worth its while, is in HIGH END systems. And most low level high end systems, employ dual-core CPUs. And which point, proper coding, would make the PhysX card completely worthless. Becuase the data could be calculated faster, sent to the video card quicker, and run will less overall system overhead (no crappy legacy PCI bus constraint to get in the way), by useing SMP to dedicate one of the cores to do nothing but physics calculation.

But even then, the game has to be coded around the lowest common denominator, with eye candy thrown onto it, for those with greater horspower behind them. This has been, and always will be the modus operandi of developers around the world. And the PhysX card will never change that. Reguardless of how much their marketing might try to make you believe otherwise.

Not like it makes any difference though, considering all of this is just falling on def ears anyway.
 
Last edited:

Greg King

I just kinda show up...
Staff member
Yes!

As will I. There will have to be a great game, or 2-3, for the launch to take off but I feel that Ageia's separate slot approach is heading down the right path.

And also, you have read correct. The SDK is being given away for all to design games and the money will be made on the cards themselves. They are somehow at the mercy of how well the games are designed to take advantage of the PPU but they have some heavy hitters on board with them so I am not worried to much about that.

Give me my dedicated card and keep the GPU clock cycles to the games!

You can argue that dual core will be the untimely coup de gras of the PPU but I feel that the more game programmers learn to code with dual cores, the more they will take advantage of the fact that they can either code the game to use one core for gaming and one for physics or both for the game and none for physics. You can argue that you only need one core for in game physics but I can argue that with a dedicated PPU, the entire processing power of the CPU can be used to the full advantage of the gaming end user. We could both be wrong but only time will tell. Stay tuned as I have a PPU on the way for review and I will let you all know what is found out. If it sucks in real time, then I will let you know but if it rules, you will know that as well.
 

Greg King

I just kinda show up...
Staff member
Buck-O said:
Good greif, go you work for Agia...or a advertising company that is contracted out for Agia? Im sorry, but i see very little relavence to this post, and even less actual information that refutes any of the claims ive made. All i see happy play spin on a product that doesnt doesnt live up to the hype that marketing is bringing it. As i said in my initial post, 90% of the unregistered post in this thread sound fishy, WAY to positive, and i think are highly suspect for being plants. And i think Rob should do everything in his aibility to investigate those posts, becuase as far as im conscerned, they are bogus.

But, ill certaintly take on every point youve attempted to make.

Great, thats fantastic that they are handing over their physics engine for free. Its about the only way any developer would be stupid enough to code for it. But even to that end. Getting handed an SDK for free, probibly comes with alot of marketing dollars behind it. Becuase, again, the PhysX card does not make sence.

Either way, by virture of handing out the SDK for free, and by making such a large public spectacle out of it, if a company where to produce a PROPER SMP bassed physics engine that could fully utilize a dual core, and make better use of resources then a PhysX card could (which again wouldent be difficult given the speed of the PCI bus), Agia would have a lawsuit all over their asses for patent infringment, and copying their intelectual property. So really, in the end, all Agia will turn out to be, is a marketing company with a few big patents for a technology that is outdated by current modern hardware standards, that could easily be coded around.

The best bit, was that you debunked your last point, in the answer you gave for why it was valid.

Adding a PhysX card to your computer, wont add any benefit to allow for larger levels, or more players or for wider expansiveness of the levels. His answer... only if we find out suddenly that everyone’s computers are much much much more powerful".

Simply ploping in a PhysX card, DOES NOTHING TO REMEDY THIS! All it could do is take advantage of some poorly coded eye candy, that your video card, may or may not be able to render on screen fully at a reasonable frame rate.

Becuase, as he said, the reasoning behind it is the power of the computer. Not the video card, not the PhysX card...the WHOLE COMPUTER. If youve only got 512 megs of RAM, a PhysX card wont make Battlefield 2 run any better on your system.

If you have an old Athlon XP 1800+, Quake4 will ot run any faster or look any better with a PhysX card.

In fact, the only performance segment where a PhysX card would be worth its while, is in HIGH END systems. And most lowe level high end systems, employ dual-core CPUs. And which point, proper coding, would make the PhysX card completely worthless. Becuase the data could be calculated faster, sent to the video card quicker, and run will less overall system overhead (no crappy legacy PCI bus constraint), by useing SMP to dedicate the dual core to nothing but physics calculation.

But even then, the game has to be coded around the lowest common denominator, with eye candy thrown onto it, for those with greater horspower behind them. This has been, and always will be the modus operandi of developers around the world. And the PhysX card will never change that. Reguardless of how much their marketing might try to make you believe otherwise.

Not like it makes any difference though, considering all of this is just falling on def ears anyway.

I agree with what you say. But I am also optimistic to see, with the right backing, what Ageia can accomplish. Also, to say that your words "fall on def ears" is a bit arrogant, dont you think? You make some valid points, I will admit that, but to write off an entirely new concept before it is even launched to the public at large is somewhat sceptical to me.

I can also admit that the PPU is geared to users with high end systems but then again, so is SLI and CrossFire and both of those concepts seem to be doing quite well.

I am more excited about the direction that Ageia is trying to take the industry, rather than where they might or might not actually take it.

No I do not work for them.
 

Buck-O

Coastermaker
DarkSynergy said:
Yes!

As will I. There will have to be a great game, or 2-3, for the launch to take off but I feel that Ageia's separate slot approach is heading down the right path.

And also, you have read correct. The SDK is being given away for all to design games and the money will be made on the cards themselves. They are somehow at the mercy of how well the games are designed to take advantage of the PPU but they have some heavy hitters on board with them so I am not worried to much about that.

Give me my dedicated card and keep the GPU clock cycles to the games!

You can argue that dual core will be the untimely coup de gras of the PPU but I feel that the more game programmers learn to code with dual cores, the more they will take advantage of the fact that they can either code the game to use one core for gaming and one for physics or both for the game and none for physics. You can argue that you only need one core for in game physics but I can argue that with a dedicated PPU, the entire processing power of the CPU can be used to the full advantage of the gaming end user. We could both be wrong but only time will tell. Stay tuned as I have a PPU on the way for review and I will let you all know what is found out. If it sucks in real time, then I will let you know but if it rules, you will know that as well.

Well at least this is a post from a respected user...

But again, i fail to see the merit.

"I feel that Ageia's separate slot approach is heading down the right path."

How do you figure? Like i said, its like adding somthing extra that doesnt need to be there. And a technology that is two years late, and $200 short of being impressive.

"Give me my dedicated card and keep the GPU clock cycles to the games!"

Im assuming this is a freudian slip, and you actually mean CPU. The GPU does nothing in terms of physics rendering for the game. Thats all handled and loaded to teh CPU. However, the dirty little secret here, is that any and all data that is processed by the PhysX card, still has to be rendered by the GPU. If your GPU cant hack it, you still wont see any marked improvment in gameplay. All you will be is a frustrated consumer who just wasted $200 on somthing he cant get the full benefit of.

"You can argue that dual core will be the untimely coup de gras of the PPU but I feel that the more game programmers learn to code with dual cores, the more they will take advantage of the fact that they can either code the game to use one core for gaming and one for physics or both for the game and none for physics."

Again, they have to program to lowest common denominator. Making the effectivness of coding for dual core AND PhysX, very dubious at best. And as it stands right now, the ise of Dual core implimentation in games has been very poor. Becuase the dedicated offloading to an individual CPU has been very limited bassed on how the games kernel can support multiple threads. Considering that PhysX is essentially a seperate entity, offloading that to a secondary CPU would be very simple to impliment, even at a user level, withen windows. Infact, i wouldent at all be suprised if in the coming months as Agia supported games are released, we see hacks that allow the PhysX computations to be offloaded directly to a seconed core, in its own thread.

"Stay tuned as I have a PPU on the way for review and I will let you all know what is found out. If it sucks in real time, then I will let you know but if it rules, you will know that as well"

Believe me, im ripe with anticipation. However, i want to see results on real games, with real advantages, with no special demos to oooh and ahhh at. Cause right now, the marketing is about the only thing i see.
 

Greg King

I just kinda show up...
Staff member
There was no slip, the nVidia and ATi approach is to let the GPU take care of the physics load. I talked to a few developers at the GDC about such an approach and they want all the GPU they can take.

You and I both agree on the requirements of the GPU. Every little item on the screen that the PPU allows the user to interact with, but be rendered by the GPU, thuse making the much heavier load on said GPU much greater. I was told by Ageia that there really are not any minumum requirements for using the PPU, but rather minimum requirements for the game itself. The bottleneck will be the GPU in such instances.

I am truly waiting with baited breath for this card but to discredit a concept before it has even launched is naive at best. These arent game "demos" that we have seen. I was actually able to play Cell Factor in real time and interact with all objects on the screen. The future is wide open for Ageia....it's just up to them to choose the right path.

They are marketing this like they should so again, you and I agree on this. One can argue against anything and make at least some valid points. I am not saying, by any streach of the imagination, that I am right, but I will not let you pick what I say apart and then concede that you have been right all along. I see you and I as the same voice, only on differant ends of the spectrum. I am for seeing the good and you, you are only for the bad. Debates like this though, will benefit the end users. End users who do not have the systems that you and I have. End users that want this card but do not know what it is all about. I will give my honest opinion of the PPU in my upcoming. An opinion that will be how I feel abou the card and not how I hope Ageia wants me to feel.

I love a good debate so keep it coming. I still think that you and I are only feeding off of each other. :D




Buck-O said:
Well at least this is a post from a respected user...

But again, i fail to see the merit.

"I feel that Ageia's separate slot approach is heading down the right path."

How do you figure? Like i said, its like adding somthing extra that doesnt need to be there. And a technology that is two years late, and $200 short of being impressive.

"Give me my dedicated card and keep the GPU clock cycles to the games!"

Im assuming this is a freudian slip, and you actually mean CPU. The GPU does nothing in terms of physics rendering for the game. Thats all handled and loaded to teh CPU. However, the dirty little secret here, is that any and all data that is processed by the PhysX card, still has to be rendered by the GPU. If your GPU cant hack it, you still wont see any marked improvment in gameplay. All you will be is a frustrated consumer who just wasted $200 on somthing he cant get the full benefit of.

"You can argue that dual core will be the untimely coup de gras of the PPU but I feel that the more game programmers learn to code with dual cores, the more they will take advantage of the fact that they can either code the game to use one core for gaming and one for physics or both for the game and none for physics."

Again, they have to program to lowest common denominator. Making the effectivness of coding for dual core AND PhysX, very dubious at best. And as it stands right now, the ise of Dual core implimentation in games has been very poor. Becuase the dedicated offloading to an individual CPU has been very limited bassed on how the games kernel can support multiple threads. Considering that PhysX is essentially a seperate entity, offloading that to a secondary CPU would be very simple to impliment, even at a user level, withen windows. Infact, i wouldent at all be suprised if in the coming months as Agia supported games are released, we see hacks that allow the PhysX computations to be offloaded directly to a seconed core, in its own thread.

"Stay tuned as I have a PPU on the way for review and I will let you all know what is found out. If it sucks in real time, then I will let you know but if it rules, you will know that as well"

Believe me, im ripe with anticipation. However, i want to see results on real games, with real advantages, with no special demos to oooh and ahhh at. Cause right now, the marketing is about the only thing i see.
 
Top