ShotYourSix
Boldly going nowhere...
+196|7019|Las Vegas
https://img391.imageshack.us/img391/9612/atimeatstack4mv.jpg

This is two X1900's in Crossfire mode with a 3rd X1900 acting as a discrete PPU (physics processing unit).  WAY too spendy but just a taste of what is coming down the pipeline. 

here is the text from the article:

ATI announces GPU-based physics acceleration plans
By Geoff Gasior - 9:58 AM, June 6, 2006


TAIPEI, TAIWAN — ATI used the first day of Computex to announce its strategy for GPU-based physics processing. Radeon X1000 series graphics processors will be capable of accelerating the Havok FX physics API as a part of what ATI is calling a "boundless gaming" experience. GPU-based physics acceleration is nothing new, of course; NVIDIA announced its support of Havok FX back in March. However, ATI says its approach is far superior to that of NVIDIA, in part because ATI's implementation can support three graphics cards in a single system.

ATI had a demo system running a pair of Radeon X1900s in CrossFire with a third X1900 card dedicated solely to physics processing. This configuration was appropriately referred to as the "meat stack," and while it produced silky frame rates in a number of demos, it's not the only Radeon configuration that will support GPU physics. In addition to supporting three-card configs, ATI will also allow a pair of its graphics cards to split rendering and physics between them. The graphics card dedicated to physics doesn't even need to match the other graphics card(s) in the system; for example, it's possible to run a high-end Radeon X1900 XTX crunching graphics alongside a more affordable Radeon X1600 series card for physics. In fact, ATI had a demo system set up with a pair of Radeon X1900s in CrossFire and a Radeon X1600 XT accelerating the Havok FX physics API.

With support for three-card configurations and no need to match cards used for graphics and physics, ATI looks to have the most flexible Havok FX acceleration implementation. ATI also claims to have a significant performance advantage when it comes to GPU-based physics acceleration, citing the Radeon X1000 series' ample shader processing power, efficient dynamic branching, and fine-grained threading. Of course, the first games to use Havok FX aren't expected until later this year. Havok FX isn't exactly comparable to what Ageia's doing with hardware physics acceleration, either; Havok FX is limited to "effects physics" that don't affect gameplay, while Ageia's PhysX PPU has no such limitation.
As an X1900XT owner, I find it pretty interesting to think that when I replace it with a new X???? one day that I could simply drop it down to the next PCI slot and use it for a physics processor and let the new card take over for GPU duties.  Pretty amazing where tech is currently heading in the graphics department.

It's plenty soon to start speculating on whether this will really pan out or not, as ATI is said to be still working on drivers to support this but it is definitely heading in the right direction.

This could spell doom for Ageia PhysX as why would you buy a dedicated PPU when you could just upgrade GPU and use the old one.  Ageia better get on the ball if they wish to combat this option.

Last edited by ShotYourSix (2006-07-11 10:43:56)

ssonrats
Member
+221|6945
What FPS do you get on BF2? lol

Last edited by ssonrats (2006-07-11 10:44:34)

DoNER90
Member
+14|6817|South Africa
ill rahter get A PhysX card for u need a MASSIVE PSU todo this but nice find
ShotYourSix
Boldly going nowhere...
+196|7019|Las Vegas
That is not my pic.  It was taken at Computex a month or so ago.  As far as I know, the drivers to support this tech is still under development.  I also am not sure if Havok has released its HavokFX Engine which will be needed to support this arrangement.  The Havok site states a release date of 2nd Quarter 2006 for HavokFX so I would assume that we will be hearing a lot more about this subject soon.
ShotYourSix
Boldly going nowhere...
+196|7019|Las Vegas

DoNER90 wrote:

ill rahter get A PhysX card for u need a MASSIVE PSU todo this but nice find
Agreed on the PSU. 

Actually, according to anandtech, the Physx cards are REDUCING framerates for PhysX enabled games.  It seems they are placing an extra burden on the CPU.  Just can't see spending $300 on something that will reduce my FPS.  You can read more about the subject here:


http://www.anandtech.com/showdoc.aspx?i=2759&p=1

Last edited by ShotYourSix (2006-07-11 10:52:29)

Cybargs
Moderated
+2,285|7016

ShotYourSix wrote:

DoNER90 wrote:

ill rahter get A PhysX card for u need a MASSIVE PSU todo this but nice find
Agreed on the PSU. 

Actually, according to anandtech, the Physx cards are REDUCING framerates for PhysX enabled games.  It seems they are placing an extra burden on the CPU.  Just can't see spending $300 on something that will reduce my FPS.  You can read more about the subject here:


http://www.anandtech.com/showdoc.aspx?i=2759&p=1
no its an extra burden on the GPU... they found out its driver related, im not suprised by this since more particles will stress the gpu.

but dont u think if u use another 1900xt wont it also drop the FPS as well? and u need a big ass powersupply to run three 1900xts.
https://cache.www.gametracker.com/server_info/203.46.105.23:21300/b_350_20_692108_381007_FFFFFF_000000.png
ShotYourSix
Boldly going nowhere...
+196|7019|Las Vegas
Read the anandtech link in the post above.  The driver didnt make any substantial improvement in FPS (look at the charts). 

And no, I dont think adding another GPU as a PPU will decrease FPS the way the PhysX does.  In the case of Ghost Recon (which is one of the only games out which support Physx) you are adding tons of particle effects.  This is the whole point of the PhysX card to add more particles to explosions etc. For one reason or another this is resulting in a reduced FPS. 

When using 2 GPU's in combo, 1 for GPU / 1 for CPU, you are not adding all the extra PhysX effects (which are dropping the FPS).  You are simply offloading CPU calculations to the 2nd GPU which will (in theory) increase FPS.  Kinda hard to explain clearly.

We really wont know for sure until some of the tech sites get some hands on time and produce some solid numbers. PhysX may get their drivers up to snuff, but the driver release that was supposed to fix everything had little to no effect.


Edit: as for PSU, I dont expect many will run 3 GPU's (thats just ATI showing off).  Most will wait until they replace the GPU and just drop the old one down as a PPU.  Any good 500w PSU should be just fine for this.

Last edited by ShotYourSix (2006-07-11 11:12:18)

SniperF0x
Member
+49|6825
Wow what a waste, and i tought 2GPU's where lame.
Cybargs
Moderated
+2,285|7016
well wouldnt that be a waste to have 3 cards? physx card does decrease FPS... but we will see, i think its too early for physic cards now
https://cache.www.gametracker.com/server_info/203.46.105.23:21300/b_350_20_692108_381007_FFFFFF_000000.png
BlaZin'Feenix
I'm just that good
+156|6919|Cork, Ireland
Hehe, BF2 would probably run so well, you would get lag because the game couldn't keep up with the machine!
semerkhet83
Member
+1|6800|Four Marks, UK
PPU's may not be worth buying YET, but give it 6-12 months, and a chance for the developers to get used to using them, and we're gonna get some damn good games...Also, the advantage to using a GPU as a PPU is that devolpers will get their tech support from the graphics companies like they have been for a long time already, so we may see physics support appear quicker in those games, and they will already have a larger user base compared to people who have, or are going to get, the agia phyx card.
and the ATI setup looks impressive, and when they demonstrated the cards, they managed to calculate about double the sphere to sphere collisions that the agia card can, least thats what custom pc mag said.
either way, its looking good for us.
BigglesPiP
Whirlybird Guy
+20|6848|Windermere, GB
Good ol' ATi, "Sod Agea we can use our existing cards with PhysX".

I read about it a while ago.

Wont make a difference to BF2. It's not programmed for physics cards. And it'll be years before Physics cards work in multiplayer. EVERYONE in the server would have to have one.
Snipedya14
Dont tread on me
+77|6995|Mountains of West Virginia
Im still waiting for all games to take advantage of Dual Core.
BigglesPiP
Whirlybird Guy
+20|6848|Windermere, GB

Snipedya14 wrote:

Im still waiting for all games to take advantage of Dual Core.
BF2 is Dual Core. Both halves of my Pentium D run between 90 and 100%.

Board footer

Privacy Policy - © 2025 Jeff Minard