PuckMercury
6 x 9 = 42
+298|6827|Portland, OR USA

DoctorFruitloop wrote:

fon|sl4y3r wrote:

Havoc
I didn't say it wasn/t there just that I didn't see it. 
DoctorFruitloop
Level 13 Wrongdoer
+515|6846|Doncaster, UK
I'll let you off
PuckMercury
6 x 9 = 42
+298|6827|Portland, OR USA
shouldn't this all be in tech anyway? </change topic from myself>
DoctorFruitloop
Level 13 Wrongdoer
+515|6846|Doncaster, UK
Hell yeah!

Thread movement request please mods...
=TFF=Omen_NataS
Member
+60|6840
Thank You very much for all your help [You all are the coolest people i don't know LOL ]
ShotYourSix
Boldly going nowhere...
+196|7019|Las Vegas
Actually, the Havok technology mentioned earlier is not hardware based like PhysX is.  The Havok solution (or HavokFX Engine) is simply a software package which game developers can purchase for the game they offer. It enables your CPU to off load certain physics calculations to your video card, thus freeing up CPU cycles for other things (this is particularly useful for games which are CPU limited). The HavokFX Engine should not be confused with Havok Physics though.  Havok Physics has been around for quite some time and is widely used in games today.  The HavokFX Engine was slated for releasae quarter 2 of this year so it will be awhile before any game developers can purchase and implement this technology. HavokFX (and as far as I know, any other havok products) are sold to game developers and not to gamers so this does not directly compete with Agea PhysX cards.

You can read (a bit more) more on the HavokFX Engine here:
http://www.anandtech.com/showdoc.aspx?i=2585&p=1
and the Havok website also offers a lot of useful information.

Contrary to what someone said earlier, graphics cards ARE well suited for physics calculations.  In fact both ATI and Nvidia have already announced plans to use their GPU technology for this (though they each approach the problem in different ways).

In this article...
http://techreport.com/ja.zz?id=127692

...we see the ATI "Meat Stack"
https://img514.imageshack.us/img514/8118/atimeatstack9gi.jpg

ATI announces GPU-based physics acceleration plans
By Geoff Gasior - 9:58 AM, June 6, 2006


TAIPEI, TAIWAN — ATI used the first day of Computex to announce its strategy for GPU-based physics processing. Radeon X1000 series graphics processors will be capable of accelerating the Havok FX physics API as a part of what ATI is calling a "boundless gaming" experience. GPU-based physics acceleration is nothing new, of course; NVIDIA announced its support of Havok FX back in March. However, ATI says its approach is far superior to that of NVIDIA, in part because ATI's implementation can support three graphics cards in a single system.

ATI had a demo system running a pair of Radeon X1900s in CrossFire with a third X1900 card dedicated solely to physics processing. This configuration was appropriately referred to as the "meat stack," and while it produced silky frame rates in a number of demos, it's not the only Radeon configuration that will support GPU physics. In addition to supporting three-card configs, ATI will also allow a pair of its graphics cards to split rendering and physics between them. The graphics card dedicated to physics doesn't even need to match the other graphics card(s) in the system; for example, it's possible to run a high-end Radeon X1900 XTX crunching graphics alongside a more affordable Radeon X1600 series card for physics. In fact, ATI had a demo system set up with a pair of Radeon X1900s in CrossFire and a Radeon X1600 XT accelerating the Havok FX physics API.

With support for three-card configurations and no need to match cards used for graphics and physics, ATI looks to have the most flexible Havok FX acceleration implementation. ATI also claims to have a significant performance advantage when it comes to GPU-based physics acceleration, citing the Radeon X1000 series' ample shader processing power, efficient dynamic branching, and fine-grained threading. Of course, the first games to use Havok FX aren't expected until later this year. Havok FX isn't exactly comparable to what Ageia's doing with hardware physics acceleration, either; Havok FX is limited to "effects physics" that don't affect gameplay, while Ageia's PhysX PPU has no such limitation.
As you can see, this tech is far from being mainstream (or even ready for primetime for that matter).  It will be awhile yet to see how these different approaches (PhysX / ATI / Nvidia-HavokFX) will pan out and what will ultimately become the best choice, though most are in agreement that at this point in the game, those who will purchase an Ageia PhysX card will are not getting much "value" for their money.  When I first read of PhysX I swore I would have one on release day.  The more I read though, the more I decided to wait and see.

If I were a betting man though (and I am) I'd put my money on ATI's plan.

Last edited by ShotYourSix (2006-07-11 10:20:50)

Rosse_modest
Member
+76|7076|Antwerp, Flanders
I think that at this point the Ageia card is just a waste of money, especially if you compare what you get as an extra in game to the damn thing's price tag. I've also heard that it results in more debris with explosions (just to name something) which means more shit for your graphics card to render which means a drop in your framerate.

Unless you have an insane beast with two 7900's, 2 Raptor hdd's in raid, multiple gigs of supersexy RAM and a fiendish cpu at your disposal and have too much money to spend (which you would if u own such a machine) I'd consider buying an ageia physx card sheer stupidity.

I think the whole physics card concept is only in its experimental stage right now and is still quite a ways off from maturity.

EDIT: If you really feel like you need to unload some cash you can always send it to me. I promise I'll accept it unconditionally, I won't be asking how u got the money or shit like that...

Last edited by Rosse_modest (2006-07-11 11:33:53)

Cybargs
Moderated
+2,285|7016
save ur money until we get dual gpus w/ physics on one of them
https://cache.www.gametracker.com/server_info/203.46.105.23:21300/b_350_20_692108_381007_FFFFFF_000000.png
Slayer
---hates you
+1,137|7056|Hell, p.o box 666

puckmercury wrote:

shouldn't this all be in tech anyway? </change topic from myself>
during my braggin about those useless cards, it seems like I have forgotten to move it. Well, now its solved *whistles*
BigglesPiP
Whirlybird Guy
+20|6848|Windermere, GB
Dont bother, by the time Physica cards are worth it that will be out of date. ATi are going to use their existing cards for PhysX.
PuckMercury
6 x 9 = 42
+298|6827|Portland, OR USA
Back to the standards thing, I really don't see that ever being resolved.  Look at every other aspect of the PC.  RAM?  Scores of manufacturers, a dozen GOOD ones too.  HD?  Again, scores of manufacturers.  Fewer good ones, but still multiple choices.  In the case of those, there's little difference in which component you use from a compatibility standpoint as long as there's a plug it fits in (I'm not even gonna get into SDRAM, DDR, DDR2 or <shudder> RDRAM).

Processor?  Two big players, each requiring a totally different mobo interface to use.  No standard, yet both thrive.

GPU?  Again, two contendors.  Both are usually compatible with the same system, but certain games still cator to one or the other.  Even where this is the case, the other works argueably just as well.  No standard, yet both thrive.

SPU?  Well, Creative kinda dominates this field.  There are many integrated components, Turtle Beach certainly warrants more than an honorable mention.  Again, no standard.  Here, both still manage to be compatible with the same interface and both are able to thrive, Creative just seems to do a bit MORE thriving.

I would look for PPU's to take off at some point.  Whether they are modified GPU's or an entirely different architecture, I think they'll be there.  I also think that we won't see any more standardization in them ever than we do in any other componentry within the PC.  I'd say GPU standardization is as close an analogy as any.  Hell, we still have DVD+/-R, +/-RW, RAM and +/-R DL and they all still work.  Some drives cover all of them, but most leave a couple off.  There are any number of other instances where standardization has not taken place, nor does it ever seem that it will.  The reason BETA failed was the same reason Sony always sucks, they don't open up their standard to any other manufacturer.  Same reason Memory Stick hasn't really done anything outside Sony peripherals and VAIO's.  This is also the same reason Mac's probably aren't as popular and more expensive.
Cybargs
Moderated
+2,285|7016
ppu's are a thing of the future, not now. ageia made a mistake and released it too early imo.
https://cache.www.gametracker.com/server_info/203.46.105.23:21300/b_350_20_692108_381007_FFFFFF_000000.png
stryyker
bad touch
+1,682|7020|California

there are very few programs supporting the TRUE potential of Phyx cards now, but im sure UT2k7 and future such games, the card will almost be a must
Cybargs
Moderated
+2,285|7016
just wait and see
https://cache.www.gametracker.com/server_info/203.46.105.23:21300/b_350_20_692108_381007_FFFFFF_000000.png
PuckMercury
6 x 9 = 42
+298|6827|Portland, OR USA

stryyker wrote:

there are very few programs supporting the TRUE potential of Phyx cards now, but im sure UT2k7 and future such games, the card will almost be a must
I keep going around agreeing with this guy ... damnit
Elogain
Member
+2|6800
"Contrary to what someone said earlier, graphics cards ARE well suited for physics calculations.  In fact both ATI and Nvidia have already announced plans to use their GPU technology for this (though they each approach the problem in different ways)."

Its more like.. we can all play football but not many of us are pros. Just because they 'can' do it on the gpu doesnt mean its a good idea. Im more for the approach where you have one gpu for Graphics (just as the g says ) and one ppu for physics. That way the cards can be specialized in what they do instead of having one generic "do it all well but not best" card.

Board footer

Privacy Policy - © 2025 Jeff Minard