Teflon.DEP wrote:
My point is it shouldnt give performance hits...I cant understand how makeing jaggies straight is so ridiculously stressful for a GPU.
Well, It's not "stressfull" for the GPu. The problem is that in todays games (like BF2) you have a 3D model of a landscape and you have a varying viewing angle and position to display. Ie you're moving your soldier around, and presto you get to see the OTHER side of that hill.
While this in itself requires rediculous amounts of computing power to do real time (remember castle wolfenstein ? The first real 3D shooter... That was a heavy thing with just 16 colors in 320x240 res)
While the GPUs can DO this today, they do so by simply rendering the part that you have visible from your current viewing angle (hence the settings "geometry" and "view distance" in your game).
BUT.... You dont want your graphics to be blocky, so we increase the res to 1280x1024. That's a HUGE increase from 320x240 (or whatever is reasonable to expect a GPU to handle), in fact, the bandwidh requirement just increased many fold. So did the processing requirements. Now you also seem to think that 16 colors isn't enough for you. So we make it 16 BIT color (16 colors is 4 bit), so now the bandwidht and cimputing power required is again many fold more than before. And that's just to compute ONE SINGLE frame. Next thing you know, you'll want the GPU to do this at least 50 times pr second (50 fps). Again, we're asking the GPU to do a HUGE amount of extra work. The amount of data sent to the GPU is gigantic. we have to give it information about the color, gemorty, and scale of hundreds if not thousands of objects, and this information has to be rendered and displayed in full color many times each second. This requires a LOT of computing power.
Now add insult to injury and ask the GPU to anti-alias those rendered images for you. In order to anti-alias you need to EITHER identify when a line requires anti aliasing, OR simply anti alias EVERYTHING. Seeing as the GPU probably wont know when a line is in need of AA, normally a programmer will just have EVERYTHING AA'ed. This means that on top of everything, we have to COMPUTE a change on every single pixel, in essense "shading" their edges differently so that they appear less blocky. DESPITE the fact that you run very high res. This requires an additional 16 bits (or more !) bits pr PIXEL.... So AGAIN you're asking the GPU to incrase it's workload. Your bandwidth requirement just went up many times. Now ask for 4x or 8x AA and it gets even worse !
But hey.. go set your res to 320x240 and presto.... You have no problem running AA... Other than the fact that you can't see shit anyway, but that's not the GPUs fault.
No.. The only real "solution" to your problem, or inquiry, is to get a Phys-X GPU. A specially designed GPU that will know the physical properties of environment "blocks", ie: how heavy or dense is this, how bright or reflective is this etc. etc. The Phys-X will be able to contain loads of information about the physical properties of an object, and how this object will work in an environment. This will allow the Phys-X GPU to calculate the environment without asking more of the display GPU, and Maybe.. JUST MAYBE with all the work done by someone else, you wont be stressing the display GPU so much that AA is a "problem". Ofcourse you can just go buy a new GPU every 6 months, and then you wont have a problem either.... Except perhaps that you'll need to go rob a bank or something to always afford the lastest equipment.