fon|sl4y3r wrote:
78% I still dont get it. I run a dual 7800GTX 256MB system and the FPS increase in BF2 is only around 15% for resolution 1600 x 1200, when I turn it down to 1024x768 or 1280x1024 it around 5-10%. The maximum increase I´ve ever tested myself was in Oblivion with around 40%, but this games engine is designed for dual-GPUs.
What benchs did you use and what systemsettings???
Where SLI/Crossfire really starts to shine is 19x10 resolution or higher. A good high end card like a 7800, 7900, X1800, or X1900 can generally handle up to 16x12, depending on how complex the shaders are. Anything below that is generally (once again depending on the game) being choked by your CPU anyway, so throwing another card at the problem does little good, if one card is sitting around waiting for CPU cycles, two cards won't change anything.
(BTW, Oblivion is NOT designed for dual GPU's, Oblivion just makes extensive use of Shader Model 3.0 features...they're one of the first, but not the last...download the Ghost Recon Advanced Warfighter demo, for example)When you move up to the native resolution of the fairly popular 24" Dell 2405FPW (19x12), or something like my 37" Westinghouse LVM37 series (19x10), and a single card starts getting worked over pretty good, while a SLI/Crossfire setup keeps framerates at very playable levels.
I can disable Crossfire on my rig when playing 1920x1080 and drop down to 30 FPS or lower in BF2, depending on the scene...with Crossfire active at that resolution I never see lower than 55 FPS. Running 25+ FPS slower or faster is VERY noticable in BF2, and can be the difference between getting cacked or being the cacker (especially when flying, zooming, driving, etc).
And it only gets worse for a single card after that, for example the uber Dell 3007FPW is practically
worthless as a gaming monitor at its native 24x18 resolution UNLESS you run SLI/Crossfire.
Now, all that might change in a card generation or two, as horsepower per GPU core increases, but that is in the future...for right NOW, a single card can NOT drive the higher end LCD displays at native resolutions with a framerate that is acceptable (read:allowing you to be competitive rather than frag bait) in a modern first person shooter, MMO, etc, if you keep the image quality settings at high. Keep in mind that LCD prices are continuing to drop, and resolutions/pixel density is increasing...in a year or two 16x12 will be like the 1024x768 of today...low end.
Ok, so you just lower resolutions on your LCD panel and everything is ok with a single card again, right? True, but why did you drop the coin on a high resolution display if you're going to cripple it by running it below its native resolution? Same logic applies to turning down detail settings (AA, AF, HDR, etc) to get framerate back up.