I posted a while back in 2142 forums that I was getting some crappy FPS in-game. I was playing at 1200x960 or some res like that, with 2x AA. Average was about 30, lowest 15 (on titan), highest maybe 45. I have an x1900xt and was very disappointed by this performance. Today while playing, I found that there was a new resolution available after installing the newest ATI drivers. 1440x1050 I believe, which is close to my native res (1680x1050).
Somehow, I am now magically getting average 60fps, with decent highs and lows. I might even push it to 4x AA if the performance remains OK. Do drivers really make that much of a difference when it comes to games? Or is there some performance drop using non-native resolutions while playing.
Also, I don't really understand the settings in the ATI Catalyst Control Center thing. It lets you select 6x AA and 16x AF in there, but when I do that, sometimes the game will load without any of these features on. How do I get those settings to work in game?
Somehow, I am now magically getting average 60fps, with decent highs and lows. I might even push it to 4x AA if the performance remains OK. Do drivers really make that much of a difference when it comes to games? Or is there some performance drop using non-native resolutions while playing.
Also, I don't really understand the settings in the ATI Catalyst Control Center thing. It lets you select 6x AA and 16x AF in there, but when I do that, sometimes the game will load without any of these features on. How do I get those settings to work in game?