CrazeD wrote:
ghettoperson wrote:
For all you guys that claim you can see 80fps, you're talking crap. Given that your monitor (well, almost all) has a refresh rate of 60Hz, you won't be able to see anything over that. And movies run at 24fps. I think 30 is perfectly adequate.
Every time. Every, fucking, time.
Every time an FPS subject emerges, someone has to pull the whole "you can't see that much FPS". Which by the way is incorrect, the human eye can see 80FPS.
Furthermore, it makes NO FUCKING DIFFERENCE what your eye can see. The game is hardcoded to only produce a certain amount of visible frames. Notice how the images don't appear faster when the FPS increases.
I can tell you for damn sure that I can tell the difference between 80FPS and 125+FPS. It's very clear.
You goddamn sure can not see the difference between 60 fps and 9000 fps on a monitor that has a refresh rate of 60Hz. No matter how many frames your video card can produce, you will simply not see more frames per second than your monitor can display. On a high-refresh-rate monitor this is totally different of course. Picture will look way smoother on a 100Hz+ monitor than on a 60Hz one with same fps, if it's over said refresh rate boundries.
And there is no certain value that human eye can see. It is generally around 100 individual frames per second, but the "blur effect" makes more frames better for the eye. Thus if you have two sources of picture, one producing 100 fps and one 200 fps, the 200 fps one will look better, even that your eye can't make much difference on individual frames. The picture will just appear smoother.
The "movies are 24fps, that's all you need for gaming" -debate, however, is stupid. Games are certainly affected by laggyness at 24 fps.
Last edited by DeathUnlimited (2008-12-22 09:44:12)