Catbox
forgiveness
+505|6726
3 8800 ultras on high and it still only gets 30-40fps...lol
and fps dropped into the teens and single digits in some areas during gameplay...
They need to figure out this DX10 Vista debacle... quick...

I know from modding the game how much performance is needed but there aren't enough
people with rigs that can play it...  Not to mention the MP hacking that has pretty much ruined this game...

http://www.tomshardware.com/2008/01/08/ … li_crysis/
Love is the answer
AussieReaper
( ͡° ͜ʖ ͡°)
+5,761|6162|what

Damn. Do you think crysis two will have even higher specs? lol

Last edited by TheAussieReaper (2008-01-14 04:41:01)

https://i.imgur.com/maVpUMN.png
FloppY_
­
+1,010|6295|Denmark aka Automotive Hell
Can't wait untill SLi 2.0  Nvidia has promiced the ability for two different cards running SLi and up to 4x SLi
­ Your thoughts, insights, and musings on this matter intrigue me
TheEternalPessimist
Wibble
+412|6629|Mhz

Seems strange to me, not played Crysis on my rig but I've seen it running as high as it goes on DX9 (basically all high instead of extra high or whatever) 4x AA 1920 x 1200 and it ran fine with a single 8800GTX, why is DX10 so shitty for performance?
SonderKommando
Eat, Lift, Grow, Repeat....
+564|6669|The darkside of Denver
I run the game on all high dx10, its playable... max prob 30fps.... doesnt ever seem to get into slide show mode either, looks damn nice too, you can see some screenies i took in the SS thread.
ShadowFoX
I Hate Claymores
+109|6540

TheEternalPessimist wrote:

Seems strange to me, not played Crysis on my rig but I've seen it running as high as it goes on DX9 (basically all high instead of extra high or whatever) 4x AA 1920 x 1200 and it ran fine with a single 8800GTX, why is DX10 so shitty for performance?
Really I smell BS.
FloppY_
­
+1,010|6295|Denmark aka Automotive Hell

ShadowFoX wrote:

TheEternalPessimist wrote:

Seems strange to me, not played Crysis on my rig but I've seen it running as high as it goes on DX9 (basically all high instead of extra high or whatever) 4x AA 1920 x 1200 and it ran fine with a single 8800GTX, why is DX10 so shitty for performance?
Really I smell BS.
Either that, or Pessimist doesnt know what the word "fine" emplies...
­ Your thoughts, insights, and musings on this matter intrigue me
TheEternalPessimist
Wibble
+412|6629|Mhz

30+ FPS at all times = fine to me. Call BS all you like, I built his PC for him I know how well it runs and I've sat and played Crysis on it.
FloppY_
­
+1,010|6295|Denmark aka Automotive Hell

TheEternalPessimist wrote:

30+ FPS at all times = fine to me. Call BS all you like, I built his PC for him I know how well it runs and I've sat and played Crysis on it.
>60 fps = fine...

<30 fps = studdering & choppy...

Last edited by FloppY_ (2008-01-15 02:48:47)

­ Your thoughts, insights, and musings on this matter intrigue me
TheEternalPessimist
Wibble
+412|6629|Mhz

30FPS a) doesn't stutter at all and b) is smooth enough for single player, and as his Uni network is run by a Naziesque regieme he can't play online so that's all it does.
FloppY_
­
+1,010|6295|Denmark aka Automotive Hell
If you are used to playing with a minimum of 60 fps you can clearly see how bad 30 is...

And when you first play 60+ you will never want to go back !
­ Your thoughts, insights, and musings on this matter intrigue me
TheEternalPessimist
Wibble
+412|6629|Mhz

I run all my games at over 100FPS, getting Crysis over 40 is a bit of a challenge so there's nothing to get used to.
topal63
. . .
+533|6728

[TUF]Catbox wrote:

3 8800 ultras on high and it still only gets 30-40fps...lol
and fps dropped into the teens and single digits in some areas during gameplay...
They need to figure out this DX10 Vista debacle... quick...

I know from modding the game how much performance is needed but there aren't enough
people with rigs that can play it...  Not to mention the MP hacking that has pretty much ruined this game...

http://www.tomshardware.com/2008/01/08/ … li_crysis/
I have that same setup... Vista (DX10) with 3-way SLI (8800 utlras). No bugs, problems, or laggy game-play (at all).

I am getting anywhere from 30-50+ fps... but with no frame delays (those tiny little CPU hangs, you sometimes experience in game). So the game is playing extremely smoothly with motion-blur "ON." The only complaint, I have, is that the environments/maps are simply not optimized and over-populated (with vegetation or other objects that must be rendered and tax the GPU). Because of this I find the sweet spot for playing the game mostly on ultra-high/DX10 (is certainly not 1920 x 1080 : 1080p) but 1024 x 768 or 1280 x 720, as there is simply to many objects to render so these resolutions allow me to get higher FPS rates.

I am running it with almost all settings on Ultra-High (DX10), except sound (seems the same at medium, more or less) and models-geometry (medium) and I really don't see that much difference (since shaders and textures are on ultra-high and high). It looks and plays AMAZING! Like no other game I've ever played.

It's actually a fun game. I start it with "-devmode" added to the command line so I can switch between 1st and 3rd person (F1 : toggles this).

I will eventually post some high-def movies of the game-play; switching between 1st and 3rd person single player-mode - looks cool plays cool and there is a slow-motion mode too! (as opposed to high-def screenies, which don't really showcase the coolness of this game, in motion that is!).
________

But, lets talk "Gears of War" PC version on Vista/DX10! OMFG! This game is even better looking than Crysis (detail wise) in high-def, and the frame rates are wicked-high and smooth. The difference between the Xbox 360 and PC version is ridiculous. There are a few cut-scene movies in-game and the pre-rendered movies (based upon the game-engine) are significantly less-detailed than my current configuration! It usually is the other way around.
________

PS: Image quality, in game, is always an issue, if you have an HDTV as a monitor - you have a lot more choices! You can soften pixel edges automatically by running the display in interlaced-mode (at 30mhz). On a good HDTV there is absolutely no Flicker - instead the pixels are not refreshed as often so they began to fade a tiny bit which softens the pixel-edge. The overall effect is like when you apply a photoshop-soften filter on image quality. Which is the opposite of the HDTV sharpen-image which has nearly the same effect as a photoshop-sharpen filter on image quality.

Last edited by topal63 (2008-01-22 11:28:55)

bakinacake
HA HA
+383|5995|Aus, Qld
Some of you guys should check out http://forums.bf2s.com/viewtopic.php?id=84919
https://i.imgur.com/LGvbJjT.jpg

Board footer

Privacy Policy - © 2024 Jeff Minard