T4rd wrote:
You happen to have a 1080P HDTV to try the AVIVO out on, EvilMonkeySlayer? Hah.. or anyone else for that fact? This is what I'll be using with my card here soon so it'd help me out a lot.
Also, it kinda pisses me off that Vista comes out late this year along with DX 10 and these cards dont support it (supposedly, I cant find proof). If I pay $500 for a card, it's retarded if it doesn't last for at least a year or two before you should need to replace it to run the newest software/games.
Edit: Oh, and you dont HAVE to d/l the Catalyst Control Center with the ATI drivers. You can just d/l the drivers and control all those settings in the "settings" tab in the display tab when you click the "advanced" button. That might help out a lil with your resources.
I'm running an X1900 Crossfire setup with a 1080p Westinghouse 37" LCD monitor. Avivo works well when playing 1080p source material (Windows HD stuff), and is fully prepped to take advantage of h.263 encoding. But where it REALLY shines right now is what it can do with non-HD source material (normal Divx, AVI, etc) files, and also does a great job with DVD.
Most small monitor/lower native resolution people may not appreciate what Avivo can do, but trust me, on a large panel with a native 1920x1080 resolution, standard definition source material, and highly compressed mpegs (like torrent movies/TV shows) can look like
total ass, since the high quality large sized LCD panel makes all the compression flaws and noise stick out like a sore thumb..shit you'll not notice when watching it on a 20" low-resolution CRT or LCD. Avivo (or PureVideo for that matter) can reduce the ass-factor quite a bit.
I wouldn't sweat DX10 too much amigo, keep in mind that DX9 was released "way back" in 2002, along with SM 2.0, and then DX9c with SM 3.0 in 2004. Four years later there are
still very few games out there that take full advantage of DX9's feature set and hardware, most are just using DX8 features with a few DX9 bits tacked on. Games like Oblivion, FEAR, and GRAW are the exception, and FarCry was a rolling tech demo for Shader Model x.0 features...but every one of those exception games will work fine and still look good with the SM 2.0/3.0 features disabled, basically running in DX8 mode.
Expect the same sort of backward compatibility with DX10, it'll run all DX9/DX8/DX7 titles just fine, most games released between now and 2008 will still just be DX9 titles, working under both XP or Vista, with possibly a few extra bells and whistles available to Vista/DX10 users. There might be one or two "flagship" heavily subsidized titles out (by that, I mean MSFT, ATI or Nvidia pays an ASSTON of money to a dev studio to make a DX10 exclusive title, to offset the sales loss by not being able to sell to the much larger DX9/XP user base).
Keep that in mind..XP was released over 4 years ago, and it took something like 2 years for it to supplant ME and 2k as the most widely used Windows-based OS. We'll see the same sort of slow adoption with Vista (some critics say SLOWER), so the majority of users will still be running XP for at least a couple of years. Since that's where the market is at, that's who the studios will target...DX9 has a lot of life left in it.
And your DX9 card you're using today will work just fine in Vista...Vista's Aero interface doesn't use any DX10 hardware features, just a new DX10 driver model, that any Nvidia or ATI DX9 card will support. You'll just be missing a few possible DX10 eye candy bits and performance improvements when playing DX10 games. In that regard, the X1900 is more future proof than the 7900 series, those 48 shader pipes will translate into generally better DX10 performance.
The big thing that DX10 supports over DX9 is a unified shader model, which lets DX10/drivers determine what pipe is used for what, giving greater flexibility. Lots of times, in today's cards, shader pipes can be idle while raster pipes are overloaded, or vice versa...making it so that all pipes can do the same job means less idle time, which can lead to greater performance, or allow the developer to add more eye candy to a given scene.
But, the reality is that SM3.0 games (and most DX10 games will support SM3.0 features like HDR, etc) will still tend to perform many more shader ops than raster ops...so the X1900's massive amount of shader pipes will help a great deal, even if they can't be programmed to do raster ops as well. That's why the X1900 tends to outperform the 7900 in most SM 3.0 heavy games today (while still being more than competitive in "old tech" games like BF2)...and as more and more games go that route, the X1900 will have more "wins" against the 7900.
Not saying the 7900 is a bad card, it is not, it's a great card. Nvidia just made a different choice with the architecture than ATI did...without so many shader pipes, it is a smaller transistor count chip, which means it runs cooler and uses less power (which can translate into higher possible clock speeds, as the massive OC's of 7900's are showing). It also keeps production costs down and yields higher, which is always good for the end user. Higher core and RAM timings can offset fewer shader pipes, and lets face it, other than for the four games I listed above, most people don't really NEED the extra shader pipes on the X1900 series
today.
It's just less forward looking than the X1900. And for many people, that's no big deal, they'll pony up again next year for Nvidia's DX10 part...which suits Nvidia just fine.