• Index » 
  • Community » 
  • Tech » 
  • ATI/AMD Radeon HD 2900xt has now been OFFICIALLY launched and reviewed
GC_PaNzerFIN
Work and study @ Technical Uni
+528|6885|Finland

that's right. NDA of Radeon HD 2000 Series has lifted is Asia and VR-Zone has posted official review of 2900xt.
This is the real deal guys.  finally......
ENJOY READING (at least when it starts working)

go here if the review's direct link doesn't work   www.vr-zone.com    and ya will see it's there

here is the review    http://www.vr-zone.com/?i=4946

the site is under heavy load so it might be slow or not working at all yet  (I guess someone else wants to read the review too, hahahahah ) first to get in reading the review gets a cookie, lol


"ATi's 2000 series was announced to the media during the late April to beginning of May period, at various locations throughout the world. I attended the Press Briefing Day in Malaysia and caught sight of the much awaited Radeon X2900XT in action. Not only was this high-end model introduced to us, the lower-end series in the 2000 Family was also talked about. We took the Radeon HD X2900XT for a spin and a review here..."
3930K | H100i | RIVF | 16GB DDR3 | GTX 480 | AX750 | 800D | 512GB SSD | 3TB HDD | Xonar DX | W8
kylef
Gone
+1,352|6964|N. Ireland
Yay!
TheDarkRaven
ATG's First Disciple
+263|7095|Birmingham, UK
[ATi Radeon 2000 Series Launch: X2900XT Review ]

    Page Title:     Radeon X2000 Series!
Category:     GPUs & Graphic Cards
Type:     Reviews
Posted By:     Shamino
Date:     May 14, 2007, 3:55 am
Source:    ATi
Actions:     Print Article Email Del.icio.us Digg

ATi's 2000 series was announced to the media during the late April to beginning of May period, at various locations throughout the world. I attended the Press Briefing Day in Malaysia and caught sight of the much awaited Radeon X2900XT in action. Not only was this high-end model introduced to us, the lower-end series in the 2000 Family was also talked about.

Briefing the press was Mr Vijay Sharma, Director of Product Marketing Graphics Product Group.


Two demo Setups were employed during the Press Briefing, each housing a Radeon X2900XT, both running AMD Processors. Notice the 680w Power Supply used seen above.

1 x PCIE 8pin Connector with 1 x PCIE 6pin Connector. The card will run fine with 2 x PCIE 6 pin power connector as well.



The speaker talked about the first of many in terms of technology, for the R2000 series:

The 65nm version of X2900XT was of course not available or ready yet at the time, but this applied for the lower-end series of 2600, 2400 and 2300.

We were first introduced the 3 segments of the market the HD 2900, 2600 and 2400 are gunning for. The speaker answer our queries regarding power consumption of the above-mentioned cards - over 200w for X2900XT, around 45w for X2600 and 25-35w for X2400.


Among some of the new features of the 2000 series include a programmable Tessellation Unit. This feature enables the GPUs to apply image enhancements when support is written in games such that game developers have a much easier time doing up simpler graphics, and letting the GPU do the image enhancements. This also supposedly saves on GPU Horsepower as you can see below. You get 2.3x performance compared to a conventionally rendered scene.

It's definitely a great feature if it works as advertised, but support on games is currently non-existent. When asked to name 3 games coming out with this support, the speaker was unable to disclose details.

~~~

Give me my damn cookie!


~~~

There's 22 pages of the stuff!
I'll be damned if I wait for it all to load...

Last edited by TheDarkRaven (2007-05-13 08:34:18)

Smithereener
Member
+138|6787|California

Vr-Zone wrote:

Pg.1 - Radeon X2000 Series!

ATi's 2000 series was announced to the media during the late April to beginning of May period, at various locations throughout the world. I attended the Press Briefing Day in Malaysia and caught sight of the much awaited Radeon X2900XT in action. Not only was this high-end model introduced to us, the lower-end series in the 2000 Family was also talked about.
http://resources.vr-zone.com.sg/Shamino … unch/4.jpg
Two demo Setups were employed during the Press Briefing, each housing a Radeon X2900XT, both running AMD Processors. Notice the 680w Power Supply used seen above.
http://resources.vr-zone.com.sg/Shamino … unch/2.jpg
1 x PCIE 8pin Connector with 1 x PCIE 6pin Connector. The card will run fine with 2 x PCIE 6 pin power connector as well.
http://resources.vr-zone.com.sg/Shamino … nch/32.jpg
We were first introduced the 3 segments of the market the HD 2900, 2600 and 2400 are gunning for. The speaker answer our queries regarding power consumption of the above-mentioned cards - over 200w for X2900XT, around 45w for X2600 and 25-35w for X2400.
http://resources.vr-zone.com.sg/Shamino … nch/12.jpg
Among some of the new features of the 2000 series include a programmable Tessellation Unit. This feature enables the GPUs to apply image enhancements when support is written in games such that game developers have a much easier time doing up simpler graphics, and letting the GPU do the image enhancements. This also supposedly saves on GPU Horsepower as you can see below. You get 2.3x performance compared to a conventionally rendered scene.

Pg.2 - New Anti-Aliasing
Anti-Aliasing gets an upgrade with ATi's new CFAA
http://resources.vr-zone.com.sg/Shamino … nch/16.jpg
http://resources.vr-zone.com.sg/Shamino … nch/17.jpg
Custom-Filter Anti-Aliasing's wide-tent filter takes more points of sampling for the final AA output. We'll see how Anti-Aliasing improves later on.
http://resources.vr-zone.com.sg/Shamino … nch/31.jpg
Edit: In finality, CFAA goes up to 24x at the Edge Detect Filter mode with 8x, even in single-card, but the newer beta drivers.
http://resources.vr-zone.com.sg/Shamino … nch/13.jpg
320 Stream Processors onboard the X2900XT Core. Branch Execution Units take the work from the Stream Processors in handling data flow towards 'division of labour specialization' to improve efficiency at which Stream Processors work.
http://resources.vr-zone.com.sg/Shamino … nch/21.jpg
Radeon 2000 series also integrate full HDCP support with integrated audio controller on card to provide audio path from motherboard through Northbridge to the PCIE slot into the card and out via HDMI connectors for on-stop video/audio output.
http://resources.vr-zone.com.sg/Shamino … nch/22.jpg
http://resources.vr-zone.com.sg/Shamino … nch/23.jpg
http://resources.vr-zone.com.sg/Shamino … nch/33.jpg
Paper specifications of the 2000 series family. This is basically trying to pale the floating point processing power of a modern day CPU in comparison to a 2000 series GPU.

Pg. 3 - Direct X10 Demos
We were also shown a neat Ruby Demo running on DirectX 10:
http://resources.vr-zone.com.sg/Shamino … unch/9.jpg
http://resources.vr-zone.com.sg/Shamino … nch/10.jpg
http://resources.vr-zone.com.sg/Shamino … nch/11.jpg
Then later, another DirectX 10 demo, this time a new upcoming game called Call of Juarez:
http://resources.vr-zone.com.sg/Shamino … nch/18.jpg
http://resources.vr-zone.com.sg/Shamino … nch/19.jpg
Below, you see how different the game looks running DirectX 9 vs DirectX 10:
http://resources.vr-zone.com.sg/Shamino … nch/15.jpg
http://resources.vr-zone.com.sg/Shamino … nch/14.jpg
The most obvious difference is seen when looking at the much better-looking water effects, and High Dynamic Range effects.

Pg. 4 - Radeon Mobility 2000 Series
The ATi Radeon Mobility 2000 series were also talked about:
http://resources.vr-zone.com.sg/Shamino … nch/25.jpg
http://resources.vr-zone.com.sg/Shamino … nch/26.jpg
We should expect to see first batch of 2000 series mobile GPUs in July.
And finally, price point, what most people will want to see:
http://resources.vr-zone.com.sg/Shamino … nch/24.jpg
Some of the Partner's cards:
http://resources.vr-zone.com.sg/Shamino … unch/6.jpg
http://resources.vr-zone.com.sg/Shamino … unch/5.jpg
http://resources.vr-zone.com.sg/Shamino … ch/126.jpg
The small crew from ATi:
http://resources.vr-zone.com.sg/Shamino … nch/27.jpg

Pg. 5 - Radeon HD X2600, X2400 Pictures
http://resources.vr-zone.com.sg/Shamino … nch/48.jpg
Notice that there is no need for PCIE Power Connector, which says something about the power drawn being small.
Radeon HD X2400 Series:
http://resources.vr-zone.com.sg/Shamino … nch/44.jpg

Pg. 6 - Radeon X2900XT Pictures
USD$399 for a top-of-range graphics card from ATi seems to be really attractive. Let's see how this card performs and a more in-depth look at this red new gem.
http://resources.vr-zone.com.sg/Shamino … ited/1.jpg
http://resources.vr-zone.com.sg/Shamino … ited/2.jpg
A red translucent shroud similar to what we see on the X1950XT covers the main body of the card's cooler.
http://resources.vr-zone.com.sg/Shamino … ited/4.jpg
http://resources.vr-zone.com.sg/Shamino … ited/5.jpg
The card implements a 6-pin PCIE power connector with another 8-pin power connector to feed the card. A minimum of 2 x 6-pin power connector is needed to run the card, similar to rival NVIDIA's 8800GTX.
http://resources.vr-zone.com.sg/Shamino … ited/3.jpg
Of course all the heat from the supposedly very hot running card has to go somewhere... that's how the shroud comes into the picture, ducting the hot air out of the case via the exhaust grille at the second slot it takes up. You wouldn't want all that heat in your casing for sure.
http://resources.vr-zone.com.sg/Shamino … ited/6.jpg
Something we have not seen on the X1950XT is seen on this new flagship: the Cross-Fire connector is now embedded onboard, two in fact for forward and backward data transfer between primary and secondary boards. No more troublesome Dongle nor a need to differentiate between Master and Slave cards.
http://resources.vr-zone.com.sg/Shamino … ited/7.jpg
The back of the GPU is full of capacitors as is the norm nowadays. The traditional ATi back-plate holds the main weight of the cooler. Nothing much more can be explored on the card without stripping off the cooling solution so I proceeded to do just that.

Pg. 7 - X2900XT Cooling
Let's take a look at the cooling solution employed.
http://resources.vr-zone.com.sg:81//Sha … ted/16.jpg
The cooling solution comprises of 5 main parts: The front heatspreader, back heatspreader, blower fan, plastic shroud, and the main copper heatsink aided with heatpipes.
http://resources.vr-zone.com.sg:81//Sha … ted/14.jpg
The front of the card is cooled directly by 2 components, the main heatsink and the red-colored aluminum heatspreader. The heatspreader mainly taps the heat away from the BGA GDDR3 Memory lying on the front of the card, and makes contact with some power management modules. It is also somewhat able to transfer heat to the main chunk of copper heatsink.
http://resources.vr-zone.com.sg:81//Sha … ted/17.jpg
Ah yes, the main cooling unit. This is a lovely massive chunk of shiny copper, fully arrayed with 32 rows of fine long fins. To hasten the process of heat transfer from the main base to the outer area of the fins, 2 copper heatpipes are employed to aid the transfer. So heat is tapped upwards from the base and also outwards. One of the best cooling units around for a GPU I would have to say!
http://resources.vr-zone.com.sg:81//Sha … ted/18.jpg
http://resources.vr-zone.com.sg:81//Sha … ted/15.jpg
The back heatspreader does the same job as the front heatspreader except that it draws the heat away from the Memory on the back of the card.

Pg. 8 - More Pictures
Pictures of the naked card:
http://resources.vr-zone.com.sg:81//Sha … nch/39.jpg
http://resources.vr-zone.com.sg:81//Sha … nch/40.jpg
http://resources.vr-zone.com.sg:81//Sha … nch/42.jpg
http://resources.vr-zone.com.sg:81//Sha … ted/13.jpg
http://resources.vr-zone.com.sg:81//Sha … ted/11.jpg
ATi Theater 200 chipset.
http://resources.vr-zone.com.sg:81//Sha … ited/9.jpg
And there you see the brain of the board, the R600 GPU, a rather big core for sure.
http://resources.vr-zone.com.sg:81//Sha … ted/10.jpg
Wow it's quite new, made in week 11 of year 2007.
http://resources.vr-zone.com.sg:81//Sha … nch/43.jpg
The R600 Core
http://resources.vr-zone.com.sg:81//Sha … ited/8.jpg
Hynix 1.0ns GDRR3 RAMs are employed to make up the 512MB of onboard GPU Memory.
http://resources.vr-zone.com.sg:81//Sha … ted/12.jpg
The X2900XT uses a full digital PWM for the power management of the GPU core and Memory VDDQ. Volterra VT1165 digital regulator is the governor for Core and another one of the same does it for Memory VDDQ. GPU Core is powered by a 6-phase module (2+4). Volterra VT233 regulates Memory VDD Voltage.

Pg. 9 - Clock, Heat, Power
With the AMD GPU Clock Tool, one can check out the 2D and 3D clockspeeds of the X2900XT.
http://resources.vr-zone.com.sg:81//Sha … nch/54.jpg
2D clockspeed is 506MHz Core and 513MHz Memory, while 3D brings about 743MHz Core and 828MHz Memory. 2D voltage for GPU is 1.08v and 3D voltage is 1.18v.
One can also check and monitor running temperature of the card with this tool:
http://resources.vr-zone.com.sg:81//Sha … nch/63.jpg
In my bare-open Test setup, the X2900XT definitely runs hot, as hot or even slightly hotter than the 8800GTX. Idle, you can see core at around 54C while loaded it goes to 71C without any overclocking. And this is one setup lying naked, imagine it in a case! Definitely one of the hottest cards around. The temperature-controlled fan on the card has little time in slow-spin operation, spinning up not long from start of operation, and perpetually all the time when 3D is ran. Not too much of a worry since it's not loud, but the noise is definitely audible. What about Power Consumption?
I used a Power Clamp Amp-Meter to measure the total power consumption of the setup during idle and load situations.
http://resources.vr-zone.com.sg:81//Sha … nch/50.jpg
3D Mark 06 SM3.0 Test "Canyon Flight" was ran to record the power consumption. The whole setup was the same except for Video Cards. Setup Specs on the next page.

Power Consumption of Total System                            Idle                Load
PowerColor HD X2900XT                                                    245 watts    365 watts
Inno3D 8800GTX 575/900MHz                                        250 watts    348 watts
ASUS EN8800GTS 640MB 513/792MHz                            240 watts    300 watts
EVGA 8800GTS 320MB Superclocked 576/850MHz     235 watts    276 watts
ASUS EN1950XTX                                                     195 watts    325 watts

This card is definitely the most power-hungry bugger around, consuming 40w more power than it's predecessor X1950XTX and 65w more power than a 8800GTS 640MB, though at idle it eats the same amount of power as the 8800GTS 640MB. Against the 8800GTX, with regular clockspeed, it consumes about 17w more power. A regular setup I would suggest at least a reliable brand-name 500w Power Supply and upwards.

Pg. 10 - Test Platforms, Drivers
When I first started testing the card, I was using the 8.36 Catalyst Drivers.
http://resources.vr-zone.com.sg:81//Sha … nch/51.jpg
Then after I have finished running the test runs for the X2900XT, the 8.37 came and I had to rerun everything.
http://resources.vr-zone.com.sg:81//Sha … nch/52.jpg
Thus, I took the chance to also compare the difference between these two slight Driver update. On the 8.37, there is no option of the 'High Quality' under AF options with the X2900XT while there is on the 8.36. There is the option when I put in the X1950XTX as well.
http://resources.vr-zone.com.sg:81//Sha … nch/86.jpg
http://resources.vr-zone.com.sg:81//Sha … nch/87.jpg
Seeing that the High Quality option for Anisotropic filtering was missing on the X2900XT on the 8.37 Drivers when it was present on the 8.36 drivers, I guessed that AF was automatically set at best quality when enabled for the X2900XT on this new set of drivers. So I ran a check between the 2 drivers with Oblivion to check out the Anisotropic Filtering. 1600x1200, 16x AF (High Quality when option was there), Temporal Anti-Aliasing at 8x Level and Wide-Tent Filter set at 16x Sampling.
http://resources.vr-zone.com.sg:81//Sha … nch/71.jpg
The Filtering on the 8.37 is definitely at least on par or even better than the High Quality setting on the 8.36. You get the faint impression that textures seem to be slightly more detailed on the 8.37. So I didn't really care that the High Quality Option was missing on the 8.37 drivers with the X2900XT.
http://resources.vr-zone.com.sg:81//Sha … nch/30.jpg
https://img382.imageshack.us/img382/3752/2900xtnh6.png
One look and you can tell which segment this video card is gunning for: the USD$399 price point where the GeForce 8800 GTS resides, it's direct competitor.

Drivers Used on X2900XT and X1950XTX is Catalyst 8-37-4-070419a. Drivers used for 8800GTS 320/640MB and 8800GTX is Forceware 158.22.

MipMap Detail setting on all drivers set to maximum level of High Quality. 16x Anisotropic Filtering was turned on.
http://resources.vr-zone.com.sg:81//Sha … nch/89.jpg
As of time of testing, we did not have the latest build just issued out 3 days before NDA was lifted. We were running 8-37-4-070419a.

The latest 8.37.4.2_47323 drivers is supposed to implement a new intelligent algorithm that increases FPS while applying similar image quality when running Adaptive Anti-Aliasing. In Oblivion, performance several times faster than previous drivers using the new adaptive AA algorithm was claimed to have been acheived. New optimizations for HDR applications in general resulted in a 5-30% increase in performance.

The 8.37.4.2_47323 is actually a pre-alpha driver, but it includes a preview of new 12xAA and 24xAA modes. These modes use an advanced edge detection filter that delivers edge quality while eliminating blurring.

Own Edit: Sweet, pictures are back up.
Alright, I'm done for now. Looks like the site is down for me.

Last edited by Smithereener (2007-05-13 09:28:33)

Hurricane
Banned
+1,153|7101|Washington, DC

Now I just need to be 18 and have "high quality sperm" so I can milk sperm banks for cash.
TheDarkRaven
ATG's First Disciple
+263|7095|Birmingham, UK
Conclusion

As said before, I had results from both the Catalyst 8.36 and Catalyst 8.37 Drivers. How did performance improve between this small driver jump of 0.01 version?

Taking a setting at 1600x1200 with 16xAF, I saw a major increase in performance, particularly Company Of Heroes and Quake 4. Performance went up by 11% on COH and 42% on Quake 4! This shows that the drivers is still very raw on this card, with just a minor driver revision boosting up performance that much, it gives us quite a lot of hope for a fair bit of improvement to come. Let's hope for that!


In many non Anti-Aliasing, High Definition game settings, you have seen the X2900XT push ahead of the performance of it's closest competitor, the GeForce 8800GTS 640MB, sometimes by quite a large margin, sometimes falling behind or ahead by a small percentage. In a select few games, the GTS is slightly faster, and vice versa. When Anti-Aliasing is turned on, the X2900XT showed that it carries it off with great efficiency in games that the drivers are optimized for, performing significantly better than the GTS; while the AA efficiency is piss-poor in some games due to the raw driver which has not fully blossomed to take advantage of ATi's new GPU technology. Just take a look at how performance has boosted from Drivers 8.36 to 8.37, that shows the potential in performance growth... a whole lot of it to reap.

It is slightly off tradition that the GPU company's flagship product sails off not to meet the flagship of it's competitor, but one target lower. Then again, the lower we go down the price pyramid, the bigger the audience, more people with the budget to spend. I'd say that there is no clear winner between the 8800 GTS and X2900XT, the GTS displayed more consistent performance behavior while the X2900XT fluctuates around due to the in-matured driver. I would say that despite the heat thrown out by the GPU, the X2900XT overclocks better than the 8800GTS by 8-10%, but that's putting out a lot more heat and drawing more power than it already consumes. So this is something potential XT buyers should take note of, the heat produced by the card is no small amount, nor is the power consumed by it - more than 60w over the GTS. What you would be investing in is a higher potential of upcoming performance boosts (including the latest pre-Alpha 8.37.4.2_47323 Catalyst just released 3 days before this review) and full HDCP support with integrated audio controller. And of course the new programmable Tessellation technology which we will probably not see support in games until much later.

Not the fastest video card in the market for sure, but definitely holds it's own at it's current price-point. We only hope that supply will be adequate and not lead to an indirect increase in prices due to short supply. We hope to see some interesting implementations from various card partners as well, be it overclocked specifications, or improved coolers.

85 marks
The Stillhouse Kid
Licensed Televulcanologist
+126|7113|Deep In The South Of Texas
Hurricane
Banned
+1,153|7101|Washington, DC

The Stillhouse Kid wrote:

ZipZoomFly already has them up for orders:

Sapphire @ $399.99
http://www.zipzoomfly.com/jsp/ProductDe … e=10005202

Diamond @ $421.99
http://www.zipzoomfly.com/jsp/ProductDe … e=10005082

HIS @ $424.99
http://www.zipzoomfly.com/jsp/ProductDe … e=10005096
zomg inexpensive
LT.Victim
Member
+1,175|7033|British Columbia, Canada
I hope if drops the 8800's prices.
Poseidon
Fudgepack DeQueef
+3,253|7008|Long Island, New York

LT.Victim wrote:

I hope if drops the 8800's prices.
As do I. I haven't bought my GTX yet but when I do I hope it's down to 500 or even less.
fatmarik
Member
+23|7053|Anywhere i am needed
it gets beaten by the 8800gtx every time, but the price of ati is way lower, so i might buy it in the near future, but i am still an nvidia fan and have never owned an ati, GO NVIDIA!
Cerpin_Taxt
Member
+155|6674
The 8800GTS 640mb seems like a better deal...cheaper and performs better. Although this might just be a driver issue.
Bell
Frosties > Cornflakes
+362|7020|UK

fatmarik wrote:

it gets beaten by the 8800gtx every time, but the price of ati is way lower, so i might buy it in the near future, but i am still an nvidia fan and have never owned an ati, GO NVIDIA!
You would expect it to get beat by the gtx, the fact is its to compete with the GTS, so its hardly fair to compare it to the GTX card.  I'm ordering mine as soon as it comes on overclockers, but I have one small question.  Read that it needs 750W PSU, wouldnt this suffice?

http://www.overclockers.co.uk/showprodu … =CA-031-EN

Also read something about it needing specfic connections for overclocking, anyone know if that one does ^^?

Martyn
CommieChipmunk
Member
+488|7041|Portland, OR, USA
wait wtf? I thought they were giong to be better than the gtx...

However, if I decided to get one, would I be SOL because I have an eVGA 680i motherboard with the nvidia chipset... I doubt an ATi card would like a nvidia chipset very much..
Maj.Do
Member
+85|7223|good old CA
Looks pretty good for the price, and isnt it made to compete withthe GTS version? not the GTX? maybe there will be a XTX version

Last edited by Maj.Do (2007-05-13 11:42:49)

Bell
Frosties > Cornflakes
+362|7020|UK

Yea the XTX is supposed to beat the GTX, but it was delayed, I think they had some problem with the new GDDR4, soooooooo september is the new release (I think).  This XT is to compete with the GTS (on price atleast) soooo depending on the scores and benchmarks that will materialise over the next week, we can draw our conclusions as to who wins .

Martyn
Stormscythe
Aiming for the head
+88|7020|EUtopia | Austria
Hm...

WHAT NOW - ATI FANBOYS?

Sorry, had to get that one off as I'm a little disappointed. The HD2900XT is in 80nm, yet it consumes way more power than the 8800 series. It TOTALLY sucks at anti-aliasing, although image quality isn't bad I'd have more fps with a 8800GTS than with it. Hum, I won't invest 350€ in a card that /wrists at nice AA which you'd expect in that price range.

At least I hope for the drivers to get a little better (ffs, they had 6 months, didn't they do anything at all?) so the 8800GTS will get a serious competetor and therefore drop a little in price...

PS: Since today I'm counting how many ATI fanboys have all of a sudden become emos...

Last edited by Stormscythe (2007-05-14 05:30:50)

Bell
Frosties > Cornflakes
+362|7020|UK

Agreed ^^^  I just finished reading an article about the card, apparantly on call of duty 2 the X1950XTX (previous flagship) was actually beating the HD 2900XT on all tested frame rates.  AMD confirmed this as it was the same as what there own test's where showing, and as it is now, there are some pretty bad driver issues right now which are selling the card short.  It already has its problems (power consumption!) this one shouldnt of occured, atleast not with a card that is 6 months late to market!

Bar that, I think we need to be careful comparing dx10 cards on dx9 games like call of duty and bf2/2142 on Call of Juarez, the ATI card beats its nvidia price competitor on early build tests, though on the newer one the nvidia card won, soooo there is so ambiguety as to who really has the better card.  Safer beat me thinks still would be to buy a GTS, its really up to the driver team at ATI to sort it out.

Martyn

Last edited by Bell (2007-05-14 06:33:58)

ghettoperson
Member
+1,943|7120

CommieChipmunk wrote:

wait wtf? I thought they were giong to be better than the gtx...

However, if I decided to get one, would I be SOL because I have an eVGA 680i motherboard with the nvidia chipset... I doubt an ATi card would like a nvidia chipset very much..
It'll work fine, you just can't use Crossfire.
Bertster7
Confused Pothead
+1,101|7052|SE London

It looks ok.

It doesn't look as good as Nvidias offerings though, which should shut up all the ATI fanboys who have been saying - ohhh, wait till the R600 is out, for ages and ages and ages.

I can't see it selling particularly well. Which will be a real kick in the teeth for AMD, who are know paying second fiddle to Intel/Nvidia.

CommieChipmunk wrote:

wait wtf? I thought they were giong to be better than the gtx...

However, if I decided to get one, would I be SOL because I have an eVGA 680i motherboard with the nvidia chipset... I doubt an ATi card would like a nvidia chipset very much..
It makes very, very little difference. Linkboost helps a bit for GPU OCing (particularly with SLI) on Nvidia cards, but that's it - and that makes very little impact on performance, certainly nowhere near the 25% boost Nvidia claim.

Last edited by Bertster7 (2007-05-14 06:32:26)

Maj.Do
Member
+85|7223|good old CA
hmm i guess ill wait a little bit longer...wonder if the Xtx will be any better.  other then that *strokes 8800*
Volatile
Member
+252|7175|Sextupling in Empire

I'm somewhat dissapointed, as I was expecting ATI to come out with something superior to the 8800GTX. Oh well, guess I'll be sticking with nvidia.
Scorpion0x17
can detect anyone's visible post count...
+691|7237|Cambridge (UK)

Volatile_Squirrel wrote:

I'm somewhat dissapointed, as I was expecting ATI to come out with something superior to the 8800GTX. Oh well, guess I'll be sticking with nvidia.
Give it time. Seeing that the 2900's aren't even available in the shops and all video cards improve in performance as they get the drivers right, the 2900s will probably end up better than the 8800s.
ReDevilJR
Member
+106|6822
Bertster7
Confused Pothead
+1,101|7052|SE London

ReDevilJR wrote:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127288
Cheapest Ultra OC'd WAY HIGH!!!
Bah.

The 8800 Ultra is overpriced. With adequate cooling you can achieve the same results with a GTX. The architecture is the same. The GTS is the one to buy 320/640 depending on your monitor.


I think ATI are fucked unless they can corner a niche in the market. If they release great quality mid range cards that put the 8600s to shame, they'll be ok.
  • Index » 
  • Community » 
  • Tech » 
  • ATI/AMD Radeon HD 2900xt has now been OFFICIALLY launched and reviewed

Board footer

Privacy Policy - © 2025 Jeff Minard