The 2900 XT does match the GeForce 8800 series on image quality generally, which was by no means a foregone conclusion. Kudos to AMD for jettisoning the Radeon X1000 series' lousy angle-dependent aniso for a higher quality default algorithm. I also happen to like the 2900 XT's custom tent filters for antialiasing an awful lotan outcome I didn't expect, until I saw it in action for myself. Now I'm hooked, and I consider the Radeon HD's image quality to be second to none on the PC as a result. Nvidia may yet even the score with its own custom AA filters, though.
The HDCP support over dual-link DVI ports and HDMI audio support are both welcome additions, too. We haven't yet had time to test CPU utilization during HD-DVD or Blu-ray playback, but we've got that on the list for a follow-up article (along with GPU overclocking, edge-detect AA filters, dual-link DVI with HDCP on the Dell 3007WFP, AMD's Stream computing plans, and a whole host of other items).
Ultimately, though, we can't overlook the fact that AMD built a GPU with 700M transistors that has 320 stream processor ALUs and a 512-bit memory interface, yet it just matches or slightly exceeds the real-world performance of the GeForce 8800 GTS. The GTS is an Nvidia G80 with 25% of its shader core disabled and only 60% of the memory bandwidth of the Radeon HD 2900 XT. That's gotta be a little embarrassing. At the same time, the Radeon HD 2900 XT draws quite a bit more power under load than the full-on GeForce 8800 GTX, and it needs a relatively noisy cooler to keep it in check. If you ask folks at AMD why they didn't aim for the performance crown with a faster version of the R600, they won't say it outright, but they will hint that leakage with this GPU on TSMC's 80HS fab process was a problem. All of the telltale signs are certainly there.
There are many things we don't yet know about the GeForce 8800 and Radeon HD 2900 GPUs, not least of which is how they will perform in DirectX 10 games. I don't think our single DX10 benchmark with a pre-release game tell us much, so we'll probably just have to wait and see. Things could look very different six months from now, even if the chips themselves haven't changed.
259 comments — Last by Rakhmaninov3 at 4:05 AM on 05/28/07
|Are retail Radeon R9 290X cards slower than press samples?We take a look||301|
|Delving deeper into AMD's Mantle APIDispatches from APU13||191|
|AMD's Radeon R9 270 graphics card reviewedPitcairn again||77|
|Nvidia's GeForce GTX 780 Ti graphics card reviewedNow witness the firepower of this fully armed and operational battle station||284|
|AMD's Radeon R9 290 graphics card reviewedHope you didn't buy the X yet||308|
|AMD's Radeon R9 290X graphics card reviewedHawaii erupts||653|
|Not-quite-live blog: panel discussion with John Carmack, Tim Sweeney, Johan AnderssonThree game engine gurus talk about PC gaming tech||37|
|Live blog from day two of Nvidia's Montreal 2013 eventThis one should be interesting||32|
|An update on Radeon R9 290X variance||20|
|Ubisoft's Snowdrop engine makes The Division look incredible||33|
|No Man's Sky has procedurally generated planets, looks amazing||37|
|Samsung brings 840 EVO to mSATA, drops new firmware for 2.5'' version||6|
|Next Windows release could be more desktop-friendly||131|
|Asus teases custom Radeon R9 290X with DirectCU II cooler||64|
|Report: NSA put agents in World of Warcraft, Second Life||75|
|Bay Trail could power $99 Android tablets||30|
|Rumor: Google cooking up Nexus TV box||41|