Now that I've filled your head with ethereal bits of RV670 theory, here's a look at the hardware.
This thin little number is the Radeon HD 3850, the lower end of the two RV670-based graphics cards. This puppy will feature a 670MHz GPU core and 256MB of GDDR3 memory running at 830MHz (or 1.66GHz effective data rate). AMD says this board uses 95W of power and rates its cooler at 31 dBA.
And this beefier specimen is the Radeon HD 3870. Unlike the GeForce 8800 GT, this beast packs a dual-slot cooler, which is both a curse and a blessing. Yes, it eats more slots, but it also exhausts hot air out the back of the case and should be able to provide more cooling with less noise than a single-slot design. Aided by this cooler, the 3870 reaches core speeds of 775MHz, and it runs its 512MB of GDDR4 memory at 1125MHz. The big cooler belies small things, though, including a rated board power of 105W and rated noise of 34 dBA.
Both cards have twin dual-link DVI connectors and HDTV-out ports. They're pretty much, uh, modern video cards as expected.
One nice touch about AMD's new naming scheme: no suffixes like "XT" and "Pro" anymore. In the 3800 series, only numbers denote higher or lower performance, so things are much easier to decode. The folks in AMD graphics seem to have picked up this idea from the AMD CPU people, interestingly enough. Imagine that.
Doing the mathand the accounting
Those of you who are familiar with this GPU architecture may be jumping ahead. With a 775MHz core clock and only a 256-bit memory interface, how will the Radeon HD 3870 match up to the GeForce 8800 GT? Let's have a look at some of the key numbers side by side, to give you a hint. Then we'll drop the bomb.
|GeForce 8800 GT||9.6||33.6||16.8||57.6||504|
|GeForce 8800 GTS||10.0||12.0||12.0||64.0||346|
|GeForce 8800 GTX||13.8||18.4||18.4||86.4||518|
|GeForce 8800 Ultra||14.7||19.6||19.6||103.7||576|
|Radeon HD 2900 XT||11.9||11.9||11.9||105.6||475|
|Radeon HD 3850||10.7||10.7||10.7||53.1||429|
|Radeon HD 3870||12.4||12.4||12.4||72.0||496|
Here are some of the key metrics for various enthusiast-class cards. We already know that the 3800 series' ostensible competition, the GeForce 8800 GT, pretty much comprehensively outperforms the Radeon HD 2900 XT. As you can see, the HD 3870 slightly surpasses the 2900 XT in terms of pixel and texture throughput and peak shader capacity, but doesn't have the same mammoth memory bandwidth. And, notably, the HD 3870 trails the 8800 GT in terms of texturing capacity and shader arithmetic. (Be aware that simple comparisons between GPU architectures on shader throughput are tricky. Another way of counting would reduce the GeForce 8-series cards' numbers here by a third, and justifiably so.)
AMD apparently looked at these numbers, thought long and hard, and came to some of the same conclusions we did: doesn't look like the 3870's gonna perform quite as well as the 8800 GT. So here are some additional numbers for you: the Radeon HD 3850 should show up at online retailers today for $179, as should the HD 3870 at $219. This presents an interesting situation. The first wave of 8800 GTs largely sold out at most places, and prices rose above Nvidia's projected "$199 to $249" range as a result. If AMD can supply enough of these cards and keep prices down, they may offer a compelling alternative to the 8800 GT, even if they're not quite as fast overall. That certainly seems to be the hope in the halls of the former ATI. Whether that will come to pass, well, I dunno. Let's see how these things actually perform.
|The TR staff traveled across the country to catch the 2017 eclipse||22|
|Alienware Area 51 desktop gets a Core X CPU infusion||7|
|Tuesday deals: a pair of monitors and a mini desktop gaming PC||0|
|SteelSeries' Rival 310 and Sensei 310 gaming mice reviewed||5|
|Bao Day Shortbread||14|
|HP Omen X laptop is overclockable and overpowered||10|
|Radeon 17.8.1 drivers are ready for Vega, Quake, and Agents of Mayhem||9|
|Android 8.0 is a freshly-baked Oreo||23|
|Aorus AC300W case offers fancy front panel connectivity||10|
|Their dies were shrinking for years. Consumer cost was not. Surely the timing of us getting more cores is purely coincidental, lol.||+12|