AMD’s Radeon HD 4870 X2 graphics card

This should be fairly painless. We’ve already previewed the Radeon HD 4870 X2 for you, and today that card is becoming official and available for purchase. To mark the occasion, we’ve wrangled not one but two of these dual-GPU monsters and paired them up via CrossFire. We’ve also assembled a killer’s row of GeForce GTX cards for a comprehensive look at the next generation of multi-GPU madness. Can we set a new record for power draw at the wall socket? Oh, I think you know we can.

An X2 refresher

You really should read our preview of the Radeon HD 4870 X2 if you haven’t already. There we explained the ins and outs of “CrossFire on a stick” technology and even took some naughty pictures of a naked 4870 X2 card. There’s not a lot more to be said about the card, which sports two Radeon HD 4870 graphics processors and twin banks of 1GB GDDR5 memory—one for each GPU. This is the same basic sort of layout we saw in the Radeon HD 3870 X2 before it.

We do know a little more about the 4870 X2 now, though. Some of that knowledge is encapsulated in this nifty logical block diagram of the 4870 X2 provided by AMD:

A nifty logical block diagram of the 4870 X2. Source: AMD.

This diagram provides confirmation for what we’d suspected: that the X2 provides additional bandwidth between its two RV770 graphics chips by means of a dedicated “sideport” connection. You may have noticed that the sideport connection offers 5GB/s in each direction, very much like a PCIe 2.0 x16 link. AMD says the sideport is electrically similar to PCI Express but is simpler because it’s only intended as a peer-to-peer link between GPUs. This link augments the bandwidth already available via the X2’s CrossFire bridge interface (CFBI in the diagram), which is only used to pass final frames from one GPU to the next for compositing, and its PCI Express lanes. The sideport connection should help improve performance in cases where multi-GPU applications have typically had performance scaling problems, such as when texture synchronization between the GPUs becomes a problem.

Whether the sideport connection will be a big help is an open question, though. The PCI Express switch on the X2 already manages traffic for 48 lanes worth of Gen 2 PCIe connectivity, including 16 lanes to each GPU and 16 lanes to the rest of the system. On top of that, AMD says the bridge chip supports a broadcast write function that can relay data to both of the X2’s graphics processors simultaneously, conserving bandwidth.

Another thing we now know about the 4870 X2 is its price. AMD is aiming for the $549 mark, which is just a tad shy of what’d you’d pay for two Radeon HD 4870 cards. Since the X2 comes with 1GB of memory per GPU rather than 512MB like most current Radeon HD 4870 cards, it’s really not a bad deal, as far as embarrassingly expensive video cards go.

And here’s a bit of a shock: the card pictured above comes from the folks at Palit, who have been hawking GeForce cards on these shores for a little while now. They are an equal opportunity eye candy vendor these days, which leads to another interesting development…

Yes, folks, the atomic frog has joined the red team, and he’s apparently kinda cheesed off about something. Watch out, or he’ll unleash his robotic bagpipes on your ass. They are full of liquid metal awesome.

Speaking of awesomeness, Radeon board vendors have another surprise in store for us soon, as well: Radeon HD 4850 X2 cards. These things will feature two Radeon HD 4850 GPUs, each with 512MB of GDDR3 memory. We haven’t yet procured one of these things, but I’d expect them to perform more or less like a pair of Radeon HD 4850s in CrossFire, which is pretty darned fast. The price? $399, or less than the going rate for a GeForce GTX 280. Now, that’s just not even fair.

AMD says we can expect 4850 X2s anytime now, between the middle of August and the third week of the month.

Considering the CrossFireX possibilities

The Radeon HD 4870 X2 comes with a single CrossFire bridge connector, which opens up some intriguing possibilities. The most obvious of those is the potential of harnessing two X2 cards together for quad-GPU mayhem. We’ve done it, of course. Another possibility is hooking the 4870 X2 together with a card like this one…

Diamond’s Radeon HD 4870

Diamond was incredibly kind to provide this Radeon HD 4870 card for our use in multi-GPU testing, and we’ve done exactly that. This is a single-GPU card with 512MB of GDDR5 memory, my current favorite graphics card value. The X2 can team up with it to achieve a three-way CrossFireX config. You do end up compromising on total memory when you go this route, however, because the CrossFire gods demand symmetry. AMD’s drivers will treat each GPU in the team as if it has 512MB of memory attached to it, since that’s the lowest common denominator. We’ll see whether (and how much) that hurts when we get to our performance results.

New GeForces, newer prices

The 4870 X2’s main competition, duh, comes from Nvidia’ GeForce GTX 200-series lineup. That’s elementary, but we do need to recalibrate our expectations somewhat in light of the impressive downward movement of GeForce GTX prices since the cards’ introductions. Let’s have a look at cards we used for testing as examples.

Like our 4870 X2, many of the GeForce GTX 260 and 280 cards we used in testing came from Palit, who seems to be everywhere lately. They were kind enough to provide us with a big chunk of the cards we used for this article. The two cards pictured above actually are different products, although telling them apart isn’t easy. The one on the right is a GeForce GTX 280, and that card is currently selling for $429.99 at Newegg, along with a (boo! hiss!) mail-in rebate that will, if and when you receive the funds, will take the net price down to 400 bucks. That’s a long, long way from the $649 introductory price. On the left in the picture above is a GeForce GTX 260, currently listing for $289.99 at Newegg and $269.99 after rebate. Again, quite the drop from its $399 debut.

In light of these prices, Nvidia claims the most appropriate competition for the Radeon HD 4870 X2 would be a pair of GeForce GTX 260 cards in SLI. That does make some sense, but remember that running a couple of GTX 260 cards in SLI imposes some strict requirements. Most notably, you’ll need to have a motherboard based on an nForce chipset, because Nvidia has restricted SLI to its own core logic. On top of that, the GTX 260 SLI config will chew up two PCIe x16 slots and (including coolers) a total of four expansion slots inside of your PC, and it will require twice as many PCIe power leads as the 4870 X2. And don’t get me started on multi-monitor support. Let’s just say AMD supports dual monitors reasonably well with the X2 and other CrossFire implementations, while SLI requires the user to engage in a manual mode-switching ritual before launching a game.

Test notes

Let me say a word or two about video drivers. When you’ve compiled the sheer volume of test results we have in the following pages, driver revision management is bound to get tricky. Some of the single-card results from prior-gen GPUs were obtained with somewhat older drivers. We decided to go ahead and include them here since for those who are interested, although the comparison to newer cards and drivers may not be exact.

Also, the drivers we used for newer cards are pre-release or beta drivers, which may create some confusion. For instance, the driver revision we used for the Radeon HD 4850 and 4870 is newer than the WHQL’ed Catalyst 8.7 release available on AMD’s website, despite the 8.5.xx version string we’ve reported. Similarly, the 177.39 drivers we used for the GeForce cards are newer than the 177.41 WHQL drivers posted on Nvidia’s website. And, of course, the drivers we used on the 4870 X2 are brand-spanking-new and still unreleased.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the results were averaged.

Our test systems were configured like so:

Processor Core 2 Extreme QX9650 3.0GHz Core 2 Extreme QX9650 3.0GHz
System bus 1333MHz (333MHz quad-pumped) 1333MHz (333MHz quad-pumped)
Motherboard Gigabyte GA-X38-DQ6 EVGA nForce 780i SLI
BIOS revision F9a P05p
North bridge X38 MCH 780i SLI SPP
South bridge ICH9R 780i SLI MCP
Chipset drivers INF update 8.3.1.1009
Matrix Storage Manager 7.8
ForceWare 15.17
Memory size 4GB (4 DIMMs) 4GB (4 DIMMs)
Memory type 2 x Corsair TWIN2X20488500C5D
DDR2 SDRAM
at 800MHz
2 x Corsair TWIN2X20488500C5D
DDR2 SDRAM
at 800MHz
CAS latency (CL) 5 5
RAS to CAS delay (tRCD) 5 5
RAS precharge (tRP) 5 5
Cycle time (tRAS) 18 18
Command rate 2T 2T
Audio Integrated ICH9R/ALC889A
with RealTek 6.0.1.5618 drivers
Integrated nForce 780i SLI MCP/ALC885
with RealTek 6.0.1.5618 drivers
Graphics
Radeon HD 2900 XT 512MB PCIe
with Catalyst 8.5 drivers

Dual XFX GeForce 9800 GTX XXX 512MB PCIe
with ForceWare 177.39 drivers
Asus Radeon HD 3870 512MB PCIe
with Catalyst 8.5 drivers

Dual Palit
GeForce GTX 260 896MB PCIe

with ForceWare 177.39 drivers

Radeon HD 3870 X2 1GB PCIe
with Catalyst 8.5 drivers

Dual Palit
GeForce GTX 260 896MB PCIe

+
GeForce GTX 260 896MB PCIe

with ForceWare 177.39 drivers
 

Radeon HD 4850 512MB PCIe
with Catalyst 8.501.1-080612a-064906E-ATI drivers

Palit
GeForce GTX 280 1GB PCIe

+ XFX

GeForce GTX 280 1GB PCIe
with ForceWare 177.39 drivers

Dual Radeon HD 4850 512MB PCIe
with Catalyst 8.501.1-080612a-064906E-ATI drivers

Dual Palit
GeForce GTX 280 1GB PCIe

+ XFX

GeForce GTX 280 1GB PCIe
with ForceWare 177.39 drivers
 

Radeon HD 4870 512MB PCIe
with Catalyst 8.501.1-080612a-064906E-ATI drivers

 
Dual Radeon HD 4870 512MB PCIe
with Catalyst 8.501.1-080612a-064906E-ATI drivers
 
Radeon HD 4870
X2 2GB PCIe
with Catalyst 8.52-2-080722a-066081E-ATI drivers
 
Diamond
Radeon HD 4870 512MB PCIe
+

Radeon HD 4870
X2 2GB PCIe
with Catalyst 8.52-2-080722a-066081E-ATI drivers

 
Dual Radeon HD 4870
X2 2GB PCIe
with Catalyst 8.52-2-080722a-066081E-ATI drivers
 
MSI GeForce 8800 GTX 768MB PCIe
with ForceWare 175.16 drivers

 
XFX GeForce 9800 GTX 512MB PCIe
with ForceWare 175.16 drivers
 
XFX GeForce 9800 GTX XXX 512MB PCIe
with ForceWare 177.39 drivers
 
GeForce 9800 GTX+ 512MB PCIe
with ForceWare 177.39 drivers

 
XFX GeForce 9800 GX2 1GB PCIe
with ForceWare 175.16 drivers
 
GeForce GTX 260 896MB PCIe
with ForceWare 177.34 drivers
 
GeForce GTX 280 1GB PCIe
with ForceWare 177.34 drivers

 
Hard drive WD Caviar SE16 320GB SATA
OS Windows Vista Ultimate x64 Edition
OS updates Service Pack 1, DirectX March 2008 update

Thanks to Corsair for providing us with memory for our testing. Their quality, service, and support are easily superior to no-name DIMMs.

Our test systems were powered by PC Power & Cooling Silencer 750W power supply units. The Silencer 750W was a runaway Editor’s Choice winner in our epic 11-way power supply roundup, so it seemed like a fitting choice for our test rigs. Because it needs ever more power and a different connector layout, our three-way GeForce GTX 280 SLI system used a PC Power & Cooling Turbo-Cool 1200, instead. Thanks to OCZ for providing all of these units for our use in testing.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Sizing ’em up

If you haven’t caught on by now, the Radeon HD 4870 X2 is, well, really frickin’ fast. We’ll get to the game benchmarks shortly, but we can quantify the X2’s prowess in several ways. Here it is compared to the most relevant competitors and some older cards of the same class.

Peak
pixel
fill rate
(Gpixels/s)

Peak bilinear

texel
filtering
rate
(Gtexels/s)


Peak bilinear

FP16 texel
filtering
rate
(Gtexels/s)


Peak
memory
bandwidth
(GB/s)

GeForce 8800 GTX

13.8 18.4 18.4 86.4
GeForce 9800 GTX

10.8 43.2 21.6 70.4
GeForce 9800 GX2

19.2 76.8 38.4 128.0
GeForce GTX 260

16.1 36.9 18.4 111.9
GeForce GTX 280

19.3 48.2 24.1 141.7
Radeon HD 2900 XT

11.9 11.9 11.9 105.6
Radeon HD 3870 12.4 12.4 12.4 72.0
Radeon HD 3870 X2

26.4 26.4 26.4 115.2
Radeon HD 4850

10.0 25.0 12.5 63.6
Radeon HD 4870

12.0 30.0 15.0 115.2
Radeon HD 4870 X2

24.0 60.0 30.0 230.4

Those are the theoreticals. Here’s how the cards measure in 3DMark’s synthetic tests.

For what it’s worth, I continue to be befuddled by the Gtexels/second numbers coming out of 3DMark Vantage. The units have to be off; they don’t match the capabilities of the cards. We first discovered this problem and asked FutureMark about it in early June. They have been very polite about telling us several times since then that the people who might fix this problem are on vacation. (Note to self: try to get job at FutureMark.) For now, we’ll continue to assume the relative performance measured here tracks well, even if the units reported are incorrect.

And the numbers look quite nice for the 4870 X2, which beats out its semi-ostensible competition in the dual GeForce GTX 260 SLI in both tests and proves to be easily the fastest “single card” anywhere in these two key metrics.

Then again, shader processing is quickly becoming the primary performance constraint in newer games. Here’s how the 4870 X2 stacks up in that regard.

Peak shader
arithmetic (GFLOPS)

Single-issue Dual-issue

GeForce 8800 GTX

346 518
GeForce 9800 GTX

432 648
GeForce 9800 GX2

768 1152
GeForce GTX 260

477 715
GeForce GTX 280

622 933
Radeon HD 2900 XT

475
Radeon HD 3870 496
Radeon HD 3870 X2

1056
Radeon HD 4850

1000
Radeon HD 4870

1200
Radeon HD 4870 X2

2400

Yeah, uh, 2.4 teraflops will probably do you pretty well. Nothing else out there even comes close. 3DMark has some synthetic shader tests that will give us a sense of the X2’s delivered shader performance.

None of the multi-GPU solutions performance particularly well in the GPU cloth and GPU particles tests, including the X2. Its additional sideport bandwidth doesn’t seem to be of any help, either; two 4870 cards in CrossFire perform similarly in those two tests.

The parallax occlusion mapping and Perlin noise tests are another story altogether. The X2 is the fastest single-card config, and dual X2s are even faster than three GeForce GTX 260 cards.

Call of Duty 4: Modern Warfare

We tested Call of Duty 4 by recording a custom demo of a multiplayer gaming session and playing it back using the game’s timedemo capability. Since these are high-end graphics configs we’re testing, we enabled 4X antialiasing and 16X anisotropic filtering and turned up the game’s texture and image quality settings to their limits.

We’ve chosen to test at 1680×1050, 1920×1200, and 2560×1600—resolutions of roughly two, three, and four megapixels—to see how performance scales.

Well, this is a great game, but it’s pretty much CPU limited with its quality settings maxed out along with 16X aniso and 4X AA. At 2560×1600, the multi-GPU configs do separate a little bit, and the 4870 X2 comes out ahead of three GTX 260s in SLI, let alone two.

Half-Life 2: Episode Two

We used a custom-recorded timedemo for this game, as well. We tested Episode Two with the in-game image quality options cranked, with 4X AA and 16 anisotropic filtering. HDR lighting and motion blur were both enabled.

Episode Two isn’t quite as CPU-limited as Call of Duty 4, but we’re still talking about frame rate averages over 70 FPS for half of the field at 2560×1600. With that caveat on the table, we can say that the 4870 X2 is again the fastest “single” card around, although it’s a little slower than dual GTX 260s at our top resolution.

Notice, by the way, that two GeForce GTX 260s appear to be quicker than three. This isn’t an unfamiliar sight for those of us who have worked with these exotic three- and four-way GPU configs. Sometimes, the overhead of managing another GPU isn’t worth it.

Enemy Territory: Quake Wars

We tested this game with 4X antialiasing and 16X anisotropic filtering enabled, along with “high” settings for all of the game’s quality options except “Shader level” which was set to “Ultra.” We left the diffuse, bump, and specular texture quality settings at their default levels, though. Shadow and smooth foliage were enabled, but soft particles were disabled. Again, we used a custom timedemo recorded for use in this review.

The 4870 X2 is a little slower than two GeForce GTX 260s in Quake Wars at the highest resolution we tested, but CrossFireX scales well: our three-way 4870 setup is faster than three GTX 260s, and the four-way 4870 rig just edges out three GeForce GTX 280s. Of course, at 115 FPS, it’s all pretty much academic anyhow.

Crysis

Rather than use a timedemo, I tested Crysis by playing the game and using FRAPS to record frame rates. Because this way of doing things can introduce a lot of variation from one run to the next, I tested each card in five 60-second gameplay sessions.

Also, I’ve chosen a new area for testing Crysis. This time, I’m on a hillside in the recovery level having a firefight with six or seven of the bad guys. As before, I’ve tested at two different settings, with the game’s “High” quality presets and with its “Very high” ones, also.

We’ve clearly moved to the more difficult portion of our program. Both AMD and Nvidia have managed to squeeze additional performance out of Crysis using two GPUs, but neither company has had much success with three GPUs or more. At both quality levels we tested, Nvidia has the upper hand in dual-GPU performance, which means even the GeForce 9800 GX2—an older breed of dual-GPU graphics card—outperforms the 4870 X2.

Assassin’s Creed

There has been some controversy surrounding the PC version of Assassin’s Creed, but I couldn’t resist testing it, in part because it’s such a gorgeous, well-produced game. Also, hey, I was curious to see how the performance picture looks for myself. The originally shipped version of this game can take advantage of the Radeon HD 3000- and 4000-series GPUs’ DirectX 10.1 capabilities to get a frame rate boost with antialiasing, and as you may have heard, Ubisoft chose to remove the DX10.1 path in an update to the game. I chose to test the game without this patch, leaving DX10.1 support intact.

I used our standard FRAPS procedure here, five sessions of 60 seconds each, while free-running across the rooftops in Damascus. All of the game’s quality options were maxed out, and I had to edit a config file manually in order to enable 4X AA at this resolution.

Here’s another game where two GPUs can help, but additional ones just get in the way. This time around, though, the Radeons clearly have the upper hand, no doubt due in part to their support for this game’s optimized antialiasing performance with DirectX 10.

Race Driver GRID

I tested this absolutely gorgeous-looking game with FRAPS, as well, and in order to keep things simple, I decided to capture frame rates over a single, longer session as I raced around the track. This approach has the advantage of letting me report second-by-second frame-rate results. I’ve left out some of the lower-end solutions here for reasons I’ll explain below.

Yikes. So your first question probably is: what happened with the three-card 4870 X2 + 4870 setup? The answer: with only 512MB of memory per GPU, they just couldn’t handle GRID at this resolution. That’s why I’ve excluded some other configs, as well. They just can’t do this. The cards with more than 512MB of memory can, though, and the 4780 X2 is tops among them.

Playing this gorgeous game on a four-way 4870 CrossFire rig at 100+ FPS on a 30″ 2560×1600 display? Yeah, I love my job.

Power consumption

We measured total system power consumption at the wall socket using an Extech power analyzer model 380803. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.

The idle measurements were taken at the Windows Vista desktop with the Aero theme enabled. The cards were tested under load running Half-Life 2 Episode Two at 2560×1600 resolution, using the same settings we did for performance testing.

We didn’t test idle power use for the 4870 X2 in our preview because AMD didn’t have all of its PowerPlay mojo working correctly yet. Now it can be told, and the story isn’t too bad, considering. The 4870 X2 system only draws 20W more at idle than the same system equipped with single 4870.

When running Episode Two, the 4870 X2-based system draws a little more power than a comparable two-way GeForce GTX 260 SLI rig. And with dual 4870 X2s in the system, we do indeed have what is, I believe, a new power-draw wattage record for one of our test rigs. To its credit, our 750W PC Power & Cooling PSU didn’t explode during this test. In fact, I tested power consumption last and had no idea how close to the edge the PSU must have been throughout our performance testing.

GPU temperatures

Per your requests, I’ve added GPU temperature readings to our results. I captured these using AMD’s Catalyst Control Center and Nvidia’s nTune Monitor, so we’re basically relying on the cards to report their temperatures properly. In the case of multi-GPU configs, I only got one number out of CCC. I used the highest of the numbers from the Nvidia monitoring app. These temperatures were recorded while running the “rthdribl” demo in a window. Windowed apps only seem to use one GPU, so it’s possible the dual-GPU cards could get hotter with both GPUs in action. Hard to get a temperature reading if you can’t see the monitoring app, though.

Our production 4870 X2 cards run a little bit cooler than the pre-production sample we tested, thank goodness. I’d like to have noise level results to present alongside these temperatures, but unfortunately, I just ran out of time to test it all. I may try to add those results in the next day or two, so you might check back here later.

Conclusions

If you’ve been reading TR for any time at all, you know all of the caveats that go along with a multi-GPU graphics card like the Radeon HD 4870 X2. Without proper support in the game or a profile in the video driver, you may only see about half of the performance potential of your video card. There’s no getting around that. But in the case of this particular card, even if the worst happens, you’re falling back on the performance of one of the fastest GPUs around in the Radeon HD 4870. Not only that, but AMD’s seamless dual-monitor support really makes the X2 a more attractive product than it might otherwise be. It’s easily the quickest “single” video card you can buy, and it almost feels like a good deal at $549. You’re really getting all the video card you need for most any game, and it occupies only one PCIe x16 slot (plus a spare adjacent slot, of course). Contrary to its protestations, Nvidia has no real answer for this beast.

Now, whether you should spend $549 on a graphics card is up to you. Personally, I think you could probably stand to put away a little more in your 401K plan or maybe save up for college, but hey, whatever floats your boat. I think we can say that buying more than two GPUs for use in your own system is pretty much a waste, though, no matter your financial outlook. Many times, we found that two GPUs were faster than three, and rarely did we find a game that really needed more than two GPUs. However, if you want the absolute ultimate graphics subsystem, you’ll find it in a pair of Radeon HD 4870 X2 cards, which unspooled a fluid ribbon of track in front of us in GRID at over 100 FPS at 2560×1600 resolution. Now, that same system pulled over 750W at the wall socket, but in return it gave us GRID bliss. And it’s a lot cheaper than buying a Porsche.

Comments closed
    • Damage
    • 11 years ago

    I’ve just updated the theoretical GPU capacity table in this review to correct the fill rate numbers for the GeForce GTX 260. The revised numbers are slightly lower. The performance results remain unaffected.

    • robspierre6
    • 11 years ago

    This is one great review.Techreport is the number 1 hardware site.
    Screw guru3d, Hexus, anandtech, bjorn3d…..and all Nvidia paid out sites.
    Techreport and tomshardware are the best.

      • Nitrodist
      • 11 years ago

      Did you seriously just put TH along with TR?

    • elite3124
    • 11 years ago

    I’m confused. 3D guru did a review of the 280 and got better numbers than the 4870, and in their review they used newer drivers than in the TR review.
    §[<http://www.guru3d.com/article/bfg-geforce-gtx-280-ocx-review/9<]§

      • robspierre6
      • 11 years ago

      3dGuru is funded by XFX.Their reviews are fake.

    • Usacomp2k3
    • 11 years ago

    It just dawned on me that AMD has come full circle. The AMD x2 4800 cpu came out, what, 5 years ago and now we have an AMD 4800 series that comes in X2 flavors, except as video cards and not cpu’s.

    • sigher
    • 11 years ago

    How come these tests always use 16xAF, I find 8 or even 4 does the job fine and the difference between that and 16 is negligible, but they do the AA at 4x and not max, so why AF max and AA more realistic?
    You should do a poll on what AF people generally use, I would be interested to see the results on that one.

    Well at least test are done with some AF, I think you must run test always with at least some AF because playing nowadays with no AF is just silly and unacceptable, and yet you still see sites do test without any AF and I don’t imagine anybody runs games with no AF at all, in fact I wonder why there even is an ‘off’ position in control panels for AF, and I think one of the companies should have the balls to set the minimum at 2xAF, after all there’s no ‘turn off DX acceleration’ button either is there.
    Well maybe for 2D style games it’s handy to be able to turn it off, although those games are hardly taxing the card enough to need to turn it off.

    • shtal
    • 11 years ago

    ATI simply razzing Nvidia – because Nvidia always brag with performance crown. ATI was waiting long time for this moment since the days of R300.

    Check out this video
    §[<http://forum.donanimhaber.com/m_25688835/tm.htm<]§

    • shtal
    • 11 years ago

    I still wonder why people worry so much about heat?

    If company tested and released this beast; who cares if it runs hot, ATI guarantees for 2 years before it dies.

      • Valhalla926
      • 11 years ago

      But some vendor only guarantee 1 year, and I think most people keep these around for more then just 1 year anyway.

      That, and if the card did die on you, and the vendor put you through bureaucratic hell with a replacement, you would care much more about heat.

    • A_Pickle
    • 11 years ago

    I think it’s time to start showing Assassin’s Creed without the DirectX 10.1 codepath, as well as all-the-way up-to-date. It almost seems a little ATI-bent if you keep doing the benchmark while Assassin’s Creed has the DX 10.1 codepath, a lot of players (myself included) will probably own an ATI card, and would like to see how well it does fully updated (for stability and in-game bug reduction) even with an ATI card.

    Also, I’m gonna second the resolution notion… I really think that tests should start at 1440×900 at minimum, and there probably ought to be a set of 1280×1024 benchmarks out there.

    I know it’s hard work, but… I mean, that’s what Tech Report is. Tech Report has the most thorough reviews on this side of the blogosphere, I mean… there’s no comparison. That’s why I’m here at Tech Report, because shy of you and Anandtech, NO ONE ever publishes 3D Studio Max or Adobe Premiere Pro scores for a CPU. Well, the same goes for graphics cards, and while I’m perfectly content observing the scores at 1920×1200, a lot of other people could (and probably are) missing out on the sheer amazing that is Tech Report.

    Just my thoughts…

      • robspierre6
      • 11 years ago

      Anandtech ies a INTEL Nvidia paid out site.I don’t trust their reviews.

    • ish718
    • 11 years ago

    Conclusion:
    HD4870 x2 crossfire is pointless and ineffective but I expected that.
    HD4870 x2 is a great card never the less.

    • Rza79
    • 11 years ago

    Why does Anand report that the Sideports are disabled?
    Can you confirm this?

    • pogsnet
    • 11 years ago
    • HurgyMcGurgyGurg
    • 11 years ago

    Great review.

    Since we are slowing in the folding department, I think a good way to get us back on course is to really start pushing gpu folding, any idea how many ppd a 2x 4870X2 system with a nice quad core would get?

    Now it is hard to get ppd accurately especially since support for the newest cards isn’t always there at 100% on launch, but still maybe it might be worth including folding benchmarks in gpu reviews now just like in cpu reviews.

    Just a thought.

      • wingless
      • 11 years ago

      F@H needs to optimize their code for ATI hardware. Nvidia’s CUDA makes it relatively easy to program for the G80 and newer arch so they have GeForce cards putting out twice the PPD compared to ATI hardware. The sad part is that ATI hardware is capable of higher output, but it is difficult for them to utilize it. Hopefully some advances in OpenCL will change all that in the coming months.

      F@H also said they want to eventually separate the ATI and Nvidia GPU clients. This should help things along for ATI; however, I wouldn’t count on this happening until 2009.

    • PRIME1
    • 11 years ago

    You can get a 9800GX2 for $280

    It actually beats the 4870X2 in Crysis and stays within a small margin in many other games.

    Not bad for a card that costs upwards of $300 less.

      • Fighterpilot
      • 11 years ago

      Well the GX2 beats the 260 AND 280 in some games so….why buy either of them then?
      §[<http://www.techreport.com/articles.x/14934/10<]§

        • PRIME1
        • 11 years ago

        Well for one none of them cost $560 now.

        Looks like you are on the wrong side of the price gun now. Ha!

      • pogsnet
      • 11 years ago
      • robspierre6
      • 11 years ago

      The 9800gx2 is a koke .Faster!!!! the 4870×2 is beating 2-3 280gtx’s in sli.
      Are you trolling or what…?

    • Fighterpilot
    • 11 years ago

    Posted at r[<09:15 AM <]r on Aug 12th 2008 PRIME1 #1, Can't find it on Newegg. Posted at r[<11:47 AM<]r on Aug 12th 2008 PRIME1 Newegg has em up now lol oh no...a 2 hour paper launch 🙂

      • PRIME1
      • 11 years ago

      Well if your going to be a tool about it (and clearly you are)….

      It’s a 1 month paper launch
      §[<http://www.techreport.com/articles.x/15105<]§ How about your post in the forum saying these were going to cost $449? LOL!

    • Chrispy_
    • 11 years ago

    Sorry to post off-topic here but EVERY time I see this sentence I can’t help but wonder how TR gets away with it 🙂

    r[<"Thanks to Corsair for providing us with memory for our testing. Their quality, service, and support are easily superior to no-name DIMMs."<]r In my books that's a veiled insult - using the lowest possible baseline to compare them against. Change the context just slightly and the veil becomes more transparent: r[<"Thanks to McDonalds for providing us with food during our testing. Their flavour, service, and aftertaste are easily superior to /[

      • Krogoth
      • 11 years ago

      Dude, Damage was talking about no-name, random crap that is dirt-cheap for a reason. In my experience, they almost never operate at rated speed without issues. They also have a nasty tenancy of outright failure.

      Any named brand’s value line is worth the extra few $$$$$ for piece of mind that their stuff has no near the amount of issues.

      BTW, I believe Damage refers to his infamous, old article about a cheapo, unnamed stick.

      §[<https://techreport.com/discussions.x/2451<]§ Sorry, the pictures and article's contents are no longer there. Unless, Damage has it store in some obscure area on one of his servers.

      • derFunkenstein
      • 11 years ago

      The truth is, scavenging a dumpster is better than MacDo’s alot of the time. 😉

        • Meadows
        • 11 years ago

        I disagree.

          • poulpy
          • 11 years ago

          Good for you Captain Serious!

          • A_Pickle
          • 11 years ago

          Me too. I happen to like a few double cheeseburgers in my diet. No fuss, just a double cheeseburger and a soda.

      • d2brothe
      • 11 years ago

      Point is, your average desktop computer from OEM ‘X’ comes with noname ram…

        • Usacomp2k3
        • 11 years ago

        Dell actually uses name-brand RAM in the business desktops. Crucial, IIRC.

    • PRIME1
    • 11 years ago
    • DrDillyBar
    • 11 years ago

    Great review.
    I’m still suprised that after all the buzz about AC and Dx10.1 and how it impacted preformance in a positive way, that it’s all of a sudden swept under the rug and only comes to light when TR tests a video card.
    I’ll take +~20% preformance with a Dx10.1 code path anyday over a PhysX enabled driver. Chalk it up to Mainstream impressions I suppose.

      • glynor
      • 11 years ago

      Agreed completely. If you look at Anand’s “review” (I had to quote that because it was laughable in detail compared to most of the others) and compare the Assassin’s Creed numbers with these the difference is substantial. TR’s numbers show the 4870 X2 beating not only the 260 SLI setup, but actually substantially beating the 280 SLI setup. In Anand’s numbers (same resolution, same AA settings) it loses not just to the 280 SLI setup, but to a single GTX 280 card.

      Awful fishy, that. Especially considering the money involved in the deal.

    • ChangWang
    • 11 years ago

    Nice review! Looks like a lot of those configurations are bottlenecked by the CPU. Any chance of OCing the CPU and running a test or 3 again?

    • derFunkenstein
    • 11 years ago

    wtf is with all the “lay off the drugs” comments? I like this style of writing – funny and informative. This is the first review where I haven’t just skimmed the benchmarks and shrugged in quite a while. Bring on MORE drugs, says I.

      • flip-mode
      • 11 years ago

      *[

        • derFunkenstein
        • 11 years ago

        Heck I got a prescription for vicadin when I hurt my foot, I was going to offer it to Damage to keep the news posts coming. 😆

    • Forge
    • 11 years ago

    Unimpressive. Looks like multi-GPU scaling is still a sticking point for everyone, and the increasing power of individual GPUs is making multi-GPU less and less relevant each generation.

    • BoBzeBuilder
    • 11 years ago

    Call me BoBzeImpressed. I must say ATI is ahead of Nvidia in terms of multi-gpu maturity. Their 2x 4870X2 crossfire scaled pretty damn well.

      • Kulith
      • 11 years ago

      why is everybody saying it scales well?? From the charts im looking at (which are hopefully the same ones you are looking at) it hardly scales at all. In fact, it looks like the majority of the time the 4870×2 in crossfire does worse than the single 4870×2

        • leor
        • 11 years ago

        cause the review is for the 4870×2 and how it scales with versus single 4870. no card really scales well out to 4 GPUs.

          • Kulith
          • 11 years ago

          Thats true, the 4870×2 does scale well compared to a single 4870,

          But Bobze in particular was saying that “2x 4870X2 in crossfire” scales well compared to a single 4870×2.. which it seems to me it doesnt

    • Thresher
    • 11 years ago

    Looking forward to a review on the 4850, that’s probably more likely to be my next card.

    To pile on to nVidia’s woes….

    The Inquirer is reporting that G92/G94 desktop parts are failing as well and not just the G84/86 notebook models.

    §[<http://www.theinquirer.net/gb/inquirer/news/2008/08/12/nvidia-g92s-g94-reportedly<]§ nVidia has a problem.

      • Valhalla926
      • 11 years ago

      You meant this one?
      §[<http://www.techreport.com/articles.x/14967<]§

        • Thresher
        • 11 years ago

        I meant the 4850X2. Go easy on me, I took flexaril last night and I’m still out of it.

          • sigher
          • 11 years ago

          It was rather easy to guess you meant X2.

    • bhassel
    • 11 years ago

    Great review. Just FYI, page 4, that shader compute number should be 2.4 teraflops, not “gigaflops” 🙂

      • Damage
      • 11 years ago

      Doh… fixed. Thanks.

    • Krogoth
    • 11 years ago

    Great article, Damage. Just take it easy off the drugs. 😉

    Anyway, it seems that the 4870 X2 is making Nvidia sweat. The article also reinforces the fact that there is little benefit from going beyond two cards in a CF/SLI configuration.

    Nevermind that majority of gamers still do not need more than a 9800 and 4850. The next step for some more $$$$ would be GT260 and 4870.

    • ECH
    • 11 years ago

    From what I’ve read at Anandtech and others here are the downsides of the 4870 X2 “FOR NOW” (hopefully this will change in the future)
    -X2 doesn’t work properly playing games in a window. Why would someone do that? Well, there are a lot of multi taskers out there that can pause a game and do other things like chat with friends, post messages, do work, etc. So from a resolution stand point it may not be ideal but, today’s PC’s have the horsepower to handle a game in the background while you do several other things at once.

    -Side port is currently disabled. Anandtech states that it may never be enabled and speculated that the traces for it can be removed for cost savings. I cannot agree with the speculation but I still haven’t forgotten the UVD debacle of the 2900 either.

    The upsides:
    -performance king
    -driver maturity will increase performance
    -all games seem to scale well
    -etc

      • sigher
      • 11 years ago

      About that sideport: I don’t think you need super-highspeed communication as long as each GPU has its own RAM and its own copies of everything to work on and only the final results are combined, you only need it for syncing and some data but not that much, basically the display buffer/2 I’m guessing?
      But of course the sideport is also suppose to reduce latency, and that might be a real benefit because although PCI-E is fast it isn’t without its latency.
      I wonder if they’ll ever make a multi-GPU setup that can share RAM, that would certainly drop the price of cards seeing they could drop a whole GB (2 in crossfire) of GDDR5 from the product in this 4870X2 example, it seems weird to have 2 GPU’s on one card that still can’t share RAM, but I guess it would be complex to write the drivers/firmware and design the RAM interface to not get into eachother’s way, still, the current ATI GPU already has the system of addressing several parts of its RAM simultaneously so it’s not that far to leap you’d think.
      Perhaps to have that happen the GPU’s would have to not simply do half the screen but each GPU would do parts of the actual 3D scene, and perhaps that won’t happen until they moved to raytracing/voxels, If ever.

    • Lord.Blue
    • 11 years ago
    • duffy
    • 11 years ago

    It’s no wonder Scott had time to notice how gorgeous GRID is…last place in the race?

    • flip-mode
    • 11 years ago

    Excellent article.

    *[

    • Mystic-G
    • 11 years ago

    I bet the framerate can go higher on CoD4. The ‘ /com_maxfps’ command is probably set to 85-90 thus keeping it from going much higher.

      • Lord.Blue
      • 11 years ago

      Good point.

      • eitje
      • 11 years ago

      agreed.

    • wingless
    • 11 years ago

    I can’t wait until we have CPUs that can push these GPUs to their limits. I hate seeing CPU limited configurations in high-end tests. Maybe heavily overclocked Core i7s can get the job done (prolly not…).

      • WaltC
      • 11 years ago

      I don’t think it’s the cpus so much as it is that game devs haven’t figured out how to tap into the tri-core and quad-core cpus that are already shipping. It looks to me like they are just doing so-so tapping into dual cores at the moment.

    • no51
    • 11 years ago

    Looks like a lot of games are being topped out already. Time for another Crysis.

      • flip-mode
      • 11 years ago

      Time for games to simultaneously look and perform great without needing more powerful cards than we’ve already got. Oh wait, we’ve got that. The last thing we need is another pitiful chunk of code like Crysis.

        • A_Pickle
        • 11 years ago

        Here’s to games that simultaneously look great and perform according to how it looks in an optimized manner. Oh, and innovative gameplay — it’d be nice for a genuinely new game to appear, rather than the same thing rehashed over and over again (Doom 3 -> Quake 4 -> Rage)…

          • Meadows
          • 11 years ago

          You can’t honestly know the first thing about Rage’s gameplay experience. Besides, it’s exactly this cliché which they will try to break with that title.

          Personally I don’t have problems with their style though and I’ll be a sucker for Doom 4 much like I liked the “previous instalment” (it really just started the story from the beginning).

    • El_MUERkO
    • 11 years ago

    i like that TR loves GRID 😀

    i look at my 4870 on my x38 and think to myself “x3 sounds nice” but no! i will hold out!

    • continuum
    • 11 years ago

    According to Anandtech, Sideport is r[http://www.anandtech.com/video/showdoc.aspx?i=3372&p=3<]§ l[

      • wingless
      • 11 years ago

      TR, can we get confirmation on this? Still the card performs amazingly well so I guess we wouldn’t need it if it is in fact disabled.

    • ub3r
    • 11 years ago

    aww why isnt it red??

      • Meadows
      • 11 years ago

      Red = bad, Green = gooood.
      It’s what games and software have taught me.

        • Fighterpilot
        • 11 years ago

        Cept now its the other way round 🙂
        The performance is killer but so is the price…..ugh guess the “$449” rumour
        was too good to be true.
        Still ….it just destroys the GTX280 in pretty much every game(Crysis once again saves the day for the green team) so if you want the fastest video card on the market then its Radeon 4870X2 FTW.
        Had to laugh at some of the results…2 in CrossfireX is freakin outrageous.LOL

          • jdaven
          • 11 years ago

          Except that Anandtech shows the X2 winning in Crysis. They also show the X2 losing in Assassin’s Creed. The direct opposite of TR. Go figure. I guess now you can choose the game and the tech site that gives you the performance you want.

            • Fighterpilot
            • 11 years ago

            TR tested Assassin with the DX10.1 code path…which gave the ATI cards a good boost.
            -[

            • jdaven
            • 11 years ago

            What about the crysis test? I didn’t read too much into the test setup but AT had the X2 ahead by a lot while TR had it down. That game sure is weird when it comes to testing. Maybe it shouldn’t be tested at all.

            • jdaven
            • 11 years ago

            Now that I look at it. The test for Crysis is really messed up. Anandtech tested the X2 at 34.2 average frames at 2560×1600 high quality/very high shaders. Tech report tested the X2 at 24.6 averyage frames at 1920×1200 very high quality. There must be something going on between high quality/very high shaders and very high quality.

            • grantmeaname
            • 11 years ago

            If I had to guess I would say that they didn’t use fraps, they used a timedemo. Also, it may have something to do with the AnandTech rig being different.

            • TREE
            • 11 years ago

            If you give Crysis a play and stick it on Very High settings and compare it at the sunrise scene to that of high, you can really see the difference. Very High seems to switch on the DX10 effects whilst high, without a doudt in my mind, only uses DX 9 effects.

            • Meadows
            • 11 years ago

            That’s just it. Shader quality was set to Very High, and everything else was High.
            TR, however, set _[

            • sigher
            • 11 years ago

            I don’t know about anand but it seems techreport tested the crossfire setup on an x38 with DDR2 RAM and not DDR3 RAM, which is fair since the SLI setup also has DDR2.
            If anand tested with DDR3 (they don’t seem to specify the system from a cursory glance) then you would see effects from that I imagine due to the CPU’s behaviour.

      • Krogoth
      • 11 years ago

      It is part of AMD’s marketing scheme of promoting black as being enthusiast-orientated. Just look at their Black Editions of their Athlon 64 X2 and Phenoms. 😉

    • Meadows
    • 11 years ago

    Holy power consumptions, Batman!

      • Krogoth
      • 11 years ago

      Ahem, high-end GPUs when loaded have always been power-hogs for a very long time. I believe it all started back with original Radeon, Geforce 256, Voodoo 5 generation.

        • Aphasia
        • 11 years ago

        Even before that… TNT2 Ultra was probably a bit worse than the Geforce256. The former didnt work at the Asus(noname) whitebox K7M so i had to get a Geforce256 instead.

      • flip-mode
      • 11 years ago

      But it’s fastest.

        • Meadows
        • 11 years ago

        You’re right, I think it’s worth a fuse replacement every other month, after all.

          • flip-mode
          • 11 years ago

          Exaggerate much?

            • bthylafh
            • 11 years ago

            Him? Never.

      • asdsa
      • 11 years ago

      As uber nvidia fan, always look at the negative sides (pissed off that GTX 280 is totally beaten? ;). It is a dual processor version of a very potent card that has already has high power requirements. Of course it requires lot of juice. Power consumption scaling is still better than current generation competition: 4870 to X2 vs. GTX 280 to GTX 280 SLI.

        • Meadows
        • 11 years ago

        I don’t care whether the GTX cards or the Radeon 4870 variants are better, as I don’t have money for either.

          • Nitrodist
          • 11 years ago

          Then why are you posting about how much the power consumption is if it’s never going to affect you at all?

            • Meadows
            • 11 years ago

            For the same reason others post about the card or the article who will probably never own the product.

            • moose17145
            • 11 years ago

            I will admit i will more than likely never own any of these, but then again i never thought i would own a AGP 3850. Basically we post about things we may never own because we enjoy discussing new products. On that note…. i think it’d be pretty awesome to own 4x crossfire setup with about 8 gigs of ram, top of the line quad core and an X-Fi all humming off a 1kw PSU. Total overkill and over the top… sure…. but hey… doesnt mean i cant think about how much fin it would be to own one!

          • eitje
          • 11 years ago

          We’ve secretly replaced Meadows with some store-bought Krogoth. Let’s see if anyone can tell the difference.

            • Meadows
            • 11 years ago

            I can!

    • Nitrodist
    • 11 years ago

    So tempted to get this to replace my 8800GTS 320mB…

      • MarioJP
      • 11 years ago

      Will a 950watt power supply be enough to at least have 1 4870×2 and one 4870?

        • Meadows
        • 11 years ago

        Any time. Even less would do.

    • dymelos
    • 11 years ago

    Good to see it not falling flat on its face, I will be picking one up as soon as it is available. Sad thing is though, its the only way to get a 4870 with a gig of memory per core atm. Nice to see performance getting better for 4 way then it did back when the 3870×2 launched. Great job Ati.

      • Nomgle
      • 11 years ago

      /[http://www.powercolor.com/global/products_features.asp?ProductID=2241<]§

      • dragmor
      • 11 years ago

      Depends on your market, several of the local PC stores near me have a 1GB card from Sapphire in stock.

    • matnath1
    • 11 years ago

    !r[

      • buildcurious
      • 11 years ago

      I have in fact died. The comment killed me. Call the morgue. Order Damage more drugs.

      • fpsduck
      • 11 years ago

      Everything in this review is fine
      except the worried frog that looks very horrible.
      Palit should learn from other companies to make a better box.

    • TechHead
    • 11 years ago

    Awesome review, great card. How about availability though? Have they committed to this being a hard-launch?

      • PRIME1
      • 11 years ago

      Can’t find it on Newegg.

        • L1veSkull
        • 11 years ago

        There is already 3 manufacturer versions on newegg with Diamond 4870×2 already sold out.

Pin It on Pinterest

Share This