It's been a bad month for celebrities, so forgive the poor taste. We've lost David Carradine, Ed McMahon, Michael Jackson, Farrah Fawcett, and Billy Mays. My loss isn't as great as an entire human life, but the little things always irritate us the most, don't they? I therefore request a moment of silence for my fallen Radeon HD 4870.
Around August of last year—my birthday—I bought myself a very special present: a VisionTek Radeon HD 4870 512MB. It was replacing a pair of CrossFired Radeon HD 3850s from the same company. The 3850s were pretty solid, but I wanted to go back to having a single card. Multi-GPU setups can be finicky creatures, and I hated having to disable CrossFire whenever I wanted to use my third monitor.
First, a bit of history. I've almost always been an ATI guy. I've owned five All-in-Wonder cards in my lifetime and loved them all. (And, as a side note, I do miss that line dearly.) My All-in-Wonder 9800 Pro was an awesome card for its era. When I sold off my desktop and switched to a desktop-replacement laptop, I begrudgingly used a Mobility Radeon X600 (a major disappointment in the mobile and desktop sectors to be sure) before eventually jumping to a GeForce Go 7600 in my next laptop. That 7600 was an absolute demon at the time, and it made me a believer in the Nvidia team. I went on to put a GeForce 7600 GT in the new desktop PC I built for college, and that card performed incredibly for the time. I later made the switch to the even more amazing GeForce 7950 GT, before finally using EVGA's marvelous trade-up program to secure a GeForce 8800 GTS 640MB, largely as a result of reviews on TR (which I wouldn't find myself writing for until two years later).
And that's when the tears came. While Nvidia's coverage sampled antialiasing is fantastic, and its transparency AA is in my opinion superior to AMD's alternative, it's no secret that Nvidia's Vista drivers were dismal and unreliable for the first year or so. While running the 8800 GTS, I used a cheap GeForce 7100 GS to drive my third monitor. OpenGL games crashed on loading unless I disabled the 7100, and one stage of Tomb Raider: Legend simply refused to run on the 8800. While Tomb Raider: Legend is a fantastic game, I could deal with having to play that one section on the 7100. But for better or worse, Quake 4's single-player campaign is one of my favorites, and if that's not running, I've got problems.
So, I defected back to the AMD camp when the Radeon HD 3850 came out. Life was good, for the most part. Then the Radeon HD 4800 series dropped and I, like many you, got very excited—AMD was back and truly competitive again. I cheerfully picked up the Radeon HD 4870 and brought it home.
Once again, the tears came. While performance was absurdly better and smoother than my 3850s, the 4870's cooler was, at the default settings, woefully incapable of dealing with the heat generated in my case. Raising the fan speed helped, but ultimately, having to choose between noise and having my video card combust proved to be an unacceptable compromise. After experimenting with different cooling solutions, I finally settled on a Zalman VF1000 coupled with the red metal backplate from the stock AMD cooler. This was an "acceptable" compromise that appeared on a lot of different forums. While the VRMs still ran punishingly hot, the GPU remained cool enough to keep operating, provided there was adequate airflow from the front of the case.
Indeed, once I had this situation in order, the 4870 was problem-free... until last Friday, when it decided it'd had enough amid a game of Ghostbusters. After monitoring the voltage, amperage, and temperatures under stress, then swapping drivers, swapping entire video cards, and experimenting with clock speeds, the culprit finally revealed itself: the memory was dying, if not dead. The card now only operates in 2D mode; visits to 3D mode result in driver crashes if I'm lucky, but more often the system just hard locks.
I still gaze at AMD with something of a fanboy's eyes. I game at 1920x1200, so I must now decide if I want to step up to a 4890, or if I should grab a 4850 (or a 4830) as a stopgap until the DirectX 11 lineup arrives later this year. While Nvidia allures me with promises of CUDA-accelerated Adobe Premiere Pro and the chance to fool around with PhysX, I still find myself gravitating towards AMD's GPUs. They generally offer better performance for the price, in my opinion, and AMD remains the underdog. The 4800 series offers incredible performance at 8xAA, as well, negating my desire for CSAA. I even like to fool around with shader-based antialiasing options.
Still, the disappointment is palpable. While I eye the Radeon HD 4890 for the opportunity to tweak it, I can't help but feel kind of screwed here. It's unfortunate. I didn't want to splash out on another card, since I quite liked my 4870, but what am I gonna do now?
|The TR Podcast 162: Apple's biggest and Nvidia's fastest||3|
|Nvidia wants to sell you LED-infused SLI bridges||11|
|Microsoft unveils a wireless display dongle of its own||12|
|Micro Center selling AOC's 24'' G-Sync monitor for $450||16|
|Steam storefront revamped with Discovery Update||12|
|Reversible, USB Type-C cables can pass DisplayPort signals alongside data and power||43|
|Early deal of the week: Delicious SSD discounts||18|
|New Gmail accounts no longer require Google+||23|
|Acer's G-Sync-infused 4K monitor priced at $800||53|
|You married well.||+51|