Of course, our seal of approval doesn't guarantee perfection. A product didn't have to be flawless to be one of the Best of 2004, it just had to be better than the competition. We're picky here at TR, so while we're gushing over our favorites, expect to see a few flaws highlighted.
Read on as we hand out awards for processors, chipsets, graphics cards, motherboards, sound cards, and more.
AMD Athlon 64 3500+
AMD's Athlon 64 processor was undoubtedly the best CPU for enthusiasts in 2004. With a speedy on-die memory controller, future-proof 64-bit capabilities, and Cool'n'Quiet technology, there was a lot to love about the Athlon 64. Intel's newest Prescott Pentium 4 processors still had a few things going for them, including the creamy smoothness of Hyper-Threading and excellent video encoding performance, but those virtues weren't enough to overcome the chips' incredibly poor gaming performance, toasty temperatures, and gluttonous power consumption.
Of the Athlon 64 family, the 3500+ gets the nod because it was the first affordable chip to be released for AMD's new 939-pin socket. Socket 939 is the future of the Athlon 64, so the 3500+ was a wise choice for those looking to keep their CPU upgrade options open down the road. Since its initial release, AMD has even added a second 3500+ to its stable, the new 90nm Winchester Athlon 64, which sports lower power consumption and a slightly faster L2 cache.
High-end graphics processor
NVIDIA GeForce 6800 GT
After enduring a rough patch with the lackluster GeForce FX series, NVIDIA came roaring back in 2004 with the introduction of the GeForce 6 series of graphics processors. The graphics race looked like a dead heat when the first GeForce 6 series GPU, the NV40, arrived this past spring alongside ATI's new chips. The picture changed, though, as NVIDIA's plans for the GeForce 6 series unfolded and ATI's Radeon X800 series became mired in a seemingly intractable set of availability problems.
Nowhere was NVIDIA's practical lead in graphics more evident than at the $399 price point, where the 16-pipe GeForce 6800 GT did battle against the 12-pipe Radeon X800 Pro. When Doom 3 arrived, the 6800 GT put the smack down on the X800 Pro and even showed up the more expensive Radeon X800 XT Platinum Edition. The 6800 GT was clearly the best buy among the $399 cards. What's more, the pricier GeForce 6800 Ultra and Radeon X800 XT PE cards were hard to find and didn't offer much more in the way of performance than the 6800 GT.
The 6800 GT looked like an even better proposition as time went along, highlighting the fact that the NV40 GPU packs newer technology than the Radeon X800. A new patch for Far Cry demonstrated the performance benefits of Shader Model 3.0, and another showed off the cards' 16-bit floating-point filtering and blending for use with high-dynamic-range lighting. Then came word that Shader Model 3.0 may continue as the graphics standard for several years, raising the possibility of unusual longevity for the 6800 GT. Finally, NVIDIA unleashed its SLI technology, giving the PCI Express version of the GeForce 6800 GT the ability to run in dual-card configurations with up to twice the performance. The only real chink in the 6800 GT's armor was the fact that the video processor didn't offer the promised WMV 9 decode acceleration in early revisions of the NV40a flaw remedied in later production runs.
As the year closed, ATI gave us a sneak peek of its ultimate answer to the 6800 GT, the 16-pipe, $299 Radeon X800 XL. If those new ATI cards arrive in volume, the GT may finally meet its match, but 2004 undoubtedly belonged to the GeForce 6800 GT.
|Battlefield Hardline open beta scheduled for February 3||18|
|You can now unlock your Chromebook with your phone||5|
|Deal of the week: A Radeon R9 290X for $233||95|
|AMD's new Fixer video is even crazier than the last||77|
|Leak pegs desktop Broadwell, Skylake for mid-year||49|
|WSJ: Microsoft to back Cyanogen with $70M investment||53|
|You've goat to check out Silicon Power's new thumb drive||54|
|We discuss the GeForce GTX 970 memory controversy||72|
|nvidia already released an official response: https://www.youtube.com/watch?v=spZJrsssPA0||+69|