Of course, 2006 was also dotted with events and trends that we’d just as soon forget. For every engineer working overtime on an innovative new product, there seemed to be at least one PR rep blowing smoke, one marketing team spinning hype, and one product manager making a poor design decision.
To send 2006 off in style, we’ve singled out the best enthusiast-oriented products of the year for our Best Hardware of 2006 awards. We’ve also whipped up a selection of unique awards to recognize some of the year’s most interesting innovations and annoying trends. And there were plenty of each. Read on to see our picks for the best hardware of the yearand then some.
The best hardware of 2006
Every year we’re inundated with new products, and although many are reasonably good, only a select few deserve special recognition. The winners of our Best Hardware of 2006 awards are the most compelling enthusiast-class products of the year, and not just because they nicely fill a spec sheet or price point. These products all have something special to offer, and in many cases, they break new ground where the competition fears toor simply can’ttread.
They’re all exceptional products, but our Best Hardware of 2006 award winners aren’t necessarily the components we’d recommend for every enthusiast-oriented PC. For specific recommendations that take into account current street prices, I suggest you check out the latest revision of our system guide.
Intel Core 2 Duo
The NetBurst era was not kind to Intel, and during that time, we saw little reason to recommend Pentium chips over AMD’s Athlon designs. In fact, the market became so lopsided that we stopped reviewing LGA775 motherboards altogetherwe just couldn’t bring ourselves to recommend any desktop platform with an Intel processor. Continually losing the performance race to AMD couldn’t have been a pleasant experience for Intel, and Prescott’s power consumption must have been embarrassing. But Intel wouldn’t be down for long.
At the beginning of 2006, Intel employees sounded positively giddy about the company’s upcoming Core 2 Duo processor. They went on and on about the chip’s performance and power efficiency, and at the time, it all sounded too good to be true. But it wasn’t. Midway through 2006, Intel unleashed its Core 2 Duo on the world and changed the desktop processor landscape almost overnight.
Intel’s Core 2 architecture was a stunning achievement, and really much more than a mobile chip re-spun for desktop applications. The chip was Intel’s first four-issue design, and it offered incredible SSE throughput thanks to its ability to execute 128-bit SSE instructions in a single clock cycle. A technology called memory disambiguationcoupled with a monster L2 cacheallowed the Core 2 to offer blistering performance without an on-die memory controller, as well.
The Core 2’s performance really was amazing, and power consumption was even more impressive, especially considering Intel’s power-hungry Prescott history. At its launch, nothing AMD had in its stable could keep up with the Core 2 lineup. Intel hasn’t increased clock speeds over the last six months, and AMD still hasn’t been able to catch up. What Intel has done, however, is put two Core 2 Duo chips on a single package, creating the world’s first quad-core desktop processor.
The quad-core Core 2 Extreme QX6700 might not be the most cost-effective chip in the Core 2 lineup, but it’s a heck of a thing to behold. And while many have criticized the chip’s two-die-per-package design as a kludge, the two-die approach allowed Intel to bring chips to market in volume before the year was out. In fact, it will probably be another six months before we see AMD release its first quad-core design.
Normally, if someone suggested that the Core 2 Duo was not the best processor of 2006, I’d be inclined to call them a hopeless fanboy. However, the Core 2’s performance and power consumption were so impressive, and its dominance so utterly complete, that anyone who doesn’t think the chip was the best processor of 2006 is more likely a madman.
Best graphics chip
Nvidia GeForce 8800
Nvidia’s GeForce 8800 wasn’t released until November of 2006, but this chip was far and away the best GPU of the year. For one, the GeForce 8800 was the first DirectX 10-compatible graphics card on the market. It was also the first unified shader design to be available on a consumer graphics carda surprise to most of us considering how much Nvidia had downplayed the need for unified shaders in the past.
And then there was the GeForce 8800 series’ performance, which was nothing short of jaw-dropping. We all assumed Nvidia’s next-gen graphics chip would be faster than its previous flagship, of course, but I’m not sure anyone expected the GeForce 8800 to burst out of the gate with the ferocity that it did. Just a single GeForce 8800 GTX proved faster than a pair of GeForce 7900 GTXs running in SLI or two Radeon X1950 XTXs in CrossFire. And the GeForce 8800 did it with the best image quality of the lot.
Nvidia’s previous graphics chips haven’t always had impeccable image quality, especially when anisotropic filtering and antialiasing are thrown into the mix. However, the GeForce 8800 raised the bar on that front thanks to effectively angle-independent aniso and coverage sampled antialiasing.
Apart from its awe-inspiring image quality and performance, the GeForce 8800 graphics chip is a rather impressive bit of engineering. The chip has eight groups of 16 generalized floating-point stream processors that can operate on vertex or pixel data, allowing for effective load balancing depending on the content of a given scene. Those stream processors run at a whopping 1.35GHz, toomore than twice the speed of the rest of the chip. Couple that with a 384-bit memory interface and up to 768MB of onboard memory running at an effective 1.8GHz, and you’ve got yourself one heck of a graphics card.
Of course, the GeForce 8800’s mammoth processing power takes more than just a few transistors. The die is huge, and Nvidia estimates that there are 681 million transistors under the hoodtwice as many as you’ll find in the graphics chip that powers the GeForce 7900 GTX. Surprisingly, though, Nvidia did a remarkable job of keeping all those transistors from consuming too much power. The GeForce 8800 GTX consumes less power under load than the Radeon X1900 XTX, and Nvidia equipped cards with a whisper-quiet fan that makes less noise than just about everything else out there.
The GeForce 8800 is sort of the ultimate high-end exotic car; it has plenty of impressive engineering to geek out over, there’s loads of power for a day at the track, and it’s practical enough for the commute to work or short trips to the grocery store for a carton of milk. That makes it the best graphics chip of 2006, by a long shot.
Nvidia nForce 570 SLI for Socket AM2
Last year, we couldn’t find a chipset we liked enough to name the Best Chipset of 2005, so no one took home the award. We were much more impressed with 2006’s crop of core logic chipsets, and we managed to find two worthy candidates for our Best Chipset of 2006 award. But there can be only one winner, and Nvidia’s nForce 570 SLI for Socket AM2 edged out Intel’s P965 Express.
Cue fanboy whining.
Intel launched the P965 as a mid-range chipset for Core 2 processors, but with a more advanced 90nm fabrication process and additional south bridge Serial ATA and USB ports, it looked more like a successor to the high-end 975X Express than a cheaper sidekick. The line between the 975X and P965 became even more blurred this fall when ATI announced that the P965 would pick up official support for CrossFire multi-GPU configurationsa capability previously exclusive to the 975X Express. But really, who could blame ATI? Motherboards based on the P965 were everywhere in the second half of the year, and the chipset did an admirable job of scaling from budget $100 boards for mainstream users to $200 screamers built for overclockers and enthusiasts.
While the P965’s versatility and popularity are impressive, there are two things we really don’t like about the chipset. First, it doesn’t include integrated Gigabit Ethernet networking. Intel makes some fine networking controllers, and we’d really love to see one integrated into at least the company’s high-end “R” south bridge chips, if only to provide a more consistent user experience. As it stands, motherboard makers are free to use whichever GigE chips they please, and we’ve seen performance vary quite a bit.
The P965’s lack of integrated networking would be easier to forgive if Intel hadn’t also dropped “parallel” ATA support from the chipset’s ICH8 series south bridge chips. Don’t get me wrongI’d love to ditch clunky IDE ribbonsbut the summer of 2006 was too early to drop ATA support from such a mainstream chipset. Serial ATA optical drives were still few and far between, and they’re only now becoming available at reasonable prices. The prevalence of ATA devices forced mobo makers to resort to third-party ATA controllers, many of which lacked proper DOS support, creating compatibility problems with older boot CDs and even some versions of Ghost.
In essence, the P965 Express has become a leaner, meaner, and slightly lighter version of the 975X. That’s not a bad thing, but it doesn’t have the makings of the best chipset of 2006. For that honor, we have something even more versatile: Nvidia’s nForce 570 SLI.
Nvidia has a habit of sharing chips across multiple chipsets, and the nForce 570 SLI made its way into about a bazillion different configurations. First, it was available as a single-chip implementation known as the nForce 570 SLI for Socket AM2 processors. That very same chip was also paired with a north bridge component as a part of the nForce 590 SLI chipset, which was available for both AMD and Intel CPUs. As if that wasn’t enough, the 570 was pressed into service again as the nForce 680i SLI’s south bridge.
Ok, so maybe we’re short of a bazillion, but you’ve gotta give Nvidia props for picking a pony and riding it. The nForce 570 SLI is really more of a thoroughbred, and a juiced up one at that. Nvidia managed to squeeze a 1GHz HyperTransport interface, 28 PCI Express lanes with SLI support, six Serial ATA RAID ports, 10 USB ports, “Azalia” High Definition Audio, and two Gigabit Ethernet controllers with TCP/IP offloads into just one chip. The hardware-accelerated Gigabit Ethernet even works this time around, and Nvidia also added outbound packet prioritization to the mix.
With so many integrated features, the nForce 570 SLI and its couplings were able to offer more consistent peripheral performance than their competition. But there was still a catch: power consumption, or more specifically, the resultant heat output. The nForce 570 SLI runs hot, whether it’s acting on its own as a single chip or working under an assumed name with a north bridge riding shotgun. Heat output is a far cry from Prescott levels, but it’s forced many motherboard makers to employ elaborate heatpipe networks in order to achieve passive cooling.
Fortunately, heat is the only problem we have with the nForce 570 SLI, and it’s really not that big of a problem at all if you consider the issues that plague other chipsets. If that’s the price we have to pay for great peripheral performance, widespread availability on a variety of mid-range and high-end motherboards, great extras like the nTune system utility, and a consistent overall user experience, we’ll pay it gladly.
Asus P5B Deluxe Wifi-AP Edition
A lot of great motherboards passed through the Benchmarking Sweatshop this year, but Asus’ P5B Deluxe Wifi-AP Edition was the best, in part because it enjoyed so many firsts. This was the first Core 2 motherboard we got our hands on, and one of the first enthusiast-class P965 boards on the market. The P5B Deluxe was also the first to offer Core 2 multiplier control in the BIOS, and it was the first P965 board to support CrossFire multi-GPU configurations.
Oh, and did I mention that the P5B Deluxe was the first Core 2 motherboard we were able to overclock to a front-side bus speed well beyond 400MHz? Yeah, that too.
Our P5B Deluxe’s excellent overclocking performance underscores the fact that this board’s appeal reaches beyond its ability to beat the competition to market. Asus put together one heck of a package with the Deluxe Wifi-AP Edition, including silent heatpipe cooling, just enough expansion slots, an ICH8R south bridge, eSATA connectivity, onboard 802.11g Wi-Fi, plenty of BIOS-level overclocking and fan speed options, and a handful of useful extras. More importantly, the P5B Deluxe Wifi-AP pushed the envelope and offered things you don’t often see from even a high-end motherboard. That makes it the best motherboard of 2006.
Unfortunately, even the best motherboard can’t escape annoying quirks. The board’s BIOS is a little finicky when it comes to flashing, and a flash gone wrong can render the board unable to POST. What’s worse, Asus hasn’t used a standard BIOS chip, so you can’t just pop in a replacement after a failed flash. Recovery is still possible, but the whole issue puts a bit of a smudge on what would have otherwise been a pristine award.
Abit shared this award with DFI last year. uGuru continued to shine throughout 2006, but DFI’s BIOSes failed to consistently implement some of the unique features we really liked about last year’s LANParty boards. No oneincluding Abitreally went above and beyond in the BIOS department in 2006. That allowed Abit to enjoy the nice lead it built itself with uGuru, which still offers by far the most complete array of BIOS-level hardware monitoring and automatic fan speed control options. No one else even comes close.
Monitoring and fan speed control may be the backbone of uGuru’s appeal, but Abit’s BIOSes also offer a bevy of tweaking and overclocking options, including multiplier control for Core 2 processors and plenty of memory bus dividers and voltage options. That’s exactly what enthusiasts need in order to wring the best performance and highest overclocks from their systems, making Abit’s uGuru an easy choice for the best BIOS of 2006.
Of course, we can’t give this award away without taking a crack at Abit’s one attempt to break new ground in the BIOS realm. In 2006, Abit decided that typical blue or grey BIOS color schemes had to go, and the company served up alternatives in black, red, and pink. The black and red schemes worked well, but the pink was, well, pink. Please remember your market, Abit; enthusiasts and overclockers aren’t so much into pastels.
Creative X-Fi XtremeMusic
Creative’s X-Fi XtremeMusic won this award last year, and it’s back for another round. Nothing has changed, mind you; this is the very same card as a year ago. As much as it might pain me to recognize an identical product two years in a row, there simply isn’t a better consumer-level card on the market. Not that it’s easy to match the X-Fi XtremeMusic. For less than $100, you get impeccable fidelity, support for DVD-Audio playback, and the ability to handle up to 128 simultaneous 3D voices in hardware. That’s an impressive trifecta, and most competitors can’t even match two out of threenot that there are many competitors to begin with.
Creative’s own business practices have a lot to do with the dearth of competition in the sound card market, and that’s drawn the ire of the enthusiast community in the past. Enthusiasts haven’t been pleased with Creative’s driver bloat, either, although the X-Fi line does offer a relatively clean driver-only installation option that gets rid of all the optional junk. We’ve yet to encounter the hissing and popping problems that some users report exists with certain motherboards, as well.
Since it’s still the best consumer-level sound card on the market, and even more affordable this year than last, we see no reason not to let the X-Fi XtremeMusic keep its best sound card award for another year. But we’d love to have more options to choose from next year. Is anyone listening?
Best hard drive
Western Digital Raptor X
2006 treated us to a number of interesting new hard drives, including one we’ll recognize a little later on. However, the king of the hill throughout the year was the latest addition to Western Digital’s 10k-RPM Raptor line. Long criticized for offering too little storage, WD doubled the Raptor’s capacity to 150GB for the new revision. Cache was also doubled to 16MB, and support for Native Command Queuing was added in lieu of the decidedly more obscure Tagged Command Queuing found on the older Raptor WD740GD.
Interestingly, though, WD chose not to support 300MB/s transfer rates with the new Raptor. Nevertheless, the new Raptor was easily the fastest Serial ATA hard drive on the market, thanks in large part to its spindle speed advantage over slower 7,200-RPM drives.
Western Digital actually rolled out two versions of its new Raptor in 2006: the Raptor WD1500ADFD, and the Raptor X. Both offered identical specifications and performance, but the latter featured a windowed view of the drive’s internals. That’s right: Western Digital put a window on a hard drive.
As one might expect, the windowed Raptor X cost more than the vanilla WD1500ADFD$50 more at launch, to be exact. That premium has shrunk since the Raptor’s launch, and as far as we’re concerned, it’s a small price to pay for the unique cachet that comes with the only window available in a production hard drive. The fact that Western Digital had the engineering expertise and audacity to equip a 10k-RPM hard drive with a window makes the Raptor X an easy pick for the best hard drive of 2006.
Gigabyte’s i-RAM solid state storage device doesn’t really fit into any of our best hardware categories, but for me, it was by far the coolest product of 2006. I’ve been around long enough to remember RAM disks, and that’s essentially what the i-RAM is, but without all the hassle. Through the magic of a field programmable gate array (FPGA) chip, the i-RAM allows DDR memory to appear as a standard Serial ATA hard drive. No special drivers or software are needed, just a Serial ATA cable and standard DDR DIMMs.
In addition to being ridiculously easy to set up, the i-RAM offers blistering performance, phenomenal access times, and complete silence. With a street price hovering around $120, it’s also an incredible value considering the cost of other solid state storage devices. The fact that you can add your own memory is a nice touch, too, and that goes a long way towards keeping the i-RAM’s price affordable.
Unfortunately, i-RAM isn’t perfect. The current version is limited to 150MB/s Serial ATA transfer rates that it has no problem saturating, and total capacity is capped at 1GB per DIMM, or 4GB total. That all but corners the i-RAM in a niche. Since it acts like a standard Serial ATA hard drive, however, there’s nothing (save for cost considerations) stopping you from building a multi-i-RAM RAID array to achieve higher capacities and even more exceptional performance.
Regardless of its status as a niche product, the i-RAM is undeniably cool. You won’t find a faster storage device that plugs into a Serial ATA port for anything close to the i-RAM’s price, even when fully loaded, and that puts a huge grin on my face.
2006 was full of events and trends that don’t necessarily fit into our best hardware categories, but that doesn’t mean they got off without recognition. We’ve cooked up a special batch of awards to reward some notable moves and to chastise a few things we’d rather forget.
Renaming for marketing
We may have named the nForce 570 SLI the best chipset of the year, but that doesn’t get Nvidia off the hook for being a little too creative with product renaming. You see, although the nForce 570 SLI was available as single-chip core logic, it also picked up a few aliases as Nvidia pressed it into service in other chipsets. In the end, that single chip ended being known not only as the nForce 570 SLI MCP, but also the 590 SLI MCP, and the 680i SLI MCP.
And it gets worse.
You’d think that the nForce 570 SLI for Socket AM2 would be similar to the nForce 570 SLI for Intel CPUs, but you’d be wrong. The 570 SLI for Intel actually didn’t contain an nForce 570 SLI MCP. Instead, it was merely a rebadged version of the an older nForce4 chipset. Confused yet?
Nvidia defended its creative renaming by suggesting that it was merely changing names to help differentiate its products. An nForce4 chipset became the nForce 570 SLI for Intel, Nvidia says, merely to denote support for Core 2 processors. But why not call it the nForce4.1? Or the nForce4 Core? Or anything that doesn’t suggest an nonexistent upgrade to an nForce 500 series chipset.
Regardless of whether Nvidia’s intent was malicious, renaming products to suit marketing agendas doesn’t sit well with us. The market is confusing enough as it is for uninitiated consumers, so be straight with them. Call a product what it is, not what you think you can get away with.
Best move we thought we’d hate
Rebadged Nvidia reference mobos
We’ve bemoaned the lack of variety in the graphics card market on more than one occasion, and 2006 was a banner year for rebadged reference designs. These days, graphics board manufacturers do little more than resticker the heatsink on high-end graphics cards, and that has resulted in a rather dull market.
With restickered reference designs diminishing the number of unique graphics card offerings, we shuddered to learn that Nvidia would be offering complete nForce 680i SLI reference motherboards for resale by its partners. But it really didn’t turn out as badly as we might have expected. The availability of a ready-for-retail nForce 680i board design hasn’t stopped Abit, Asus, and others from coming up with their own designs for the 680i SLI. What it has done, however, is allowed smaller players like BFG Tech, EVGA, and even ECS make a play for enthusiast motherboard dollars.
The real kicker here is that Nvidia actually built quite a good motherboard with its nForce 680i SLI reference design. Apart from Serial ATA signaling problems that appear to have been fixed with a BIOS update, the board offers an excellent layout, competitive performance, a feature-rich BIOS, and loads of overclocking headroom. In fact, the reference design is probably much better than what less experienced mobo makers like BFG and eVGA would have been able to come up with on their own; it’s certainly an improvement over ECS’s attempts to produce a good enthusiast board, and they’ve been building motherboards since the dawn of time.
Of course, just because we like the nForce 680i reference design doesn’t mean we want to see every motherboard maker selling the board with little more than a new sticker. Bigger players like Asus, Abit, DFI, Gigabyte, and MSI should be able to do better, and with the likes of BFG and eVGA nipping at their heels with a capable reference design, we hope it forces them to raise the stakes even higher.
It’s about time
Enthusiasts have been waiting for perpendicular recording technology to make its way into desktop hard drives for years, and we’ve had Hitachi’s “Get Perpendicular” song stuck in our heads for pretty much the whole time. Fortunately, 2006 brought us our first taste of perpendicular recording with Seagate’s Barracuda 7200.10 series. The 7200.10s were the first 3.5″ desktop drives to make use of perpendicular recording tech, and they did so in style, offering a flagship model with a whopping 750GB of storage capacity50% more than the next largest drive on the market.
The Barracuda 7200.10 750GB was released in May, and some eight months later, we’re still waiting for a higher capacity hard drive to hit the market. The fact that Seagate was able to hold a 50% storage capacity lead over the competition for the better part of the year is an impressive feat in and of itself, but what’s more striking is that the 750GB flagship didn’t carry an obscene price premium. The ‘cuda was the most expensive SATA hard drive on the market, of course, but its cost per gigabyte was comparable to that of 500GB drives.
Seagate was the only manufacturer to bring perpendicular recording to the desktop in 2006, but it joined Fujitsu and Hitachi in offering mobile drives with perpendicular tech. 2007 looks to be the year that perpendicular recording will really take hold, though. Hitachi has already announced plans to release a five-platter terabyte drive that uses perpendicular platters in the first quarter, and Seagate is promising a four-platter terabyte disk in the first half of the year.
Small form factor systems
Just a few short years ago, small form factor systems captivated our attention by squeezing an entire PC into a barebones form factor the size of a bread box. Small form factor systems were every bit as fast as their full-size counterparts, they were available with all the latest chipsets, and they had enough tweaking and overclocking options to keep the average enthusiast happy.
Shuttle was responsible for most of the craze, and they did small form factors better than anyone else, aggressively releasing models based on new chipsets and socketsoften even before motherboard makers. And then Shuttle decided that it wanted to become a systems vendor, shifting its focus from designing small form factor barebones to putting together complete systems. Suddenly, Shuttle was less concerned with keeping its barebones systems on the cutting edge, leaving the door open for someone else to pick up the torch.
Except no one really did. Some tried, and we’ve seen decent small form factor designs from Biostar, MSI, and others. But none have been exceptionally good, and none have inspired the sort of enthusiasm for the platform that we saw in Shuttle’s heyday.
Perhaps we’re just beginning to see the enthusiast community’s relationship with small form factor systems for what it wasa torrid affair driven by infatuation, but sorely lacking in substance. Still, we miss getting excited about cramming an obscenely powerful system into a chassis the size of a toaster.
When Intel announced that it would bring its Kentsfield quad-core processor to market before the end of the year, AMD seemed to scramble to come up with a response. At first, AMD intended to counter Kentsfield with a “4×4” concept that combined two processors and four GPUs in boutique systems built by the likes of Alienware and Voodoo PC. As enthusiasts who prefer to roll our own systems, we weren’t impressed, and we didn’t pull any punches.
AMD was at least paying attention, because 4×4 soon morphed into the much more reasonable Quad FX platform. Quad FX dropped the four-GPU requirement in favor of a motherboard that was simply capable of supporting up to four graphics cards, and AMD pledged several processor options to meet different budgets. What’s more, AMD said Quad FX CPUs and motherboards would be available as retail products, giving enthusiasts the freedom to build their own systems based on the platform.
In essence, Quad FX became little more than an enthusiast-oriented dual Opteron platform with support for unbuffered memory. That’s not a bad idea, especially considering how suitable AMD’s processor architecture is for multi-socket systems, but the reality didn’t work out quite as well as the ideal. You see, Quad FX did in fact arrive before the end of the year, but it did so only in review sample form. Actual CPUs and motherboards are only now becoming available online, and retailers are selling them at a bit of a premium.
Availability wasn’t Quad FX’s only problem. AMD chose to launch the platform with just one compatible motherboard, and while Asus’ L1N64-SLI WS is a sight to behold, it’s too big for some ATX cases, has two very power-hungry Nvidia MCPs, and really is overkill for enthusiasts looking to revisit the multi-socket glory days of the BP6. Power consumption is actually a rather major issue for the entire Quad FX platform, especially when compared with Intel’s considerably more power-efficient quad-core alternative.
Quad FX may be a fine workstation alternative for those looking to avoid higher Opteron prices, but it’s too awkward to compete with the simplicity of adding a quad-core Kentsfield to a compatible LGA775 motherboard, and it’s not any faster. With 65nm CPUs and some reasonable mid-range motherboard options, Quad FX could have been a very desirable platform. That’s the direction the platform needs to take if AMD wants Quad FX to go anywhere in 2007.
Most underdelivered hype
Hardware physics processing
We’ve had hardware acceleration for graphics, 3D audio, and even Ethernet for years, but in 2006, we were introduced to the first dedicated physics processor. Ageia’s PhysX physics processing unit promised to bring games a new level of environmental interaction and realism that would not be possible without dedicated physics hardware.
The hype sounded good, and games were certainly ripe for an upgrade from lame rag doll effects. Unfortunately, Ageia’s delivery was a little lacking. Game support just wasn’t there, and we found that even Ageia’s own Cell Factor tech demo ran pretty well with just a high-end dual-core processor. Ageia promised more widespread game support, and we’ve been inundated with press releases detailing upcoming games that will make use of PhysX hardware, but we’re still waiting for a compelling reason to recommend a PhysX card.
Of course, Ageia wasn’t the only company pimping physics processing in 2006. ATI and Nvidia also threw their hats into the ring, promising to accelerate eye candy physics (effects physics that doesn’t impact actual gameplay) on the GPU. Both briefed the press about their plans, and ATI even had a live demo running at Computex. Yet we’re still waiting for games that will actually make use of GPU-based physics acceleration.
Game developers aren’t universally enthusiastic about hardware-based physics processing, either. Valve, for one, seems far more interested in doing physics on multi-core processors. Dedicated physics processing’s best shot at a killer app may be Unreal Tournament 2007, but that’s not due until later this year, leaving 2006 drowning in unrealized physics hype.
AMD buys ATI
It doesn’t get much bigger than the AMD buying ATI. Although the jury is still out on whether this marriage will be good for both companies, and the market as a whole, there’s no denying that this was the biggest development of 2006. In fact, AMD’s acquisition of ATI had such an impact that we’re still waiting for the dust to settle; we may have to wait until the end of 2007 to have anything really profound to say on the subject.