This article is late because we've been occupied by trying to pin down ATI on Radeon 8500 clock speeds (which we eventually did) and delving into ATI's apparent use of cheats on Quake III Arena benchmarks. Then we visited with ATI (and with NVIDIA) at Comdex. While we were there, ATI released a new driver for the Radeon 8500, so we had to go back to the drawing board with our testing.
That's all probably just as well, however, because the new drivers make this comparison much more interesting. Before ATI released the latest drivers for the Radeon 8500, this GeForce3 Ti 500-versus-Radeon 8500 comparison would have read like this:
Don't buy a Radeon 8500. Buy a GeForce3.End of story (only with more graphs). However, ATI's latest drivers take care of a great many problemsthe Quake III "optimizations," Athlon XP incompatibilities, surprisingly low performancethat the Radeon 8500 brought with it when it first arrived on retail shelves. And once you get under that crusty old ATI veneer of lousy drivers and purposely vague public statements, the Radeon 8500 looks like a darned good graphics processor.
Good enough to take on NVIDIA's vaunted GeForce3 Titanium series? Just maybe. Keep reading to find out.
GeForce goes Titanium
The GeForce3 Titanium series cards are new, but not really novel. They're simply GeForce3 chips set to run at different core and memory clock speeds. Rather than use its traditional "Ultra" and "Pro" tags, NVIDIA chose to use the "Titanium" name for its fall product line this year. The new GeForce3 Ti 500 runs at a 240MHz core clock speed with 500MHz (DDR) memoryjust a bit faster than the 200/460MHz of the original GeForce3. The Ti 200, meanwhile, runs at a 175MHz core speed with 400MHz memoryslower than the GeForce3, but then it's priced much lower, too. Beyond the clock speed changes, the Ti series of chips is essentially identical to the original GeForce3.
Not that there's anything wrong with that. In fact, the GeForce3 is still one of the most amazing graphics chips ever. If it weren't for the Radeon 8500, the GeForce3 would have no real rivals. If you aren't familiar with what makes the GF3 so special, go check out my review of the GeForce3 right now. It will bring you up to speed on the new approach to real-time graphics that NVIDIA pioneered with the GeForce3, including fancy-pants things like vertex shaders and pixel shaders. When you're done reading that article, you'll be much better equipped to follow this one.
The Ti cards hit the scene at the same time NVIDIA released its Detonator XP video drivers. These new drivers brought with them substantial performance gains, and they "turned on" a few features already present in the GeForce3 hardware but not yet implemented in driver software. Among them:
Once you've grasped the basic idea, the mind-blowing stuff comes along. NVIDIA's implementation of 3D textures includes quad-linear filtering and up to 8:1 compression of 3D textures.
Poof! Mind blown.
NVIDIA has licensed its 3D texture compression scheme to Microsoft for use in DirectX, but this scheme remains unique to NVIDIA products in OpenGL.
Although it's not as radical an approach as a Kyro II, the GeForce3 has the ability to determine, at least some of the time, when a pixel will be occluded, so the chip can avoid drawing unneeded pixels. The Detonator XP drivers improve the GeForce3's occlusion detection, boosting the chip's effective pixel-pushing poweror fill ratesubstantially.
|Star Wars Battlefront trailer will leave your jaw on the desk||102|
|This week produced a bumper crop of security holes, patches||10|
|Two men have real-life flame war over iOS, Android||31|
|Report: DOJ may oppose Comcast's Time Warner acquisition||28|
|Deal of the week: A terabyte-class SSD for $300, plus more||29|
|This is my favorite fanless NUC chassis so far||28|
|AMD posts $180 million loss, shutters SeaMicro business||221|
|Razer's BlackWidow Chroma spawns a tenkeyless variant||18|
|You should probably watch the new Star Wars trailer||144|