NVIDIA's position has only gotten stronger since the TNT chip. The GeForce series of chipsextending up into the high end with the Ultra and Quadro lines, and down into the value market with the MX linehas devastated the competition. Since the introduction of the first GeForce chip, S3 has spun away, Matrox has dug safely into its niche markets, and 3dfx finally conceded defeat, selling out to NVIDIA. ATI remains standing, but it's not clear whether the company is gathering its strength for a renewed attack, or waiting wobbily for its knock-out punchprobably some combination of the two.
Now comes NVIDIA's GeForce3 chip, the company's most ambitious undertaking yet. It's also one of the most poorly named products in recent memory, because the GeForce3 name implies that this chip is a follow-on to the previous GeForce chips. In reality, GeForce3 is a completely new graphics chip that initiates a novel approach to real-time 3D graphics.
It was a risk for NVIDIA to design and produce this chip at this point in time. They could have concentrated on adding raw pixel-pushing power and refined the conventional approach to real-time 3D graphics. Doing so would have arguably delivered more performance on present-day games, and it would have avoided shaking up a market that NVIDIA more or less owned. The fact they didn't choose this approach is a credit to the company's leadershipat least, assuming GeForce3 manages to deliver on its promise. Let's take a look and see whether NVIDIA has pulled it off.
The foundation for a new beginning
The keys to the GeForce3's new approach to rendering are two of the chip's main functional units, dubbed vertex and pixel shaders. By now, it's likely you've read a fair amount about these things, and if you're like me, your eyes probably glaze over at the mention of a vertex shader. But I think these two thingsvertex shaders and pixel shadersare worth understanding well.
Besides, they are, at heart, easy concepts.
OK, not "circle is round" simple or "Rosie O'Donnell is annoying" intuitive, but they're easy enough to grasp, nonetheless. My first exposure to the GeForce3 came at Comdex last fall, when NVIDIA showed us early silicon with early drivers and a very early software demo. That demo evolved to become the best single demonstration of both vertex and pixel shaders in action together that I've seen yet.
But before we get to that, I'm gonna have to take a crack at explaining pixel and vertex shaders. If you can already feel your eyes glazing over, skip ahead to the next section. I won't hate you. Much.
|be quiet!'s Silent Base 800 case reviewed||6|
|MSI Aegis Ti wraps up SLIed GTX 1080s in an aggressive shell||36|
|Deals of the week: a Dell G-Sync monitor for $470 and more||15|
|Radeon Software Crimson Edition 16.7.3 serves up the bugfixes||7|
|AMD reveals the full specs of the Radeon RX 460 and RX 470||75|
|Nvidia will pay GeForce GTX 970 owners $30 over memory snafu||57|
|Gigabyte's GeForce GTX 1080 Xtreme Gaming graphics card reviewed||43|
|Microsoft's free Windows 10 upgrade offer ends tomorrow||133|
|ASRock H110M-STX mobo puts the 5x5 platform in builders' hands||15|
|Now you can install Crysis directly on the video card!||+66|