NVIDIA's position has only gotten stronger since the TNT chip. The GeForce series of chipsextending up into the high end with the Ultra and Quadro lines, and down into the value market with the MX linehas devastated the competition. Since the introduction of the first GeForce chip, S3 has spun away, Matrox has dug safely into its niche markets, and 3dfx finally conceded defeat, selling out to NVIDIA. ATI remains standing, but it's not clear whether the company is gathering its strength for a renewed attack, or waiting wobbily for its knock-out punchprobably some combination of the two.
Now comes NVIDIA's GeForce3 chip, the company's most ambitious undertaking yet. It's also one of the most poorly named products in recent memory, because the GeForce3 name implies that this chip is a follow-on to the previous GeForce chips. In reality, GeForce3 is a completely new graphics chip that initiates a novel approach to real-time 3D graphics.
It was a risk for NVIDIA to design and produce this chip at this point in time. They could have concentrated on adding raw pixel-pushing power and refined the conventional approach to real-time 3D graphics. Doing so would have arguably delivered more performance on present-day games, and it would have avoided shaking up a market that NVIDIA more or less owned. The fact they didn't choose this approach is a credit to the company's leadershipat least, assuming GeForce3 manages to deliver on its promise. Let's take a look and see whether NVIDIA has pulled it off.
The foundation for a new beginning
The keys to the GeForce3's new approach to rendering are two of the chip's main functional units, dubbed vertex and pixel shaders. By now, it's likely you've read a fair amount about these things, and if you're like me, your eyes probably glaze over at the mention of a vertex shader. But I think these two thingsvertex shaders and pixel shadersare worth understanding well.
Besides, they are, at heart, easy concepts.
OK, not "circle is round" simple or "Rosie O'Donnell is annoying" intuitive, but they're easy enough to grasp, nonetheless. My first exposure to the GeForce3 came at Comdex last fall, when NVIDIA showed us early silicon with early drivers and a very early software demo. That demo evolved to become the best single demonstration of both vertex and pixel shaders in action together that I've seen yet.
But before we get to that, I'm gonna have to take a crack at explaining pixel and vertex shaders. If you can already feel your eyes glazing over, skip ahead to the next section. I won't hate you. Much.
|Cooler Master's MasterCase 5 reviewed||4|
|The TR Podcast is live on Twitch||0|
|Thursday Night Shortbread||9|
|Don't call it Knots Landing: next Xeon Phi detailed||29|
|Apple will hold its next major event on September 9||25|
|Red Awakening mixes action, stealth, parkour, and the '80s||8|
|Rumor: Next iPhones to get a 12MP camera, 4K video recording||27|
|Join us tonight on the TR Podcast live stream||4|
|Dell jumps into G-Sync with its S2716DG 27" monitor||28|