NVIDIA's position has only gotten stronger since the TNT chip. The GeForce series of chipsextending up into the high end with the Ultra and Quadro lines, and down into the value market with the MX linehas devastated the competition. Since the introduction of the first GeForce chip, S3 has spun away, Matrox has dug safely into its niche markets, and 3dfx finally conceded defeat, selling out to NVIDIA. ATI remains standing, but it's not clear whether the company is gathering its strength for a renewed attack, or waiting wobbily for its knock-out punchprobably some combination of the two.
Now comes NVIDIA's GeForce3 chip, the company's most ambitious undertaking yet. It's also one of the most poorly named products in recent memory, because the GeForce3 name implies that this chip is a follow-on to the previous GeForce chips. In reality, GeForce3 is a completely new graphics chip that initiates a novel approach to real-time 3D graphics.
It was a risk for NVIDIA to design and produce this chip at this point in time. They could have concentrated on adding raw pixel-pushing power and refined the conventional approach to real-time 3D graphics. Doing so would have arguably delivered more performance on present-day games, and it would have avoided shaking up a market that NVIDIA more or less owned. The fact they didn't choose this approach is a credit to the company's leadershipat least, assuming GeForce3 manages to deliver on its promise. Let's take a look and see whether NVIDIA has pulled it off.
The foundation for a new beginning
The keys to the GeForce3's new approach to rendering are two of the chip's main functional units, dubbed vertex and pixel shaders. By now, it's likely you've read a fair amount about these things, and if you're like me, your eyes probably glaze over at the mention of a vertex shader. But I think these two thingsvertex shaders and pixel shadersare worth understanding well.
Besides, they are, at heart, easy concepts.
OK, not "circle is round" simple or "Rosie O'Donnell is annoying" intuitive, but they're easy enough to grasp, nonetheless. My first exposure to the GeForce3 came at Comdex last fall, when NVIDIA showed us early silicon with early drivers and a very early software demo. That demo evolved to become the best single demonstration of both vertex and pixel shaders in action together that I've seen yet.
But before we get to that, I'm gonna have to take a crack at explaining pixel and vertex shaders. If you can already feel your eyes glazing over, skip ahead to the next section. I won't hate you. Much.
|Nanoxia Project S case slides into home-theater setups||18|
|Cat5e and Cat6 cables get a 5Gbps speed boost||9|
|BIO-key fingerprint readers let users get in touch with Microsoft Hello||2|
|Google Translate gets a boost from deep neural networks||4|
|BlackBerry will no longer make BlackBerries||8|
|Nvidia previews Xavier SoC with Volta GPU for self-driving cars||18|
|be quiet! Silent Loop AIO liquid coolers hum along quietly||4|
|Microsoft catapults datacenter performance with FPGAs||47|
|Asus J3455M-E mobo sails out with Apollo Lake SoC aboard||23|