Single page Print

NVIDIA's David Kirk speaks

The chipmaker's Chief Scientist gives us the dirt on the current graphics scene
— 12:00 AM on December 19, 2001

DAVID KIRK HAS BEEN NVIDIA's Chief Scientist since 1997. During that time, he helped build the technology that has propelled NVIDIA to near-dominance in PC graphics. Prior to his tenure at NVIDIA, Dr. Kirk worked at Crystal Dynamics and Hewlett-Packard. He is also a published author who holds degrees in Mechanical Engineering from MIT and Computer Science degrees, including a Ph.D., from the California Institute of Technology.

Recently, Dr. Kirk shared his views with us on a range of topics related to the current PC graphics landscape. Our questions—in yellow—and his answers follow.

It is always our goal to provide continuing value and improved performance to our customers with new driver releases. I can't say what's coming, but you can always expect that each release will expose just a little bit more. This is further leveraged by our Universal Driver Architecture strategy. All of our drivers are compatible with all of our hardware, both backwards and forwards. This means that we can very easily expose more and more functionality over time, in a compatible way. I can confirm that there are more hardware features in the GeForce3 that have not yet been exposed in software. Stay tuned!

I believe that the combination of GeForce3, GeForce3 Ti500, and now, GeForce3 Ti200 in the mainstream, create a dynamite "virtual console" platform on the PC. So many gamers and enthusiasts will have graphics processors from the GeForce3 family, that this becomes an excellent platform for game development. I expect to see a lot of games that offer exclusive or special support for GeForce3 features.

We choose our manufacturing and production clock rates to strike a balance between performance (as much as possible :) ) and stability. It is important to us that customers who buy our products get a consistent, high quality experience. Because of our how conservative we are, there is ample headroom for overclocking. While we do not advocate this behavior, it is certainly an opportunity for those who want to push the bleeding edge!

There are many techniques for doing shadows. GeForce3 supports not only shadow buffers, but also stencil shadows. Each technique has different benefits and limitations. Even given the choice of shadow buffers, there are multiple ways to achieve the goal. What shadow buffers provide is simply, shadows. Games and rendered scenes look a lot more realistic when the characters and environments have realistic shadows. The shadow buffer implementation on GeForce3 provides for high quality, smooth edged soft shadows and objects that can cast shadows on themselves. No other hardware provides equivalent quality.

NVIDIA's approach has always been to be API-agnostic. By that, I mean that the hardware supports every feature equally well in both OpenGL and Direct3D. In some cases, there are features that our hardware supports that are not exposed in Direct3D, but that should be remedied over time. Usually, the hardware's full capability set is exposed under OpenGL, so sometimes OpenGL is ahead for a time. I keep hearing that these benchmarks are specifically written for our hardware, and this is just nonsense. It's almost certainly true that these benchmarks were developed ON our hardware; at the time, GeForce3 was the only DX8 hardware available to developers, and it's still all that most developers have. It's clear that a benchmark developed on a particular piece of hardware will run on that hardware!

In theory, yes. In practice, no. The DX8 vertex shader instruction set is identical to the hardware operations on the Geforce3. This happened because Microsoft licensed the technology from NVIDIA. Other implementations probably only approximate the instruction set—they don't actually implement it fully.