The GeForce III (NV 20) will be officially presented very soon at the Comdex. The T&L engine ist greatly improved and will be full DX-8 compliant. 0.15 micron, 6 pipelines with each 3 T.U. Core frequency : 300 MHz. Memory 200 MHz 256-bit DDR-Ram. Fillrate : 1.8 GPix/s resp. 5.4 GTex/s. Bandwidth : 12.8 Go/s. The expected perf are truly fantastic : 1600x1200x32@100 fps for Q3 on a P-III 1GHz.I ripped this from a post at iXBT Labs' forums. I didn't initially think this info was worthy of mention, because some of it seemed bogus, but I'll give you my impressions.
There's just one problem : the price (US$ 800.-- ?).
First, the skepticism: The $800 price point is just plain crazy. I suppose NVIDIA could start raising prices if the competition can't keep up, but $800 is a lot of money in PCs these daysa lot. Maybe the pin count for 256-bit memory makes it sound a little more sane, but still way too expensive. Also, some of the numbers, like the Q3 frames per second estimate and the "bandwidth" number, sound like fabrications. No doubt the performance estimate could be right, though.
As for the six pipelines with three textures each, this bit of the spec seems plausible, as is the "greatly improved", DirectX 8-compliant T&L engine. Clock frequencies sound about right, and the clock-to-pipeline-to-fillrate math works out. The .15-micron feature size seems possible in light of the process improvements they've made that produced the GeForce2 Pro and Ultra. The pixel and texel fillrate numbers (esp. texel) are theoretical fantasies, but so are the theoretical peak numbers on the current NVIDIA chips.
So maybe they've got the basics right.
But no mention of Z compression, hardware clipping, or occlusion detection circuitry makes me wonder about this particular set of specs. Maybe we're only getting part of the story, but I kinda think these specs are just someone's guess. Not a bad guess, though.