As we've mentioned in another thread, rumor has it the GF4 MX is coming. That's probably based on the NV17 chip, which is a hopped-up GeForce2 MX with some new bits tacked on. It will perform well in current games, but it lacks vertex shaders and pixel shaders.
Now. Just as the NV17 steals logic units from the GeForce2 MX and the GeForce3, the GeForce4 (not the MX) will probably steal the RAMDACs, TMDS transmitter, TV encoder, and DVD decoder units from the NV17. It will probably contain the same crossbar memory controller used in the GeForce3 and NV17 (among others). So this will be NVIDIA's first dual-display high-end card.
It seems likely the GF4 will gain a second vertex shader to bring it up to snuff with the Xbox GPU and Radeon 8500. It's possible the vertex unit will be tweaked, but this might just be two copies of the GF3 vertex shader, much like the XGPU.
As for pixel shaders, one hopes NVIDIA will improve these to match the Radeon 8500's. I expect to see dependent texture reads finally working right. Quite possibly, NVIDIA could even surpass ATI in the pixel shader department. However, that's not a foregone conclusion given the current situation.
So now the GF4 is up to par with the 8500.
Also expect higher fill rates and memory clock speeds on the high-end card, although that's hardly thrilling anymore. In the best of all worlds, NVIDIA might introduce some nifty occlusion detection techniques and really boost effective fill rates. I'd guess there will be some incremental improvements along those lines, but nothing revolutionary.
Surely improved AA routines are coming. The 8500's Smoothvision is the best thing on the market, and we can't have that, can we? Maybe we'll see a multisampled AA routine with randomized (or partially randomized) sampling. I'd expect a fast, T-buffer-esque blending setup between multiple buffers--again, much like the Radeon 8500.
If NVIDIA isn't just playing catch-up here, we may see some interesting new features. I'm betting on seeing floating-point pixel formats, which would make pixel shaders into freaking powerhouses. Can't wait to see it.
I'd also kind of expect some pixel compression techniques to come into play if FP pixels really do happen. Only seems natural.
But certainly FP pixels will require software rewrites before they ever get used.
_________________
Scott Wasson
"Damage"
The Tech Report
http://tech-report.com
<font size=-1>[ This Message was edited by: Damage on 2002-01-23 00:10 ]</font>