You'll recall that build 320 was the version of 3DMark for which NVIDIA and ATI optimized their drivers. (ATI has owned up to it, but NVIDIA hasn't so far as I'm aware.) Build 330 of 3DMark03 was written expressly to foil benchmark-specific optimizations and cheats.
Here are my results:
As you can see, ATI's optimizationswhich they claim didn't change image outputbarely affect the overall score. NVIDIA's, however, make a substantial difference. The most striking difference between builds 320 and 330 is in the Game 4 test, where performance drops dramatically once the cheats and optimizations are disabled. This test, the "Mother Nature" scene, makes the most extensive use of DirectX 9 and pixel shader 2.0.
Those are the numbers. I don't have time here to dig into all of the related issues, but Dave from Beyond3D sent me a note about a couple of things you should check out on his site. First, Dave has captured images from ATI and NVIDIA cards in 3DMark03 to show the image quality differences between builds 320 and 330. The NVIDIA drivers clearly have more impact on image quality. Next, to help you sort out what that fact means, have a look at Unreal guru Tim Sweeney's take on cheating versus optimization. The basic principle he outlines seems like a good guide in this case.