A few quirks in the CS: Source beta
Now, about those rendering problems. First, no matter which card we tried, we'd see pixel shader corruption problems and skewed benchmark results if we didn't exit the game and restart it after each video mode change. This problem was simple to work around, of course, but it's something to note.
Second, the GeForce FX cards hit a few bumps in the road in this beta version of CS: Source. The Source engine auto-detects recommended settings for one's video card, and on any card in the GeForce FX line, it practically insists on turning off water reflectivity. As a result, we've benchmarked the GeForce FX line without water reflectivity, and we've put an asterisk next to the name of each FX card in our results, to remind you of that fact. There's no great visual difference between the water's look on the FX line and on other cards, but if the menu settings mean anything, the FX cards are doing less work.
The GeForce FX line also won't do 4X antialiasing in this CS: Source beta. Instead, you get this message:
I'm not sure what the problem here is, but obviously Valve has classified it as a known bug with a beta version of the engine. I'm curious to find out whether this bug has anything to do with the centroid sampling problems on GeForce FX hardware. Whatever the case, we weren't able to benchmark the GeForce FX cards with antialiasing enabled.
DX8 versus DX9 illustrated
We should briefly address the issue of DirectX 8 versus DirectX 9, because Valve originally said last year that it might have to drop back to its DirectX 8 rendering path in order for GeForce FX cards to perform acceptably in Half-Life 2. We have no confirmation from Valve yet about what rendering path the CS: Source beta is using on GeForce FX cards, but I thought I should show you the difference between a DirectX 8-class card, a GeForce FX, and ATI's very latest DX9 card. The image output differences between DX8 and DX9 cards are pretty subtle. Have a look at the screenshots below, and you'll see an example where a difference is apparent.
The DX8-class GeForce4 Ti 4200 manges to render the glow effects reasonably well, but it has less internal color precision than the Radeon X800. You can see some harsher color transitions and some greenish banding at the edge of the light halos in the Ti 4200 screenshot. That is pretty much the sum of the difference between DX8 and DX9 rendering in the CS: Source betaminor differences in color precision.
Interestingly enough, the GeForce FX 5950 Ultra renders the scene with enough precision that the banding apparent on the GeForce4 Ti 4200 is banished. The same holds true for the GeForce FX 5700 Ultra. Looks to me like the GeForce FX cards are using a DX9 rendering path of some sort.
|Asus brightens up its Z170 Pro Gaming mobo with Aura RGB LEDs||5|
|iPad sales stabilize in Apple's fiscal 2016 third quarter||34|
|Seagate Nytro family now includes a 2TB M.2 SSD||9|
|Crucial fills out MX300 SSDs with 275GB, 525GB, and 1TB models||18|
|Nvidia and AMD ease 360-degree video production with new APIs||16|
|AMD FireRender is now the open-source Radeon ProRender||8|
|AMD Radeon Pro graphics cards bring Polaris to content pros||45|
|Radeon Pro Solid State Graphics keeps big data close to the GPU||83|
|Pascal powers up pro graphics with Nvidia's new Quadros||31|
|Now you can install Crysis directly on the video card!||+49|