A few quirks in the CS: Source beta
Now, about those rendering problems. First, no matter which card we tried, we'd see pixel shader corruption problems and skewed benchmark results if we didn't exit the game and restart it after each video mode change. This problem was simple to work around, of course, but it's something to note.
Second, the GeForce FX cards hit a few bumps in the road in this beta version of CS: Source. The Source engine auto-detects recommended settings for one's video card, and on any card in the GeForce FX line, it practically insists on turning off water reflectivity. As a result, we've benchmarked the GeForce FX line without water reflectivity, and we've put an asterisk next to the name of each FX card in our results, to remind you of that fact. There's no great visual difference between the water's look on the FX line and on other cards, but if the menu settings mean anything, the FX cards are doing less work.
The GeForce FX line also won't do 4X antialiasing in this CS: Source beta. Instead, you get this message:
I'm not sure what the problem here is, but obviously Valve has classified it as a known bug with a beta version of the engine. I'm curious to find out whether this bug has anything to do with the centroid sampling problems on GeForce FX hardware. Whatever the case, we weren't able to benchmark the GeForce FX cards with antialiasing enabled.
DX8 versus DX9 illustrated
We should briefly address the issue of DirectX 8 versus DirectX 9, because Valve originally said last year that it might have to drop back to its DirectX 8 rendering path in order for GeForce FX cards to perform acceptably in Half-Life 2. We have no confirmation from Valve yet about what rendering path the CS: Source beta is using on GeForce FX cards, but I thought I should show you the difference between a DirectX 8-class card, a GeForce FX, and ATI's very latest DX9 card. The image output differences between DX8 and DX9 cards are pretty subtle. Have a look at the screenshots below, and you'll see an example where a difference is apparent.
The DX8-class GeForce4 Ti 4200 manges to render the glow effects reasonably well, but it has less internal color precision than the Radeon X800. You can see some harsher color transitions and some greenish banding at the edge of the light halos in the Ti 4200 screenshot. That is pretty much the sum of the difference between DX8 and DX9 rendering in the CS: Source betaminor differences in color precision.
Interestingly enough, the GeForce FX 5950 Ultra renders the scene with enough precision that the banding apparent on the GeForce4 Ti 4200 is banished. The same holds true for the GeForce FX 5700 Ultra. Looks to me like the GeForce FX cards are using a DX9 rendering path of some sort.
|Intel unveils purpose-built Neural Network Processor for deep learning||12|
|Razer's Blade Stealth and Core V2 step to the cutting edge||9|
|Wear Something Gaudy Day Shortbread||12|
|Astro Gaming A20 rockets to 5.8 GHz for clearer connections||0|
|Asus teases ROG Strix X370I mobo for spiffy Mini-ITX Ryzen builds||11|
|NZXT's H700i, H400i, and H200i cases have their heads in the clouds||15|
|ASRock X299E-ITX/ac stuffs Core i9s into mini-ITX systems||29|
|Surface Book 2 flies higher with eighth-gen Core and Pascal||32|
|Rumor: Samsung 970 and 980 NVMe SSDs could be on the way||43|
|That's a lot of dongs.||+16|