A few quirks in the CS: Source beta
Now, about those rendering problems. First, no matter which card we tried, we'd see pixel shader corruption problems and skewed benchmark results if we didn't exit the game and restart it after each video mode change. This problem was simple to work around, of course, but it's something to note.
Second, the GeForce FX cards hit a few bumps in the road in this beta version of CS: Source. The Source engine auto-detects recommended settings for one's video card, and on any card in the GeForce FX line, it practically insists on turning off water reflectivity. As a result, we've benchmarked the GeForce FX line without water reflectivity, and we've put an asterisk next to the name of each FX card in our results, to remind you of that fact. There's no great visual difference between the water's look on the FX line and on other cards, but if the menu settings mean anything, the FX cards are doing less work.
The GeForce FX line also won't do 4X antialiasing in this CS: Source beta. Instead, you get this message:
I'm not sure what the problem here is, but obviously Valve has classified it as a known bug with a beta version of the engine. I'm curious to find out whether this bug has anything to do with the centroid sampling problems on GeForce FX hardware. Whatever the case, we weren't able to benchmark the GeForce FX cards with antialiasing enabled.
DX8 versus DX9 illustrated
We should briefly address the issue of DirectX 8 versus DirectX 9, because Valve originally said last year that it might have to drop back to its DirectX 8 rendering path in order for GeForce FX cards to perform acceptably in Half-Life 2. We have no confirmation from Valve yet about what rendering path the CS: Source beta is using on GeForce FX cards, but I thought I should show you the difference between a DirectX 8-class card, a GeForce FX, and ATI's very latest DX9 card. The image output differences between DX8 and DX9 cards are pretty subtle. Have a look at the screenshots below, and you'll see an example where a difference is apparent.
The DX8-class GeForce4 Ti 4200 manges to render the glow effects reasonably well, but it has less internal color precision than the Radeon X800. You can see some harsher color transitions and some greenish banding at the edge of the light halos in the Ti 4200 screenshot. That is pretty much the sum of the difference between DX8 and DX9 rendering in the CS: Source betaminor differences in color precision.
Interestingly enough, the GeForce FX 5950 Ultra renders the scene with enough precision that the banding apparent on the GeForce4 Ti 4200 is banished. The same holds true for the GeForce FX 5700 Ultra. Looks to me like the GeForce FX cards are using a DX9 rendering path of some sort.
|G.Skill KM560 MX keyboard drops the numpad||10|
|Rumor: Acer Triton 700 may use an unreleased Pascal GPU||23|
|Silverstone Vital VT02 could hold a Core i7 in under two liters||8|
|Galax and KFA2 induct the GTX 1080 Ti into the Hall of Fame||22|
|Acer's Aspire GX-281 lineup brings Ryzen to the masses||18|
|Deals of the week: discounts on CPUs, mobos, and more||10|
|Asetek gets $600,000 from Cooler Master in AIO cooler patent spat||19|
|Acer Predator Triton and Helios laptops are ready for serious play||15|
|Intel enjoys healthy revenue and profits for Q1 2017||30|
|Unless Intel suddenly becomes very aggressive in its pricing, a Skylake-X will certainly cost a hell of a lot more than Ryzen CPU. And who cares if AM...||+66|