GeForce FX in Source engine investigated

I had an interesting phone conversation with Brandon Bell over at the FiringSquad after the publication of our CS: Source video stress test article. We were puzzling through the GeForce FX cards’ unexpectedly solid performance and whether the Source engine was using DirectX 9 with them. I added some screenshots to our CS: Source article showing the difference between DX8, the FX, and DX9 in an attempt to figure things out, with some success. I learned the FX cards aren’t doing straight-up DX8.0, at least. Brandon went the extra mile, though, testing the various rendering paths on FX cards. The results aren’t entirely conclusive, but he surmises that the FX cards are using DirectX 8.1 and can’t be coaxed into DX9. The screenshots and benchmarks would seem to support that conclusion.

Comments closed
    • vortigern_red
    • 17 years ago

    You misunderstood me.

    (AFAIK) PS2.0 has alot of optional parts such as MRT that the FX range does not support. If source makes use of any of these optional parts of PS2.0 then the FX range will not be able to run the PS2.0 path.

    It should be noted that NV did say that the FX range would support some of these addition render targets at launch but never exposed them in the drivers and more recently have said that they are unlikely to expose them at all. So it is not unreasonable for a developer of a engine that has been in development for some years to assume that by the time his/her game ships the FX range would support these RT, because NV told him/her that they would.

    If you run shadermark on an FX many tests will not complete, they never have completed and will probably never complete. If the source engine is using a similar technique for some shaders on the “PS2.0” path then the FX range is unable to run it.

    The 6800 series however is PS3.0 and the specs have far less optional parts than PS2.0 so this problem should be less in the future (assumming both IHVs support the same shader model!) Many of the optional parts of SM2.0 are core SM3.0 parts and so the 6800 should support a lot more of PS2.0 than the FX range.

    To sum up, all the above is just me trying to remember things from some time ago and so may be innaccurate but this is the general point WRT to the following type comments

    “My FX is a DX9/SM2.0 card so it can run the DX9 path in source”

    Many parts of DX9/SM2.0 are optional, so just because your card meets the base spec does not mean it will run every shader, to compound this NV told everyone thier cards would support features they have never exposed in the drivers.

    This may well not be the case with source (it might be performance of the PS2 path on the FX) but don’t assume that because your card is DX9 that it will support every feature of that spec.

    • Entroper
    • 17 years ago

    The Firingsquad article describes a console variable that sets the shader model used. Vortigern makes a good point though, it didn’t occur to me that a driver bug could make the sm2.0 path render incorrectly. The hardware should definitely be capable, though.

    • Myrmecophagavir
    • 17 years ago

    But are those options explicitly exposed in the game’s options? I mean if it goes so far as to have an option to enable 2.x shaders, then those should be used. But if it just vaguely talks about “quality” then who knows what you’ll really get… is that documented in the readme anywhere?

    • Supreme Dalek
    • 17 years ago

    Here here…

    I have had mine for 18 months. The longest single possesion of a 3D card since I first ponyed up a wad of cash for the Orchid Righteous 3D back in 97.

    The 9700 pro is even better now since I did that shim thing… 350G/320M

    • vortigern_red
    • 17 years ago

    How do you know that the FX is capable of running the shader2.0 code used in the source engine? Try running shadermark and see if an FX completes all tests.

    • Entroper
    • 17 years ago

    The point is that there are options to choose your rendering paths, and they seem to be broken for the FX. The FX are capable of running sm2.0 code, so it should be allowed, even if it performs poorly in comparison.

    • Myrmecophagavir
    • 17 years ago

    Please get the terminology right – I find it hard to believe that they’re actually using DX8.1 interfaces just for one card lineup. Surely we’re talking shader version paths here?

    And why is this a surprise anyway – Valve made that presentation ages ago where they explained that in order to get good performance from some FX cards they had to run 1.x shaders instead of 2.x… perhaps they simply decided to do this on all FXs?

    • ExpansionSSS
    • 17 years ago

    I’ve said it once, and I’ll say it again….

    Still got a 9700 pro, still loving it.

    • DreadCthulhu
    • 17 years ago

    Hey, I just got a GeforceFX 5900 to replace my Ti 4600 that died. Of course, I got it for $120 off of ebay; there is no way I would have payed retail price for it.

    • hmmm
    • 17 years ago

    True.

    • DukenukemX
    • 17 years ago

    The most ATI is making Valve do is force DX8.1 and or not mixing DX8.1 and DX9 shaders around for FX cards.

    Geforce 6800 cards seem to be just fine or even faster than ATI.

    • Convert
    • 17 years ago

    I hope they are spending all this time to investigate ATI as well, and the game makers (like they did with doom3 shaders, not that these people were right its good people are looking though).

    • DukenukemX
    • 17 years ago

    This shouldn’t have been a shock to anyone. Most games that make use of DX pixel shaders haven’t performed well for FX cards.

    This includes FarCry, Deus EX 2, and Thief 3. Even ShaderMark 2.0.

    §[<http://www.xbitlabs.com/articles/video/display/ati-nvidia-roundup_12.html<]§ §[<https://techreport.com/reviews/2003q4/geforcefx-5950ultra/index.x?pg=11<]§ §[<http://www.digital-daily.com/video/msi-fx5700ultra/index06.htm<]§ If you based your FX performance on 3Dmark2003 and Halo this could come as a shock.

    • Corrado
    • 17 years ago

    No, not really. i’ve had 3 5900XT’s and they have all been wonderful cards. Its whats in my machine right now. I’ve also had 9600’s, 9800s, 9700’s, GF4’s, GF3’s, GF2’s, etc etc. I’ve had LOTS AND LOTS of video cards and I can honestly say that none of them has dissappointed me. I had an 8500 on launch day for them, it ran fine, no driver issues. I had a Parhelia, I loved it. Maybe I’m just not picky, but I’m certainly not an nVidiot and i certainly know better.

    • Corrado
    • 17 years ago

    Even Voodoo2’s were $500 when launched… and to get top performance, you needed TWO of them.

    • BabelHuber
    • 17 years ago

    q[< 2 years ago, who would've believed you'd have to spend 200 dollars minimum for a decent upper-midrange card, and 500 to 600 for "the best"? And only to have the best bested in 4 or more months?<]q I'm sorry, but this is just not the case. The original Geforce256DDR was very expensive in Dezember 1999, too. And straight in May 2000, varius GF2-flavors came to market. The GF2GTS was expensive, and even more so the Ultra (~$600 upon release IIRC). In 2001 the GF3 appeared - for $500. And so on and so on and so on. It always had been this way, get over it.

    • --k
    • 17 years ago

    Nvidia and ATI were in talks about doing a yearly refresh instead of a 6 month.

    • random gerbil
    • 17 years ago

    Well said.

    • kvndoom
    • 17 years ago

    it does matter, since some people paid 300 or more dollars for the high-end cards and aren’t getting what they paid for.

    the question is, did Valve code HL2 this way, or is it another Quake.exe/Quack.exe driver fiasco? The video card market is getting too big for its pants, if you ask me. 2 years ago, who would’ve believed you’d have to spend 200 dollars minimum for a decent upper-midrange card, and 500 to 600 for “the best”? And only to have the best bested in 4 or more months? It’s becoming too cutthroat and shady, and the consumers are having to pay for it.

    • DukenukemX
    • 17 years ago

    Does it matter anymore? The only Geforce product worth buying are the 6800. The Geforce FX was never considered a good product since the dust buster was released. Even the Geforce 4 TI’s were a better product.

    Only Nvidiots or people who don’t know better bought FX cards.

    • GodsMadClown
    • 17 years ago

    Oh god. Here come the Nvidia driver cheating conspiracy theory-type rants again. I hope somebody makes real news soon. August is a slow month.

Pin It on Pinterest

Share This