A preview of Half-Life 2 performance?

AT ABOUT 10PM CENTRAL TIME on Wednesday night, a beta version of Counter-Strike based on the new Source game engine became available to those holding an ATI Half-Life 2 voucher and a subscription to Valve’s Steam content delivery system. Counter-Strike: Source is basically just a port of the mega-popular team-based shooter to Half-Life 2’s Source engine. The beta version of CS: Source, though, includes another feature that’s very interesting: a video card benchmark. This benchmark, which Valve has dubbed the “video stress test,” uses the same test level that we saw in our early Half-Life 2 benchmarks nearly a year ago. Of course, this time around, Half-Life 2 is very close to release—rumors abound about the game going gold very shortly.

Based on everything we know, we can only conclude that the CS: Source video stress test is essentially a Half-Life 2 benchmark that’s available to the public right now. Naturally, that piques our curiosity, especially since last time around, the ATI cards were absolutely trouncing the NVIDIA cards in HL2 benchmarks. There was only one thing to do: we rounded up thirteen different DirectX 9-class video cards for a Source engine benchmarking bonanza.

Has ATI maintained its monstrous lead in Half-Life 2 performance over NVIDIA, or have the events of the past year allowed NVIDIA to catch up? Read on to find out.

The Source engine video stress test
The Source engine video stress test included with the CS: Source beta isn’t a real in-game benchmark. It doesn’t use a real Half-Life 2 level, doesn’t test game physics, and doesn’t play sounds back. It is, however, a pretty darned good video card benchmark, because it incorporates a whole range of pixel shader effects, sometimes layering them on top of each other, to produce lots of eye candy. The video stress test is also—lo and behold—very much a stress test; it seems to throw a series of worst-case scenarios at the graphics card to see how it fares. In other words, if a graphics card can make it through the video stress test without choking, I’d expect it to hold up its end of the bargain in Half-Life 2, as well.

To give you some idea what the stress test does, let’s have a look at a few screenshots. The first one is from the opening stage of the stress test, where multiple translucency effects are layered on top of one another. Note, also, the reflective and refractive water below. This scene is packed with DX9 pixel shader effects.

Next up is a room illuminated by a fire effect. On the pedestal in the middle of the room, you can kind of see a translucent player character, though it is tough to pick out in this shot. Also note the walls, which are covered with very detailed bump or normal maps. The low resolution of this screen shot doesn’t do it justice; the textures still look exquisitely detailed at 1600×1200.

Finally, we have a room with a series of virtual TV sets, displaying images from the previous test room on them through the magic of portal rendering (or, uhm, render to texture). Again, the floor is covered with water, and settled above the water is a thick, blue volumetric fog.

All of these scenes rendered perfectly on all of the video cards we tested, with a couple of minor exceptions that I’ll describe shortly. Overall, the Source engine’s visuals are much higher quality than current games and don’t seem to vary widely from card to card, much like we’ve seen in DOOM 3. We have refrained from providing extensive screenshot comparisons between cards because of some limitations in the CS: Source beta, but to the naked eye, there’s little difference between ATI and NVIDIA in terms of image quality. Let’s talk about the differences we were able to spot…

 

A few quirks in the CS: Source beta
Now, about those rendering problems. First, no matter which card we tried, we’d see pixel shader corruption problems and skewed benchmark results if we didn’t exit the game and restart it after each video mode change. This problem was simple to work around, of course, but it’s something to note.

Second, the GeForce FX cards hit a few bumps in the road in this beta version of CS: Source. The Source engine auto-detects recommended settings for one’s video card, and on any card in the GeForce FX line, it practically insists on turning off water reflectivity. As a result, we’ve benchmarked the GeForce FX line without water reflectivity, and we’ve put an asterisk next to the name of each FX card in our results, to remind you of that fact. There’s no great visual difference between the water’s look on the FX line and on other cards, but if the menu settings mean anything, the FX cards are doing less work.


Water on the X800 with world reflectivity enabled


Water on the GeForce FX 5950 Ultra without reflectivity

The GeForce FX line also won’t do 4X antialiasing in this CS: Source beta. Instead, you get this message:

I’m not sure what the problem here is, but obviously Valve has classified it as a known bug with a beta version of the engine. I’m curious to find out whether this bug has anything to do with the centroid sampling problems on GeForce FX hardware. Whatever the case, we weren’t able to benchmark the GeForce FX cards with antialiasing enabled.

DX8 versus DX9 illustrated
We should briefly address the issue of DirectX 8 versus DirectX 9, because Valve originally said last year that it might have to drop back to its DirectX 8 rendering path in order for GeForce FX cards to perform acceptably in Half-Life 2. We have no confirmation from Valve yet about what rendering path the CS: Source beta is using on GeForce FX cards, but I thought I should show you the difference between a DirectX 8-class card, a GeForce FX, and ATI’s very latest DX9 card. The image output differences between DX8 and DX9 cards are pretty subtle. Have a look at the screenshots below, and you’ll see an example where a difference is apparent.


GeForce4 Ti 4200 (Click for lossless PNG version)


Radeon X800 XT Pro (Click for lossless PNG version)


GeForce4 FX 5700 Ultra (Click for lossless PNG version)


GeForce4 FX 5950 Ultra (Click for lossless PNG version)

The DX8-class GeForce4 Ti 4200 manges to render the glow effects reasonably well, but it has less internal color precision than the Radeon X800. You can see some harsher color transitions and some greenish banding at the edge of the light halos in the Ti 4200 screenshot. That is pretty much the sum of the difference between DX8 and DX9 rendering in the CS: Source beta—minor differences in color precision.

Interestingly enough, the GeForce FX 5950 Ultra renders the scene with enough precision that the banding apparent on the GeForce4 Ti 4200 is banished. The same holds true for the GeForce FX 5700 Ultra. Looks to me like the GeForce FX cards are using a DX9 rendering path of some sort.

 

Our testing methods
The dialog below shows the settings we used in testing, with the exception of the water detail problem noted above. For the benchmarks done with 4X antialiasing and 8X anisotropic filtering, we used this in-game settings tool to change AA and aniso modes.

Both the ATI and NVIDIA cards were left at their driver default settings for image quality, with the exception that we turned off vertical refresh sync on all cards.

Our test system was configured like so:

Processor Athlon 64 3800+ 2.4GHz
System bus HT 16-bit/800MHz downstream
HT 16-bit/800MHz upstream
Motherboard Asus A8V
BIOS revision 1006
North bridge K8T800 Pro
South bridge VT8237
Chipset drivers 4-in-1 v.4.51
ATA 5.1.2600.220
Memory size 1GB (2 DIMMs)
Memory type Kingston HyperX DDR SDRAM at 400MHz
CAS latency 2
Cycle time 5
RAS to CAS delay 2
RAS precharge 2
Hard drive Seagate Barracuda V ATA/100 120GB
Audio Integrated
Graphics Radeon 9600 XT 128MB AGP
Radeon 9800 Pro 128MB AGP
Radeon 9800 XT 256MB AGP
Radeon X800 Pro 256MB AGP
Radeon X800 XT 256MB AGP
GeForce FX 5700 Ultra 128MB AGP
GeForce FX 5800 Ultra 128MB AGP
GeForce FX 5900 128MB AGP
GeForce FX 5950 Ultra 256MB AGP
GeForce 6800 128MB AGP
GeForce 6800 GT 256MB AGP
GeForce 6800 Ultra 256MB AGP
GeForce 6800 Ultra “Overclocked” 256MB AGP
OS Microsoft Windows XP Professional
OS updates Service Pack 2 RC2, DirectX 9.0c

We used NVIDIA’s ForceWare 61.77 drivers for all of the GeForce cards, and we used ATI’s CATALYST 4.8 drivers with all the Radeon cards.

The test systems’ Windows desktop was set at 1152×864 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

If you have questions about our methods, hit our forums to talk with us about them.

 

Benchmark results
We have bar graphs and line graphs, because each type is useful in its own way. Note that on the line graphs down there, I’ve had to split the results across two graphs, because we had too many results for a single line graph. To keep thing sane, I’ve put the newer, higher-end cards on one graph and the older and mid-range cards on another.

 

4X antialiasing plus 8X anisotropic filtering

 
Conclusions
The video stress test in the CS: Source beta gives us a very different set of results than what we saw in the early Half-Life 2 benchmarks from almost a year ago. The NVIDIA cards are performing much better than they were before, especially relative to the Radeons. We’re not seeing the kind of “class busting” disparities between benchmark results here that we saw recently in DOOM 3, where one company’s $299 card outran the other company’s $399 card. Instead, what we have is rough parity. Of course, the GeForce 6 series of cards is much more potent in DirectX 9 than the GeForce FX line was. Still, the change in the FX cards’ relative performance is something of a surprise. Let’s break it down by class.

Among the $499 “image products,” the Radeon X800 XT Platinum Edition outdoes both flavors of GeForce 6800 Ultra, the “regular” 400MHz version and the 450MHz “overclocked in the box” model. The X800 XT PE’s advantage is most pronounced at with antialiasing and anisotropic filtering enabled. For instance, with 4X AA and 8X aniso at 1280×1024 resolution, the Radeon hits 87 frames per second, while the GeForce 6800 Ultra OC averages 80 FPS and the Ultra 74 FPS. This isn’t exactly dominance, but ATI is clearly on top.

Down at $399, though, it’s a different story. The GeForce 6800 GT slightly but surely outperforms the Radeon X800 Pro without aniso and AA. With 4X AA and 8X aniso, the two cards are virtually tied across all four resolutions we tested.

At $299, we approach the sorts of graphics cards that many folks might actually consider buying. Here, the aging Radeon 9800 XT faces off against the brand-new GeForce 6800, and the NVIDIA card has the edge in the majority of our tests. Only in the most brutal conditions, at 1600×1200 with AA and aniso enabled, does the Radeon prevail.


The Source engine’s high-dynamic-range lighting in action

Jump down near the $199-ish range, and the field gets a little crowded, with various flavors of Radeons and GeForce FXs vying for attention. I’d pick the battle of the Radeon 9600 XT versus the GeForce FX 5700 Ultra as the most interesting comparison here. The FX card isn’t doing reflective water and can’t run with antialiasing in this CS: Source beta version, but otherwise, the two cards pump out frames at nearly the same rate.

NVIDIA has started phasing them out now, but there are still lots GeForce FX 5900-series cards out there on the market, like the FX 5900 and FX 5950 Ultra cards we tested. Amazingly enough, these cards perform nearly as well as their ATI-based counterparts in the CS: Source beta, with the obvious caveats about water reflections and antialiasing. In the vintage sweeps, I had hoped to include a Radeon 9700 Pro in our tests to face off against the GeForce 5800 Ultra, but our ancient Radeon 9700 Pro card (a very early review unit) proved incompatible with our test system. The 5800 Ultra deafened me a little, but it turned in some decent benchmark scores, only six frames per second behind the Radeon 9800 Pro at 1280×1024.

Looks to me like Valve and NVIDIA have been working together to improve performance on GeForce FX GPUs, with impressive results. If these numbers are any indication, GeForce FX owners ought to be able to play Half-Life 2 with few compromises. I’m curious to see whether the optimizations for FX cards in Half-Life 2 are robust enough to survive incremental shader modifications and code updates to the game. They may still be rather fragile, as some of NVIDIA’s other optimizations for FX cards have proven. 

Comments closed
    • Rousterfar
    • 15 years ago
    • Ryu Connor
    • 15 years ago

    q[http://www.steampowered.com/platform/update_history/Counter-Strike%20Source%20Beta.html<]§ They fixed some rendering issues with the engine this most recent patch. There are at least two more bugs I've reported they've still not touched. So I'm sure others have been found that or they're own internal list still has some. So we'll probably see another patch soon.

      • zqw
      • 15 years ago

      8/21 update fixed some FX AA/aniso (also fps for 5900fx with AA/aniso)

      I just downloaded today – so I didn’t try the previous version.
      But, on my 5900fx I can turn on AA and Aniso inside CS. But, when I quit/restart CS, it says ‘can’t set video mode – restarting with defaults.’ I can also set AA+Aniso in the nvidia control panel without problems.

      The water reflections setting doesn’t affect fps, and it still doesn’t stick after quit/restart of CS.

      Also, when I alt-tab out and back in, some/all of the specular maps become pure magenta. It’s quite jolly.

      Anyway, 3.2ghz P4, 5900FX 128MB 1024×768 vsync off 71.66 stock quality settings. AA/aniso set via nvidia control panel.

      400/850(stock speeds)
      ~81fps no AA, trilinear
      ~51fps 4xAA, 8xaniso

      475/950 (5950 speeds, but less memory than 5950)
      ~92fps no AA, trilinear
      ~59fps 4xAA, 8xaniso

    • lyc
    • 15 years ago

    the reason why the current ati cards are beating nvidia here and not in doom3 is because nvidia has concentrated on shader performance in the gf6000 series, and these benchmarks don’t strain core logic at all compared to doom3.

    basically what you’re getting is a memory benchmark ^_^

    this is a lot like why intel beats amd at multimedia stuff, they optimise for streams of information. i used to be an ati fanboy, but they let me down when they didn’t include sm3 in their latest series. i’m hoping to upgrade to a 6800(gt?) soon…

    • Skyline57GTR
    • 15 years ago

    Nvidia 6800Ultra is very good perform in Doom 3 and ATI X800 is very good in HL2…that’s difference…hmm…Nvidia doesn’t make me so suprise about doom 3 performance.

    • sativa
    • 15 years ago

    i just got BANNED from DriverHeaven.net for linking to this article in their CS: Source thread.

    I took some screen shots of my post..

    §[<http://indeego.com/111686.jpg<]§ §[<http://indeego.com/111690.jpg<]§ §[<http://indeego.com/111692.jpg<]§ OR save indeego's bandwidth and click on these: §[<http://www.imagedump.com/index.cgi?pick=get&tp=111686&poll_id=0&warned=y<]§ §[<http://www.imagedump.com/index.cgi?pick=get&tp=111690&poll_id=0&warned=y<]§ §[<http://www.imagedump.com/index.cgi?pick=get&tp=111692&poll_id=0&warned=y<]§ thanks indeego.

      • indeego
      • 15 years ago

      §[<http://indeego.com/111686.jpg<]§ §[<http://indeego.com/111690.jpg<]§ §[<http://indeego.com/111692.jpg<]§ Let me know if you want me to host moreg{<.<}g

      • Krogoth
      • 15 years ago

      Somebody can’t take the heat. 😉

      • sativa
      • 15 years ago

      here’s his response to an email about why i got banned:

      y[http://www.driverheaven.net<]§ site owner.<]y here's a screenshot: §[<http://www.imagedump.com/index.cgi?pick=get&tp=111711<]§ my response is as follows: "Hi, I'm not sure how i'm reponsible for other newly registered members. I only registered 1 name. My comment did not violate anything on your terms of service. I didn't say "you guys must have used a GT." i said "Seems like you guys might have been using a 6800GT or x800 Pro instead of a 6800U" That is not imflammitory in any way and it is quite reasonable seeing as your scores for the 6800U are the exact same as others scores for the 6800GT. Will" I haven't gotten a response back yet.

        • indeego
        • 15 years ago

        ” ive no patience left to deal with it.” Best of luck running a site where you ban dissenting customers. That is just sillyg{<.<}g A simple warning or request to change language or something, but an outright ban for something that innocent? Especially since you just saw a rather large influx of potential new users/pageviewsg{

          • blitzy
          • 15 years ago

          heh, it should be called ATIfanboyheaven.net

            • Illissius
            • 15 years ago

            That’s taken. Rage3D.

      • Convert
      • 15 years ago

      To quote myself: “Jeez I wonder what kind of ruling that reminds me of.”

      Yes that’s right, ban everyone that goes against the -[

      • sativa
      • 15 years ago

      lol he edited the post to say

      y[http://indeego.com/111686.jpg<]§

        • freshmeat
        • 15 years ago

        Maybe he’s just having a bad day, or maybe he’s an asshole. The bad part of the internet is that one cannot tell which of those possibilities is true. The good part of the internet is that you don’t have to deal with such people if you don’t want to — just ignore his site.

        I’d have to say I had no strong opinions one way or the other about DriverHeaven before, but I feel no real desire to visit them. Between their touchiness about being the data outlier and their poor response to your post and email, I have a hard time giving them much credibility or benefit of the doubt.

        Anyway, I’m overjoyed that my (t)rusty 9700pro should be adequate for HL2 (as it is for Doom3) as long as I don’t want AA or AF — not bad for a geezer of a card :p

        • mercid
        • 15 years ago

        that site isnt all that hot anyway, who really cares when there are sites out there like TR. Anyway, it reminds me alot of the Hard episodes in those forums, i was one of the ones that got bannd for discenting on kyles highly misleading 3.06 HT numbers.

        Just dont go anymore, and make everyone aware of their tactics. (as your doing here) and they will feel the pain in advertising dollars.

      • Convert
      • 15 years ago

      Amazing, they locked the thread. How could you lock a discussion over a benchmark? People need to be able to discuss the results; you can’t just keep it closed like that.

      Further more he mentions this about techreport “more accurate meaning basically it “suits” the card in your title and giving further weight to my last post.”

      How dare he even attempt to bring down techreports reputation. Compare the two tests from both sites excluding the results. My brother could put together a better review, and he’s pretty young. Techreports review is far superior in every respect, DriverHeaven will even agree on that.

      • sativa
      • 15 years ago

      another update. here’s his email to me

      y[http://www.driverheaven.net<]§ site owner.<]y first of all, he is outright LYING about multiple accounts. My ISP changes my IP once every 3 days so its impossible for there to have been someone else in my area who registered a name with my IP. What a f_cking bastard. I cannot describe how frustrating it is to have him totally fabricate some bullshit. secondly, look at his first 3 sentences. thirdly, i was the only person to have linked to TR's article at the time, so i DID add something to the thread. unbelievable.

        • absinthexl
        • 15 years ago

        y[

      • sativa
      • 15 years ago

      here’s a link to a screenshot of his BS email to me:
      §[<http://www.imagedump.com/index.cgi?pick=get&tp=111961<]§

        • nexxcat
        • 15 years ago

        wow. thin skin.

          • Rousterfar
          • 15 years ago

          Guy sounds like a jerk. Thanks for pointing that out. I won’t be reading his website.

    • cy_a253
    • 15 years ago

    Gentlemen, we have entered the Age of the 256MB videocards.

      • Krogoth
      • 15 years ago

      Duh, it was going to obivious that in AA/AF performance in newer titles that 256MB will show it’s edge. Still, 128MB is still more then enough for HL2. AA/AF, IMO is still very overrated, it doesn’t still jusify spending the price prenium for 256MBs of VRAM.

      128MB cards already were choked in Doom 3 and UT2K4 with AA/AF enabled. HL2 wasn’t the first game that started the need for 256MB of VRAM.

    • Xplosive
    • 15 years ago

    /[

    • Ryu Connor
    • 15 years ago

    Not sure if it’ll make any differences, but CS Source (and Steam) were patched today.

    The patch isn’t very big, but it fixed one very irritating bug I was running into that was causing a CTD. No telling what else it fixed (no patch notes).

      • indeego
      • 15 years ago

      yeah I know for certain it fixed the server because I’d get memory read errors on Wednesday night, but Thursday night it worked fineg{<.<}g It brings up an interesting point that it will be harder to benchmark these games if they distribute like this. They patch dynamically, maybe even one day one owners of x vid card will receive a patch where y vid card doesn't. Won't be as easy to test against older reviewsg{<.<}g

      • indeego
      • 15 years ago

      They were patched yet again. So this’ll be a daily thingg{?}g

    • Damage
    • 15 years ago

    The article has been updated with screenshots showing the difference between DirectX 8 and DirectX 9 rendering paths. Interesting stuff.

    • daniel4
    • 15 years ago

    Damn I’m more interested in buying a 256MB card now more than ever. The 128MB cards sure are starting to show their age with the latest games.

    • Plazmodeus
    • 15 years ago

    A week before Doom3 came out, my client GAVE me a Radeon 9800XT 256mb which replaced my ti4200. I was all excited and looking forward to having a beefy, manly, high end video card to beat up on D3 and HL2.

    Alas, these are strange days and ‘high end video card’ doesn’t really mean what it used to. My 9800XT is just ‘good’, not ‘wikkid’, and I have to live, and be ok with that.

    I am going to crawl back under my rock and play UT2k4 at acceptable frame rates.

    • Dposcorp
    • 15 years ago

    I am still glad I run a AIW9700PRO and 9800NP. They still have a lot of life left. I would much rather get a new CPU, motherboard, and Ram, then a 6800Ultra or a 800XT for $499.

    $499 for a video card that does nothing else but be a video card?!?!?!

    My AIW9700Pro has been so kick ass in the time I have had it, was less then $300 when purchased, and came with a lot of extra hardware.

    I would spend $499 on a video card, but I expect much more for that money.

    ( 256 or 512 Ram, Dual DVI, 1 or maybe 2 HDTV tuners; PIP, VIVO, remote control, component output.l)

    Maybe the AIW9700Pro spoiled me.
    Just my .02.

    • Ryu Connor
    • 15 years ago

    My apologies. When I say X800 vanilla, I mean the entire twelve pipe series of designs.

    It appears that the extra four pipes and the agressive clock design of the XT PE was not an easily ramped change. I don’t think it was originally in their plans to build a sixteen pipe design and I don’t think it cooperated in yields quite like they hoped.

      • Krogoth
      • 15 years ago

      Blame the 6800U for that 🙂

    • jb
    • 15 years ago

    Well its interesting to note this is the first review where the GT beats the x800pro as both FirgingSquad and GameDepoet show the X800pro slighlty faster in all cases. Still good work

      • Illissius
      • 15 years ago

      gamersdepot’s results almost exactly reflect TR’s, except the 6800GT retains its lead even with AA+AF… although they used a 6800GT OC. Still, they’re basically neck and neck, same as w/ TR, and 20MHz isn’t going to make a huge difference.

    • TheTechReporter
    • 15 years ago

    Before you go jumping to conclusions about how these cards perform, read this:

    §[<http://driverheaven.net/showthread.php?p=423812#post423812<]§ It shows the XT Platinum having a huge lead over the 6800 Ultra - definitely a larger disparity than was seen in Doom 3. Now, I'm not sure which article is more accurate, but don't buy a 6800 Ultra just because of this article.

      • indeego
      • 15 years ago

      Those dudes need some i[“Ancient-Chinese-Art-of-Excel-Graphing ” ]/ classesg{<.<}g

      • Dissonance
      • 15 years ago

      From the DriverHeaven article:

      “Each test was run 3 times and the middle result shown in the graphs, Nvidia optimisations were on for the testing.”

      Curious that they would only mention leaving NVIDIA’s optimizations on when we all know that ATI has optimizations of their own, ones you can’t actually turn off. (https://techreport.com/etc/2004q2/filtering/index.x?pg=1)

        • daniel4
        • 15 years ago

        If we all know then why would they need to say anything?

          • Dissonance
          • 15 years ago

          Well NVIDIA’s optimizations aren’t exactly a secret, either, so why did they feel the need to single out those? When you look at their results, it’s all very…. curious.

            • Illissius
            • 15 years ago

            Because it’s useful information, unlike whether the ATi ones were used as they obviously were since you can’t turn them off.
            More interesting is where they use ATi’s 6xAA vs. nV’s 8xS… doesn’t nVidia have a new 8x mode which has a lot less of a performance hit? (ie, 4xMS and 2xSS, as opposed to the other way around)
            Anyways, I wouldn’t trust their results much as they’re pretty clearly ATi biased.

        • vortigern_red
        • 15 years ago

        Erm. If you can’t turn them off is it not pretty obvious they are on? Whereas NV gives a user the choice so it makes sense to say whether they are on or off.

        ie. reading a review I wonder “are these benchmarks with NVs optimisations on or off?” so it is useful to be told. I don’t think that with ATI as I know they are on, so don’t need to be told.

        EDIT: Sorry, forgot to add this in no way validates their seemingly incorrect results.

      • Convert
      • 15 years ago

      WTH is up with DriverHeaven’s results? I just don’t understand it… How could they put out a benchmark that is obviously pretty skewed (on accident for the benefit of the doubt) and not question the results themselves as so many other users are?

      Not to mention anyone that questions it is a hopeless fanboy. No sir’s, not only will fanboys question it but so will people who know BS when they see it. We can’t forget the fact that whoever swallows this without a second thought is also a fanboy.

      I mean it’s so obvious, x brand looses to y brand so if any x brand supporters question it they must be fanboys so their claims are immediately falsified. Jeez I wonder what kind of ruling that reminds me of.

      TR Always questions results. Whoever doing the testing will always make a note to some extent “Hmm things don’t look right here I suspect some funny business”. They will also attempt to explain why there is such a performance delta. With these people its just a end all be all of “Well, I think its safe to say the results speak for themselves.”. No I don’t think it’s ever safe to say that.

      If they did something on accident then I can understand a mistake. Lets hope they either justify their results with some answers as to why they are like that or find the mistakes and correct them. In all honesty there could be a very reasonable answer for it.

      • absinthexl
      • 15 years ago

      The last pair of benchmarks is interesting, and makes me really question the credibility of the site. They run the ATi card at 6x AA and the nVidia one at 8xS AA, then show the results (and comment on them) as if they’re equal.

      8xS has been shown to drop framerates by 40-50% over 6x. Why do they essentially damn nVidia for allowing higher-quality settings?

        • Chrispy_
        • 15 years ago

        y[<"8xS has been shown to drop framerates by 40-50% over 6x. Why do they essentially damn nVidia for allowing higher-quality settings?"<]y Because Veridian obviously doesn't know the difference between the two modes (oh dear). If he did, it would have saved his credibility to add just a few words to explain his choice. There's always the possiblity that DH is hosting this controversial review to generate hits before they sell out. That's a crazy unlikely idea though, and it's more likely that the review was written in a hurry, by a n00b.

    • Ryu Connor
    • 15 years ago

    q[

      • vortigern_red
      • 15 years ago

      y[

    • Samlind
    • 15 years ago

    Good. Now maybe ATI will drop their prices some. An X800XT Platinum boojesus shinemyshoes givemeagreatsexlife for less than $400 would be sweet.

    • Illissius
    • 15 years ago

    According to FiringSquad, the GeForce FXs are running in DX 8.1 mode. Can you guys confirm this? It would explain the water reflection thing and otherwise make a lot of sense, as I’d certainly expect them to do a lot worse with DX9 involved…

      • Damage
      • 15 years ago

      He doesn’t say how he knows that, which makes me curious. I’d like to confirm it if I could. Visually, the GeForce FX looks very much like the other cards to my eye. I’m perplexed by his assertion that the FX cards don’t do high dynamic range lighting. After reading that, I went back and checked it out with the FX 5950 Ultra, and HDR lighting appears to be active. That makes me think the FX cards are running a DX9 code path.

      I tried to mail Brandon, but it bounced. Doh! We’ll see if we can corner him somehow and ask what he knows and how.

        • droopy1592
        • 15 years ago

        Anyway we can verify driver heaven’s high AF benches? Looks like a big difference.

        -edit: Oh wait, looks like Driver Heaven people are already attacking the benches because they are getting higher scores with their 6800 GTs over clocked to Ultra levels with slower processors.

        • Illissius
        • 15 years ago

        Looking at the high res (png) screenies from their site, there -are- some very noticeable differences, to the extent that by the third and fourth images I could tell which was the 5950U and which the 9800XT just by looking at it. The odd part is, the 5950U looks the more detailed of the two… o_O

          • vortigern_red
          • 15 years ago

          I thought that there was a huge difference, and I could see why you think the NV shots look more detailed, very odd.

          I spent ages looking at these images and I can’t quite figure it out but if you look at the images of the cave with the water in the bottom the R420, R3x0 and NV40 all render similar looking images, just look at the water and the wall on the right hand side, the NV3x renders the water completely differently and the wall looks far more “contrasty” The wall at the back is the most obvious difference between the R3x0 and the NV3x but unfortunatly is the most difficult to judge compared to the NV40 and R420 due to the fact that they are much closer. But the NV40 and R420 shots look as more like the NV3x shots to me there, but I’m not sure.

          The shot with the fire in them are very difficult to judge on as the lighting and position is different in every shot, but again I think the NV3x looks different to all the others

            • Damage
            • 15 years ago

            I’ve updated our article with screenshots showing the difference between DX8 and DX9 rendering paths, plus a shot of whatever the FX cards are doing. Have a look.

            • vortigern_red
            • 15 years ago

            To me there is no obvious difference between your NV3x shot and the X800 shot. But FS shots look way different, Illissius is right the NV3x shots look far more “textured”

            Look at these.
            §[<http://firingsquad.com/media/article_image.asp?fs_article_id=1532&pic_id=05<]§ §[<http://firingsquad.com/media/article_image.asp?fs_article_id=1532&pic_id=06<]§ §[<http://firingsquad.com/media/article_image.asp?fs_article_id=1532&pic_id=09<]§ §[<http://firingsquad.com/media/article_image.asp?fs_article_id=1532&pic_id=10<]§ Look at the water and the wall on the right, The first two are pretty identical, X800 and NV40, the next one is R3x0 and the last NV3x. NV3X looks very different to all the other 3 but R3x0 could also look different from the first two shot re the wall on the right (less contrasty perhaps, hence exagerating the difference between itself and NV3X???) but the water looks the same as the first two. I'm not sure at all what or why the differences are, but they are obviously there, unfortunately the shots are from quite different positions so comparing is very difficult. Also look at these ones §[<http://firingsquad.com/media/article_image.asp?fs_article_id=1532&pic_id=14<]§ §[<http://firingsquad.com/media/article_image.asp?fs_article_id=1532&pic_id=13<]§ first is NV3x and second is R3x0 (no R420 or NV4x shots unfortunately) You don't need the PNGs to see the difference in those shots, the NV3x shot looks far more "bumpy" and one of the cables/pipes along the roof is two completely different colours. There seems to be a massive difference in the lighting/shadows.

            • Damage
            • 15 years ago

            I’m not sure those screenshots you linked are 100% right. I’m not saying they’re wrong, but I am a little wary at this point for various reasons.

            That said, it’s an interesting contrast. The differences between them could be caused by different levels of precision for normal maps, or perhaps some sort of difference in the handling of normal map data between the cards. However, these aren’t exactly super-deep normal maps, so I doubt we’re talking about floating-point texture formats or anything too exotic. The hard part would be assigning a value to one or the other as “better.” I suppose the Valve guys could tell us what they intended and which card gets closer to it, but I’d hate to guess.

            • vortigern_red
            • 15 years ago

            y[

            • vortigern_red
            • 15 years ago

            DaveB at beyond3D today wrote:

            y[<5900's are running in "Mixed mode" which means mixed precision dx9 shaders, and possibly a few more dx8 shaders. Its only the low end FX series that Valve treat as full DX8.<]y I'm pretty sure he will be right, so it would concur with your: l[

            • Damage
            • 15 years ago

            I’ve added a screenshot from the FX 5700 Ultra. It, too, appears to be using higher precision like the X800 Pro and 5950 Ultra.

            • vortigern_red
            • 15 years ago

            I would expect it to!

            My presumption is that all NV35 based cards (FX59x0 and 5700) will run mixed mode (or even full) and that all NV30 based cards (ie FX5800, 5600, 5200) will be the cards that run mainly DX8 or at best mixed mode.

            The difference being the cards that have all floating point pipelines (59×0 and 5700) V the cards that still have fixed function hardware (or integer, I can’t quite remember, its too late for me! 12:30 here) (5800, 5600, 5200)

            If you wanted to see the difference then comparing a 5800 (or 5200 or 5600) to a 5900 (or 5700) should show it, if it is really there. Bear in mind the worst possible outcome for these cards should be the Geforce4 picture you posted and I don’t think that DX8 shaders V DX9 shaders has anything to do with the difference in the FS screens.

    • BigMadDrongo
    • 15 years ago

    This article makes me a lot happier about the 6800GT I bought last week 😀 Was having great difficulty choosing between 6800GT and X800Pro, in the end I all but tossed a coin…

    Coming from a Ti4200 it is scary how fast this thing is. (Admittedly I also upgraded the CPU from P4 2.26 to A64 3000+ – which is /[

    • ChangWang
    • 15 years ago

    man, nvidia is doing big things this round. now I’m sure I’ll be getting a 6800…

    DROOPY! how ya been man. Trey told me a few months back that you traded your Jetty for an A4

      • droopy1592
      • 15 years ago

      Yeah man, I gave up the Jetta and bout a 1.8t quattro manual 04. It’s really nice and hopefully next year I’ll start modding it. I’m going caractere front and paint the lowers, GIAC chip (the new one) and some S4 brakes.

        • ChangWang
        • 15 years ago

        sweet! I was in an accident in late june and mine was totaled. I just bought another one though, Lagoon Blue GLI. You ever come to any of the GTG’s? I’ll be at the Varsity on the 2nd.

        Oh yeah, anyone care to gander on how a 5900XT would fair in all this?

          • droopy1592
          • 15 years ago

          VW guys meet up TOO much. Audi group meets once a month, but I may show up anyway.

          Hope you are ok…

      • droopy1592
      • 15 years ago

      Oh yeah, next time you guys go on a mountain run, let me know.

    • droopy1592
    • 15 years ago

    I’m an ATi man but it looks like the Geforce 6800GT is the card to have unless there is some major image issues, which I don’t see… not yet anyway.

    • Spotpuff
    • 15 years ago

    Why does the 9800xt own the 9800 pro so bad? 🙁
    My brand new $200 (CDN) 9800 pro feels bad now 🙁

      • dukerjames
      • 15 years ago

      if you look at the test specs: they used 9800XT 256MB and 9800Pro128MB

      9800XT isn’t that much faster than 9800Pro maybe 10% or so. but the extra memory seems to be helping a lot here.

        • Spotpuff
        • 15 years ago

        Guess it’s large textures, but I can play doom 3 1024×768 high quality no problem whatsoever.

        Sooo… yeah. I wonder if it’s the memory size… we are finally seeing games now where VRAM makes a difference.

        Crap.

        Should have spent the extra $25 on the 256 mb version 🙁

          • Usacomp2k3
          • 15 years ago

          wonder how my 9800pro flashed to XT would sit in there 128mb version

            • MadCatz
            • 15 years ago

            I was wondering the same thing, especially since the 128mb Geforce 6800 owns the 256mb 9800XT. Is 4 extra pipes enough to boost the Geforce past the 9800XT with only half the ram?

            • dukerjames
            • 15 years ago

            don’t worry. take a look at this: §[<https://techreport.com/etc/2004q3/source-engine/cs-640.gif<]§ Our 9800pros owned everything, got a gold medal in the first test. :P

        • Krogoth
        • 15 years ago

        It’s only in AA/AF benches that the extra VRAM comes in handly. The 9800XT also crushes the 9800P 128MB in D3 and UT2K4 if you enabled AA/AF due to having more VRAM.

    • Convert
    • 15 years ago

    Great review. That was far from the slaughter I was expecting heh. In fact those scores compare well with other DX9 titles, you see the same trend.

    So the ginormous lead ATI was expected to have wasn’t that big after all.

    With every review that goes by the GT looks better and better.

    • FireGryphon
    • 15 years ago

    Judging from the numbers, Doom 3 still stresses graphics cards more. It seems any card from this generation or the last is going to play HL2 extremely well at high IQ’s.

    I’m stoked for HL2. Better finish doom 3 soon…

    • indeego
    • 15 years ago

    Ran this last night and I have a AMD64 3200+, gig of ram, and 9600 xt and I should be getting much lower scores than thoug{<..<}g Athlon64 3200+ 1Gig PC3200 Nforce3 150 ATI Radeon 9600XT XP SP2 and all patches/drivers up to date 63 fps @ 1024x768 32 fps @ 1280x1024 I was quite pleased with the performance, although it hitched at 1280x1024 several times. Oh well. Wonder if it's because you have SP2 RC2g{

      • Damage
      • 15 years ago

      Your scores are very similar to what I got. That’s no surprise given that the 9600 XT is pretty much shader/fill rate limited here. The faster CPU and memory subsystem won’t matter much when the graphics card is the primary bottleneck.

      • indeego
      • 15 years ago

      Just an update. Got a 9800pro on the same system and now my FPS is more than double at 73.57 fps. Sweetg{<.<}g

    • spworley
    • 15 years ago

    Thisis a great peek at what to expect with HL2!
    It means that I will indeed be getting my 6800GT now.. between HL2 and Doom 3, that’s going to be hard to beat. ATI is definately staying competitive, but with the 6800GT doing so well in both of the Big Games, and with shader model 3, and with 32 bit floating point, my decision is now made. I’m sure that an ATI x800 would be OK, but now IMHO Nvidia has the tech edge (again). A great race, though, and it just helps us users!

    TR, Thanks for such a quick compare after the demo came out!

Pin It on Pinterest

Share This