During the preparation of my Radeon X1900 review, I discovered a problem with Radeon X1900 cards in 3DMark06. I meant to mention it in the review itself, but things came down to the wire and I just ran out of time. Here’s the deal: in the “Firefly Forest” scene of 3DMark06, Radeon X1900 cards render an image that’s not quite right. The problem is visible to the naked eye—that’s how I spotted it. The glowing area around the two floating lights in that scene—the “fireflies,” I suppose—looks almost as if the colors were dithered on the Radeon X1900. Like most dithering, this visual artifact is most obvious in motion; the area around the lights creeps, crawls, and “fizzes” as the light moves through the scene.
Here are some screenshots that clearly capture the problem. First, the Firefly Forest scene on the GeForce 7800 GTX 512 looks like so. Note especially the glow of the green light on the forest floor. Now, look at the same frame as rendered by the Radeon X1900 XTX. The colors on the forest floor take on a rough, salt-and-pepper kind of texture that looks, as I said, like dithering. For those of you who are squinting and straining to see the problem, I’ve blown up the most relevant portion of the frame to four times normal size. Here’s the 7800 GTX 512 and the Radeon X1900 XTX. If you can’t see it there, check with your optometrist.
I pinged ATI to see what they had to say about this problem, and they confirmed that were able to duplicate it themselves. They said that the image problems are caused by a bug in the compilation of the shadow shader on the Radeon X1900. They explained that this shader is different than the one used on either the the X1800 or the GeForce cards because it “employs both Fetch4 and 24-bit DST” in order to best take advantage of the X1900’s capabilities.
Sure enough, the artifacts don’t appear on the Radeon X1800 XT, so ATI’s explanation seems plausible.
Pixel shader programs are compiled in the graphics driver, of course, so ATI should be able to fix the problem with a driver update, and that’s exactly what they expect to do. We don’t yet have any ETA on the fix, but ATI claims it should not adversely affect performance. We’ll have to see about that.

The HL2 engine adapts to the power of the GPU and removes distant objects sooner on slower GPU’s, clearly this was erroneously detected here (or they didn’t use a fresh install of steam).
You can however force it to behave in a less extreme way via settings and cvars, see §[<http://www.steampowered.com<]§ the support section for info. There's no reason to panic about it, the hl2 engine has many of these hand-fixable bugs.
how are MISSING OBJECTS the result of AA?
How often do we see image quality issues that only crop up in one game? I think it is actually quite often.
So far, this issue does not seem to happen in any actual game. Like the ones you play, where the mouse and keyboard make things happen on the screen.
ATI’s time would be better spent fixing issues with real games.
And that has exactly what to do with the newsarticle in question, Beomagi? :-/
gtx 512 xxx? does it feature a naked nalu on the sink?
That’s the explanation. He’s trying to be a fly, in his hopelessness to defend ATI. Poor sap. :-/
AA doesn’t explain the bushes that are COMPLETELY missing. There are also some shadows missing the ATI shot (check out the barrel on the right, for example).
Why am I not surprised.
it doesn’t look better, unless of course you see black dots over everything in real life.
This has to do with ATI’s AA method, it tries so hard to anti-alias extremely thin lines that the thinnest ones end up almost blended completely out. The tree will show what I mean, and the bushes would also be there when you got closer to them. Dont like it? turn down the AA. The tree shows more detail in the nvidia shot, but it doesnt really matter because it looks horrible anyways with all the aliasing.
Look at the barrel at the right of the scene, shadow for ati and no shadow for nvidia, now that is a real bug, either driver or game, more likely driver.
I think it looks better with the “mistake”
I agree with you. People should be entitled to their opinions without becoming the subject of slander.
Yeah, and you are getting upset about someone else’s opinion, relax guy.
The Geforce is clearly better in this case. The ATI card is not rendering the image properly and it does not look better.
Please do not compound the blindness by agreeing with him. If you can’t see the flaw between the two (zoomed) images, please stop using your computer. “More realistic”? To a Smurf?
This “useless program” just highlighted a bug in the card’s driver, which would potentially affect who knows how many games..
I would only care if there was a large difference in performance, otherwise no big deal. Probably just a bug they havent got around to ironing out.
Yeah, bring on the mathmatical diff screenies… I always enjoyed those on the anti-aliasing tests… why not on the image quality test?
Yeah, you’re a goddamn idiot, the (horrible) difference is obvious. It looks like something from the 1980s.
Seems like it’s time again to compare output as delivered by ATI/Nvidia to that delivered by the MS reference renderer. This technique has already proven useful in the past.
Exactly what I thought–the ground/grass looked like it had more depth, and looked more realistic in the “mistake.”
This all app specific optimization stuff is pure crap. All cards should produce equal output, yet they cheat and go for higher FPS
I’m with you on that one. Even though the x1900 is rendering the image improperly, it looks better. Looks more “natural”, like the real outside would.
Nvidia’s had their fair share of new cards + drivers = bugs.
Nah, I couldn’t tell which one was the “bad” one when I put them side by side.
EDIT: Upon further review I do finally see what Damage is talking about, but I think it would be much more visible in motion like he mentioned in the article.
Can anyone try renaming 3Dmark06 to 3Dmurk06?
How about a performance comparison chart showing the 3DMark05
results with the GTX and XTX….Its the benchmark the majority of TR members are familiar with and would have been interesting to see along with the 06 results.
Just checked out the 2 screenshots on the TR frontpage links……oh the horror….
So am I a technoplebe for thinking the “defective” X1900 image actually looks better?
Or…. There is some kewl cheating involved again…
Why would anyone with a working brain give a crap?
X1900XT is just new and ATI driver developers barely had any time to refine its drivers.
Oh God, I’m so sick of this crap.
Where it was said?! Can’t see it 😉
really???
At least the drivers don’t corrupt the entire texturing of the scene like NV’s 8x.xx drivers did for months on the 6×00 series (my 6800 and 6600) in Guild Wars. 🙂 Was fairly laughable and annoying.
Some games yes, synthetic junk… meh, until they get into future games. Can’t be ignored either way for that reason though, cause it’ll probably come back and bite us in the hiney later 😛
HL2 was endorsed by ATI…
I couldn’t tell there was anythign wrong until I looked at the 7800 version and saw the difference. That said, new cards + ATi drivers = bugs.
How important this is really depends on whether the buggy pixel shader compilation is in a benchmark-specific optimization, or the generic pixel shader support. If it’s a problem with an optimization that is specific to 3DM06, then no biggie. But if it is a problem with the generic pixel shader handling, then any games which use this type of pixel shader could also be affected.
Because if it cannot render this properly it means that some games using similar effects may not render properly. Oh, BTW, you don’t play UT2004 with us anymore? :'(
So we dont care NVIDIA had whole lot of ‘bugz’ with 3DM05, and some games too…
i see what 5 is saying, if you look around the forest floor and plants where the green firefly shines on you see more shadow detail on the plants/ground with the 7800. also the pink firefly has a trail of light with the 7800 where as the x1800 does not, however the x1900 shows the trail.
As it was said “If you can’t see it there, check with your optometrist.”
Well let’s give him grace and assume he meant X1800 XT instead.
Besides, we all know how this industry lovex their exxxxex. Even XFX is now putting out a GTX 512 XXX card (no lie!).
–dv
….There is no X1800 XTX….
If this is limited to synthetic software then it’s not an issue since those I skip over 😛 The only concern I have, though, is that if synthetic offers some glimpe into future games supposedly, then this may affect future games if ignored, eh?
Just looks to me like the X1800’s image is a little bit darker than the 7800 GTs. Turn up your monitor’s brightness? 🙂
Even when i view the X1800 image it seems that the picture is a lower quality as well, some shadows / detail are missing when compared to the nVidia screenshot which looks better.
Anyone else notice this?
Does this mean the nVidia card is processing more things as the ATi is removing detail?
Bah… Can’t see any dithering…
ATI or Valve needs to fix the loss of detail in HL2.
Look at the screenshots from this review at Firingsquad.
§[<http://www.firingsquad.com/hardware/ati_super_aa_vs_nvidia_sli_aa/page3.asp<]§ Here is the NVIDIA shot §[<http://www.firingsquad.com/media/article_image.asp?fs_article_id=1789&pic_id=12<]§ Here is the ATI shot §[<http://www.firingsquad.com/media/article_image.asp?fs_article_id=1789&pic_id=11<]§ In the ATI shot there are bushes, branches and other things missing. Rage3D ran into the same problem but could not find out why it happens other than HL2 is running at a lower detail setting. This may be on Valve's end. Can anyone else confirm this or shed some light on it? Seems to make HL2 benchmarks as uneven.
Any chance you can try taking a screenshot using a Radeon X1600 or X1300 for comparison, as these two parts also support Fetch4 and DF24?
Hanners
Elite Bastards
I really don’t care if a card can run this useless program properly or not.