Further NVIDIA optimizations for 3DMark03?

WELL, I CAN HARDLY BELIEVE we’re back on this subject again, but this saga just won’t die.

Let me try my best to bring you up to speed. First, there was a late, difficult birth for NVIDIA’s new line of GeForce FX chips, especially the stillborn GeForce FX 5800 Ultra. Then, NVIDIA voiced complaints and concerns about FutureMark’s new version of 3DMark, the DirectX 9-based 3DMark03. 3DMark’s publisher, FutureMark, fought back, and it was a hoot—a big, public brawl between hardware vendor and benchmark house.

Once GeForce FX review units finally trickled out to reviewers, folks started finding driver optimizations on the FX cards that reduced image fidelity to gain performance. NVIDIA’s spiffy new Detonator FX drivers seemed to fix that problem, and improved performance dramatically. But then ExtremeTech published an article exposing specific optimizations in the Detonator FX drivers intended to inflate 3DMark03 benchmark scores.

Mmm. Fishy.

Soon, FutureMark released an audit report confirming the problems with the NVIDIA drivers and explicitly stating that NVIDIA was cheating. NVIDIA disagreed. Brouhaha ensued. ATI, for its part, was also caught optimizing its drivers. ATI owned up, pointed out that its optimizations didn’t change image output, and pledged to remove them from its next driver release anyhow.

FutureMark released a new build of 3DMark03 that disabled the disputed driver optimizations. (We benchmarked the build with optimizations versus the build without.) Dell then weighed in in support of 3DMark03. Presumably after some behind-the-scenes legal wrangling, FutureMark issued a joint statement with NVIDIA backing down from the “cheat” word.

And that’s where we are now. Phew. Thank goodness for hyperlinks. This story has been unfolding long enough to induce fatigue in certain quarters.

Even more background
If you’re old like me, you may remember the day when ATI was caught optimizing its Radeon 8500 drivers for Quake III Arena timedemos. The trick was simple: ATI’s drivers looked for an executable named “quake3.exe” and turned down texture quality when “quake3.exe” started. Kyle Bennett at the cold, HardOCP renamed “quake3.exe” to “quack3.exe” and ran some benchmarks. ATI was busted.

In a funny twist of fate, I got a tip earlier this week about NVIDIA’s Detonator FX drivers. The allegation: if you rename 3DMark03.exe to something else and run the benchmark with anisotropic filtering enabled in the drivers, test scores drop. In other words, NVIDIA appears to be using the same lame technique ATI did way back when: keying on the program’s filename in order to trigger benchmark “optimizations.” In this case, those optimizations appear to be a lower quality form of texture filtering than the anisotropic filtering method selected in the driver control panel. Many review sites like us benchmark cards with anisotropic filtering and edge antialiasing turned on, so these things do matter.

Keep in mind, now, that this goes beyond the eight specific Detonator FX “cheats” FutureMark identified in its audit report. This is in addition to all of those things, and this one works with the latest release version of 3DMark, build 330. It appears NVIDIA has taken yet another measure, which FutureMark didn’t catch in its audit, in order to boost scores in 3DMark03.

You probably won’t be surprised to learn that my tip came from ATI. The ATI folks say they were studying the Detonator FX driver’s new filtering routines, to see what they could learn, when they discovered this quirk. (Some have claimed NVIDIA was the original source for the Quake/Quack debacle, but I have no first-hand knowledge about that.) Regardless, we felt compelled to test the Detonator FX drivers to see if the allegations were true. So we gathered up a GeForce FX 5800 Ultra, a GeForce FX 5600, and a GeForce FX 5200 Ultra for a little benchmarking session. We brought along our typing fingers, so we could rename “3DMark03.exe” to “3DMurk03.exe” and see what happened. Our results follow.

 

Our testing methods
The benchmark results we’ll present come from testing conducted by our own Geoff Gasior. I also conducted extensive testing myself on a GeForce FX 5800 Ultra as we formulated the best means of testing for these driver optimizations. The results of Geoff’s tests served to confirm my own results. Since Geoff was able to test on a wider range of cards, we’ve elected to publish his results.

As ever, we did our best to deliver clean benchmark numbers. Tests were run three times, and the results were averaged.

Our test system was configured like so:

  System
Processor Athlon XP ‘Thoroughbred’ 2600+ 2.083GHz
Front-side bus 333MHz (166MHz DDR)
Motherboard Asus A7N8X
Chipset NVIDIA nForce2
North bridge nForce2 SPP
South bridge nForce2 MCP
Chipset drivers NVIDIA 2.03
Memory size 512MB (2 DIMMs)
Memory type Corsair XMS3200 PC2700 DDR SDRAM (333MHz)
Sound nForce2 APU
Graphics card GeForce FX 5200 Ultra 128MB
GeForce FX 5600 256MB
GeForce FX 5800 Ultra 128MB
Graphics driver Detonator 44.03
Storage Maxtor DiamondMax Plus D740X 7200RPM ATA/100 hard drive
OS Microsoft Windows XP Professional
OS updates Service Pack 1, DirectX 9.0a

The “Quality” setting was used with the Detonator FX 44.03 drivers.

The test system’s Windows desktop was set at 1024×768 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

 

3DMark03 overall score
We tested all of the new NV3x chips we could get our hands on, including the NV30, NV31, and NV34. Of course, we tested with 3DMark03 using two filenames: “3DMark03.exe” and “3DMurk03.exe”. In order to keep a control group of sorts, we tested both with and without anisotropic filtering enabled. As you will see, the “3DMark” and “3DMurk” scores are nearly identical when aniso is disabled. However, with 8X anisotropic filtering set in the driver…

…scores drop in “3DMurk.exe”. The GeForce FX 5800 Ultra’s scores suffer most dramatically, but all three of the GeForce FX chips are affected. In the following pages, you can see the scores on the four individual game tests that contribute the 3DMark03 overall score. After that, we’ll have a look at the differences in image output.

 

Game test 1

Game test 2

 

Game test 3

Game test 4

 
Different filtering, different images
We’ve taken some screenshots using 3DMark03’s image quality tools in order see if there’s any output difference between “3DMark” and “3DMurk.” The differences are subtle, but unmistakable. Below are images taken from frame 480 of 3DMark03’s game test 3, Troll’s Lair, on a GeForce FX 5800 Ultra with both executable filenames.


Frame 480 from game test three with 3DMark03.exe
(Click for full-sized, lossless, high-bandwidth PNG image)


Frame 480 from game test three with 3DMurk03.exe
(Click for full-sized, lossless, high-bandwidth PNG image)

Differences between the two are not immediately obvious here, even when you’re looking at the full-sized PNG images. 3DMark03 doesn’t present a lot of flat surfaces and bright lighting that might make detecting texture filtering quirks relatively easy. However, using some image editing tools, we can easily identify the differences.

The image below is the result of a mathematical “diff” operation between the two full-size screenshots linked above. We’ve enhanced the image using a histogram function in order to make the differences more visible. Here’s the result.


The difference between the two images
(Click for full-sized, lossless, high-bandwidth PNG image)

Texture filtering methods do indeed appear to have changed when we renamed the executable file.

To give you some context, the output from 3DMark03’s image quality tool is usually consistent from run to run. We captured a pair of screenshots from frame 480, both using “3DMurk03.exe”, and ran a diff between them. Not a single pixel was different. The differences between “3DMark” and “3DMurk” image output are very likely related to the performance differences we’ve observed.

 
Conclusions
Given the source for our tip about these benchmark-specific driver optimizations, we had some initial trepidation about bringing this issue out in the open. In light of that fact, it’s probably not appropriate for us to comment on the nature of these optimizations, whether they qualify as outright cheats, or any of that. We have already commented on many of the issues surrounding driver optimizations and popular benchmarks like 3DMark03, Quake III Arena, and the like. If you read all the linked items in the introduction to this article, you will have a good overview of the issues involved. We will leave it to the reader to draw his own conclusions. 
Comments closed
    • Anonymous
    • 17 years ago

    Post by Boooooommm:

    This is simple: I don’t trust 3DMark benchmark result because they don’t ultimately show the performance on real games available in the market! They do not use engines that are, will or were used in games, so it really doesn’t tell you exactly what your results will be in real games. I would consider it more a RAW POWER benchmark than anything else. As to nVidia and ATi cheating with their drivers the answer is simple: If you call optimizing cheating, then you folks must hate faster cards. It’s a bit ironic how people think these current events can be called cheating. Anyway, there is no gain in optimizing drivers for a test that will not really provide a considerable view on how a card will perform in real game situations. In this case, both are cheating. SO nobody here is saved from the sin. Futuremark doesn’t make a reliable benchmark (even though it is respectable today) because it doesn’t make the cards run on Real Game situations. nVidia optimized for marketing reasons and ATi… well… ATi’s life with drivers is optimizing. So all of them are completely unreliable if this is the case. Nobody, unfortunately is safe.

      • Anonymous
      • 17 years ago

      ATI much RAW POWER it has.

      • Anonymous
      • 17 years ago

      At least ATI did not lose IQ when they optimized. They did not make quality suffer to push a extra few frames per second to fool consumers and OEM’s like Nvidia has. And Nvidia has done this 3 times now. Regardless how you think of this benchmark(not too many people bitched about 3dmarks2001. Hmmmm) it proves one thing. Nvidia has to doctor benchtests to fool everyone that the 5900 is the fastest kid on the block. And the funny thing is, it is. They did not have to do this. They just wanted the card to pummel the 9800pro. Which it did not do. Not even close. Actually losing to the 9800pro in shader tests and shader games.

    • Anonymous
    • 17 years ago

    It is cheating the consumers. However, is it cheating that most software writers optimize their software to operate better with Pentium based processors? AMD may not be as fast, but if they have to emulate and still outperform the competition no one seems to cry foul. Likewise with the driver optimizations. Both ATI & Nvidia need to spend a little more time correcting their drivers to work with the various types of software programs and spend less time worrying about the benchmarks. The benchmarks will follow!

    • Anonymous
    • 17 years ago

    this is funny…not defending either ati or nvidia…but, is ati in the beta program for future mark…nvidia i guess didnt want to pay fees to be part of the beta program…maybe thats why ati felt they need to say sorry (not coz they care and loved you loyal ati fans) and nvidia doesnt…they both just trying to maintain there image as the two top contenders in graphics…all they want is our hard earn cash…i think both companies make great video cards…benchmark to me is just to see if i my graphics card can handle the games being released today…damn i dont sit and play games and say damn im loosing x amount of fps…who cares whether they optimise…we’re gonna optimise it ourselves anyways…there is no real life comparison when it comes to benchmarks just like cpus…your not really gonna see the numbers…of course there is a difference when your making comparisons with dinosaur components…for the record i think ati is a better video card but, like nvidia coz i dont have to optimise vid card myself…muahahahahah!!! with ati optimising i still have to optimise…this tells u i dont care!!!

    • Anonymous
    • 17 years ago

    ERIC’S OPINION
    When this whole sorry spectacle first broke, there were those who claimed nVidia couldn’t possibly be doing anything wrong. When Futuremark issued a formal statement calling it a cheat, there were those who said nVidia wasn’t really cheating, just optimizing. Now nVidia is apparently doing the exact same things that ATI was castigated for a couple of years ago–fudging image quality during benchmarks to garner higher scores. Many of the voices who are defending nVidia today were the same voices who called for blood from ATI because of Quack vs. Quake. One wonders just how long a self delusion can continue before the truth runs head on into inescapable fact.

    In the final equation, though, this changes very little. Given how wantonly nVidia has distorted the issue to serve its own interests, and how pitifully Futuremark has responded to a blatant lobotomy of its one and only product, can anyone give any credence whatsoever to any benchmark nVidia issues? Futuremark’s reputation has been utterly destroyed here, and the very idea of synthetic benchmarks at all has been called–quite rightly in my opinion–into question.

    That nVidia would stoop so low is disappointing but not surprising. That it is being allowed to get away with it by so many is.

    • Anonymous
    • 17 years ago

    #221,

    I think you meant to say that “Confusing opinions with fact are what HardOCP is good at”.

    What did they use this as a rebuttal for ET’s analysis of nVidia’s 3dmark issues? That ET was sour because of not getting to review Doom 3. The “fact” was that they didn’t get benchmark Doom 3. It is /[

    • Anonymous
    • 17 years ago

    Mister Wasson, you disappoint me.

    run 3dmuck on two equally priced cards from nvidiati (not those 500$ cards. 200$ is more then enough),get the results
    and then run whatever games you like and compare the results with the murk ones.
    if the results are about the same then nvidia is the wrongdoer.
    and if not ,then 3dmark is not not so good for benchmarking any more, isnt it?

    once this graphzilla fight is over, world peace will be achieved and all problems will be over.

    keep it in proportion people.

    • alphaGulp
    • 17 years ago

    What are ‘optimizations’? Basically, the programmers find patterns in the execution of the code and add more intelligence to their own code to take advantage of it. In its essence, this is truly no different than compressing a file.

    Benchmark optimizations (be they from actual games or from benchmark suites) are inherently risky, since the output (the set of frames) is identical from run to run.

    This is equivalent to testing pkzip with the exact same file, over and over again. If you wanted to, you could compress the entire file down to zero bits, since from its name can assume what output to generate.

    Evidently, neither NVIDIA nor ATI would ever go so far. My point is that any optimizations should be looked at critically, and that even identical output doesn’t mean that gross cheating isn’t taking place.

    Conversely, optimizations done to the quake3 engine (for example) that result in improvements across all / most benchmarks of games based on the quake3 engine could be taken to be ‘valid’ optimizations.

    Furthermore, if my imagined quake3 engine optimization were to modify the output slightly, that would not seem to me to be that big an issue. (lossy compression of images is evidently something we appreciate in jpegs, for example)

    However, in a ‘future benchmark’ like 3dmark03, I do not believe that aggressive optimizations should be used AT ALL, since you end up optimizing heavily for an estimate of what games will be.

    After all, optimizing aggressively for one game (e.g. Dungeon Siege) does not mean that you have optimized for all other games as well (all shooter games, RPG games, racing games, etc.).

    • alphaGulp
    • 17 years ago

    To those who discount ‘future benchmarking’ as a whole and care only about benchmarks from actual games:

    Simply because the problem space is complex doesn’t mean that we should simply give up.

    An apt analogy can be found in economics, which also has a set of measures that you could decide to discount entirely: ‘So what if the GDP grew by X%, and the unemployment rate grew by Y%? That doesn’t mean that I should sell my shares and buy government bonds’. Evidently, only a fool would whole-handedly knock all the information that economic measures give us, only because they are uncertain.

    Similarly, although a benchmark like 3Dmark03 is not perfect, and could in fact be so skewed that it were made useless, it still is one of the only measures we have of the ability of these cards (which are being marketed as ‘DirectX 9’ cards) to run DX9 code.

    Here is how it breaks down: You can disregard the estimate because you do not plan on playing any future titles on your card, or because you have tangible evidence that it is skewed.

    You can’t reasonably discredit all ‘future benchmarking’ for being inherently weaker than the benchmarks of today’s games, since by doing so you evidently show that you do not understand the actual purpose of the ‘future benchmark’ itself.

    P.S.: as I understand it, there were games being demoed at E3 recently that are using DX9 shaders and whatnot. If DX9 is going to be used in games within a year (or six months at least) then I would say that testing for it is relevant today.

    • Anonymous
    • 17 years ago

    You guys might want to check out Quake 3’s image quality using the latest 44.03 dirvers from nvidia….

    Might only be my imagination, but comparing 40.xx to 44.03, 44.03 seems to be ‘optimizing’ for Quake 3….

    • Anonymous
    • 17 years ago

    Plastic Surgeon, before you mouth off, read what IpKonfig article said. Here’s a snippet. I think it points finger in right direction………
    +++++++++++++++++++++++++++++++++++++++++++++

    Subject: PURSE POWER (FutureMark Controversy)
    Date: 07 June 2003
    Author: Brian Hall

    Buttered Bread

    FutureMark’s bread, in the short and medium term, is buttered on the manufacturer side. The pre-release “Beta” membership fee can be perceived as a kind of blackmail by FutureMark. It certainly leaves wide open the suspicion that those who pay get special treatment and information, and those who don’t will have to take their chances. (Theoretically, it could even go so far as to involve deliberate crippling of performance on competitors’ products. Remember Dr. DOS?) So FutureMark gets Money and Power. On the other hand, manufacturers know that their fees make a substantial contribution to FutureMark’s profit margin, and can press for special consideration. ……

      • Anonymous
      • 17 years ago

      And Nvidia was part of that “beta” program two months before they “left” Why did they leave 😉

      • PLASTIC SURGEON
      • 17 years ago

      Yes, did not Nvidia leave the program 2 months before it was released?? Hmmmmmm.
      So let me understand this then. It was FutureMarks fault that Nvidia cheated? Cheated not once but twice. Hmmmmm. r[

    • Anonymous
    • 17 years ago

    This article sides with Kyle at HardOCP.
    It says the blame lies with FutureMark’s taking “Fees”.
    Taking ‘fees’ from the very companies (Nvidia for example) that FutureMark software is performing “impartial objective” tests on.

    ipKonfig on purse power: the FutureMark controversy

      • PLASTIC SURGEON
      • 17 years ago

      Yes and HardOCP is the cradle of unbiaised reporting. lol.
      Yes Kyle, keep up the Nvidia ass kissing for that new 5900 Non-ultra. Did they send it to you yet?
      r[

        • Anonymous
        • 17 years ago

        Yes, they do seem to be very unbiased. You guys only look at what fuels your stupid little ideas. Have you not read their card review? I think they tip their hat towards the ATI products over the NV cards all the friggin time.

        Stating your opinions about situations and telling the facts and experiences with hardware are two things that HardOCP is very good at. I think you should read their review before you make such idiotic statements.

    • Anonymous
    • 17 years ago

    This article sides with Kyle at HardOCP.
    It says the blame lies with FutureMark’s taking “Fees”.
    Taking ‘fees’ from the very companies (Nvidia for example) that FutureMark software is performing “impartial objective” tests on.

    ipKonfig on purse power: the FutureMark controversy

    • Anonymous
    • 17 years ago

    I see two posibilities here.

    1) nVidia feels that 3dMark’s settings are incorrect, so they have thier driver detect 3dMark and force it down what they feel is the approprate rendering path. The higher scores actually represent what happens in games, which are more gracefully path-selected and/or are also forced down the correct path. In other words, nVidia is “cheating” in a way that actually benifits the public with faster games and more realistic 3dMark scores.

    2) As above, except the forced path exists only to improve 3dMark scores, and has nothing to do with actual gameplay. Other specific paths may exist to improve other common benchmarks as well. In other words, nVidia is “cheating” purely to inflate benchmarks.

    Given the extreme sublilty of the pic differences, I think it’s possible that nVidia is only doing this to work around path-selection issues. This may have been the only way to do it, since they’re not in the beta program and were until recently hostile with Futuremark.

    It seems more likely to me that this is a strait up cheat. But I’m not going to become a fanATIc just yet.

    Luckily for me, I’m not looking to upgrade video at all for a while!

    — Liquid[TJ]

    (If for some reason anyone wants to reply personally, send a shack message to Liquid.)

      • droopy1592
      • 17 years ago

      Ahem, clipping paths added to reduce objects rendered and reduced filtering quality. It’s cheating, nothing else can be said about it. Even if it’s only Nvidia’s path, it’s not what’s meant to be displayed.

        • Anonymous
        • 17 years ago

        Or played.

    • Anonymous
    • 17 years ago

    I start to think that there are a lot of fanATIcs, then I realize it is just the same folks copy and pasting their post on board after board after board. Quite humorous. I guess they are really are starved for attention and need folks to agree with them somewhere.

    You guys need to wake up and smell the coffee as this situation is about more than your precious ATI or NVIDIA, it is an indictment of a tool that the entire industry has put value in and there seems to be none left in it, if there ever was.

    • PLASTIC SURGEON
    • 17 years ago

    For anyone defending Nvidia in this cheating nonesense should be shot. Same for ATI if they did the same amount of b.s. now. What Nvidia has done is cheat the consumer. Plain deception. To cheat on a benchmark and then say that very benchmark you cheated in is not worth it’s salt; then cheat again another 2 times, explains to me that Nvidia does not give a rats ass about consumers. For the lemmings that want to see all this as harmless optimizations from Nvidia, should take some drugs and call Dr.Spock in the morning.

      • Anonymous
      • 17 years ago

      Who exactly is defending them?

        • PLASTIC SURGEON
        • 17 years ago

        You would be surprised. 😉

      • us
      • 17 years ago

      no, Canada is letting Gay couple get married, why don’t you let people love nvidia?

    • Anonymous
    • 17 years ago

    quote ” Yawn. Impending death of 3dmark on horizon. Bring on the games.”

    Let me say this just once more for the Morons who don’t seem to get it. IT IS NOT FUTUREMARKS FAULT Nvidia is CHEATING. There is nothing wrong with 3DMark03 as a Benchmark of DX9 Capabilities. I wouldn’t be a bit surprised to learn that Nvidia cheated in the hopes of getting caught just to make Futuremark look bad and discredit their benchmark. Futurmark didn;t want to kiss Nvidia’s ass in the first place and Nvidia dropped out of the Beta program. The rest is just sour grapes on Nvidia’s part because they found out that Futuremark wouldn’t cater to thier NV30 rendering path.

    and as for using Games as benchmarks… As soon as you use a Game to test a Video Card it becomes a BENCHMARK itself. If it becomes popular enough, It becomes a target for cheating. Any GAME BENCHMARK can be cheated just as easily as 3DMark03 if the IHV wants to do it. “Use Games” is not an answer.

    NVIDIA is the culprit here, NOT Futuremark. I wish some of you Lamers would get a clue… (or a Brain)

      • YeuEmMaiMai
      • 17 years ago

      that would require some form of intelligence on their part. From what I understand nVidia has re-enabled the cheats in their current drivers dated 5-29-03….

        • us
        • 17 years ago

        nvidiots are funny!

    • Anonymous
    • 17 years ago

    I did the same test as you did by renaming to 3DMurk03 with AF8X and I also ran it before and got exactly the same fps both times. I do have a FX 5800 Ultra using the 44.03’s. I just ran the nature test and got 17.4fps while it was plain 3DMark03.exe and got 17.4fps after renaming it 3DMurk03.exe. I bet them test are fake and you just wanting hits on your site by publishing these bogus figures. I will conduct these tests again when I get the FX 5800 Ultra too. I be they will be the same before and after renaming 3DMark03.

      • Naito
      • 17 years ago

      use a debugger and rename all the references to 3dmark.exe in the executable itself to 3dmurk

      they had to do that for Quake/Quack too……

      • YeuEmMaiMai
      • 17 years ago

      yeah like we are going to trust your esteemed opinion.

      nice try homer simpson…..oh wait a minute even homer simpson can figure that tech-report did it the right way…..

      • Anonymous
      • 17 years ago

      Only saying that this test results can be fake shows that you don’t know anything about the Tech-Report. Stupid fanboy… troll anywhere else.

    • Anonymous
    • 17 years ago

    r[<"The way it's meant to be cheated...." <]r Should be Nvidia's new motto. They make me laugh. Waht a joke of a company now.

    • Anonymous
    • 17 years ago

    The only way Nvidia and to a lesser degree ATI, will regain the trust of the buying public is to publish the app specific “optimisations” for each driver update. A separate tab for these optimisations should be included on the control panel, with an option to switch each on or off.

    For the record I have a 9700 pro and a 4600.

      • WaltC
      • 17 years ago

      I don’t have a problem with ATi.

      Unlike nVidia, ATi has admitted to a small change it made (which affected neither render quality nor workload in 3dMark–unlike nVidia’s code which did both), and further pledged to provide “optimization-free” drivers in the future for benchmarks.

      Unlike nVidia which has admitted, and pledged, nothing.

    • Anonymous
    • 17 years ago

    “….why don’t you read Carmack and Tim’s opinions about the matter? ….”

    Why? Who died and made them God?

      • Anonymous
      • 17 years ago

      y[<"....why don't you read Carmack and Tim's opinions about the matter? ...."<]y l[

        • Anonymous
        • 17 years ago

        they’ve said it. nvidia is cheating. simple. any diff in the rendered output to that of the expected output is cheating.

    • Anonymous
    • 17 years ago

    #188 lol. Then ATI goes and does it again. Didn’t you read the futuremark audit? They caught ATI as well, sure its not as much as nvidia, but its an 8% increase on one gaming part of 3dmark03.

    They both do it, and both after doing it in the past. They wont ever stop. Get real

      • Anonymous
      • 17 years ago

      r[

    • Anonymous
    • 17 years ago

    183 quit complaining.
    Nvidia is engaging in legitimate biz practise. It’s making sure its cards do great in benches, beat the competition. That’s biz, baby. Relax.

    • Sargent Duck
    • 17 years ago

    If Nvidia confessed and said they were sorry, they probably wouldn’t be in this mess.
    Gamers forgave ATI, becuase they said they were sorry, and wouldn’t do it again. Nvidia, being all high and mighty, has decided that they do not need to say sorry.

    That, and I don’t trust the french at all.

    • Anonymous
    • 17 years ago

    ATI cheated, too (But not as mutch as nvidia did [about 2% performane gain])

    Ati-Statement:

    “The 1.9% performance gain comes from optimization of the two DX9 shaders (water and sky) in Game Test 4 . We render the scene exactly as intended by Futuremark, in full-precision floating point. Our shaders are mathematically and functionally identical to Futuremark’s and there are no visual artifacts; we simply shuffle instructions to take advantage of our architecture. These are exactly the sort of optimizations that work in games to improve frame rates without reducing image quality and as such, are a realistic approach to a benchmark intended to measure in-game performance. However, we recognize that these can be used by some people to call into question the legitimacy of benchmark results, and so we are removing them from our driver as soon as is physically possible. We expect them to be gone by the next release of CATALYST.”

    Read all @ §[<http://www.theregister.co.uk/content/3/30880.html<]§

      • droopy1592
      • 17 years ago

      Why does everyone keep saying “ATi cheated?” Get the shit straight. If the output is the same as the reference, it’s optimized. If the output is different from the reference, it is reducing quality to improve speed. Sad thing is, ATi admitted to the ASO, and has said they will remove the ASO.

      Now, let’s look at Nvidia. The NV30 had an increase of over b[<60%<]b with one driver revision, ATi had a 2% increase in the final score. §[<http://tech-report.com/reviews/2003q2/geforcefx-5800ultra/3dmark03.gif<]§ Now that T-R has negated the ASOs, all of a sudden, the NV30's 3dmark03 score is right back where it started. §[<http://tech-report.com/etc/2003q2/3dmurk03/5800u-overall.gif<]§ *[

        • Rousterfar
        • 17 years ago

        How do you feel about the cheat ATI did in the past? The quake 3 one? This whole Nvidia mess is Dejavu. 🙂

          • droopy1592
          • 17 years ago

          We always lose when Anyone cheats like this. I don’t care who it is.

          Car analogy:

          b[

    • Anonymous
    • 17 years ago

    It’s clear that they cheated.
    NVIDIA “optimized” their drivers only for 3DMark03 – does any other 3D Application gain speed by this ? NO!

    shark:
    “If you need to do a Diff in photoshop to see a what was changed, then I don’t really care what they did, I’ll take it. ”

    What benchmark do you look at when you buy a new graphic card? Many people compare the 3DMark results, Quake 3 frame rates,… but is it fair if a Manufactor creates drivers that lower the Image Quality of a game to gain better results…

    For example:
    You compare 2 graphic cards (each selling for $299) in many different benchmarks and buy the better one (at least you think so) but then you find out that the only games / … where the Graphic card is faster are 3Dmark03, Quake3, JediKnight 2, Comanche 4…
    (Applications used often for benchmarks) and in some it runs only faster becaus it uses lower texture Resolutions,… -> lower Image Quality… — What would you say??

    @…
    “Some people post like they don’t really see it’s a blantant cheat. It became a cheat the minute they looked for ‘3dmark.exe’ in the drivers – I don’t understand how this can be anything else than a cheat.”

    I don’t think so – it became a cheat because they didn’ inform anybody about their “optimization”.

    NVIDIA should have placed a checkbox in their driver setup to disable the cheat / optimization / … that is turned on when installing their drivers first time. So you could decide if you want their optimization or not. Maybe if NV had done this …
    this would have made the Graphic Cards manufactors to include more and more of such optimizations in their drivers (I would really appreciate this).

    But it seem seems to me cheating is stylish at the moment.
    Just look at MSI (P4 3.0 Ghz running @ 3.2+ Ghz)

    Everybody finding misstaakes (Grammar,..) in this text is allowed to keeeeeep them.

    I didn’t read the whole board (maybe somebody wrote the same 2 pages backwards)

    • Anonymous
    • 17 years ago

    Well it seems that ATI is doing the same thing by another mean (depends on where you enable AF, ie driver or 3DMark) and can gain around 10%. Seems Ati forgot to talk to you about that 😉

    *[http://www.hardware.fr/news/lire/06-06-2003/<]§

      • Anonymous
      • 17 years ago

      Do you really trust a french site? I’d rather wait till its in English and some repeatable confirmation.

        • Anonymous
        • 17 years ago

        Well it’s a trustfull site, more than many english ones that i know 😉

        • Anonymous
        • 17 years ago

        the problem with english sites is that thay prefer publishing “tips” form commercial companies over researching on they’re own.

          • gordon
          • 17 years ago

          What are you talking about, their tests are based on what was entailed in the tip. Without it they never would have thought to even do such experiments.

        • Rousterfar
        • 17 years ago

        Do you really trust the French? 😉 All kidding aside, I would doublecheck all the info. I have found when it comes to news/rumors in the tech and videogame industry French sites can be some of the worse.

    • Anonymous
    • 17 years ago

    Just thought I’d point something out:

    Anisotropic filtering adds texture detail and anti-aliasing.

    Increasing texture LOD (negative LOD bias) adds texture detail, but doesn’t add anti-aliasing, and aliasing errors tend to increase.

    If you wanted to cheat and gain performance, but have still screenshots look comparable, all you have to do is lower anisotropic filtering, and raise texture LOD, without telling the user. You’d be doing less work, because you wouldn’t be adding the anti-aliasing. If people aren’t looking for anti-aliasing, or the scene has characteristics that serve to hide it, you might get away with it too.

    One thing that might confuse this for some people is thinking that anti-aliasing only happens at the edges of objects.

    Why doing this would be a cheat, even though screenshots would look very similar, is because the performance is being measured u[

    • Division 3
    • 17 years ago

    AG#145 Has an idea for all of you who believe that gaming and benchmark has nothing to do with each other. If you want to boost your 3d games you should just rename the exe of your game to 3dmark.exe and maybe you’ll get a boost in your game.

    Just a thought.

    • Anonymous
    • 17 years ago

    Reviewers Condemn Video Tweaks

    ATI tweaked its new video card software to make Quake run faster. The company was only trying to please gamers, but instead it upset just about everyone. By Andy Patrizio.

    By Andy Patrizio

    2:00 a.m. Nov. 28, 2001 PST

    Video card maker ATI Technologies is trying to smooth some ruffled feathers after it was discovered the software for its newest video card manipulated the performance of a popular gaming benchmark.

    The software drivers for ATI’s Radeon 8500 card were tweaked to make Id Software’s Quake 3 Arena game run faster. The driver turned down the quality of the textures in the game and increased the frame rate.

    But the tweak didn’t affect any other games, and ATI has been angrily accused of trying to fudge benchmarks. Quake 3 is the de facto standard benchmark used by hobbyist websites and computer magazines to test the performance of video cards and PC systems.

    See also:
    Discuss this story on plastic.com
    Pokey Man Big in Japan
    Free Card Game Is Such a Deal
    Murder Game Caught on Video
    Read more Technology news

    Even John Carmack, the highly respected lead programmer of id Software, chimed in on the controversy. “Making any automatic optimization based on a benchmark name is wrong,” he wrote in a note posted to the Web. “It subverts the purpose of benchmarking, which is to gauge how a similar class of applications will perform on a tested configuration, not just how the single application chosen as representative performs.”

    The driver tweaks were discovered by HardOCP, a hobbyist site. They were unearthed when testers renamed the Quake 3 executable and noticed that performance dropped. The testers realized the driver was looking for that particular file name.

    ATI has since released a new driver that doesn’t just crank up Quake 3 performance, but cranks up all games that use the Quake 3 engine, including the newly released Return to Castle Wolfenstein.

    ATI claims the tweaks were intended to merely improve the performance of a very popular game, not to enhance the card’s ranking in performance benchmarks.

    “(The accusations have) caught everybody here totally off guard,” said David Nalasco, technical marketing manager for ATI. “Our driver guys were totally in shock. We thought we were making the experience better for the user.”

    “When you’re developing a driver, you want to look at the applications people are going to use with your product,” he said. “In the case of the 8500, it’s a gaming-oriented product. So you want to start with what you think is most important and work from there.”

    But the issue has left some people a little red-faced, and not just ATI. Maximum PC put the Radeon 8500 on the cover of its November issue, saying it “beats GeForce 3 and scores a perfect 10!” In its January 2002 issue, Maximum PC will re-examine the Radeon 8500 with the new drivers.

    “Maximum PC doesn’t approve of these tactics,” said Jon Phillips, the magazine’s editor in chief. “There should be an implicit trust between ISVs and hardware vendors and journalists that that kind of thing isn’t changed without notification.”

    Phillips said ATI has suffered more in the eyes of his readers than his staff.

    Anand Lal Shimpi, who runs the popular AnandTech website, also ran a glowing review before the driver issue was uncovered. Like Phillips, he said the company has egg on its face.

    “It damaged them a lot in the eyes of their end users,” said Shimpi. “These are their customers, and they are a company being branded as a cheater. As a reviewer I’ve seen everything. This is just another I-can’t-believe-they-did-this.”

    “Die hard ATI fans were saying this is going too far,” Shimpi added. “It’s not murder, but it’s still cheating. It’s still a betrayal of trust I think, and it really comes down to, ‘did they consciously make a decision to do this?'”

    ATI said the driver tweaks were made simply to optimize the performance of a popular game, not monkey with the benchmarks.

    Dean McCarron, co-president of market research company Mercury Research, said he has more driver tweaks than he can count. In the past, video card drivers were tweaked to perform well with Ziff-Davis’ 3D Winbench, which was a popular graphics benchmarking tool a few years back. But most users don’t care about driver tweaks, he said.

    “Essentially, as long as the driver works, probably on the order of 90-plus percent, users won’t care,” he said. “You get their attention when it doesn’t work.”

      • Rousterfar
      • 17 years ago

      How soon people forget.

        • Anonymous
        • 17 years ago

        LOL!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Stoopid asses.

          • Rousterfar
          • 17 years ago

          Well this does mean that if people can forgive ATI for it like they have, then there is no reason Nvidia could not redeem itself with time. I wonder what it would take at this point.

    • zAmboni
    • 17 years ago

    OK, I was bored tonight….here are a few pictures to ponder

    First off here is the Rasterized (reference) version of frame 480
    §[<http://mywebpages.comcast.net/zamboni/RastAF8.png<]§ Using that reference picture here are some difference pictures obtained. Note I leveled things a bit using PS6, but not to the extent that damage did. I tried to level them all the same so you can get a relative lightness difference between all of the pics. For those who want to replicate what I did...here is the process: 1) open rasterized version 2) copy other pic in new layer 1 3) Set Layer 1 to difference filter 100% 4) flatten image, convert to grayscale 5) Select Levels and changed Input Levels from 0, 1.0, 255 to 0, 1.0, 70 6) Save Here are the rest of the goods FXMark - FXMurk (same as Damage did) §[<http://mywebpages.comcast.net/zamboni/FXdif1.png<]§ Rasterized - FXMark §[<http://mywebpages.comcast.net/zamboni/FXMarkdif1.png<]§ Rasterized - FXMurk §[<http://mywebpages.comcast.net/zamboni/FXMurkdif1.png<]§ Rasterized - ATI9700Mark §[<http://mywebpages.comcast.net/zamboni/ATIdif1.png<]§

      • just brew it!
      • 17 years ago

      Interesting… relative to the reference image, *[

    • dmitriylm
    • 17 years ago

    #162…uh..shouldnt the 3dmurk picture me less blurry since full filtering is enabled? 3dmark is the exe that is being checked by the driver. I think you still need some glasses and a good monitor.

    • Anonymous
    • 17 years ago

    Hey Damage, the differences are obvious to me. And I have pretty crappy eyesight. And my monitor is going bad. 🙂

    The 3dmurk picture is much less defined, blurry even, especially the desk drawers and the books on the shelves. Maybe some of these guys do need LASIK.

    AG

    • Anonymous
    • 17 years ago

    The point is this: reviewers want to compare apples to apples as best they can with settings. They are supposedly picking nVidia’s better AF to show performance comparisons to ATi’s similar quality AF. Now, if nVidia switches this up through an optimization (which is cheating because of what it IS doing, not how it was implemented) and shows better performance using their lower quality AF a reader will think they can run nVidia’s high AF and have that same performance. Not so, as because not every application is going to do that switch they’ll be getting cheated in what they expected.

    I don’t know how else to explain it to those who think that no drop in noticable image quality is okay in their book. It’s not the point, point is nVidia is changing the AF setting to make the readers think that their AF has really improved with these newer cards (and it has, but not to the extent they want you to think).

    • gordon
    • 17 years ago

    I thought of a pertinent link to those whom can’t see the differences, “www.lasik.com”. Prices are going down and I think it’s getting very cheap and practical to use it. I myself wear contacts, but considering the fact I can see the differences I thought that there is no need. Good luck!

      • Rousterfar
      • 17 years ago

      Har har har

    • Anonymous
    • 17 years ago

    Some people post like they don’t really see it’s a blantant cheat. It became a cheat the minute they looked for ‘3dmark.exe’ in the drivers – I don’t understand how this can be anything else than a cheat.

    If nvidia don’t like Futuremark then this is a pretty smart way to kill them off, by consistently invalidating the results, thus removing nvidia from the list which leaves 3dmark without relevance.

    • Anonymous
    • 17 years ago

    You know, It is totally True that the IQ is really not that greatlly affected. I Personally dont see a problem If their AF looks like that in any game anywhere.

    the issue here is they are doing specific application detection because they have found a way to Fiddle With the Texture Stages or other things. Which obviously cant be used accrossed the board. But only in this or other Rail type Benchmarks. Thats my take on it.

    It artificially inflates the score and is clearly Trickery or Hackery. I am not going to call this cheating per se. However it does decieve Buyers and OEMs into thinking the card is regularly capable of something its not. I have a Strong feeling they are likely pulling the same Kind of things in some of the other major benchmark apps/Games as well.

    What it really is, is like Bait and Switch technique, Or 3 card monty. Its the consumers who are losing becuase they think they are getting the highest performing card with AA+AF in said applications. when in reality they are not.

    • zAmboni
    • 17 years ago

    I was able to get some screenies using my 9700Pro here, FSAA off, 8x AF and did the Murk/Mark test and here are the pics….

    §[<http://mywebpages.comcast.net/zamboni/ATIMarkAF8.png<]§ §[<http://mywebpages.comcast.net/zamboni/ATIMurkAF8.png<]§ There looks to be no difference between the two. I'll let you all compare them to the TR ones....There are a bunch of differences between the ATI and FX shots, but most of them have to do with the shadowing being a pixel or two deeper on the ATI. It's tough to tell which of the two the ATI screenie is more similar to...but if I had to nitpick and guess it would be closer to the TR 3DMark screenie. But I'm no image quality expert...maybe someone could make a better determination :)

      • Anonymous
      • 17 years ago

      the difference pic came out completely black (no difference).
      when you flip the nvidias pics on and off in photoshop the difference is clear to the naked eye but nothing to be concerned about in my opinion.

    • Anonymous
    • 17 years ago

    #151, FSAA stands for Full Scene Anti Aliasing. Multisampling FSAA (which is based on Z-tests to only average pixel colors on edges, thus saving gobbles of fillrate) is referred to as MSAA, whereas Supersampling FSAA (where the image is either drawn on a larger size then reduced (Ordered Grid Supersampling, ie OGSS) or drawn several times at slightly different sampling angles (Rotated Grid Supersampling, ie RGSS)).

    Oh, and your explaination of OGSS meaning automatic AF is, well, to put it simply, complete nonsense.

      • Anonymous
      • 17 years ago

      Thank you for briefing me with geek site jargon. For antialiasing, there are two more or less common ways to do it: edge antialiasing and supersampling. How the supersampling kernel is constructed, is an implementation detail. Using jittered kernel (“Queens Problem” is preferred) yields better results than ordered. By using gaussian weighting and overlapping kernels the results are improved further.

      Quote:
      “Oh, and your explaination of OGSS meaning automatic AF is, well, to put it simply, complete nonsense”

      It’s not equal thing, but supersampled antialiasing implies more texture samples taken per pixel. Quality of texturing depends on number of samples AND how these samples are placed in the texture space. Think about averaging a supersampled grid of linear filtered (isotropic) samples. How are those sampling points located in the texture space?

      Ultimately, EWA filter is THE solution.

    • Anonymous
    • 17 years ago

    The difference image is more likely or partially caused by some antialiasing difference instead of anisotropic filtering. However, supersampled antialiasing (FSAA as geeks call it) implies anisotropic filtering: If twice the pixels are rendered, it doubles the number of texture samples regardless of the texture filtering mode. AF is basically supersampled texturing, the only difference being that the sampling kernel doesn’t have to be isotropic (= uniform).

    AF does not increase the resolution like supersampled AA, and it’s only processed for (causally) visible pixels. Therefore AF is less expensive than supersampled AA.

    These are not very clear images for comparing anisotropic filtering modes. AF does not tend to degrade image quality (as some of you seem to think), but it greatly improves the cases where the Nyquist rate is exceeded. The quality of AF can be rated by (at least) following methods:

    1) For static images, repetitive or detailed pattern must be used, like a chequerboard or text in perspective.
    – AF reduces, but does not remove, Moire effect.
    – As AF ratio is increased, the image converges to look *what you’d expect*.

    2) Animated scene
    – AF reduces, but does not remove, Moire effect.
    – Noise effect is greatly reduced between frames
    – whatnot

    Blah. That’s it. The results say nVidia is distributing h4x0r3d drivers, which never should pass the WHQL tests. The difference of the static images does not have any meaning to me, except that something has been done for image output. Shame on you nVidia…

    For obvious reasons, I remain anonymous.

    • LJ
    • 17 years ago

    Whoops, I didn’t have a chance to read the article at work and I hadn’t realized that they were sacrificing image quality for sales.

    In that case, this is worse than what ATi did because they should’ve learned their lession then. If they’re going to discredit 3Dmark (which is fine by me) then they shouldn’t be doing something like that.

    • zurich
    • 17 years ago

    #133

    Yaay, someone else who agress with me.

    So lets see here, latest NV drivers BOOST 3dmark aniso speed while also giving better image quality. THE HORROR

    😉

      • droopy1592
      • 17 years ago

      Zurich, you aren’t getting it. If the code for AF was so good, why not use it in all games! Why does it have to be application specific? It shouldn’t even have to look for 3dmark03.exe or whatever. It’s called *[

    • Anonymous
    • 17 years ago

    OMG i can’t tell a freaking difference between the imagines.. yeah great cheat..

    • meanfriend
    • 17 years ago

    Love the nomenclature. Personally, I would have gone with 3DFarce but 3DMURK is great 🙂

    For my 2 cents, I dont have a problem with a driver ‘special casing’ an app by detecting it at run time and running app-specific code as long as the image quality doesnt change.

    In this case, the image quality changes almost imperceptably. As others have noted, you have to rapidly switch back and forth between the two screenshots or use a ‘diff’ comparison to even tell whats changed.

    Ultimately, is this case a cheat? IMHO, yes! Do I really care in this specific case? Not at all. For the small amount of image degradation, you are getting 10-20% increase in performance. That amount of improvement may not be observed in a real world app (ie. an actual game) but many gamers would accept that tradeoff.

    When the IQ gets degraded so much that you can easily tell the difference, then thats going way too far. That results in a fine (and shifting) line since IQ means different things to different people. What seems to have gotten everyone riled up is that nvidia did all this without any sort of disclosure and makes it appear they have something to hide.

    • SnowboardingTobi
    • 17 years ago

    So umm… what happens if you rename the Quake3 EXE file to 3Dmark03.exe?

    In fact… what happens if you rename any 3D game/program to 3Dmark03.exe?

    Try it… I wanna know if things look screwy.

      • Division 3
      • 17 years ago

      That would be kind of cool, have a performance boost for your game. Nice. haha..too bad I dont have GFFx card to try it out.

        • us
        • 17 years ago

        Not only that. All the enemies you don’t see won’t shot you, because they are clipped off 🙂

          • Anonymous
          • 17 years ago

          heh… now see… someone needs to try this. Come on… anyone? I don’t have the right hardware to try this out. I’m curious as to what happens if you rename any 3D application to “3DMark03.exe”

    • Anonymous
    • 17 years ago

    hm.. why all the differences around the edges of surfaces? Looks like anti-aliasing issues, but why they show up when aniso is enabled, who knows.

    • indeego
    • 17 years ago

    hahaha… now we have colorblind shackers complaining because on the graphs you can’t tell the difference between the Green and the orange bars. classic, and aptg{<.<}g §[<http://www.shacknews.com/ja.zz?id=7730937<]§

      • just brew it!
      • 17 years ago

      Well… colorblindness — espcially red/green — is in fact quite common, especially among men. Relying on the distinction between red/green or orange/green in a graph (or other display) can cause difficulties for a fairly significant percentage of the population. It is in fact a pet peeve of mine (since I suffer from the condition myself), but I have mostly given up commenting on it, because years of experience has shown that people who aren’t colorblind themselves tend not to give a crap about the issue.

        • indeego
        • 17 years ago

        I care! /holds jbi from a man-safe distanceg{<.<}g

    • zAmboni
    • 17 years ago

    Damage: Did you happen to mislabel the two screenshots? For an explanation…I’ll post what I did in the Ars forum…

    q[

      • Damage
      • 17 years ago

      The images are labeled correctly. Although I tend to disagree with you about which looks “better,” some things are difficult to judge in static screenshots, especially when it comes to antialiasing (of textures, in this case). The slower 3DMurk03.exe filtering probably samples more, while the 3DMark03.exe method may look “sharper” in a single static screenshot.

      Sorry about the mediocre sample scene. Maybe we can find something better.

        • zAmboni
        • 17 years ago

        Yea…some things like AF you cant really tell a difference in a static screenie….plus it is difficult to tell what FM intended the screen to look like….you may be comparing apples to oranges when we should be comparing them to pears I’ll take your word for it though 🙂

          • eitje
          • 17 years ago

          can you not use a software renderer with 3dmark? i’ve seen it in use elsewhere, and that would certainly give you the exact image produced by the 3dmark software.

            • Pete
            • 17 years ago

            I agree. Damage, did you produce a reference image using 3DM’s DX9 software rendering? Which screenshot does it look closer to? I’d hope it’s equal to the 3DMurk one….

        • Rousterfar
        • 17 years ago

        I agree that some better screen shots to compare are much needed. I keep staring at them and just don’t see it.

    • derFunkenstein
    • 17 years ago

    Fortunately, TR was bold enough to believe its readers are smart enough to come to their own conclusions. Unfortunately, the other sites who weigh in on this (especially HardOCP) don’t have that faith in their readers, and insist on giving us opinions to recite for them.

    • us
    • 17 years ago

    Miksu in B3D suggested:
    Posted: Fri Jun 06, 2003 12:44 pm Post subject:
    ——————————————————————————–
    Anyone tried renaming some other benchmark/game to 3DMark03.exe and running it? It should do something to the game, shouldn’t it.

    I think it’s very good idea. Anyone with NV card might want to try that

      • eitje
      • 17 years ago

      if i owned any FX card, i’d be all over the benchmark renaming. in fact, this is the first time since they were released that i’ve actually wanted to get one. 😉

    • Anonymous
    • 17 years ago

    Personally I don’t give a crap who is the best NVIDIA or ATI. I do not base my judgment based on the freaking LOGO. I buy what fits my needs. It is simple is that. I have had both ATI and NVIDIA products before. Loyalists are the reason why company’s producing crap in the first place. Why you might ask? Well it is very simple. Because you are supporting them, by buying there crap. Maybe it is time to start basing your judgment on the product itself and not the LOGO. Doing so will defiantly putt pressure on the other team and will force them to do better.

      • Yahoolian
      • 17 years ago

      The point is that other people do base their judgements on 3D Mark. Even if you dont.

    • rms
    • 17 years ago

    I suspect that *all* the Nvidia game benchmarks for the 5900 are now subject to this same method of cheating, namely changing the AA method (for instance) on the fly, regardless of what the user has set in the driver properties.

    If true, this completely invalidates *any* benchmarking whatsoever, whether with 3dmark or games, as the driver is lying to the user about what is really being displayed on the screen. This makes it impossible to compare apples to apples.

    I don’t even see how you would go about proving such a thing is occurring. How do you look at an image and tell if it *really* is using 8x AA like you set, or if it’s really displaying 6x on the screen?

    Damn nvidia to hell for opening this can of worms.

    rms

    • Anonymous
    • 17 years ago

    us,
    wtf was that supposed to mean? It made no sense as a response to #103. It was like yelling “NO YUO” in the middle of an argument.

    and it looks like registration is down here too droopy, can’t get the activation email.

    • Rousterfar
    • 17 years ago

    §[<http://www.theregister.co.uk/content/54/31082.html<]§ The reg has reported on TR's article. I wonder how long now before this site gets slashdotted.

      • indeego
      • 17 years ago

      hours agog{<.<}g

      • just brew it!
      • 17 years ago

      Already happened. The story has been linked from the slashdot front page since mid-morning.

      (Kudos to T-R’s service provider, and whoever configured their server… the site didn’t so much as hiccup.)

        • Rousterfar
        • 17 years ago

        Oh wow. I would have never thought it happened already. Usually Slashdot is the great death of servers. I 2nd the kudos to TR’s service provider.

    • Anonymous
    • 17 years ago

    Has anyone really thought about the fact that those frames aren’t necessarily SUPPOSED to be the same? If they are moving at different framerates, frame 480 of each will be slightly different. Comparing them (especially with a diff) is 100% pointless!!

    That is what I interpret it as. If someone can explain how they are supposed to be exactly the same, please reply.

      • just brew it!
      • 17 years ago

      The differences do not appear to be due to motion in the scene. The most obvious discrepancies seem to be errors in lighting calculations (brightness of objects).

      • us
      • 17 years ago

      Then you are talking about a bug in 3Dmark. The frame number meant to refer to the same scene.

      Even without any visible difference, NV is still doing something differente due to the .exe name change.

      • eitje
      • 17 years ago

      imagine, if you found an old 8 mm movie reel, you could pull it apart and look at each frame on the reel. each time you went back to that frame on that reel, it would be exactly the same as it was before (ignoring entropy for the moment). when you put it onto a 8 mm movie projector, you could play it for a while, and when that frame appeared, you could stop it, and it would look exactly the same as when you looked at the frame on the reel.

      3d graphics are just a real-time movie reel. each frame is described in the program requesting the rendering. the video card should be a passive projector, simply displaying whatever it is instructed to display. any given frame should look exactly the same as it did on any previous viewing of that frame, with no change from projector to projector (or GPU to GPU), even if one projector runs the movie twice as fast as all the other projectors.

      this situation is much like a projector manufacturer deciding that Rambo would’ve been more impressive with a bigger gun, or that Trinity should have been wearing tighter, shinier leather, or that any other movie should have been different from the way the production studio and director originally intended it to be.

        • Buub
        • 17 years ago

        q[<#110, imagine, if you found an old 8 mm movie reel, you could pull it apart and look at each frame on the reel. each time you went back to that frame on that reel, it would be exactly the same as it was before (ignoring entropy for the moment). when you put it onto a 8 mm movie projector, you could play it for a while, and when that frame appeared, you could stop it, and it would look exactly the same as when you looked at the frame on the reel. 3d graphics are just a real-time movie reel. each frame is described in the program requesting the rendering. the video card should be a passive projector, simply displaying whatever it is instructed to display. any given frame should look exactly the same as it did on any previous viewing of that frame, with no change from projector to projector (or GPU to GPU), even if one projector runs the movie twice as fast as all the other projectors.<]q See that's the problem. This is the way it's coded for 3DMark, but this is /[

    • Anonymous
    • 17 years ago

    These problem will be addressed with the new Detonator ASS drivers leaked here:

    §[<http://www.rage3d.com/board/showthread.php?threadid=33690232<]§

    • Anonymous
    • 17 years ago

    Ok, most people don’t understand that there is no one correct method for everything. NVIDIA has it’s own rendering path for AF. They want 3dmark to take advantage of their optimization, so they detect when 3dmark is run and force it to take the faster rendering path. The net result is an undetectable quality difference (i can’t tell the difference at all) — and not necessarily quality degredation…just difference. It’s not like when ATI made it’s hacks so that quake 3 was visually horrible on ati cards. Honestly, all this hoopla is just that. 3dmark’s use of generic rendering paths does not properly benchmark the abilities of the vid cards that have optimized paths. Obviously, game developers will take advantage of nvidia’s rendering path for AF on nvidia cards and ati’s rendering path on ati cards and default to the generic path for everything else. 3dmark is not a good method of benchmarking video cards. your favorite game is the best benchmark. obviously, there need to be some changes made (at 3dmark, nvidia, and ati) to address using vendor specific optimizations. and sensationalist journalists like this need to leave their biases out of their reports (or at least think critically about what they’re reporting and realize that the larger issue is the innapropriateness of 3dmark as a proper benchmarking utility).

      • crazybus
      • 17 years ago

      Do you know what you’re talking about? 3dmark has nothing to do with Nvidia’s way of anisotropic filtering. The “generic rendering path” you talk about has nothing to do with texture filtering. Here’s an example: 8500 can’t do trilinear anisotropic, but 3dmark can’t tell that, neither can quake 3. Quake 3 reports the texture filtering mode as GL_LINEAR_MIPMAP_LINEAR, when it’s actually using bilinear filtering.

      • eitje
      • 17 years ago

      if you want to compare two things in the best apples-to-apples way possible, then there IS only one way to do something; in other words, there IS only one codepath to follow in a benchmarking environment. that’s how it gets to be a same-same comparison. 🙂

      sure, it’s a synthetic environment, but if no GPU manufacturer optimized for that synthetic environment, it would be a CLEAN environment – a level playing field, if you will. i think the issue at hand is that 3dmark is intended to be an even competition, without optimization or improvement beyond what they could do within the software development cycles.

    • Anonymous
    • 17 years ago

    Keep up the good work Scott. I’m glad someone reported it. I was wondering if anyone tried rename the executable for quake or ut2003 with the new nvidia drivers.

    • Anonymous
    • 17 years ago

    For those of you who can’t register at the H and post your thoughts, I posted this:
    §[<http://www.hardforum.com/showthread.php?s=&postid=1024873576#post1024873576<]§ Hopefully that will satiate you all. :) I basically called him out, in a polite way. If I had been an ass about it, I may not get a response. All in all, I do not appreciate NVIDIA deciding what image quality I should play my games at. I want to choose if I should sacrifice quality for performance. It shouldn't be up to the card manufacturer. ATI from now on. I was loyal to NVIDIA, but they have obviously deceived me and made a fool of me (because I believed them). I don't appreciate that. My friend who has been pestering me to go ATI for some time now will be pleased to hear my change over.

    • Anonymous
    • 17 years ago

    it’s so funny watching you guys bash Kyle as a NVidia lackey when you probably haven’t been keeping up with the [H] for some time. Looking at his last 10 Vid card review, a whopping 2!! were for just NV cards, 2 more pitted NV vs. ATI, and 6!!! were just for ATI. Damn that’s one hell of a ATI ^H^H^H NVidia fanboy.

      • us
      • 17 years ago

      Probably because you haven’t been keeping up with most recent development in [T]ardOCP.

      • Pete
      • 17 years ago

      You realize that ATi has released far more and far better cards in the past six months, don’t you? Not even the biggest “fanboy” could twist recent card releases to reflect well on nVidia.

      Thankfully, things are finally evening out with the 5900, albeit with some obvious turbulence.

        • derFunkenstein
        • 17 years ago

        Evening out? The only “equalizer” that I can see at this point is either in older games or in Doom3, and I’ll bet dollars to doughnuts (I’ll provide the doughnuts, you provide the dollars) that nVidia, having a copy of the game itself and trying to supply demos for “unbiased” testing, has done some heavy “optimizations” that affect image quality. I’m pretty friggin’ tired of it. I’ve owned nVidia cards (and one computer still uses a GF4 Ti4200, another has a TNT2 M64, and a 3rd has a GF2MX 400) but I won’t buy another until this sort of thing is resolved.

    • Anonymous
    • 17 years ago

    Someone might want to check out UT 2003 by the way

    • adisor19
    • 17 years ago

    Well looks like Kyle’s site is not the only one that’s gone weird…

    Seems like Anand is also refusing to admit they posted false Q3 benchmarks that favored NVidia’s NV35… linkage :

    §[<http://forums.anandtech.com/messageview.cfm?catid=44&threadid=1063480&STARTPAGE=1<]§ It's starting to get depressing... We lost Tom, Kyle and now it looks like Anand is also loosing his integrity.. Damage, don't you EVER pull a stunt like that ;) Adi

      • us
      • 17 years ago

      You can’t be big website with integrity?
      Kyle is an asshole, Brent pretends not to be an asshole, but I guess it’s hard not to be 🙂

    • Anonymous
    • 17 years ago

    How strange Kyle is being in this matter. What an asshole he is coming off as. His site Hardocp only became popular after he started railing against Tom’s Hardware guide for the same blind company devotion. Then the Quack debacle. How times change and it seems that too many free lunches buys you a hardware site whether that site is even aware of the change over time.

    • Anonymous
    • 17 years ago

    well. here’s a quote from kyle’s response to me…
    “Well had this been done in a widely played game that is used for a benchmark and it greatly impacted the IQ of such game while playing it, then it would be the same situation.

    3DMark03 is crap, why waste you time?”

    So….. when Kyle found it, it was worthwhile, but when it was from a place that NVidia doesn’t support anymore, is not worth your time?

    I think the easy answer is because whether you think its crap or not, people DO use, and make decisions based on these benchmark suites, therefore as a responsible reviewer and site, I think any web page worth their salt should be screaming bloody murder when any manufacturer, be it ATI, Nvidia, even Matrox, plays with results in ANY game or benchmark… not just the ones you *THINK* are worthwhile.

    SO, for the record, Kyle thinks 3DM2K3 is crap, so it doesn’t matter than Nvidia is cheating on it, *EVEN IF THAT INFLUENCES PEOPLES DECISIONS*. People who aren’t the most tech savvy in the world are supposed to know that Kyle thinks its crap and disregard it.

    Yeah. 3DM isn’t established or wellknown at all. That’s going to happen.

    • Anonymous
    • 17 years ago

    I think the blame falls on Nvidia’s shoulders. You may not agree with 3Dmark as a valid benchmark, but the truth is it is used to guage performance in video cards. I would never accept a review based soley off 3dmark just as I would never accept a review based off of UT 2003 (or any other benchmark for that matter). I feel FM sort of pussed out when with that Nvidia joint statement, but if 3dmark was being threatened by legal action then I find it more acceptable.

    People are saying that there is not a noticeable difference between the two images, but you must remember that the scene is not exactly the best to show differences in AA filteting. Second, they are still putting a false result out.(other posters have already explained why this effects you) If the quality had lowered in all games and tests not just 3dmurk.exe then it would not be cheating but rather lower IQ for faster frames (3dfx). I don’t care what people say, this is a cheat.

    I remember that after the whole Quack fiasco many hardware review sites vowed that they would be paying more attention to IQ and not just running benchmarks. TR has always shown image comparisions and some other sites as well. While others still just run benchmarks with no regard to IQ performance. I understand that a bug like this is hard to catch. But if sites would take time they might catch it. I realize that alot of times sites are rushed to get there reviews out so this may not be feasable.

    Finally, Kyle needs to get out of the business. I seriously question those doom 3 scores now. (actually I always have) When ATi did a similar cheat he was quick to lay blame on ATi. Now that the shoe is on the other foot he wants to lay blame on the benchmark. I’m not going to visit HardOCP until he leaves, period. I will not support the trash he spews out!

    • eitje
    • 17 years ago

    i would like to hear about any other situations with executable renaming that might occur in commonly-used games. even one actual game used in GPU testing with this feature would go a long way to wrapping this whole thing up for the non-believers out there.

    also, i’m interested in how the program name recognition works. if you rename another program (like the folding client 😉 ) to 3dmark03.exe, run it, and then run 3dmurk03.exe, does 3dmurk end up producing the same scores as the correctly-named benchmark executable? if this works, what happens when you rename the other program to other commonly-used game benchmarks’ names?

    • Anonymous
    • 17 years ago

    I want to see a 3dmark/3dmurk 2001 comparison of the NV3x chips. Remember how aniso filtering on the geforce4 chips was broken? It might still be on the new chips.

    • LJ
    • 17 years ago

    The comparison is impossible to draw because there were no adverse effects from these optimizations. The ATi optimization, on the other hand, stole image quality from those with ATi cards.

      • Anonymous
      • 17 years ago

      diff shows nvidia’s images to be different/craptasticized.
      ati reordered instructions, no loss of quality.

        • Ishmael
        • 17 years ago

        PS: I dont know how much use searching for application specific (Quake->Quack) optimizations are going to be because both companies have been doing this for quite some time. It has been an accepted practice as far as I understand that optimizing commands more efficiently as long as the output is the exact same has been fine to do. Thus, why you see specific games listed in version changes between drivers of both companies. Clearly, if you can show that the output quality is not the same then the optimization is a cheat but boy is that tricky. How many of these hardware sites are going to actually be able to discern something not obvious that is lower quality when Doom3 gets 20% higher scores between driver versions.

        Anand is also being an asshole about this same issue. unbelievable

        • just brew it!
        • 17 years ago

        The image differences are in fact quite small. Brightness of a couple of objects appears to be slightly off (too bright), and there seems to be a minor issue with the rendering of the panes of the window. The rest of the differences are trivial IMO. I think saying that the image has been “craptasticized” is an exaggeration.

        Now, I’m not trying to defend nVidia here… they have clearly done some very shady things, as outlined in the FutureMark whitepaper. However, I hesitate to say that *[

          • droopy1592
          • 17 years ago

          Oh yes! It’sa frikkin cheat when your 3dmark scores increase by almost b[<2000<]b points (5341-3305= cheating bastards). Seems like more than a 50% increase to me. (half from the name game, the other half from the precision reduction and path change) It's one thing when you lose in one game and win a bench in another, but to make bulky bloated drivers that give application specific adjustments to boost scores is totally wrong. Maybe that's why NVidia use to come with so many driver updates. Everytime a game or new way to benchmark came out, they had a good time giving ASO to each one.

    • HammerSandwich
    • 17 years ago

    What results do you get when testing at different levels of aniso?

    • Anonymous
    • 17 years ago

    Here is my take on the situation.
    I play games. I don’t play benchmarks. If I have to get out a different tool to see the differences I don’t care. Also 3dmark is controlled by ATI. They sit on top of the food chain. Nvidia doesn’t pay money to the company.
    So bottom line. I don’t care that they cheat. When I sit down and play BF1942 or Doom3. If it runs at 1024 and gives me a good frame rate. That is what I want. I don’t care beyond that. They are just cards. Hell in the business world where the money is made as long as you put up a decent 2d image. that is all mangers care about.

    Peace,

      • WaltC
      • 17 years ago

      /[

      • gordon
      • 17 years ago

      The fact that Nvidia couldn’t sway Futuremark their way, which subsequently lead to their departure, clearly displays that they can’t and aren’t being controlled by any one member. Furthermore the differences are fairly easy to spot and I seriously think some of you guys need to invest in some eyeglasses or maybe lasik surgery. There is also the consideration that what we are seeing is only the tip of the icing and if we had access to multiple viewpoints it might be more apparent to see what’s going on.

      Edit: Ahhh, WaltC go away you!

      • WaltC
      • 17 years ago

      I’d like to add for AG #82 here…

      You say you don’t “care” if companies cheat like this. That’s fine–you’ve a right to your opinion. I think you should consider, though, that the target of that cheating was not FutureMark–the target was *you* (and me, and anyone at all who might be tempted to look at 3DMark scores and reach some kind of conclusion about nVidia’s hardware based on those scores.) So, basically, what you’re saying by comments like these is that you don’t care when a company makes an effort to deceive *you.* I thought I’d mention that just in case you might have thought there was something remote and impersonal about cheating of this type.

      (And Gordon…I can’t go away I’m already here…;))

      • droopy1592
      • 17 years ago

      Atleast someone is honest about their own ignorance and plans to do nothing about it.

    • Shark
    • 17 years ago

    If you need to do a Diff in photoshop to see a what was changed, then I don’t really care what they did, I’ll take it.

      • just brew it!
      • 17 years ago

      Yes… I think the jury is still out on this one. If visual quality is not noticeably degraded, *[

      • Anonymous
      • 17 years ago

      But you haven’t got the point, so eoloquently explained previously.
      I’ll paste it in here, to save you looking for it:

      “Last thing here: I read that at least one person downloaded the .pngs TR did and didn’t see much difference in the visible image quality and so wasn’t sure what all the fuss was about. I think lots of people really aren’t connecting well with the real issue. That is, when TR renamed the 3D Mark 03 to “3dMurk” the nVidia drivers no longer recognize that 3DMark 03 is being run and they behave exactly as they would when running a real 3D game. This means, simply, that in running “real 3D games” (a term nVidia has suddenly fallen in love with) your performance while running 8x AF will be lower than the performance indicated in 3D Mark 03. How much lower? TR’s bar charts provide a rough idea–the point is that “real 3D game” performance under D3d while running the nVidia Dets with 8x AF enabled will always be lower than the performance the Dets simulate for 8x AF under 3D Mark. In other words, the kind of “application optimization” for 8x AF we see in the Dets applies only to 3DMark and to nothing else. Thus the real problem is performance in real 3D games.”

      You are being CHEATED of performance that nVidia are trying to make you believe their video card has. It’s as simple as that. And if that doesn’t make you mad (or at least a little irritated!) then you don’t understand the situation. Image quality has nothing to do with it in this case. Hey, I’ll hold my hand up and say I play First Person Shooters where framerate is more important than what level of anisotropy I can run, but if I run an nVidia card that ‘beats’ an Ati card in a benchmark (which is a major buying criterion, let’s be honest), but can’t in games because they’ve CHEATED then whose lost out? Me. And you. And everyone other graphics card buyer.

      nVidia must be alternately gutted they’ve been ‘found out’ and yet gleeful that so many people are saying ‘ignore 3dmark03’, because that’s what they want you to do – their cards can’t perform well in that benchmark – so they want it to be ignored for ever more. Don’t let them get away with that, INSIST your favourite review sites CONTINUE to use 3dMark03!

        • Shark
        • 17 years ago

        well – until someone renames ALL the games we use to benchmark and play, there is no real way to say which programs are being referenced for optimization now is there?

          • Rousterfar
          • 17 years ago

          I swear I can’t see the diff in these shots. Does anyone have some better images to look at?

            • just brew it!
            • 17 years ago

            If you have Mozilla, load them into different tabs of the same window, and flip back and forth with Ctrl-Tab. With IE, you can get nearly the same effect by loading them into different browser windows, stacking the windows one on top of the other, and using Alt-Tab.

            Yes, the differences are in fact not very significant.

    • droopy1592
    • 17 years ago

    Seems like registration has been disabled in the [H] Forums. I’d go over there and lay into Kyle until I got banned.

    Oh wait, that already happened.

    • Anonymous
    • 17 years ago

    Damn. Nvidia just does not stop with all this b.s. And then they get websites to back them up. Got to love that.

    • dukerjames
    • 17 years ago

    after hours and hours comparing the 2 png pictures, now I can tell the exact differences with my naked eyes, pixel by pixel,

    oh did I mention that now I can also see dead people!

    • Dually
    • 17 years ago

    I’m not saying FM is at all the perfect “Gamer’s Benchmark”, but for Kyle to totally lay all blame on them when it comes to cheats and (in his world) clear nvidia of all chargers is intolerable.

      • dukerjames
      • 17 years ago

      r[

    • adisor19
    • 17 years ago

    Kyle is a sad sad NVidia lackey. He wants to get the latest cards from NVidia and so he plays their game by discrediting FM and protecting NV… sad.

    Adi

    • YeuEmMaiMai
    • 17 years ago

    Here I will post the difference between the two on ATi’s drivers cat 3.4

    The following settings were the same for both runs

    2xFSAA
    2xANSIO
    Texture Preference Quality
    Mipmap level Highquality
    Vsync OFF
    TruForm OFF

    Radeon 9500 Pro clock speeds 360/306

    3dmark score 2856
    3dmurk score 2854

    A whole 2 point difference or .07% drop

    Tasks running in background

    Fold@Home (dos box)
    Promoxitron
    Rage3d Tweak
    Outlook
    IE (3 seperate instances)

    __________________
    ECS K76SA, Win 2K SP2, 512MB DDR2700 ram, AMD Athlon XP 1.7G clock, Radeon 9500Pro, 120GB WDC Caviar 7200rpm drive, SB Audigy Platnum, USR 56K PCI hardware modem, Lite-On 40/12/48 CD-RW, Toshiba 12xDVD- Reigon free, ViewSonic A90F

      • eitje
      • 17 years ago

      thanks. 🙂

        • YeuEmMaiMai
        • 17 years ago

        well it looks like ATi is off the hook on this issue

      • Anonymous
      • 17 years ago

      One of my frients have had 34 points increase after renaming. ATI’s definitely not quacking….

    • Anonymous
    • 17 years ago

    Baaaa, baaaa, ATI did it so Nvidia is not so bad, baaaa, I could bearly tell the difference, baaaa, baaaa, who cares as long as the games play well, baaaa, baaaa, benchmarks suck anyway, baaaa, baaaa.

    Don’t be sheep! Wake up!

    The point is not whether you can see the difference in one frame of a game. The point isn’t that by doing this cheat (no quotes) NVidia can increase its benchmark performance. The point is that their driver software is lying to you. If you select a feature to be available and the driver decides to turn it off at its whim, it is lying to you – and by extension NVidia is lying to you. Period.

    Maybe I’m confused, but when I play a game that I paid for on a video card that I paid for, I like to think that I’m in charge. If I turn on a feature then I want it left that way. If the game performance sucks than I can turn it off. But I decide! As a software programmer myself I know what NVidia did was no accident and is definitely not the work of some low level peon programmer. You had better believe those higher up on the development food chain were aware of this. They either instigated the fraud or acquiesced and allowed it to be perpetrated on their customers. Either way it is a dereliction of their responsibilities.

    As an owner of two NVidia products, I am pissed as hell about this!

    • Anonymous
    • 17 years ago

    Great article Scott. TR is really distinguishing itself as the best and most credible review site around. Ethics……what a concept!

    I can’t believe how pathetic nvidia has become. I’ve owned a GF3 and ti4600. I have a 9700pro now and can’t see ever trusting nvidia enough to buy another one of there video cards.

    Does anyone else think it’s funny that all of nvidia’s problems started when they released the GeforceFX (as in 3dfx) and hyped they’re ownership of 3dfx assets? It’s like nvidia is heading down the path of 3dfx.

      • Krogoth
      • 17 years ago

      I couldn’t agreed more with that to also mention something I noticed recently that nividia marketer’s started with that ad crap in the computer magazines that I have been reading, and 3DFX’s marketers did the exactly the same thing around 3DFX’s heyday
      my .02 cents

    • dolemitecomputers
    • 17 years ago

    Sorry Forge got to quote you from the ATI driver fiasco:

    q[

    • WaltC
    • 17 years ago

    Heh- Kyle’s predilection at [H] to constantly and wilfully blame the benchmark for the fact that nVidia is deliberately cheating it is bizarre. Why didn’t Kyle suggest that people not buy ID Software’s Quake when he “uncovered” ATi’s “Quack optimizations” a long time ago? What’s the difference? I mean, if you are going to tell people not use a certain piece of software simply because nVidia is cheating it, why not be consistent and tell people not to buy Quake, too, at the time? Is there an inconsistency between these two widely spaced events because [H] has developed a hidden conflict of interest in the interim? Or is Kyle’s position completely consistent with an internal bias to promote/protect nVidia whenever the need is dictated to him by his superiors in the nVidia Corporation? If Kyle can ever get around to presenting an intelligent position on 3D Mark 03 which doesn’t pretend it’s FutureMark’s fault that nVidia has extensively cheated the benchmark, perhaps that will become evident. Why is it he is congenitally unable to understand that criticisms of the benchmark, and nVidia’s cheating the benchmark, are two entirely separate issues that have nothing to do with each other? He didn’t seem to have a problem separating the the Quake software from ATI’s “Quack optimization”, did he?

    Let’s examine what [H] has said so far about about these new facts that TR has uncovered:

    *[

    • Anonymous
    • 17 years ago

    Dually,
    You expect Kyle to spend time going back over GF4 reviews just because you might buy one for your parent’s new PC? (used as an example). Site like the [H] and TR tend to review NEW items. This way they are close to the cutting edge of technology where us readers want to be. I don’t want either of them to waste their time reviewing GF4s or 9000/9100/9500s now. It would be pointless. They are here to lead us towards emerging products, not standing in the middle of the pack looking at what’s on clearance. Get a clue.

      • Dually
      • 17 years ago

      How about going back over a more recent reviews, say, the 5900 Ultra and 5600 Ultra, which, if I recall correctly, is where the whole controversy began…

      Afraid of what we might find if we begin turning over stones?

      And yes, I would like to see the benchmarks re-analysed with ATI hardware as well.

      I’m not the one that needs a clue…

      • droopy1592
      • 17 years ago

      That’s sad, poor AG. You need two or three clues. Plus another for good luck. You sound like Kyle himself.

    • Dually
    • 17 years ago

    And it already begins (from HardForums):

    q[

      • thorz
      • 17 years ago

      deleted, deleted

    • atidriverssuck
    • 17 years ago

    ok, so the scores so far are:
    Nvidia = zero credibility
    ATI = zero credibility
    3dfark = zero credibility
    many other benchmarks = zero credibility
    Blind masses = never underestimate their appeal for meaningless graphs

    As usual, I’ll buy based on how well my favourite games run and all-round compatibility. It’s not very hard. Oh wait, Nvidia has the “the way it’s meant to be played” EA program happening. Just another way to break the competition through “optimised” games. OK then, I’ll just get a console. Gamecube, Xbox or PS2? Hell, let’s get all 3 and forget about PC games altogether. Yeah.

      • eitje
      • 17 years ago

      nvidia’s in the xbox. 🙂

        • droopy1592
        • 17 years ago

        But atleast they can’t upgrade drivers.

        *[

      • Anonymous
      • 17 years ago

      Maybe I should quote ExtremeTech in this matter (yeah… prepared to be flamed). When they reviewed the 5200 and 5600, they said
      y[

      • WaltC
      • 17 years ago

      Relative to the 3D Mark fiasco I’d rate credibility:

      nVidia = 0% (Company has not admitted its cheating yet, has taken no steps to correct the problems found, and has made no “cheat-free” pledge for the future)
      ATi = 95% (ATI slipped in a single optimization which affected aggregate performance <2%, and has admitted it, and pledged its next drivers would not recognize 3D Mark)
      FutureMark = 90% (their benchmark has nothing whatever to do with what’s been done in nVidia’s drivers–they lose some credibility for being bullied into changing the word “cheat” to “application optimization” while maintaining their original audit report findings are unchanged.)

      Heh–dumping your PC for a console because of nVidia’s driver cheats is like reading about a problem with Firestone tires and selling your car and going to a bicycle…;) Smart money buys a better set of tires and keeps the car, and dumps nVidia for a better 3D card and keeps the PC…;)

        • thorz
        • 17 years ago

        CAR ANALOGY, CAR ANALOGY… Yeah! They rock 😀 😀 😀

    • house
    • 17 years ago

    Futuremark is gonna hurt form this one. I don’t think we’ll see another joint statement form them again.

    • droopy1592
    • 17 years ago

    Looks like ATi has enough beef to feed everyone while Nvidia has to trim the fat on everything they do so they can make their meat look like the better beef.

    s[

      • atidriverssuck
      • 17 years ago

      hah. It’s a step up from the car analogies.

        • eitje
        • 17 years ago

        i’ve always tried to use the “physical fitness” analogy. never really caught on, though. 😛
        i guess car information is more readily available to most of us geeks. 😉

    • YeuEmMaiMai
    • 17 years ago

    Wow look at all the nVidiots defending nVidia. Now if this was ATi you guys would be all over them like a $2 whore……

    • Delphis
    • 17 years ago

    Nvidia are such sons of bitches, they can’t make a decent video card so they cheat. And, as someone pointed out, are making the whole benchmarking thing as untrustworthy as they are.

    Ugh.

    /me glad I have an ATI 9500Pro

    • Dually
    • 17 years ago

    I would like to see some other applications checked for this cheat. Maybe when similar cheats are found with Q3A, D3, Splinter Cell, UT2003 or others, those that are defending every pitiful move nVidia makes will run out of steam. Of course, I’m sure Kyle Bennett will find a way to accuse every one of those game manufacturers and defend his beloved “no cheat” nvidia.

      • droopy1592
      • 17 years ago

      Seriously Scott, you should check into what this guy says. You should check some of the games and timedemos that got the 20% increase from the last driver update and rename them and see what you get then….

    • dolemitecomputers
    • 17 years ago

    TR’s article did make the ol’ slashdot today:

    §[<http://slashdot.org/articles/03/06/06/1216249.shtml?tid=137&tid=152&tid=185<]§

      • gordon
      • 17 years ago

      Hehe thanks for link. I especially liked this comment “/[

    • Anonymous
    • 17 years ago

    excuse me, but this isnt a good article,
    its a good ati commercial.
    you get a tip form ati regarding a benchmarking software that ati sponsors and you publish this as a “tech report”.
    at least backup your argument with same real applications.
    3dmark is biased from start, and ati is just making the most out of the money they paid for it , thats all.

      • just brew it!
      • 17 years ago

      Well, I agree calling this a “cheat” may be premature, since the effect on image quality is quite subtle. However, your comment about it being an ATI benchmark is off-base. NVidia was also a sponsor/member of the FutureMark beta program, during most of the development of 3DMark03.

    • Anonymous
    • 17 years ago

    My God… It seems a week can’t go by without me seeing something bad that nVidia’s done.

    Cheating, cheating, threatening, cheating.

    When will it stop!!???

    FOR THE LOVE OF GOD!

    • velocityboy
    • 17 years ago

    Kyle’s blurb on the subject:

    q[

    • --k
    • 17 years ago

    I can’t say this is surprising considering Nv’s latest action. They seem to have an air of hubris equaled only by that of MS. It’s a dangerous position to be in since their products are easily replaceable, unlike Windows. Maybe a change in management is in order.

    • Hat Monster
    • 17 years ago

    I assume this doesn’t work on ATI cards?
    I’ll do some testing on my own R9700 with the renamed executable. Doesn’t look like TR bothered checking.

    (hint to driver developers: Use MD5 sums!)

    • Anonymous
    • 17 years ago

    Than k you TR – once again you are acting for us – consumers, and your readers.

    • Anonymous
    • 17 years ago

    why did you, Scott make the comment

    Kyle Bennett at the cold, HardOCP

    what do you mean by “cold”

      • TheCollective
      • 17 years ago

      Its the way he always refers to Hardocp. He didn’t mean anything by it.

      • GodsMadClown
      • 17 years ago

      I think it’s meant to evoke the phase “cold hard truth”.

    • TheCollective
    • 17 years ago

    Nice job TR. Damage, remember what I said? Keep fighting the good fight. We will be here. 🙂

    • espetado
    • 17 years ago

    TR also got my support. Good job. And as stated here before; it’s not the fact the diffs are small but the fact there is ANOTHER ‘cheat’ that is important. nVidia Hax0rs sux0rs.

    Now for pointing out the mistakes that Bush made in the same manner….

    • MindlessOath
    • 17 years ago

    i bet nvidia is going to try and do the same thing, but do it in a manner that it searches for a file that cannot be renamed or through finding something being writen to memory in a spacific spot or mannor.

    they are sneaky and have enough money and talent to do so.. i wouldnt be surprised if they do it already. then we really cant tell whats going on by changing stuff. anyways its just a thought, and would be really hard for them to pull it off, non the less, for anyone here to belive that its possible.

      • Anonymous
      • 17 years ago

      Reading that comment was one of the most painful things i’ve ever done.

        • Hat Monster
        • 17 years ago

        Not to mention that application specific optimisations are perfectly kosher and very much part of both ATI and Nvidia drivers.

          • Hattig
          • 17 years ago

          Application Specific Optimisations are valid with real world applications, as long as the image quality doesn’t change (presumably it is allowed to get better though … interesting thought).

          The whole point of a benchmark is that the benchmark is testing how well the video card can do a certain amount of work. ASO’s here alter the amount of work to be done, thus invalidating the benchmark’s results, because the resultant score is not derived from doing the expected work.

          Even re-ordering instructions in a pixel shader to run more efficiently on your own hardware (what ATI did for 3DMark03) is not a valid optimisation for a benchmark (although certainly no-where near as bad as including static clip planes, etc, like nVidia did).

          If you cheat on a benchmark, your intention is clear – to intentionally mislead consumers and OEMs evaluating your product (thus cheating end users). Deplorable.

    • zurich
    • 17 years ago

    BTW, I should add I don’t think Damage/Diss/anyone @ TR has an agenda of sorts against NVIDIA. However, if you want to dig up dirt on them, how about publishing the word on their 44.03 not-so-floating-point precision? That’s a hell of a lot more criminal than 3dMurk.

    • zurich
    • 17 years ago

    Um, am I the only one who could barely notice a difference between those two screens? I mean, I had them open in seperate browsers flipping between ’em pretty quick. And like, if you ask me, 3dmark (upon unbelievable scrutiny) looked better than 3dmurk, little less aliasing (or something).

    ATI did a real number on their mipmap/aniso levels during the Quack deal, but this is really grasping at straws =/

      • Anonymous
      • 17 years ago

      To all those who were saying that the change could not be seen by the naked eye, just look at the open book located a third the way down from the top, and a little less than slightly halfway from the left. It is open on a stand, facing toward the window. The differences are easily seen by eye.

      • Pete
      • 17 years ago

      Speaking of grasping at straws, it’s been shown ATi incorrectly MIP-mapped a grand total of five textures in their Quack “fiasco.” Alert the presses!

      If nV can get boost their AF performance so greatly with such minimal IQ loss, why not apply it to all games?

    • Anonymous
    • 17 years ago

    Just a thought. Is it possible for the owners of nVidia’s products to file a class-action suit? On the reason that nVidia’s is/was cheating with the drivers to make their products look/looked better and hence is/was doing deceptive marketing (BTW, isn’t marketing all about deception?), and made the owners made the wrong purchase decisions etc……

    I think the hardware firms (including but not limited to nVidia) need to learn a lesson or two on this.

    Gerbil #XXX

      • indeego
      • 17 years ago

      ON WHAT GROUNDS? They have no grounds, no claims! Nvidia can quote scores on benches all they want, they reached them, it doesn’t matter HOW they reached them, they just didg{<.<}g If consumers could sue at every market-speak out there we wouldn't see anything made. Oracle's Unbreakable" campaign comes to mind, as well as Microsoft's five 9's campaigng{<.<}g

        • Anonymous
        • 17 years ago

        1. Presumably 3DMark is a commonly used and accepted benchmark (it could be challenged…)
        2. nVidia cheated to make its products run faster on 3DMark
        3. People thought they were paying for better products, which wasn’t the case actually.

        A poor comparison is that a car manufacturer claims its car can run at 90km/h but actually THEIR “km/h” is different from GENERALLY ACCEPTED “km/h”.

        Sure I think it would be a poor case (as it’s difficult to prove point #1)…

        Let’s think of something else… : )

        Gerbil #XXX

          • indeego
          • 17 years ago

          Let’s not. Cheating in a benchmark or a game is not like cheating in claimed speeds. Nvidia can list specific 3dmarks and claim honesty, it’s not their fault the bench can be manipulated, in fact it’s to their advantage to do so. ATI will do the same. It’s actually up to benchmarkers to determine if it’s worth it to rely on a 3rd party synthetic bench as a reliable means of evaluating a part. I say no, way, never. They can always cheat on them, and they LOSE NOTHING by cheating on them other than whinefests by geeksg{

      • WaltC
      • 17 years ago

      I think that sending in a complaint to the FTC and calling it “dishonest trade practices” or “deceptive advertising” (which I think nVidia is clearly guilty of because its “up to 50% performance increase” claims, which are well-documented, pertain mainly to the cheats it enabled in its drivers for 3D Mark) might be a more profitable course of action.

      Really, unless you could prove that your purchase decision was made based on the inflated scores relative to 3D Mark 03 created by the cheats nVidia implemented–which led you to the erroneous conclusion that you were expecting normal 3D-game performance to mirror that indicated in the benchmark, you wouldn’t have much of a case, I wouldn’t think. I think there are few who might truthfully say that their purchase of an nv3x-based product was based *solely* on their concept of 3DMark 03 performance and how it related to actual 3D game performance.

      Then there are damages to consider–the maximum I would think per individual would be the cost of the card. And then, because nVidia doesn’t sell to the public, there would be the issue of which company was actually liable–would it be nVidia, or the nVidia OEM you bought the card from? Or would it be the retail reseller from whom you purchased the card? Or would all of them be complicit? I just think it would be a mess to litigate for even more reasons than I’ve specified here (not the least of which is that FM has gotten nVidia off a technical legal hook by being coerced into changing the word “cheat” to “application optimization.”) Probably an FTC complaint is the best way to go–not to mention word of mouth, voting with your wallet, etc.

        • Anonymous
        • 17 years ago

        _[

    • Anonymous
    • 17 years ago

    With so many sites losing credibility with their questionable reviewing practices and their seeming bias in the lack of coverage regarding Nvidia’s antics, I’m glad to see the Tech Report emerging as a viable alternative. Keep up the good work!

      • fr0d
      • 17 years ago

      It’s because no-one CARES. I wish something exciting would happen in the computer industry so this slow-news-day crap can be relegated to the shortbread.

    • Anonymous
    • 17 years ago

    Ok First of all they did it with application detection which means they are intentionally Fudging via the driver, regardless of how visible the changes are.

    Secondly Id also find ways of testing Q3, Serious Sam and Unreal Tournement. If you’ll notice Nvidia had Big gains is perormance in AA+AF in those games to. But strangely enough only in Certian levels. The ones the Big sites use. When you look at some of the minor review sites that use different levels for Testing they cards are Tied or the 9800pro beats the 5900. the 9800pro also wins the Splinter cell and Warcraft benchmarks which are not widely used either.

    Its pretty Fishy to me.

    • getbornagain
    • 17 years ago

    well i wish i had the money to worry about stuff like this, but my r8500 just going to have to last a bit longer here too

    ah nice article by the way 🙂

    ps sorry about the deleted post, i was going to reply to that, and decided not to and well sorry

      • Anonymous
      • 17 years ago

      Yup. My 8500 is staying with me until I actually see a good game use something more powerful. I figure by this X-mas things will be out that will really start to tax the older cards.

      • indeego
      • 17 years ago

      GF3 Ti200 forever s[

    • andurin
    • 17 years ago

    Scott, nicely done. Concise and to the point as well. I would suggest you try this renaming method out in your “evil new means of representing graphics benchmark results,” but since NVIDIA made their statement about optimizing for games, I doubt it would have much impact. However, it might be revealing, if only to see how much “optimization” is going on.

    I found it curious that you chose not to comment. I’m thinking the E-Bay method of obtaining test samples is going to be the norm for you in regards to NVIDIA based cards from here on out. While I doubt you’re worried about incurring NVIDIA’s wrath, because I’m sure you’ve already made your point just by posting this story, I wouldn’t hold it against you if that’s why you chose not to comment. Regardless, I’m sure you have the respect of your readers for actually reporting, since so little of that goes on in some hardware sites.

    edit: looks liek Gordon was thinking along the same lines right before I posted.

      • slymaster
      • 17 years ago

      I don’t think Damage has to worry about Nvidia not providing test samples anymore. I am sure ATI will be happy to include a competitive sample whenever they send one of their own cards for testing, as Damage has proven to be a journalist who is unwilling to compromise his principles. Keep up the Good Work, TR !

    • gordon
    • 17 years ago

    Someone needs to do a rundown of a couple popular games to see if they are using similar techniques with AF or AA =/.

    Edit: I like this quote “/[

    • Mulch54
    • 17 years ago

    well I suppose the obvious question on my mind is what about ATI. Did they learn from thier mistake with the quake disaster? What happens if you run this same bench mark with thier card. I assume from the the fact they were eager to let this tip out in the the open that they have covered thier rears but you never know.

    • dmitriylm
    • 17 years ago

    Im pretty sure it humming…lol..

    • dmitriylm
    • 17 years ago

    Well….as long as my Ti4200 keeps humming along just fine..i dont really care. nVidia is gonna catch some flack for this one though.

      • danny e.
      • 17 years ago

      are you sure its humming? maybe its hermming… and if you really made it hum you wouldnt be as happy with it.

        • danny e.
        • 17 years ago

        yes i am tired.

    • red0510
    • 17 years ago

    Whatever relationship The Tech-Report had remaining with nVidia just got flushed down the crapper. I’m guessing you won’t be getting any new pre-release cards from them to bench anymore.

    But this is a great thing. In the wake of all the stuff that passes for ‘reviews’ and ‘information’ on the Web these days, not to mention the New York Times fiasco, it sure is nice to see that some people are still willing to stick their necks out to expose the truth.

    I may create a PayPal account just to contribute to TR. You guys rock.

    • Pete
    • 17 years ago

    Good job, Damage (and ATi tipsters). This is getting sad.

    • Loie
    • 17 years ago

    I don’t know. This is a -really- close call. I had to load the two PNG’s and Alt-Tab between them to catch any visible difference, and even then it was tough. The only differences I can notice are the carpet under the desk, the top of the chest between the desks, and the arch in the wall behind the top of the bookcase.

    Maybe TR should do this more often, but come up with a “standard”, i.e. if 3dmUrk generates a frame with >3% difference versus the same frame from 3dmArk, the method which caused the difference would qualify as a “cheat”. Just a thought…but something, some kind of -standardized- method, should be determined.

      • getbornagain
      • 17 years ago

      delete delete

    • Forge
    • 17 years ago

    Thing is, you can’t be sure your tests aren’t getting cheated, if the methods are publically reproducible!

    That’s why Nvidia’s cheating is a double edged sword. They’re killing 3Dmark in specific, but they’re also killing benchmarking in general.

    Way to go, morons!

    “Our new card is fast! Really fast! How fast? No one knows, but our PR guys agree it’s the fastest we’ve ever made this week!”

    • John Gatt VIA
    • 17 years ago

    Damage,

    Good job man it’s getting harder all the time to know what you are getting when you’re looking for new hardware. What tests can we trust now? What do we have to do to stop this? I use these tools every day for work. I for one will be looking for other ways to bench mark from now on.

    John Gatt
    Web Media Liaison &
    Technical Support
    VIA Technologies, Inc.
    §[<http://www.viaarena.com<]§ §[<http://www.via.com.tw<]§ john.gatt@viaarena.com

    • indeego
    • 17 years ago

    Yawn. Impending death of 3dmark on horizong{<.<}g Bring on the games.

      • danny e.
      • 17 years ago

      hmmm are all games written the same? how does a video card know the difference between when it is running a game and when it is running a benchmark. code is code is code. i really need to take a few hours sometime and explain thoroughly why there is no such thing as “synthetic” benchmark. the benchmark maker may do things that a game writer would never do, but that is the very essence of a benchmark — you are determining how said code will run on said video cards. saying that games are better for benchmarking is true only to the extent that people play games not benchmarks. However, games do not make better benchmarks for the simple reason that games also are not written the same. Look at results for openGL games vs. DirectX games… ATI always gets clobbered on Quake.. DOOM3.. ect. as opposed to say UT2003. In order to fully know which card ran a particular game faster you would have to test that specific game. A benchmark is designed to test particular methods of doing things not particular games. I am tired so i hope i at least made some sense. Maybe if i get unlazy sometime i will try to make my point better in the forums. i am not, by the way, saying benchmarks are always good or whatever. … i am just tired of hearing about “synthetic vs real”. code is code. its all a matter of what the code does.

      • danny e.
      • 17 years ago

      sorry there indeego. i think i am need of some sleep. i am tired and grumpy. i just realized your post doesnt even say anything about ” synthetic benchmarks are bad”.
      anyways… danny e. needs sleepy time.

        • indeego
        • 17 years ago

        there is an edit button ya silly billy gooseg{

    • Spotpuff
    • 17 years ago

    What the hell is wrong with nVidia this year? 🙁

    Oh well, my brand new Radeon 9100 is doing fine (don’t laugh 😛 I’m waiting for doom3… yeah that’s it…)

      • Nelliesboo
      • 17 years ago

      i hear ya, i am going to be building maybe 2-3 systems and they are all getting 9000 pros for now

    • thorz
    • 17 years ago

    Haha, the article was taken down for 2-3 minutes, for one second I thought that Damage was forced to take it down… I was already searching my web cache for saving the pages 😀 !!!

    • thorz
    • 17 years ago

    I have just one word for you NVIDIA:

    C-H-E-A-T-E-R-S !!!!!!!!!!!!

    Burn in h$@%#ell !!

    • red0510
    • 17 years ago

    Ha Ha!!! Perry ‘Damage’ Mason is on the case!!

    This is big. Really big. nVidia caught cheating again? I’m curious to hear FutureMark’s response to this. I hope your new servers can handle a good slashdotting. Hell, this even may be linked by CNN.com.

Pin It on Pinterest

Share This