NVIDIA caught cheating (again) in 3DMark03

We’ve seen a stunning performance boost with the new NVIDIA Detonator FX 44.03 drivers, and it appears NVIDIA has already been caught cheating in 3DMark03. You can read the whole story in this article by Dave Salvator. The Detonator FX drivers use custom pre-defined clip planes and avoid doing key buffer clears in order to boost performance artificially. These kind of tricks only work when the camera follows a pre-defined path, as it does in the 3DMark03 benchmark.

As you’d expect, these revelations change our opinion somewhat about the benefits of NVIDIA’s Detonator FX drivers, which we cataloged right here. We will be looking into the extent of NVIDIA’s cheating on common benchmarks.

These revelations, however, do not surprise us in any grand way. NVIDIA has always been, top to bottom, ambitious to a fault. Intentional deception of the press, the enthusiast community, add-in board partners, consumers and shareholders with these kinds of tricks is nothing less than what we’d expect from the company. I’m sorry, but if they do not like us writing such things about them, then they should stop behaving as they do.

Comments closed
    • DerekBaker
    • 16 years ago

    That should, of course, have been:

    Can’t see any revisions to Tom’s review mentioning this.

    Derek

    • Anonymous
    • 16 years ago

    Kudos to VR-Zone for mentioning in their new review of the 5900, that there is controversy over it’s 3dMark2003 scores.

    Can see any revisions to Tom’s review mentioning this.

    Derek

    • SpotTheCat
    • 16 years ago

    in nVidia’s defence, it could be said that they do this to help cinimatic rendering. say you’re playing doomIII and one of their cinimatics comes on, and wants to be rendered. How nice would it be if you could get a better frame rate on this, and still have it look nice?

    and nvidia has said that they hate benchmarks anyways, so why would they aim their drivers at them, that’s ATI’s job

      • Anonymous
      • 16 years ago

      If this way of doing things, speeded up games if would be fine. It doesn’t, it only works for benchmarks.

      Derek

      • Anonymous
      • 16 years ago

      Nvidia does not hate benchmarks. They just hate 3dmarks2003 because when it first came out, they got their butts kicked in it by ATI. NVIDIA HAS ALWAYS ENDORSED 3DMARKS IN THE PAST WHEN THEY WERE ON TOP OF THE VIDEO CARD FOOD CHAIN………..HYPOCRITES THEY ARE…….

    • us
    • 17 years ago

    The sickest is HardOCP. Remember their role in Quack farce?
    If you blame a synthetic benchmark, fine, then don’t use it in your review at all.

    • indeego
    • 17 years ago

    So really all Nvidia would have to do to invalidate 3dmark (at some expense to themselves, of course) is not optimize at all for it, yet on “real-world” benchmarks like HL2 and DIII they just cream ATI. People will be like, WTF, this benchmark is USELESS for me to determine whether or not this card is playing my favorite games.

    They’d have a point.

    On 3dmark’s release ATI creamed Nvidia, yet on other games the percentage was not as great. This is not a realistic way of looking at a card. AT the least 3dmark should have an open membership for all 3d card manufacturer’s and possibly game producers so all have equal say in what is important. Signing ATI and exclusively ATI hurts everyne in one way or anotherg{<.<}g

      • Anonymous
      • 17 years ago

      y[

    • muyuubyou
    • 17 years ago

    Remember my blacklist??

    I remember how hard they bashed ATI for the Q3A cheat… how dare they do this… πŸ™

    Nvidia added forever or until they kneel before me begging for mercy and vowing not to do such a thing again.

    It’s gonna be hard to avoid those sweet nForce2 and future Nvidia chipsets…

    Please apologize soon, Nvidiot cheaters…

    • Anonymous
    • 17 years ago

    At 1600×1200 don’t need much AA if any.

    Shrug, popular games get optimized for Nvidia, but wait, maybe in a year or two that will be true for ATI too lol

    The 5900 is already fast, OC’s nicely and is a member of Nvidia’s 70% marketshare.

    I’ll be buying one, and the drivers work just fine in games, which is more than I can say for ATI.

    • Anonymous
    • 17 years ago

    If nividia wasn’t part of the beta group and all that jazz for 3dmark03, and weren’t smart enough to have/use this “program” that ET has then how did they make this “cheat” in the first place? I haven’t seen any allegations from any other source.. I don’t know ET’s site very well, but I’m sure that site is getting many many hits now.. good way to get some publicity. There not even sure if Nvidia is cheating or not.. it’s just complete speculation at this point.

      • jae
      • 17 years ago

      Easy. They someone else in the beta program. Think about it.

    • Anonymous
    • 17 years ago
    • derFunkenstein
    • 17 years ago

    HardOCP is famous for this sort of thing. What do you think that “giant” fiasco between H and Van’s Hardware was all about? It was he-said-she-said all the way there, too.

    • crazybus
    • 17 years ago

    I’m very surprised at Hardocp. They are backing up nvidia by attacking ET? huh? Where’s the logic in that? Considering nvidia would have to go through 3dmark frame by frame to do that, how could it possibly not be cheating?

    • Hellsbellboy
    • 17 years ago

    are we really suppose to believe that ET had some program that Nvidia didn’t/couldn’t get?

      • dolemitecomputers
      • 17 years ago

      Good question. I’m not sure how easy it is to get ahold of that unless you are in the beta program.

        • jae
        • 17 years ago

        That’s a rhetorical question I assume. πŸ™‚

    • Hellsbellboy
    • 17 years ago

    Yes just like ET stated ‘they think’ but not sure if it’s true or not whether Nvidia cheated.

    • ChangWang
    • 17 years ago

    Ya know #65, your right… [H]ard to believe that this is the same guy that was trying to downplay synthetic benchies because of this and that, yet and still he comes to nvidia’s aid in defending their lies and cheats… Talk about being a hypocrite….

    • YeuEmMaiMai
    • 17 years ago

    all of you nVidia card buyers just got screwed over by nVidia, how does it feel that you have the fastest sythetic scores but when you crank up the IQ to the max you loose to a card who’s basic technology has been on the market for about 8 months?

    take a look at hard ocps ut 2K3 scores where the radeon runs 300% faster at 6x fsaa AND LOOKS BETTER THAN nVidia’s 8x fsaa….. LOL nice to pay more for less…

    • spooner
    • 17 years ago

    Two days after [H]ard|OCP was given the opportunity by NVIDIA to benchmark DOOM3, they come out swinging in defense of NVIDIA when the company was obviously intentionally inflating benchmark scores in 3DMark03. What is interesting here is that NVIDIA was unaware they would be found out because they didn’t have access to tools Extremetech used to uncover the method behind the inflations. These tools are not provided to NVIDIA anymore since they went on the warpath, pulling out of the 3DMark beta program and publicly attempting to discredit the small company behind one of the most popular graphics benchmarks.

    I am pretty sure you will see many uninformed sites jumping on the bandwagon today with lame attempts to defend NVIDIA. Give me a moment to hit this from a different angle.

    First off it is heavily rumored that [H] is very pleased with NVIDIA at the moment as they were included in the DOOM3 benchmarks on Monday and that a bit of giddiness might have precipitated the article at [H], as I was told about their research a while ago. They have made this statement.

    q[

      • TheCollective
      • 17 years ago

      My point exactly. Kyle has theories of his own, yet he is willing to share no proof of his assertion that :

      q[

        • dolemitecomputers
        • 17 years ago

        I didn’t realise that Hardocp had the bench as an exclusive from Nvidia. That does make things more interesting.

          • TheCollective
          • 17 years ago

          Not exclusive, but certainly one of a few.

    • TheCollective
    • 17 years ago

    Hardocp.com now has a snippet on the front page about ET having their own motives for revealing this information. What suprises me is that Kyle makes these accusations based upon no evidence whatsoever. All I see on the front page is a “he said she said” type of rhetoric. Funny, he provides no screenshots.

      • Dually
      • 17 years ago

      Exactly, first they get on Et’s case for making accusations of cheating without proof that it was indeed cheating and not a bug, then they accuse Et of busting Nvidia’s balls because they didn’t get a D3 demo, without proof that that is at all the reason Et is busting Nvidia’s balls. I guess the [H] stands for r[

      • indeego
      • 17 years ago

      Welcome to Kyle’s whinefest. It’s why I rarely visit [H] and never visit tom’s anymoreg{<.<}g

        • Hellsbellboy
        • 17 years ago

        yes well most people are sheep, and only go to sites that display their opinions.

        • indeego
        • 17 years ago

        b[<"While our thoughts on this will surely upset some of you, especially the fanATIics, I hope that it will at least let you possibly look at a clouded issue through from a different perspective."<]b I mean, if TR stated something like that, I'd have to just question their integrity, completely. While TR does rant on about apple, and not getting review samples from some card manu's in time (or at all,) I never get the sense that there is animosity towards Nvidia, ATI, or the people that read their siteg{<.<}g

    • thecoldanddarkone
    • 17 years ago

    my suggestion is to look at hardocp today… makes some good points

      • Dually
      • 17 years ago

      …makes some good excuses too. πŸ˜›

        • thecoldanddarkone
        • 17 years ago

        I have to agree a little, especially the part where they did it to show how synthetic benchmarks can get easily manipulated so they did it to prove it to teh community….

          • TheCollective
          • 17 years ago

          q[

            • Anonymous
            • 17 years ago

            Two days after [H]ard|OCP was given the opportunity by NVIDIA to benchmark DOOM3, they come out swinging in defense of NVIDIA when the company was obviously intentionally inflating benchmark scores in 3DMark03. What is interesting here is that NVIDIA was unaware they would be found out because they didn’t have access to tools Extremetech used to uncover the method behind the inflations. These tools are not provided to NVIDIA anymore since they went on the warpath, pulling out of the 3DMark beta program and publicly attempting to discredit the small company behind one of the most popular graphics benchmarks.

            I am pretty sure you will see many uninformed sites jumping on the bandwagon today with lame attempts to defend NVIDIA. Give me a moment to hit this from a different angle.

            First off it is heavily rumored that [H] is very pleased with NVIDIA at the moment as they were included in the DOOM3 benchmarks on Monday and that a bit of giddiness might have precipitated the article at [H], as I was told about their research a while ago. They have made this statement.

            q[

            • thecoldanddarkone
            • 17 years ago

            that is what I was meaning πŸ™‚ ops I need to explain myself better

            • hmmm
            • 17 years ago

            In my mind, the only thing discredited by this fiasco is nvidia, not 3dmark.

            I was pretty shocked by the [H]’s post. I’ve been a faithful reader of that site for six years or so, and I’m thoroughly disappointed. That post is sad, Tom’s Hardware sad. The first things the [H] says are excuses (hypothetical D3 motive, tools unfairly withheld from nvidia).

            Let’s take a closer look. They talk about ET being upset about being slighted by nvidia. First, [H] is *gasp* guessing at motive (something they don’t want to do regarding nvidia’s driver bug). Second, motive does not invalidate the results. Are they reproducable? I think yes. ET found something. It doesn’t matter one iota why they went looking.

            Next [H] talks about how nvidia–because they won’t pay tons of money–isn’t given the tools that ET used to uncover the ‘bug’. Relavance? NV didn’t need the tools to create the bug. NV *chose* not to buy the tools. That’s fine, but we shouldn’t be implying that there is something wrong with other people choosing to buy the tools.

            Then H talks about ‘uninformed’ sites jumping to conclusions. Well, there’s a bug in NV’s drivers that benefits NV in a benchmark that is perceived as important in a highly competitive market in which NV was loosing acroos the board until very recently (and has now retaken the lead only at the $500 price point). That circumstantial evidence alone would shift the burdan of proof against NV. They better have a good explanation. Such specific and beneficial accidents come along once in a blue moon.

            Then H says that we shouldn’t guess at motive because we don’t know. Well you can never *know* what someone is thinking, but you can get a pretty good idea. The H seems perfectly happy to assume ET’s motive based on ‘rumors’. That seems rather hypocritical on face.

            Then they offer this little ‘there was a bug in NV’s drivers that helped ATI, doesn’t mean NV was cheating for ATI.’ Well thanks for that, Captain Obvious. And holy non-sequitur and false analogy, Batman.

            Then the [H] turns this whole incident into a posterboy for their assaut on 3dmark. WIthout addressing the validity of that assaut, there are a few things that can be said. They clearly have a political motive here (and it isn’t backed up by reproducable evidence). The clear subtext of HardOCP’s post is to excuse and distract. ‘Don’t look at NV, look at 3dmark and ExtremeTech…’ The [H] even declines to say that, assuming ET is right, NV should be ashamed. How many people would actually defend that proposition? That would indicate how far out there the [H] is on this one issue. There’s no outrage in that post. If it is true, NV betrayed the public trust. That ought to get a rise out of someone. And it isn’t like the [H] is known for calm and emotionally-neutral reporting.

            I really just can’t understand why Kyle made that post. It is one thing to say ‘the jury is still out, let’s wait and see what nvidia has to say.’ Instead he clearly took NV’s side–I don’t see how it could be spun any other way. He said there is no reason to think NV did anything wrong, even if they did the benchmark is bad anyway (so the wrong action doesn’t matter), and look at these other guys (they aren’t perfect either). The style of Kyle’s post is as important as its substance.

    • ChangWang
    • 17 years ago

    I don’t know if anyone else has said this yet, but If it is easy to cheat on benches where the camera follows a certian path, who’s to say they haven’t used pre-defined clip planes on ALL of the current benchmarks in use. The first that come to mind anre the UT2K3 flyby benches, etc…..

      • eitje
      • 17 years ago

      yep, someone said it. ;D

    • thecoldanddarkone
    • 17 years ago

    They came to some conclusions, not proven though if you read the article. Also if they are questioning the drivers, why don’t they do prove the same thing of all the *significant % increases*…. I am not going to say nvidia cheated or not, *cause they might have* but their need sto be more info before I cast judgement for myself….

    • Anonymous
    • 17 years ago

    The core issue here is not whether 3DMark03 is a valid benchmark or not. The point is that every time demo or benchmark currently out there, whether it’s Q3A or UT2k3 or whatever, go through a scripted sequence. If they’ve gone through the trouble of mapping out the clip boundaries for every frame in those particular sequences, you can be sure they’ve done the same for many other benchmarks. In other words, UT2k3 might suddenly be showing a spectacular increase in the flyby, but it may be doing the exact same thing as 3DMark03 — so you’ll end up seeing zero benefit while actually playing the game.

    The other point to consider is that while many of us may not care about 3DMark03 scores, many OEM’s base a large part of their purchase decision on these numbers. For them it’s an important and underhanded way of winning business.

      • hmmm
      • 17 years ago

      Exactly! That is one of the points I was trying to get at in my post over on the [H] forums.

      /[

    • Buub
    • 17 years ago

    I think one thing that needs to be kept in mind here is that there were significant increases in performance across the board in many different tests. Yeah maybe they “quacked” the 3DMark03 benchmark, but that doesn’t negate the fact that there were significant performance increases in other tests, and there were also significant visual quality improvements.

    This behavior is unfortunate and regrettable. But where it differs from “quack” is that is only a minor part of the changes, and these drivers appear to have real solid improvements in addition to the “quacking”.

      • eitje
      • 17 years ago

      across the board, excluding Quake 3 Arena and Jedi Knight II. πŸ™‚

        • Anonymous
        • 17 years ago

        The problem is that same technique can be applied to ANY benchmark with a scripted sequence (ie all of them). This problem is not limited to 3DMark03. Just because there’s a huge increase in UT2k3 in the benchmark does not necessarily translate to an increase in in-game performance. In fact, given the almost across the board increases in performance with these drivers, I would say that this technique is probably being applied to every major in-game benchmark they could get away with.

          • hmmm
          • 17 years ago

          There’s no reason the fly-bys in UT couldn’t be optimized in the same way. Do I think they are? Probably not. The problem is that this behavior should raise that question in everyone’s mind. It also says that nvidia (for some reason) did not feel confident enough in their legitimate gains (from NV35 and the DetFX).

    • NeXus 6
    • 17 years ago

    I don’t think it really matters. Most people are looking at specific gaming benchmarks when deciding on which video card to buy. Seriously, has anyone here bought a video card based soley on 3DMark scores? Yeah, I didn’t think so. They already state that they think it’s a bug in the new FX drivers, so let’s not jump to conclusions until we know for certain it isn’t.

    • Hattig
    • 17 years ago

    I’m not impressed with this behaviour. What happens if you rename the 3DMark executable to something else?

    Game specific optimisations are fine, if the image quality remains the same. Benchmark specific optimisations that take advantage of a benchmarking limitation (in this case, a fixed flight path) are wrong, because this does not happen in real games, and hence the “optimisation” is actually a cheat.

      • Anonymous
      • 17 years ago

      I am sure that both Nvidia and ATI have *very* clever ways of identifying which particular 3D application is using their driver.

        • Anonymous
        • 17 years ago

        Or it could just be that nvida cares less about the benchmark then about the games and the mear fact that you have to use the games to benchmark the card is what nvidia wanted anyway… so why would nvidia go out of its way to code the drivers to render the benchmark correctly when they can instead just sit back and let it mess it up untill its no longer important anyway. ie when doom 3 comes out.

          • Anonymous
          • 17 years ago

          In order to pull off the cheat Extremetech found, Nvidia’s driver team would have had to go through the entire test, frame by frame, and determine which parts would not be visible and thus could be culled out. This is a major effort. All those same engineers could have been working to optimize for real games, like the ones in ET’s Gamegauge test. But they weren’t. What’s Nvidia’s reasoning? “The benchmark is meaningless”. Uh, something doesn’t add up there.

            • Anonymous
            • 17 years ago

            Excuse me do you have any idea how utterly stupid that idea realy is?

            In order to do that they would need to know exactly what YOUR puter was looking at every frame BEFORE HAND… no matter what your puter was doing WHILE running the benchmark and no matter how you clocked your card your puter your monitor heck even your freaking sound card.

            Get the timing off by even ,1 seconds and blamo your hosed. Have the hd hickup and blamo your off by 2 frames.

            They could do it if this was a 60 frame per second game where it ALWAYS was 60 frames per second and at 45.02 seconds it was ALWAYS exactly at frame xxxxx looking ALWAYS exactly at so and so.

            In short its soo utterly pooptasticaly stupid an idea only an ati fanatic could come up with it.

            • Anonymous
            • 17 years ago

            “In short its soo utterly pooptasticaly stupid an idea only an ati fanatic could come up with it.”
            I am not a fanATIc, but your line tells me for sure that you are a NVIDIOT :lol:. Stop talking like a STUPID fanboy. Did ya read the article? It says that the sky is rendered first… yeah, the WHOLE screen. but what about those ocasions when the sky covers JUST one half or one quarter of the screen… what if the sky never gets rendered in the inferior corners. Wow… it’d be rocket science to determine which parts of the sky are never rendered πŸ™„ .

            • Anonymous
            • 17 years ago

            I was talking about the frame by frame nonsense…

            Now the sky bit itself is simple… thats what the card is SUPPOSED to do. Its what nvidia prommised it would do.

            The only glitch is the benchmark doesnt like it when used in dev mode freeze.

            But its not a cheat as they already said the card reordered the rendering of the scenes parts.. and guess what that includes the sky.

            • hmmm
            • 17 years ago

            You, sir, are mistaken. If you look at frame 412 from game 4 on your computer, it will be exactly (or at least the data would be) the same as the same frame on my computer or any other box. They do it frame-by-frame, not second-by-second. You say you read the article? You might want to try it again.

            • Anonymous
            • 17 years ago

            The only way the frames would be exactly the same is if the benchmark was actualy a list of thousands of frames to render and in that case you couldnt freeze the freaking benchmark and swing around.

            But its not its a game engine scripted to follow a set PATH. As such the only thing the same puter to puter is the path not how long it takes nor how many frames that entails.

            To understand the difference its like this.. say you had a super duper puter and card that could render it at 400 fps.. well if it did all frames the same then the benchmark would go so fast you couldnt see anything. But instead it goes more smoothly and somwhat faster. THUS some of the frames arnt the same in both.

            • Anonymous
            • 17 years ago

            πŸ™„ πŸ™„ so… maybe you want to tell futuremark that their benchmark is broken because you are going to get different pictures when you do your IQ comparison… let me see, you are saying that the frame 412 is going to be different in another computer… πŸ™„ πŸ™„ … *sigh*

            • just brew it!
            • 17 years ago

            The benchmark does not have a fixed number of frames, it has a fixed length (in terms of wall clock time). The video card renders as many frames as it can over the course of the benchmark; each frame is a snapshot in time. Higher framerate = smoother motion, because the snapshots are more frequent (closer together). Higher framerate does not make the entire sequence finish quicker (unless you’re talking about one of 3DMark03’s “CPU tests”, which is not what we’re discussing here).

            To give a concrete example… a slow card might render only 600 frames over the course of a 60-second benchmark. A fast card might render 6000 frames over the course of the same 60-second benchmark. So frame 100 on the slow card will correspond (roughly) with frame 1000 on the fast card.

            Understand now?

            • Anonymous
            • 17 years ago

            Nope frankly I dont care either way.

            I dont go ati not because I hate em not because they are evil not even because of any cheating wich I frankly dont care about either way.

            I dont go ati simply because I wana try a different company after having had an ati product.

            If it werent for the fact there are realy only 2 such chip makers I might have gone away from both this cycle as frankly both sides are getting very…..annoying. But its like picking between bush and gore… someone who cant possibly be as stupid as he looks and someone who sure as hell is… choices choices… makes me wish I had taken the red pill damnit. or is that the blue pill…. babble.

      • just brew it!
      • 17 years ago

      q[

        • meanfriend
        • 17 years ago

        all synthetic benchmarks, you mean?

        The most relevant benchmarks for the end user are real-world performance of games/apps and you cant expect Doom3 or Half-Life 2 to be open source πŸ˜‰

      • muyuubyou
      • 17 years ago

      That won’t help at all in this case (they’d just know how to cheat even better).

      y{

        • Martrux
        • 17 years ago

        Wait, so if they made all benchmarks open source, and all graphics cards would end up optimizing their drivers for it, in the end wouldn’t the better card STILL win? It’s a never ending circle putting drivers (and the programmers of them) to build drivers optimizing synthetics rather than wasting their time opting for quake3 and ut2k3 even more? So how many frames would I get on Super Mario Bros?

    • Anonymous
    • 17 years ago

    Now what ?!??!??!?!?!
    The same nVidia did warn you of these very problems when that lame 3DMark 2003 came out.

    I’ve always hated 3DMark and most benchmarks.. they’re simply useless.
    3DMark is lame because it’s neither really synthetic (should have little raw primitives tests, weighted and summed up), nor a real game (it has fixed paths and no interaction).
    All those wasted reviews.. all those wasted CPU cycles !

    P.S. I love drivers that cheat to speedup games… expecially for my fav games πŸ™‚
    You just have to check if image quality is the same !

    [EuroGerbil Dan]

      • fr0d
      • 17 years ago

      Agreed. Real world benchmarks are the only things I read.

      Campaign for removal of pointless benchmarks:
      1. 3dmark 200x
      2. “Memory Bandwidth Tests”

      • just brew it!
      • 17 years ago

      q[

        • FroBozz_Inc
        • 17 years ago

        I agree… There were significant gain across the board, not just in 3DMark.
        This optimization buisness has been going around since ppl reviewed early computers, and now, 3D games.

        As long as there were performance gains in other games, big deal.

        So they tweaked 3Dmark to run faster….it still looks good, and other games are faster.

        Let’s not get our panties in a bunch.

          • Anonymous
          • 16 years ago

          So FroBozz_Inc you know for a fact that the new drivers actually increase performance in the games not just the game benchmarks. The fact is that nVidia’s hack can be applied to all known benchmarks (synthetic or game benchmarks). Unless you customize your benchmark (which makes them incomparable) all benchmarks can be hacked like 3DMark03. Prove nVidia hasn’t hacked all the game benchmarks and the performance increases actually translate into the game. I guess you’ve got a lot of testing to do. This is what happens when company’s cheat. You can no longer trust the results. Get it.

    • Anonymous
    • 17 years ago

    Ugh Nvidia copying Ati, “Quack”… maybe they’re next gen will copy Ati some more and not suck ass

    • Hellsbellboy
    • 17 years ago

    well sounds like it’s just a allegation and not FACT as of yet..

      • just brew it!
      • 17 years ago

      Agreed. But it is an allegation with some fairly convincing (IMO) circumstantial evidence to back it up.

        • hmmm
        • 17 years ago

        What would make it fact? Like some internal nvidia document? I’m sure they’re sending those to the Inq ATM. :p Honestly, what more could you expect? There is a narrowly-limited ‘bug’ in the most widely used benchmark that just happens to coincide with a dramatic speed boost. At that point, I think the burdan of proof has shifted to nvidia.

          • GodsMadClown
          • 17 years ago

          proof smoof… Real users will be busy getting real framerate gains in real games while all the silly people argue about synthetic benchmark piffle.

    • Anonymous
    • 17 years ago

    hmmm those drivers aren’t even officially released..

      • just brew it!
      • 17 years ago

      Yes they are. They were released today; I actually downloaded them myself earlier this afternoon.

    • dolemitecomputers
    • 17 years ago

    Either it is:
    A: Really a driver problem but exploited by a Nvidia biased extreme tech writer as cheating.
    or
    B. Nvidia cutting corners.

      • Anonymous
      • 17 years ago

      Did you ever read the article??? oh my goodness!! if you read it… you’ll know the answer, instead of presenting a highly unlikely A option πŸ˜‰

        • dolemitecomputers
        • 17 years ago

        Yeah I did read it. They used the developer version of 3DMark to spot the “bug”. I’m not saying if it is true or not I just gave the possible choices.

          • just brew it!
          • 17 years ago

          Well, you left out:

          C. nVIDIA really *[

            • dolemitecomputers
            • 17 years ago

            Well of course we will never find out C for sure since they would never admit it. We can only assume they were from whatever we find out.

    • atryus28
    • 17 years ago

    Wow I don’t even like nvidia and I feel embarrased for them. Talk about clutching for straws here.

    What’s up with the neon colors? I thought you guys said you hated neon. Well at least that’s what you said in your review of chaintechs stuff.

      • Forge
      • 17 years ago

      Yes, but it wasn’t Matrix Reloaded release day, then.

        • atryus28
        • 17 years ago

        I see well then I will give my wonderful impression of Keanu.

        “Ya gotta go down Matrix, it’s gotta be that way!” This also applies to any other movie he’s been in.

        “Ya gott go down dracula, it’s gotta be that way!”

    • Anonymous
    • 17 years ago

    Please keep the color scheme, at least until the weekend. It’s really awesome

      • indeego
      • 17 years ago

      g{

        • Anonymous
        • 17 years ago

        All I can say is the green on black gives me a headache.

          • Anonymous
          • 17 years ago

          I love green on black… do ya want an aspirin πŸ˜†

          • jae
          • 17 years ago

          I second that. SICK.

    • Anonymous
    • 17 years ago

    Well, so much for the Nvidia ‘Gold Standard’ for drivers, and even Tech Report was only just praising Nvidia. Tsk tsk.

    Anyway, speed and image quality are one thing, but how is the stability and compatibility of the new drivers? This aspect seems to always be treated as an after-thought, and wrongfully so.

    • WaltC
    • 17 years ago

    ATi is not nVidia’s worst enemy–nVidia is.

    • Samlind
    • 17 years ago

    Well, we are all reminded of the ATI “Quack” stunt. Like Jimmy Swaggart, they said they repented of that relationship with that woman, and have seen the light. I expect Nvidia to pull a similar conversion of faith.

    Of course right now they are liars, their products are crap, and their public image sucks. Good move.

    In other news, the Matrixization of TR looks pretty good. You should keep it like that…. not…

    • Trik
    • 17 years ago

    /ot
    This color really is hideous

      • hmmm
      • 17 years ago

      Agreed…

    • Anonymous
    • 17 years ago

    It’s good that the truth wins out………though one would wonder why they would ‘fix’ the race, they gotta know they are going to get caught.
    Info travels so fast these days around the internet, who are they trying to fool?

    • Anonymous
    • 17 years ago

    ATI owns all of you nvidia faboys!

    • gordon
    • 17 years ago

    Nvidia doesn’t recommend 3DMark03 as a method of judging a cards strength so why the excess effort to optimize for it?

    • LicketySplit
    • 17 years ago

    Sounds like a nice case of fraud….should haul their arses off to jail pronto! How damned low can u get to make a few sales…..whew!

    • Xylker
    • 17 years ago

    Yeah, no surprise, really… Sad, but the 6 month product cycle is a killer, especially when you are on the downside of the slope trying to topple the performance leader.

      • Anonymous
      • 17 years ago

      I really think that they really need to slow down the product cycles from 6 months to say 9-12 months since a lot of games don’t change engines or make huge strides in graphics complexity every 5-6 months. At least, the additional time would give nVidia to find a real fix to the generally tethered GeForce FX (mostly the 5200 line).

      I think it’s bad for any company to exploit and cheat on benchmarks just to look better… mostly when it comes to backlash that comes after the cheat is exposed to the public (be it nVidia, ATI, S3/Via, whomever).

      Personally, I don’t give a crap about how many hundreds of frames a second a game runs at since it’s not my kind of thang. The majority of people don’t care about it either, mostly when it’s playing things like Solitaire or Minesweeper… instead, I’d rather have something that provides the best image and render quality and super stable drivers. But that’s just my POV on it all.

Pin It on Pinterest