Valve steamed over NV3x performance

SEATTLE — Greetings from ATI’s “Shader Days” event, where, in a remarkable presentation, Gabe Newell of Valve, famed developers of the Half-Life series, just uncorked a shocking set of benchmarks of his company’s new DirectX 9 game. Newell also expressed his frustation at some graphics hardware makers’ attempts to influence the outcome of benchmarks using driver-based optimizations.

As you know if you have been following graphics hardware over the past six months or so, there is a rich thicket of issues surrounding benchmarking, driver optimizations, and next-gen graphics hardware. I will not attempt to bring you up to speed on all of them here. I believe Newell’s presentation summarized the relevant issues nicely and stated Valve’s position with little ambiguity, so I’ll present my pictures of those slides with some commentary as necessary.

The context of Newell’s remarks was ATI’s “Shader Days” event. Select members of the press spent the day here taking in presentations by employees of ATI, Microsoft, and SoftImage about the latest developments in DirectX 9-class shaders, with an emphasis on pixel shader hardware. The briefings were full of useful information about how innovations made possible by DirectX 9-class programmable graphics hardware promise to vastly improve visual fidelity in games and other interactive apps without compromising performance. The capstone of the day’s presentations was Newell’s talk, which was accompanied by a stunning demonstration of Half-Life 2’s source engine in action.

Newell began by establishing Half-Life 2’s credentials as a true Direct X 9 app.

Gabe Newell of Valve software states his case

Newell then talked about his consternation over techniques IHVs have used to skew benchmark results, a problem he considers serious, because it threatens the gaming experience for Valve’s customers.

As you can tell from looking at the list in the slide above, Newell was concerned particularly with some of the techniques NVIDIA has used in recent driver releases, although he didn’t exempt other graphics hardware makers from his complaints. He said they had seen cases where fog was completely removed from a level in one of Valve’s games, by the graphics driver software, in order to improve performance. I asked him to clarify which game, and he said it was Half-Life 2. Apparently, this activity has gone on while the game is still in development. He also mentioned that he’s seen drivers detect screen capture attempts and output higher quality data than what’s actually shown in-game.

Newell summed up his problem with bad benchmarks in a nice zinger line: “Our customers will be pissed.”

 
To illustrate the seriousness of the problem, he then shared some benchmarks with us from Half-Life 2 running on current graphics hardware. The test config was like so:

The results are striking, as you can see. I believe the results were recorded at 1024×768 in 32-bit color. This is with only standard trilinear filtering and no texture or edge antialiasing.

NVIDIA’s NV3x-derived chips are way off the pace set by the ATI DirectX 9-class cards. The low-end GeForce FX 5200 Ultra and mid-range GeForce FX 5600 Ultra are wholly unplayable. The high-end GeForce FX 5900 Ultra ekes out just over 30 fps, well behind ATI’s mid-range Radeon 9600 Pro (and yes, that score is from a 9600 Pro, not a stock 9600—the slide was mis-labeled). The Radeon 9800 Pro is twice the speed of the GeForce FX 5900 Ultra.

Valve even ginned up a value-for-money slide to illustrate the problem with the current price/performance proposition for NVIDIA hardware.

However, NVIDIA has claimed the NV3x architecture would benefit greatly from properly optimized code, so Newell detailed Valve’s sojourn down that path. The company developed a special codepath for the NV3x chips, distinct from its general DirectX codepath, which included everything from partial-precision hints (telling the chip to use 16-bit floating-point precision rather than the default 32-bit in calculating pixel shader programs) to hand-optimized pixel shader code.

The “mixed mode” NV3x codepath yielded mixed results, with a fairly decent performance gain on the FX 5900 Ultra, but not near enough of a boost on the FX 5200 Ultra and FX 5600 Ultra.

Newell also expressed skepticism about the payoffs for NV3x-specific optimizations, noting that the optimization process was arduous, expensive, and less likely to produce performance gains as shader techniques advance. What more, he said, smaller developers are not likely to have the resources Valve was able to bring to bear on the problem.

He suggested one way of dealing with the issue would be to treat all NV3x hardware as DirectX 8-class hardware, which would cut down significantly on eye candy and new graphics features, but which could yield more acceptable performance on NV3x chips. Obviously, he noted, one could always cut down visual quality in order to achieve higher performance, but in the case of Half-Life 2, falling back to DX8 will require tangible sacrifices.

Oddly enough, even using the DX8 codepath, the previous-generation GeForce Ti 4600 outperformed the brand-new GeForce FX 5600 Ultra.

 
To tack down one of the few loose ends in his case, Newell addressed the possibility of performance improvements in new drivers, casting a skeptical eye on promises of big speed gains. He made a distinction between good and bad optimizations, and even offered to help publications like ours sort through some of these issues, when needed.

Finally, Valve’s top dog addressed potential accusations of bias head-on. The company recently signed a deal allowing ATI to distribute Half-Life 2 with its Radeon cards, but Newell made clear this was not a case of the tail wagging the dog.

To aid reviewers in evaluating graphics hardware, Valve will be releasing a new, Half-Life 2-based benchmark in the near future, with a full suite of tools and instrumentation to help reviewers (and end users) find performance bottlenecks. The test will output all kinds of useful data, including frame rate averages, deltas, and second-by-second frame rates. Reviewers will be able to record their own in-game demos for playback, and tools will include per-frame screen captures using either the graphics card or the DirectX reference rasterizer included in Microsoft’s DX9 SDK.

Conclusions
No doubt there will be repercussions from Valve’s unprecedented decision to address this topic in this way. We here at TR will be watching developments with interest, and we eagerly anticipate the opportunity give the new Half-Life 2 benchmark a spin. Valve’s efforts to build a useful, detailed set of benchmarking tools into its new game are exactly what we want and need from developers.

At the same time, while the NV3x cards’ weak performance numbers in Half-Life 2 with DirectX 9 are jarring, they don’t come as a complete surprise. We have seen NVIDIA go to great lengths to protect its NV3x hardware from proper scrutiny in prominent benchmark applications like 3DMark03 and Unreal Tournament 2003. Results from the few other DX9-class benchmarking tools available, like ShaderMark, have not bode well for NV3x. Valve’s frustrations seem to stem from the disconnect between its expectations—that NV3x cards were capable DX9-class graphics hardware—and the apparent reality—the NV3x is a capable performer in DX8-class games, but it lacks the hardware resources (floating-point pixel shaders, register space) necessary to run DX9 games in real time.

We’ve had difficulty confirming this reality because of a lack of good tools to measure DX9 performance, and because NVIDIA has been very guarded with details of the NV3x chips’ internal configurations. It looks as though new games like Half-Life 2, and new benchmarking tools derived from them, are about to change all of that. We will have more to say about all of this soon, once we get our hands on the tests ourselves. 

Comments closed
    • Kuroi Inazuma
    • 17 years ago

    yeah, Steam is such trash. it works half the time and the rest of the time it sucks up all the system resources. my comp is old except for my 9800 pro. i mean i dont have problems with half life, just Steam. valve did a horrible job with steam and im one of the very few who still uses WON just because Steam sucks so much. maybe if they actually made Steam efficient… but if Steam is any reflection of their real ability to write code… then HL2 is going to be a disaster. HL2 and Steam so far have been nothing but one letdown after another. the only good thing about any of it is that apparently ATI cards outpreform NVIDIA. Ive always supported ATI because i always got better preformance their cards. all im waiting for now is for AMD to release their new FX-60 and the rest of their 4000+ series. oh well. ignore me, i dont know what im talking about..

    • Anonymous
    • 18 years ago

    I just want to say that ATI became just what Nvidia was before. It’s ironic how now ATI rules in 3D and Nvidia has the best built quiality and tv-out quality. I bought a Super grace ATI 9700, it was almost dead on arrival, then bought an Sapphire 9700 PRO, it used to display corruption sometimes when logging on to windows. And tv quality is really crap, and the same for driver features. Please people, try concentrating on somehing else then just pure benchmarks

    • Anonymous
    • 18 years ago

    anyone noticed how a company died because they didn’t want to use 32 bit color & stuck with 16 bit. well all i can say is i hope u die quicker nvidia so the video card noobs don’t waste their money on ur crap (unless it’s a dx8 GeF ti card)

    • VTNC
    • 18 years ago

    It’s REALLY funny when you think about the people who bought a GFX just to stick to nVidia.

    • maxxcool
    • 18 years ago

    §[<http://www.anandtech.com/news/shownews.html?i=20525<]§ what else do i have to say ? $Nvida is stooping really low for this one

      • PLASTIC SURGEON
      • 18 years ago

      And this surprises you? 😉 When you market a product and hype the shit out of it only to find out that it is inferior in almost every way compared to your strongest competitor, you have a few ways to deal with the problem. 2 actually:

      y[<1. Admit there is a problem and devise a plan of action to correct it at all cost<]y y[<2. Do nothing and lie.<]y We all know what option Nvidia has taken. And continues to take.

    • paulusbospooier
    • 18 years ago

    this is bogus i have a leaked beta of hl2 it runs great on a ip4 2000 and a gf4200 128mb with 8xagp only its in dx8 big deal i pump out 60fps on a res of 1024×786 at 32bit with 3x aa so i dont know what al the fuzz is about????
    for the record i have it on a asusp4p800 deluxe mainplate with 512ddr of GEIL late dudes

    • Anonymous
    • 18 years ago

    ROFL … nice FAKE stats!

    look: 9600 ATI is faster than a 5900Ultra ? ROFL!

    a GeForce 4 Ti is faster than a FX5900Ultra ? ROFL!

    Nice FAKE!

      • Anonymous
      • 18 years ago

      OK Dumbass… lay off the pot. First at all, it is pretty well known that the FX5900/5800 is a piece of crap for anything related to DirectX9 shaders. So that explains why it gets its ass kicked by a medium range card. Second of all,did you even noticed that the Geforce4 is only executing DirectX8 shaders????

      • PLASTIC SURGEON
      • 18 years ago

      What a dumbass you are. Please don’t post here anymore you 12 year old. Those numbers are being confirmed now all over the web. So it’s not just Valve publishing them. Extremetech.com…GamersDepot….Anandtech.com…Tomhardware.com
      are just a few. FX chips perform shaders and dx9 applications poorly. Fact.

    • Anonymous
    • 18 years ago

    What’s wrong with nvidia????

    I will tell you all what’s wrong with nvidia,,,3dfx they destroyed themselves with their stupid ideas of creating their own chips and parrallel proccesors and now their wacko ideas and engineers have infultrated nvidia,, the 5900’s are a joke they remind me of the voodoo 6000’s Nvidia wake up fire those idiot 3dfx dolts before it’s to late… Last best nvidia card ti4200 next stupid nvidia move, an add on card for the 5900’s to double the power at only one resolution

    • Anonymous
    • 18 years ago

    For me its not the performance issue (well, i have a R9700NP 🙂 ) but the fact nVidia lied and continued to lie and cheat in drivers after it got caught and fooled people some of them blame now.. ATI / Valve / MS. :weird: Thats really strange. And if i wanted to play with medium-settings i would just set the details to “medium”.No problem. Why ffs nVidia decides what settings have i play with. “The way Nvidia thinks you should play”..
    #52 LiamC
    Fully agree

      • Anonymous
      • 18 years ago

      Ok first off go read up man ATI is the one who cheated evan futuremark saiad so . Nvidas cheatn or hack was for visual quality only ada minor performance bump by 15 ATIs cheap was for pure performance only and the cheat gave them 40% increse. but if ya read right futremark siad and i quote. Nvida could have not possable had time to make optimizations since they dropped out of the ebta testing race long befor we started production on 3d2k3.} so try agian remeber ATI is the one who leaked Doom 3 alpha thats why John carmak quite hailing ATI as his best friend. but the thing i hate is that vavle is just being a dumb ass because they was alredy told by nvida not to use ther 45. drivers they should have used the 50. drives because they made many DX9 upgrades and fies and new code optimizer sutch as ATI has t use for the Dx9code . naywho iam ot

        • Anonymous
        • 18 years ago

        ARE YOU FOR REAL? Let’s get this straight right now dumbass. Nvidia dropped out of the Beta program 16 months into it. Late December. The finalized 3dmarks2003 was released in Feb. THEY HAD PLENTY OF TIME TO “OPTIMIZE” for it. And they did. 3 or 4 times. Even after they got caught cheating those times and said the benchmark is useless. The program was in place for 18 months. Where the hell did you get your facts that they dropped out BEFORE it started. You need to read your facts. You can look this up at Beyond3d or this very site. Even Extremetech.com And after all that? They REJOINED the program in August. Learn your facts.

        y[

        • Anonymous
        • 18 years ago

        Read this quote from GamersDepot. You need to. And there are plenty more like it all over the web……

        “NVIDIA’s Det50’s don’t quite grab the high-needed performance boost we had hoped for – granted they still have some time to improve upon them. However, we can now conclude as to why NVIDIA wanted Valve to use these in the recent testing we did in Half-Life 2 – they make the image quality look like crap as a way to boost performance.

        To us, the choice for which graphics card is quite clear. Do you want to buy a video card that uses DX9 “Smoke and mirrors” to emulate performance, or one that provides best-of-class performance and image quality in one package. You’d truly have to be a NVIDIA fan-boy to buy one of their cards given the current state of affairs.

        We’re told that both ATI and NVIDIA have new products coming out relatively soon; however, both are loosely based off of current generation GPUs, which means – all things holding true to form – that NVIDIA will still be taking in the shorts for the majority of DX9 / “Next-gen” games.

        We do understand that our build of the Det50’s are in beta, and there are some things that could – and obviously need to change – for NVIDIA to keep a running in the race. For starters, they need to crank up the image quality again and make sure, first and foremost, things look good. Secondly, find a way to optimize their shaders so games run much better. It’s unfortunate that NVIDIA’s way of optimizing games is to drop the precision down to 12 or 16-bit.

        It’s in our opinion that both end-users and OEM’s out there should be highly cautioned into buying any of NVIDIA’s current hardware under the assumption that it’ll offer the best gaming experience for DirectX9-class games. Until NV40 rolls out, it appears for now that NVIDIA will be sitting on the side-lines. ”

        • PLASTIC SURGEON
        • 18 years ago

        Here’s something for you to look at…..

        y[http://www.anandtech.com/mobile/showdoc.html?i=1866&p=11<]§

        • Anonymous
        • 18 years ago

        y[<http://www.beyond3d.com/forum/viewtopic.php?t=7873<]y Keep believing Nvidia PR spin.......

    • maxxcool
    • 18 years ago

    again #178, branching from the norm, you have to be prepared for a slap <or> raving success….

    first the game has to come out, i could give 2 shits about vaporware bencharks..when its out in demo and i can run my tests ill have my conclusions..same with the curent issue….which is the tactic of nvidias pr to lie, cheat, and try to hide it and never fess up

    the topic is cleary stated….we are disccussing nvidia tactics, they know the fucked up and wont admit it. and they continue to try to discredit and lie or buy there way out of it….

    ati admited to cheating image quality in q3, and they listend and improved.

    nvidia should do the same, but wont. they have so much vested in 3x that they will rape every consumer who buys the card without knowing the facts. thats what most of us are pissed about. the fleecing of the dumb ass consumer..

    driver issues?? again to all driver whiners….wtf? i have yet to see any issues on the 3 other machines running ati cards…

    my personal game box is a nvidia 5600, which i will replace with whaterver card runs the best image qaulity and speed….

    the others are running

    1 ati 8500 64meg cards @ 280/245
    1 ati 9100 128meg cards @ 275/266 and 245
    1 ati 9500 modded over to 9700….default clocks

    when we game we have no issues from either side save for the fact that none of these card are dx9 save for the semi-9700. including the 5600 i own…

    the driver issue is dead, if you have no other amunition your shooting blanks and havent a clue.

    • Anonymous
    • 18 years ago

    Who say’s Half Life 2 is going to be good anyway?

    After looking at the situation with the various card Manufactorers I’ve just bought a FX5900 Ultra and I’m completely satisfied with it.

    I value stability over and above raw performance. I’d perfer to be able to play a game for several hours at a time rather than just serveral minutes.

    The real key to this for me is what Carmack say’s and does. He admits that the NV30’s do have their faults, but for all the ATI/DX9 flag waving fanfare he still insists that for him OpenGL is the way to go not DX9.

      • Anonymous
      • 18 years ago

      “I value stability over and above raw performance. I’d perfer to be able to play a game for several hours at a time rather than just serveral minutes.”

      In that case an ATI card would perfectly satisfy you…

      “He admits that the NV30’s do have their faults, but for all the ATI/DX9 flag waving fanfare he still insists that for him OpenGL is the way to go not DX9.”

      Well, Gabe said Valve used DX9 instead of OpenGL because DX9 was more mature….

    • maxxcool
    • 18 years ago

    167 : 170 : 173

    what the fucking hell are you on ?

    prerelease 50.xx driver show good improvment but still a loss of massive scale

    this is in shader mark and hl2 and tombraider

    the god damn chip cant handle massive amounts of temp registers how can you cover that up let alone proclaim “ati will be blown out of the water”?? go back to your caves

    how can 3 industry level apps be wrong ? especialy when the use the “standard”

    im not paying 500-1000$ for a card that needs a special driver…nor should any normal person. specialy when you get a card that will not run in its highest quality mode even if you tell it to…

    what part of “smoke and mirrors” dont you get??? i have a 5600, and ill bet you my godamn paycheck i wont be able to play it as a dx9 game……nor will anyone else with anything less that a 5900 ultra

    §[<http://www.3dcenter.org/artikel/cinefx/index_e.php<]§ there are people alot smarter than i that have already concluded A LONG TIME AGO that the shader engine was desgined with non-dx9 internal code paths and handelers.....sure it *MIGHT* inprove with a new driver but its a god damn emulator....the dx9 titles will *talk down* to the n3x family....this isnt what people want to buy....but the majority of the public doesnt know any better and theye get screwed..... thats what pissess me of....ati did it once, and thet were forced to correct it now it nvidias turn, but when ati got caught they fucking admited it and changed it.....nvida is going alone with milking it for all its worth.....thats morally wrong.... topic change: i dont give 2 shits about compilers, if were to the point where a standard has been issued, and one comapny thinks there big enough to break form that standard.... then they need to be prepared to take the lumps when thier choice limits there performance becasue thier chip wasnt feed optimized code....to complain about once reslults after making a obviously risky choice is fucking retatrded.. they tried to push a standard and now is haunting them. theres no driver fix that going to put a 5900 on par with a 9800..... the 9800 was desgined to the exact 9x feature set....is that ati or valves fault, or how about shadermark or 3dmark or tombraiders.....i didnt think so.... if you build a dx9 bencmark...are you going to use the industry standards or are you going to build a "code path".....the point of dx9 is to make game devlopers lives easier so they dont have to recode a engine 2 or 3 times....jeses as of right now were stepping backwards... i dont want to go back to the days were i have to get a "mini driver" to run our favorite game..anyone young enough to remember that shit ? i swear to god were are just a step away from that again.....maybe ill forward this forum to the saint and see whats says about these latest delvolpements on his dx babey.... if someone build a 3 tired car and says its more effiecient but it turns like crap then whos fault is that ? hey maybe nvidias way is better, but right now they need to fix thier implimenttaion...not hide obvious defects with denial, then lies, then driver "fixes"..... n40 may be the shit, but they better fix this issue right quick or they better net that deal with sony for ps3... but really, to defend a company beaqsue you like it is one thing, its another to start blindly chanting a mantra when the truth is obvious.... (as allways take nothing personal from me, im venting and pissed) Maxx

      • Anonymous
      • 18 years ago

      Whew, breathe, man! You really got worked up there, perhaps you should step away from the keyboard and go have a brew. I agree with your key points, so this reply is merely out of concern for your health.

      Peace.

        • maxxcool
        • 18 years ago

        grrrr….

        ok ok, ill get laid tonite and drink a few beers….. 😉

        maxx out!

    • Anonymous
    • 18 years ago

    This is encouraging for ATI owners. It’s a travesty what nVidia have done, and alot of people have lost faith in them. Im now going to buy an ATI card; I was always put off ATI because their driver support for Softimage XSI was poor, but with XSI and ATI working with Valve, Im sure all that is or will be cleared up. Woohoo!

    • Anonymous
    • 18 years ago

    As a technician of 24 years I have worked with both companies video cards and have always found that nVidia had better drivers…even down to the imperfect vga standard(never was fixed was it?)….Ati’s drivers were problematic at best and the hardware was very tempermental with many systems so I doubt the slidshow you have to be anything more than a way to make more money for that Canadian company…..the proof is in your article….the Geforce 4 Ti4600 was close to the speed of the newer FX cards….but they use new drivers (same driver) also…I will bet you find somthing in the game code that makes the nVidia cards look bad….something that narrows the pipe to the engine and since the engine is dated but tweaked I would say it would almost be a sure thing…..check on it …it will be there…..

      • droopy1592
      • 18 years ago

      Crawl back in your hole because you’ve been absent too long. nVidia’s drivers are more broken than ever, and wayyyyy more bulky I might add, because they have application specific cheats.

      The net is now full of equal amounts of issues with cards from both companies, if not more with nVidia cards.

      Stop living in the past.

      • Anonymous
      • 18 years ago

      For the guy that said the 4600 ran faster than the newer GF-FX cards: Dude, did you read that? First off it’s using Direct X 8. Secondly it shows how nVidia’s new cards aren’t that great! You’re not really defending nVidia here…

      Plus, everyone, give up the driver argument. Since the first Catalyst drivers, ATi’s drivers have been very stable and rock solid. They may have had problems in the past. I don’t know about their past problems. I never used an older ATi card. But I have had newer Ati cards for over a year now (since the 9700 pro was released) and I haven’t had one single problem with any driver issues. The only problem I had was when I tried to use the 9700pro in a cheaply made SiS chipset motherboard. And that was the crap motherboard’s fault, not ATi’s drivers. It was an ASUS P4S8X and it was just a bad board. 15% slower than other P4 boards. I replaced it and everything worked great. I’ve never had my ATi card or drivers “crash my system” or even give me the slightest issue. And I’ve used them on almost every newer game that has been released.

      Sure, every driver has a few minor problems (that most people will never run across) like “Selecting the pin camera in the game MS Links 2003 no longer results in the ground not being drawn correctly in the sub window. This issue was known to occur under Windows XP with the RADEON™ 9000 series card installed”
      Just look on nVidia’s website. Their drivers cause issues just like every video card manufacturer. Such as “Need For Speed Hot Pursuit 2: Application hangs with a black screen when
      starting a race with 4x or 4xS Antialiasing enabled with an NVIDIA non-
      Ultra GeForce™ FX 5200 (Quadro® FX 500) series graphics cards.”

      The argument that ATi’s drivers suck is old and completely inaccurate.

      • us
      • 18 years ago

      you are too old to face the new reality, man

    • Anonymous
    • 18 years ago

    Everyone knows that Source is 5 years old. With their “tweaks” to support low-end computers as well, it is quite clear how much of the latest technology they’ve used in Source, despite what their slides may say.

    That engine has been written and re-written over and over, with lots of inherent flaws. And for errors that are partly their own fault, Valve lashes out at nVIDIA.

    nVIDIA is nVIDIA, and they’re the BEST. Why do you think Epic and ID haven’t complained about nVIDIA. I think it’s time Valve took a serious look into the base code of their engine before lashing out at one of the best GFX card companies out there.

      • HiggsBoson
      • 18 years ago

      Please, no more b[

        • Anonymous
        • 18 years ago

        You obviously don’t know jack shit about game programming, so try to contain your urge to flame. Plus, just like the Higgs Boson is a hypothetical particle, so’s your remark.

      • PLASTIC SURGEON
      • 18 years ago

      What kind of Hallucengenic drugs have you been smoking?

      r[

        • Anonymous
        • 18 years ago

        First of all, that is Hallucinogenic. As your own justification states, the problem lies within the driver. 3dmarks stats showed that even the GeForce 4 Ti 4600 came very close to the Radeon 9700 PRO. This would imply that not only the dx9 drivers are at fault, but nVIDIA’s cards can outperform ATI’s cards. nVIDIA should (will is the right word) soon correct this error, and then the Radeon 9800 PRO will be blown right out of the water. I’m not saying ATI is bad, but clear as it is, their drivers suck. Granted that the ATI Radeon has some very impressive technology built into it, but without good drivers, they’re no better than their nVIDIA counterparts.

        And what was that about nVIDIA cards costing $150 more? The GeForce cards have always been much cheaper than the Radeons, the exception being the 5900 Ultra.

        >For Nvidia to run DOOM3 at decent frame rates, Carmack had to code it in fragmented paths in lower FP. Not ATI.

        We’re again talking of asm shaders, and it’s not nVIDIA’s fault that OpenGL was given to card manufacturers to make it vendor-specific. nVIDIA just took advantage of it. And you think ATI doesn’t have their own extensions? With the new OpenGL 2.0 standard, there won’t be vendor-specific API. ATI cards have much poorer performance when it comes to OpenGL games.

        No offense, but some of your points like “Slower performance in dx9 games AND dx9 benchmarks” are based on Gabe’s accusations, which are under question. nVIDIA and ATI are both good, but all these years nVIDIA has been a wee bit ahead, and now they’ve fallen back, not because of their technology, but because of their drivers.

          • JustAnEngineer
          • 18 years ago

          It’s not the drivers. GeForceFX hardware performance (actually the i[

          • PLASTIC SURGEON
          • 18 years ago

          y[<3dmarks stats showed that even the GeForce 4 Ti 4600 came very close to the Radeon 9700 PRO.<]y In test 4 it got slaughtered. y[http://www.extremetech.com/article2/0,3973,1264987,00.asp<]§ §[<http://www.anandtech.com/video/showdoc.html?i=1863<]§ The test results Gabe performed are now confirmed by these 2 indenpendant reviews. And expect alot more as the days follow. y[but all these years nVIDIA has been a wee bit ahead, and now they've fallen back, not because of their technology, but because of their drivers. ] y Did you not say that ATI drivers suck? So if they suck then what does that mean for Nvidia drivers then with that last comment of yours? At this point in time, Nvidia is the company producing faulty drivers. Proven. Drivers that are just giving application specific optimizations in certain games and benchmarks. And still they have not addressed the Trilinear filtering issue with Unreal2003. y[http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/hl2_followup/004.htm<]§ y[< don’t know how anyone could objectively look at the performance we’ve seen in these benchmarks and conclude that a 5900 Ultra is a smarter buying decision than a 9800 Pro – even the 128MB 9800 Pro (as used in the tests here) trumps the lofty 256MB 5900 Ultra. If you’re still “stuck in the past” and think that ATI is plagued with driver issues, than go ahead and keep your head stuck in the sand like an ostrich, buy a 5900 Ultra and then start crying when your pals are smoking your ass in games like Half Life 2 and Halo because they’re running ATI hardware. <]y Buddy your whole defense of Nvidia based fx hardware is flawed. You must work for Nvidia's pr department.

      • droopy1592
      • 18 years ago

      Oh, you mean that id (john carmack) has been complaning. Telling us how long it’s taking him to optimize the game for NV3X series cards? Telling us how frustrating it is.

      He’s gotta concentrate on making the game better for the Nvidia cards instead of just make the game better.

      • Anonymous
      • 18 years ago

      dude, if the Source engine is so bleeding old, why the crap do GFFX’s have such trouble running it!!?!? doom3 will be pushing 4 years when it’s released, it’ll have nearly as many ‘flaws’. and didn’t john carmack state that the gffx had slow shaders?

    • Anonymous
    • 18 years ago

    look how fat that dude is.

    shit…thats fat as FUCK

    • Pete
    • 18 years ago

    Excellent write-up, Damage–another reminder of why I became a TR regular. Thanks.

    As for the HL2 benchmark, I hope TR has its graphing compyudah warmed up. 🙂

    (OT, has TR considered using that new 3D graph patented by those clever kids at Johns Hopkins? It’s such an obvious idea and improvement, and it would look nice with your L2 cache latency diagrams. It does appear to require a license or fee for use, though, which kind of irks me.)

    • Anonymous
    • 18 years ago

    Funny how the HL2 benchmarks seem to jibe nicelywith what the “useless” synthetic benchmarks told us months ago.

      • Anonymous
      • 18 years ago

      hehe, yep. The 3DMark team must have a huge smile right about now.

        • us
        • 18 years ago

        I bet not. They are regreting born to early and lack of bone.

    • Anonymous
    • 18 years ago

    One thing that a lot of people seem to forget is that ATI’s driver quality is still piss poor. You might not get good framerates on the NV side, but you can at least run the game for more than 10 minutes without your system crashing.

      • mattsteg
      • 18 years ago

      I honestly can’t remember crashing any games with any driver for my 9700 Pro. Maybe something a year ago or something, I dunno…

      • PLASTIC SURGEON
      • 18 years ago

      Funny. I bought my 9700pro when it first hit the shelves last year in September. Hmmmm. Not one game crashed on me since i bought it. The only issue i have had with it was Splinter Cell and FSAA. Something that effects ALL cards. Here’s something for Nvidia fanboys such as yourself. Because that old line about ATI drivers being poor is sooo pathetic now it’s comical. I am actually surprised that even Nvidiots can say things about ATI drivers with all the crap Nvidia has done in respect to driver nonesense and stability issues. Hell. Just look at the issues their drivers have with PS2.0 And yo have the balls to say ATI still has piss poor drivers. If they do, what does Nvidia have? Complete and absolute crap drivers?

      y[

      • us
      • 18 years ago

      funny you still grab a straw

      • maxxcool
      • 18 years ago

      i think you can shut the f up on your driver issues, given that i can play on a 9600 pro while anyone under a 5900 cant.

      glitches are glicthes, every driver set has them, they all get fixed. but you cant fix a f’ing broken chip beacuse you didnt stick around the dx9 conference and ati did.

      oh and the det 50’s, its a emulator you ass…..it will translate calls and break the instructions into easier to digest code…meaning true dx9 apps will have to talk down to nv3x hardware….yeah good drivers there when you need a emulator……

      • PLASTIC SURGEON
      • 18 years ago
      • Anonymous
      • 18 years ago

      ati had drivers that are fine..i jsut put a 9100 in my roomie’s system without incident..the gf2 was horribly unstable in win2l no matter which driver rev i used..Ati drivers are fine

    • Anonymous
    • 18 years ago

    I think a lot of the posters are missing the point. DirectX, some of the buggiest shittiest code that I have at a pleasure to work on, is an awful standard developed by Microsoft. ATI is a partner with Microsoft where as Nvidia has always been at odds with them from continuing their strong support of OpenGL.

    Big surprise, Nvidia’s cards do not follow the rules set out by Microsoft. They focus on what they think makes a better card, not what Microsoft does. DirectX 9 has a few more bells and whistle than the other DirectXs but it is still shit. Who the hell wants their card to confirm to a shitty standard?

      • --k
      • 18 years ago

      That’s a nice bit of revisionist theory, but you’re wrong. Nvidia and MS have been working hand in hand working on DX until Nvidia tried to patent some ideas in DX9.

        • Anonymous
        • 18 years ago

        Exactly smartass, Nvidia wants to go a different way than Microsoft did with it. Anyhow, I love you fan boys who follow our companies around and act all arrogant about it. You’re the Hollywood insiders of the gaming industry.

          • --k
          • 18 years ago

          In other words Nvidia tried to pull a Rambus patent some ideas that were being discussed openly. Btw what company do you work for?

            • Anonymous
            • 18 years ago

            I work for ATI. Its kind of sad to see how many Nvidia owners will jump ship over one press conference. Luckily we’re in the limelight at the moment but I am sure a lot people will jump ship the moment Nvidia or another company releases better marks than ours in HL2 or Doom3.

            • Anonymous
            • 18 years ago

            I *almost* believe that you work for ATi. If you did you wouldnt make such stupid remarks and im pretty sure you would have registered to give yourself *SOME* type of credablity. DX9 is not like DX8 languages in any way…. Me thinks that you have’nt a clue.

            • Anonymous
            • 18 years ago

            lol, yeah friend, that’s what they say with each release of DirectX, happy trails

            • Anonymous
            • 18 years ago

            no, its really not. But if you even had a remote clue about directx or worked with directx at ALL then you would know better. a nice place to start is

            §[<http://www.thehavok.co.uk/scene/32bits/tutorials/directx/thebasics/<]§ just to get started, because now im quite convinced you have never worked with directx much less work at ATi.

      • droopy1592
      • 18 years ago

      No, you are missing the point. Nvidia cheats to run games at a reasonable rate. What other point is there?

        • Anonymous
        • 18 years ago

        I think you are wrong there. Would you rather have a locked in rigid standard of hardware that can conform to a different spec if needed or would you rather have something flexible that can be programmed differently depending on what is required of it?

          • Rousterfar
          • 18 years ago

          So you are saying this is all MS’ fault?

          • Anonymous
          • 18 years ago

          Only problem with that thinking is Nvidia programs it’s drivers in such a way to cover it’s weaknesses. And lower IQ in the process. Is that ok? Flexible to cheat?

            • Anonymous
            • 18 years ago

            What you call cheating, others call good design principles. If I were designing a piece of hardware in a constantly evolving environment, I would want it as flexible as possible.

            • Anonymous
            • 18 years ago

            Good point, note that Nvidia’s earlier revisions of the NV3x chipset can still run the game. I do not see the same for ATI.

            • HiggsBoson
            • 18 years ago

            q[

            • PLASTIC SURGEON
            • 18 years ago

            So lowering Floating point precision, and inserting static custom clipping planes and disabling AA is what you call r[

          • PLASTIC SURGEON
          • 18 years ago

          Programming because of the gpu’s inferior capabilities? I would rather have a gpu that did not need that sort of tinkering to achieve acceptable levels of performance.

      • Krogoth
      • 18 years ago

      A thought just surge in my mind. Well now I understand why Nvidia made the deal with EA games “Play it what it was meant to be”. Any future title that EA games is going to put out it will be optimized for NV3x’s broken DirectX 9 support and it will run like crap on compertior’s products. As seen with HL2 using the correct Directx 9 standard the NV3x series runs like a cow. This possiblity alone among with Nvidia PR deperment’s deception, crappier and Det drivers. Just made me mad enough that I would never invest in a Nvidia card until, the management stuff at Nvidia stop smoking the cheap crack.

        • mattsteg
        • 18 years ago

        Isn’t TR:AOD a “Way it’s meant to be played” game? It runs like ass on nvidia hardware, same as other dx9 games.

    • Anonymous
    • 18 years ago

    That ATI 9600 Pro All-In-Wonder is looking like a great buy now. When does that sucker come out…I can’t wait!!!! Come on ATI…give me the 9600 AIW!!!

    • Krogoth
    • 18 years ago

    This report confirms what I knew what would the NV3X series was going to turned out to be. NV3X series was almost like a NV25 on mad, slilcon steroids with broken DirectX 9 support. I just more glad that my investment in a Radeon 9700 PRO back a year ago was worthwhile despite paying the big $$$ for it.

    • Anonymous
    • 18 years ago

    Yeah, #27, I remember too, in fact I kept a copy of the interview:

    “NVIDIA’s own CEO Jen-Hsun Huang recently gave an interview to The Mercury News and to acknowledge that his company will be back on top soon said the following: “Tiger Woods doesn’t win every day. y[

      • indeego
      • 18 years ago

      uh, no one? He said someday. Gives him a lot of leewayg{<.<}g

        • PLASTIC SURGEON
        • 18 years ago

        Leeway to make himself look like a ass. This current line of fx chips will not dethrone ati. Not even close. The fx’s shader engine is it’s achillies heel. So you wont see nothing change until the NV40 comes out. Because the NV36, NV38 use that same flawed engine…..

    • 5150
    • 18 years ago

    I want to see some G5 benchmarks! Oh wait, is Half Life 2 coming out for Macs? HAHAHA

      • dmitriylm
      • 18 years ago

      5150..you just made a dumb ass post.

        • 5150
        • 18 years ago

        What else is new?

    • Anonymous
    • 18 years ago

    Today I got my Verto FX5900 Ultra by mail.
    I accidentally stumbled to this article while browsing for more good reviews of my new GfX card…I spent most of my working day chasing the truth about these issues from the net.

    Now the box is nex to my computer, and I dare not to open it just so that I won’t lose that “14 days money back guarantee for unused product” -deal.
    I’m seriously thinking of Switching to ATI… :/

      • Anonymous
      • 18 years ago

      you have not switched yet? Good god…..swtich now

      • HiggsBoson
      • 18 years ago

      If you don’t need the video card RIGHT THIS SECOND, just return it and pickup whatever is proven to run and run well after HL2 comes out. My money says it’ll be something Radeon 9X00 but what’s another month or two of waiting?

      • Autonomous Gerbil
      • 18 years ago

      I’d either return it and go ATI or return it and wait and see what the next gen will offer. I think you’d be hard pressed to find anyone that’s been keeping up on the latest events that would suggest you hold on to that 5900.

    • maxxcool
    • 18 years ago

    oh and since im ranting let me clairify one thing…

    i am a perfromace fan boy, not ati’s or nvidia’s….ill buy what works and what looks fantastic on my dispaly…

    if hl2 runs well on a 9600pro or a 9700 / 9800 ill buy that when i need to, and ill buy the one that has the best price/performance ratio

    same for doom 3, (oh thats if it ever gets released).

    • Ryu Connor
    • 18 years ago

    q[

      • maxxcool
      • 18 years ago

      *sigh* MIB or NLT ?? (nvidia laywer team?)

      i just hope that gabe doesnt back down. this thread has made its way to the inquirer….i know that nvidia snoops that so they should see this sson enough.

      lets see what they respond with…a answer or lawyers/spin doctor..

      i know whats its like to have released a screwed up product….

      but when my comapny does that we tell the customer “oh by the way this will turn your pc into a smoulder pile of ash” and then we fix it.

      but to continue lying when the truth is begining to take light is just dumb and smaks of thinking us as dumb consumers…sure most of the market doesnt know, bit does that make it right ??

      its like getting a tripple cheese from wendys and getting 2.5 patties…thats false advertisement. but everyone is so intergtrated into everyone elses success that none does anything….that why im shocked and pleased ad gabes honest responce to the issue at hand.

      we need to support ANY vendor to the end that is willing at this point to risk ire for truth..

      maxx

        • Anonymous
        • 18 years ago

        Nvidia HATES the Inquier. LOL. They actually do not even let them in on ANY details about ANY of their products or info since they released the FX line of cards. When TheInquirer.net brought to light issues with the cards, they were black listed almost immediatly. That’s the same for Extremetech.com and even Hardocp is getting the cold shoulder from Nvidia. Nvidia blackmailed FutureMarks in retracting true statements. Nvidia bullied OmegaDrivers to remove it’s driver enhancements for DET drivers off it’s site. A move that only HURTED Nvidia owners. Nvidia has blacklisted any website that gives truthful but non-flaterring reviews of the fx line of cards. I don’t think they will be able to do it to Valve. Especially since all those Shader tests like 3dmarks2003, shadermarks and other shader games are now confirming the weakness of the fx chips. Valves results confirm all this. Yup, Nvidia will be pissed. They already released a statement inregards to the performance in HL2. DET50 drivers will be better they say…….ya right. It’s your hardware Nvidia. You can’t pull the wool over anyones eyes anymore.

    • maxxcool
    • 18 years ago

    RE 82:

    never give up the harware zeal. we are the only ones keeping these bastards honest.

    i know very well if there wasnt such a bitch fest over nvidias consumer humpfest that ati would have the same optimizations, but they dont becasue they know very well that the can capitalize on the demand for hadware that works as advertised.

    if people had’nt pissed all over atis parade in the 2000,2001 years thier drivers would still suck.

    we have the right and duty to dig, find the truth and ask questions….fan boys can suck off and die…its us the testers and lovers of new tech that make this all work…we cant back down

    maxx

    • maxxcool
    • 18 years ago

    Heh, im sorry i paid 140$ for a 5600 replacement fo my 9100, which by the way performed just as fast in dx8 games as this new card….

    but on the other hand i didnt buy it for a dx9 part becase i KNEW beforehand it sucked my ass. (just couldnt get a 9100 replacement local) but these “revelations” are actully a bit shocking and forgive me for speaking frankly fucing lame. to boldly lie outright on the :

    1 architecture….took months to find out what the pipe config was, and then find out they lied again in how it worked

    2 claim dx9 superioity, then ask vendors to igonore dx9 standards and optimize the code to reduce temp resgister usage…..so the damn chip can handle more quad-packed data (that is after it sends a empty packed through to clear its fucking buffers)

    3 beat the hell out of a company for pointing out that thiere harware sucked, then raping them with lawers untill they claimed the sky was green and all was right in the world…

    (who needs 32 bit color in todays games ?? : 3dfx)
    (who needs t&l ?? : 3dfx)

    (who needs 3dmark ?? : nvida)

    anyone else see a spriraling failure of 3dfx proportions unless nvidia comes out and tells the truth (by the way thats the REAL truth, not nvidias) ??

    wtf…..

    140$ for a dx8 replacement thats roughly the same speed as a ati 8500

    wtf….

    tomraider dx9 / doom3 / hl2….major titles using MS code path, and nvidia gets its ass eaten….im glad that gabe came out like this….the author is right…..

    nvidia needs to be called to the carpet and explain what the fuck thier smoking….before thier teams of lawers decend on all the small game vendors and whey begin cowering under legal pressure to “optimize thier games”…fuck that

    u know i started this messge in good cheer because i didnt spend the 500$….let alone 1000$ for that rediculas overclocked piece of shit that a 9600 pro could beat with a stick….but now im pissed the fuck off

    they lied to the whole industry…i buy this shit i support them…wtf is this.. i get fucked in the ass? same with apple…..most power full pc in the world… yeah i can beat some poor bastard in a iron lung but you dont see me bragging about it….die steve jobs.

    im sick of the whole industry, ive neverhad a problem watching out for myself…but now the industry is just rife with raping the consumer….

    big vendors pimping celeron systems with pictures of people gaming, image altering to get better fps, image pre-proccessing…….

    fuck you nvidia. i dare you to respond to my accusations let alone this thread in person.. and not a sales rep.

    but i know that wont happen, theres no advantage here for you is there…so you sit and tweak and release det 5.0 drivers that yet again swap quality for performance

    …hmmpf

    ranting done i guess…now lets see if this will post ..

    maxx

      • us
      • 18 years ago

      calm and peace buddy, it took them a lot time to get 5600 out

      and no, Apple looks nice, or at least looks different.

        • maxxcool
        • 18 years ago

        dont get me wrong, normaly im quite civil…and hardely ever post here.

        but as of late im just sick of getting massivly screwed…

        gasoline that goes up 20c a gallon..why…cause we pay it
        carzy ass phone bills that make no sense (cellies)
        taxes that go, then go up more while i earn same wage….
        “power full computer system” advertisements for crappy pc’s
        nvidias bullcrap
        clueless fan boys that seriously need a ass beating with splintered bamboo
        “optimized benchmarks” , thats for anyone, but seems mostly for apple and nvidia…hmm maybe they should just shag and make a Napple pc and claim its better than a sgi render farm…

        seriously, i like apple. i would use the god damned things if i could build one…..but that asshole steve and his dumb as a post board wont have it, or sell the damn os that i would also pay for.

        but both apple and nvidia share the same marketing team i swear…..all i need to see is nvidia comparing a “optimized” vs a ati rage card and im driving to cali to go postal my self. then stone steve into the ocean fire the board and run it myself…

        the industry is suffering and this is why….is that hard to figure out…fuck its easier to buy a xbox. not have to patch it. not have to worry about drivers and not worry about upgrades..christ its cheaper than a good card…hell its 25$ more than my 5600….

        the market cant have shit like this where the joe consumer who supports these dumb ass developers buys “card a ” and find it sucks ass….but he cant offord another one..gets frustrated and buys a xbox/ps2….this is exactly why when you go to “software etc” you see 80% of the store is ps2/xbox…

        bah….im all worked up agian…..

        no-one take me personal, im just more pissed than ever. i dont care what company does this kind of shit its just messed up…

        maxx out

          • PLASTIC SURGEON
          • 18 years ago

          You hit the nail on the head. The industry is whacked out. Optimized for specific applications….Benchmarks this….benchmarks that…..coded this…coded that….loss IQ….No IQ. Pixel pipelines here…vertex shaders there…. It’s enough to make you drink hard Jamacian 181 proof rum and smoke a big fatty which i am doing right now as i write this post.
          What Nvidia has done with the release of the fx chips is a travesty. They were caught flat footed by the release of the 9700pro and ever since then pulled out every trick in the book minus hardwork to regain the performance crown. And in doing so, has screwed many and turned the whole gpu industry on it’s ear. Which might be a good thing. It brings to light the issues about this benchmarking crap. I feel sorry for the loyal fans of Nvidia that purchased a lie. Many lies to be exact about the NV3xx chips. What makes it worse is the plain fact that evidence is now plainly showing the very weakness and major flaw of Nvidia’s GPU’s and STILL people are defending Nvidia’s actons in hiding this with various dubvious methods and down right cheating. WHAT THE FUCK IS WRONG WITH PEOPLE????? If it smells like shit. Looks like shit. Feels like shit. Guess what people. It is SHIT. No matter how they dress it up. Spice it up. We need more vendors…sites…anyone to speak up about this bullshit practice. Regardless of the company doing it. If ATI pulled this crap. They lose my money and business right there. Nvidia the same. Which happened. Matrox. The same. Intel. No different. Yes these are companies that are out to make money. However, they need to do so without fucking, us the consumer that pays their hydo bills and their salaries.

    • Anonymous
    • 18 years ago

    ATi 4eva!!!!!!! Nvidia = teh sux.

    hahahahahahahahahahahahahahahahahahhaa

    • Anonymous
    • 18 years ago

    Isn’t it also funny how nVidia’s driver downloads page declares that the Detonators have the “industry’s best DirectX 9 support”? In addition, they have a link on the same page to HL2 videos, undoubtedly generated on ATI hardware! Heh.

    • Hockster
    • 18 years ago

    If the benchmark results truly reflect how the final game will perform on the cards, then ATI has truly owned Nvidia.

    I’m just hoping my system with a GeForce Ti 4600 in it will give me good enough performance.

    • Ryu Connor
    • 18 years ago

    q[

      • droopy1592
      • 18 years ago

      I forgot my *[[sarcasm]]* tags! my bad.

      No zeal here, I’m just all for performance. I have been twisting hairs out of my arm to figure out what my next processor would be.

      I only bought a 9700 Pro because it was the best card out at the time (by a long shot) and my GF2 just wasn’t cutting it anymore, especially since my latest monitor is high res.

      • TheCollective
      • 18 years ago

      q[

        • Ryu Connor
        • 18 years ago

        q[

    • --k
    • 18 years ago

    I’ve not seen this kind of lynch mob since the days when Romero bragged about making you his bitch. Good stuff.

    • eitje
    • 18 years ago

    q[

    • Anonymous
    • 18 years ago

    I don’t understand people here who just bought FX cards. Wasn’t it obvious from almost every review that ATI cards were a better buy?

      • --k
      • 18 years ago

      It seems that Nvidia is making the best of the bad situation. The HW is obviously not up to snuff, so they are trying to patch things up. There are 2 things Nvidia can do if it wants to save face: do a massive recall and replace the cards with NV40, which probably won’t be out until March of next year or a refund the customers’ money, but accountability is a bad word in this business.

        • JustAnEngineer
        • 18 years ago

        Many years ago, I purchased a Diamond Viper VLB video card with the Weitek processor. Diamond tried and tried, but they never could get the Windows drivers right for the Weitek chip. They finally gave up and gave me and the other Viper owners a letter of apology and a big trade-in credit for new Viper products based on S3 chips.

    • Anonymous
    • 18 years ago

    For everyone out there in the “9700 club ” ( people who took the -risk- a year or more ago and now are reaping the rewards ):

    Whoop ! There it is !
    Whoop ! There it is !

    On a side note: Every new hardware generation has risks and rewards. Engineers + Researchers have goals to achieve. They sit down, set an aim. Then make an equation(s) to make it happen. Then hardware is built to do it ( virtually ). and from there it is tweaked / worked out.

    From here two things could happen:

    1. It could be good, optimized, then eventually produced.

    2. It could suck, need to be completely rewritten, and DELAYS insue. Thats ok, if you have time, sometimes you dont’. Sometimes the PRODUCT IS RUSHED. So a less-than-stellar design is produced, and the Marketing guys have a harder job. So on and so forth.

    So for all we know, the next generation of ATI’s hardware ( next NEW generation, not a revamped/retooled gen, like 9700->9800 ), could fall into number two, and Nvidia’s into number one. Or vice-versa. Or somewhere in between.

    Or S3 could come back and stomp all of them. Q3 2008 baby !

    • PLASTIC SURGEON
    • 18 years ago

    This is NVIDIA’s Official statement: “The Optimiziations for Half-Life 2 shaders are in the 50 series of drivers which we made available to reviewers on Monday Sept. 8, 2003. Any Half-Life 2 comparison based on the 45 series driver are invalid. NVIDIA 50 series of drivers will be available well before the release of Half-Life 2”.

    So Nvidia will once again “optimize” for a benchmark or game to increase the results to level itself out with ATI cards. What a joke. Can not everyone see that it’s not the software with fx cards that’s the issue. It’s the HARDWARE. And all these artilces, syn tests, game tests are proving it. Nividia just fix the flawed shader engine wit the fx line. Because we ALL know that in order for Nvidia to improve performance in dx9 games or applications, they will have to program their drivers to run the game in a lower floating point precision. Lower then the standard FP24. We have been down this CHEATING road Nvidia. FP24 is the dx9 standard. Anything less is altering IQ. Something Nvidia said it would no longer do with it’s new procedures. I have to see these 50 dets in action. Because you know these drivers will be dissected at every turn.
    And the whole point to buying a new top or mid range card is to play games at close to or maximum settings. How they plane to increase performance greatly in these 50 drivers is beyond me.

    • willox2112
    • 18 years ago

    Being an avid NVIDIA fan since the days of the Riva 128, I must say that their performance lately (hardware, PR, and everything else) is really sad , and is starting to get ridiculous. This is totally unnaceptable from a company that holds over 50% of the market and has been put there by a continuous line of excellent products, and by the customers that wanted an excellent product.

    Just think about it, saying NVIDIA used to be akin to saying Intel, or Xerox, or Sony, or Coca-Cola (well, maybe not as big as Coke or Sony, but you get the drift). It was a brand-name. When you thought video cards, that’s probably the first thing that popped into your head. Now that has been corrupted, and in the most unpleasant of ways, by screwing those who brought them up.

    But, also being an open minded person I realize that often the best products do not come from those big brand names, since companies
    that get big also get conceited and do not try as hard as before… relying too much on their name or their fans to pick whatever they put out (As an example – Metallica). It seems to me that NVIDIA is very guilty of this. Let’s hope that they can learn from this and come back with a bang, although it will take a while for people to trust them again. If not, then it is really the 3dfx curse.

    Now folks, think for a moment and don’t let your loyalties cloud your eyes. It is bad enough what NVIDIA has done to the industry, but it is pretty sad to see some of you still in denial that you are being screwed and forced to like it. Either live with it or do something about it, and nothing talks bigger than your hard earned money.

    Myself, I know what I am doing. Right now, that 9600 Pro AIW is starting to look pretty sweet.

      • PLASTIC SURGEON
      • 18 years ago

      your not the only one saying and thinking this….

    • Anonymous
    • 18 years ago

    Why is everyone bitching about ATI or nVidia cards? Put away your egosticks for a second and just ask yourselves why we have these so called modern games which REQUIRE hardware that costs thousands of $$$?! Several years ago and you could get amazing games on machines like the Amiga, now you have to pay $1000+ for these games? Instead of whining which card is better and how many $hundreds you wasted on it try not to let the games companies get away with it. They keep releasing this stuff and expecting you to spend more money on “upgrades” because you let them get away with it!

      • droopy1592
      • 18 years ago

      Stay on topic, please.

      • Rousterfar
      • 18 years ago

      Well the problem is with PC development. Unlike consoles programing they are not able to code to the metal.

      • WaltC
      • 18 years ago

      For my Amiga 500 back in ’87 I paid ~$1700 for the cpu, the monitor, and an external floppy (no hard drive, if you remember.) Later I paid $1999 for an A2000 without hard drive or monitor (before they were made standard.) I paid $2999 for an A3000 and about the same for an A4000 (which I still have.) I loved the Amiga at the time.

      Today, my WinXP desktop machine is hundreds of times faster than my A4000, has a storage capacity 1000’s of X greater, has far more ram–you name it–and it’s cost me about what my original A500 cost me in ’87.

      The fact is that today’s boxes are far more powerful while costing much less than Amigas did in the late 80’s and early 90’s. There’s nothing to complain about. Incidentally, the games I run today simply weren’t possible on those machines–even if the software had been available. Typically, though, an average game today costs about what it cost in ’87, which is $40-$50. Difference is instead of shipping on 880K floppies consisting of 1-2 megs of data, today’s games ship on CD and contain gigabytes of data–absolutely impossible on any of my Amigas. You need to do some more thinking about this…

        • JustAnEngineer
        • 18 years ago

        I agree with you, WaltC. My Amiga 2000 with a whopping 3MB of RAM, dual floppies, no hard-drive, and a 13″ monitor cost $2400 in ’87.

      • HiggsBoson
      • 18 years ago

      If you’re asserting that game development and fun has taken a back seat to visual/audio effects and other such goodies then I could see that you have a point. But that being said there are still lots of high quality, incredibly fun games that are being produced out there WITH great visuals.

    • Ruiner
    • 18 years ago

    Not to piss on anybody’s cheerios, but even the 9800 pro did just ‘ok’ at 10×7 with no AA/AF. Granted, it trounced everything else, but this card is obviously barely adequate for playing with the eye candy on. 9800’s don’t take a huge hit from AA/AF, but I can see framerates dropping to 30 or so with 8xAA/16xAF.

    It will take the next generation of cards to make this game shine (as always with groundbreaking engines).

      • --k
      • 18 years ago

      Good point. Playing with anything less than 9800xt is going to be painful.

      • Anonymous
      • 18 years ago

      So what your saying is that with AA/AF on the 5200Uand 5600U wont even be on the board while the 5900U has about 1-5 fps…

      *rips the 5600U out of his system*

      Ok, time to get a 9700pro

      • droopy1592
      • 18 years ago

      So then you are saying that the 5900 Ultra will perform like a pageflip animation with eye candy on?

      Some of us gamers are glad *[

        • Autonomous Gerbil
        • 18 years ago

        I am also glad to see the software push the hardware rather than the other way around. Maybe the next generation of cards will be more revolutionary than evolutionary in terms of speed and visual capabilities. Maybe we need an entirely new architecture to do this? Maybe we need someone to do Kyro-like optimizations on the next generation of hardware so that it can do more with less? Either way, the next-gen hardware is going to be something special – doesn’t anyone else get that sense?

      • Rousterfar
      • 18 years ago

      The game is supposed to be able to run on even low end hardware. I heard the game will even run on a GF3 card. You just have to crank some of the eye candy down. Unlike DOOM3 you won’t have to buy a high end card to run it.

      • Anonymous
      • 18 years ago

      HL2 is CPU limited at 60fps with full CPU-dependent settings (physics, etc.) on a P4 2.8C. This is blatently obvious from the fact that the 9800 Pro only gets 1.27x the framerate of a 9600 Pro when it has 1.9x the pixel processing resources. If CPU performance were not an issue, the 9800 Pro would be running at 90fps, not 60.

      Which means the 9800 Pro has plenty of headroom at these settings. On a similar system, you will be able to turn on AF, and bump up AA and/or resolution without suffering much if any performance hit. 1024×768 with full AF and 2xAA will almost certainly come in at ~60 fps. 1024×768 with 4xFSAA and 1280×1024 with 2xFSAA will likely still be pretty close.

      FWIW, Gabe Newell has said that he plays HL2 on his 9800 Pro at 1600×1200 with 2xFSAA. Now, he’s not going to be getting 60fps at those settings, but it’s obviously playable. Mid 30’s or even ~40 fps is not out of the question.

      • Anonymous
      • 18 years ago

      So, you’re trying to tell me that 60 FPS is unplayable?You’re living in a dream world buddy.

    • Anonymous
    • 18 years ago

    Very insightful, best report I’ve ever read.

    • PLASTIC SURGEON
    • 18 years ago

    It seems 3dmarks2003 was spot on about Nvidia’s dx9 performance. Spot on

    • --k
    • 18 years ago

    This event is also a good way to drum up sales for ATI and HL2. Kind of like an infomercial.

    • --k
    • 18 years ago

    Admittedly not a reliable source, but this may explain why Bign sux at the shaders:

    “According to one source, Microsoft has recently held meetings with the major 3D card vendors to flesh out the specifications for the DirectX9 3D API.

    Microsoft is rumoured to have asked the vendors to sign an agreement that they’d keep shtum about DirectX9 and not get all proprietary about details.

    The word is, and this is a rumour, is that NviDiA refused to sign the agreement and Microsoft “summarily excluded” them from the meeting.”

    §[<http://www.theinquirer.net/?article=165<]§ "One of the most fascinating things he managed to listen in on was the story of Microsoft's attempt to flex its muscles with DirectX 9. It seems that a story we published a fair while ago uncovered a goodly amount of the truth. When Microsoft was first putting together the specifications for DirectX 9 they gathered various companies together to help build the API. After a very short while, Intel and Nvidia both walked away from the whole thing and now we know why. It seems that Microsoft wanted both companies to agree to reveal any patents they had relating to the technology the Vole was building. Once revealed, the Vole expected the companies to hand the patents over for good. Intel and Nvidia walked away. Only recently has it signed an altered agreement. That is why the GeForce FX does not follow the DirectX 9 specifications to the letter." §[<http://www.theinquirer.net/?article=7781<]§

      • LiamC
      • 18 years ago

      I have heard this rumour more than once, and the first time more than a year ago.

      Trouble is, this does not excuse or hide the fact that NV 3x cards perform poorly in DX 9.0 code. *[

        • Anonymous
        • 18 years ago

        perfectly said Liam

    • 42
    • 18 years ago

    So is it reasonable to assume that either of the NV40 or R420 cards when released, will be optimised to play either D3 or HL2 at mega frame rates?

    • Anonymous
    • 18 years ago

    The issue is that nvidias shaders at full precision are very very slow, ATIs cards however only have one useful mode and so are at fulspeed always. Carmack has said hes using the low end accuracy on nvidias produicts to get speed, but at the same time, Doom3 doesnt use as much shaders and OpenGL is much faster on nvidias hardware. Nvidia screwed up, they know they have. We must wait to the 4 series cards to see if things remain the same, if so, nvidia will lose alot of money.

    • Aphasia
    • 18 years ago

    Well, well, well, i guess what we see now are the fruits of nVidia aquiring some old 3dfx staff and technology and putting it to work. Whats next, will they buy a card maker and start selling their own cards or something….

    😀

      • Over13
      • 18 years ago

      Who manufactured all the 5800 Ultras?

    • AGerbilWithAFootInTheGrav
    • 18 years ago

    Great that TR was called to the event… among just a few sites 🙂 that’s cool…

    as for the results… well we were waiting on those, as Nvidia put a lot of dust in our eyes lately..

    luckily I replaced my GF4 with a Rad9800 non-pro recently… 😉

    • Anonymous
    • 18 years ago

    Dang I just bought a GF5900 128MB non-ultra for $249.

    But with that said, I’ve had 3d Cards since the Matrox Mystic Era (B4 Direct X).

    I found this interesting artilce about how ATI and Nivida implmented Shadders, and it ain’t over yet. Looks like MicroSlop may have coded DX9 to match ATI hardware more than to create an open Standard.

    §[<http://www.3dcenter.de/artikel/cinefx/index_e.php<]§ I guess OpenGL is the real open 3D Render, and does Value Say that Doom 3 is wrong with OpenGl and with what they say about Nivida? Time will tell, read the article and then comment. Foo -=FI=- SOF2

      • Anonymous
      • 18 years ago

      sorry to hear that man – seriously take it back 🙁

      – AbRASiON

      • ChangWang
      • 18 years ago

      no Valve is not wrong in what they are saying, you are just in denial. If Doom 3 used only the “open standard” which is ARB2 in that case the FX line STILL gets trounced. Your arguement on that front is moot.

      • droopy1592
      • 18 years ago

      Nope. ATi’s hardware is the poster boy because it was designed to the DX9 spec, not over or under designed.

      I guess nVidia were planning on wowwing us with specs so the over designed the NV3X series. They wanted us to see the over precision, etc.

      They missed by a long shot.

      *[< I would rather play Doom3 on an ATi card than play Half Life 2 on an Nvidia card.<]*

        • Anonymous
        • 18 years ago

        Agreeded.FX owners will play DOOM3 in lower FP. Then ATI owners…

      • Rousterfar
      • 18 years ago

      It’s still a good card u have. Just not the best. You will be fine with it.

        • droopy1592
        • 18 years ago

        Don’t lie to him! It sucks, unless he’s ok with less precision, no complete choice of filtering, bulky, buggy drivers, and a hot card.

          • Anonymous
          • 18 years ago

          well buggy drivers does it for me. I’m staying away from ATi cards. Firingsquad has tested the Cat. 3.7 drivers and IL2 Sturmovik: Forgotton Battles doesn’t work properly with these drivers, as well as issues with Flight Simulator 2004. So here’s more waiting for either driver updates, or something from nVidia which might be <$200 and FAST.

            • Anonymous
            • 18 years ago

            Good luck pal. You stick with drivers from Nvidia that are heavily “optimized” for. And wait until all those dx9 games come out. You will be crying that you bought a 5600 or 5900. The good thing with the Cat drivers is that they regularly update them. I was once a Nvidia owner. The 9700pro changed my mind in a hurry. Then ATI fixed it’s major issues with the release of the Catalyst drivers. The VERY SAME DRIVER TEAM that left Nvidia and is now working for ATI. That sold me right there. Until Nvidia can get it’s act together. ATI for me. I have no favorites. Just performance favorites.And Nvidia is lacking with the FX chips. BIG-TIME

            • droopy1592
            • 18 years ago

            yeah, that’s real smart. Go to a company that works on their drivers and constantly improving to a comany that’s bulking up crappy drivers and getting even worse. Yeah, that makes a lot of sense.

            Get what works best with the games you play/ will play.

            • Anonymous
            • 18 years ago

            I’d like to go ATi, but they have issues to work out yet as previously stated in #98. Supposedly the upcoming 3.8 CAT’s are a big driver improvement but this waiting is getting old and so is this onboard NF2 video I’m using. And so it’s wait….wait….wait… 🙁

            • droopy1592
            • 18 years ago

            So your point is? Are you sticking with Nvidia until you get better ATi drivers? But wait, aren’t the Nvidia drivers just as buggy, if not moreso?

            Tell me what’s wrong with the Cat 3.7 in a game you have experienced? Most folks with R3XX series cards don’t ever have problems. Me for one.

            • Anonymous
            • 18 years ago

            my point is I’ll be waiting to see how ATi does on their next driver release. If the bugs are cleaned up in the flight simulators I fly(MS Flight Simulator2004 and IL2 Sturmovik: Forgotten Battles) then I’m getting a 9600Pro or RV360. If not, I’ll….see how nvidia fares in their Detonator 50 driver. The 3.7 problems are on the §[<http://www.firingsquad.com<]§ site they reviewed but I can't access it right now. It seems these flight sims are ok with nvidia but the performance of the middle-end cards suck, as I also play some FPS's. So that's it in a nutshell.

            • Anonymous
            • 18 years ago

            i CAN TELL YOU MANY INSTANCES WHERE NVIDIA HAS ISSUES WITH IT’S CURRENT DET DRIVERS. Not all drivers are perfect. But as of late Cat drivers are far better then Nvidia’s “specific appliation drivers”
            You know the the type that only target benchmarks and certain high profile games. Go take a look at those det50 ones..you will see what i mean.

            • Anonymous
            • 18 years ago

            Really? Like waiting for DET driver updates that only target benchmarks? Benchmarks that only idiots and OEM’s look at. Take a look at DET 50 drivers. It gives increases in AQUAMARK only and not to the same levels as HL2…HMMMMMMMMMMMMMMMMMM Gee i wonder why. Keep living with your head in the sand about ATI drivers. Too bad the very same driver team responsible for the unified DET drivers and it’s stability are now working for ATI.

    • Anonymous
    • 18 years ago

    So which one of you idiots are going to fess up and tell us how you recently spent 500 bucks for DX8 card in a DX9 skin.

    • Anonymous
    • 18 years ago

    man that whole screenshot specific rendering thing is huge. I hope no one under-estimates what that means. Basically, nVidia was attempting to outright LIE to customers. No, it can’t be passed off as an optimization or a bug, the only purpose it could POSSIBLY serve is to make people believe that they weren’t reducing image quality when in fact they were.

    I’m sorry to sound like a spaz, but if I were an nVidia owner, I would be calling my lawyer to consider a possible lawsuit based on that alone…

      • HiggsBoson
      • 18 years ago

      Actually I was wondering… could anyone explain a mechanism for how this could be possible?

      • Anonymous
      • 18 years ago

      Sh!t, I’m with you here fella. More than anything else in the entire article, the screenshot detection is an incredibly huge story. The implications of such dishonesty has legal ramifications behind it. If true, this would be irrefutable evidence of a IHV using intentional deceptive business practices. Its far worse than just having optimizations kick in on application detection. This is bold faced lying.

      Based on the info, however, we don’t know who’s drivers were doing this. We only assume its Nvidia based on the context of the slide. Nvidia is however, looking like a bunch of frauds right about now with their “cinema quality” marketing bullsh!t.

      Damn … what can you trust now a days people? Damn.

      • Anonymous
      • 18 years ago

      Hey they just released the 50 series. And from what there saying it only improves AQUAMARK and no other dx9 title. Will they fudge FP again to increase speeds? They should not. Nvidia’s own “proceedures” forbids altering IQ. And i can’t see how their cards will chage this with their faulty shadr engine.

    • Anonymous
    • 18 years ago

    ok, so the Valve benchmark will be out Sep. 30, right? by then the Det.50’s may be be released(or not, depending on current situation now), THEN the TechReport benchies will tell all.

    The Inquirer says bundling will be with R360(9800XT), no mention of RV360. So pay you’re still (over)paying a lot of money for HL2 to be bundled with a top end video card (or is it the other way around? 🙂

      • Anonymous
      • 18 years ago

      I have it on good authority that not only will the RV360 have HL2 bundled, but so will everything else from the 9600 up.

        • Anonymous
        • 18 years ago

        I’ve heard that the version bundled with the ATI card(s) won’t be the *full* version.

        Though it’s not clear to me whether Valve has figured out how many versions it will make at this point.

          • droopy1592
          • 18 years ago

          I heard that too.

          • Autonomous Gerbil
          • 18 years ago

          In the past, it has only been a limited version (was for me at least with my 8500), so that’s probably going to be correct.

      • Anonymous
      • 18 years ago

      If the benchmark comes out on the 30th, when will the game be released?

        • NeXus 6
        • 18 years ago

        It was supposedly pushed up to November but now I’m reading Sept. 30. I haven’t seen any reports of it going ‘gold’ yet.

    • Anonymous
    • 18 years ago

    who is that guy? The VP of Twinkies!?!?

      • Anonymous
      • 18 years ago

      I know, I kind of pictured Newell looking like the guy in Half-Life, not a fat-ass computer geek.

    • Anonymous
    • 18 years ago

    I think Valve would have much preferred saying that HL2 played really well on both nVidia’s and Ati’s hardware. After all, they are in the business of selling their product to everyone. I think they are legitimately upset and concerned that they have spent all this time and effort creating a DirectX9 product and more than half of the graphics hardware out there (nVidia’s) is too crippled to run it properly.

    • Anonymous
    • 18 years ago

    I remember when the president of Nvidia made a comment about Ati smoking some bad hash or something as reply to the interviewer who asked him whether the lead ati would last.

    I guess Nvidias illusion finally caugh up to them.

    • indeego
    • 18 years ago

    Nice PR and project management at Valve consistently. Someone from Nvidia or Microsoft needs to hire them away. They hit their targets, they produce quality work, and their marketing speak is not mostly hype, but reality.

    Good work. I will get the game, but likely still play with my GF3 ti200 for a few months more, then get the card with least issues after the first few months of patching goes ong{<.<}g

      • Despite
      • 18 years ago

      yeah, just look at TF2! oh, wait…

    • Anonymous
    • 18 years ago

    #18 aaaaaaahahahahahahahahahahahahahahahahhahahahahahahahahahaaaaaaaaaa

    • Anonymous
    • 18 years ago

    I guess Ati is softening up Nvidia before launching another offensive…

    R360 and RV360 is suppose to be bundle with HL2….

    This will only is getting ugly

      • danny e.
      • 18 years ago

      i am fairly certain HL2 would be bundled with R360 … not R360 bundled with HL2. 🙂 although…. if they did bundle R360 with HL2 i would buy it …. even if it was the first game to cost $200!! 😉

    • DrDillyBar
    • 18 years ago

    It seems that it took a Game Producer to really put some light on the r{

      • WaltC
      • 18 years ago

      It’s the FutureMark-nVidia saga all over again–only difference is that valve has some backbone and isn’t going to let an IHV discredit its software just to cover up its shoddy hardware.

    • Anonymous
    • 18 years ago

    OK! Now this is News you can Use!!

    OUTSTANDING report, Damage! Woohoo!

    I think my favorite part was where they decided to treat NVidia junk as DX8, just to get playable framerates. Kinda like treating an Apple like a monochrome TRS-80, just so it can keep up and play with the rest of the kids Bwaaaaaaaaahahahahahah.

    Good stuff, Maynard! 🙂

    • Trident Troll
    • 18 years ago

    Bah!
    What use are these benchmarks when they left out the all-important Parhelia numbers?

    • Anonymous
    • 18 years ago

    gg nVidia. This is HUGE.

    • Anonymous
    • 18 years ago

    I think the Radeon’s advantage in HL2 looks like it will be much bigger than the GFFX’s advantage in Doom 3.

    Here’s the old Doom 3 benches from THG’s FX 5900 review:

    Medium Quality, 1024×768
    R9800 Pro – 68 fps
    GFFX 5900 Ultra – 83 fps

    High Quality, 1024×768
    R9800 Pro – 61 fps
    GFFX 5900 Ultra – 55 fps

    Now look at the HL2 benches:

    DX9, 1024×768
    R9800 Pro – 60 fps
    GFFX 5900 Ultra – 30 fps

    Mixed Mode, 1024×768
    R9800 Pro – 60 fps
    GFFX 5900 Ultra – 47 fps

    So for Doom 3, both cards seem to provide playable frame rates, with the GeForce being 20% faster in medium quality mode and the Radeon being 10% faster in high quality mode. For HL2, the Radeon is 100% faster in high quality mode and 25% faster in medium quality mode.

    I think I know which card I’d rather have for these games. Luckily I already have one 🙂

    • Anonymous
    • 18 years ago

    They sign a deal with ATI, and then they criticize “Nvidia performance” at an ATI-sponsored event?
    ‘Nuff said.

      • GreenShrike
      • 18 years ago

      “‘Nuff said”? Really?

      So your implication is what exactly? That Valve is lying about benchmark results on a game that’ll be released just around the corner? If that were the case, every tech site would call them on it an hour after the game falls into reviewers’ hands. Valve would be the laughingstock of the gaming community.

      You simply can’t be serious.

        • dmitriylm
        • 18 years ago

        Yeah, the people at Valve are much stronger than that. Stupid accusations such as these need to be eliminated. Valve explained everything clearly enough that its easy to see just how bad nVidia is doing (although my Ti4200 still rocks for the money i paid!)

        • rogerw99
        • 18 years ago

        I agree with you on that point. Valve would most certainly become a laughingstock. And that could threaten to put them out of business. Therefore they could not risk anything but the truth. However, nVidia has been playing some VERY shady games (pun intended) with their drivers in order to hide this very weakness for quite some time. WHY??? More than likely the CEO’s & investors pushed for new silicon long before it was ready. “IMMEDIATE PROFIT AT ALL COSTS!! SCREW WHAT THE CUSTOMER THINKS LATER WHEN THE TRUTH COMES OUT!! BY THEN WE’LL HAVE THE NEW SILICON AND THEY CAN SPEND EVEN MORE TO GET WHAT WORKS RIGHT!!” Obviously nVidia engineers could not keep up the furious pace to produce effective & efficient new silicon every 18-24 months. Sooner or later their going to have to bite the bullet and take more time for R&D in order to get it right. In the long run this guaranties the health, reputation and ultimately the profitability of the company. As far as I’m concerned, nVidia has lost any credibility they had with me. And I think in many cases nVidia will take a very long time in regaining any trust in their customer base. It’s not like they did not know what was involved with Dx-9 when they started to develop the FX silicon. As a matter of fact, they had more time than ATI did. Any new core design should be able to run baseline software as it was designed to run. Not by fudging the quality and lying about the settings to fool the testers who are taking a closer look at things. I’m done with nVidia! This is one customer they won’t get back!

          • indeego
          • 18 years ago

          I love these ultimatums. You and everyone else would go back to x graphic card maker if they proved themselves the best. Look at how we all laughed at ATI’s attempts 2-3 years ago, a pathetic jokeg{<.<}g

            • rogerw99
            • 18 years ago

            And nVidia is just as pathetic now. They pulled just about every trick in the book to hide the fact that their hardware is NOT fully Dx-9 capable. Even ATI wasn’t guilty of that. I’m not saying ATI is going to be the best there is forever or that nVidia won’t trounce everyone else later(Big maybe). I am saying that nVidia broke trust with it’s entire customer base, they outright lied to each and every one of us. Would you buy fruit from the vendor who stole your wallet? I don’t think so. You would turn your back on him with your hand firmly covering your backside, and your replacement wallet, as you walked away determined never to do business with that thief again. That’s how I feel and most likely a great many others as well. How many people do you know who can afford to through away $400-$500 for a piece of hardware that doesn’t do what was promised. I certainly can’t and nobody I know can either. “The Way It Was Meant To Be Played” What a crock!!

            • PLASTIC SURGEON
            • 18 years ago

            So true. Which means Nvidia needs to stop bullshitting and lying and improve their fx chips. As for the NV36,NV38. It will perform the same. Same faulty shader engine…And for us to go back. IMPROVE! This is the gains of a free market. If the product is crap, i take my business and hard earned dough elsewhere

            • Moridin
            • 18 years ago

            ” Which means Nvidia needs to stop bullshitting and lying…”

            This is my real concern. If I know what a given card will do there is still a chance I will select the lower performing card for a variety of other reasons, but in attempting to hide the real performance of their cards makes it impossible for me to even consider nVidia for a high end or mid range card.

            Things like reducing important IQ settings to levels below that I specify, encrypting drivers to hide what the card is actually doing (thus making it impossible to know if you are making a valid performance comparison), and raising image quality when a screen capture attempt is detected (thus making it impossible to compare IQ) are more damming IMO then mediocre performance, since they completely remove a card from consideration. Mediocre performance is only one strike against a card, price and system stability/reliability could still prompt me to buy a lesser performing card.

          • My Johnson
          • 18 years ago

          nvidia made the mistake of developing beyong the 9.0 spec and using 128bit shaders. Oddly, when they valve did 16bit for shaders the nvidia cards still kinda sucked.

      • Anonymous
      • 18 years ago

      Did you actually read the article here? If so, are you completely unaware of what has been demonstrated in recent months about nVidia?

      They won’t even discuss their top GPU’s architecture for crying out loud. It was only in the last couple of months that people began figuring out how many pixel pipes the thing had.

      They tried to sneak one by people, and it isn’t working. They were caught flat-footed when the R300’s came out, and haven’t been able to catch up yet, so they decided to play games and lie for the time being. They quickly hurried out new revisions of their utterly failed NV30, and tried to tide themselves over with them, the NV3x boards. That is all. Maybe their next generation will be OK. Then again, maybe not.

        • Anonymous
        • 18 years ago

        Until the NV40 comes out. The fx line is stunted like a dwarf. All nv chips use the same shader engine. Same pixel pipelines. Same vertex pipelines. Which is 1 less then ATI’s offereings.

      • Anonymous
      • 18 years ago

      Many of the guys at Valve are r[

    • Anonymous
    • 18 years ago

    My 9500 Pro on a 2500+ outperforms my 5900 Ultra + 2.8C by a noticable margin in the Halo leak. These HL2 numbers don’t surprise me in the least. As soon as you do anything with FP32 textures, or full precision PS 2.0, NV35’s performance falls off rapidly.

    gamersdepot.com did a very interesting 5900U vs. 9800 Pro compare a few days back, the new Tomb Raider and the Halo press version. The NV35 got molested.

      • 42
      • 18 years ago

      what Halo leak Sir? Prey tell more please…

    • Anonymous
    • 18 years ago

    Woops – should have been HL2, not HF2.

    • Anonymous
    • 18 years ago

    Let me get this straight: For DOOM3 I will need a GFFX and for HF2 I will need a Radeon 9800. Doom3 benchmarks show the GFFX stomping the R9800 and HF2 benches show ATI at double the speed of the GFFX.

    There is one problem though – both games have yet to be released. Besides that, I get the feeling that Nvidia and id are in bed together which is why id allowed sites to show benches of D3 even when ATI had no chance to work on their drivers…seems like ATI is trying to get the upper hand now by doing the same thing. D3 and HF2 will be huge games that sell a lot of high end hardware so my guess is that this is all a bunch of BS spewed fourth by the respective marketing departments of each company. Is it really a surprise that a new piece of hardware doesn’t perform very well on a game that isn’t even finished?

    I’ll wait for HL2 to launch before I buy any new hardware – I doubt they will ship a product that doesn’t perform well on Nvidia hardware since that is the bulk of people buying their game. Maybe there is a problem with Nvidia hardware, possibly fixed through a driver update. I’m sure both companies will have fresh drivers for the holiday season so we’ll see in a few months. I think this is just a bunch of FUD with some crappy DX9 drivers thrown in the mix – of course I want to believe that since I dropped $400 on my GFFX 5900 last month! 🙂

    Jeff

      • ChangWang
      • 18 years ago

      Keep in mind that Doom 3 has a specific optimized path for NV3X hardware. If it uses the regular ARB2 path like the latest Radeons do, NV3x gets its arse handed to it. What Gabe has confirmed is what we already knew. The NV3x architecture is simply not efficient and has to be specifically coded for in order to get any decent performance out of it.

        • Anonymous
        • 18 years ago

        And in order for the fx chips to run effiecient, it has to do it in lower floating point precision.

      • mattsteg
      • 18 years ago

      HL2 runs like crap on nv hardware because nv’s dx9 implementation really is looking more and more half-assed as more dx9 titles come out. Have you seen tomb raider:AOD benches? More of the same. Sure, they’ll have something that might perform OK on nv cards, but it’ll likely be a quality/speed tradeoff. I really think you’re suffering from the same thing nvidia fans have been since before the 9700 was released: over-optimism regarding the nv3x series of cards. I can’t even count the number of people who said buying a 9700 pro over a year ago was a bad idea, but it’s been by far the best high-end video hardware in recent times – it still rocks hard, and beats nvidia’s newest in a not-insignificant number of tests.

      • Anonymous
      • 18 years ago

      Go re-read Carmack’s commentary on Doom3+NV3*. He’s not entirely pleased with all the work he’s having to put into the NV3X backend to keep it competitive, and he notes that NV3X using the generic ARB2 back end is very slow.

      GFFX owning Doom3 has nothing to do with the hardware and the drivers, and everything to do with a big company paying Carmack enough to make it worth his while to make proper cheats^H^H^H^H^Hoptimizations.

        • Anonymous
        • 18 years ago

        Carmack is so rich he trades Ferraris the way most people change clothes, and he’s funding his own minor space program for fuck’s sack. I don’t think NVidia is paying him to do the optimizations. So why is he doing it? Because NVidia has a huge installed base, and he wants his game to be at least playable on the widest range of hardware possible.

          • JustAnEngineer
          • 18 years ago

          That’s a good point. If you had made it without profanity, it would not be filtered out for most readers. Can you edit an anonymous message?

          The shadow stencil technique that Carmack uses extensively in Doom3 is one of the things that the GeForceFX architecture does quite well. It is irksome that NVidia claimed that NV3x would do conventional color+Z rendering at the speed that it can do stencils.

    • absinthexl
    • 18 years ago

    Keep in mind that this is at 1024×768 with no AF/AA. Hopefully it’ll still be playable on my 9500 Pro with both enabled…

    • Khujo
    • 18 years ago

    They did clearly say that they signed a deal with ATI because the hardware was good not, had good results because of the deal with ATi. Seeing every other DX9 test show similar results leaves no reason to think otherwise. ATi has simply created a good architecture that doesn’t have to take short cuts to get the proper image on the screen at playable framerates. The 9600 Pro outperforming the 5900Ultra was surprising but when both are 4 pipeline chips and the ATi shaders are so much stronger i guess it does make sense. Its sad that nVidias latest and greatest should be considered DX8 hardware.

    • Anonymous
    • 18 years ago

    Keep in mind that this was an ATI-sponsored event.. not that I question the validity of the remarks but there is a difference between a glass that is half-full or half-empty or .. half-assed… or were they all the same?

    Audiatur et altera part, let’s hear a counterstatement before we break the camel’s back with another straw..

    Cheers.

    MS

      • Anonymous
      • 18 years ago

      Here’s my take on a counter-statement: l[<"Unh uh."<]l

      • Anonymous
      • 18 years ago

      These staements fall in line with ALL shader game results from Nvidia. Splinter Cell. Unreal2003 in shader intense scenes. Commnache4. Tomb Raider. HL2. As for syn bench marks. 3dmarks doe not seem so wrong now. shadermarks also.

    • Rousterfar
    • 18 years ago

    Wow…

    I do wish Valve had chosen a non-ATI sponsored event to say all this, it’s going to cause a lot of people to cry fowl. Just when the Nvidia vs ATI thing was starting to die down too.

      • GreenShrike
      • 18 years ago

      q[

    • danny e.
    • 18 years ago

    looks like a Radeon 9600 will be sufficient for HL2 . thats nice… since i dont have the $$ for a 9800 or 9900 yet. with DOOM3 delayed till Q12004 at the soonest….. it doesnt make sense to buy a 9800 or faster now.

      • JustAnEngineer
      • 18 years ago

      My Radeon 9700 Pro is over a year old. It looks like it will be a good card for a while longer.

      At $249 (including DVI-VGA adapter for dual-CRT Hydravision), a Radeon 9800 (non-pro) 128MB board looks like a powerful performer… especially when you contemplate the $429 pricetag for a GeForceFX 5900 Ultra.

Pin It on Pinterest