How ATI’s drivers ‘optimize’ Quake III

MOST OF YOU ARE probably familiar by now with the controversy surrounding the current drivers for ATI’s Radeon 8500 card. It’s become quite clear, thanks to this article at the HardOCP, that ATI is “optimizing” for better performance in Quake III Arena—and, most importantly, for the Quake III timedemo benchmarks that hardware review sites like us use to evaluate 3D cards. Kyle Bennett at the HardOCP found that replacing every instance of “quake” with “quack” in the Quake III executable changed the Radeon 8500’s performance in the game substantially.

The folks at 3DCenter.de followed Kyle’s trail and discovered that, on the Radeon 8500, “Quack 3” produces much better image quality—texture quality in particular—than Quake III. The FiringSquad observed the same behavoir, only they did so in English.

With the publication of these articles, it became a matter of public record that ATI was intentionally sacrificing image quality in Quake III for better benchmark scores. The issue, as far as I was concerned, was settled: ATI was busted.

Only a couple of questions remained: What did ATI have to say for themselves? And how exactly did they implement their cheat? Now we have answers for both.

ATI’s story
Yesterday, the FiringSquad asked ATI for their side of the story. ATI’s response was interesting. I suggest you go read through the whole interview at the FiringSquad if you haven’t already. It’s all worth reading.

I’d like to focus on one particular bit of ATI’s explanation of the situation. It reads like so:

Most of our optimizations for Quake 3 and other applications have no impact at all on image quality, and therefore it would be pointless to allow users disable them. The current RADEON 8500 driver revision has an issue that prevents it from correctly interpreting the texture quality slider setting in Quake 3. This issue will be corrected in the next driver release.

Note that the texture quality setting is just one of many possible ways that users can increase image quality in Quake 3 at the expense of performance; forcing on anisotropic filtering or disabling texture compression are alternative methods. It is also important to note that the image quality obtained using all of the standard image settings in Quake 3 (fastest, fast, normal, and high quality) can be observed to be equal to or better than any competing product (try it!); it is only in the special case of selecting “high quality” AND turning the texture quality slider up to the highest setting that a potential discrepancy appears.

Like some of ATI’s previous PR statements, this answer is packed with tricky twists and turns: reader beware. Truth be told, ATI is doing something more than simply misinterpreting the texture quality slider setting. After a little digging, we’ve zeroed in on what they’re doing.

The cheat in action
First, some background. Like Kyle, I was able to modify the Quake III executable to purge it of all instances of the word “quake”. In my case, I simply used a hex editor’s search-and-replace function to replace all instances of “uake” with “uaff”. The result? My own hot new game: quaff3.exe.

I tested quaff3.exe with the Radeon 6.13.3276 drivers on the following setup:

Processor: AMD Athlon XP 1800+ – 1.53GHz on a 266MHz (DDR) bus
Motherboard: Gigabyte GA7-DX
Memory: 256MB PC2100 DDR SDRAM
Audio: Creative SoundBlaster Live!
Storage: IBM 75GXP 30.5GB 7200RPM ATA/100 hard drive
OS: Windows XP Professional

Running quaff3.exe with the Radeon 8500 produces quite decent image quality. Like so:


How Quake III ought to look on a Radeon 8500

Running the original quake3.exe isn’t nearly as pretty. Check it out:


How Quake III looks with ATI’s cheats

It’s not hard to see why the Radeon 8500 produces better benchmarks with this driver “issue” doing its thing. The amount of detailed texture data the card has to throw around is much lower.

 

How the cheat works
The big question has been this: How was ATI able to produce textures that looked a little worse than the standard “high quality” textures, but in close-up detail screenshots, still looked better than the next notch down the slider?

The answer: They are futzing with the mip map level of detail settings in the card’s driver whenever the Quake III executable is running. Mip maps—lower resolution versions of textures used to avoid texture shimmer when textured objects are far away—are everywhere; they’re the product of good ol’ bilinear filtering. ATI is simply playing with them. When the quake3.exe executable is detected, the Radeon 8500 drivers radically alter the card’s mip map level of detail settings.

To show you what I mean, I’ve used Quake III’s “r_colormiplevels” feature, which colorizes the various mip maps in an image in order to show us what’s happening. Here’s how the mip maps look with quaff3.exe:


Quake III’s standard mip maps colored for visibility

That is as expected; the GeForce3 produces very similar results. Now, here are the mip maps in quake3.exe, when ATI’s “optimizations” are in full effect:


Quake III colored mip maps with ATI’s cheats

ATI has moved the threshhold for mip mapping so close to the user’s point of view that mip maps overwhelm the entire image. Even the counters reading “100” at the bottom of the screen are mip maps instead of full-quality textures (notice how they are tinted red). In fact, when Quake III is loading a level with ATI’s “optimizations,” even the game’s static splash screens turn red when r_colormiplevels is enabled.

This is something different than the simple misinterpretation of a texture quality slider. To establish that fact, I’ve included screenshots from quake3.exe and quaff3.exe for the Radeon 8500, plus comparative GeForce3 shots, on the next three pages. You can see for yourself what effect the slider has with each card at each of the game’s four texture quality settings.

 


The Radeon in Quake III, high quality textures


The Radeon in Quake III, medium quality textures


The Radeon in Quake III, low quality textures


The Radeon in Quake III, lowest quality textures

 


The Radeon in “Quaff III,” high quality textures


The Radeon in “Quaff III,” medium quality textures


The Radeon in “Quaff III,” low quality textures


The Radeon in “Quaff III,” lowest quality textures

 


The GeForce3 in Quake III, high quality textures


The GeForce3 in Quake III, medium quality textures


The GeForce3 in Quake III, low quality textures


The GeForce3 in Quake III, lowest quality textures

Conclusions
As you can see, mip mapping is usually affected by texture quality settings, but only a little bit. ATI’s hyper-aggressive “optimization” of mip maps goes well beyond that. Also, quake3.exe shows more aggressive mip mapping than quaff3.exe with at least three of Quake III’s four texture quality settings: high, medium, and low.

(It’s also worth noting that the Radeon 8500’s mip maps are are bounded by two lines that intersect at the middle of the screen. The GeForce3’s mip map boundaries come in a smooth arc determined by their distance from the user’s point of view. But I’m getting ahead of myself.)

I find it very difficult to believe that what ATI was doing here wasn’t 100% intentional. Judge for yourself, but personally, I find the evidence mighty compelling. 

Comments closed
    • Anonymous
    • 18 years ago

    F**k up?!?! LOL????

    Quaff works perfect and Quake works like shit… F**k up my a$$! How blind are you? I am begining to believe that ATi really do have special eye drops for its costumers so they can see it all well…

    • Anonymous
    • 18 years ago

    I agree why didn’t they go to nvidia for the detenator 4 problem. Nvidia worked 6 months to get there latest drivers to even use some of the new fetures on the geforce 3/4 {Ex The new memory feture}. Lets say ATI did cheat, what of it I say good for them Nvidia delayed the d4 to spoil the radeon 8500 release why can’t they fight fire with fire.
    Give ATI 6 months and lets see what tipe of driver support they give for the card, and why all this over just one outdated game?
    Yes quake is used for benching, but ppl are starting to use other game like aquanox why don’t you test it on that and see if they need to cheat there??

    • Anonymous
    • 18 years ago

    Well why din’t ppl go to nvidia and ask them why they delayed the detanator 4 releas?? ATI had no chois in the matter if nvidia wants to play unfair then let ATI play unfair. All is fair in love and war.

    • Anonymous
    • 18 years ago

    Actually I dont really fault ATI or NVidia or any other card company this crap goes on because so called reveiwers
    think that running Q3 benchmark is the only way to test a video card there are many things a video card has to do right not just Q3 or 3dmark. Many times I’ve seen posts from people happy as can be that with the new drivers there 3dmark score went up 50 points what you dont read about is that other apps stopped working properly. Oh but who cares It still gets a great score on…

    • Anonymous
    • 18 years ago

    The stability of ATI drivers has been dropping for years now. Now they are cheating in their drivers!

    ATI drivers SUCK!

    • IntelMole
    • 18 years ago

    And #122, it’s not only that, it’s the principle…

    The simple truth is that ATI have cheated, and as the GamesMaster once says (C4 a few years back, ran 7 series) “Cheating is not allowed!”

    So go play your Civilisations or whatever on your Voodoo 3500, see if we care,
    IntelMole

    • IntelMole
    • 18 years ago

    Okay #122, note one thing…

    No-one here plays in 640×480 or 800×600 unless they have a reason… the only reason i play that is cos my system is crap… and the difference this makes would be your “1 or 2 percent”

    Oh no, they play in 1024×768, or 1280×1024, or even 1600×1200

    Why? Because they can, because it looks nicer, and also because it means they’re getting their money’s worth on the card…

    At 1024×768+, this cheat will make a huuuuuuuge difference, we’re not talking a few percentage points, we’re talking 5-10 frames/second…

    Think about it, at that resolution, the primary bottleneck is memory bandwidth… 64MB DDR simply cannot move stuff much quicker… so, what do they do? They maul the textures so they look like a dog’s week-old dinner and then they take up far less space… so the card can push more through the memory…

    Result! Instant cheat, and 5-10FPS or more on a system doing 25FPS @ 1280×1024… that’s about 20-40% increase…

    See where I’m coming from?
    IntelMole

    • Anonymous
    • 18 years ago

    I’m glad someone is on their toes and holding ATI’s to the fire. I am upgrading my Geforce 2 in the very near future and I am waiting for ATI to sort out there exposed assets. In other words, they don’t think their card is comparable without cheating?

    • Anonymous
    • 18 years ago

    # 126

    ATI still r{

    • Anonymous
    • 18 years ago

    nobody I know who has the card (ie. people who have an idea how to set up a pc) has any trouble with theirs… excellent quality at blazing speeds.
    ——————
    So you’re saying the Tom’s Hardware review was bullshyte, and that the instability experienced with the Radeon was because Lars Weinand doesn’t know how to set his card up properly?

    • Anonymous
    • 18 years ago

    All we need is 3dfx back .. 🙂

    • superchode
    • 18 years ago

    saber, with 9 driver releases since release, including 3 for XP – I wouldn’t be worried about driver updates.

    nobody I know who has the card (ie. people who have an idea how to set up a pc) has any trouble with theirs… excellent quality at blazing speeds.

    in fact, why don’t you get here to pick up two – and you can put one under my christmas tree.

    • Anonymous
    • 18 years ago

    *[

    • Anonymous
    • 18 years ago

    AG 122,

    Uh, you must be talking about YOUR lack of a life.

    I’m happily married and quite successful, thanks.

    Dick.

    • Anonymous
    • 18 years ago

    And nobody was talking to you, anyway.
    ———
    Then keep the off-topic chats private.

    • Anonymous
    • 18 years ago

    *[Originally Posted by Trident

    • Anonymous
    • 18 years ago

    #122, LOL . . . I’m running a Matrox G400 right now, since I’d feel like I was cheating on her if I switched to the GeForce2 Pro lying around.

    Uhh . . . and I’m not doing so well in this ‘life’ game. Relationships bunged up, grades going down the toilet . . . where’s the save/load feature, mang?? The one thing I can count on is my baby . . . aahh . . .

    /me hugs his InWin Q500

    • Pete
    • 18 years ago

    124, instead of spouting useless criticism, make it constructive and specify what you consider better sources of news. Saying I watch CNN doesn’t mean I trust them to report the truth, the whole turth, and nothing but, so help them God–it just means they’re the most convenient source. I suppose CNN’s left-wing slant is balanced by Fox’s right-wing pandering–whoop-de-doo.

    • Anonymous
    • 18 years ago

    *[Originally Posted by Trident

    • Anonymous
    • 18 years ago

    Geez are you guys still using Quake to measure with? This is so 20th century. Try UT and get with it…

    And now that the flame bait is trolling in the water. I’ll get real.
    I’m running a 3dfx 3500 which works just fine for me. Sure some of you seem to want the highest FPS rates and worry about every little 1 and 2 percent increment, so naturally companies cheat on the bench mark. For a change why not recognize that unless your trying to hit some mystic epileptic producing frame rate that only you know about all of these cards are Good Enough! Try different games and different results will occur. Keep standardising on one damn benchmark and all you get is optimization to that mark. And who knows, some of you fan boys just might be pleasently surprised to find that some of those other games are seriously cool too. UT rocks, and Serious Sam is simply too damn funny to take serious. Good Game of Mech Warrior anyone! How about making your mind work with a game of Civ3 or Alpha Centauri. Somethings are good release, but you can exercise your brains as well as your trigger fingers too.
    And for those of you getting really schizo over this card or that cards realtime frame rates, try looking up and a litle to your left. That’s called a window and the frame rates out that are damn near seemless. Plus if you go out the door over there, take a shower and dress nice, you might even discover something called girls. Talk about 3d action ;>)
    In short, get lives you lamers. There’s bigger sh*t going on then this tempest in a teapot.

    • TargetBoy
    • 18 years ago

    AG101, I appreciate the advice, but I have a Geforce2 MX right now, not an ultra. The TNT2Ultra is the card in my Linux machine. A Geforce3 ti200 would certainly give me a performance boost over my MX.

    • Anonymous
    • 18 years ago

    Yeppers, can’t live a day without communicating with someone’s disembodied virtual head. w00t.
    —————–
    LOL

    Anybody know if the eyes blink on those models? It would be kinda goofy to have ’em blink randomly, but it would be even worse if they didn’t blink at all . . .
    —————
    I never even thought of that. And now I’m really interested to find out. Anyone tried this must-have feature on their matrox and care to report?

    • Pete
    • 18 years ago

    Trident Troll,

    I didn’t think I could get more sick than knowing that there were people willing to die for bin Laden. It’s funny how I always looked at the dark points in history (most recently, Nazis and Communists) as something we’ve overcome…

    Why haven’t I heard this on CNN/etc.?

    • Pete
    • 18 years ago

    113, ATi bought the company (ArtX) that designed the GC video core (Flipper). Its architecture is not really comparable to the Radeon, so there won’t be any cross-development there.

    • Anonymous
    • 18 years ago

    *[Originally Posted by Trident

    • cRock
    • 18 years ago

    Matrox baffles me. The G400 core has a lot going for it at this point. Dual head rocks, the drivers are mature, and the image quality is top notch. If you want solid open source drivers, Matrox is THE choice on Linux or FreeBSD. Yet, the cripple their cards with fscking 64 bit DDR RAM. This makes no bloody sense at all. With more memory bandwidth, a G400 won’t beat a Geforce, but it would render Quake 3 fast enough and be the PERFECT balance between workand play. The cards would go flying off the shelves. Instead, Matrox cripples the cards and pitches headcasting BS. To make matters worse, they cancel the MAX cards. I’ve never seen a company where their new products are out performed by their old ones. I hope Matrox releases a new part soon. They have so much potential to fill a vital market niche, but right now they’re slightly missing the mark.

    Blame Canada!

    • Anonymous
    • 18 years ago

    Nvidia g{

    • Anonymous
    • 18 years ago

    [q]All you need to know is that Microsofts X-Box is going for the Geforce[/q]

    Aren’t Nintendo using a ATi graphics chip in the GameCube?

    Or am I thinking of something else here?

    • IntelMole
    • 18 years ago

    lol Now I’m stuck…

    Building a new PC is daunting enuff for me already, and any1 who’s read my topic on building a new 1 cos “ur all smarter than me” will know this… Fact is, I’m very likely to forget something, then get it and blow it up accidentally 🙂

    Anyway, the point is, I was going to get an All-In-Wonder card, but by the time I’m gonna get it it’ll be a majorly budget card. Hell it is already.

    Anyway, so I’m deciding on a new graphics card, I need one that can do the following:

    TV Tuner,
    DVD,
    Amazing Cutting Edge Graphics,
    and everything else I want

    Any ideas people?

    Oh yeah, it needs to be cheap as well 🙂

    • Anonymous
    • 18 years ago

    *[Originally Posted by Trident

    • Anonymous
    • 18 years ago

    Forge, I think he means that Microsoft is went for the Geforce architecture in the Xbox as opposed to going with another chip.

    • Forge
    • 18 years ago

    How can Xbox be going after the GeForce, when they’re both made by Nvidia? I think you meant the Xbox is going after the PC.

    • Anonymous
    • 18 years ago

    *[

    • Anonymous
    • 18 years ago

    Yeppers, can’t live a day without communicating with someone’s disembodied virtual head. w00t.

    Anybody know if the eyes blink on those models? It would be kinda goofy to have ’em blink randomly, but it would be even worse if they didn’t blink at all . . .

    • Anonymous
    • 18 years ago

    #99, no I haven’t . . . u have a link to it somewhere?

    • Aphasia
    • 18 years ago

    Well well well, most ag’s seems to play q3 and nothing else, no work, no dvd…then by all means buy nvidia. But stop being annoying fanboys for something that has been blown totally out of proportion.

    I for one is waiting to see if matrox announces something new before next year when i will buy a new comp. If not, then its ATI, cause nvidia can put out a decent 3d card that has dualhead. Only a few crappy mx cards. But that is just me, and i want dualhead, and by that i mean real dualhead not two card dualhead. Which can be annoying as hell sometimes.

    But still, ATI shouldnt have done away with stunts like this, its better to loose in that test than get all this.

    And yes, we need both ATI and Nvidia in the market right now.
    For the price that a 8500 can be had its a damn good deal.

    cheers

    • Anonymous
    • 18 years ago

    *[

    • Anonymous
    • 18 years ago

    Trident Troll post#99

    you got a link?

    r{

    • Anonymous
    • 18 years ago

    I REALY REALY think the r{

    • Anonymous
    • 18 years ago

    *[Originally Posted by Trident

    • Anonymous
    • 18 years ago

    If you’re into frame rates in Quake3, (or Quak3, or Quaff3, or WTF ever) then buy a nVidia card (…at inflated prices ’cause nVidia loves taking gamers hard-earned green).

    For everything else get a Radeon.

    _______________
    It’s good to see that I can leave this site for a week or so, come back, and see the same old argument still going strong…

    Get over it …it’s just a fscking video card…

    • Forge
    • 18 years ago

    I’m not sure what crack rock those Chinese guys are smoking, but that very definately is NOT typical Radeon 8500LE DVD out. Looks like they’re using the old Xing player on a 16 bit desktop with overlay support forced off. I’m calling bull on it.

    • Anonymous
    • 18 years ago

    [quote]I think one of the reasons that a lot of NVIDIA users might be aggrevated by this would be the reason I am: I was strongly considering getting the 8500 as my next video card. All I have to do is think about how pissed off I’d be if I’d gotten screwed by more ATI driver bullshit and how frustrated I am to now not really have any choice other than NVIDIA. [/quote]

    TargetBoy, you have mirrored my feelings exactly as an nVidia user. I truly want competition, I seriously considered buying an R8500, but what annoys me more than anything else is that ATI have taken something that I had hoped would be good, and tainted it.

    Now I’m left thinking what to get. Stick with good old nVidia who was delivered nothing but solid performing products ever since the TNT1 (my first nVidia card was a Canopus Spectra 2500 and I’m still amazed to this day how well it runs when within its design parameters of up to 800x600x32), right through to me current GF3. I have owned 3DFX, ATI, Matrox and PowerVR cards in that time-frame too and in all-cases nVidia driver support was there to fix a problem that I had within days or at worst weeks (on the rare occasion I did have a problem). In all cases I was irked by either the lack of driver support or game compatability by the other vendors.

    You see, I am a swinging hardware buyer. I buy whatever I feel will serve me the best at a particular point in time. I only feel allegiance or affinity to any particular manufacturer if they have earned it. Right now nVidia have earned a lot in terms of offering the consumer an excellent line of products (aside from their back-door business policies). I feel that I can always trust on nVidia to deliver a solid performing video card for my enthusiast needs.

    When I looked forward to the R8500 to fulfill those needs for my next upgrade I find that all my past bad experiences with owning an ATI card are still there in their gory detail (bodgy driver support).

    How do you think this makes me feel when I consider buying an ATI product. If you say concerned, you’d be right. I you say, scared you may not be far from the truth either.

    ATI did this all by themselves. All you people blaming nVidia need to wake and realise that nVidia didn’t make ATI employ idiots for their driver team that in the end create an atmosphere of doubt in an otherwise excellently engineer hardware product all by themselves.

    • Anonymous
    • 18 years ago

    *[

    • Anonymous
    • 18 years ago

    *[

    • Anonymous
    • 18 years ago

    ATi, you are the weakest link. Goodbye!

    As a Canadian, I’m proud of Matrox, but would rather sweep ATi under the rug. I just don’t get how anyone can be a fan.. like, they’re riding on their past success, and buying their way to keep afloat and stay competitive with NVIDIA. Like, they bought ArtX for a console rig / motherboard (that never happened), bought FireGL for a highend product, whats next? Buying NVIDIA boards to stay competitive with NVIDIA?

    I love NVIDIA. They went from rags to riches, and it was all their own doing. Glad to see a success story like that.

    • superchode
    • 18 years ago

    Oh yeah… I’m in Canada so the pricing is a little different.

    Buying local:

    8500 – approx. $350 CAN

    Ti500 – approx. $575 CAN

    And the 8500 has VIVO and hardware dvd decoding…. add picture quality and the fact that radeon drivers have been nothing but excellent for me in the past and it’s an easy choice.

    Please don’t take it personally.

    • superchode
    • 18 years ago

    [q]moron… idiot…. your mom[/q]

    gerbil #78, if you’re going to make personnal attacks against me, at least have the balls to register.

    [q]test/critique/investigate drivers[/q]

    my point it exactly that… how can you possibly critique drivers if you can’t test & investigate performance… you need a card for this. All I was saying is that without the card you have no basis to judge radeon drivers.

    Facts:

    8500 has been out for approximately 5 weeks and there have been a total of 9 driver releases/updates… 3 for XP, 2 for 2k, and 4 for 98. They are definately not sitting on their asses.

    [q]and what is “shifty timing of driver releases” anyway? im sure anytime nvidia releases drivers its shifty in your opinion[/q]

    it appears I was overestimating tech-report’s readership… you haven’t justified buying from nvidia, you haven’t had to… ignorance is bliss.

    Again… all I’m saying is that if you don’t know shit about the radeon driver’s performance – don’t go posting ATI’s drivers suck.

    • TargetBoy
    • 18 years ago

    Dim, you are listing some of the reasons that were going into my decision to look at ATI instead of NVIDIA. I really do want to see ATI do drivers right. I don’t want to see NVIDIA become another Creative Labs, in fact I am hoping that NVIDIA will enter their marketplace with the sound technology from the nForce and provide competition in a market segment that desperately needs it.

    So, maybe next year’s card will be an ATI, if they can do the right thing. This year’s will probably be a ti200. Damn strange to consign my Hercules TNT2Ultra to the spare parts drawer, once I put my current card in the machine that one is in right now.

    • Anonymous
    • 18 years ago

    #80

    I own an ATi Radeon LE (a downtuned, overclockable like hell radeon DDR, basically) and the damn thing is working fine fine fine. Has it ever crashed in any game I played? No, not once. It has amazing 2D quality and DVD playback.

    Driver issues? I (that’s ME) haven’t seen any of these horible drivers. I’m pretty aware that I may not be getting the best framerates because they are holding back the hardware, but hell, I’m getting 80 fps in most games!

    So yeah, there are satisfied ATi customers.

    • Anonymous
    • 18 years ago

    *[

    • TargetBoy
    • 18 years ago

    BTW, does anyone have any experience with ATI’s linux support? One of the things I appreciate from NVIDIA are reasonably workable linux drivers that support 2D and 3D. How does ATI stack up here?

    • TargetBoy
    • 18 years ago

    dim last I recall when NVIDIA tweaked their OpenGL for Quake, they worked on actually optimizing the drivers so they ran faster not so the image quality was worse. Any games that used the set of calls that quake did would run faster. too bad Unreal was never a good comparison, since it was so bound up in 3dfx’s hardware architecture.

    Everything else you mention is a normal part of business. Accentuate your strong points and draw focus away from your weak. Put as much product on the market as you can aford to weaken your competitors. Didn’t seem to hurt NVIDIA’s profits much, for being “almost below cost”. BTW, that can be also seen as volume pricing, which is one of the most standard business practices arouned. Charge less, make less of a profit per unit, but sell more units to make up for it.

    I think one of the reasons that a lot of NVIDIA users might be aggrevated by this would be the reason I am: I was strongly considering getting the 8500 as my next video card. All I have to do is think about how pissed off I’d be if I’d gotten screwed by more ATI driver bullshit and how frustrated I am to now not really have any choice other than NVIDIA.

    3dfx commited suicide by ignoring what the market place wanted. NVIDIA helped them along by increasing what desire was already there. 32-bit color is great, I use it on all my single player games. I hated the fact that 3dfx refused to recognize the utility of 32-bit color because of the huge difference in quality it makes. On the flip side, I agreed with NVIDIA’s assessment of FSAA. Until recently there hasn’t been a card that did FSAA fast enough for me to use in single player at 32-bit color at resolutions over 800×600.

    I don’t hate ATI, I just can’t trust them. I’ve known several people who have been screwed by their horrible driver support in the past and I refuse to be caught in the same trap. They have shown, yet again, that they just don’t get it. Until they show that they can produce consisten drivers without screwing around, I will have no interest in doing business with them, because there is a company available that produces good hardware with good drivers and none of the bullshit.

    • Pete
    • 18 years ago

    FiringSquad is singing the praises of the 2D quality of the Leadtek Winfast GF3 Ti500 (saying it’s superior to Matrox at 16×12), as have some people at Ars.

    Still, ATi is the only high-end card to offer dual outputs and the best DVD playback (has anyone disputed that Shortbread link you had of a Chinese site showing ATi to to have better DVD quality than nVidia?).

    78–at least 74 is “spewing” first-hand “nonsense.” =p

    • Ryu Connor
    • 18 years ago

    [q]Let’s compare:[/q]

    Okay.

    [q]Ati radeon 8500:
    -$250[/q]

    GeForce 3 Ti200 = $175 and just as fast if not faster.

    [q]-Fast 3D (HyperZ II etc..)[/q]

    I suppose that’s fair.

    [q]-On-board DVD Decoding[/q]

    Yup.

    [q]-Hardware Dual Monitor[/q]

    Yes and no. Still no multiple resoultion and depth per monitor. Still no hardware 3D or overlay support on the second monitor.

    [q]-HDTV Support[/q]

    The GeForce3 has this as well.

    [q]-Smoothvision (available in the latest beta drivers)[/q]

    HRAA is available on the GF3 including Quincunx.

    [q]-Best 2D visuals[/q]

    If you mean most production consistent 2D visuals, I suppose that’s true.

    [q]-Drivers where performance is only going to go UP![/q]

    Well, the fixing of Quake 3 will likely cause performance to fall. 🙂 And yes, the drivers could get worse.

    [q]Geforce 3 Ti500
    -$399[/q]

    $320 actually.

    [q]-Fast 3D (Hardware T&L etc..)[/q]

    The fastest actually.

    [q]Drivers with MAXED out performance[/q]

    And that’s a downside how exactly? And it seems that performance is maturing some more as the Detonator 4’s progress farther into their life.

    [q]Hrmmm which would you choose?[/q]

    The GF3 Ti200. I would choose the 500 if I felt the need to piss on ATI’s grave.

    • Anonymous
    • 18 years ago

    *[

    • sativa
    • 18 years ago

    superchode has a good point about people who dont own ati cards bitching about the drivers.

    i honestly dont see that many ati owners ever whining about it in comparison to the number of nvidia owners who whine about a product they have never used.

    but ati still sucks the big one for this whole fiasco

    • sativa
    • 18 years ago

    dissonance he meant between cards that actually can play games

    • Anonymous
    • 18 years ago

    #74 (moron)

    [q]If you don’t own the card – STFU about the drivers. [/q]

    so if we dont own the card we cant test/critique/investigate drivers? if there isnt any questionable activity going on then you have nothing to worry about, right?

    [q]I don’t endorse what they did with the benchmark… but I don’t really care that much… [/q]

    well thats cause your an idiot

    [q]just like everyone with a nVidia card doesn’t care that much about nVidia’s series of strong-arm tactics and shifty timing of driver releases. [/q]

    your just mad cause you (or your mom) spent money on a crap video card….and what is “shifty timing of driver releases” anyway? im sure anytime nvidia releases drivers its shifty in your opinion….thats cause you are used to ati never releasing any…ahahahahaha

    [q]Fact is – the card is more feature rich and at least as fast as a GF3 with ATI’s current drivers [/q]

    please go read a review of your card compared to a Geforce 3 and come back when you accept the truth.

    otherwise stop spewing out nonsense.

    • Anonymous
    • 18 years ago

    *[

    • Anonymous
    • 18 years ago

    frankly im pissed off, i was ready to get an ati card. ive switched between ati and nvidia since the TNT. after the tnt i bought an ATI MAXX cause it had 64 megs of ram, but that cards drivers blew. it was the most laggy in framerates that ive ever seen. so i went back to nvidia with a geforce 2mx. that card was awsome, no driver problems and no compatibility problems. Nvidia is great, but i like to try different companies often. but this new feascle with the 8500 makes me not to want to try ATI ever again. Tell ATI they did a great job on the marketing department *cough* *cough* deception department

    • Anonymous
    • 18 years ago

    #68

    [q] nVidia spent a lot of money trying to produce a chip to compete with the Radeon 8500 the way it should be, then found out the chip wasn’t quite as powerful, you can bet they’d do whatever they could to give that chip perceived value just to make money on the R&D. [/q]

    actually thats what r&d is for, to research and develop a decent product. g{

    • superchode
    • 18 years ago

    docco has a point, and the AG who’s trying to flame him about drivers DOESN”T have an 8500 (or any radeon, I’d bet) – and apparently he doesn’t realize that docco is USING the card – ie. using the drivers… and he’s still pimping his 8500 over his old GF3….

    I’m so VERY sick of people who don’t own radeons complaining about the drivers… I’ve had a radeon DDR for some time and have only had ONE single issue with drivers…. NFS5 wouldn’t run – I changed one D3d setting and it worked…. OH the SUFFERING!!!

    If you don’t own the card – STFU about the drivers.

    I don’t endorse what they did with the benchmark… but I don’t really care that much… just like everyone with a nVidia card doesn’t care that much about nVidia’s series of strong-arm tactics and shifty timing of driver releases.

    Fact is – the card is more feature rich and at least as fast as a GF3 with ATI’s current drivers (which, if you’re watching, are being updated every few weeks), all for less money… ie. it’s the card I want.

    • Anonymous
    • 18 years ago

    get your hands off my video card you damn stinking ape!!

    • Anonymous
    • 18 years ago

    I just wish all those 3rd party ATI manufacturers would pick up the Radeon chips and actually make cards:) would be nice to get Asus or Hercules’ driver departments behind these things…

    Later

    • Anonymous
    • 18 years ago

    *[

    • Anonymous
    • 18 years ago

    -Best 2D visuals
    -Drivers where performance is only going to go UP!
    —————–
    Oh, and u forgot the EightyEye bonus consolation prize:

    Shittest drivers on planet earth. Next!

    • Anonymous
    • 18 years ago

    *[

    • Anonymous
    • 18 years ago

    *[

    • Anonymous
    • 18 years ago

    66 — Good point *but*

    r{

    • Anonymous
    • 18 years ago

    Isn’t the point that ATi are deliberatly modifying the driver’s behavior for Q3 Timedemos, in order to lure the quake-engine gamer market to buying their cards?

    It’s like being buying a fast car to discover that it’s only actually as fast as they said it would be on one particular road.

    • TargetBoy
    • 18 years ago

    Starfox,

    The last I knew, when NVIDIA couldn’t compete on being the top-dog for speed, they competed by providing compelling features and a highly agressive deployment schedule that frequently bought them the speed to compete for the top performance spot.

    I have no sympathy for ATI if they cannot compete without resorting to desceptive practices. Sacrificing image quality for speed is a design decision if it is made across the board, otherwise it is akin to fraud (and may actually be, but I’m not a lawyer).

    The Radeon 8500 was at a price point where it was an attractive option even if it wasn’t faster than the Geforce 3 ti500. The only major reservation that was really left was ATI’s poor driver history, which they had been making strides in reversing this year.

    Until they can prove beyond a shadow of a doubt that they are fully commited to producing and updating quality drivers across platforms, I have no further interest in their products. Before this, they at least had to make a good show of it. Now they will have to be saints before I will trust them again.

    • noko
    • 18 years ago

    Well I was touring the local Best Buy and had in my hands a Radeon 8500, they had like 5 of them. No GF3 TI’s (all sold out). I almost bought it. Well maybe another day when I know the drivers are up to snuff and know that Smoothvision really works and can make a significant difference.

    • Anonymous
    • 18 years ago

    and he has the typo no less…
    why WHO would cheat? WHO?

    • Anonymous
    • 18 years ago

    *[Originally Posted by Trident

    • Anonymous
    • 18 years ago

    r{

    • Anonymous
    • 18 years ago

    I am happy when I see pretty colors on my screen. You know what I mean?

    • Anonymous
    • 18 years ago

    56

    Go back under your bridge and don’t come out.

    • Anonymous
    • 18 years ago

    56#

    GOOD piont I remember this same sort of thing going on at TOM’s with b{

    • Anonymous
    • 18 years ago

    [quote]Don’t know whether this idea has come up before or not, but have you tested the GF3 with “Quaff3.exe”? [/quote]

    HardOCP did, and the GF3 gave the same speed and image quality. They used quack, rather than quaff, but it’s the same concept (ie. not Quake).

    • Anonymous
    • 18 years ago

    ATI made the FASTEST video card on the market and now you MUCHERS got to make stuff up or Nvidia wont send you free boards…. THIS IS a SAD day.

    • Anonymous
    • 18 years ago

    Don’t know whether this idea has come up before or not, but have you tested the GF3 with “Quaff3.exe”?

    • Anonymous
    • 18 years ago

    *[

    • Anonymous
    • 18 years ago

    “second tier”..so you only believe the top selling product is the best one?

    enjoy your mcdognuts hamburgers and in stink music then.

    • Anonymous
    • 18 years ago

    If ATI hardware is really all that nice, their software engineers wouldn’t need to go to such lengths to distort performance.

    ATI cards have nice marketing [i]specs[/i], but are their hardware really at the same level as those from NVidia? Take the mipmapping for example. Until you look through the actual chip designs and implemented algorithms, you just can’t say.

    When SGI, once [i]the[/i] poineering company in computer graphics, started it’s downward slide, NVidia snapped up many of their graphics engineers, in the process building a first rate chip design team that is the thrust behind their meteoric acent to the heady pinnacle of PC graphics. ATI’s accumulation of technology buzzwords in their press releases is no reason to believe that ATI’s hardware design team is at the same calibre as the NVidia team.

    • Anonymous
    • 18 years ago

    49,

    Many gamers don’t play with v sync enabled. So it’s a valid benchmark. Don’t even go into the “24 fps is all I need because movies do it” arguement, it’s old and tired and totally WRONG.

    Also, try this. Go into quake 3 and run a timedemo at say, 640×480 32bpp @85 hertz with v sync disabled.

    Then do it with v sync enabled. Betcha don’t get 85 fps. Betcha it’s alot less.

    • Anonymous
    • 18 years ago

    #47, yeah Nvidia said the TNT (and TNT2 if I remember right) would ship at higher clock speeds then what they actually shipped with. But they didn’t lie about it or try to hide that fact from consumers.

    And that 22-bit color fiasco was 3DFX, not Nvidia.

    • Anonymous
    • 18 years ago

    Am I the only one who noticed that the pointless FPS benchmark specs? Ok, I’m running a video mode of 1024×768 at 85 Hz and my frame rate is 150. Uhhh 150>85 the screen updates 85 times/sec, the game is running at 150 frames/sec
    You only get half the images. Large screen movies still only run at 24 fps.
    I guess I just don’t play enough games to understand. I’m ready to update my video system, I think I’ll go with a Matrox G450 and get decent 2D performance and dual output. I’m tired of buy a card then waiting 6-9 months for usable drivers to come out.

    • lenzenm
    • 18 years ago

    AG #47:
    As a proud owner of several 3dfx products, I can tell you that it was the 3dfx cards that rendered 22-bit internally, and output to the monitor at 16-bit. It was the best looking 16-bit I’d ever seen, but it still looked like ass compared to 32-bit. So, nVidia is of the hook on that one. Not that nVidia didn’t do other despicable things, though…not the least of which is burying all 3dfx tech after aquiring 3dfx’s intellectual assets; thereby leaving me without support.

    • Anonymous
    • 18 years ago

    doesnt anyone remember nVidia’s fiasco with the original spec’d clock rates of the TNT? in the quest for the almighty dollar nothing is sacred. 22-bit color is just as good as 32! yeah, right.

    • noko
    • 18 years ago

    I liked the article and I’ve had noted at Beyond3d.com how Nvidia mipmaps are spherical while the Radeon and now the Radeon 8500 are rectangular. This is the reason I believe the Radeon had the servere moire patterns when using texturing filtering such as Anisotropic filtering. The Nvidia cards correctly sets the texture distance from point of view while the ATI cards has a approximation using a rectagular shape causing within the mipmap a under and then oversampling of the texture -> moire problem. I was hoping that Wasson would go into this because it does affect image quality especially not the single image shots but when in motion the texture aliasing and mip-map broundaries of the ATI cards become more evident. So yes now would be a good time for a real indebt image comparison review between two technologies and maybe PowerVR technology as well. If someone would clearly point out what is going on between ATI and Nvidia Mip-Maps and show what this means it would push ATI to upgrade how they do mip-maps so that textures are properly layed down for the best image quality. That doesn’t mean I won’t buy a Radeon8500 but that is another reason why I might may buy a GF3 Ti-500. So Scott I hope you cover the mip-map techniques used by the industry and the advantages of each one and disadvantages of each one.

    • Anonymous
    • 18 years ago

    *[Originally Posted by Trident

    • Anonymous
    • 18 years ago

    How do you figure nVidia did anything like this???

    Some things one might do to increase performance:

    1) offer the user tweaks and settings to balance quality vs. performance

    2) optimize the code

    3) lower the quality without a choice

    4) detect a specific game running and disable half the functionality of the card

    nVidia does 1) and 2). Haven’t noticed them doing 3), but I can’t say for sure – all the cards run the same drivers, presumably using the same interfaces and features (or a subset thereof), from TNT on up to GF3. Be more specific if you think that they have lowered the quality without a user option to bring it back – and if they have, show how it is anything close to the degree represented by the screenshots in this article. Let’s see what happens if you change the name of the quake3 executable on nVidia drivers. I’m guessing: nothing.

    4) is all ATI and it’s really quite different from ANYTHING I’ve ever heard of a purportedly reputable company doing. It’s not EVEN optimizing for a benchmark. It’s a deception. They couldn’t even just “optimize” OpenGL to do this, because they knew it would ruin most OpenGL applications.

    • Anonymous
    • 18 years ago

    So….. ATI is now guilty of what nVidia has been doing for quite a while now (GeForce 1 and 2 series)….

    Namely producing a lower image quality in order to get a few more FPS….. Big deal. Yawn..

    I’ll stay with my G400.

    • Anonymous
    • 18 years ago

    nVidia did NOT try to put 3dfx out of business, 3dfx filed bullshit anti-competitive patent lawsuits against nVidia – fact. At that point nVidia had no choice but to leverage their position to shut 3dfx down. 3dfx = evil.

    Why would you want a radeon? Why would anyone want any weird 2nd tier product form Canada like Matrox or ATI? I don’t get it. People who want to feel special/elite for having something different, I guess.

    It’s conclusive. ATI is trying to deceive gamers. They look for quake3 to be running and tweak the mipmaps in a horrendously obvious and intentional way.

    This isn’t even like optimizing for a benchmark – you don’t Detect the benchmark running and change how the driver works, you optimize the routines that are heavily used by the benchmark. ATI waits for quake3.exe to be running then disable half the functionality of the card. That’s so dishonest that I wish it could be illegal. Hell it might well be.

    Inferior memory bandwidth I guess? Who knows or cares why ATI is crap – it’s superfluous. Just be grateful that Intel hasn’t monopolized the graphics chip business and buy nVidia.

    There’s no advantage in weird second rate products, except that people might not know how bad your shit sucks when you brag about it.

    • TargetBoy
    • 18 years ago

    Bah, that was bizzare.

    • TargetBoy
    • 18 years ago

    Heheh. Posting engine didn’t like fifty-percent: 50%

    • TargetBoy
    • 18 years ago

    Yeah, I remember the “strong arm” fiasco.

    I just see muscling news sites as a different category of bad than faking driver optimizations by reducing image quality.

    Last time I remember this happening was back in the early 90’s when some video card company figured out what text string PC Magazine was using to test with and hard-coded an optimization into their driver for that string. Fortunately, they didn’t pay attention to the results of the tweak. When PC Magazine did the test they showed a 500% performance improvement and got suspicious and found the hack.

    Looks like ATI learned from history, but this is just sleazy and there is no excuse for it. I just hope that one of the major industry magazines picks up on this and pillories them for it. Unfortunately, ATI is running some nice 2 page ads in the press right now. Doesn’t seem likely to happen with PC magazine about thickness it used to be back in the early 90’s…

    • Anonymous
    • 18 years ago

    *[http://hypothermia.gamershardware.com/nv_bs_pt2.htm<]§ Nvidia blamed a PR rep and wrote it off. I didn\'t believe it, but it was enough of a response to take the heat off. PR divisions treat us like children, giving us answers that make little sense, but are enough to give their fans an excuse to like the company again, even if it doesn\'t satisfy their detractors. Will I buy nVidia, even remembering all the horrible things they\'ve done? Oh yes, most definately. Will I buy ATI, even considering this driver fiasco? Yes, just as definately. Why? Because I know why it was done. I know that it was done for advertising reasons. I know it was done because most people want to see frame rate. I am glad they were discovered, but, quite frankly, it happens all the time, and no company is above doing it. Look at the major players in the hardware industry and show me one that hasn\'t. Via, Intel, AMD, nVidia, ATI, most people have at least 2 of these manufacturers in their machine, and each one of them is guilty of some sort of indiscretion designed to do one thing: Make us buy their products. Cut \'em some slack. Bad news is big news, and hardware sites are just as guilty of trying to get your \'business\' as anyone else. This kind of story has the wonderful side-effect of collecting more people to a hardware site, and maybe those people will look at something else while they\'re there, but even if they don\'t, more traffic shows the site\'s bosses, whoever or whatever they may be, that their site is WORTH more. My point is, be careful what you read and where, because sometimes the sites showing the story are guilty of the same kind of things these companies are doing: falsely showing their worth. Note: I post this here, because I don\'t think Damage is doing that, he actually has more info here than most sites, and I learned something more about the situation.

    • Anonymous
    • 18 years ago

    BTW, this article didn’t make a few things clear. Specifically, what were the other settings for graphics besides the texture quality? Was everything set to highest for all the other cards?

    Also, it quotes ATi’s statement about the IQ being equivalent (well, they say better) at settings other than having textures set to the highest quality, yet the only pictures of settings other than high quality textures colored mip mapping turned on, making comparison between the two difficult. As best I can tell, the IQ for low and lowest look the same as the unoptimized, but medium it is really hard to tell with that severe red tint. Heh, I’m not trying to rationalize what ATi did, but I am trying to point out that the article failed to make the point clear by not including such a picture for comparison.

    • Anonymous
    • 18 years ago

    There people go talking about 3dmark again.

    How and why does anyone care about 3dmark? It’s synthetic.

    Argue for it all you want, I don’t give a rat’s behind about 3dmark, since 3dmark isn’t a GAME that I can PLAY.

    • cRock
    • 18 years ago

    Is there any hope of timely 8500/7500 drivers for XFree86? It looks like ATI dropped Linux/FreeBSD support like a bad habit.

    • Anonymous
    • 18 years ago

    Well, they promised to address this issue in the mid-november driver release (and fix Smoothvision), and maybe even increase speeds of the drivers (i.e., actual optimizations instead of cheat), so we’ll see.

    • noko
    • 18 years ago

    Everyone keep it up, I want to get my new Radeon8500 Retail for $100. 🙂

    • Anonymous
    • 18 years ago

    I wonder if any one here remembers at all the major crap and shady deals Nvidia did on their quest to kill 3dfx? How soon we forget…. Most of us still use 3DMark to get some idea of 3D cards speed even though we all know it was coded for Nvidia’s chips and if other chips work w/ the program thats nice to.

    • Anonymous
    • 18 years ago

    *[http://www.zdnet.de/c/c1.cgi?techexpert/artikel/tuning/200109/nv2181_00-wc.html<]§

    • Anonymous
    • 18 years ago

    Ummm.. Who freeking cares?
    I recently gave up my Radion DDR for a Geforce2 Pro and well bummed at the image quality loss in most games I loce the smooth frame rates. ATI fix this crap so we can get on playing instead of crying over this.

    • Anonymous
    • 18 years ago

    Nemesis[BOD]:
    ATI can suck on my Geforce 3 Quad Rocket!!!!! EAT IT!!!!

    • Anonymous
    • 18 years ago

    ITS CLEAR this is all Nvidia’s doing… ILL bet tech-report had Nvidia make some Radeon 8500 drivers that cheat so all of us will go out and get the Ti500….

    DIE NVIDIA and all you ATI hating bastards…
    —————-
    Interesting theory. My theory is that you’re an ATI zealot.

    • Anonymous
    • 18 years ago

    The next best thing would have been to come clean about it in an official sort of way, instead of working themselves into a corner.
    ————-
    No; see, official PR/Marketing types (aka droids) are trained to ignore the issue completely and let it simmer down naturally, so as not to fan any flames. They NEVER come clean about anything in any official way…UNLESS the outrage of the user community (ie. ‘consumer’) steadily grows and continues to grow, DEMANDing they do respond officially, in order to save face.

    In other words, they only take action when forced, and when the outcome of an official response is deemed to be more beneficial to business and public perception than ignoring the issue.

    As far as they’re concerned, it’s only a few geeks that know what games they are up to at this stage. I hope it grows and the knowledge spreads out into a more mainstream audience…

    • Anonymous
    • 18 years ago

    How is a company run this way still in business?
    ———
    I hope nvidia rip their balls off. I really do.

    And then I hope an unknown company rips nvidia’s balls off if they ever try the same. And so on.

    That’s the only way to keep the bastards honest.

    • Anonymous
    • 18 years ago

    Damn, this makes me mad. I look at the 8500’s specs and I want one – then I think about the games they play and the lies they tell and then I don’t want one. How can they let the driver guys turn this card into such a big POS? How is a company run this way still in business?

    • BabelHuber
    • 18 years ago

    when i see you goddamned campers anywhere on a server near me you

    • Anonymous
    • 18 years ago

    Ya. We campers actually stop to take in the scenery unlike those bunny-hoppers, and the ATI drivers messes it all up. =(

    • Anonymous
    • 18 years ago

    Day in. Day out. Frag after frag.
    I see no benefit to these ATI drivers.
    I am a camper. wOOt!

    • Anonymous
    • 18 years ago

    how many more articles can we see with this same storyline? let’s just wait and see what difference their new drivers do.

    too many descrepencies in people’s current testing including the infallable(cough) tech-report.

    • Anonymous
    • 18 years ago

    *[http://velnias.tripod.ca/q3r_mip.jpg<]§ (I actually doublechecked this one.. IT WORKS!)

    • Anonymous
    • 18 years ago

    *[http://velnias.tripod.ca/q3_mip.jpg<]§ the file size is ~71k so all those 56k-ers don\'t need to flinch. The shot was taken at 1600x1200x32, all settings maxxed.

    • Anonymous
    • 18 years ago

    *[http://velnias.tripod.ca/q3_mip.jpg) is what I get doing that same r_colormiplevels thing (r_colormiplevels 1) I don\’t really know how to interpret the colors, but I do know it\’s different from what Scott got. So either it IS a bug or ATi decided to take it right out of their code as soon as people got up in arms (which I think is more likely)

    I personally think it would hurt ATi a lot less if they could decide on what their position as a company is and stick to that as opposed to something different every week.

    If I hadn\’t been offered the deal I got on this 8500, I would have gone with a GF3Ti200, but I\’m happy with my purchase. 91 fps in Q3A is more than enough for me to be passably accurate with a railgun 🙂

    Oh, about the picture: I compressed the file as I was assuming that the colors are what is important, not the image quality. I\’ve got no idea what I get for bandwidth off of this site and I didn\’t want it all taken up at once.

    • EasyRhino
    • 18 years ago

    guys, the ATI thing is gonna get beat to death,but check out the ‘mini-itx’ motherboard off the shortbread link:

    §[< http://www.via.com.tw/jsp/en/pr/pr_itx.jsp<]§ it's SO FRICKING SMALL ITS AWESOME! ER

    • Anonymous
    • 18 years ago

    ITS CLEAR this is all Nvidia’s doing… ILL bet tech-report had Nvidia make some Radeon 8500 drivers that cheat so all of us will go out and get the Ti500….

    DIE NVIDIA and all you ATI hating bastards…

    • Anonymous
    • 18 years ago

    *[Originally Posted by Trident

    • Forge
    • 18 years ago

    Actually, I’d imagine most college-level coding classes could do far better than ATI’s driver teams, given the same amount of documentation and info availability.

    This crap from ATI’s PR and driver groups is utterly undefendable. They were ready to meet GF3 Ti on level grounds and win, but Nvidia scared them with the Detonator4s, so ATI resorted to cheap tricks. Utter shat. I suggest you all get a GF3 Ti 500 if you can, and a Ti200 if not. Vote with your wallets.

    • Anonymous
    • 18 years ago

    I for one feel sorry for ATi’s hardware engineers. They design a really nice piece of hardware and the driver team sits on it’s ass.

    I honestly think a high school programming class could make better drivers than these clowns.

    You would think after ATi has been blasted repeatedly for having shit drivers they would get their act together and release a product that wasn’t crippled in one way or another.

    Bullshit.

    • Anonymous
    • 18 years ago

    *[http://www.tech-report.com/news_reply.x/3041/<]§ FYI, Nvidia was doing this on their Riva128 and ZX chips in their driver too. They later changed it to an option when the TNT came out.

    • nexxcat
    • 18 years ago

    It’s sad, too. Reports by others here stated that the R200 part+new drivers really rocked. At least GeForce3 image quality has improved on some vendors’ parts. That’d be my next video card, when I have some money to burn on a new rig.

    • Anonymous
    • 18 years ago

    *[

    • Anonymous
    • 18 years ago

    I understand this wasn’t the greatest move by ATI, but is it REALLY that bad?

    I mean, has anyone gone back through the drivers to see if it was ever correct?

    Or if it was correct, when it was optimized?

    Maybe they just really made a mistake here…

    • Anonymous
    • 18 years ago

    *[

    • LiamC
    • 18 years ago

    Frames? How much performance are ATi getting? How big a cheat is this? If it’s 5 fps then stupid ATi.

    If it’s 15fps then there hardware/drivers suck

    Damn fine work Damage

    • Anonymous
    • 18 years ago

    So we’ll be seeing a 7500LE soon….

    • lowlight
    • 18 years ago

    Just an FYI guys, I took a look at colormiplevels on my Radeon 7500, and it’s the same thing… The Radeon is bit by this ‘bug’ as well (or was it ‘optimizations’?)

    Also, I am getting word that the OEM 7500 is clocked at 270 Mhz core, rather than 290 Mhz on the retail… I’m looking into this as well 🙂

    lowlight
    b[

    • Anonymous
    • 18 years ago

    Good article, I like it…

    Or rather, I like the way it’s done. I agree with Firingsquad (somewhat) that if we could turn these “off” it wouldn’t be as big of a deal.

    I mean, geez, the hardware is great, why should they have to do this? Are their driver departments that sucky?

    • Anonymous
    • 18 years ago

    Cheating bastards. How about writing a decent ICD, hardware is there… Sad thing, they still lose in this benchmark.

    ATI=teh suck!

    • Anonymous
    • 18 years ago

    *[

    • Anonymous
    • 18 years ago

    ATI is in deep shit.

Pin It on Pinterest

Share This