ATI’s Radeon 9600 XT graphics card

Manufacturer ATI
Model Radeon 9600 XT
Availability Soon

ATI AND NVIDIA are locked in an epic battle for the graphics performance crown, but as sexy and exciting as technology leadership can be, sometimes it’s hard to get really excited about high-end graphics cards. ATI and NVIDIA’s latest flagships, the Radeon 9800 XT and GeForce FX 5900 Ultra, are both capable of rendering stunning environments with fluid frame rates, but $500 price tags keep the cards out of the hands of those of us who don’t have spare organs to hawk on eBay.

Fortunately, the fancy technology found in most high-end graphics cards eventually trickles down to more affordable mid-range products. Mid-range cards might not have enough horsepower to run the latest games at the highest resolutions with antialiasing and anisotropic filtering cranked all the way up, but they’re generally fast enough for all but the most demanding gamers.

Last year, NVIDIA’s GeForce4 Ti 4200 owned the mid-range graphics market, but this year has been dominated by ATI. ATI took over the mid-range graphics performance crown with the Radeon 9500 Pro, which was succeeded by the Radeon 9600 Pro. NVIDIA’s GeForce FX 5600s haven’t been able to keep up. Not content to sit idle, today ATI is beefing up its mid-range graphics line with the Radeon 9600 XT. The 9600 XT promises to set a new standard in affordable graphics performance, but it is really that much faster than the competition? Read on to find out.

The RV360 GPU
The Radeon 9600 XT is based on ATI’s new RV360 GPU, which is quite similar to the RV350 chip found in the Radeon 9600 Pro. The RV360 shares the RV350’s 4×1-pipe architecture and a host of other features that you can read about in my Radeon 9600 Pro review. Rather than rehash all the technology found in the RV360, I’d rather focus on what’s new in the chip. ATI snuck a few surprises into the RV360 that are worth exploring.

Like the recently announced R360 GPU, which powers the Radeon 9800 XT, the RV360 supports GPU core temperature monitoring. Temperature monitoring is necessary for ATI’s new OVERDRIVE automatic overclocking software, which will come to the Radeon 9600 XT in the Catalyst 3.9 driver release, slated for November. (The RV360 has all the necessary hardware support for OVERDRIVE to work, but the Cat 3.9s aren’t ready yet.) Since OVERDRIVE will initially only offer the 9600 XT overclocked speeds of 513 and 527MHz, old fashioned overclocking may be a route for experienced enthusiasts.

To help give OVERDRIVE plenty of clock speed headroom, RV360 GPUs are being fabbed on a 0.13-micron manufacturing process using a special “Black Diamond” insulator that has less capacitance than the Fluorine-doped silicate glass insulator found in the RV350. Low capacitance (low-k) insulators can help chips reach higher clock speeds, which explains why ATI is able to clock the RV360 GPU at an even 500MHz on the Radeon 9600 XT—100MHz higher than the Radeon 9600 Pro. The fact ATI is rolling out OVERDRIVE support for the 9600 XT suggests the chip can handle clock speeds north of 500MHz, too.

If you look really close, you can almost see the low-k insulator

The specs
I know you can’t wait to see benchmarks, but first, let’s have a quick peek at the Radeon 9600 XT’s specs.

GPU ATI RV360
Core clock 500MHz*
Pixel pipelines 4
Peak pixel fill rate 2000 Mpixels/s
Texture units/pixel pipeline 1
Textures per clock 4
Peak texel fill rate 2000 Mtexels/s
Memory clock 600MHz
Memory type BGA DDR SDRAM
Memory bus width 128-bit
Peak memory bandwidth 9.6GB/s
Ports VGA, DVI, composite and S-Video outputs
Composite, S-Video inputs
Auxiliary power connector None

If you ignore low-k insulators and core tweaks, the Radeon 9600 XT is basically a faster version of the Radeon 9600 Pro. We’ll see just how much faster in a moment. However, before we get into testing, let’s take in the beauty that is the Radeon 9600 XT:

Ooohhhh. Aaaahhhh.

The Radeon 9600 XT’s heat sink is similar to what you might find on a Radeon 9700 Pro

The card’s BGA memory chips are rated for operation at 300MHz

And no, ATI still isn’t offering dual DVI on its consumer graphics cards

 

Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run three times, and the results were averaged.

Our test system was configured like so:

  System
Processor Athlon XP ‘Thoroughbred’ 2600+ 2.083GHz
Front-side bus 333MHz (166MHz DDR)
Motherboard DFI LANParty NFII Ultra
Chipset NVIDIA nForce2 Ultra 400
North bridge nForce2 Ultra 400 SPP
South bridge nForce2 MCP-T
Chipset drivers NVIDIA 2.45
Memory size 512MB (2 DIMMs)
Memory type Corsair XMS3200 PC2700 DDR SDRAM (333MHz)
Graphics card GeForce FX 5600 Ultra 128MB Radeon 9600 Pro 128MB
Radeon 9600 XT 128MB
Graphics driver Detonator FX 45.23 CATALYST 3.8
Storage Maxtor DiamondMax Plus D740X 7200RPM ATA/100 hard drive
OS Microsoft Windows XP Professional
OS updates Service Pack 1, DirectX 9.0b

We’re comparing the Radeon 9600 XT to its most direct competitors, the Radeon 9600 Pro, and NVIDIA’s GeForce FX 5600 Ultra. I’ve elected to use only publicly available and supported drivers for this review, which means you won’t find any results using the various beta versions of NVIDIA’s “Release 50” drivers. We’ll be exploring the performance and image quality of NVIDIA’s next official driver release when the driver is finalized, WHQL-certified, and available to the general public.

The test system’s Windows desktop was set at 1024×768 in 32-bit color at a 75Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

 

Fill rate
Theoretical fill rate and memory bandwidth peaks don’t necessarily dictate real world performance, but they’re a good place to start.

  Core clock (MHz) Pixel pipelines  Peak fill rate (Mpixels/s) Texture units per pixel pipeline Peak fill rate (Mtexels/s) Memory clock (MHz) Memory bus width (bits) Peak memory bandwidth (GB/s)
Radeon 9600 Pro 400 4 1600 1 1600 600 128 9.6
Radeon 9600 XT 500 4 2000 1 2000 600 128 9.6
GeForce FX 5600 Ultra 400 4 1600 1 1600 800 128 12.8

ATI hasn’t done anything to increase the Radeon 9600 XT’s memory bandwidth over the 9600 Pro, but XT’s pixel and texture fill rates lead the pack.

Despite its higher theoretical pixel fill rate peak, the Radeon 9600 XT can’t catch the GeForce FX 5600 Ultra in 3DMark03’s single-textured fill rate test. However, the tables turn when we look at multi-textured performance, where the 9600 XT has a huge lead over even the 9600 Pro.

Shaders

The Radeon 9600 XT improves upon the 9600 Pro’s already impressive pixel shader performance by a small margin, but the XT really shines in vertex shader performance. A 50% improvement in vertex shader performance suggests maybe ATI spent a little time tweaking more than just the RV360’s clock speeds.

 

ShaderMark 2.0
ShaderMark 2.0 is brand new and includes some anti-cheat measures to prevent drivers from applying questionable optimizations. The Radeons run the benchmark with straight pixel shader 2.0 code, but I’ve included results for the GeForce FX with partial-precision and extended pixel shaders, as well.

The Radeons are in a class all their own in ShaderMark 2.0, and the GeForce FX 5600 Ultra’s performance is nothing short of embarrassing. The FX is way behind the Radeons across the board, and refuses to cooperate with a number of the shader tests.

Focusing on the Radeons, the 9600 XT is about 20% faster than the 9600 Pro, which nicely matches the 20% core clock speed difference between the two cards.

 

Quake III Arena

Wolfenstein: Enemy Territory

Unreal Tournament 2003

Wolfenstein: Enemy Territory runs fastest on the GeForce FX 5600 Ultra, but Quake III Arena and Unreal Tournament 2003 perform better on the Radeon 9600 XT.

 

Comanche 4

Codecreatures Benchmark Pro

Gun Metal benchmark

The Radeon 9600 XT leads the way in Comanche 4 and Codecreatures, but the card can’t quite keep up with the GeForce FX 5600 Ultra in the Gun Metal benchmark.

 

Serious Sam SE

The GeForce FX 5600 Ultra maintains its lead in Serious Sam SE with and without antialiasing and anisotropic filtering.

 

Splinter Cell

In Splinter Cell, all the cards are closely matched. Overall, the Radeon 9600 XT is the fastest of the lot, but not by much.

 

3DMark03

In 3DMark03, the Radeon 9600 XT leads the way in all but the DirectX 7-class Wings of Fury game test. The 9600 XT’s performance advantage over the 9600 Pro is particularly impressive in the complex Mother Nature test.

 

Tomb Raider: Angel of Darkness
Tomb Raider: Angel of Darkness gets its own special little intro here because its publisher, EIDOS Interactive, has released a statement claiming that the V49 patch, which includes a performance benchmark, was never intended for public release. Too late, the patch is already public. We’ve used these extreme quality settings from Beyond3D to give the GeForce FX 5600 Ultra a thorough workout in this DirectX 9 game, and the results speak for themselves.

The Radeons wipe the floor with the GeForce FX 5600 Ultra. To be fair, NVIDIA claims that the Tomb Raider benchmark incorrectly uses a generic DirectX 9 code path rather than NVIDIA’s GeForce FX-optimized code path for the game. If that is indeed the case, our results highlight just how much the GeForce FX line needs optimized code paths in order to be competitive.

AquaMark3

The Radeon 9600 XT is out ahead in AquaMark3, which makes extensive use of DirectX 9 shaders.

Halo
I used the “-use20” switch with the Halo benchmark to force the game to use version 2.0 pixel shaders.

The Halo benchmark also shows the Radeons way out ahead, with the 9600 XT leading the way. Halo’s benchmark timedemo renders cut-scene footage, which is important to note since the benchmark results don’t necessarily reflect real-world gameplay performance. However, the timedemo still uses Halo’s graphics engine, so it’s fair game for comparing the relative performance of different graphics cards.

Halo also isn’t compatible with the GeForce FX’s antialiasing, but ATI’s Smoothvision antialiasing seems to work just fine.

Real-Time High-Dynamic Range Image-Based Lighting
To test the GeForce FX 5600 Ultra’s performance with high-dynamic-range lighting, we logged frame rates via FRAPS in this technology demo at its default settings. The demo uses high-precision texture formats and version 2.0 pixel shaders to produce high-dynamic-range lighting, depth of field, motion blur, and glare, among other effects.

The Radeons XT looks great running this demo, and performance isn’t too shabby, either. The XT isn’t leaps and bounds ahead of the Radeon 9600 Pro, but there’s still a noticeable performance gap between the two cards.

 

Edge antialiasing

The Radeon 9600 XT’s 100MHz core clock speed advantage helps a little with antialiasing, but the performance boost isn’t too dramatic.

 

Texture antialiasing

The Radeon 9600 XT is just a little bit faster with anisotropic filtering, too.

 

Overclocking
In testing, I was able to get my Radeon 9600 XT stable and artifact-free with core and memory clock speeds of 595 and 640MHz. Try as I might, I just couldn’t get the card to behave with a 600MHz core clock speed. Overall, though, I’m quite happy with the 95MHz overclock; it certainly proves that the card has plenty of headroom for OVERDRIVE to work with.

Of course, just because I was able to get my 9600 XT stable at 595/640 doesn’t mean that every card will be stable at those speeds. Some cards may be cable of higher clock speeds, and some may barely overclock at all. Either way, I have to give ATI props for the Catalyst 3.8 driver’s new VPU recovery feature. This safety feature, added in the Catalyst 3.8 driver release, attempts to reset the graphics card if it locks up, potentially saving a reboot. It works, too. VPU recover is incredibly useful for overclocking stability testing, and I can’t wait to see it become a Windows requirement.

Overclocking the Radeon 9600 XT yields impressive gains in AquaMark3 both with and without antialiasing and anisotropic filtering. It’s official: low-k dielectrics rule.

 

Conclusions
The Radeon 9600 XT’s basic rendering capabilities are unchanged from its predecessor, so we haven’t included an image quality analysis in this review. However, things do change with driver releases, so we will be looking at the comparative 3D image quality of the latest ATI and NVIDIA products with the latest drivers in an upcoming article. For now, I can say that the Radeon 9600 XT’s image quality is, subjectively, very good.

The results of our performance testing are clear. The Radeon 9600 XT improves on the 9600 Pro’s already impressive performance in a wide variety of benchmarks and real-world applications, including titles that take advantage of DirectX 9-class pixel and vertex shaders and floating point data types. NVIDIA’s GeForce FX 5600 Ultra manages to sneak in a few small victories here and there, but those victories are mostly confined to dated games whose graphics engines don’t make use of DirectX 9-class hardware.


Mid-range graphics: ATI Radeon 9600 XT
October 2003

Though the Radeon 9600 XT performs well today, OVERDRIVE support should make the card even more fearsome. I can only hope that ATI will consider bumping up the Catalyst 3.9 driver’s top OVERDRIVE clock speed from 527MHz to something a little bolder. If my sample (which reached nearly 600MHz) is any indication of what the majority of RV360 GPUs is capable of, 527MHz may only scratch the surface of the chip’s overclocking potential.

At the end of the day, I think I’m more excited about the Radeon 9600 XT than I’ve been about any other graphics card. ATI has leveraged some very cool manufacturing technology to produce a graphics chip that’s capable of incredibly high clock speeds without the need for exotic cooling or even an auxiliary power source. The Radeon 9600 XT’s performance in DirectX 9-class applications is great, which bodes well for next-generation titles like Half-Life 2 and Doom 3. What’s probably most appealing about the 9600 XT is the fact that its $199 price tag makes the card very affordable for a wide range of gamers and PC enthusiasts alike. That $199 price tag includes a copy of Half-Life 2, too. All things considered, it would almost be irresponsible for me to not give the Radeon 9600 XT our coveted Editor’s Choice award for mid-range PC graphics. Today, the Radeon 9600 XT is as good as mid-range graphics gets. 

Comments closed
    • daniel4
    • 16 years ago

    Yes, they will make the 9600XT faster so IGNORE THIS REVIEW. This review never happened and Geoff Gasior is crazy.

    • PLASTIC SURGEON
    • 16 years ago

    Looks like ATI will make the 9600xt faster before it’s released retail the same way they did with the 9600pro….

    §[< http://www.theinquirer.net/?article=12161<]§

      • Krogoth
      • 16 years ago

      It will still be slower the Radeon 9700 PRO regradless of what ATi said they need to clock the core it at least 650Mhz! Which I doubt ATi is willing to push it that far without exotic cooling. Not to mention that the Memory needs to clock at 1.2 Ghz to obtain slimiar memory bandwidth because, R9600XT still has a 128bit memory bus. Nevertheless the 9600XT will still be one heck of a card just for $200 USD.

    • Anonymous
    • 16 years ago

    If nvidia stumbles on the next release and can’t deliever a world class card, im going to switch and never look back.

    Nuff said.

      • Anonymous
      • 16 years ago

      “i’m going to switch and never look back”?

      How ridiculously short-sighted and simple-minded of you. If you’d said the same thing about ATi three years ago, and you probably did, does that mean you’d be buying less cost-effective NVIDIA cards now?

      Sheesh, put away the fanboy specs and just buy whatever gives the best combination of price, performance and quality when you need a new card.

    • JustAnEngineer
    • 16 years ago

    That’s another great review, Diss.

    I certainly hope that the Radeon 9600-class hardware becomes standard for mainstream PCs. We want game developers to be able to target games toward that performance and feature level.

    • Anonymous
    • 16 years ago

    Why did you only compare this card against other low-end cards? I want to know how it stacks up against the 9800 Pro and the 5900 (without having to refer to other articles). Did ATI specify that you could only show results vs. other cards in the same price category??

    • Rousterfar
    • 16 years ago

    What is the ETA or ATI’s and Nvidia’s next real new card generation release?

      • Krogoth
      • 16 years ago

      Sometime next year ether Q1 or Q2 2004.

      • indeego
      • 16 years ago

      What do you mean by “real?” They are all either clock upgrades, architectural pipelining changes, memory changes, or micron changes, -[

        • Rousterfar
        • 16 years ago

        Well I just meant the next revamp that’s more then simple clock jump.

        • Krogoth
        • 16 years ago

        The NV40 and R420 will be more then just a die shrink, clock speed ramping and adding more pipelines. These chips will most defintively incorpate some archtectural enhancements from the NV3X, R3XX series just not as revolutary as R300.

    • daniel4
    • 16 years ago

    ATI has been through multiple card generations and they are still using those 3.3ns hynix/samsung modules on their midrange cards :(. Musta got a discount on em back in 2001 or something.

      • My Johnson
      • 16 years ago

      From the other articles that included the 9500 pro I noticed that even with less memory bandwidth the 9500 pro sometimes won at high resolutions. So, I don’t think that the 9600pro/XT are really that choked.

    • Rousterfar
    • 16 years ago

    I was wondering if anyone here has an idea of how well this 9600 XT performs next to an older GF ti 4200. I have been considering upgrading, but am wondering if other then the Dirext X 9 support how big of a jump would it be?

    • malicious
    • 16 years ago

    I’ll just repeat what’s already been said about disappointment with the memory clock. If nVidia can afford to put 2.5ns RAM on their mid-level cards, ATi sure as hell can too. They had their chance to put some more distance between the 9600 line and 5600 one but decided that the only way to keep their high-end products compelling is to actively hinder the less expensive ones.

    • Anonymous
    • 16 years ago

    y[

    • Decelerate
    • 16 years ago

    Question is, will HL2 be the r[

    • Anonymous
    • 16 years ago

    So should I dump my nvidia 4600 for this?

      • DaveJB
      • 16 years ago

      Difficult to say. The 9600XT will perform better in shader-ops, FSAA and AF, but the Ti4600 would have the edge, overall.

      If you want to replace your Ti4600, I’d recommend going for the vanilla 9800, which outperforms the Ti4600 in nearly all situations, and can be overclocked to near-9800 Pro levels.

      • Anonymous
      • 16 years ago

      No, I would wait until next year to be honest. When the new ATI R400 Chips come out next year plus the ability to use PCI Express slot, then the price should come down even more.
      I would have to wait since the cost of the graphic cards are usually 1 1/2 times more expensive than in the USA beside the damn taxes.
      I would wait until the new motherboard are equipe with PCI AGP EXPRESS slot, since that would free up the graphic power even more.
      Depends on how much are you willing to spend.
      Thank to Stupid HACKERS, the Half Life 2 is being delay until spring of 2004 because they stoled, somehow, a third of the code for the games.
      Good Luck on your decision.

        • Anonymous
        • 16 years ago

        y[

          • --k
          • 16 years ago

          My independent research coincides with what you said. I have a feeling the source leak was not exactly an accident.

      • us
      • 16 years ago

      dump you 4600 for 9800XT

    • indeego
    • 16 years ago

    $199 is not mid range, IMO. I’d say $130 is mid-range. COnsidering low end is now easily attainable at $50-$75 for poor man’s 3-D

    The “old” 9600 pro gets 90% or more on every test and it’s ~$60-$80 cheaper.

    Editors choice on performance, but it’s not a good price/performance ratio for mid-rangeg{<.<}g

      • dolemitecomputers
      • 16 years ago

      For cost I would say it is midrange considering the 9800XT is $300 more.

      • wagsbags
      • 16 years ago

      indeego where are you buying a decent 9600 pro for $130? And not to mention that’s the RETAIL price for hte xt…

        • indeego
        • 16 years ago

        Work got me one off newegg last week. ATI branding, OEM. DIdn’t even have drivers included. passive coolingg{.}g

    • Anonymous
    • 16 years ago

    such awful framerates for Halo 🙁 So in the two years since it was released on the Xbox there wasn’t as much optimization as there could have been. And even the 9600XT ONLY gets 31.5fps at 1024x768x32 w/PS 2.0. With Microsoft involved here, conspiracy theories abound.

    • Anonymous
    • 16 years ago

    The S3 Virge XT…

      • Krogoth
      • 16 years ago

      S3 Scourge Extreme Edition Type-R XT FX Ultra 600000! 😆

    • Generic Ninja
    • 16 years ago

    While I really like this new low K process and the resulting core speed increases (soooo fast while running soooo cool!) I do find it dissapointing that ATI has switched to 600 Mhz DDR on the 9600XT. From previous reviews of the 9600 pro and two cards purchased by friends I know that the previous generation had 700 Mhz memory chips clocked at 600, allowing a healthy memory overclock.

    Hopefully manufacturers like Tyan or Gigabyte will provide faster memory, I can’t wait to see how a 9600 XT would perform with a memory overclock matching or exceeding the core overclock.

    BTW OCs on ATI 9600 pros were 492/720 and 510/708

    • Anonymous
    • 16 years ago

    Pathetic

    AnandTech said that this card was supposed to surpass the performance of the Radeon 9700 Pro…………

    I bought my Radeon 9700 Pro for $220 last week, and I’m really glad that I did.

    • Anonymous
    • 16 years ago

    To be honest, I don’t see the 9600 XT or the 9800 XT to be terribly compelling upgrades for those of us who already own a 9500 Pro/9600 Pro at the mid-range, or a 9700 Pro/9800 Pro at the high end. It’s all about marketing and trying to stay ahead of the competition more than anything else. Within a couple of months of the 9700 Pro being released, it was considered to be the “unofficial” video card of Doom 3, and I doubt that’s changed. Anyway, I think the R400 will be much more interesting than any current offering out there. I’ll leave it to the elitists out there to upgrade their 9800 Pro to a 9800 XT.

    What we have going on between nVidia and ATI is a nuclear arms race of performance not unlike what the Soviet Union and the United States were doing for many years.

    • Anonymous
    • 16 years ago

    q[

    • atryus28
    • 16 years ago

    HHHMM looks like a winner to me. Te extra $50 can easily be justified by the copy of HL2 you can get. Sweet.

    • Anonymous Gerbi1
    • 16 years ago

    WOOOOOOOOOOOOOOO!

    Time to put the “old” 9600 Pro on ebay!!!

      • Anonymous
      • 16 years ago

      What?? Did you actually read the review? There is a negligible performance difference bettwen the 9600 XT and 9600 Pro.

    • AGerbilWithAFootInTheGrav
    • 16 years ago

    Great review as usual… and I am glad you used available drivers 🙂

    + this card looks great IMHO… I’d like if you could use this card & Rad 9800 non pro just to compare what is the gap between best mid range & lowest high range cards in the “upcoming review” as there is not too many such reviews around…, and Nvidia equivalents of course…

    • DaveJB
    • 16 years ago

    I wonder whether the OC’s limiting factor was due to the chip, or the cooling solution used. If it was the latter, then imagine what speeds could be achieved with a water-cooling solution!

    Pity ATI stuck with 600MHz memory though, which is probably bottlenecking the card. I’m guessing that anything over 600MHz wasn’t practical from a price/performance point of view, or they intend to release a 9600 XXT with faster memory! :p

      • derFunkenstein
      • 16 years ago

      i’m sure some third party vendor like Tyan or GigaByte (both of whom build their own cards rather than picking stock cards) will come along and take care of the memory issue. 😀

      • AGerbilWithAFootInTheGrav
      • 16 years ago

      I guess that the manufacturers could pair up the chip with better memory too… That would give the card a great boost, imagine this radeon with nvidia memory speeds (up to 1/3 faster)… that would be rather good…

    • Anonymous
    • 16 years ago

    I don’t think NVidia is sitting on its ass. I do think, however, that they just chose the wrong path.

    I think NVidia became powerful enough in the market that they believed (at least at one point, if not now) that everyone would code for their cards, no matter how they designed them. This has backfired in a major way; ATI chose another path, that of optimizing their card for Microsoft’s DirectX codepath. Because of this, ATI’s cards are a lot easier to code for out-of-the-box, developers like this, not much overhead involved as long as you stick with optimizing for best DirectX9 features/performance, and the ATI cards shine in this arena.

    NVidia’s choice to require code to be optimized for them means that if done right, you can probably get good to really great performance on a Geforce FX card, IF you code specifically for it. The market has long shown time and time again what happens when you get too big for your britches and assume everyone will follow your proprietary path (3dFX Glide API, anyone? Late 80s/early 90s IBM brand hardware? Plenty of examples). Looks like NVidia didn’t learn from the past, and condemned themselves to repeating it. It’ll now take some catch-up work from them, and a decision…will they lose trying to force everyone into their camp, or will they adapt to change?

      • derFunkenstein
      • 16 years ago

      GLide was THE NEXT BIG THING though…tons of games came out for GLide because 3DFX was (at the time) about the only manufacturer making a worthwhile video chipset. Sure, they supported it until the end, but they followed along with OpenGL and D3D compliance when it became important. Even Unreal Tournament and Quake II had GLide optimizations, as well as a ton of EA games, like the NFS series, NBA Live, Madden, et al…GLide was a bad example. MicroChannel from IBM, on the other hand, is a good one.

        • Krogoth
        • 16 years ago

        Glide was the only one that really pull it off for a while. Because, 3Dfx’s products were second to none in performance/stabilty when compaired to compair to S3, Nvidia and ATi’s offerings. It wasn’t until Nvidia’s Riva TNT prove to a worth adverstiy to 3Dfx and Nvidia’s focus on moving to 32 bit color. 3Dfx refused to adept their aging GLide to the times and obiviously they shot themselves in the foot two times before going under.

          • derFunkenstein
          • 16 years ago

          That I will grant you, but from 1996 through 1999, almost every game had GLide support, and those that didn’t were less popular because of it.

      • 5150
      • 16 years ago

      Gawd I miss Glide. I still love my Voodoo!

    • Xplosive
    • 16 years ago

    Many people will enjoy this card for Half-Life 2 I can pretty much assume.

    • Anonymous
    • 16 years ago

    ***FIRST***

    Nice card !

    Thank god for ATI while nVidia is still sitting on its ass.

      • Anonymous
      • 16 years ago

      From what I’ve seen on HEXUS.net 5700 looks to be a very good competitor to 9600XT.

        • vortigern_red
        • 16 years ago

        That review is awful!!

        No image quality comparisons but a few comments about image quality compromises on the NV card/drivers ie still no tri filtering

        Unreleased card with unreleased drivers on a load of heavily “cheated” benchmarks. I think I’ll wait and see some proper reviews.

          • Ryszard
          • 16 years ago

          Thanks for the comments 🙂 Why are people getting worked up over the NV36 numbers? It’s just a week from release, what I reviewed is what everyone else will review in 7 days.

          How are they cheated benchmarks? We use the same benchmarks that everyone else uses, with renamed .exe’s where applicable (UT2003 doesn’t work if you do) and custom demos if possible.

          I don’t see many IQ comparisons on other reviews either, at least I made the effort, I just didn’t have time for screenshots.

          Anyway, glad you liked it!

          Rys

            • vortigern_red
            • 16 years ago

            I still think the use of the drivers with “no tri filtering but I can not see it” is crap. But I read your responce on B3D and look forward to the IQ article you said you will do. Good to see you are taking notice of peoples critisism.

Pin It on Pinterest

Share This