ATI vs. NVIDIA: The next generation

BOTH ATI AND NVIDIA HAVE recently updated their graphics product lines. Now even a heapy-cheapy graphics card is more powerful than anything you could buy a couple of years ago, and the really high-end cards are absolutely hoss. Packed with 128MB of RAM, these cards are faster than a French surrender. Today, we’ve rounded up a total of nine—count ’em—different graphics card configurations, ranging from the Radeon 7500 and GeForce4 MX to the new 128MB Radeon 8500 cards and GeForce 4 Titaniums. We’ve tested them all together in a massive, teeming, silicon-based pack, and we’re here to show you who’s fast—and who’s faster.

Because we’re dealing with a whole lotta cards here, we’re going to limit our focus to performance. We’ll address the newest wrinkles in image quality and anti-aliasing in more depth in a future article, where we can devote more attention to how every mip gets mapped. And we’ve already covered most of the highfallutin’ 3D theory in our previous articles. We’ve already compared the GeForce3 to the Radeon 8500, and we’ve charted the changes contained in the GeForce4. Not only that, but we’ve dug deep into the Radeon 8500’s GPU to see what makes it tick. So we’ll dispense with the theory here. If you’re not up to date on that stuff, do yourself a favor and go read our previous articles before you go on.

Now, let’s take a look at some of the cards we’ll be comparing and see what we’ve got.

ATI’s Radeon 7500
The ATI Radeon 7500 occupies the top rung of the low end of ATI’s retail video card lineup. (Repeat that fives times fast.) This card is based on ATI’s original Radeon GPU, but in this implementation, the chip is clocked at 290MHz—over 100MHz faster than the original Radeon. Similarly, the card’s DDR memory runs 230MHz, or 460MHz in DDR-speak. To put these numbers into perspective, the original Radeon chip was a little more advanced than the GeForce2, so this combination of elements is nothing to sneeze at. With 7.4GB/s of memory bandwidth plus ATI’s Hyper-Z suite of bandwidth-conserving technologies, this thing ought to outclass a GeForce2 Ultra.

The price? About 75 bucks from online retailers.

ATI’s Radeon 7500 card Pick your jaw up off the floor for a second and consider this: the Radeon 7500 comes complete with dual video outputs: a VGA-out port and a DVI-out connector that can double as a second VGA output. (ATI includes the DVI-to-VGA adapter in the retail box.) With ATI’s HydraVision software, you can drive a pair of monitors in tandem. Not a bad deal for a “budget” card.

The Radeon 7500’s outputs: S-Video, DVI, and VGA

 

The Radeon 8500 line gets refreshed
ATI’s answer to the new GeForce4 Ti cards is simple: double the RAM on the Radeon 8500. The Radeon 8500 GPU already has many of the new features NVIDIA added to the GeForce4 Ti, like dual vertex shaders and more advanced pixel shaders with dependent texture addressing. [Ed: No theory, please.]

Sorry about that.

But to combat the GeForce line in stores, ATI added a card to the mix, the Radeon 8500LE 128MB. The Radeon 8500LE is nothing more than a Radeon 8500 with core and memory clock speeds at 250MHz, or 25MHz lower than the Radeon 8500. To cut costs, ATI also removed the DVI-out port on the LE card. Retailing for about $199, the 8500LE promises to give GeForce4 cards some serious competition in the bang-for-your-buck category.

ATI’s Radeon 8500LE 128MB

The Radeon 8500LE card has Infineon memory in a square BGA package For now, the original Radeon 8500 remains at the high end of ATI’s lineup. Like the LE cards, the 8500 now comes with 128MB. In theory, the extra memory ought to allow for better performance handling complex scenes with lots of texture and geometry data.

 

Abit’s Siluro GF4 MX 440
Most of you are probably familiar with our gripes about NVIDIA’s GeForce4 MX chip. It’s more of a GeForce2 on steroids than it is a GeForce4-anything (or even a GeForce3-anything). Considering how NVIDIA has spearheaded the move to a new approach to real-time 3D with vertex and pixel shaders, the GeForce4 MX is a bit of a disappointment. Look at it as direct competition for the Radeon 7500, though, and NVIDIA’s thinking begins to make more sense. Clearly, the GF4 MX isn’t a gamer’s card, but it matches up well against the competing product from ATI.

Abit’s version of the GeForce4 MX 440 comes in a stunning shade of blue with a very nice cooler.

The Abit Siluro GF4 MX 440

Abit’s GF4 MX 440 includes 64MB of DDR memory clocked at 400MHz and an S-Video output. Unfortunately, there’s no secondary output, DVI or otherwise, so this card can’t support multiple monitors, even though the GF4 MX chip can. Abit is planning a “VIO” version of this card with a DVI output and a TV encoder.

Of course, the Siluro GF4 MX 440 includes all of the good things the NV17 chip brings to the table, including Accuview anti-aliasing, NVIDIA’s efficient memory architecture, and a full MPEG2 decoder for effortless DVD playback.

 

VisionTek’s Xtasy GeForce4 Ti 4600
Finally, we have VisionTek’s rendition of the GeForce4 Ti 4600. Not long ago, these cards hit retail, and the weak among us hardware enthusiasts could be seen out on street corners trying to sell our first-born on the open market to fund the purchase of one of these puppies. This is the highest of the high-end consumer video cards, with faster RAM and better bragging rights than just about anything.

Like any product that follows the NVIDIA GeForce4 Ti reference design, this card is large. I’m not sure whether you plug it into a motherboard or plug a motherboard into it.

Top: GeForce4 Ti 4600. Bottom: Radeon 8500LE.

Only the Voodoo 5 5500 can make this thing look small Either way, the VisionTek card comes with most of what you’d need, including a DVI output, a VGA out, and video in and out ports (via an included splitter cable) for real VIVO functionality. VisionTek even throws in CyberLink’s PowerDirector video editing suite. VisionTek’s lifetime warranty is part of the package, as well. The only thing missing from the box with our review sample was a DVI-to-VGA converter, which you might have to hunt up separately if you want to drive a pair of CRT monitors.

NVIDIA’s reference cooler is mighty fancy As you’d expect, the GeForce4 Ti 4600 chip promises to make this beast one of the fastest cards you can buy.

 

The contest
The rest of the field is comprised of variations on the cards shown above. GeForce Ti 4400 cards use the same chip as the Ti 4600 cards, except with lower core and memory clock speeds. And we’ll test the 128MB Radeon 8500 variants against a Radeon 8500 64MB to see if the extra RAM really helps performance at all.

To see exactly how these cards stack up in terms of vital stats, have a look at the table below. I’ve sorted the contenders by peak memory bandwidth, since memory tends to be the single biggest performance bottleneck in most 3D graphics.

Core clock (MHz) Pixel pipelines  Peak fill rate (Mpixels/s) Texture units per pixel pipeline Peak fill rate (Mtexels/s) Memory clock (MHz) Memory bus width (bits) Peak memory bandwidth (GB/s)
GeForce4 MX 440 270 2 540 2 1080 400 128 6.4
GeForce3 Ti 200 175 4 700 2 1400 400 128 6.4
Radeon 7500 290 2 580 3 1740 460 128 7.4
GeForce3 Ti 500 240 4 960 2 1920 500 128 8.0
Radeon 8500LE 250 4 1000 2 2000 500 128 8.0
GeForce4 Ti 4400  275 4 1100 2 2200 550 128 8.8
Radeon 8500 275 4 1100 2 2200 550 128 8.8
GeForce4 Ti 4600 300 4 1200 2 2400 650 128 10.4

Hardware specs matter, but they aren’t destiny, as you’ll see in the benchmarks results below. However, this table helps show us some of the key performance matchups. Among them:

  • Radeon 7500 vs. GeForce4 MX 440 — These two cards are close competitors, and it would appear the Radeon 7500 has the upper hand; it’s got faster memory, gobs more multitextured fill rate, a higher clock speed, and (at present) a lower price tag.
  • GeForce4 Ti 4400 vs. Radeon 8500 — These cards will be priced roughly equal to one another, and they happen to share the same basic specs in terms of clock speed, memory speed, pixel pipelines, and peak fill rates. Here’s a chance to see ATI’s and NVIDIA’s best chips square off on equal footing.
  • The various Radeon 8500 flavors vs. each other — We know the 8500LE’s sweet price tag matters. Does an extra 25MHz really matter? And how much does an extra 64MB help?

Those aren’t the only interesting matchups, but they are worth keeping your eye on as the test results unfold.

Our testing methods
As ever, we did our best to deliver clean benchmark numbers. All tests were run at least twice, and the results were averaged.

The test system was built using:

Processor Intel Pentium 4 2.2GHz
Front-side bus 100MHz (400MHz quad-pumped)
Motherboard Abit BD7-RAID
Chipset Intel 845
North bridge 82845 MCH
South bridge 82801BA ICH2
Memory size 512MB (2 DIMMs)
Memory type Micron PC2100 DDR SDRAM (CAS 2)
Sound Creative SoundBlaster Live!
Storage Maxtor DiamondMax Plus D740X 40GB 7200RPM hard drive
OS Microsoft Windows XP Professional

For comparative purposes, we used the following video cards and drivers:

  • ATI Radeon 7500 64MB AGP with 6.13.10.6037 drivers
  • ATI Radeon 8500 64MB AGP with 6.13.10.6037 drivers
  • ATI Radeon 8500LE 128MB AGP with 6.13.10.6037 drivers
  • VisionTek Xtasy 6964 (NVIDIA GeForce3 Ti 500) with Detonator XP 28.32 drivers
  • Abit Siluro GF4 MX 440 64MB AGP with Detonator XP 28.32 drivers
  • VisionTek Xtasy GeForce4 Ti 4600 with Detonator XP 28.32 drivers

We also included a “simulated” GeForce3 Ti 200, because we could. We underclocked our GeForce3 Ti 500 card to Ti 200 speeds and ran the tests. The performance of the card at this speed should be identical to a “real” GeForce3 Ti 200. Likewise, we underclocked the GF4 Ti 4600 card to test it at GF4 Ti 4400 speeds. And perhaps most heinously, we overclocked the Radeon 8500LE 128MB card in order to simulate a Radeon 8500 128MB. (The card showed no signs of problems at the 8500’s 275MHz clock speed—perfectly stable.) If you can’t handle the concept of a simulated graphics card, pretend those results aren’t included.

We used the following versions of our test applications:

The test systems’ Windows desktop was set at 1024×768 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

 

Quake III Arena
First up is the well-worn favorite, Quake III. We’ll start at low resolutions, where polygon throughput and driver bottlenecks are the primary limiting factor, and move up quickly to 1600×1200, where pixel-pushing power (or fill rate) is the name of the game.

Obviously, we’re hitting a system-level bottleneck here; something other than the graphics card is limiting performance. This result actually bodes well for the ATI cards. Last time around, the ATI cards were measurably slower at low resolutions. Perhaps ATI has made some progress on its drivers.

As the resolution increases, the pack begins to separate, and we can see how things shake out. At 1600×1200, especially, the picture is clear. Despite a disadvantage in memory bandwidth, the GeForce4 MX 440 outpaces the Radeon 7500. Likewise, the GF4 Ti 4400 is over 20 frames per second faster than the Radeon 8500 at the same clock speed. In both cases, the NVIDIA cards have the performance edge.

 

Giants: Citizen Kabuto
Giants is one of a select few Direct3D games with a decent built-in benchmark function. Let’s see if the field looks any different here than in Quake III.

At low res, the NVIDIA cards have a small but pronounced advantage.

Even at 1600×1200, there’s not too terribly much difference between the cards. The middle of the pack is especially tight, with the Radeon 8500 flavors and the GeForce3 Ti 500 all stacked up with only a few frames per second separating them.

 

Serious Sam SE
We’ll use Serious Sam SE’s magical benchmarking tools to look at the test data a different way. The graphs below plot frames rates over time in 1-second intervals. It’s a bit of a jumble to read, but bear with me, because there’s something to be learned here.

In order to make this interesting, the results below are the result of a single representative test run, not an average. I did it this way in order to demonstrate something. Watch for peaks and valleys in the frame rates, because those things (especially the valleys) affect playability much more than an “average frame rate” number.

It’s hard to read, but what you’re seeing here is the NVIDIA cards all bunched up just above the ATI cards.

Here, at about 5 seconds into the test, the Radeon 8500 64MB has a sharp spike downward in its frame rate compared to the other cards. It happens again, to a lesser degree, with the Radeon 8500 128MB at about 17 seconds. This kind of problem is a serious playability killer, and it’s something I’ve seen while playing Serious Sam SE on a Radeon 8500 card. I was pleased to be able to capture it in a graph. Check out the next two tests.

All of the Radeon 8500 cards are susceptible to downward spikes in frame rates, and the likelihood it will happen goes up as the screen resolutions increase. Note that the 128MB cards seem just as prone to the problem as the 64MB cards. With these kinds of slowdowns, you’ll get smoother gameplay out of a GeForce3 Ti 200 card than out of any flavor of Radeon 8500. That’s something an FPS average just won’t show you.

I hestitate to speculate much about the cause of these performance spikes on the Radeon 8500 cards. I’ll leave the heavy lifting to folks more concerned with application-level tuning and the like. It’s possible Serious Sam’s self-tuning nature is causing a problem for the Radeon 8500 here. We use Serious Sam SE as an application-level benchmark, so we let the game engine tune itself. Serious Sam adjusts some video settings for individual graphics chips, and it’s possible the settings for the Radeon 8500 are holding it back. For one thing, certain models in the game are set to use ATI’s Truform technology by default, and those models will have much higher polygon counts than the stock in-game models. However, I tried turning off Truform as I was playing the game myself, and it did nothing to eliminate the intermittent slowdowns. Using smaller texture sizes seemed to help more.

I’m hopeful ATI and the game’s developers can alleviate the problem, but right now, the game isn’t very playable on an 8500.

UPDATE: Just before we went to press, we discovered this thread in the Serious Sam forums that explains the problem has to do with the texture upload routine in ATI’s driver. Apparently, the problem will be fixed in future driver releases.

 

Comanche 4 demo
This benchmark is a late addition to the review. We actually held back the release of this article in order to add results for Comanche 4. This game uses pixel and vertex shaders, if available, to produce some stunning effects. It’s gotta be one of the most graphically advanced games out here. The nifty thing about this game engine is that it will use pixel shaders if it can, but if not, it will use multipass rendering to achieve the same effects—even on an older graphics chip without pixel shaders. Rendering in multiple passes can degrade performance and image quality, but Comanche doesn’t show any serious image quality degradation on a Radeon 7500 or GF4 MX.

This test is very much CPU-bound at lower resolutions; there’s not much difference at all between 640×480 and 1024×68. Only the higher resolutions really slow down the slower chips—and the Ti 4600, amazingly, is unfazed even at 1600×1200.

What sticks out like a sore thumb here is the massive performance gap between the Radeon 8500 64MB and the 128MB version. Somehow, Comanche 4’s test scene stresses the memory in the Radeon 8500 pretty seriously. Yet the GF3 Ti 500 card with 64MB memory doesn’t have that problem.

The low-end cards don’t seem to suffer too much without pixel or vertex shaders. They are slower, but not much slower than they are in other games.

 

3DMark 2001
3DMark uses DirectX 8.1’s new features to good effect, and it packs loads of polys into many of its scenes. I consider it a decent predictor of performance in upcoming games that take advantage of vertex and pixel shaders.

The GeForce4 Ti cards run away with it, and the Radeon 8500 cards distinguish themselves with a relatively strong performance, as well. Not conincidentally, both the GeForce Ti and Radeon 8500 chips have dual vertex shaders and solid pixel shader implementations.

Without either sort of shaders, the Radeon 7500 and GeForce4 MX cards are simply outclassed here.

Game tests
Here’s how the cards performed on 3DMark’s individual game tests.

In the first three game tests, the top two slots and the bottom two slots are pretty well consistent. The GF4 Ti cards are always fastest, and the Radeon 7500 is almost always just a little slower than the GeForce4 MX.

In between, the Radeon 8500/LE cards and the GeForce3 cards battle it out.

The Nature test makes extensive use of pixel shaders, so the low-end cards just aren’t able to run it. Remarkably, the Radeon 8500 128MB outpaces the GF4 Ti 4600 to take the top spot. This scene uses loads of complex geometry and some nice-looking pixel shader effects, so both vertex and pixel shaders are working hard in this one.

 

Fill rate and pixel shader performance
I’ve tried to group 3DMark’s synthetic test together as logically as possible. The first two of these tests measure fill rate.

This is pretty much the same pattern we’ve been seeing in games at higher resolutions, with the exception that the Radeon 8500 cards perform relatively stronger on this synthetic test than they do in most games.

The next two tests measure performance with the two most popular forms of bump mapping. Newer games are supposed to make extensive use of bump mapping, so watch carefully.

Notice that the GeForce4 MX wasn’t able to complete the Environment Bump Mapping test. That’s because the GeForce2/GeForce2 MX 3D engine never could handle this form of bump mapping. With a 3D core lifted right out of GeForce2 MX chip, the GeForce4 MX can’t do it, either.

Speaking of things the GF4 MX can’t do, let’s look at pixel shader performance.

Running at the same clock and memory speeds, with the same amount of memory, the Radeon 8500 and the GF4 Ti 4400 are neck-and-neck in pixel shader performance. However, the next test was added specifically to capitalize on the more advanced pixel shader 1.4 capabilities of the Radeon 8500. The GeForce3/4 chips can execute the test, but they have to resort to multi-pass rendering (drawing the scene multiple times before pushing it out to the frame buffer) in order to do it.

Whoa. That wasn’t what we were expecting. So what gives?

Here’s the answer from the 3DMark 2001 SE FAQ:

Q: I have an ATI Radeon 8500, which should draw the water surface in the Advanced Pixel Shader test in a single pass, compared to my friend’s system with DX8 hardware that should draw it in two passes. Still I don’t see much performance difference. Shouldn’t my system be twice as fast?

A: The Advance Pixel Shader test is what we call a Feature Test, which means that we, above all, want to present some new technology. It was decided that a fall-back was to be included in addition to the 1.4 pixel shader, since the same looking effect can be achieved using pixel shader 1.0 hardware. These two different modes of that same test work a bit differently and should, therefore, not be directly compared. Both modes could be optimized to show more performance either way, but now the test is just optimized for maximum compatibility. Vertex shader performance also affects the score, somewhat, due to this compatibility optimization.

So the preceding graph was pretty much meaningless. Made you look, though.

 

Poly throughput and vertex shader performance
Finally, we’ll look at T&L and vertex shader performance. Keep in mind that the low-end cards here both have fixed-function T&L engines to remove some burden from the CPU, but they can’t handle the more complex operations a vertex shader can. As a result, the low-end cards will fall back on the 2.2GHz Pentium 4 processor in our test system to execute vertex programs. In DirectX 8, this fallback mode is automatic, because unlike pixel shaders, vertex shaders can be emulated with relative ease.

Meanwhile, the rest of the cards have vertex shaders, and they implement fixed-function T&L as a vertex program. Essentially, they’re emulating a traditional T&L unit in software, and that software runs on the chip’s vertex shader. Therefore, sometimes older chip designs with hard-wired T&L units can perform very well next to the newer chips.

Anyhow, the theory police are on to me, so I’ve gotta move on.

These two tests use only fixed-function T&L, and you can see the GF4 MX outpacing the GF3 Ti 500. However, the GF4 and Radeon 8500 cards are all faster than the GF4 MX and Radeon 7500.

Here’s a true vertex shader test, and yes, even the GF3 Ti 200 can outperform the Pentium 4 2.2GHz running DX8’s software vertex shader routines. As in the Nature test, the Radeon 8500 beats the Ti 4600 here. No doubt ATI’s vertex shaders are formidable. The GeForce3, with only a single vertex shader unit, can’t keep up with the dual-shader chips.

Point sprites or particles are generally handled by the vertex shader, so we’ll lump them in here. Our contestants line up just about like you’d think, but the Radeon 7500 about has a nervous breakdown. Ouch.

 

Antialiasing performance
We’ll close out our massive 3DMark comparo with some antialiasing action. We’ll cover AA in all of its forms in a future article, but the tests below will give you a good sense of how these chips perform with AA turned up.

At 640×480, even 4X mode doesn’t cause serious slowdowns with most of these cards. They have ample fill rate to handle 4X mode at this resolution.

Now we’re starting to see the pack separate. Look at the GeForce4 Ti cards go! The GF4 Ti 4400, with the same memory and clock speeds, delivers double the frame rate of the Radeon 8500. Yes, NVIDIA’s multisampling AA method is more efficient than ATI’s supersampling, but this is amazing.

At this resolution, only NVIDIA’s GF4 Ti cards with 128MB can handle 4X AA mode. Oddly, the Radeon 8500/LE 128MB cards won’t do it; they drop back into 2X AA as if they didn’t have enough frame buffer memory. Perhaps the Radeon 8500’s drivers just haven’t been updated to use the extra RAM for anti-aliasing at higher resolutions yet.

Anyhow, look how fast the Ti 4600 is. In 2X AA mode, it’s nearly as fast as a Radeon 8500 without antialiasing. NVIDIA has produced an AA monster.

 

SPECviewperf
I’ve about had enough benchmarks, but we’ll add one more to the list of the sake of completeness. Viewperf’s workstation-class tests use OpenGL for both wireframe and solid rendering tasks, which makes them quite different from our gaming benchmarks.

It’s a back-and-forth battle between the ATI and NVIDIA cards, which is good news for ATI. The Radeon 8500 cards have made great strides in OpenGL performance since list time we tested a Radeon 8500, when we used an older driver set. Advanced Visualizer (AWadvs) and MedMCAD show substantial performance improvements, and if you’re planning on using ProCDRS for industrial design, get a Radeon 8500. (Well, OK, you might want to get a workstation-class card, instead, but you see what I’m saying.)

 

Conclusions
I’ve been around 3D graphics on the PC since the beginning, and it’s hard not to like what we’re seeing here, from top to bottom. Even the slowest card of the bunch, the $75 Radeon 7500, is breakneck fast at mid-level screen resolutions.

The Radeon 7500 and GF4 MX can’t always cut it when all of the next-gen graphics features are turned up. Then again, they performed pretty well in the Comanche 4 demo, so maybe vertex and pixel shaders won’t be as necessary as we had once thought. I wouldn’t bank on that, though. If you can afford it, move up to a card with real vertex and pixel shaders.

One of the most striking results of our tests is how much performance NVIDIA is able to wring out of a given hardware spec. The GeForce4 MX card has a lower clock speed and slower RAM than the Radeon 7500, but it outperforms the 7500 more often than not, especially in fill rate-limited scenarios like high resolutions. You’d think that with faster RAM, the Radeon 7500 would be faster in such situations. Similarly, the GF4 Ti 4400 matches the Radeon 8500 clock-for-clock, but the Ti 4400 is faster nearly across the board.

Obviously, NVIDIA’s bandwidth conservation methods—like the GF4 line’s crossbar memory controller and occlusion detection abilities—are much more effective than ATI’s. The GeForce3 has the same basic set of bandwidth-saving features as the GF4 line, but those features are greatly improved in the GeForce4 Ti. The progress is most evident when antialiasing is in use. The GeForce4 Ti cards are practically magic when it comes to antialiasing. I thought the line about turning on 2X AA and not seeing any performance drop was just a marketing spiel. Turns out that spiel is mighty close to the truth.

The GeForce4 Ti series doesn’t bring a whole lot of new 3D technology to the scene, but the refinements since the GeForce3 deliver gobs of real-world performance. VisionTek’s Ti 4600 card pretty much blew away everything else in our tests. It’s really no contest. If you want the fastest card on the planet, get a Ti 4600.

That said, ATI has some very solid products in the Radeon 8500 series. Right now, there’s a gaping hole in the middle of NVIDIA’s product lineup, because the GF4 MX 460 is apparently stillborn (I challenge you to find a GF4 MX 460 for sale anywhere). Its apparent replacement, the GF4 Ti 4200 isn’t quite here yet. ATI forced NVIDIA’s hand with the Radeon 8500LE 128MB, which is a bargain at $199 or less. Even after the Ti 4200 cards arrive, the Radeon 8500LE 128MB will be an attractive card for the price. The extra RAM makes the Radeon 8500LE 128MB perform about like a Radeon 8500 64MB, despite 25MHz the clock speed gap. ATI has made notable progress on its drivers, and the Radeon 8500 GPU is still more advanced, in some ways, than even the GeForce4. I’m finding myself recommending ATI cards to friends and readers who want a good all-around card at a decent price.

As for the GF4 Ti 4200, it should hit store shelves near the end of April. The cards will come in two basic configurations: a 250MHz core and 64MB of DDR memory at 500MHz for around $179, and a 250MHz core paired with 128MB DDR memory at 444MHz (don’t ask) for about $199. Once those cards arrive, the Radeon 8500LE will have some real competition. I expect the prices of both the LE and the Ti 4200 to drop pretty quickly.

So that’s the story on performance. We’ll explore some of the intricacies of edge and texture antialiasing in a future review, and we’ll revisit the performance scene once we have a GF4 Ti 4200 card in our hot little hands, which should be very soon. 

Comments closed
    • Anonymous
    • 17 years ago

    Wow that was some pretty comprehensive and informative testing! And after all that i decided to go with the Radeon 7500.
    Oh well. Can’t go wrong for $55

    • Anonymous
    • 17 years ago

    Hi as this test is only a 3D Test you should also think of the other features you get, a very big Problem of Nvidia Cards is the bad Image Quality (because of bad design). An ATI has a very clear Output instead.

    Then when you like watching Movies (DVD, DivX) on your TV the Nvidia Solution is shit. Got myself a Radeon 8500 everything works Great, dual screen, the DVI Output. And in my opinion the difference between is 150 an 120 Frames is completly useless (except to be proud of)

    • Anonymous
    • 17 years ago

    *[

    • Anonymous
    • 17 years ago

    Perhaps it can dither it, and perhaps that explains the often disparate opinions on the quality of displays?

    • cRock
    • 17 years ago

    Assuming your flat panel is not a POS, it should be able to handle a 32 bit color signal and dither it down to 24 bit or whatever it is capable of displaying.

    All I know is that my Sony SDMM51D looks much better on my G450 than my Radeon 7500.

    • Anonymous
    • 17 years ago

    I don’t know if its generally on a gamers consideration list, but from what I can tell, the Radeon is the better card for driving a flat panel, if you happen to have or may possibly get one.

    It appears that the Nvidia solution only drives a panel in 16 bit mode since panels can either run 16 or 24 bit color and only the Radeon has a 24bit color mode.

    I’m thinking come the latter half of this year their may be a lot more people considering them, if for nothing more than a nice second monitor, and its a shame that the Nvidia solution would force them to run 16bit color. Talk about a step back into history.

    • elmopuddy
    • 17 years ago

    I just ordered my Visiontek Ti4600…

    I may just kiss the FedEx guy..

    ๐Ÿ˜‰

    EP

    • Anonymous
    • 17 years ago

    who cares the english blow!

    • BabelHuber
    • 17 years ago

    [quote]
    Or could it be that you have another culprit in your fathers machine in regards to fps in q3. Like you said the drivers could be a problem…
    [/quote]

    Exactly! I

    • h0rus
    • 17 years ago

    Forge,

    I know that after enabling that enhanced client sampling in OSP, I dislike the idea of playing with it off. The mock feel of playing with a machine that can crank out > 100fps is just amazing. ๐Ÿ™‚

    • h0rus
    • 17 years ago

    66,

    That’s just getting used to a certain standard, and not wanting to go below again. I’m certain if I had a leet setup, I wouldn’t want to go back to my shitty one. Just as I don’t ever want to boot up my P133 again! ๐Ÿ™‚

    • Anonymous
    • 17 years ago

    by the time it finally does come out it will be obsolete.

    – danny e.

    • Anonymous
    • 17 years ago

    will software ever catch back up to hardware?
    by the time DOOM comes out…. the R300 will be out… and maybe the GeForce 5. .. video cards double in speed about every year…. 3d shooters seem to take nearly 2 years in development…

    hmmm me tired.

    *poof*

    – danny e.

    by the time Duke Nukem Forever is out…. the R5000 will be out… and i will be old an gray or grey even. Seriously… does anyone give a cr*p about that game anymore??? i was really anxious for it to come out about 4 years ago.

    • Hallucinosis
    • 17 years ago

    Yes Forge– as you’ve maintained time and time again that your level of acceptability for sound, video, and various other things is much higher than everyone else, it’s clear to me that you have extrasensory superpowers. I’m glad my sense are dulled just enough to not be irked by 60FPS and 192kbps (hq) MP3s.

    =)

    • Anonymous
    • 17 years ago

    AG64, What? I don’t get it. Are you trying to simulate Half-life/Quake3 on a PC but with Xbox-style controls?

    • Anonymous
    • 17 years ago

    63,

    Or play with the fps_max command in Half-life or com_maxfps in Quake 3. Bind keys to change the values.

    Hell, double blind it. Give one person the keyboard, one the mouse.

    • Anonymous
    • 17 years ago

    *[

    • Forge
    • 17 years ago

    Droopy – It’s a very subjective thing. Comparing ‘acceptable’ fps between people is an exercise in futility.

    For example:

    I am not happy with anything less than ~75, and I prefer >100.
    My buddy Walt is happy with anything ~ or over 60.
    My buddy Keith is happy with anything over roughly 30 fps steady.

    It’s all in your opinion.

    • droopy1592
    • 17 years ago

    Indeego, not to be an asshole, but how many FPS do you want? I thought 60 was enough for GP, and 85 for high paced action. i agree 40 is kinda low, but what’s acceptable?

    • droopy1592
    • 17 years ago

    Nah, his dad’s puter only has 64MB of RAM and running XP. That’s enough to slow you down.

    • Aphasia
    • 17 years ago

    So what you are saying is that either that your Geforce256 DDR get more then double the framrate of mine. Or that the Damage’s benches of the 7500 is wrong.

    Or could it be that you have another culprit in your fathers machine in regards to fps in q3. Like you said the drivers could be a problem…

    What i want to say is that you shouldnt say that the card is crap unless it is. What i have seen it seems to be quite good cards if you just find a driver that works.

    cheers

    • BabelHuber
    • 17 years ago

    Good Review!

    However, I cannot recommend the radeon 7500 for gamers. My father

    • indeego
    • 17 years ago

    #55,
    ummm, your joking, rightg{?

    • Forge
    • 17 years ago

    AG #51 – Not a stripped R200, but a built up R100. It’s a Radeon1 on .15 micron with Hydravision built in. That’s all.

    • Anonymous
    • 17 years ago

    Very good article.

    Is it just me but are these cards well ahead of any software(games) that can actually give them a work out.

    I play Serious Sam SE @1600×1200 on my Radeon 8500 with all the setting set to high just so that I can keep up with the game. Its way to fast @1280 x 1024 or lower. Give me a sustained 40fps with all the eye candy @ 1280 x 1024 and I am happy.

    Or maybe I am getting to old and my feeble hands cannot keep up to my nimble eyes?

    • Anonymous
    • 17 years ago

    Hot damn, you shot my forums’ activity through the roof. Nice article, and nice linkage ๐Ÿ™‚

    “Currently Active Users: 68
    There are currently 15 members and 53 guests on the boards. | Most users ever online was 227 on 04-04-2002 at 05:44 PM.”

    -Rodzilla
    -http://www.seriouscommunity.com

    • Anonymous
    • 17 years ago

    Excellent review.

    Speaking as an owner of both the Radeon 7500 and a nicely-overclocking “classic” GeForce3 (and a small meangerie of other 3D cards), I found the results for those two board groups roughly comparable to my own experiences.

    This was refreshing, because I had been expecting the 7500 to perform faster than a GF3 and was a bit surprised when the GF3 turned out to be a lot faster — especially in HalfLife, which is getting pretty old now. And I had that 7500 clocked pretty high, too.

    -av

    • Anonymous
    • 17 years ago

    *[

    • Anonymous
    • 17 years ago

    wait a minute, I thought the 7000 and 7200 were based on the original version of the Radeon, and the 7500 was a stripped-down version of the 8500 core?

    • Anonymous
    • 17 years ago

    b{<50th Post. wOOt!<}b

    • Anonymous
    • 17 years ago

    Who gives a shit? The french suck.

    • freshmeat
    • 17 years ago

    Doh! Took too long to type . . .

    • freshmeat
    • 17 years ago

    Not to kick the Frenchies while they’re down (why did the French plant trees along the Champs Elysees? So Otto von Bismarck could march in the shade . . .), but I can only go so far as a heavily qualified “four years of resistance”. Despite every old frenchies claim to have been in the resistance, for most of the period of German occupation, the resistance was small and hunted, mainly becasue of a massive collaboration problem. And let us not forget Marshal Petain, hero of Verdun, and the Vichy Regime. So outside of a very small heroic resistance, and the Free French forces in England, you had a very quick surrender followed by active collaboration . . .

    • Anonymous
    • 17 years ago

    A few points, to some posters and to the article writer (retyped in haste as my previous went poof since my password seems to be invalid :/ ):

    1) There are a few 8500 cards with VIVO now. There are the Gigabyte cards, which I think are VIVO, and probably a few of the other new 3rd party cards, and there is the non-DV 8500 AIW from ATI, which is only available with a new Compaq computer I believe. Perhaps it will be generally available soon, or maybe someone is selling it on EBay already.

    2) The Comanche benchmark is extremely questionable. From what I understand, it simulates the water effect using multipass rendering if it doesn’t detect the card as supporting pixel shaders. The problem is that the method of detection at default doesn’t recognize the 8500 cards. A file must be edited to circumvent this, from what some have said, and I just don’t understand why articles that are using this benchmark don’t address this issue. It would be appreciated if it was stated this issue was recognized and dealt with, and that the file editing was observed to have an effect, and what that effect was, if this benchmark is to be used.

    3) I have a bone to pick for recent reviews/articles not using or even mentioning the 6043 and 6052 drivers which fixed a pretty well established, and major, performance issue exhibited by very demanding OpenGL applications. It is possible I missed a mention in the article, however I still don’t see why in the world they couldn’t be used at all.

    4) Using SS numerical benchmark results with no analysis or control of the graphics setting seems worse than useless, it seems to leave the door wide open to negligent misrepresentation. It could be that the 8500 is done a great injustice, or it could be the 4600, but to knowingly use such results without any regard to what the self-tuning changed seems extremely counter-productive.

    • Pete
    • 17 years ago

    Re: Vertex shaders in the 7500

    Nope, the 7500, like the GF4MX, ony offers hardware TnL, but not of the programmable kind. “Shader” denotes programmable, which everything other than a GF3/4/8500 is not. Shaders were introduced with DX8, I believe, and only fully-DX8 cards (not merely compliant ones) can shade in hardware. The earlier ones must resort to shading in software.

    If it makes it easier to understand, think of the shaders as engines that DX7-level hardware is without. Or something. ๐Ÿ˜‰

    • R2P2
    • 17 years ago

    I commented on the slashdottings right after my last post, but the server didn’t respond when I hit “Post”. Those slashdotters are quick. So, to answer the question I asked in the post that didn’t get posted: No, the server doesn’t quite seem to be able to handle 2 slashdottings in one week.

    • Steel
    • 17 years ago

    Wow. Two slashdottings in a week, impressive. Must’ve heard about the nifty new index feature…. ๐Ÿ˜‰

    I’m looking to get a Radeon to replace my GF2 based Asus card for my multimedia machine. The Asus can capture video OK but playing it back to the TV gives me color banding and oversaturated colors. I haven’t seen an 8500 with VIVO ability sans TV tuner aside from a few entries on PriceWatch. The 8500DV has a pretty hefty price tag, I’ll probably wind up getting the AIW 7500.

    • R2P2
    • 17 years ago

    Are you sure about the Radeon 7500 not having vertex shaders? I thought vertex shaders were introduced in the GF2, and therefore the Radeon 7500 should have them, since I think it had all the features of the GF2 plus a couple more. ‘Course, even if it does have a vertex shader, it clearly isn’t too speedy.

    • Anonymous
    • 17 years ago

    Just wanted to know if TR is able to make an update with the 6052 Radeon drivers?

    If been using them for quite some time now, and they kcik ass – no bugs that I have experienced, and damn fast too. I Saw them beating the 6037, by quite a bit in some benchmarks. Cant really see why the review’ers cant use them in test, there are already some who use them?

    If it is such a big problem, why dont try both and show both results – dont really thinks it gives a realistic picture of Radeon 8500 potential(and the performance that most Radeon users are enjoying). The time it takes to use them both maybe? – just want a reason?

    • Anonymous
    • 17 years ago

    i have a question for anyone/everyone…
    i have a Radeon 64MB VIVO now…. and will probably want to upgrade to the R3 this fall… however I LOVE my VIVO for recordering tv to mpeg files…

    does anyone know of a GOOD quality .. relatively inexpensive pci board with doing this? i realize that the radeon vivo is not that great at making mpegs… but its decent and its a side feature…

    – danny e.

    • Anonymous
    • 17 years ago

    *[

    • Forge
    • 17 years ago

    ChangWang – I think that third TMU gets used more than you think. Just off the top of my head, both Serious Sam games and Tribes 2 use it fully.

    In fact, Until the GF3 with it’s loopback texturing, the Radeon was significantly faster in Tribes2. My friend’s AIW R1, at only 166/166, was nearly a match for my GF2 Ultra. That says a hell of a lot in favor of that third TMU.

    ATI did find more pipes to be more useful than extra TMUs, but the third TMU was far from useless.

    • Damage
    • 17 years ago

    ChangWang:

    Ah, yes, we i[

    • rabbit
    • 17 years ago

    enjoy Absolut Hoss wodka

    • ChangWang
    • 17 years ago

    You guys seem to forget that 7500 has a high fillrate because of its 3 texture unit per pipe design. In most games, that 3rd texture unit doesn’t get used, which is why they changed it on the 8500 series.

    • Anonymous
    • 17 years ago

    I nearly sh!te my pants when I saw that index! Good show Damage. I had even stopped reading the articles for a while.

    • Kevin
    • 17 years ago

    There was an index? I didn’t see anything but all those purdy graphs. ๐Ÿ™‚

    • IntelMole
    • 17 years ago

    Okay, I’ve decided to stay away from ATI for good now, their tech is sooooo good that it gets beat by lesser hardware ๐Ÿ™‚

    Besides, it don’t matter, my card won’t get bought till October-ish ๐Ÿ™‚

    Oh, and the index didn’t get seen until you mentioned it here,
    IntelMole

    • Anonymous
    • 17 years ago

    1) 4 years of resistance to Nazi occupation is not what I would call a quick surrender by the French. It was flawed military strategy and the dependence upon the Maginot (sp?) line of static defenses that caused the quick downfall of the French. They may be smelly ๐Ÿ˜‰ but they don’t surrender easily. Maybe you just don’t like the French-Canadians? Come on, give us your real opinion!

    2) An Index. NOOOOO. Now what will happen to all the AGs who posted bitching about no index! Since they never had anything to contribute other than complaints, what will they do now? [i]DOWN WITH THE INDEX, SAVE THE WHINEY AGs[/i] ๐Ÿ™‚

    -MadManOriginal

    • Dposcorp
    • 17 years ago

    Great Article. I always enjoy reviews like this, its good to see so many products side by side. makes choosing easier

    • Mr Bill
    • 17 years ago

    Very well written article, I have video card envy. Did’nt notice the index till page 13. Boy you guys are fast when you decide to get after something. Just for fun, you should only have the index on the last page, at the bottom. Ba Bump

    • Anonymous
    • 17 years ago

    *[http://216.12.218.25/domain/www.beyond3d.com/forum/viewtopic.php?t=564<]ยง for me its not that bad. Really depends on the type of game and the action. I would like to have both optins. Ansio ATI style for games were FPS is a most. Ansio nVidia style where CPUs or other things are the bottle necks...

    • droopy1592
    • 17 years ago

    What’s an index?

    : }

    • Anonymous
    • 17 years ago

    *[

    • Xylker
    • 17 years ago

    technophile: The EPP is an Employee Purchase Program, but the program ended on 4/1/02. I work for a large (with a capital L) OEM that has a reciprocal agreement with ATi. I wish that I could help you out because I REALLY do like this card. I am watching TV right now even as I type this!

    • technophile
    • 17 years ago

    xylker, through ATI’s what? I want a cheap 8500 aiw! Please explain, lest I purchasea meager 7500 aiw instead…

    • monaco
    • 17 years ago

    there’s an index! awesome! thanks for putting this in guys!

    • TheCollective
    • 17 years ago

    Good review as always. Tell me, do anistrophic filtering and trilinear filtering work together on the Radeon 8500s or not?

    I am seriously considering the Hercules card for a new gaming rig im building and need to know this. My other choice is a Gainward GeForce 3 Ti200.

    • Anonymous
    • 17 years ago

    *[

    • Forge
    • 17 years ago

    OK, just a little rant, real quick:

    I have a problem with the phrase ‘Product X is great cause it overclocks to Product Y levels!’. OK, yes, that’s great if all you were looking for was the performance of the higher-clocked part, but I see a growing trend among people to disregard the top-clocked part entirely, since the next-best card can usually reach the speeds of the top part.

    Basically, I’m saying that the Ti4600 is not sitting around at 300/650 waiting for your Ti4400 to catch up. Those suckers are unwinding to 320/725-750 on average. No Ti4400 I’ve heard of has managed to hit 320/750….

    • Anonymous
    • 17 years ago

    Good review!

    The 8500 Serious Sam SE texture thrashing issue has been fixed in the last couple of ‘leaked’ drivers. Course, you couldn’t use those for your tests.

    Overall, I think the 8500’s have strong drivers now. I play lots of games and have no problems. I’m glad I bought this card.

    The 8500 is a card you can go get today for a great price, and feel assured that it will rock with UT2 when released.

    sBarash

    • Anonymous
    • 17 years ago

    *[

    • Anonymous
    • 17 years ago

    Finally an index. Now I don’t have to change the page numbers in the Address bar ๐Ÿ˜‰

    Nice article.

    • Anonymous
    • 17 years ago

    Bah! An index? Now what are we going to complain about?

    I know – lets start a ‘Move the index to the top’ campaign ๐Ÿ™‚

    “absolutely hoss” – wtf?

    BTW – good article

    • Anonymous
    • 17 years ago

    *[

    • Despite
    • 17 years ago

    “heapy-cheapy” ? “absolutely hoss” ?? Hopefully any of your international readers will be intelligent enough to figure out what you mean from context.

    heapy-cheapy? Dude, you’re just making up words now.

    btw, index == good.

    • elmopuddy
    • 17 years ago

    Seems the Ti4400 is a good deal, expecially since it’ll probably OC to 4600 levels..

    mmmmm

    Tasty review too!!

    Great job Damage!!

    EP

    • Anonymous
    • 17 years ago

    Question: did you run the quake benchies using max graphics quality settings, or one of the other stock settings? 1600 res with max gfx settings yielding 150+ fps….r{

    • rabbit
    • 17 years ago

    [q]absolutely hoss!![/q]

    more things in life should be this way… like the way this article is! That 7500 guy would be good to drop into my parent’s system…

    • Pete
    • 17 years ago

    Good work, Damage. Surprising how affordable and feature-pcked a card the 7500 is. Shame it still suffers from ATi’s less-efficient drivers/hardware.

    So…were there any real differences among the cards in 2D quality (text legibility)? Or did you not have sufficient monitor/time?

    • Anonymous
    • 17 years ago

    *[Originally Posted by Trident

    • Iago
    • 17 years ago

    And Damage said, let there be a drop-down menu. And it was good.

    • EasyRhino
    • 17 years ago

    Man, I haven’t even read the story, and I’m tingling with excitement.

    I’m also very afraid.

    Oh well, onwards…

    ER

    • Anonymous
    • 17 years ago

    Excellent article. Nice that you looked into the sustained vs. average framerate on Serious Sam.

    Look at the 7500 cream the MX440 in the multitextured fillrate test, but it gets beaten by a fair margin across the board in real game tests. Interesting stuff. The MX440 looks like it can command a slightly higher price than the 7500, especially considering how bad it trashes the ATI in AA scores.

    Also interesting to notice how far the Ti200 is behind in the DX8 tests. The Ti500 is in the middle of the pack, within a few percent of the GF4 Ti cards, but the Ti200 is just left in the wake purely due to clock speed. One might venture to guess the GF4 Ti and possibly 8500 are really being limited by the system.

    Once again ATI hardware looks really good in theoretical tests, but then just doesn’t put the pieces together for the game tests. Blame drivers, blame poor efficiency, blame Canada. I’m especially put out considering ATI’s texture filter is not quite as accurate as Nvidia’s overall. And I like my saturation boost. Too bad Nvidia can’t get their act together on refresh rates and the damned gamma control.

    ATI definitely needs to get their ass in gear on the AA front. Sure MSAA isn’t all that and a bag of chips, but I think its a more intelligent tradeoff in the AA world than a crappy texture filter is when it comes to filtering textures. (if I set my game to trilinear, it should ****ing show trilinear, damnit!)

    • Anonymous
    • 17 years ago

    I don’t believe it! It feels like I’m on Anandtech! wohooooooo.

    I always go to the conclusions first, and then work my way into the nitty gritty. Anyone else do the same?

    • Anonymous
    • 17 years ago

    third post!

    • technophile
    • 17 years ago

    Well, at least you didn’t say “FIRST POST!”

    • Anonymous
    • 17 years ago

    Oh, my god! There is an index. THANK YOU.

    Now Tech Report has been improved by at least 24.41%.

Pin It on Pinterest

Share This