Sapphire’s Radeon X800 graphics card

Manufacturer Sapphire
Model Radeon X800
Price (list) $249
Availability Now

THOSE OF YOU who have been hanging around here for a while know that I tend to get rather excited about a really good mid-range graphics card. Sure, the more expensive models are nice, and it’s one of the perks of the job that I get to play around with them now and then. But when a GPU maker takes all of the technology built into a $500 luxury toy and crams it into a $200-ish package that offers decent performance, that’s serious business. For many of us, NewEgg and the ol’ credit card are about to have a rendezvous.

That’s why we were excited when the new generation of eight-pipe graphics cards arrived this past fall, including the GeForce 6600 series and ATI’s answer, the Radeon X700 line. The GeForce 6600 GT, in particular, was an excellent newcomer, and ATI countered by announcing the Radeon X700 XT, which we promptly reviewed. ATI’s new mid-range card was a little slower than the GeForce 6600 GT, but the Radeon X700 XT wasn’t a bad option, save for one thing: you couldn’t buy one. ATI kept promising that we’d see them soon, but very few cards ever materialized. In the end, the Radeon X700 XT was stillborn, and ATI announced a replacement with better performance at the same price: the Radeon X800.

The Sapphire Radeon X800 card that we’re reviewing today is one of the very first Radeon X800 cards available on the market, and it promises to be stiff competition for the GeForce 6600 GT. In fact, given the Radeon X800’s 12-pipeline design, this shouldn’t be a fair fight. But it should be fun to watch.

Sapphire’s new gem
Let us say right up front that Sapphire’s rendition of the Radeon X800 is not the $199 card that ATI predicted. Instead, this card ships with 256MB of GDDR3 memory and a robust bundle of goodies for a list price of $249. That’s fifty bucks well spent to get the extra memory, as far as I’m concerned. Let’s have a gander at the card itself.

Sapphire’s Radeon X800

This is a PCI Express card (sorry upgraders; ATI hasn’t announced an AGP rendition yet) that uses the exact same PCB design as its big brother, the sixteen-pipe Radeon X800 XL. That’s expected, because the Radeon X800 is based on the same ATI R430 chip, only it’s had one of its four pixel-pipeline “quads” disabled—ostensibly because that section of the chip didn’t come out quite right, although sometimes a perfectly good section of a graphics chip might be disabled for product positioning purposes.

Like the XL, Sapphire’s Radeon X800 has VGA and DVI outputs, plus a TV-out port. The major visible differences between ATI’s Radeon X800 XL card and Sapphire’s Radeon X800 are the Sapphire’s bright blue hue and its smaller copper cooler. Let’s have a close-up of that puppy, please.

Sapphire’s X800 cooler has a scary-looking chick on it

This cooler is shaped a little like those on the Radeon X700 XT review unit that we tested a while back, but it’s not as heavy as that one was, and it doesn’t make as much noise. I’d say this cooler is roughly as quiet as the one of the GeForce 6600 GT, subjectively speaking. (The sound level meter is out on loan, or I’d have numbers for you. Sorry.) One of the keys to that quietude may be the cooler’s unique blower design. Notice how the blades aren’t angled like a fan; they’re straight, intended to scoop air out across the copper fins of the cooler. They really do tend to move a lot of air.

Sapphire doesn’t skimp on the extras bundled with this thing, either.

The X800 comes with a nice mix of accessories and software

The bundled software includes (from left to right in the picture) Sapphire’s home-brewed overclocking utility, a drivers disc, CyberLink’s PowerDVD 5 DVD player, and two very decent games—Splinter Cell: Pandora Tomorrow (not to be confused with Splinter Cell: Peanut Butter Monkey) and Prince of Persia: The Sands of Time. Graphics card game bundles don’t get much better than this one, in my book. Sapphire also packs a clutch of cables and adapters into the box, including a DVI-to-VGA dongle, a composite video adapter, a composite video cable, an S-Video cable, and a component output adapter cable for HDTV. All in all, a very decent package.

There is one place where Sapphire has skimped a little, and that’s the core clock speed of its Radeon X800. The official word from ATI on its introduction was that the Radeon X800 would have a 400MHz core clock and 700MHz memory, but Sapphire’s card ships with a 392MHz core and 700MHz RAM. The 8MHz difference won’t exactly shake the Earth off its axis, but it’s possible that other manufacturers’ versions of the X800 will run at exactly 400MHz, for what it’s worth.

The big question, of course, is: how does this carefully calibrated combination of Radeon X800 pixel pipelines, memory chips and clock speeds perform in today’s games? For that, we have some answers…


Test notes
I have committed the ultimate sin against marketing types everywhere by comparing the Radeon X800 against a slew of siblings who are using older drivers than the X800. Forgive me. I used these slightly older results because I wanted to be able to compare the X800 against a wide range of competitors without benchmarking myself into a coma.

Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run at least twice, and the results were averaged. All graphics driver image quality settings were left at their defaults, with the exception that vertical refresh sync (vsync) was always disabled and geometry instancing was enabled on ATI cards.

Note that some of our graphics cards are fakes. Specifically, the Radeon X850 XT is actually a Radeon X850 XT PE card that’s been underclocked, and the pair of GeForce 6800 GT cards in SLI is actually an underclocked pair of 6800 Ultras.

Our test systems were configured like so:

Processor Athlon 64 4000+ 2.4GHz Athlon 64 4000+ 2.4GHz Athlon 64 4000+ 2.4GHz
System bus 1GHz HyperTransport 1GHz HyperTransport 1GHz HyperTransport
Motherboard Asus A8V Deluxe Asus A8N-SLI NVIDIA reference
BIOS revision 1008 beta 1 1001-009 beta 009 4.70
North bridge K8T800 Pro nForce4 Ultra nForce4 Ultra
South bridge VT8237
Chipset drivers Hyperion 4.55 ForceWare 6.31 beta ForceWare 6.31 beta
Memory size 1GB (2 DIMMs) 1GB (2 DIMMs) 1GB (2 DIMMs)
Memory type OCZ PC3200 EL DDR SDRAM at 400MHz OCZ PC3200 EL DDR SDRAM at 400MHz OCZ PC3200 EL DDR SDRAM at 400MHz
CAS latency (CL) 2 2 2
RAS to CAS delay (tRCD) 2 2 2
RAS precharge (tRP) 2 2 2
Cycle time (tRAS) 5 5 5
Hard drive Maxtor MaXLine III 250GB SATA 150
Audio Integrated VT8237/ALC850 with 3.66 drivers Integrated Integrated
Graphics 1 GeForce 6800 128MB AGP 
with ForceWare 66.93 drivers
GeForce 6800 GT 256MB PCI-E
with ForceWare 66.93 drivers
GeForce 6600 GT 128MB PCI-E
with ForceWare 66.93 drivers
Graphics 2 Radeon X800 Pro 256MB AGP
with 8-08-rc2-019256e drivers
Dual GeForce 6800 GT 256MB PCI-E
with ForceWare 66.93 drivers
Radeon X700 XT 128MB PCI-E
with 8-08-rc2-019256e drivers
Graphics 3 Radeon X800 XT Platinum Edition 256MB AGP
with 8-08-rc2-019256e drivers
GeForce 6800 Ultra 256MB PCI-E
with ForceWare 66.93 drivers
 Radeon X800 XT 256MB PCI-E
with 8-08-rc2-019256e drivers
Graphics 4   Dual GeForce 6800 Ultra 256MB PCI-E
with ForceWare 66.93 drivers
Graphics 5   Radeon X850 XT 256MB PCI-E
with 8-08-rc2-019256e drivers
Graphics 6   Radeon X850 XT Platinum Edition 256MB PCI-E
with 8-08-rc2-019256e drivers
Graphics 7   Radeon X800 XL 256MB PCI-E
with 8-08-rc2-019256e drivers
Graphics 8   Dual GeForce 6600 GT PCI-E
with ForceWare 66.93 drivers
Graphics 9   Sapphire Radeon X800 256MB PCI-E
with 8-09-041221m-020455C drivers
OS Microsoft Windows XP Professional
OS updates Service Pack 2, DirectX 9.0c

Thanks to OCZ for providing us with memory for our testing. If you’re looking to tweak out your system to the max and maybe overclock it a little, OCZ’s RAM is definitely worth considering.

Also, all of our test systems were powered by OCZ PowerStream power supply units. The PowerStream was one of our Editor’s Choice winners in our latest PSU round-up.

The test systems’ Windows desktops were set at 1152×864 in 32-bit color at an 85Hz screen refresh rate.

We used the following versions of our test applications:

The tests and methods we employed are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.


Pixel filling power
Ah, fill rate. It is the pixel-pushing power that leads us to pursue ever higher numbers of pixel pipelines and clock speed. Fill rate can determine whether a game runs well at a given resolution with lots of antialasing and texture filtering, and so it’s one of the primary determinants of overall graphics performance. It’s also tied closely, in real-world scenarios, to memory bandwidth. Here’s how the Radeon X800 stacks up on both fronts.

  Core clock (MHz) Pixel pipelines  Peak fill rate (Mpixels/s) Texture units per pixel pipeline Peak fill rate (Mtexels/s) Memory clock (MHz) Memory bus width (bits) Peak memory bandwidth (GB/s)
GeForce 6200 300 4 1200 1 1200 TBD 128 TBD
Radeon X300 325 4 1300 1 1300 400 128 6.4
Radeon X600 Pro 400 4 1600 1 1600 600 128 9.6
GeForce FX 5700 Ultra 475 4 1900 1 1900 900 128 14.4
Radeon 9600 XT 500 4 2000 1 2000 600 128 9.6
Radeon X600 XT 500 4 2000 1 2000 740 128 11.8
GeForce 6600 300 8* 1200 1 2400 TBD 128 TBD
Radeon 9800 Pro 380 8 3040 1 3040 680 256 21.8
Radeon 9800 Pro 256MB 380 8 3040 1 3040 700 256 22.4
GeForce FX 5900 XT 400 4 1600 2 3200 700 256 22.4
Radeon X700 400 8 3200 1 3200 600 128 9.6
Radeon 9800 XT 412 8 3296 1 3296 730 256 23.4
Radeon X700 Pro 420 8 3360 1 3360 864 128 13.8
Radeon X700 XT 475 8 3800 1 3800 1050 128 16.8
GeForce 6800  325 12 3900 1 3900 700 256 22.4
GeForce 6600 GT AGP 500 8* 2000 1 4000 900 128 14.4
GeForce 6600 GT 500 8* 2000 1 4000 1000 128 16.0
Sapphire Radeon X800 392 12 4704 1 4704 700 256 22.4
Radeon X800 400 12 4800 1 4800 700 256 22.4
GeForce 6800 GT 350 16 5600 1 5600 1000 256 32.0
Radeon X800 Pro 475 12 5700 1 5700 900 256 28.8
Radeon X800 XL 400 16 6400 1 6400 980 256 31.4
GeForce 6800 Ultra 425 16 6800 1 6800 1100 256 35.2
Radeon X800 XT 500 16 8000 1 8000 1000 256 32.0
Radeon X800 XT Platinum Edition 520 16 8320 1 8320 1120 256 35.8
Radeon X850 XT 520 16 8320 1 8320 1120 256 35.8
Radeon X850 XT Platinum Edition 540 16 8640 1 8640 1180 256 37.8

* The GeForce 6600 GT has eight pixel pipes and four ROPs, so it can only write 4 pixels per clock when applying a single texture. The pixel pipes and ROPs are connected via a fragment crossbar, so they are very efficient.

Sapphire’s X800 is priced directly between the GeForce 6600 GT and the GeForce 6800. Specs-wise, though, it looks an awful lot like a GeForce 6800, with 4 more pixel pipes and twice the memory bus width of the GeForce 6600 GT. Those attributes give the Radeon X800 more theoretical peak fill rate and memory bandwidth than the 6600 GT. Let’s put that theory to the test with some synthetic fill rate benchmarks.

With multiple textures being applied per pixel, the X800 reaches very near to its theoretical peak fill rate. When doing single texturing, interestingly enough, the X800’s vast theoretical advantage over the GeForce 6600 GT fails to materialize. The 6600 GT’s unique fragment crossbar config, which is tied to only four raster output units, certainly doesn’t seem to harm it in practice. Still, the X800’s 12 pipes give it the edge overall.

That sets the expectations for the X800. Let’s see if it can live up to them.


Doom 3 – Delta Labs
We’ll kick off our gaming benchmarks with Doom 3. Our first Doom 3 test uses a gameplay demo we recorded inside the Delta Labs complex, and it represents the sorts of graphics loads you’ll find in most of the game’s single-player levels. We’ve tested with Doom 3’s High Quality mode, which turns on 8X anisotropic filtering by default.

NVIDIA GPUs have long excelled in Doom 3, and so it is here. The Radeon X800, even with its theoretical fill rate advantage, only manages to catch the GeForce 6600 GT at 1600×1200 with 4X antialiasing. Still, the Radeon X800 hits reasonably playable frame rates at almost every resolution.


Doom 3 – Heat Haze
This next demo was recorded in order to test a specific effect in Doom 3: that cool-looking “heat haze” effect that you see whenever a demon hurls a fireball at you. We figured this effect would be fairly shader intensive, so we wanted to test it separately from the rest of the game.

The X800 largely runs neck and neck with the 6600 GT here, at least once antialiasing is enabled. The 12-pipe GeForce 6800 generally retains a slight edge over the X800.


Half-Life 2 – Route Kanal
Our first Half-Life 2 demo is a longish section of the Route Kanal sequence early in the game. It combines a number of effects, including reflective water and the flashlight, with a lot of running around in simple, dark corridors and a few outside areas.

Half-Life 2 is friendlier ground for the X800, and it begins to make good on its theoretical promise by outrunning the GeForce 6600 GT and the 6800—something the stillborn Radeon X700 XT couldn’t do.


Half-Life 2 – Airboat Battle
In this demo, Gordon is doing battle with a helicopter while running around in an airboat. There’s lots of water here, plus plenty of pyrotechnics.

The field is a little tighter, but the X800 remains on top in the airboat battle, too.


Far Cry – Pier
The Pier level in Far Cry is an outdoor area with dense vegetation, and it makes good use of geometry instancing to populate the jungle with foliage.

Sapphire’s new baby stays on top in Far Cry, especially at 1600×1200 with 4X AA, where its 256MB of RAM probably helps it pad its lead over the competing GeForce cards.


Far Cry – Volcano
Like our Doom 3 “heat haze” demo, the Volcano level in Far Cry includes lots of pixel shader warping and shimmering.

The 6600 GT gives the X800 a run for its money in this more shader-laden test, especially at lower resolutions.


3DMark05 – Game tests
3DMark05 is intended to show us how a system would handle future games, with more demanding graphics loads than even the latest current games.

3DMark shows a consistent, solid edge for the Radeon X800 over both the GeForce 6600 GT and the GeForce 6800.


3DMark05 – Synthetic feature tests

This is interesting. The GeForce 6 series GPUs’ newer pixel shader architecture manages to run 3DMark05’s pixel shader test faster than the X800. However, the tables turn in the vertex shader tests, where the ATI cards, including the X800, have the upper hand.

Power consumption
With each of the graphics cards installed and running, I used a watt meter to measure the power draw of our test systems. The monitor was plugged into a separate power source. The cards were tested at idle in the Windows desktop and under load while running our Doom 3 “heat haze” demo at 1280×1024 with 4X AA.

Note that our power consumption numbers aren’t entirely comparable from card to card because we’re testing entire systems, and those are based on three different motherboards. Two of those motherboards are nForce4 boards, but the third is an AGP system. Also, notice that I’ve limited our testing to actual products that we have on hand, to the exclusion of underclocked “simulated” cards. Some folks have pointed out that different models of graphics cards may vary with respect to the amount of voltage going to the graphics chip or memory.

Oddly enough, Sapphire’s X800 card eats up a little more power than its 16-pipe cousin, Radeon X800 XL, running in the same system. I suspect the difference might be due to the Sapphire’s blower, which seems to spin at some pretty high RPMs.

I was able to crank up the Sapphire card’s clock speeds to 425MHz for the core and 375MHz for the memory (or 750MHz once you take DDR’s double data rate into account). That was good enough for a few frames per second in Doom 3:

Don’t expect miracles from overclocking the Radeon X800, folks. It’s not that far from its peak clock speed already, which may explain why Sapphire went conservative and chose a 392MHz core clock for its cards.

Sapphire, with a little help from ATI, has produced a graphics card in the Radeon X800 256MB that is generally a better performer than the GeForce 6600 GT 128MB, its most direct competition. The X800 is also generally faster than the GeForce 6800. That much we know.

We also know that Sapphire’s version of the X800 comes with a very decent bundle of software, cables, dongles, and such. Sapphire’s custom cooler isn’t bad, either, although the card we tested wasn’t exactly a heroic overclocker. For the stated $249 list, it isn’t a bad deal given its relative performance in the constellation of graphics cards from $199 and up.

The tricky issue with this card has to do with its price. Sapphire says its suggested retail price is $249, and they say the card’s street price might even dip to 10% below that. If it does, this card could be a steal, and it will easily be worth the premium over ATI’s projected price for 128MB versions of the Radeon X800 of $199. (A card with this much rendering power really ought to have 256MB of memory.) However, I see only two listings for this card at online vendors right now at, and both are charging about $330 for the OEM version—not the retail one we’ve reviewed with the bundled games and cables. The question is: will the street price really reach $249 or less? Will this new generation of ATI graphics chips be widely available enough for supply and demand to meet at the appropriate point?

I don’t think we know the answer to those questions just yet, but I can’t blame Sapphire for that. I’ll take them for their word that they believe the price will be $249 or less.

ATI, on the other hand, I’m not so sure about. They said the Radeon X700 XT would arrive at $199, and it never did. They’ve also had serious problems keeping the market supplied with chips for some time now, and the new Radeon X800 XL is still selling for $100 over its supposed list price of $299—including Sapphire’s version. I don’t know what kind of deep-seated psychological problems would lead a company clawing its way back from serious supply problems to understate the price of its products by a third, but that appears to be what’s happened. The Radeon X800 could meet a similar fate.

Another possibility is that ATI could begin cranking out R430 chips like the dickens, and all R430-based cards could land near list price. If that happens, we’ll be asking a different question about this Sapphire X800 256MB card: why not pay a little extra and pick up a Radeon X800 XL for $299? Look back over our benchmarks again, and you’ll see that the XL is firmly a step above the Radeon X800. Hmm.

But you probably didn’t want a tortured treatise on product positioning and supply and demand, did you? Here’s the bottom line: if Sapphire delivers this card to the world for $249 or less, it will be a better buy than the GeForce 6600 GT, to which we recently gave a Best of 2004 award. That’s saying something. We should know soon enough how the pricing shakes out. If all goes as planned, the Radeon X800 will be the mid-range card of choice. 

Comments closed
    • CampinCarl
    • 14 years ago

    I would like to say that I just saw the Sapphire Radeon x800 for $199.00 at Newegg (It also has a 30 dollar MIR that brings the effective cost down to $169.00)

    • indeego
    • 15 years ago

    Month later and the card is still above $249. dudg{<.<}g

    • WaltC
    • 15 years ago

    I began reading this thinking I was going to get some in-depth analysis of this particular Sapphire card, but instead I found the info on the poor little Sapphire card buried in giant bar charts featuring fourteen (14) other 3d-card configurations…some costing as much as ~4x the $249 price tag of the Sapphire x800 (even excluding the premium for dual-slot PCIe16 boards over single-slot versions, differences in PSU’s required, etc.) Are we going back to “bait ‘n switch,” again? I’m jaded, but I thought we’d pretty much outgrown that sort of thing in recent years. Ah, well…;)

    Accordingly, the main thing I got from this particular Sapphire x800 review, unfortunately, is that generally speaking it takes two ~$500 nVidia cards running in tandem to beat one ~$500 ATi card–well, that happens some of the time, anyway…;)

    At least the Sapphire’s bar in the bar charts was yellow instead of the orange used for the fourteen (14) other configurations thrown into the Sapphire review (in surprise, “cameo appearances,” I guess?) I think that narrowing down the list somewhat–say restricting the cards contrasted to $100 + or – the the Sapphire’s $249 MSRP–would have been mcuh more informative. But of course this assumes the actual intent of the Sapphire review was to feature the Sapphire card as opposed to something else, doesn’t it?…:D

    • Flying Fox
    • 15 years ago

    Ok, am I see things here? Some 6600GT SLI actually beats 6800 Ultra SLI?

    • NewfieBullet
    • 15 years ago

    One thing that has always bugged me about video card reviews is the inclusion of 3dmark. Other that for bragging rights what is the point of 3dmark? If I want to know how a particular game runs on a particular card there’s no better benchmark than the game itself. I know they claim that this will tell us how a card will run future games but has anyone actually compared how well a card runs todays games vs yesterdays 3dmark to see how accurate that claim is?

      • Dissonance
      • 15 years ago

      Relative performance in 3DMark03 actually tracked pretty well with the first wave of DX9 games. Also, it’s nice to have some raw fill rate and shader performance numbers to help explain why X card may behave in a certain way versus Y card in a given game.

      Using only 3DMark to evaluate a graphics card is never a good idea. But it’s a nice companion to real world gaming tests if you want to dig a little deeper into why performance in the real world pans out the way it does.

        • blitzy
        • 15 years ago

        yep, and we don’t need any more HardOCPs doing their best to get rid of the few benchmarking tools we have =/

        • NewfieBullet
        • 15 years ago

        I’ll buy that. I guess it makes some sense as an analysis tool. That doesn’t seem to be the way it is presented though and summing up performance in one 3dmark number hides any useful data for analysis.

      • WaltC
      • 15 years ago

      Unless it also bugs you to read hard drive reviews which incorporate HD Tach results, it shouldn’t bug you to read 3d-card reviews which incorporate 3d-Mark results. Worthwhile hardware reviews always include a mix of results from both synthetic benches and commercially available programs, imo. A 3d-card review which leaves out a good synthetic benchmark is no better in my view than a 3d-card review that publishes only 3d-Mark results and nothing else.

      BTW, 3d-Mark doesn’t pretend to “demonstrate tomorrow’s 3d game performance” to my knowledge. What it does is to stress your 3d-card’s performance *today* to the maximum extent possible and provide you with ideas about the performance and IQ of your 3d-card as it operates “today.” Whether or not an actual 3d-game comes along which stresses your 3d-card in exactly the same ways as 3d-Mark seems irrelevant to me. The point is that 3d-Mark shows you what you’re getting for your money *today* in case a game requiring the same kind of gpu horsepower 3d Mark tests for happens to wind up being something you’d like to buy.

      It’s true that 3d-Mark won’t tell you how your system will handle Doom3. It’s equally true, though, that running Doom3 won’t tell you anything about how your system will run HL2, and vice-versa, for the same reasons exactly that 3d-Mark won’t tell you how your system will run Doom3…;) There’s a difference in purpose and intent behind synthetic benchmarks and 3d games, and it’s surprising how many folks fail to see the differences. That’s why quality hardware reviews include performance and IQ results from both synthetic benchmarks and shipping games, imo. It’s a practice commonly called “covering all the bases.” Leave off one or the other and you simply aren’t doing that.

        • NewfieBullet
        • 15 years ago


    • Xenolith
    • 15 years ago

    How do you run a Half-life 2 demo? I downloaded the boat7 demo and don’t even know where to place it. Thanks for the help.

    • YeuEmMaiMai
    • 15 years ago

    Since I do not want to give up my ecs L7S7A and Mobile Barton 2500+ running at 2.8Ghz, I WANT AN AGP VERSION!!!!! DANG IT ATI GIVE ME SOMETHING GOOD IN AGP!!!!!!!

    • Dposcorp
    • 15 years ago

    Another fine review Scott, but I do have some comments and questions.

    First, since you are testing a midrange graphics card, what is the reason you didnt test it on a some slower midrange CPUS?

    I would think that people looking to buy a card at this price point probably dont have a $750 4000+ CPU.
    ยง[<<]ยง On your two Nvidia platforms, what is the difference between the Asus A8N-SLI board, and the NVIDIA reference board? They look like the same specs and drivers to me, or did I miss something? EDIT: I see now you used different cards on each board, but then couldnt that effect the results? Since you are testing video cards, shouldnt the CPU, ram, and MOTHERBOARD all stay the same? I am a huge fan of ATI, but at this point in time, I keep recomending the 6800NU or the 6800GT, which are showing up used on various forums and eBay for good prices. I got my 6800GT for right around $300, and the 6800NU seem to unlock quite often to give you the full 16 pipes.

      • highlandr
      • 15 years ago

      I can tell you what will happen with a slower CPU: the differences will be less pronounced, except for a few instances where the CPU will be the bottleneck (lower resolutions, mostly)

      I know people like to see how a more balanced system will run, but the review is supposed to show you how the cards differ. To do that, you have to eliminate any other possible bottlenecks. That means putting in a screamer of a CPU.

    • PerfectCr
    • 15 years ago

    I recently got a X700 PRO for 160 at newegg. Retail “Lite” Box from Sapphire. Sure you can spend an extra $50 here and there to get more, but you have the draw the line somwhere. For that price I am very happy with the performance.

    It’s close the X700 XT (which never materialized) in terms of performance. I’d call it “mid-range”, but these days it looks like the Mid-Range has ranges within itself!

    I guess the X700 PRO is lower mid – and the X800 is mid-mid? ๐Ÿ˜

    Why can’t we go back to the days where we had ONE low end, ONE mid range, and ONE high end card to choose from. Is there really an advantage to having all of these different models other than benchmark pissing contests?

      • DreadCthulhu
      • 15 years ago

      Having a myriad of different cards allows ATI & Nvidia to use chips that have some defects in them (say a 16 pipe chip has one bad pipeline, you can still use it in a 12 pipe chip) and sell them, instead of throwing them away.

    • AGerbilWithAFootInTheGrav
    • 15 years ago

    Scott – you have actually coloured one graph wrongly, X800 pro is yellow as opposed to X800… I just noticed it while reading trough…

    will go back to report which one…

    found it —

    Doom III High quality Heat Haze 1280*1024…

      • Damage
      • 15 years ago

      Picky, picky. Slave driver!

      Fixed. ๐Ÿ™‚

    • kvndoom
    • 15 years ago

    There are two issues concerning price I want to point out.

    One is that I don’t think cards are going to level out to MSRP because of saturation. There are simply TOO MANY MODELS to choose from at certain price points. The $200 – $250 range is so cluttered it’s ridiculous.

    Also, I don’t think either Nvidia or ATI can be blamed for the current $100+ price inflation on PCI Expre$$ cards. I think it’s the retailers who are doing this because the demand is so viciously outweighing the supply. Something tells me that dealer cost is where it’s supposed to be, but (r)etailers are thinking, “If these fools will pay $100 extra, why not charge them for it?” I think there’s an OPEC mentality in the business right now. Gas doesn’t have to be $2 a gallon, but it is. OPEC sets oil prices mostly on how they feel any given day. And even when prices hit a ridiculous high over the summer, people kept on filling up their 12MPG SUV’s. So it is with the video cards.

      • danny e.
      • 15 years ago

      i tend to agree.
      if you look at NewEgg prices even 939 athlons over the past few months you’ll note that NewEgg is actually charging about $50 MORE now for a 3500+ than they were several months ago.
      I am fairly confident that AMD has not RAISED their price.. so what it comes down to is NewEgg has a limited supply and just keeps raising their prices as long as people keep buying.

      same thing happening in the video card sector.

      • Yahoolian
      • 15 years ago

      So you belive in Communism?

      Those darn businesses, how dare they try to maximize profits!!

        • Technophobe
        • 15 years ago

        Wait, I’m confused. What does not wanting to be price gouged have to do with communism?

          • JustAnEngineer
          • 15 years ago

          Shop around. That’s how your side of the free market economy is supposed to work.

          I actually ordered parts from MonarchComputer, ZipZoomFly and CoolerGuys this week because they were less expensive from those fine vendors than from our beloved NewEgg.

            • Cuhulin
            • 15 years ago

            Shopping around is PART of how a free economy works.

            The other part of a free economy is providing the social structure — contracts, courts, police, and the like — in which it works. For that, we are entitled to vendors to act in accord with our ethics. They have their ownership, their management, and so on — i.e., no communism — we get a minimal level of ethics.

            The problem with the Yahoolian type of comment is that it ignores the society’s part of the social contract that is needed for free enterprise to work.

    • vortigern_red
    • 15 years ago


      • Chrispy_
      • 15 years ago

      Agreed, I second this question.

      This shady area of the two companies rivalry is out of the spotlight now that the performance gap between them is closer, but we KNOW that neither company can be trusted with honesty in their drivers.

      In terms of real world differences, it really isn’t very important, because the image quality loss from turning the optimisations on is negligible compared to the FPS increase, but it would be nice to check that these results are still like-for-like.

      • PRIME1
      • 15 years ago

      SM 3.0 should have been used in the Far Cry tests. Even though ATI cards don’t support it, the NVIDIA cards would have seen a boost and it would have been a better comparison of what each card has to offer.

        • Damage
        • 15 years ago

        We tested with Far Cry 1.3, which makes use of SM2.0b/3.0 by default. We also, as it says in the testing methods section, enabled geometry instancing in the ATI drivers.

      • Damage
      • 15 years ago

      As it says in the testing methods, we largely used the driver defaults. That means Catalyst AI was enabled.

        • vortigern_red
        • 15 years ago

        Thanks, I did read that page about 3 times but missed it, of course, not owning a current NV card I don’t actually know what the defaults are. But I presume its safe to assume all optimisations are on ๐Ÿ™‚ (which is fair enough in the vast majority of cases.)

        Given that Cat AI was on then the Doom3 results are a little disappointing.

    • Klyith
    • 15 years ago

    On page 8 (Far CRy – Pier), the Geforce 6800 is left out of the final graph, the 1600×1200 + AA + AF one. Was this an oversight, or did you not have the 6800 data for that instance?

      • Damage
      • 15 years ago

      Our 6800 didn’t complete several of the tests at 16×12 with 4X AA. Some kind of driver or card problem combining an NV40 with only 128MB of RAM, I guess.

    • spworley
    • 15 years ago

    What makes this card so power hungry? Over 50 watts more than an x800 Pro or a 6800. I don’t think it can be the big fan, though I guess the power (= heat) is why there’s a big fan anyway.

      • Damage
      • 15 years ago

      Careful there. That’s system power consumption, and the cards you mentioned are AGP cards on a different motherboard.

    • blitzy
    • 15 years ago

    phew, it was alright but not quite enough to give me buyers remorse.. If they don’t come below the $250 mark I think they’d be too expensive for my taste, especially once I factor in the extra cost for buying here in NZ. I don’t think I could have waited till they actually hit the streets here either

    I just got my MSI 6600GT today, cant say how it runs since I am still waiting for other hardware but it sure had a huge bundle. 15 CDs all up, Prince of persia: SOT, XIII, URU: ages beyond myst, 14in1demo CD, Intervideo WinDVD and some other things

    that said the X800 would have to be the best mid-range card if you can find one for the right price, otherwise the X800XL might steal its thunder

      • PRIME1
      • 15 years ago

      I think the MSI 6600GT it is the best mid-range card on the market right now.

      Not only does it have a great software bundle, VIVO, SM 3.0, HDL, HDTV out cables and a copper heatsink that covers the memory as well as the GPU, but at $216 (what I paid) it’s a good deal.

      *[< My MSI card is due to arrive today. ๐Ÿ™‚ <]*

        • blitzy
        • 15 years ago

        I paid roughly $248 for mine, for a 6800GT or other high end it wouldve cost $570 or more… (for a non crappy brand especially) – prices in USD.

        Not a great price, but not bad considering I am in NZ

        • thebluesgnr
        • 15 years ago

        I’ve read some reviewers saying it’s not a copper heatsink. I guess they’re right, MSI would probably say something in their web site if it was.

    • Convert
    • 15 years ago

    Can I get another Holy Poop? Competition is grand isn’t it. Might have taken ATI quite a few months to get a card out but better late than never.

    They are really missing out on a lot of people who are still using AGP though.

    The only problem with this card is its price. I mean it’s definitely worth the extra 50$ but for another 50$ I’d rather have a x800xl.

    Of course I mean all of this when the cards actually hit their MSRP……

      • Tuanies
      • 15 years ago

      Its a slippery slope, theres always something better for another $50…and another…and another…

        • Convert
        • 15 years ago

        Indeed it is. Going to be hard turning down a xl for another 50$ though.

        Provided they sell for 300$ at some point. Seeing this card though I have to wonder if ATI will stick with their original 350$ price point.

        • Autonomous Gerbil
        • 15 years ago

        After 2 ATI cards in a row (Voodoo1, Voodoo2, GFMX200, 8500, 9600), I switched back and bought a 6600GT for $180 about a month ago. After seeing these tests and knowing how much I can OC the 6600GT, I see it’s at least the equal of the $70 (if it ever gets that low) more expensive ATI card – and at this price a 75% markup almost puts it in a different category altogether. ATI has had the upper hand for a couple of years, but the pendulum has swung back to Nvidia. A year ago I never would have seen myself buying Nvidia or getting this kind of a deal for <$200. I love having two viable 3D card manufacturers!

Pin It on Pinterest

Share This