But what if you need a mid-range PCI Express graphics card today?
You have a couple of options: ATI’s Radeon X600 XT and NVIDIA’s GeForce PCX 5900, derived from these companies’ respective AGP offerings, the Radeon 9600 XT and GeForce FX 5900. Both of these PCI-E cards are available for around $200, and we’ve rounded up a trio of cards from Abit, Albatron, and Gigabyte to determine which is worthy of your new motherboard’s PCI Express x16 graphics slot.
Comparing the cards
Before I delve into the details of each card individually, let’s take a moment to compare some important metrics.
|GPU||Core clock (MHz)||Memory clock (MHz)||Memory type||Memory size (MB)||Video outputs||Warranty period||Street price|
|Abit RX600XT-PCIE||ATI RV380||513||742||DDR||128||DVI, VGA, S-Video||15 months*||$181|
|Albatron Trinity PCX5900||NVIDIA NV35||350||550||DDR||128||DVI, VGA, S-Video||3 years labor, 1 year parts||$215|
|Gigabyte GV-RX60X128V||ATI RV380||500||742||DDR||128||DVI, VGA, S-Video||2 years||$190|
The RX600XT-PCIE and GV-RX60X128V are both based on ATI’s RV380 GPU, while Albatron’s Trinity PCX5900 uses NVIDIA’s NV35 graphics chip. Unlike RV380, which is a native PCI Express chip, NV35 was built with AGP in mind. NV35 must be paired with NVIDIA’s High Speed Interconnect (HSI) bridge chip in order to interface with a PCI Express x16 graphics slot, and there’s been much discussion of whether that’s an adequate implementation. At least as far as today’s applications are concerned, it’s unlikely that ATI’s native PCI Express implementation will offer tangible performance benefits over NVIDIA’s HSI bridge chip.
PCI Express implementations aside, the RX600XT-PCIE, GV-RX60X128V, and Trinity PCX5900 also differ in their pixel pipeline configurations, memory bus widths, and clock speeds. Here’s how fill rates and memory bandwidth pan out in our trusty chip chart:
|Core clock (MHz)||Pixel pipelines||Peak fill rate (Mpixels/s)||Texture units per pixel pipeline||Peak fill rate (Mtexels/s)||Memory clock (MHz)||Memory bus width (bits)||Peak memory bandwidth (GB/s)|
|Albatron Trinity PCX5900||350||4||1400||2||2800||550||256||17.6|
Although the Radeon X600 XT cards’ higher core clock speeds yield better single-texturing fill rates than the Trinity PCX5900, NV35’s 4×2-pipe configuration offers a higher peak multi-texturing fill rate. The Radeon X600 XT-based cards also have higher memory clock speeds, but they’re only running on a 128-bit memory bus. The Trinity PCX5900 enjoys a wider 256-bit path to memory, allowing it significantly more memory bandwidth with lower clock speeds.
Also note that the RX600XT-PCIE has a slightly higher core clock speed than the GV-RX60X128V. More on that in a moment.
A few extra MHz never hurt anyone Back in the day, Abit video cards sported graphics chips from NVIDIA. Abit hopped the fence last year, though, and they’ve been churning out Radeon-based graphics cards ever since. The company currently has a packed lineup of no fewer than six Radeon graphics cards for PCI Express, from the high-end Radeon X800 XT Platinum Edition down to affordable Radeon X300-based cards. Today we’ll be looking at a sample from the middle of Abit’s PCI Express lineup, the Radeon X600 XT-based RX600XT-PCIE.
Like Abit’s recent motherboards, this graphics card is dressed in a distinctive shade of not quite red, but not really orange. The card’s color nicely matches Abit’s PCI Express-equipped motherboards, and so far, the unique shade hasn’t been copied.
Speaking of copying, the Abit X600 XT isn’t a standard cookie-cutter reference card. Some of its surface-mounted components differ from what you’ll find on both Gigabyte’s Radeon X600 XT and ATI’s own Radeon X600 reference card.
Oh, and Abit’s cooler definitely isn’t stock, either.
Funky, but maybe not functional. For starters, the cooler’s exquisitely-detailed fan guard is an air flow nightmare. The shroud will no doubt prevent probing fingers from making contact with the fan, but how often are you actually poking around the graphics card fan while the system is running? Not often, if ever. Fortunately, the shroud can be removed with a screwdriver, improving air flow and potentially lowering noise levels by eliminating turbulence.
My other issue with this cooler is its integrated memory heat sinks, which seem to be more about cosmetic appeal than functionality. I don’t dispute that heat sinks can help keep memory chips cool, but the card has memory chips on both sides. The cooler’s memory heat sinks aren’t doing anything to help cool the bare memory chips mounted on the bottom of the card.
The Abit X600 XT is populated with eight Hynix HY5DU283222AF-25 memory chips for a total of 128MB. The chips are rated for operation at speeds of 400MHz (an effective 800MHz if we take into account DDR’s clock-doubling effect,) but they’re only running at an effective speed of 742MHz on the card. The 742MHz effective memory clock is a sneaky 2MHz above the Radeon X600 XT’s stock 740MHz memory speed, but it’s only a couple of MHz. The Gigabyte Radeon X600 XT also has an effective 742MHz memory clock.
This Abit card has a real clock speed trick up its sleeve, though. While stock Radeon X600 cards run with a 500MHz core clock speed, the RX600XT-PCIE’s core clock is 513MHz. This minor clock speed boost probably won’t make much of a difference in the real world, but it’s worth noting. Abit’s warranty covers the card at this speed, and as you’ll see in a moment, we were able to get it running even faster in our overclocking tests.
While I’d normally whine about the lack of dual DVI connectors, the Abit X600 XT’s sub-$200 price tag make it easy to excuse the card’s standard array of video output ports. It would be nice to see dual DVI pop up on mid-range and low-end cards, but at this price, I don’t feel ripped off by having only single VGA, DVI, and S-Video outputs.
Abit augments the RX600XT-PCIE’s output capabilities with a cable bundle that includes a DVI-to-VGA adapter, an S-Video-to-composite video adapter, and composite and S-Video cables. A copy of PowerDVD 5 is also included in the box, but no game bundle. At the very least, Abit isn’t trying to dress up the card with a dated game bundle or filler discs packed with publicly-available game demos.
Abit’s warranty is perhaps the most complicated of the lot. The card is covered for three years, but parts and labor are only free for the first 15 months. For the final 21 months of the warranty period, Abit charges a flat $25 fee for labor. 15 months of full coverage isn’t great, especially compared with other cards in this comparison. However, Abit’s eRMA service is excellent.
Less leather, more blue Albatron’s Trinity PCX5900 sits atop the company’s PCI Express graphics lineup, but it’s really only a mid-range card. The PCX5900 is based on the same NV35 GPU that powers the GeForce FX 5900 series, but with a core clock of only 350MHz, the chip is running much slower than it does on even the GeForce FX 5900 XT, which runs at 400MHz. Lower clock speeds, combined with a price tag a little over $200, make the Trinity PCX5900 a decidedly middle-of-the-road part.
At first glance, the Trinity PCX5900 looks like a monster. A big, bright, blue monster.
The card is nearly nine inches longa full two and a half inches longer than the Radeon X600 XT cards we’re looking at today. This beast might be a tight fit for small form factor systems like Shuttle’s XPC SB81P.
Once you get over the card’s size, you have to come to terms with its color. Everything is blue, and not an understated blue that’s easy to ignore. If you happen to like blue, this card might just be perfect for you. Otherwise, you might want to reconsider that case window.
The Trinity PCX5900 is equipped with cooling implements to cover not only the card’s graphics chip, but also all its memory chips and NVIDIA HSI bridge. Half of the card’s memory chips are actively-cooled by the GPU cooler, while the others share a passive heat sink with the bridge chip.
Albatron populates the Trinity PCX5900 with eight Hynix HY5DU283222AF-28 memory chips rated for operation at speeds up to 350MHz, or an effective 700MHz with DDR clock-doubling in action. The card’s memory chips are only clocked to an effective 550MHz, though, possibly leaving some headroom for overclocking. As I mentioned earlier, the PCX5900 uses NVIDIA’s HSI bridge chip to interface the NV35 GPU with a PCI Express X16 bus. The bridge chip die is tiny, but it comes on a comparatively large package, as you can see in the picture on the right above.
The Trinity has all the ports you’d expect from a mid-range graphics card: VGA, DVI, and S-Video outputs. The port arrangement is a little odd, but doesn’t seem to create any problems.
Albatron ships the Trinity PCX5900 with an S-Video-to-composite video adapter and a composite video cable, but curiously, no S-Video cable or DVI-to-VGA adapter. Given the Trinity’s multimonitor potential, which leans heavily on NVIDIA’s excellent nView software, it’s disappointing that Albatron didn’t at least include a VGA-to-DVI adapter.
In the software department, the Trinity PCX5900 comes with a full copy of WinDVD Creator, Duke Nukem Manhattan Project, and a cheesy game demo CD. This is the kind of game bundle that makes me cringe. Not only is Manhattan Project ancient, so are all the games on the demo CD. The GeForce PCX5900 might not be cutting-edge graphics technology, but it’s certainly capable of playing recent games. Unfortunately, the game bundle doesn’t reflect that.
Albatron covers the Trinity PCX5900 with a three-year labor and one-year parts warranty. That’s not a great deal, but not a horrible one, either.
Make mine VIVO Gigabyte is one of only a handful of manufacturers to offer PCI Express graphics cards based on both ATI and NVIDIA designs. The company has a full line of GeForce PCX cards in addition to a wide selection of Radeon PCI-E products, all of which slide into a PCI-E x16 graphics slot. Today we’re looking at the GV-RX60X128V, a mid-range card based on the Radeon X600 XT. The GV-RX60X128V carries on a proud Gigabyte tradition of horribly cryptic product names, but the card packs more than enough appeal to transcend its awkward name.
An awkward name isn’t the only Gigabyte tradition the GV-RX60X128V carries on. Like most Gigabyte graphics cards and motherboards, the GV-RX60X128V comes in a lush shade of turquoise blue.
Gigabyte accents the blue board with a muted gold heat sink that’s just a buffing short of bling.
There’s nothing particularly wild or interesting about the cooler, which resembles the Blue Orbs of old. Gigabyte seems content to leave the card’s memory chips free of heat sink-assisted cooling, too.
Speaking of memory, the Gigabyte card uses 128MB of the same Hynix memory chips as the Abit card. As on the Abit, the memory chips are clocked at an effective 742MHz. However, the Gigabyte’s core clock speed is 500MHz, standard for the Radeon X600 XT. The Gigabyte card is packing something the Abit card can’t compete with, though: video in and out (VIVO) capabilities. Thanks to an ATI Rage Theater chip, the Gigabyte X600 XT can capture video streams from S-Video or composite video sources. That’s not quite as slick as an All-in-Wonder card with a TV tuner, but for basic video capture, it gets the job done.
Despite its VIVO capabilities, this card’s port cluster is as standard as they come.
The card’s video input and output ports are elegantly handled by the single splitter cable Gigabyte provides. Gigabyte also includes a DVI-to-VGA adapter, but sadly, neither composite nor S-Video cables.
Gigabyte saves face in a big way by packing its X600 XT with worthwhile software. In addition to copies of PowerDVD 5 and PowerDirector 3, the software bundle includes full versions of Rainbow Six 3: Raven Shield, Counter-Strike: Condition Zero, and Spell Force. Although I’ve never heard of Spell Force, the other two games are definitely big-name titles. According to current prices at EBgames.com, these three games retail for a combined total of $104.97, bringing tangible value to this game bundle.
In addition to a strong game bundle, the Gigabyte X600 XT also has a decent warranty. The card is covered for two years for both parts and labor, which I prefer to the Abit and Albatron three-year warranties, whose coverage degrades after 15 months or one year, respectively.
All tests were run three times, and their results were averaged, using the following test systems.
|Processor||Pentium 4 2.8’E’GHz|
|Front-side bus||800MHz (200MHz quad-pumped)|
|North bridge||Intel 915P MCH|
|South bridge||Intel ICH6R|
|Chipset driver||Intel 184.108.40.2062|
|Memory size||1GB (2 DIMMs)|
|Memory type||OCZ PC4400 DDR SDRAM at 400MHz and 2.5-4-4-8 timings|
|Albatron Trinity PCX5900|
|Graphics driver||Catalyst 4.9||ForceWare 61.77|
Western Digital WD360GD 10,000RPM Serial ATA hard drive
|Operating System||Windows XP Professional
Service Pack 2 and DirectX 9.0c
Since we’re already familiar with the performance of the GeForce FX and Radeon 9600 XT series, I’ve only included a couple of 3D performance tests. You can get a more complete picture of PCI Express graphics performance from our recent GeForce 6600 GT review.
We used the following versions of our test applications:
The test systems’ Windows desktop was set at 1280×1024 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests. All of the 3D gaming tests used the highest possible detail image quality settings except where otherwise noted.
All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.
DOOM 3 was tested with the game’s High Quality mode, which enables 8X anisotropic filtering.
Although the Trinity PCX5900 has a definite lead in trdemo2, which takes place in one of the game’s early levels, the other cards pull even in our “heat haze” demo, which was recorded in one of the game’s later Hell levels. Since heat haze is a shader-driven effect, I suspect that we can attribute the close scores to the GeForce FX’s comparatively weak shader power.
Our Far Cry testing used a beta version of the game’s 1.2 patch and the game’s High Detail image quality setting.
The X600 XT cards win big in Far Cry, and the PCX5900 doesn’t have Shader Model 3.0 support to help even the score. DOOM 3 may have been close, but this one’s a blowout for the Radeons, especially in the Research demo.
As expected, the Abit X600 XT’s 13MHz core clock speed advantage over the Gigabyte doesn’t translate to much of a performance advantage.
Armed with CoolBits and PowerStrip, I overclocked the cards as far as they would go without sacrificing image quality or system stability. NVIDIA’s optimal clock speed detection routine did a good job of finding the Trinity PCX5900’s core and memory sweet spot, but I was forced to resort to more tedious trial and error in PowerStrip to overclock the X600 XTs.
In testing, I was able to get the following core and memory clock speeds from each card:
- Abit RX600XT-PCIE – 570/840MHz
- Albatron Trinity PCX5900 – 415/681MHz
- Gigabyte GV-RX60X128V – 590/840MHz
Not bad at all. Both Radeon X600 XT cards were stable with an effective memory clock boost of 98Mhz, and the Gigabyte card was flawless with a 90MHz core overclock. With the Albatron card, I was able to squeeze an extra 65MHz from the core and a whopping 131MHz from the memory. As always with overclocking, your mileage may vary. So what did overclocking do for performance? I fired up our DOOM 3 heat haze demo to find out.
All the cards see a healthy performance boost from overclocking, but the Trinity PCX5900 comes out as the big winner, especially as we turn up the resolution. Because the card uses a 256-bit memory bus, memory bandwidth gains from each tick over stock speeds are double that of 128-bit cards. The Radeon X600 XT cards both happen to use a 128-bit memory bus, so while their 98MHz memory overclocks yield an extra 1.57GB/sec of memory bandwidth, the Trinity PCX5900’s 131MHz memory overclock gives the card a whopping 4.19GB/sec more memory bandwidth to play with.
I used our trusty watt meter to measure system power consumption, sans monitor, at the outlet. Power consumption was measured at idle and under a load generated by a DOOM 3 trhaze timedemo.
Under load, the Albatron and Abit cards consume about the same amount of power. The Abit card is more frugal under load, though. Curiously, the Gigabyte X600 XT has higher power consumption at idle and under load than the Abit, despite the fact that both cards are based on the same GPU and memory chips. I suspect the Gigabyte card’s Rage Theater chip may be at least partially responsible for the higher power consumption.
Noise levels were measured using an Extech 407727 Digital Sound Level Meter placed four inches from the graphics card. The noise level meter was out of the direct path of air flow. Idle and load environments were identical to those described in our power consumption tests.
The Trinity PCX5900’s blinding blue cooler is the quietest of the bunch, but the Gigabyte’s isn’t far behind. Abit’s RX600XT-PCI is a full decibel louder than the Gigabyte card under load.
Video signal quality
To test video signal quality, I hooked the cards up to a Mitsubishi Diamond Pro 930SB 19″ monitor and ran the Nokia Monitor Test at 1024×768, 1280×1024, and 1600×1200 at 85Hz. While text clarity and signal quality were golden at 1024×768 and 1280×1024, 1600×1200 was a little less crisp on all the cards. The RX600XT-PCIE fared the best at 1600×1200. While it’s not quite golden, I’d classify its output as very good. Output from the Gigabyte and Albatron cards at 1600×1200 output was good, too, but neither was quite as clear as the RX600XT-PCIE. I’d give the Gigabyte X600 XT a slight edge over the Trinity PCX5900, but the two were pretty close.
Before I pick a winner, let’s quickly sum up what each of the cards brings to the table.
- Abit RX600XT-PCIE – At only $181 online, the Abit X600 XT is the least expensive card of the lot. Every bit as fast as the Gigabyte, the Abit card boasts strong video signal quality at high resolutions and a great cable bundle, but the software package is pretty weak. Also, Abit’s warranty doesn’t offer the best coverage. The RX600XT-PCIE is also the noisiest card we looked at today, although only by a decibel.
- Albatron Trinity PCX5900 – If you want to crank up clock speeds, the Trinity looks to have the most potential. The card’s core and memory are running well below their capabilities, at least on our sample, and memory overclocking can yield excellent performance gains thanks to a 256-bit bus. Unfortunately, the Albatron’s game bundle is weak, and its Far Cry performance is off the back. At $215, the Trinity PCX5900 is also the most expensive card of the three cards.
- Gigabyte GV-RX60X128V – VIVO support and a great game bundle give the Gigabyte X600 XT a big edge over the competition, and Gigabyte’s two-year parts and labor warranty sweetens the deal. This card is available for $190, making it quite a steal given what’s inside the box.
Picking a winner from this trio isn’t difficult. Gigabyte’s X600 XT offers easily the best game bundle, arguably the best warranty coverage, and VIVO support at a very competitive price. If you need a mid-range PCI Express graphics card today, the GV-RX60X128V should definitely be on your list. However, if you can afford to wait a month or two, it’s probably a good idea to save pennies for a GeForce 6600 GT or whatever ATI has up its sleeve.