Single page Print

Budget graphics cards compared

ATI, NVIDIA, and S3 duke it out for under $80

HIGH-END GRAPHICS CARDS tend to get a disproportionate amount of attention, and for good reason. They're often the first to showcase new features and capabilities, and they generally represent the cutting edge of graphics technology. Sure, we might not always be able to afford them (or at least justify their cost to our better halves) but sooner or later we know the technology will trickle down to lower price points. In recent years, this trickle down has happened sooner rather than later. Gone are the days when low-end parts were a generation behind their mid-range and high-end counterparts; these days, budget offerings share the same basic architecture as bleeding-edge flagships and often launch on the same day.

While ATI and NVIDIA have successfully massaged their Radeon X1000 and GeForce 7 architectures into more affordable flavors, they're not the only ones duking it out in the budget space. S3 has slowly clawed its way back into the game, and the Chrome S27 is poised to take on the GeForce 7300 GS and Radeon X1300 Pro at the low end of the market. S3 may not have a high-end line from which to borrow technology, but at least on paper, the Chrome S27 looks more potent than the budget offerings from ATI and NVIDIA.

Can S3's Chrome S27 pull off an improbable coup and snatch the budget graphics crown, or does either the Radeon X1300 Pro or the GeForce 7300 GS prove superior? We've rounded up a trio of graphics cards that cost under $80 and subjected them to an array of graphics performance and video playback tests to find out.

ATI's Radeon X1300 Pro
A testament to the shrinking generation gap between high- and low-end graphics cards, ATI's budget Radeon X1300 Pro was announced on the same day as the company's then-flagship Radeon X1800 series. The entire Radeon X1000 series shares the same basic GPU architecture, but the RV515 graphics chip in the Radeon X1300 Pro does so with just 105 million transistors—one third that of R520 GPU that powers the Radeon X1800. Fewer transistors allow ATI to squeeze the Radeon X1300's graphics chip onto a die that measures just 95 mm², according to our handy plastic ruler. Thus, the RV515 is roughly 65% smaller than R520. Both chips are manufactured by TSMC using the same 90-nano fabrication process, though.

Despite sporting one third the transistors and a much smaller die area, the Radeon X1300 retains many of the features you'll find in the high-end X1800 series, including a native PCI Express interface, full support for Shader Model 3.0, and ATI's Avivo video suite. The Radeon X1300 Pro even supports CrossFire GPU teaming, although the value proposition for budget multi-GPU graphics is rather poor—just because you can doesn't mean you should.

To stretch a common GPU architecture across high- and low-end products, ATI relies on a modular graphics pipeline that allows the company to add and subtract pixel shaders, vertex shaders, and other functional units to hit transistor and die size targets for various price points. For the Radeon X1300, ATI uses just one of the Radeon X1000 series' pixel shader quads, effectively giving the chip four Radeon X1000-class pixel shader processors with support for Shader Model 3.0 and 32 bits of precision per color channel. That nicely matches the Radeon X1300's four vertex shaders, four texture units, and four render back-ends, making the chip's graphics pipeline rather balanced, unlike some other members of the X1000 series. The Radeon X1600 and X1900 lines are more heavily biased towards pixel shader power, featuring a whopping three times more pixel shader units than texture units or render back-ends.

In addition to sporting fewer shader processors, texture units, and the like, the Radeon X1300 graphics chip is also bound by a couple of other limitations that set it apart from the rest of the X1000 series. For example, the X1300's internal scheduler is limited to only 128 threads, one quarter that of the high-end Radeon X1800. The Radeon X1800 also has a more advanced ring-style memory controller with 512-bit internal and 256-bit external pathways, but that doesn't trickle all the way down to the X1300. The RV1515 GPU includes a more traditional crossbar memory controller with support for one, two, or four 32-bit memory channels.

ATI and its partners currently offer the Radeon X1300 in two flavors: the vanilla Radeon X1300 and a faster X1300 Pro. Today we'll be limiting our focus to the Pro, which is available for $80 and up from numerous online retailers. Pros feature a core clock of 600 MHz, with memory clocked at 400 MHz (or an effective 800 MHz taking into account DDR's double data rate.) That memory leverages all four of the X1300 graphics chip's 32-bit memory channels, yielding an effective 128-bit memory bus with a cool 12.8 GB/s of memory bandwidth.

Interestingly, although some members of the Radeon X1300 series leverage ATI's HyperMemory technology to carve out space in system memory for graphics data and then access it via PCI Express, the Radeon X1300 Pro does not. ATI's partners apparently weren't interested in HyperMemory for 256MB Radeon X1300 Pro cards, and for good reason. There's really no great need for a low-end card like the X1300 Pro to have access to more than 256MB of graphics memory.

Our MSI Radeon X1300 Pro relies on a generic active cooler that lacks automatic fan speed control, subjecting users to the same low-frequency whine regardless of the graphics load. A passive heatsink would be preferred here, but we suspect that the Pro's 600 MHz core clock speed requires more aggressive cooling. Only the vanilla Radeon X1300 appears to be available with passive cooling, but its core runs at between 450 and 500 MHz, depending on the card manufacturer.

Like most budget cards, the MSI Radeon X1300 Pro doesn't offer much in the way of innovative features of fancy extras. Dual DVI outputs are virtually unheard of at this price point, and although the card is advertised as HDTV-ready, it doesn't come with component video cables. MSI does throw a DVI-to-VGA adapter and S-Video cable into the box, though.