NVIDIA’s GeForce FX 5200 GPU

WHEN NVIDIA announced its NV31 and NV34 graphics chips, I have to admit I was a skeptic. The chips, which would go on to power NVIDIA’s GeForce FX 5600 and 5200 lines, respectively, promised full DirectX 9 features and compatibility to the masses. Who could resist?

Me, at least initially. Perhaps I still had a bitter taste in my mouth after the recycled DirectX 7 debacle that was the GeForce4 MX, or maybe it was NVIDIA’s unwillingness to discuss the internal structure of its graphics chips. Maybe it was merely the fact that I didn’t believe NVIDIA could pull off a budget graphics chip with a full DirectX 9 feature set without making cutting corners somewhere.

Or maybe I’m just turning into a grumpy old man.

Well, NVIDIA may have pulled it off. Now that I have Albatron’s Gigi FX5200P graphics card in hand, it’s time to take stock of what kind of sacrifices were made to squeeze the “cinematic computing” experience into just 45 million transistors. Have NVIDIA and Albatron really produced a sub-$100 graphics product capable of running the jaw-dropping Dawn demo and today’s 3D applications with reasonably good frame rates? How does the card stack up against its budget competition? Let’s find out.

The NV34 cheat sheet
NVIDIA’s big push with its GeForce FX line is top-to-bottom support for DirectX 9 features, including pixel and vertex shaders 2.0, floating point data types, and gobs of internal precision. As the graphics chip behind NVIDIA’s GeForce FX 5200 and 5200 Ultra, NV34 has full support for the same DirectX 9 features as even the high-end NV30. What’s particularly impressive about NV34 is that NVIDIA has squeezed support for all those DirectX 9 features into a die containing only 45 million transistors—nearly one third as many as NV30.

Beyond its full DirectX 9 feature support, here’s a quick rundown of NV34’s key features and capabilities. A more detailed analysis of NV34’s features can be found in my preview of NVIDIA’s NV31 and NV34 graphics chips.

  • Clearly defined pipelines — NVIDIA has been very clear about the fact that NV34 has four pixel pipelines, each of which is capable of laying down a single texture per pass. Unlike NV30, whose texture units appear dependent on the kind of rendering being done, NV34 is limited to a single texture unit per pipeline for all rendering modes.

  • Arrays of functional units — NVIDIA has been coy about what’s really going on under the hood of its GeForce FX graphics chips. Instead of telling us how many vertex or pixel shaders each chip has, NVIDIA expresses the relative power of each graphics chip in terms of the amount of “parallelism” within its programmable shader. NV30 has more parallelism than NV31, which in turn has more parallelism than NV34. How much more? Well, NVIDIA isn’t being too specific about that, either.

  • Lossless compression lost — Unlike NV30 and NV31, the NV34 graphics chip doesn’t support lossless color and Z compression, which could hamper the chip’s antialiasing performance. The absence of lossless Z compression will also limit the chip’s pixel-pushing capacity.

  • 0.15-micron core — NVIDIA’s mid-range NV31 and high-end NV30 graphics chips are manufactured on a 0.13-micron manufacturing process, and both feature 400MHz RAMDACs. Since NV34 is targeted at low-end graphics cards, it’s being built on a cheaper and more mature 0.15-micron manufacturing process. The 0.15-micron manufacturing process limits NV34’s RAMDAC speed to 350MHz, but only those running extremely high resolutions at high refresh rates should be limited by a 350MHz RAMDAC.


NVIDIA’s NV34 graphics chip: DirectX 9 on a budget

With chip specifics out of the way, let’s take a peek at Albatron’s Gigi FX5200P.

Albatron’s Gigi FX5200P
Because the GeForce FX 5200 is a high-volume, low cost product, it’s likely that cards from different manufacturers will be very similar to each other. Manufacturers will probably stick to NVIDIA’s reference design and differentiate their products through cosmetics and bundles more than anything else.

Today, we’re looking at Albatron’s Gigi FX5200P, which follows NVIDIA’s GeForce FX 5200 reference design. Check it out:

The Gigi FX5200P’s blue board should nicely match Albatron’s most recent motherboards, which sport the same color scheme.

Because the GeForce FX 5200 supports 64 and 128MB memory configurations, manufacturers have a little room here for potential differentiation. Albatron has taken the high road with its Gigi FX5200P and endowed the card with 128MB of memory. Bucking a recent trend, the Gigi FX5200P’s memory chips are all mounted on one side of the board.

The budget-minded Gigi FX5200P uses TSOP memory chips from Samsung that are rated for operation at speeds as high as 500MHz. Since the GeForce FX 5200 spec calls for a memory clock speed of only 400MHz, users could potentially overclock the Gigi FX5200P’s memory bus by 100MHz without exceeding the capabilities of the memory chips.

Although the Gigi FX5200P’s NV34 graphics chip is built on a 0.15-micron core, the chip is only running at 250MHz. At that speed, a passive heat sink is all that’s needed to keep the chip cool, which is quite a contrast to the GeForce FX 5800 Ultra’s 70dB Dustbuster.

Like every other graphics card in its class, the Gigi FX5200P features a standard array of video output ports. NVIDIA has hinted that we may see some manufacturers offering versions of the GeForce FX 5200 with dual DVI output ports, but those cards may only surface in pre-built systems targeted at business users.

Bundle-wise, the Gigi FX5200P doesn’t have much to offer, but that’s to be expected. At this price point, there’s little room in the budget for extras. Albatron does manage to squeeze a composite-to-RCA video adapter and a video cable into the Gigi FX5200P’s box, along with a copy of WinDVD and a requisite driver CD.

All right, that’s enough gawking at pictures for now. It’s time to check out some benchmarks.

Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run three times, and the results were averaged.

Our test system was configured like so:

System
Processor Athlon XP ‘Thoroughbred’ 2600+ 2.083GHz
Front-side bus 333MHz (166MHz DDR)
Motherboard Asus A7N8X
Chipset NVIDIA nForce2
North bridge nForce2 SPP
South bridge nForce2 MCP
Chipset drivers NVIDIA 2.03
Memory size 512MB (2 DIMMs)
Memory type Corsair XMS3200 PC2700 DDR SDRAM (333MHz)
Sound nForce2 APU
Graphics card GeForce4 Ti 4200 8X 128MB
GeForce MX 460 64MB
GeForce FX 5200 128MB
Radeon 9000 Pro 64MB
Radeon 9600 Pro 128MB
Graphics driver Detonator 43.45 CATALYST 3.2
Storage Maxtor DiamondMax Plus D740X 7200RPM ATA/100 hard drive
OS Microsoft Windows XP Professional
OS updates Service Pack 1, DirectX 9.0

Today, we’re testing the GeForce FX 5200 against a wide range of graphics cards from ATI and NVIDIA. NVIDIA’s own GeForce4 MX 460 and ATI’s Radeon 9000 Pro are the GeForce FX 5200’s closest competitors, price-wise, though results for the GeForce4 Ti 4200 8X and Radeon 9600 Pro have been included to frame the results in a wider perspective.

In order to keep a level playing field, image quality-wise, I tested the NVIDIA cards with the “Application” image quality setting. The Radeon cards were tested using ATI’s “Quality” image quality option, which produces visuals roughly equivalent to NVIDIA’s “Application” setting.

A number of tests were run with 4X antialiasing and 8X anisotropic filtering enabled to illustrate the GeForce FX 5200’s performance with its image quality options pushed to their limits. Since the GeForce4 MX 460 is incapable of 8X anisotropic filtering, it was eliminated from those tests. Also, since my Radeon 9000 Pro features only 64MB of memory, it’s incapable of 4X antialiasing at resolutions above 1024×768.

The test system’s Windows desktop was set at 1024×768 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Fill rate
Theoretical fill rate and memory bandwidth peaks don’t necessarily dictate real world performance, but they’re a good place to start. Here’s how the GeForce FX 5200 stacks up against the competition when it comes to theoretical peaks:

Core clock (MHz) Pixel pipelines Peak fill rate (Mpixels/s) Texture units per pixel pipeline Peak fill rate (Mtexels/s) Memory clock (MHz) Memory bus width (bits) Peak memory bandwidth (GB/s)
GeForce FX 5200 250 4 1000 1 1000 400 128 6.4
GeForce4 Ti 4200 8X 250 4 1000 2 2000 512 128 8.2
GeForce4 MX 460 300 2 600 2 1200 550 128 8.8
Radeon 9000 Pro 275 4 1100 1 1100 550 128 8.8
Radeon 9600 Pro 400 4 1600 1 1600 600 128 9.6

As you might expect from the cheapest graphics card of the lot, the GeForce FX 5200 has the humblest specifications. The card’s fill rates match up nicely with ATI’s Radeon 9000 Pro, but with a memory clock speed of only 400MHz, the GeForce FX 5200’s peak memory bandwidth looks like a potential bottleneck.

Of course, the above chart only reflects theoretical peaks; real-world performance can be a different beast altogether. How does the GeForce FX 5200 stack up in some synthetic fill rate tests?

The GeForce FX 5200’s single-texturing performance looks good, putting the card neck-and-neck with the Radeon 9000 Pro. However, when we look at multi-texturing performance, the GeForce FX 5200 falls off the pace, despite the fact that it and the Radeon 9000 Pro are both 4×1-pipe architectures with similar clock speeds. Could the GeForce FX 5200’s limited memory bandwidth be rearing its ugly head?

To put real world fill rate performance in perspective, let’s revisit those theoretical peaks.

With single texturing, only the GeForce4 MX 460 comes close to realizing all of its theoretical peak fill rate. In multitexturing, the GeForce FX 5200 stands out like a sore thumb. The FX 5200 is the only card that doesn’t come close to realizing all of its peak theoretical multitextured fill rate, suggesting that a memory bottleneck may be lurking behind the scenes.

Occlusion detection
The NV34’s lack of lossless color and Z-compression puts the GeForce FX 5200 at a distinct disadvantage in our occlusion detection test. The GeForce FX 5200 also lacks the “Early Z” occlusion detection algorithms that have all but eliminated overdraw in mid-range and high-end graphics chips.

With antialiasing and anisotropic filtering disabled, the GeForce FX 5200’s performance in VillageMark closely matches the GeForce4 MX 460 and trails the Radeon 9000 Pro. When we enable 4X antialiasing and 8X anisotropic filtering, the GeForce FX 5200’s performance is actually closer to that of the GeForce4 Ti 4200 8X, which puts it ahead of the Radeon 9000 Pro.

Pixel shaders
The GeForce FX 5200 is replacing a pixel shader-less GeForce4 MX line at the low end of NVIDIA’s graphics lineup, so it doesn’t have big shoes to fill. The GeForce FX 5200’s pixel shaders support the pixel shader 2.0 standard, and they can even offer more color precision and longer instruction lengths than the shaders on ATI’s Radeon 9600 Pro.

Despite its support for more advanced pixel shader programs than any of the other graphics cards we’re looking at today, the GeForce FX 5200’s pixel shader performance in 3DMark2001 SE scrapes the bottom of the barrel. The GeForce FX 5200’s scores are especially disappointing in the advanced pixel shader test, where the card is well behind the Radeon 9000 Pro.

In NVIDIA’s own ChameleonMark pixel shader benchmark, the GeForce FX 5200 is at the back of the pack, well behind the Radeon 9000 Pro. The GeForce FX 5200’s poor pixel shader performance thus far underlines the fact that capability and competence are two very different things.

Of the cards we’re looking at today, only the Radeon 9600 Pro and GeForce FX 5200 are capable of running 3DMark03’s pixel shader 2.0 test. Of course, the Radeon 9600 Pro costs more than twice as much as the GeForce FX 5200, and the cards’ relative performance reflects that price disparity. The fact the GeForce FX 5200 is slow here no matter what resolution is used suggests simple pixel fill rate isn’t factoring into the equation much; the GeForce FX 5200’s pixel shaders are just slow. At the very least, they could definitely use a little more of that parallelism NVIDIA keeps talking about.

Vertex shaders
The reduced parallelism in GeForce FX 5200’s programmable shaders no doubt hurts its performance in synthetic pixel shader tests, but are the card’s vertex shaders equally hindered?

Yes. In 3DMark2001SE, even the GeForce4 MX 460, which uses DirectX 8’s software vertex shader, turns in a better performance than the GeForce FX 5200. The GeForce4 MX 460 can’t complete 3DMark03’s vertex shader test, but the GeForce FX 5200 is still underpowered and outclassed compared to the Radeon 9000 Pro.

In 3DMark2001 SE’s transform and lighting tests, the GeForce FX 5200 is again at the back of the pack. The card is more competitive in the less complex single light scene, but with eight lights, it doesn’t come close to keeping up with the Radeon 9000 Pro or the GeForce4 MX 460.

Games
Synthetic feature tests are fun and all, but real-world performance is what’s really important. How does the GeForce FX 5200 fare?

Quake III Arena

With antialiasing and anisotropic filtering disabled in Quake III Arena, the GeForce FX 5200 runs with the Radeon 9000 Pro and is slower than the GeForce4 MX 460. With 4X antialiasing and 8X anisotropic filtering, the GeForce FX 5200 stays ahead of the Radeon 9000 Pro and even starts narrowing the performance gap with NVIDIA’s GeForce4 Ti 4200 8X.

Jedi Knight II

The GeForce FX 5200 fares much better in Jedi Knight II, especially with antialiasing and anisotropic filtering enabled. Managing nearly 60 frames per second at 1024×768 with 4X antialiasing and 8X anisotropic filtering is an impressive feat for a graphics card this cheap.

Comanche 4

In Comanche 4, the GeForce FX 5200 is sent to the back of the pack again. Without antialiasing and anisotropic filtering enabled, the GeForce FX 5200 is behind even the GeForce4 MX 460. With antialiasing and anisotropic filtering turned on, the GeForce FX 5200’s performance improves relative to its competition, but overall frame rates are really too low for even casual gaming.

Codecreatures Benchmark Pro

In Codecreatures, the GeForce FX 5200 is left looking a little underpowered again. The Radeon 9000 Pro’s behavior in this test is a little erratic, especially with antialiasing and anisotropic filtering enabled, so its results have been omitted.

Unreal Tournament 2003

The GeForce FX 5200’s performance in Unreal Tournament 2003 isn’t exactly awe-inspiring. The card is beaten handily by the GeForce MX 460 with low-quality detail settings, and the FX 5200 just trails the MX 460 with Unreal Tournament 2003’s high-detail settings. In all cases, the GeForce FX 5200 is well behind the Radeon 9000 Pro.

With 4X antialiasing and 8X anisotropic filtering enabled, the GeForce FX 5200 looks like a much better competitor for the Radeon 9000 Pro. However, the frame rates produced by both cards with antialiasing and anisotropic filtering enabled are really only high enough for gaming at low resolutions.

Problems in Unreal Tournament 2003
The GeForce FX 5200’s performance in Unreal Tournament 2003 isn’t particularly impressive, but there are problems beyond simple performance. The “Application” image quality setting in NVIDIA’s drivers forces trilinear filtering in 3D applications. It also creates all kinds of rendering problems for the GeForce FX 5200 in Unreal Tournament 2003. Check it out:


Unreal Tournament 2003 with “Quality” image quality settings


Unreal Tournament 2003 with “Application” image quality settings
The way it’s meant to be played?

If the “Application” image quality setting is changed to “Quality” or “Performance,” the scenes are rendered correctly. However, with the “Application” image quality option selected, the rendering problems persist with even the lowest in-game image quality settings and the latest Unreal Tournament 2003 patch. This same problem also occurs with the GeForce4 MX 460, but not with any other card I’ve tested.

So, while Unreal Tournament 2003 is certainly playable on the GeForce FX 5200, trilinear filtering with the “Application” image quality setting is apparently too much to ask.

Serious Sam SE
We used Serious Sam SE’s “Extreme Quality” add-on to ensure a mostly level playing field between the different graphics cards. In these tests, the Radeons are actually doing 16X anisotropic filtering, the GeForce4 Ti 4200 8X and GeForce FX 5200 are doing 8X anisotropic filtering, and the GeForce4 MX 460 is doing only 2X anisotropic filtering.

The GeForce FX 5200 is at the bottom of the pile in Serious Sam SE in terms of average frame rates. What happens when we look at frame rates over the course of the entire benchmark demo?

The GeForce FX 5200 is consistently the slowest card throughout the demo, but the fact that the card stutters at the start of the demo at higher resolutions sets off a few alarms. What happens when we crank up the antialiasing and anisotropic filtering options?

With 4X antialiasing and 8X anisotropic filtering, the GeForce FX 5200 bottoms out again. How do things look over the course of the benchmark demo?

At the start of the demo, it’s not pretty. The GeForce FX 5200 jumps all over the place in the first moments of the demo, a behavior that gets worse as we crank up the resolution. After its initial fit, the GeForce FX 5200 settles down and produces low but relatively consistent frame rates. These big performance dips indicate a severe playability problem with Serious Sam SE on the GeForce FX 5200. Perhaps a future driver update could fix the problem, but for now, it’s not pretty.

3DMark2001 SE

Overall, the GeForce FX 5200 matches up with the GeForce4 MX 460 in 3DMark2001 SE. How do 3DMark2001 SE’s individual game test scores shape up?

The GeForce FX 5200 ties the GeForce4 MX 460 at the back of the pack in one of 3DMark2001 SE’s game tests, and sits behind the GeForce4 MX 460 in another. In the “Nature” game test, which requires at least DirectX 8-class hardware, the GeForce FX 5200 is barely half as fast as the Radeon 9000 Pro.

3DMark03
NVIDIA doesn’t think there’s really any point to using 3DMark03 to evaluate graphics cards, but we beg to differ. While it would be inappropriate for a graphics review to lean heavily on the results of any one graphics benchmark, when used in conjunction with as wide a variety of tests as we’ve assembled today, 3DMark03’s game tests do have value.

Only able to complete one of 3DMark03’s game tests, the GeForce4 MX 460 takes a bullet for the GeForce FX 5200 and ends up at the back of the pack. Let’s look at the individual game tests.

In 3DMark03’s first three tests, the GeForce FX 5200 can’t keep up with the Radeon 9000 Pro.

Only the Radeon 9600 Pro and the GeForce FX 5200 are capable of running the gorgeous “Mother Nature” game test, which requires DirectX 9-class hardware. Since the Radeon 9600 Pro costs more than twice as much as the GeForce FX 5200, it’s no surprise that its performance is much more impressive here.

3DMark03 image quality
The GeForce FX 5200 is capable of running 3DMark03’s DirectX 9-class “Mother Nature” game test, but that doesn’t mean the card’s output looks right. Check out the following pictures taken from frame 1799 of the Mother Nature game test. The first picture is from the reference software DirectX 9 renderer, the second is from ATI’s Radeon 9600 Pro, and the third is from the GeForce FX 5200.


DirectX 9’s software renderer


ATI’s Radeon 9600 Pro


NVIDIA’s GeForce FX 5200

You can to click on the pictures and look at the high-quality PNG images to see the difference more clearly, but the GeForce FX 5200’s sky definitely looks “off,” even when compared with the Radeon 9600 Pro. Is NVIDIA falling back to lower color precision in order to improve performance? Probably, but for the GeForce FX 5200, they really need to. Just because the GeForce FX 5200 supports DirectX 9 applications doesn’t mean it will look as good as other DirectX 9-capable cards.

SPECviewperf
So far, our performance has focused on gaming and synthetic feature tests. How does the GeForce FX 5200 perform in workstation-class applications?

Pretty well, at least when compared with the Radeon 9000 Pro. At least for a budget workstation, the GeForce FX 5200 appears to be a solid pick.

Antialiasing
Next, we’ll isolate the GeForce FX 5200’s performance in different antialiasing and anisotropic filtering modes. We’ve already had a glimpse of the card’s performance with 4X antialiasing and 8X anisotropic filtering in our game tests, but this section will provide a more thorough look at the card’s performance with these image quality features.

Edge antialiasing

The GeForce FX 5200’s antialiasing performance isn’t particularly impressive, but the card fares relatively well against the Radeon 9000 Pro. The fact the GeForce FX 5200 is, comparatively, so slow at 640×480 is a little disappointing, though.

Antialiasing quality: Radeon 9600 Pro
For some odd reason, the Radeon 9000 Pro refuses to change antialiasing levels in 3DMark03, so we’re using screenshots taken with a Radeon 9600 Pro as a comparative reference for antialiasing quality.


Radeon 9600 Pro: No antialiasing


Radeon 9600 Pro: 2x antialiasing


Radeon 9600 Pro: 4x antialiasing


Radeon 9600 Pro: 6x antialiasing

Antialiasing quality: GeForce FX 5200


GeForce FX 5200: No antialiasing


GeForce FX 5200: 2x antialiasing


GeForce FX 5200: 4x antialiasing

The GeForce FX doesn’t support 6X antialiasing like the Radeon 9600 Pro. With both cards at 4X AA, the FX 5200 doesn’t smooth out jagged edges as well as the ATI card.

Texture antialiasing
To measure texture antialiasing, I used Serious Sam SE with various texture filtering settings.

The GeForce FX 5200’s anisotropic filtering performance is well behind its competition; even the GeForce4 MX 460 manages better scores with its limited anisotropic filtering capabilities. Of course, the GeForce FX 5200 does have one advantage over the Radeon 9000 Pro; the GeForce FX 5200 can do trilinear and anisotropic filtering at the same time.

We only tested the GeForce FX 5200 with its “Application” image quality setting, but here’s how the card looks with its other image quality options:


GeForce FX 5200: Standard trilinear + bilinear filtering


GeForce FX 5200: “Performance” 8X anisotropic filtering


GeForce FX 5200: “Quality” 8X anisotropic filtering


GeForce FX 5200: “Application” 8X anisotropic filtering

As you can see, NVIDIA’s “Application” quality option produces the sharpest textures.

Anisotropic filtering quality: Radeon 9000 Pro
For comparative purposes, here’s the Radeon 9000 Pro with trilinear and “Performance” 8X anisotropic filtering. Because the Radeon 9000 Pro can’t do anisotropic and trilinear filtering at the same time, this is as good as it gets.


Radeon 9000 Pro: Standard trilinear + bilinear filtering


Radeon 9000 Pro: 8X anisotropic filtering

While 8X anisotropic filtering looks good on the Radeon 9000 Pro, the GeForce FX’s “Application” image quality setting, which uses anisotropic filtering and trilinear filtering, looks much better.

Texture filtering and mip map levels: GeForce FX 5200
Now let’s see exactly what NVIDIA is doing to texture filtering with these various quality slider settings. I’ve used Q3A’s “r_colormiplevels 1” command to expose the various mip-maps in use and the transitions between them.


GeForce FX: bilinear + trilinear filtering


GeForce FX: “Performance” 8X anisotropic filtering


GeForce FX: “Quality” 8X anisotropic filtering


GeForce FX: “Application” 8X anisotropic filtering

The GeForce FX’s “Performance” and “Quality” mip map transitions aren’t as smooth as the “Application” image quality setting, whose transitions are gorgeous.

Texture filtering and mip map levels: Radeon 9000 Pro


Radeon 9000 Pro: bilinear + trilinear filtering


Radeon 9000 Pro: 8X anisotropic filtering

As evidenced by the brutal mip map transitions, trilinear filtering is sorely missing on the Radeon 9000 Pro with anisotropic filtering enabled.

Overclocking
Unfortunately, there’s not much I can tell you about my overclocking experiences with the Gigi FX5200P. I was able to get the graphics core up to a stable and artifact-free clock speed of 300MHz with little effort, but any attempt at memory overclocking failed. That’s odd, especially considering that the card’s Samsung memory chips are rated all the way up to 500MHz. A little extra memory bandwidth could help the Gigi FX5200P’s performance, too. Conclusions
There are really two issues to consider as I conclude this review. First, there’s the performance and features of NVIDIA’s NV34 graphics chip and the GeForce FX 5200 cards that make use of it. Second, we have the Gigi FX5200P, Albatron’s take on the GeForce FX 5200.


This is Dawn on the GeForce FX 5200


This is the Dawn we drool over

To NVIDIA’s credit, the GeForce FX 5200 largely makes up for the travesty that was—and still is—the GeForce4 MX. With the GeForce FX 5200, NVIDIA can claim full DirectX 9 feature support across its entire graphics line. Even the cheapest GeForce FX 5200s, which retail for as little as $67 on Pricewatch, support all the DirectX 9 features that make NVIDIA’s “Dawn” demo look so good, and that’s an impressive feat. However, it’s important to make a distinction between feature capability and feature competence. As we’ve seen in our testing, the GeForce FX 5200 is considerably underpowered in situations where even less technically capable graphics cards excel. Sure, the GeForce FX 5200 supports high precision data types, pixel and vertex shaders 2.0, and a host of other advanced features, but it doesn’t seem to perform particularly well when those features are exploited. So, while the GeForce FX 5200 is technically capable of all sorts of DirectX 9-class eye candy, I have to question just how well the card will handle future DirectX 9 games and applications. After all, a slideshow filled with DirectX 9 eye candy is still a slide show.

The GeForce FX 5200 isn’t as capable a performer as its feature list might suggest, but that doesn’t mean cards based on the chip aren’t worth picking up. At only $67 online, the GeForce FX 5200 is a few dollars cheaper than the Radeon 9000 Pro. For gamers, the Radeon 9000 Pro offers better and more consistent performance. However, for average consumers and business users, the GeForce FX 5200 offers better multimonitor software, more future-proof feature compatibility, and silent and reliable passive cooling. The GeForce FX 5200 is a great feature-rich card for anyone that’s not looking for the best budget gaming option.

So what about Albatron’s Gigi FX5200P offering?

Unfortunately, it looks like Albatron may have tried to cater to gamers a little too much with the Gigi FX5200P. The Gigi FX 5200 retails for $95 on Pricewatch, which is pricey compared to GeForce FX 5200 cards from other manufacturers. The Gigi FX5200P does feature 128MB of memory, but since I wouldn’t recommend a GeForce FX 5200-based graphics card to budget gamers, I don’t see much point in having 128MB of memory on the board. With 128MB of memory, Albatron’s Gigi FX5200PP is too slow to appeal to gamers and too expensive to compete with the $67 GeForce FX 5200 64MB cards that will appeal to budget-conscious businesses and consumers.

Albatron does, however, have plans for a whole slew of budget GeForce FX 5200-based offerings, including versions of the card with 64MB of memory, 64-bit memory busses, and even a PCI variant. Those cards should be cheaper than the $95 Gigi FX5200P and more appropriate for consumers and business users.

Comments closed

Pin It on Pinterest

Share This

Share this post with your friends!