A peek at the GeForce FX 5200 and 5600

TODAY NVIDIA UNVEILS its long-awaited NV31 and NV34 graphics chips, which will bring the GeForce FX name down to price points that most of us can afford in what will hopefully be enough volume to meet demand. Though the GeForce FX 5800 Ultra has become everyone’s favorite whipping boy over the past few months, the vast majority of the market doesn’t buy flagship products like the 5800 Ultra. Instead, they wait for new technologies to trickle down to affordable prices in more mainstream products.

For more than a year, the GeForce4 MX and GeForce4 Ti 4200 chips have occupied the mainstream and performance segments of NVIDIA’s product catalog, but they’re being put to pasture in favor of new additions to the GeForce FX line. In 2002, GeForce4 Ti 4200 received glowing accolades from reviewers and held the PC graphics price/performance crown for the vast majority the year; many of us will be sad to see the product taken out behind the barn. The GeForce4 Ti 4200 will be replaced by NVIDIA’s new GeForce FX 5600 Ultra, which will have a tough act to follow.

The GeForce4 MX, on the other hand, wasn’t favored by many, and its lack of hardware pixel and vertex shaders isn’t likely to be missed by anyone. The GeForce4 MX’s replacements, the NV34-powered GeForce FX 5200 and 5200 Ultra, have rather small shoes to fill, so it should be easy for them to impress.

What are the capabilities and limitations of NVIDIA’s new NV31 and NV34 graphics chips and the new GeForce FX cards they’ll be rolling out on? Do these new products share enough technology with NVIDIA’s high-end NV30-based products to be worthy of the GeForce FX name, or is NVIDIA still keeping the mainstream a generation behind? Read on to find out.

Cinematic computing across the line
In a bold move that lays to waste NVIDIA’s much-criticized “MX” philosophy of introducing new low-end graphics chips a generation behind the rest of its lineup, NVIDIA’s new NV31 and NV34 chips both support Microsoft’s latest DirectX 9 spec and even offer a little extra functionality above and beyond DirectX 9’s official requirements. Here’s a quick rundown of the features shared by NV30, NV31, and NV34.

  • Vertex shader 2.0+ – NV30’s support for vertex shader 2.0+ carries over to NV31 and NV34, with all the bells and whistles included. Vertex shader 2.0+ offers some extra functionality over vertex shader 2.0, making the former a little more flexible.
  • Pixel shader 2.0+ – NV31 and NV34 also inherit all the features and functionality of NV30’s pixel shaders 2.0+, which supports more complex pixel shader programs than even Microsoft requires for DirectX 9. In total, NV31 and NV34 support pixel shader programs a maximum of 1024 instructions in length. Most of ATI’s R300-derived GPUs support pixel shader 2.0, whose maximum program length is only 64 instructions, though ATI’s latest Radeon 9800 and 9800 Pro use an “F-buffer” to support shader programs with a theoretically “infinite” number of instructions. At least for now, ATI’s “F-buffer” will only be available on high-end graphics cards, which means NVIDIA still has the edge on mainstream cards.

    Will NV3x’s support for more complex pixel shader programs than even DirectX 9’s requirements go unused? Maybe not. In this .plan update, id Software programmer John Carmack acknowledges that he’s already hit the R300’s limits:

    For developers doing forward looking work, there is a different tradeoff — the NV30 runs fragment programs much slower, but it has a huge maximum instruction count. I have bumped into program limits on the R300 already.

    Games that don’t venture beyond the DirectX 9 spec won’t make use of the NV3x’s support for longer pixel shader programs, but some developers will probably take advantage of support for extra-long shader programs where available.

  • Pixel shader precision – Like NV30, NV31 and NV34 support a maximum internal floating point precision of 128 bits within their pixel shaders. NV3x can also scale down its pixel shader precision to 16-bits of floating-point color per channel, or 64-bits overall, to yield better performance in situations where 128 bits of internal precision is just too slow.

    Unfortunately, it’s hard to compare the NV3x’s pixel shader precision directly with the R300’s. The R300 supports only one level of floating point pixel shader precision, which, at 96 bits, falls between NV3x’s support for 64- and 128-bit modes. Based on the results of early reviews, it looks like NV30’s performance with 128-bit pixel shader precision is a little slow, but the chip can sacrifice precision to improve performance.

The above features are all key components of NVIDIA’s CineFX engine, which means that NV31 and NV34 are both prepared for the “Dawn of cinematic computing” that NVIDIA has been pushing since its NV30 launch. This catch phrase, of course, refers to really, really pretty visual effects that should be easy for developers to create using of the additional flexibility offered by NV3x’s support for complex shader programs and high color modes. That’s the theory, anyway. NVIDIA will need to equip a large chunk of the market with CineFX-capable products before developers start targeting the technology, but bringing CineFX to the masses is what NV31 and NV34 are all about.

 

Differentiating features, or the lack thereof
If NV30, NV31, and NV34 share so many key features, how does NVIDIA differentiate between them? First, let’s deal with the easy stuff:

  Lossless color & Z compression Memory interface Transistors (millions) Manufacturing process RAMDACS
NV30 Yes 128-bit DDR-II 125 0.13-micron 400MHz
NV31 128-bit DDR-I 80
NV34 No 45 0.15-micron 350MHz

Both NV30 and NV31 use lossless color and Z-compression to improve antialiasing performance, but those features have been left off NV34 (likely to reduce the NV34’s transistor count). The lack of color compression will hinder NV34’s antialiasing performance, and the chip won’t support NVIDIA’s new Intellisample antialiasing technology. Losing Z-compression won’t help performance, either, with AA or in general use.

All of NVIDIA’s NV3x chips will have a 128-bit memory interface, but NV31 and NV34 will use DDR-I memory chips. NVIDIA wouldn’t reveal how fast the memory on its various NV31 and NV34 flavors will run, but at the very least we know that cards will have less memory bandwidth than the vanilla GeForce FX 5800. Currently, the fastest DDR-I-equipped consumer graphics cards use DDR-I memory at 650MHz, which offers just over 10GB/s of memory bandwidth on a 128-bit bus; to equal the GeForce FX 5800’s 12.8GB/s of memory bandwidth, NVIDIA would have to uue DDR-I memory chips clocked at 800MHz, which is very unlikely.

All of NVIDIA’s NV3x chips will be manufactured by TMSC. Although NV31 will use the same 0.13-manufacturing process as NV30, NV34 will use the older, more established 0.15-micron manufacturing process. NVIDIA wouldn’t reveal NV31 or NV34’s final clock speeds. Those speeds have been decided, but they won’t be released until actual reviews hit the web. It doesn’t take much faith to believe that the 0.13-micron NV31 will run at higher clock speeds than the 0.15-micron NV34. Because NVIDIA is guarding the clock speeds of its new chips so closely, it’s almost impossible to speculate on each chip’s performance potential. One wonders why is NVIDIA being so secretive.

There are, however, no secrets when it comes to NV31 and NV34’s integrated RAMDACs. NV34 integrates two 350MHz RAMDACs, while NV31 uses 400MHz RAMDACs. Honestly, NV34’s 350MHz RAMDACs shouldn’t hold many back. The GeForce4 MX’s 350MHz RAMDACs support 32-bit color in resolutions of 2048×1536 at 60Hz, 1920×1440 at 75Hz, and 1920×1200 at 85Hz. I can think of precious few instances where a relatively low-end NV34-based graphics card would be paired with an ultra high-end monitor capable resolutions and refresh rates higher than that.

Now that we’ve gone over the easy stuff, it’s probably a good idea to pause and take a deep breath. Things are about to get messy.

Deciphering the pipeline mess
Lately, a bit of a fuss has been made over the internal structure of NV30’s pixel pipelines and how many pixels the chip is capable of laying down in a single clock cycle. NV30’s internal layout is unconventional enough to confuse our trusty graphics chip chart, which only works with more traditional (or at least more clearly defined) graphics chip architectures.

What do we know about NV30 for sure? That it can render four pixels per clock for color+Z rendering, and eight pixels per clock for Z-rendering and stencil, texture, and shader operations. Only newer titles that use features like multi-texturing and shader programs will be able to unlock NV30’s ability to render eight pixels per clock cycle. In fact, even in id’s new Doom game, NV30 will only be rendering eight pixels per clock “most” of the time. That “most” is straight from NVIDIA, too.


NV31’s mystery-shrouded internals

If that explains NV30, what about NV31 and NV34? According to NVIDIA, both NV31 and NV34 have four pixel pipelines, each of which has a single texture unit. A 4×1-pipe design makes the chips similar to ATI’s Radeon 9500, but comparing NV31 and NV34 with NV30 is more complicated. You didn’t think you were going to get off easy this time, did you?

Because NVIDIA has explicitly stated that NV31 and NV34 are 4×1-pipe designs, it’s probably safe to assume that there are no situations where either chip can lay down more than four textures in a single clock cycle, but it doesn’t look like there are any situations where NV31 or NV34 can lay down more than four pixels per clock cycle, either.

According to NVIDIA, NV31 will be roughly half as fast as NV30 in situations where NV30 can lay down eight pixels per clock (Z-rendering and stencil, texture, and shader operations). Part of that speed decrease will come from the lack of a second texture unit per pixel pipeline, but NV31 will also be slower because it has “less parallelism” in its programmable shader than NV30. NVIDIA isn’t saying NV31 has half as many shaders as NV30 or that its shader is running at half the speed of NV30’s, just that the shader has “less parallelism.” If NV31’s performance is tied to the amount of parallelism within its shader, a betting man might wager that NV31 achieves “roughly half” the speed of NV30 when dealing with shader operations because NV31’s programmable shader has roughly half the parallelism of NV30’s.

Like NV31, NV34’s pixel pipelines have half as many texture units and its programmable shader “roughly half” as much parallelism as NV30’s. NV31 and NV34 have more in common with each other than they do with NV30, but at least partially because of its lack of color and Z compression, NV34 won’t be quite as fast as NV31. According to NVIDIA, NV34’s performance is very similar to NV31’s in situations where NV30 is capable of rendering four pixels per clock and about 10% slower than NV31 in situations where NV30 would be capable of rendering eight pixels per clock. Those comparative performance estimates refer to non-antialiased scenes; all bets are off when antialiasing is enabled.

Of course, these relative performance claims for NV30, NV31, and NV34 assume that the chips are running at identical clock speeds, which certainly won’t be true for all cards based on the chips and may not even be true for any. Additionally, any manufacturer’s performance claims should be taken with a grain of salt, at least until independent, verifiable benchmarks are published.

Now that we know about the chips, let’s move onto the cards they’ll be riding on.

 

NV31: GeForce FX 5600 Ultra
NVIDIA will bring NV31 to market as the GeForce FX 5600 Ultra, a performance-oriented product that will replace NV25/28-based graphics cards at a retail price of under $200. The GeForce FX 5600 Ultra’s suggested $199 price tag lines the card up nicely against ATI’s new Radeon 9600 Pro, which will become available in the same timeframe. Both cards will have four pixel pipelines with a single texture unit per pipe, but whether the GeForce FX 5600 Ultra can match the Radeon 9600 Pro’s 400MHz core clock speed remains to be seen.

What might GeForce FX 5600 Ultras look like? Let’s have a peek.


The GeForce FX 5600 Ultra, picture courtesy of NVIDIA

As you can see, the GeForce FX 5600 Ultra doesn’t need a Dustbuster.

And there was much rejoicing.

Seriously, listen. You can actually hear the rejoicing because there isn’t a vacuum whining in the background. The GeForce FX 5600 Ultra’s reference heat sink looks like something one might find on a GeForce4 Ti 4600, and won’t eat up any adjacent PCI slots. Still, it’s usually a good idea to leave the closest PCI slot to an AGP card empty to improve air flow.

From the picture above, we can see that NVIDIA has gone with those nifty BGA memory chips on its GeForce FX 5600 Ultra reference card, which also uses a standard output port config. Thankfully, because NV31 integrates dual RAMDACs and dual TMDS transmitters, there’s nothing stopping manufacturers from building GeForce FX 5600 Ultras with dual DVI or even dual VGA ports.

NVIDIA wouldn’t confirm how many PCB layers are required by the GeForce FX 5600 Ultra, but they did say that the boards would have fewer layers than the 12-layer GeForce FX 5800 Ultra. According to NVIDIA, plenty of different manufacturers building their own GeForce FX 5600 Ultra boards; we might even see a few manufacturers stray from NVIDIA’s reference design with more unique GeForce FX 5600 Ultra-based products.

Although NVIDIA makes no official mention of a non-Ultra GeForce FX 5600, I wouldn’t be surprised to see a few popping up in OEM systems. According to NVIDIA, “Ultra” is what sells on retail shelves, so manufacturers won’t be putting much retail emphasis on non-Ultra products. However, I doubt that means that shelves won’t be populated with non-Ultra GeForce FX 5800s since it appears that few GeForce FX 5800 Ultras will be available at all.

NV34: GeForce FX 5200 and 5200 Ultra
Unlike NV31, which only comes in one flavor (at least for now), NV34 will see action in the GeForce FX 5200 and GeForce FX 5200 Ultra, which will be priced at $99 and $149, respectively. The GeForce FX 5200 cards are aimed at mainstream markets, and will all but purge the GeForce4 from NVIDIA’s line. Unfortunately, the GeForce4 MX isn’t quite dead yet. The much-maligned MX will hang out in the sub-$80 price range as a value part, extending the incredible legacy of NVIDIA’s GeForce2 architecture.

Depending on which GeForce FX 5200 we look at, ATI’s Radeon 9600 or 9200 will be the competition. Unfortunately, without clock speeds or samples of the GeForce FX 5200 and GeForce FX 5200 Ultra, it’s hard to speculate about the cards’ performance relative to ATI’s mainstream offerings. At the very least, the Radeon 9600’s antialiasing performance should easily eclipse the GeForce FX 5200 Ultras because of the latter’s lack of color and Z-compression.

NVIDIA’s NV31 and NV34 press materials include pictures of the GeForce FX 5200 and 5200 Ultra, but the GeForce FX 5200 Ultra’s picture is identical to that of the GeForce FX 5600 Ultra. Since we’ve already checked out that glossy, let’s have a look at the vanilla GeForce FX 5200.


The GeForce FX 5200, picture courtesy of NVIDIA

How does a passively-cooled, DirectX 9-compatible graphics card for under $100 sound? Deadly, at least for the GeForce MX. Who wouldn’t spend an extra $20 for a couple of extra pixel pipelines and a side order of DirectX 9?

If the GeForce FX 5600 Ultra’s board layout is identical to the GeForce FX 5200 Ultra, then the latter should be available with BGA memory chips. It looks like the vanilla GeForce FX 5200 will use older TSOP memory chips, but won’t require any extra power, so it doesn’t have an on-board power connector.

As with its GeForce FX 5600 Ultra cards, NVIDIA isn’t revealing the board layer requirements of the GeForce FX 5200 other than to say that the boards will require fewer layers than the 12-layer GeForce FX 5800 Ultra. I’d expect the same manufacturers that will be building their own GeForce FX 5600 Ultra cards will also be making their own GeForce FX 5200s.

 

Selling features ahead of performance
Because NVIDIA hasn’t announced the clock speeds of its NV31- and NV34-based graphics products, it’s really difficult to pass any kind of judgment on what each card’s performance might look like. NVIDIA has released some no doubt carefully massaged benchmarks comparing its new GeForce FX chips with a handful of older GeForce4s and ATI’s recently replaced Radeons, but a manufacturer’s own benchmarks should always be taken with a healthy dose of salt. The fact that NVIDIA is being so secretive about so many crucial details, from core and memory clock speeds to the internal structure of the shaders, is troubling. At least they’re being up front about the number of pixel pipelines and texture units in each chip.

With few details about the internal structure of NVIDIA’s new performance and mainstream chips and no hint of clock speeds, analyzing NV31 and NV34 is tough. There’s no new technology in either chip, but both extend NVIDIA’s support for DirectX 9 features all the way down the line to even the $99 GeForce FX 5200. That even a budget $99 graphics card will feature support for vertex and pixel shader 2.0+ and full 128-bit internal floating point precision is impressive, and those features alone should give marketing managers plenty of options when it comes to dazzling consumers.

But will these new GeForce FX cards sell on their DirectX 9 support alone? That depends.

Against ATI’s Radeon 9200s, the GeForce FX 5200 should do well, and rightly so. Consumers looking for a graphics card at that price point will more than likely be swayed by the GeForce FX 5200’s generation lead over the Radeon 9200 when it comes to DirectX support. Those consumers also aren’t likely to be bothered by the GeForce FX 5200’s potentially poor antialiasing performance, since few are likely to enable that feature anyway. I am, however, concerned about the GeForce FX 5200 line’s performance in real DirectX 9 titles, especially when using full 128-bit floating point precision. NV34 may have full hardware support for DirectX 9, but supporting DirectX 9 applications and running them with acceptable frame rates are two very different things.

Ideally, the higher clock speeds facilitated by the GeForce FX 5600 Ultra’s 0.13-micron core should yield performance comparable with ATI’s mid-range Radeon 9600 Pro, but it’s hard to say for sure. At least in terms of their support for DirectX 9 features, the Radeon 9600 Pro and GeForce FX 5600 Ultra are on roughly even ground. NVIDIA will have the edge when it comes to supporting longer program lengths in its pixel and vertex shaders. Unfortunately, NV31’s shaders seem to be half as powerful as NV30’s, which isn’t encouraging. Based on what NVIDIA has revealed about the GeForce FX 5600 Ultra, it looks like the card could be significantly slower than the GeForce FX 5800, especially in next-generation titles like the new Doom.

In the end, NVIDIA should be commended for bringing NV30’s CineFX engine down to a price point that everyone can afford; it’s cinematic computing for the masses. However, the fact that NVIDIA is being so tight-lipped about the clock speeds of its new cards, even though those clock speeds have apparently been finalized, sets off all sorts of alarms in my head. NVIDIA’s carefully-picked selection of marketing-tuned benchmarks aren’t enough to allay my concerns, and I have more questions about these products than NVIDIA has straight answers. Color me a skeptic, but it doesn’t look like NVIDIA’s NV31- and NV34-based products will dominate the mainstream marketplace. We’ll have to wait and see. 

Comments closed
    • Anonymous
    • 16 years ago

    Be aware of stupid cards like my 3dClub 9000PRO shit stuf…128MB. Memory is@250MHz (DDR) by default in bios but gives corupted polygons and skins.Only works if memory clock is set to 230MHz (DDR). Not to mention that the catalyst reports it as POWERCOLOUR!!! This card sucks…ati sucks…..i want back my old ASUS GeForce256.

    • Anonymous
    • 16 years ago

    To post 68:

    DON’T BUY A FX5200!!! I bought one of these crap and it performs worst than my old GF2!! I quickly sold it and get a Radeon 9100 and it’s like TWICE the performance!!

    FX5200 is the ultimate crap!!!!

    • Anonymous
    • 16 years ago

    Nvidia and ATI are clearly in league together in a ridiculous scam to take money from poor stupid gamers who have nothing better to worry about than “will upgrading from a 5600 to a 9600 be like 7 polygons/s faster??” What kind of idiot pays $150 for a circuit board anyway?

    • Anonymous
    • 16 years ago

    Fuck you!
    i got a geforce 4 ti 4200(980s) from asus
    and it is faster than a geforce 4 ti 4600(in 3d mark 2001),but in a banchmark(an old one) freshdiagnose a tnt2 draw more texts:
    mine:728,592
    tnt2:858,401

    • Anonymous
    • 16 years ago

    If you’re looking to overclock the FX 5200, but are worried about the passive cooling, then you might want to look at buying an FX 5200 put out by PNY. They mount a small cooling fan on the heatsink for you, and also give you two VGA outs, so you can support dual monitors w/o having to buy any adaptors, plus they give you an S-video out as well. BFG and Asylum don’t do that for you. It was a really good buy for only $100.00.

    • Anonymous
    • 16 years ago

    FUCK YOU ALL
    my geforce 4 will kick your ass ๐Ÿ˜›

    u fucks av probably ony got tnt2 fuckheads

    • Anonymous
    • 16 years ago

    you all should get a Geforce 4 ti 4400.. its a beast.. i have it overcloocked and running stable over the GeForce 4ti 4600… its nice…

    • Anonymous
    • 16 years ago

    Here’s One Better,..Let’s See Nvidia And Radeon Do This.

    Wildcat VP990 Pro
    512MB 256bit DDR
    450M Tri/Sec
    225M Ver/Sec
    42G AA Samples/Sec
    200 Giga/Flops Processor
    1.2 Tera/Ops VPU

    All For The Price Of Both Top End Cards From Nivida And Radeon. But 4-5 Times The Performance.

    • Anonymous
    • 16 years ago

    Screw Invida And Radeon,..What Y’all Want Is A Wildcat IV From 3d Labs,…If You Work For DELL You Can Snake One For Free.

    • Anonymous
    • 16 years ago

    Well, a TNT2 Pro is not supported by BF:1942, I’ve tried! ๐Ÿ™‚

    • Anonymous
    • 16 years ago

    I have a Radeon 9000 PCI and want to know if the geforce fx 5200 card is any better? Can anyone tell me some facts between the two cards? If the 5200 is better, I may scrap the 9000.

    Thanks

    • Anonymous
    • 16 years ago

    geforce 3 is crap

    • Anonymous
    • 16 years ago

    lullululllul

    • Anonymous
    • 16 years ago

    YAY!! Finally, a cheap b[

    • Anonymous
    • 16 years ago

    *[http://www.chip.de/news/c_news_10209939.html<]ยง You\'ll probably want babelfish. NV35 > *

    • Anonymous
    • 16 years ago

    Uh, yeah. Considering NO games actually use DX9 right now, and full use of it as industry standard, like DX8.1 will take more than a few months to come into play, by the time DX9 IS mainstream, today’s 400 dollar DX9 cards will be tomorrow’s 179.99 cards anyway.

    • Anonymous
    • 16 years ago

    “When you first start up Unreal Tournament 2003, it puts up a big splash screen with an Nvidia logo that reads, “The Way It’s Meant to Be Played”. However, going by these test results, it looks more like UT is meant to be played on the Radeon 9500 Pro. ”
    YUP. 5200 and the 5600. If you know nothing about hardware. Then those cards are for you. Crap like the MX chips.

    • lovswr
    • 16 years ago

    Ok, It’s Monday now…where are the reviews? The line that srtuck the most from the article is this: “All of the NV3X series will have a [b]128 bit[/b]bus.” If this is true the epic finger pointing over at nvnews will be of epic proportions.

    • Anonymous
    • 16 years ago

    Too late Nvidia! I just picked up an ATI card a few days ago and I’m really happy with it, which is more than I can say for your hyped late-to-market products and stale moldy drivers on your website.

    • Anonymous
    • 16 years ago

    I just want to buy a Video Card that costs $2. And run DOOM3 at 80 FPS with Fsaa+Ansioscopic filtering sampled to the max at resolutions of 1600×1200. I don’t think that is too much to ask for.

    • Anonymous
    • 16 years ago

    If the 5200 is gonna be faster than my GF2 MX (it better!!!) then I’d be interested in buying it.

    And by faster… it better be at least double the speed. 3 chip generations later better make it double the speed. Damn… hope my 1GHz system is okay

    • Anonymous
    • 16 years ago

    posted at THG that some nvidia guy said Direct X 9 cards for $79 bucks.. would be great if the 5200 was $79 bucks… and the 5600 double that.. ~$160.00 that would be sweet.. specially if there is a Price war between the two.. should benifit everyone.. could care less if i have a ATI or Nvidia card, long as it works properly, and gives me the best bang for the buck.. I can’t afford $200+ bucks for a graphics card.

    • Anonymous
    • 16 years ago

    [q]In the standard tests without FSAA and anitotropic filtering, the FX 5600 Ultra seems to be just about the same or slower than a GeForce 4 Ti 4200 8x and Radeon 9500 PRO. This might be due to the reduced pixel pipelines (2×2, as opposed to 4×2). With 4xFSAA, it appears to reach nearly double the performance of the 4200, beating a 4800 as well, but it loses out to the Radeon 9500 PRO. It’s a similar picture with the anisotropic filtering. In the pixel shader tests from 3DMark 2001, it beats the 4200/2800, but loses in the vertex shader tests. In both tests, it clearly loses to the Radeon 9500 PRO. [/q]

    • wesley96
    • 16 years ago

    Considering Trident has been floating around selling their chips even NOW, while they have very little existence in the retail market proves that the fast pimp isn’t feeding the family, indeed.

    • Zenith
    • 16 years ago

    AG#49 – With AA and aniso mind you, and not with well writen or even a single FINAL driver set for it.

    • Anonymous
    • 16 years ago

    MX isn’t dead, not remotedly, it’ll be mass produced for at least 2 years.

    • opinionated
    • 16 years ago

    The naming scheme does make it easier to keep track of which nvidia card is going up against its corresponding ati card. 5200 vs. 9200 , 5600 vs. 9600 , 5800 vs. 9800 .

    • Anonymous
    • 16 years ago

    If we look at where these companies make there profit,,its obviousely not in the high end,,,so lets not say nvidias dead,,, they still make boat loads of money, The enthusiast may choose a 300 dollar rocket right now but the middle class choose whats priced for performance. We have no actual benchmarks right now and do not know the exact price points. You can find the same card priced at one place for 200 dollars ,,down the road its 175,, go figure ,,,dont complain,,, this only makes it better for us when there is competition and more options. And i think the whole naming sceme makes things more intresting,,, pick a box and you never know what your gonna get

    • Anonymous
    • 16 years ago

    I agree with 43. The 5200 just doesn’t look powerful enough to run DX9 games (although for some reason I still feel that it’s going to be fast). The 5200 Ultra has gotta be at least as fast as a GF4 Ti4200 to make me interested!
    And according to Tom’s Hardware Guide, the 5600 just can’t go up with the Radeon 9500 Pro, which is faster than the Ti4200.

    • Zenith
    • 16 years ago

    Ah crap NV is screws. I mean come on, there new flag ship getting its ass creamed by a 9500pro with some Aniso? WHAT GIVES?

    • EasyRhino
    • 16 years ago

    Yay! The MX is dead!

    Almost.

    ER

    • Anonymous
    • 16 years ago

    What idiot did the naming scheme?

    Just looking at the names one would think that the 5200 *is* the replacement for the 4200. Of course, a brief read will tell you that the *5600* is taking the midrange point of the 4200. With the 5200 taking over for the MX and mashing them handily…oh what a surprise (extreme sarcasm), but not beating the 4200 the naming scheme is tragic. It just seems like spectacularly bad marketing.

    • Anonymous
    • 16 years ago

    I wonder if NVIDIA is going the way of 3DFX.

    • Namarrgon
    • 16 years ago

    How fast does a TNT2 M64 run MOH:AA or BF1942? Not very fast, I suspect. Yet nVidia OEMs still sell them by the truckload.

    What’s important is, will Doom3/whatever provide means to cut down on the eye candy until the framerate is acceptable, even on a cheaper card like this? You can bet that they will, or they’ll be missing out on a very large market segment.

    At the very bottom of the market, a TNT2 M64 won’t run Doom3 at all, but a GF2 MX will – just. A GFFX 5200 should have no trouble, but of course with no AA/AF, only (say) 800×600 and probably with no specular highlights or some such.

    Bottom line is, the new low-end cards will run almost anything for years to come, so long as those games/apps allow people to turn down the resolution/texture size/fanciness enough – and I think that’s very likely.

    • Anonymous
    • 16 years ago

    So now we can have beautiful looking DX9 games on a $99 card … at < 10 fps …

    I’d rather have it look a bit rougher and get 50 fps on a DX8.1 card like the 9000 up to 9200 Pro.

    There aren’t many games around that use DX9 – in fact, it is mostly demos.

    Low-end cards are good for todays and yesterdays games, you buy the card, and you replace it 1 year down the line ‘cos it is cheap.

    High end cards are good for next years games, ‘cos you buy the card and then can’t afford to replace it in a year! So you get excellent performance now, and acceptable performance next year.

    I can only see the 5200 tanking badly, with bad DX9 game performance (when one appears), and bad DX8.1 performance to boot.

    • Anonymous
    • 16 years ago

    they should have a gold silver and bronze naming standard.

    • Anonymous
    • 16 years ago

    [q]I was under the impression that even the low end model (5200) should outdue the current ti4200. If not then maybe it’s not that exciting. Direct X 9 for $99 is a good deal.[/q]

    Ti 4200 has more texel fillrate than FX 5200 Ultra, and less or more pixel fillrate depending on whether the pipes are 4×1 (as TR has reported) or 2×2 (as THG has reported). 5200 Ultra has oogles more bandwidth and DX9 support.

    • Anonymous
    • 16 years ago

    Im looking for news posters for ยง[< http://www.3dnewsnet.com<]ยง e-mail mike!

    • Anonymous
    • 16 years ago

    Directx9 compliant for these cards will be the catch buzz word for non-gamers who no squat. Just because it’s dx9 compliant means squat if it does not have the rendering power to run demanding games with directx9 features. Most newbies don’t know that. And this is where these low-end crap cards are directed too. Try running a $99 directx9 card on DOOM3(which does not even use directx9 features) or any upcoming directx9 game. Or run 3dmarks2003 with those low price crap cards and see how they get slaughtered. More MX line crap.

    • Unanimous Hamster
    • 16 years ago

    From

    63M transistors for a DX8 part (NV25)

    to

    45M transistors for a DX9 part (NV34)

    ?????????????

    How’d they do dat?

    Methinks Nvidia is in league with Satan … or at least Bill Gates.

    • Anonymous
    • 16 years ago

    The NV25 has 63M transistors.

    The NV30 has ~120M transistors.

    The NV31 has ~70M transistors.

    The NV34 has ~45M transistors.

    Something has to give, I wonder what it is…

    • Namarrgon
    • 16 years ago

    Tightlipped about the clock speeds? Well then why do other previews state them all happily up front? (At least, B3D and nVNews do, all I’ve checked so far):

    GeForce FX 5600 Ultra: 350 MHz core, 350 MHz DDR-I memory
    GeForce FX 5200 Ultra: 325 MHz core, 325 MHz DDR-I memory
    GeForce FX 5200: 250 MHz core, 200 MHz DDR-I memory

    It wasn’t that long ago that ATi & Matrox actually did hide their clockspeeds. Thankfully it’s all that much more open now, at least.

    • Anonymous
    • 16 years ago

    fx5200 doesn’t have to outdo a GF4Ti 4200. The fx5600 does. The 5200 is tasked with outdoing a GF4MX 440, which shouldn’t be hard to do at all.

    • opinionated
    • 16 years ago

    Oops. I guess I did see them.

    • opinionated
    • 16 years ago

    Good point. That’s why I would have liked to have seen benches of the 9800 Pro done with the old 3DMark.

    • Anonymous
    • 16 years ago

    *[

    • opinionated
    • 16 years ago

    While nVidia is preventing anyone from reporting actual benchmark numbers before Monday, Tom’s Hardware (for what it’s worth) has run some benchmarks and is reporting general results. In general, they conclude that Ati’s cards are outperforming nVidia’s at similar price points.

    • opinionated
    • 16 years ago

    But will DirectX 9 games be playable on the 5200 Ultra when they (the games) finally begin coming out? I’m concerned the Radeon 9800 Pro may not be enough card by the time we finally get to see DirectX 9 games!

    • Rousterfar
    • 16 years ago

    I was under the impression that even the low end model (5200) should outdue the current ti4200. If not then maybe it’s not that exciting. Direct X 9 for $99 is a good deal.

    • Anonymous
    • 16 years ago

    dissonance
    [q]NVIDIA tells me 5600 Ultra at the beginning of April, 5200s at the end of April.[/q]

    hehe, someone check the calender. Isnt that almost exactly 1 year after the original release of the GF4 Ti4600 and 4200?

    Wait a year, add a thousand. If this keeps up, we can finally have a Geforce10000 sometime in 2008! Of course by then, who know how many digits/superlatives ATI will have after thier cards…

    • Anonymous
    • 16 years ago

    *[

    • leor
    • 16 years ago

    if the fx5200 can’t outdo a ti4200 that would be really sad

    • Anonymous
    • 16 years ago

    So explain to me why I should be happy about vid cards that should be about on par with their GeForce4?

    Their short lived 5800 Ultra is neck and neck with ATI’s 9700 so why should I want something slower when the 9700 has been out for so long now?

    Nvidia’s low end cards really should be performing at the level of a 9700, but alas, nividia fucked up and missed the boat.

    • Anonymous
    • 16 years ago

    *[Originally Posted by Trident

    • Anonymous
    • 16 years ago

    Finally something compelling for the average gamer to talk about in the gfx arena. While the whole ATI vs Nvidia performance lead debate is interesting, it’s not immediately practical for most of us.

    Its moot whether the 9800Pro kicks GFFX ass or vice versa; I am not willing to pay the $$ for either and probably alot of you feel the same way.

    Now that both companies will have new cards in the (sub) $200 range, the battle begins anew and I’ll be watching with keen interest. I bought my GF4 Ti4200 last summer when the only ATI offerings were the 8500 and the 9700Pro so it was an easy choice.

    If I decide to upgrade this fall/winter, I suspect it will be a difficult decision which card to buy ๐Ÿ™‚

    BTW, is this just another paper launch? About when are the cards actually due?

    • Anonymous
    • 16 years ago

    Gignic, I think you only missed the Radeon 9100. ๐Ÿ™‚

    Yes, the names have gotten out of hand, but at least with nVidia’s cards, they have other things attached to them like “MX”, “Ti”, “FX”. You can at least tell which product line you’re looking at, even if they were bastards with the GF4 MX’s non-DX8 abilities. All you have to go on with a Radeon is a number, and the first digit doesn’t mean much since the 8500 outdoes the 9000 and the 9100 is the 8500 rebadged.

    • Anonymous
    • 16 years ago

    i just got some spam today with something along the lines of extreme growing… not a good name

    Adi

    • Anonymous
    • 16 years ago

    I share #11’s enthusiasm. The FX 5200 Ultra is sitting smack in the middle of my price range. If it can outdo a Ti 4200, I’m there.

    • Anonymous
    • 16 years ago

    *[

    • Anonymous
    • 16 years ago

    If the 9600 Pro has some form of the F-Buffer technology that’s in the 9800 Pro the one advantage the 5600 has, longer pixel shader instructions, will be negated. But since there are no complete specs on the 9600 Pro, we will have to wait a little while.

    • Anonymous
    • 16 years ago

    *[Originally Posted by Trident

    • yarbo
    • 16 years ago

    extreme growing sounds like a catchy name

    • atidriverssuck
    • 16 years ago
    • Anonymous
    • 16 years ago

    XabreII will own the mainstream and value sections. Nvidia & ATI are just fighting a futile battle!

    • eitje
    • 16 years ago

    i think the GFFX Superfliege is my favorite ๐Ÿ™‚

    • eitje
    • 16 years ago

    hehe – brought to you by babelfish, and trident troll’s sense of humor.

    [superfly]
    spanish: GeForce FX Mosca Estupenda
    french: GeForce FX Mouche Superbe
    german: GeForce FX Superfliege
    japanese: GeForce FX &#26997;&#24230;&#12398;&#12399;&#12360;
    korean: GeForce FX &#52572;&#44256; &#48708;&#54665;&#44144;&#47532;

    when you run the Japanese back through, it comes out as “Extreme Growing”… hmmm…. and the Korean is “GeForce FX Highest Flight Range”.

    • Anonymous
    • 16 years ago

    *[

    • atidriverssuck
    • 16 years ago

    trident troll, remember that these cards sell in non-enlglish speaking countries too, so I think the more numbers, the merrier. Unique names probably wouldn’t work too well in other markets.

    • Rousterfar
    • 16 years ago

    Rember guys it’s these low end cards that sell. If Nvidia can win the crown for best price/performance on the lowend for this round vs ATI they will pull threw this fine. Everyone I know who has a GF4 got a ti4200. Not many people are willing to pop 300 bones down on a graphic card.

    • Anonymous
    • 16 years ago

    *[Originally Posted by Trident

    • Anonymous
    • 16 years ago

    I am excited about the low end cards,, i may just keep my geforce 2 mx 200 for now,,i havn’t found anything in my area with a decent price. So im gonna wait for these cards and se if radeon card prices plummet,,looks good for me!!! the middle class\lower end buyer ๐Ÿ˜€

    • Forge
    • 16 years ago

    Well, crap. That GFFX we announced and never shipped is already obsolete, so let’s start talking about Rage, Spectre and Mojo instead of getting The Voodoo5 6000 out the door.

    I’ve heard all this before.

    • Anonymous
    • 16 years ago

    Yes! Yesss!! Oh, let me taste your tears, Nvidia! [starts licking the Fairy’s tears off her face] Mm, your tears are so yummy and sweet.

    Oh, the tears of unfathomable sadness! Mm-yummy. [licks the tears off the table and off the Fairy’s face.] Mm-yummy you guys!

    • Anonymous
    • 16 years ago

    Given that the 9500 Pro is only a few % behind the vanilla 5800 when running AA and AF, I can’t imagine the 5600 Ultra beating out the price equivalent 9600 Pro…

    …actually I expect quite the opposite.

    The only place I see ATI falling behind is the lowend 9200 vs the 5200 Ultra.

    Dually
    _non-member_

    • Anonymous
    • 16 years ago

    You guys really, really need to get your hands on a GeForce FX.. other than that, well done as usual!

    • Anonymous
    • 16 years ago

    Hmmmm. Not giving out full details or even some details again. Nvidia doing it again. I hope for their sake that these cards are not stinkers like the 5800. And Nvidia not releasing details about makes me very skeptical too. Pls not another b.s card like the MX line or 5800 Ultra.

    • atidriverssuck
    • 16 years ago

    Color me a skeptic, but it doesn’t look like NVIDIA’s NV31- and NV34-based products will dominate the mainstream marketplace.
    ———–
    yep.

Pin It on Pinterest

Share This