NVIDIA’s GeForce FX 5900 XT GPU

Manufacturer eVGA
Model GeForce FX 5900 SE
Price (estimated) US$200
Availability Shipping

AFTER FIELDING a number of lackluster mid-range graphics chips based on its GeForce FX architecture, NVIDIA finally got its act together with the GeForce FX 5700 Ultra. When we reviewed the 5700 Ultra back in October, the card’s performance in DirectX 8-class games was at least as good as—and often much better than—ATI’s Radeon 9600 XT. Unfortunately, the 5700 Ultra stumbled in a number of DirectX 9-class games and synthetic benchmarks, casting a shadow of doubt about how the card might handle next-generation DirectX titles.

Not content to bet the mid-range farm on the 5700 Ultra, NVIDIA has added yet another GeForce FX card to its lineup, the GeForce FX 5900 XT. The 5900 XT will share the same $200 price point as NVIDIA’s existing GeForce FX 5700 Ultra, and both will compete with ATI’s $200 Radeon 9600 XT. This new card’s “XT” moniker suggests NVIDIA wants to knock the Radeon 9600 XT off its pedestal. NVIDIA has even whipped up an answer to ATI’s much-lauded Half-Life 2 bundle to sweeten the 5900 XT.

Can its pseudo-eight-pipe graphics core help elevate the GeForce FX 5900 XT above the competition? There’s only one way to find out.

eVGA’s GeForce FX 5900 SE
Don’t let its name fool you. eVGA’s GeForce FX 5900 SE is very much a 5900 XT. NVIDIA’s partners are free to name cards as they please, and most will be using the GeForce FX 5900 XT name. All GeForce FX 5900 XT cards should share the same core and memory clock speeds, and similar board layouts and memory configurations. Here are the specs on eVGA’s GeForce FX 5900 SE:

GPU NVIDIA NV35
Core clock 400MHz
Pixel pipelines 4*
Peak pixel fill rate 1600 Mpixels/s
Texture units/pixel pipeline 2
Textures per clock 8
Peak texel fill rate 3200 Mtexels/s
Memory clock 700MHz
Memory type BGA DDR2 SDRAM
Memory bus width 256-bit
Peak memory bandwidth 22.4GB/s
Ports VGA, DVI, composite and S-Video outputs
Auxiliary power connector 4-pin Molex

*The NV35 graphics chip renders four conventional (color + Z) pixels per clock, but is capable of performing 8 operations per clock for Z pixels, textures, and stencil and shader ops.

eVGA’s GeForce FX 5900 SE is a pretty plain looking card, which is just fine by me. As attractive as colored boards with blinking lights can be, how often are you actually looking at your PC’s internals?

In a move that will no doubt delight owners of Shuttle’s small form factor XPC systems, the GeForce FX 5900 SE uses a single-slot cooler that gets along just fine with Shuttle’s nonstandard AGP slot layout. The 5900 SE’s cooling fan is nice and quiet, too.

eVGA uses large memory heat sinks to cool the GeForce FX 5900 SE’s memory chips. Incidentally, those chips are only found on one side of the board.

Like just about every other consumer graphics card on the planet, eVGA’s 5900 SE has VGA, DVI, and S-Video output ports. The card also comes with a DVI-to-VGA adapter and an S-Video cable.

In addition to a couple of cables, eVGA’s 5900 SE comes with a pretty stacked software bundle. For starters, the bundle includes full versions of NVDVD 2.0, America’s Army, and Ghost Recon, but that’s not all. NVIDIA has also announced an exclusive deal to bundle the recently released WWII shooter Call of Duty with its GeForce FX 5900 XT graphics cards. Just about every one of NVIDIA’s partners, including eVGA, will be getting in on the deal.

NVIDIA’s Call of Duty deal represents roughly $50 of value for those who were planning on picking up the game, which is a pretty sweet deal. NVIDIA is providing its partners with full-version Call of Duty CDs rather than coupons or vouchers, so the game should come right in the box.

NVIDIA certainly isn’t the first graphics chip manufacturer to announce an exclusive, top-tier game bundle for its mid-range graphics cards. ATI announced its Radeon XT Half-Life 2 bundle back in September. Call of Duty might not have the name recognition of Half-Life 2, and it’s probably wasn’t nearly as eagerly anticipated as Valve’s upcoming sequel. But Call of Duty is available today, and there’s no telling when Radeon XT owners will be able to redeem their Half-Life 2 coupons. In the end, I’ll probably end up spending more time playing Half-Life 2 than I will Call of Duty, but there’s only so much amusement I can wring from a Half-Life 2 coupon. Making paper airplanes and origami are fun for the first five minutes, but replay value is pretty weak. (In all fairness, ATI and Valve are offering an interim game pack for Radeon buyers that includes a total of six downloadable Valve titles from years past. They are older titles, though.)

 

Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run three times, and the results were averaged.

Our test system was configured like so:

  System
Processor AMD Athlon 64 3200+ 2.0GHz
Front-side bus HT 16-bit/600MHz downstream
HT 8-bit/600MHz upstream
Motherboard Chaintech Zenith ZNF3-150
Chipset NVIDIA nForce3 Pro 150
Chipset drivers nForce 3.13
Memory size 512MB (1 DIMM)
Memory type Corsair XMS3500 PC3200 DDR SDRAM (400MHz)
Graphics card GeForce FX 5900SE 128MB
GeForce FX 5700 Ultra 128MB
Radeon 9600 XT 128MB
Graphics driver Detonator FX 52.16 CATALYST 3.9
Storage Maxtor DiamondMax Plus D740X 7200RPM ATA/100 hard drive
OS Microsoft Windows XP Professional
OS updates Service Pack 1, DirectX 9.0b

Today we’ll be looking at the GeForce FX 5900 XT’s performance against its $200 competition: the GeForce FX 5700 Ultra and ATI’s Radeon 9600 XT. I’ve included results for the 5900 XT with NVIDIA’s previous 52.16 drivers, and the new 53.03s.

Though ATI’s Catalyst 3.9 drivers support Overdrive automatic overclocking for the Radeon 9600 XT, we ran the card with Overdrive disabled. Since Overdrive is controlled by the GPU core temperature, which is influenced by variable ambient system temperatures and the thermal characteristics of individual graphics cores, it’s hard to come up with reproducible scores.

It’s important to mention that, at press time, the NVIDIA Detonator 53.03 drivers we used for testing weren’t approved by FutureMark for use with 3DMark03. FutureMark has yet to evaluate the drivers in question. Keep in mind that future 3DMark03 patches may combat optimizations in unapproved drivers, which could impact performance in the benchmark.

The test system’s Windows desktop was set at 1024×768 in 32-bit color at a 75Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

 

Fill rate
Theoretical fill rate and memory bandwidth peaks don’t necessarily dictate real-world performance, but they’re a good place to start.

  Core clock (MHz) Pixel pipelines  Peak fill rate (Mpixels/s) Texture units per pixel pipeline Peak fill rate (Mtexels/s) Memory clock (MHz) Memory bus width (bits) Peak memory bandwidth (GB/s)
Radeon 9600 Pro 400 4 1600 1 1600 600 128 9.6
Radeon 9600 XT 500 4 2000 1 2000 600 128 9.6
GeForce FX 5600 Ultra 400 4 1600 1 1600 800 128 12.8
GeForce FX 5700 Ultra 475 4 1900 1 1900 906 128 14.4
GeForce FX 5900 XT 400 4 1600 2 3200 700 256 22.4
GeForce FX 5900 450 4 1800 2 3600 850 256 27.2

With a core clock speed of 400MHz, the GeForce FX 5900 XT can’t quite match the 5700 Ultra or Radeon 9600 XT’s single texture fill rate. However, the 5900 XT’s multi-texturing fill rate blows away the competition, as does its peak memory bandwidth, which more than doubles what’s available with the Radeon 9600 XT. It’s also worth noting that the 5900 XT’s NV35 graphics chip can, under certain circumstances, look a lot like an eight-pipe chip.

Of course, theoretical peaks don’t always hold up in the real world. How does the 5900 XT deliver on its fill rate potential?

Despite having a lower theoretical single-texturing fill rate peak than both the 5700 Ultra and Radeon 9600 XT, the GeForce FX 5900 XT blows them both away in 3DMark03’s single-texturing fill rate tests. The 5900 XT is way out ahead in our multi-texturing test, too, nearly doubling the performance of the 5700 Ultra.

Shader performance
Given the GeForce FX’s unconventional shader architecture, it’s not easy to come up with theoretical expectations for pixel or vertex shader performance, so let’s cut right to the graphs.

The GeForce FX 5900 XT leads the pack in both pixel and vertex shader performance, though pixel shader performance drops noticeably with the latest drivers. Those same drivers dramatically increase vertex shader performance. Win some, lose some, I guess.

 

ShaderMark 2.0
ShaderMark 2.0 is brand new and includes some anti-cheat measures to prevent drivers from applying questionable optimizations. The Radeons run the benchmark with straight pixel shader 2.0 code, but I’ve included results for the GeForce FX cards with partial-precision and extended pixel shaders, as well.

Some of ShaderMark 2.0’s shaders won’t run on the GeForce FX 5900 XT, or indeed any other GeForce FX chip. The problem is apparently related to floating point texture formats, which the GeForce FX architecture supposedly supports in hardware. NVIDIA claims that adding floating point texture format support to its drivers hasn’t been a priority, and even the latest 53.03s don’t appear to support the feature.

It can’t quite catch the Radeon 9600 XT, but the GeForce FX 5900 XT is still pretty fast in ShaderMark 2.0. The 5900 XT is much faster than the 5700 Ultra, no doubt because of the former’s ability to perform up to eight shader ops per clock.

 

Quake III Arena

Call of Duty

Unreal Tournament 2003

The 5900 XT performs well in our first wave of first-person shooters, but the 5700 Ultra steals a rare win in Call of Duty with antialiasing and anisotropic filtering disabled. Despite a better performance in Quake III with the latest 53.03 Detonators, the 5900 XT looks more comfortable with the older 52.16 drivers.

 

Comanche 4

Gun Metal benchmark

The 5900 XT makes quick work of the competition in Comanche 4 and Gun Metal. The card’s more than twice as fast as a Radeon 9600 XT with 4X antialiasing and 8X aniso.

 

Serious Sam SE

NVIDIA cards have long performed well in Serious Sam SE, and the 5900 XT is no exception. The 5900 XT is way out ahead of both the 5700 Ultra and Radeon 9600 XT, with and without 4X antialiasing and 8X aniso.

The 9600 XT’s performance with 4X antialiasing and 8X aniso at 1600×1200 is wildly erratic through the first 20 seconds of our benchmark test and essentially unplayable at this resolution. This wasn’t the case with ATI’s previous 3.8 Catalyst drivers, so I suspect ATI has a bug on its hands.

 

Splinter Cell

The 5900 XT puts on another show in Splinter Cell, where it’s about 50% faster than the Radeon 9600 XT across all resolutions.

 

3DMark03

No doubt in part because of NVIDIA’s aggressive driver optimizations, the 5900 XT runs away with 3DMark03’s game tests. The latest version of 3DMark03 was incompatible with a number of NVIDIA’s driver optimizations, but the new 53.03s appear to have a whole new set of tricks up their sleeve. Of course, I couldn’t see any difference in rendering quality between the new 53.03s, the old 52.16s, or even ATI’s latest drivers in 3DMark03’s game tests.

 

Tomb Raider: Angel of Darkness
Tomb Raider: Angel of Darkness gets its own special little intro here because its publisher, EIDOS Interactive, has released a statement claiming that the V49 patch, which includes a performance benchmark, was never intended for public release. The V49 benchmark apparently fails to correctly load Tomb Raider’s GeForce FX-optimized code path, so it’s more a reflection of how the GeForce FX cards perform with default DirectX 9 code than anything else. Running the normal Tomb Raider game executable, without the benchmark mode enabled, loads the correct GeForce FX code path and promises better performance.

We’ve used these extreme quality settings from Beyond3D to give the GeForce FX 5900SE a thorough workout in this DirectX 9 game.

I had a lot of problems getting the 9600 XT to run the Tomb Raider test at anything other than 1024×768, which is really a shame. ATI’s previous Catalyst 3.8 drivers didn’t have a problem with higher resolution tests, but I couldn’t get the Catalyst 3.9s to work properly above 1024×768.

It’s a pity that the Radeon 9600 XT doesn’t have a full suite of scores here, because it could have been an exciting race. The 5900 XT is quite a bit faster than the 5700 Ultra in Tomb Raider, and could potentially be faster than the 9600 XT. All that with an unoptimized DirectX 9 code path, too.

AquaMark3

The 5900 XT is 50% faster than the competition with antialiasing and anisotropic filtering disabled, but its lead with 4X antialiasing and 8X aniso is much smaller.

Halo
I used the “-use20” switch with the Halo benchmark to force the game to use version 2.0 pixel shaders.

In Halo’s benchmark timedemo, the 5900 XT is out in front by about 50% again.

Real-Time High-Dynamic Range Image-Based Lighting
To test the GeForce FX 5900SE’s performance with high-dynamic-range lighting, we logged frame rates via FRAPS in this technology demo at its default settings. The demo uses high-precision texture formats and version 2.0 pixel shaders to produce high-dynamic-range lighting, depth of field, motion blur, and glare, among other effects.

None of our GeForce FX cards renders the “rthdribl” demo perfectly, possibly because of their lack of support for floating point texture formats, but the 5900 XT still performs well. The Radeon 9600 XT is faster at lower resolutions, but the 5900 XT pulls even when we hit 1024×768.

 

Edge antialiasing

ATI’s SMOOTHVISION gamma-corrected antialiasing does a better job of camouflaging jagged edges than the GeForce FX’s antialiasing schemes, but the 5900 XT is faster at 2X and 4X AA than the Radeon 9600 XT.

 

Texture antialiasing

The 5900 XT is at the head of the class in our anisotropic filtering tests, though like every other GeForce FX, the 5900 XT’s aniso maxes out at 8X.

 

Overclocking
In testing, I managed to crank eVGA’s GeForce FX 5900 SE up to a core and memory clock speed of 460 and 880MHz, respectively. Considering that the vanilla GeForce FX 5900 is clocked at 450/850, that’s not a bad little overclock for a $200 graphics card.

As always, it’s important to note that just because I was able to get my GeForce FX 5900 XT sample stable and artifact-free at 460/880 doesn’t mean that every 5900 XT will be capable of those speeds. Some cards may clock higher, and some may not overclock at all.

Our overclocked 5900 XT gets a nice performance boost from its higher core and memory clock speeds, especially with antialiasing and aniso enabled.

 

Conclusions
If you’ve been following along, you’ve seen the GeForce FX 5900 XT simply dominate its $200 competition. Though the card is a little behind the Radeon 9600 XT in ShaderMark 2.0 and the “rthdribl” high dynamic range lighting demo, the 5900 XT makes up for it in nearly every other test. The fact that the 5900 XT is occasionally 50% faster than the 9600 XT is a little shocking, but it’s certainly good news for gamers looking for great performance at an affordable price.

As good as the 5900 XT is, I feel for NVIDIA’s partners who are also trying to sell 5700 Ultra boards. 5700 Ultras are retailing for just under $190 online, which wasn’t a bad deal until the 5900 XT came along at roughly same price with much better performance and a copy of Call of Duty in the box. eVGA tells me that its 5900 XT-based GeForce FX 5900 SE is ready to ship, leaving really no reason to go with a 5700 Ultra unless those cards’ prices fall dramatically.

The GeForce FX 5900 XT is no doubt a very fast, very affordable graphics option for enthusiasts and gamers on a budget, and right now I’d recommend it over the Radeon 9600 XT. However, I have some concerns about NVIDIA’s lack of driver support for floating point texture formats, which could become a more important issue as more DirectX 9 titles come to market. Adding support for floating point texture formats apparently hasn’t been a priority for NVIDIA’s driver team, but it should be. Perhaps NVIDIA could take some time away from optimizing for 3DMark03 and dedicate more software engineers to floating point texture support.

Optimization jabs aside, NVIDIA has put together a sweet deal with its GeForce FX 5900 XT and Call of Duty bundling deal. Don’t let eVGA’s GeForce FX 5900 SE name fool you, either; this card is packing as much goodness as everyone else’s 5900 XT cards. 

Comments closed
    • flip-mode
    • 16 years ago

    l[

      • Anonymous
      • 16 years ago

      So you must be really pissed at Nvidia then too as their core has been unchanged pretty much since the geforce4 line. They bothed shared the same type of memory controller and memory interface. The 5900 being the exception……..

      • Anonymous
      • 16 years ago

      For $200+ the 9700pro is still the best buy. And it plays DX9 games without issue. Can’t say that for the FX cards………..

        • Anonymous
        • 16 years ago

        Without issues? Which issues do you mean? TROD comes to mind in the “issues” category.

        I could care less who is king of the graphics industry but perhaps some NON fanboys can clarify something for me.

        What’s the deal behind the “smoke” and “mirrors” comments? I realize that Nvidia had to do some driver tweaking to get their FX line up to snuff but so what? Diss himself said there is no real visible difference between the cards, other than the ATI card having a little better AA (or was it AF? Or both..). Is there a unspoken law somewhere that says cards cant use any form of “optimizations”? I don’t see the point to all the ruckus.

        Please just save flaming comments, these are honest questions I have. You guys are pretty predictable.

    • Anonymous
    • 16 years ago

    You can get a 9800 non-pro non-SE for around 200…and if you use ATi’s trade in program you can get a 9800 Pro for 250…either of which imho are better options than the nvidia cards, at least until nvidia gets their act together. And im not entirely sure, but they both may still come with the HL2 offer if you buy them BBATI.

    • us
    • 16 years ago

    I won’t buy Nvidia until they make real DX9 card

    • Anonymous
    • 16 years ago

    LOL! You like how TR changed their graphs, instead of using bar graphs that would show just how much of a Gigantic lead Nvidia has they use the other ones.

    Is it really so hard to give a unbiased review?

      • Dissonance
      • 16 years ago

      Um, what? We changed our graphs to display performance across multiple resolutions more clearly. We’ve been doing it for a while now, too.

      When we’re only dealing with a single resolution, like 3DMark03, the bar graphs are there.

      Is it really so hard to read a line graph?

        • Anonymous
        • 16 years ago

        For him it is Dissonance. Blame it on his cheap medication…. 😛

    • Anonymous
    • 16 years ago

    HOLY SH*T SON! The nvidia card Smoked the ATI card. I didn’t think Nvidia would have been able to come back other than maybe during the next release of cards.

    I remember TR would say the ATI card would win by a “Huge” amount when there were only a handful of fps difference.

    Nice to see you guys decided to lay off those comments for ATI’s sake.

    Ah but the real question remains, has the image quality been drastically decreased in order to give ATI the old one two knockout?

      • hmmm
      • 16 years ago

      No, there probably aren’t any more cheats than usual for NVIDIA. They’re just selling a $400 card for $200 and confusing ATI’s branding nomenclature.

    • Anonymous
    • 16 years ago

    A correction, page 1:
    “The NV35 graphics chip renders four conventional (color + Z) pixels per clock, but is capable of performing 8 operations per clock for Z pixels, textures, and stencil and shader ops.”

    Nope. The sentence should read
    “<…> but is capable of performing 8 operations per clock for Z and stencil pixels and 8 texture lookups per clock.”

    Less confusing IMO and factually correct to boot 😉

    Note that 8 texture lookups per clock are an automatic capability of all 4×2 architectures, it isn’t unique to the NV30/NV35 design. Also, doing multiple “shader ops” per clock, per pipe, is just as misleading. ATI does that, too.

    Recommended reading on the topic of shader pipe layout:
    §[<http://www.beyond3d.com/forum/viewtopic.php?t=8005<]§ -zeckensack

      • Pete
      • 16 years ago

      Diss, I’m curious why zeck’s info hasn’t been incorporated into this review? I’m also unsure how the 5900 is capable of two shader ops per clock. I knew about the Z and stencil ops, but two shader ops? I doubt that happens in many cases.

    • Anonymous
    • 16 years ago

    /[http://www.anandtech.com/video/showdoc.html?i=1896&p=38<]§

      • Anonymous
      • 16 years ago

      Now look at IQ issues with Halo and MaxPayne2. All companies have issues. Nvidia just more for the past year or so regarding IQ. And i wouldnt say “Anandtech” is a unbiased source. It’s almost as bad as Tom’s Hardware……….

        • Anonymous
        • 16 years ago

        Really what part is biased?

          • Anonymous
          • 16 years ago

          All one has to do is look at all the current and past reviews of R3xx based chips vs. NV3xx based chips at Anandtech. Their results compared to almost every other review hardware site contridicted their results in favour of NV3xx chips. They also NEVER mention Nvidia’s poor shader performance in any of their reviews. And the possible effects this will have on future dx9 games.

          And then, when Nvidia was fudging their drivers in the IQ realm to get a little extra speed and this fact was proven at many sites including this one,(which in my opinion is one of the better review unbiased sites)
          Anandtech defended Nvidia to the tooth and nail. Even though we all knew they were cheating. The 51.75 drivers being the Coup de grace.

          And one more issue: Even before the 5950 was a mention on ANY OTHER website, Anandtech just happened to get a Beta Version of the card before anyone else. Including the new drivers. No other site got one. All the other sites criticisied Nvidia at one time or another about poor FX performance and “questionable” driver optimization practices and got snubbed. Anandtech had nothing but good things to say about the card.

          Just ask HardOcp about saying critical things about Nvidia and their products. And also ask TheInquirier.net about that too. Nvidia STILL won’t talk to them. Even TechReport, GamersDepot.com, Extremetech.com and Beyond3d. com.
          When they gave a honest critical assement of Nvidia FX cards and it’s lack luster performance in certain games vs. ATI cards and cheating issues notwithstanding, Nvidia stopped sending those sites “complamentary cards to test. .

          That’s what i call BIASED. Kiss ass and get those free products. Just make sure you give a “good’ review. Thank God TechReport is not like that……………….

          • PLASTIC SURGEON
          • 16 years ago

          r[<"Unfortunately, the 5700 Ultra stumbled in a number of DirectX 9-class games and synthetic benchmarks, casting a shadow of doubt about how the card might handle next-generation DirectX titles."<]r Even TechReport states the 5700 weakness. A honest perception. No ass kissing. Just honest reviews. Something Nvidia has loathed since the release of the FX series of cards. Funny how Anandtech's review of the 5700 NEVER EVEN ONCE MENTIONS the 5700's issue with future DX9 games and current ones? Now ask Damage if he had to buy the 5700Ultra himself to test or did Nvidia being the "graceful" people that they are, donated one for this site to test? Because you damn well know Anandtech got one donated. TechReport. Never sell out.

            • Anonymous
            • 16 years ago

            /[

            • Anonymous
            • 16 years ago

            Well, TRAOD and Halo are DX9 games. They are out. And Nvidia had to go to great lengths and different driver sets just to get those games to run decent. Anandtech never has anything critical to say about Nvidia products. Even when they got busted for cheating. They just turned a blind eye for a good 4 months when this happened. That’s not a safe approach. That’s a weasle approach not to piss off one of your major advertisers……. 😛

            • us
            • 16 years ago

            But he took every possible chances to predict an excellent FX5800 in his 9700 review

        • Logan[TeamX]
        • 16 years ago

        Max Payne 2 is a slick game. Halo does have its issues.

        If Nvidia performed as it should, out of the box, we wouldn’t need all this underclocking and renaming monkey-business. End of story.

          • Anonymous
          • 16 years ago

          Nvidia needs to use all these optimizations just to get these new games to run.

      • Anonymous
      • 16 years ago

      And when you go to that site. You also see that TRAOD has better IQ on ATI cards. As well as overall Anti-Alaising. Gama corrected rotated grid AA is still superior to Nvidia 3 year old method of order sampling.

      “Tests here were too tricky to get close enough to the same frame on both cards for a difference image. We can see, however, that the ground on the ATI card has more lighting effects. We can see the same thing in the frame with anisotropic filtering and antialiasing turned on.”

      Both cards have their strengths and weaknesses. However, Nvidia FX cards just have too many weaknesses for future shader intensive games. And unfortunately for Nvidia and FX owners, those are the future of gaming for now.
      I am not spending $200+ on a card that “might” play dx9 games properly. Or cross my fingers and hope that all game developers code using Nvidia’s labour intensive Cg graphic core just to get to almost the same level as ATI cards with those very games..

      • lethal
      • 16 years ago

      non biased?
      on their midrange review the 9600XT wins 12 of 26 and the GeForce FX 5700 wins 13 of 26 (I didn`t count one test because the difference is lower than 0.5 FPS, 34.3 vs 34.6) and yet they claim “NVIDIA has flipped the tables on ATI in the midrange segment and takes the performance crown with a late round TKO.” note that tomb raider: Angel of Darkness is a DX9 game and they didn

        • Anonymous
        • 16 years ago

        The reason why he didn’t cover it is because it was bad on both cards. He explains why he didn’t add it. Tomb Raider is by no means a AAA title Like Doom III/Half-Life II 😛 I don’t even know why that it is even included for such a boring game.

        • Anonymous
        • 16 years ago

        Anandtech is a pro nvidia site. Their the major advertiser. When Anandtech runs a comparison review they almost never test DX9 games that are out. And they have NEVERED used MaxPayne2 as a test. Any test or game that uses heavy vertex and pixel shader performance is not used by Anandtech, just Quake3 engine type games. Non-biased my ass.

    • Anonymous
    • 16 years ago

    These debates will never end. Nvidia fanboys will support Nvidia to their graves even though the truth is out on subpar performance for PS2.0 and Vertex snader support for Next Generation games.
    If they cheat like they have done, Nvidia fanboys will make up excuses why Nvidia did it. Who needs a PR department when you have blind sheep supporting a flawed product? If IQ is less and lower floating point precision is used, Nvidia fanboys will argue less IQ is better. Slower Frame rates? Same response. And the classic.

    “They still have bad drivers”

    Too bad every review site that runs ATI cards say they are solid. I have no issues with my 9600pro.
    All the lies about the 5800Ultra?

    “Just delays. No issues.”

    Lies about how many piplines it has?

    “It still runs like a 8×1 pipeline card. I swear. Nvidia even said so. I will believe”

    What about the 5900 and the 5950?

    “It still runs like a 8×1 pipeline card. Nvidia said so. I will believe”
    What about all those shader tests and games showing Nvidia hardware having major problems with shader intensive titels and DX9 games?

    “3dmarks2003, Aquamarks, ShaderMarks, Chemeleon Marks, TRAOD, HL2, and Halo, are the total fault of the game deveopers for those games running poorly on Nvidia based hardware even though those games have the (Way it’s meant to be played) on the game boxes. Nvidia is not at fault. They are the golden standard of drivers and video cards. Nvidia said so.”

    r[

      • PLASTIC SURGEON
      • 16 years ago

      Too funny. But sadly true. lol

    • Pete
    • 16 years ago

    Great article, Diss. The 5900XT stacks up nicely at $200 retail, and totally invalidates the 5700U. I was surprised to see the 9600XT competing in some AA+AF benches, though–particularly in OGL games!

    Just one correction: regular 5900’s are 400/850, not 450/850 (p. 13). Only the 256MB 5900U is normally clocked 50MHz higher for the core.

    • Anonymous
    • 16 years ago

    FX cards are quite fantastic if you want to run games that are DX7, and DX8 driven. Quite amazing. Holding their own against any R3xx series of cards. But who the hell buys a new video card to play last year games andeven further? If you do, you need mental help….

      • Krogoth
      • 16 years ago

      It’s likely that the next generation of DX9 apps will not run super fast at high detail and resolution on R3xx. At least R3xx are faster then NV3x in this aspect. Besides, game developers are still going to develop games using Directx 8. Since DirectX 9 class hardware still hasn’t fully saturated the mainstream market yet and wouldn’t likely happen until Q4 04. So there is some life that FX5900XT has which should be good until the end of next year.

        • Anonymous
        • 16 years ago

        If HL2 is any indication, FX cards our not on my wish list this Christmas

    • Anonymous
    • 16 years ago

    You can find radeon9800 non-pro for almost as low as $200 on ebay or holiday store sales. Personally I bought a samsung mem non-pro for $220 on ebay just recently, and that card flashes to pro and overclocks like a champ on top of that (well beyond pre-overdrive 9800XT speeds). Ofcourse on ebay you get ripped for S&H, so keep that in mind.

    I can see how one can make the fx5900 (XT or ebay non-ultra) look like a solid buy, and I agree that it is; fx5900, radeon9800non-pro and radeon9700pro are all awesome bang for the buck cards. However, with all the bad publicity nVidia has been getting for a year an a half now, you would do good for the industry by picking the underdog. Remember – you can be as loud on the forums as you want, but it is your wallet that you vote with.

    • Anonymous
    • 16 years ago

    Don’t get me wrong, this was a well put together article, but it seems hardly fair to put GeForce FX 5900 cards up against a 9600 XT when it is clear that these cards are aimed at different price points and market segments. If you’re going to include 5900 series cards, then it’s only fair to also include the 9800 XT in your benchmarks.

      • DreadCthulhu
      • 16 years ago

      AG, the GeforceFX 5900 XT IS aimed at the same market and price point as the Radeon 9600 XT – looking at various sites, the GeforceFX card in the article cost maybe $10 more than the Radeon one. Both are around $170-$190. The Radeon 9800 XT’s are all around $450; they compete against the GeforceFX 5950 in that price segment. Seems like a perfectly fair comparison to me.

    • atidriverssuck
    • 16 years ago

    sure, but does it have HeadCasting?

    • Anonymous
    • 16 years ago

    I’m thinking about returning my recently purchased Leadtek 5700 in favor of this card. However, I’m wondering if it’s true that Leadtek uses a higher quality RAMDAC and output filter(s) to achieve superior visual quality, sharpness, etc?? I don’t want to sacrifice ergonomics for 3D processing speed. What do you all think?

      • atidriverssuck
      • 16 years ago

      I seriously doubt Leadtek do anything better than its competitors. I certainly haven’t noticed a difference with their quality compared to any other card.

    • sativa
    • 16 years ago

    am i the only one amazed by the fanboy stuff (for mainly ati, but some for nvidia).

    its a card that puts pictures on your screen, not a religion.

    • Evan_Frame
    • 16 years ago

    #88 ionpro

    Your colors are revealed. y[

      • ionpro
      • 16 years ago

      What you said is, (and I quote): q[The dude behind the counter tells him what to buy.] My point was that ‘the dude behind the counter’, as you so eloquently put it, has no more clue about video cards then the person buying. If you wish to quibble with your own words, be my guest, but that is what you said.

      I maintain that the “guy behind the counter” is only going to pick up a latent knowledge of how relative video cards perform, and there will be mistaken sales to nVidia because of this naming convention. I said it before, and I’ll say it again: Why play naming games when your product can stand on its own merits, unless you feel your card is inferior somehow?

    • Anonymous
    • 16 years ago

    #84 and 80 So does name calling and depreciating comments make the people who bash others feel any better or any more righteous? I believe those individuals were making an honest comment about the lack of nvidia’s products in canada and you made it a personal attack.

    Oh well, i hope you feel better anyways

    • AENIMA
    • 16 years ago

    ATI fanbois, stop being idiots 😉 just because nVidia has a superior product in this price range doesnt mean you have to spout off a total bunch of crap! “oooh, its hard to find them in canada” wow… that makes it such a bad product… SHUT UP! GO ONLINE! BUY FROM NEWEGG LIKE THE REST OF THE PLANET! “ooh ooh, its ACTUALLY a 5900!!!!! but its clocked lower!!!!!” and…. IT COSTS $100 LESS! WOW! THEY MIGHT EVEN BE TRYING TO SELL SOMETHING! “whhhiinee they needed to put a $500 ATI card in the review to ‘balance’ it out” no no no they didnt! have they ever added something that has NOTHING to do with the product they are reviewing? nope. thats about as smart as adding a voodoo5 in a processor review. seriously. I own a radeon 9700 and I bet it could beat this card with a little OCing but so what? I allways thought ATI’s best deals were right in between where nVidia hasn’t currently gotten to, like $230-300. the 9700/9800non-pro are both KILLER deals. even the 9700/9800PROs are awesome deals. so quit bitching!

      • Spotpuff
      • 16 years ago

      Actually, Newegg doesn’t ship to Canada. Most online US computer retailers don’t. 🙁

        • Anonymous
        • 16 years ago

        Actually newegg is one of the very FEW that do not ship to canada, stop crying and goto canadacomputers.com/ or pccanada.com

          • Spotpuff
          • 16 years ago

          I know about cc and pccanada. CC is good for prices and that’s about it; their staff aren’t particularly knowledgable when it comes to specific components. They also won’t order things for you if they don’t carry it, e.g. biostar IDEQ boxes.

          PC Canada’s prices aren’t good and neither is their service.

          And saying we have computer stores up in Canada doesn’t invalidate my statement about newegg not shipping to Canada, nor about other retailers in the US not shipping to Canada.

          Me: A lot of US retailers don’t ship to Canada
          You: GO TO STORES IN CANADA

          Nice argument

      • Anonymous
      • 16 years ago

      As long as you’re keeping yourself amused… w00T!

      Keep it up!!

      readysetgo… CHEERS@@@!!!

      • Greylandra
      • 16 years ago

      Um… Just for the record I Have personally purchased 3 CGs all of them were made by Nvidia I gaurentee my next CG will be from Nvidia’s next gen cards (or a 5900XT if I have the chance to get my hands on one ;). That being said if a graphic card, or any other piece of hardware for that matter, is announced for realease then sent to review companies for benches/opinions all with a price tag half of what the public expects is too good to be true. A large number of these are sold in such miniscule amounts and or discontinued before the public can take advantage the price performance these products promise. It’s not just Canada that gets effected by this bait and switch technique; how many Americans went hunting for an ATI 9500 just to be sold a 9600. As for subjecting CGs to benchmark reviews based on price is probably the most fair way to go. However if Nvidia wants to make ,say, an over clocked 5950 ultra with all the bells and wistles, send one over to techreport.com and muse aloud at the fact that it’ll *probably* sell for $19.95 thereby beating all of thier compition, price perfomance wise, by an astonomical factor and then halt production as soon as they get the free PR they were looking for. No. There needs to be some other checks and balances.

      • Anonymous
      • 16 years ago

      superior product for what? DX7 and DX8 games? lmao. Only problem is when DX9 games come along FX cards crap out…..

        • Anonymous
        • 16 years ago

        What DirectX 9 game that is out now that Nvidia is crapping out? Max Payne 2? That’s a good AAA title that uses DirectX 9. Go read the anandtech site of the new Nvidia drivers. They fixed alot of their problems. Plus their IQ is no different from ATI. Great work on Nvidias part to fix their problems.

          • Anonymous
          • 16 years ago

          Ati cards run it quite faster for one. And HL2, TRAOD and Halo are another. And MaxPayn2 is not a DX9 game. Just the trailer portions of the game is. The only game to use extensive use of PS2.0 at this time is HL2. And it has been proven the FX cards have major problems running applications or games that use those features in full precision. Not too mention HDL and DX9 vertex shaders. Halo and TRAOD still does not have the same level of IQ as ATI R3xx based cards.
          I agree that Nvidia has done a great job on it’s drivers, however even so, games still run slower on them. And do you actually think it will get better for Nvidia based FX hardware when games come out running more DX9 features? HL2 is a prime example. Look what they had to do to even get it to run properly on their cards? Lower FP. Different shader operations(PS1.1, PS1.4) They couldn’t use the required dx9 spec of PS2.0 because when they did the performance was brutal. That’s what is instore for FX owners for DX9 games.

            • Anonymous
            • 16 years ago

            /[

            • PLASTIC SURGEON
            • 16 years ago

            Yup. They give the skinny on it at Beyond3d.com.

            r[http://www.beyond3d.com/interviews/maxpayne2/<]§ And even a complex shader DX8.1 game runs quite slower on FX based cards. §[<http://www.techtv.com/freshgear/products/jump/0,23009,3576350,00.html<]§ r[http://reviews.gaminghorizon.com/media/0,100,120,1,1,840.html<]§

            • Anonymous
            • 16 years ago

            So I can sue Remedy for false advertisement? 😛

            • PLASTIC SURGEON
            • 16 years ago

            And after you sue them or try too, then you can sue UBISOFT for Splinter Cell also. LOL. It tells you to load up DX9 also. And it’s a DX8.1 game also.

    • Anonymous
    • 16 years ago

    *sits here patiently* I’m waiting for Loki.

    • Anonymous
    • 16 years ago

    #79/80/81

    So there’a a total of 5 stores in all of canada that sell the 5700, and that’s online? In vancouver vitrually all the local computers stores have the 9600xt on their websites and even in the bigger chain computer stores have them on the shelf. I cant see the same for the 5700.

    5700 and 5900se/xt mostly paper launches for PR purposes and nothing for normal people.

    To the person who suggested the price network, Canada is a big country and 5 stores dont really cut it for people who dont want to pay for the added cost of shipping.

      • Anonymous
      • 16 years ago

      Damn what WHINERS Canadians are! If it isn’t at the local 7-11, AND that 7-11 better not be more than 2 blocks away, you’re not interested? Won’t pay for shipping? Whaaaaaaaaa!!!

      Sheesh, try to help someone out, get kicked in the teeth. No wonder /[

      • Generic Ninja
      • 16 years ago

      Of course the smart thing to do is to buy online out of province and then you save the provincial sales tax. That savings easily covers the shipping unless you are spending less than $100 or are shipping something like a case.

      We actually aren’t too badly off here in Canada if you shop smart. Of course the best thing to do is to shop at New Egg in the US and nip over across the border to pick up your mechandise from a rented post office box. You smuggle it back to avoid duty and there is your value for money. Course I guess you hafta live near the border for that to work. Meh.

    • Anonymous
    • 16 years ago

    I think people should recognize that these 5900 cards are a sham unless they are out in MASS QUANTITIES. Here in canada, finding a 5700 is damn impossible whereas the 9600XT/pros are EVERYWHERE. For me until I see them in the stores the 9600XT kicks ass!

      • Greylandra
      • 16 years ago

      I agree fully. There were four 5700’s shipped to Canada. All of them got lost in the mail… Untill I see a 5900XT on a store shelf I’ll keep it in perspective as follows. There are currently 25 5900XT’s world wide and they are currently in the hands of Harware review companies. One (1) of them may make it to Canada and then sold on e-bay for twice it’s price because it is such a rare find!

        • Anonymous
        • 16 years ago

        So true. Paper launch….lmao..well almost..at least for the consumers….

      • Anonymous
      • 16 years ago

      *eyeroll* C’mon people, you’re not even trying…

      5700 FX
      §[<http://www.pricenetwork.ca/search.php?q=5700+fx&c=all&i=&location=Canada&expert=1<]§ 5900 FX §[<http://www.pricenetwork.ca/search.php?q=5900+fx&c=all&i=&location=Canada&expert=1&go.x=6&go.y=6<]§ That's using a Canadian price portal to find Canadian dollar prices at Canadian stores. For a tech site, there's an amazing number of visitors here who just don't use the technology that's available... ...or maybe this is a condition exclusive to Canadians. Put down the beer and pick up the mouse.

        • PLASTIC SURGEON
        • 16 years ago

        Maybe our beer is more appealing then Nvidia FX cards. lololol. At least more appealing then your water downed american beer.. 😉 Anyways, most people probably will pass on this marketing sham and pick up that cold Sleeman’s Dark Ale instead….mmmmmmmmmmmmmm…Dark Ale…….

          • Anonymous
          • 16 years ago

          Please change your name to “please don’t take anything I say seriously, I run around proclaiming ATI has been chosen by the Pope himself”

          Seriously man your all over this conversation with half assed pro ATI comments.

          You are no better than a full fledged nvidiot, people like you make us look bad, lets try to be better than the nvidia fans if that’s not too hard.

          If we are going to praise ATI then lets use the more evolved part of our brain that we used to choose ATI in the first place.

          I think you have had a 12 pack too much of that Canadian beer you enjoy.

            • Anonymous
            • 16 years ago

            #82 I guess then the NV FX5800 were the best cards ever released in mass quantities.. LoL

            • Anonymous
            • 16 years ago

            OH please stop the Nvidia support baby crying you suck face

            • Anonymous
            • 16 years ago

            Great come back… Seriously lay off the beer.

            • Anonymous
            • 16 years ago

            Maybe you need to drink some more beer instead of your baby bottle, Nvidia baby. Anonymous summed it up quite well. “SUCK FACE” :p LOL

            • Anonymous
            • 16 years ago

            Nice way to prove a point, resort to name calling.

            • Anonymous
            • 16 years ago

            You still crying like a 3 year old girl? Please stop. You’re making us all physically ill.

            • Anonymous
            • 16 years ago

            Crying? No. Although you are the one that has proven their age.

            • Anonymous
            • 16 years ago

            The only beer worth drinking is German

      • Dissonance
      • 16 years ago

      For the record, I was at a local shop in Vancouer, BC, the other day (ATIC), and they had loads of 5700 Ultras in stock and available for order.

      • Anonymous
      • 16 years ago

      Alright, what did I say? I said y[

        • Anonymous
        • 16 years ago

        Uhh..? I can go to almost any computer store in Canada and pick up a 5700. The card came out not too long ago, the 9600 came out a lot earlier, so NO F*CKING DUH ITS MORE COMMON. What do you go to a store count every last card they have and make sure they have 20 of the specific card before you would even consider buying, then sniff around every store in your area? God your so f*cking anal.

          • Anonymous
          • 16 years ago

          It’s so good to see people can express opinions here without the baby name calling attitude.

    • Anonymous
    • 16 years ago

    Good grief. Just try and look for a 9700pro. It will ensure you can play dx9 titles trouble free without any of the smoke and mirrors Nvidia has been cooking up with it’s drivers and the FX cards…….

      • Anonymous
      • 16 years ago

      Except for when the ATI drivers don’t work with anything and constantly crash.

        • Anonymous
        • 16 years ago

        What a load of crap that statement is. Out of the 60 different games I’ve tried, only 2 (budgetware) games proved to be problematic on my Radeon 9800 Pro.

          • Anonymous
          • 16 years ago

          That’s YOU who didn’t have problems. I have problems with call of duty crashing every hour oh so and Final Fantasy XI texture problems on my 9700 Pro. I’m going to upgrade to Intel just to use ATI videocards. This is my last ATI purchase. I’m going with Nvidia because their drivers work and don’t cause problems. I’m very happy with my Nforce 2/AMD setup and plan to buy AMD in the future. Granted ATI is much better than what they were back in the 90’s when their driver update fixed one thing and broke another.

            • Anonymous
            • 16 years ago

            Can someone please provide a link for this person so he can see that there are people using both Nvidia and ATI graphic cards that are having /[

            • Anonymous
            • 16 years ago

            What link are you talking about?

            • Anonymous
            • 16 years ago

            What link am I talking about?

            q[<"Can someone please provide a link for this person so he can see that there are people using both Nvidia and ATI graphic cards that are having unique, unusual problems?"<]q It looks like I was unclear in my wording; I apologise. I meant a link to a forum thread (anywhere) or a website that will show the individual, whose post I was replying to, that both ATI and Nvidia have their unique, unusual problems. readysetgo... CHEERS@@@!!!

            • Anonymous
            • 16 years ago

            This is the first time EVER that I have heard of this problem, even in all my years in local tech help and running a computer repair shop.

            Does that mean it doesn’t exist? No. Nvidia is far from perfect at this time so strange happenings shouldnt be surprising. That place you linked to seems to have just jumped on a oportunity to attack Nvidia.

            What scares me more is that Indigo jumped into the mix and didn’t even bother posting resonably. It is a known fact that running either card you normally need to uninstall the old drivers first. We all know from being on tech help forums people run into this exact problem for either brand of card. No duh its not reasonable to expect to have to do that but quit bitching its life.

            I really hope that indigo isnt the same indigo on TR, normally I like Indigo’s posts.

            • PLASTIC SURGEON
            • 16 years ago

            Problems like Det 45.23 drivers? Problems like 51.75 drivers? Then again the whole 40 series and early 50 series inwhich most of them did not even make it to certified status because of all the IQ and other issues they had. (cheating maybe is one reason 😉 )

            The only decent DET drivers released have been the 52.14, 52.16 and the verdict is still out on their new “forceware” drivers. I got rid of my 4600ti not too long ago in my secondary PC and grabbed a 9500pro. It’s been running quite fine with Cat drivers.

            I have had PLENTY of gaming issues with my 4600ti and New Det drivers. Too many too count actually. So please, spare us that old, old crap about how Nvidia drivers are still the y[

            • Anonymous
            • 16 years ago

            But, I never had ANY problems with Nvidia cards. I used all the GeForce 1, 2, 3, and 4 no problems. Enter the 9700 Pro and my locksup begin. I hate this damn card. I have to use beta drivers to get Call of Duty to work. I should have waited till Half-Life II went gold before I purchased this card. Oh look Final Fantasy XI has those wonderful textures problems! UGH should have thought twice before switching from Nvidia.

            • Anonymous
            • 16 years ago

            You ever think it’s your PC setup? I am running a Sapphire 9800pro on my Intel board and not one issue. Even with Call of Duty. 1600×1200 with fsaa+aa on sampled…great frame rates. Using the Cat 3.9 drivers.
            And love those ugly half textures i got on my 4600ti with Det drivers. Not to mention TRAOD. Oh and also MaxPayne2. The reason i got rid of it. It can’t play newer games. Stick to a card that won’t play dx9 games..

            • Anonymous
            • 16 years ago

            Yeah it probably is because mine is an AMD setup (Asus A7N8X Delux). I have to get an Intel motherboard/CPU to get 100% compatibility. The people on the Rage3d forum who have virtually no problems are the ones who own Intel system. I’m not going intel anytime soon with how wonderfully hot the Prescott is. I am going Athlon 64 FX next year. That means I’ll be going back to Nvidia.

            • Anonymous
            • 16 years ago

            I am running the same board as you! My 9800pro runs solid! Matter of fact, i just reverted to the Catalyst drivers 3.9. No problems here.
            The only issue i have had since i got the 9800pro 2 months ago was with StarTrek Elite Forces 2. And that issue is fixed now.
            At this time i won’t purchase a Nvidia based FX card until they correct it’s hardware issues. Not to mention their software issues at lowering IQ.

        • PLASTIC SURGEON
        • 16 years ago

        Typical response from a Nvidiot who does not even OWN a ATI card. How old are you? 12? My 9800PRO runs every game perfectly. No issues.

        r[http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/002.htm<]§ I love that quote from GamersDepot. It sums up all you Nvidia fanboys who refuse to admit the truth about current FX cards. Have fun running DX9 games when they come out. Got to love those lower precision and altered shader specs.....i will pass. I want my games to be played the way they were ment to be played. Not with lower IQ.

      • Anonymous
      • 16 years ago

      While his statement is a very wide one it holds *some* truths. My 9800 has had a few problems with games. Especially BF I get very strange visual problems while running it. On Tomb raider it wont even run but I switched back to the older dets. On WC3 my units will disappear sometimes, hold the retarded comments they aren’t dieing they flicker. Belonging to a few message boards I have accumulated some very tech savvy friends. I have heard identical problems from some of the ATI adopters. Locally I mentioned it in a conversation with a computer tech and he also has run into similar problems as mentioned above.

      I don’t know what is causing it, perhaps faulty hardware perhaps incompatible system setups ect, ect. While most of the problems are pretty minor I think I could live with them rather than not knowing what Nvidia is doing behind the scenes to get those scores. I say this until one of my favorite games doesn’t want to play on it lol.

    • Spotpuff
    • 16 years ago

    Why the HELL are people complaining about the fact that this is a “marked down” 5900 Ultra?

    If it sells for $200, it sells for $200 and you compare it to other graphics cards in that range; I don’t understand how people can complain considering it makes you life as a consumer that much easier.

    If I can get better performance at a lower price, I will do it. Hence my 1700+ OC’d to 2400+ speed.

      • Anonymous
      • 16 years ago

      If we support this behavior now because it suits our selfish desires, how much say should we have when a product is “marked UP” and labeled in a manner altogether misleading?

      You may wish to live without employing your capacity to comtemplate and critically evaluate a situation, but there is no reason why others should have to do so as well.

      readysetgo… CHEERS@@@!!!

        • Spotpuff
        • 16 years ago

        Um, critically evaluate the situation?

        OK, here we go.

        It costs $200 and it outperforms every other card in its price range.

        WOW that was hard.

        You seem to be unable to pull your head from your fanboy ass; I love ATI too but if Nvidia can beat them at that price point, then no amount of fanboyism is going to help.

        Where were you to complain when the 9500 pro was stomping all over Nvidia’s midrange line?

        When cards are marked UP, then I will do my homework and read sites like TR et al. and make a decision. It’s really not that hard.

          • Anonymous
          • 16 years ago

          Well then where is the problem? You see good in this development, while I see /[

    • DrDillyBar
    • 16 years ago

    What’s really sad is that nVidia using “XT” in their naming convention doesn’t really suprise me. I almost expected this to happen when the ATI XT’s hit the market.

    • Evan_Frame
    • 16 years ago

    I love all the bleeding hearts here for Joe Consumer and how he is totally screwed by the naming of these cards. Joe Consumer goes to Future Shop/Best Buy and asks for a good card for himself or his kid that costs between 100 and 200 dollars. The dude behind the counter tells him what to buy.

    Now enthusiasts who go to the extreme of reading tech sites and commenting on articles should be able to distinguish themselves from Joe Consumer by reading comparison reviews and looking up a price.

    5800 9200 XT Pro Ulta etc are merely alpha-numeric designations to separate physically different products. Joe consumer doesn’t care what its called and Joe Enthusiast shouldn’t care. Find a product, find its specs/reviews check price, buy it.

      • Anonymous
      • 16 years ago

      You are making obvious valid point. The point that some of the people q[

        • hmmm
        • 16 years ago

        Is readysetgo your screenname or something? There are a bunch of posts that end that way. If so, why don’t you just create an account instead of being a gerbil?

      • ionpro
      • 16 years ago

      What makes you think the guy at Best Buy knows any more than Joe Consumer? Those guys get paid $7-$10 per hour. They’re paid for their ability to sell, not their knowledge of computer hardware. They’ll think the exact same thing when they see a “5900 XT” as Joe Consumer would — namely
      900 > 800
      “XT” > “”
      Therefore, the 5900 XT MUST be better then the 9800 XT, right? But when they find out that it’s not, they’ll get confused. Is XT better then pro? Ultra better then SE? FX XT SE or SE XT Pro? At least if someone told you they had “the new XT card”, you could tell which manufacturer made it. Why confuse the market? Why not let your cards stand on their own merit?

      Unless you feel your cards are inferior somehow.

    • Zenith
    • 16 years ago

    I think we should boycott nvidia until they stop releasing new cards made off the same damn core, with even gayer and more confusing names…serious, I just saw the title and hung my head and wondered WTF nVidia is doing.

      • Hellsbellboy
      • 16 years ago

      yeah and ATI naming scheme isn’t any more confusing? Pro, Ultra, non Pro, XT, Ultimate, SE. 9000, 9100,9200 which is slower or equal to 8500, a 9500pro that’s faster then 9600pro. hmmm

        • Anonymous
        • 16 years ago

        Oh yes, you are both right. If I were given a list of either ATI’s or Nvidia entire “current” product line, I will be unable to list them by order of performance. While both parties cannot be considered champions of clearness and counter intuitiveness, ATI generally appears to be less of a culprit, especially in light of latest developments, i.e., 5900 XT.

        With this latest naming scheme, Nvidia appears to be taking the game of counter intuitive card names to a new level of confusion and hazziness.

        l[

    • ionpro
    • 16 years ago

    Maybe it’s just me, but I could honestly care less about how the card performs. I APPLAUD eVGA for renaming their card. Naming a budget chip after the top-performance chip of your competitor is really a desperate, last-ditch measure to garner a few more sells. It’s cheap, uncalled for, and it has caused me to stop recommending any nVidia product for my clients.

    What would you guys think if AMD chopped half the cache out of the XP 3200+ and renamed it the “XP 3200+ EE”, then sold it for slightly cheaper? This is exactly the same.

    I hope VIA likes my K8T880 purchases.

      • Anonymous
      • 16 years ago

      yeah it’s HORRIBLE that nvidia releases a lower clocked 5900 that’s faster then the Radeon 9600, for $200 bucks.. and probably be lower soon. How dare they.. what are they thinking? hmm isnt’ this like what they did with the the Ti4200? hmm XT who’s to say what Nvidia means by XT except for nvidia??? who cares what the hell the card is called, if it has superior performance to the 9600xt and cost the same then it could be called anything for all i care. ionpro is as u stated is how u recommend cards or parts to ur customers.. glad i’m not one of them.

        • ionpro
        • 16 years ago

        That’s right, I do recommend video cards (as well as myriad other components) in my work as a home computer consultant. Playing naming games is a sure way for anyone to be sure they don’t get a recommendation from me. Of course, you have every right to stay away from my business because of that policy — I always tell my customers that there are other parts out there, and I’ll install anything they’d like. But my midrange recommendation is still going to stay the 9600 Pro for now.

        I’m sure that nVidia could care less about my paltry $5k or so in business each year. They make the majority of their money from OEMs. What will hurt them is card manufacturers switching over to ATI because independant consultants and enthusiasts don’t like the games they play with naming. $5k may not be much, but if 1000 of us do the same, then they may feel it a little. What ever happened to distinguishing your own product, and not basing your success on mistaken identity? Was XT really the only two letters out there that conveyed the impression they wanted? I feel the same way about the Athlon 64 FX, btw; though I wouldn’t be recommending that in any case (price is way to high for the performance benefit).

          • AENIMA
          • 16 years ago

          I dont know why you care about them releasing a SUPERIOR PRODUCT for an EQUAL COST. and the 9600pro is worse than the 9600xt, explain WHY you would reccomend a product that isn’t as fast for little less. it doesnt make any sense.

            • ionpro
            • 16 years ago

            ATI Radeon 9600 Pro == $144
            ATI Radeon 9600 XT == $189

            95% of the performance at 75% of the price. That’s why I recommend the 9600 Pro over the 9600 XT. It’s the same reason why I still recommend that 9800 Pro over the 9800XT, for the ultra performance segment — 95% of the performance at 75% of the cost.

            For that matter, it’s why I recommend Athlon machines over Pentium 4s in the budget segment — Athlon XP 2400+ > Celeron 2.6Ghz at same price. Duron 1.6Ghz == Celeron 2.6Ghz at half the price.

            • hmmm
            • 16 years ago

            You don’t seem to understand his point. He isn’t mad that they released a fast card for a cheap price. He’s mad about the dirty tricks involved in the nomenclature. They are seperate issues. Doing one thing right doesn’t excuse don’t something else wrong, especially when it wasn’t an accident.

          • Dissonance
          • 16 years ago

          Did you stop recommending ATI products when they played the name game with the Radeon 9000/8500?

            • AENIMA
            • 16 years ago

            or the 9800 SE? lol… nice logic

            • ionpro
            • 16 years ago

            9800SE did use a naming convention set by nVidia (think 4800SE, 4400SE), but they used it /[

            • indeego
            • 16 years ago

            Or Nvidias TWINVIEW?! WTF I NEVER SAW MARY KATE OLSEN ASHLEY ON MY CARDg{

            • ionpro
            • 16 years ago

            indeego: well, then go to §[<http://www.marykateandashley.com/<]§ on your TwinView enabled card, then.

            • Anonymous
            • 16 years ago

            response:

            …..*shudder*

            readysetgo… CHEERS@@@!!!

            • indeego
            • 16 years ago

            My browser shat it’s pantsg{<.<}g Gah the flash.

            • ionpro
            • 16 years ago

            That is different, and here’s why: the 9000 and 8500 weren’t in active production at the same time, as far as I know. When ATI moved their cards to the 9xxx series, they named their budget offering 9000 because it was the lowest available name. If I had a problem with ATI renaming the 8500 core to 9000, even though it had lower performance, don’t you think I’d have the same issue with the FX 5200 vs. the Ti 4200? True, the FX 5200 is as real of a DX9 part as nVidia seems to be able to make these days, but it still has worse performance then the Ti 4200, while having a higher number.

            • Logan[TeamX]
            • 16 years ago

            Ionpro, I already had this arguement with Dissonance. Just walk away, it’s easier this way.

            • ionpro
            • 16 years ago

            Btw, Diss, I have absolutely no qualms with the review whatsoever. Tech Report is by far my favorite review site; the uniformly excellent quality and readability of the reviews continues to impress me (and I’ve made several donations because of it). Thanks to you (and Damage) for the wonderful site.

    • Anonymous
    • 16 years ago

    “Tomb Raider: Angel of Darkness gets its own special little intro here because its publisher, EIDOS Interactive, has released a statement claiming that the V49 patch, which includes a performance benchmark, was never intended for public release. The V49 benchmark apparently fails to correctly load Tomb Raider’s GeForce FX-optimized code path, so it’s more a reflection of how the GeForce FX cards perform with default DirectX 9 code than anything else.”

    Did this come from NVIDIA, Edios or Core?

      • lethal
      • 16 years ago

      Eidos Interactive said that

    • Illissius
    • 16 years ago

    How typical. *[

    • Anonymous
    • 16 years ago

    I think the review should have included 9500pro/9700pro, 5900U ,9800pro/np, to give a better measurment of where the 5900XT stand. Pricewises I can pick up a 9800np in bestbuy and Circit City for $199.

    • Rakhmaninov3
    • 16 years ago

    Looks like Nvidia finally caught up with ATi after all these MONTHS, hehe.

      • Anonymous
      • 16 years ago

      They still have not. DX9 games still are the main issues as well as shader performance for ALL FX CARDS……

    • indeego
    • 16 years ago

    q[

      • dolemitecomputers
      • 16 years ago

      Well it does have a multiplayer portion. That extends it a lot.

      • dmitriylm
      • 16 years ago

      length shouldnt always dictate the worth of a game. I can pull out hundreds of crap games that will give you a good 20 hours of gameplay..not that you’ll want to go through the whole 20 hours..

        • indeego
        • 16 years ago

        In this case I’d like a longer single player option. 15-20 hours provides replayability, epic (for a FPS) feel, and worth the $50. I’m not complaining, the game looked beautiful, but I’m not a must have gamer and will pick it up at $20 and no more/no less. Same goes for MaxPayne2, which was just gorgeousg{<.<}g

          • Anonymous
          • 16 years ago

          I dont know why people cry about Max Payne. I finished CoD a lot quicker.

            • derFunkenstein
            • 16 years ago

            and I LOVE Max Payne on top of it…well worth the $50 I spent

            • hmmm
            • 16 years ago

            CoD was much shorter than MP2 for me as well. For some reason I didn’t feel cheated out of my money by MP2, but I did by CoD.

          • Anonymous
          • 16 years ago

          No less? I’d pay less if I could. 😉

            • indeego
            • 16 years ago

            You’d have to wait for it. I don’t want to waitg{<.<}g

            • Anonymous
            • 16 years ago

            Well, you must know more about how prices work in such things than I do. I only see things on sale by accident now and then, and if it’s something I’d earlier had interest in, then I may buy. So if $20 for a $50 release priced game is something you can tell when is going to happen, I’m interested in learning about that.

            • indeego
            • 16 years ago

            It happens on “Black Friday” at EB and almost always within 9 months of a games initial release at online stores like gamestop. I don’t use much retail nowadaysg{<.<}g

            • Anonymous
            • 16 years ago

            Interesting. Thanks.

    • Sargent Duck
    • 16 years ago

    So, Nvidia is taking ATI’s high end name (xt) and putting it on a “lower” card. Seems almost like a smear campaign or something. Couldn’t ATI open up a law suit over this?

    Imagine if ATI did the same thing, relabel a 9200 as a Radeon fx 9200 utlra, just to get to get even. *disclaimer* I don’t want ATI to actually do this, as it would just be stupid.

      • dmitriylm
      • 16 years ago

      There is no rule dictating that the designation XT should mean a faster product. In nVidia’s plans, the XT is a cheaper version of the Ultra that competes with the lower end market, and does a good job at it.

    • derFunkenstein
    • 16 years ago

    so is nV dumping 5900 GPUs that didn’t make the grade speed-wise and calling them a budget chip?

    • WaltC
    • 16 years ago

    I’m a bit confused by what is evidently your confusion about nV3x and its “pseudo 8-pipe graphics core.” There’s nothing “pseudo” about it–it has only four (4) pixel pipes and can only render 4 pixels per clock. It’s very unwise to confuse “ops” per clock with “pixels” per clock, because if we did that with R350/60 vs. nV35/38, we’d be talking about the R3x0’s 20+ ops per clock versus the nV38’s 8 ops per clock. As the two are not related directly, it’s far simpler and far more accurate not to confuse ops per clock with pixels per clock. Big distinction. Since nVidia initially misrepresented that nV30/5/8 was an 8 pixels-per-clock chip, instead of the 4-pixels per clock chip that it is, and tried to confuse the issue by confusing “ops” with “pixels,” I suppose your misunderstanding is understandable, although by now I’d expect most everyone to have a bit better grasp of the issue.

    Am I surprised that it takes nVidia’s high-end nV35 clocked 17% lower than in a $500 nV38 reference card to equal or surpass ATI’s mid-range RV350/60 reference design? Not in the least, as I’ve always suspected that’s where nV35/38 was best suited for competition in the first place, as it, like RV360, is a 4-pixel-per-clock chip (While the R300/50/60 have always been a solid 8 pixels per clock.) What will surprise me enormously, though, is to see actual reference designs of the type you tested becoming available for $200 at retail…:) That will surprise me a lot, and it will indicate that nVidia is literally having to give nV35 away to push it into the market. I hope you’ll update your article when the actual production boards begin to surface at some point in the future, or at least do a review on such a product to if nothing else spot any changes that may occur between the prototype you’ve tested here and the real McCoy which becomes available for purchase.

    At any rate, should the basic card you tested here become available in reality then, yes, I’d agree it would make a fine purchase for someone interested in a 4-pixel-per-clock chip–definitely. I will feel more sorry for nVidia, though, than I will for nVidia’s AIB-partners, because if nVidia doesn’t heavily discount nV35 to them we’ll never see the product you reviewed here actually ship at the price you estimate…:)

    For those who might be interested, I replaced my GF4 Ti4600 last September (in ’02) with an R300-based 9700P (at one point I actually had two GF4 Ti4600’s installed in my machines at home), and was immediately able to bump up the resolution in the games I play anywhere from 1-3 notches (depending on the game), and to turn on FSAA (2x-6x, depending on the game) along with using 16x AF, over the resolutions I had used with the GF4 @0x FSAA & 8XAF, without any other change to system hardware (FSAA just never looked good to me with the GF4, and the performance hit was excessive, so I didn’t use it, but I did enjoy AF on the GF4)–and in most cases run at a higher, more playable framerate than I had enjoyed with the GF4. The difference in IQ was very evident. That was a big surprise to me, and so I did not return the 9700P even though I had purchased it from a retail vendor providing me with a 30-day, satisfaction-guaranteed, full refund policy which I deliberately chose because I had assumed I would not like the 9700P because of ATi’s past driver problems, which I had experienced. But as it turned out I liked the 9700P so much better than my Ti4600, in fact, that I bought a 9800P this past May, as well, and have permanently retired both of my GF4’s. I only mention this after reading remarks here by the people who are currently still using GF4s–just to say that much better products have been shipping for well over a year now.

    I wish nVidia much luck in the future, and can only hope that the state of their future product development will be such relative to their competition that the company will no longer feel compelled to cheat on and denigrate benchmarks, and to fundamentally misrepresent its product specs as they did initially with nV30 (I don’t think I’ll ever forget nVidia advertising an 8×1 organization for nV30 at first–it’s the first time I’ve seen something like this and I hope never to see it again.)

      • vortigern_red
      • 16 years ago

      WaltC,

      This was not an NV ref card, if you think that it was.

      Quote
      /[

        • WaltC
        • 16 years ago

        OK–I had thought that it must be, since the price was estimated instead of actual. Thanks for the correction, though.

      • getbornagain
      • 16 years ago

      §[<http://www.newegg.com/app/ViewProductDesc.asp?description=14-130-179&depa=0<]§ did not have to look far :) almost think i want one

        • vortigern_red
        • 16 years ago

        I had not actually looked (I don’t want one 🙂 ) but yes it appears the 5900XT is available even in the UK!! (dabs own brand £150 haven’t looked elsewhere)

        • WaltC
        • 16 years ago

        Thanks for the link, but that’s the SE version, not the XT reviewed here.

          • Dissonance
          • 16 years ago

          From the review.

          /[http://www.techreport.com/reviews/2003q4/geforcefx-5900xt/index.x?pg=1<]§ /[http://www.techreport.com/reviews/2003q4/geforcefx-5900xt/index.x?pg=14<]§

            • WaltC
            • 16 years ago

            Thank you, Dissonance….:) I did, believe it or not, look at your specs page and compare it to the SE’s, but I goofed, and thought you’d listed the amount of onboard ram in your specs, and thought it was 256mbs–but now I see that you didn’t list the amount of ram onboard the card you tested here, at least in the specs you listed in the table:

            §[<http://www.techreport.com/reviews/2003q4/geforcefx-5900xt/index.x?pg=1<]§ Sorry....:) I really think, though, that the amount of onboard ram is a valid spec. I also might suggest that rather than including an "estimated price", if you can find an in-stock version of the product shipping from a reliable source (such as Newegg), it might be better to list that price and a link, instead of tagging the review with an "estimated price" (which provides the impression you have no exact idea of what the specific product you've reviewed will in the future actually sell for.) Gawd...I despise these monikers--SE/XT, etc.--I don't care who makes the chips the products are based on--they are for the birds if for no other reason than that they are inconsistent. I cannot figure out the ridiculous "9800SE" at all, frankly...and have no idea what's up with that (What's the point in doing a 9500-like version of the 9800?) Also, just going by model number nomenclature, one might think the "5900XT" should be compared with the 9800XT, until you look at the price difference between not only the 9800XT & the 5900XT, but also between the 5950U and the 5900XT....:) It's a mess...:) The strangest thing of all is EvGA sending you a 5900XT which, apparently, is being sold currently as a "5900SE"....?....:) If there's no difference, why is eVGA using different model ID nomenclature? And if the two are the exact same product, why would you estimate the XT's price to be $15 higher than the SE's, as listed in stock on Newegg...? Man, the AIB partners for both these companies seem to be having a field day with confusing and possibly even misleading model numbers. It's going to be a Caveat Emptor Christmas, no doubt about it...:)

    • My Johnson
    • 16 years ago

    Great card for older games but once u turn on PS2.0 the 9600XT pulls up even or ahead at playable lower resolutions.

    I guess I’m still saving my pennies. Nothing out there I care to play on the PC that’s new until HL2 anyway.

    I’m currently running a GF2Ti. Last PC game purchased was HL 5 years ago. Currently goofing on a Gamecube and I only play the games designed for it. Not those ugly PS2 ports either.

    • Anonymous
    • 16 years ago

    I see Nvidia is throwing money out the window. The card is just an under clock 5900. Plus that seems like an unfair fight. The 9600XT is only missing half is memory and bandwidth.

      • Anonymous
      • 16 years ago

      And costs more, tech sites usually base their comparisions on this strange, weird, odd thing called [BOLD]PRICE.

    • vortigern_red
    • 16 years ago

    /[

      • Anonymous
      • 16 years ago

      Nvidia didnt remove anything, the latest patch to 3dMark03 disabled parts of the NVdrivers. I think it was posted on this Tech-report somewhere…

      I could be wrong…

      • vortigern_red
      • 16 years ago

      Sorry I was not very clear, the newer drivers (not the futuremark approved ones 52.16? but the 53.xx) seems to restore all the “cheats disabled by the recent 340 pach that stops NV from detecting 3Dmark via the order of its PS instructions.

      However FM only stopped the “cheats” on the 4 game tests not the PS2.0 tests but NV seem to have removed those “cheats” themselves at the same time as re-implimenting the disabled ones in the new 53.xx drivers.

      • Anonymous
      • 16 years ago

      Go to anandtech. They did an article on IQ.

        • vortigern_red
        • 16 years ago

        LOL!!!

        Thats a joke right?

        /[

    • R2P2
    • 16 years ago

    ATI uses XT for a higher-clocked version of a card, and now nVidia uses it for a lower-clocked version. WHEN WILL THE NAMING MADNESS END!!1!

      • lethal
      • 16 years ago

      ha! never. It appears as almost they /[

    • Logan[TeamX]
    • 16 years ago

    What you are all quick to forget it just how weak the card really is. They had to compare it to a 9600 XT in order to see it succeed. Put it against a 9800 Pro 128MB or *gasp* a 9800 XT, and there goes the ballgame. It’s a 5900, which last I checked was supposed to be the competition to the 9800 series. The 9600 is intended as competition for the 5700 and 5600 series. Even ATI’s comparison page readily gives basic tech facts and expected performance ratings to substantiate this. Oh, and I’m not going to even comment on the 53.03 drivers… there has to be something at work there to give you a 700+ point jump in 3DMark03! If the next Catalysts did that, the Nvidia people would riot in the street, screaming about optimizations and whatnot. Now that Nvidia does it again, is anyone surprised? No… they welcome it. I’m always skeptical on both sides of the graphics fence, but now this has me floored. A souped-up 5900, compared against a 9600 XT.

    Oh, and check the price for the performance. There’s no doubt it can achieve what it says it can achieve, with the right drivers, but how are they making any money using cores that powerful then? Are they willing to take a loss just to dominate the middle-ground section of the video card industry? I guess so.

    As taken from the ATI.com Product Comparison screen:

    ATI 9600 Pro 128MB: 1.6GPixels/second, 200MTriangles/second

    ATI 9600XT 128MB: 2.0GPixels/second, 250MTriangles/second

    ATI 9800 Pro 128MB: 3.06GPixels/second, 380MTriangles/second.

    Jeez… sounds like the intended competition is either the 9800 Pro or even non-Pro, either of which will soundly spank this price-chopped 5900.

      • Anonymous
      • 16 years ago

      Valid points but I really think you have to compare cards on price/performance not specs. People are quite happy to buy a card that should retail for say £300 at £200 they don’t care if the IHV is selling at a loss or not!

        • Logan[TeamX]
        • 16 years ago

        I think that a Raedon 9600XT @ 279 CDN for example is still a better deal than this 5900 “SE” POS. It’s like Chevy deciding to sell the Ecotec Cavaliers at $6500 CDN “because they can”. They’d be nailed by the courts so fast it wouldn’t even be funny.

        I don’t see why it’s right for Nvidia but wrong for everyone else. If you were to compare a 9800SE (256-bit interface) to this card, I think it would be neck-and-neck. How about it, Dissonance?

          • DaveJB
          • 16 years ago

          No point comparing the 9800SE – it only uses a 4×1 pipeline config @ 325MHz, so odds are it’d get walloped by anything faster than a 9600 Pro, even in AA/AF tests.

          • vortigern_red
          • 16 years ago

          see this page (i’ve been pointing this at everybody in the last few days!)

          §[<http://www.digit-life.com/articles2/over2003/index.html<]§ Its got its limitations (noAA/AF ect) but its one of the few places I've seen the SE versions of any vid cards tested. The 9800SE 256bit performs about the same as the 9600PRO but would perform better at AA due to its better mem bandwidth. I don't see your problem with NV selling the chip cheaper ATI has done the same thing with the 9500 PRO. AMD and Intel constantly sell chip that are capable of being clocked much higher at mid price points! PS the anon poster you are responding to was me having pressed the wrong button

            • Logan[TeamX]
            • 16 years ago

            I just don’t see how AMD sells chips that can be clocked higher? If you’re talking lower multis, well then yes BOTH AMD and Intel are guilty. I’m just saying that they’re taking an established level of graphics processor, keeping the name the same for some unknown reason, and then underclocking it and telling reviewers and enthusiasts that it’s a competitor to a lower class of architecture.

            I’m going to detune my Cadillac Escalade by 50 HP and tell all automotive reviewers that it competes with the Toyota RAV4 and Honda Pilot. Does my point make sense now? I’m asking you to forget about all the amenities and features for the admissions criteria to the comparison, yet account for all of them in the evaluation processes.

            It’s like ATI making a 9800 “FX” and lowering it just below Pro speeds, and then telling everyone that matters that it competes with the 5600 and 5700. Of course it’ll clean up, as the architecture of the chip is far superior to either of the models listed from Nvidia. I’m not faulting the Tech Report guys for doing the test, I’m questioning why they fell for such an obvious ruse. It really doesn’t make sense, and it throws the generally accepted review principals for a crap.

            What next… 9200SE vs 5700 Ultra? FX5200 vs. 9800 NP? I see no good coming from this trend of beyond-hucksterism by whichever company embraces such tactics. It just so happens that Nvidia has taken the first (low) shot at it. First the driver debacle, and now this? Why am I not surprised…

            [EDIT] – I hope XT is trademarked by ATI, otherwise I fully expect to see retaliatory labels being used by ATI against Nvidia. This is rather low, especially for Nvidia.

            • jss21382
            • 16 years ago

            /[< I'm not faulting the Tech Report guys for doing the test, I'm questioning why they fell for such an obvious ruse. It really doesn't make sense, and it throws the generally accepted review principals for a crap<]/ I could be mistaken but aren't most comparison reviews done based on the price points of the pieces being reviewed?

            • Logan[TeamX]
            • 16 years ago

            It’s obvious you didn’t read the rest of my post. I fully expect ATI to start playing dirty pool, what with the price-chopping and badging of cards with the competitors’ nomenclature. I’ll be the first in line for a Raedon 9800FX that’s priced to whack a GeForce FX5900. Oh wait, that’s the Raedon 9800 NP. Nevermind.

            You people are really blinded by the new range of Nvidia crap, aren’t you?

            • hmmm
            • 16 years ago

            Look, I think it is *you* who doesn’t get it. You’re right, the comparison isn’t fair, but the market isn’t supposed to be fair. If AMD wanted to sell an A64 FX for $50 and have it compete against Celerons, we’d all think they were MAD, but we’d be pretty happy to buy up as many of the chips as they’d sell. If NVIDIA wants to completely erode their margin on a card to be competitive with an inferior architecture (by selling the top range stuff at mid-range prices) then the consumer wins. Now maybe it still isn’t wise to buy the card, especially if you want to play DX9 titles (or especially now that 9800Pro 128’s are available for just over $200!!!). Maybe it isn’t fair of NVIDIA, but in the end it is good for us. That’s what matters.

            Now maybe the review should have said that the archtectures were inherently unequal and that NVIDIA must be losing money. That would have been a nice little sentense to include, but it isn’t *that* big a deal.

            Also, I would have liked to see a 9800 NP and a 9800 Pro 128 included in the benchmarks. I believe that the latter would certainly clean up. And it makes sense now that the prices have come way down.

            • Dissonance
            • 16 years ago

            /[

            • Logan[TeamX]
            • 16 years ago

            *sigh*

            Doesn’t matter to me, really… I just wondered why you guys didn’t even comment on the fact!

            • vortigern_red
            • 16 years ago

            No more than Pro or Ultra mean that it has to be faster but they are percieved (sp) by Joe Public to mean that. Would you think that ATI was trying to mislead people if it had called its current 9800SE (already a misleading name) the Radeon FX 9800 Ultra?

            Its about how difficult its becoming for Joe Public to make a selection on a graphics card because the IHVs and board makers are not clear.

            • Logan[TeamX]
            • 16 years ago

            Actually, yes, I’d despise ATi for doing such a thing, and I’ve never supported the actual use of “SE” for their bottom-end card.

      • wagsbags
      • 16 years ago

      I think it makes infinitely more sense to compare the cards on a price basis. Why would you compare it to a card that costs over $100 more? I would like to see a 9700pro thrown in there…

    • Rousterfar
    • 16 years ago

    Still going strong with my Ti 4200. I think I will just wait for the next full round of cards.

    • mirkin
    • 16 years ago

    Is it just me, but I feel like Im in limbo until HL2 and Doom3 go retail and we see some benchies of those games. Im running a G3ti500 and would love to upgrade, but feel its just a short time until the next gen games hit that will actually require an upgrade – Id hate to sell myself short by upgrading prematurely. The battle for the mid range is insane – weve never had so many options and such performance at $200 – what does NV and ATI know about D3 and HL2 ? if these cards run those games great – we win, if they dont … there will be much rending of garments and screaming.

    • Nelliesboo
    • 16 years ago

    why is everyone talking about retiring there card because NV has a winner now? what u cant do it when ati was leading the way? (wtf) I have a 4200 right not, but i want to see what the spring cards do before i get a new one. Also i want to see a game that can use all this power, i mean i am fine just playing games at 1024×768 (which i do on my 9000 powered laptop), and last. while this card looks good NV driver issues have gotten to the point where i dont trust benches no more. Good card, but how good is it really?

      • Anonymous
      • 16 years ago

      People change their mind all the time with hardware. This isn’t odd at all. Remember when AMD was such a crappy processor? Now AMD is king! Stuff like this happens all the time in PC hardware. We’ll see how the NV40 in 2004 and see how their new design stacks up. ATI was a nice kick to Nvidia telling them if they get lazy someone is going to take their spot. The same happened with Intel back in the Pentium II days 😛 This is a great time to be a consumer because they will do EVERYTHING to outdo eachother and we can expect massive speed increases in gaming.

        • Anonymous
        • 16 years ago

        q[

          • Anonymous
          • 16 years ago

          The comment directly above is my own. I mistakingly posted it before completing it and signing off. Here is the portion that was left out:

          […] Naturally you should note that these are the words of an individual with a natural /[

    • Anonymous
    • 16 years ago

    I would have liked to have seen more real games used that had private demos like they do at [H]. All the “canned” benchmarks are worthless with the cheating ways of NVIDIA so evident now days when it comes to showing me what kind of gaming experience it going to be delivered by the video card.

      • Damage
      • 16 years ago

      Kyle:

      We tested nine games, and we used five custom demos that we created. We were limited by the fact that not every game we wanted to test (and we wanted to test some of them because of their use of certain technologies, like DX9) allows for the creation of custom demos. But you are one of our favorite long-time readers, so we will keep the feedback in mind next time out. 🙂

      Thanks,
      Scott

      • Anonymous
      • 16 years ago

      Don’t forget back in the day it was ATi taht was found to be cheating. This shit happens with everyone.

        • Anonymous
        • 16 years ago

        This is not “cheating” per se; it’s questionable practices. If we, customers, don’t make noise when we see business practices that can/will negatively impact us, businesses will assume the public is “ok” with their action(s), i.e., their “shaddy” behaviour will not result in a negative backlash. Afterall, businesses are in for profit and if the easiest, cheapest “method” – which also happens to be shaddy, borderline illegal – brings in the most profit, then they WILL do it, if they think they can get away with it.

        I will, at the risk of getting called names, point Microsoft out as an example of a company that has functioned in this method.

        readysetgo

        • Anonymous
        • 16 years ago

        Only thing is Nvidia continues to do it. How many times did they get caught for cheating in 3dmarks2003 alone? 3 times? lol. Not to mention Unreal2k. Not to mention those infamous 51.75 drivers…….

        • hmmm
        • 16 years ago

        You’re totally right. ATI is just as bad as NVIDIA. They cheated once in Q3 like three years ago. Then they took out the cheats and improved the drivers legitimately beyond the initial cheating scores. That’s moral equivelency with NVIDIA if I’ve ever seen it.

    • Hance
    • 16 years ago

    Great review as always . Looks like a pretty fast card . Its way faster than the 5700 . The only thing i wonder is how the 5900xt compares to a 9700PRO . You can still pick up the 9700pro on the net for around 230 or 240 dollars and i bet it will give the 5900 a run for its money

      • Krogoth
      • 16 years ago

      I really think that the Geforce FX 5900 XT actually beats the older Radeon 9700 PRO both in performance/price ratio. This is the first time since the NV25 that Nvidia released a solid GPU and card for a reasonible price.

        • PLASTIC SURGEON
        • 16 years ago

        I doubt that. It might beat it in some DX8 tests and DX7 tests, but who wants a next generation card for those games when the 4200ti and the 4600ti run those games fine? It’s DX9 games that count. And the 9700pro STILL has TRUE 8 pixel pipelines. Not the 4×2(sometimes 8×1) the 5900XT has. It’s still running on R3xx chipset. Nvidia still has to run games in lower FP and hybrid shader operations to run DX9 games properly. (PS1.1,PS1.4…FP16..FP12) Pure DX9 shaders operation and games will suffer on FX cards. This has already been seen. And even in this article is a worrysome issue for the 5900XT. Yes, if you want a cheap top performing DX7, DX8 class card, then look no further then this card. However, i really don’t think most “informed” gamers will opt for a card that will play upcoming games poorly…..

        §[<http://www.digital-daily.com/video/nvidia-forceware-5216/index07.htm<]§

        • vortigern_red
        • 16 years ago

        A 9700PRO is just slightly faster (virtually the same) as a 9800 non pro, IMHO I really don’t think that this card will touch it in upcoming games if you can get a 9700 pro for similar money (or a bit more) I would get that unless you are dead set on NV of have a particular game or other reason (Linux?) that NV might be more suitable for.

      • Anonymous
      • 16 years ago

      Well yeah the ATI pro lines are supposed to be better than the Nvidia 5900 non-ultra 😛

    • Anonymous
    • 16 years ago

    Sweet card!

    I’ll finally retire my Gf3-200 this spring and this looks like my new card.

    • Captain Ned
    • 16 years ago

    Is it finally time to retire the Ti4200? It just may be.

Pin It on Pinterest

Share This