Nvidia’s GeForce 8800 Ultra graphics card

WHAT HAPPENS WHEN YOU take the fastest video card on the planet and turn up its clock speeds a bit? You have a new fastest video card on the planet, of course, which is a little bit faster than the old fastest video card on the planet. That’s what Nvidia has done with its former king-of-the-hill product, the GeForce 8800 GTX, in order to create the new hotness it’s announcing today, the GeForce 8800 Ultra.

There’s more to it than that, of course. These are highly sophisticated graphics products we’re talking about here. There’s a new cooler involved. Oh, and a new silicon revision, for you propellerheads who must know these things. And most formidable of all may be the new price tag. But I’m getting ahead of myself.

Perhaps the most salient point is that Nvidia has found a way to squeeze even more performance out of its G80 GPU, and in keeping with a time-honored tradition, the company has introduced a new top-end graphics card just as its rival, the former ATI now owned by AMD, prepares to launch its own DirectX 10-capable GPU lineup. Wonder what the new Radeon will have to contend with when it arrives? Let’s have a look.

It’s G80, Jim, but not as we know it
For us, the GeForce 8800 is familiar territory by now. We’ve reviewed it on its own, paired it up by twos in SLI for killer performance, and rounded up a host of examples to see how they compared. By and large, the GeForce 8800 Ultra is the same basic product as the GeForce 8800 GTX that’s ruled the top end of the video card market since last November. It has the same 128 stream processors, the same 384-bit path to 768MB of GDDR3 memory, and rides on the same 10.5″ board as the GTX. There are still two dual-link DVI ports, two SLI connectors up top, and two six-pin PCIe auxiliary power connectors onboard. The feature set is essentially identical, and no, none of the new HD video processing mojo introduced with the GeForce 8600 series has made its way into the Ultra.

Yet the Ultra is distinct for several reasons. First and foremost, Nvidia says the Ultra packs a new revision of G80 silicon that allows for higher clock speeds in a similar form factor and power envelope. In fact, Nvidia says the 8800 Ultra has slightly lower peak power consumption than the GTX, despite having a core clock of 612MHz, a stream processor clock of 1.5GHz, and a memory clock of 1080MHz (effectively 2160MHz since it uses GDDR3 memory). That’s up from a 575MHz core, 1.35GHz SPs, and 900MHz memory in the 8800 GTX.

Riding shotgun on the Ultra is a brand-new cooler with a wicked hump-backed blower arrangement and a shroud that extends the full length of the board. Nvidia claims the raised fan allows the intake of more cool surrounding air. Whether it does it not, it’s happily not much louder than the excellent cooler on the GTX. Unfortunately, though, the longer shroud will almost certainly block access to SATA ports on many of today’s port-laden enthusiast-class motherboards.

If you dig the looks of the Vader-esque cooling shroud and want the bragging rights that come with the Ultra’s world-beating performance, you’ll have to cough up something north of eight hundred bucks in order to get it. Nvidia expects Ultra prices to start at roughly $829, though they may go up from there depending on how much “factory overclocking” is involved. That’s hundreds of dollars more than current GTX prices, and it’s asking quite a lot for a graphics card, to say the least. I suppose one could argue it offers more for your money than a high-end quad-core processor that costs 1200 bucks, but who can measure the depths of insanity?

The Ultra’s tweaked clock speeds do deliver considerably more computing power than the GTX, at least in theory. Memory bandwidth is up from 86.4GB/s to a stunning 103.7GB/s. Peak shader power, if you just count programmable shader ops, is up from 518.4 to 576 GLOPS—or from 345.6 to 384 GFLOPS, if you don’t count the MUL instruction that the G80’s SPs can co-issue in certain circumstances. The trouble is that “overclocked in the box” versions of the 8800 GTX are available now with very similar specifications. Take the king of all X’s, the XFX GeForce 8800 GTX XXX Edition. This card has a 630MHz core clock, 1.46GHz shader clock, and 1GHz memory. That’s very close to the Ultra’s specs, yet it’s selling right now for about $630 at online vendors.

So the Ultra is—and this is very technical—what we in the business like to call a lousy value. Flagship products like these rarely offer stellar value propositions, but those revved-up GTX cards are just too close for comfort.

The saving grace for this product, if there is one, may come in the form of hot-clocked variants of the Ultra itself. Nvidia says the Ultra simply establishes a new product baseline, from which board vendors may improvise upward. In fact, XFX told us that they have plans for three versions of the 8800 Ultra, two of which will run at higher clock speeds. Unfortunately, we haven’t yet been able to get likely clock speeds or prices from any of the board vendors we asked, so we don’t yet know what sort of increases they’ll be offering. We’ll have to watch and see what they deliver.

We do have a little bit of time yet on that front, by the way, because 8800 Ultra cards aren’t expected to hit online store shelves until May 15 or so. I expect some board vendors haven’t yet determined what clock speeds they will offer.

In order to size up the Ultra, we’ve compared it against a trio of graphics solutions in roughly the same price neighborhood. There’s the GeForce 8800 GTX, of course, and we’ve included one at stock clock speeds. For about the same price as an Ultra, you could also buy a pair of GeForce 8800 GTS 640MB graphics cards and run them in SLI, so we’ve included them. Finally, we have a Radeon X1950 XTX CrossFire pair, which is presently AMD’s fastest graphics solution.

 

Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the results were averaged.

Our test systems were configured like so:

Processor Core 2 Extreme X6800 2.93GHz Core 2 Extreme X6800 2.93GHz
System bus 1066MHz (266MHz quad-pumped) 1066MHz (266MHz quad-pumped)
Motherboard XFX nForce 680i SLI Asus P5W DH Deluxe
BIOS revision P26 1901
North bridge nForce 680i SLI SPP 975X MCH
South bridge nForce 680i SLI MCP ICH7R
Chipset drivers ForceWare 15.00 INF update 8.1.1.1010
Matrix Storage Manager 6.21
Memory size 4GB (4 DIMMs) 4GB (4 DIMMs)
Memory type 2 x Corsair TWIN2X20488500C5D
DDR2 SDRAM
at 800MHz
2 x Corsair TWIN2X20488500C5D
DDR2 SDRAM
at 800MHz
CAS latency (CL) 4 4
RAS to CAS delay (tRCD) 4 4
RAS precharge (tRP) 4 4
Cycle time (tRAS) 18 18
Command rate 2T 2T
Hard drive Maxtor DiamondMax 10 250GB SATA 150 Maxtor DiamondMax 10 250GB SATA 150
Audio Integrated nForce 680i SLI/ALC850
with Microsoft drivers
Integrated ICH7R/ALC882M
with Microsoft drivers
Graphics GeForce 8800 Ultra 768MB PCIe
with ForceWare 158.18 drivers
Radeon X1950 XTX512MB PCIe
+ Radeon X1950 CrossFire
with Catalyst 7.4 drivers
GeForce 8800 GTX 768MB PCIe
with ForceWare 158.18 drivers
 
Dual BFG GeForce 8800 GTS SLI 640MB PCIe
with ForceWare 158.18 drivers
 
OS Windows Vista Ultimate x86 Edition Windows Vista Ultimate x86 Edition
OS updates

Thanks to Corsair for providing us with memory for our testing. Their quality, service, and support are easily superior to no-name DIMMs.

Our test systems were powered by OCZ GameXStream 700W power supply units. Thanks to OCZ for providing these units for our use in testing.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults.

The test systems’ Windows desktops were set at 1600×1200 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

 

Pixel-filling power
We’ve talked a little bit about shader power and memory bandwidth, but let’s pause here to look at pixel and texel throughput alongside memory bandwidth. Shader power is becoming more and more prevalent in newer games, but these old-school metrics still dictate part of the performance picture. As expected, the Ultra’s at the top of the heap in nearly every measure.

  Core
clock
(MHz)
Pixels/
clock
Peak
fill rate
(Mpixels/s)
Textures/
clock
Peak
fill rate
(Mtexels/s)
Effective
memory
clock (MHz)
Memory
bus width
(bits)
Peak memory
bandwidth
(GB/s)
GeForce 7950 GT 550 16 8800 24 13200 1400 256 44.8
Radeon X1900 XT 625 16 10000 16 10000 1450 256 46.4
GeForce 7900 GTX 650 16 10400 24 15600 1600 256 51.2
Radeon X1950 XTX 650 16 10400 16 10400 2000 256 64.0
GeForce 8800 GTS 500 20 10000 24 12000 1600 320 64.0
GeForce 8800 GTX 575 24 13800 32 18400 1800 384 86.4
GeForce 8800 Ultra 612 24 14688 32 19584 2160 384 103.7

Yep, the 8800 Ultra has about twice the memory bandwidth of the GeForce 7900 GTX, believe it or not, and it leads in the other categories, including pixel fill rate, texturing capacity, and “impresses the chicks.” The only way any existing solution could keep up would be in a multi-GPU configuration. We can measure several of these capabilities via synthetic benchmarks, to see if the Ultra lives up to its potential.

The Ultra comes very near to its theoretical peak for multitexturing, as do the other solutions we’ve pitted against it.

 

S.T.A.L.K.E.R.: Shadow of Chernobyl
We tested S.T.A.L.K.E.R. by manually playing through a specific point in the game five times while recording frame rates using the FRAPS utility. Each gameplay sequence lasted 60 seconds. This method has the advantage of simulating real gameplay quite closely, but it comes at the expense of precise repeatability. We believe five sample sessions are sufficient to get reasonably consistent and trustworthy results. In addition to average frame rates, we’ve included the low frames rates, because those tend to reflect the user experience in performance-critical situations. In order to diminish the effect of outliers, we’ve reported the median of the five low frame rates we encountered.

For this test, we set the game to its “maximum” quality settings at 2560×1600 resolution. Unfortunately, the game crashed on both GeForce and Radeon cards when we set it to use dynamic lighting, so we had to stick with its static lighting option. Nevertheless, this is a good-looking game some nice shader effects and lots of vegetation everywhere.

The Ultra flies through S.T.A.L.K.E.R., as do the GTX and the pair of 8800 GTS cards in SLI. You’d be hard pressed to tell the difference between any of these three solutions by the seat of your pants. Only the poor Radeon X1950 XTX CrossFire setup struggles here, showing its age.

Supreme Commander
Here’s another new game, and a very popular request for us to try. Like many RTS and isometric-view RPGs, though, Supreme Commander isn’t exactly easy to test well, especially with a utility like FRAPS that logs frame rates as you play. Frame rates in this game seem to hit steady plateaus at different zoom levels, complicating the task of getting meaningful, repeatable, and comparable results. For this reason, we used the game’s built-in “/map perftest” option to test performance, which plays back a pre-recorded game.

Another note: the frame rates you see below look pretty low, but for this type of game, they’re really not bad. We’ve observed frame rates in the game similar to the numbers from the performance test, but they’re still largely acceptable, even at higher resolutions. This is simply different from an action game, where always-fluid motion is required for smooth gameplay.

The Ultra snags the top spot here, aided by the fact that the 8800 GTS in SLI doesn’t appear to scale to two cards well in this game. The median low frame rate numbers, meanwhile, are kind of all over the map, which just shows how variable they are in Supreme Commander, for whatever reason.

 

Battlefield 2142
We tested this one with FRAPS, much like we did S.T.A.L.K.E.R. In order to get this game to present any kind of challenge to these cards, we had to turn up 16X anisotropic filtering, 4X antialiasing, and transparency supersampling (or the equivalent on the Radeons, “quality” adaptive AA). I’d have run the game at 2560×1600 resolution if it supported that display mode.

BF2142 looks gorgeous and runs well on any of these configs, but the 8800 Ultra looks best, plays best, and turns in the fastest average and low frame-rate numbers.

Half-Life 2: Episode One
This one combines high dynamic range lighting with 4X antialiasing and still has fluid frame rates at very high resolutions.

The Ultra is juuust barely edged out by a pair of Radeon X1950 XTXs in CrossFire here, but it’s extremely close. The GTX again shadows the Ultra, running just behind it.

 
The Elder Scrolls IV: Oblivion
We turned up all of Oblivion’s graphical settings to their highest quality levels for this test. The screen resolution was set to 1920×1200 resolution, with HDR lighting enabled. 16X anisotropic filtering was forced on via the cards’ driver control panels. We tried enabling 4X antialiasing, as well, but got inconsistent results from Nvidia’s current 158.18 drivers for Vista x86. Antialiasing only worked intermittently, and we haven’t yet found a consistent work-around or fix. As a result, we’ve tested without AA.

We strolled around the outside of the Leyawin city wall, as show in the picture below, and recorded frame rates with FRAPS. This area has loads of vegetation, some reflective water, and some long view distances.

Grabbing a pair of GTSes will buy you more performance in Oblivion than the Ultra.

Rainbow Six: Vegas
This game is notable because it’s the first game we’ve tested based on Unreal Engine 3. As with Oblivion, we tested with FRAPS. This time, I played through a 90-second portion of the “Dante’s” map in the game’s Terrorist Hunt mode, with all of the game’s quality options cranked. The game engine isn’t compatible with multisampled antialiasing, so we couldn’t enable AA.

This Xbox 360 port will tax any current video card at this resolution, but the Ultra once again comes out ahead of the pack.

 

3DMark06

The GTS SLI rig scales up to two cards nicely in 3DMark, allowing it to take top honors. The Ultra is all alone in second place, running ahead of the neck-and-neck GeForce 8800 GTX and Radeon X1950 XTX CrossFire.

Through the remainder of 3DMark’s synthetic tests, the Ultra proves again to be just a little faster than the GeForce 8800 GTX.

 

Power consumption
We measured total system power consumption at the wall socket using an Extech power analyzer model 380803. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement.

The idle measurements were taken at the Windows desktop. The cards were tested under load running Oblivion at 1920×1200 resolution with 16X anisotropic filtering. We loaded up the game and ran it in the same area where we did our performance testing.

The GeForces were all measured on the same motherboard, but we had to use a different board in order to run the Radeon X1950 XTX in CrossFire, so keep that in mind.

The power consumption numbers we observed in our test scenario aren’t quite what we expected, given Nvidia’s claims about the Ultra’s peak power use being lower than the GTX’s. However, power use can vary from one scenario to the next, and it’s possible the Ultra’s peak power use is still lower, depending on how one tests it. We’ve found our test scene from Oblivion to be very power intensive, and it’s a good real-world test, for what it’s worth.

The other thing to note here is that, for all its speediness, the Ultra draws substantially less power than the dual-GPU solutions that offer similar performance.

Noise levels and cooling
We measured noise levels on our test systems, sitting on an open test bench, using an Extech model 407727 digital sound level meter. The meter was mounted on a tripod approximately 14″ from the test system at a height even with the top of the video card. We used the OSHA-standard weighting and speed for these measurements.

You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured, including the Zalman CNPS9500 LED we used to cool the CPU. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

The Ultra carries on the GTX’s tradition of excellent acoustics. Amazingly, the fastest video card on the market is also one of the quietest.

 
Conclusions
What’s there to say that hasn’t been said? The GeForce 8800 Ultra’s clock speeds are a little bit higher than the 8800 GTX’s, and as a result, it performs somewhat better. That’s more than sufficient to make this the new Fastest Single Video Card on the Planet. Perhaps the best thing one could say for the Ultra is that Nvidia didn’t blunt the GTX’s virtues—which include gorgeous image quality, fairly reasonable power draw numbers, and whisper-silent cooling—in order to get more performance.

I also prefer the Ultra to the option of running two GeForce 8800 GTS cards in SLI, for a variety of reasons. The 8800 GTS SLI config we tested was faster than the Ultra in some cases, but it was slower in others. Two cards take up more space, draw more power, and generate more heat, but that’s not the worst of it. SLI’s ability to work with the game of the moment has always been contingent on driver updates and user profiles, which is in itself a disadvantage, but SLI support has taken a serious hit in the transition to Windows Vista. We found that SLI didn’t scale well in either Half-Life 2: Episode One or Supreme Commander, and these aren’t minor game titles. I was also surprised to have to reboot in order to switch into SLI mode, since Nvidia fixed that issue in its Windows XP drivers long ago. Obviously, Nvidia has higher priorities right now on the Vista driver front, but that’s just the problem. SLI likely won’t get proper attention until Nvidia addresses its other deficits compared to AMD’s Catalyst drivers for Vista, including an incomplete control panel UI, weak overclocking tools, and some general functionality issues like the Oblivion AA problem we encountered.

The holder of the graphics performance crown is rarely available for $88.88 at Wal-Mart, but the Ultra’s value proposition is more suspect than usual for a top-end part—not because it breaks new ground in graphics card pricing, which it does, but because there are GTX cards already available with strikingly similar clock speeds for about $200 less. That fact tarnishes the performance crown this card wears, in my view. I expect the Ultra to make more sense as a flagship product once we see—if we see—”overclocked in the box” versions offering some nice clock speed boosts above the stock specs. GeForce 8800 Ultra cards may never be killer values, but at least then they might justifiably command their price premiums.

We’ll be keeping an eye on this issue and hope to test some faster-clocked Ultras soon.

When we do, we may be testing them alongside the Ultra’s intended prey, cards based on AMD’s upcoming R600 GPU. Stay tuned. 

Comments closed
    • bozo28
    • 12 years ago

    Kinda like buying a formula one for 10 millions.But you can’t go at 300 mph in a city.You flash and that it

    • bozo28
    • 12 years ago

    800 dollars for a video card .I just don’t get it.I have the money and could purchase it.But the technology advance so quick it’s useless spending so much for a video card.That kind of card seem only for people wanting to *flash*.Or just brainwashed.

    Running a video game at 200 FPS or 30-40 FPS.It’s still the same thing.It’s very playable in both case and the graphic are the same.Even if you card can count 1 trillions whatever baziilion of ballion pixel.IF the game you play don’t have that ammount lol . THan it’s fucking useless

    The end

    • matnath1
    • 13 years ago

    Final words of article “intended prey”…..Does this imply the Ultra will Eat the HD 2900 xtx???

    When will the NDA be lifted and the real story told?

    • Sniper
    • 13 years ago

    What a steal.

    (As in, nVidia is doing the stealing.)

    Edit: Oh yeah, in 6 months these cards will be obsolete again, and they’ll probably be $400 cheaper.

    • Wintermane
    • 13 years ago

    I have several friends that buy the highest end stuff..

    For a couple of em thier systems wind up 4000 anyway due to custom cooling rigs and various other things… multiple high end hdds max high s[eed ram… so adding even an 800 buck card isnt a big deal.

    Anouther friend of mine is just MANIC about framerate. He gets motionsick very easy so he is obsessed with speed to prevent it.. seems any kizzups trigger it. For him watercooling and so on are baseline…

    Then there is that guy… we all know em.. He OBSESSES on every detail and is VERY obseesive compulsive. He has the best for the same reason his garbage disposal is the most powrful made and his car gets exactly 6 zillion mpg and so on.

    And not to many of them are RICH they just spend alot of what they have on thier obsessions.

    • Forge
    • 13 years ago

    7950GX2 still looking good, especially when bang/$ comes into it.

    That GTS SLI arrangement is pretty close, though. It’s just those few outlier titles that beneift 0% from SLI that still make me leery, that and giving up FOUR slots to graphics hardware.

    • wulfher
    • 13 years ago

    as my asus 8800gtx runs with 670 clocked GPU who any problems with the regular stock cooler i have no need to buy a Ultra, and the 10% more clock in the Pixel shader dosnt make that big difference.

    Most of us reading this review think the same as 200+$ can be better invested ๐Ÿ™‚

    • PerfectCr
    • 13 years ago

    l[

      • Krogoth
      • 13 years ago

      I concur

        • SGT Lindy
        • 13 years ago

        Me too….especially when I read about all the bugs in SC…..PC gaming is just a mess with $800 video cards and games being released ready to be patched.

        Add in Vista and DX10……….a console just looks to easy.

        For $1000 you could get a cheap notebook and a 360.

          • Krogoth
          • 13 years ago

          Bugs in Supreme Commanders? It doesn’t have that many. Internet forums attract the whiners and those who exaggrated their issues. The people who are having little or no problems don’t complain about anything.

          I have play and seen far worse offenders in the bug development. Outpost, Master of Magic, Master of Orion 3, Battlecrusier 3000 AD, some obsecure Sierra football title that was made back in 1999 etc.

          Consoles games also have software gitiches and bugs, but the main advantage of consoles is that you just have to do tray and play instead of hunting down the latest drivers/patches and tweaking every little thing in a gaming PC.

            • Bensam123
            • 13 years ago

            Like your “amazing performance with 81 x 81 maps, 1000 unit cap and supreme AIs”

            -_-

            Forums, the front news section could be considered one, also attract the “OMG I get 1000FPS in CSS” people as well.

            • Krogoth
            • 13 years ago

            WTF? Stop twisting crap.

            I never said anything about amazing performance. I just said only those conditions can bring down my rig to its knees. Those by in large are rather extreme.

            • Bensam123
            • 13 years ago

            Not twisting, just rewording what you said so it isn’t so sly.

            “I just said only those conditions can bring down my rig to its knees. Those by in large are rather extreme.”

            In other words, my computer can run the game just fine except for extreme conditions. Which is the complete opposite of what just about everyone else has said, which is the game grinds to a hault in 20mins.

            I can play CSS at 70 FPS, I can play CSS at 1000 FPS. There is a correlation.

            • Krogoth
            • 13 years ago

            It is the bloody truth for my system.

            I dunno why it performs so well compared to similar-equiped rigs or what I am doing right.

            • Bensam123
            • 13 years ago

            Well damn d00d. I guess I need to start coating my CPU in cheetah blood or something. How many pigeons/goats do you sacrifice?

    • Wintermane
    • 13 years ago

    I expect some rather good overclocked cards will pop up with 1.4 ghz ram and say 2 ghz shaders…

    • Sargent Duck
    • 13 years ago

    I’ve just ordered two. Gonna SLI ’em up. / goofy, not to be taken seriously tag

      • PrincipalSkinner
      • 13 years ago

      You must be totally crazy or sitting on a pile of cash or both.

        • Nullvoid
        • 13 years ago

        he probably uses $100 bills as toilet paper.

          • Sargent Duck
          • 13 years ago

          Actually, I’ve found that the $50’s are much softer. The $100’s are better fire starters. It’s like they were made to burn.

          *I’m just kidding about the differences in money. I know it’s all the same process/material.

        • Sargent Duck
        • 13 years ago

        Actually neither. I forgot to put on the sarcasim tags. Oooops, my bad

          • Chrispy_
          • 13 years ago

          Don’t worry about them, some people can’t think for themselves.

          I, on the other hand, can *[

    • toxent
    • 13 years ago

    I was going to say something witty. About the high price, and the e-penis, but it looks like everyones probably said it already…

    • IntelMole
    • 13 years ago

    I get the feeling that the overclocked cards aren’t going to be overclocked by that much. If this is for bragging rights only to counter the new AMD hotness, which will apparently be rather good, then Nvidia can’t be able to reliably clock these much higher IMO.

    Course, they could also be allowing themselves some wiggle room for future releases should AMD’s stuff absolutely destroy them, and/or for the overclocking manufacturer crowd…

    • lemonhead
    • 13 years ago

    would have liked to see a quick comparison of XP numbers on 1 game or something to see how the drivers are shaping up since Vista release.

    • Nelliesboo
    • 13 years ago

    You know with all the crying isn’t the 2900XT beating the GTS (which is the tops most want to pay for a card). If the 2900XT can hit that $200 – 250 mark the GTS hits then we have a winner.

    • herothezero
    • 13 years ago

    q[< I yearn for those times when I read shootouts between ATi, nVidia, Matrox, S3, 3DFX...<]q Yeah, I think we all do, even though any of the choices out there in the midrange market from red or green are good ones for the average user, and for them, fewer choices = easier computing/gaming. Maybe Intel's performance graphics entry next year will bring something new to the table...

      • SPOOFE
      • 13 years ago

      EDIT: The heck? Tried to respond to something else.

      • Stefan
      • 13 years ago

      Well, S3 and Matrox never were that much of a competition performance wise. So if we count them out, we are left with 3 players only (3dfx, ATI, Nvidia). With Intel pushing for higher performance graphics, I hope, this will become a similar situation, again.
      (Note that I am far from beeing an Intel fanboy, have always rooted for the underdogs. But you just gotta hope for some competitor that bringe the focus back to the mainstream market. And as much as I like to read about the fastest GPU available, I am much more interested “What is the best grpahics option out there for $200”-reviews than in the latest numbers from XY-Corps slightly higher clocked top of the line part!)

        • swaaye
        • 13 years ago

        Matrox sold the fastest card with the best visual quality for a few months in ’99. G400 MAX was a sweet card. Even OpenGL got sorted out finally. That was the last decent card they built for gaming, however. Parhelia woulda been neat, if they’d sold it at the same price as a Ti4600 instead of gouging for years and years.

        Intel hasn’t had jack for enthusiasts since their lukewarm i740. And that card wasn’t worth it when it was brand new. Goofy little crippled device, it was. Did you know that to make a PCI version they had to build the card with an AGP interface on board because the GPU could only use AGP texture memory? lol

    • Code:[M]ayhem
    • 13 years ago

    Wake me when Newegg are selling these for $159 *yawn*

      • d0g_p00p
      • 13 years ago

      In 3 years they might be that low. *yawn* In the mean time that TNT2 Ultra might be cheap enough now for you to afford.

    • Usacomp2k3
    • 13 years ago

    I was hoping for a little more performance increase for a $200 price increase. At least the 6800 super-ultra edition had a good bit better performance.

    • FireGryphon
    • 13 years ago

    y[

      • JoshMST
      • 13 years ago

      Yeah, I miss the heyday of multiple, competitive 3D graphics manufacturers. Too bad the cost of entry into the market is now so far above anyone’s head, it just isn’t feasible anymore. I think we are past the days of startups with unique 3D technologies.

      • flip-mode
      • 13 years ago

      ๐Ÿ˜€ I love to make people laugh.

    • PRIME1
    • 13 years ago

    This card is just to poop in ATI’s cornflakes.

    However, there are plenty of people willing to dish out 1K for a CPU so some of these will probably sell.

      • l33t-g4m3r
      • 13 years ago

      *cough* SPOOFE *cough*

        • SPOOFE
        • 13 years ago

        What on Earth makes you think I would buy this thing?

        Oh yeah: I showed that you were a complete liar, and now you’re bitter. Wa-a-a-a-ah…

    • Lazier_Said
    • 13 years ago

    This is just a press stunt to make the green bar bigger than the red bar when the R600 is released.

    That Nvidia felt a jump from 575 to 612mhz was enough to put them ahead of AMD – when almost any 8800GTX with the standard cooler is good for 650 and they could have cherry picked out 675mhz if not 700mhz for this card if they had felt the need – suggests that along with awfully late, R600 is going to be awfully disappointing in performance as well.

    AMD is so dead.

    • herothezero
    • 13 years ago

    Wow…quite underwhelming. Very pleased with my BFG 8800GTX OC2 purchase last week–even moreso now.

    • maroon1
    • 13 years ago

    I heard that 8800 Ultra consume less power than 8800GTX, is that true ?

      • Jigar
      • 13 years ago

      Please read the review.

    • eitje
    • 13 years ago

    Damage, you seem to be getting a little jaded on this whole “bigger/badder/faster/harder” thing that the H/W manufacturers are pushing. ๐Ÿ˜‰

      • Damage
      • 13 years ago

      Hmm. Not exactly. I like fast new stuff, and there’s an awful lot of room for improvement in graphics between now and, well, 10 years from now, from what I can tell. But I do want advancements to bring solid increases in both capability and value without losing out on usability and stability. That may make me spoiled, but when I’m sitting in front of the 3007WFP running Oblivion at 60 FPS at 2560×1600 with 16X AA and 16X AF, I’m feeling far from jaded. ๐Ÿ™‚

        • Jigar
        • 13 years ago

        I am jealous Damage.. give me your address i got to come and take all that high end stuff you got there from your house.. ๐Ÿ˜‰

        EDIT: Oh btw can i have your Rig specs ??

        • eitje
        • 13 years ago

        i’m just harkening back to the quad-core review, where there was a similar tone around the hardware companies that release a product only to trump their own product. it’s humorous, to see a review where you pretty much say “i spent my time on this so, by god, i’m publishing these graphs!!”

    • Jigar
    • 13 years ago

    Where is the news about GPU market shares??? It’s missing from,the front page.. just disappeared ๐Ÿ˜•

      • eitje
      • 13 years ago

      ยง[<https://techreport.com/ja.zz?comments=12380<]ยง it was in MY history. ;)

      • Damage
      • 13 years ago

      Looks like JPR sent us incorrect data. We’re following up with them, but the story has been removed from the front page until we can confirm accurate info. We don’t like to pull stories, but we don’t wish to continue reporting information we believe may be wrong. We’ll follow up ASAP with corrected data.

        • Whybee
        • 13 years ago

        Apparently they just mixed up Intel and Nvidia Q4 market shares.

    • data8504
    • 13 years ago

    duggdededd

    Edit: you know, that looks stupider and stupider the more I look at it. Damn. Stupid 10 char minimum.

    What I meant to say was: dugg.

      • Jigar
      • 13 years ago

      X2

      • eitje
      • 13 years ago

      <space>

    • pdjblum
    • 13 years ago

    Hector Ruiz, with his need for celebrity and power, has brought two companies, ATI and AMD, to their knees, allowing the competition to do whatever they please. At least intel is not raping us blind –far from it actually. Too bad the same can’t be said about Nvidia. Nvidia chose margins over volume, which is far from optimum for those of us who would be willing to spend a small fortune, but not everything we have, on a very high end card.

      • Anomymous Gerbil
      • 13 years ago

      I don’t think they care much about either margins or volume on this card…

      • pdjblum
      • 13 years ago

      I guess what I meant to say is Hector Ruiz should be burned at the stake or at least fired and Nvidia are fraken greedy mofos.

        • SPOOFE
        • 13 years ago

        Why should he be fired if he’s bringing success to the company? Your thinking’s all backwards, boy.

          • pdjblum
          • 13 years ago

          What am I missing? Hector Ruiz, CEO, of AMD, took over when the company was already on the upswing, after the release of the amd64 and opteron, and enjoyed the spoils that came with those fantastic products. He then spent tons of money building capacity so he could get Dell to sell AMD pcs and tons of money to buy ATI, all the while alienating the very customers that made amd successful in the first place and negleting r&d and product development, He enjoyed hanging with Speilbert and Armstrong and the FI racing guys instead of hanging with the people that made amd a success.

            • l33t-g4m3r
            • 13 years ago

            you are missing that spoofe is a troll.

            • SPOOFE
            • 13 years ago

            Or just mistaken this time.

            You sure seem to have a hardon for me. Are you trying to ask me out? Because I don’t date folk I meet online. You find some weird ones out there, you understand.

            • SPOOFE
            • 13 years ago

            My mistake. Getting my Hotshot Executives mixed up.

      • Mithent
      • 13 years ago

      Intel aren’t raping us blind probably because they priced their CPUs well in order to take AMD’s market share, not because they’re feeling altristic? nVidia can charge what they like for this card because a certain class of person will just buy the top card no matter what the cost, and AMD/ATI don’t have anything DX10 and so won’t be considered by the majority of serious gamers, which this card is obviously aimed at (no, there are no games yet, but if you’re going to spend this much you’d hope it would last for a bit..).

    • Stefan
    • 13 years ago

    Of course, this extreme price tag does nothing to dispel the impression that is just a preemptive paper launch…

    • JoshMST
    • 13 years ago

    I’m guessing PureVideo 2 will be making it into the 65 nm shrink/improvement in the architecture. Of course, with this release, it appears as though we won’t be seeing those cards until Oct/Nov of this year. To me it is pretty amazing that NVIDIA is being so conservative with their process choices. TSMC has been offering 65 nm since last year (admittedly for much simpler ASICs than a 680 million transistor behemoth). Apparently too much risk with that as compared to the 80 nm process they went with on the G84/G86.

    • deathBOB
    • 13 years ago

    I too am disappointed by the lack of R600 news. I guess that rumor about the NDA being May 2nd was wrong.

    Why is the GTS SLI all over the map? I thought SLI was supposed to be useful…

    • Krogoth
    • 13 years ago

    8800 Ultra = factory overclocked 8800GTX for another $200. It is meant to spoil HD 2900XTX’s thunder, but 8800Ultra will end-up being another 6800UE (cherry-picked GPUs = very limited quantities).

    8800GTS 640MB utterly crushes it for performance/price in the high-end segment.

    8800GTX can overclock just as high without too much trouble.

      • madgun
      • 13 years ago

      i concur… in fact if one is so tempted to buy it, he might as well look into the Evga ASC3 8800GTX available for 650$, which performs roughly the same. And you get the option of trade in, when the real deal , the 8900 series hits the market.

    • Lord.Blue
    • 13 years ago

    This is nice and all, but if they had tagged a $50 price hike to it, it would have made more sense than ~$200. Like everyone has mentioned, it is not worth the money when you can get one doing the same or better for much less.

    • Shinare
    • 13 years ago

    Ok, after reading two reviews, the one here, and another place (I will not mention the name out of respect)… but both places failed to mention that there are GTX card out there with faster clocks than this, and at $549-599. BFG OC, an eVGA, and a FOXXCON to name a few… so, if this is basically an OC’d GTX… then why exactly are you paying $826 for this thing?

      • Sikthskies
      • 13 years ago

      The people who know that won’t pay for it ๐Ÿ˜‰

      • cobalt
      • 13 years ago

      I don’t think there are any GTXes with higher than 2GHz effective memory clock — the Ultra has 2.16GHz.

      And I don’t think the comparison with factory overclocks was ignored here: “The trouble is that “overclocked in the box” versions of the 8800 GTX are available now with very similar specifications. Take the king of all X’s, the XFX GeForce 8800 GTX XXX Edition. This card has a 630MHz core clock, 1.46GHz shader clock, and 1GHz memory.” (near the bottom of page 1.)

    • Sniper
    • 13 years ago

    In my opinion, Supreme Commander is pretty awful, graphics wise. The art is surprisingly low-poly, there are simple shadows for units, and there’s some generic terrain and water.

    As far as the graphics engine goes, I think GasPowered Games has one of the most poorly implemented. Other game developers can do better with the hardware available.

      • Krogoth
      • 13 years ago

      ROFL, Supreme Commander graphics are pretty good and make sense for a RTS game.The strategic overview is like playing a game of Allies and Axis. It is a lot easier to manage and see a bunch of some icons rather then some tiny, difficult to see models.

      GPG could not use high-poly count models, because Supreme Commander uses hundreds if not thousands of “active,” independent models. Every model has their pathfinding and physics calculations etc. You need the power of a render farm to order to do the same task with high-poly and quality models.

      FPS like Quake 4, FEAR and Half-Life 2 get around it by only using a few dozen active models at any given time. There are also tricks like “disappearing corpses”. Try running the same FPS with hundreds of active baddies and watch your GPU and CPU get hammered.

      Serious Sam series is the only modern FPS which did use tons of active models. The developers had to do some tradeoffs like making the AI dumb, using low-polycount models and a LOD algorithm.

      The point is that there is no CPU and GPU that can handle a “LOTR size battle (high quality models, thousands of active models)” in real-time. GPG took the more realistic approach with modern hardware by making tradeoffs and using an advanced LOD algorithm.

      C&C3 on the other hand is just Generals with a face-lift in environmental effects, but the models themselves are just slightly more complex then those found in Supreme Commander. C&C 3 isn’t as much of a system hog, because the game’s maps and battles are a lot smaller and quicker then anything Supreme Commander. C&C 3 can still bring down a high-end rig if you try to do epic-size battles like a late-game session of Supreme Commander.

        • DASQ
        • 13 years ago

        Except it runs like crap. What’s the point of an awesome engine that you have to drop down to virtually bottomed out quality to run smoothly? By the time a single card can chew it up, it’ll look like crap compared to what’s out in that time.

        Its just a crappy engine. Thumbs down.

          • Krogoth
          • 13 years ago

          Nether of us are programmers by trade nor are able to see the source code so it is unfair for us to say that the engine is “Th3 SUCKZ!”.

          I think the problems are that power users are utterly shock when a gaming title comes along and utter crushes their epenis. The same power users are so spoiled and used to twitch FPS titles of late that a slow-paced RTS running at 20-30FPS seems to be inadequate. When in fact it is more then sufficient for an enjoyable gameplay experience.

          Oblivion got a lot of flak at release due to its steep GPU requirements. It still managed to be a fairly popular title, despite having some shortcomings in gameplay.

          Supreme Commander just needs a half-decent dual-core CPU and GPU to be playable. It does not need a QX6800 or 8800GTX. Single-core chips don’t cut it for any epic battle. Dual-core chips are finally affordable for the mainstream market for ether CPU camp.

          Supreme Commander is the first game that I would recommend a dual-core chip. I remember a few years back when dual-cores first came out. Enthusiast were complaining that it would take forever for any game to take advantage of SMP. Well, my friends we finally have a title that does take advantage of SMP.

            • l33t-g4m3r
            • 13 years ago

            edit: question: does supreme commander work with smp in xp64?
            because I don’t see it with process explorer.

            • Krogoth
            • 13 years ago

            Yes, the game will and does use both cores in X64. The problem is that another program running like A/V scan tends to eats up CPU2. You have make sure that are no other demanding programs are open while you are playing the game.

            I had ran into “Why does the game only use 50% problem?”. I was surprised to discovered when I disabled my F@H clients that my minimal FPS went up like 80%.

            What SMP is.

            ยง[<http://en.wikipedia.org/wiki/Symmetric_multiprocessing<]ยง

            • l33t-g4m3r
            • 13 years ago

            sorry, accidental mouse guesture in opera made me post before I completed my question.
            please read it again. thanks.

            • l33t-g4m3r
            • 13 years ago

            Huh, I can’t get it to work for some reason, wonder if supreme commander needs to be 64-bit?
            Maybe driver conflict?
            Don’t really have any programs eating resources of the second core.
            (using process explorer to view graphs.)

            • DASQ
            • 13 years ago

            Well, my E6300 @ 3.43GHz and my OC’d 7950GT begs to differ.

            Playing in a window on 1280×1024, on a four player skirmish map with absolutely bottomed out graphics (No AA, no AF, nothing), it runs like absolute garbage (I’d say maybe 1FPS, if not lower) after about 15 minutes of the AI building up their units.

            And it slows down even though it’s all covered by the FoW… it seems like the engine renders absolutely everything on the map regardless of whether or not it’s visible.

            • Krogoth
            • 13 years ago

            You clearly have a configuration problem and the CPU is the bottleneck. Are you sure that you aren’t running any other demanding programs?

            The lowest FPS that I get is 10FPS that is only on one of the 81×81 maps with 8 AI players, 1000 unit count and a hour of game time passed by.

            (E6300@3.0Ghz, 2GBs of PC2-6400 memory, X1900XT) This using maximum details with/AA at 1280×1024.

            • DASQ
            • 13 years ago

            No there’s no background programs running other than my standard.

            It’s just seemingly worse in windowed vs. fullscreen as well… even though the windowed is a good chunk lower resolution (1280×1024 vs. 1920 x 1080).

            Other games run just fine, my machine chews through Oblivion just fine, R6V doesn’t run too well, but that’s mostly due to it being a bad, bad console port (great frames in some situations, but random drops for no good reason! Yaaayyy!)

            The median low listed in the review is also rather typical of Supreme Commander… on an 8800GTX. That’s only 1280×1024 as well :/

            • swaaye
            • 13 years ago

            Oh, come on now. A brief visit to the game’s forum will explain to you just how horribly buggy the game is. Performance can vary wildly on rigs with similar hardware.

            One real nasty bug is with the audio. They used some new MS sound API (XACT I think it is) that is causing some major HDD I/O access on some systems. Very odd stuff. Some forum posters that seem to be game devs themselves blame mis-sized buffers, too.

            – AI is very, very broken. Commander suicides, are one example. Modders are having trouble fixing things/tweaking because the bugs are in the code, or some such.
            -Nukes will fly around the map, aimlessly, till they blow up something random (like the launcher’s base).
            -There very frequently is a massive delay in unit response thanks to some sort of processing queue they use.. I have had super units stand still and get blown up, ignoring all of my commands to move/fight/whatever.

            I’m very disappointed in Sup Comm. TA is certainly the better of the two.

            • Krogoth
            • 13 years ago

            The audio problem is a bit of a mystery to me. I never had experience it myself with an Audigy Z2S and aging Turtle Beach Santa Cruz. I suspect the problem lies with non-Creative audio solutions (DA BLUE KRAB), which is hardly a surprise.

            I never had any problems with strategtic nukes. They are supposed to fly above and around the map like a real-world ICBM. The long range artillery shots arc around a bit at medium to short distances before the dead zone. It is called ballistics. ๐Ÿ˜‰

            Pathfinding is a known problem, because the CPU is held back from other calcuations. I had only experience pathfinding problems when I try to order a huge army (200+ ground forces) to move to one single location. None of the ground experimetnals are fast or nimble. You need to use some forethought in order to move your units. ๐Ÿ˜‰

            This isn’t a twitch RTS like Starcraft and C&C where ground forces could move and turn on the dime. TA itself was very infamous for pathfinding issues! Ships and huge armies were the worse offenders.

            AI in Supreme Commander isn’t that great, but none of the non-cheating AIs in any RTS were great ether. It is because current AIs lack ingenitiy, adaptbility of a thinking person. The only way to make an AI challanging is to give it ridicious advantages; infinite resources, 4x production speed etc.

            • swaaye
            • 13 years ago

            I’m not even referring to pathfinding specifically, although it is obviously related. When I can’t get a unit to move for several minutes, I see it as a game killing bug. My favorite SupCom moment so far was saying goodbye to a Colossus bot because it was standing in front of an oncoming army, not moving or firing. At best, I could get it to turn slightly. LOL.

            And I wondered why the AI didn’t show up until the game damn near was on the shelf.

            TA was never, ever remotely this bad. Maybe units would go in circles around things, but they never just sat there and blinked for 3-5 minutes. Even with thousands of units running around.

            And, perhaps cheating AI is “cheap”, but at least it can give you and your 3 friends a major run for your money instead of the pushover “supreme” AI in SupCom. I love a run with a few teamed Banzai AIs w/ hard boost. That is a rush in a game with your buds.

            I’ve played a few skirmishes and watched what the SupCom AI does. Usually, it does very little actually. In fact, there are times when the AI never attacks at all. Or does so in such pathetic ways that it’s laughable (small groups of lvl1 crap). Rarely do you really get pounded, except on small maps where it does some interesting things. In MP especially, the game just chugs slower and slower, but nothing really happens!

            The AI primarily has caused me to put the game down until it either gets modded or patched. I play RTS games as coop experiences with friends, vs. the AI. SupCom doesn’t have that at all right now. I also have seen my framerate go from ~25fps to <1fps while watching one of those heavy ground crawlers get fired on by an army. It wasn’t even all that intense. C2D 3.1 GHz, 8800GTX. Nice coding there.

            It’s almost as if they went in a fundamentally wrong direction with how the game balances itself. The queuing for units is obviously flawed. Never mind the weird hardware quirks, like the audio deal or whatever the hell caused me to see <1fps on a 3.1 GHz C2D w/ 8800GTX…..

            I also get the impression that single core users really, really get screwed over. I have a Pentium M 2.13 GHz notebook w/ 2GB RAM and a 7800 GTX. I run a mere 1440×900. It runs ok at the start, but wow does it go downhill. Considering most of what SupCom does isn’t much of an improvement over TA (gameplay-wise) and that TA is entirely software rendered, I’m not impressed.

            BTW, I haven’t played StarCraft since probably 1999. Aside from some single player stints on some retro PCs I’ve built lately. On the other hand, TA and several TA TCs are on my HDD right now. I’ll most likely be playing that this Friday on the LAN, actually.

            If you want to see a twitch RTS, try out C&C3. Yikes.

            • Krogoth
            • 13 years ago

            Those are third party AIs. Supreme Commander will likey get a few of them in time. ๐Ÿ˜‰ The default AI in TA was very stupid, moreso then the “Supreme AI” in Supreme Commander.

            The default AI in OTA would attack with a nice size army, but the second the unit in front gets hit by one of defense turrets. The army retreats in full force when it could at least inflict some damage. The AI uses the worse build order ever. I almost always see it trying to build several Vulcans, Annihilators without having the proper energy production to support them. o_0

            The Supreme AI’s problem is that only attacks with ether T1 units or experimental units nothing in between. The only thing the Supreme AI knows to do well is build one hell of a base defense which can stop a large wave of siege assault bots or one of the ground experimentals. I have to win by using attrtion, spam nukes, T3 artillery or experimental units.

            GC isn’t the best example as it is the second slowest unit in the darn game! (it has the worse acceleration and turning rate though) GC is a walking juggernought that is meant to just walk straight through anything in its path. You cannot keep clicking a quick turnaround at the last second and expect to move on the dime. GC isn’t quite like the Krogoth. It is more like OTA’s “The Can” on steroids a.k.a lumbering giant hitpoint tank that is lethal to anything that gets close enough. Althought, the Aeon’s main strength isn’t their experimental units. ๐Ÿ˜‰

            • swaaye
            • 13 years ago

            Yes yes, but the Collosus would not move at all. Not whatsoever. No firing. The legs were totally stationary. LOL. And no this isn’t an isolated incident. I am stunned that you are arguing with me here. Have you really never run into this unit response delay? I get it in every heavy MP game I play on even a LAN. It’s also all over the game’s forum.

            Units plain will not move at all. They won’t fire. They do absolutely nothing until the “queuer doodad” gets to them, or whatever’s happening….

            I agree that before Core Contingency, TA was a rather empty game too. We really do need more units and a lot of fixes. Maybe in a year or so, Sup Com will be a worthy successor to TA.

            I am particularly concerned over the complaints modders have voiced about AI bugs being deep in the code where they can’t alter them at all. That is why TA hasn’t gone farther than it has with mods. They just can’t reverse engineer the EXE well enough, and Infogrames (or whichever company it is these days) is a greedy monolith who won’t budge on the source code. Ah well.

            • Krogoth
            • 13 years ago

            I only had enconuter unit responive issues on only extreme cases (huge 81×81 map, tons of active units on the map, 1+ hour game time, trying to get tons of units do the same order etc). I suppose it rarely happnes with me since by the time a hour of game time passed over half the AI players would already been killed by my hands.

            The next patch supposely will do some heavy optimizations that might resolve most of the pathfinding issues.

            • Bensam123
            • 13 years ago

            d00d, every game I’ve played with more then two supreme AI and more then four total players with a 500 unit cap results in heavy lag. It usually kicks in about 30mins into the game. When you can watch 1 second pass on the clock and 5 seconds pass in real life you know something is up.

            Anyone who experiences this lag and has a dual core processer should take a look at the windows processer graph. It goes from using both the cores pretty evenly at the beggining of the game to one core running at 100% and the other running at about 15%. The longer you play the less balanced it becomes. 7 sup AIs, 1000 unit cap, and 2x resources will bring this up alot faster.

            Systems I’ve actually watched having these problems IRL:

            P4 3.0Ghz, 1GB memory, x800XL
            Intel 640, 2GB memory, x1800XT
            AMD 64 4400+ x2, 2GB memory, 7800GT
            AMD 64 4600+ x2 AM2, 4GB memory, 7900GT
            Intel C2D 6300, 2GB memory, 8800GTS

            No AVS on any of the systems and they’re clean.

            • Krogoth
            • 13 years ago

            I did suggest that the reason why it doesn’t happen to me is that I usually kill off most of the AI players before they get huge. It is the last two players that managed to get a fully build-up base. At the point I have to resort to T3 artillery, nukes, experimental spam or wave after wave of SABs. I never encountered any zombie unit response in these instances.

            • tfp
            • 13 years ago

            y[

            • swaaye
            • 13 years ago

            Yeah I’ve read that on the forums. All comps share the load. In practice, I’ve even seen the “host” computer crash and the game continues on without problem. In TA, that meant the game was finished. With TA, I used to set up a rig to host an AI all by itself to cut the load. ๐Ÿ™‚

            The problem is that most people still don’t have dual cores and even the fastest CPUs, even quad cores, don’t run this game very well when things get intense.

            • tfp
            • 13 years ago

            I didn’t read it as spilt the load, I read it as everyone runs all of the calcs then it checks and if they match it keeps running. So when you see the desync issues some of the calcs didn’t match up.

            I don’t think it is set up such that it is a distrubited comuting effort where the calcuations are split up between the different matchines playing the game. That that where the case the game should run better multiplayer (most of the time) then it does with single player.

            • JoshMST
            • 13 years ago

            Something sounds goofy there. My buddy and I play the game and we often get up to 1K units. He has a X2 3600 running at 2.85 GHz with a 7900 GTX at 1600 x 1020 (no AA but 16X AF). He does not experience slowdowns at all. I have also played with multiple AIs on that machine, and it doesn’t exhibit that behavior as well.

            • bthylafh
            • 13 years ago

            OT: what kind of cooler and RAM do you have, that you got such a good OC from your 6300?

            • DASQ
            • 13 years ago

            Scythe Ninja with 2 x 120mm in push-pull, and OCZ 2x1GB DDR2-800 Platinum XTC Rev 2. kit on a Gigabyte DS3 (1:2 RAM divider). It does get somewhat hot with the 1.57v I slap into it, but it’s perfectly stable.

      • Bensam123
      • 13 years ago

      I agree… GPG aren’t very pretty.

      I personally like how the ground just looks like one big flat texture. It has little to no depth to it. SupC is hardly a game for taxing graphics, more along the lines of makeing a CPU choke and upchuck its internals before graphics even come into play.

      • albundy
      • 13 years ago

      Totally agree. Not worth wasting this kinda horsepower on. Maybe on a GF 4 and up.

    • Shinare
    • 13 years ago

    It certainly makes me wonder what kind of corporate intel nVidia has gotten for them to push their flagship product just a little bit further just before R600 is to be launched… Now I really AM looking forward to seeing what DAMMIT has to offer…

    • pluscard
    • 13 years ago

    It would appear the 2900xt is already generating competition…

    So far, everyone thinks the 8800 ultra is overpriced…

    Did anyone read the comment on the dailytech article saying their xtx sample used only gddr3 ram?

    Explains why the performance was similar to the xt

    • BoBzeBuilder
    • 13 years ago

    You’d have to be an idiot to buy this card. Nvidia’s gone nutz.

      • Jigar
      • 13 years ago

      Nope Nvidia is not nuts…. ATI is not giving then any competition..

        • PetMiceRnice
        • 13 years ago

        Indeed, and it is a reminder to the fanbois of the world who want one company or another to die off. Loss of competition = higher prices and less innovation.

          • l33t-g4m3r
          • 13 years ago

          It’s the perfect price for people dumb enough to pay for it.

            • SPOOFE
            • 13 years ago

            Wealthy people are dumb now?

            Sounds like someone’s jealous. That’s not very leet, now, is it?

            • BoBzeBuilder
            • 13 years ago

            Not this discussion again. Wealthy people didn’t get wealthy by burning their money on crap. There is absolutely no excuse for anyone to pay $200 more for this card, you can get the BFG watercooled and overclocked version which is probably as fast as this card for less money.

            • SPOOFE
            • 13 years ago

            Whatever. Wealthy people buy expensive crap because they’re wealthy. All of them? Hell no. The wealthiest people I know still use Pentium IIIs. ๐Ÿ™‚

            But there are enthusiasts with money to burn. You don’t need to be a multi-millionaire to buy one of these. Hell, you don’t even need a six-digit salary. You just need to /[

            • albundy
            • 13 years ago

            LoL! so true. cheap wealthy people didnt become wealthy by spending on useless cr@p that will become obsolete in a month. On the other hand, smart wealthy people can deduct things like this off their corporate taxes. hehe.

            • NeXus 6
            • 13 years ago

            It’s called “e-peen.”

            • SPOOFE
            • 13 years ago

            No different than folks that buy an Enzo or Ferrari, really. Or people that go to a club and spend three grand on a bottle of Grey Goose.

            • eitje
            • 13 years ago

            actually, some of them probably inherited the money, or won it through some lottery system. ๐Ÿ™‚

            • l33t-g4m3r
            • 13 years ago

            If I really wanted to, I’d buy it.
            I’m just not a sucker. or a shill for nvidia.

            • SPOOFE
            • 13 years ago

            Me neither.

            But there’s no denying that it’s the fastest stock-clocked video card in the world… and probably a decent choice for people with large monitors. I mean, only a fool would deny – or through deliberate inaction avoid admitting – those minor observations.

            ๐Ÿ˜€

      • NeXus 6
      • 13 years ago

      It doesn’t matter how much this card costs because it will be in very limited supply. ANYONE who does manage to get one is just wanting to increase their e-peen status.

        • BKA
        • 13 years ago

        Here’s an idea, I might just be crazy or something but here goes.

        If you like the card and have the money, buy it. Be proud and play games until you can’t anymore.

        If you don’t like it, don’t buy it and leave it at that.

        *Unless of course if they borrowed the money from you and told you it was for immediate mandatory surgery and instead bought a 8800 Ultra.

        If its doesn’t come out of your pocket why would anyone care?

        Just a concept here guys.

          • NeXus 6
          • 13 years ago

          An overclocked 8800GTX will deliver the same results. The 8800 Ultra is all about e-peen. Why else would you buy one?

            • SPOOFE
            • 13 years ago

            He explained why.

            Why do you care?

            • NeXus 6
            • 13 years ago

            Never said or implied that I cared.

            • SPOOFE
            • 13 years ago

            You cared enough to ask why someone would buy this. So why do you care enough to ask why someone would buy this? It’s their money. Let ’em waste it if they want.

            • NeXus 6
            • 13 years ago

            So…why do you care that I asked why someone would buy one? Who or what are you defending? And WHY!!???

            • SPOOFE
            • 13 years ago

            Morbid curiosity and psychological fascination.

            • Anomymous Gerbil
            • 13 years ago

            Edit: aaah, who cares.

            • NeXus 6
            • 13 years ago

            Apparently somebody with e-peen does. The need to impress everybody with their lastest and greatest hardware purchase is too good to pass up. The inner child in them just can’t resist the urge to splurge.

            • SPOOFE
            • 13 years ago

            I notice more commentary from the people with some urgent drive to attack the e-Peen people than the e-Peen people themselves. I really just wonder if the whole e-Peen thing is just a misperception of the unreasonably insecure.

            • NeXus 6
            • 13 years ago

            Nah, it’s just pointing out people that have more money than brains. Or, to put it more delicately for you, people that have no sense of value. Yes, who cares how people spend their money. Let stupid people be, well…stupid.

            • SPOOFE
            • 13 years ago

            You really can’t stand that there are people that can get something slightly nicer than what you have, can you? It’s sad.

            • NeXus 6
            • 13 years ago

            Sad for who? You? Oh, and keeping this for real, how is an 8800 Ultra nicer than an overclocked 8800GTX? They perform the same, but one costs $300+ more.

            • totoro
            • 13 years ago

            Don’t feed the trolls.

            • Bensam123
            • 13 years ago

            But isn’t this thread just a e-peen war for people that can’t afford a real e-peen?

            Trollers be damned, whose gonna win the ghetto e-peen war??!?!?!

    • Fighterpilot
    • 13 years ago

    Nice card,crazy fast in games.
    At $800 + tho its way expensive.
    Good review TR.

    • Dposcorp
    • 13 years ago

    Quick and consise review.
    Looks like a simple respin and clock increase, but thanks for taking the time to do a review Scott.

    • Ricardo Dawkins
    • 13 years ago

    oh..no..Where is the Radeon ?
    is this what happens when there is no competition ?

      • Jigar
      • 13 years ago

      At AMD’s factory ?? I guess :rollseyes:

    • Jigar
    • 13 years ago

    Too cold to buy .. I would rather pick a factory OCed GTX (still OC it some more heh) and save my $200..

    • Thresher
    • 13 years ago

    Hahahahahahahah.

    They are out of their frickin’ minds. I can understand the need for a “halo” product, but this is nuts.

    • Bet
    • 13 years ago

    Aw nuts. Was hoping for R600 benchmarks this morning! Ridiculous price for this card too, still hoping the R600s can drive down the price of the GTX some. Fleeting hope.

    • flip-mode
    • 13 years ago

    It’s monumentally disapointing to me that HD video processing available on the lower end cards isn’t a part of this super-high-end card’s profile. That’s a fist full of sand in the face and then a kick straight to the nuts.

    FWIW, I’d much rather be reading a TR review of an 8500 – you know, something a few of us will actually buy.

      • LoneWolf15
      • 13 years ago

      Absolutely. If they were going to respin the silicon, why not add this on to the new Ultra? It would add another selling point to those who have to have the best.

      Without it, it seems to me that it’s just an overclocked 8800GTX

        • Lord.Blue
        • 13 years ago

        That’s because it is.

      • coldpower27
      • 13 years ago

      Because it’s still on the 90nm process, it’s jsut a new stepping on the same process, like AMD did with their F2 to F3 stepping 90nm Dual Core K8.

        • flip-mode
        • 13 years ago

        That doesn’t change my point though. They could even have used a separate video processing chip for all I care.

Pin It on Pinterest

Share This