Nvidia’s GeForce 8800 GTS 512 graphics card

Okay boys and girls, it’s time once again for the introduction of a new video card, a happy ritual performed multiple times each year during the lead-up to Christm… err, Holiday. The video card making its debut today is the GeForce 8800 GTS 512, a brand-new big brother to the GeForce 8800 GT, which is selling like a Nintendo Wii infused with the essence of Cabbage Patch Kids, or something like that.

The 8800 GTS 512 looks to supplant older versions of the GeForce 8800 GTS by offering a more attractive mix of price, performance, capabilities, and power consumption. So, you know, it’s the same old boring story in the world of GPUs. The 8800 GTS 512 is interesting in that it has, more or less, no direct competition from the Radeon camp, and its most potent competitor may be its little brother, the 8800 GT. Is there any reason to spend the extra to reach up to the new GTS? Let’s take a look.


Meet the new GTS, different from the old GTS

Despite the name, the GeForce 8800 GTS 512 is actually quite a different animal than previous versions of the GeForce 8800 GTS. If Nvidia were facing more competition in the high-end of the video card market, this product would probably be called the GeForce 8950 GTS or something like that. The new card is based on the same G92 graphics processor that lies under the slim cooler of the GeForce 8800 GT. The G92 traces its design lineage to the G80 GPU that powered the original 8800 GTS cards, but it’s substantially revised, with a new 65nm chip fabrication process, a PCI Express 2.0 interface, more of some things, and less of others.

The most obvious outward indication that the 8800 GTS 512 is a new product, of course, is the amount of memory it carries. Nvidia has decided to key on that fact and let the “512MB” label after the name denote the newer product. Cleverly subtle, I suppose, but to a fault. The 8800 GTS 512 is a better product in some notable ways.

At its default speeds, the 8800 GTS 512 packs quite a wallop. The G92 GPU in the 8800 GTS 512 has four ROP partitions capable of outputting four pixels each, for a total of 16 pixels per clock. Each ROP partition has a 64-bit path to RAM attached, yielding a total memory pathway that’s 256 bits wide. And the GTS 512 has a total of 128 stream processors clocked at 1625MHz.

So what do those numbers mean? Well, in several cases like pixel output capacity and memory bandwidth, the new GTS 512 trails the old GTS in per-clock power. However, the GTS 512 tends to make up any deficits by running at higher clock speeds. Here’s how some of the key stats look once you multiply the per-clock capacity by the clock frequency.

Peak
pixel
fill rate
(Gpixels/s)

Peak bilinear

texel
filtering
rate
(Gtexels/s)


Peak bilinear

FP16 texel
filtering
rate
(Gtexels/s)


Peak
memory
bandwidth
(GB/s)

Peak
shader
arithmetic
(GFLOPS)
GeForce 8800 GT 9.6 33.6 16.8 57.6 504
GeForce 8800 GTS 10.0 12.0 12.0 64.0 346
GeForce 8800 GTS 512 10.4 41.6 20.8 62.1 624

GeForce 8800 GTX

13.8 18.4 18.4 86.4 518
GeForce 8800 Ultra 14.7 19.6 19.6 103.7 576
Radeon HD 2900 XT 11.9 11.9 11.9 105.6 475
Radeon HD 3850 10.7 10.7 10.7 53.1 429
Radeon HD 3870 12.4 12.4 12.4 72.0 496

The 8800 GTS 512 beats the original 8800 GTS in all respects but peak memory bandwidth, where the two are very close. And the GTS 512 even surpasses the GeForce 8800 GTX in texture filtering capacity and peak shader arithmetic. That bodes very well for its performance overall.

The one caveat, if there is one, is that the GTS 512 doesn’t offer much more pixel fill rate or memory bandwidth than the GeForce 8800 GT. Thanks to an extra SP cluster, the GTS 512 does have more texture filtering and shader capacity than the GT, but those advantages may not always bring better performance. The G92’s ratio of shader and texture processing capacity to pixel output and memory bandwidth is much different than the G80’s. As a result, pixel fill rate and memory bandwidth are more likely to be the primary bottlenecks, which could make it hard for the GTS 512 to separate itself from the 8800 GT.

If that egghead math stuff doesn’t float your boat, you’d probably still want to pick the 8800 GTS 512 over the original GTS in order to see the pretty pictures from HD movies. Unlike G80-based cards, the 8800 GTS 512 supports HDCP over both links of a dual-link DVI connection—necessary for HD movie playback on very high resolution displays—and it includes Nvidia’s VP2 video processing unit that provides hardware assistance with H.264 decoding. We found that the VP2-equipped GeForce 8600 GTS consumes quite a bit less CPU time during H.264 video playback than the G80-based GeForce 8800 GTX.

The cards

Physically, GeForce 8800 GTS 512 cards look very similar to their predecessors.



EVGA’s take on the 8800 GTS 512



A single SLI connector means two-way SLI is probably the limit

The board is 9″ long, with a dual-slot cooler that reaches that full length. The GTS 512’s funky cooler places the blower at an angle from the card, presumably to allow room for air intake when the card is nestled up against another one. In front are a pair of dual-link DVI ports and an HDTV-out port. Around back is a single six-pin PCIe auxiliary power connector. Nvidia rates the card’s power use at 150W, so it’s just able to make do with a single aux plug. Like the GTS before it, the GTS 512 has just one SLI connector per card, not two like on the GeForce 8800 GTX. That means exotic three- and four-way SLI configurations probably won’t be possible.

The EVGA card pictured above is selling for $359 at Newegg with a bundled copy of Crysis. Like many GeForce cards, this one runs at higher clock speeds than Nvidia’s recommended baseline. The core clock is 670MHz, the shader clock is 1674MHz, and the memory clock is stock at 970MHz.


And here’s XFX’s take on the 8800 GTS 512. XFX plans several versions of the 8800 GTS, including a stock-clocked model for $349 and the one pictured above, the XXX version, which has a 678MHz core, 1728MHz shaders, and 986MHz memory, for $379. Our XXX card came with a bundled copy of Lost Planet: Extreme Condition, which means EVGA wins, in my book. Crysis is a much better game. XFX seems to have upped the ante by offering a stock-clocked version of the GTS 512 packaged with Company of Heroes for $369 at Newegg, however.

Overall, these prices are just a little lower than the going rate for a GeForce 8800 GTS 640MB. As we’ve mentioned, the GTS 512 doesn’t really have any direct competition from AMD, unless you count running a pair of Radeon HD 3850 256MB cards in CrossFire. A couple of those cards would cost about the same as an 8800 GTS 512. However, their effective memory size would be only half that of the GTS 512. We’ve tested this config to see how it compares. Although it’s a more expensive configuration, we’ve also tested a pair of Radeon HD 3870 512MB cards in CrossFire.

Since the EVGA card has the more conservative clock speed of the two GTS 512 cards we had on hand, we decided to use that one in single-card testing. We then paired it up with the XFX for SLI.

Test notes

You may notice that I didn’t engage in a lot of GPU geekery this time around. I decided instead to focus on testing these new video cards across a range of the amazing new games and game engines coming out. These GPUs are basically “refresh” parts based on existing technology, and their most compelling attributes, in my view, are their incredibly strong price-performance ratios.

In order to give you a better sense of perspective on the price-performance front, I’ve included a couple of older video cards, in addition to a whole range of new cards. Roughly a year ago, the Radeon X1950 Pro faced off against the GeForce 7900 GS at $199. This year’s crop of similarly priced GPUs have some substantial advantages in terms of specifications and theoretical throughput, but as you’ll see, the gains they offer in real-world performance are even larger—and they do it while delivering image quality that’s sometimes quite noticeably superior to last year’s models, as well.

You’ll find results for both the X1950 Pro and the 7900 GS in several of our gaming tests and in our power and noise measurements. I’ve had to limit their participation to scripted benchmarks because these cards were generally too slow to handle the settings at which we tested manually with FRAPS.

That leads me to another issue. As I said, the older cards couldn’t handle some of the settings we used because, well, they’re quite intensive, with very high resolutions, quality levels, or both. We tested at these settings because we wanted to push the cards to their limits in order to show meaningful performance differences between them. That’s hard to do without hitting a CPU or system-level bottleneck, especially with cards this fast running in multi-GPU configurations. We did test at multiple quality levels with a couple of games in order to give you a sense of performance scaling, which should help.

Also, please note that many of the GeForce cards in the tables below are clocked at higher-than-stock speeds. Nvidia’s board vendors have made a practice of selling their products at multiple clock speeds, and some of our examples are these hot-clocked variants. For instance, the 8800 GTS cards are all clocked at 575MHz (or in the case of the one XFX 320MB card, 580MHz) core clocks and correspondingly higher shader clocks. Obviously, that’s going to change the performance picture. We think it makes sense to include these cards because they’re typically fairly plentiful and available for not much of a premium over stock-clocked versions. They’re what we might buy for ourselves.

The one exception to that rule, at least right now, may be the GeForce 8800 GT. The first wave of these cards looks to have sold out at many online vendors, and all variants are going for something of a premium right now—especially the higher clocked ones. We have included one “overclocked” version of the 8800 GT (from MSI) in our tests in order to show you its performance. This card is very fast, but be aware that it is not currently a $199 or even a $249 option.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the results were averaged.

Our test systems were configured like so:

Processor Core
2 Extreme X6800
2.93GHz
Core
2 Extreme X6800
2.93GHz
System
bus
1066MHz
(266MHz quad-pumped)
1066MHz
(266MHz quad-pumped)
Motherboard XFX
nForce 680i SLI
Gigabyte
GA-X38-DQ6
BIOS
revision
P31 F5h
North
bridge
nForce
680i SLI SPP
X38
MCH
South
bridge
nForce
680i SLI MCP
ICH9R
Chipset
drivers
ForceWare
15.08
INF
update 8.3.1.1009

Matrix Storage Manager 7.6

Memory
size
4GB
(4 DIMMs)
4GB
(4 DIMMs)
Memory
type
2
x Corsair
TWIN2X20488500C5D
DDR2 SDRAM
at 800MHz
2
x Corsair
TWIN2X20488500C5D
DDR2 SDRAM
at 800MHz
CAS
latency (CL)
4 4
RAS
to CAS delay (tRCD)
4 4
RAS
precharge (tRP)
4 4
Cycle
time (tRAS)
18 18
Command
rate
2T 2T
Audio Integrated
nForce 680i SLI/ALC850

with RealTek 6.0.1.5497 drivers

Integrated
ICH9R/ALC889A

with RealTek 6.0.1.5497 drivers

Graphics XFX
GeForce 7900 GS 480M 256MB PCIe

with ForceWare 169.01 drivers

Dual

Radeon X1950 Pro 256MB PCIe

with 8.43 drivers

Dual
XFX
GeForce 7900 GS 256MB PCIe

with ForceWare 169.01 drivers

Dual

Radeon HD 2900 XT 512MB PCIe

with 8.43 drivers

GeForce
8800 GT 512MB PCIe

with ForceWare 169.01 drivers

Dual

Radeon HD 3850 256MB PCIe

with 8.43 drivers

Dual
GeForce
8800 GT 512MB PCIe

with ForceWare 169.01 drivers

Dual

Radeon HD 3870 512MB PCIe

with 8.43.1.071115a drivers

MSI
NX8800 GT TD512E 512MB PCIe

with ForceWare 169.01 drivers

XFX
GeForce 8800 GTS XXX 320MB PCIe

with ForceWare 169.01 drivers

EVGA
GeForce 8800 GTS 512MB PCIe

with ForceWare 169.06 drivers

EVGA
GeForce 8800 GTS 512MB PCIe

+ XFX GeForce 8800 GTS 678M 512MB PCIe

with ForceWare 169.06 drivers

XFX
GeForce 8800 GTS XXX 320MB PCIe
+ MSI
NX8800GTS OC 320MB PCIe

with ForceWare 169.01 drivers

EVGA
GeForce 8800 GTS SC 640MB PCIe

with ForceWare 169.01 drivers

Dual
EVGA
GeForce 8800 GTS SC 640MB PCIe

with ForceWare 169.01 drivers

MSI
GeForce 8800 GTX 768MB PCIe

with ForceWare 169.01 drivers

Dual
GeForce 8800
GTX 768MB PCIe

with ForceWare 169.01 drivers


Radeon X1950 Pro 256MB PCIe

with 8.43 drivers


Radeon HD 2900 XT 512MB PCIe

with 8.43 drivers



Radeon HD 3850 256MB PCIe

with 8.43 drivers



Radeon HD 3870 256MB PCIe

with 8.43 drivers

Hard
drive
WD
Caviar SE16 320GB SATA
OS Windows
Vista Ultimate
x86 Edition
OS
updates
KB36710, KB938194, KB938979, KB940105,
DirectX August 2007 Update

Thanks to Corsair for providing us with memory for our testing. Their quality, service, and support are easily superior to no-name DIMMs.

Our test systems were powered by PC Power & Cooling Silencer 750W power supply units. The Silencer 750W was a runaway Editor’s Choice winner in our epic 11-way power supply roundup, so it seemed like a fitting choice for our test rigs. Thanks to OCZ for providing these units for our use in testing.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Enemy Territory: Quake Wars

We’ll start with Quake Wars since this game’s simple “nettimedemo” allows us to record a gaming session and play it back with precise repeatability on a range of cards at a range of resolutions. Which is what we did. A lot.

We tested this game with 4X antialiasing and 16X anisotropic filtering enabled, along with “high” settings for all of the game’s quality options except “Shader level” which was set to “Ultra.” We left the diffuse, bump, and specular texture quality settings at their default levels, though, to be somewhat merciful to the 256MB and 320MB cards. Shadows, soft particles, and smooth foliage were enabled where possible, although the Radeon X1950 Pro wasn’t capable of handling soft particles.

The GTS 512 does well enough at lower resolutions, but the pack really starts to separate as the display resolution climbs. At 1600×1200, the GTS 512 is the second-fastest single card in the bunch, just behind the GeForce 8800 GTX and just ahead of the Radeon HD 3870 in CrossFire. The most price-competitive config from AMD, the two Radeon HD 3850 cards in CrossFire, just isn’t in the same league as the GTS 512. Also, the GTS 512 easily outdoes the GeForce 8800 GTS 640MB, in spite of the fact that our GTS 640MB representative is clocked at speeds substantially higher than stock.

The most formidable foe to the GTS 512 is the “factory overclocked” GeForce 8800 GT from MSI. This card is selling for under $300 online and shadows the GTS 512 very closely.

Unreal Tournament 3 demo

We tested the UT3 demo by playing a deathmatch against some bots and recording frame rates during 60-second gameplay sessions using FRAPS. This method has the advantage of duplicating real gameplay, but it comes at the expense of precise repeatability. We believe five sample sessions are sufficient to get reasonably consistent and trustworthy results. In addition to average frame rates, we’ve included the low frames rates, because those tend to reflect the user experience in performance-critical situations. In order to diminish the effect of outliers, we’ve reported the median of the five low frame rates we encountered.

Because the Unreal engine doesn’t support multisampled antialiasing, we tested without AA. Instead, we just cranked up the resolution to 2560×1600 and turned up the demo’s quality sliders to the max. I also disabled the demo’s frame rate cap before testing.

The picture looks much the same in UT3. The 8800 GTS 512 slots in just between the GeForce 8800 GTX and the overclocked 8800 GT from MSI. The big difference here is the performance of the Radeon HD 3850 CrossFire config, which turns out higher average frame rates than the GTS 512. Unfortunately, though, the HD 3850 turns in a much lower median low FPS number. In fact, CrossFire doesn’t appear to help median low frame rates at all on the 3850, which means it’s not a huge help to playability in worst-case scenarios. The same is true of SLI on the GTS 512.

Call of Duty 4

This game is about as sweet as they come, and we also tested it manually using FRAPS. We played through a portion of the “Blackout” mission at 1600×1200 with 4X antialiasing and 16X aniso.

Not to sound like a broken record, but I think we’re getting a sense that the 8800 GTS 512 is faster than the GTS 640MB and nearly as quick as the big daddy, the 8800 GTX. The MSI 8800 GT OC remains the fly in the ointment, however.

Frustratingly, AMD doesn’t appear to have CrossFire working well with CoD4.

TimeShift

This game may be a bizarrely derivative remix of Half-Life 2 and F.E.A.R., but’s it’s a guilty-pleasure delight for FPS enthusiasts that has a very “action-arcade” kind of feel to it. Like most of the other games, we played this one manually and recorded frame rates with FRAPS. We had all of the in-game quality settings maxed out here, save for “Projected Shadows,” since that feature only works on Nvidia cards.

This time, the GTS 512 even outdoes the 8800 GTX in single-card performance, though not in SLI. The GTS 640MB just isn’t as fast, which is one reason I take issue with the naming of the GTS 512. As ever, though, the 8800 GT OC card shadows the GTS 512 closely.

The Radeon CrossFire scores look pretty good here, but I should mention that, although we were able to test, we encountered annoying visual artifacts (the screen was flashing) while playing on the CrossFire rigs, even with AMD’s latest drivers.

BioShock

We tested this game with FRAPS, just like we did the UT3 demo. BioShock’s default settings in DirectX 10 are already very high quality, so we didn’t tinker with them much. We just set the display res to 2560×1600 and went to town. In this case, I was trying to take down a Big Daddy, a generally unsuccessful effort.

The trends we’ve seen before continue unabated here, mercilessly offering me very little to write about. SLI seems to be no help at all in this game for the GTS 512. None of the GeForce cards gain much from SLI in BioShock, but most of them do gain some. We’re using slightly newer drivers on the GTS 512, which may explain the scaling difference.

Power consumption

We measured total system power consumption at the wall socket using an Extech power analyzer model 380803. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.

The idle measurements were taken at the Windows Vista desktop with the Aero theme enabled. The cards were tested under load running BioShock in DirectX 10 at 2560×1600 resolution, using the same settings we did for performance testing.

Although the GTS 512 performs nearly as well as the GeForce 8800 GTX, its power draw is much lower. This is solid improvement on the power-performance front—and nothing less than we expected from a card based on the G92 GPU.

Noise levels

We measured noise levels on our test systems, sitting on an open test bench, using an Extech model 407727 digital sound level meter. The meter was mounted on a tripod approximately 14″ from the test system at a height even with the top of the video card. We used the OSHA-standard weighting and speed for these measurements.

You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured, including the stock Intel cooler we used to cool the CPU. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

Few cards stand out from the pack here, and that’s generally a good thing, since they’re almost all nice and quiet. The big exceptions are the GeForce 7900 GS and the Radeon HD 2900 XT, which become noisy when running a game. The GTS 512’s cooler doesn’t suffer that fate.

Then again, neither does the GeForce 8800 GT. One thing’s worth noting, though, before we move on: we tested noise levels on an open test bench. I suspect that, when placed into a hot PC case, the GTS 512’s dual-slot cooler and rear exhaust will lend itself to lower noise levels quite a bit better than the single-slot cooler on the GeForce 8800 GT. As a rule, if we don’t mind sacrificing the extra expansion slot, we tend to prefer dual-slot coolers for this reason.

Conclusions

The GeForce 8800 GTS 512 is easily superior to the product it supplants, the GeForce 8800 GTS 640MB. In fact, I’d take the GTS 512 over the more expensive GeForce 8800 GTX, given the choice. The GTS 512 performs substantially better than the GTS 640MB and nearly as well as the GTX, yet it draws less power and offers several new features, including PCIe 2.0 support and better HD video processing. All of this is good.

What’s more, AMD has nothing to compete with this card, save for a Radeon HD 3850 CrossFire config that’s not truly competitive—both for performance reasons and because of the innate problems that plague multi-GPU setups, especially (at least of late) those from AMD. The folks at AMD have been talking a big game about ramping up their multi-GPU strategy as their primary means of competing in high-end graphics. If they’re going to do so, they’ll have to do much better with CrossFire scaling and compatibility than they are right now. None of the games we tested are exactly obscure, yet several of them didn’t work well with CrossFire.

With that said, the GTS 512’s big Achilles’ heel comes in the form of a sibling rivalry. The specs say the GTS 512 is better than the 8800 GT thanks to its additional SP cluster, which grants it more texture filtering and shader processing power. But the real-world games we tested say the true limitations right now look to be pixel fill rates and memory bandwidth, where the GTS 512 barely leads the 8800 GT at all. That’s especially the case once you throw a “factory overclocked” 8800 GT into the mix. We tested one from MSI, but many companies offer similar cards. Even if you factor in the price premium you’ll pay for the higher-clocked version of 8800 GT and the premium due to the GT’s relative scarcity, the 8800 GT OC looks to be a better deal than the GTS 512.

There is a case to be made for the GTS 512, though, in the right conditions. The 8800 GT has been rather difficult to find in stock at online retailers, and the MSI card we tested is going for nearly 300 bucks. We do prefer the GTS 512’s dual-slot cooler, and the EVGA GeForce 8800 GTS 512 card we reviewed comes with a copy of Crysis for $359. I can see paying that price if you haven’t already finished Crysis like I have, I suppose.

Speaking of which, you may be wondering why we haven’t included Crysis results in our testing today. The reason is that we have bigger plans for Crysis testing soon. Stay tuned.

Comments closed
    • rythex
    • 12 years ago

    Good to see my 8800 GTX I bought well over a year ago is still faster than these new, cheaper cards.. ๐Ÿ˜›

      • Krogoth
      • 12 years ago

      Not really,

      8800GT and 8800GTS are trailing darn close for a fraction that you have spend for that GTX at its launch.

        • indeego
        • 12 years ago

        You guys could have replaced your model number epenis war terms with any from the past decade and the argument would still fit. Economics of vid cards hasn’t changedg{<.<}g

          • Krogoth
          • 12 years ago

          Actually, it has diversify a bit. There are far more choices at different market segments then what there used to be back in the day.

            • swaaye
            • 12 years ago

            That’s because the suits at the companies are trying desperately to make more money by finding/creating new market segments.

        • swaaye
        • 12 years ago

        I guess when your top end card is still basically top end after over a year, maybe it was worth it.

    • indeego
    • 12 years ago

    ยง[<http://www.gemaga.com/2007/12/12/over-1800-in-video-cards-to-run-crysis-on-very-high<]ยง There are the missing crysis benchesg{

    • spartus4
    • 12 years ago

    What a load of crap!!! The Nvidia cards were only tested on the Nvidia chipset. This review is bogus!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

      • muyuubyou
      • 12 years ago

      oh noes!!1!1!111!!1!!!!!!!!!1!

    • credo
    • 12 years ago

    Quick question about the RAM on the 2nd, 3rd, etc cards. Is it used? I mean, 2 512 cards != 1024, rather 2 512 cards = 512 right? The second, and so on, card(s) is/are use primarily for processing right?

      • Damage
      • 12 years ago

      The two cards draw frames independently, generally, and each card must store the texture, models, and programs it needs in its own local memory. So the effective RAM size doesn’t grow when you add a second card.

        • credo
        • 12 years ago

        ah, i was thinking of it more like a central card telling the others what to do, offsetting task processing. makes more sense your way though :/

    • NIKOLAS
    • 12 years ago

    What a blight it is having all the single card and SLI results jumbled together.

    I have no problem if these same results were also presented in such a graphic format, as long as there was primacy given to results containing just the single cards.

    • gtoulouzas
    • 12 years ago

    Damage, could you in some way incorporate single-card only benches (for example, upon mouseover on the benchmark picture?). I know this sounds like a bizzare request, but the sli/crossfire results make the tables awfully cluttered and many of us are completely uninterested in them.
    Congratulations on the thorough review.

      • danny e.
      • 12 years ago

      i actually agree with this idea-r.. anything to weed out the mulit-configs.. or at least seperate them.

      • indeego
      • 12 years ago

      Also, there’s no way to please everyone. I’ll bet you if he did that 4 people would chime in “where is SLI?!?!g{

        • gtoulouzas
        • 12 years ago

        Not if it’s implemented in a mouseover. ๐Ÿ™‚

      • provoko
      • 12 years ago

      I’m sure most people are curious to know how a particular SLI rig compares to a single rig. Without them or if they were on another chart, people would complain.

      However, I will agree that the tables are cluttered, but nothing should be taken off. They would look less cluttered if they were *[http://img143.imageshack.us/img143/1267/trpictureaf9.jpg<]ยง

      • My Johnson
      • 12 years ago

      A mouse over that grays the multi-GPU configs…

      • Jigar
      • 12 years ago

      People who browse from the office will face the problem.. i have flash disable at my office.. what should i do ?

        • boing
        • 12 years ago

        Mouseover doesn’t use flash.

      • snowdog
      • 12 years ago

      Ditto on this request. Two separate tables or something. I might buy a GTS as but I have zero interest in SLI. Wading through results looking for the single cards is a pain.

    • Jigar
    • 12 years ago

    Looks like i am better off with my 2X 8800 GT Oced @ 740MHZ Core and 2050 MHZ Memory ๐Ÿ˜‰

      • Faiakes
      • 12 years ago

      How about the Shaders?

        • Jigar
        • 12 years ago

        They don’t Oc .. don’t know the reason…Tried everything but reva tuner was unable to OC it.. ๐Ÿ™

          • Faiakes
          • 12 years ago

          Usually you have to set them manually through a simple BIOS flash.

          Just change the number and reflash it. NiBiTor and a bootable usb drive will do it in under 2 minutes!

            • Jigar
            • 12 years ago

            I never had a good luck with flashing Video card bios, i would rather stay away from it

      • flip-mode
      • 12 years ago

      Your epeen bragging is somewhat repugnant.

        • Jigar
        • 12 years ago

        What can i say, i like to be like that ๐Ÿ˜‰

      • Faiakes
      • 12 years ago

      I’m getting an eVGA 8800GT and will be permanently overclocking it with a BIOS flash.

      My last 8800 GTS was set to 648-1620 (shaders)-2106.
      If you do the flash with a USB flash drive is very quick and safe.

    • bfellow
    • 12 years ago

    According to Tom’s review at 1280×1024 and 1920×1440 with no AA/AF it beats the 8800GTX in Crysis

      • Flying Fox
      • 12 years ago

      Whose review, huh? =D

    • El_MUERkO
    • 12 years ago

    :O

    how can the crossfired radeons be slower than the stand alone card?

      • wingless
      • 12 years ago

      ATI has not optimized the drivers for those specific games. The weak point of crossfire and ATI cards in general is that hey have to be optimized for individual software titles/game engines. I tried to play a Mechwarrior 4 Mercenaries yesterday which is made by Microsoft and band the game was unplayable. The drivers simply were not compatible and the game came out in the year 2000. This 2900XT has failed me yet again! (If I were Darth Vader I would have crushed it’s larynx months ago…..)

      The Crossfire performance is amazing on software that it does work on. Its damn near as fast as the 8800GTS 512mb in SLI! The scaling with Crossfire is almost exactly 2x and remember, you can add a third and fourth card just for the hell of it. We may see scaling of 3x and 4x if that trend keeps going. Like I posted earlier though, waiting for drivers to come out every month just to play new games is f’ing ridiculous though.

        • Silus
        • 12 years ago

        Don’t get your hopes up. You will never see that sort of performance increase, in a Quad Crossfire setup. Just like you didn’t with Quad-SLI a year and half ago, for the very same reason you complained: driver support.

    • bogbox
    • 12 years ago

    the only /[

    • wingless
    • 12 years ago

    That was a great test for all modern cards from ATI and Nvidia. One thing I noticed is that ATI’s cards in Crossfire scale nearly exactly 2x on any game they’re optimized for. The 3870 in Crossfire matched the 8800GTS in SLI in several tests. The only problem with Crossfire is that if the game isn’t “optimized” then it pretty much won’t work at all. Waiting for ATI driver engineers to implement support is an F’ing hassle! However, if you bought a 3870 recently you shouldn’t feel too bad about buying a second one. Hell, if you have a 790FX mobo you can buy a third and fourth one too down the road =)

    These cards are great but I would still recommend waiting for the Nvidia 9xxx series due out in February. No card out today is “GOOD” in DX10 yet. However, If you’re a DX9 user then the G92 8800GTS is the best on the planet for that price.

      • ElderDruid
      • 12 years ago

      Agreed.

      I think anyone who buys the 8800 GTS 512 now will be regretting their decision in 3 months.

        • Flying Fox
        • 12 years ago

        Whenever you buy the top end late in its lifetime and/or close to a next gen release, you are going to regret anyways.

        The point though, is that a lot of people can only play games extensively during the holidays. So waiting until Februrary is not realistic for them. After all, you don’t just buy a card, you need to play the games to enjoy the utility the card provides.

          • flip-mode
          • 12 years ago

          g{

            • Gerbil Jedidiah
            • 12 years ago

            That’s why I always shoot for the best deal in the middle ground. I paid $300 for my 7950GX2 over a year ago because somebody had to upgrade to the brand new 8800GTX. Here we are a year later, the 7950GX2 is still a fast card, and DX10 is just now becoming something worth having. So now I’m looking for the best card in the middle ground again. Honestly, I think it’s the 3870 due to it being quiet and having low power consumption.

        • Forge
        • 12 years ago

        That’s why eVGA’s Step-Up program covers you for… wait for it… THREE MONTHS.

        I’ll be picking up a GTS512 as soon as my next paycheck hits. I’ll be using Step-Up to a D9E in roughly 2.5-3 months as well.

        Win/win.

          • Flying Fox
          • 12 years ago

          Except if the D9E gets delayed beyond 3 months, then what do you do?

            • credo
            • 12 years ago

            then you didnt buy a card 3 months before it was obsolete.

    • b_naresh
    • 12 years ago

    GPU temperature readings should have been included along with the temperature and noise level charts.

    • Jive
    • 12 years ago

    Can anyone tell me the point in buying this card over a 8800 GT again?

      • Voldenuit
      • 12 years ago

      Other than the dual slot cooler, not much.

      One thing TR’s review doesn’t demonstrate is that the single slot cooler on the GT is pretty loud when it ramps up. I had a friend with an overheating GT, and his cooler was much louder under load than my (G80) GTS. He eventually ditched the stock cooler and bought a Thermalright HR03+, which ended up costing him as much as if he’d bought a GTS 512 in the first place, in addition to eating up 3 PCI slots.

      The TR cooler is insanely good, though. His load temps are close to my idle temps.

      Maybe he had a bad sample. But there are other reviews of the GT that have measured much higher sound levels from it than TR has. Like Damage said, it might have something to do with using an open testbed.

        • toyota
        • 12 years ago

        the 8800gt reference cooler was changed about a week ago.

        ยง[<http://en.expreview.com/?p=78<]ยง

          • Mithent
          • 12 years ago

          It is better, but still hotter than a dual slot cooler – there’s nowhere to vent that hot air until it’s captured by a case/PSU fan.

          • Voldenuit
          • 12 years ago

          Ooo…thanks for that bit of news!

          A GT might make it into my next build after all…

          • b_naresh
          • 12 years ago

          Damn! I just ordered an EVGA 8800GT from Amazon yesterday! I’ll mostly get the older version! But then again, I needed this by this Friday so I guess I’ll just have to use Riva Tuner to adjust the fan speed and bear with the noise ๐Ÿ™
          EDIT: Just as I thought. I got the older one with the smaller heatsink ๐Ÿ™

      • provoko
      • 12 years ago

      No point.

    • provoko
    • 12 years ago

    What a stupid card. It barely beats the 8800GT yet it’s 50-100 more dollars…

    • Gilligan
    • 12 years ago

    8800 GT for 207 bucks at dell.com a few weeks ago was THE deal to grab!

    although I am still waiting…

    but i have exams and a broken mobo anyways ๐Ÿ™

    • My Johnson
    • 12 years ago

    Is the date of the review 12/10/07 or 12/11/07?

      • danny e.
      • 12 years ago

      ๐Ÿ™‚ long hours can do that to you.

    • Usacomp2k3
    • 12 years ago

    Did the XFX version of this one gain any ground that the eVGA missed?

    • danny e.
    • 12 years ago

    yes.. i was wondering as i was reading through.. glad to see there is a good reason. hope it comes soon.

      • danny e.
      • 12 years ago

      sadly the GTS doesnt look like its quite to the level I want yet.
      It would have been nice to see some of th GTS overclocked version scores though.. to compare to the GT OC.

    • CampinCarl
    • 12 years ago

    Does anyone know what kind of improvements we can expect to see with the next series of cards? I see in the Beyond3D links that it’s supposed to be due out in February, but it only mentions that it’ll be a DX10.1 card…anyone got anything more than that?

      • BoBzeBuilder
      • 12 years ago

      The cards will supposedly be built using 55nm technology, but I don’t think the next generation cards will be a departure from the current 8000series architecture.

        • Silus
        • 12 years ago

        The high-end based on G9x (D9E), will still use 65 nm. It’s the mid-range G9x, that will use 55 nm.

          • Xenolith
          • 12 years ago

          When are those cards (mid-range 55nm) expected to hit the streets in bulk? That sounds like my next card.

    • ReAp3r-G
    • 12 years ago

    nothing great IMO…its not groundbreaking as it was a year plus back…

    the 8800GT and GTS prices are enough to push me away…i’m in the search for a bargain…but its too pricey now for something that isn’t quite new and not quite groundbreaking as well…good thing i decided to wait on the 9 series…

    • lethal
    • 12 years ago

    Edit: whoops this was meant to reply #2.

    I know Nvidia isn’t going to drop the 8800 yet, specially since the GT is selling like hotcakes, but still, most people who follows these news are used to seeing regularly the new architectures, tech demos and the usual drool inducing game every now and then…. things have kinda slowed down in the tech department a bit since G80 was introduced. But this year saw a lot of great game released, so I cant really complain :P.

    • OptOut
    • 12 years ago

    I skimmed through the other reviews on the web today, but this is the piece I’ve really been waiting for. Thanks for getting it out.

    Have to say I agree with the conclusion; the OC’d GT is a better deal than a stock GTS. I can’t see paying the $359.00, course I already own Crysis.

    • lethal
    • 12 years ago

    No Crysis results? come on, you even got the cd with the card ;).Nice review, although the 8800 as a whole is coming a little long in the tooth. Hopefully the next gen will bring some substantial performance improvements vs the current tech, the GTX has been pretty much the performance ceiling for way too long.

      • Kulith
      • 12 years ago

      “Okay boys and girls, it’s time once again for the introduction of a new video card, a happy ritual performed multiple times each year during the lead-up to Christm… err, Holiday. The video card making its debut today is the GeForce 8800 GTS 512, a brand-new big brother to the GeForce 8800 GT, which is selling like a Nintendo Wii infused with the essence of Cabbage Patch Kids, or something like that.”

      Lol this site is amazing. I absolutely love to read these reviews. Comments like these make my day.

      Anyways, the GT is still the better deal imho. Ill be happy if Dell ever gets around to shipping my MSI 8800 GT OC that I managed to get for 207$. I really wish the GT came with a dual slot cooler though like this one….

      • danny e.
      • 12 years ago

      did you just skim the graphs ?

        • indeego
        • 12 years ago

        And why not? Not much new to the table is introduced here, and the crysis teaser is at the end of the articleg{<...<}g

          • danny e.
          • 12 years ago

          well.. you could at least read the intro and conclusion.. then wouldnt have to wonder about why Crysis results werent included.

    • Krogoth
    • 12 years ago

    It is effectively a 8800Ultra with 512MB of memory.

    Great review, Damage.

    Nvidia doesn’t want to release a 1GB verison, because it will render their 8800U and 8800GTX completely obsolete.

    I can’t wait for the full-brown version of G9x series.

      • toyota
      • 12 years ago

      the 8800Ultra and GTX are sometimes faster mainly because of their bandwidth not because of the amount of memory. having 1024mb on the GTS would still result in the same outcome in 95% of cases against the Ultra and GTX.

        • marvelous
        • 12 years ago

        Most definitely. The whole faster shader give you better performance is utter bull in real games. G92 GTS has theoretical peak fillrate that surpasses the gtx or ultra but never attained because of memory bandwidth.

      • marvelous
      • 12 years ago

      Not quite. It still trails gtx except for shader intensive games which it ties.

        • Krogoth
        • 12 years ago

        less then 2% difference? Hardly trailing in the best sense.

        8800GTS 512MB proves that most of the extra bandwidth that 8800GTX has goes to waste.

          • marvelous
          • 12 years ago

          GTX still beats newer 60nm G92 and you say waste? GTX was the card to have which technology allowed at the time yet it still performs faster than the new G92 GTS that has almost double the fillrate of GTX yet it still can’t saturate its total fillrate and still loses to GTX. If these G92 had 384 bit or 512bit even it will surely beat an ultra and then some.

            • Krogoth
            • 12 years ago

            You should look at the numbers and charts more carefully.

            GTX has has a nice edge in fill-rate. GTS 512 is only faster at shading operations.

            This is reflected in the benchmark suite. The games that are depend more on shader performance the GTS 512 outruns the GTX (Bioshock, Timeshifter). The games that depend more on fill-rate is where 8800GTX still manages to outrun the GTS 512 by a littlebit (Quake Wars: Enemy Territory).

            I believe the newer 60nm G92 have already proven themselves to be superior to the older 65nm G8x family. The missing part of the puzzle is the lack of a full-brown G9x design (8900GTX).

      • Nitrodist
      • 12 years ago

      I think ASUS is releasing a 1gb card of the 8800GT. I believe TR reported on it on the 7th or something.

Pin It on Pinterest

Share This