Nvidia’s GeForce 8800 GT graphics processor

This is an absolutely spectacular time to be a PC gamer. The slew of top-notch and hotly anticipated games hitting stores shelves is practically unprecedented, including BioShock, Crysis, Quake Wars, Unreal Tournament 3, and Valve’s Orange Box trio of goodness. I can’t remember a time quite like it.

However, this may not be the best time to own a dated graphics card. The latest generation of high-end graphics cards brought with it pretty much twice the performance of previous high-end cards, and to add insult to injury, these GPUs added DirectX 10-class features that today’s games are starting to exploit. If you have last year’s best, such as a GeForce 7900 or Radeon X1900, you may not be able to drink in all the eye candy of the latest games at reasonable frame rates.

And if you’ve played the Crysis demo, you’re probably really ready to upgrade. I’ve never seen a prettier low-res slide show.

Fortunately, DirectX 10-class graphics power is getting a whole lot cheaper, starting today. Nvidia has cooked up a new spin of its GeForce 8 GPU architecture, and the first graphics card based on this chip sets a new standard for price and performance. Could the GeForce 8800 GT be the solution to your video card, er, Crysis? Let’s have a look.


Meet the G92

In recent years, graphics processor transistor budgets have been ballooning at a rate even faster than Moore’s Law, and that has led to some, um, exquisitely plus-sized chips. This fall’s new crop of GPUs looks to be something of a corrective to that trend, and the G92 is a case in point. This chip is essentially a die shrink of the G80 graphics processor that powers incumbent GeForce 8800 graphics cards. The G92 adds some nice new capabilities, but doesn’t double up on shader power or anything quite that earth-shaking.


Here’s an extreme close-up of the G92, which may convince your boss/wife that you’re reading something educational and technically edifying right about now. We’ve pictured it next to a U.S. quarter in order to further propagate the American hegemonic mindset. Er, I mean, to provide some context, size-wise. The G92 measures almost exactly 18 mm by 18 mm, or 324 mm². TSMC manufactures the chip for Nvidia on a 65nm fab process, which somewhat miraculously manages to shoehorn roughly 754 million transistors into this space. By way of comparison, the much larger G80—made on a 90nm process—had only 681 million transistors. AMD’s R600 GPU packs 700 million transistors into a 420 mm² die area.

Why, you may be asking, does the G92 have so many more transistors than the G80? Good question. The answer is: a great many little additions here and there, including some we may not know about just yet.

One big change is the integration of the external display chip that acted as a helper to the G80. The G92 natively supports twin dual-link DVI outputs with HDCP, without the need for a separate display chip. That ought to make G92-based video cards cheaper and easier to make. Another change is the inclusion of the VP2 processing engine for high-definition video decoding and playback, an innovation first introduced in the G84 GPU behind the GeForce 8600 lineup. The VP2 engine can handle the most intensive portions of H.264 video decoding in hardware, offloading that burden from the CPU.

Both of those capabilities are pulled in from other chips, but here’s a novel one: PCI Express 2.0 support. PCIe 2.0 effectively doubles the bandwidth available for communication between the graphics card and the rest of the system, and the G92 is Nvidia’s first chip to support this standard. This may be the least-hyped graphics interface upgrade in years, in part because PCIe 1.1 offers quite a bit of bandwidth already. Still, PCIe 2.0 is a major evolutionary step, though I doubt it chews up too many additional transistors.

So where else do the G92’s additional transistors come from? This is where things start to get a little hazy. You see, the GeForce 8800 GT doesn’t look to be a “full” implementation of G92. Although this chip has the same basic GeForce 8-series architecture as its predecessors, the GeForce 8800 GT officially has 112 stream processors, or SPs. That’s seven “clusters” of 16 SPs each. Chip designers don’t tend to do things in odd numbers, so I’d wager an awful lot of Nvidia stock that the G92 actually has at least eight SP clusters onboard.

Eight’s probably the limit, though, because the G92’s SP clusters are “fatter” than the G80’s; they incorporate the G84’s more robust texture addressing capacity of eight addresses per clock, up from four in the G80. That means the GeForce 8800 GT, with its seven SP clusters, can sample a total of 56 texels per clock—well beyond the 24 of the 8800 GTS and 32 of the 8800 GTX. We’ll look at the implications of this change in more detail in a sec.

Another area where the GeForce 8800 GT may be sporting a bit of trimmed down G92 functionality is in the ROP partitions. These sexy little units are responsible for turning fully processed and shaded fragments into full-blown pixels. They also provide much of the chip’s antialiasing grunt, and in Nvidia’s GeForce 8 architecture, each ROP has a 64-bit interface to video memory. The G80 packs six ROP partitions, which is why the full-blown GeForce 8800 GTX has a 384-bit path to memory and the sawed-off 8800 GTS (with five ROP partitions) has a 320-bit memory interface. We don’t know how many ROP partitions the G92 has lurking inside, but the 8800 GT uses only four of them. As a result, it has a 256-bit memory interface, can output a maximum of 16 finished pixels per clock, and has somewhat less antialiasing grunt on a clock-for-clock basis.

How many ROPs does G92 really have? I dunno. I suspect we’ll find out before too long, though.

The 8800 GT up close

What the 8800 GT lacks in functional units, it largely makes up in clock speed. The 8800 GT’s official core clock speed is 600MHz, and its 112 SPs run at 1.5GHz. The card’s 512MB of GDDR3 memory runs at 900MHz—or 1.8GHz effective, thanks to the memory’s doubled data rate.



MSI’s NX8800GT

Here’s a look at MSI’s rendition of the GeForce 8800 GT. Note the distinctive MSI decal. This card is further differentiated in a way that really matters: it comes hot from the factory, with a 660MHz core clock and 950MHz memory. This sort of “overclocking” has become so common among Nvidia’s board partners, it’s pretty much expected at this point. MSI doesn’t disappoint.

I don’t want to give too much away, since we’ve measured noise levels on a decibel meter, but you’ll be pleased to know that the 8800 GT’s single-slot cooler follows in the tradition of Nvidia’s coolers for its other GeForce 8800 cards. The thing is whisper-quiet.

The sight of a single-slot cooler may be your first hint that this is not the sort of video card that will put an ugly dent in your credit rating. Here’s another hint at the 8800 GT’s mainstream aspirations. Nvidia rates the power consumption of the 8800 GT at 110W, which makes the single-slot cooler feasible and also means the 8800 GT needs just one auxiliary PCIe power connector, of the six-pin variety, in order to do its thing.



The 8800 GT sports a single six-pin PCIe aux power connector

Another place where the 8800 GT sports only one connector is in the SLI department. That probably means the 8800 GT won’t be capable of ganging up with three or four of its peers in a mega-multi-GPU config. Two-way SLI is probably the practical limit for this card.

Here’s the kicker, though. 8800 GT cards are slated to become available today for between $199 and $249.

Doing the math

So that’s a nice price, right? Well, like so many things in life—and I sure as heck didn’t believe this in high school—it all boils down to math. If you take the 8800 GT’s seven SP clusters and 112 SPs and throw them into the blender with a 1.5GHz shader clock, a 256-bit memory interface, along with various herbs and spices, this is what comes out:

Peak
pixel
fill rate
(Gpixels/s)

Peak texel
sampling
rate
(Gtexels/s)

Peak bilinear

texel
filtering
rate
(Gtexels/s)


Peak bilinear

FP16 texel
filtering
rate
(Gtexels/s)


Peak
memory
bandwidth
(GB/s)

Peak
shader
arithmetic
(GFLOPS)
GeForce 8800 GT 9.6 33.6 33.6 16.8 57.6 504
GeForce 8800 GTS 10.0 12.0 12.0 12.0 64.0 346

GeForce 8800 GTX

13.8 18.4 18.4 18.4 86.4 518
GeForce 8800 Ultra 14.7 19.6 19.6 19.6 103.7 576
Radeon HD 2900 XT 11.9 23.8 11.9 11.9 105.6 475

In terms of texture sampling rates, texture filtering capacity, and shader arithmetic, the 8800 GT is actually superior to the 8800 GTS. It’s also quicker than the Radeon HD 2900 XT in most of those categories, although our FLOPS estimate for the GeForce GPUs is potentially a little rosy—another way of counting would reduce those numbers by a third, making the Radeon look relatively stronger. Also, thanks to its higher clock speed, the 8800 GT doesn’t suffer much in terms of pixel fill rate (and corresponding AA grunt) due to its smaller ROP count. The 8800 GT’s most noteworthy numbers may be its texture sampling and filtering rates. Since its SPs can grab twice as many texels per clock as the G80’s, its texture filtering performance with standard 8-bit integer color formats could be more than double that of the 8800 GTS.

Performance-wise in graphics, math like this isn’t quite destiny, but it’s close. The only place where the 8800 GT really trails the 8800 GTS or the 2900 XT is in memory bandwidth. And, believe it or not, memory bandwidth is arguably at less of a premium these days, since games produce “richer” pixels that spend more time looping through shader programs and thus occupying on-chip storage like registers and caches.

Bottom line: the 8800 GT should generally be as good as or better than the 8800 GTS, for under 250 bucks. Let’s test that theory.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the results were averaged.

Our test systems were configured like so:

Processor Core
2 Extreme X6800
2.93GHz
System
bus
1066MHz
(266MHz quad-pumped)
Motherboard XFX
nForce 680i SLI
BIOS
revision
P31
North
bridge
nForce
680i SLI SPP
South
bridge
nForce
680i SLI MCP
Chipset
drivers
ForceWare
15.08
Memory
size
4GB
(4 DIMMs)
Memory
type
2
x Corsair
TWIN2X20488500C5D
DDR2 SDRAM
at 800MHz
CAS
latency (CL)
4
RAS
to CAS delay (tRCD)
4
RAS
precharge (tRP)
4
Cycle
time (tRAS)
18
Command
rate
2T
Audio Integrated
nForce 680i SLI/ALC850

with RealTek 6.0.1.5497 drivers

Graphics GeForce
8800 GT 512MB PCIe

with ForceWare 169.01 drivers

XFX
GeForce 8800 GTS XXX 320MB PCIe

with ForceWare 169.01 drivers

EVGA
GeForce 8800 GTS OC 640MB PCIe

with ForceWare 169.01 drivers


Radeon HD 2900 XT 512MB PCIe

with Catalyst 7.10 drivers

Hard
drive
WD
Caviar SE16 320GB SATA
OS Windows
Vista Ultimate
x86 Edition
OS
updates
KB36710, KB938194, KB938979, KB940105,
DirectX August 2007 Update

Please note that we’re using “overclocked in the box” versions of the 8800 GTS 320MB and 640MB, while we’re testing a stock-clocked GeForce 8800 GT reference card from Nvidia.

Thanks to Corsair for providing us with memory for our testing. Their quality, service, and support are easily superior to no-name DIMMs.

Our test systems were powered by PC Power & Cooling Silencer 750W power supply units. The Silencer 750W was a runaway Editor’s Choice winner in our epic 11-way power supply roundup, so it seemed like a fitting choice for our test rigs. Thanks to OCZ for providing these units for our use in testing.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Crysis demo

The Crysis demo is still fresh from the oven, but we were able to test the 8800 GT in it. Crytek has included a GPU benchmarking facility with the demo that consists of a fly-through of the island in which the opening level of the game is set, and we used it. For this test, we set all of the game’s quality options at “high” (not “very high”) and set the display resolution to—believe it or not—1280×800 with 4X antialiasing.

Even at this low res, these relatively beefy graphics cards chugged along. The game looks absolutely stunning, but obviously it’s using a tremendous amount of GPU power in order to achieve the look.

The demo is marginally playable at these settings, but I’d prefer to turn antialiasing off in order to get smoother frame rates on the 8800 GT. That’s what I did when I played through the demo, in fact.

Notice several things about our results. Although the 8800 GT keeps up with the 8800 GTS 640MB in terms of average frame rates, it hit lower lows of around 10 FPS, probably due to its lesser memory bandwidth or its smaller amount of total RAM onboard. Speaking of memory, the card for which the 8800 GT is arguably a replacement, the 320MB version of the GTS, stumbles badly here. This is why we were lukewarm on the GTS 320MB when it first arrived. Lots of GPU power isn’t worth much if you don’t have enough video memory. GTS 320MB owners will probably have to drop to “medium” quality in order to run Crysis smoothly.

Unreal Tournament 3 demo

We tested the UT3 demo by playing a deathmatch against some bots and recording frame rates during 60-second gameplay sessions using FRAPS. This method has the advantage of duplicating real gameplay, but it comes at the expense of precise repeatability. We believe five sample sessions are sufficient to get reasonably consistent and trustworthy results. In addition to average frame rates, we’ve included the low frames rates, because those tend to reflect the user experience in performance-critical situations. In order to diminish the effect of outliers, we’ve reported the median of the five low frame rates we encountered.

Because the Unreal engine doesn’t support multisampled antialiasing, we tested without AA. Instead, we just cranked up the resolution to 2560×1600 and turned up the demo’s quality sliders to the max. I also disabled the demo’s 62 FPS frame rate cap before testing.

All of these cards can play the UT3 demo reasonably well at this resolution, the 8800 GT included. I noticed some brief slowdowns on the GTS 320MB right as I started the game, but those seemed to clear up after a few seconds.

Team Fortess 2

For TF2, I cranked up all of the game’s quality options, set anisotropic filtering to 16X, and used 4X multisampled antialiasing at 2560×1600 resolution. I then hopped onto a server with 24 players duking it out on the “ctf_2fort” map. I recorded a demo of me playing as a soldier, somewhat unsuccessfully, and then used the Source engine’s timedemo function to play the demo back and report performance.

The 8800 GT leads all contenders in TF2. Even at 2560×1600 with 4X AA and 16X aniso, TF2 is perfectly playable with this card, although that didn’t help my poor soldier guy much.

BioShock

We tested this game with FRAPS, just like we did the UT3 demo. BioShock’s default settings in DirectX 10 are already very high quality, so we didn’t tinker with them much. We just set the display res to 2560×1600 and went to town. In this case, I was trying to take down a Big Daddy, another generally unsuccessful effort.

A low of 23 FPS for the 8800 GT puts it right on the edge of smooth playability. The 8800 GT pretty much outclasses the Radeon HD 2900 XT here, amazingly enough. The 2900 XT couldn’t quite muster a playable frame rate at these settings, which my seat-of-the-pants impression confirmed during testing.

Lost Planet: Extreme Condition

Here’s another DX10 game. We ran this game in DirectX 10 mode at 1920×1200 with all of its quality options maxed out, plus 4X AA and 16X anisotropic filtering. We used the game’s built-in performance test, which tests two very different levels in the game, a snowy outdoor setting and a cave teeming with flying doodads.

Here’s another case where the 8800 GTS 320MB stumbles, while the 8800 GT does not. Although the Radeon HD 2900 XT lists for $399, it looks like an also-ran in most of our tests.

Power consumption

We measured total system power consumption at the wall socket using an Extech power analyzer model 380803. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement.

The idle measurements were taken at the Windows Vista desktop with the Aero theme enabled. The cards were tested under load running BioShock in DirectX 10 at 2560×1600 resolution, using the same settings we did for performance testing.

Nvidia has done a nice job with the G92’s power consumption. Our 8800 GT-based test system draws over 20 fewer watts at idle than any of the others tested. Under load, the story is similar. Mash up these numbers with the performance results, and you get a very compelling power efficiency picture.

Noise levels

We measured noise levels on our test systems, sitting on an open test bench, using an Extech model 407727 digital sound level meter. The meter was mounted on a tripod approximately 14″ from the test system at a height even with the top of the video card. We used the OSHA-standard weighting and speed for these measurements.

You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured, including the stock Intel cooler we used to cool the CPU. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

Nvidia’s single-slot coolers have too often been gratuitously small and noisy in the past year or two, but the 8800 GT is different. This may be the quietest single-slot cooler we’ve ever tested (save for the passive ones), and it doesn’t grow audibly louder under load. That’s a pleasant surprise, since the thing can get very loud during its initial spin-up at boot time. Fortunately, it never visted that territory for us when runnning games.

Conclusions

You’ve seen the results for yourself, so you pretty much know what I’m going to say. The 8800 GT does a very convincing imitation of the GeForce 8800 GTS 640MB when running the latest games, even at high resolutions and quality settings, with antialiasing and high-quality texture filtering. Its G92 GPU has all of the GeForce 8800 goodness we’ve come to appreciate in the past year or so, including DX10 support, coverage-sampled antialiasing, and top-notch overall image quality. The card is quiet and draws relatively little power compared to its competitors, and it will only occupy a single slot in your PC. That’s a stunning total package, sort of what it would be like if Jessica Biel had a brain.

With pricing between $199 and $249, I find it hard to recommend anything else—especially since we found generally playable settings at 2560×1600 resolution in some of the most intensive new games (except for Crysis, which is in a class by itself.) I expect we may see some more G92-based products popping up in the coming weeks or months, but for most folks, this will be the version to have.

The one potential fly in the ointment for the 8800 GT is its upcoming competition from AMD. As we were preparing this review, the folks from AMD contacted us to let us know that the RV670 GPU is coming soon, and that they expect it to bring big increases in performance and power efficiency along with it. In fact, the AMD folks sound downright confident they’ll have the best offering at this price point when the dust settles, and they point to several firsts they’ll be offering as evidence. With RV670, they expect to be the first to deliver a GPU fabbed on a 55nm process, the first to offer a graphics processor compliant with the DirectX 10.1 spec, and the first to support four-way multi-GPU configs in Windows Vista. DirectX 10.1 is a particular point of emphasis for AMD, because it allows for some nifty things like fast execution of global illumination algorithms and direct developer control of antialiasing sample patterns. Those enhancements, of course, will be pretty much academic if RV670-based cards don’t provide as compelling a fundamental mix of performance, image quality, and power efficiency as the GeForce 8800 GT. We’ll know whether they’ve achieved that very soon.

This concludes our first look at the 8800 GT, but it’s not the end of our evaluation process. I’ve been kneedeep in CPUs over the past month or so, culminating today with our review of the 45nm Core 2 Extreme QX9650 processor today, and that’s kept me from spending all of the time with the 8800 GT that I’d like. Over the next week or so, I’ll be delving into multi-GPU performance, some image quality issues, HD video playback, more games, and more scaling tests. We may have yet another new video card for you shortly, too.

Comments closed
    • Varun010
    • 12 years ago

    Hey I was thinking of buying a graphic card and at this point of time am going for pretty much a complete overhaul of my PC.
    My current config is
    AMD Athlon 3000+ 2.1 GHz
    K8 series motherboard
    512MB RAM
    NVidia GeForce 5200 ( How I hate to admit it)
    Well my current plan is to buy the 8800 GT and change the motherboard to a K8 SLi-eSATA2 which supports my 764pin processor.
    Also the memory will be cranked upto 1.5 GB
    Do you think that this would be good enough to last me a year and half or so. Or do I have to change the processor as well. Also 8800 GT seems the winner and is it then a safe bet.

    • danny e.
    • 12 years ago

    based on the Crysis scores, the GT is just begging for more memory. maybe someone will come out with a 1GB version ?

      • BoBzeBuilder
      • 12 years ago

      I’d be so all over a 1GB version.

    • insulin_junkie72
    • 12 years ago

    Crap… forgot to hit the reply button.

    l[

    • Vaughn
    • 12 years ago

    Save the attitude my friend its not needed.

    I just found it odd, I don’t think i’ve ever seen any TR videocard reviews that weren’t focusing on SLI running just that res.

    Obviously it will run at the lower res, but I think people still want to see the numbers and not just assume. I guess its just me.

    • Vaughn
    • 12 years ago

    a Single 8800 GT is playable at the native res of a 30` display that very few people that game own?

      • rxc6
      • 12 years ago

      I guess that it takes a genius to realize that if it runs at such high resolution, it will run well at lower resolutions… :roll_eyes:

      • Entroper
      • 12 years ago

      More importantly, today’s games will run smooth as butter on a lesser display, and tomorrow’s games will run acceptably.

        • Vaughn
        • 12 years ago

        that is a very good point also

    • Vaughn
    • 12 years ago

    I must have missed it but why was the testing done at such a high res?

      • BoBzeBuilder
      • 12 years ago

      Its to stress the GPUs. And because these cards can handle such high resolutions.

    • ludi
    • 12 years ago

    Sweet card, but the slide-show crack re: Crysis Demo left me scratching my head. What am I doing wrong, other than not turning the visual options to the max? Medium-everything at 1024×768 seems to run pretty well on a year-old Radeon X1950 Pro…..and certainly far better than I was expecting based on all the forum hissing.

      • A_Pickle
      • 12 years ago

      Keep that attitude, and it won’t be long before you’ll need a new video card to play a new game.

      Wait… Crysis pretty much already does that. Well then… I guess it’s okay. Keep thumbs-upping developers and delivering… shitty products.

        • indeego
        • 12 years ago

        Crysis is great. Exactly what I wanted, I’ve played it through on three difficulty levels and played completely differently (weapons, paths (for the most part), and styles of attack). I know it’s semi-mindless, bu it has one of the most important aspects to a game: it’s fung{<.<}g

      • Shinare
      • 12 years ago

      Its your resolution. At 1024×768 you have about a 3rd of the pixles that a 1600×1200 rez monitor displays. I bet that at 800×600 crysis will run on a 9600pro. It will look like a turd, but it will run.

    • Shinare
    • 12 years ago

    I think you need to address why the 8800GT out performed the 320GTS which out performed the 640GTS on page 4 under UT demo. Seems odd that the 320 would fare better than its beefier brother. I saw that and was left hoping that you would say something about that.

      • Damage
      • 12 years ago

      In that case, the 320MB slightly outdid the 640MB because it was just a tad (5MHz) higher in core clock. (Remember, both of the GTSes I tested were retail products with hot clocks; the 8800 GT was not.)

      As for exactly why the 8800 GT was faster in UT, it probably boils down to shader power. Since UT doesn’t support multisampled AA, the additional ROPs (and perhaps memory bandwidth) in the GTS weren’t much help.

        • Shinare
        • 12 years ago

        Ahhh, thanks Damage. I was just like, WTF? heh.

        • JustAnEngineer
        • 12 years ago

        Why don’t you test more stock products? Doesn’t it muddy the waters more than a little to test some super-clocked KO XXX version of a new board instead of than the stock version that will account for 90% of the sales?

          • Damage
          • 12 years ago

          Believe it or not, we’ve given a lot of thought to that question and have come up with a different answer than the one implied by your assertion about the proportional sales of stock-clocked versus higher-clocked versions of these products. The reality is that each of Nvidia’s model name “brands” has become an umbrella under which a range of products reside, and we’re forced to account for that.

          We debated how to do it, but in this generation, I decided to go ahead and test higher-clocked versions for several reasons. One is that we tend to test what we’d want to buy for ourselves, and these higher-clocked versions can be pretty good values. Even if we concede that they don’t represent the majority of sales via OEMs and retail (and I’ve seen no numbers to prove that), they are the sort of thing I’d order from an e-tailer like Newegg for myself–and I expect most TR readers would, too. Another is that, by now, Nvidia and its board partners have established this tiered-speed model for its cards very well, having done a long string of product launches where the “OCed” versions of cards are available right away. This is more than just a ploy for good reviews. Also, Nvidia seems to have pretty much decided, around the time the 2900 XT launched, that the 8800 GTS should get an effective clock speed bump. We’ve seen an awful lot of 575MHz-core GTS cards since then at very good prices, such as the MSI GTS 640MB card we recommended in the Sweet Spot box in our latest system guide. Given the prices and widespread availability of cards like that, it only seemed fair to test the higher-clocked GTS variants.

          Since most of your questions seem to show a concern for us being fair to ATI/AMD, let me go ahead and address this question on their side of the fence. I had hoped to get some higher-clocked 2900 XTs to test in this generation of products, but it’s proven difficult because ATI’s board partners are fewer and, when I’ve asked them, they’ve told me the higher-clock cards aren’t available for review and won’t be shipping widely into the channel. In fact, I really thought the 1GB version of the 2900 XT we reviewed was going to be a higher-clocked card–right up until I started testing it and found out otherwise. ’twas disappointing, but also a dose of reality.

          We will keep an eye on the issue of stock-vs-hot-clocked cards and what’s “typical” for enthusiasts going forward. Already, that’s leading me to look into adding higher-clocked 8800 GT cards to my next round of tests, in order to be fair–not into lower-clocked GTS cards or the like. At some point, market realities may pull in another direction, though, and hopefully we’ll catch it when it happens.

      • lyc
      • 12 years ago

      it wasn’t a pre-recorded demo, the gameplay was different each time. this was well explained ๐Ÿ˜›

      • thevagraunt
      • 12 years ago

      I was wondering, I bought a Dell recently, the Inspiron 530, which has a new small form factor (but not the ridiculously small ones). It comes with a PSU that has a steady rate of 300 W (not sure what that means, just repeating what the Dell rep. told me), but a max output of 460 W. I’m not sure about the volts and amps. (doesn’t the GT require 12V?). Anyways, I was looking at the power consumption part of this test, as right now that’s the only concern I have regarding a purchase of this card. The test says that it runs 231 W under load. With my 300 W PSU, will I be able to run a 8800 GT and not have to worry about it crashing my PC or anything? The main reason I want to get a new GPU is for Crysis. Here are my specs:

      Intel Core 2 Duo 2 Ghz (although tests says it runs at 1.99, and since it’s Dell OC’ing is going to be difficult if not impossible)
      3 GB RAM
      -2 GB OCZ Platininum 4-4-4-15, DDR2-6400 at 800 Mhz
      -1 GB of whatever RAM it is Dell sends
      300 W (steady rate) PSU
      GeForce 8300 GS 128 MB

      So with all that stuff using up power from my PSU, will I be able to run an 8800 GT fine on it? Also, what’s with this extra power connector thing I see on the card? Will the 8800 come with said cable, and how will I plug it into my PSU?
      As a final question, do you guys think I should keep the Dell RAM in there or take it out? Right now my memory on the Vista Index is 4.9, but my friend who only has 2 GB of the same OCZ RAM has it indexed at around 5.4 or something. Should I have a lot of slow RAM, or just a little fast RAM?

      Sorry for asking so many questions but I really need help on this purchasing decision. The 8800 GT is out and Call of Duty 4 and Crysis are on their way and I want to be able to play them as soon as possible. I’d really appreciate it if you guys can help me with this.

    • pixel_junkie
    • 12 years ago

    Too bad these cards are going for $260-$310 rather than the announced pricing plan. I know, they will come down, but why announce prices that no one is offering? I’m sure it’s still a good/great value. But no one is selling this card for anywhere near the $199 mark.

      • jobodaho
      • 12 years ago

      Yet, it just came out. All Nvidia can do is set an MSRP, they can’t/won’t tell them to sell it for less.

    • Usacomp2k3
    • 12 years ago

    The gal on the HSF shroud art is looking at me ๐Ÿ˜ฎ
    Run away!

    • Stefan
    • 12 years ago

    Nice review. However, I would suggest using a slightly higher resolution and only going with 2xAA in Crysis as this is what I understand the majority of gamers (at least the ones I know) use for this game. (At the speed GPUs evolve lately, we can go to 16xAA next year ๐Ÿ˜‰ )

    • SS4
    • 12 years ago

    About EVGa step up program, i got a 8800 GTS 320 mb for 300$, so suppose 8800 GT comes out at 200$, will i get 100$ back or will i have to get 2 of em for 100$ ?

      • Jigar
      • 12 years ago

      i would suggest to pay $100 more and be happy with SLI ๐Ÿ˜‰

      • Gerbil Jedidiah
      • 12 years ago

      the prices on the EVGA trade up site are full retail, so more than likely anything you find in their trade up program that’s faster than your card will cost the same or more. Also, they may let you trade in most any card within 90 days, but they don’t offer all of their lineup to sell you when you trade up. If they have a card that is flying off the shelves like the 8800GT, you probably won’t find it on their list of cards you can purchase via the trade up policy.

      I went through this a year ago when I paid 300 for a superclocked 7900GT. They had the 7900GTO on newegg for 250, but the only cards EVGA offered in the trade up program faster than my card were the 7900GTXs at around 550, or something else similarly overpriced.

    • cappa84
    • 12 years ago

    Nice card. Too bad they are ripping us off in Australia. The bastards want $375AUD for the damn thing!

    • Mithent
    • 12 years ago

    Looking around various reviews – it looks like the 8800 GT keeps up well with the 8800 GTX pretty well, except when antialiasing comes into the equation? I’m trying to decide which card is overall best to get – obviously the GTX is faster and the GT is better value, but I don’t know if the GT is fast enough for reasonable future-proofing.

      • cobalt
      • 12 years ago

      Yes, particularly high resolutions and antialiasing seem to be a little better on the GTX. I don’t see that as a strong reason to recommend the GTX as a better solution to future proofing, though, simply because the GTX is literally twice as much as the GT. You can buy two $250 GT’s in SLI to fill that gap, or if it’s going to be a year before you need it, buy the current $250 GT now and replace it with whatever’s on the market for $250 later (which will undoubtedly be faster than the GTX anyway).

    • matnath1
    • 12 years ago

    NVIDIA ups The GTS to 112 SP’s according to the Hard/Ocp

    “UPDATE – 10/29/07-8:29am: A very interesting addendum to this. I just got off the phone with BFG Tech and NVIDIA has been doing some strange things lately. As of this morning, the GeForce 8800 GTS 640MB (unsure on the 320MB) will have its stream processors officially increased to 112, the same as the GT. This should put the GTS back ahead of the GT as per the paper specs. However, the separation in the products is still going to be very small except for those of you wanting to run high resolutions with AA turned on. To do that you are still going to need a $400+ video card..or so. Our new spec GTS is on the way to us now and we will of course be updating you. Given the GT’s faster clocks and possibly larger texture unit, we will have to wait and see. Undoubtedly though, the 8800 GT remains a stellar value at the expected price points.”
    ยง[<http://enthusiast.hardocp.com/article.html?art=MTQxMCw2LCxoZW50aHVzaWFzdA==<]ยง

      • cobalt
      • 12 years ago

      But it /[http://enthusiast.hardocp.com/news.html?news=Mjg4ODEsLCxoZW50aHVzaWFzdCwsLDE=<]ยง

    • Reputator
    • 12 years ago

    *[<>>>>"Speaking of memory, the card for which the 8800 GT is arguably a replacement, the 320MB version of the GTS, stumbles badly here. This is why we were lukewarm on the GTS 320MB when it first arrived. Lots of GPU power isn't worth much if you don't have enough video memory."<]* When a game like Bioshock runs the same on the 320MB GTS as the 640MB GTS at 2560x1600 4xAA, on an engine that uses it's own memory management system (as opposed to Managed DX engines), wouldn't it be smarter to assume that this is the texture evict issue rearing its ugly head again?

      • Lord.Blue
      • 12 years ago

      Very likely, but supposedly, MS has released patches that address this, at least somewhat.

    • BoBzeBuilder
    • 12 years ago

    Ah. A sweet way to start the week for PC enthusiasts if you ask me. An avalanche of reviews for both Graphics and CPU’s. AHHhhhh….

    • Hance
    • 12 years ago

    This sucks I had a new RC Heli all picked out to waste money on now I dont know what to do. The chopper is 3 times the money the 8800GT is but I kind of want a new video card too.

    • insulin_junkie72
    • 12 years ago

    l[

      • Vrock
      • 12 years ago

      PAL speedup sucks. Boo for chipmunk voices.

        • insulin_junkie72
        • 12 years ago

        You can finish watching a movie 4% faster, though!

        Think of the time saved! You could save ten minutes on just GONE WITH THE WIND alone!

          • Vrock
          • 12 years ago

          Frankly, my dear, I don’t give a damn.

          ๐Ÿ˜‰

    • d0g_p00p
    • 12 years ago

    Nice card. However I am still happy with my 8800GTS 320MB version as it powers my 1680×1050 display at native rez for all games. I have not tried Crysis yet but I know my video card will weep. Plus I only paid $270 for it, so I am still pretty happy.

    However a second gen 98XX (or whatever) will be in my next build.

    • ChrisDTC
    • 12 years ago

    I see this review got Slashdoted, congrats

    • IntelMole
    • 12 years ago

    Good review, however your conclusion is slightly lacking in not pointing out that the AMD cards were the first to their respective process nodes, and were months late and far short of the clockspeed mark.

    Definitely a factor I feel if you’re adopting a wait-and-see attitude towards the chip.

    • My Johnson
    • 12 years ago

    I really thought Geoff would be writing this review after seeing the 6950 review released.

    • lethal
    • 12 years ago

    Any OC results? From other reviews I’ve seen the 8800GT was a hair from boiling itself with load temps near the 90C mark with the stock cooler, while the ol’ GTS with its beefier cooler ussually sailed north of 600 Mhz from stock (20% OC).

      • b_naresh
      • 12 years ago

      91 degrees Celcius under load with an open case according to HardOCP!!! Thats not good at all!

    • gerbilspy
    • 12 years ago

    Now why would anyone want to ruin Jessica Biel by giving her a brain?

      • moritzgedig
      • 12 years ago

      I don’t get that Jessica Biel thing.
      what’s special about her? aren’t there plenty of females like her in Hollywood? I don’t think she is dumber than the crowd.
      And what does this remark have to do in a good article?
      Are you trying to say that she would be the perfect female, for you, if she had a mind/brain of a scientist?
      I think she is the charismatic type not the hot brain like Sharon Stone.

        • sluggo
        • 12 years ago

        Four words for ya: Boo. Tee. Lish. Us.

          • Usacomp2k3
          • 12 years ago

          “Lish” isn’t a word. Oh, and you spelled “tea” wrong
          ๐Ÿ˜‰

    • Code:[M]ayhem
    • 12 years ago

    So 30FPS is considered playable the days?, my how times have changed.

      • Krogoth
      • 12 years ago

      No, it always has been like that.

      60FPS only matters for fast-pace games like Quake 3, UT2K4 etc.

      It is just that some ewankers get spoiled by software doesn’t push their hardware to their limit. Once titles come around and make their hardware bent on it kness. They go “OMFG! IT IS TOO SLOW! T3H SUCKS!”

        • Vrock
        • 12 years ago

        Why do you persist with this crap? Why can’t you just accept that different people have different standards of what they consider playable?

          • Ardrid
          • 12 years ago

          Because he’s on point. While intelligent people can certainly debate what amount of FPS feels “better” or “playable” to them, it’s a widely accepted fact that 30 FPS is the bare minimum of playability. And, quite honestly, until you hit 60 FPS, you can’t discern the difference between 30-59 FPS unless your minimums are dropping below 30.

            • Vrock
            • 12 years ago

            It’s totally subjective. What feels nice and smooth to you can be unplayable to someone else. 30fps is torture for me, I see judder everywhere. 24fps movies are even worse. I need the game world in a FPS to be as smooth and fluid as turning my head is in real life. It’s ridiculous for him or anyone else to insult those who don’t find the “bare minimum” tolerable, he does it all the time, and he needs to STFU about it already.

          • Krogoth
          • 12 years ago

          They got some pretty stringent standards or are just downright spoiled.

          60FPS isn’t really that much better than 30FPS outside of fast-pace action games and flicks. 60FPS does help with your twitch-actions and pulling off some split-second moves.

          IMO, none of the games feel completely natural in FOV movement or animation even at vaunted 60FPS. The recent blur motion effects that happens when you move around FOV in recent games looks so forced.

          The crux of the problem is that there is no display technology that match the capabilities of the human eye. 30-60FPS was just found to be a reasonable range for “smooth” animation without running to the wall of diminishing returns.

            • indeego
            • 12 years ago

            “60FPS isn’t really that much better than 30FPS outside of fast-pace action games and flicks. 60FPS does help with your twitch-actions and pulling off some split-second moves.”

            That is exactly why we still buy PC’s and these $250-$400 cards. This twitch and reaction time are crucial to some of us, it makes or breaks a game. 60 fps also means a higher mean. The mean is also vital.

            I’ve seen movies filmed at 60 fps and I much prefer them to movies in “HD”g{<.<}g

            • Krogoth
            • 12 years ago

            True, however those users only represent a small minority of hardcore FPS junkies. ๐Ÿ˜‰

            • insulin_junkie72
            • 12 years ago

            l[

            • bthylafh
            • 12 years ago

            Depends on the film’s graininess, does it not?

            For an example of what I mean, take an old snapshot of you from when you were 5 years old. Have it professionally blown up. You’ll see the grains. More expensive film tends to be finer-grained.

            • insulin_junkie72
            • 12 years ago

            Grain just comes with the territory, I’d say, although it certainly varied with the type of film stock used.

            As an example off the top of my head, the film stock in common usage in the 70s was quite grainy, and pretty distinctive if you’ve watched a ton of movies from the era (what would define the 80s? Soft-focus overload?), but I wouldn’t classify it as ‘lower-resolution’ as a result.

            (I hate DVDs that aggressively filtered out normal film grain during the transfer, although that practice has seemed to diminish some the last few years. It’s FILM! It’s going to have FILM GRAIN! It’s like how newer CDs try to eliminate all the tape hiss from analog recordings during “remastering”…)

            • Vrock
            • 12 years ago

            Yup, the signature look of 80s flat films is soft focus and muted palette IMO.

            • Vrock
            • 12 years ago

            Nope, grain is not a factor here. And blowing up a print from a print is going to give you bad results anyway, regardless of grain inherent in the film stock.

            • Aihyah
            • 12 years ago

            wrong, the difference between film and video games is control. you have no control of film motion, which is why blur is fine. hell i’m sure everyones seen films with fast motion action scenes where you can barely make out anything thats going on because its so fast and there are so many cuts. only the control of the director keeps it from going totally off the rails. if you had to control with such limitations you would suffer. film motion blur is ok, its a great effect and fine for story telling. for gaming and precise control it is horrible. games have no motion blur to hide movement, its all crystal clear so lack of adequate frame rate is very apparent. 30fps is minimum, it is considered passable for playing games. for silky smooth is 60 fps. it all depends on your budget and willingness to spend on which fps level you will settle for.

            • Usacomp2k3
            • 12 years ago

            I find too much motion in film to be annoying and confusing and, to some degree, nauseating.

            • SS4
            • 12 years ago

            ill agree with u on that, i hate motion in movie especially with big figthing scene when u cant see whats happening, looks like a lazy way to do things imo.
            In games motion blur is horrible (NFS anyone???) If the option is there i always turn it off, damn ive never experienced motion blur in real life. Ok, maybe i cant see the asphalt clearly going 200 km/h but damn the world around me doesnt become blurry no matter what i do.

            • Usacomp2k3
            • 12 years ago

            The Bourne series is the worst at this, IMHO.

            • lyc
            • 12 years ago

            i totally agree, it’s a pity because they are pretty hardcore when you can make out wtf is going on.

            • lyc
            • 12 years ago

            the trouble is that no one does PROPER motion blur. a direct analogy can be made between blurring the screen and doing proper antialiasing – antialiasing isn’t just a blur, it’s an integral over all sub-pixels; similarly motion blur isn’t just smearing the screen, it’s an integral over time.

            so to get real motion blur you’d need to render lots of frames and blend them together (remember 3dfx’s t-buffer?). this obviously crushes performance and that’s why no one does it.

            • SS4
            • 12 years ago

            Exactly, they should just stop doing this stupid blur effect since they cant make it right and its totally junk in most game.

            • d0g_p00p
            • 12 years ago

            I don’t know TF2’s motion blur looks quite good.

            • lyc
            • 12 years ago

            haven’t played it, in fact i was really running my mouth because i hardly play any games these days; last ones were bioshock and oblivion, and before that for years i played mainly soldat!

            however, i still think it’s safe to assume that gamedevs won’t be plonking a 16x+ speed decrese just for some proper temporal-antialiasing (motion blur) anytime soon. well, maybe that’s why crysis is so damn slow, i’ve yet to check it out myself (i have a 8800 gts 640mb, quadcore) but from the thumbnailed screenshots i’ve seen it does have some motion blur.

            • indeego
            • 12 years ago

            It has massively annoying motion blur. Reminds me to turn it off next time I play itg{<.<}g

      • crazybus
      • 12 years ago

      Playable fps is completely dependent on the game. Some games are perfectly fine at 20-25fps while others need 50+ to feel smooth.

      • swaaye
      • 12 years ago

      15 fps was “playable” back in the good ‘ol N64 era. Heh.

      • Vrock
      • 12 years ago

      Tell me about it.

      • End User
      • 12 years ago

      If you plan on keeping a video card for a year or two then you want to avoid spending money on a card that maxes out at 30fps with current gaming technology.

      It kills me when people go on about why you need more fps than the eye can see. I want a card that will last more than 3 months before I need to upgrade. If there were a card on the market today that allowed me to play Crysis at a minimum of 100fps then sign me up! I don’t care that 100fps in Crysis is more than I need now, I am looking into the future and how that video card is going to handle Alan Wake!

    • WildBenchv2
    • 12 years ago

    It would have been nice to see the scores of a 1950Pro to compare. That card was arguably previous generations’ best price to (decent) performance card.

    • WallisHall
    • 12 years ago

    If they are raising the price because their normal supplier is out but their secondary supplier provides the card at a $10 premium then it’s not gouging. It’s selling the product at the best price available.

    Think about it.

    • ReAp3r-G
    • 12 years ago

    does DX10.1 need new hardware or can it exist on current 8 series? coz i’d hate to get this one and the 9’s are just around the corner and to find out only the 9’s have DX10.1 support…

      • emorgoch
      • 12 years ago

      10.1 will require new hardware. I believe AMD’s upcoming card(s) this month will be the first to have 10.1 support.

        • alex666
        • 12 years ago

        Is 10.1 really all that important?

          • ew
          • 12 years ago

          Is 10.0 really that important?

            • indeego
            • 12 years ago

            I’ll save some time: g{

            • ReAp3r-G
            • 12 years ago

            i’m just torn between waiting for a decent price and hopping on the GeForce 9k’s bandwagon…

            what significant plus points does that 0.1 add to current DX10?

            other than that global illumination thingy…

            so suggestions…what should i get? a 256? or 512 version of that card? i’m looking for a price range of $200-$250

            so far newegg and what other not vendors are above $250 ๐Ÿ™

            • insulin_junkie72
            • 12 years ago

            The prices are always higher and rather volatile the first week or two on things like this.

            Wait until all the stores have a steady supply coming in, and those prices will come down to list price and lower.

            • My Johnson
            • 12 years ago

            Remember the 9700?

    • flip-mode
    • 12 years ago

    Newegg’s got stock – $260 XFX:

    ยง[<http://www.newegg.com/Product/Product.aspx?Item=N82E16814150252<]ยง

      • alex666
      • 12 years ago

      Price went up $10 since I first read this, now $270.00.

        • flip-mode
        • 12 years ago

        LOL, gouging FTW.

          • alex666
          • 12 years ago

          I’ve been waiting for the release of this card, as I need to add a card to my new system. I’m anxious to see if more manufacturers release cards today, and at what price point. If this is any indication, I’m going to wait, though it would be fun to be among the first to own this card.

          • Krogoth
          • 12 years ago

          Supply and Demand.

          Newegg is a business not a charity.

          Geez. ๐Ÿ™„

      • alex666
      • 12 years ago

      A second one just popped up, an evga superclocked:

      ยง[<http://www.newegg.com/Product/Product.aspx?Item=N82E16814130303<]ยง

        • ew
        • 12 years ago

        Since when did the term superclock replace overclock? Or is there some subtle difference? Or does it just sound cooler?

          • flip-mode
          • 12 years ago

          EVGA’s usual marketing strategy is to sell several different models each at a different level of overclock. So for example:

          8800GT: stock speeds
          8800GT KO: moderate overclock
          8800GT Superclock: really high OC.

            • cobalt
            • 12 years ago

            In this case, at least, the KO is higher (675/1950) than the SC (650/1900). They’re also launching an “SSC” edition at 700/2000.

            ยง[<http://www.evga.com/articles/378.asp<]ยง (Interestingly, their product page for the 8800GTS SSC edition lists "96+" SPs.)

          • bthylafh
          • 12 years ago

          Be thankful it’s not extremeclocked or xclocked.

          Wait, though; DAMIT might just market the next gen cards as xxclockedx.

    • deathBOB
    • 12 years ago

    Scott, when you say new Ati cards will be out soon, do you mean weeks or months? Can you give a ballpark figure?

    I really, really want to get one of these but I want to see the Ati cards…

    • elmopuddy
    • 12 years ago

    I’d be curious to see if it puts the smack down on my 7900GTX, ie: a worthy upgrade or not…
    EP

      • Ruiner
      • 12 years ago

      check Anand’s for benches you’re looking for.

    • zgirl
    • 12 years ago

    Great review, but as mentioned the numbers for higher end cards would offer some perspective. It’s always nice to see them against one another.

    • BoBzeBuilder
    • 12 years ago

    An American coin? Blasphemy. Madness.

      • eitje
      • 12 years ago

      THIS…. IS…. TECH REPORT!

      • indeego
      • 12 years ago

      s[

        • Krogoth
        • 12 years ago

        Not exactly. The metal in the coins could be use for various applications. Granted you would need a lot of coins to get a decent quantity. ๐Ÿ˜‰

        Sadly, a coin is worth more in its material cost than paperbacks.

    • Krogoth
    • 12 years ago

    This is what 8600GT should have been. If it wasn’t so darn crippled.

    8800GT is tempting, but I rather stick it out with my X1900XT. It still holds up pretty well and plays Crysis @1280×1024 High details with an acceptable framerate.

    • Jigar
    • 12 years ago

    Great review scott. ๐Ÿ™‚

    • Entroper
    • 12 years ago

    Wow. I didn’t expect this great an offering from nVidia. Most tempting graphics card release in years from my standpoint…

    • zqw
    • 12 years ago

    kaching! How’s temps?

    • MadManOriginal
    • 12 years ago

    Looking forward to more info and detailed testing at more normal resolutions with various AA/AF options. You could also throw a GTX in there for some more perspective. And I wonder if the ‘hints’ about a full G92 are solid hints? ๐Ÿ˜‰ The 8800GTS is due to get an update on the G92 chip as well according to lots of rumors, I just don’t see how it could be a 112SP chip as well, even if it had more memory and mem bandwidth it would be too close performance-wise.

    You should also try Crysis on a quad core.

    • Vrock
    • 12 years ago

    Thanks for the review. IMO, this is what the 88xx series should have been when it released months ago (price, power, speed, etc). As it stands now, it’s /[

      • bthylafh
      • 12 years ago

      Were you one of those people who bought an iPhone the first day and got mad about the price cut? :rolleyes:

        • Entroper
        • 12 years ago

        Wrong reply.

        • Vrock
        • 12 years ago

        Er, no. My point is that Nvidias (and ATI’s, actually) initial run of DX 10 products just didn’t deliver alot of value for the price, and now with the refresh doing everything they can do better, cheaper, and more efficiently, it’s all the more evident. The people who bought them are stuck with underperforming, overpriced cards. That’s on them, though. I guess the lesson here is wait for the refresh, but I’d still like to see the companies offer the “refresh” first, if you get me.

          • Jigar
          • 12 years ago

          I don’t think 320MB card is a under performing card. We know eventually high end products are going to be replaced by something better. Btw 320MB card was always recommended for people with smaller monitors so i don’t think 320MB is going to under perform at low res.. (Crysis different story)

          • coldpower27
          • 12 years ago

          Why should they? With the new DX10 API they had a chance to cash in on that new feature set and charge a premium rather then concentrate on performance.

          This is nothing new, early adopters always typically pay a premium to have the best performance with the best feature set first.

          • lyc
          • 12 years ago

          you’ll have to agree that nvidia’s timing with the 8800 was really good, caught ati completely off-guard and it’s had a really excellent run.

          obviously we’d all prefer it if nvidia could have just launched a 65nm, 55nm or 32nm gf8 back when… however, there’s a “if you build it they will come” effect going on.

      • flip-mode
      • 12 years ago

      Early adopters always pay a big price. Everyone should know this. Besides, the 8800GTS was worth the money for its DX9 performance. It’s easy to see a few DX10 benchmarks and forget that. So I don’t consider those folks losers, they just made a choice that had very predictable results.

        • Entroper
        • 12 years ago

        I don’t think 8800 GTS buyers got screwed, that’s just early adoption. I do think 8600 GT/GTS buyers got screwed big time, since there was a much larger-than-usual gap between these cards and the high-end. The mid-range is where you’re supposed to get the most bang for your buck, and nVidia didn’t deliver that with the 8600 series. You could probably say these buyers were screwed before the 8800 GT — this just adds insult to injury.

          • swaaye
          • 12 years ago

          8600GT costs $100 or so. It’s also a power-frugal little thing that also has excellent HD vid acceleration. Seems to have value to me, eh.

            • Entroper
            • 12 years ago

            It cost quite a bit more than that when it was released. Right now, it’s priced as a low-end card and gives you the performance of a low-end card, so it’s a reasonable deal. Months ago, it was priced as a mid-range card and gave you the performance of a low-end card.

            • swaaye
            • 12 years ago

            Well if people bought it back then, LOL @ them. ๐Ÿ™‚ I got one for $100 because it was $100 and was passively cooled. It actually performs quite well in a number of games. Oblivion, in particular. When I overclock the little thing to 750/950, it does Crysis at medium detail 1680×1050 quite playable.

      • totoro
      • 12 years ago

      Let me check….yup I’m pissed.
      At myself.

        • ReAp3r-G
        • 12 years ago

        ditto…

        waiting to see if DX10.1 is GeForce 9 territory…if it truly is 9 only, then i’ll wait somemore…if not i’ll get the 8800GT for sure

        can’t help to think i should’ve just ported my 7600GT from my older pc…oh well thats what you get for early adoption of DX10 hardware…

        i could live with medium settings on crysis for awhile…til i get better graphics…could someone tell if DX10.1 is ONLY for NVIDIA’s 9k’s or ATI’s 3k’s? or can it work on both 8 and 9’s (same logic applies for ATI’s offerings)

      • crazybus
      • 12 years ago

      I bought an 8600GTS a couple months ago and I’m not pissed. I got a decent deal on it and it got me the best performance for the money I was willing to spend at the time. In any case it’s an EVGA so I can step up if I want to.

        • ReAp3r-G
        • 12 years ago

        ah lucky you…i’m having the gigabyte 8600GT…its pretty decent…got me a medium on the crysis settings…which was more than enough…i could enjoy most of the eye candy…just waiting for the deals of the year to come (hint x’mas!!!) hehe

          • bthylafh
          • 12 years ago

          Must… play… Pac-Man.

        • St. Babu
        • 12 years ago

        I’m pissed. I bought my 8800GTS320 a week ago. I bought from the ‘Egg, so I may just RMA it and get an 8800GT instead.

    • Gerbil Jedidiah
    • 12 years ago

    I’m not so much interested in this card, but the high end 2 GPU variant would be nice, especially if it handles DX10 games as well as my 7950GX2 handles DX9 games. Of course, maybe I have to get something that is DX10.1 compliant. *sigh* it never ends

    Nonetheless, this appears to be an awesome value.

    • gbcrush
    • 12 years ago

    BRAVO! Damage, once again thank you and the crew for an excellent report.

    I just finished the AT report to find a few things that I wanted to know more about. Granted, I know some people have different feelings for AT, but I generally like them. Still, their report on the new GT felt rushed, and while I know that in many cases the 8800GTS 320 has similar performance to the 8800GTS 640, it makes me happy to see you’ve tested them both.

    • flip-mode
    • 12 years ago

    I was about to ask for a good old DX-9 title like FEAR or something just to put the old cards in perspective but you know what, the hell with it, the world moves on.

    Good review. Extremely convincing card. This is why it’s worth waiting for the refresh, not that I’m interested in buying one of these to let it languish in 2D mode its entire life.

    RV670 better be damn good – none of this “best value” mentality, it needs to bring best performance at the same time, well, at least competitive performance.

    Spelling error in the conclusion – Biel should be spelled Alba, in which case the brain comment is no longer necessary. But I can see your reasoning for choosing Biel since no sane, reasonable, trustworthy person would compare any sort of technology in the world to Alba.

    • flip-mode
    • 12 years ago

    Ho! The one-two punch! I’m feelin a bit dizzy now.

      • king_kilr
      • 12 years ago

      Damn, I’m getting nauseous from all the awesome coming out of TR, great works guys!

        • SuperSpy
        • 12 years ago

        Yeah, I saw the front page after reading the CPU review and went “Hrm, TR’s rotating article thingie is brok… OMG ANOTHER REVIEW”

        So much graphy goodness I almost don’t need my morning caffeine.

        Almost.

    • kvndoom
    • 12 years ago

    Guess all the NDA’s lifted today, whether NVIDIA wanted them to or not. Hell of a deal on that card. About time, too.

Pin It on Pinterest