This is an absolutely spectacular time to be a PC gamer. The slew of top-notch and hotly anticipated games hitting stores shelves is practically unprecedented, including BioShock, Crysis, Quake Wars, Unreal Tournament 3, and Valve’s Orange Box trio of goodness. I can’t remember a time quite like it.
However, this may not be the best time to own a dated graphics card. The latest generation of high-end graphics cards brought with it pretty much twice the performance of previous high-end cards, and to add insult to injury, these GPUs added DirectX 10-class features that today’s games are starting to exploit. If you have last year’s best, such as a GeForce 7900 or Radeon X1900, you may not be able to drink in all the eye candy of the latest games at reasonable frame rates.
And if you’ve played the Crysis demo, you’re probably really ready to upgrade. I’ve never seen a prettier low-res slide show.
Fortunately, DirectX 10-class graphics power is getting a whole lot cheaper, starting today. Nvidia has cooked up a new spin of its GeForce 8 GPU architecture, and the first graphics card based on this chip sets a new standard for price and performance. Could the GeForce 8800 GT be the solution to your video card, er, Crysis? Let’s have a look.

Meet the G92
In recent years, graphics processor transistor budgets have been ballooning at a rate even faster than Moore’s Law, and that has led to some, um, exquisitely plus-sized chips. This fall’s new crop of GPUs looks to be something of a corrective to that trend, and the G92 is a case in point. This chip is essentially a die shrink of the G80 graphics processor that powers incumbent GeForce 8800 graphics cards. The G92 adds some nice new capabilities, but doesn’t double up on shader power or anything quite that earth-shaking.

Here’s an extreme close-up of the G92, which may convince your boss/wife that you’re reading something educational and technically edifying right about now. We’ve pictured it next to a U.S. quarter in order to further propagate the American hegemonic mindset. Er, I mean, to provide some context, size-wise. The G92 measures almost exactly 18 mm by 18 mm, or 324 mm². TSMC manufactures the chip for Nvidia on a 65nm fab process, which somewhat miraculously manages to shoehorn roughly 754 million transistors into this space. By way of comparison, the much larger G80made on a 90nm processhad only 681 million transistors. AMD’s R600 GPU packs 700 million transistors into a 420 mm² die area.
Why, you may be asking, does the G92 have so many more transistors than the G80? Good question. The answer is: a great many little additions here and there, including some we may not know about just yet.
One big change is the integration of the external display chip that acted as a helper to the G80. The G92 natively supports twin dual-link DVI outputs with HDCP, without the need for a separate display chip. That ought to make G92-based video cards cheaper and easier to make. Another change is the inclusion of the VP2 processing engine for high-definition video decoding and playback, an innovation first introduced in the G84 GPU behind the GeForce 8600 lineup. The VP2 engine can handle the most intensive portions of H.264 video decoding in hardware, offloading that burden from the CPU.
Both of those capabilities are pulled in from other chips, but here’s a novel one: PCI Express 2.0 support. PCIe 2.0 effectively doubles the bandwidth available for communication between the graphics card and the rest of the system, and the G92 is Nvidia’s first chip to support this standard. This may be the least-hyped graphics interface upgrade in years, in part because PCIe 1.1 offers quite a bit of bandwidth already. Still, PCIe 2.0 is a major evolutionary step, though I doubt it chews up too many additional transistors.
So where else do the G92’s additional transistors come from? This is where things start to get a little hazy. You see, the GeForce 8800 GT doesn’t look to be a “full” implementation of G92. Although this chip has the same basic GeForce 8-series architecture as its predecessors, the GeForce 8800 GT officially has 112 stream processors, or SPs. That’s seven “clusters” of 16 SPs each. Chip designers don’t tend to do things in odd numbers, so I’d wager an awful lot of Nvidia stock that the G92 actually has at least eight SP clusters onboard.
Eight’s probably the limit, though, because the G92’s SP clusters are “fatter” than the G80’s; they incorporate the G84’s more robust texture addressing capacity of eight addresses per clock, up from four in the G80. That means the GeForce 8800 GT, with its seven SP clusters, can sample a total of 56 texels per clockwell beyond the 24 of the 8800 GTS and 32 of the 8800 GTX. We’ll look at the implications of this change in more detail in a sec.
Another area where the GeForce 8800 GT may be sporting a bit of trimmed down G92 functionality is in the ROP partitions. These sexy little units are responsible for turning fully processed and shaded fragments into full-blown pixels. They also provide much of the chip’s antialiasing grunt, and in Nvidia’s GeForce 8 architecture, each ROP has a 64-bit interface to video memory. The G80 packs six ROP partitions, which is why the full-blown GeForce 8800 GTX has a 384-bit path to memory and the sawed-off 8800 GTS (with five ROP partitions) has a 320-bit memory interface. We don’t know how many ROP partitions the G92 has lurking inside, but the 8800 GT uses only four of them. As a result, it has a 256-bit memory interface, can output a maximum of 16 finished pixels per clock, and has somewhat less antialiasing grunt on a clock-for-clock basis.
How many ROPs does G92 really have? I dunno. I suspect we’ll find out before too long, though.
The 8800 GT up close
What the 8800 GT lacks in functional units, it largely makes up in clock speed. The 8800 GT’s official core clock speed is 600MHz, and its 112 SPs run at 1.5GHz. The card’s 512MB of GDDR3 memory runs at 900MHzor 1.8GHz effective, thanks to the memory’s doubled data rate.

MSI’s NX8800GT
Here’s a look at MSI’s rendition of the GeForce 8800 GT. Note the distinctive MSI decal. This card is further differentiated in a way that really matters: it comes hot from the factory, with a 660MHz core clock and 950MHz memory. This sort of “overclocking” has become so common among Nvidia’s board partners, it’s pretty much expected at this point. MSI doesn’t disappoint.
I don’t want to give too much away, since we’ve measured noise levels on a decibel meter, but you’ll be pleased to know that the 8800 GT’s single-slot cooler follows in the tradition of Nvidia’s coolers for its other GeForce 8800 cards. The thing is whisper-quiet.
The sight of a single-slot cooler may be your first hint that this is not the sort of video card that will put an ugly dent in your credit rating. Here’s another hint at the 8800 GT’s mainstream aspirations. Nvidia rates the power consumption of the 8800 GT at 110W, which makes the single-slot cooler feasible and also means the 8800 GT needs just one auxiliary PCIe power connector, of the six-pin variety, in order to do its thing.

The 8800 GT sports a single six-pin PCIe aux power connector
Another place where the 8800 GT sports only one connector is in the SLI department. That probably means the 8800 GT won’t be capable of ganging up with three or four of its peers in a mega-multi-GPU config. Two-way SLI is probably the practical limit for this card.
Here’s the kicker, though. 8800 GT cards are slated to become available today for between $199 and $249.
Doing the math
So that’s a nice price, right? Well, like so many things in lifeand I sure as heck didn’t believe this in high schoolit all boils down to math. If you take the 8800 GT’s seven SP clusters and 112 SPs and throw them into the blender with a 1.5GHz shader clock, a 256-bit memory interface, along with various herbs and spices, this is what comes out:
|
Peak pixel fill rate (Gpixels/s) |
Peak texel sampling rate (Gtexels/s) |
Peak bilinear texel |
Peak bilinear FP16 texel |
Peak memory bandwidth (GB/s) |
Peak shader arithmetic (GFLOPS) |
|
| GeForce 8800 GT | 9.6 | 33.6 | 33.6 | 16.8 | 57.6 | 504 |
| GeForce 8800 GTS | 10.0 | 12.0 | 12.0 | 12.0 | 64.0 | 346 |
|
GeForce 8800 GTX |
13.8 | 18.4 | 18.4 | 18.4 | 86.4 | 518 |
| GeForce 8800 Ultra | 14.7 | 19.6 | 19.6 | 19.6 | 103.7 | 576 |
| Radeon HD 2900 XT | 11.9 | 23.8 | 11.9 | 11.9 | 105.6 | 475 |
In terms of texture sampling rates, texture filtering capacity, and shader arithmetic, the 8800 GT is actually superior to the 8800 GTS. It’s also quicker than the Radeon HD 2900 XT in most of those categories, although our FLOPS estimate for the GeForce GPUs is potentially a little rosyanother way of counting would reduce those numbers by a third, making the Radeon look relatively stronger. Also, thanks to its higher clock speed, the 8800 GT doesn’t suffer much in terms of pixel fill rate (and corresponding AA grunt) due to its smaller ROP count. The 8800 GT’s most noteworthy numbers may be its texture sampling and filtering rates. Since its SPs can grab twice as many texels per clock as the G80’s, its texture filtering performance with standard 8-bit integer color formats could be more than double that of the 8800 GTS.
Performance-wise in graphics, math like this isn’t quite destiny, but it’s close. The only place where the 8800 GT really trails the 8800 GTS or the 2900 XT is in memory bandwidth. And, believe it or not, memory bandwidth is arguably at less of a premium these days, since games produce “richer” pixels that spend more time looping through shader programs and thus occupying on-chip storage like registers and caches.
Bottom line: the 8800 GT should generally be as good as or better than the 8800 GTS, for under 250 bucks. Let’s test that theory.
Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the results were averaged.
Our test systems were configured like so:
| Processor | Core 2 Extreme X6800 2.93GHz |
| System bus |
1066MHz (266MHz quad-pumped) |
| Motherboard | XFX nForce 680i SLI |
| BIOS revision |
P31 |
| North bridge |
nForce 680i SLI SPP |
| South bridge |
nForce 680i SLI MCP |
| Chipset drivers |
ForceWare 15.08 |
| Memory size |
4GB (4 DIMMs) |
| Memory type |
2 x Corsair TWIN2X20488500C5D DDR2 SDRAM at 800MHz |
| CAS latency (CL) |
4 |
| RAS to CAS delay (tRCD) |
4 |
| RAS precharge (tRP) |
4 |
| Cycle time (tRAS) |
18 |
| Command rate |
2T |
| Audio | Integrated nForce 680i SLI/ALC850 with RealTek 6.0.1.5497 drivers |
| Graphics | GeForce 8800 GT 512MB PCIe with ForceWare 169.01 drivers |
| XFX GeForce 8800 GTS XXX 320MB PCIe with ForceWare 169.01 drivers |
|
| EVGA GeForce 8800 GTS OC 640MB PCIe with ForceWare 169.01 drivers |
|
Radeon HD 2900 XT 512MB PCIe with Catalyst 7.10 drivers |
|
| Hard drive |
WD Caviar SE16 320GB SATA |
| OS | Windows Vista Ultimate x86 Edition |
| OS updates |
KB36710, KB938194, KB938979, KB940105, DirectX August 2007 Update |
Please note that we’re using “overclocked in the box” versions of the 8800 GTS 320MB and 640MB, while we’re testing a stock-clocked GeForce 8800 GT reference card from Nvidia.
Thanks to Corsair for providing us with memory for our testing. Their quality, service, and support are easily superior to no-name DIMMs.
Our test systems were powered by PC Power & Cooling Silencer 750W power supply units. The Silencer 750W was a runaway Editor’s Choice winner in our epic 11-way power supply roundup, so it seemed like a fitting choice for our test rigs. Thanks to OCZ for providing these units for our use in testing.
Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.
We used the following versions of our test applications:
- Crysis demo
- Unreal Tournament 3 demo
- Team Fortress 2
- BioShock 1.0 with DirectX 10
- Lost Planet: Extreme Condition with DirectX 10
- FRAPS 2.9.2
The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.
Crysis demo
The Crysis demo is still fresh from the oven, but we were able to test the 8800 GT in it. Crytek has included a GPU benchmarking facility with the demo that consists of a fly-through of the island in which the opening level of the game is set, and we used it. For this test, we set all of the game’s quality options at “high” (not “very high”) and set the display resolution tobelieve it or not1280×800 with 4X antialiasing.
Even at this low res, these relatively beefy graphics cards chugged along. The game looks absolutely stunning, but obviously it’s using a tremendous amount of GPU power in order to achieve the look.


The demo is marginally playable at these settings, but I’d prefer to turn antialiasing off in order to get smoother frame rates on the 8800 GT. That’s what I did when I played through the demo, in fact.
Notice several things about our results. Although the 8800 GT keeps up with the 8800 GTS 640MB in terms of average frame rates, it hit lower lows of around 10 FPS, probably due to its lesser memory bandwidth or its smaller amount of total RAM onboard. Speaking of memory, the card for which the 8800 GT is arguably a replacement, the 320MB version of the GTS, stumbles badly here. This is why we were lukewarm on the GTS 320MB when it first arrived. Lots of GPU power isn’t worth much if you don’t have enough video memory. GTS 320MB owners will probably have to drop to “medium” quality in order to run Crysis smoothly.
Unreal Tournament 3 demo
We tested the UT3 demo by playing a deathmatch against some bots and recording frame rates during 60-second gameplay sessions using FRAPS. This method has the advantage of duplicating real gameplay, but it comes at the expense of precise repeatability. We believe five sample sessions are sufficient to get reasonably consistent and trustworthy results. In addition to average frame rates, we’ve included the low frames rates, because those tend to reflect the user experience in performance-critical situations. In order to diminish the effect of outliers, we’ve reported the median of the five low frame rates we encountered.
Because the Unreal engine doesn’t support multisampled antialiasing, we tested without AA. Instead, we just cranked up the resolution to 2560×1600 and turned up the demo’s quality sliders to the max. I also disabled the demo’s 62 FPS frame rate cap before testing.


All of these cards can play the UT3 demo reasonably well at this resolution, the 8800 GT included. I noticed some brief slowdowns on the GTS 320MB right as I started the game, but those seemed to clear up after a few seconds.
Team Fortess 2
For TF2, I cranked up all of the game’s quality options, set anisotropic filtering to 16X, and used 4X multisampled antialiasing at 2560×1600 resolution. I then hopped onto a server with 24 players duking it out on the “ctf_2fort” map. I recorded a demo of me playing as a soldier, somewhat unsuccessfully, and then used the Source engine’s timedemo function to play the demo back and report performance.


The 8800 GT leads all contenders in TF2. Even at 2560×1600 with 4X AA and 16X aniso, TF2 is perfectly playable with this card, although that didn’t help my poor soldier guy much.
BioShock
We tested this game with FRAPS, just like we did the UT3 demo. BioShock’s default settings in DirectX 10 are already very high quality, so we didn’t tinker with them much. We just set the display res to 2560×1600 and went to town. In this case, I was trying to take down a Big Daddy, another generally unsuccessful effort.


A low of 23 FPS for the 8800 GT puts it right on the edge of smooth playability. The 8800 GT pretty much outclasses the Radeon HD 2900 XT here, amazingly enough. The 2900 XT couldn’t quite muster a playable frame rate at these settings, which my seat-of-the-pants impression confirmed during testing.
Lost Planet: Extreme Condition
Here’s another DX10 game. We ran this game in DirectX 10 mode at 1920×1200 with all of its quality options maxed out, plus 4X AA and 16X anisotropic filtering. We used the game’s built-in performance test, which tests two very different levels in the game, a snowy outdoor setting and a cave teeming with flying doodads.




Here’s another case where the 8800 GTS 320MB stumbles, while the 8800 GT does not. Although the Radeon HD 2900 XT lists for $399, it looks like an also-ran in most of our tests.
Power consumption
We measured total system power consumption at the wall socket using an Extech power analyzer model 380803. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement.
The idle measurements were taken at the Windows Vista desktop with the Aero theme enabled. The cards were tested under load running BioShock in DirectX 10 at 2560×1600 resolution, using the same settings we did for performance testing.


Nvidia has done a nice job with the G92’s power consumption. Our 8800 GT-based test system draws over 20 fewer watts at idle than any of the others tested. Under load, the story is similar. Mash up these numbers with the performance results, and you get a very compelling power efficiency picture.
Noise levels
We measured noise levels on our test systems, sitting on an open test bench, using an Extech model 407727 digital sound level meter. The meter was mounted on a tripod approximately 14″ from the test system at a height even with the top of the video card. We used the OSHA-standard weighting and speed for these measurements.
You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured, including the stock Intel cooler we used to cool the CPU. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.


Nvidia’s single-slot coolers have too often been gratuitously small and noisy in the past year or two, but the 8800 GT is different. This may be the quietest single-slot cooler we’ve ever tested (save for the passive ones), and it doesn’t grow audibly louder under load. That’s a pleasant surprise, since the thing can get very loud during its initial spin-up at boot time. Fortunately, it never visted that territory for us when runnning games.
Conclusions
You’ve seen the results for yourself, so you pretty much know what I’m going to say. The 8800 GT does a very convincing imitation of the GeForce 8800 GTS 640MB when running the latest games, even at high resolutions and quality settings, with antialiasing and high-quality texture filtering. Its G92 GPU has all of the GeForce 8800 goodness we’ve come to appreciate in the past year or so, including DX10 support, coverage-sampled antialiasing, and top-notch overall image quality. The card is quiet and draws relatively little power compared to its competitors, and it will only occupy a single slot in your PC. That’s a stunning total package, sort of what it would be like if Jessica Biel had a brain.
With pricing between $199 and $249, I find it hard to recommend anything elseespecially since we found generally playable settings at 2560×1600 resolution in some of the most intensive new games (except for Crysis, which is in a class by itself.) I expect we may see some more G92-based products popping up in the coming weeks or months, but for most folks, this will be the version to have.
The one potential fly in the ointment for the 8800 GT is its upcoming competition from AMD. As we were preparing this review, the folks from AMD contacted us to let us know that the RV670 GPU is coming soon, and that they expect it to bring big increases in performance and power efficiency along with it. In fact, the AMD folks sound downright confident they’ll have the best offering at this price point when the dust settles, and they point to several firsts they’ll be offering as evidence. With RV670, they expect to be the first to deliver a GPU fabbed on a 55nm process, the first to offer a graphics processor compliant with the DirectX 10.1 spec, and the first to support four-way multi-GPU configs in Windows Vista. DirectX 10.1 is a particular point of emphasis for AMD, because it allows for some nifty things like fast execution of global illumination algorithms and direct developer control of antialiasing sample patterns. Those enhancements, of course, will be pretty much academic if RV670-based cards don’t provide as compelling a fundamental mix of performance, image quality, and power efficiency as the GeForce 8800 GT. We’ll know whether they’ve achieved that very soon.
This concludes our first look at the 8800 GT, but it’s not the end of our evaluation process. I’ve been knee–deep in CPUs over the past month or so, culminating today with our review of the 45nm Core 2 Extreme QX9650 processor today, and that’s kept me from spending all of the time with the 8800 GT that I’d like. Over the next week or so, I’ll be delving into multi-GPU performance, some image quality issues, HD video playback, more games, and more scaling tests. We may have yet another new video card for you shortly, too.

Post that same thing in the System Builder’s Anonymous forum, and you’ll get plenty of feedback.
§[<https://techreport.com/forums/viewforum.php?f=33<]§
“Lish” isn’t a word. Oh, and you spelled “tea” wrong
😉
Four words for ya: Boo. Tee. Lish. Us.
Hey I was thinking of buying a graphic card and at this point of time am going for pretty much a complete overhaul of my PC.
My current config is
AMD Athlon 3000+ 2.1 GHz
K8 series motherboard
512MB RAM
NVidia GeForce 5200 ( How I hate to admit it)
Well my current plan is to buy the 8800 GT and change the motherboard to a K8 SLi-eSATA2 which supports my 764pin processor.
Also the memory will be cranked upto 1.5 GB
Do you think that this would be good enough to last me a year and half or so. Or do I have to change the processor as well. Also 8800 GT seems the winner and is it then a safe bet.
Believe it or not, we’ve given a lot of thought to that question and have come up with a different answer than the one implied by your assertion about the proportional sales of stock-clocked versus higher-clocked versions of these products. The reality is that each of Nvidia’s model name “brands” has become an umbrella under which a range of products reside, and we’re forced to account for that.
We debated how to do it, but in this generation, I decided to go ahead and test higher-clocked versions for several reasons. One is that we tend to test what we’d want to buy for ourselves, and these higher-clocked versions can be pretty good values. Even if we concede that they don’t represent the majority of sales via OEMs and retail (and I’ve seen no numbers to prove that), they are the sort of thing I’d order from an e-tailer like Newegg for myself–and I expect most TR readers would, too. Another is that, by now, Nvidia and its board partners have established this tiered-speed model for its cards very well, having done a long string of product launches where the “OCed” versions of cards are available right away. This is more than just a ploy for good reviews. Also, Nvidia seems to have pretty much decided, around the time the 2900 XT launched, that the 8800 GTS should get an effective clock speed bump. We’ve seen an awful lot of 575MHz-core GTS cards since then at very good prices, such as the MSI GTS 640MB card we recommended in the Sweet Spot box in our latest system guide. Given the prices and widespread availability of cards like that, it only seemed fair to test the higher-clocked GTS variants.
Since most of your questions seem to show a concern for us being fair to ATI/AMD, let me go ahead and address this question on their side of the fence. I had hoped to get some higher-clocked 2900 XTs to test in this generation of products, but it’s proven difficult because ATI’s board partners are fewer and, when I’ve asked them, they’ve told me the higher-clock cards aren’t available for review and won’t be shipping widely into the channel. In fact, I really thought the 1GB version of the 2900 XT we reviewed was going to be a higher-clocked card–right up until I started testing it and found out otherwise. ’twas disappointing, but also a dose of reality.
We will keep an eye on the issue of stock-vs-hot-clocked cards and what’s “typical” for enthusiasts going forward. Already, that’s leading me to look into adding higher-clocked 8800 GT cards to my next round of tests, in order to be fair–not into lower-clocked GTS cards or the like. At some point, market realities may pull in another direction, though, and hopefully we’ll catch it when it happens.
Why don’t you test more stock products? Doesn’t it muddy the waters more than a little to test some super-clocked KO XXX version of a new board instead of than the stock version that will account for 90% of the sales?
I’d be so all over a 1GB version.
based on the Crysis scores, the GT is just begging for more memory. maybe someone will come out with a 1GB version ?
I don’t get that Jessica Biel thing.
what’s special about her? aren’t there plenty of females like her in Hollywood? I don’t think she is dumber than the crowd.
And what does this remark have to do in a good article?
Are you trying to say that she would be the perfect female, for you, if she had a mind/brain of a scientist?
I think she is the charismatic type not the hot brain like Sharon Stone.
the prices on the EVGA trade up site are full retail, so more than likely anything you find in their trade up program that’s faster than your card will cost the same or more. Also, they may let you trade in most any card within 90 days, but they don’t offer all of their lineup to sell you when you trade up. If they have a card that is flying off the shelves like the 8800GT, you probably won’t find it on their list of cards you can purchase via the trade up policy.
I went through this a year ago when I paid 300 for a superclocked 7900GT. They had the 7900GTO on newegg for 250, but the only cards EVGA offered in the trade up program faster than my card were the 7900GTXs at around 550, or something else similarly overpriced.
i would suggest to pay $100 more and be happy with SLI 😉
EDIT: I see you did start a forum topic, so I’ll leave this one be.
I will definitely get more of the OCZ stuff, I know it’s better than what my Dell came with. Crucial does do the scan but they don’t tell me what exactly my RAM modules are, just what it is compatible with. And I am using dual channel with 2×1 GB of the OCZ and 2×512 of the Dell stuff.
My monitor’s resolution is 1680×1050 (Dell 22-inch) with a DVI and VGA output. Also, I’m sure the 8800 GT will fit in my case as my friend had an 8800 GTS and I’ve seen the size of it, also since the GT is much thinner than the GTS is.
Here are pics of my PC
§[<http://i236.photobucket.com/albums/ff167/thevagraunt/IMG_0498.jpg<]§ There's the PSU details. What does it mean? What's the amp of my PSU? It's 5 1/2 long, 3 1/2 tall, and about 6 wide. I look for this because the last time I tried fitting a different PSU into my other Dell it wouldn't fit, but I wound up finding a 430 W that would. How's the size go? Will it be able to fit say, a 450-500 W PSU with two fans and the like? Inside of case, couldn't find the PCIe six-pin connector thing: §[<http://i236.photobucket.com/albums/ff167/thevagraunt/IMG_0496.jpg<]§ Thanks for all your help guys, appreciate it. :)
Crap… forgot to hit the reply button.
l[
Its your resolution. At 1024×768 you have about a 3rd of the pixles that a 1600×1200 rez monitor displays. I bet that at 800×600 crysis will run on a 9600pro. It will look like a turd, but it will run.
I was wondering, I bought a Dell recently, the Inspiron 530, which has a new small form factor (but not the ridiculously small ones). It comes with a PSU that has a steady rate of 300 W (not sure what that means, just repeating what the Dell rep. told me), but a max output of 460 W. I’m not sure about the volts and amps. (doesn’t the GT require 12V?). Anyways, I was looking at the power consumption part of this test, as right now that’s the only concern I have regarding a purchase of this card. The test says that it runs 231 W under load. With my 300 W PSU, will I be able to run a 8800 GT and not have to worry about it crashing my PC or anything? The main reason I want to get a new GPU is for Crysis. Here are my specs:
Intel Core 2 Duo 2 Ghz (although tests says it runs at 1.99, and since it’s Dell OC’ing is going to be difficult if not impossible)
3 GB RAM
-2 GB OCZ Platininum 4-4-4-15, DDR2-6400 at 800 Mhz
-1 GB of whatever RAM it is Dell sends
300 W (steady rate) PSU
GeForce 8300 GS 128 MB
So with all that stuff using up power from my PSU, will I be able to run an 8800 GT fine on it? Also, what’s with this extra power connector thing I see on the card? Will the 8800 come with said cable, and how will I plug it into my PSU?
As a final question, do you guys think I should keep the Dell RAM in there or take it out? Right now my memory on the Vista Index is 4.9, but my friend who only has 2 GB of the same OCZ RAM has it indexed at around 5.4 or something. Should I have a lot of slow RAM, or just a little fast RAM?
Sorry for asking so many questions but I really need help on this purchasing decision. The 8800 GT is out and Call of Duty 4 and Crysis are on their way and I want to be able to play them as soon as possible. I’d really appreciate it if you guys can help me with this.
Ahhh, thanks Damage. I was just like, WTF? heh.
Crysis is great. Exactly what I wanted, I’ve played it through on three difficulty levels and played completely differently (weapons, paths (for the most part), and styles of attack). I know it’s semi-mindless, bu it has one of the most important aspects to a game: it’s fung{<.<}g
Keep that attitude, and it won’t be long before you’ll need a new video card to play a new game.
Wait… Crysis pretty much already does that. Well then… I guess it’s okay. Keep thumbs-upping developers and delivering… shitty products.
Yup, I remember that old 30/60 fps demo 3dfx had. ‘Twas an eye opener, for sure.
Alas for 3dfx. They understood that frame rate was king. Without a good frame rate, all the fancy shaders and high dynamic range lighting in the world aren’t worth squat.
72 fps at most, ehg{http://amo.net/NT/02-21-01FPS.html<]§
that is a very good point also
Save the attitude my friend its not needed.
I just found it odd, I don’t think i’ve ever seen any TR videocard reviews that weren’t focusing on SLI running just that res.
Obviously it will run at the lower res, but I think people still want to see the numbers and not just assume. I guess its just me.
More importantly, today’s games will run smooth as butter on a lesser display, and tomorrow’s games will run acceptably.
I guess that it takes a genius to realize that if it runs at such high resolution, it will run well at lower resolutions… :roll_eyes:
a Single 8800 GT is playable at the native res of a 30` display that very few people that game own?
Its to stress the GPUs. And because these cards can handle such high resolutions.
I must have missed it but why was the testing done at such a high res?
Sweet card, but the slide-show crack re: Crysis Demo left me scratching my head. What am I doing wrong, other than not turning the visual options to the max? Medium-everything at 1024×768 seems to run pretty well on a year-old Radeon X1950 Pro…..and certainly far better than I was expecting based on all the forum hissing.
It has massively annoying motion blur. Reminds me to turn it off next time I play itg{<.<}g
it wasn’t a pre-recorded demo, the gameplay was different each time. this was well explained 😛
In that case, the 320MB slightly outdid the 640MB because it was just a tad (5MHz) higher in core clock. (Remember, both of the GTSes I tested were retail products with hot clocks; the 8800 GT was not.)
As for exactly why the 8800 GT was faster in UT, it probably boils down to shader power. Since UT doesn’t support multisampled AA, the additional ROPs (and perhaps memory bandwidth) in the GTS weren’t much help.
haven’t played it, in fact i was really running my mouth because i hardly play any games these days; last ones were bioshock and oblivion, and before that for years i played mainly soldat!
however, i still think it’s safe to assume that gamedevs won’t be plonking a 16x+ speed decrese just for some proper temporal-antialiasing (motion blur) anytime soon. well, maybe that’s why crysis is so damn slow, i’ve yet to check it out myself (i have a 8800 gts 640mb, quadcore) but from the thumbnailed screenshots i’ve seen it does have some motion blur.
I don’t know TF2’s motion blur looks quite good.
Human eye can discern 72 FPS at most so the closer u get to that the more real it appears, its only natural. Past that point there should be no difference whatsoever though.
Exactly, they should just stop doing this stupid blur effect since they cant make it right and its totally junk in most game.
the trouble is that no one does PROPER motion blur. a direct analogy can be made between blurring the screen and doing proper antialiasing – antialiasing isn’t just a blur, it’s an integral over all sub-pixels; similarly motion blur isn’t just smearing the screen, it’s an integral over time.
so to get real motion blur you’d need to render lots of frames and blend them together (remember 3dfx’s t-buffer?). this obviously crushes performance and that’s why no one does it.
i totally agree, it’s a pity because they are pretty hardcore when you can make out wtf is going on.
I think you need to address why the 8800GT out performed the 320GTS which out performed the 640GTS on page 4 under UT demo. Seems odd that the 320 would fare better than its beefier brother. I saw that and was left hoping that you would say something about that.
Yet, it just came out. All Nvidia can do is set an MSRP, they can’t/won’t tell them to sell it for less.
Too bad these cards are going for $260-$310 rather than the announced pricing plan. I know, they will come down, but why announce prices that no one is offering? I’m sure it’s still a good/great value. But no one is selling this card for anywhere near the $199 mark.
91 degrees Celcius under load with an open case according to HardOCP!!! Thats not good at all!
The gal on the HSF shroud art is looking at me 😮
Run away!
Nice review. However, I would suggest using a slightly higher resolution and only going with 2xAA in Crysis as this is what I understand the majority of gamers (at least the ones I know) use for this game. (At the speed GPUs evolve lately, we can go to 16xAA next year 😉 )
The Bourne series is the worst at this, IMHO.
Well if people bought it back then, LOL @ them. 🙂 I got one for $100 because it was $100 and was passively cooled. It actually performs quite well in a number of games. Oblivion, in particular. When I overclock the little thing to 750/950, it does Crysis at medium detail 1680×1050 quite playable.
About EVGa step up program, i got a 8800 GTS 320 mb for 300$, so suppose 8800 GT comes out at 200$, will i get 100$ back or will i have to get 2 of em for 100$ ?
ill agree with u on that, i hate motion in movie especially with big figthing scene when u cant see whats happening, looks like a lazy way to do things imo.
In games motion blur is horrible (NFS anyone???) If the option is there i always turn it off, damn ive never experienced motion blur in real life. Ok, maybe i cant see the asphalt clearly going 200 km/h but damn the world around me doesnt become blurry no matter what i do.
Nice card. Too bad they are ripping us off in Australia. The bastards want $375AUD for the damn thing!
Yes, particularly high resolutions and antialiasing seem to be a little better on the GTX. I don’t see that as a strong reason to recommend the GTX as a better solution to future proofing, though, simply because the GTX is literally twice as much as the GT. You can buy two $250 GT’s in SLI to fill that gap, or if it’s going to be a year before you need it, buy the current $250 GT now and replace it with whatever’s on the market for $250 later (which will undoubtedly be faster than the GTX anyway).
If you plan on keeping a video card for a year or two then you want to avoid spending money on a card that maxes out at 30fps with current gaming technology.
It kills me when people go on about why you need more fps than the eye can see. I want a card that will last more than 3 months before I need to upgrade. If there were a card on the market today that allowed me to play Crysis at a minimum of 100fps then sign me up! I don’t care that 100fps in Crysis is more than I need now, I am looking into the future and how that video card is going to handle Alan Wake!
Looking around various reviews – it looks like the 8800 GT keeps up well with the 8800 GTX pretty well, except when antialiasing comes into the equation? I’m trying to decide which card is overall best to get – obviously the GTX is faster and the GT is better value, but I don’t know if the GT is fast enough for reasonable future-proofing.
But it /[http://enthusiast.hardocp.com/news.html?news=Mjg4ODEsLCxoZW50aHVzaWFzdCwsLDE=<]§
I find too much motion in film to be annoying and confusing and, to some degree, nauseating.
wrong, the difference between film and video games is control. you have no control of film motion, which is why blur is fine. hell i’m sure everyones seen films with fast motion action scenes where you can barely make out anything thats going on because its so fast and there are so many cuts. only the control of the director keeps it from going totally off the rails. if you had to control with such limitations you would suffer. film motion blur is ok, its a great effect and fine for story telling. for gaming and precise control it is horrible. games have no motion blur to hide movement, its all crystal clear so lack of adequate frame rate is very apparent. 30fps is minimum, it is considered passable for playing games. for silky smooth is 60 fps. it all depends on your budget and willingness to spend on which fps level you will settle for.
Very likely, but supposedly, MS has released patches that address this, at least somewhat.
NVIDIA ups The GTS to 112 SP’s according to the Hard/Ocp
“UPDATE – 10/29/07-8:29am: A very interesting addendum to this. I just got off the phone with BFG Tech and NVIDIA has been doing some strange things lately. As of this morning, the GeForce 8800 GTS 640MB (unsure on the 320MB) will have its stream processors officially increased to 112, the same as the GT. This should put the GTS back ahead of the GT as per the paper specs. However, the separation in the products is still going to be very small except for those of you wanting to run high resolutions with AA turned on. To do that you are still going to need a $400+ video card..or so. Our new spec GTS is on the way to us now and we will of course be updating you. Given the GT’s faster clocks and possibly larger texture unit, we will have to wait and see. Undoubtedly though, the 8800 GT remains a stellar value at the expected price points.”
§[<http://enthusiast.hardocp.com/article.html?art=MTQxMCw2LCxoZW50aHVzaWFzdA==<]§
*[<>>>>"Speaking of memory, the card for which the 8800 GT is arguably a replacement, the 320MB version of the GTS, stumbles badly here. This is why we were lukewarm on the GTS 320MB when it first arrived. Lots of GPU power isn't worth much if you don't have enough video memory."<]* When a game like Bioshock runs the same on the 320MB GTS as the 640MB GTS at 2560x1600 4xAA, on an engine that uses it's own memory management system (as opposed to Managed DX engines), wouldn't it be smarter to assume that this is the texture evict issue rearing its ugly head again?
Frankly, my dear, I don’t give a damn.
😉
You can finish watching a movie 4% faster, though!
Think of the time saved! You could save ten minutes on just GONE WITH THE WIND alone!
Actually the film runs at 24fps I believe but 23.976fps is used so 3:2 pull down can be applied to get NTSC 59.94 fields/sec.
PAL speedup sucks. Boo for chipmunk voices.
Ah. A sweet way to start the week for PC enthusiasts if you ask me. An avalanche of reviews for both Graphics and CPU’s. AHHhhhh….
This sucks I had a new RC Heli all picked out to waste money on now I dont know what to do. The chopper is 3 times the money the 8800GT is but I kind of want a new video card too.
I’m pissed. I bought my 8800GTS320 a week ago. I bought from the ‘Egg, so I may just RMA it and get an 8800GT instead.
Faster is almost always better, but it really isn’t necessary for a lot of games. At the same time, 60 fps is noticeably better than 30 even if 30 will suffice. Just look at Mario Galaxy vs. Mario Sunshine. Neither of these games requires 60 fps to be perfectly playable, but just look at how /[
l[
I thought it was 23.97?
24fps in movies is not the same as 24fps in games. Cameras will capture some motion blur, whether from subject action or from camera panning, which helps the eyes smooth out the frame transitions. Sure, games are starting to use various methods of motion blur, but it’s not quite the same. My main issue with 24fps film is telecine judder.
Nice card. However I am still happy with my 8800GTS 320MB version as it powers my 1680×1050 display at native rez for all games. I have not tried Crysis yet but I know my video card will weep. Plus I only paid $270 for it, so I am still pretty happy.
However a second gen 98XX (or whatever) will be in my next build.
I see this review got Slashdoted, congrats
I think it’d be pretty bad marketing for Nvidia to call their next seris 9xxx, considering ATi already went there. I know ATi also already did 8xxx, but the 9xxx series was the one that did really well.
No, I think they’ll change it up before then. Maybe to a three digit system?
Good review, however your conclusion is slightly lacking in not pointing out that the AMD cards were the first to their respective process nodes, and were months late and far short of the clockspeed mark.
Definitely a factor I feel if you’re adopting a wait-and-see attitude towards the chip.
Yup, the signature look of 80s flat films is soft focus and muted palette IMO.
Nope, grain is not a factor here. And blowing up a print from a print is going to give you bad results anyway, regardless of grain inherent in the film stock.
Grain just comes with the territory, I’d say, although it certainly varied with the type of film stock used.
As an example off the top of my head, the film stock in common usage in the 70s was quite grainy, and pretty distinctive if you’ve watched a ton of movies from the era (what would define the 80s? Soft-focus overload?), but I wouldn’t classify it as ‘lower-resolution’ as a result.
(I hate DVDs that aggressively filtered out normal film grain during the transfer, although that practice has seemed to diminish some the last few years. It’s FILM! It’s going to have FILM GRAIN! It’s like how newer CDs try to eliminate all the tape hiss from analog recordings during “remastering”…)
Depends on the film’s graininess, does it not?
For an example of what I mean, take an old snapshot of you from when you were 5 years old. Have it professionally blown up. You’ll see the grains. More expensive film tends to be finer-grained.
Remember the 9700?
I really thought Geoff would be writing this review after seeing the 6950 review released.
l[
Must… play… Pac-Man.
Be thankful it’s not extremeclocked or xclocked.
Wait, though; DAMIT might just market the next gen cards as xxclockedx.
For action flicks, I cannot agree more with that statement more.
The slower-pace flicks (admittedly a rarity these days) can get anyway with a lower FPS.
The whole 24-FPS problem with movies is due to film length (inches per seconds) problems. 60FPS would require at least 250% more film length than 24FPS.
True, however those users only represent a small minority of hardcore FPS junkies. 😉
Again, 24 fps in movies sucks as well. Go watch a movie at 60 fps that has scenes where the camera pans, and come back here and let us know if there is a difference. It’s huge, and I think it’s more important than HDg{<...<}g
“60FPS isn’t really that much better than 30FPS outside of fast-pace action games and flicks. 60FPS does help with your twitch-actions and pulling off some split-second moves.”
That is exactly why we still buy PC’s and these $250-$400 cards. This twitch and reaction time are crucial to some of us, it makes or breaks a game. 60 fps also means a higher mean. The mean is also vital.
I’ve seen movies filmed at 60 fps and I much prefer them to movies in “HD”g{<.<}g
They got some pretty stringent standards or are just downright spoiled.
60FPS isn’t really that much better than 30FPS outside of fast-pace action games and flicks. 60FPS does help with your twitch-actions and pulling off some split-second moves.
IMO, none of the games feel completely natural in FOV movement or animation even at vaunted 60FPS. The recent blur motion effects that happens when you move around FOV in recent games looks so forced.
The crux of the problem is that there is no display technology that match the capabilities of the human eye. 30-60FPS was just found to be a reasonable range for “smooth” animation without running to the wall of diminishing returns.
It cost quite a bit more than that when it was released. Right now, it’s priced as a low-end card and gives you the performance of a low-end card, so it’s a reasonable deal. Months ago, it was priced as a mid-range card and gave you the performance of a low-end card.
It’s totally subjective. What feels nice and smooth to you can be unplayable to someone else. 30fps is torture for me, I see judder everywhere. 24fps movies are even worse. I need the game world in a FPS to be as smooth and fluid as turning my head is in real life. It’s ridiculous for him or anyone else to insult those who don’t find the “bare minimum” tolerable, he does it all the time, and he needs to STFU about it already.
if it is indeed GTS 2.0 and it is the only card (besides ATI) that can support 9.0, uber price drop for GTS 1.0? 😀 and maybe being overzealous here…uber^2 GT price drop? 😉
ah lucky you…i’m having the gigabyte 8600GT…its pretty decent…got me a medium on the crysis settings…which was more than enough…i could enjoy most of the eye candy…just waiting for the deals of the year to come (hint x’mas!!!) hehe
trying to make it look movie-ish (24fps is the actual film speed) but 30 is fine…
i didn’t actually mind the lag but then it was just plain weird…been playing at 60fps or more for all of my games and its just weird to have to turn to “slideshow” speeds :p
Because he’s on point. While intelligent people can certainly debate what amount of FPS feels “better” or “playable” to them, it’s a widely accepted fact that 30 FPS is the bare minimum of playability. And, quite honestly, until you hit 60 FPS, you can’t discern the difference between 30-59 FPS unless your minimums are dropping below 30.
Why do you persist with this crap? Why can’t you just accept that different people have different standards of what they consider playable?
I bought an 8600GTS a couple months ago and I’m not pissed. I got a decent deal on it and it got me the best performance for the money I was willing to spend at the time. In any case it’s an EVGA so I can step up if I want to.
Tell me about it.
15 fps was “playable” back in the good ‘ol N64 era. Heh.
Playable fps is completely dependent on the game. Some games are perfectly fine at 20-25fps while others need 50+ to feel smooth.
Any OC results? From other reviews I’ve seen the 8800GT was a hair from boiling itself with load temps near the 90C mark with the stock cooler, while the ol’ GTS with its beefier cooler ussually sailed north of 600 Mhz from stock (20% OC).
No, it always has been like that.
60FPS only matters for fast-pace games like Quake 3, UT2K4 etc.
It is just that some ewankers get spoiled by software doesn’t push their hardware to their limit. Once titles come around and make their hardware bent on it kness. They go “OMFG! IT IS TOO SLOW! T3H SUCKS!”
8600GT costs $100 or so. It’s also a power-frugal little thing that also has excellent HD vid acceleration. Seems to have value to me, eh.
you’ll have to agree that nvidia’s timing with the 8800 was really good, caught ati completely off-guard and it’s had a really excellent run.
obviously we’d all prefer it if nvidia could have just launched a 65nm, 55nm or 32nm gf8 back when… however, there’s a “if you build it they will come” effect going on.
Now why would anyone want to ruin Jessica Biel by giving her a brain?
The prices are always higher and rather volatile the first week or two on things like this.
Wait until all the stores have a steady supply coming in, and those prices will come down to list price and lower.
So 30FPS is considered playable the days?, my how times have changed.
i’m just torn between waiting for a decent price and hopping on the GeForce 9k’s bandwagon…
what significant plus points does that 0.1 add to current DX10?
other than that global illumination thingy…
so suggestions…what should i get? a 256? or 512 version of that card? i’m looking for a price range of $200-$250
so far newegg and what other not vendors are above $250 🙁
Not exactly. The metal in the coins could be use for various applications. Granted you would need a lot of coins to get a decent quantity. 😉
Sadly, a coin is worth more in its material cost than paperbacks.
It would have been nice to see the scores of a 1950Pro to compare. That card was arguably previous generations’ best price to (decent) performance card.
In this case, at least, the KO is higher (675/1950) than the SC (650/1900). They’re also launching an “SSC” edition at 700/2000.
§[<http://www.evga.com/articles/378.asp<]§ (Interestingly, their product page for the 8800GTS SSC edition lists "96+" SPs.)
s[
I’ll save some time: g{
EVGA’s usual marketing strategy is to sell several different models each at a different level of overclock. So for example:
8800GT: stock speeds
8800GT KO: moderate overclock
8800GT Superclock: really high OC.
If they are raising the price because their normal supplier is out but their secondary supplier provides the card at a $10 premium then it’s not gouging. It’s selling the product at the best price available.
Think about it.
Since when did the term superclock replace overclock? Or is there some subtle difference? Or does it just sound cooler?
Is 10.0 really that important?
Is 10.1 really all that important?
10.1 will require new hardware. I believe AMD’s upcoming card(s) this month will be the first to have 10.1 support.
A second one just popped up, an evga superclocked:
§[<http://www.newegg.com/Product/Product.aspx?Item=N82E16814130303<]§
Supply and Demand.
Newegg is a business not a charity.
Geez. 🙄
I’ve been waiting for the release of this card, as I need to add a card to my new system. I’m anxious to see if more manufacturers release cards today, and at what price point. If this is any indication, I’m going to wait, though it would be fun to be among the first to own this card.
ditto…
waiting to see if DX10.1 is GeForce 9 territory…if it truly is 9 only, then i’ll wait somemore…if not i’ll get the 8800GT for sure
can’t help to think i should’ve just ported my 7600GT from my older pc…oh well thats what you get for early adoption of DX10 hardware…
i could live with medium settings on crysis for awhile…til i get better graphics…could someone tell if DX10.1 is ONLY for NVIDIA’s 9k’s or ATI’s 3k’s? or can it work on both 8 and 9’s (same logic applies for ATI’s offerings)
LOL, gouging FTW.
check Anand’s for benches you’re looking for.
Price went up $10 since I first read this, now $270.00.
does DX10.1 need new hardware or can it exist on current 8 series? coz i’d hate to get this one and the 9’s are just around the corner and to find out only the 9’s have DX10.1 support…
November 15th
§[<https://techreport.com/discussions.x/13481<]§
Newegg’s got stock – $260 XFX:
§[<http://www.newegg.com/Product/Product.aspx?Item=N82E16814150252<]§
Why should they? With the new DX10 API they had a chance to cash in on that new feature set and charge a premium rather then concentrate on performance.
This is nothing new, early adopters always typically pay a premium to have the best performance with the best feature set first.
THIS…. IS…. TECH REPORT!
Let me check….yup I’m pissed.
At myself.
I don’t think 8800 GTS buyers got screwed, that’s just early adoption. I do think 8600 GT/GTS buyers got screwed big time, since there was a much larger-than-usual gap between these cards and the high-end. The mid-range is where you’re supposed to get the most bang for your buck, and nVidia didn’t deliver that with the 8600 series. You could probably say these buyers were screwed before the 8800 GT — this just adds insult to injury.
Scott, when you say new Ati cards will be out soon, do you mean weeks or months? Can you give a ballpark figure?
I really, really want to get one of these but I want to see the Ati cards…
I’d be curious to see if it puts the smack down on my 7900GTX, ie: a worthy upgrade or not…
EP
Great review, but as mentioned the numbers for higher end cards would offer some perspective. It’s always nice to see them against one another.
An American coin? Blasphemy. Madness.
This is what 8600GT should have been. If it wasn’t so darn crippled.
8800GT is tempting, but I rather stick it out with my X1900XT. It still holds up pretty well and plays Crysis @1280×1024 High details with an acceptable framerate.
I don’t think 320MB card is a under performing card. We know eventually high end products are going to be replaced by something better. Btw 320MB card was always recommended for people with smaller monitors so i don’t think 320MB is going to under perform at low res.. (Crysis different story)
Early adopters always pay a big price. Everyone should know this. Besides, the 8800GTS was worth the money for its DX9 performance. It’s easy to see a few DX10 benchmarks and forget that. So I don’t consider those folks losers, they just made a choice that had very predictable results.
Er, no. My point is that Nvidias (and ATI’s, actually) initial run of DX 10 products just didn’t deliver alot of value for the price, and now with the refresh doing everything they can do better, cheaper, and more efficiently, it’s all the more evident. The people who bought them are stuck with underperforming, overpriced cards. That’s on them, though. I guess the lesson here is wait for the refresh, but I’d still like to see the companies offer the “refresh” first, if you get me.
Great review scott. 🙂
Wow. I didn’t expect this great an offering from nVidia. Most tempting graphics card release in years from my standpoint…
Wrong reply.
Were you one of those people who bought an iPhone the first day and got mad about the price cut? :rolleyes:
kaching! How’s temps?
Looking forward to more info and detailed testing at more normal resolutions with various AA/AF options. You could also throw a GTX in there for some more perspective. And I wonder if the ‘hints’ about a full G92 are solid hints? 😉 The 8800GTS is due to get an update on the G92 chip as well according to lots of rumors, I just don’t see how it could be a 112SP chip as well, even if it had more memory and mem bandwidth it would be too close performance-wise.
You should also try Crysis on a quad core.
Thanks for the review. IMO, this is what the 88xx series should have been when it released months ago (price, power, speed, etc). As it stands now, it’s /[
Yeah, I saw the front page after reading the CPU review and went “Hrm, TR’s rotating article thingie is brok… OMG ANOTHER REVIEW”
So much graphy goodness I almost don’t need my morning caffeine.
Almost.
Damn, I’m getting nauseous from all the awesome coming out of TR, great works guys!
I’m not so much interested in this card, but the high end 2 GPU variant would be nice, especially if it handles DX10 games as well as my 7950GX2 handles DX9 games. Of course, maybe I have to get something that is DX10.1 compliant. *sigh* it never ends
Nonetheless, this appears to be an awesome value.
BRAVO! Damage, once again thank you and the crew for an excellent report.
I just finished the AT report to find a few things that I wanted to know more about. Granted, I know some people have different feelings for AT, but I generally like them. Still, their report on the new GT felt rushed, and while I know that in many cases the 8800GTS 320 has similar performance to the 8800GTS 640, it makes me happy to see you’ve tested them both.
I was about to ask for a good old DX-9 title like FEAR or something just to put the old cards in perspective but you know what, the hell with it, the world moves on.
Good review. Extremely convincing card. This is why it’s worth waiting for the refresh, not that I’m interested in buying one of these to let it languish in 2D mode its entire life.
RV670 better be damn good – none of this “best value” mentality, it needs to bring best performance at the same time, well, at least competitive performance.
Spelling error in the conclusion – Biel should be spelled Alba, in which case the brain comment is no longer necessary. But I can see your reasoning for choosing Biel since no sane, reasonable, trustworthy person would compare any sort of technology in the world to Alba.
Ho! The one-two punch! I’m feelin a bit dizzy now.
Guess all the NDA’s lifted today, whether NVIDIA wanted them to or not. Hell of a deal on that card. About time, too.