Nvidia’s GeForce GTX 285 and 295 graphics cards


Yeah, I’ve been playing with the very latest video cards for weeks now, using the newest games and a screaming new test rig. Life’s rough, you know. But now I have just a few hours to write an entire review that’s destined to be packed with more GPU goodness than six regular ones. I’ve been mainlining caffeine ’til I can barely sit still to type, and each key-press seems to be registering ttthhrreeeeee tttiiimmmeesss. So let’s skip the pleasantries and get down to business. The primary subjects of our attention today are the GeForce GTX 285 and 295, two cards that are both based on the new 55nm version of Nvidia’s GT200 graphics processor, the 1.4 billion-transistor monster behind the GeForce GTX lineup.

The move from a 65nm fab process to a 55nm one promises additional goodness from the B-step version of the GT200, affectionately known by some as the GT200b. The goodness comes mainly in the form of reduced power consumption and heat production, but in turn, those things can lead to increased performance headroom. That is, Nvidia can turn up the clock speeds without resorting to quad-slot cooling or an external power supply. Good times, indeed.

Other than the die shrink, the GT200b is more or less unchanged from the GT200. That means it still has 240 gizzywhatchits and 512 hoozydoers or whatever, just like the old version. I dunno, man, go read my GeForce GTX 280 review if you want info on the GPU chiclets and stuff. It’s all in there. What has changed is clock speeds—in the case of the new single-GPU flagship, the GeForce GTX 285—and, thanks to better thermals, the ability to shoehorn two of these GPUs into a single card—that’s the GTX 295.

The GTX 285: GT200b undiluted

Let’s start our whirlwind tour of the zillion graphics cards in this little roundup with the GeForce GTX 285. This puppy more or less mirrors the GeForce GTX 280 that it succeeds, with a gig of RAM onboard and an all-around resemblance to its elder sibling, right down to the 10.5″ board length and the plastic cooler shroud.

Yep. Uh huh.

The 285 should outperform the 280, though, thanks to a bump in clock speeds in all three dimensions: the main GPU clock is up to 648MHz, the shaders are up to 1476MHz, and the memory is up to 1242MHz—or 2484 MT/s, since it’s of the GDDR3 variety and since doing the unit conversion for you makes us sound smart. We’ll talk about what the clock speed changes mean to theoretical performance capacity shortly.

Before we go there, though, we should note the other positive effect of the move to a 55nm chip: this card has two six-pin PCIe power plugs, and that’s it. The GTX 280’s second connector requires an eight-pin plug, an annoying requirement that, for many, would involve either a plug adapter or a new PSU.

Cards like this one should become available today, and Nvidia tells us it expects street prices to be around $379. This particular GTX 285, as you can see, comes from Asus, and it’s clocked a little higher than stock with a 670MHz GPU core, 1550MHz shaders, and 1300MHz memory. The folks at Asus don’t bundle their card with too many frills, but they do include a DVI-to-HDMI adapter, and happily, they provide three years of warranty coverage with no product registration required. Too many board vendors have taken to requiring online registration in order to get warranty coverage beyond a year, a trend I’m pleased to see Asus avoiding.

Ganging up: The GTX 295

Inside of this quintessentially Cash-meets-Vader cooling shroud is a pair of circuit boards, each of which has mounted on it a GT200b GPU and associated memory. Like so:

A mock cutaway of the GeForce GTX 295. Source: Nvidia.

This is how Nvidia likes to do its multi-GPU graphics cards, dating all the way back to the GeForce 7950 GX2. Somewhere in there is a PCI Express switch chip that allows two GPUs to share a single PCIe x16 slot.

What you get from the GTX 295 is not quite as much computing power as a pair of GeForce GTX 285 cards. All 240 stream processors are intact on each GPU, but one ROP partition is deactivated. As a result, each GPU can output 28 pixels per clock rather than 32, with a corresponding drop in antialiasing power. Also, because ROP partitions have memory controllers in ’em, each GPU has a 448-bit aggregate path to memory and 896MB of RAM instead of a full gigabyte. Many of these numbers will no doubt sound familiar if you know about the GeForce GTX 260, which also has one of its ROPs deactivated.

Why the gimpy GPU? Many reasons: to keep chip yields up and costs down, to reduce the power envelope, to make everything fit into a small space, or because it somehow makes sense. Yes, it’s a little weird that the GTX 295 has 1792MB of GDDR3 memory onboard. Doesn’t really matter, though, at the end of the day. The GTX 295 will be plenty potent with a pair of GPUs in this configuration—to say the least.

Yep, the 295 does indeed require an 8-pin PCIe power connection in addition to a 6-pin. That’s pretty much a foregone conclusion. The connector you might not have been expecting, though, is the HDMI port on the back (or is it front?) of the card, which replaces the analog video output. Only makes sense, I suppose. The other connector of note is a single SLI interface, which protrudes from the top of the card, threatening quad SLI.

Despite its dually nature, the GTX 295 is still only 10.5″ long. Well, “only” is a strong word, but let’s say the 295 is no longer than the 285 or the Radeon HD 4870 X2, which is nice. GTX 295 cards like this one seem to be selling right now for a prevailing price of $499. That’s a whole heck of a lot of money, and you’ll presumably be using it with a pricey 30″ LCD monitor or something similar. Otherwise, there’s little point to a video card that costs half a grand.

Not that Nvidia isn’t trying, of course. The firm helpfully points out that the GTX 295’s dual GPUs can either pair up for peak graphics power or be split so that one card handles graphics duties and the other does PhysX calculations. We’re still a fair distance from the day when an entire GT200b GPU is best used for physics, though. We do have a copy of Mirror’s Edge for the PC, which has been fortified with additional PhysX vitamins and minerals, and we plan to have a look at it very soon.

Make room for the big dawg

In addition to testing the new GeForces, we’ve gathered together quite a few intriguing Radeon cards. Many Radeon board vendors have been building custom versions of the 4800-series cards, and several of them are represented here today.

Ladies and gentlemen, this honkin’ contraption is our very first triple-wide video card, the Palit Revolution 700 Deluxe. Yes, it’s a Radeon HD 4870 X2 with a little something extra. Let us marvel at its… girth.

Palit says it chose this triple-slot cooling design in order to keep the card quiet, and that approach seems to have worked. To my ears, this card does indeed sound quieter than the regular dual-slot versions of the 4870 X2. And before you write it off as a gimmick, consider that this card takes up one less slot than a CrossFire pair of Radeon HD 4870s. In its own, large way, it somehow makes sense, like Kim Kardashian in a two-piece.

But it does kinda dwarf a regular old Radeon HD 4870 X2, even though both cards are 10.5″ long.

Palit has made use of all that expansion plate real estate by equipping the Revolution 700 with an output port quad-fecta: DVI, VGA, HDMI, and DisplayPort are all represented. Included in the box is an HDMI-to-DVI adapter, too—a crucial bit for many of us.

Amazingly enough, though, Palit has chosen to stick with the Radeon HD 4870 X2’s stock clock speeds. Surely they could have gotten an extra 50MHz out of that honking cooler, no?

Nevertheless, the Revolution 700 is a nice example of a card maker going out on a limb to offer something a little bit different. Unfortunately, it doesn’t yet appear to be available at online retailers in the U.S. We received this example of the Revolution 700 a few weeks back, and at that time, Palit told us to expect street prices around $540. I would kind of expect that plan to be modified by the arrival of the GTX 295 and by the price cuts some vendors have made on regular ol’ 4870 X2s based on AMD’s reference design. Many of those X2 cards are now selling for $449, and most currently have a $50 mail-in rebate attached, as well. We’ll have to see where prices are when the Revolution 700 arrives in volume, which I believe should happen any day now.

Sapphire’s gem: the 4850 X2

Speaking of creative interpretations of the Radeon HD 4800 series, get a load of this thing:

That’s Sapphire’s Radeon HD 4850 X2, and as far as I know, it’s totally unique. No other vendor seems to offer a dual-GPU version of the Radeon HD 4850, and this board is apparently Sapphire’s own custom design. Perhaps that explains why the board itself is so very long—an eighth beyond 11 inches, in fact, longer than even Palit’s triple-slot monstrosity. You’ll want to check very carefully to see whether there’s room inside your PC before ordering one of these.

Sapphire’s 4850 X2 is even longer than a 4870 X2—or a GeForce GTX 295

In fact, the little heatsink pictured above protrudes from the back of the card, grabbing even more space. And yes, the card requires both a 6-pin and an 8-pin PCIe auxiliary power connection.

For all of its quirks, the 4850 X2 has some pretty obvious attractive attributes. Among them is the price-performance prospect of such a thing. Right now, the 4850 X2 is selling for $299 at Newegg, which puts it at roughly the same price as a couple of Radeon HD 4850s. And since its GPUs and memory run at stock speeds, it is very much like a couple of those. But this X2 has one big advantage: it sports 2GB of GDDR3 memory, or a gigabyte per GPU, double the amount on most 4850 cards. And in this performance class, when you’re looking to run the latest games at the highest resolutions and quality settings, a gig is what you want. You’ll see what I mean when we get to our benchmarks.

Asus custom coolers get caught in the CrossFire

The Radeon HD 4850 was an instant hit around these parts, but the first implementations caused us to raise (or is it singe?) an eyebrow over their thermals. Their single-slot coolers were never very beefy, and AMD tuned their fan speed control points pretty aggressively for good acoustics. The result was GPU temperatures approaching 90°C under load and idle temperatures around 80°C. Nothing to be worried about, according to AMD, but those temps were high enough to prompt us to look forward to 4850 cards with aftermarket cooling.

Asus’ EAH4850 TOP looks to be just what we had in mind. This card has a custom-designed dual-slot cooler, and its GPU and memory clocks are tweaked up to 680MHz and 1050MHz, respectively, for a little extra performance. The EAH4850 TOP sells for $164.99 at Newegg right now, with $30 mail-in rebate (for those who enjoy the stimulating combination of paperwork and games of chance that is the MIR.) Like all Asus video cards, the EAH4850 TOP has a three-year warranty with no registration required.

Asus was nice enough to supply us with a pair of these cards for some CrossFire testing, and interestingly enough, we seem to have caught them in a board design transition. As you can see, the components are placed differently on the two cards. They have the same GPU and memory clock speeds, and should be functionally identical. For what it’s worth, the card with the row of capacitors running across the back appears to be the newer layout, while the other one follows AMD’s original reference design.

We should go ahead and address an issue we found with these cards up front, though. You’ll see in our acoustic and temperature testing that the Asus 4850 was among the quietest and coolest cards we tested in a single-card configuration, with GPU temperatures under load that are quite dramatically reduced versus the stock Radeon HD 4850. But look closely at that cooler design, and you’ll notice that it has a fan, not a blower, onboard. That little fan collects air from above itself and pushes that air down over the cooler’s metal fins, an arrangement that’s very effective in a single-card setup. But if you place another video card in the slot directly adjacent, as happens in CrossFire configurations on many motherboards, then Asus’ cooler becomes starved for air, and GPU temperatures begin to climb.

We first noticed this problem during our performance testing, when our test system would lock up at random. There wasn’t any particular pattern to it, except that we could run a game for while on it without issue, but eventually, inevitably, the screen would go black and the system would lock. Once we started troubleshooting with a eye toward a thermal issue, the problem became clear almost immediately. We didn’t even have to make use of both GPUs. So long as a second card was nestled up against the Asus 4850, the Asus would overheat in a matter of minutes. You could watch it happening. Temperatures would rise, the card’s fan speed would peak, and GPU temps would continue climbing. Eventually, within five to ten minutes, the temperature would climb past the 100°C mark, and shortly thereafter, the screen would go blank. Bam. Game over.

We tried everything we could looking for a fix. Swapping the two cards, which do after all have different PCB designs, was no help. Our Gigabyte EX58-UD5 motherboard has a rather large south bridge cooler that could obstruct airflow, so we tried testing on an Asus P5E3 Premium, as well—same problem. Asus even sent us a matched pair of EAH4850 TOP cards based on the new PCB design, but they showed the exact same behavior. In fact, I gave up on CrossFire altogether and mounted the EAH4850 TOP in the P5E3 Premium with a Radeon X1950 card adjacent to it, and the 4850 still overheated.

More alarmingly, all of this happened on our open-air test bench, where ambient temperatures are much lower than inside of the average PC.

We were in communication with Asus throughout this troubleshooting process, and at first, their R&D department said it couldn’t duplicate the problem. After a little more testing, though, the company changed its tune. Asus now says it has made a revision to its card design, and it expects to be shipping the revised cards going forward. We hope to get our hands on one for testing soon.

In the meantime, you’ll want to avoid buying this card, especially if you plan to run it in CrossFire or with a larger expansion card of any sort in the adjacent slot. There are plenty of other options. AMD’s Radeon HD 4850 reference design may run a little hot, but at least it seems to be properly engineered.

Here’s another Asus Radeon card with a gorgeous custom cooler, the EAH4870 DK 1G. “DK” stands for Dark Knight, and I was shocked to find a picture of a dude in black armor holding a lance on the front of the box. Where’s Batman?

That disappointment aside, the 4870 DK is a nice example of a Radeon HD 4870 with 1GB of GDDR5 memory. You’ll find it for $249.99 at Newegg, with a $20 mail-in rebate. No, it doesn’t run at higher clock speeds than AMD’s reference designs, but when we were gathering cards for use in this article, we had a very difficult time finding the combination of 1GB of RAM and “overclocked” speeds in a Radeon HD 4870.

Sadly, that sweet-looking custom cooler on the EAH4870 DK 1G has the same basic fan design as the Asus 4850. Like the 4850, the EAH4870’s cooler is very effective in a single-card setup. In fact, it had by far the lowest GPU temperature we measured when used alone. But like the 4850, the EHA4870 suffocates when a card is placed in the slot next to it, blocking air to the cooling fan. The 4870 isn’t as quick to heat up as the 4850, but the card still generates heat faster than the cooler is capable of removing it, so it’s headed for the same destination.

Moving to different a motherboard than our EX58-UD5 helped somewhat by slowing down the rate at which the card overheats. Still, we saw the EAH4870’s MEMIO sensor reach GPU temperatures in excess of 116°C in testing on the Asus P5E3 Premium, and getting there only took about 30 minutes of continuous GPU use. We found we could almost reverse the equation by using Asus’ SmartDoctor utility to set the fan speed manually at 100%—but doing so was dreadfully loud, as the fan reached a speed it apparently won’t hit when relying on the card’s automatic speed control.

The bottom line is that this cooler is unsuitable for use in any situation where a second card obstructs its fan from taking in air. We don’t yet have word from Asus how it plans to address this problem with this product, either. Again, I’d advise you to seek out an alternative 4870 card. In this case, AMD’s stock cooler is really quite good.

If you’ve purchased either of these cards and are experiencing system lock-ups or heat-related GPU problems, I’d suggest contacting Asus for a warranty replacement immediately. They should take good care of you. If they give you any trouble about it, feel free to drop us a line at report@techreport.com. I’m not sure we can do anything to help, but we’d like to know if these issues aren’t being addressed properly.

The problems we encountered with these cards aren’t endemic to just Asus, either. Sapphire’s 4850 X2 cooler is a bit clumsy and loud, to name just one other example. A number of board vendors have latched on to the idea of offering their own coolers, and they seem to target two things: the cooler must be a custom design, and it must use multiple slots. Those things alone aren’t necessarily good, though. I’m perplexed why anyone would offer a dual-slot cooler that isn’t expressly designed to direct hot air out the back of the PC. But many do.

AMD and Nvidia clearly invest considerable engineering effort in their stock coolers, to make sure they have good acoustics and tolerate a wide range of conditions. Look at the way the blower on the GTX 285 card is angled slightly; that’s so it can take in air even if another card is right next to it. If a board vendor doesn’t intend to invest sufficient engineering effort into building an even better cooler than the stock one, then why bother?

So who competes with whom?

We have an awful lot of results from different video cards in the following pages, and you probably want to know which cards compete with which others. In some cases, we can give a fairly straightforward answer. For instance, the Radeon HD 4850 generally matches up against the GeForce 9800 GTX+. Prices for both are around $150 right now. Similarly, the Radeon HD 4870 1GB matches up pretty directly against the GeForce GTX 260 at somewhere near 250 bucks, and the Radeon HD 4870 X2 is the closest rival to the GeForce GTX 295 at between $450 and $500.

However, in all of the cases I’ve cited above, AMD’s pricing generally undercuts Nvidia’s. We found this to be true back in November, when we did a pretty extensive analysis of video card pricing, and although prices have dropped considerably since then, the general trend holds. In some cases, like the Radeon HD 4870 X2 versus the GeForce GTX 295, the price difference is substantial—$50 to start, more if you factor in mail-in rebates on the 4870 X2.

Other cards have no clear direct competitor. The GeForce GTX 285 falls into this category. At between $379 and $400, its closest competition from the Radeon camp is either the 4850 X2 at $300 or the 4870 X2 at $450. And none of this is taking rebates into account, which can cloud the picture even more while also ripping some people off. (I love rebates.)

Now that you’re as confused as I am, we can proceed to our performance results.

Test notes

We conducted our tests on a brand-new GPU test rig, which looks like so:

That’s a Gigabyte EX58-UD5 motherboard, 6GB of Corsair DDR3 memory rated for operation at up to 1600MHz, and lurking under the cooler is a Core i7-965 Extreme processor. The cooler itself is impressive, innit? It’s a Thermaltake V1 AX, and it’s nice and quiet, which gives us a little more room for acoustics testing.

All in all, this test system is incredibly fast, and at long last, we’re able to test both CrossFire and SLI on the same motherboard. Ahhh.

The detailed info about our tests systems is shown in the table below. One thing I do want to point out is that we’re using the version of the GeForce GTX 260 with 216 SPs, as opposed to the older version with 192. Nvidia says it’s now shipping all new GTX 260s with 216 SPs, so any 192-core stragglers on the market still should be older cards.

Also, please note that several of the cards we tested have higher clock speeds than the baselines established by AMD and Nvidia. The EVGA GeForce GTX 260 Core 216 card we tested runs faster than stock, as does the Asus GTX 285 and the Asus Radeon HD 4850. These are real, shipping products, and they are warranted by their makers at their given clock frequencies.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the results were averaged.

Our test systems were configured like so:

Processor Core i7-965
Extreme
3.2GHz
System bus QPI 4.8 GT/s
(2.4GHz)
Motherboard Gigabyte
EX58-UD5
BIOS revision F3
North bridge X58 IOH
South bridge ICH10R
Chipset drivers INF update
9.1.0.1007
Matrix Storage Manager 8.6.0.1007
Memory size 6GB (3 DIMMs)
Memory type Corsair
Dominator TR3X6G1600C8D
DDR3 SDRAM
at 1333MHz
CAS latency (CL) 8
RAS to CAS delay (tRCD) 8
RAS precharge (tRP) 8
Cycle time (tRAS) 24
Command rate 2T
Audio Integrated
ICH10R/ALC889A
with Realtek 6.0.1.5745 drivers
Graphics
Asus EAH4850 TOP Radeon HD 4850 512MB PCIe

with Catalyst 8.12 (8.561.3-081217a-073402E) drivers

Dual Asus EAH4850 TOP Radeon HD 4850 512MB PCIe

with Catalyst 8.12 (8.561.3-081217a-073402E) drivers

Visiontek Radeon HD 4870
512MB PCIe

with Catalyst 8.12 (8.561.3-081217a-073402E) drivers

Dual Visiontek Radeon HD 4870
512MB PCIe

with Catalyst 8.12 (8.561.3-081217a-073402E) drivers

Asus
EAH4870 DK 1G Radeon HD 4870 1GB PCIe

with Catalyst 8.12 (8.561.3-081217a-073402E) drivers

Asus
EAH4870 DK 1G Radeon HD 4870 1GB PCIe

+ Radeon HD 4870 1GB PCIe

with Catalyst 8.12 (8.561.3-081217a-073402E) drivers

Sapphire
Radeon HD 4850 X2 2GB PCIe

with Catalyst 8.12 (8.561.3-081217a-073402E) drivers

Palit
Revolution R700 Radeon HD 4870 X2 2GB PCIe

with Catalyst 8.12 (8.561.3-081217a-073402E) drivers

GeForce
9800 GTX+ 512MB PCIe

with ForceWare 180.84 drivers

Dual GeForce 9800 GTX+ 512MB PCIe

with ForceWare 180.84 drivers

Palit GeForce 9800 GX2 1GB PCIe

with ForceWare 180.84 drivers

EVGA
GeForce GTX 260 Core 216 896MB PCIe

with ForceWare 180.84 drivers

EVGA
GeForce GTX 260 Core 216 896MB PCIe

+ Zotac GeForce GTX 260 (216 SPs) AMP²! Edition 896MB PCIe

with ForceWare 180.84 drivers

XFX
GeForce GTX 280 1GB PCIe

with ForceWare 180.84 drivers

GeForce GTX 285 1GB PCIe

with ForceWare 181.20 drivers

Dual GeForce GTX 285 1GB PCIe

with ForceWare 181.20 drivers

GeForce GTX
295 1.792GB PCIe

with ForceWare 181.20 drivers

Hard drive WD Caviar SE16 320GB SATA
OS Windows Vista Ultimate x64 Edition
OS updates Service Pack 1, DirectX
November 2008 update

Thanks to Corsair for providing us with memory for our testing. Their quality, service, and support are easily superior to no-name DIMMs.

Our test systems were powered by PC Power & Cooling Silencer 750W power supply units. The Silencer 750W was a runaway Editor’s Choice winner in our epic 11-way power supply roundup, so it seemed like a fitting choice for our test rigs.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Specs and synthetics

Before we get to play any games, we should stop and look at the specs of the various cards we’re testing. Incidentally, the numbers in the table below are derived from the observed clock speeds of the cards we’re testing, not the manufacturer’s reference clocks or stated specifications.

Peak
pixel
fill rate
(Gpixels/s)

Peak bilinear

texel
filtering
rate
(Gtexels/s)


Peak bilinear

FP16 texel
filtering
rate
(Gtexels/s)


Peak
memory
bandwidth
(GB/s)

Peak shader
arithmetic (GFLOPS)

Single-issue Dual-issue

GeForce 9500 GT

4.4 8.8 4.4 25.6 90 134

GeForce 9600 GT

11.6 23.2 11.6 62.2 237 355

GeForce 9800 GT

9.6 33.6 16.8 57.6 339 508
GeForce 9800 GTX+

11.8 47.2 23.6 70.4 470 705
GeForce 9800 GX2

19.2 76.8 38.4 128.0 768 1152
GeForce GTX 260 (192 SPs)

16.1 36.9 18.4 111.9 477 715
GeForce GTX 260 (216 SPs)

17.5 45.1 22.5 117.9 583 875
GeForce GTX 280

19.3 48.2 24.1 141.7 622 933
GeForce GTX 285

21.4 53.6 26.8 166.4 744 1116
GeForce GTX 295

32.3 92.2 46.1 223.9 1192 1788
Radeon HD 4650 4.8 19.2 9.6 16.0 384
Radeon HD 4670 6.0 24.0 12.0 32.0 480
Radeon HD 4830 9.2 18.4 9.2 57.6 736
Radeon HD 4850

10.9 27.2 13.6 67.2 1088
Radeon HD 4870

12.0 30.0 15.0 115.2 1200
Radeon HD 4850 X2

20.0 50.0 25.0 127.1 2000
Radeon HD 4870 X2

24.0 60.0 30.0 230.4 2400

Theoretically, the GeForce GTX 285 has an edge in almost every category over any other single-GPU graphics card, with the exception being the Radeon HD 4870’s slightly higher peak shader compute rate. In fact, the GTX 285 even beats the Radeon HD 4850 X2 at theoretical fill and texture filtering rates.

Interestingly, the GTX 295 has higher peak fill and filtering rates than the 4870 X2, but the 4870 X2 has slightly higher memory bandwidth and decisively more shader arithmetic potential. Then again, we’ve found that current GPU architectures don’t always behave as their specifications might suggest. Perhaps some synthetic benchmarks will help sort things out for us.

None of the cards reach anything close to their theoretical peak color fill rates in this synthetic test. We’ve found that memory bandwidth often tends to be the limiting factor here. The results do seem to line up accordingly, roughly speaking, with the GTX 285 ahead of both the Radeon HD 4870 and the 4850 X2, while the GTX 295 trails the 4870 X2.

As far as I can tell, the units for 3DMark’s texture fill rate test are just completely horked, and FutureMark has shown no interest in fixing this problem. We’ll probably switch to a different benchmark utility as a result eventually, but for now, I believe we can use these numbers for relative comparisons, regardless. What these results show us is that the Radeon HD 4800 series has a substantial advantage over the current GeForces in delivered texture fill and filtering rates. The GeForce GTX 285 scores just a little lower than the Radeon HD 4850, and the GTX 295 is behind the 4850 X2.

In spite of the Radeons’ theoretical advantage in arithmetic rates, the GeForces tend to be faster in three of the four 3DMark shader tests. In the GPU particles and GPU cloth tests, a second GPU doesn’t seem to help and sometimes even hurts performance. That’s the nature of the beast with multi-GPU schemes. Not all games or applications scale well across multiple graphics processors.

Far Cry 2

We tested Far Cry 2 using the game’s built-in benchmarking tool, which allowed us to test the different cards at multiple resolutions in a precisely repeatable manner. We used the benchmark tool’s “Very high” quality presets with the DirectX 10 renderer and 4X multisampled antialiasing.

This is one game that will punish even the latest video cards at relatively common resolutions—and we weren’t even using the “Ultra high” quality settings. Notice how, just at 1680×1050, the Radeon HD 4870 with 512MB of memory is slower than the 4870 1GB. That effect is magnified as the resolution increases, and the 512MB cards become completely useless at 2560×1600—even the relatively powerful configs, like dual Radeon HD 4870 512MB cards in CrossFire.

As for the two cards we’re ostensibly reviewing, well, the GTX 285 is indeed the fastest single-GPU solution, and two of them in SLI are faster than anything else we tested. Yet a single GTX 285 turns out to be slower than the (less expensive) Radeon HD 4850 X2, so the ol’ value proposition looks shaky. Meanwhile, the GTX 295 nearly averages 60 FPS at 2560×1600, easily ahead of the 4870 X2 but only a tad faster than two 4870 1GB cards in CrossFire.

Left 4 Dead

We tested Valve’s zombie shooter using a custom-recorded timedemo from the game’s first campaign. We maxed out all of the game’s quality options and used 4X multisampled antialiasing in combination with 16X anisotropic texture filtering.

Yeah, so pretty much any of these cards will run Left 4 Dead fluidly at the highest resolution we can manage. That kind of makes fine-grained analysis seem superfluous, but we can still see clear performance differences at 2560×1600, for what it’s worth.

Call of Duty: World at War

We tested the latest Call of Duty title by playing through the first 60 seconds of the game’s third mission and recording frame rates via FRAPS. Although testing in this matter isn’t precisely repeatable from run to run, we believe averaging the results from five runs is sufficient to get reasonably reliable comparative numbers. With FRAPS, we can also report the lowest frame rate we encountered. Rather than average those, we’ve reported the median of the low scores from the five test runs, to reduce the impact of outliers. (Jeez, dude, you have a religion degree. Quit talking like that.) The frame-by-frame info for each card was taken from a single, hopefully representative play-testing session.

Since this game isn’t too hard on the GPU, we tested with all of its visual quality options at their highest settings and the screen res at 2560×1600. Why such a high resolution? Well, this is a review of video cards that cost upwards of 300 bucks. You don’t need an expensive video card for lower resolutions.

Nvidia pretty much cleans up here, with the GeForce 9800 GTX+ outperforming even the Radeon HD 4870 1GB. Both the GTX 285 and 295 perform relatively well.

Fallout 3

This is another game we tested with FRAPS, this time simply by walking down a road outside of the former Washington, D.C. We used Fallout 3‘s “Ultra high” quality presets, which basically means every slider maxed, along with 4X antialiasing and what the game calls 15X anisotropic filtering.

Here’s another case where cards with 512MB of memory tend to suffer, especially the GeForces. Frame rates for the 9800 GX2 and the single and dual 9800 GTX+ configs were all over the place. They would sometimes decline from run to run until we had to restart the game. I wouldn’t call Fallout 3 playable (at these extreme settings) on any of them.

The new GeForces play Fallout 3 just fine at these settings. However, they don’t really separate themselves from their competition. The Radeon HD 4850 X2 is faster than the GeForce GTX 285, and the GeForce GTX 295 seems to run into the same performance wall as all of the faster configs we tested.

The performance bottleneck at work here is the game’s level-of-detail mechanism making changes as you cross the terrain. The saw-tooth pattern in the frame-by-frame data is caused by those LOD adjustments happening every so often. Frame rates recover, and then another one hits. Of course, when the lowest troughs are at 60 FPS, it’s barely perceptible as you play.

Dead Space

This is a pretty cool game, but it’s something of an iffy console port, and it doesn’t allow the user to turn on multisampled AA or anisotropic filtering. Dead Space also resisted our attempts at enabling those features via the video card control panel. As a result, we simply tested Dead Space at a high resolution with all of its internal quality options enabled. We tested at a spot in Chapter 4 of the game where Isaac takes on a particularly big and nasty, er, bad guy thingy. This fight is set in a large, open room and should tax the GPUs more than most spots in the game.

Wow, uh, it’s a clean sweep for Nvidia. Doesn’t get much more definitive than that. Thing is, if you look at the numbers, even the slowest Radeon runs the game just fine, with its lowest frame rate above 30 FPS. AMD apparently has some driver work to do here, since performance doesn’t scale well at all with multiple GPUs.

Then again, because of the game’s control problems at high frame rates, the GeForce GTX 285 SLI was the least playable config we tested. That can be remedied with a tweak in the Nvidia control panel, though.

Crysis Warhead

This game is sufficient to tax even the fastest GPUs without using the highest possible resolution or quality setting—or any form of antialiasing. So we tested at 1920×1200 using the “Gamer” quality setting. Of course, the fact that Warhead tends to apparently run out of memory and crash (with most cards) at higher resolutions is a bit of a deterrent, as is the fact that MSAA doesn’t always produce the best results in this game. Regardless, Warhead looks great on a fast video card, with the best explosions in any game yet.

Some of the multi-GPU configs turn in decent average frame rates here, but their median lows are kind of iffy. Those configs include the GeForce 9800 GTX+ in SLI, the Radeon HD 4850 in CrossFire, and the GeForce 9800 GX2. Those lows are around 18 to 22 FPS, which is a little dicey. The common thread? They all have 512MB of memory, which seems to be a bit constraining.

Of course, the GeForce GTX 285 and 295 both handle this game without trouble, with the GTX 285 sitting at the top of the heap among single-GPU configs and the GTX 295 turning in the highest average frame rate overall.

Power consumption

We measured total system power consumption at the wall socket using an Extech power analyzer model 380803. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.

The idle measurements were taken at the Windows Vista desktop with the Aero theme enabled. The cards were tested under load running Left 4 Dead at 2560×1600 resolution, using the same settings we did for performance testing.

The GT200 GPU has the highest dynamic range in terms of power use we’ve ever seen, and the 55nm version of the GT200 continues that tradition. At idle, the GeForce GTX 285 draws less power than the Radeon HD 4850, remarkably enough. Even the dual-GPU GTX 295 consumes less power at idle than a single Radeon HD 4870 512MB.

When running a game, the story changes, as the GeForce GTX cards’ power use ramps up. Still, the GTX 285 draws less power than the card it replaces, the GTX 280. Disabling that ROP partition and keeping the clock speeds down on the GTX 295 seems to have had the intended effect, too: it pulls less juice than the Radeon HD 4870 X2, by over 50W.

Noise levels

We measured noise levels on our test systems, sitting on an open test bench, using an Extech model 407738 digital sound level meter. The meter was mounted on a tripod approximately 8″ from the test system at a height even with the top of the video card. We used the OSHA-standard weighting and speed for these measurements.

You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

This is our first time out with a new and improved sound meter that has a lower sound floor and seems to give more precise measurements than the old one. As a result, we’ve returned to measuring noise levels at idle as well as under load.

At idle, the GTX 295 produces a little more hiss than most cards, by a noticeable amount. That’s a little surprising give the low noise levels achieved by a number of cards with similar power draw at idle. When running a game, also, the GTX 295 is among the loudest cards we tested. By contrast, the GTX 285 is a pretty good acoustic citizen, very much like the GeForce GTX 280 was before it.

The loudest card of the bunch, clearly, is Sapphire’s Radeon HD 4850 X2. The dual fans on its cooler tend to spin pretty quickly, especially at idle, where its noise levels are simply annoying. When we asked Sapphire about this behavior, they offered us a pair of BIOS updates (the card has two BIOSes) intended to lower fan noise levels. We flashed both BIOSes following Sapphire’s instructions, rebooted, and BAM! We’d bricked the card. I mean, it’s fallen, and it can’t get up. No POST, nada. I was going to take Sapphire to task for not providing the updated BIOS files on its website, but I can understand why the company might not want consumers attempting to flash the card’s BIOSes. Bad things.

The other config that’s noisy at idle is the Radeon HD 4870 1GB in CrossFire. It’s loud because the primary card in the pair, the Asus EAH4870 DK 1G, is being starved for air by the second card, and the cooler has ramped up its fan speed as a result. You can see that the Asus card alone is fairly quiet at idle, at 40.3 dB. Adding a second card to the mix creates problems, though.

The dual Asus 4850s, meanwhile, remain nice and quiet as the interior card in the pair overheats.

GPU temperatures

I used GPU-Z to log temperatures during our load testing. In the case of multi-GPU setups, I recorded temperatures on the primary card.

Nvidia seems to be pretty consistent, allowing GPU temperatures to creep into the mid-80s under load, and the new GTX 285 and 295 cards fall right in line. Both the Palit and Sapphire X2 cards keep temperatures much lower than AMD’s own stock coolers do, which is nice.

The shame of it all is that those Asus coolers are reasonably quiet and very effective in a single-card configuration. Look at the single-card temperature on that 4870 1GB! However, with dual cards installed, the Asus coolers essentially fail to do their jobs. These peak temperatures near 100°C were logged just before the system locked up.

Conclusions

The GeForce GTX 285 spent much of its time in our tests being hounded by less expensive multi-GPU solutions, including the GeForce 9800 GX2 and especially the Radeon HD 4850 X2, which has enough memory to really present a challenge. Still, the GTX 285 is the fastest single-GPU video card on the planet, and that counts for a fair bit. When multi-GPU solutions stumble, as they sometimes inescapably do, the GTX 285 should still perform as expected. Impressively, the GTX 285 has the lowest power draw at idle of any card in the field, and its acoustics and power consumption under load are both quite acceptable for this class of graphics card. The GTX 285’s status as the fastest single GPU card makes it an ideal building block for a fast SLI setup, too—the dual GTX 285 config outstripped everything else we tested.

The GTX 285’s $379 price tag makes it undeniably a luxury-priced item in a time when most games simply don’t require the fastest possible video card. The fact that you can buy a Radeon HD 4870 1GB or a GeForce GTX 260 for substantially less money—well over a hundred bucks—may be rather off-putting since the performance delta simply isn’t that huge, especially in terms of real-world playability. But the GTX 285 has essentially captured the single-GPU performance crown, and I suppose there is a price premium to be attached to that.

Speaking of performance crowns, if there’s a dual-slot, dual-GPU, single-card performance crown, then the GeForce GTX 295 has snagged that one. The GTX 295 is an even more extreme solution than the 285, obviously, but it’s the class of the exotics. You really shouldn’t even consider buying one of these right now unless you plan on forking out for a 30″ monitor, as well. But if you’re going to go all out, the GTX 295 isn’t a bad way to go. Its noise levels are a little high compared to that freakish triple-slot Palit 4870 X2, but no worse under load than a pair of GTX 285s.

As for the Radeons, AMD appears to have given its partners more freedom than ever before to come up with their own custom board and cooling designs—enough rope, in fact, to hang themselves. Asus seems to have obliged. The custom coolers on both its 4850 and 4870 cards have severe thermal problems with an adjacent card installed, as we’ve chronicled.

Sapphire has produced more mixed results with its 4850 X2. On one hand, the card is too long and too loud to appeal to most people. I really can’t recommend it as it is. On the other hand, the card’s basic hardware formula is dynamite. The price-performance mix makes a lot of practical sense, as witnessed by the fact that this $299 card was such a menace to the GeForce GTX 285. Sapphire told us at CES that it plans on re-launching the 4850 X2 with some fixes in place soon. If they can lower its noise levels, that would be nice. If they can also reduce the length of the card, then we’d really be talking. We’ve heard rumors than AMD might decide to introduce its own 4850 X2 to combat the GeForce GTX 285. Sounds like a plan to me.

The best custom design we saw from a Radeon board vendor is Palit’s Revolution 700. This thing’s cooler is huge, but it’s also effective and fairly quiet. We’ll have to see how Palit prices this card when it arrives on U.S. shores in volume, but this is one bit of freelancing we can appreciate on its merits.

Comments closed
    • jonybiskit
    • 11 years ago

    no love for the 9600gt…from anyone…

    • 2x4
    • 11 years ago

    YO DAWG i heard you like gaming so we put video card on your video card so you can game, while you game

    • thermistor
    • 11 years ago

    #112

    I try to buy a resolution “up” from what my monitor really is…I game at 1600…so I look at the 1900 resolution just for some headroom. I figure that mitigates the differences between Damage’s Core i7 and my C2Q.

    What affects gaming:

    HDD style, size, cache, RPM
    Memory…stock timings versus performance settings
    Bus speed (HT speed?)
    CPU generation…even at high resolutions
    Non virgin copy of Windows with all patches in place…how much junk do you have running in the background that you don’t bother to turn off?

    In short, you think you are getting measures of ABSOLUTE performance, but the best you can get, even at other sites, is RELATIVE performance.

    Sorry to burst your bubble.

    • Silus
    • 11 years ago

    Nice review Scott!

    There’s a typo in the Call of Duty page (https://techreport.com/articles.x/16229/9) and also the navigation combo box with the article’s pages. It’s referenced as Call of Duty 4: World at War, when it should be Call of Duty 5.

    • TheBulletMagnet
    • 11 years ago

    I really wished Grid had been one of the games used to test. From the GTX 260 vs HD 4870 1gb this game was one that really taxed the cards and showed how much a difference gpu memory could make.

    • BoBzeBuilder
    • 11 years ago

    100th!!! Thank you . Thank you.

      • eitje
      • 11 years ago

      100 and FRIST!

        • Meadows
        • 11 years ago

        Not to mention a typo.

          • eitje
          • 11 years ago

          awww, your grammar nazi made you look dumb.

          it’s an intentional typo, significant to many as an internet meme.

          from now on, when you feel that someone or something has infringed upon your perceptions of truth, please consider if it’s worth alerting others. sometimes, it may not be all that valuable.

            • Meadows
            • 11 years ago

            That’s beside the point.

    • swaaye
    • 11 years ago

    Does anyone else kinda feel like these cards are almost pointless? I mean, the only really demanding game tested here is Crysis. I beat that almost two years ago with a “measly” 8800GTX and don’t really plan to return to it. Its “sequel” is actually a bit toned down on the detail in a few ways and runs better as a result of that and some performance improvements.

    Is there even anything in the pipe that makes such powerful cards worthwhile these days? Hell, my 8800GTX tears through the other games used here. Fallout 3 runs better than Oblivion, probably due to much better data stream loading and because the world is so barren. Dead Space is actually quite playable on a 7800 GTX! 🙂 CoD4 isn’t exactly mind blowing visually and L4D is almost 5 year old technology basically.

    I’m starting to think we’re not going to see a big change in anything until we’re into a new console generation, honestly. Probably best to hold off until then.

    Great review of course though. I like how well NVIDIA is keeping idle power in check. I almost want to replace the 8800GTX with a 285 just for the 50W drop in idle power I’d probably see.

      • indeego
      • 11 years ago

      As mentioned, these are 30″+ cards. If you have an LCD that is that res or above (unlikely for the majority of us, given TR polls and economy,) use these cards. Otherwise, stick with what you haveg{<.<}g

        • swaaye
        • 11 years ago

        makes sense.

      • yogibbear
      • 11 years ago

      Yes, like you, i’m still not “blown away”. My 8800gt is still fine at 1680×1050. No point in upgrading.

      • mboza
      • 11 years ago

      These are pointless for almost all of us, but a cutdown version would be quite nice. I am waiting to see what DoW2, and E:TW look like, but I suspect I might survive them without upgrading from my 8800. And I only got it because there was a big jaggy line in one of the CoH cutscenes.

      • Lans
      • 11 years ago

      Is pointless to me just because I am still gaming (lightly) at 1280×1024 but still nice to see price of GTX 260 and GTX 280 drop though! 🙂

      • TurtlePerson2
      • 11 years ago

      It’s too bad that graphical improvements are driven by consoles. Since Crysis, no one has come near that bar of graphical complexity. Engines these days are designed to be playable on consoles only. There aren’t even any exciting engines in the pipe, except maybe Rage, which is being built for consoles.

    • green
    • 11 years ago

    Scott, would it be possible to remeasure the 4870×2 idle power consumption?
    it doesn’t look quite right. specifically in comparing:
    §[<http://www.techreport.com/r.x/geforce-gtx-285/power-idle.gif<]§ §[<http://www.techreport.com/r.x/gtx260-vs-4870-1gb/power-idle.gif<]§ with comparative cards the numbers vary by 20-30W this can be attributed to the different testing platforms the 4870x2 out by 50W. well outside the changes of other cards

    • Forge
    • 11 years ago

    Back OT: Damage: What ever became of that ElderAlert 4850 X2? Did Sapphire replace it?

    • swaaye
    • 11 years ago

    It’s pretty easy to “unbrick” a vid card. Drop in a PCI video card or another PCIe card (needs to initialize first) and boot a DOS USB stick or floppy. Re-Flash with another BIOS or your hopefully-backed-up original BIOS with ATIFlash. Read the ATIFlash docs though so you don’t accidentally flash the wrong card and make a bigger mess.

    I suppose this is rather technical in reality. It’s not a big deal for a “veteran” BIOS flasher though…

    It sounds like the geniuses at Sapphire sent the wrong BIOS. Also nice of them to release a card to retail with such horrible fan speeds in the first place.

      • eitje
      • 11 years ago

      my understanding from this and another Etc article Scott wrote is that the card – when installed – now causes the system to not even get past POST, which negates the ability to boot to a DOS USB stick or floppy.

    • eitje
    • 11 years ago

    what’s the ambient sound level for the Damage Labs?

    • indeego
    • 11 years ago

    You know when your card is knocked off the list of TR benchmarked cards, that it is time to think about upgradingg{…}g

      • uknowit90
      • 11 years ago

      you also know that when your crossfire cards are cheaper than the 4850 and perform /[

        • indeego
        • 11 years ago

        Crossfire/SLI comes with a host of issues I wouldn’t want to tend with, even given $ savings.

        Stability of playing the game > g{<*<}g

      • Krogoth
      • 11 years ago

      It is more like when your favorite upcoming titles can no longer render at native resolution with a smooth framerate.

      It is funny that TR has to use 24″+ mointor and higher levels of AA/AF to show off the performance difference between the mid-range GPUs and their high-end brothers.

    • uknowit90
    • 11 years ago

    I have dual Asus 3850’s running in crossfire on an Asus P5Q-Pro motherboard. They have the same coolers as the ones here, but I have never had a single overheat problem, even though I overclocked them from 668/800 mhz to 758/1103 mhz (gpu/memory). The board /[

    • ecalmosthuman
    • 11 years ago

    $379 is definitely expensive for a video card, but I’d say it’s a step in the right direction when 2 years ago a card in the same tier cost $599 – $649 at launch.

    • RickyTick
    • 11 years ago

    It’s hard to believe that only a year ago the 8800GT was the talk of the town. The video card world sure has changed a lot.

      • Meadows
      • 11 years ago

      And to think I bit the hook that was the 8800 GT back when its price was relatively high – sure, I got the performance earlier than others (and I had several then-current titles that could use it), but I’ll remember not to hurry next time if it’s about videocards. The upside is that this same card can still do quite a lot with my overclocking profile and good driver settings, so I guess I got some value at least.

    • YeuEmMaiMai
    • 11 years ago

    who cares? really? just another over priced innefficient pos………

      • Vasilyfav
      • 11 years ago

      GTX 295 is the card than can FINALLY (after more than a year) actually “run Crysis”.

    • MadManOriginal
    • 11 years ago

    I think the 55nm GT200b is just practice for using 55nm in GT212 which is due out in Q2 which means it could be as soon as 3 months. The power savings are moderate, of course it was rather good to start, and they basically used it up with higher clock speeds. Unless one hasn’t gotten a new card for ages or can snag a really hot deal I’d wait until GT212 at this point. I’d be interested to see an equal clock power draw comparison between a 65 and 55nm GTX280 though.

      • toyota
      • 11 years ago

      no the gt212 is supposed to be Nvidias first 40nm core. theres no way they are going to put anything more into another 55nm high end chip because then they will be right back in the same shape as with the 65nm gt200.

    • homerdog
    • 11 years ago

    Last sentence of first paragraph:

    l[

    • Krogoth
    • 11 years ago

    285 and 295 = meh, unless you need to feed a 30″ monster.

    Nvidia got the trophy back from AMD’s hands. It is just that 260 Reloaded and 4870 are too much of a good deal too pass up. 4850/9800GTX+, 4670/9500GT are still good deals if $$$$ is tight.

    • DancinJack
    • 11 years ago

    First I would like to say great review. Another good one. Don’t get mad at me for this, but Call of Duty 4:Modern Warfare and Call of Duty: World at War are two different games. I’m pointing it out because some may become confused while reading. I know it shows a picture of World at War and that should be enough, just wanted to point it out.

    • fpsduck
    • 11 years ago

    Thanks for this review.
    Now I can skip 9800GTX+ without regret
    and go to GTX 260 (reloaded) instead.

    • MadManOriginal
    • 11 years ago

    *cues fight over using overclocked video cards*

    Can the comments go…over 9000?>!

    • Fighterpilot
    • 11 years ago

    Glad to see the 4850 X2 get some love…..nice going TR.They clearly rule the field at that price point.
    The new NVidia cards look pretty good,especially the power and temps…grr wish ATI would just copy their damn coolers and be done with it.
    Still every price bracket now it seems has a very good card from either maker…good times indeed.

    • danny e.
    • 11 years ago

    seems like the Asus cards are the way to go for those of us who never plan on having dual card configs.

      • swampfox
      • 11 years ago

      I have one of those ASUS 4850s, and it works fantastic as a single card (I’m only at 1680×1050), is pretty quiet and very cool.

    • etherelithic
    • 11 years ago

    I recently bought the Sapphire 4850 X2 for one reason alone: it’s the only card that has 4 DVI ports built into it. I have 3 monitors, 2 19″ and 1 24″, as well as an HDTV, and I wanted to use all of them with my computer. This card’s perfect for it, and the added benefit of being able to run it in crossfire mode at the cost of just deactivating half of my LCDs for demanding games is a plus too. My rig’s a bit old, but I’m unwilling to let go of my Athlon 4200+ CPU and MSI K8N Neo4 motherboard unit Core i7 prices come down, and having only one PCI-E x1 slot, this card fit the bill perfectly.

    • Meadows
    • 11 years ago

    I immediately skipped to the Crysis page in a bid to find out whether “it plays Crysis”, and apparently, no. CPU seems to matter and for some reason, SLI carries a tax that’s lightest with the GTX 260 cards, and seeing how they give proper performance in other games as well, it’s still the optimal “enthusiast” configuration I believe.

    • willyolio
    • 11 years ago

    /[

      • Meadows
      • 11 years ago

      Oh yeah, I feel like at home.

      • yogibbear
      • 11 years ago

      This was hilarious. I love the cheeky journalism going on here Scott 😉 Those first pages of yours are always an entertaining start to what can sometimes be a very technical article.

      • Sargent Duck
      • 11 years ago

      This is what I love about TR. They (Scott and Geoff) put humor in their reviews and makes it a) so much more entertaining than other reviews to read, 2) shows their having fun with their job. And that’s important. It’s not another “stupid article I have to write” but rather “let’s have fun with this article”. And when you enjoy what you do, you always put out better work.

      Keep up the great job Scott and Geoff for reviews. *sends some love Cyril’s way as well*

        • ssidbroadcast
        • 11 years ago

        Yup. Dashes of humor and TLC in all the right places.

          • _Sigma
          • 11 years ago

          Yeah, I typically don’t lol while reading a GPU review!

            • SomeOtherGeek
            • 11 years ago

            Same here! I have thrown out the newspaper cuz I hate the negativity!

            TR writers and everyone on the staff, keep up the good work! *high five*

    • CampinCarl
    • 11 years ago

    The Dead Space results seem fishy to me. And I don’t mean you guys messed with them–either AMD’s driver…uhh, whatever it is, for Dead Space is downright terrible, or EA/Redwood sabotaged it. Maybe I’m just paranoid, but those kind of results just seem unreasonable. Illogical.

      • HurgyMcGurgyGurg
      • 11 years ago

      Hmm, that seems about right, I actually messed around with my 4870 in the exact same room and at 1650×1080, it gets around 80-90 fps, at 2500×1600 45 fps seems about right. Crossfire doesn’t work evidently, and Dead Space has the TWIMTBP sticker, so Nvidia probably spent a lot more time working with EA than AMD did.

      Correct me if I’m wrong but you tested that in the level 4 Atrium when the Brute/Giant necromorph breaks through the doors?

    • mbutrovich
    • 11 years ago

    My video card isn’t included in the benchmarks anymore, so it must be time for an upgrade.

      • SubSeven
      • 11 years ago

      Haha, I haven’t seen a 6600GT in a review for a LONG LONG time. I wonder what that means for me?

        • toyota
        • 11 years ago

        likely a whole new pc…

          • SubSeven
          • 11 years ago

          Although i wholeheartedly agree, current times make such an investment, especially when not critical, illogical and foolish.

        • green
        • 11 years ago

        i saw my video card in a benchmark so i guess i can hold on to it for longer
        but i don’t remember when *[

      • MadManOriginal
      • 11 years ago

      I wouldn’t worry about it in this case, although it does reach ‘down’ to the HD4850 it’s really a high-end review.

    • mattthemuppet
    • 11 years ago

    Any idea when mid-range GPUs based on a cut down GT200 core will be available? All I’ve heard about are rebadged 9/8 series chips.

    Very impressive review though, can’t imagine how much time that must have taken!

    • toyota
    • 11 years ago

    why in the heck would you run Warhead at just gamer and not enthusiast settings for this test? these are high end cards so that seems a little silly because the gtx295 can easily handle that. heck I play Warhead on DX9 enthusiast settings at 1920×1080 with an overclocked gtx260 just fine for the most part.

      • Kurotetsu
      • 11 years ago

      Getting Crysis to play smoothly at high settings using DX9 isn’t a challenge (and hasn’t been for a while). Try running Warhead on DX10 enthusiast settings at 1920 x 1080 with your overclocked card and see how smoothly it runs.

        • toyota
        • 11 years ago

        I understand that DX10 is a little more sluggish and thats why I am in DX9. the point was these cards should have been tested in enthusiasts settings though.

          • Meadows
          • 11 years ago

          You’re probably right, maybe with 2 separate tests to highlight RAM constraints, one with textures at “mainstream” and little/no antialias, and the other maxed out with antialias.

            • toyota
            • 11 years ago

            techreport is getting lazier with each review. bit-tech has the best reviews by far. if a game has DX10 they will show both it and DX9 results, they also test video cards across a wide selection of resolution and AA settings. bit-tech also show minimum framrerates which are very important. that is clearly the best site to go to for a comprehensive review.

            • Meadows
            • 11 years ago

            Go ahead and visit them often.
            I feel warm and comfy right here.

            • toyota
            • 11 years ago

            I do when I want a real in depth review of how a product performs.

            • grantmeaname
            • 11 years ago

            you just read that entire review and don’t know how the card performs? Seriously?

            • Meadows
            • 11 years ago

            He wants to know exactly how each electron nudges on the PCB, so Tech Report isn’t “in-depth” enough for him.

            • toyota
            • 11 years ago

            #104 yeah because testing Crysis at one res and setting and only listing average framerates is very in depth. are you really crazy enough to think that bit-techs reviews are not more useful than the simple quick benchmarks that techreports video card reviews are. bit-tech at least takes enough time to test various resolutions, min/max framerates, games settings and API path. that is how you do a real review. I like techreport but their reviews lack any depth whatsoever and I dont senselessly praise them.

            • Damage
            • 11 years ago

            toyota, for Crysis, we listed average frame rates, minimum frame rates, and showed frame-by-frame performance data for all 17 configs tested. You are quite simply incorrect.

            You also have been exceptionally rude and have used our comments section to advertise another site.

            If you would like to request something from us for future reviews, this is most definitely not the way to go about it. You are welcome to email us if you would like to make requests.

            • toyota
            • 11 years ago

            I wasnt tying to advertise that site and didnt even link to it. I was just saying that it would be nice to have an in depth review like they used. more than one res and setting for all games would be nice and makes a lot more sense than the simple test you guys used. also using DX10 and DX9 for games like Crysis helps give a more well rounded review since people run those various settings and need to know what to expect.

            • eitje
            • 11 years ago

            how much difference do you normally see in those tests between DX9 and DX10? is it statistically significant, or just cool to see?

    • HurgyMcGurgyGurg
    • 11 years ago

    Ugh… Starting to worry about the decision to get a 512 MB 4870 instead of a 1 GB, it seems like games finally do need more than 512 MB, at least I still game at 1680×1050 though, I guess I’ll squeeze by for a little longer.

    Oh well, I guess when I get my new monitor it will serve as a great excuse to upgrade.

    Now the long wait until the next big release…

      • DrDillyBar
      • 11 years ago

      meh, I’d wait for Intel to show their hand myself.

        • Kurkotain
        • 11 years ago

        lol, same here…will be interesting to see what larrabe can do…either a good competitor…or a bloodfest…

      • odizzido
      • 11 years ago

      if you don’t use AA the story changes. I don’t know if you do or not. I don’t know how much longer 512 will do without AA as TR never seems to test without it.

        • HurgyMcGurgyGurg
        • 11 years ago

        Yea, I already had to give up 8x AA in a few games to make it run better, of course 4x AA is pretty much the same. But at 1680×1050 jaggies are still pretty noticeable in games like Dead Space.

        I’m just worried because I probably will end up keeping this 4870 for close to two years (Even though I plan to upgrade before that), and once textures get large enough it will really become the limit to an otherwise great card.

          • toyota
          • 11 years ago

          well you cant use your cards AA in Dead Space anyway.

    • Hance
    • 11 years ago

    Man your a slacker I could have beaten you by 15 minutes if I had a First Post in my system LOL.

    Hmm pay raise and work and the wife just made off with my 20 inch LCD. Guess its time for some 24 inch widescreen love. Be my first and I will have to get a new video card to run it. Time to do some research.

    • jdrake
    • 11 years ago

    First!

    I love all the crazy custom video cards – triple wide?

      • ssidbroadcast
      • 11 years ago

      Yeah when I first laid eyes on that triple-wide, for some reason I was inexplicably reminded of my high school sweetheart; the Ultimate Cheeseburger at Jack-in-the-Box.

      In other words, I got an appetite response. (seriously)

      (/[

      • cygnus1
      • 11 years ago

      Yup, I bought the Palit 9600GT with a dual slot cooler. This thing never gets loud except at first boot when the fan spins at 100%.

      Plus it has dual dvi, HDMI, and display port all on one card.

Pin It on Pinterest

Share This