AMD’s Radeon HD 6790 graphics card

Nvidia’s GeForce GTX 550 Ti is finally settling onto store shelves, having made peace with the fact that it’s slower and considerably less exciting than some of its pricier elders. Already, the latest GeForce faces another challenge—or rather, a challenger that hails from the Radeon camp and wants to throw down with the Ti on its home turf.

AMD’s Radeon HD 6790 is being introduced today, and it bears the exact same $149 price tag as the new kid from Nvidia. According to AMD, the 6790 should “comfortably” outpace its GeForce counterpart by as much as 30%, letting you enjoy games at a 1080p resolution. We found the GTX 550 Ti wasn’t always up to the task of cranking out smooth frame rates at 1680×1050 with antialiasing enabled, so that sounds like an attractive premise—if the Radeon delivers, of course.

Beside being a competitor to the GTX 550 Ti, the 6790 also provides a middle ground between AMD’s Radeon HD 5770, which sells for as little as $120 at Newegg these days, and the quicker Radeon HD 6850, which starts closer to the $165 mark. Before today’s release, Radeon fans had to deal with the truly unbearable dilemma of having to choose between one of those two cards. No longer! That’s called progress, folks.

Now, how did AMD conjure up an answer to Nvidia’s new $149 GPU so quickly? Well, it kind of didn’t. The Radeon HD 6790 is based on the Barts graphics processor already known for its starring roles in the Radeon HD 6850 and 6870. AMD has simply disabled a few bits and pieces to keep the 6790 from nipping at the heels of real 6800-series offerings. That said, you’d never know it from looking at the Radeon HD 6790 engineering sample AMD sent us for review. The card has the same imposing length, cooler, display output arrangement, dual six-pin power connectors as the Radeon HD 6870:

To keep things from getting confusing here in our labs, we thought it appropriate to slap a sticker on the 6790. Ahh, much better.

Of course, retail Radeon HD 6790 variants won’t have quite the same garage-sale look. AMD tells us boards designs “will vary greatly from what you’re seeing on . . . sample boards, including PCB, power connectors, cooler design etc.” Certain retail 6790 cards won’t require dual PCIe power connectors, which is probably a good thing—folks shopping for a $149 graphics card are probably lucky if their power supply has one PCIe power plug.

Here’s Sapphire’s take on the Radeon HD 6790, just for the sake of illustration:

Source: AMD.

Note the shorter circuit board and the snazzier-looking cooler. If the side were visible, you’d see a pair of DVI ports, one HDMI port, and a DisplayPort output. Sapphire’s card will apparently have dual power connectors, but AMD tells us a PowerColor offering with only one PCIe plug will hit stores some time after today.

Clearly, the new Radeon has a lot in common with the 6800 series. Why didn’t AMD simply call it the Radeon HD 6830? That kind of nomenclature wouldn’t exactly upset tradition, after all, and it’d undoubtedly be more fitting from an architectural point of view.

The answer is simple: despite featuring a larger and more capable GPU, the Radeon HD 6790’s specifications are strikingly similar to those of the Radeon HD 5770, which AMD now sells in pre-built PCs as the Radeon HD 6770.

  ROP

pixels/

clock

Textures

filtered/

clock

Shader

ALUs

Memory

interface

width (bits)

Estimated

transistor

count

(Millions)

Approx.

die size

(mm²)

Fab

process

Juniper (Radeon HD 5770) 16 40 800 128 1040 166 40 nm
Barts (Radeon HD 6790) 16 40 800 256 1700 255 40 nm
Barts (Radeon HD 6850) 32 48 960 256 1700 255 40 nm
Barts (Radeon HD 6870) 32 56 1120 256 1700 255 40 nm

From a bird’s eye view, the 6790’s only notable holdover from the 6800 series is the 256-bit memory interface, which the 5770’s Juniper GPU is physically incapable of matching. AMD couldn’t put Juniper on stilts, so the quick-and-easy alternative was to pay Barts a visit and shatter its tibias with a baseball bat. The fractures incapacitated four of Barts’ SIMD arrays, leaving it with 800 ALUs and the ability to filter only 40 textures per clock. This latest example of GPU hobbling also destroyed half of Barts’ rasterization capabilities, limiting it to pushing only 16 pixels/clock. As a result, the 6790 and the 5770 have identical ALU counts, and they can filter the same number of pixels and textures per clock. Make sense?

  Peak pixel

fill rate

(Gpixels/s)

Peak bilinear

integer texel

filtering rate

(Gtexels/s)

Peak bilinear

FP16 texel

filtering rate

(Gtexels/s)

Peak shader

arithmetic

(GFLOPS)

Peak

rasterization

rate

(Mtris/s)

Peak

memory

bandwidth

(GB/s)

GeForce GTS 450 12.5 25.1 25.1 601 783 57.7
GeForce GTS 450 AMP! 14.0 28.0 28.0 672 875 64.0
GeForce GTX 550 Ti 21.6 28.8 28.8 691 900 98.5
GeForce GTX 550 Ti Cyclone 22.8 30.4 30.4 730 950 103
GeForce GTX 550 Ti AMP! 24.0 32.0 32.0 768 1000 106
GeForce GTX 460 768MB 16.2 37.8 37.8 907 1350 86.4
GeForce GTX 460 1GB 21.6 37.8 37.8 907 1350 115
GeForce GTX 560 Ti 26.3 52.6 52.6 1263 1644 128
Radeon HD 5770 13.6 34.0 17.0 1360 850 76.8
Radeon HD 5770 SOC 14.4 36.0 18.0 1440 900 76.8
Radeon HD 6790 13.4 33.6 16.8 1344 840 134.4
Radeon HD 6850 24.8 37.2 18.6 1488 775 128
Radeon HD 6870 28.8 50.4 25.2 2016 900 134
Radeon HD 6950 25.6 70.4 35.2 2253 1600 160

Studying maximum theoretical performance numbers sheds further light on the subject. The Radeon HD 6790 has slightly weaker number-crunching capabilities than the 5770 but considerably more memory bandwidth—even more than the Radeon HD 6870, in fact.

One traditional downside of hobbling an upmarket GPU to compete at the low end is die area. High-end GPUs tend to be larger, making them costlier to produce. Those higher costs in turn reduce margins, giving resulting products less wiggle room when the time comes to slide down the pricing scale. The Radeon HD 6790 isn’t in too bad a position, though. While its Barts GPU is indeed quite a bit larger than Juniper, at 255 mm² vs. 166², it’s not that much portlier than the GeForce GTX 550 Ti’s GF116 chip, which I measured at about 225 mm². Nvidia might have a cost-efficiency edge, but I doubt it’s a terribly great one. It’s also worth noting that Nvidia needs a fully capable GF116 to make a GeForce GTX 550 Ti, but AMD can slip Barts chips that don’t make the cut for the 6870 and 6850 into the 6790.

Now, let’s try to see if the Radeon HD 6790 has the right mix of performance and power efficiency to give the GeForce a run for its money. Time for benchmarks!

Our testing methods

Both AMD and Nvidia have released new graphics drivers since we published our GeForce GTX 550 Ti review, so we tested the new Radeon and re-tested the cards from our GeForce GTX 550 Ti review using Nvidia’s GeForce 270.51 beta release and AMD’s Catalyst 8.84.2_RC2 drivers. We still configured our Radeons’ Catalyst Control Panel like so, leaving optional AMD optimizations for tessellation and texture filtering disabled.

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and we’ve reported the median result.

Our test system was configured as follows:

Processor Intel Core i5-750
Motherboard MSI P55-GD65
North bridge Intel P55 Express
South bridge
Memory size 4GB (2 DIMMs)
Memory type Kingston HyperX KHX2133C9AD3X2K2/4GX

DDR3 SDRAM at 1333MHz

Memory timings 9-9-9-24 1T
Chipset drivers INF update 9.2.0.1025

Rapid Storage Technology 10.1.0.1008

Audio Integrated ALC889

with Realtek R2.57 drivers

Graphics Gigabyte Radeon HD 5770 Super OC 1GB

with Catalyst 8.84.2_RC2 drivers

AMD Radeon HD 6790

with Catalyst 8.84.2_RC2 drivers

XFX Radeon HD 6850 1GB

with Catalyst 8.84.2_RC2 drivers

Zotac GeForce GTS 450 1GB AMP! Edition

with GeForce 270.51 beta drivers

MSI GeForce GTX 550 Ti Cyclone II 1GB

with GeForce 270.51 beta drivers

Zotac GeForce GTX 550 Ti AMP! Edition 1GB

with GeForce 270.51 beta drivers

Zotac GeForce GTX 460 1GB

with GeForce 270.51 beta drivers

Hard drive Samsung SpinPoint F1 HD103UJ 1TB SATA
Power supply Corsair HX750W 750W
OS Windows 7 Ultimate x64 Edition

Service Pack 1

Thanks to Intel, Kingston, Samsung, MSI, and Corsair for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following test applications:

Some further notes on our methods:

  • Many of our performance tests are scripted and repeatable, but for Bulletstorm, we used the Fraps utility to record frame rates while playing a 90-second sequence from the game. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We raised our sample size, testing each Fraps sequence five times per video card, in order to counteract any variability. We’ve included second-by-second frame rate results from Fraps for those games, and in that case, you’re seeing the results from a single, representative pass through the test sequence.

  • We measured total system power consumption at the wall socket using a P3 Kill A Watt digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.

    The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Bulletstorm at a 1920×1200 resolution with 4X AA and 16X anisotropic filtering.

  • We measured noise levels on our test system, sitting on an open test bench, using a TES-52 digital sound level meter. The meter was held approximately 8″ from the test system at a height even with the top of the video card.

    You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

  • We used GPU-Z to log GPU temperatures during our load testing on all cards but the Radeon HD 6790, which wasn’t supported by GPU-Z’s temperature probing component. With that card, we ran the load test in a window and jotted down the temperature reported by AMD’s Catalyst Control Center.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Bulletstorm

I’ve made no secret of my appreciation for Bulletstorm‘s cathartic gameplay and gorgeous environments, so it seems like a fitting start to our round of benchmarking. This shooter was tested at 1680×1050 with 2X antialiasing, medium post-processing quality, and other detail settings cranked up. Since this game has no built-in benchmarking mode, I played through the first 90 seconds of the “Hideout” echo five times per card, reporting the median of average and low frame rates obtained.

The Radeon HD 6790 is off to a nice start, although right off the bat, we can see it’s closer to our slightly souped-up Radeon HD 5770 than to the 6850. The Radeons’ strong overall showing here is likely due to optimizations in AMD’s latest round of drivers, which have boosted performance in games that use deferred shading, like Bulletstorm.

Civilization V
Civ V has several interesting tests, including a built-in compute shader benchmark that measures the GPU’s ability to decompress textures used for the graphically detailed leader characters depicted in the game. The decompression routine is based on a DirectX 11 compute shader. The benchmark reports individual results for a long list of leaders; we’ve averaged those scores to give you the results you see below.

As we noted in our review of the GeForce GTX 550 Ti, pixel-pushing capabilities may play a part in this test. That would explain why the 6790 falls behind despite its much higher peak GFLOPS—the GTX 550 Ti cards still have higher peak pixel fill rates.

In addition to the compute shader test, Civ V has several other built-in benchmarks, including two we think are useful for testing video cards. One of them concentrates on the world leaders presented in the game, which is interesting because the game’s developers have spent quite a bit of effort on generating very high quality images in those scenes, complete with some rather convincing material shaders to accent the hair, clothes, and skin of the characters. This benchmark isn’t necessarily representative of Civ V‘s core gameplay, but it does measure performance in one of the most graphically striking parts of the game. As with the earlier compute shader test, we chose to average the results from the individual leaders.

In this more conventional and shader-intensive rendering test, the Radeon HD 6790 ends up neck-and-neck with the fastest of the two GeForce GTX 550 Ti cards.

Another benchmark in Civ V focuses on the most taxing part of the core gameplay, when you’re deep into a map and have hundreds of units and structures populating the space. This is when an underpowered GPU can slow down and cause the game to run poorly. This test outputs a generic score that can be a little hard to interpret, so we’ve converted the results into frames per second to make them more readable.

The GeForces take the lead here. That said, even 29 FPS should feel relatively smooth in a game like Civilization V, where rapid camera movements aren’t as much of a staple as in fast-paced shooters.

Just Cause 2

Although it’s not the newest kid on the block, JC2 is a good example of a relatively resource-intensive game with flashy DirectX 10 visuals. It doesn’t hurt that the game has a huge, open world and addictively fun gameplay, either.

This title supports a couple of visual effects generated by Nvidia’s CUDA GPU-computing API, but we’ve left them disabled for our testing. The CUDA effects are only used sparingly in the game, anyhow, and we’d like to keep things even between the different GPU brands.

We tested performance with JC2‘s built-in benchmark, using the “Dark Tower” sequence.

Here’s another disappointing showing by the 6790, which barely edges out our souped-up Radeon HD 5770 and slips behind the GTX 550 Tis.

Metro 2033

Sometimes, and especially with low-end GPUs like the GeForce GTX 550 Ti, treating yourself to a decent amount of fancy shader effects without killing frame rates means having to sacrifice antialiasing. We ran Metro 2033‘s built-in benchmark using the “High” graphical preset with 16X anisotropic filtering and no antialiasing. PhysX effects were left disabled to ensure a fair fight between all of our contestants.

Now, that’s better. The Radeon HD 6790 gets back in the lead in Metro, cranking out a comfortable 39 FPS.

Aliens vs. Predator
AvP uses several DirectX 11 features to improve image quality and performance, including tessellation, advanced shadow sampling, and DX11-enhanced multisampled anti-aliasing. Naturally, we were pleased when the game’s developers put together an easily scriptable benchmark tool. This benchmark cycles through a range of scenes in the game, including one spot where a horde of tessellated aliens comes crawling down the floor, ceiling, and walls of a corridor.

For these tests, we turned up all of the image quality options to their maximums, along with 2X antialiasing and 16X anisotropic filtering.

The Radeons take this one, no question about it.

AvP also gives us a feel for performance scaling as we ramp up the resolution. I’m not sure what to make of AMD’s promises of 1080p gaming with the Radeon HD 6790, though. Perhaps frame rates would be smoother if we’d disabled antialiasing, but we noticed that the 6790 ran AvP a little choppy at 1080p with 2X AA. It certainly looks like you’d want to cough up the extra dough for at least a Radeon HD 6850 (or a GeForce GTX 460 1GB) if you plan on spending a lot of time beyond 1680×1050.

Power consumption

Despite its portly Barts GPU, the Radeon HD 6790 isn’t much of a power hog. Both of our GeForce GTX 550 Tis draw more juice under load—especially the 1000MHz Zotac variant.

Noise levels and GPU temperatures

I wouldn’t pay too much attention to those noise and temperature numbers, for the simple reason that few (if any) retail Radeon HD 6790 graphics cards are likely to sport the same cooler as our engineering sample. Still, for what it’s worth, the quicker Radeon HD 6850 ran cooler and quieter than the 6790 under load.

Conclusions

The scatter plot below, which tracks overall performance on the Y axis and prices on the X axis, sums up the Radeon HD 6790’s showing quite well. (In case you’re wondering, we worked out overall performance by averaging frame rates across all of our game benchmarks at 1680×1050, Civilization V‘s leader and compute-shader tests excepted. Meanwhile, we grabbed prices corresponding to the cards we tested from Newegg and Amazon.)

Yes, the Radeon HD 6790 ends up slightly above the GeForce GTX 550 Ti cards overall, but look how close the Radeon HD 6850 lies on the price axis—and how much of a leap in performance it provides.

The unfortunate truth is that, for $149, the 6790 is kind of a raw deal. A graphics card without a cause, really. Head to Newegg right now, and you can grab a Radeon HD 6850 for $164.99—that’s before a $20 mail-in rebate that’ll bring the card down to just $144.99, as long as you eventually get that check in the mail. The 6850 produces much higher frame rates across the board, draws about as much power, and may actually emit less noise.

No matter how hard I try, I just can’t think of a situation where I’d recommend the 6790.

Perhaps if you have an irrational hatred for mail-in rebates and have a budget so tight that going over by $15 would force you to mortgage your house and take up residence under a bridge. If you’re that strapped for cash, though, perhaps a $149 graphics card isn’t a wise expense to begin with.

Comments closed
    • Veerappan
    • 9 years ago

    Nice review Cyril. I’d definitely take the advice and go for a 6850 over the 6790 for the small price difference.

    I do think you might have been giving the 6790 too little credit with respect to the 5770, since most of your benchmarks were comparing to the OC’d version, but with respect to the 6850, the 6790 is pointless other than allowing AMD to harvest otherwise useless chips.

    I did notice that you flipped your noise/power consumption charts compared to what Geoff/Scott usually produce. Usually they create those charts with lowest power/noise on top. Not sure if you noticed this, but I figured I’d point it out. It’s nice to be consistent.

    • hoohoo
    • 9 years ago

    WRT Civ V as a benchmark: I tend to keep my military units near the frontier where they can respond to invaders.

    • kamikaziechameleon
    • 9 years ago

    ATI is finally price cutting to make there cards the deals they should be. Nvidia doesn’t really like competing in a price performance war. Strange how Nvidia has hung onto market share even when they don’t offer a single compelling product.

    FYI I like nvidia I have a 460 and I’m happy with it, I’m just observing a turn in the market offerings they have

      • Veerappan
      • 9 years ago

      For gaming, I’d choose AMD over Nvidia almost every time.

      For scientific computing apps, I’d probably consider an Nvidia. CUDA is used pretty extensively, and Nvidia’s GPU profiler and GPU debugger are pretty useful.

    • d0g_p00p
    • 9 years ago

    Great another useless card at a $10 -$20 price point difference. They need to move to a $1 to $2 range. That way they can over everyone who wants a video card. Just think we can have the 6970, 6969, 6968, 6966, 6965, 6964, 6963, 6962,6961, 6960, 6959, 6958, 6957, 6956, 6955, 6954,6953, 6952, 6951, 6950, 6949, 6949, 6948, 6947, 6946, 6945, 6944,6943, 6942, 6941

    nVidia can compete by having the GSO 580 GTX,580 GSS 580,GRS 580,GYD 580,GHU 580,GNH 580,GKI 580,GDE 580,GBV 580,GXX 580,GCC 580,GKI 580,GNB 580,GER 580

    Just think of the possibilities.

      • Meadows
      • 9 years ago

      You used GKI 580 twice.

        • Arclight
        • 9 years ago

        That’s because one of the GKI 580 will be a special Vision Extreme Crank that Sh*t up Mega Special Edition which enables the user to use up to 3 displays simultaniously.

      • Nutmeg
      • 9 years ago

      I would support this measure, that way I could tailor my graphics card purchases based on how much I spent on groceries this week. Would be very useful!

      • ClickClick5
      • 9 years ago

      I like the GER one. German maybe?

    • swaaye
    • 9 years ago

    I like the plain, sticker-free cooler anyway. ๐Ÿ˜‰

    • jensend
    • 9 years ago

    Moral of the story: neither the green or red team has a really compelling replacement for Juniper in the “cool and quiet, good-enough performance” niche a year and a half after its introduction. Sure, the 6790 is faster than the 5770, but not by all that much once you take into account the higher price and higher noise and power consumption under load. nV had a chance to displace Juniper in its niche with the 550Ti, but they priced it unreasonably high, and their board partners all tried to soup it up to pit it against cards with twice its compute units, resulting in high power consumption relative to its lead over the 5770. Turks-based cards are so much of an amazing improvement over Redwood (the 5670), and AMD is so confident in their performance, that AMD has quietly shoved them into obscure OEM releases and seems to be hoping the public forgot that project existed.

    • BoBzeBuilder
    • 9 years ago

    Great Review Scott. But how dare you not include my faithful S754 AMD with 9800 Pro?

      • dashbarron
      • 9 years ago

      I had a 9800 pro with my old P4 @ 2.4GHz. Upgraded to a 2.4GHz quad + 8800GT.

      • ClickClick5
      • 9 years ago

      How dare you give credit to Scott for Cyril’s work!

        • NeelyCam
        • 9 years ago

        Seems to be a hot trend here

          • Bensam123
          • 9 years ago

          No one expects the spanish inquisition!

          I honestly didn’t notice Cyril wrote this one till I looked back from the 6450 article to look for errors on this one.

    • can-a-tuna
    • 9 years ago

    1680×1050? Are you kidding?

    • crsh1976
    • 9 years ago

    Might as well grab a cheaper 5770, it ranks just a tiny notch under the 6790..

    • Fred & Ethel
    • 9 years ago

    “AMD couldn’t put Juniper on stilts, so the quick-and-easy alternative was to pay Barts a visit and shatter its tibias with a baseball bat.”

    I love a site that can make me laugh out loud at a videocard review.

      • SomeOtherGeek
      • 9 years ago

      Yea, me too. And also I love a site whose commentor’s names can make me laugh. I also liked this one:

      [quote<]Perhaps if you have an irrational hatred for mail-in rebates and have a budget so tight that going over by $15 would force you to mortgage your house and take up residence under a bridge. If you're that strapped for cash, though, perhaps a $149 graphics card isn't a wise expense to begin with.[/quote<] BTW, great review, Scott. Thanks for the honesty.

        • Damage
        • 9 years ago

        Thanks! Cyril wrote it, though.

          • SomeOtherGeek
          • 9 years ago

          Oops!! My bad! Cyril, I beg your forgiveness?

            • Cyril
            • 9 years ago

            It’s cool. TR editors pretty much have a hive mind, anyway.

          • flip-mode
          • 9 years ago

          Dude, you got thumbed down for that? How? I don’t get it. I get thumbed down for no good reason too. We should hang out!

      • khands
      • 9 years ago

      Yup, the 6790 is this generations 5830

    • Corrado
    • 9 years ago

    Why do you have the GTX460 @ $200? That may be the MSRP, but you can get them for < $150.

    [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16814162055[/url<] $140 after MIR, $180 before for an overclocked 1GB GTX460.

    • dpaus
    • 9 years ago

    I think a lot of you are missing the point: AMD has used their formidably-deep parts bin to come up with a near-perfect competitor to the GTX 550 Ti in very short order, and they have priced to – wait for it – make money!

    The thing that struck me most about the ‘value’ scatter plot is the clean sweep for the red team. I’m wondering when AMD is going to start using those charts in their marketing and presentations.

      • PixelArmy
      • 9 years ago

      Competing with the overpriced, underachieving GTX 550 with another overpriced, underachieving card is an idiot fight. Both are slaughtered by the GTX 460/768MB for the same $150 or the HD 6850 at $165 (both even less after MIR). [b<]The 6790 is only "priced to make money" if the other two don't exist.[/b<] They're not gonna sell many (shouldn't sell any) at the current price, so the margins don't matter. Heck, it's slower, louder, hotter and longer than the HD 6850! Congrats on recycling though. Additionally because the GTX 550 sucks, it has dropped in price. In terms of the plots, this would make it a little more ambiguous for the GTX550 vs 6790. However, the 460/768 would make this not so useful for marketing (moot point, AMD doesn't market anyways).

        • SHOES
        • 9 years ago

        We must bear in mind that this could quickly become a price war in the “mid-range” area, which could mean some hefty price drops on both the 6790 and the 550 its just hitting the market give it 30 days and I bet its around $100-125 which would be more appropriate.

    • flip-mode
    • 9 years ago

    AMD, you just couldn’t let the Geforce 550 suck on it’s own. Jeez. At least you could have learned from the Geforce 550 launch and priced your card at $119.99.

    As CaptTomato said: a pointless, slow card. And I’ll add “overpriced” to that.

    Gross.

      • Corrado
      • 9 years ago

      I’m guessing this cost AMD next to nothing to create. Take existing good yield 6770’s, clock them up, and ship em out. If people buy them they make money. If they go for the 6770 instead, they still make money. It costs them virtually nothing.

        • Palek
        • 9 years ago

        The article clearly explains that the 6790 is a cut-down 6870/6850 (Barts) core with higher clocks. AMD are probably using some dies that didn’t have enough working units for a 68×0 card, but inevitably some perfectly good dies also will go to the 6790. Still, they will be getting money for some silicon that would have gone to the bin otherwise.

          • Corrado
          • 9 years ago

          Right, regardless of what they are, they’re not costing AMD anything to put into a product. Its even more of a boon if they were going to otherwise be thrown away.

            • flip-mode
            • 9 years ago

            Er, I’m not clear on the point you are making. Seems like you are saying it is no loss to AMD to push out a product onto the market at an unattractive price when it cost them nothing to make. That’s not making sense to me because:

            1. It’s not good for AMD’s image, although the impact of that may be negligible.

            2. Harvesting defective dies may not cost AMD anything, but creating a full-fledged video card from those does cost money. Packaging and shipping costs money.

            3. After a decade or so of practicing die harvesting, I can all but guarantee that AMD and Nvidia now factor these things into their revenue projections. These dies do cost money to make, harvesting the bad ones just means less money lost, but it’s still accounted for.

            I’ve got no problem with AMD harvesting defective dies, but god golly, I make a point of always taking issue with a product that seems to be priced to high. This is such a product, just as the Geforce 550 is.

            • derFunkenstein
            • 9 years ago

            I don’t think it’ll take long for this thing to wind up where it belongs, at the $120 range retiring the 5770.

            • flip-mode
            • 9 years ago

            Anand’s conclusion has an interesting take on the possibilities there.

      • can-a-tuna
      • 9 years ago

      It’s faster and better priced than GeForce 550 “Ti”. Nvidia will sell tons of 550s anyway. Why can’t AMD do the same?

    • CaptTomato
    • 9 years ago

    A slow pointless card….

    • Anvil
    • 9 years ago

    There’s a place for this card, but the pricing just makes me want to go WHY.jpg.

      • FuturePastNow
      • 9 years ago

      I wouldn’t worry about the price, simply because I don’t see it staying that price for long. Within two months, this will be a $130 card.

      • Vhalidictes
      • 9 years ago

      Good card, WTH pricing. Skip 2 lunches and buy a 6850.

    • jjj
    • 9 years ago

    You guys spend less and less time on reviews and at this rate the site will be dead in a couple of years.You can’t review this card and not compare it with the GTX 460 768MB and the AMD 5830.Not to mention that it was tested in only a handful of games …
    Seriously,don’t want to offend but mobo reviews have next to no overclocking section anymore,half of the reviews are a few days late and many of the reviews that do make it in time are …lets say rushed.

      • toyota
      • 9 years ago

      I agree this review left out the cards you would expect to see it compared too. Anandtech did use the 5830 and gtx460 768mb though if you haven’t looked there.

      • Cyril
      • 9 years ago

      If you’re curious about how the Radeon HD 5830 and GeForce GTX 460 768MB fit into the picture, might I suggest checking our original review of the Radeon HD 6850:

      [url<]https://techreport.com/articles.x/19844/[/url<] In it, we found that the 6850 was faster overall than both the 5830 and the GTX 460 768MB. Considering the small price differences between those three products right now, plus what we recently discovered about the GTX 550 Ti and 6790, it's pretty clear that the 6850 is the one to get. Why weren't the 5830 and GTX 460 768MB tested in this review? Simple: I had four days, including the weekend, to conduct the testing and write this article up, and I just wasn't able to get other cards in time. Driver refreshes from both AMD and Nvidia also forced me to repeat the testing from the GTX 550 Ti review when I could have been benchmarking more games. We at TR always strive to balance quality with quantityโ€”that is, quality of writing, methodology, and analysis versus quantity of reviews and data. More often than not, we sacrifice the latter to the former. That doesn't mean we don't care or that we don't work hard; quite the contrary. We're a small team of very passionate people trying to serve our readers as well as we can. We'd love to offer both quality and quantity 100% of the time, and we are always striving to improve... but tight deadlines, shifting schedules, surprise launches, and last-minute driver updates force us to compromise sometimes.

        • bdwilcox
        • 9 years ago

        I’m disappointed you didn’t include a Voodoo3.

          • Veerappan
          • 9 years ago

          Voodoo3 would be a bit useless, since I think that was still a DX8 card.

          It would be mildly amusing to see a GF 5xxx series card compared (first DX9 card from Nv), and maybe throw in a Radeon 9700/9800.

          Yes, I know you were joking, but I’d kinda like to see that now ๐Ÿ™‚

      • esterhasz
      • 9 years ago

      I do not agree. This card is obviously not a dramatic entry into the graphics domain and I prefer that the testers invest their time into more interesting things. The review gives you an overall impression of the card’s performance and variations in other games are probably negligible.

      My only point of criticism is that that the list of die sizes for comparable CPUs gets steadily shorter. Die sizes are one of the reasons I fancy TR reviews – it’s a great way to get an idea about production cost, which interests me more than +/- 5 frames. I know that’s probably a niche though…

      • Sargent Duck
      • 9 years ago

      I respectfully disagree. Scott goes into very in-depth reviews when new high-end cards and architectures are launched. This card really doesn’t target the readers of this website as we’d all be going for 6850/70/6950+.

      This was just a short and sweet review of a card that most of us probably won’t take a second glance at.

      • Tamale
      • 9 years ago

      I just want to throw in my $0.02 and point out that as long as it was proven that the 6850 is a better value than this card, there’s no reason for more comparisons..

    • r00t61
    • 9 years ago

    HD 5830 redux.

    Buying cheap crippled cards like these is really like cutting off your nose to spite your face. Either pony up for something better or don’t even bother wasting the money.

    • ColdMist
    • 9 years ago

    This is really sad. I remember when a new generation of video cards would pretty much double or at least 1.5x the last one.

    Now, they shuffle the stuff around, change the prices $5-10, and call it a new name.

    Very disappointing overall.

    • crazybus
    • 9 years ago

    This card would only be interesting if it could be flashed into a 6870.

      • Sam125
      • 9 years ago

      I was thinking the exact same thing.

      • Bensam123
      • 9 years ago

      Like the 6950 to 6970 flash. I wish those two were in the benchmark list or at least the conclusion to add more context to the overall picture.

    • potatochobit
    • 9 years ago

    crossfire one with a 6850 and post numbers

      • Firestarter
      • 9 years ago

      Yes sir! Right on it, sir!

    • ClickClick5
    • 9 years ago

    Teh, Scott gets the good cards while Cyril is stuck with the “Meh” cards.

    Honestly I agree. Once you land in this range of power, you are better off saving that money for either a better card or just be happy with IGP graphics. If a gamer goes to buy a card, he/she buys a card. They don’t piddle in this low of a power range. And if they don’t game, well, no point in buying a card.

    The only thing I see this being useful for is if the person wants to run more than one monitor. Otherwise…shoot for something higher. Much higher….like 6970 and higher. ๐Ÿ˜‰

    (Says the guy with his Arctic XTREME cooled 6970)

      • Sargent Duck
      • 9 years ago

      Scott also owns the site. Life is good when you’re the king!

        • ClickClick5
        • 9 years ago

        -Scott: “Eh what is….wha…EH! This thing….when did it get here? I gotta ditch this quick. (looks around) CYRIL! Hey buddy! Slow news recently huh?”
        -Cyril: “It has been ok, not to…”
        -Scott: “Great to hear! Say, wanna do a review?”
        -Cyril: “Oh golly gee boy do I!”
        -Scott: “Fantastic…. Here.”
        -Cyril: “Um…this is a…”
        -Scott: “I’m aware.”
        -Cyril: “You know, I’m actually kinda busy….um….tweaking the web site.”
        -Scott: (stares blankly) “Go write a review or your fired.”

          • Palek
          • 9 years ago

          I know I’m nitpicking, but… Scott’s in the US while Geoff and Cyril are in Canada. Your imaginary conversation should take place over Skype or sumthin’. ๐Ÿ˜›

      • MrJP
      • 9 years ago

      The extra spend to get to the 6850 is definitely worthwhile, but this is still massively better than any integrated graphics. Look at the benchmarks [url=https://techreport.com/articles.x/20401/16<]here[/url<] for Civ 5 at 1280x800 with everything turned down and compare to the ones from this review at 1680x1050 at high settings with 4x AA. Night and day. Even the class of cards a step below this (450 or 5670) are still worth the money for a gamer on a really tight budget, as long as they're paired with a price-appropriate system.

      • travbrad
      • 9 years ago

      I would agree based on the current market situation, but it hasn’t always been the case that this price range is too slow for gamers. I bought my HD4830 for $93 which last me through 1 1/2 years of gaming at 1080p (not with maxed settings obviously but still decent).

      Unfortunately I doubt you could find a significantly faster card at that price…2 1/2 years later. While the midrange/highend cards have gotten a lot faster, it hasn’t really happened with the low-end.

    • DrDillyBar
    • 9 years ago

    Nice price for great preformance.

      • Vhalidictes
      • 9 years ago

      I don’t think many people have a problem with the performance. People have a problem because it costs almost exactly as much as a 6850. Even if someone is desperate to save $10 USD the 5770 is still on the market…

Pin It on Pinterest

Share This