AMD’s Radeon R9 270 graphics card reviewed

AMD’s Radeon R9 series is growing at an alarming rate. Just over a month ago, we were treated to the R9 280X and R9 270X. AMD followed up with the top-of-the-line R9 290X in late October and the slightly less top-of-the-line (yet much more compelling) R9 290 last week.

Today, AMD pulls back the curtain on the R9 270, which extends the R9 family down to the $179 price point. This is the family’s most affordable member, and it’s also the least power-hungry. It gets by with a single six-pin PCI Express power connector with a “typical board power” of only 150W, by AMD’s count.

The R9 270 will face off against Nvidia’s fastest sub-$200 card, the GeForce GTX 660. Nvidia slashed the GTX 660’s price to $179 last month, although today, a Newegg search shows the card selling for $190 before mail-in rebates. AMD, then, has an opportunity to undercut Nvidia with a newer product. But will the R9 270 be better?

Pitcairn’s latest gig

The Radeon R9 270 is based on the same Pitcairn chip as the 270X and the older Radeon HD 7800 series. As you can see in the table below, the R9 270 features a fully enabled version of the chip, just like the R9 270X and the older Radeon HD 7870.

GPU

base

clock

(MHz)

GPU

boost

clock

(MHz)

Shader

processors

Textures

filtered/

clock

ROP

pixels/

clock

Memory

transfer

rate

Memory

interface

width

(bits)

Radeon HD 7850 2GB 860 1024 55 28 4.8 GT/s 256
Radeon HD 7870 GHz 1000 1280 80 32 4.8 GT/s 256
Radeon R9 270 ?? 925 1280 80 32 5.6 GT/s 256
Radeon R9 270X ?? 1050 1280 80 32 5.6 GT/s 256

In fact, as far as I can tell, the R9 270 only differs from the R9 270X in its GPU clock speed and power envelope. The R9 270 runs about 75MHz slower (though its memory is clocked at the same 5.6 GT/s), and AMD has cut typical power consumption from 180W to 150W. The trimmed power envelope allows for this:

The R9 270 only requires a single six-pin PCI Express power connector, which is good news if you’re stuck with a lower-wattage power supply. The R9 270X and Radeon HD 7870 both need two six-pin connectors, while the GTX 660 requires only one. So far, then, Nvidia has had a small flexibility advantage on that front.

Peak pixel

fill rate

(Gpixels/s)

Peak

bilinear

filtering

(Gtexels/s)

Peak

shader

arithmetic

rate

(tflops)

Peak

rasterization

rate

(Gtris/s)

Memory

bandwidth

(GB/s)

Radeon R7 260X 18 62 2.0 2.2 104
Radeon R9 270 30 74 2.4 1.9 179
Radeon R9 270 (Asus) 31 78 2.5 2.0 179
Radeon R9 270X 34 84 2.7 2.1 179
Radeon HD 7850 2GB 28 55 1.8 1.7 154
Radeon HD 7870 32 80 2.6 2.0 154
GeForce GTX 650 Ti Boost 2GB 25 66 1.6 2.1 144
GeForce GTX 650 Ti Boost 2GB (Asus) 26 69 1.7 2.2 144
GeForce GTX 660 25 83 2.0 3.1 144

Based on our theoretical numbers, the drop in clock speed doesn’t put the R9 270 at much of a disadvantage versus to the R9 270X. The Asus version of the R9 270 that AMD sent us actually runs at 975MHz instead of the reference 925MHz, which helps narrow the gap even more.

Compared to the GTX 660, the R9 270 has, on paper, a higher pixel fill rate, a similar texture filtering rate, higher shader throughput, and much more memory bandwidth. Its only disadvantage is a rasterization rate about two thirds that of the GTX 660.

The R9 270 has another perk worth mentioning: it comes with a free copy of Battlefield 4. Or, at least, some versions of it do. Or, they’re supposed to.

Yeah, this card’s game bundling situation is clear as mud right now. AMD originally said that all R9-series cards would ship with BF4, but it now tells us that it will be up to retailers and vendors to decide which of their cards ship with the game. As of November 14, we can’t find a single R9 270 in stock at Newegg with BF4 included.

The GeForce GTX 660, by contrast, ships with Assassin’s Creed IV: Black Flag and Splinter Cell Blacklist, plus a $50 discount on the purchase of an Nvidia Shield handheld console. This deal applies to all GTX 660s listed at Newegg. I don’t know a lot folks interested in buying a Shield, and I expect serious PC gamers to log far more hours in Battlefield 4 than in the titles Nvidia offers. Still, an actual game bundle beats the promise of one.

But I digress. We’ve still got a whole suite of game benchmarks to show you—after a brief detour through our testing methods.

Our testing methods

We tested using our tried-and-true “inside the second” methods. Since we don’t have FCAT equipment up here at TR North, we used Fraps to generate all our performance numbers.

Fraps gives us information about things happening at the start of the rendering pipeline—not, as FCAT does, at the end of the pipeline, when frames reach the display. Having both sets of numbers would be better, but the Fraps data is largely sufficient for the kind of testing we’re doing here. We don’t expect there to be much of a discrepancy between Fraps and FCAT numbers on single-GPU, single-monitor configurations like these.

This time, we’ve run most of our Fraps numbers through a three-frame low-pass filter. This filter is designed to compensate for one of the side effects of triple buffering. It should smooth out irregularities in our frame time measurements that don’t actually affect when the frames are shown on the display. We didn’t apply the filter to our BioShock Infinite numbers, since that game is based on the Unreal Engine. Most UE games don’t use triple buffering, so the filter isn’t appropriate for them.

Whether filtered or not, our “inside the second” Fraps numbers are far more informative than the raw frames-per-second data produced by more conventional benchmarking techniques. Such data can cover up problems like latency spikes and micro-stuttering, which have a real, palpable impact on gameplay.

For more information about Fraps, FCAT, and our inside-the-second methodology, be sure to read Scott’s articles on the subject: Inside the second: A new look at game benchmarking and Inside the second with Nvidia’s frame capture tools.

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and we reported the median results. Our test systems were configured like so:

Processor Intel Core i7-3770K
Motherboard Gigabyte Z77X-UD3H
North bridge Intel Z77 Express
South bridge
Memory size 4GB (4 DIMMs)
Memory type AMD Memory & Kingston HyperX

DDR3 SDRAM at 1600MHz

Memory timings 9-9-9-28
Chipset drivers INF update 9.3.0.1021

Rapid Storage Technology 11.6

Audio Integrated Via audio

with 6.0.01.10800 drivers

Hard drive Crucial m4 256GB
Power supply Corsair HX750W 750W
OS Windows 8 Professional x64 Edition
Driver revision Base GPU

clock

(MHz)

Memory

clock

(MHz)

Memory

size

(MB)

AMD Radeon R7 260X Catalyst 13.11 beta V9 1100 1625 2048
Asus Radeon R9 270 Catalyst 13.11 beta V9 975 1400 2048
XFX Radeon HD 7850 2GB Catalyst 13.11 beta V9 860 1200 2048
XFX Radeon HD 7870 Catalyst 13.11 beta V9 1000 1200 2048
Asus GeForce GTX 650 Ti Boost 2GB GeForce 331.40 beta 1020 1502 2048
Asus GeForce GTX 660 GeForce 331.40 beta 980 1502 2048

Thanks to AMD, Corsair, Crucial, and Kingston for helping to outfit our test rig. AMD, Asus, Nvidia, and XFX have our gratitude, as well, for supplying the various graphics cards we tested.

Image quality settings for the graphics cards were left at the control panel defaults, except on the Radeon cards, where surface format optimizations were disabled and the tessellation mode was set to “use application settings.” Vertical refresh sync (vsync) was disabled for all tests.

We used the following test applications:

Some further notes on our methods:

  • We used the Fraps utility to record frame rates while playing 60- or 90-second sequences from the game. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We tested each Fraps sequence three times per video card in order to compensate for variability. We’ve included frame-by-frame results from Fraps for each game, and in those plots, you’re seeing the results from a single, representative pass through the test sequence.

  • We measured total system power consumption at the wall socket using a P3 Kill A Watt digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.

    The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Crysis 3 at the same quality settings used for our performance testing.

  • We measured noise levels on our test system, sitting on an open test bench, using a TES-52 digital sound level meter. The meter was held approximately 8″ from the test system at a height even with the top of the video card.

    You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

  • We used GPU-Z to log GPU temperatures during our load testing.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Battlefield 4

Testing multiplayer games involves a lot of finicky variability, so I stuck with Battlefield 4‘s single-player campaign. Benchmarking was done at the start of the Singapore beach landing, which features cataclysmic environmental effects as well as explosions, gunfire, and… well, see for yourself:

I tested at 1080p with every detail setting maxed out except for deferred antialiasing, which was left disabled. That mix of settings caused a negligible decline in AA quality, but it yielded substantially better performance than the “Ultra” preset on the R9 270.



The R9 270 is off to a nice start. It beats the stock-clocked 7870 and winds up with the highest frame rate, the lowest 99th percentile frame time, and the least time spent beyond 16.7 ms of all the cards we tested. (None of the cards spent a significant amount of time above our other thresholds, except for the R7 260X, which is a different class of product.)

Nvidia’s GeForce GTX 660 doesn’t fare terribly well here. It’s barely any faster than the Radeon HD 7850 2GB, which retails for a good $20-30 less.

You might notice one card missing from our comparison: the R9 270X. Alas, we had a short time window to produce this review, and despite our best efforts, we weren’t able to get a 270X up to our labs at TR North. That said, the only difference between our R9 270 sample and a reference 270X is a 75MHz reduction in core clock speed. It’s not hard to imagine how the faster card would fare: negligibly better than its slower sibling.

BioShock Infinite

Irrational Games’ latest BioShock title uses a modified Unreal engine to deliver some truly stunning vistas. To test it, I ran around the “Raffle Square” area for 60 seconds, heading toward a large crowd before doubling back and roaming around empty streets.

Testing was conducted at 1080p using the game’s built-in “Ultra” detail preset.



The R9 270 earns itself another gold medal in BioShock Infinite—though here, it’s so close to the 7870 that there’s hardly any difference between the two.

Nvidia, meanwhile, is at a disadvantage again. The GTX 660 distances itself nicely from the 7850 2GB this time, but it’s still a ways behind the R9 270 and the 7870. The GTX 660 also spends a little more time than the Radeons above our 33-ms frame time threshold, but not much. We’re talking about 35 ms out of a 60-second run here, a nearly negligible span of time.

Tomb Raider

Developed by Crystal Dynamics, this reboot of the famous franchise features a young Lara Croft. I tested it by running around a small mountain village near the beginning of the single-player campaign.

Here, too, I went with the “Ultra” detail preset at 1080p. In Tomb Raider, that preset enables tessellation effects and a whole host of other goodies.



I think some kind of pattern is emerging here. Yet again, the R9 270 edges out the 7870 to snag the gold medal—and yet again, the GTX 660 doesn’t quite measure up.

Crysis 3

There’s not much to say about Crysis 3, except that it’s the latest Crysis game, and it has truly spectacular graphics. To benchmark it, we ran from weapon cache to weapon cache at the beginning of the Welcome to the Jungle level for 60 seconds per run.

Testing was done at 1080p using the medium detail preset with “very high” textures and medium SMAA antialiasing. I thought these cards would be able to handle the “high” preset, but the game just didn’t feel smooth at that setting.



So, this is interesting. With all the cards, the upper 5% of frame times are much higher than the rest. Our 99th percentile results reflect this, with all the cards neck and neck around 31-32 ms. That means the last 1% of frames are rendered at the equivalent of 31-32 FPS or less. We’re not seeing a lot of particularly large spikes in frame rendering times, as evidenced by our beyond-a-threshold data, but the game definitely feels less smooth than those FPS averages suggest. Why? Quite likely because of the constant variance from relatively low frame rendering times of around 10 ms to relatively high frame times around 35 ms. This problem may be caused by the difficult workload in this section of the game. There’s a tremendous amount of geometry detail in the grass here.

Oh, and if you’re wondering which card wins here: excluding the R7 260X, which clearly spends more time above 20 ms than the rest, it’s kind of a toss-up—at least in this particular part of Crysis 3.

Far Cry 3
Far Cry 3 Blood Dragon is a little too technicolor for my taste, so I tested Far Cry 3, which is based on the same engine. I picked one of the first assassination missions, shortly after the dramatic intro sequence and the oddly sudden transition from on-rails action shooter to open-world RPG with guns.

The game was run at 1080p using the “Ultra” detail preset. HBAO was enabled, as well, but MSAA was left disabled.



A-ha! Finally, the GeForce GTX 660 scores a victory, albeit a small one. The Nvidia card is on par with the R9 270 and 7870 in Far Cry 3 in terms of the frame rate average and in our latency-focused “time spent beyond X” metrics, which attempt to quantify the “badness” or lack of smoothness in each card’s performance. However, the GTX 660’s 99th percentile frame time is slightly lower. A look at the full frame latency curve shows that the GTX 660’s frame times don’t rise quite as much after the 95th percentile mark compared to the Radeons. In other words, with the toughest frames to render, when animation smoothness is most threatened, the GTX 660 performs best.

Given how the R9 270 performs in our other games, though, this showing isn’t quite enough to vindicate the GTX 660. The R9 really is just about as good in this game.

Power consumption

For some reason, the R9 270 draws more at idle than our 7870 card—an XFX Black Edition model underclocked to match reference specs. The R9 270 is more power-efficient under load, however, and it’s about the same when the display is switched off. As we saw on the previous pages, that reduced power consumption under load doesn’t coincide with reduced performance; on the contrary, the R9 270 generally matches or beats the 7870.

Noise levels and GPU temperatures

Save for the R7 260X, all of these cards have nice and quiet dual-fan coolers. Their noise levels are pretty much on par with one another—not just according to our decibel meter, but subjectively, as well.

The R9 270 draws less power than the 7870 under load, and it’s outfitted with a nice Asus DirectCU II cooler. No wonder it runs so cool.

Conclusions

We’ll once again wrap things up with a couple of value scatter plots. In both plots, the performance numbers are geometric means of data points from all the games we tested. The first plot shows 99th-percentile frame times converted into FPS for easier reading; the second plot shows simple FPS averages. Prices were fetched from Newegg, the GPU vendors, and the card makers, depending on what was appropriate.

The best deals should reside near the top left of each plot, where performance is high and pricing is low. Conversely, the least desirable offerings should be near the bottom right.


Well, I think this is pretty clear-cut. The Radeon R9 270 outperforms the GeForce GTX 660 overall, and it does so while drawing roughly the same amount of power at idle and under load—and while sipping fewer watts at idle with the display powered off.

The Asus version of the R9 270 we tested may be hot-clocked, but it’s priced at the same $179.99 as other R9 270 cards. That makes it arguably a better choice than the GTX 660, which starts at $189.99.

That is, as long as you don’t start accounting for game bundles.

All GTX 660s listed at Newegg come with Assassin’s Creed IV: Black Flag and Splinter Cell Blacklist plus a $50 Shield discount. AMD says some R9 270s are supposed to ship with Battlefield 4, but we can’t find any such bundles at Newegg right now. If freebies matter more to you than a little extra performance, then the GTX 660 may be the card for you. But if you somehow manage to score an R9 270 with a free copy of BF4, then I’d say that’s the better bargain—simply because Battlefield multiplayer should have much more replay value than the titles Nvidia offers.

The R9 270 has another thing going for it: Mantle. Many Mantle-enabled games are on the way, and the performance gains hinted at by developers sound tantalizing. It could be that the R9 270’s lead over the GTX 660 will grow significantly thanks to the new API. That’s another thing to consider.

As for the R9 270’s big brother, the R9 270X, well… folks with (slightly) deeper pockets may prefer to cough up the extra $20 for it, but given the small difference in clock rates between the two, I’m not sure I’d bother.

Comments closed
    • Trickyday
    • 6 years ago

    Cyril
    “The Radeon R9 270 outperforms the GeForce GTX 660 overall, and it does so while drawing roughly the same amount of power at idle and under load—and while sipping fewer watts at idle with the display powered off”

    “That makes it ARGUABLY a better choice than the GTX 660” ?????

    Cyril, you should be a politician. Howcome you didn’t give the 660 a gold Award?

    • BIF
    • 6 years ago

    And we have another graphic card review and still no folding or other GPGPU test results or even vague information is included.

    I need more information than FPS and what games come free with these cards.

    • erwendigo
    • 6 years ago

    MmmMmmmhh!!!

    A review with ALL its tests (games) that are in the Gaming Evolved program.

    Excellent xxxxjob Cyril!! So the nvidia card is a little slower than the -10$ AMD card (BUT in this case, a OCed card vs reference card), you know, Cyril, if you take this as “a fair comparison” because the Oced card sells at the same price than the factory card, then you must compare it with a OCed nvidia card because you have many models with OC for the SAME PRICE that the reference card (as the Gigabyte GTX 660 OC model, as example).

    And all of this is decorated with a battery test that is copyrighted by AMD (all gaming evolved title). Brilliant!!!

      • NeelyCam
      • 6 years ago

      Oh no! Did your precious NVidia card lose in a benchmark, rendering your life meaningless?

        • moose17145
        • 6 years ago

        RENDERING your life meaningless! hahaha I get it!

        Also… I love how supposedly TR as biased in favor of NVidia not long ago, and now they are suddenly biased in favor of AMD… LOL seriously? Some peoples children…

      • Cyril
      • 6 years ago

      Oh, so we’re biased in AMD’s favor now?

      Jeez. You people. 😉

    • Srsly_Bro
    • 6 years ago

    Thanks to Cyril for putting the old and new GPUs in the same graph for easy comparison. It helps out those who don’t always read articles!

    • Cyril
    • 6 years ago

    I’ve updated the review (including the conclusion) to account for the new information AMD has shared about Battlefield 4 bundles. See here for more details about that:

    [url<]https://techreport.com/news/25660/amd-only-some-r9-cards-will-be-bundled-with-bf4[/url<]

    • anubis44
    • 6 years ago

    I had an Asus 7850 for about a week last year, and although I decided to spring for a Windforce 7950 (I bios flashed it to 1GHz – running solid at that speed for over a year) instead, I was extremely impressed with that little 7850 for ‘only’ $270. Now that this card is out at $180, I am extremely impressed with AMD’s current product line up.

    Looking at the larger picture, AMD really looks like it is deadly, deadly serious about gunning to be ‘THE’ name for product in gaming and shoving nVidia to the pavement in order to win the ‘gaming’ crown at all price points. With AMD winning the triple console wins, these new cards, and soon, Mantle, nVidia should be very, very worried, I think.

    • madmanmarz
    • 6 years ago

    How can I be sure, when ordering one of these cards, that I’m going to receive the BF4 bundle? So far Newegg has not listed them as such.

      • Farting Bob
      • 6 years ago

      I believe you get the steam code in the box, so any supplier will be fine.

        • derFunkenstein
        • 6 years ago

        right but how do you know you’re ordering a box that has a steam code, if that’s eh case? Or does the reseller email you the code?

          • anubis44
          • 6 years ago

          Buy it at a bricks & mortar store. That’s how you know what you’re getting – because you can look at the actual box and card you’re getting before you hand over the money. Also, if there’s anything wrong with the card, you can drive right back to the store and exchange it for another one without hassle the same day. That’s why I only buy my graphics cards at a physical store. I don’t mind paying $10-$15 more for the card, since the shipping usually costs about that much anyhow, and if the card is defective, you have to pay ANOTHER $10-$15 to ship it back, and then you get to wait about a week for them to send you another one. And that’s if you’re lucky and nobody loses the paperwork. Screw that.

          Same goes for other higher defect-rate items like motherboards, ram and hard drives. CPUs never seem to be defective, so those I don’t mind ordering online, although I tend to buy those at a physical store, too, if they have stock, just because I like going to buy tech stuff on impulse, and going from deciding to buy something, to getting it home to install it all the same day is a fun rush. I also like supporting my local businesses.

          Hope this helps.

            • derFunkenstein
            • 6 years ago

            Where are you shopping that you’re only paying 10-15 more locally? Locally for me I’m paying usually 15-20% more. Right now I could buy a 2GB GTX 760 for $302 (according to their price list [url=”http://www.computerdeli.com/app/download/6635835704/2013+10+31+Computer+Deli.pdf”]on the web[/url]) or buy the same card for [url=”http://www.amazon.com/EVGA-SuperClocked-Dual-Link-Graphics-02G-P4-2765-KR/dp/B00DHW4HXY/’]$259[/url] on Amazon. Higher end cards have a larger delta. They don’t list 290 cards yet, so I can’t do that comparison.

            • UberGerbil
            • 6 years ago

            Fry’s [url=http://www.frys.com/onlineads/0001507075<]matches[/url<] "internet prices" (with the expected caveats). For me sales tax is still a killer (for a little while longer, until all the etailers are collecting it also).

            • Airmantharp
            • 6 years ago

            Get in now, pay tax, get it later, maybe pay shipping… personally, I’d rather my local Fry’s/Best Buy/MicroCenter stay in business than not. Getting price-matches just let’s them know that their prices are a touch too high :).

            • indeego
            • 6 years ago

            How interesting, complete opposite opinion here. Where I live just to drive to a place that has decent cards is ~25 miles (Fry’s) each way, which is about $7-9 in gas. (But really about $20 in maintenance if you consider the life of a car.) It’s a serious time sink. I haven’t been in many years but the prices were easily $40-$50 over online prices, and there was a hassle to even get the card (had to ask associate, who typically tried to upsell you.) You certainly can’t open the box before you pay for it, and returning items at Fry’s, again, is a massive hassle compared to just dropping it back in packaging and putting in mail.

    • mczak
    • 6 years ago

    Someone should give the 7850 numbers on page 1 a look. Don’t think it has 55 tmus and 28 rops 🙂

      • Chrispy_
      • 6 years ago

      it’s 64 for the 7850 and 80 for the 7870

      I’m pretty sure the ROPs are untouched when a Pitcairn is castrated down to a 7850, so all 32 should still be there.

    • ronch
    • 6 years ago

    Ok, I didn’t read the whole article and look at every graph. I just looked at the scatter plot on the last page as well as the power consumption graphs. What I don’t like about the 270 is how it delivers essentially the same performance as the 7870 while consuming more power at idle. Never mind the lower power consumption when the display is off (I only turn off the display when doing a virus scan). As for load power draw, yeah, at least the 270 is better than the 7870 but I reckon most kids out there will be spending most of their computer time on idle, especially the TR kind. You know, us gerbils spend more time reading TR articles and commenting (and perhaps trolling as well.. but the big trolls don’t really take a long time to get banned) than playing games, I believe.

    • blitzy
    • 6 years ago

    Hmm looks to be roughly half the cost and half the performance of a 290, probably not too bad if you only play at 1080p

      • swaaye
      • 6 years ago

      The pricing is pretty nice and makes the fact that it’s a 2 year old chip less annoying. A buddy of mine picked up a 6970 recently for about this price and this is a good bit faster.

      But honestly I think 2GB is probably going to be the new 512MB soon enough. I think new games are going to demand more video memory quickly, since the new consoles will make it less impractical to do so. On the other hand this is a rather cheap card so one can’t expect much. With those $500-$700 cards, on the other hand, I think they skimped frankly (should have gone to 4GB).

        • JustAnEngineer
        • 6 years ago

        AMD [b<]did[/b<] go to 4 GiB of memory on the $400 Radeon R9-290 and the $550 Radeon R9-290X. The $300 Radeon R9-280X / HD7970 and the $240 Radeon HD7950 all have 3 GiB.

          • swaaye
          • 6 years ago

          Yeah I forgot that the 290 has 4GB.

    • Klimax
    • 6 years ago

    ” and I expect most serious PC gamers will log far more hours in Battlefield 4’s multiplayer than in Black Flag and Blacklist combined.”

    WTF is this idiocy??? Sorry, but that is so bad that it should be nuked from orbit.

    • superjawes
    • 6 years ago

    Not enough fanboyism in these comments…

    ROFL, Nvidia is doomed!!!1 this is totally a better value and won’t poison your goldfish like a GeFart card!!

      • Chrispy_
      • 6 years ago

      Well, Spigzone, Dzoner and Mad Doctor have been banned. Now we can just have a sensible discussion on the merits and downsides of the [s<]red[/s<]darker-green-than-the-other-green team's latest efforts.

    • Meadows
    • 6 years ago

    Now this is a card I can agree with more readily. Bonus points to ASUS for a proper cooler.

    • jthh
    • 6 years ago

    I would love to know the Crossfire potential of 2 of these 270’s! Is that a possibility?

      • derFunkenstein
      • 6 years ago

      I’m sure it’s very much like a pair of Radeon 7870s.

    • DPete27
    • 6 years ago

    IMO, if you’re going to suggest avoiding the 270X on the account that it’s not that much faster than the 270, you ought to show it in the scatter plot.

    • MustangSally
    • 6 years ago

    Cyril, were you the designated TR wish fairy for charts and graphs? If so, do you think we could get one scatter plot that shows the relative price-performance data for the full set of the new Radeon cards and the Nvidia competitors? I know that Nvidia will probably respond to this card with price cuts on their existing cards, which might make the data points moving targets in the truest sense of the word, but it’d still be cool to see the whole GPU price-performance spectrum one one chart.

      • DPete27
      • 6 years ago

      Unless you only want a synthetic matchup, they’d have to retest all the cards using the same in-game settings to develop an equal comparison. That poses problems of GPU limitations on lower-end cards and CPU limitations on higher end cards.

        • Bensam123
        • 6 years ago

        They could run one test that is pretty standard across all the cards so they could do something like that. Say like 3D mark… People could extrapolate those results against the rest of the benchmark results for different sets of data then.

          • DPete27
          • 6 years ago

          That’s what I was referring to when I said “synthetic”

          • Chrispy_
          • 6 years ago

          You said 3DMark :\
          That’s not even synthetic, it’s just irrelevant to everything.

      • Damage
      • 6 years ago

      Considering how close the 270 and 270X are, this comes pretty close:

      [url<]https://techreport.com/review/25611/nvidia-geforce-gtx-780-ti-graphics-card-reviewed/12[/url<]

    • anotherengineer
    • 6 years ago

    Does the ‘x’ version offer true audio or something, or is that on the lower end cards?

    I think I will wait for the GCN 2.0 refresh of the 7850 and see if that offers any further improvements. $179 though, decent price, even better when the 4850/70 was released.

    As for the weird power measurements, it could be the way the BIOS is setup, techpowerup checks clocks and voltages under a few scenarios.

    [url<]http://www.techpowerup.com/reviews/MSI/HD_7870_HAWK/32.html[/url<] [url<]http://www.techpowerup.com/reviews/MSI/R9_270X_HAWK/30.html[/url<]

    • Bensam123
    • 6 years ago

    Almost makes you wonder why they have the 270x. Overclocked variants of the 270 would’ve filled in that gap… Actually, overclocked variants of the 270 may still fill that gap… AMD in general seems to be winning all the price points this time around, sometimes by a large margin and othertimes by a small, but meaningful one (such as with this card). They’re definitely aggressively working at that market share and I can’t say I dislike it.

      • derFunkenstein
      • 6 years ago

      NVidia’s low-midrange looks pretty dumb here. Worse performance for more money is never a winning combination.

        • dpaus
        • 6 years ago

        If their goal is to deny Nvidia a big chunk of their revenue stream, AMD has positioned itself beautifully. This could be significant; AMD has many, many years of surviving and even growing while taking in very little cash. This is a deep-rooted corporate culture that simply cannot be thrown together quickly. Nvidia simply does not have this culture, and cannot put it in place quickly. I haven’t looked up what their cash reserves are, but if I was an active trader in either stock (and I’m not), it’s the first thing I’d be checking this morning.

          • UnfriendlyFire
          • 6 years ago

          And AMD probably has to have some skills with the banks and investors:

          “Yes, we know we’re betting big on APUs because competing against Intel head on didn’t work. Yes, we know that we screwed up our mobile targeting. But please continue to give us loans…”

          • travbrad
          • 6 years ago

          It’s strange how many enthusiasts are questioning Nvidia’s profitability and stock prices when it’s AMD who has been bleeding money for years, and Nvidia has been profitable during that same time period. Nvidia also has roughly double the cash reserves compared to AMD and no debt. Having better performance for the price doesn’t necessarily translate into more profits or even more sales. If it did AMD would have had 90% market share against Intel back in the Athlon64 days.

          Mind you I don’t think I’d be investing in either company right now. Nvidia hasn’t really been able to succeed to any great extent outside of discrete graphics cards, which is such a mature (not growing) market already. AMD well…as I said they have been bleeding money. They do have a more diverse offerings across different parts of the industry, but none of those parts is doing all that well.

            • GeneS
            • 6 years ago

            I think dpaus’s point is that Nvidia is at the top of their stock price history with nowhere to go but down, and AMD is still in the basement with nowhere to go but up.

            • entropy13
            • 6 years ago

            [quote<]I think dpaus's point is that Nvidia is at the top of their stock price history with nowhere to go but down, and AMD is still in the basement with nowhere to go but up.[/quote<] I see, so it's impossible for either of them to actually maintain the status quo i.e no 'going up' nor 'going down'...

            • clone
            • 6 years ago
          • Klimax
          • 6 years ago

          Running on thin margins is really great idea. You can ask pre-Elop Nokia how well it worked for them. Oh and I’d ask CPU side of AMD how great it is to have thin budget for R&D…

          • derFunkenstein
          • 6 years ago

          Yeah, I don’t think so, man. I mean, AMD is used to doing nearly nothing because they have nearly nothing thanks to slim margins, but that doesn’t make it a winning combination.

        • Aerugo
        • 6 years ago
          • derFunkenstein
          • 6 years ago

          I’m holding onto my GTX 460 1GB in the hopes that there will be price drops to line up on performance. If I could get a GTX 760 for $200 I’d bite. Otherwise I’m looking at this R9 270, which is pretty nice for the price.

          • Bensam123
          • 6 years ago

          Makes you wonder how long they can cost at higher prices on their brand name alone… I mean Apple does it and makes top dollar even though like 70% of the market belongs to android.

      • Srsly_Bro
      • 6 years ago

      What about the gap filled by overclocked 270x?

    • jessterman21
    • 6 years ago

    That Crysis 3 level is kind of borked because of the grass physics (no CPU can really handle it). Makes your frametimes graph look like grass…

    Maybe a test sequence from Root of All Evil instead?

    • sweatshopking
    • 6 years ago

    I recently picked up a 7870 for 100$. Seems like it was still a decent buy. If anyone wants bioshock infinite, I’ll selling

      • BoBzeBuilder
      • 6 years ago

      Hey SSK! Welcome back.

      • sweatshopking
      • 6 years ago

      Wtf? Bioshock was game of the year, and NOBODY WANTS TO BUY IT? SCREW YOU PIRATING JERKS!

        • anotherengineer
        • 6 years ago

        No time for games.

        Will your wife clean my house for $50 though??

          • sweatshopking
          • 6 years ago

          Hahahaha. My wife doesn’t clean. I do.

            • Arclight
            • 6 years ago

            Wait, the jokes on yout then.

            • sweatshopking
            • 6 years ago

            she works her butt off in uni. she’s finishing up her nursing degree, then wants to head to med school. She’s maintained a 4.3GPA for the past 3 years. I have to work for a few more years, then i can retire and she’ll bust her butt as a doctor.

            • superjawes
            • 6 years ago

            So you’re going to be a trophy husband? 😆

            Wait, is trophy the right word?

            • derFunkenstein
            • 6 years ago

            More like consolation prize. /zing!

            • superjawes
            • 6 years ago

            Participation award XD

            • sweatshopking
            • 6 years ago

            more like participation punishment.

            • anotherengineer
            • 6 years ago

            stay at home dad………………….my dream too………

        • indeego
        • 6 years ago

        Probably because the game of the year was yet another “Movie” on rails. Just like Bioshock was.

        Wonderful art and story-telling.

        Not the best [i<]game[/i<].

    • wrevilo
    • 6 years ago

    Is this chip “Pitcairn” as per the article or “Curacao” that I have seen mentioned elsewhere?

      • Alexko
      • 6 years ago

      Both, since Pitcairn == Curaçao.

    • albundy
    • 6 years ago

    interesting to see that the Geforce 650ti is in that line up. this card is half the price of the 270, but not half in its performance.

      • LiamC
      • 6 years ago

      Let see; a non-reference (faster than ref.) 650 ti 2GB (twice as much as ref)- TR price grabber price being $160 – and that’s half the 270’s price? Apples to oranges by the looks.

      • Cyril
      • 6 years ago

      We tested the 650 Ti Boost 2GB, not the 650 Ti 2GB. The Boost 2GB starts at ~$150.

      • Wall Street
      • 6 years ago

      I think that the confusing thing here is that the vanilla GTX 650 is half of the price of the R9 270x. However, the GTX 650 Ti Boost and the vanilla R9 270 are being compared in this review and there the price difference is $20-30 depending on which specific cards and price fluctuation. Yay to naming schemes!

      In fact the GTX 650 Ti Boost is not even based on the same silicon as the GTX 650 and has double the shaders and texture units and 50% wider memory interface.

    • wrevilo
    • 6 years ago

    I wonder how far these overclock compared to the 7870 and R270x given the reduced power supply? Will hit some other reviews to find out, but always use TR as my first call!

    One game I would love to see in the review suite is ArmA III. Is there any way TR could include this? The single player campaign should now make it benchmarkable.

    • Stickmansam
    • 6 years ago

    Seems like AMD’s strategy is too fill in every possible gap in performance. I wonder how a 1ghz 7850 would have done with faster memory.

      • Chrispy_
      • 6 years ago

      The 7850 loses texture units – 64 vs 80, I think this hurts it.

      I downclocked a 7870 to 750Mhz and it was still noticeably faster than the 7850 running at 860Mhz – much more than the theoretical 10% difference that clock x shader count would account for.

    • JosiahBradley
    • 6 years ago

    Always nice to read a positive review before bed time. Thanks guys for the non-stop work this release cycle.

      • TwoEars
      • 6 years ago

      The techreport staff is doing a first rate job as usual. Their reader base might not be the biggest but I suspect it’s pretty damn loyal.

Pin It on Pinterest

Share This