AMD’s FX-8370E processor reviewed

In many ways, AMD’s FX processor series seems to have fallen by the wayside lately. While A-series APUs were refreshed with new Kaveri silicon this past January, the FX family has been trucking along with the same Vishera silicon since 2012. The accompanying 990FX chipset is a year older and begging for a replacement. At this point, one might have expected AMD to let the FX family die a dignified death—then fill in the gaps with high-octane Kaveri APUs.

But that’s not what the company did.

Instead, AMD has just shaken up the FX series with a trio of new models. The additions are based on the same Vishera silicon as before, but the magic of binning has yielded a faster 125W top-of-the-line part as well as two eight-core offerings with 95W thermal envelopes. One of them, the FX-8370E, will be the subject of our review this morning.

Today’s shakeup also involves a round of price cuts, the biggest one of which will send the family’s 220W flagship, the FX-9590, into the same waters as Intel’s Core i5-4690K. Other price cuts apply to the FX-9370 and FX-8320, which are both getting a tad cheaper.

Put together, these are without a doubt the biggest changes AMD’s FX line has seen in well over a year. Let’s look at them one by one before we fire off our benchmarks.

When it came out in June 2013, the FX-9590 could be found only inside select pre-built PCs from system integrators. It took a couple of months for the chip to hit e-tail listings, where it initially sold for a daunting $880. By October 2013, the FX-9590 had fallen to $350; and just prior to today’s price cut, Newegg had it on sale for $299.99.

As of today, the FX-9590 should be available at e-tail for only $229.99. This price pits the FX flagship against Intel’s cheapest Devil’s Canyon processor, the Core i5-4690K, which is available for $239.99. The Core i5 admittedly consumes a fraction of the power, at 88W, and comes with a bundled heatsink and fan, which the FX-9590 does not. (AMD offers a version of the FX-9590 with a liquid cooler in the box, but that kit will set you back $290 after these cuts come into effect.) Still, the FX-9590 is in a more competitive position now than ever.

For users intimidated by the FX-9590’s 220W power envelope, AMD has introduced the FX-8370, its fastest 125W processor yet. Aside from a 100MHz increase in peak Turbo headroom, the FX-8370 has basically the same specs as the FX-8350. Since the new model is $20 more expensive, some may be tempted simply to buy the slower, cheaper chip and overclock it. All members of the FX series, past and present, still have fully unlocked upper multipliers. The FX-8370 is AMD’s first new top-of-the-line FX-8000-series processor since October 2012, though, which has got to count for something.

The most interesting additions to the lineup are these FX “E” chips. They offer up the same eight-core recipe as the aforementioned 125W parts, but in a more reasonable 95W power envelope.

95W FX processors with eight cores have been available before, but these days, the only ones still around are four- and six-core parts. AMD says it intends the “E” chips to serve as upgrades to those models inside systems that “don’t really have the infrastructure” to support a 125W CPU. Simply put, someone with a quad-core FX chip could replace it with an FX-8370E and enjoy a sizeable performance boost without needing a new motherboard, cooler, or power supply. Not even a BIOS update would be necessary, apparently. AMD says “lots” of its users have requested an upgrade path like this, and it was happy to oblige.

In new builds, the “E” chips will vie for supremacy with Core i3 and i5 processors from Intel’s Haswell Refresh series. AMD mentioned the Core i5-4430 and i5-4460 as likely competitors, but the FX-8370E’s most direct opponent will probably be the i5-4590, which carries the same $199.99 asking price at e-tail.

Model Modules/

threads

Base

clock

(GHz)

Max

Turbo

clock

(GHz)

Max

DDR3

speed

(MT/s)

L3

cache

(MB)

TDP

(W)

Old

price

(Newegg)

New

price

(SEP)

FX-9590 4/8 4.7 5.0 2133 8 220 $299.99 $229.99
FX-9370 4/8 4.4 4.7 2133 8 220 $219.99 $210.99
FX-8370 4/8 4.0 4.3 1866 8 125 $199.99
FX-8350 4/8 4.0 4.2 1866 8 125 $179.99 $179.99
FX-8320 4/8 3.5 4.0 1866 8 125 $159.99 $146.99
FX-8370E 4/8 3.3 4.3 1866 8 95 $199.99
FX-8320E 4/8 3.2 4.0 1866 8 95 $146.99

Here’s a full list of specs and prices for the updated FX lineup. “Old” prices were grabbed from Newegg this past weekend, while new prices are the suggested e-tail figures given to us by AMD.

As you can see, AMD has reduced base clock speeds quite a bit to give the “E” chips their lower TDPs. The Turbo Core peaks are the same as for the non-E models, however.

The FX-8370E, for example, can clock itself as high as the FX-8370 via Turbo. That means it has the potential to be a very strong performer in the lightly threaded workloads that dominate day-to-day PC use—all the while sipping less power than a 125W CPU. The FX-8370E’s lower base speed will probably hinder it in heavily multithreaded tasks, but thanks to its eight hardware threads, it may still put up a decent fight against Intel’s quad-core offerings.

That’s the theory, anyhow. AMD sent us an FX-8370E to test, and we put it through our suite to see if the theory matches the reality. Keep reading for the results.

 

Our testing methods

As usual, we ran each test at least three times and have reported the median result. Our test systems were configured like so:

Processor AMD FX-8350

AMD FX-8370

AMD A6-7400K Pentium G3258
AMD A10-7800 Core i3-4360

Core i5-4590

Core i7-4790K

Motherboard Asus Crosshair V Formula Asus A88X-PRO Asus Z97-A
North bridge 990FX A88X FCH Z97 Express
South bridge SB950
Memory size 16 GB (2 DIMMs) 16 GB (4 DIMMs) 16 GB (2 DIMMs)
Memory type AMD Performance

Series

DDR3 SDRAM

AMD Radeon Memory

Gamer Series

DDR3 SDRAM

Corsair

Vengeance Pro

DDR3 SDRAM

Memory speed 1866 MT/s 1866 MT/s 1333 MT/s
2133 MT/s 1600 MT/s
Memory timings 9-10-9-27 1T 10-11-11-30 1T 8-8-8-20 1T
10-11-11-30 1T 9-9-9-24 1T
Chipset

drivers

AMD chipset 13.12 AMD chipset 13.12 INF update 10.0.14

iRST 13.0.3.1001

Audio Integrated

SB950/ALC889 with

Realtek 6.0.1.7233 drivers

Integrated

A85/ALC892 with

Realtek 6.0.1.7233 drivers

Integrated

Z97/ALC892 with

Realtek 6.0.1.7233 drivers

OpenCL ICD AMD APP 1526.3 AMD APP 1526.3 AMD APP 1526.3
IGP drivers Catalyst 14.6 beta 10.18.10.3652

 

Processor Core i5-2500K Core i7-4960X Core i7-5960X
Motherboard Asus P8Z77-V Pro Asus P9X79 Deluxe Asus X99 Deluxe
North bridge Z77 Express X79 Express X99
South bridge
Memory size 16 GB (2 DIMMs) 16 GB (4 DIMMs) 16 GB (4 DIMMs)
Memory type Corsair

Vengeance Pro

DDR3 SDRAM

Corsair

Vengeance

DDR3 SDRAM

Corsair

Vengeance LPX

DDR4 SDRAM

Memory speed 1333 MT/s 1866 MT/s 2133 MT/s
Memory timings 8-8-8-20 1T 9-10-9-27 1T 15-15-15-36 1T
Chipset

drivers

INF update 10.0.14

iRST 13.0.3.1001

INF update 10.0.14

iRST 13.0.3.1001

INF update 10.0.17

iRST 13.1.0.1058

Audio Integrated

Z77/ALC892 with

Realtek 6.0.1.7233 drivers

Integrated

X79/ALC898 with

Realtek 6.0.1.7233 drivers

Integrated

X99/ALC1150 with

Realtek 6.0.1.7233 drivers

OpenCL ICD AMD APP 1526.3 AMD APP 1526.3 AMD APP 1526.3
IGP drivers

They all shared the following common elements:

Hard drive Kingston HyperX SH103S3 240GB SSD
Discrete graphics XFX Radeon HD 7950 Double Dissipation 3GB with Catalyst 14.6 beta drivers
OS Windows 8.1 Pro
Power supply Corsair AX650

Thanks to Corsair, XFX, Kingston, MSI, Asus, Gigabyte, Cooler Master, Intel, and AMD for helping to outfit our test rigs with some of the finest hardware available. Thanks to Intel and AMD for providing the processors, as well, of course.

Some further notes on our testing methods:

  • The test systems’ Windows desktops were set at 1920×1080 in 32-bit color. Vertical refresh sync (vsync) was disabled in the graphics driver control panel.
  • We used a Yokogawa WT210 digital power meter to capture power use over a span of time. The meter reads power use at the wall socket, so it incorporates power use from the entire system—the CPU, motherboard, memory, graphics solution, hard drives, and anything else plugged into the power supply unit. (The monitor was plugged into a separate outlet.) We measured how each of our test systems used power across a set time period, during which time we encoded a video with x264.
  • After consulting with our readers, we’ve decided to enable Windows’ “Balanced” power profile for the bulk of our desktop processor tests, which means power-saving features like SpeedStep and Cool’n’Quiet are operating. (In the past, we only enabled these features for power consumption testing.) Our spot checks demonstrated to us that, typically, there’s no performance penalty for enabling these features on today’s CPUs. If there is a real-world penalty to enabling these features, well, we think that’s worthy of inclusion in our measurements, since the vast majority of desktop processors these days will spend their lives with these features enabled.

The tests and methods we employ are usually publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

 

Power consumption and efficiency

The workload for this test is encoding a video with x264, based on a command ripped straight from the x264 benchmark you’ll see later.


Lower TDP or not, the FX-8370E doesn’t draw any less power than its 125W cousins at idle. The Core i5-4590 consumes 19W less here.

Fire up a video-encoding workload, and the FX-8370E shines compared to its relatives. While its TDP is only 30W lower than the FX-8370’s on paper, the FX-8370E actually draws almost 50W less in this test. Too bad that difference isn’t enough to close the gap with the Core i5-4590.

We can quantify efficiency by looking at the amount of power used, in kilojoules, during the entirety of our test period, when the chips are busy and at idle.

Perhaps our best measure of CPU power efficiency is task energy: the amount of energy used while encoding our video. This measure rewards CPUs for finishing the job sooner, but it doesn’t account for power draw at idle.

As we’ll see in our performance section, the FX chips chew through this x264 test quite a bit slower than the Intel ones. That means the AMD CPUs spend more time at peak power draw, which compounds the effect of their already-high power consumption. No wonder they’re stuck at the bottom of the graph. The FX-8370E isn’t the worst of the bunch, but it’s still much less efficient than the Core i5-4590.

 

Discrete GPU gaming

The FX-8370 is absent from these performance tests, since we didn’t have time to benchmark it fully. Sorry about that. We’ve been exceptionally busy around here these past few days, as you’ve probably noticed.

Anyhow, in Thief with the Direct3D renderer enabled, the FX-8370E trails the FX-8350 by a small margin—and the Intel pack by a larger one. This game suffers more from the FX-8370E’s lower base clock than it benefits from the chip’s higher Turbo peak, apparently.

AMD’s Mantle API does a pretty solid job of cutting CPU overhead. Enable it in Thief, and the margin between the FX-8370E and the other contenders shrinks to almost nothing. Results like these may become the norm after DirectX 12 arrives late next year. For now, though, they’re the exception.

(If you’re wondering why the dual-core CPU results are missing from the graph above, it’s because the game wouldn’t start in Mantle mode with those chips. Weird.)

 

Productivity

Let’s run through a quick sampling of some desktop-style applications that rely on the CPU cores to do their work.

In these apps, which all put multiple threads to work, the FX-8370E is slower than the FX-8350 across the board. That’s pretty much what we expected, since the FX-8370E has a lower base speed, and it probably doesn’t hit its Turbo peak with all eight cores sweating away.

Still, the FX-8370E’s multithreaded performance is enough to close the gap with the Core i5-4590 in a couple of tests. The two chips are closely matched in Handbrake and GCC, and the Core i5 actually falls behind in 7-Zip.

 

LuxMark OpenCL rendering

LuxMark is a nice example of GPU-accelerated computing. Because it uses the OpenCL interface to access computing power, it can take advantage of graphics processors, CPU cores, and the latest instruction set extensions for both. Let’s see how quickly this application can render a scene using a host of different computing resources.

Here, LuxMark is running on the CPU alone, unaided by discrete or integrated graphics. The FX-8370E falls behind the FX-8350 again, but it edges out the i5-4590.

This is what happens when we run LuxMark on just our discrete GPU. Unsurprisingly, a faster processor doesn’t make much of a difference. The FX processors hold back the discrete Radeon a tad more than the rest, but only just a tad.

In this last test, the CPU and discrete GPU team up to do the work as quickly as the system can manage. The FX-8370E outruns the i5-4590 by just a hair here, and it’s nearly as fast as the FX-8350.

Cinebench rendering

Cinebench also renders a 3D scene, but it uses only CPU power to do so.

The FX-8370 returns for a brief encore here, since Cinebench gives us an important look at single-threaded vs. multithreaded performance.

Cinebench confirms the hypothesis we outlined earlier: in a single-threaded workload, the FX-8370E’s high Turbo speed allows it to perform just as well as its 125W counterpart—and better than the FX-8350. Is that enough to catch up with Intel? Well, no. Even the $70 Pentium G3258 has better single-threaded performance than the FX-8370E.

In our multithreaded test, however, the FX-8370E’s extra threads allow it to edge out the Core i5-4590—despite falling behind the FX-8370 and FX-8350 because of its lower base speed.

 

Conclusions

Let’s wrap things up with one of our famous price vs. performance scatter plots.

We used AMD’s suggested e-tail prices for the FX series and Newegg prices for the rest. On the performance front, we used geometric mean of results from our full test suite—the same one featured in our review of Intel’s Core i7-5960X processor. We presented an abridged version of that suite on the previous pages of this article, but we used our full slate of numbers for our value calculation.

The FX-8370E caught up with the Core i5-4590 in a handful of our tests, but it doesn’t quite measure up overall. In pure performance terms, the $180 FX-8350 is more of a match for Intel’s $200 quad-core CPU.

Now, the figures above don’t account for power efficiency. The FX-8370E is a better buy than the FX-8350 if you’re all about staving off arctic thawing, or if you’re upgrading a system that’s not equipped to handle a 125W processor. As we saw on page three, however, the reduction from 125W to 95W isn’t enough to close the gap with Intel. The Core i5-4590 is much more power-efficient than the FX-8370E across the board.

That makes the FX-8370E a tough sell for a new PC build. This chip does have an unlocked upper multiplier, which gives it an advantage of sorts over the Core i5. However, overclocking will only worsen the already poor power-efficiency picture—and folks who don’t care about power efficiency should be buying the FX-8350.

Then there’s the accompanying platform. AMD’s 990FX chipset, which still powers high-end Socket AM3+ motherboards, came out in May 2011. It lacks native support for PCI Express 3.0, SATA Express, M.2, and even USB 3.0. ASRock managed to jerry-rig an M.2 slot onto its Fatal1ty 990FX Killer mobo, but that slot is limited to PCIe Gen2 speeds, and the board itself costs a hefty $169.99. Over in the Intel aisle, you can find an Asus board based on Intel’s brand-new Z97 chipset with a Gen3 M.2 slot for 40 bucks less.

So, yeah. If AMD really wants to shake up the FX series, it ought to introduce new silicon based on the Steamroller cores that drive Kaveri, and it ought to offer a new chipset worthy of the FX’s enthusiast aspirations. Price cuts and variations on old themes are all well and good, but they’re not enough if AMD wants to keep up with the blue team.

Comments closed
    • ronch
    • 5 years ago

    OK, so the bottom line here is that the FX-8350 is STILL, in effect, the top-of-the-line FX processor. The 220w models are nothing more than overclocked models and these low wattage models are nothing more than low-clocked chips. WTH. I guess GF’s 32nm hasn’t really gone anywhere even after more than 3 years.

    • Convert
    • 5 years ago

    I’m honestly curious if there’s really any reason to buy AMD. I’m guessing there must be a price range that makes sense? Grandma’s internet PC or something? Admittedly I don’t actually care enough to research it, for my uses I’ve been buying Intel since the C2D days. I know along the way they were often “close enough” performance wise and the price differences made them competitive. Now days the gap just seems too big. Plus AMD’s platform isn’t all that desirable.

    Back in the day it didn’t seem so clear cut and the fanboy arguments had plenty of fuel on either side. Now days it just seems there’s nothing to argue about. I don’t know, maybe I just don’t care anymore.

      • dragontamer5788
      • 5 years ago

      Honestly, AMD’s big moves have been the AM1 platform and FM2 platform. FM2 HTPCs enjoy a bit of GPU girth for [url=http://forum.doom9.org/showthread.php?t=146228<]MadVR[/url<] (high settings like Jinc3 still requires discrete class chip, so you definitely take advantage of the high-speed IGP on FM2), and [url=http://media.bestofmicro.com/5/H/418805/original/grid.png<]low-energy gaming[/url<] (older 1080p games possible, and almost everything plays at 720p). With AM1 platforms can build $150 computers for micro-tasks all around the house. (Router, NAS, etc. etc.). AM1 uses more power than Atom, but is significantly faster on all the benchmarks I've seen and costs basically the same. ($40 for mobo, $40 for AM1 CPU, $20 for RAM, $30 cheap Case+PSU, and a $10 USB Thumbdrive to boot Linux) AMD AM3+ remains the only platform from AMD for your typical PC however (with a discrete GPU). They are only competitive from a price perspective. Since Intel Pentium and Intel Core i3 shut off AES instructions, the Fx-6300 remains competitive at the same price point. Benchmark wise, Fx-6300 wins in multi-threaded tasks and is fast enough to make any single GPU the bottleneck. Haswell has the whole single-threaded thing going for it, but compiling, rendering, photoshop and a ton of other tasks will be better on the Fx-6300.

      • just brew it!
      • 5 years ago

      When you get down to it, the platform issues aren’t necessarily a deal breaker unless you’re really trying to push the performance envelope. Most people don’t tax the PCIe bus on even AMD’s outdated chipset, and mobo makers have been incorporating USB3 support into their AM3+ designs via an add-on USB3 controller. (That addresses the two most common complaints I hear about AM3+.)

      • BobbinThreadbare
      • 5 years ago

      The FX-6350 is a reasonable competitor to lower end i5s.

      • itachi
      • 5 years ago

      Check my answer to a post upper.

      • Anonymus_notthetroll
      • 5 years ago

      Hows the saying go….. “If you dont have anything nice to say, dont say anything at all…?”

      the “Internet FANBOY” missed that day in school I think…

      Sometimes I think I did too…
      Or I was “tardy” at least.

    • Nictron
    • 5 years ago

    Does anyone know if this CPU will work on a ASUS ROG Formula IV Motherboard? The board is AM3+ capable and the review does imply that it might work on older high-end motherboards.

    Thank you in advance.

      • just brew it!
      • 5 years ago

      I’d lean towards no given that the BIOS hasn’t been updated in nearly 2 years. But the only way to know for sure is to find someone who has tried it.

    • uni-mitation
    • 5 years ago

    I hate to be a realist, but I think it came time to admit reality, people. It is time for them to take out the tumor that is the AMD CPU section, and try to salvage what they can from the GPU. Intel won. There is no do-over, no rewind, nothing. Today I mourn the dead of CPU competition and welcome our new overlords. Continued market segmentation to intensify.

    Why pay extra for an underclocked FX chip?

      • Sgttwinkie
      • 5 years ago

      But if amd stops making CPUs intel will overcharge the crap out of theirs

      • dragontamer5788
      • 5 years ago

      [quote<]I hate to be a realist, but I think it came time to admit reality, people. It is time for them to take out the tumor that is the AMD CPU section, and try to salvage what they can from the GPU. Intel won. There is no do-over, no rewind, nothing. Today I mourn the dead of CPU competition and welcome our new overlords. Continued market segmentation to intensify. [/quote<] The death of high-end CPU competition happened over a year ago, when AMD laid off like 10% of their engineers, sold their headquarters, laid off 20% of their marketers. Hopefully AMD can ramp back up eventually, but they clearly have no plans to compete against Intel, not while they're billions of $$ in debt and going to continue bleeding money in all reasonable forecasts. ------------ That said, I don't believe the Fx-8350 is a bad buy though. AMD has kept it competitive by dropping the price over and over again, so while AMD isn't getting any performance crowns I can think of plenty of PCs that I'd build using the Fx-8350. [quote<] Why pay extra for an underclocked FX chip?[/quote<] It offers an upgrade path for all Fx-6300 users, which is at 95 TDP. If you cheaped out and got a budget-box, you need to keep the TDP at 95 for all of those $30 motherboards. The cheap mobos are going to throttle the 125W CPUs randomly, causing major stutter issues. So the best CPU for those mATX 760 boards is in fact the Fx-8370e (and till this point, it was a niche that was best satisfied by the Fx-6300)

        • just brew it!
        • 5 years ago

        Egads, I didn’t realize exactly how cheap those “cheap mobos” have gotten. You can get an ECS AM3+ motherboard with 760G chipset [url=http://www.newegg.com/Product/Product.aspx?Item=N82E16813135361<]for $25 these days[/url<], it would appear. Not that I'd trust it, since A) it's a $25 mobo; and B) it's made by ECS!

        • Anonymous Coward
        • 5 years ago

        [quote<]Hopefully AMD can ramp back up eventually, but they clearly have no plans to compete against Intel, not while they're billions of $$ in debt and going to continue bleeding money in all reasonable forecasts.[/quote<] Of course they won't compete head on again. They need a business plan that stands a chance of making money, so a simple and effective design, faster than Jaguar/Bobcat but maybe not even as good as the Bulldozer family. Something they can [i<]sell at a profit[/i<]. A [i<]small[/i<] profit. So, easy to design, easy to fab....

          • just brew it!
          • 5 years ago

          [quote<]So, easy to design, easy to fab....[/quote<] Sounds like ARM!

      • Anonymous Coward
      • 5 years ago

      If you can’t be the [i<]best[/i<], might as well do [i<]nothing at all[/i<]!

        • Anonymus_notthetroll
        • 5 years ago

        this.

        thumbs up +infinity

      • anubis44
      • 5 years ago

      “I hate to be a realist, but I think it came time to admit reality, people. It is time for them to take out the tumor that is the AMD CPU section, and try to salvage what they can from the GPU. Intel won. There is no do-over, no rewind, nothing.”

      If you really ‘hate to be a realist’, you wouldn’t use a word like ‘tumour’ to describe the AMD processor section, and if you were really a realist, you wouldn’t conclude there’s no ‘do-over, no rewind, nothing’ when AMD clearly has started to show the first signs in a long time that they’re revitalizing their CPU efforts. Most people had believed AMD would not be releasing any additional socket AM3+ chips, but despite being respins of existing silicon, here they are. There is also credible intelligence coming out that Jim Keller has been cooking up a new CPU in his lab at AMD, and I’d wait to see what he has come up with before you write AMD off. After all, he’s the only chip engineer who ever beat Intel at the CPU game.

      • itachi
      • 5 years ago

      Just my 2 cents though but when I looked at getting 4670k, and then the motherboards, for it, the good ones for overclocking were near 200 bucks, so thats 230+200, where my amd 8320, it cost 150+140 with m5a99fx pro, which is a good overclocking board, and I like that.. cpu running currently at 4.7 ghz crystal clear, that’s a 140$ save and except the very few games which are mostly dependant on single threaded performance, like skyrim for instance, I don’t think it’s much of a big deal, I’ll still get stables framerates there anyway..

      Also yea one could argue that the amount you save will be spent in electricity bill, but with the price of electricity these days I don’t think I’m that close, 0.13 euros per kw/h in france average I did calculations and it’s a bit confusing, but I end up thinking that way, when its cold and in winter, I’ll literally won’t have to turn heater on.. so my PC is a double edged sword, rofl.

      Also there are rumors of Fx 8300’s coming in soon, think about the value you get if you can OC that thing pretty well.. we can hope aggressive price arround 120$ too hopefully..

      • Anonymus_notthetroll
      • 5 years ago
    • kvndoom
    • 5 years ago

    Pretty disappointing for the amount of money they want. I really can’t see myself sticking with AMD when I upgrade from my Phenom 965 in a couple years. Too bad, because I’ve been faithful to them for a long time, my last Intel being a Pentium 3.

    My biggest decision may well be which version of Windows to use, 7 or 9…

      • DarkMikaru
      • 5 years ago

      You know I feel the same way. And I’ll be the first to say this as a die hard AMD fan over the years. The 8370E makes sense for people on 95w motherboards though. For those users I’m glad to see it. But the fact that the 8350 is the end of the line for us AM3+ users kinda sucks.

      Granted, the 8350 is plenty fast for my needs and I really won’t need to upgrade for quite some time. I guess its just the whole thought of knowing that should I want more power I’ll have to go to Team Blue…which I dislike. In way we are stuck between a rock and a hard place gentlemen. On the one hand, Intel has an absolutely killer on it’s hands with the Core i series. So I understand the appeal. But on the other hand, we can’t let AMD die as who wants Intel as the only option.

      I love AMD’s APU’s and I’ll continue building those systems for my casual users & business clients. But at the high end, I just can’t downgrade to an APU. Come on AMD… I hope you have something up your sleeve because in 3 more years when that upgrade bug bites me I’d rather give you my money than Intel. 🙁

        • Airmantharp
        • 5 years ago

        You know, as far as most people are concerned (I’m not them, but hey), Intel *IS* the only option, and has been for some time. In most CPU market segments, they’ve essentially been competing with themselves so that they’re also not obsoleted; desktops are a dying breed after all, and mobile is the future of essentially all consumer computing.

        But I do empathize with the wish to support AMD and hopefully help prevent them from failing.

          • DarkMikaru
          • 5 years ago

          Yeah, your right. When AMD was handing Intel their &^% with the Athlon, Athlon XP & 64 lines they really missed their opportunity to drill into the consumer psych that “We Are The Value Option”! Many many many years ago I used to work for a retailer selling PC’s. And AMD machines were always less expensive yet just as fast. I’d have customers coming back telling me how impressed there were with there little 350 dollar PC and glad they didn’t spend the 6 or 700 for the P4 / Pentium D equivalent.

          But I agree, hopefully things change for the better. Competition is always good for us. The consumer.

        • anubis44
        • 5 years ago

        You probably won’t have to Intel your money if you can wait until 2016:

        [url<]http://linustechtips.com/main/topic/155351-amds-bringing-fx-back-with-jim-kellers-new-x86-core/[/url<]

          • just brew it!
          • 5 years ago

          Hmm… if AMD is still around by then and manages to pull this off things could get interesting again. My FX-8350 should last me until 2016!

          • Aquineas
          • 5 years ago

          I have no doubt that they’ll design a technically competent part with Jim Keller leading things. What I wonder about now is their ability to manufacture such part on a new, unproven, 14nm process. AMD (or GlobalFoundries) historically has taken a long time to ramp up new process sizes.

            • ronch
            • 5 years ago

            They’ve surely learned their lesson and come to grips being a fabless chipmaker and are designing their future cores to be highly portable across process technologies. They’re not dumb enough to trust GF again and give them all their eggs.

          • Aquineas
          • 5 years ago

          I’d like to think you are right, but been burned twice with Phenom and Bulldozer. AMD is going to have to earn my future chip business.

        • ronch
        • 5 years ago

        [quote<]I hope you have something up your sleeve because in 3 more years when that upgrade bug bites me I'd rather give you my money than Intel. :([/quote<] Unless you've been living under a rock, you know full well that they are working on a high performance x86 core that will supplant the fundamental Bulldozer lineage, right? Estimated release is 2016/2017. Hang in there and don't count AMD out just yet.

      • Airmantharp
      • 5 years ago

      Windows 9 will likely be launched within your time-frame, and given that it will include (at a minimum) all of the under-the-hood improvements introduced with Windows 8, I can’t see anyone reasonably going for Windows 7 if they don’t have to.

      And hell, given that Windows 8’s interface is so very easy to customize so that it resembles Windows 7 and it’s predecessors, I don’t see any reason to shun it now either.

      • dragontamer5788
      • 5 years ago

      Zen isn’t going to be out till 2016, and there’s no real word on whether or not Zen is going to be any good (or on time… or on budget).

      But that’s the next hope for a “big core” from AMD.

    • wierdo
    • 5 years ago

    I’m confused about the idle consumption, why’s it the same between the 8370 (4ghz) and 8370E (3.3ghz)? Is there something preventing the CPU from running closer to its idle speed?

    Also fascinating to see how Mantle changes the cpu bottleneck so considerably, I expected some good improvement but that was kinda drastic.

    Thanks!

      • just brew it!
      • 5 years ago

      As a guess, they probably run at the same clock speed when idling. FWIW the FX-8150 I use at work and the FX-8320 I use at home (which have base clocks of 3.6GHz and 3.5GHz respectively) both down-clock idle cores to 1.4GHz when power management is enabled.

      • ronch
      • 5 years ago

      My FX-8350 runs at 1.4GHz when it’s idle. Cool & Quiet is enabled, of course, and I’m using the Balanced power profile under Win7. I would think the 8370 and 8370E also run at 1.4GHz at idle.

        • wierdo
        • 5 years ago

        Interesting, I guess that makes sense, so their idle speeds are the same and not tied to base clock… I expected a relationship between idle and clock multiplier.

          • just brew it!
          • 5 years ago

          AFAICT (limited sample size) they *all* idle at 1.4GHz. (And FWIW AFAICT the older Thuban core Phenom II processors all idled at 800MHz… but I guess we’re getting a bit off-topic!)

      • just brew it!
      • 5 years ago

      C’mon, people. There’s no need to down-vote someone for saying they’re confused (unless done sarcastically, and my sarcasm detector isn’t going crazy here). Down-votes should be reserved for blatant trolling or posting of misinformation/FUD.

    • deruberhanyok
    • 5 years ago

    The most interesting thing for me here is the Thief benchmarks. The large gap with the Direct3D renderer shrinks down to a much smaller, almost negligible gap with Mantle. And as mentioned right there in the article, that could be status quo when Direct3D 12 comes around.

    That right there might be the biggest win for AMD. If CPU overhead can be decreased so much that it can get their processors back into price-competitive performance, suddenly they’re a viable alternative for gaming systems again.

      • albundy
      • 5 years ago

      there’s always the video conversion horsepower factor that many take into consideration as well as power conservation.

      “Too bad that difference isn’t enough to close the gap with the Core i5-4590” yeah, this is what matters when you charge $200 for a cpu that isnt even comparable to an i7 class.

    • ronch
    • 5 years ago

    I like AMD. I’ve mostly used AMD in my own machines. But the thing is, the way AMD has been operating since the launch of the Athlon 64 in 2003 will guarantee their demise. The original K7 Athlons and succeeding Athlon 64 architectures proved that AMD is capable of slipping one under Intel, akin to David and Goliath if you will, despite many claims that it was just a series of flukes. Perhaps it’s true, but I think it’s more of AMD resting on their laurels since 2003. Barcelona was a lazy effort at a time when AMD became too confident that creating a better architecture than their K8 was impossible, and Bulldozer was an obvious step back from going toe-to-toe with Intel. But the thing is, IIRC AMD’s most profitable years were from the days of the K6 up to 2006 when Intel gave them a nasty surprise. Some people claim going head-to-head with Intel was disastrous for AMD and that’s why they needed to step back a little, but was it really? During the K7 and K8 days AMD only sold CPUs, had to fund their fabs, and had no graphics to augment their revenue. The ATI purchase no doubt tore a big hole in their cash flow but still, if they had a competitive part with Barcelona the sales of Phenom chips would’ve made it easier to pay off their huge debt.

    This was all happening around 2006-2007, during the early days of Bulldozer’s development. Things started going downhill even further when AMD decided that the future of the company would be fabless (remember ‘Asset-Light’ and all the intrigue that surrounded the rumors during Hector’s watch?), and Bulldozer is going to be a high core-count CPU with lower IPC, to be compensated for by high clock speeds and made in a fab that they no longer own and have complete control of. In a way, Bulldozer probably made sense at that time, but AMD tried to blaze their own path and forgetting that in this game, Intel sets the rules most of the time and the chances of succeeding down your own path is small.

    So since 2006 AMD hasn’t had the performance crown and they just try to make people more excited about their upcoming products by talking about them soon before launch. This was the case during 2006 when Core 2 came out and beat AMD and AMD started talking about Barcelona and how it’s gonna eat Intel up. Barcelona came and went, and around 2010 AMD talked about Bulldozer and how it’s gonna offer better performance. And now we have AMD’s next x86 core. I know few people are optimistic about it given how AMD has fallen flat on their promises twice since 2006, but we’ll see. At this time they can’t do anything but flog a dead horse. Don’t get me wrong. I don’t think Bulldozer is broken, in fact, I think it’s a very interesting architecture. But it was too unique and took a lot of risks that in the end just didn’t work out well for AMD. Here’s hoping AMD’s upcoming x86 core will turn things around big time for the company that gave us choice in the x86 space for the longest time.

    • ronch
    • 5 years ago

    Come on, kids. I know these new chips aren’t terribly exciting but let’s not put all the blame on AMD either. If GF has given them 22nm last year we’d (probably) have 5 or 6GHz FX chips by now at 95w or lower.

      • Kretschmer
      • 5 years ago

      That sounds WILDLY optimistic. And it’s quite fair to call a bad product a bad product.

        • Anonymus_notthetroll
        • 5 years ago

        Does “bad” equal “garbage”, “unmarketable”, “unusable”, “unsellable” or anything like that….?

        It just means that theres something better right?

        And theres ALWAYS something better…just ask my ex girlfriend (queue rimshot, and then completely unnecessary comment from the gallery, “dont lie, you dont have a girlfriend punk” sad face sad face sad face lol). Still true tho… ALWAYS something bigger/faster/stronger.

        6ghz is HILARIOUS btw…lol 6ghz?? SIX???
        upvote for ya, just for that lol… “Wildly” was an understatement.

        No offense to the OP….ur probably right. Not @ 95 watt tho…that ones a stretch lol

          • ronch
          • 5 years ago

          OK, I was exaggerating, but you guys do get my point.

    • LoneWolf15
    • 5 years ago

    “The FX-8370E isn’t the worst of the bunch, but it’s still much less efficient than the Core i5-4590”.

    Heck, it’s much less efficient than the Core i7-4790K, and slightly less efficient than the i7-5960X, according to your testing.

    Those chips may be priced a bit higher than the i5 (so they’re not cost leaders) but when Intel’s performance-class chips beat AMD’s “energy efficient” chips in performance-per-watt….what does that say?

      • ronch
      • 5 years ago

      It says AMD desperately needs some 22nm from GF.

    • GatoRat
    • 5 years ago

    When AMD’s Bulldozer first came out, a colleague quipped that the Athlon 64 series was a fluke, but everyone still takes them seriously. I think there’s a lot of truth in this.

    When the Athlon 64 X2 came out, it blew away everything Intel had. Unfortunately, the chip sets had serious problems and AMD rested on their success for too long. Intel sat up and revolutionized their CPUs. The difference Nehelem brought is still astonishing and Intel has built on it magnificently. AMD came out with Bulldozer; it was a bad joke and AMD doubled down.

      • LoneWolf15
      • 5 years ago

      Heck, Core 2 Duo and Core 2 Quad revolutionized Intel’s CPUs. It was kind of like a boxing match where Intel got it’s bell rung, got a second wind between rounds, and came out swinging.

      Personally, I think Sandy Bridge was the true revolution after Conroe/Penryn Core 2 architecture; Nehalem wasn’t bad (and AMD couldn’t match it), but it was short-term and Intel improved quite a bit on it with SB. I think everything after Sandy has been somewhat incremental, though Intel has won every round since Conroe.

        • hansmuff
        • 5 years ago

        Yep, I had a C2D E8400 for a while and then Sandy Bridge came out. The performance improvements were so impressive that I bought it the moment Intel addressed the SATA issue in P65.

        I still run that chip and that board.. since 2011!
        And I still read reviews of new CPUs and nothing has impressed me enough to upgrade.

        Sandy Bridge was ahead of its time and I am still loving that chip.

        • ronch
        • 5 years ago

        Core 2 was a big wake up call for AMD, and it was only made worse for AMD by Nehalem when integrating the memory controller was like opening the flood gates. With Nehalem, AMD’s last bastion was gone. Sandy Bridge built upon Nehalem’s excellent wide architecture, did some reshuffling and rebalancing of the execution units, re-introduced the uop cache to the architecture which was actually taken from the Pentium 4. Ever since then Intel has been fine-tuning the architecture by adding new instructions, making the caches more efficient, making a few minor tweaks here and there, etc. Intel truly has one of the most advanced CPU cores, and far beyond any other x86 licensee has ever done. It becomes even more noteworthy when you consider how problematic and inefficient the x86 architecture is especially in terms of sophisticated decode hardware, something RISC cores don’t have to bother with.

      • just brew it!
      • 5 years ago

      Once is a fluke, more than once is not. The original Athlon and follow-on Athlon XP were also decent. Heck, even the K6-III+ (little-known, hard-to-find, die-shrunk mobile version of the K6-III) was a good CPU for its day.

      Yes, Bulldozer was clearly problematic. But Piledriver parts (ridiculous 200+ watt “stunt” CPUs and oddities like the subject of this review aside) are generally competitive on price/performance, provided you’ve got a multi-threaded workload.

        • Jason181
        • 5 years ago

        I think the problem is that the performance is so uneven. Multithreaded fpu loads don’t fare well, so you’ve basically got a cpu that you probably don’t want for gaming and that only performs really well in certain situations. If you fit that situation, then you’re golden, but if there’s any chance you’ll be doing something else, I just see no reason not to go Intel.

        OpenCL acceleration is nice, and helps AMD remain competitive, but again that’s really a pretty small market. I don’t think most people care about power draw; what I hear is “Why is it so loud?”

          • Airmantharp
          • 5 years ago

          Having eight cores but only four FPUs is killer for games, even though it’s clearly effective for many multitasking workloads.

          • ronch
          • 5 years ago

          I think AMD deliberately designed the Bulldozer core while keeping in mind that FPUs don’t get utilized as much as Integer execution resources, hence, AMD consciously dropped heavy FPU workload scenarios.

        • ronch
        • 5 years ago

        Yep. Just because AMD doesn’t match Intel on per-core performance doesn’t mean the fundamental Bulldozer architecture is a failure. It isn’t one of AMD’s best products ever but it’s something that’ll get the job done for IMHO 95% of people.

        Edit – The FX 8-cores are not the best money-makers for AMD, but come to think of it, they ARE ACTUALLY AMD’s best products ever.

          • just brew it!
          • 5 years ago

          [quote<]The FX 8-cores are not the best money-makers for AMD, but come to think of it, they ARE ACTUALLY AMD's best products ever.[/quote<] Depends on how you define "best". In absolute terms, yes they do give you the most performance AMD has ever offered in a single-socket solution. But it arrived too late to compete with Intel on the high end. If you define "best" as "gave the market what it wanted, and gave AMD their best shot at competing with Intel on an even footing" I think that would have to be the original Athlon64.

            • ronch
            • 5 years ago

            By ‘best’, I meant from a technological perspective. The FX-8350, for example, beats any Athlon 64 in terms of performance/watt and performance/$$$, and AFAICT also beats the fastest Phenom II’s by a small margin.

      • ronch
      • 5 years ago

      [quote<]AMD rested on their success for too long[/quote<] That's true. AMD sure took a long nap after the success of K8 from 2003 yo 2006, and when Core 2 came out, it was like Intel gave them the Ice Bucket Challenge while they were snoring. But by 2006 it was too late to draw up a more competitive part than Barcelona so all they could do was play down Core 2 and brag about their Direct Connect Architecture and all that and how Barcelona was gonna open up a huge gap between AMD and Intel again. I remember AMD marketing people like Lisa Sobon and AMD chip architect Randy Allen being bullish about Barcelona just before launch. Same thing with John Fruehe who also practically made up lies about Bulldozer, even [url=http://forums.anandtech.com/showpost.php?p=30364913&postcount=280<]citing a 17% IPC increase[/url<]. I wonder where AMD gets their marketing people.

        • Mat3
        • 5 years ago

        Read his post again. JFAMD did not say “17% increase in IPC”, he said, “Increase for single-threaded workloads will be “a lot” more than 17%”.

        Turns out that wasn’t true either though.

          • ronch
          • 5 years ago

          I was being conservative, so I stuck with 17%.

      • shank15217
      • 5 years ago

      Dead horse beating

        • just brew it!
        • 5 years ago

        Whether or not I agree with you depends on which horse you think is dead. 😉

      • Airmantharp
      • 5 years ago

      You know, I still think that Bulldozer can be tweaked and shrunk to be largely competitive with Intel’s best, at an AMD module per Intel Hyperthreaded core perspective.

      AMD’s modules are like hardware Hyperthreading, which is cool.

        • just brew it!
        • 5 years ago

        So in other words… if Intel fabbed AMD’s CPUs, they’d be competitive with Intel? 😉

        Hyperthreading is done in hardware too, BTW. There’s just more resorces shared between the two virtual hyperthreaded cores than there is between the cores in an AMD module.

      • anubis44
      • 5 years ago

      ” a colleague quipped that the Athlon 64 series was a fluke, but everyone still takes them seriously. I think there’s a lot of truth in this.”

      The Athlon 64 was not a fluke, it was the product of Jim Keller’s design brilliance. Now, after a long hiatus, the same Jim Keller is now head of AMD’s CPU division. You can bet he’s got another killer CPU brewing in AMD’s labs right this very minute, and it will be ready for prime time sometime next year.

      “When the Athlon 64 X2 came out, it blew away everything Intel had. Unfortunately, the chip sets had serious problems and AMD rested on their success for too long.”

      The myth about AMD’s chipsets having across-the-board problems still persists, I see. Well, there were about 5 different companies producing chipsets for AMD motherboards back then, ATI, AMD themselves, nVidia, VIA and SiS. Guess what? The SiS chipsets were just as crappy on Intel systems as they were on AMD systems. The nVidia chipsets also had quirks on the Intel side even as they had different quirks on the AMD side. Same goes for ATI chipsets. Only AMD’s own chipsets seemed to have no issues, and the same thing pretty well goes for Intel’s own chipsets. So I’d say it’s pretty much a wash when it comes to ‘serious problems’ with AMD and Intel chipsets. If some people chose to buy super-cheap motherboards with crappy chipsets to pair up with their superior AMD Athlon 64 processors because somehow, they were still thinking of AMD as a ‘budget’ system build, then that was their own fault, not AMD’s.

        • srg86
        • 5 years ago

        If you don’t count the ATI based chipsets with AMD branding, then I don’t think AMD ever made chipsets for the Athlon 64. I still think the nVidia ones were the best, the quirks never affected me.

        BTW I did once have the AMD 760 chipset for K7, the memory controller developed a fault at 512MB, kinda a strange issue.

        Still, the myth as you call it, originated from the Super Socket 7 era. At that point it certainly was not a myth those chipsets were very ropey, but things were much better in the K7 and K8 eras.

      • Anonymus_notthetroll
      • 5 years ago
    • ronch
    • 5 years ago

    Has anybody here checked out AMD’s website? I understand that depending on your country, you may be presented with a different AMD website, so, here’s what you get with the U.S. site:

    [url=http://tinypic.com/r/1t32hl/8<]This is FXing Serious[/url<] If it's meant to sound witty or cool, it's not doing a very good job. Edit - Oh heck, typical of AMD's site to [url=http://products.amd.com/en-us/DesktopCPUDetail.aspx?id=850&f1=&f2=&f3=&f4=&f5=&f6=&f7=&f8=&f9=&f10=&f11=&f12=<]lack info on the 8370E[/url<]. Pretty sure it's not just this particular SKU. Not the first time I found the AMD site lacking polish.

      • just brew it!
      • 5 years ago

      *groan*

      AMD’s legendary marketing prowess in action…

      • tipoo
      • 5 years ago

      “Up to 8 cores of unlocked performance”

      Oh dear. What’s the conversion rate between a core of performance and a Usain Bolt?

    • Bensam123
    • 5 years ago

    What? No overclocking results? I read the end snippet about ‘buying the 8350’, but seriously… This chip runs with less power so that may entail better or more efficient OC’s… It’s like the most interesting prospect about the chip, especially if you’re going to OC a AMD product.

    No one is going to buy AMD if they want a efficient chip, they’re going to buy AMD if they have a tight budget and want the most bang for their buck. If this OC’s better then a 8350 and that’s something people consider when they’re on a budget and have the expertise to do so, they’re going to look at this chip.

      • Meadows
      • 5 years ago

      Did you not read the article?
      The base clock is so low that if you overclock it, you just end up with an 8350.

        • just brew it!
        • 5 years ago

        That’s what has me baffled about the rationale for this chip. You’re essentially paying [i<]extra[/i<] for what amounts to a factory [i<]underclocked[/i<] part. OK, so you get an extra 100MHz in turbo mode vs. the 8350... that's not going to be noticeable in anything but synthetic benchmarks. The only way it makes *any* sense is for system OEMs who want to tick the "8 cores" and "4.3GHz turbo" checkboxes on the system spec sheet, but don't want to deal with the power and cooling requirements of a CPU that can potentially draw 125W.

        • Bensam123
        • 5 years ago

        What does the baseclock matter when it’s operating at turbo speeds normally during any sort of intensive workloads? If the x264 results are any indication of that, you can tell it’s using less juice at maximal speed. If it was operating at 3.3ghz there would be a 20%~ gap between it and the 8350.

        Given that the article doesn’t really stipulate whether or not it’s operating at maximum frequency during certain tests, so I can only assume.

        It’s a 8350 anyway, that’s not the point. The point is whether or not this is a cherry picked 8350 sample and can operate at same frequencies for less juice, which would mean it should be able to overclock further.

          • derFunkenstein
          • 5 years ago

          you are assuming that under an “intensive workload” that it’s actually running at any sort of turbo speed.

            • Bensam123
            • 5 years ago

            That’s what I said and I also said that it lacks important information like that, but based on the fact that it’s not 20% slower then the 8350, we can assume that it’s definitely not running at base clock.

          • Jason181
          • 5 years ago

          There is certainly a comment that it must not be operating at full turbo, and it’s borne out by the results. Read, man, READ! 🙂

        • Geonerd
        • 5 years ago

        We don’t know that.

        IMO, the real question is: “What date was this fabricated.”
        If GloFo has spit these out relatively recently, there is every reason to expect a change in the OC headroom (hopefully for the better!)

        The review team needs to report the fab date. If it’s a recent chip, I think they would be negligent to not give OC a try.

      • ronch
      • 5 years ago

      I bought my FX because I’ve been very intrigued by the Bulldozer microarchitecture ever since AMD released significant details about it. And of course it seems to offer awesome performance/[insert name of currency here].

        • srg86
        • 5 years ago

        It was certainly gutsy of AMD to try it. Kinda like the K5, where they made a chip much more like the Pentium Pro than a normal Pentium (and if you look at block diagrams, surprisingly similar to a single Bulldozer core inside a module).

        Certainly something that no one else had tried, so who knows if it was going to be a success at the time.

        Sadly, as a “Speed Demon” type design, it always screams to me as AMD’s Netburst 🙁

      • Ryu Connor
      • 5 years ago

      [url<]http://www.guru3d.com/articles_pages/amd_fx_8370_and_8370e_processor_review,18.html[/url<] 1.46v, 4.7GHz (200 x 23.5), and 360W worth of juice. [url<]http://www.xbitlabs.com/articles/cpu/display/amd-fx-9590-9370_7.html#sect0[/url<] You'd get better results from a 9590 and 9370. Results tentatively imply that AMD is still binning all the excellent chips for the 9000 series.

        • Bensam123
        • 5 years ago

        That’s not bad, I got 4.4ghz at 1.43 on a 8350, but probably the silicon lottery more then anything.

        It looks like all these chips top off at around that range. I think the best way to actually see if it’s a chip that requires less voltage is to see where it falls at same speeds as a 8350. It would widdle out max clock speeds and look more at the voltage in the OC itself.

    • Meadows
    • 5 years ago

    A base clock speed of 3.3 GHz is unreasonably low. I bet they didn’t lower the base voltage much, if at all. Which is a shame, and they should have done so.

    • christos_thski
    • 5 years ago

    Is AMD becoming the new Cyrix? They haven’t produced a competitive desktop CPU since intel released the Core 2 Duo in 2006. I wish they would at least leave the pipedream of competitive APUs aside and focus on well performing CPUs again. Integrated graphics cannot compete even with mid/low-end discrete GPUs, no one needs integrated graphics eating up silicon real estate on the desktop. They’re screwed on the mobile sector too, seeing how intel has rebounded from the atrocious in-order atoms and produced real, performing mobile cpus with much lower power consumption….

    I’d hate to see AMD go by the wayside, or become an embedded system/console exclusive CPU manufacturer, they’ve given us some remarkable products over the years. But right now even Qualcomm is a more exciting company, cpu-wise…

      • Meadows
      • 5 years ago

      Actually, their APUs are very good buys in 15-17 inch laptops, which seem to be where value is nowadays.

      The E2-6110 in particular is highly competitive. It offers 2 modules (4 threads) in a reasonable power envelope, it costs often less than any competing intel laptop, and it performs just as well (if not better). The up-to-date integrated GPU is good enough for most gaming purposes while competing intel laptops in that price range are saddled with intel GPUs which — regardless of how decent they’ve become — are still noticeably worse. AMD’s proposition is very compelling overall.

      But their desktop scene? A proper refresh is painfully overdue.

      • ronch
      • 5 years ago

      [quote<]Is AMD becoming the new Cyrix?[/quote<] It just occurred to me that we can draw parallels between Cyrix and AMD. Before Cyrix's demise, they were a strong force to reckon with, with their 6x86 actually being fiercely competitive with Intel's more expensive chips. This was around 1995-96. Just before the end, around 1998, they were falling behind and couldn't seem to put out a chip that could keep up with Intel and AMD. Around this time they were also acquired by NatSemi who thought integrating everything in one chip was the future (MediaGX), and made Cyrix lose focus on performance and instead made them do 'just enough computing'. The rest is history. If you look at AMD in the last few years you will see a very clear resemblance with Cyrix. First they became ambitious and are adamant that Fusion is the future and spent more than they could afford to buy ATI. Then they keep falling further and further behind with their mainstream performance products, with resources stretched between their performance line and these integrated chips, thinking it's the answer to every desktop buyer's dream. Then they spun off their fabs because they couldn't fund it anymore, being a fabless chip maker just like Cyrix. There are ver clear similarities between Cyrix and AMD, and I'm not sure AMD has already realized that they are heading down the path Cyrix took. Edit - Originally posted this using my Asus Google Nexus 7 (2013) with its built-in Chrome browser. Chrome for Android always goes bonkers when posting long posts here on TR, putting the cursor in other places and not wanting to place it where you left off or want to put it, killing the whole post. I have no choice but just post it (to avoid wasting everything I wrote down) and just edit it using my PC (using Chrome). Even on its latest iteration (4.4.4), I can still see some imperfections with Android, and the Chrome browser for it is even worse. I thought Google would've nailed all the issues by now.

    • ronch
    • 5 years ago

    Would it be possible to achieve 95w TDP by undervolting an FX-8350? Could be the better option than getting a 3.3GHz 95w chip for $20 more.

      • Concupiscence
      • 5 years ago

      I dunno about an 8350, but undervolting my FX-8320 to 1.2125v seems to have done the trick. It’s probably doable; for those knowledgeable it may be the best option.

    • Ryhadar
    • 5 years ago

    Bummer. Was hoping the early rumors were true and we’d see another repeat with the Phenom II 95W versus 125W (same specs; just purely lower power consumption).

    I could make a case for Vishera when it was released 2 years ago, but by not going with Intel now you’re losing out on things like native USB 3.0, native m.2 and SATA express support, and smart response technology. Not to mention that motherboards that support vishera smaller than ATX are using a chipset that were released 4 years ago.

    If you can use the cores, more power to you though.

      • just brew it!
      • 5 years ago

      [quote<]Not to mention that motherboards that support vishera smaller than ATX are using a chipset that were released 4 years ago.[/quote<] If you're referring to the 760G that chipset is actually over 5 years old now.

        • Shobai
        • 5 years ago

        The 890GX would be the pick over the 760G for µATX, I think, which Wikipedia pegs at early 2010.

          • just brew it!
          • 5 years ago

          Most of them seem to be using the 760G though.

            • Shobai
            • 5 years ago

            For sure, but there are/were some 890GX boards getting around; Biostar and MSI did one each, Asrock did a couple. For that matter, I believe Asrock also does a number of Geforce 7025 boards that accept 95W FX CPUs, apparently – now, /there’s/ a chipset with legs!

            • just brew it!
            • 5 years ago

            I really don’t understand the relative dearth of post-760G IGP chipset motherboards for Socket AM3+. I can only speculate that AMD has tried to discourage mobo vendors from doing this (maybe by inflating the price of, or prematurely EOLing those chipsets?), in order to encourage adoption of their APUs.

      • Anonymous Coward
      • 5 years ago

      Yeah, seems like the bulldozer family hasn’t improved as much over time as AMD has been able to pull off with previous lines.

    • 6GTX9
    • 5 years ago

    But we’re also talking about two different processes here. Did anyone really expect AMD to surpass or even equal Intel using a 32nm? AMD needs to play catch up before we can even think about a fair head to head.

    As it stands now, we’re lining up an old Gran Am versus a Corvette. you know damn well it’s a laugher before the race starts.

    • Kretschmer
    • 5 years ago

    While we only saw one game, those benchmarks are brutal (neck-and-neck with the Pentium option, unless your game is one of the few Mantle offerings).

    Damage, why did you exclude game benchmarks from the scatter plot (or not offer them as a separate button)? Games are a more common use of CPUs than something like Cinebench.

      • yokem55
      • 5 years ago

      The absence of the gaming scatter plot is interesting given it’s prominence in the conclusion of the 5960X review.

    • just brew it!
    • 5 years ago

    No real surprises here. This thing is just a re-tuned FX-83xx with the clock speeds and turbo thresholds tweaked to improve power usage under load, at the expense of some performance. I think calling it an 83[u<]70[/u<]E (and charging a premium for it) is just a tad shady, given that it trails the 8350 (albeit not by a lot) in the benchmarks. The take-home: AMD is still competitive on price/performance in the market segments where they have products to sell; but power consumption under load continues to be an albatross around their neck.

      • ronch
      • 5 years ago

      Yeah, calling it an 8370E is a little misleading. Intel does the same thing with their ‘S’ models. Can’t say I approve of it either. I’ll blow my whistle whether it’s Intel or AMD that raises my eyebrows.

    • Ninjitsu
    • 5 years ago

    So basically, don’t bother with it for a new build.

    • ptsant
    • 5 years ago

    Choice is always good, but these chips don’t make much sense. Why would anyone pay the $20 to go from the 8350 to 8370 for a mere 0.1 GHz in turbo mode? Similarly, the E versions are castrated in multi-threaded performance, which is the major selling point of the FX series. If you don’t care about multi-threaded performance, you don’t buy FX, you buy Kaveri or Intel.

    Anyway, people stuck with AM3+ motherboards will be happy to have a few new more models at different (reasonable) pricepoints.

    • jdaven
    • 5 years ago

    In summary:

    Single threaded performance on any budget – Intel Haswell processors
    Multithreaded performance on a budget – AMD 8-core FX processors
    Multithreaded performance unlimited budget – Intel Haswell-E 8-core processor

      • DragonDaddyBear
      • 5 years ago

      I’d say a Pentium for the single-threaded on a budget.

        • derFunkenstein
        • 5 years ago

        Those Pentium processors are Haswell-based now, so you’re not correcting him so much as bolstering his claim.

          • DragonDaddyBear
          • 5 years ago

          I was trying to be more specific. All Pentiums are Haswell but not all Haswell are “budget.” But I see your point. I should have clarified that.

            • jdaven
            • 5 years ago

            That’s why I said ‘on ANY budget’. Any means $0 to infinity. Budget doesn’t always mean cheap depending on the use.

            • derFunkenstein
            • 5 years ago

            Yeah, I think you just missed the word “any” on that line.

        • jdaven
        • 5 years ago

        Current Pentium processors are based on Haswell so they are included.

        • Jason181
        • 5 years ago

        I honestly think I’d take the Pentium over the 8350 even for lighter threaded applications (3-4 threads). Of course primarily wanting performance for games, and not afraid to overclock that poor little cpu to within an inch of its life.

      • dragosmp
      • 5 years ago

      Positives:
      *if lucky one can upgrade an old 955BE to an 8-core
      *using the right app the 8370e may actually be faster than the 955BE

      Negatives:
      *pointless for a new system. I want to like these new energy efficient CPUs, but the 8370e as fast as the 2500k… That’s not a good alternative to a H97+4-core Haswell imho.
      *PCIe speed, SATA (it’s slower for some reason than Intel’s) and lack of USB 3.0

        • jdaven
        • 5 years ago

        If the application uses all 8 cores, then the FX is faster than a 2500K. Here is the Anandtech Bench comparison:

        [url<]http://www.anandtech.com/bench/product/697?vs=288[/url<] In Cinebench R11.5 multi-threaded benchmark, the FX 8350 beats the 2500K by 26%. You can see other benchmarks here as well.

      • swaaye
      • 5 years ago

      There’s another consideration. Programs that don’t utilize all hardware threads but aren’t single-threaded either. They depend on per-core performance. Thief for example. I think it utilizes about 3 cores (UE3 often does that).

        • ronch
        • 5 years ago

        Or programs that depend a lot on the FPUs. Some benchmarks suggest that the FX’s FPUs are a bit slower than the ones found in Intel chips. If so, the FX FPUs will inevitably pull performance down even if your app can use all 8 integer cores.

      • travbrad
      • 5 years ago

      The problem for AMD is that single-threaded performance still matters even when you are talking about programs that use 2-4 cores (e.g. most games). Maybe that will all change with DX12 as the article mentioned, but I’d never buy hardware based on promises of future performance that may or may not happen, and could be years away if adoption rates of past DX versions are anything to go by.

    • ronch
    • 5 years ago

    Any hope of ever having an FX lineup built on anything smaller than 32nm or 28nm? Practically all CPUs in recent memory have seen at least one die shrink in its production run. We can’t really put all the blame on AMD. As long as GF is killing them with slow process ramps, pesky one-time charges (who knows, it may not be over yet) and that crazy wafer supply agreement that AFAIK ties AMD to GF until 2090 or so, AMD will never see much improvement in their business. We really should be buying 22nm FX models by now, and expecting 14nm to be imminent.

    • anotherengineer
    • 5 years ago

    Hmmm the 8320e could actually be an interesting upgrade over my 5 yr old 955-BE. Although, I don’t really see any charts as how it would compare to the 955 in gaming though.

    I see some other general benchmarks though
    [url<]https://techreport.com/review/26977/intel-core-i7-5960x-processor-reviewed/13[/url<] However I guess I would have to see if Gigabyte updates the BIOS to support the chip first? I have a feeling we're going to see some reviews popping up how peoples old 990x/fx mobo's does not recognize the new chip...............

    • ronch
    • 5 years ago

    Jumped straight to the power consumption graphs. Was pleased to see the 8370E consume less power. Checked benchmarks, was a little taken aback with the lower performance so I checked the specs. 3.3GHz base. OK. So that’s why.

    I was hoping GF’s 32nm node had become better in terms of energy efficiency.

      • derFunkenstein
      • 5 years ago

      I don’t think this is all that terrible. I wouldn’t buy one, mind you, but I think AMD accomplished what they set out to do – take away as much of the power consumption complaints as possible while still giving good performance. Saving 50W at load is worth the small 8-core performance loss, and still in lightly threaded tasks do as well as they could.

        • ronch
        • 5 years ago

        Well, I guess it’s the best they could do with an outdated process node they’ve been stuck with for 3 years in the FX line.

          • derFunkenstein
          • 5 years ago

          I agree, and I also think they’ve decided it doesn’t make sense to sink money into a die shrink and Piledriver cores – that they’d get enough performance to entice people to make up for the significant investment to back port their APU work to AM3+. Their semi-recent hires indicate they’ve gone back to the drawing board and the publicly available information is kind of exciting, so I get it.

    • SuperSpy
    • 5 years ago

    Poor AMD, that power consumption graph really shows how screwed they are.

      • DragonDaddyBear
      • 5 years ago

      Yeah. But Intel had the same issue with the P4. Hopefully history will repeat once again and AMD will return to their former competitive glory.

        • ludi
        • 5 years ago

        So…AMD cobbles together several K8-derived 64-bit cores at 16nm with doubled cache, gives it a catchy new branding, and finally gives my 3yo i5-2500k a reason to sweat?

        That would be interesting, if it happened, but AMD doesn’t have the resources to keep multiple chip projects going the way Intel did when transitioning past the P4/Itanium fiasco.

      • anotherengineer
      • 5 years ago

      It’s still a trivial amount of power in the total usage of a typical home.

      Wife’s hair dryer is about 1700W, basement lights are about 500W, and they often get left on all the time by others I live with…..sigh…………

        • DragonDaddyBear
        • 5 years ago

        Not to mention at idle, where most CPU’s spend their time, it’s an even smaller gap.

        • christos_thski
        • 5 years ago

        But hair dryers are devices that are used momentarily, or for a small amount of time at any rate, they do not factor in nearly as much into total consumption of a household as a device that is almost always on, like a personal computer.

        As for your basement lights, jesus man, that’s A LOT, especially if they’re leaving them on all the time. You should absolutely, positively look into led bulbs, you’ll save quite a bit.

          • UnfriendlyFire
          • 5 years ago

          My parents refuse to buy LED bulbs because they calculated that it would take 4-5 years to recoup the losses, and by that time they would’ve moved out of the house.

            • just brew it!
            • 5 years ago

            Light bulbs aren’t permanently wedded to the fixtures you install them in. Just stick the old bulbs in a box in the attic and swap them back in if/when you move.

        • UnfriendlyFire
        • 5 years ago

        Three incandescent light bulbs will consume around 180W.

        Bad insulation job? Have fun with the increased heating/cooling bill.

      • ptsant
      • 5 years ago

      People are OK paying top dollar for the Xeon E5-W model which puts out 150W, so power consumption is not the problem. Decent cooling can take care of that. The problem is the absolute performance that you get.

      Intel has been designing around power consumption in order to best serve the lucrative notebook/mobile and server markets. The desktop market does not objectively depend that much on TDP up to, say, 130-150W. If 130W got you double the performance of a 65W Haswell, many people would pay for that (it’s called Haswell-E, in fact), stick a good cooler and forget it’s there.

        • SuperSpy
        • 5 years ago

        My point was not the power consumption, it was the effective efficiency. The AMD chip uses nearly twice the energy and takes about 20% more time at that high energy usage to do the same amount of work as the Intel chip.

        I would agree it wouldn’t hurt so much if AMD had anywhere near an absolute performance lead over Intel, but they don’t by quite a large margin. A lot of people, myself included, aren’t really bothered by a high power chip [i<]if[/i<] there's performance benefits. With AMD that's sadly not the case. With Intel focused so heavily on playing catch-up with ARM and it's tiny power consumption I don't see it getting any easier for AMD. I really, really, hope they have something new in the works, but I'm worried they are just gonna run their Bulldozer-styled architecture into the ground with nothing but incremental upgrades.

      • sschaem
      • 5 years ago

      To be noted, those CPU use the same 32nm process that was first use by AMD in 2010.
      Its also pretty much the same architecture that was released in 2011.

      I see the FX-8370e as a 2011 chip design, built on a 2010 fab process, released in mid 2014

      So not much investment on AMD part to get that done.

      So does this showcase AMD current/lastest capabilities? I dont think so.

      Jumping to 20nm and the new announced architecture (post excavator) will have AMD narrow the gap.
      Until then, nothing too revolutionary will happen.

    • DragonDaddyBear
    • 5 years ago

    I think the CPU is a viable buy when you look at total platform costs. The motherboards are cheaper at feature parity, as is the CPU. And PCI Express Gen 2 hasn’t been a limiting factor in any benchmark I’ve ever seen under realistic conditions. I honestly would recommend the exact pairing AMD suggests in the graphic on the first page, the E CPU and a 285, for someone on a very tight budget for a gaming/all-purpose rig.

      • Kretschmer
      • 5 years ago

      For gaming, the Pentium G3258 costs less than half as much as the cheapest “FX-E” and generally offers better performance BEFORE overclocking. Intel motherboards are very competitive if you go with the “H” series like I did for my last build; many “Z” features are extraneous. Going with an i3 would end up being a night and day difference.

      I can’t see recommending one of these new CPUs for any sort of gaming, as single-threaded performance and platform options are the low point of AMD’s offerings.

        • DragonDaddyBear
        • 5 years ago

        I don’t disagree but I think I would rather get an AMD for a budget build that can also do VMs and games and video encoding.

          • Kretschmer
          • 5 years ago

          I guess, but that’s a hell of a niche.

          I’d rather spend a bit more for an i5 and get great performance in all categories.

          • derFunkenstein
          • 5 years ago

          An i3 build can do VT-x, so you’re still not losing anything.

          The “gaming/video encoding” build case that immediately springs to mind is Twitch. At least with nVidia (and maybe AMD, I haven’t been paying attention), the GPU will do the H.264 encoding with no CPU intervention. Unless I’m converting a bunch of Blu-Ray rips to HD .mkvs, I don’t see “video encoding” in the traditional sense as a real thing people do anymore. They’d much rather stream it from Netflix or buy it from Amazon/Apple.

            • dragontamer5788
            • 5 years ago

            [quote<] The "gaming/video encoding" build case that immediately springs to mind is Twitch. At least with nVidia (and maybe AMD, I haven't been paying attention), the GPU will do the H.264 encoding with no CPU intervention. Unless I'm converting a bunch of Blu-Ray rips to HD .mkvs, I don't see "video encoding" in the traditional sense as a real thing people do anymore. They'd much rather stream it from Netflix or buy it from Amazon/Apple.[/quote<] GPU encoders seem to be of significantly lower quality than x264 / Handbrake, and I'm pretty sure that H264 encoding from the GPU slows down the GPU anyway. (Most computers have extra CPU power, especially if you're playing on high settings).

            • Concupiscence
            • 5 years ago

            Netflix’s issue remains one of cyclical availability: when your kid’s favorite show suddenly vanishes from Netflix without warning because the rights holder couldn’t come to an agreement with the streaming provider, I can assure you that the reliable presence of physical media becomes more attractive.

            Amazon and Apple both provide a useful service – there are some HD movies on the iTunes Store that will never get a Blu-ray release, for starters – but I like extra features, commentaries, and knowing that video quality won’t take a dive because I’m torrenting or someone else in the house is downloading. And transcoding Blu-ray rips to DRM-free smaller files is handy as can be! I just don’t share them with others outside my immediate family.

            • dragontamer5788
            • 5 years ago

            [quote<]Netflix's issue remains one of cyclical availability: when your kid's favorite show suddenly vanishes from Netflix without warning because the rights holder couldn't come to an agreement with the streaming provider, I can assure you that the reliable presence of physical media becomes more attractive.[/quote<] Indeed. Its why I still buy anime today, especially the ones I enjoy. Even if things streamed on the internet at one point in time or another... sometimes... you end up with situations like "[url=http://www.amazon.com/dp/B009TN24KS<]Garden of Sinners[/url<]", where the show becomes extremely rare as streaming rights run dry.

      • Third_Eye
      • 5 years ago

      IMHO the PC world has moved from Price/Performance to Power/Price/Performance a while back and hence the FX series has seemingly been shunned.

      In the computing division, the only thing that keep AMD floating are their APUs. With basically no future for the MPUs with the Bulldozer core. Sorry folks! There will be no SR based MPUs or even Excavator based ones. Blame it on the design or on fabbing. Either way the ship has sailed.The last refuge of the Dozer cores seemed to be in the mobile world.

      What worries me is the absolute slow release of SR based laptops.
      And Beema based laptops are only used as Kabini replacements rather than something better which they clearly are.

        • ronch
        • 5 years ago

        With dwindling funds I think AMD had to make a tough decision to halt future AM3+ chips. They are concentrating all their time, money and effort to making their next generation products more competitive. Remember, Bulldozer and its lineage are a result of past management and it takes 5 or so years to develop a new CPU core to underpin their lineup. For now they’re just weathering out the storm and doing the best they could to get another shot. It may well be their last. Let’s just cheer for the little company instead of booing them for doing everything they could given their limited resources to design a mind-bogglingly complex microprocessor that failed to match the product of a far, far bigger company with many times more resources to develop a competing CPU/lineup. Really, what some people here are doing by bashing AMD is like bashing a small car company for not having the resources to develop a luxury car that matches up well to cars from BMW or Mercedes.

      • ronch
      • 5 years ago

      I don’t see why people are splitting hairs here comparing the FX-8350 to Intel’s lineup. I can see many customer comments over at Newegg and they have practically the same thing to say about the FX-8350: terrific performance, great price, horrible fan/heatsink, runs a little hot (although Intel chips are not immune to that either). It’s not a bad CPU to own and it does have the muscle to power through multiple tasks with ease.

        • Anonymus_notthetroll
        • 5 years ago

        First world problems….

        Food… check
        Water… check
        Shelter… check
        EVERYTHING ELSE ANY sane PERSON COULD WANT… check

        Hmmm…still have this “wierd” feeling… Like i HAVE to complain about something… hmmmm, now what could i complain about here…OH! How u doin AMD..???

        (says the guy complaining about the complainers) what!? lol

          • ronch
          • 5 years ago

          Sometimes we need to step back a little and see how much we’ve got and how lucky we are to even be buying FX and Core i5/i7 chips when most of the world don’t even have computers, or are getting by with tired, old Pentium 4s or faded and dirty 2006-era laptops.

Pin It on Pinterest

Share This