A subjective look at the A8-7600’s gaming performance

Update — Faulty memory appears to be behind the crashing we experienced with the A8-7600T. The AMD-branded DIMMs provided with the Kaveri test system produce errors when running Prime95 alongside the Unigine Valley graphics benchmark. These errors occur with the memory clocked at 2133MHz, the maximum speed officially supported by both the DIMMs and the processor. Dialing back the modules to 1866MHz eliminates the errors, and so does swapping in a pair of Corsair Vengeance DIMMs. The Corsair modules passed a 12-hour stress test at 2133MHz without so much as a single error.

Kaveri is AMD’s first APU to feature integrated graphics based on the latest generation of Radeon graphics cards. As we learned in our review of the A8-7600, even a cut-down version of this DirectX 11-class GPU can keep up with the latest blockbuster games. Battlefield 4, Batman: Arkham Origins, and Tomb Raider are all playable at a 1080p resolution. The frame rates aren’t great—around 25-30 FPS—and the in-game detail settings need to be turned down to get the games running that well. But the action is smooth enough and the graphics are good enough to deliver an enjoyable experience, especially for so-called casual gamers with less refined tastes. Not bad for a $120 processor that can fit inside small-form-factor and all-in-one systems.

Our deadline for the Kaveri review was extremely tight, so there was no time to test the A8-7600 in additional games. However, I wanted to see how the chip handled a broader collection of titles, specifically the older, less demanding games so frequently discounted on Steam. These games may not be the latest and greatest, but they’re still a lot of fun, and they’re very cheap to buy. Perhaps the A8-7600 could run them with fewer compromises.

Since I was pretty much zombified the day after the review went up, I decided to find out. Installing and playing games was my only real hope of productivity in that state. The following are my subjective impressions and some accompanying screenshots. Clicking the screenshots will bring up a larger, full-resolution image that provides a better sense of how things look.

First, here are some shots from the games we tested in the review. (Our full, inside-the-second analysis begins here.)

Battlefield 4 (click to enlarge)

Batman: Arkham Origins (click to enlarge)

Tomb Raider (click to enlarge)

I didn’t include the full-sized images in the initial article, but they’re worth perusing. All three games look better than one might expect from integrated graphics, especially given the display resolution. That said, Batman and Tomb Raider both crashed to the desktop multiple times during testing, and they weren’t the only games to have issues.

Speaking of other games, let’s look at batch of first-person shooters.

Borderlands 2 (click to enlarge)

Dishonored (click to enlarge)

Serious Sam 3: BFE (click to enlarge)

Borderlands 2 ran reasonably smoothly with only depth of field, ambient occlusion, and antialiasing disabled. The frame rate stuck to around 30 FPS, and I didn’t perceive any obvious stuttering. This game is definitely playable, though it did crash to the desktop twice.

Serious Sam also crashed a couple of times. Otherwise, the game ran pretty well with high details and only ambient occlusion and antialiasing disabled. Frame rates bounced around within the 25-50 FPS range depending on how many baddies there were on the screen. The occasional slowdown was noticeable during the heaviest action, but it didn’t really affect my enjoyment of the game.

Fraps’ frame rate counter showed 30-45 FPS during my Dishonored session. The action felt smooth, with no apparent interruptions to fluid frame delivery. And the game looked decent, too. All the graphical settings were maxed with the exception of the model detail, which was set to normal rather than high, and antialiasing, which was disabled.

Next on the shooter front: Mirror’s Edge and Counter-Strike: Global Offensive.

Mirror’s Edge (click to enlarge)

Counter-Strike: Global Offensive (click to enlarge)

Both of these games ran well on the A8-7600. Counter-Strike regularly hit 60 FPS with the details maxed and FXAA turned on. It felt noticeably silkier than the other shooters, which is exactly what you want in a game that relies on quick reactions.

Mirror’s Edge had slightly lower frame rates than Counter-Strike, and I had to disable antialiasing and PhysX effects to make the action stutter-free. After those adjustments, Fraps’ FPS counter never dropped below 35 FPS, and Faith’s free running felt fluid. Or it did until the game crashed. Twice. Noticing a pattern yet?

Dirt: Showdown (click to enlarge)

Need for Speed: Shift 2 Unleashed (click to enlarge)

Dirt: Showdown crashed to the desktop multiple times, too. It was actually part of the original test suite for the A8-7600 review, but I switched to Tomb Raider after encountering a couple of early crashes on the Kaveri setup. After getting another shot, Dirt: Showdown ran pretty well, at least between subsequent crashes. With high details and antialiasing disabled, the frame rate hovered around 30-35 FPS. There were no obvious stutters or slowdowns.

The only crashing problem in Need for Speed: Shift 2 Unleashed was hitting other cars, and I can’t blame the game or the APU for that. With high in-game detail settings, the A8-7600 cranked out 25-30 FPS. The frame rate dipped to the lower end of that range when there were more cars in front of me, but that didn’t make the gameplay feel sluggish.

Sleeping Dogs (click to enlarge)

Just Cause 2 (click to enlarge)

More game crashes hit when I tackled Sleeping Dogs. One of them even hosed part of the Windows install, forcing me to re-image the system. Ugh.

When it wasn’t crashing, Sleeping Dogs was too choppy with high details. The game ran at 30-45 FPS with medium details, though. Scaling back the eye candy sacrificed the slickness of the environment, but it was necessary to even out the frame delivery and eliminate stuttering. And the graphics still looked all right.

Just Cause 2 is enjoying somewhat of a renaissance thanks to a free multiplayer mod. The mod crashed on me several times, but the standard, single-player version of the game ran without issue. And it ran very well, too. With high details and everything but antialiasing and ambient occlusion disabled, there were no noticeable slowdowns in the frame rate. Fraps reported 30-45 FPS for the duration of my test session.

Dyad (click to enlarge)

Trials Evolution Gold (click to enlarge)

A couple of more casual games, Dyad and Trials Evolution Gold, performed impeccably with all their in-game detail settings turned up. Not that we should be surprised. These titles are much simpler than the other games we’ve looked at so far.

Dyad and Trials Evolution Gold were immune to crashes, and the Kaveri system was perfectly stable in all our non-gaming tests, including those that tapped the integrated Radeon via OpenCL. The A8-7600 still had problems with exactly half of the games we played, though. That’s a lot, especially since these aren’t overly obscure titles. (I’m not counting Just Cause 2 multiplayer, which could probably use more polish.)

AMD’s OverDrive utility showed no evidence that the APU was overheating. Also, there were no problems with Richland-based APUs running in the same test system and with the same drivers. Those chips have an older integrated graphics architecture that may use a separate driver code path, so perhaps this is just a software issue that can be ironed out with a future Catalyst driver release. Fingers crossed.

In between crashes, the A8-7600’s gaming chops impressed me. This APU is fast enough to run lots of really good titles at 1080p resolution, and it can handle older games without too much sacrifice. That said, there are still some compromises involved. Even in older games, it’s rare to be able to turn the detail settings all the way up, and antialiasing often causes slowdowns. Some visual fidelity is inevitably lost versus what can be achieved with a more powerful GPU.

Some smoothness is lost, as well. Although the A8-7600 was largely stutter-free in the games we tested, the lower frame rates we experienced in more recent titles didn’t feel as fluid as the 60 FPS we got in Counter-Strike. The APU was fast enough to run the games we played at the settings we used, but the performance definitely wasn’t ideal for most of those titles.

For casual audiences with less refined appetites, the A8-7600 is probably fast enough. Connoisseurs are unlikely to be satisfied, though, and I question how well future titles will run on the chip. The Xbox One and PlayStation 4 have a lot more GPU grunt than Kaveri’s integrated Radeon, and developers are likely to target those platforms as their new baseline. Perhaps Kaveri can serve as a sort of gateway drug by giving people a taste of PC gaming without all the trimmings.

Comments closed
    • kamikaziechameleon
    • 6 years ago

    We need to have an integrated solution from AMD that is atleast as powerful as a 400 dollar console. This is silly.

    • esterhasz
    • 6 years ago

    1) I really like the “subjective” approach.

    2) Pretty good performance! A-10 with the right memory could be a nice machine for the living room.

    3) Computerbase.de tested different memory configurations and found out that not only memory speed strongly affects Kaveri’s performance, but also how chips are organized on the DIMMs. Two dual-rank modules are approximately %5 faster than four single ranked ones at the same speed rating. That’s quite a lot. (http://www.computerbase.de/artikel/prozessoren/2014/amds-kaveri-und-der-speicher/2/)

    • Mat3
    • 6 years ago

    Where’s the A-10 review?

    • Fighterpilot
    • 6 years ago

    Not bad for a first mainstream APU.
    How do Intel chips perform in those games…or is Intel sitting this one out?

    • flip-mode
    • 6 years ago

    I do not want to take away from what AMD has been able to do with integrated graphics. Also, I think that AMD’s progress on HSA is very good to see and has big potential. But I want to talk about pure-CPU performance.

    Over the weekend I was curious how my old 2009-model X4 955 stands in comparison to the fastest Kaveri – the A10 7850. Neither TR nor Anandtech did a direct comparison of these two, which is completely understandable; the X4 955 is a geezer of a CPU. So I used Anandtech Bench, which has the inverse problem of not having info on the newest chips, but it did have data for the A10 5800K and Anandtech also included the A10 5800K in its Kaveri review. So I compared my X4 955 to the A10 5800K in Anandtech Bench and – again, for purely CPU tasks only – the A10 5800K and the X4 955 are very close in performance, with the A10 5800K have a slight edge overall. Jumping over to Anandtech’s Kaveri review, one can see that in pure-CPU tasks, the A10 7850 takes a slight lead over the A10 5800K but it is very modest.

    The bottom line is that, as far as I can tell, the A10 7850 is just modestly faster in purely CPU tasks than the 5-years-older X4 955. That’s unfortunate. An X4 980 is probably as fast or faster than the A10 7850 on pure-CPU tasks.

    Having said that, I’m hopeful for HSA. Also, I hope that there is a post-Bulldozer era coming and that whatever CPU core AMD is developing for that will be a winner.

      • anotherengineer
      • 6 years ago

      I am also still running a 955. I wonder where/how the Deneb and Thuban would be now if they continually refined it and brought it to 28nm, and upped the I.M.C.up to 2133MHz, etc.

      Maybe we would have had a 95W 8 core Thuban at 3.8 ghz with 1.25 the ipc of the old one for under 200 bucks?

      And they would probably perform better than steamroller cpu wise.

        • flip-mode
        • 6 years ago

        Getting off topic, does your 955 overclock well? Mine sucks. I can get 400 MHz out of it easily, but it falls apart fast after that, with a 600 MHz overclock being about the most mine will do even with generous voltage. In percentage terms it is the worst overclocker I’ve ever had. Black Edition my arse.

          • anotherengineer
          • 6 years ago

          To be honest I have not ever tried to OC it, I have had no reason to. It under-volts quite well though, I disable the low power C states in the BIOS and set win7 to performance mode so CnQ is over-riden, but I don’t disable it in the BIOS.

          Runs stock 3.2GHz all the time, but at 1.250V, I have had it to 1.225V, but 1 core gave me a prime95 error after 8+ hours.

          I think mine is C3 stepping, which are apparently better for OC than the C2 stepping models. I upgraded my mobo when I got my SSD, so I could have sata3, so I am also running ddr3, maight make a difference also??

          I have seen a lot of people getting 3.6 GHz with this chip at 1.325V (I guess which would be 3.6ghz for you) It does seem to be about the sweet spot for these chips.

        • OneArmedScissor
        • 6 years ago

        Llano was 32nm K10, but had serious issues. They tried, but it didn’t work out.

        Faster L3 / IMC / HT would have helped FX, too.

        And with all that to deal with, TSMC cancelled the manufacturing process for 28nm Bobcat and delayed 28nm GPUs.

        So it’s not as simple as some CPU cores being inferior. AMD has been mired in stop gap releases for 2 years.

        This is all very similar to the original Phenom’s TLB bug and the 65nm Athlon 64 royally sucking.

        AMD has been stretched thin since acquiring ATI, but hopefully they are more streamlined from here on out.

          • anotherengineer
          • 6 years ago

          Ya it seems like the fab process is their greatest challenge in regards to overall chip power/performance. At least the 45nm Phenom II worked well.

      • Meadows
      • 6 years ago

      Do remember that while performance only increased modestly, so did the typical consumer workloads, and power consumption decreased greatly since the Phenoms were in vogue. That last bit matters more these days than absolute performance, so I’d imagine that’s where AMD want to develop the most. Sadly they still have a ways to go.

      • Lazier_Said
      • 6 years ago

      If that’s not depressing enough, keep in mind that the X2/3/4 of 2009 was where AMD finally caught up with Intel’s Core from 2006.

    • ronch
    • 6 years ago

    Typical scenario when a friend asks for you help when buying a budget gaming PC:

    Friend: Hey bro, I want to buy a new computer but I don’t know anything about computers. Help!

    You: Sure. Let’s go out and get one of those new-fangled APUs.

    Friend: What’s an APU?

    You explain.

    Friend: Cool. Let’s go!

    So you and your friend go out and buy the parts and put them all together. He brings the parts home and eagerly installs his favorite games, which HAPPEN TO BE the games that Geoff gets crashes with.

    Friend calls you: Hey dude, is there something wrong with my computer? My games are crashing all over the place. This computer is NEW!

    You: Oh yeah, man, I forgot to tell you. Those new APUs still have some kinks to be ironed out. You’ll need to wait for AMD to release new drivers which will hopefully fix those crashes.

    Friend: How long is that?

    You: Oh, I dunno. Maybe two or three months.

    And so, you just lost a friend.

      • Deanjo
      • 6 years ago

      What do you mean that I have to buy a video card now if I want to play games? Why didn’t I just get an FX then? Wouldn’t that have given me better performance? Now I’m stuck with a processor with mediocre performance as well? 3 months for good drivers? Are you sure? Didn’t they have like 3 or more years to get that right while the chip was being developed? Or is this another “The current version of windows don’t take advantage or give the best performance for this chip, wait until the next version!”

      YOU FRIGGEN JERK!!!

    • Deanjo
    • 6 years ago

    Obviously Geoff was doing something wrong. I mean every time someone mentions about nvidia having better drivers, every ATI/AMD fanboi comes out of the woodwork and proclaims that AMD/ATI hasn’t had driver issues in years and are every bit as robust as anyone else.

    • Gadgety
    • 6 years ago

    Great. Now, it would be good with a comparison run through of the same games on the A10-7850k. Will the same games still crash?

    • kilkennycat
    • 6 years ago

    Geoff: Maybe you should run Furmark on the A8-7600?

    Might check whether the crashes were due to overheating of some of the GPU elements of the chip – which might not show up in an overall chip temp measurement. Furmark should just throttle and not crash if the GPU is not the guilty party..

      • Hippo
      • 6 years ago

      I think it may be the vrms on the motherboard that are overheating.

        • kilkennycat
        • 6 years ago

        Some Arctic Freeze occasionally squirted on the VRM heatsinks while checking out Furmark should prove or disprove that assertion.

    • yokem55
    • 6 years ago

    What about games like Skyrim or Kerbal? You know, games that can use a good CPU?

    Edit: Missing indefinite article.

      • Stickmansam
      • 6 years ago

      Skyrim isn’t that CPU heavy

      [url<]http://www.tomshardware.com/reviews/piledriver-k10-cpu-overclocking,3584-15.html[/url<] [url<]http://images.hardwarecanucks.com/image//skymtl/CPU/KAVERI-APU/KAVERI-APU-67.jpg[/url<]

    • LostCat
    • 6 years ago

    Was TR (and everyone else) asked politely to wait for Mantle on dedicated card benches? Kind of surprised no one’s asked yet.

      • chuckula
      • 6 years ago

      The test rig that TR received for the A8-7600 was a small form factor case that couldn’t even accomodate a mid or high end discrete GPU, so that makes it hard to test.

      Scott mentioned that he received an A10-7850K at CES (just the chip, not a whole system). It’s apparently under test at Damage labs with a standard FM2+ motherboard that can take the full range of standard videocards. The followup review will include far more information about discrete GPU performance and might include Mantle numbers depending on the timing (otherwise Mantle will get its own article).

        • LostCat
        • 6 years ago

        Awesome, thanks.

    • ronch
    • 6 years ago

    The fact that many games were crashing all over the place means it’s not a problem with the games. It’s either the drivers are broken or the chip’s broken. But given how GCN has been around for so long, how could the drivers be to blame? All AMD did was put those GCN cores next to those Steamroller cores.

    If those Steamroller cores are or how Kaveri was put together the culprit, it could say a lot about AMD’s engineering prowess. But I guess even if the chip’s broken we’ll never find out. More likely they’ll just patch things up rather than admit their chip is broken and risk tarnishing AMD’s image even further.

      • HisDivineOrder
      • 6 years ago

      I imagine there are differences between how the GCN cores are tied to the CPU in an APU and how they are tied in a system with discrete graphics. I’d imagine those differences might impact how stable the drivers are.

      That said, remember Phenom. It began the modern AMD era of Intel dominance.

        • ronch
        • 6 years ago

        So they released Kaveri with their Catalyst drivers not fully supporting it yet? So anybody who buys a Kaveri chip today will have their games crashing all over the place until AMD rolls out an updated driver? Shouldn’t the Kaveri platform be acceptably stable upon release?

          • LostCat
          • 6 years ago

          Early adopter pain? UNACCEPTABLE! 😉

            • ronch
            • 6 years ago

            Ok, you buy a computer with one of these chips, eagerly install these games, and lots of them crash. Is that acceptable? You gotta be a real die-hard AMD fanboi to think that’s acceptable.

            Yes, I am aware that anything new MAY have some kinks that should be ironed out in time, but this is a little too sloppy, don’t you think? And sarcasm doesn’t make it the least bit funny. If you think this is ok, why don’t you go out right now and spend a grand on a Kaveri-based PC?

            • Deanjo
            • 6 years ago

            What I would like to see is what happens when a dedicated GPU is added to the system. Do the issues continue or do they disappear?

            • travbrad
            • 6 years ago

            I would guess the problems are with the integrated GPU or GPU drivers. If it was some sort of problem with the CPU cores they would probably be seeing crashes in other applications. Testing with a discrete GPU would be interesting though, just to confirm that’s where the problem lies.

            Of course there is really no reason to buy these APUs if you are going to use a discrete GPU though.

            • LostCat
            • 6 years ago

            Faulty RAM, they already updated the article.

            • chuckula
            • 6 years ago

            AMD isn’t entirely off the hook since running AMD-branded RAM supplied directly by AMD at the officially supported clock frequency that AMD advertised in a system completely built by AMD produced the problems…

            • LostCat
            • 6 years ago

            *shrug* Never said they were. I need my damn processor o.o

            • LostCat
            • 6 years ago

            I already have one built, though I don’t have the processor yet. My 6870 and my OSes also had early adopter pain. I imagine most hardware does.

            (That’s not even mentioning new games. I don’t even know why I bother buying them new.)

          • sschaem
          • 6 years ago

          I have yet to read other review telling that their kaveri system crashed during gaming.

          Crashing is totsaly unnaceptable and insane..

          AMD got nothing to say about tr game review?
          Did tr follow up with AMD?

          This should be a huge story. “AMD new kaveri apu can’t play AMD optimized games without crashing”

            • puppetworx
            • 6 years ago

            [quote=”sschaem”<]I have yet to read other review telling that their kaveri system crashed during gaming.[/quote<] We don't know what caused the crashes, it could have been the APU or driver, it could equally have been the RAM, motherboard, PSU or heatsink fan. The blog post doesn't indicate that any of these aspects were scrutinized, rather many people have [i<]presumed[/i<] that they have been based on the source. Unfortunately this blog post seems to have become a bit of a storm in a teacup amongst the TR community when this appears to be an isolated incident.

            • DaveBaumann
            • 6 years ago

            Check out the update – faulty DIMM…

            • Nullvoid
            • 6 years ago

            It’s a shame the update wasn’t placed at the start of the article as the main body of text wasn’t changed at all so it still paints a pretty bad picture of the Kaveri-based system’s reliability.

    • ronch
    • 6 years ago

    Sorry guys, as a long-time AMD fan I must say I’m really disappointed with these APUs. All AMD did was offer Core i3 (aggregate) performance along with a low-end discrete GPU in one piece of silicon. What for?

    Oh, HSA? Yeah. Let’s wait for developers and everyone else in the industry to make AMD’s HSA and OpenCL dreams come true.

      • ermo
      • 6 years ago

      [quote<]"All AMD did was offer Core i3 (aggregate) performance along with a low-end discrete GPU in one piece of silicon. What for?"[/quote<] One word: Laptops. EDIT: From the [url=http://www.kitguru.net/site-news/interviews/jules/exclusive-kaveri-interview-with-nicolas-thibieroz-of-amd/2/<]kitguru article[/url<] from the thursday shortbread (awful article, by the way): [quote<](Nicolas Thibieroz, AMD DevRel guru): "While performance in PC Mark 8 v2 is 8% better when comparing the best Kaveri chips to the best Richland chips from the last generation, the improvement in the sub-45w area is much bigger – as much as 15% faster" "When you measure system scores in 3DMark Fire Strike, you will see over 33% better performance per watt in the mid and high-end chips, but the low end increases by as much as 75%" [/quote<]

        • cynan
        • 6 years ago

        As for that second quote, for a gaming laptop, I’d take a nice discrete mobile GeForce or Radeon, together with a more power efficient Haswell with IGP that I can toggle when I’m actually doing things that make sense to run off of batteries (ie, work, watching videos, etc).

        Sure you can plan a lot of casual games running off of batteries, but playing those that require significant GPU grunt over Intel integrated graphics without being plugged in is a good way to kill your laptop battery quickly.

        That to say that unless AMD manages to magically reduce power consumption by orders of magnitude while running all out in 3D, I don’t see how these APUs are more compelling for a laptop than discrete mobile graphics configurations that come with more power efficient CPUs (ie, Intel), other than that they may cost less and perhaps fit into form factors a hairs breadth thinner.

        I suppose Kaveri is interesting for the budget gaming laptop market (if there is such a thing) or if squeezing the lowest power Kaveri into a larger tablet…

      • Anonymous Coward
      • 6 years ago

      [quote<]All AMD did was offer Core i3 (aggregate) performance along with a low-end discrete GPU in one piece of silicon. What for?[/quote<] Better than going out of business.

      • sschaem
      • 6 years ago

      At the cost of repeating .

      45w, true audio DSPs, mantle support with latest gcn cores (this include all the tech goodies found in next been consoles), unified memory/HSA for future compute optimizations.

      Chicken and the egg. At. Some stage someone need to take a step forward for things to happen.

      The 95w part seem to be a cassduaslity of GF incompetence… But sub 45w seem decent.

        • LostCat
        • 6 years ago

        I’ll take 95 with GPU obviously. My current part is 140 without.

    • 6GTX9
    • 6 years ago

    This is certainly not enough to placate actual gamers or enough reason not to follow up the FX Vishera chips. Kaveri seems to be nothing more than a novelty at the moment, one that AMD can’t afford to hinge their hopes on…

      • swaaye
      • 6 years ago

      It’s just something new to sell in the same “entry level” segment that Llano, Trinity and Richland sold in. If they could have somehow greatly boosted the memory bandwidth it would look a lot better, but that would add to platform costs and may not have made any sense I guess.

      AMD’s long term goals are a bit of a mystery to me too. It looks like they’ve given up trying to compete with Intel at the high-end directly though. They just don’t have the resources. Intel would have to have a giant mis-step and I am not sure how that could happen with the way things look right now.

      • HisDivineOrder
      • 6 years ago

      Yet they do pin their hopes on it. It’s ridiculous on the face of it. Intel has proven AMD’s best on more powerful GPU’s to less powerful CPU’s is the complete opposite of what the industry wants.

      If one needs great GPU performance, they buy a discrete card. If they want reasonable GPU performance they need great CPU performance with great performance per watt, too, and go Intel.

      The niche for poor performance per watt, poor CPU performance, and vaguely better GPU performance than the competition at a higher price is just so small, I don’t see the product having much success.

      You’d think AMD would recognize their version of Netburst (ie., Bulldozer, Piledriver, Steamroller) and do what Intel did. Go back a few generations to the last truly great architecture and re-build from there. They’ve gone off the rails atm.

      They’re going to wind up screwing us all if they don’t get back on track because without AMD to compete, Intel and nVidia are going to stop doing much of anything except minor refreshes and price increases (ie., what they’re mostly doing right now).

        • cynan
        • 6 years ago

        Regardless of what happens to AMD’s CPU business, or AMD as a whole, I find it highly unlikely that AMD’s GPU division would simply be dissolved or parted out at sold. Nvidia’s main competition isn’t going anywhere.

    • puppetworx
    • 6 years ago

    Were you using the driver that came with the review unit? [url=http://www.anandtech.com/show/7677/amd-kaveri-review-a8-7600-a10-7850k/7<]For AnandTech[/url<] that was a Beta driver, so crashes wouldn't be unprecedented. [quote="anandtech.com"<]...for today's review we were sent their 13.300 beta 14 and RC2 builds (which at this time have yet to be assigned an official Catalyst version number)[/quote<]

      • Jigar
      • 6 years ago

      I am surprised TR didn’t mentioned this in their article that crashes are happening but they are also using beta driver from AMD and hence official catalyst is expected to improve the stability (Somewhat).

      EDIT: Review article’s table does mention – AMD Catalyst 13.30 RC2 but not in this article. Reader who is unaware will get a real bad picture.

    • the Lionheart
    • 6 years ago

    I don’t think Kaveri was meant to be just a CPU+GPU on one chip.
    I see it as a coalition of separate dedicated processors that’s meant to maximize performance by using the right processor for the right workload.

    If I were to buy a Kaveri chip now, it would be on the premise of seeing apps that utilize the HSA capabilities of Kaveri rather than playing games on the Radeon IGP.

    I don’t game much anymore, mostly because the purpose video games are made for, which is providing virtual realities in which one can engage and immerse is no longer met. The crappy mechanics and AI in games, and visuals to a great extent, make modern video games to a long-time gamer like myself a boring activity.

    With that said, Kaveri and kaveri like vecotor+scalar combo processor with those nice HSA features are the right hardware to handle next gen physics and AI in games. This is precisely why Kaveri is important for the gaming industry to evolve.

      • Voldenuit
      • 6 years ago

      [quote<]If I were to buy a Kaveri chip now, it would be on the premise of seeing apps that utilize the HSA capabilities of Kaveri rather than playing games on the Radeon IGP.[/quote<] Except full HSA support won't arrive until 2015 at the earliest. So it doesn't make much sense to buy now for that.

        • the Lionheart
        • 6 years ago

        Hardware wise, I think HSA is complete. Shared virtual memory and the ability for CPU cores and GPU “cores” to share execution of processes/threads without context switching and going through a recompilation is HSA.

        AMD might try to make CPU microcode runnable on their GCN CUs or cores which is doable, but I’m not sure if they’re planning on doing that.
        Moving the execution of many dimensional (+/4) vector instructions (integer floating point) to GPU cores is a possibility that we might see on the HSA side and this would do away with the packet based division of CPU Labor and GPU labor or what AMD calls HQ.

        With HQ basically an app splits up work into CPU or GPU specific packets and puts them into CPU/GPU specific memory locations or “task-queues” in the process’s memory footprint where those packets are connected through memory pointers which dictate the execution flow of the whole process.

        Overall, the future of computing seems to be shaping up on the basis of dedicated processing, which is what HSA is all about.

        Software wise, I have to disagree with you. Video games can certainly start taking advantage of the GPU part of Kaveri to accelerate all kinds of processing. With Mantle around, this is gonna happen soon, AFAICT.
        Other software solutions will start to support HSA if AMD shows enough enthusiasm in the market.

        It’s a matter of time before HSA takes off.

      • ronch
      • 6 years ago

      [quote<]If I were to buy a Kaveri chip now, it would be on the premise of seeing apps that utilize the HSA capabilities of Kaveri...[/quote<] How long are you willing to wait after buying your Kaveri chip right now for $190 for AMD's HSA and OpenCL plans to come true? And by 'true', I mean not just a few sporadic apps here and there. I'm talking about industry-wide adoption. I don't see Intel getting too worried. Waiting for apps to use 8 cores is probably more realistic than actually expecting everyone in the industry to use AMD's GCN cores for everything from running F@H to the Calculator app. [quote<]... rather than playing games on the Radeon IGP. [/quote<] It's no secret (and let's not deny it) that AMD is targeting gamers with their APUs. I remember Tigerdirect.com displaying a big AMD Kaveri banner add with Battlefield 4 all over it. And also, if they're not targeting gamers out there (which is really a stupid assumption), why does everyone (including AMD, I reckon) keep comparing their APUs' gaming performance with Intel IGPs? And oh yeah, remember, AMD says you can CROSSFIRE these APUs with SOME of their GPUs. What do you think people would use CF for? Don't kid yourself, Spiggy. I love AMD but it seems to me like they're pushing HSA and OpenCL and all that just to justify their ATI purchase. Perhaps, if they REALLY can't compete in the CPU space anymore they should just become a GPU company entirely just like Nvidia was before they dabbled into SOCs.

    • christos_thski
    • 6 years ago

    AMD has been singing the praises of APUs ever since the ATI acquisition, 8 full years ago, and the best they’ve managed is this? Color me unimpressed. I was actually hoping for an integrated GPU with performance similar to those inside the PS4 and Xbox One, at Kaveri’s prices (especially those of the higher end parts).

      • stmok
      • 6 years ago

      Well, the PS4 and Xbox One are (primarily) gaming platforms. So its reasonable the engineering would focus on IGP and GPU performance. Its a specific, fixed platform with a defined consumer lifespan. (7 years or so?)

      Where as Kaveri is for general PC computing markets. ie: More powerful CPU, but less powerful IGP in order to fit into certain TDP specs.

      The only way to improve IGP performance is to get a compatible discrete Radeon GPU for Crossfire mode…If that were the case, you might as well save up a bit and not bother with APUs. Stick to a fast performing CPU with a discrete GPU for your needs.

      What you’re highlighting is that there is no free lunch and compromises have to be made based on the intended market or usage. (Which applies to all engineering fields.)

      To me, the APU idea seems well-suited for specific markets like HTPCs, notebooks, etc. Systems where you don’t upgrade. You just use them until they don’t fit your needs and you get rid of them. (Sell them off on Ebay, give them to poor or recycle/re-purpose, etc.)

      …Personally, I would rather AMD just dump the Bulldozer-based architecture (CPU-side) completely and bring up something new that gives Intel a run for their money. In reality, we won’t see a brand new architecture until 2016-ish. (Post-Excavator with DDR4 being the norm). Seriously AMD, go back to pure cores!

      When you combine a new, high-performing CPU architecture with GPU technology; then you’ll have enthusiasts like yourself get excited about PC technology again. ie: “I want that in my box!”

      So I agree in that I’m just as unimpressed as you are with the current situation in the PC hardware market. (PC gaming market seems largely unaffected from the people I’ve talked to in the industry.)

        • ahmedabdo
        • 6 years ago

        Well, I see a market for these chips. I live in a poor country (Iraq). People rarely have the financial power to buy things like discrete GPUs in there systems. An APU like this one would be more than welcome to the market. The gaming performance (as I can observe from the screen shots) is quite acceptable by the standards here. A very low priced PC can be built with decent gaming performance.
        However, AMD market is very limited, if not rare. Actually, a lot of people never knew there is CPUs being made beside Intel’s!

      • tipoo
      • 6 years ago

      They need GDDR5 versions of Kaveri to come out (which I think are coming out) before coming to PS4 level performance. The bottleneck is the bandwidth right now, and they have no eDRAM for an in-between solution. They could put 100 compute units on it and it probably wouldn’t advance performance that much without more bandwidth.

        • chuckula
        • 6 years ago

        [quote<]They need GDDR5 versions of Kaveri to come out[/quote<] That came out LONG ago.. it's called the HD-7750 only they had a higher clock on the GGN cores and they dropped the underperforming CPU cores.

          • anotherengineer
          • 6 years ago

          Dropped the cpu? That would make it a GPU and not part of the Kaveri family then. Correct?

            • chuckula
            • 6 years ago

            This clip is for you: [url<]http://www.youtube.com/watch?v=vdHBsWXaHN8[/url<]

            • HisDivineOrder
            • 6 years ago

            He’s trying to say that product already exists and you don’t need to have a CPU attached to get it because it’s not all that and a bag of chips.

          • tipoo
          • 6 years ago

          Yeah, but I was responding to why there wasn’t a PS4-level Kaveri chip yet. And this HSA stuff may mean nothing right now, but it will continue to if they don’t keep iterating these chips.

    • Voldenuit
    • 6 years ago

    Can we get split-screen comparisons to HD 4×00 and HD 5200 at comprable settings/framerates?

    Eg Kaveri on High vs, say HD 4600 on medium to provide the same framerates.

      • chuckula
      • 6 years ago

      Prediction: The one on the left wins. It’s science!

        • Meadows
        • 6 years ago

        Prediction: one of them will crash.

          • shank15217
          • 6 years ago

          and the other wont even launch the game.. please stop making excuses for Intel’s crack gpu drivers..

            • derFunkenstein
            • 6 years ago

            whoa, did you pull a muscle making such a leap?

    • Chrispy_
    • 6 years ago

    I took one thing away from that, Geoff;

    Kaveri drivers aren’t ready yet – crash to desktop seems to be the default result!

      • DPete27
      • 6 years ago

      Yeah, I’m significantly less excited about going out and buying one right now based on this article. Would it be too much to have a revisit once the next set of APU drivers comes out? (Edit: assuming they fix the crashing issues)

      Hopefully you told AMD of your crash findings.

        • HisDivineOrder
        • 6 years ago

        AMD released the product too early and you want him to redo all his work because they couldn’t get their act together by launch? You realize people are buying these things right now, yes? That this is the Day 1 experience…

        Kinda reminds me of the 7970 launch, actually. AMD really needs to learn how NOT to rely on our largesse to succeed in the marketplace.

          • MadManOriginal
          • 6 years ago

          Yeah so this reflects the current experience, in 3 months or whatever it might be totally different which would warrant a second look at that time. It’s not too much to ask, especially in this ‘subjective’ style where he basically just plays around in the games for a few hours.

    • Hattig
    • 6 years ago

    The crashes are worrying. Have you tried different memory sticks?

      • Scrotos
      • 6 years ago

      Looks like this was a good call.

        • Anonymous Coward
        • 6 years ago

        Its a little unfortunate that crashing is such a significant component of the review, when it was discovered to be the memory.

          • superjawes
          • 6 years ago

          Well this isn’t technically a review. This is part of Geoff’s blog, and is advertised as “a subjective look.” Still, it’s nice that he put the update at the beginning of the article so that it can be read before discussing the crashes.

    • chuckula
    • 6 years ago

    Subjective question about those crashes:

    Were they:
    1. Less/More frequent that you experienced for the A10-6800K just after it launched?
    2. Less/More frequent than you normally experience with regular AMD discrete GPUs?
    3. Affected in any way (that you can perceive) by driver updates?

    [Edit: As noted at the top of the article, faulty RAM was to blame… but it was AMD-branded RAM (supplied by AMD too)]

    • gerryg
    • 6 years ago

    I’m assuming the new A-10 will finish making the journey into mid-level gaming reasonableness with a 5-10% boost in frame rates. I still want to know what the Dual Graphics story will be like.

    Fingers crossed on the driver updates fixing crash issues.

      • ALiLPinkMonster
      • 6 years ago

      This. Let’s see a test of crossfire between Kaveri’s graphics core and say a 7750. Might make a decent upgrade option for someone wanting to build now but make sure they can play future games decently.

        • chuckula
        • 6 years ago

        [quote<]Let's see a test of crossfire between Kaveri's graphics core and say a 7750. [/quote<] Won't happen. But if you get the right model of GPU it's possible: [url<]http://wccftech.com/amd-kaveri-dual-graphics-works-ddr3-memory-based-radeon-r7-gpus/[/url<]

          • ALiLPinkMonster
          • 6 years ago

          Oh yeah, I forgot about the inherent limitations.

            • chuckula
            • 6 years ago

            Those dual-graphics setups will probably be pretty popular for OEMs who want to have base models & higher-end graphics models for Kaveri systems.

            • HisDivineOrder
            • 6 years ago

            I suspect they won’t be popular because AMD’ll have to fix the driver situation before any of them go within a 100 yards of it. I don’t think the driver issues will ever be fixed because I think the configuration requires too much cooperation between parts of the CPU that connect via an internal controller and parts that connect via PCIe to make them do well.

            I think those OEM’s will retreat to the same old configuration they’ve always done, which is putting a higher end GPU if they want more GPU performance and skipping the APU’s take on GPU entirely if they upgrade the chip. I suspect it winds up not much more expensive to them in terms of buying actual hardware and INCREDIBLY less expensive when NOT having to deal with the software/driver support issues related to APU’s crossfired to discrete cards.

            I think APU’s crossfired to discrete cards will remain a niche (APU CF to discrete) of a niche (gaming PC’s) of a niche (PC’s) and you’re going to see no uptake and no interest on AMD’s part to really push the matter forward. It’s more in AMD’s interests to sell higher end GPU’s anyway.

      • derFunkenstein
      • 6 years ago

      Tom’s appeared to have a reasonable take on dual-graphics, and it was mostly positive other than some jittery.

    • Kaleid
    • 6 years ago

    Gotta wonder how the performance would be if there was a far greater memory bandwidth. Consoles have 128-bit DDR3 and GDDR5 then why can’t the PC:s have that?

      • ALiLPinkMonster
      • 6 years ago

      In the first review, they didn’t see significant improvement from DDR3 1600 to 1866 and they couldn’t get much higher than that without running into boot problems. A lot of these problems do seem like driver/software/BIOS issues, since there’s little reason why it shouldn’t be able to handle 2133 or higher memory speeds. I’d very much like to see how it performs with super fast memory.

      On a somewhat related note, I’d love to see what it can do with a Mantle title. If it’s all it’s cracked up to be, we should see BF4 become quite a bit more playable on this little engine that can once the Mantle version is released (is it yet? I don’t know).

        • Kaleid
        • 6 years ago

        Well, going from 1600 to 1866 doesn’t significantly improve the memory bandwidth but say that you go from 64 to 128bit and I bet you’d see significant improvements as the memory bandwidth doubles. If there were no improvements, then MS would have kept 64bit for Xbone instead of going with 128bit.

        No, Mantle is not yet released, but I think it will come out this month. I haven’t played BF4 myself yet, because I’m a bit interested in comparing the two, it’s a bit like in the early days of glide and d3d.

          • alientorni
          • 6 years ago

          if you are using dual channel ddr3 then it’s 64bitsx2= 128bits. the problem is with amd’s memory controller is quite bad. you can see on any review that it is hugely outperformed by any intel counterpart.
          it has improved in this gen and some of the betterment with dedicated graphics performance has to do with this, plus 2133 support.

      • the
      • 6 years ago

      The Xbox One and the PS4 have 256 bit wide memory interfaces.

      And according to [url=http://anandtech.com/show/7702/amd-kaveri-docs-reference-quadchannel-memory-interface-gddr5-option<]Anandtech[/url<] Kaveri may have both a 256 bit and GDDR5 memory controller on-die. Obviously that with FM2+ socket compaitbility, it cannot take advantage of either but it could be an option for BGA packaged chips.

        • Kaleid
        • 6 years ago

        I stand corrected, somehow I remembered 128-bit for Xbone..

          • Narishma
          • 6 years ago

          You’re probably thinking of the Xbox 360, which uses GDDR3 on a 128-bit interface.

    • hendric
    • 6 years ago

    Any chance you could do some MMO testing? I loved seeing Guild Wars 2 listed in some of the previous reviews, and would like to know how well this runs it.

      • ALiLPinkMonster
      • 6 years ago

      RTS too. That would be a great test of the new Steamroller cores.

    • JosiahBradley
    • 6 years ago

    Not bad for having 128 SPs disabled from the full chip. Maybe my next netbook will handle games better. And since I run Linux, I might see HSA enabled apps before they go mainstream.

Pin It on Pinterest

Share This