AMD’s A8-3800 Fusion APU

When they first hit the scene this summer, we thought AMD’s new A-series APUs—more commonly referred to in certain circles by their code-name, Llano—were a nice fit for laptops. Llano silicon combines four relatively low-speed CPU cores with an integrated Radeon graphics processor, and AMD focused lots of attention on making sure the chip only sips power at idle, to prolong battery life. Intel’s competing Core i3 processors have two fast cores, and Llano’s quad cores give it nearly comparable CPU performance, while its Radeon IGP runs circles around Intel’s rather anemic graphics. In all, the mobile A-series APUs are an attractive alternative to the latest from the Intel juggernaut—no minor achievement these days.

When AMD attempted to migrate Llano’s competitive formula onto the desktop, though, the road got bumpy. The CPU cores needed a big clock frequency boost in order to compete with Intel’s desktop Core i3 processors, and to get there, AMD had to raise the chip’s operating voltage substantially. As a result, the first desktop Llano, the A8-3850 APU, had an outsized power envelope of 100W—5W higher than the Phenom II X4 840 it ostensibly replaced (which was made on an old 45-nm fabrication process) and worlds apart from the competition, the 65W Core i3-2100. Worse, the A8-3850’s four CPU cores were still notably slower than the Core i3’s dual cores, especially in single-threaded applications.

Llano retained its advantage in graphics horsepower over the Intel IGP, but we had trouble seeing the value proposition of a 100W chip whose primary attraction was somewhat nicer integrated graphics than the 65W competition—especially when a cheap video card would provide better graphics than any IGP. In short, Llano wasn’t terribly attractive when it moved too far from its original mission as a low-power solution for highly integrated systems like laptops. Fortunately, AMD also promised to release 65W versions of Llano for the desktop eventually, and we looked forward to those as a potentially smarter choice with more of a natural fit in some parts of the market.

The A8-3800 APU

Although they are currently in short supply due to manufacturing problems, the 65W desktop variants of Llano have intermittently popped up in stock at various online retailers in recent weeks. They are out there, even if they’re a little scarce right now—and fortunately, we’ve gotten our hands on one of ’em, the A8-3800 APU.

The A8-3800 differs from the A8-3850 we’ve already reviewed in just a few respects, the most obvious being its 65W thermal design power (TDP) rating. Also unlike the A8-3850, the A8-3800 makes use of AMD’s Turbo Core dynamic clock frequency tech, which allows the chip to range up to higher clock speeds temporarily when there’s thermal headroom available—that is, when not all of the CPU cores are heavily burdened at once. In this case, the A8-3800 can stray from its 2.4GHz base clock up to 2.7GHz. Those frequencies leave the A8-3800 a bit behind the A8-3850, whose four cores regularly run at 2.9GHz. Happily, though, AMD didn’t have to compromise on the 3800’s graphics in order to fit into the smaller power envelope. The two products share the same Radeon HD 6550D IGP with 400 shader ALUs at 600MHz.

A look at the underside of the A8-3800 and the Socket FM1 mechanism that holds it

The A8-3800’s list price is $129, ten bucks less than the A8-3850’s. That means its primary competition from Intel is similar: the Core i3-2100 at $117, or perhaps more appropriately, the Core i3-2105 at $134. The difference between those two Core i3 models is simple: the i3-2105 has full-on Intel HD 3000 graphics, not the chopped-in-half HD 2000 variant in the i3-2100. Their CPU performance should be the same. Thus, we’ve tested both chips, but we’ve confined the Core i3-2105 to our integrated graphics tests alone.

As if the competiton weren’t formidable enough, Intel has recently released a few new models, some at overlapping prices. For instance, the Core i3-2125 lists for $134, runs at 3.3GHz (200MHz faster than the i3-2105), and also has an HD 3000 IGP. The Core i3-2130 runs at 3.4GHz and sells for $138, but has HD 2000 graphics. We haven’t tested them yet, but the slight CPU clock speed boosts should make these newer models even tougher competition for the A8-3800. Still, they’re not likely to alter any key dynamics fundamentally, as you’ll understand once we get into the performance results.

Oh, and the results on the following pages were obtained with the same configurations detailed on this page of our A8-3850 review. We’ve just dropped the A8-3800 into our Llano test rig and added it to the mix.

Power consumption and efficiency

Since the A8-3800’s primary distinction is its 65W power envelope, we’ll begin with a look at power consumption. There’s a lot going on in the graphs below, I’ll admit. Most of the systems are configured as similarly as possible, with the same power supply, storage, memory, and graphics card. Only the motherboards and processors vary.

However, for Llano and friends, we also wanted to take a look at power draw with only integrated graphics, so the results parenthetically marked “IGP” don’t include a separate video card.

Also, a couple of the results, marked “Brick PSU”, involve truly low-power configurations with only integrated graphics and a more efficient laptop-style brick power supply. We pondered putting the Llano and Core i3-2100 systems on the brick PSU, as well, but their total power draw probably wouldn’t mix well with that power supply’s 80W peak rating, so we stuck with our standard PSU instead. Out of necessity and because it’s only fair, we did install the Core i3-2100 in a smaller microATX motherboard, the Intel DH67BL, for the IGP power tests.

We’ll start by looking at the raw power draw results, and then we’ll parse them in various ways.

The A8-3800 system’s power draw at idle is comparable to the Core i3-2100’s, overall. However, at peak, the Core i3-2100 rig draws substantially less power, even though the two chips ostensibly share the same max power (TDP) rating. The gap is largest without a discrete graphics card in the mix, where the A8-3800 setup pulls over 20W more than the Core i3-2100. That gap shrinks when we involve a discrete GeForce and shift the Core i3-2100 to a larger motherboard.

Still, the contrast between the A8-3850 and the A8-3800 is considerable. Our A8-3800-based system requires about 30 fewer watts, either with our without a discrete GPU.

If we concentrate on the period when each CPU is processing the rendering task, we can get a nice measure of power-efficient performance. When we do that, the A8-3800 isn’t especially impressive—more efficient than the A8-3850, but less so than the Core i3-2100.

Civilization V

The developers of Civ V have cooked up a number of interesting benchmarks, two of which we’ve used here. The first one tests a late-game scenario where the map is richly populated and there’s lots happening at once. As you can see by the setting screen below, we didn’t skimp on our the image quality settings for graphics, either. Doing so wasn’t necessary to tease out clear differences between the CPUs. Civ V also runs the same tests without updating the screen, so we can eliminate any overhead or bottlenecks introduced by the video card and its driver software. We’ve reported those “no render” scores, as well,

The next test populates the screen with a large number of units and animates them all in parallel.

The A8-3800 solidly trails the Core i3-2100 in each of our Civ V tests, generally by fairly large margins. The good news, in my view, is that the A8-3800 isn’t that much slower than the A8-3850. For many purposes, the roughly 30W reduction in peak system power draw would make the move to an A8-3800 a positive tradeoff.

While we’re here, I should pause to point out an unlikely source of competition for the A8-3850 from AMD’s own stable of older processors. Notice that the Athlon II X3 455 performs very much like the A8-3800 in the tests above. The X3 455 is an older, triple core product based on a 45-nm fabrication process, and it has a higher 95W TDP. Still, the X3 455 costs only $76 right now. You’d need to buy a discrete graphics card or a motherboard with integrated graphics to use with the X3 455, but you’d have plenty of money left over to do so. The X3 may offer similar CPU performance for less—something to watch as we look through the rest of the tests.

F1 2010

CodeMasters has done a nice job of building benchmarks into its recent games, and F1 2010 is no exception. We went to some lengths to fiddle with the game’s multithreaded CPU support in order to get it to make the most of each CPU type. That effort eventually involved grabbing a couple of updated config files posted on the CodeMasters forum, one from the developers and another from a user, to get an optimal threading map for the Phenom II X6. What you see below should be the best possible performance out of each processor.

Metro 2033
Metro 2033 also offers a nicely scriptable benchmark, and we took advantage by testing at four different combinations of resolution and visual quality.

The Core i3-2100 is nearly 50% faster than the A8-3800 in F1 2010. Fortunately, the gap isn’t so great in Metro 2033, and the A8-3800’s 57 FPS average is respectable. Still, we continue to like the tradeoff of taking the A8-3800 over the 3850. The performance difference between the two is fairly minimal.

Battlefield: Bad Company 2

The best thing we can say for the A8-3800 is that it wull run this game competently. We’re less pleased with the Athlon II X3 455, whose 45 FPS average and 37 FPS minimum doesn’t inspire confidence.

Source engine particle simulation

Next up is a test we picked up during a visit to Valve Software, the developers of the Half-Life games. They had been working to incorporate support for multi-core processors into their Source game engine, and they cooked up some benchmarks to demonstrate the benefits of multithreading.

This test runs a particle simulation inside of the Source engine. Most games today use particle systems to create effects like smoke, steam, and fire, but the realism and interactivity of those effects are limited by the available computing horsepower. Valve’s particle system distributes the load across multiple CPU cores.

Although the Core i3-2100 has only dual cores, each of those cores can track and execute two threads via a feature Intel calls Hyper-Threading. This test is highly multithreaded, and the Core i3-2100 handles it very well, outperforming even the fastest current AMD quad-core processor, the Phenom II X4 980. The A8-3800 trails substantially.

Productivity

Somewhat surprisingly, the A8-3800 becomes much more competitive as we move into productivity type applications. That’s true in part because, with the exception of SunSpider, the rest of these tests are nicely multithreaded, so the A8’s quad cores are put to good use.

Video encoding

The drop from the A8-3850 to the A8-3800 moves Llano back in the standings only slightly, but it’s enough that the A8-3800 falls behind the Core i3-2100 in x264, interestingly enough.

3D modeling and rendering

Let’s pause here to consider the effects of Turbo Core on the A8. Cinebench is typically a litmus test of sorts for Turbo Core, since it includes both a single-threaded and a multi-threaded component—and since 3D rendering speed tends to scale quite nicely with additional cores and threads. As you know, the A8-3800 has Turbo Core, with a 300MHz potential boost on tap, while the A8-3850 does not.

Yet look at the results here for the two A8 processors. The speedup when going from one thread to four with both of them is almost exactly 4X, and with only one thread active, the 3800 is a fair amount slower than the 3850. Now, one would have expected Turbo Core to grant the A8-3800 a bit of an advantage in the single-threaded scenario, since that one core should be running at 2.7GHz. For the same reason, the A8-3800’s speedup with four threads should have been a little less than 4X. What’s the deal?

Worried that Turbo Core wasn’t working right, I fired up a copy of CPU-Z while the Cinebench single-threaded test ran, in order to see what was happening with the CPU clock frequency. Fortunately, my fears were unfounded; the A8-3800 was indeed running at 2.7GHz some of the time. The trouble was that it spent far more time at 2.4GHz, even though only one core was busy. Remember, Turbo Core is fundamentally conservative, because it doesn’t want to violate the chip’s overall TDP. To avoid such problems, Turbo Core “dithers” between the base and peak P-states as needed. In the case of the A8-3800, that means Llano doesn’t spend much time at its 2.7GHz peak frequency—at least not with this workload. Moreover, since Turbo Core behavior is programmed at the factory, all A8-3800 chips should behave the same. Bottom line: don’t expect to see your A8-3800 spending a lot of time at 2.7GHz, even in single-threaded applications.

Integrated graphics performance

Obviously, integrated graphics performance is where Llano carves out its place in the market, assuming you care about IGP gaming performance in a desktop processor. Neither version of the Core i3, not the 2100 nor the 2105, can hold a candle to the A8-3800—and we haven’t even talked about Llano’s vast edge in image quality. Beyond that, there are few surprises here. As you can see, there’s virtually no drop-off when going from the A8-3850 to the A8-3800 in IGP gaming. Still, there’s vastly more performance to be had by pairing one of these processors with a $99 discrete graphics card like the Radeon HD 6670.

Conclusions

If we want to consider the A8-3800 solely as a desktop processor, we can summarize its value proposition with one of our famous scatter plots.

At $129, the A8-3800 offers only slightly higher performance than the Athlon II X3 455, which is a $76 chip. A host of other processors occupy the A8’s price range with substantially higher overall test scores, including the A8-3850, the Phenom II X4 840, and the Core i3-2100 and i3-2105. On the price-performance front, Llano’s desktop incarnations continue to seem rather unfortunate, especially for brand-new 32-nm silicon.

Among those desktop CPUs, though, only the Core i3 chips share the A8-3800’s 65W TDP rating. That’s important to note because a 65W processor can go places that a 95-100W CPU can’t—into smaller form factors with quieter cooling and more economical power supplies, for instance. Inside of such systems—some of them, at least—integrated graphics is likely to be the solution of choice. And in some cases, in such systems, the A8-3800’s vastly superior graphics may well be more highly prized than the Core i3’s superior CPU power.

All of which takes us back to where we left off with the A8-3850 several months ago, when we had trouble finding a sensible home for a 100W chip with a burly IGP. As a 65W part, the A8-3800 ought to make sense for a certain class of relatively cheap, compact computers. Those computers probably aren’t the sort that would be built by PC enthusiasts—even in a mini-ITX enclosure, we’d prefer a Core i3 and a cheap graphics card—but they exist in various forms, including the increasingly popular all-in-one systems that follow the iMac template. For fairly basic computing needs, Llano’s integration of four slower cores and AMD’s Radeon technology could end up providing a better user experience than Intel’s Sandy Bridge CPU-IGP hybrid. We suspect PC makers who adopt the A8-3800 won’t be paying anything close to AMD’s $129 list price, either.

With that said, we’d still like to see AMD lower prices on its retail boxed A-series APUs or, better yet, raise the performance bar while keeping TDPs steady once those pesky manufacturing issues are resolved. The A8 isn’t far from being more broadly appealing, but some tweaks would alleviate our doubts. Here’s hoping those happen soon, or it may fall to the next-gen Trinity APU to close the gap.

Comments closed
    • xiaomimm
    • 8 years ago
    • xiaomim
    • 8 years ago
    • burntham77
    • 8 years ago

    I like the technology here, but I just don’t see this fitting into any sort of desktop environment. Laptops, on the other hand, would eat this up.

    • bjterry62
    • 8 years ago

    Nice review guys! Too bad you have to deal with all the moronic comments about the A Series making no sense in the desktop space. AMD has ALWAYS promoted these parts for the BUDGET DESKTOP / HTPC arena. Let’s see, that would be those on a limited budget getting a decent PC for their kids that can still do a little gaming without braking the bank. Or how about a low power HTPC with great 3D BR playback! BTW, where are those power consumption graphs for BR playback anyway??? Oh well, no matter. Since they aren’t competitive with higher powered CPUs when matched with discreet cards, they’re a total flop. Llano is for laptops and low power desktops. When was the last time any gamer gave a crap about how much power they were using?

    Bunch of pinheads.

      • NeelyCam
      • 8 years ago

      Did a big bully take your milk this morning?

      • thermistor
      • 8 years ago

      Other than the tone of the comment, I agree with some of the content.

      I have an HTPC running an X4500 integrated Intel graphics and (gasp) a Celeron 440 overclocked to 2.66 GHz. I use the BIOS to pin the fan back for quiet and the system runs at 60C while using Windows Media Center. I need the overclock to keep the system snappy and the X4500 was the first Intel graphics that didn’t stutter under normal usage. I tried X3000, GMA 950, etc., had to go to the G4X chipset to get enough GPU horsepower to solve my problem. My signals are OTA HD, and analog cable.

      Moral of the story? The Llano would fit my system perfectly, definitely enough GPU, and a respectable CPU. But 65W is still too much, I’d prefer a dual core with the Llano graphics.

      • ronch
      • 8 years ago

      The reason why some folks are troubled that AMD is offering only midrange parts is not because of the actual parts per se. The parts are good for the price you pay for them. Rather, it’s the fact that AMD couldn’t offer parts that compete in the high end and give people confidence that AMD has the technology to keep up with Intel and prevent a monopoly. As it is, AMD has no choice but to play their cards at the lower segments of the market. Not very assuring considering BD has also been delayed many times, among many other troubles AMD is currently facing.

    • I.S.T.
    • 8 years ago

    Man, why the hell does Borderlands run so slow? I’ve got a system that can play L4D2 maxed with 16xQCSAA and 16xAF on at 50-75 frames a second, yet I can barely manage 40 in Borderlands without AA, without AF and running at 800×600.

      • Goty
      • 8 years ago

      Borderlands never did run particularly well on ATI hardware. I believe the culprit is a bit of a mix of its use of Unreal Engine 3 and however it goes about achieving the cel shading effect.

        • I.S.T.
        • 8 years ago

        I got Nvidia though… GTS 250(I’m poor. >_>).

        I recall that the original i7 reviews on this site tested using that game, and it runs like crap comparatively even there. I hope the new game is either optimized better, or at least looks like Borderlands 1 runs. I don’t mind trading speed for eye candy.

    • flip-mode
    • 8 years ago

    As much as I would love to see AMD have a competitive product, this is not it. The CPU is too slow. The fact that the CGPU is decent doesn’t make up for that. Better to have a fast CPU and a slow CGPU that can easily be mitigated with a discreet GPU than a slow CPU that cannot be mitigated.

    I hope Bulldozer is better than I’m expecting it to be.

      • axeman
      • 8 years ago

      Well said, sir.

      • Clint Torres
      • 8 years ago

      Exactly. CPU slower than the two-generations-old Core2 Quad? Back to the drawing board, AMD.

        • NeelyCam
        • 8 years ago

        Don’t worry – their drawing board is already full of construction vehicles. These things just take time..

      • NeelyCam
      • 8 years ago

      [quote<]I hope Bulldozer is better than I'm expecting it to be.[/quote<] I don't think BD is going to be the breakthrough product everyone hoped. It could somewhat meet Sandy Bridge in performance with a much bigger die, but that's not a way to make money. Furthermore, with 22nm tri-gate coming online next year, AMD simply doesn't have a chance to profitably compete in the high end... mostly because they don't have access to Intel's two-to-three-years-ahead process technology. However, I have high hopes for the 28nm Brazos successors. Those chips will smack 32nm Atoms silly, and could offer a low-end option for ultrabook IB. I think this is the market AMD should focus on. Low-cost, fast-enough x86... they could make some good money there. Llano's game it not necessarily over yet, though (depending on GloFo, or course). It's not for me - I've long maintained that its GPU is too weak if I wanted gaming, and way oversized for non-gaming purposes. However, I don't have a good understanding of the "casual mid-range gamer" market that Llano is targeting... it might end up being larger than I expected. Once yields are up, we'll see.

        • flip-mode
        • 8 years ago

        If BD matches even a Penryn or a Nahalem on per-clock performance, then it will have exceeded my expectations. And if it matches Sandy Bridge power consumption at idle, it will have exceeded my expectations.

        I don’t personally care at all how big the die is. That’s AMD’s problem to worry about and even if it’s twice as big as Intel’s I’m not going to bother criticizing that if it’s got good performance aspects.

          • NeelyCam
          • 8 years ago

          [quote<]I don't personally care at all how big the die is.[/quote<] I do. There is a lot of pressure on them to make money for the shareholders. As such, they have to price their stuff to have some profit margin... with big dies, it's hard to keep prices low. Moreover, it has an impact on AMD's future competitiveness. Unless they can start being profitable, they have to cut staff/R&D/something, negatively impacting how well their future can pressure Intel to bring prices down.

    • Corrado
    • 8 years ago

    Just want to point out that the 6670 isn’t even $99. I just bought one last week for my father’s computer, for $83 with a $20 rebate at Newegg. Thats an INCREDIBLE deal for the performance it offers. He mostly plays racing simulations, and since most of them are a few years old, he can max out his 1680×1050 screen with all the visual goodies and it runs fluidly.

      • paulWTAMU
      • 8 years ago

      cheap graphics goodness makes me happy.

    • ronch
    • 8 years ago

    I wouldn’t hesitate to grab an A8 instead of Phenom II unless I plan to plug in a better graphics card. I read somewhere before that getting Dual Graphics to work is a pain in the a$$. They’d better fix that before they stop making Phenom II’s, which is still, for me, the more compelling choice if you plan to add a discrete GPU.

    • colinstu
    • 8 years ago

    Conclusion: So uh… Buy an Intel i3-2100… and throw in a gpu if you give any kind of a real shit about gaming?

      • derFunkenstein
      • 8 years ago

      And buy an Intel i3-2100 and skip the GPU if you don’t.

        • colinstu
        • 8 years ago

        Yup!

    • JustAnEngineer
    • 8 years ago

    Any chance that we could see a Core i3-2105 in the mix? For HTPC applications, that’s the most likely competition for Lllano.

    I’d also like to see testing with faster memory. PC3-12800 costs about the same as PC3-10600 these days.

      • Rza79
      • 8 years ago

      Read the article. It’s already there!

        • JustAnEngineer
        • 8 years ago

        I see them in one set of charts now.

          • Rza79
          • 8 years ago

          [quote<]Their CPU performance should be the same. Thus, we've tested both chips, but we've confined the Core i3-2105 to our integrated graphics tests alone.[/quote<] Reading != just looking at the graphs

      • TheEmrys
      • 8 years ago

      I would actually like to see the 2105T in the mix. I just built an HTPC around it, and I’ve been quite pleased. All the power at 35w.

    • kalizec
    • 8 years ago

    It would have been nice if this review had actually clocked the memory at 1600 or 1866 speeds as Llano supports instead of hampering its GPU performance by clocking that memory at 1333 speeds. So now this review is actually worthless regarding its GPU performance comparison.

      • obarthelemy
      • 8 years ago

      Indeed. At least for the IGP tests were impact should be higher.

        • derFunkenstein
        • 8 years ago

        You must have missed the article a while back where they went and re-visited the 3850 with high-speed memory. Nearly nothing then, either.

        [url<]https://techreport.com/articles.x/21255[/url<] And I'm sure they've got it set up appropriately because if they were still testing at slower speeds with faster DIMMs, you'd see the results not budge at all. They're at least moving a little bit which shows a TINY bit of improvement. The fact is, caring about IGP gaming performance in a desktop system is like putting a spoiler on a Smart Car.

      • Rza79
      • 8 years ago

      What’s up with these comments? DDR3 1600 & 1800 results are included in the IGP tests.

        • kalizec
        • 8 years ago

        They only tested 3850 with 1600 and 1866… so they didn’t test 3800 with 1600 or 1866.

        Also, why test a CPU for CPU performance with 1333 when it supports 1600 and 1866 and you even have 1600 RAM on the machine… that’s, imho, just misconfiguring.

          • derFunkenstein
          • 8 years ago

          Why would you waste time testing the 3800 with faster memory when for the better CPU it had no effect? It had so little effect for the 3850 it’s pure insanity to expect it to help a CPU that’s even slower.

            • Waco
            • 8 years ago

            People *really* try to find fault in everything don’t they?

            • kalizec
            • 8 years ago

            Because there have been countless reviews that have shown that memory speeds has effect on the GPU performance (in the 20’s of %) as well as showing that even the CAS-latency causes several % of difference in GPU performance.

            And regarding CPU? Even if it’s only 1% of effect you should use it if the CPU in question was designed for it. All CPU’s are helped by lower overall memory latency (even if only 1-3%). Anyone claiming that it doesn’t simply don’t understand how a memory subsystem (caches + ram) works and affects performance.

            [edit]

            And for those not believing the relevance of fast RAM for the IGP of Llano…

            [url<]http://www.tomshardware.com/reviews/amd-a8-3850-llano,2975-6.html[/url<]

            • TrptJim
            • 8 years ago

            The review shows both the 3850 and 3800 GPU scores being the same at baseline. I assume the performance increase is at most equal with the 1600/1866MHz results shown in the review. This is not a big deal.

            • Waco
            • 8 years ago

            The day I trust TH over TR is the day I’ll quit computers.

            • Farting Bob
            • 8 years ago

            When you only listen to TH reviews, its time to hand in your geek card at the front office.

            • NeelyCam
            • 8 years ago

            Most review sites have good and bad info; you just have to use your own judgment and weed out the bad stuff.

      • flip-mode
      • 8 years ago

      It’s hard to care about the GPU when the CPU is so slow.

        • Krogoth
        • 8 years ago

        >mfw when you consider any of the chips slow, when they can all effortlessly handle mainstream tasks and gaming without a hitch.

          • Prion
          • 8 years ago

          why is this shorthand being used on non-imageboard forums? mfw stands for “my face when” and is typically accompanied by a reaction face image, and in fact makes no damn sense without it

    • BobbinThreadbare
    • 8 years ago

    This chip isn’t for me, but it makes way more sense than the 3850 does the kind of applications it should find itself in.

      • axeman
      • 8 years ago

      This isn’t the chip for me. </Darrell Sheets>

    • MadManOriginal
    • 8 years ago

    Intel roolz, AMD droolz.

    Now just to wait for the AMD fanboi downrating. Can I beat my record of -10?!? Only time and your vote will give the answer!

      • sweatshopking
      • 8 years ago

      IMMA GIVE YOU A PLUS, BECAUSE I LOVE YOU.

        • derFunkenstein
        • 8 years ago

        Me, too. I think he wants downvoted so the best “downvote” is to actually upvote him.

          • Meadows
          • 8 years ago

          I concur.

          • MadManOriginal
          • 8 years ago

          Drat, you’ve seen through my reverse psychology. OR HAVE YOU?!?!

            • tfp
            • 8 years ago

            So lets vote one up and one down? If this doesn’t show there is a point to the thumb rating system nothing will!

    • willmore
    • 8 years ago

    This looked interesting until I realized that even Scott gave up on it near the end. Hmm, starting with Borderlands there are no more A8-3800 results at all.

    After all, the desktop (and probably the laptop) APUs don’t make much sense with a discrete graphics card. Sure, for the system builder crowd–like most anyone reading this article–the APUs don’t make much sense, but to the system integrator who would never add a discrete GPU (to a desktop or a laptop), the APUs start to look pretty attractive. (unless Intel throws a wad of cash at you) *cough*

      • Damage
      • 8 years ago

      [quote<]Hmm, starting with Borderlands there are no more A8-3800 results at all.[/quote<] Huh? I see them. They're in there.

      • sweatshopking
      • 8 years ago

      yeah, i’ve never understood the “just get a discrete gpu” argument. it defeats the whole purpose, and confuses the market for these chips. that being said, i do see the borderlands data.

    • Johnny5
    • 8 years ago

    I had a nice moment of excitement before I realized this wasn’t a Bulldozer review. :/

      • stmok
      • 8 years ago

      hhh

        • ronch
        • 8 years ago

        Yes, I don’t plan to buy the first BD iteration either, especially if the rumors suggesting that it’s gonna be unable to perform as we hoped it would are true. Second, I guess Rory’s decision to can FM2 for now is right. FM1 just came out and rolling out FM2 would be bit rash even if it will support FM1 chips. Trinity would better off sticking with FM1, if they don’t have to make too many compromises to make it work with the socket.

          • khands
          • 8 years ago

          It’s not really going to matter from the sounds of it, they won’t be able to make enough to meet demand anyways due to GloFo. Hopefully 28nm isn’t such a disaster.

            • ronch
            • 8 years ago

            Well, GloFo’s 32nm problems are based on Llano production. Whether it’s really a problem with the 32nm process or Llano’s manufacturability is open to debate.

Pin It on Pinterest

Share This