AMD’s Brazos platform: The benchmarks

Exactly two weeks have passed since I got to sit down at AMD’s Austin campus and benchmark a Zacate development system. You might have read about my experience, which I seasoned with some fresh insights on the Brazos platform that Zacate powers, in my first preview article on November 8.

As I explained then, AMD put a momentary moratorium on the publication of Zacate benchmarks, so the best I could offer were vague and possibly misleading comparisons… and even vaguer hints about the performance of AMD’s first accelerated processing unit (APU). Well, the moratorium finally lifted tonight. I guess the headline was probably a dead giveaway, but still; we’re now free to publish all of the numbers I collected in my few, brief hours of testing.

I’m not going to keep you folks waiting too long for the data—a week is a long enough time to wait. So, we’re going to skip architecture and platform talk. We already provided that information in spades in the aforementioned preview article, as well as Scott’s architectural overview of Bobcat. I should, however, take some time to explain the conditions of the test and lay out a few caveats, since this is one of the very rare cases where we tested neither a retail-boxed computer nor a system we had carefully configured ourselves.

Testing conditions

The misshapen contraption you see above is one of the Zacate test rigs AMD laid out for us lucky testers. Scott saw the same test chassis in San Francisco two months ago, when AMD gave an early demo of its APU running City of Heroes and Internet Explorer 9 with hardware acceleration enabled. This time, however, AMD let me use the system mostly unsupervised for several hours—enough time to run our new mobile benchmark suite, paving the way for a comparison between the Zacate system and several laptops we’ve reviewed so far, including the Nile-powered Toshiba Satellite T235D and a CULV 2010-based ultraportable.

AMD took care of configuring the test machine, outfitting it with the necessary cooling, peripherals, storage, display (an 11.6″, 1366×768 panel), and software. Windows 7 Professional x64 was pre-installed along with a full suite of drivers and some benchmarks:

In a shocking display of rudeness toward my hosts, I pooh-pooed the included benchmarks and instead whipped out a 32GB USB thumb drive containing our mobile test suite—plus a few Steam game backups. Re-installing Windows would have been a good way to ensure a clean testing environment, but that wasn’t really feasible. First, the drivers AMD had installed were pre-release versions not publicly available. And second, I had just enough time with the system to complete my testing. (To give you a rough idea, AMD handed out the keys to the preview system at around 10:00 AM, and I had to head to the airport at around 4:30 PM.)

By the way, that tight schedule also ruled out any battery life tests. Considering AMD quotes run times of at least 8.5 hours for Zacate systems, you can probably guess why.

As far as I could see, though, the Zacate test rig wasn’t up to anything unusual. CPU-Z reported a 1.6GHz CPU clock speed, signaling the quickest Zacate part, and Windows 7 said it had 4GB of memory at its disposal, minus around 400MB requisitioned by the integrated GPU. AMD told us it had set up the machines with solid-state drives, and the Windows Device Manager supported this claim, announcing a 128GB Crucial RealSSD C300. That’s not exactly the kind of drive you’d find on a cheap ultraportable, of course, but its fast read and write speeds proved helpful when installing benchmarks and restoring Steam backups—the clock was ticking, after all.

To sate my paranoid side, I also peeked into the Windows Task Manager and checked the process names. I encountered one service I hadn’t seen before, but after discussing it with AMD, I’m now reasonably satisfied that no tomfoolery was afoot. Eventually, all the poking and prodding led way to some actual testing…

Our testing methods

Over the next few pages, you’ll see the Zacate test build compared to two full-sized Optimus notebooks, the Asus N82Jv and 833Jc; three consumer ultraportables, the Acer Aspire 1810TZ, Acer Aspire 1830TZ, and Toshiba Satellite T235D; one netbook, the Eee PC 1015PN; and one Consumer Ultra-Low-Voltage nettop, the Zotac Zbox HD-ND22.

The 1015PN, N82Jv, U33Jc, and T235D were all tested twice: once plugged in and once unplugged using included “battery-saving” profiles. In the case of the 1015PN, N82Jv, and U33Jc, we compared the battery-saving results to those obtained using built-in “high-performance” modes. The N82Jv’s battery-saving results were obtained with Asus’ Super Hybrid Engine enabled, as well, which dropped the CPU clock speed from 2.4GHz to 0.9-1GHz depending on load. On the U33Jc, the high-performance profile included by Asus raised the maximum CPU clock speed from 2.4 to 2.57GHz. On the Eee PC 1015PN, the included Super Hybrid Engine “Super Performance Mode” raised the CPU speed by 25MHz, while the “Power Saving Mode” limited the CPU to about 1GHz and disabled the Nvidia GPU.

All tests were run at least three times, and we reported the median of those runs.

I should note that, in my hurry, I seem to have misplaced the version numbers for the Zacate system’s graphics and audio drivers. Apologies for that. Also, for what it’s worth, CPU-Z wouldn’t report memory timings.

System AMD Zacate test system Acer Aspire 1810TZ Acer Aspire 1830TZ Asus Eee PC 1015PN Asus N82Jv Asus U33Jc Toshiba Satellite T235D-S1435 Zotac Zbox HD-ND22
Processor AMD Zacate engineering sample 1.6GHz Intel Pentium SU4100 1.3GHz Intel Pentium U5400 1.2GHz Intel Atom N550 1.5GHz Intel Core i5-450M 2.4GHz Intel Core i3-370M 2.4GHz AMD Turion II Neo K625 1.5GHz Intel Celeron SU2300 1.2GHz
North bridge AMD Hudson FCH Intel GS45 Express Intel HM55 Express Intel NM10 Intel HM55 Express Intel HM55 Express AMD M880G Nvidia Ion
South bridge Intel ICH9 AMD SB820
Memory size 4GB (2 DIMMs) 3GB (2 DIMMs) 3GB (2 DIMMs) 1GB (1 DIMM) 4GB (2 DIMMs) 4GB (2 DIMMs) 4GB (2 DIMMs) 4GB (2 DIMMs)
Memory type DDR3 SDRAM DDR2 SDRAM at 667MHz DDR3 SDRAM at 800MHz DDR3 SDRAM at 667MHz DDR3 SDRAM at 1066MHz DDR3 SDRAM at 1066MHz DDR3 SDRAM at 800MHz DDR3 SDRAM at 1066MHz
Memory timings N/A 5-5-5-15 6-6-6-15 6-5-5-12 7-7-7-20 7-7-7-20 6-6-6-15 7-7-7-20
Audio IDT codec Realtek codec with 6.0.1.869 drivers Realtek codec with 6.0.1.6043 drivers Realtek codec with 6.0.1.6186 drivers Realtek codec with 6.0.1.6024 drivers Realtek codec with 6.0.1.6029 drivers Realtek codec with 6.0.1.6072 drivers Realtek codec with 6.0.1.5845 drivers
Graphics AMD Radeon HD 6310 Intel GMA 4500MHD with 15.17.11.2202 drivers Intel HD Graphics with 8.15.10.2057 drivers Intel GMA 3150 with 8.14.10.2117 drivers

Nvidia Ion with 8.17.12.5912 drivers

Intel HD Graphics with 8.15.10.2189 drivers

Nvidia GeForce 335M with 8.17.12.5896 drivers

Intel HD Graphics with 8.15.10.2119 drivers

Nvidia GeForce 310M with 8.17.12.5721 drivers

AMD Mobility Radeon HD 4225 with 8.723.2.1000 drivers Nvidia Ion with 8.17.12.6099 drivers
Hard drive Crucial RealSSD C300 128GB Western Digital Scorpio Blue 500GB 5,400-RPM Toshiba MK3265GSX 320GB 5,400 RPM Western Digital Scorpio Blue 500GB 5,400-RPM Seagate Momentus 7200.4 500GB 7,200-RPM Seagate Momentus 5400.6 500GB 5,400-RPM Toshiba MK3265GSX 320GB 5,400 RPM Western Digital Scorpio Black 500GB 5,400 RPM
Operating system Windows 7 Professional x64 Windows 7 Home Premium x64 Windows 7 Home Premium x64 Windows 7 Starter x86 Windows 7 Home Premium x64 Windows 7 Home Premium x64 Windows 7 Home Premium x64 Windows 7 Home Premium x64

We used the following versions of our test applications:

All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Application performance

SunSpider JavaScript benchmark

The SunSpider JavaScript benchmark has become a staple of browser testing around the web, usually serving to highlight differences in JavaScript execution speeds between browser revisions. Today, we’ll be looking at SunSpider performance with the same browser (Firefox 3.6.9) across multiple notebooks.

Zacate is off to a respectable start, trailing our CULV-based Zbox HD-ND22 by less than 10% and handily outpacing the Eee PC 1015PN’s dual-core Atom running at 1.53GHz.

7-Zip

Odds are, anyone with a computer will need to compress or decompress some files every one in a while. To see how these systems handled that task, we ran 7-Zip’s built-in benchmark and jotted down the results for both compression and decompression.

AMD’s new APU kicks things up a notch in 7-Zip, zooming ahead of entry-level CULV and CULV 2010 systems alike and nipping at the heels of the Toshiba notebook’s Turion II Neo.

TrueCrypt

Next up: file encryption. Because who wants any two-bit thief to have access to his sensitive data? We ran TrueCrypt’s built-in benchmark and averaged the results for all of the different encryption schemes.

x264 video encoding

Last, but not least, we took our notebooks through the x264 high-definition video encoding benchmark.

In TrueCrypt, the Zacate system lies smack-dab in CULV territory. Moving on to a more intense video-encoding test, however, the CULV systems (and even our AMD Nile laptop) speed ahead, leaving the APU in the no-man’s-land between the Atom and the rest of the market. To be fair, video encoders are among those apps that folks generally don’t run on their ultraportables and netbooks.

That gives us a decent glimpse into the performance of Zacate’s dual Bobcat CPU cores. What about the APU’s graphics component?

Gaming

Call of Duty 4: Modern Warfare

Infinity Ward’s first Modern Warfare title is growing somewhat long in the tooth, but it still has a strong following in multiplayer circles. More importantly, it’s a good representative of the type of game you might want to play on a notebook that lacks serious GPU horsepower: not too old, but not too new, either. We tested Call of Duty 4 by running a custom timedemo, first at 800×600 with the lowest detail options, then again at 1366×768 with everything cranked up except for v-sync, antialiasing, and anisotropic filtering, which were all left disabled. (With the Eee PC, we opted for the 1024×600 native resolution instead of 1366×768, and with the Zbox HD-ND22, we were only able to use a resolution of 1360×768.)

Oh my. Although Zacate falls a reasonable distance behind our Toshiba Nile notebook in CPU tests, it speeds past in Modern Warfare. The only system in the same class that comes close is the Zbox, which is powered by a first-generation Nvidia Ion chipset.

Far Cry 2

Ubisoft’s safari-themed shooter has much more demanding graphics than CoD4, so it should really make our notebooks sweat. We selected the “Action” scene from the game’s built-in benchmark and ran it in two configurations; first at 1366×768 in DirectX 10 mode with detail cranked up, then at that same resolution in DX9 mode with the lowest detail preset. Vsync and antialiasing were left disabled in both cases. (Again, the Eee PC was run at 1024×600, since that’s the highest resolution its display supports, and the Zbox was run at 1366×760.)

We see a similar picture in Far Cry 2, where the Zacate and Zbox basically end up neck-and-neck.

That’s it for our conventional gaming benchmarks. Next up: some freestyle game testing.

Off the beaten path

Scientific benchmarks or not, we like to install different games on our laptops and manually tweak the options to see how well they run. A little subjective testing never hurt anybody, right?

I kicked off my freestyle gaming tests with DiRT 2, a long-time TR favorite and one of the better racing games out on the PC. At 1366×768 with the “low” preset, the demo’s Morocco track unfurled at a solid 20 FPS, give or take two or three. Frame rates dropped into the low teens upon crashes, but the game was surprisingly smooth and playable overall.

Next up was Left 4 Dead 2, which I ran at 1366×768 with trilinear filtering, no antialiasing, high shader detail, and medium effect, model, and texture detail. In the first map of the Dead Center campaign, frame rates ranged from a low of about 13 to a high of 36 FPS. From a seat-of-the-pants perspective, the game was completely playable despite notable choppiness during heavy action. Those massive zombie swarms aren’t easy on low-end hardware, but Zacate did a reasonably good job of keeping things smooth.

Feeling emboldened by these good results, I tried the Just Cause 2 demo. Things didn’t go so well there. At 800×600 with all the detail options turned down, the Zacate system yielded frame rates in the 18-20 FPS range in the first town. That’s just not good enough for a fast-paced action game, unfortunately. I had trouble shooting and driving straight, wishing there were somehow a way to turn the graphics detail further down.

I ended my freestyle game testing with a short round of Alien Swarm, specifically the single-player training level. At 1366×768 with antialiasing off, trilinear filtering, low shader and effect detail, and medium model and texture detail, the game chugged along with highs of nearly 30 and lows in the 15-17 FPS range during heavy action. I’d say the game was definitely playable overall, and it didn’t look half bad, as you can see above.

Video playback

I tested video decoding performance by playing the Iron Man 2 trailer in a variety of formats. Windows Media Player was used in full-screen mode for the H.264 QuickTime clips, while Firefox was used for the windowed YouTube test. In each case, I used Windows 7’s Performance Monitor to record minimum and maximum CPU utilization for the duration of the trailer.

  CPU utilization Result
Iron Man 2 H.264 480p 9-40% Perfect
Iron Man 2 H.264 720p 4-53% Perfect
Iron Man 2 H.264 1080p 6-62% Perfect
Iron Man 2 YouTube 720p windowed 32-82% Choppy

Rest assured, Zacate’s UVD3 video decoding logic does a terrific job with H.264 content in Windows Media Player. Playing back high-definition Flash clips in Firefox turned out to be another story, though—there were quite a lot of dropped frames, even if I tried running the video in full-screen mode.

I asked the AMD folks about this, and they told me they’d been able to get smooth high-def Flash playback out of one of the same test systems without issue. Puzzled, I tried updating Firefox and the Flash plug-in to the latest versions. No dice. After some more prodding, I learned that AMD had run its internal tests in Internet Explorer. Sure enough, the same video played back as smooth as silk in Internet Explorer 8 with the latest Flash ActiveX plug-in. Interesting. I’d probably chalk this issue up to immature drivers. Hopefully, final Zacate systems will be able to deliver smooth Flash video in any browser.

Temperatures and power consumption

I didn’t bring an infrared thermometer or a power meter along, but AMD was kind enough to provide some, so I took the bait and got some readings while running the benchmarks you saw on the previous pages.

First, I amused myself with the infrared thermometer, pointing the beam at various surfaces (and the inside of my mouth, when the AMD guys weren’t looking). I measured 27.4°C at the base of the APU heatsink with the system at idle and 32.5°C during our Far Cry 2 benchmark’s third consecutive run. The fan speed seemed pretty much constant, and it was barely louder than a whisper.

I also checked the power meter while the Far Cry 2 test was finishing: it reported 33.6W draw at the wall. In x264, a purely CPU-bound benchmark, power draw at the wall was about 26W. Keep in mind these figures are for the entire system and were taken upstream of the power adapter, which AMD told me has an efficiency of around 83%.

Conclusions

AMD has pulled off quite an impressive feat with Zacate. On the CPU front, the dual Bobcat cores clearly play in the same league as Intel’s entry-level CULV and CULV 2010 processors. That means performance largely acceptable for everyday tasks, from web surfing to file compression. Even enthusiasts using an ultraportable or a nettop as their second or third PC ought to find this kind of performance acceptable.

On the graphics front, meanwhile, Zacate succeeds in outpacing both previous AMD solutions and current Intel ones by a fair margin. The only system that came close was our Zotac nettop, which has an Nvidia Ion integrated graphics chipset. You’ll be hard-pressed to find any ultraportable notebooks toting that kind of hardware these days, though; the closest thing you’d probably find would be something with a discrete, next-generation Ion GPU, and that’s a different class of hardware—nothing like the elegant two-chip Zacate-Hudson combo AMD will offer as part of the Brazos platform.

As I said last time, the only question that remains is battery life. If Zacate manages to match or exceed current solutions in terms of run time, which seems entirely possible considering the Brazos platform’s very spartan power draw, then AMD might just end up with the most attractive ultraportable platform on the market early next year. That would be a refreshing first, after the misses and near-misses we’ve seen from the company so far.

Of course, Intel isn’t asleep at the switch. With such heated competition from its chief rival, perhaps it won’t hesitate to whip up some Sandy Bridge-based CULV 2011 solutions, which could have considerably better graphics performance than current, CULV 2010 offerings. And the next generation of Atom processors, also due next year, might give netbooks some much-needed oomph. There’s no telling whether Intel will be able to reach the sweet spot of power efficiency and performance AMD appears to have achieved with Zacate, though.

Comments closed
    • UberGerbil
    • 9 years ago

    *[http://www.anandtech.com/show/2828/5<]§ (I haven't seen benchmark numbers for Pineview but I suspect they aren't much better, and certainly not better than the ION systems)

    • IntelMole
    • 9 years ago

    Assuming decent battery life and pricing, in essence, buying this vs. a CULV chip from Intel is taking a bet that GPGPU takes off for the really heavy lifting.

    Given that there are now more than a few examples of GPUs being used, that might be reasonable.

      • FuturePastNow
      • 9 years ago

      I’d pick this simply because I have shelves of games to play that, while old, still require better 3D graphics performance than Intel can deliver. This should run shooters and action games from 5-10 years ago quite well; a CULV or Atom won’t.

      I also need a small, light laptop to travel with, and I like the idea of taking a few older games with me.

    • LiamC
    • 9 years ago

    I want Bobcat for appliances. Think a firewall box or NAS.

      • Voldenuit
      • 9 years ago

      Agreed. A Bobcat Boxee should be interesting.

      It just doesn’t appeal to me as an attractive netbook option since it’s not much faster than dualcore Atom.

        • LiamC
        • 9 years ago

        I too was hoping this would make a good thin and light (13″~14″), but it doesn’t look to have the cpu grunt I want. I’m can’t figure out if the bottleneck is CPU or the graphics~would probably need comparing to a Radeon 5450 equipped CULV

    • indeego
    • 9 years ago

    meh. My 7-zip is 4 times this. Can’t go backwards, even for portability’s sakeg{<.<}g

    • burntham77
    • 9 years ago

    I just like the idea of not buying an Intel system. This is good stuff. I’d pick one up as a cheap portable for sitting on the couch.

    • StashTheVampede
    • 9 years ago

    Give me the most powerful chip, 15″ screen, 4GB of ram and an SSD — how many hours of battery life and whats the cost? Swap the screen to 13 and how many hours of battery life and cost?

    Chip isn’t as fast as Intel C2D, but we’re looking at “good” performance at very lower power numbers.

      • axeman
      • 9 years ago

      BAH! The best use for a low power chip is in a small and light laptop, which still has excellent battery life. I’d like to see more 11-12″ laptops with “good enough” performance.

        • StashTheVampede
        • 9 years ago

        Not everyone wants the smaller, sub-13″ screen. MOST of the time those smaller screens lead to smaller keyboards (which annoy me to no end).

        Why not have a 15″ screen on that platform?

    • FuturePastNow
    • 9 years ago

    I was hoping for more, but it’s better than Atom in every way. I can settle for that.

      • NeelyCam
      • 9 years ago

      For now. I think Cedar Trail will change that.

        • FuturePastNow
        • 9 years ago

        Cedar Trail won’t be significantly more powerful than Pineview. The onlt real performance improvement is adds is hardware 1080p video decoding. It will also use a lot less power. That is the direction Intel wants to take Atom, into potentially smaller devices with better battery life.

        Anyway, Cedar Trail isn’t coming until the middle of next year.

          • NeelyCam
          • 9 years ago

          I was talking about the power consumption reduction. The current Atom chipsets are kind of inefficient… the CPU itself consumes almost no power (see idle vs. full load).

            • OneArmedScissor
            • 9 years ago

            I’m hesitant to think that anything but a complete redo of the way laptops are built will change battery life much. It doesn’t look like even Core 2s AND their chipsets use hardly anything.

            The problem is that despite integrating so many parts into one chip, they keep making all the parts do more and more things that most people just don’t need…and now they ROMG TURBO BLOOOOST to bazillion GHz even if that accomplishes nothing for you. Pentium 4, is that you in there?

            Atom and Bobcat are both steps in the right direction, but ARM-based “smartbooks” could very well take a big, steamy dump all over them. Nobody will expect those to have whacky PCIe setups and oodles of desktop speed RAM with a squillion traces running all over an entirely too fancy PCB. Just stick a beefed up smartphone chip in there with a low power SSD and it’s good to go.

            • DavidC1
            • 9 years ago

            Why do everyone think the chipsets are inefficient on the Atom?

            Hello?! The 2010 Atoms are called Pineview, which integrates most of the chipset functions! The companion chipset has a TDP of only 1.5W. Even the Nettop companion chipset is only at 2.2W. The NETTOP Atoms suck in power usage which is what they use to compare Brazos platform with. Let’s see Brazos Netbook vs. Atom Netbooks and see power consumption comparison goes then.

            • OneArmedScissor
            • 9 years ago

            People often confuse “chipset” with “the way every single PC has been built for years.”

            The “computer” part isn’t changing, folks, which is the point I was trying to make. Put a 1w TDP combination CPU/NB/SB chip in a laptop and it’s still going to suck down 5w or so

            To see any real difference, we’ll need new screens with no backlights, cut down motherboards, different RAM altogether, and netbook-class SSDs instead of server-class.

            The trouble is that you cannot even really do any of that to a laptop by switching parts out yourself. It’s a fundamental PC design issue.

    • swaaye
    • 9 years ago

    It looks a little better than the boring Nano from both power and speed standpoints. The GPU looks a little better than a low end card. Exciting. Heh.

    It will probably make annoyingly warm and noisy netbooks. My Eee900 with its Celeron 900 and GMA 900 already get quite hot playing anything 3D and that’s about the same power output I think.

      • axeman
      • 9 years ago

      This uses less power than the chipset alone does in your case AFAIK.

        • swaaye
        • 9 years ago

        I also forgot that ICH7 is in there too. 🙂

    • OneArmedScissor
    • 9 years ago

    Oh, you kids and your wild speculation on synthetic benchmarks. I’m sure a CPU with integrated just about everything is merely on par with Atom, which has integrated nothing and wonderful FSB interlinks holding it together. You absolutely won’t be able to tell the difference between the two in regular use, just like you can’t tell Atom from an Athlon 64. Wait a minute…

    *Earth to dingdongs! Earth to dingdongs!*

    If it doesn’t compress 14GB ZIP files as fast, but still gets good battery life, I think everyone else who doesn’t obsess over incomplete numbers will live.

    Wait for the battery figures before you get out your pitchforks.

      • NeelyCam
      • 9 years ago

      Well said.

      But speculators need to speculate.

    • NeelyCam
    • 9 years ago

    I wish TR reviews would list the CPUs along with the laptop model names. How am I supposed to know what Asus 85012xkt has in it? Makes comparison difficult.

    I preferred the way Anand’s graphs list the CPUs.

    • kamikaziechameleon
    • 9 years ago

    I would pick amd over intel in this arena for a couple reasons, They will have a true GPU on board and at that intel isn’t yet capable of releasing compitent drivers.

      • sweatshopking
      • 9 years ago

      that’s really only 1 reason. you like the gpu better.

        • NeelyCam
        • 9 years ago

        Concise and to the point. But where’s the trolling?

        • shank15217
        • 9 years ago

        No its two reasons, good drivers and a fast gpu, where did you miss that?

      • sweatshopking
      • 9 years ago

      PLUS INTEL CPU’S ARE FART POWERED!

        • NeelyCam
        • 9 years ago

        And there it is.

        It gives you warm and fuzzies to know that there are some things you can always count on.

    • neon
    • 9 years ago

    Does the APU have a product name yet?

    I assume it will not be AMD Zecate E350, but rather AMD Turion3 E350 or AMD Fusion E350 or AMD RockOn 11900/BBQ/XLT

      • sweatshopking
      • 9 years ago

      They’re calling it AMD Awesome Powerful Ultimate (APU) platform.

    • TheEmrys
    • 9 years ago

    What’s with the Ageia Physx installation? Was it a factor in Just Cause 2? This just looks odd to me that Physx would be installed in an AMD cpu/gpu system.

    • flip-mode
    • 9 years ago

    I don’t know what to think. This market does not excite me much. Maybe for netbook users this is good news and hope for future tablets. For me, I just see a really slow processor and GPU paired up together. I’d love to have a file server / linux box based on one of these, but that’s much more of a luxury than a need and it falls pretty low on my prioritized list of luxuries too.

    I hope the product does well for AMD but I don’t see it as useful for me.

      • obarthelemy
      • 9 years ago

      most people I know (except gamers, and even them… ) run incredibly outdated PCs. 5-10 years ago I liked fixing PCs up as a hobby, and selling/giving them on… I got a call last month about a friend of my parents’ with a Celeron 700-based MSI pizza-style mini PC (5371 ?) i fixed up for him when that CPU was already old. He wanted another PC, a laptop, to move around with it. Still perfectly happy with the Celeron, though.

      Even an Architect friend of mine is still using some kind of Athlon (not even XP nor II) PC with an old version of AutoCAD, “‘coz it works fine”.

      So my take on the chip is that it’s what the world needs: powerful enough for almost everything at home and at work, plays all video formats, cheap, more reliable (due to integration). It would be an upgrade for 80-90% of all the PCs I know. ‘cept mine :-p

        • flip-mode
        • 9 years ago

        So true. In fact, I’m typing this on a 4-year old PC I’m presently prepping to pass on to my dad, who is using a PC I passed on to him some 4 or 5 years ago, which is an Athlon XP 2600 with an ATI Rage 128 32MB video card. The only reason he’s decided to upgrade is because he bought a new (used) monitor that’s 1600×1200 and I suggested that his current PC will probably be sluggish on it.

      • shank15217
      • 9 years ago

      A file server nas box wouldn’t need a gpu at all, making these a bad choice.

        • OneArmedScissor
        • 9 years ago

        So a Core i7 would be better?

          • shank15217
          • 9 years ago

          Single core nile athlon at 9w would outperform it on i/o intense tasks.

            • OneArmedScissor
            • 9 years ago

            A 1.2 GHz single core wins vs. a 1.6 GHz, almost directly clock for clock comparable dual-core? I don’t think so, Tim.

            Problem: if you’re talking about mobile parts, where are you going to get a mobile board with no GPU? I’m not aware one exists.

            If you go with desktop parts, well, I know what a file server REALLY doesn’t need, and that’s a high power northbridge with over 9,000 PCIe lanes like a 7/890FX board has.

            Bobcat nettops will likely idle at 15-20w. There isn’t a current AMD desktop board that will get even remotely close to that. It would just be a waste of electricity.

        • obarthelemy
        • 9 years ago

        in my experience, having some kind of graphics on any box makes everything so much easier: installation, troubleshooting, updates and upgrades… IGPs are better for this than discrete cards, since they don’t impact reliability, power usage, size… as much

          • UberGerbil
          • 9 years ago

          Agreed. I want an IGP in all my motherboards, just because it’s handy when a discrete GPU is unavailable/being troubleshot/unnecessary/whatever.

            • flip-mode
            • 9 years ago

            Thank gosh there’s someone that shares this exact sentiment with me! Every review of and AMD xxxGX chipset says “but I just don’t see the point of an IGP on and enthusiast motherboard” and I’m like “but it’s great for trouble shooting and for the eventual demotion to 2nd class computer where low power is awesometown”.

    • HisDivineShadow
    • 9 years ago

    Looks like Apple’s dream CPU+GPU config to me for a Macbook Air. Imagine how thin a system like this could be after a couple of iterations when performance is twice, three times what this one’s is.

    It does make one wonder how much performance we can expect from its big brother, Bulldozer. Now that APU could well end up being exactly what Apple is looking for.

    Then again, Sandy Bridge might, too, if Intel can get their GPU drivers up to snuff. With Bulldozer so far away for portable systems (and Llano standing in for it in the meantime), it looks like I’ll wind up with another Intel system this coming year. It’s a shame. I wish AMD had put more emphasis on portable systems, less on the ever-declining desktop, but I can’t argue that building an Atom+Ion trouncer wasn’t very smart to do.

    This chip will do 90% of what the market needs from a laptop and could well bring Atom-priced CULV laptops up to CULV performance and slow iPad-domination of the pricepoint. If you can go out and buy a 11-15″ laptop for $300-400 with 8 hour battery life and the performance to run WoW and watch Full High Def videos, then that begins to look fairly attractive to a lot of people.

    I also expect people to grow a little weary of the limitations of an iPad and begin to want to do other things beyond web browsing, which might incline them to investigate more “armed and fully operational battle station”-worthy options. As in, a repeat of when people began buying netbooks en masse, then started buying CULV’s afterward because they found netbooks whetted their appetite for bigger things.

    In other words, I am saying to AMD, great job. I’m impressed. I’d totally look at this for a secondary system and I hope that someone manages to figure out the right combo of interface for Windows 7, hardware for the right price, and tablet to go with it. I’d love a tablet that harnassed Windows AND an APU like this one, but in order to do that, they need to realize Microsoft seems wholly incapable of providing them an appropriate UI and go make one for themselves. And who’s smart enough to do that and do it well?

    • Chrispy_
    • 9 years ago

    \o/

    Cheap, light, all-day battery life ultraportable that can run games less thn three years old, and run video.

    I hope this is priced closer to an Atom than a CULV. Atom needs to die, and it won’t happen if there’s a significant enough discount over Zacate.

      • HisDivineShadow
      • 9 years ago

      I’d expect the day these laptops come at $400, Atom laptops will hit $200 regularly, with Ion straddling the middle at $300.

      This is not really that far away from what we’re seeing today (old Atom $230-$275, Atom+Ion $325-400).

        • Anonymous Coward
        • 9 years ago

        I don’t see why an Ion+Atom is going to beat Bobcat on cost.

          • NeelyCam
          • 9 years ago

          I think his point is that they’ll get priced based on the performance of the competition. If Atom+Ion sells for $350, this will sell for $400 because it’s slightly better.

    • codedivine
    • 9 years ago

    Single channel memory looks to be holding it back.

      • Hattig
      • 9 years ago

      Yeah, I think that at some point AMD will need to look at layered DRAM / eDRAM on the Ontario/Zacate package.

        • NeelyCam
        • 9 years ago

        That would make a lot of sense.

      • mczak
      • 9 years ago

      I think dual channel is out of the question, due to large number of additional i/o pads needed. That said, an obvious improvement would be to use faster ddr3 – nowadays low-voltage (1.35V) ddr3 can be had with speeds up to ddr3-1600 whereas zacate is limited to ddr3-1066.

    • Fighterpilot
    • 9 years ago

    i hope there was enough magic fairy dust left to sprinkle over Bulldozer as well.
    Looks like pushing the frequency of that Zacate would give a decent performance boost….can you/have you tried upping the clocks at all?

      • Anonymous Coward
      • 9 years ago

      AMD is likely looking for good yields and low engineer time, so I doubt Bobcat will ever clock terribly high. I imagine they choose yields over top-bin clock speed every time the choice comes up.

    • vvas
    • 9 years ago

    Goodbye Ion, it was nice knowing you.

    And now I’m really itching to see benchmark comparisons of Zacate’s GPU and Sandy Bridge’s GPU, because it looks like it could be pretty close!

    • moritzgedig
    • 9 years ago

    I fail to understand the need for HD-video on nettops.
    are they not for surfing and typing?
    it is like with cars: They keep adding small ones but then make them bigger with each iteration so they have to keep adding new models on the low end.
    HD-video is ment for big TVs not tiny screens.

      • Voldenuit
      • 9 years ago

      HD video is becoming a big component of surfing. Youtube HD and sites like Vimeo for instance. And I can imagine a lot of people will soon find services like Hulu Plus, Netflix streaming etc attractive if they haven’t already.

        • moritzgedig
        • 9 years ago

        I find the 360p setting on youtube sufficient, it saves data and is good enough for the laughing baby, drunk cat or what ever.

          • HisDivineShadow
          • 9 years ago

          And yet MOST people want 720p video because most videos added today support it. I don’t blame them as long as it doesn’t cost battery life. In this case, it would seem it does not.

          It is noteworthy that one of the biggest advantages of this APU config is the fact that the support for HD video is essentially free because the GPU component includes UVD3 since all modern Radeon parts do. I suspect pulling it out would net little benefit and deprive the system of one of the best benefits of having a built-in GPU.

          It’s also worth mentioning that Intel includes its own attempt at the same thing, even if its drivers are rather shoddy at actually supporting it.

      • Meadows
      • 9 years ago

      No, HD video is meant for devices that support it. Most portable computers can fully display 720p at least, so it’s perfectly acceptable to want to use it and test it.

      • Anonymous Coward
      • 9 years ago

      This AMD chip could end up in all sorts of places. They place no restrictions on its use, such as Intel does with Atom. It could easily show up in a machine with “HD” screen resolution.

      • Bauxite
      • 9 years ago

      hdmi port + hdtv = not bored crazy

      Hotel/office/random visit to relative or friend + netbook that isn’t gimped (e.g. atom) = play tons of stuff.

      Add some kind of internet (wifi/3g/whatever you can get your hands on) and then you can stream or do whatever.

      Screens are everywhere these days, and my 11″ portable has plugged into tons of them. You never know when you’ll want it next. 4 hours in the middle of the night at a vet clinic that I would’ve otherwise had to watch infomercials, read 2 years of golf digest or grab a useless nap which would’ve just made me groggier for the long drive back. Thank you amazing little culv netbook for saving my sanity.

      • Delphis
      • 9 years ago

      Yea .. they said it’s a 1366×768 screen and then run a 1080p video.. hmm.. only you don’t have 1920×1080 resolution, so isn’t that a bit silly? I guess watching videos on a flight might amuse some people.

        • TheEmrys
        • 9 years ago

        Particularly if you have these video already on a computer.

      • obarthelemy
      • 9 years ago

      2 reasons:

      1- a nettop/netbook can double up as a TV-connected video player, grabbing videos off of my file server, when I’m at home, or plugged into my hosts/hotel tv…

      2- on the road, I don’t want to have to reencode files just for my nettop’s screen resolution. I want to grab whatever I feel like of my server, copy it onto my netbook, and sail into the wild blue yonder. No reencoding.

      • odizzido
      • 9 years ago

      If the video I want to watch happens to be larger than my screen I still want to be able to watch it properly.

      • derFunkenstein
      • 9 years ago

      A nettop is also a prime target for an HTPC. Play back Bluray and DVD, along with streaming videos from Netflix and the like.

    • Meadows
    • 9 years ago

    That no-frills photo reminds me of computers of old.

      • dpaus
      • 9 years ago

      I thought “Oh!! They put that poor APU into a waffle maker!”

    • basket687
    • 9 years ago

    Great article as always, but I have a suggestion (applicable to all your mobile articles): In the performance charts, why not write the (CPU+GPU) next to the name of each system? For example: When you say in the charts (Toshiba T235D), that really means nothing to me unless I go each time and check the “testing methods” page.
    Thank you.

      • Voldenuit
      • 9 years ago

      Seconded. I had two open two tabs and compare two pages side by side just to see which CPU/GPU was scoring where.

      Personally, I think the underlying hardware is what interests people more than the ODM/OEM/badge on the machine, and it would have been more informative to rank the systems by their CPU+GPU.

        • vvas
        • 9 years ago

        Additionally, it would be nice to describe a bit more in text what the contenders are. That specifications table has a lot of information to digest. For example, it took me a while to realize that the Pentium U5400 of the 1830TZ is of the same generation as the i3/i5!

    • Cyril
    • 9 years ago

    I’ve just updated the 7-zip and x264 scores for the Aspire 1830TZ. I only got this system a few hours ago, and those two benchmarks were run first, but the results for them turned out to be abnormally low. The difference doesn’t amount to much, though, and the commentary and conclusion are unaffected.

    • fent
    • 9 years ago

    Honestly, after all of the build up I was expecting it to be faster CPU-wise.

    • UberGerbil
    • 9 years ago

    So… what was the temperature of the inside of your mouth?

      • KarateBob
      • 9 years ago

      I was wondering the same thing, but I didn’t want another one of my posts and it’s subsequent replies to suddenly disappear.

        • sweatshopking
        • 9 years ago

        I find that to be a common problem.

    • Skrying
    • 9 years ago

    Looks wonderful on paper. So wonderful that I know when laptops using this platform ship none of them will be made with an ounce of care. We’ll get horrible touchpads, horrible keyboards, horrible screens, horrible plastic bodies. It’ll be Intel’s CULV platform all over again. Promising chip derailed by subpar implementation.

      • zima
      • 9 years ago

      Lenovo gives some hope – Brazos should be at home in the successor of “Thinkpad”(not a true one, but decent) X100e. Here’s hoping that this time they won’t be so afraid of giving it good battery in a misplaced effort to shield “big” X-series…

      • Anonymous Coward
      • 9 years ago

      Are you saying Apple should make a Bobcat product?

    • KarateBob
    • 9 years ago

    When are systems based on this platform going to -[

      • NeelyCam
      • 9 years ago

      JF-ADM in SemiAccurate mentioned Q1, I think

    • Voldenuit
    • 9 years ago

    Not unexpected. 80 shader GPU is ~ 2x faster than the 40 shader Radeon Mobility 4225, but still disappointingly behind the Gefoce 310M, which is not exactly a powerhouse in any sense of the word.

    They need to get their dualcore part down to 9W and into tablets, because they’re still not super competitive performance wise. They can probably undercut intel on price, but it doesn’t look like a game changer right now.

    • SNM
    • 9 years ago

    Wow. Suddenly I am wondering seriously if AMD will power Apple’s laptops in a few years. I doubt they’ll ever accept Brazos, but the Bulldozer-based Fusion products might give them a satisfactory 2-chip solution before Intel can give them one.

      • NeelyCam
      • 9 years ago

      By that time Intel has IvyBridge, or maybe even Haswell.

      Although, I think Llano has a good window of opportunity, assuming AMD can get it yielding with GF 32nm SOI

      • LiamC
      • 9 years ago

      Some of them yes. Not a few years. A few months

        • derFunkenstein
        • 9 years ago

        Yeah, I tend to agree. Other APU parts, the higher-power, higher-performance parts which will hopefully finally scale beyond 2.5GHz are a great fit for 13″ Macbooks because it’s an APU and a southbridge-type I/O chip and that’s all. The APU’s graphics will support OpenCL because all of AMD’s current products on OS X do, 4000 and 5000 series alike.

Pin It on Pinterest

Share This