AMD’s Brazos platform: The benchmarks

Exactly two weeks have passed since I got to sit down at AMD’s Austin campus and benchmark a Zacate development system. You might have read about my experience, which I seasoned with some fresh insights on the Brazos platform that Zacate powers, in my first preview article on November 8.

As I explained then, AMD put a momentary moratorium on the publication of Zacate benchmarks, so the best I could offer were vague and possibly misleading comparisons… and even vaguer hints about the performance of AMD’s first accelerated processing unit (APU). Well, the moratorium finally lifted tonight. I guess the headline was probably a dead giveaway, but still; we’re now free to publish all of the numbers I collected in my few, brief hours of testing.

I’m not going to keep you folks waiting too long for the data—a week is a long enough time to wait. So, we’re going to skip architecture and platform talk. We already provided that information in spades in the aforementioned preview article, as well as Scott’s architectural overview of Bobcat. I should, however, take some time to explain the conditions of the test and lay out a few caveats, since this is one of the very rare cases where we tested neither a retail-boxed computer nor a system we had carefully configured ourselves.

Testing conditions

The misshapen contraption you see above is one of the Zacate test rigs AMD laid out for us lucky testers. Scott saw the same test chassis in San Francisco two months ago, when AMD gave an early demo of its APU running City of Heroes and Internet Explorer 9 with hardware acceleration enabled. This time, however, AMD let me use the system mostly unsupervised for several hours—enough time to run our new mobile benchmark suite, paving the way for a comparison between the Zacate system and several laptops we’ve reviewed so far, including the Nile-powered Toshiba Satellite T235D and a CULV 2010-based ultraportable.

AMD took care of configuring the test machine, outfitting it with the necessary cooling, peripherals, storage, display (an 11.6″, 1366×768 panel), and software. Windows 7 Professional x64 was pre-installed along with a full suite of drivers and some benchmarks:

In a shocking display of rudeness toward my hosts, I pooh-pooed the included benchmarks and instead whipped out a 32GB USB thumb drive containing our mobile test suite—plus a few Steam game backups. Re-installing Windows would have been a good way to ensure a clean testing environment, but that wasn’t really feasible. First, the drivers AMD had installed were pre-release versions not publicly available. And second, I had just enough time with the system to complete my testing. (To give you a rough idea, AMD handed out the keys to the preview system at around 10:00 AM, and I had to head to the airport at around 4:30 PM.)

By the way, that tight schedule also ruled out any battery life tests. Considering AMD quotes run times of at least 8.5 hours for Zacate systems, you can probably guess why.

As far as I could see, though, the Zacate test rig wasn’t up to anything unusual. CPU-Z reported a 1.6GHz CPU clock speed, signaling the quickest Zacate part, and Windows 7 said it had 4GB of memory at its disposal, minus around 400MB requisitioned by the integrated GPU. AMD told us it had set up the machines with solid-state drives, and the Windows Device Manager supported this claim, announcing a 128GB Crucial RealSSD C300. That’s not exactly the kind of drive you’d find on a cheap ultraportable, of course, but its fast read and write speeds proved helpful when installing benchmarks and restoring Steam backups—the clock was ticking, after all.

To sate my paranoid side, I also peeked into the Windows Task Manager and checked the process names. I encountered one service I hadn’t seen before, but after discussing it with AMD, I’m now reasonably satisfied that no tomfoolery was afoot. Eventually, all the poking and prodding led way to some actual testing…

Our testing methods

Over the next few pages, you’ll see the Zacate test build compared to two full-sized Optimus notebooks, the Asus N82Jv and 833Jc; three consumer ultraportables, the Acer Aspire 1810TZ, Acer Aspire 1830TZ, and Toshiba Satellite T235D; one netbook, the Eee PC 1015PN; and one Consumer Ultra-Low-Voltage nettop, the Zotac Zbox HD-ND22.

The 1015PN, N82Jv, U33Jc, and T235D were all tested twice: once plugged in and once unplugged using included “battery-saving” profiles. In the case of the 1015PN, N82Jv, and U33Jc, we compared the battery-saving results to those obtained using built-in “high-performance” modes. The N82Jv’s battery-saving results were obtained with Asus’ Super Hybrid Engine enabled, as well, which dropped the CPU clock speed from 2.4GHz to 0.9-1GHz depending on load. On the U33Jc, the high-performance profile included by Asus raised the maximum CPU clock speed from 2.4 to 2.57GHz. On the Eee PC 1015PN, the included Super Hybrid Engine “Super Performance Mode” raised the CPU speed by 25MHz, while the “Power Saving Mode” limited the CPU to about 1GHz and disabled the Nvidia GPU.

All tests were run at least three times, and we reported the median of those runs.

I should note that, in my hurry, I seem to have misplaced the version numbers for the Zacate system’s graphics and audio drivers. Apologies for that. Also, for what it’s worth, CPU-Z wouldn’t report memory timings.

System AMD Zacate test system Acer Aspire 1810TZ Acer Aspire 1830TZ Asus Eee PC 1015PN Asus N82Jv Asus U33Jc Toshiba Satellite T235D-S1435 Zotac Zbox HD-ND22
Processor AMD Zacate engineering sample 1.6GHz Intel Pentium SU4100 1.3GHz Intel Pentium U5400 1.2GHz Intel Atom N550 1.5GHz Intel Core i5-450M 2.4GHz Intel Core i3-370M 2.4GHz AMD Turion II Neo K625 1.5GHz Intel Celeron SU2300 1.2GHz
North bridge AMD Hudson FCH Intel GS45 Express Intel HM55 Express Intel NM10 Intel HM55 Express Intel HM55 Express AMD M880G Nvidia Ion
South bridge Intel ICH9 AMD SB820
Memory size 4GB (2 DIMMs) 3GB (2 DIMMs) 3GB (2 DIMMs) 1GB (1 DIMM) 4GB (2 DIMMs) 4GB (2 DIMMs) 4GB (2 DIMMs) 4GB (2 DIMMs)
Memory type DDR3 SDRAM DDR2 SDRAM at 667MHz DDR3 SDRAM at 800MHz DDR3 SDRAM at 667MHz DDR3 SDRAM at 1066MHz DDR3 SDRAM at 1066MHz DDR3 SDRAM at 800MHz DDR3 SDRAM at 1066MHz
Memory timings N/A 5-5-5-15 6-6-6-15 6-5-5-12 7-7-7-20 7-7-7-20 6-6-6-15 7-7-7-20
Audio IDT codec Realtek codec with 6.0.1.869 drivers Realtek codec with 6.0.1.6043 drivers Realtek codec with 6.0.1.6186 drivers Realtek codec with 6.0.1.6024 drivers Realtek codec with 6.0.1.6029 drivers Realtek codec with 6.0.1.6072 drivers Realtek codec with 6.0.1.5845 drivers
Graphics AMD Radeon HD 6310 Intel GMA 4500MHD with 15.17.11.2202 drivers Intel HD Graphics with 8.15.10.2057 drivers Intel GMA 3150 with 8.14.10.2117 drivers

Nvidia Ion with 8.17.12.5912 drivers

Intel HD Graphics with 8.15.10.2189 drivers

Nvidia GeForce 335M with 8.17.12.5896 drivers

Intel HD Graphics with 8.15.10.2119 drivers

Nvidia GeForce 310M with 8.17.12.5721 drivers

AMD Mobility Radeon HD 4225 with 8.723.2.1000 drivers Nvidia Ion with 8.17.12.6099 drivers
Hard drive Crucial RealSSD C300 128GB Western Digital Scorpio Blue 500GB 5,400-RPM Toshiba MK3265GSX 320GB 5,400 RPM Western Digital Scorpio Blue 500GB 5,400-RPM Seagate Momentus 7200.4 500GB 7,200-RPM Seagate Momentus 5400.6 500GB 5,400-RPM Toshiba MK3265GSX 320GB 5,400 RPM Western Digital Scorpio Black 500GB 5,400 RPM
Operating system Windows 7 Professional x64 Windows 7 Home Premium x64 Windows 7 Home Premium x64 Windows 7 Starter x86 Windows 7 Home Premium x64 Windows 7 Home Premium x64 Windows 7 Home Premium x64 Windows 7 Home Premium x64

We used the following versions of our test applications:

All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Application performance

SunSpider JavaScript benchmark

The SunSpider JavaScript benchmark has become a staple of browser testing around the web, usually serving to highlight differences in JavaScript execution speeds between browser revisions. Today, we’ll be looking at SunSpider performance with the same browser (Firefox 3.6.9) across multiple notebooks.

Zacate is off to a respectable start, trailing our CULV-based Zbox HD-ND22 by less than 10% and handily outpacing the Eee PC 1015PN’s dual-core Atom running at 1.53GHz.

7-Zip

Odds are, anyone with a computer will need to compress or decompress some files every one in a while. To see how these systems handled that task, we ran 7-Zip’s built-in benchmark and jotted down the results for both compression and decompression.

AMD’s new APU kicks things up a notch in 7-Zip, zooming ahead of entry-level CULV and CULV 2010 systems alike and nipping at the heels of the Toshiba notebook’s Turion II Neo.

TrueCrypt

Next up: file encryption. Because who wants any two-bit thief to have access to his sensitive data? We ran TrueCrypt’s built-in benchmark and averaged the results for all of the different encryption schemes.

x264 video encoding

Last, but not least, we took our notebooks through the x264 high-definition video encoding benchmark.

In TrueCrypt, the Zacate system lies smack-dab in CULV territory. Moving on to a more intense video-encoding test, however, the CULV systems (and even our AMD Nile laptop) speed ahead, leaving the APU in the no-man’s-land between the Atom and the rest of the market. To be fair, video encoders are among those apps that folks generally don’t run on their ultraportables and netbooks.

That gives us a decent glimpse into the performance of Zacate’s dual Bobcat CPU cores. What about the APU’s graphics component?

Gaming

Call of Duty 4: Modern Warfare

Infinity Ward’s first Modern Warfare title is growing somewhat long in the tooth, but it still has a strong following in multiplayer circles. More importantly, it’s a good representative of the type of game you might want to play on a notebook that lacks serious GPU horsepower: not too old, but not too new, either. We tested Call of Duty 4 by running a custom timedemo, first at 800×600 with the lowest detail options, then again at 1366×768 with everything cranked up except for v-sync, antialiasing, and anisotropic filtering, which were all left disabled. (With the Eee PC, we opted for the 1024×600 native resolution instead of 1366×768, and with the Zbox HD-ND22, we were only able to use a resolution of 1360×768.)

Oh my. Although Zacate falls a reasonable distance behind our Toshiba Nile notebook in CPU tests, it speeds past in Modern Warfare. The only system in the same class that comes close is the Zbox, which is powered by a first-generation Nvidia Ion chipset.

Far Cry 2

Ubisoft’s safari-themed shooter has much more demanding graphics than CoD4, so it should really make our notebooks sweat. We selected the “Action” scene from the game’s built-in benchmark and ran it in two configurations; first at 1366×768 in DirectX 10 mode with detail cranked up, then at that same resolution in DX9 mode with the lowest detail preset. Vsync and antialiasing were left disabled in both cases. (Again, the Eee PC was run at 1024×600, since that’s the highest resolution its display supports, and the Zbox was run at 1366×760.)

We see a similar picture in Far Cry 2, where the Zacate and Zbox basically end up neck-and-neck.

That’s it for our conventional gaming benchmarks. Next up: some freestyle game testing.

Off the beaten path

Scientific benchmarks or not, we like to install different games on our laptops and manually tweak the options to see how well they run. A little subjective testing never hurt anybody, right?

I kicked off my freestyle gaming tests with DiRT 2, a long-time TR favorite and one of the better racing games out on the PC. At 1366×768 with the “low” preset, the demo’s Morocco track unfurled at a solid 20 FPS, give or take two or three. Frame rates dropped into the low teens upon crashes, but the game was surprisingly smooth and playable overall.

Next up was Left 4 Dead 2, which I ran at 1366×768 with trilinear filtering, no antialiasing, high shader detail, and medium effect, model, and texture detail. In the first map of the Dead Center campaign, frame rates ranged from a low of about 13 to a high of 36 FPS. From a seat-of-the-pants perspective, the game was completely playable despite notable choppiness during heavy action. Those massive zombie swarms aren’t easy on low-end hardware, but Zacate did a reasonably good job of keeping things smooth.

Feeling emboldened by these good results, I tried the Just Cause 2 demo. Things didn’t go so well there. At 800×600 with all the detail options turned down, the Zacate system yielded frame rates in the 18-20 FPS range in the first town. That’s just not good enough for a fast-paced action game, unfortunately. I had trouble shooting and driving straight, wishing there were somehow a way to turn the graphics detail further down.

I ended my freestyle game testing with a short round of Alien Swarm, specifically the single-player training level. At 1366×768 with antialiasing off, trilinear filtering, low shader and effect detail, and medium model and texture detail, the game chugged along with highs of nearly 30 and lows in the 15-17 FPS range during heavy action. I’d say the game was definitely playable overall, and it didn’t look half bad, as you can see above.

Video playback

I tested video decoding performance by playing the Iron Man 2 trailer in a variety of formats. Windows Media Player was used in full-screen mode for the H.264 QuickTime clips, while Firefox was used for the windowed YouTube test. In each case, I used Windows 7’s Performance Monitor to record minimum and maximum CPU utilization for the duration of the trailer.

  CPU utilization Result
Iron Man 2 H.264 480p 9-40% Perfect
Iron Man 2 H.264 720p 4-53% Perfect
Iron Man 2 H.264 1080p 6-62% Perfect
Iron Man 2 YouTube 720p windowed 32-82% Choppy

Rest assured, Zacate’s UVD3 video decoding logic does a terrific job with H.264 content in Windows Media Player. Playing back high-definition Flash clips in Firefox turned out to be another story, though—there were quite a lot of dropped frames, even if I tried running the video in full-screen mode.

I asked the AMD folks about this, and they told me they’d been able to get smooth high-def Flash playback out of one of the same test systems without issue. Puzzled, I tried updating Firefox and the Flash plug-in to the latest versions. No dice. After some more prodding, I learned that AMD had run its internal tests in Internet Explorer. Sure enough, the same video played back as smooth as silk in Internet Explorer 8 with the latest Flash ActiveX plug-in. Interesting. I’d probably chalk this issue up to immature drivers. Hopefully, final Zacate systems will be able to deliver smooth Flash video in any browser.

Temperatures and power consumption

I didn’t bring an infrared thermometer or a power meter along, but AMD was kind enough to provide some, so I took the bait and got some readings while running the benchmarks you saw on the previous pages.

First, I amused myself with the infrared thermometer, pointing the beam at various surfaces (and the inside of my mouth, when the AMD guys weren’t looking). I measured 27.4°C at the base of the APU heatsink with the system at idle and 32.5°C during our Far Cry 2 benchmark’s third consecutive run. The fan speed seemed pretty much constant, and it was barely louder than a whisper.

I also checked the power meter while the Far Cry 2 test was finishing: it reported 33.6W draw at the wall. In x264, a purely CPU-bound benchmark, power draw at the wall was about 26W. Keep in mind these figures are for the entire system and were taken upstream of the power adapter, which AMD told me has an efficiency of around 83%.

Conclusions

AMD has pulled off quite an impressive feat with Zacate. On the CPU front, the dual Bobcat cores clearly play in the same league as Intel’s entry-level CULV and CULV 2010 processors. That means performance largely acceptable for everyday tasks, from web surfing to file compression. Even enthusiasts using an ultraportable or a nettop as their second or third PC ought to find this kind of performance acceptable.

On the graphics front, meanwhile, Zacate succeeds in outpacing both previous AMD solutions and current Intel ones by a fair margin. The only system that came close was our Zotac nettop, which has an Nvidia Ion integrated graphics chipset. You’ll be hard-pressed to find any ultraportable notebooks toting that kind of hardware these days, though; the closest thing you’d probably find would be something with a discrete, next-generation Ion GPU, and that’s a different class of hardware—nothing like the elegant two-chip Zacate-Hudson combo AMD will offer as part of the Brazos platform.

As I said last time, the only question that remains is battery life. If Zacate manages to match or exceed current solutions in terms of run time, which seems entirely possible considering the Brazos platform’s very spartan power draw, then AMD might just end up with the most attractive ultraportable platform on the market early next year. That would be a refreshing first, after the misses and near-misses we’ve seen from the company so far.

Of course, Intel isn’t asleep at the switch. With such heated competition from its chief rival, perhaps it won’t hesitate to whip up some Sandy Bridge-based CULV 2011 solutions, which could have considerably better graphics performance than current, CULV 2010 offerings. And the next generation of Atom processors, also due next year, might give netbooks some much-needed oomph. There’s no telling whether Intel will be able to reach the sweet spot of power efficiency and performance AMD appears to have achieved with Zacate, though.

Comments closed
    • UberGerbil
    • 9 years ago

    *[http://www.anandtech.com/show/2828/5<]§ (I haven't seen benchmark numbers for Pineview but I suspect they aren't much better, and certainly not better than the ION systems)

      • NeelyCam
      • 9 years ago

      “Casual gamers” – yeah, I’ve heard of those. I appreciate that there is a whole world of people out there that I haven’t met, and this casual gaming market might materialize.

      It’ll be interesting to see if that happens or not. I’m betting on Ontario/Zacate being largely a flop, but I’ve been wrong before…

        • JumpingJack
        • 9 years ago

        §[<http://www.businessweek.com/magazine/content/05_40/b3953044.htm<]§ Though this is from 2005, at that time an estimated 56 million people played casual games -- ".... an estimated 56 million adults worldwide who may not care when the next Madden NFL Football is released but who still want a little game time. Call them the casual gamers: a set of folks loosely defined as those who have broadband but play less frequently -- and who want less complex games -- than the much smaller hard-core gaming crowd." This would imply that hard core gamers (those who frag their best buds in a casual match in COD2 or TF2) are in the minority. §[<http://news.cnet.com/Casual-games-get-serious/2100-1043_3-6071465.html<]§ Hard core gamers typically do not go for the casual games, naturally: "Initially RealNetworks targeted hard-core gamers who played games like "Doom" and "Quake" and tried to get them to pay to play games online. "It was abysmal," said Senior Vice President Michael Schutzler. "It did not go well at all." " §[<http://www.casualgaming.biz/news/28130/Casual-gaming-underappreciated<]§ There is good reason why Wii was so successful, a large part of the end user base do not care for the fancy, FPS, blow it up with 4XAA quality images and graphics, they like their Bejeweled and Plans v Zhombies better.

      • FuturePastNow
      • 9 years ago

      AMD may also be betting on OpenCL actually seeing use. I’m not sure programmers can be bothered to write software which offloads work to a GPU, but it could be a big boost for things like video encoding.

      (Or, more likely to happen, Microsoft’s DirectCompute API)

    • IntelMole
    • 9 years ago

    Assuming decent battery life and pricing, in essence, buying this vs. a CULV chip from Intel is taking a bet that GPGPU takes off for the really heavy lifting.

    Given that there are now more than a few examples of GPUs being used, that might be reasonable.

      • FuturePastNow
      • 9 years ago

      I’d pick this simply because I have shelves of games to play that, while old, still require better 3D graphics performance than Intel can deliver. This should run shooters and action games from 5-10 years ago quite well; a CULV or Atom won’t.

      I also need a small, light laptop to travel with, and I like the idea of taking a few older games with me.

    • LiamC
    • 9 years ago

    I want Bobcat for appliances. Think a firewall box or NAS.

      • Voldenuit
      • 9 years ago

      Agreed. A Bobcat Boxee should be interesting.

      It just doesn’t appeal to me as an attractive netbook option since it’s not much faster than dualcore Atom.

        • LiamC
        • 9 years ago

        I too was hoping this would make a good thin and light (13″~14″), but it doesn’t look to have the cpu grunt I want. I’m can’t figure out if the bottleneck is CPU or the graphics~would probably need comparing to a Radeon 5450 equipped CULV

    • indeego
    • 9 years ago

    meh. My 7-zip is 4 times this. Can’t go backwards, even for portability’s sakeg{<.<}g

    • burntham77
    • 9 years ago

    I just like the idea of not buying an Intel system. This is good stuff. I’d pick one up as a cheap portable for sitting on the couch.

    • StashTheVampede
    • 9 years ago

    Give me the most powerful chip, 15″ screen, 4GB of ram and an SSD — how many hours of battery life and whats the cost? Swap the screen to 13 and how many hours of battery life and cost?

    Chip isn’t as fast as Intel C2D, but we’re looking at “good” performance at very lower power numbers.

      • axeman
      • 9 years ago

      BAH! The best use for a low power chip is in a small and light laptop, which still has excellent battery life. I’d like to see more 11-12″ laptops with “good enough” performance.

        • StashTheVampede
        • 9 years ago

        Not everyone wants the smaller, sub-13″ screen. MOST of the time those smaller screens lead to smaller keyboards (which annoy me to no end).

        Why not have a 15″ screen on that platform?

    • FuturePastNow
    • 9 years ago

    I was hoping for more, but it’s better than Atom in every way. I can settle for that.

      • NeelyCam
      • 9 years ago

      For now. I think Cedar Trail will change that.

        • FuturePastNow
        • 9 years ago

        Cedar Trail won’t be significantly more powerful than Pineview. The onlt real performance improvement is adds is hardware 1080p video decoding. It will also use a lot less power. That is the direction Intel wants to take Atom, into potentially smaller devices with better battery life.

        Anyway, Cedar Trail isn’t coming until the middle of next year.

          • NeelyCam
          • 9 years ago

          I was talking about the power consumption reduction. The current Atom chipsets are kind of inefficient… the CPU itself consumes almost no power (see idle vs. full load).

            • OneArmedScissor
            • 9 years ago

            I’m hesitant to think that anything but a complete redo of the way laptops are built will change battery life much. It doesn’t look like even Core 2s AND their chipsets use hardly anything.

            The problem is that despite integrating so many parts into one chip, they keep making all the parts do more and more things that most people just don’t need…and now they ROMG TURBO BLOOOOST to bazillion GHz even if that accomplishes nothing for you. Pentium 4, is that you in there?

            Atom and Bobcat are both steps in the right direction, but ARM-based “smartbooks” could very well take a big, steamy dump all over them. Nobody will expect those to have whacky PCIe setups and oodles of desktop speed RAM with a squillion traces running all over an entirely too fancy PCB. Just stick a beefed up smartphone chip in there with a low power SSD and it’s good to go.

            • DavidC1
            • 9 years ago

            Why do everyone think the chipsets are inefficient on the Atom?

            Hello?! The 2010 Atoms are called Pineview, which integrates most of the chipset functions! The companion chipset has a TDP of only 1.5W. Even the Nettop companion chipset is only at 2.2W. The NETTOP Atoms suck in power usage which is what they use to compare Brazos platform with. Let’s see Brazos Netbook vs. Atom Netbooks and see power consumption comparison goes then.

            • OneArmedScissor
            • 9 years ago

            People often confuse “chipset” with “the way every single PC has been built for years.”

            The “computer” part isn’t changing, folks, which is the point I was trying to make. Put a 1w TDP combination CPU/NB/SB chip in a laptop and it’s still going to suck down 5w or so

            To see any real difference, we’ll need new screens with no backlights, cut down motherboards, different RAM altogether, and netbook-class SSDs instead of server-class.

            The trouble is that you cannot even really do any of that to a laptop by switching parts out yourself. It’s a fundamental PC design issue.

    • swaaye
    • 9 years ago

    It looks a little better than the boring Nano from both power and speed standpoints. The GPU looks a little better than a low end card. Exciting. Heh.

    It will probably make annoyingly warm and noisy netbooks. My Eee900 with its Celeron 900 and GMA 900 already get quite hot playing anything 3D and that’s about the same power output I think.

      • axeman
      • 9 years ago

      This uses less power than the chipset alone does in your case AFAIK.

        • swaaye
        • 9 years ago

        I also forgot that ICH7 is in there too. 🙂

    • OneArmedScissor
    • 9 years ago

    Oh, you kids and your wild speculation on synthetic benchmarks. I’m sure a CPU with integrated just about everything is merely on par with Atom, which has integrated nothing and wonderful FSB interlinks holding it together. You absolutely won’t be able to tell the difference between the two in regular use, just like you can’t tell Atom from an Athlon 64. Wait a minute…

    *Earth to dingdongs! Earth to dingdongs!*

    If it doesn’t compress 14GB ZIP files as fast, but still gets good battery life, I think everyone else who doesn’t obsess over incomplete numbers will live.

    Wait for the battery figures before you get out your pitchforks.

      • NeelyCam
      • 9 years ago

      Well said.

      But speculators need to speculate.

    • NeelyCam
    • 9 years ago

    I wish TR reviews would list the CPUs along with the laptop model names. How am I supposed to know what Asus 85012xkt has in it? Makes comparison difficult.

    I preferred the way Anand’s graphs list the CPUs.

    • kamikaziechameleon
    • 9 years ago

    I would pick amd over intel in this arena for a couple reasons, They will have a true GPU on board and at that intel isn’t yet capable of releasing compitent drivers.

      • sweatshopking
      • 9 years ago

      that’s really only 1 reason. you like the gpu better.

        • NeelyCam
        • 9 years ago

        Concise and to the point. But where’s the trolling?

        • shank15217
        • 9 years ago

        No its two reasons, good drivers and a fast gpu, where did you miss that?

      • sweatshopking
      • 9 years ago

      PLUS INTEL CPU’S ARE FART POWERED!

        • NeelyCam
        • 9 years ago

        And there it is.

        It gives you warm and fuzzies to know that there are some things you can always count on.

    • neon
    • 9 years ago

    Does the APU have a product name yet?

    I assume it will not be AMD Zecate E350, but rather AMD Turion3 E350 or AMD Fusion E350 or AMD RockOn 11900/BBQ/XLT

      • sweatshopking
      • 9 years ago

      They’re calling it AMD Awesome Powerful Ultimate (APU) platform.

    • TheEmrys
    • 9 years ago

    What’s with the Ageia Physx installation? Was it a factor in Just Cause 2? This just looks odd to me that Physx would be installed in an AMD cpu/gpu system.

    • flip-mode
    • 9 years ago

    I don’t know what to think. This market does not excite me much. Maybe for netbook users this is good news and hope for future tablets. For me, I just see a really slow processor and GPU paired up together. I’d love to have a file server / linux box based on one of these, but that’s much more of a luxury than a need and it falls pretty low on my prioritized list of luxuries too.

    I hope the product does well for AMD but I don’t see it as useful for me.

      • obarthelemy
      • 9 years ago

      most people I know (except gamers, and even them… ) run incredibly outdated PCs. 5-10 years ago I liked fixing PCs up as a hobby, and selling/giving them on… I got a call last month about a friend of my parents’ with a Celeron 700-based MSI pizza-style mini PC (5371 ?) i fixed up for him when that CPU was already old. He wanted another PC, a laptop, to move around with it. Still perfectly happy with the Celeron, though.

      Even an Architect friend of mine is still using some kind of Athlon (not even XP nor II) PC with an old version of AutoCAD, “‘coz it works fine”.

      So my take on the chip is that it’s what the world needs: powerful enough for almost everything at home and at work, plays all video formats, cheap, more reliable (due to integration). It would be an upgrade for 80-90% of all the PCs I know. ‘cept mine :-p

        • flip-mode
        • 9 years ago

        So true. In fact, I’m typing this on a 4-year old PC I’m presently prepping to pass on to my dad, who is using a PC I passed on to him some 4 or 5 years ago, which is an Athlon XP 2600 with an ATI Rage 128 32MB video card. The only reason he’s decided to upgrade is because he bought a new (used) monitor that’s 1600×1200 and I suggested that his current PC will probably be sluggish on it.

      • shank15217
      • 9 years ago

      A file server nas box wouldn’t need a gpu at all, making these a bad choice.

        • OneArmedScissor
        • 9 years ago

        So a Core i7 would be better?

          • shank15217
          • 9 years ago

          Single core nile athlon at 9w would outperform it on i/o intense tasks.

            • OneArmedScissor
            • 9 years ago

            A 1.2 GHz single core wins vs. a 1.6 GHz, almost directly clock for clock comparable dual-core? I don’t think so, Tim.

            Problem: if you’re talking about mobile parts, where are you going to get a mobile board with no GPU? I’m not aware one exists.

            If you go with desktop parts, well, I know what a file server REALLY doesn’t need, and that’s a high power northbridge with over 9,000 PCIe lanes like a 7/890FX board has.

            Bobcat nettops will likely idle at 15-20w. There isn’t a current AMD desktop board that will get even remotely close to that. It would just be a waste of electricity.

        • obarthelemy
        • 9 years ago

        in my experience, having some kind of graphics on any box makes everything so much easier: installation, troubleshooting, updates and upgrades… IGPs are better for this than discrete cards, since they don’t impact reliability, power usage, size… as much

          • UberGerbil
          • 9 years ago

          Agreed. I want an IGP in all my motherboards, just because it’s handy when a discrete GPU is unavailable/being troubleshot/unnecessary/whatever.

            • flip-mode
            • 9 years ago

            Thank gosh there’s someone that shares this exact sentiment with me! Every review of and AMD xxxGX chipset says “but I just don’t see the point of an IGP on and enthusiast motherboard” and I’m like “but it’s great for trouble shooting and for the eventual demotion to 2nd class computer where low power is awesometown”.

    • HisDivineShadow
    • 9 years ago

    Looks like Apple’s dream CPU+GPU config to me for a Macbook Air. Imagine how thin a system like this could be after a couple of iterations when performance is twice, three times what this one’s is.

    It does make one wonder how much performance we can expect from its big brother, Bulldozer. Now that APU could well end up being exactly what Apple is looking for.

    Then again, Sandy Bridge might, too, if Intel can get their GPU drivers up to snuff. With Bulldozer so far away for portable systems (and Llano standing in for it in the meantime), it looks like I’ll wind up with another Intel system this coming year. It’s a shame. I wish AMD had put more emphasis on portable systems, less on the ever-declining desktop, but I can’t argue that building an Atom+Ion trouncer wasn’t very smart to do.

    This chip will do 90% of what the market needs from a laptop and could well bring Atom-priced CULV laptops up to CULV performance and slow iPad-domination of the pricepoint. If you can go out and buy a 11-15″ laptop for $300-400 with 8 hour battery life and the performance to run WoW and watch Full High Def videos, then that begins to look fairly attractive to a lot of people.

    I also expect people to grow a little weary of the limitations of an iPad and begin to want to do other things beyond web browsing, which might incline them to investigate more “armed and fully operational battle station”-worthy options. As in, a repeat of when people began buying netbooks en masse, then started buying CULV’s afterward because they found netbooks whetted their appetite for bigger things.

    In other words, I am saying to AMD, great job. I’m impressed. I’d totally look at this for a secondary system and I hope that someone manages to figure out the right combo of interface for Windows 7, hardware for the right price, and tablet to go with it. I’d love a tablet that harnassed Windows AND an APU like this one, but in order to do that, they need to realize Microsoft seems wholly incapable of providing them an appropriate UI and go make one for themselves. And who’s smart enough to do that and do it well?

    • Chrispy_
    • 9 years ago

    \o/

    Cheap, light, all-day battery life ultraportable that can run games less thn three years old, and run video.

    I hope this is priced closer to an Atom than a CULV. Atom needs to die, and it won’t happen if there’s a significant enough discount over Zacate.

      • HisDivineShadow
      • 9 years ago

      I’d expect the day these laptops come at $400, Atom laptops will hit $200 regularly, with Ion straddling the middle at $300.

      This is not really that far away from what we’re seeing today (old Atom $230-$275, Atom+Ion $325-400).

        • Anonymous Coward
        • 9 years ago

        I don’t see why an Ion+Atom is going to beat Bobcat on cost.

          • NeelyCam
          • 9 years ago

          I think his point is that they’ll get priced based on the performance of the competition. If Atom+Ion sells for $350, this will sell for $400 because it’s slightly better.

    • codedivine
    • 9 years ago

    Single channel memory looks to be holding it back.

      • Hattig
      • 9 years ago

      Yeah, I think that at some point AMD will need to look at layered DRAM / eDRAM on the Ontario/Zacate package.

        • NeelyCam
        • 9 years ago

        That would make a lot of sense.

      • mczak
      • 9 years ago

      I think dual channel is out of the question, due to large number of additional i/o pads needed. That said, an obvious improvement would be to use faster ddr3 – nowadays low-voltage (1.35V) ddr3 can be had with speeds up to ddr3-1600 whereas zacate is limited to ddr3-1066.

    • Fighterpilot
    • 9 years ago

    i hope there was enough magic fairy dust left to sprinkle over Bulldozer as well.
    Looks like pushing the frequency of that Zacate would give a decent performance boost….can you/have you tried upping the clocks at all?

      • Anonymous Coward
      • 9 years ago

      AMD is likely looking for good yields and low engineer time, so I doubt Bobcat will ever clock terribly high. I imagine they choose yields over top-bin clock speed every time the choice comes up.

    • vvas
    • 9 years ago

    Goodbye Ion, it was nice knowing you.

    And now I’m really itching to see benchmark comparisons of Zacate’s GPU and Sandy Bridge’s GPU, because it looks like it could be pretty close!

      • DaveJB
      • 9 years ago

      Only if we’re talking about the Sandy Bridge IGP with 12 EUs. By all indications the 6 EU model isn’t going to be significantly faster than the current Clarkdale IGP, so it’d probably still get its backside handed to it by Zacate.

        • DavidC1
        • 9 years ago

        Even the 6 EU Sandy Bridge graphics should be significantly faster than Clarkdale.

        Each EUs are 2x as powerful as the previous gen, and has graphics Turbo mode and can share the L3 cache. There are some other minor changes as well.

      • NeelyCam
      • 9 years ago

      Zacate’s GPU doesn’t matter if the CPU is too slow to compete.

      I think there are four classes of systems:

      1) Compute-focused systems. Only the most powerful, power-efficient CPUs are included. X6’s, i5/i7’s etc. Graphic performance secondary.
      2) Gaming systems. Powerful graphics needed, CPU not as important. Main-stream 2-/4-/6-cores plus discrete graphics. IGPs are worthless in this class.
      3) Main-stream systems. Snappy browsing, “productivity”, maybe HTPC usage. Good CPUs, mediocre graphics, low price. Athlons+IGP chipsets, C2D/i3/i5/SandyBridge… Only low-end discrete graphics or IGPs needed.
      4) Bottom-of-the-barrel low-cost browse/email/facebook tools. Low-performance CPUs + crappy graphics. Atom plus Intel IGP, AMD’s laptop solutions, Via, ARM.

      Zacate doesn’t really fit in any of these categories. Graphics would qualify it for the main-stream systems and then some, but CPU is too weak. They would do well in the bottom-of-the-barrel section, but graphics seem overkill, and will definitely come with a price.

      It’s like AMD is trying to create a new, low-grade gaming class, and I don’t see it happening.

        • flip-mode
        • 9 years ago

        WTF? I suppose if you ignore nettops, netbooks, and high end tablets, there’s no market for these. If you’re interested in a nettop or netbook, I could not see choosing an Atom based device (with or without Ion) over a Brazos based device. You could make the argument that Atom can reach to lower power consumption levels so you could get over 9000 hours of battery life, but the group of people that needs more than 8 hours on such a device has to be extremely small.

          • poulpy
          • 9 years ago

          q[

            • NeelyCam
            • 9 years ago

            Read again – Atom is in “class 4”

          • NeelyCam
          • 9 years ago

          If you’re interested in a nettop, then yes – Zacate could make a bit of sense, but I would still pick something with a more powerful CPU (e.g., CULV).

          Netbook: if you don’t need much CPU horsepower, you pick Atom over Zacate for better battery life. If you need more CPU power, you pick CULV (still with better battery life).

          Zacate, as it is, missed the mark… or rather, hit the wrong mark. It would’ve been better if 1) the CPU was better – good enough to beat CULV, or 2) if the die size had been cut in half by reducing the graphics core and, consequently, bring the cost down to undercut Atom in a large way.

          I think option #1 would be better, as the cost of a netbook is mostly paid for non-CPU stuff: with a 10-20% more cost you could have a much better CPU and much better performance. This is why CULV platform is so fantastic.

          AMD decided to shoot for improving the GPU instead the CPU, and although they did it in a cost effective manner, I think they aimed at the wrong target. For users that think Atom-class CPU is sufficient, such powerful IGP is an overkill.

          And do you really think this is cheaper to sell Atom, even if the chip is a bit smaller? Intel can extract margins also for manufacturing, while with Zacate, TSMC takes a cut. We’ll see how cheap these things really is…

          Although few of you would believe it, I was truly excited about Ontario/Zacate, when the story was that it’ll have “90% of mainstream CPU performance” and is much cheaper, with good graphics to boot. This could’ve been the thing to make me switch from Intel to AMD. Unfortunately, it turns out that “mainstream” meant low-grade AMD CPU.

          The sucky CPU performance was disappointing, and I have no reason to switch now.

            • TO11MTM
            • 9 years ago

            Well, one interesting thing I noticed… this Brazos Sample platform’s “Whole system” (including Monitor etc) matches the Zotac ION mini-itx board’s power Draw pretty well… While the Desktop Atom likely has a higher power draw, that figure didn’t drive a monitor either.

            So, in short, Better battery life without too much more power draw is what it sounds like.

            §[<https://techreport.com/articles.x/16893/4<]§

        • OneArmedScissor
        • 9 years ago

        All of those types of computers “didn’t fit” anywhere when they were first conceived. And here we are…

          • NeelyCam
          • 9 years ago

          Fair enough. To me it seems like AMD is trying to create a new class, but it’s just a hybrid of current classes – an Atom with an oversized GPU. Sort of like Atom+Ion platform, with a slightly lower power consumption.

          I never had an Atom+Ion myself, but I’ve heard people bitching about it’s crappy CPU, and since I had an Atom netbook myself, I can completely understand why they’d say that. Creating yet another shitty CPU is not a way to make a market breakthrough.

            • OneArmedScissor
            • 9 years ago

            I think you’re being a bit judgemental. Atom is built like an ancient CPU. This is extremely modern. Why does it need to be faster than anything else that already exists? If it’s on par, it’s fine.

            It’s not a hybrid of any two things. It’s the modernized version of the Core 2 ULV platform, which Intel has failed to produce.

            I’m not sure they’re trying to make a new anything. They’re just fishing for the “good enough” point for everything in one box, as far as it concerns not only most people, but when you get down to it, all *[

    • moritzgedig
    • 9 years ago

    I fail to understand the need for HD-video on nettops.
    are they not for surfing and typing?
    it is like with cars: They keep adding small ones but then make them bigger with each iteration so they have to keep adding new models on the low end.
    HD-video is ment for big TVs not tiny screens.

      • Voldenuit
      • 9 years ago

      HD video is becoming a big component of surfing. Youtube HD and sites like Vimeo for instance. And I can imagine a lot of people will soon find services like Hulu Plus, Netflix streaming etc attractive if they haven’t already.

        • moritzgedig
        • 9 years ago

        I find the 360p setting on youtube sufficient, it saves data and is good enough for the laughing baby, drunk cat or what ever.

          • HisDivineShadow
          • 9 years ago

          And yet MOST people want 720p video because most videos added today support it. I don’t blame them as long as it doesn’t cost battery life. In this case, it would seem it does not.

          It is noteworthy that one of the biggest advantages of this APU config is the fact that the support for HD video is essentially free because the GPU component includes UVD3 since all modern Radeon parts do. I suspect pulling it out would net little benefit and deprive the system of one of the best benefits of having a built-in GPU.

          It’s also worth mentioning that Intel includes its own attempt at the same thing, even if its drivers are rather shoddy at actually supporting it.

      • Meadows
      • 9 years ago

      No, HD video is meant for devices that support it. Most portable computers can fully display 720p at least, so it’s perfectly acceptable to want to use it and test it.

      • Anonymous Coward
      • 9 years ago

      This AMD chip could end up in all sorts of places. They place no restrictions on its use, such as Intel does with Atom. It could easily show up in a machine with “HD” screen resolution.

      • Bauxite
      • 9 years ago

      hdmi port + hdtv = not bored crazy

      Hotel/office/random visit to relative or friend + netbook that isn’t gimped (e.g. atom) = play tons of stuff.

      Add some kind of internet (wifi/3g/whatever you can get your hands on) and then you can stream or do whatever.

      Screens are everywhere these days, and my 11″ portable has plugged into tons of them. You never know when you’ll want it next. 4 hours in the middle of the night at a vet clinic that I would’ve otherwise had to watch infomercials, read 2 years of golf digest or grab a useless nap which would’ve just made me groggier for the long drive back. Thank you amazing little culv netbook for saving my sanity.

      • Delphis
      • 9 years ago

      Yea .. they said it’s a 1366×768 screen and then run a 1080p video.. hmm.. only you don’t have 1920×1080 resolution, so isn’t that a bit silly? I guess watching videos on a flight might amuse some people.

        • TheEmrys
        • 9 years ago

        Particularly if you have these video already on a computer.

      • obarthelemy
      • 9 years ago

      2 reasons:

      1- a nettop/netbook can double up as a TV-connected video player, grabbing videos off of my file server, when I’m at home, or plugged into my hosts/hotel tv…

      2- on the road, I don’t want to have to reencode files just for my nettop’s screen resolution. I want to grab whatever I feel like of my server, copy it onto my netbook, and sail into the wild blue yonder. No reencoding.

      • odizzido
      • 9 years ago

      If the video I want to watch happens to be larger than my screen I still want to be able to watch it properly.

      • derFunkenstein
      • 9 years ago

      A nettop is also a prime target for an HTPC. Play back Bluray and DVD, along with streaming videos from Netflix and the like.

    • Meadows
    • 9 years ago

    That no-frills photo reminds me of computers of old.

      • dpaus
      • 9 years ago

      I thought “Oh!! They put that poor APU into a waffle maker!”

    • basket687
    • 9 years ago

    Great article as always, but I have a suggestion (applicable to all your mobile articles): In the performance charts, why not write the (CPU+GPU) next to the name of each system? For example: When you say in the charts (Toshiba T235D), that really means nothing to me unless I go each time and check the “testing methods” page.
    Thank you.

      • Voldenuit
      • 9 years ago

      Seconded. I had two open two tabs and compare two pages side by side just to see which CPU/GPU was scoring where.

      Personally, I think the underlying hardware is what interests people more than the ODM/OEM/badge on the machine, and it would have been more informative to rank the systems by their CPU+GPU.

        • vvas
        • 9 years ago

        Additionally, it would be nice to describe a bit more in text what the contenders are. That specifications table has a lot of information to digest. For example, it took me a while to realize that the Pentium U5400 of the 1830TZ is of the same generation as the i3/i5!

    • Cyril
    • 9 years ago

    I’ve just updated the 7-zip and x264 scores for the Aspire 1830TZ. I only got this system a few hours ago, and those two benchmarks were run first, but the results for them turned out to be abnormally low. The difference doesn’t amount to much, though, and the commentary and conclusion are unaffected.

    • fent
    • 9 years ago

    Honestly, after all of the build up I was expecting it to be faster CPU-wise.

      • NeelyCam
      • 9 years ago

      Yes. I expected a CULV competitor, and got an Atom competitor… and we haven’t even seen how bad the 9W versions are.

        • Kurotetsu
        • 9 years ago

        I’m pretty sure this was stated to be an Atom competitor right from the start?

          • sweatshopking
          • 9 years ago

          it was……

          • NeelyCam
          • 9 years ago

          It was? With that 18W TDP, I would’ve imagined the target was CULV…

          With that power consumption, it’ll have a hard time competing with Atom’s battery life.

            • kc77
            • 9 years ago

            On the contrary… some systems were not downclocking the GPU. However, Hothardware has a graph of power comparison. It’s using less power than Atom with ION2 and comes very close to Atom without it at load. However at idle it beats them all.

            • accord1999
            • 9 years ago

            It also uses about the same power as a new Mac Mini, with a 2.4GHz C2D. The details from the current tests don’t provide enough information to say how good Bobcat is in terms of power usage.

            • NeelyCam
            • 9 years ago

            Yes. The Atom+Ion comparison point didn’t seem fair. This CULV+Ion runs at the same level, is dirt cheap, and far more powerful:

            §[<http://www.silentpcreview.com/zotac-zbox<]§

            • kc77
            • 9 years ago

            Atom+ ION is not cheaper that’s for sure. Any system that uses a discrete GPU will be more expensive to make. The CPU can’t communicate with the GPU via osmosis.

            • kc77
            • 9 years ago

            So what you’re saying that unless a Mac appears within a test suite we cannot determine power usage? If anything including a Mac would distort the test bed because Apple’s OS is more efficient than Windows.

            • seeker010
            • 9 years ago

            I’m not sure I’ve seen an apples to apples comparison about this. Running Windows on boot camp to test battery puts Windows at the whim of drivers Apple provides. Running Mac OS on an unlicensed notebook results in the same problem.

            • kc77
            • 9 years ago

            Yeah anand did a netbook test a while back… Mac OS sips less power and lasts longer.

            • accord1999
            • 9 years ago

            I’m saying that the current power consumption tests with desktop Atoms and CULVs are inadequate to indicate how well Bobcat will compete with mobile Atoms and CULVs in notebooks.

            • sschaem
            • 9 years ago

            Atom + Ion cannot compete with Zacate in term of CPU power and overall power consumption.

            Check over benchmark review with actual numbers. It all point to netbook/slate that can perform 2x better then Atom+ion and last 2x longer.

            AMD got the implementation close to perfect, the only way Intel win is in fabrication.

            • NeelyCam
            • 9 years ago

            I’m not talking about Atom+Ion; I’m talking about Atom. I always considered the Ion thing to be a bit silly… Nobody’s going to play games with Atom (or at least they shouldn’t). Yes; HD video would’ve been nice, but that was addressed later with the CrystalHD chip. Overall, Ion platform sounded like a pointless power hog.

            No; to me, Atom+CrystalHD sounds like a good netbook; it can compete with Zacate (the 18W part – the only one that’s been benchmarked so far) as long as 3D benchmarks aren’t considered (do you truly think they really matter..?), and power efficiency wise Atom still kicks Zacate’s butt (it’s the Ion that needs to be removed).

            Meanwhile, Intel’s CULV dual-cores with 4500HD wipe the floor with Zacate (again, as long as 3D benchmarks aren’t considered) – HD decoding is included, CPU is more powerful and more power-efficient. Just for kicks, today I tested what my SU2300 ultraportable idle power was. r[<8W.<]r That's it. Those 20W something benchmarks on Atom+Ion nettops don't tell the whole story. Zacate can't compete with CULV in terms or power or power efficiency. And before you bring up the cost, that ultraportable cost me $400 a year ago.

          • JumpingJack
          • 9 years ago

          Nope, AMD has been clear that the Zacate was targeted at low end and small form factor notebooks, and Ontario would compete somewhere between Atom and CULV.
          §[<http://www.techreport.com/articles.x/19937/2<]§ " Admittedly, AMD said this slide reflects its aim on the pricing front, not performance." I believe AMD is betting on the graphics side to give the product a leg up against Atom and put it in a pricing curve closer to Celeron. The netbook version (Ontario) clocking in around 1.0 GHz will likely under perform Atom in most, not all, CPU bound workloads (i.e. take roughly 40% off all the CPU bound from the reviews you are seeing and that is what will become AMD's netbooks). GPU performance will also be cut roughly in half. This looks like a great processor, relative to the competition today, at the 18 W envelop, but for Netbooks nothing much exciting. The real compelling feature is that Ontario will probably play flash content better than any non-Ion based atom netbook.... other than that, there won't be much difference.

    • UberGerbil
    • 9 years ago

    So… what was the temperature of the inside of your mouth?

      • KarateBob
      • 9 years ago

      I was wondering the same thing, but I didn’t want another one of my posts and it’s subsequent replies to suddenly disappear.

        • sweatshopking
        • 9 years ago

        I find that to be a common problem.

    • Skrying
    • 9 years ago

    Looks wonderful on paper. So wonderful that I know when laptops using this platform ship none of them will be made with an ounce of care. We’ll get horrible touchpads, horrible keyboards, horrible screens, horrible plastic bodies. It’ll be Intel’s CULV platform all over again. Promising chip derailed by subpar implementation.

      • zima
      • 9 years ago

      Lenovo gives some hope – Brazos should be at home in the successor of “Thinkpad”(not a true one, but decent) X100e. Here’s hoping that this time they won’t be so afraid of giving it good battery in a misplaced effort to shield “big” X-series…

      • Anonymous Coward
      • 9 years ago

      Are you saying Apple should make a Bobcat product?

    • KarateBob
    • 9 years ago

    When are systems based on this platform going to -[

      • NeelyCam
      • 9 years ago

      JF-ADM in SemiAccurate mentioned Q1, I think

    • Voldenuit
    • 9 years ago

    Not unexpected. 80 shader GPU is ~ 2x faster than the 40 shader Radeon Mobility 4225, but still disappointingly behind the Gefoce 310M, which is not exactly a powerhouse in any sense of the word.

    They need to get their dualcore part down to 9W and into tablets, because they’re still not super competitive performance wise. They can probably undercut intel on price, but it doesn’t look like a game changer right now.

      • esterhasz
      • 9 years ago

      But that’s the thing, really, good enough performance (which the sluggish Atom just does not provide, even the Atom+Ion in my fallback 311c is actually somewhat unusable) + cheap. If this becomes the baseline for Netbooks, this would be a real step forward. Don’t forget, AMD and Intel now make more than half of their revenue in emerging markets…

        • Voldenuit
        • 9 years ago

        That’s the thing, these benchmark figures don’t really show Bobcat as much of an appreciable improvement over Atom. We still have a CPU that’s slower than a very low end CULV C2D/Pentium.

        Perhaps the Bobcat system is less sluggish in subjective use, but outside of the (semi-artificial) Sunspider benchmark, these benchmarks don’t measure that metric. It’s still a lot closer to Atom performance than it is to CULV systems from 2-3 years ago. Heck, the Turion II is pretty much a netbook CPU and Bobcat still falls far short of it. Then again, maybe Cyril can give us some feedback from his subjective use of the systems.

        IMO AMD can only get so much mileage out of the hypothetically lower cost of Bobcat since they have to produce it out of house, and the other component costs in netbooks (RAM, casing, battery, OS fee etc) are static and significant. To succeed, they need to go somewhere intel can’t match with Atom and aren’t already dominating with their CULV C2D/Arrandale chips, and I’m making a prediction with tablets. Although I’m not sure they can make it with 18W since we already have Atom slates like the hp 500.

          • esterhasz
          • 9 years ago

          Yeah, I should have specified that my 311c has a *single core* Atom and there the difference to Bobcat is really quite significant in the benchmarks. Apparently the dual parts are a lot less sluggish, but I’ve never used one. Sure, 20%+ performance would have been nice but if battery life and price are good, Atom in all its non-embedded variants (+Ion +Broadcom) has been rendered obsolete – and that is not that small a market. Intel’s CULV systems will have to get cheaper, too…

          Tablets really could be interesting for Ontario but without an adequate OS? Win7 is simply not an option for mass appeal. Meego!?

            • Voldenuit
            • 9 years ago

            Don’t forget that (as of now, at least), AMD is stuck with TSMC as their sole Bobcat manufacturer. Considering the supply constraints TSMC has had with supplying GPU makers at the 40nm node, Bocat is unlikely to render Atom obsolete even if they had knocked performance out of the park (which they haven’t).

            I agree that tablets aren’t ready for prime time, outside of ARM entrants like the iPad. Android is still waiting on Gingerbread and Windows manoeuvers with all the grace and agility of an oil tanker doing 3 point turns.

            I’m not disregarding AMD’s achievements with Bobcat, I’m just saying that they’re unlikely to revolutionize the market with what they’ve got. Is it a surprise? It shouldn’t be, since AMD has long marketed Llano as being above Bobcat, and that is still a Stars architecture CPU. I was just expecting it to have been closer, is all.

            • esterhasz
            • 9 years ago

            You’re right of course with the yield question. But I meant “technically obsolete”: why would anybody buy a DC Atom + Ion now? Comparable performance but probably more expensive (3 chip solution if I remember correctly, larger dies, etc.) and more power consuming. And 75mm2 is a relatively tiny chip (1/4 of Juniper) so TMSC might just be able to churn out a bunch of those. Fingers crossed ,-)

            But sure, I’m kind of thinking from the “where AMD is now” perspective and despite the inroads they recently made in the portable sector, they’re still not in a good place. Bobcat has the potential to put a real dent in the Netbook market (Atom and lower end CULV) and it’s been a long time since AMD had such a strong product in the CPU area – even if there’s no revolution, for sure…

            • Voldenuit
            • 9 years ago

            Well, I wasn’t referring to yield so much as TSMC’s backorder and available fab space. They are completely backed up with nvidia and AMD GPU production, and have contractual obligations in place that will make it hard for AMD to ramp up production should Bobcat meet a sudden spike in demand. Especially if nvidia decides to play dirty.

            This in turn will make ODMs/OEMs very wary about committing to Bobcat designs, because if supply gets stalled, they stand to lose a lot of revenue and standing costs.

            You’re right, no one in their right mind should want an Atom after this, and nvidia will see even less demand for Ion (ouch), but intel’s production capacity (remember they are using their 45nm capacity for Atom) will probably mean that we will be seeing a lot more Atom solutions on the market than Ontario or Zacate for the near future.

            Then again, maybe I’m being too pessimistic, and these things are small enough that they can be stamped out like crisps.

            I still wish it were closer to Turion performance, though, especially since Nile was already in the 15-18W ballpark.

            • Hattig
            • 9 years ago

            I don’t think AMD will be struggling to make Ontarios and Zacates.

            The die size is quite small – 75mm^2. It should cost around $7 to manufacture a working die (add a few dollars for packaging on top of this).

            80% (yield) * 940 (die candidates on a wafer) = 750 working dies per wafer.
            Wafer cost ~$5000 -> die cost ~$7.
            TSMC were making 40,000 40nm wafers a month I believe, but are meant to double capacity around this time. I can’t find a link confirming the TSMC wafer starts however, only one from 2009 saying that TSMC were hoping to have 40,000 40nm/45nm starts very soon.

            If AMD order 10,000 wafers (costing them $50m), or one week’s production, for Ontario/Zacate, then they’ve just made 7.5m APUs. That would satisfy the market surely!

            • vvas
            • 9 years ago

            To add another factor to your calculations: AMD has announced both single-core and dual-core models, even though it looks like all the chips made will be dual-core. So if they can use this sort of binning to make use of slightly defective chips, they can probably up their yields even more. No idea what happens with the GPU though (which is the largest part of the chip, after all).

            • NeelyCam
            • 9 years ago

            I would add that whoever is happy with Atom like performance may not need the graphics capability of Zacate, and would rather have the longer battery life.

            The Zacate demoed here is the highest-end version of the lineup, with the 18W TDP… way too much to compete with Atom, but the CPU performance is in the ballpark.

            Moreover, when these finally come out next quarter, 32nm Atoms might be out as well, which changes the whole performance/power equation again.

            I’d say CPU performance disappoints, and graphics performance is better as expected, but I still wouldn’t want to play games on this.. and if I’m not gaming, graphics performance doesn’t matter much to me.

      • mczak
      • 9 years ago

      I suspect most of the difference to the 310M is due to insufficient memory bandwidth. 310M doesn’t offer a lot there neither, but it uses 64bit ddr3 with 800Mhz clock. Zacate only supports (a bit of a shame imho since low-voltage ddr3 is available up to ddr3-1600) 64bit ddr3-1066 (533Mhz) which is quite a bit less (and of course it’s shared with the cpu too). I suspect that with 50% more memory bandwidth (ddr3-1600) it would be a very close race.

        • Voldenuit
        • 9 years ago

        Anandtech hypothesized that memory bandwidth only became a bottleneck for them in DA:O, when Zacate suddenly fell to the bottom of the pack.

        §[<http://www.anandtech.com/show/4023/the-brazos-performance-preview-amd-e350-benchmarked/4<]§ Of course, their hypothesis is unproven and it's possible the weak CPU could have been the real culprit. Note that on the next page, they go on to claim that DA:O was CPU bound. At any rate, I'm not interested in Bobcat as a gaming platform, because contrary to appearances, I'm not a masochist :P. I'd rather see some well coded mainstream apps taking advantage of GPGPU.

          • mczak
          • 9 years ago

          Actually, the difference between ddr3-1066 and ddr3-1600 for zacate can be somewhat easily extrapolated from HD4350 and HD4550 numbers – the HD4550 is very very similar in performance to HD5450, and the only difference of HD4350 to HD4550 is the former uses ddr2-500 and the latter ddr3-800 memory.
          Based on that, I’d say Zacate would easily gain ~20-30% performance with ddr3-1600 memory in graphic benchmarks (and probably next to nothing in cpu bound tasks).
          Ah well – maybe next revision :-). But I really think it’s a pity max memory speed was lowered from ddr3-1333 seen with the early demo systems to ddr3-1066 now. Should have been good for ~15% faster graphic performance with minimal power draw difference.

            • vvas
            • 9 years ago

            The problem is that, even if Zacate was made to handle DDR3-1600, it’s doubtful whether the notebook manufacturers would actually equip it with that sort of memory. It’s a chip destined for budget systems, after all.

            But yeah, perhaps Krishna/Wichita next year will up the memory clock. :^)

            • mczak
            • 9 years ago

            Oh that’s quite correct. I don’t think there’s really any price difference between ddr3-1066 and ddr3-1333 nowadays though, so even if they wanted to save a buck and not use ddr3-1600 I think ddr3-1333 would be an option from that point of view.
            Certainly, atom doesn’t have more memory bandwidth (well today – not sure what cedarview will support), however don’t forget SB graphics should be a lot faster, and these systems will potentially have more than twice the memory bandwidth (two channel ddr3-1333) – the cheapest CULV Sandy Bridge systems might potentially also be competitors to Zacate.

    • SNM
    • 9 years ago

    Wow. Suddenly I am wondering seriously if AMD will power Apple’s laptops in a few years. I doubt they’ll ever accept Brazos, but the Bulldozer-based Fusion products might give them a satisfactory 2-chip solution before Intel can give them one.

      • NeelyCam
      • 9 years ago

      By that time Intel has IvyBridge, or maybe even Haswell.

      Although, I think Llano has a good window of opportunity, assuming AMD can get it yielding with GF 32nm SOI

      • LiamC
      • 9 years ago

      Some of them yes. Not a few years. A few months

        • derFunkenstein
        • 9 years ago

        Yeah, I tend to agree. Other APU parts, the higher-power, higher-performance parts which will hopefully finally scale beyond 2.5GHz are a great fit for 13″ Macbooks because it’s an APU and a southbridge-type I/O chip and that’s all. The APU’s graphics will support OpenCL because all of AMD’s current products on OS X do, 4000 and 5000 series alike.

Pin It on Pinterest

Share This