AMD aims Z-60 APU at Win8 tablets, hybrids

Windows 8 is coming, and it’s bringing a wave of tablets and hybrid devices tuned for the touch-friendly Metro UI. AMD doesn’t have much of a presence in the tablet world right now; only one system selling in North America is based on the tablet-centric Z-01 APU released last year. However, AMD expects multiple Windows 8 devices to employ its new Z-60 processor, otherwise known as Hondo.

This new chip uses the same architecture as the Z-01. Hondo’s CPU is comprised of dual Bobcat cores clocked at 1GHz, each with 512KB of L2 cache, while the memory controller offers a 64-bit path to DDR3 or DDR3L memory clocked at speeds up to 1066MHz.

Naturally, an integrated GPU shares the die with the CPU and memory controller. The Z-60 has the same Radeon HD 6250 graphics as the Z-01, with 80 DirectX 11-class shader ALUs clocked at 275MHz. There’s a dedicated Universal Video Decoder block onboard, and AMD says Windows 8’s video pipeline makes full use of the available hardware acceleration. The Radeon’s display resolution support tops out at 1920×1200, though. The Z-60 won’t be able to power 2048×1536 panels like the one in the iPad 3.

Although Hondo uses the same architecture and is produced via the same 40-nm fabrication process as its predecessor, AMD has improved the chip’s efficiency by tweaking the power gating. The Z-60 has a 4.5W thermal envelope, down from 5.9W on the Z-01. AMD claims the Z-60 consumes just 1.57W while playing HD video, 1.12W when web browsing, and 0.75W at idle. The firm expects Hondo-based systems to offer "almost" eight hours of web browsing, "up to" six hours of 720p video playback, and about two weeks of standby time.

Unlike Intel’s Clover Trail-based Atom Z2760, the Z-60’s platform hub resides on a separate chip. This complementary silicon has gone on a crash diet, ditching support for PCIe peripherals and all but one Serial ATA device. However, the platform hub does feature USB 3.0 support. Device makers will be able to choose between offering dual SuperSpeed ports or eight USB 2.0 connections. For tablets and hybrids, I suspect most folks would prefer fewer and faster ports. The ultra-slim systems AMD is targeting really don’t have room for loads of expansion ports, anyway.

Hondo’s lower power consumption is purported to enable devices as thin as 10 mm, which is only marginally thicker than current ARM-based tablets. AMD expects Hondo-based systems to be available when Window 8 launches at the end of the month, with more to follow as January’s Consumer Electronics Show approaches. These devices will run the full version of Windows 8 rather than the RT variant reserved for ARM-based processors.

As someone who has been waffling on whether to buy a Windows 8 hybrid to replace my aging ultraportable notebook, I’m curious to see how Hondo stacks up against Clover Trail. While the tablet-focused Atom should consume less power (its TDP is less than 2W), the Z-60’s USB 3.0 support is a nice upgrade, and its integrated Radeon may offer better performance than the PowerVR GPU in Clover Trail.

Comments closed
    • izmanq
    • 7 years ago

    now we need to port android to x86 😀 don’t want to use win 8 :p

      • shank15217
      • 7 years ago

      Its been done, the issue GPU drivers for this chip doesnt exist

    • dpaus
    • 7 years ago

    I’d love to see someone put together a tablet based on this, with a simple desktop charger/stand that makes one of the USB 3.0 ports available, and sell it in a bundle with the [url=http://www.targus.com/us/productdetail.aspx?regionId=7&sku=ACP70USZ&PageName=Docking%20Stations%20by%20Targus&productCategoryId=13&bucketTypeId=0&searchedTerms=&navlevel1=products&cp=&bannertxt=Docking%20Stations<]Targus USB 3.0 dual-video docking station[/url<]. For anyone who doesn't need high-end processing power (i.e., probably 80% of office PC users - you have an instantly portable and 'perfectly adequate' desktop/tablet system. For anyone who runs an out-of-the-office service business, that could be a huge boon. Combined with Win8, that could be a real winner for AMD - and something you won't get from Clover Trail.

      • sweatshopking
      • 7 years ago

      and it’d be a system you’d never see made, or if they did make it, charge up the ass for it, killing their market.

        • dpaus
        • 7 years ago

        Hmmm, don’t know… I could see HP or Dell making such a system for the ‘volume corporate’ market.

          • sweatshopking
          • 7 years ago

          exactly. and they’d charge up the ass for it.

    • chuckula
    • 7 years ago

    [quote<]AMD claims the Z-60 consumes just 1.57W while playing HD video, 1.12W when web browsing, and 0.75W at idle. [/quote<] Not all that impressive when you read Anand's report on Clovertrail: [url<]http://www.anandtech.com/show/6340/intel-details-atom-z2760-clovertrail-for-windows-8-tablets[/url<] Bear in mind the power consumption numbers presented for Clovertrail are for [b<]the entire tablet[/b<] while AMD's numbers are just for the chip. Edit: For example, HD video playback (including screen + backlight) for the whole tablet is only 3.0 watts, which beats the iPad 3 and Transformer Pad Infinity. Note: the whole tablet at idle with screen on is using 2.3 watts... of which [b<]only 2 milliwatts are being used by the Clovertrail SoC in its deep sleep state[/b<]. Using those numbers we get: Clovertrail web browsing SoC power usage: ~0.5-0.6 watts. Clovertrail HD playback SoC power usage: ~0.7 - 0.8 watts. So basically, it looks like the Z-60 CPU/GPU has about twice the power consumption of the Z-2760... and that's not counting the fact AMD has a 2-chip solution with a whole additional watt being used by the separate chipset. (interesting how AMD's naming convention is suspiciously similar to Intel's previously announced chip....) Edit 2: For anyone who doesn't believe me, here is the Fudzilla article: [url<]http://fudzilla.com/home/item/29013-amd-z-60-hondo-tablet-chip-is-out[/url<] I know, I know: Don't trust what Fudzilla writes, but go look at the power consumption chart, which is straight from AMD. Look at the total power consumption and you'll see that this tablet solution has a much higher power consumption profile than competing tablets out on the market today. I'm glad that there is more competition coming into the tablet market on the x86 side, but anybody thinking that AMD can just slap in a cut-down Brazos chip and takeover the market is in for a rude awakening. This is basically AMD getting to the point that Intel was at in about 2009 - 2010 or so, and they need to do a lot more work, but I think they can be competitive if they keep integrating components and put more work into shrinking the power envelope. Edit 7: LMAO, if I had made this exact same post where [insert name of favorite ARM SoC here] were the low-power solution and [insert name of hated Intel chip here] were the high TDP solution, then I'd probably be sitting at +25 instead of -1..... Haters gonna hate, but that won't stop me.

      • NeelyCam
      • 7 years ago

      That 0.75W idle is particularly awful (and that’s without the platform hub)

        • raddude9
        • 7 years ago

        You changed your tune, in the A10-5800K article you said that “idle” power wasn’t important. Would that be because AMD had the best Idle power numbers there and Intel has the best Idle numbers here 🙂

          • chuckula
          • 7 years ago

          Neely is not being [relatively speaking] that hypocritical: The A10-5800K is a [b<]desktop[/b<] part that isn't drawing all of its power from a battery. The idle power draw (and ability to throttle between idle and useful operating modes quickly) is much much more important on a tablet than on a desktop PC.

          • NeelyCam
          • 7 years ago

          Two completely different applications. In a battery-powered device, idle power is pretty important.

          In the A10-5800K article I said it’s a useless product, but a laptop Trinity chip is a much better value proposal – that’s where idle really matters.

      • sschaem
      • 7 years ago

      But the Z-2760 is also slower. Possibly 10 ten times?? for GPU centric apps and 30% for CPU task.
      The chart show that all chip are at a point of diminishing return. where the system is the #1 user of power, even in the case of AMD based full x86 windows8 tablets.

      I think it will come down to price and GPU power.

      If true: 10 hours in Windows presentation mode, 8 hours of web browsing and up to 6 hours of video playback might be enough for most budget user that must have the full windows8 experience. if its cheaper then a Z-2760 based model?

      BTW, the Z-60 is a low power C-50.

      And here is the ***reality*** : even if AMD had a miracle and cut down the Z-60+hub power consumption by HALF, the platform will go from 3.9W to 3.04W while web browsing.

      This is not insignificant, but knowing the Z-60 score 1701 in 3Dmark , while the Z-2760 only score 137 there is another side to that coin.

        • chuckula
        • 7 years ago

        [quote<]But the Z-2760 is also slower. Possibly 10 ten times?? [/quote<] You'll need some evidence of that. I've seen live video of hand-on demos of the Clovertrail tablets doing multi-tasking between multiple games, Hi-def video, and web browsing and it all looked very smooth. I've seen AMD's synthetic benchmarks... made against an over 1 year old Atom that was using a different GPU which is *not* Clover Trail... and I'm not buying that AMD will give some 10x miracle GPU performance in these power envelopes. That's not to say that an AMD APU with 3x the power envelope of Clover Trail will be slower at GPU... I'm sure AMD can win benchmarks when everything is opened up full-tilt. I am saying that Clover Trail certainly appears to have enough GPU power to do the things that people want to on tablets including playing games, watching hi-def video, and browsing web sites with a very smooth graphical experience.

          • zorg
          • 7 years ago

          Yea live videos and live demos. But one thing they never mention. Clovertrail is the only tablet SoC that don’t have a developer platform, and also there is no documentation for it.
          In theory the Z-60 IGP is ~6 times faster, but in practice you can optimize the app for it. The main platform is almost two years old. Most developer directly do some optimization for Brazos. Hondo will automatically profiting from it.
          You can also optimize for Clovertrail with the Cedartrail platform, but it’s a really hilarious thing without a working Win8 driver.
          So it’s really easy to code a program to run 10 times faster on Hondo. Without documentation you don’t even know how to optimize for Clovertrail. Programing for this, and for all Atom with PowerVR grahics is a nightmare on Windows.

          • sschaem
          • 7 years ago

          It was a question.

          The Z-60, unless power capped, is a ULV C-50. So we already have a good idea of its Dx11/OpenCL 1.1 GPU performance. And this little guy can run games like League of Legend.

        • NeelyCam
        • 7 years ago

        It’s hard to say which is more important for a system like this – good graphics performance or good battery life. Also, it clearly depends on the relative differences of the two platforms in these metrics.

        Clearly, AMD is well ahead of Intel in graphics for these systems, but Intel is similarly well ahead of AMD in platform integration.

        Also, do we know if the CPU in Z-60 is really faster than the CPU in Z2760? Brazos wasn’t that much better than Atom of yesteryear clock-for-clock, and here Atom has almost a 2x advantage in clock frequency..

    • sweatshopking
    • 7 years ago

    [quote<] up to 6 hours [/quote<] not long enough. the ipad does what, 10? seriously guys, stop making it a cm thick, make it 1.5, and give me 15 hours!!!!

      • Helmore
      • 7 years ago

      It’s not so much the thickness I’m worried about. It’s weight. I have an iPad 3 and I sometimes think it’s just a bit too heavy. I know that when I’ll buy a new tablet, which won’t be for quite a while, it’ll be quite a bit lighter than my iPad. My iPad is 650 g and any future ~10″ tablet I’ll buy will be closer to 500 g, preferably lower than that.

        • sweatshopking
        • 7 years ago

        seriously? a lb is too heavy for you? i think it’s time to hit the gym my friend. my wife, WIFE has no problem with the hp touchpad (710g), and she’s 110lbs of bones. i’d WAY rather have additional weight and improved battery life. that’s for everything, phones, laptops, and tablets.

          • Helmore
          • 7 years ago

          Yes, the weight start bothering me after more than an half an hour or so. By then I usually rest it on my lap, which negates the ‘problem’. That said, I’d like to be able to hold up the tablet like I usually do a book. My main use for my tablet is reading after all, both manga as well as books.

            • lilbuddhaman
            • 7 years ago

            Then you’re using the wrong product.

            • NeelyCam
            • 7 years ago

            yep – should’ve had the iPad2 with a 32nm chip in it

            • MrJP
            • 7 years ago

            “You’re holding it wrong.”

      • sschaem
      • 7 years ago

      Stop complaining and buy something like this
      [url<]http://www.newtrent.com/store/ipad-external-battery/leather-power-case.html[/url<]

      • clone
      • 7 years ago

      +1

    • Arclight
    • 7 years ago

    Any word on the GTX 650 review or the A10 5700? All these tablet news bores me.

      • travbrad
      • 7 years ago

      [quote<]Any word on the GTX 650 review[/quote<] Based on the reviews I've seen I'd say spend $10 more and get a 7770. Even in the best case scenarios it struggles to match the 7770, and in some games the 7770 is way faster (as much as 50%). Of course those reviews haven't used the "inside the second" methods, but such big FPS deficits will certainly have some correlation to frame-time numbers. [url=http://www.techpowerup.com/reviews/MSI/GTX_650_Power_Edition/29.html<]TechPowerUp has some value charts at the end of their review.[/url<]

        • Arclight
        • 7 years ago

        Amazing dud. Even the good old 6850 runs tracks around it (on average fps). Thanks for the link.

    • Voldenuit
    • 7 years ago

    That SATA port is the biggie. Clover Trail only supports eMMC. No matter the CPU cores or GPU, a Clover Trail tablet is going to drag compared to Hondo because of the slow interface to storage.

    Sadly, AMD is on the back fot with design wins. No matter how good the product is, it doesn’t mean squat if it’s not in devices.

      • chuckula
      • 7 years ago

      Putting a SATA port on the Z-60 is like giving a fish a bicycle since you aren’t going to be using a full SATA drive (or even microdrive) in a form factor like these tablets. Lots of other tablets get along just fine without SATA support already, a SATA drive will use even more power, and a SATA drive will increase the cost substantially, so I don’t see it being a killer feature.

        • raddude9
        • 7 years ago

        Sure, you’re not going to put a 2.5 inch SATA drive in a tablet, but an mSATA drive measures just 3cm x 5cm:
        [url<]http://www.techspot.com/review/571-crucial-m4-msata-ssd/[/url<] There's plenty of room for one of those in a tablet, also, it would be great if you were able to upgrade a tablet drive in the future, it would give the expensive tablet hardware more longevity.

          • chuckula
          • 7 years ago

          I included microdrives like the m4-msata in my analysis. If you used the z-60 in a larger netbook-like device then I could see the benefit of SATA, but in a tablet even the msata devices use more space, more power, and add more cost than what OEMs want. Bear in mind that most built-in flash consists of one or two chips soldered directly to the motherboard, which is much much smaller than even a SATA microdrive.

    • destroy.all.monsters
    • 7 years ago

    According to Ars this chip features UVD 3 which might be the first implementation of it but on the downside is not a SOC giving it the exact same disadvantages as TI’s OMAP. Further there are no Linux drivers for it at the moment.

      • Helmore
      • 7 years ago

      I don’t get your comment. UVD 3 has been implemented on a variety of chips, such as on the Brazos platform, on Llano and Trinity and of course on a whole bunch of GPUs. Yes, you’re right that it’s not a proper SoC, but I don’t get your comparison to TI’s OMAP chips. All OMAP chips are proper SoCs, just like Tegra, Snapdragon and Exynos.

        • destroy.all.monsters
        • 7 years ago

        I wasn’t aware that UVD 3 had been used elsewhere (apparently it dates back to 2011 something I was unaware of until now) – I just did a quick search on it and some light reading.

        The articles on OMAP I’d read indicated it wasn’t a SoC and therefore was one of the issues leading up to their pulling out of the smartphone market. Apparently I was misinformed.

        Thanks for the update.

    • Price0331
    • 7 years ago

    Why don’t they focus on making a half-decent desktop architecture instead of these “too late to the party” business segments? I miss the old competitive AMD.

    EDIT: I meant desktop architecture as in their CPU lineup.

      • Geistbar
      • 7 years ago

      This is an easier market segment to compete in then desktops and laptops. For smaller devices (such as tablets), you won’t win off of just having the best processor. Here, the total package is essential, and that’s somewhere where AMD is still very competitive with Intel.

      Probably one of AMD’s greatest mistakes was ceding markets where it was doing well but didn’t make bucketloads of money. They shouldn’t overdiversify either, but it was unwise (though, perhaps, inevitable due to their financial health at the time) to shed off their mobile divisions.

        • blastdoor
        • 7 years ago

        I wouldn’t say it’s really that much easier. Apparently, despite their $5 billion purchase of AMD, they can’t even put a GPU in their APU that can beat the licensed PowerVR GPU that Apple uses, if this statement is true:

        “The Z-60 won’t be able to power 2048×1536 panels like the one in the iPad 3.”

        I maintain that the purchase of ATI was a catastrophic blunder. AMD should have used that money to press its advantage with its multicore Athlons by investing in its manufacturing capacity and in continued refinements to the Athlon/Opteron. The PC market may not be growing much anymore, but AMD had plenty of room to grow its share of that market. Also, there was nothing to stop AMD from licensing PowerVR tech to provide a GPU in a SOC — again, look at what Apple has done with PowerVR tech. If AMD had gone my route, they’d have a much better CPU and a GPU that’s not much worse than they’ve got now

          • Price0331
          • 7 years ago

          My thoughts exactly. They can’t compete with apple tech, and the manufacturers of Windows 8 tablets are probably going to favor other options. I’m going to go ahead and say only 2 manufacturers will carry a tablet with their chip. I just really don’t think they have the money or chip performance to get vendor support at this point.

          • Helmore
          • 7 years ago

          I am pretty darn sure that the GPU in Z-60 is way faster than the PowerVR SGX543MP4 used in the iPad 3. The GPU in the iPad is probably a bit more power efficient though, but the PowerVR GPU is a lot less capable. It comes nowhere near the DirectX 11 specification of the Z-60.

          • Geistbar
          • 7 years ago

          Of course it’s an easier market to compete in: look at all the companies that are doing so. Off the top of my head, we have NVIDIA, TI, Samsung, and Qualcomm all making mobile chips. AMD has been struggling to remain relevant against Intel, and they’ve managed to accomplish that almost as much because of Intel allowing it as by their own efforts.

          As for ATI, you’re missing the timetables involved. Getting a new fab up is not a quick affair — any money they poured into that wouldn’t pay off for 2-4 years. By the time that would have been available, they wouldn’t have been top dog anymore, nullifying most of the benefit. They were already trying to use 3rd party fabs to meet their new demand, but that didn’t work out, so that isn’t a new alternative option either.

          As comparison, ATI has given them the tools they’ve needed to maintain their current shred of relevance — balanced performance for systems without a discrete GPU. Perhaps the single biggest hole in their portfolio before was that they couldn’t provide a complete system, which made their platform less appealing to business customers. That was also fixed by buying ATI. Now, they definitely overpaid for them, which was an enormous blunder, but… I don’t think it can really be said that they had a clearly evident alternative.

            • blastdoor
            • 7 years ago

            How is the presence of more competitors evidence that it’s an easier market to compete in? If anything, more competitors means more competition, which typically means less room to make money. Intel has 60% margins in the x86 desktop/laptop/server space. That leaves quite a nice price umbrella for anyone who can field a decent product at a lower cost.

            But lets look at mobile. The two main players in the phone market are Samsung and Apple, and those two companies both design their own SOCs (I know Samsung has used other companies’ SOCs in the past, but I suspect they won’t be doing that much going forward). The rest of the market is small, and split between NV, Samsung (which, unlike Apple, will sell its SOCs to others), Intel, and Qualcomm (TI has already announced it’s leaving this market). I don’t see how that’s an easier market to compete in.

            As for the timetable, AMD didn’t just wake up in 2005 and discover they were in the CPU business with $5 billion burning a hole in their pocket. They should have been making those investments all along, but apparently the management at the time thought it made more sense to set aside cash to overpay for ATI rather than invest in fabs and chip designs.

            As for ATI GPU vs PowerVR GPU — I’m no expert, but according to TR’s write-up, the GPU in the Z-60 isn’t capable of driving an iPad3, so apparently PowerVR isn’t all that bad when employed by a company (like Apple) that’s willing to spend some die space on a GPU. More generally, looking at benchmarks of Apple’s mobile products versus other competitors, Apple’s are routinely at the top of the heap. So if there is an advantage for ATI graphics, it’s totally hypothetical — it doesn’t exist in any product actually being sold in the mobile space. Also, in looking at AMD’s financial performance it’s pretty clear that ATI GPU tech in its APUs is NOT making up for the otherwise abysmal performance of the CPU cores. Finally, note than in the PC space, AMD is only competing against Intel, and Intel’s integrated graphics have always sucked. It’s not at all clear that spending $5 billion on ATI was necessary to compete with Intel in graphics performance, but NOT spending money on fab and chip design clearly led AMD to be totally noncompetitive with Intel on CPU performance.

            • Geistbar
            • 7 years ago

            [quote<]How is the presence of more competitors evidence that it's an easier market to compete in? If anything, more competitors means more competition, which typically means less room to make money. Intel has 60% margins in the x86 desktop/laptop/server space. That leaves quite a nice price umbrella for anyone who can field a decent product at a lower cost.[/quote<] More competitors means it's easier to compete in a market because if it wasn't, there wouldn't be as many companies successfully trying to make money there. x86 has great margins... if you're Intel. Intel has managed to squeeze everyone else out of that market, by being so successful as a competitor. In the mobile space, there's a lot more people to compete with, but they're all able to successfully make money for themselves there, and AMD isn't as hugely outsized by the mobile divisions here as they are by Intel. [quote<]As for the timetable, AMD didn't just wake up in 2005 and discover they were in the CPU business with $5 billion burning a hole in their pocket.[/quote<] Athlon64 was launched at the very end of 2003, and Conroe was launched in the middle of 2006. That left AMD with 2.5 years of market dominance. If they had managed to make/borrow enough money on [b<]day one[/b<] of the A64 launch to get some fabs up, and those fabs miraculously took just 2 years to get online, then AMD would have had half of a year to squeeze money out of that. Before A64, they wouldn't have been able to get that money, and after it, it was too late for fabs to pay off. AMD overpaid for ATI, but that doesn't mean that they had some truly superior alternative, like you seem to believe.

            • NeelyCam
            • 7 years ago

            [quote<]In the mobile space, there's a lot more people to compete with, but they're all able to successfully make money for themselves there[/quote<] Price collusion...? No matter. Intel will change this soon; with far superior cost structure, Intel will be competing heavily on price, and consumers win.

          • derFunkenstein
          • 7 years ago

          The GPU in the iPad is a four-core affair, each core “powering” a quarter of the screen.

          • zorg
          • 7 years ago

          Do you really think that the GPU is the limitation for the resolution? It’s limited by the display interface.

          Also does anyone think that all of the iPad3 games runing at native resolution? The retina display is the biggest bullshit that Apple ever created. The first thing to do for a demanding iPad3 game, is to limit the internal renderer to 1024×768 pixels, than upscale it to native. Yeah you have retina display, but won’t get more information than the iPad 1 or 2.

          • clone
          • 7 years ago

          AMD didn’t spend money to buy ATI, they leveraged the ATI purchase in order to borrow the money to buy ATI.

          AMD did not pull funds away from CPU R&D to buy ATI… why ppl continue to push this is beyond me, the ATI purchase didn’t force AMD to go native quad core putting them back 12 months, it didn’t force AMD to stumble with the TLB bug which added another 6, it didn’t force AMD away from beating Intel to the 65nm transition or Intel releasing C2D.

          AMD had no mobo or Gfx division & without both AMD couldn’t sell to server because they lacked a complete platform to offer and support…. AMD desperately wanted to be an OEM of equal caliber to Intel before they ran out of time….. and in the end they simply ran out of time the day Intel launched C2D…. which had nothing to do with the ATI purchase.

          beyond that no one ever considers the value of ATI’s patent portfolio in regard to the purchase… given Nortel’s sold for $3.5 billion it’s certainly a significant asset.

          • mutarasector
          • 7 years ago

          “I maintain that the purchase of ATI was a catastrophic blunder. AMD should have used that money to press its advantage with its multicore Athlons by investing in its manufacturing capacity and in continued refinements to the Athlon/Opteron.”

          Shoulda, woulda, shoulda…

          “The PC market may not be growing much anymore, but AMD had plenty of room to grow its share of that market. Also, there was nothing to stop AMD from licensing PowerVR tech to provide a GPU in a SOC — again, look at what Apple has done with PowerVR tech. If AMD had gone my route, they’d have a much better CPU and a GPU that’s not much worse than they’ve got now”

          And if the bear wouldn’t have stopped to sh*t in the woods, it would have caught the rabbit…

      • joselillo_25
      • 7 years ago

      They are doing it with trinity

        • Price0331
        • 7 years ago

        If I recall from the TR review, Scott had a hard time recommending Trinity for anyone. There are just too many more viable options out there than AMD’s lineup.

          • khands
          • 7 years ago

          The 65W desktop trinity GPU is great for HTPCs though that’s a tiny segment. On the laptop side it’s still doing well (just not nearly as well as it should be).

            • Sam125
            • 7 years ago

            [quote<]The 65W desktop trinity [b<]GPU[/b<] is great for HTPCs though that's a tiny segment.[/quote<] You mean APU? Although the GPU is the distinguishing factor for Trinity for sure.

      • link626
      • 7 years ago

      I have to agree with you on the “too late” part.

      all the tablet makers have already designed their tablets with intel Atom.

      AMD probably has only 1 design win right now with this z60.

      • thefumigator
      • 7 years ago

      Because AMD is the only one capable of making a chip TODAY with the amount of GPU power to beat all the competitors combined, except perhaps, nvidia.

        • Price0331
        • 7 years ago

        Riiiiiight, and that’s why this chip can’t power an iPad 3 display? Don’t get me wrong, I love their graphics cards, and use one myself, I’m just talking strictly about them trying to get ground in the tablet space when there are too many competitors. Most people who want a tablet will get either an iPad, or a Kindle Fire HD this coming holiday season. I just really don’t see the windows tablet succeeding, much less whatever percentage of these windows tablets they will get to put their chip in.

    • Sargent Duck
    • 7 years ago

    I’ll take the 2 USB 3 ports please and thank you. I can always connect a USB hub to one of them if need be…

    • tbone8ty
    • 7 years ago

    hopefully Asus adds this to their transformer line 🙂

    would love a hybrid tablet with keyboard running win8

    though the z-60 is not ground breaking…its a step in the right direction, paving the way for their 28nm quad core jaguar/GCN chips next yr.

      • mutarasector
      • 7 years ago

      It’s already coming down the pipeline. It’s called the Taichi…

      • NeelyCam
      • 7 years ago

      [quote<]28nm quad core jaguar/GCN chips next yr.[/quote<] Now [i<]this[/i<] I could be interested in. Is the "platform hub" integrated? I'm still waiting for a HTPC that's small and passively-cooled, but is still reasonably fast and can drive 4k displays with video (if I get a new HTPC, it better be able to do that - I don't want it to become obsolete in a year)

    • jjj
    • 7 years ago

    The GPU doesn’t sound like much clocked that low but maybe at least the chip is faster than Atom and AMD might sell it as low as 20$.

      • Sargent Duck
      • 7 years ago

      I use the Radeon 6250 in my E-350 Zacate desktop. Don’t expect it to play Crysis, but it will play indie/small games just fine. (PvZ, AoE:online, Defense Grid: The Awakening)

        • codedivine
        • 7 years ago

        Actually E350 has the higher clocked version: 6310. The 6310 is clocked substantially higher at 500MHz instead of the 275MHz here. Plus, the CPU in E350 is also clocked 60% higher at 1.6GHz vs 1GHz here. I wouldn’t really expect much from the Z-60.

          • Ryhadar
          • 7 years ago

          I wouldn’t count it out just yet. I’m typing on a netbook that has an Athlon L110 at 1.2GHz. From what I can tell, it’s essentially a stripped down single core 65nm Athlon 64. Actually, the specs aren’t considerably different from the Z-60. Here’s a spec sheet: [url<]http://www.cpu-world.com/CPUs/K8/AMD-Athlon%20Single-Core%20Mobile%20L110%20-%20AMML110HAX4DN.html[/url<] The biggest difference being that the L110 is single core, and a dual channel IMC (I'm currently using it in single channel) and the Z-60 has dual cores, and probably a single channel. Oh, and the Z-60's TDP is almost a third of the L110 without considering the chipset and you get all the improvements AMD put into Z-60 since the Athlon 64 era. At any rate, my netbook with an x1270 Radeon (OC'd, x1250 from what I can tell) doesn't have much trouble running older games, or newer indie games. In fact, just the other day I was playing Return to Castle Wolfenstein (2001 one) and I was surprised to see my netbook run it extremely well after some tweaking to the settings.

          • willmore
          • 7 years ago

          My C-50 based netbook is basically this chip but without the power and feature refinements they’ve added to it. As far as gaming goes, you run out of CPU before you run out of GPU. It has no problem playing ‘indie’ type games at 1386×768 just fine. HL2 and L4D are a bit slow, but still playable on low settings.

            • sschaem
            • 7 years ago

            Yes, the Z-60 is a C-50, same CPU clock, same GPU clock, same spec all around.
            The question is, did AMD do massive capping to half the TDP?

            If its just a ULV voltage version, those tablets will perform very well.

            Windows8 with metro apps on a tablet will be more GPU oriented. This should benefit AMD big time.

            • willmore
            • 7 years ago

            Better clock gating, etc. could save some power without a ton of work. My C-50 netbook undervolts a large amount. So, just tuning the manufacturing process (and the die design) would allow lower voltages to run stabily. That’s a good route to power savings as well.

Pin It on Pinterest

Share This