Intel unwraps Clover Trail-based Atom Z2760

In case you hadn’t heard, a whole bunch of Windows 8 tablets and hybrids are scheduled to be released next month. A lot of them will be based on Intel’s Atom Z2760 processor, otherwise known as Clover Trail. This tablet-focused chip isn’t a radical departure from the Medfield SoC that’s popped up in a handful of smartphones, but Intel has made a few tweaks with tablets in mind. The most notable tweak is the addition of a second Atom processor core.

The Z2760’s dual cores can scale as high as 1.8GHz in burst mode, which is just another way to say turbo. Architecturally, the cores are unchanged from previous Atom iterations. They’re fully x86-compliant, allowing Clover Trail devices to run the full-fat version of Windows 8 rather than its cut-down RT counterpart. Hyper-Threading is supported, of course.

Despite boasting dual cores, Clover Trail has a TDP of less than 2W. Intel says the chip also features a new power mode dubbed S0ix. This mode supports the Connected Standby feature in Windows 8, which allows applications to receive updates while the processor is idling in a low-power state. Intel claims devices based on the Z2760 will be able to last for three weeks in Connected Standby mode. They’re supposed to be capable of 10 hours of HD video playback, as well.

1080p video playback will be accelerated by Clover Trail’s integrated GPU, which is based on the PowerVR SGX545. This particular implementation is clocked at 533MHz, and it features an HDMI 1.3 output for external displays. The SGX545 was released nearly two years ago, and I’m curious to see how it fares in the sort of casual games likely to be played on Windows 8 tablets.

The Z2760’s memory controller supports dual channels of DDR2 memory at 800 MT/s. Device makers will be able to equip their systems with up to 2GB of RAM, and they can add solid-state storage via an eMMC interface. USB 2.0 connectivity is built into Clover Trail, too. The chip is fabricated using Intel’s older, 32-nm process technology, but the package still measures a minuscule 14 x 14 mm.

Acer, Asus, Dell, Fujitsu, HP, Lenovo, LG, Samsung, and ZTE have all pledged to offer Windows 8 devices based on Clover Trail. 20 designs are purportedly in the pipeline; half of those are convertible hybrids, while the rest are standard slates. We can expect devices as thin as 8.5 mm and as light as 1.5 lbs, according to Intel.

Device makers have already shown off several systems equipped with the Z2760. Although prices haven’t been made official, it looks like Atom-based tablets may command a stiff premium over their ARM counterparts—and over netbooks based on existing Atom processors. Intel says the Z2760 isn’t substantially more expensive than previous Atom platforms, and it notes that there are other system components to consider. Most Clover Trail tablets will have fancy IPS touchscreens, solid-state storage, ultra-slim chassis, multiple cameras, accelerometers, and GPS—perks you won’t find in budget netbooks. Clover Trail systems will also come with the full version of Windows 8, which likely costs device makers more than WinRT. Whether consumers are willing to cough up the extra scratch for compatibility with x86 desktop apps remains to be seen.

Comments closed
    • Arclight
    • 7 years ago

    I have an idea, it may be stupid but hear me out….what if they put all the hardware of a normal ultrabook into the part that has the keyboard and when the screen (aka tablet) docs to the keyboard the tablet shuts down non essential hardware and gives control to the “keyboard side” (normal use dicatating that when you use it in “laptop mode” it is supposed to be plugged in……The only major kink to work out would be the HDD communications between the tablet side and the laptop side….maybe use a esata cable and the laptop HDD as an “external HDD” thus the OS staying all the time on the tablet.

      • vargis14
      • 7 years ago

      Might as well add a lag free wireless 1080p capable video, Maybe intels WIDI along with a receiver so you can hook it to any HDTV-monitor and a discrete graphics card to compliment the ivy graphics. Also house it in a BIG backlit full sized mechanical black widow ultimate keyboard type deal with a thick palm rest to make room for all the high end guts for a good gaming rig.
      While we are at it make it so 2 tablets can dock to it in landscape mode, His and Hers 🙂 or just one in landscape. Might as well be able to use those little cpus while they are docked for something interesting and off the wall like razers command center on the razer laptops
      Sure it would be expensive but could be interesting to see what they could come up with.

      Weird idea 🙂 I just typed it up while my coffee is brewing …hey tablets need mechanical keyboard love too ya know…rdy for the thumbs down:)

        • OneArmedScissor
        • 7 years ago

        They’re actually putting WiDi in the Ivy Bridge tablets. There’s not much they won’t have that “ultrabooks” can, with graphics cards being the biggest exception. That’s probably not going to matter too much by Haswell, though.

      • slaimus
      • 7 years ago

      I think Lenovo had something like this a few years ago. The screen from the laptop would detach and run a simple Linux distro that has basic tablet features.

    • ronch
    • 7 years ago

    Intel obviously needs to make money and make their investors happy, but they already had their run with x86, with which they have displayed their tendencies to be a monopolistic, unfair company unwilling to share the architecture/technology with anyone when they realized x86 was becoming dominant (if you lived through the 80’s and 90’s you’ll easily understand this). Now as many companies are hopping on the ARM bandwagon in the mobile space Intel wants to break that momentum by shoving x86 in and trying to dominate that space as well.

    Well, I don’t like it. I don’t like Intel even on my x86 PC (I’m using AMD), and I certainly don’t want Intel running the show inside my next tablet. ARM has demonstrated they can do well here. The ecosystem is also already here. Why would I want to go with Intel when Android/iOS are already doing well on ARM? To give Intel a shot at dominating this space as well? Sorry Intel, I’m sticking with ARM.

    • Bensam123
    • 7 years ago

    I wonder when they’ll connect the two dots and go back to having a rotating keyboard attached to the monitor… XD

    • blastdoor
    • 7 years ago

    Hey, I just saw this:

    [url<]http://www.eetimes.com/electronics-news/4397207/TI-steering-OMAP-toward-embedded[/url<] was this in a shortbread and I missed it? Sounds like ARM consolidation is starting to happen.

      • phileasfogg
      • 7 years ago

      well, it’s more like ‘ceding the market to Nvidia and Qualcomm’ rather than consolidation – at least that’s the commentary I saw on barrons dot com yesterday. TI really only has 2 significant-sized design wins for OMAP4 SoCs AFAIK: Kindle Fire+ FireHD and Blackberry. Someone correct me if I missed any other high-volume sockets. (OMAP5 may have some new unannounced design wins). With each new gen of SoC pushing into a new process node and costing ~$75-100M of R&D dollars and very little room for error, they likely decided not to continue to pour future investments into such a high-risk area. Probably a wise choice for TI. This market is also being fiercely contested by Intel (see today’s C-Trail announcement), Mediatek on the lower end and of course, that ever-present hulking giant Samsung, with its own Exynos and a huge r&d expansion in Silicon valley. Enough to cause TI to pull the plug I suppose.

      what the EETimes piece didn’t say is equally as relevant as what it did: TI could hive off the OMAP4/5 tablet+smartphone dev team in its entirety to Amazon. Bezos is a smart cookie – surely he’s realized that he needs much stronger control over the supply of CPUs for current and future-gen tablets.

        • codedivine
        • 7 years ago

        [quote<]well, it's more like 'ceding the market to Nvidia and Qualcomm' rather than consolidation[/quote<] Isn't that what consolidation means?

        • blastdoor
        • 7 years ago

        Re: Amazon…. maybe, but I’m skeptical. Amazon has made it pretty clear that their business model is not to make profits off the hardware, and if they were to get into the chip business then that model would become a lot tougher. Far better for them to just buy from the weakest supplier so they can extract the best possible price. My guess is they’ll stick with OMAP for as long as they can, then switch to n-1 generation Tegra. Amazon just wants to sell a “good enough” content delivery device. They don’t need to design their own SOCs to do that.

        I’m guessing NVidia might be the next to fall, maybe in two years or so. Then there will be a big tussle between Qualcomm and Intel that could last a couple more years. Ultimately, I bet the market settles down into Intel and Samsung competing for OEM business, Samsung supplying itself, and Apple using TSMC or GloFo to build Apple-designed SOCs.

        There are just too many players in this market right now. It’s not sustainable.

        • codedivine
        • 7 years ago

        OMAP4 design wins include: Galaxy Nexus, Droid Razr, Galaxy Tab 2 (7.0).

    • phileasfogg
    • 7 years ago

    Take a very close look at what they mean by “dual channel” memory controller. The atom-Z2760 product brief says it’s a dual-channel 32bit controller , i.e. 16bits per channel. Little wonder then that the package size is only 14mm x 14mm.

    (C-Trail’s max memory b/w is 3.2GB/s because it is limited to 800MTransfers/sec, while Tegra3 can support upto LPDDR2-1066). NVidia also claims Tegra3 (Kal-El) can support DDR3L-1500 (at 1.35V) .

    What’s also interesting is C-Trail can support an (upto) 8MP primary camera and a 2.1MP secondary camera because of the built-in Image Signal Processor. (not sure if previous Atom chips included this block)

      • Rza79
      • 7 years ago

      No, that’s 2 x 32bit! It’s actually the Tegra 3 that’s bandwidth starved since it has a single 32bit controller.
      Z2760: 6.4GB/s
      Tegra 3: 4.2GB/s

        • phileasfogg
        • 7 years ago

        Aah, perhaps you’re right! I can’t see the datasheet for z2760 on ark dot intel dot com but I’ll take your word for it (Anandtech says it’s 2 x 32bit as well so I should have checked before I posted). My apologies.

        Aside: The C-Trail block diagram doesn’t show any PCIe lanes on this SoC. So if a tablet manufacturer wants to design-in a docking connector, can this be done without PCIe support?

          • Rza79
          • 7 years ago

          Basically one USB connection is enough for a docking connector.

          • Beomagi
          • 7 years ago

          None of it matters – atom isn’t potent enough to make use of even dual channel. Remember socket 754 Athlon 64 vs early socket 940? almost no real difference, single channel vs dual channel – and those were much more powerful CPUs.

          More disturbing is the handicapped 2GB limit. That’s just weaksauce.
          Also, WHY are they still on DDR2? This isn’t about speed, but power consumption. Isn’t ddr3 more power efficient?

            • vargis14
            • 7 years ago

            My dual channel 940 pin fx53 would chew up a 754 pin a-64…..But The % better was not the same as the % more it cost:)

            • Rza79
            • 7 years ago

            Socket 940 is for Opteron CPU’s and shouldn’t be compared to 754 bc of the use of REG memory. Compare 754 with 939 and there’s a definite performance difference.

            [quote<]and those were much more powerful CPUs.[/quote<] Times have changed buddy. I sadly have to inform you that those CPU's, as powerful as they were, are now slower than an Atom processor. [url<]http://www.cpubenchmark.net/cpu.php?cpu=Intel+Atom+Z2760+%40+1.80GHz[/url<] [url<]http://www.cpubenchmark.net/cpu.php?cpu=AMD+Athlon+64+3500%2B[/url<] [quote<]More disturbing is the handicapped 2GB limit. ... Also, WHY are they still on DDR2? ... Isn't ddr3 more power efficient?[/quote<] Well, no tablet on the market has more than 2GB of RAM and they seem to get along just fine. Also this processor uses [b<]LPDDR2[/b<] which is something very different to DDR2. It uses less power than DDR2 & 3. The LPDDR3 spec was just released so don't expect products using LPDDR3 soon.

      • bcronce
      • 7 years ago

      They don’t make a 16bit version of DDR. You make no sense.

    • willmore
    • 7 years ago

    [quote<]They're fully x86-compliant, allowing Clover Trail devices to run the full-fat version of Windows 8 rather than its cut-down RT counterpart.[/quote<] That brings up a good point. Since current ARM cores (well, stuff that's going to be in the market when this Atom ships) are performance competetive with Atom: Is 'full fat' Windows 8 more resource intensive than RT? If so, how will these chips survive under that increased load? Is there an x86 version of RT that slower Atom chips (say a single core phone type Atom) which will suit the resources of the chips better? Or, is the difference between 'full fat' Win8 and RT just the APIs that are supported? If that's the case, then the difference is that 'full fat' devices will ship with more laptop like 64-128GB SSDs while RT devices will ship with more tablet like 16-64 GB drives--due to product cost and OS storage needs.

      • Duck
      • 7 years ago

      I expect ‘full fat’ Windows 8 will come pre loaded with the usual horrendous bloatware preinstalled by ASUS, HP, whoever. That alone could acount for a noticable drop in performance and battery life.

        • brucethemoose
        • 7 years ago

        Well, if atom’s slow enough, they shouldn’t have to slow it down for you.

        They probably will anyway.

        • vargis14
        • 7 years ago

        Unless vizio makes it:)

      • Helmore
      • 7 years ago

      From what I read and saw of hands-ons with several different tablets at the IFA show at the end of august, the Windows RT tablets running on a Tegra 3 chipset were all running very smooth and had great performance. The couple of Intel Atom Windows 8 tablets that were tested on the other hand were choppy and clearly not ready yet. I’m not sure where the difference in performance comes from though, as Intel’s Clover Trail chipset should be faster than Tegra 3.

        • willmore
        • 7 years ago

        There’s the element of graphics driver that can’t be removed. Atom graphics have, traditionally, had horrible driver support–not just on Linux.

          • bthylafh
          • 7 years ago

          That’s mainly been the PowerVR-based Atom graphics with horrible drivers, which IIRC were all on the Z-series Atoms (until this one?).

          This chip is the one that’s not getting driver support from Intel for the power-saving modes unless you’re running Win8, so it’s still going to be crappy for Linux users.

            • willmore
            • 7 years ago

            I’ve never heard of any Atom chip having good driver support. It’s not just been the PowerVR ones. The GMA series was a smelly binary blob–and windows wasn’t much better.

            • Voldenuit
            • 7 years ago

            [quote<]That's mainly been the PowerVR-based Atom graphics with horrible drivers, which IIRC were all on the Z-series Atoms (until this one?).[/quote<] These Clover Trail Atoms have PowerVR SGX545 GPUs. Valleyview (originally scheduled for 2012, but delayed until Q4 2013) will incorporate Ivy Bridge graphics.

        • Ricardo Dawkins
        • 7 years ago

        [Citation needed]

        • NeelyCam
        • 7 years ago

        [quote<]The couple of Intel Atom Windows 8 tablets that were tested on the other hand were choppy and clearly not ready yet.[/quote<] You mean like this: [url<]http://www.youtube.com/watch?v=xIHuqnBN1CI[/url<]

          • chuckula
          • 7 years ago
          • chuckula
          • 7 years ago

          Look Neely! This video of an ARM based Windows system is INSANELY smooth:
          [url<]http://www.youtube.com/watch?v=spTCQtvp6qU[/url<] This video of an Intel Tablet shows that Intel is completely hopeless and will never produce a chip that can run in a tablet: [url<]http://www.youtube.com/watch?v=spTCQtvp6qU[/url<] I think I've just proven my case!

            • NeelyCam
            • 7 years ago

            Um… the same link twice..?

            Maybe this is what you were going for:

            [url<]http://www.youtube.com/watch?v=HWOOefm_rwo&feature=player_detailpage#t=101s[/url<]

            • chuckula
            • 7 years ago

            [quote<]Um... the same link twice..?[/quote<] Of course it was... it's just a psychological experiment to do on the ARM fanboys who will say that the exact same video of the exact same tablet is "great" when it supposedly uses ARM and is "unusable" when it supposedly uses Intel....

            • NeelyCam
            • 7 years ago

            Ah, that was too clever for me..

            • cegras
            • 7 years ago

            That video already showed it being pretty choppy on certain transitions.

            • NeelyCam
            • 7 years ago

            Yes it was.. this was worse:

            [url<]http://www.youtube.com/watch?v=HWOOefm_rwo[/url<] I love the ending, where it takes 5-10 seconds for two file managers to show up (the demo dude started it twice, since he didn't realize the tablet registered his first touch... because it was so FRIGGING SLOW TO RESPOND)

            • MadManOriginal
            • 7 years ago

            Got anything less than a year old? :p

            • willmore
            • 7 years ago

            I saw tiny little buttons that are easy to hit wrong. If you listen to what he was saying, he was expecting a file manager and IE to launch.

            • Ricardo Dawkins
            • 7 years ago

            oh well, looks like there is some video like this on youtube where a WinRT tablets got beaten by a iNTEL Atom Clover trail tablet

            Check it out: [url<]http://www.youtube.com/watch?v=BPVct0jAAKs[/url<]

            • chuckula
            • 7 years ago

            OK Ricardo, stop posting propaganda videos that contradict the herd-concensus that Atom cannot run a tablet!

            I’ll have you know that one of those animations I saw on the Atom tablet looked a little jerky, so therefore Atom is completely unusable on tablets and Intel has utterly failed! After all, nobody cares about booting a tablet or using so-called “web browsers” really!

          • cegras
          • 7 years ago

          lol, you posted a video of an i5.

            • NeelyCam
            • 7 years ago

            Wow, that was a pathetic miss on my part..

            How about this:

            [url<]http://www.youtube.com/watch?v=ODmaum14Qg4[/url<] See how quickly it scrolls and zooms in/out!

    • derFunkenstein
    • 7 years ago

    DDR2? really? Is there a reason to not support DDR3 with its lower cost, lower power consumption, and higher speeds?

      • jensend
      • 7 years ago

      It’s a typo. As you can read at other sites (*cough* anand *cough*), it’s [i<][b<]LP[/b<]DDR2[/i<] - which postdates DDR3 and uses some DDR3 ideas. LPDDR3 standards were just published and no LPDDR3 products will be manufactured until late next year. Regular DDR, whether 2,3, or 4, is a non-starter in the tablet or smartphone space since power is at too much of a premium.

        • willmore
        • 7 years ago

        So LPDDR2 is sort of DDR2.5?

          • Helmore
          • 7 years ago

          Not really. LPDDR2 is meant for a different market, namely the embedded space. You can’t have the same power consumption as laptop DDR2 in a smartphone for example. Another major difference is the amount of chips/packages needed. You just don’t have the same amount of space in a smartphone as in a laptop and ordinary DDR2 takes up way too much space. Then there is the way LPDDR2 can easily be stacked on top of a SoC in a package on package configuration, saving even more space. That’s what LPDDR2 is optimized for. I believe we’ll be seeing LPDDR3 early next year IIRC.

            • Beomagi
            • 7 years ago

            Is this a new dimm slot type as well? sounds like searching for upgrades may not be easy at this time.

            • UberGerbil
            • 7 years ago

            LPDDR3 is typically soldered on. Given the differences between DDR4 and earlier standards, the DIMMs (and SO-DIMMs) and slots will be different.

      • Duck
      • 7 years ago

      DDR3 is not automatically better than DDR2. Let us imagine what happens when you start producing DDR2 on a smaller process to at least match DDR3 production… power consumtion and operating voltages will drop for one. Probably matching DDR3. Price per GB will (eventually) drop. Again, probably matching DDR3. As for performance, at least you get much lower latencies compared to DDR3. Bandwidth may not be as high, but I expect it to be higher than what you could achive when DDR2 was launched for the desktop thanks to the smaller processes available.

        • willmore
        • 7 years ago

        Considering that the operating voltage is part of the DDR spec, your hypothetical DDR2 won’t actually be DDR2. If you’re going to go through the effort of redesigning a DDR2 part for a smaller process, why not design a DDR3 part instead?

        The latencies are a product of storage element charge and sense amplifier gain (and noise). That’s more a product of process than it is of any other part of the chip.

        I think there might be some confusion about latency here, too. Latency values for memory are in terms of cycles, not in some absolute time value. DDR2 generally has a lower cycle latency, but those cycles take longer. DDR3 has higher cycle latency, but the cycles are smaller. In the end, the product of cycle time and # of cycles of latency is pretty constant and has been so for a while. There’s no magic in DDR2 that makes them lower latency.

          • Duck
          • 7 years ago

          If there is no real differences between DDR2 and DDR3, then there would be no need to make DDR2 obsolete and have it replaced. It’s trivial to make an adjustable voltage regulator that could go lower. So it could have been in the original spec. The main problem would be in the limit of the memory bus on the motherboard/PCB. But you can have that as a rated spec from the motherboard manufacturer.

          Take a look at some Samsung 30nm DDR3. It seems to have the same latencies as other RAM even though it’s built on a smaller process and is a lower voltage part.

          There seems to be somthing else going on in the transistion from DDR2 to DDR3 that alows the capacity to scale up. Unless you are capacity constrained, and providing DDR3 doesn’t have a die shrink advantage, then DDR2 should be better thanks to lower latencies.

            • bcronce
            • 7 years ago

            Transistors for a given size have a given voltage range. Making transistors smaller requires lower voltages. The voltage for a given memory spec is based on the common operational voltage for the industry at that time.

            One cannot just lower the operational voltage without first shrinking the transistors. This means if you lower the voltage, you must create a new memory controller. Since memory controllers are typically integrated into the CPU, this means you need different versions for a given CPU with different transistor sizes.

            Who would want to fragment their market by doing that?

Pin It on Pinterest

Share This