Next-gen Samsung SoC has USB 3.0, WQXGA display support

Looks like next-generation Samsung smartphones and tablets will boast PC-class storage interfaces. Samsung’s new Exynos 5 SoC features not only 6Gbps Serial ATA connectivity, but also USB 3.0. The latter should prove particularly useful for folks transferring files to and from their devices, and it makes the Exynos an intriguing option for netbook/tablet hybrids.

Of course, faster storage interfaces aren’t the only new hotness in the Exynos 5. The chip features dual CPU cores based on ARM’s Cortex-A15 architecture. Those cores are clocked at 1.7GHz, according to the official press release, and they’re fed by dual channels of 800MHz LPDDR3 memory. The 12.8GB/s of bandwidth provided by the memory interface allows the chip to power displays with resolutions up to 2560×1600, the same number of pixels as 30" desktop monitors. In what’s claimed to be a first in the mobile industry, Samsung has integrated an Embedded DisplayPort controller and PHY right onto the Exynos SoC. HDMI 1.4a output is supported, as well.

Pixel-pushing horsepower is provided by ARM’s quad-core Mali-T604 GPU, which allows the graphics processor to perform general-purpose computing tasks via OpenCL. Samsung’s Exynos 5 whitepaper claims the GPU doubles the 3D graphics performance of the Mali-400 graphics processor in the Exynos 4. On the video front, the Exynos 5 is purportedly capable of decoding 1080p content at 60 FPS.

There’s no mention of when devices based on the Exynos 5 will hit the market, but I hope we don’t have to wait too long. I really like the idea of an ultra-high-PPI tablet with a 2560×1600 display and USB 3.0 connectivity.

Comments closed
    • sparechange
    • 7 years ago

    if they put this chip in a 10 in tablet with all of that- it will be a ipad killer. i think i want one, and i dont even have a walkabout phone yet – this segment of computing is just getting interesting- still a pc guy

      • FrankJ
      • 7 years ago

      I agree, to the extent that it should be better than the iPad which will be almost instantly replaced with something better than the Samsung offering. Still its a step in the right direction. Most Android offerings are vastly inferior yet cost virtually the same as the current iPad due to Apple’s ruthless deals with parts manufacturers.

    • ronch
    • 7 years ago

    After my bad experience with Samsung’s international warranty policies, I just don’t care what they come up with. They told me they honor int’l warranty claims for refrigerators but not tablets. WTF.

    • kamikaziechameleon
    • 7 years ago

    the new ipad’s display is simply amazing. If only they put a camera on that thing that could use it even a little bit 🙁

    • Farting Bob
    • 7 years ago

    Man the power usage is going to be pretty brutal. Even for tablets, it seems battery tech/size is growing at a smaller rate than power consumption. Stick on of these onto a HD 10″ display and when doing anything intensive the battery is going to be hammered. Personally, i’d gladly trade an extra 10% weight and thickness for a battery that offers twice as much runtime, but for phones/tablets this seems like a dream still.

      • DancinJack
      • 7 years ago

      Why is power usage going to be brutal?

      • Peldor
      • 7 years ago

      The battery is already 25% or more of the weight of most tablets. If you want to double it, 10% definitely won’t cut it.

      • OneArmedScissor
      • 7 years ago

      [quote<]Even for tablets, it seems battery tech/size is growing at a smaller rate than power consumption.[/quote<] Hardly. Just look at the Nexus 7. It actually uses a 35% lower capacity battery than most of its competitors with weaker chips, but lasts as long, or even longer than some. The screen is smaller, but they also came up with an alternative tablet design that cut a few components to pull that off. There are still numerous components to either shrink to the current node and tie into the SoC's power control circuits, or just remove due to obsolescence. Samsung is the only manufacturer to use high-k gates so far, and then multigates will follow. No one has shown off a chip with a modern low power core yet, like the Cortex A7. Memristors are just around the corner. We haven't even seen some existing technologies utilized in tablets yet. I'm not aware of a single tablet with an OLED backlight, like many phones have. The backlight uses the most power. I doubt phones and tablets run into the same battery life wall that laptops have. They're subject to similar limitations, but since it's taking longer to get there, the steady march of progress will carry on right up until 3D chips and completely different displays become viable.

    • albundy
    • 7 years ago

    hopefully the will wake up from the unsmart TV bs, and put this in with android JB (sorry Ouya).

    • burntham77
    • 7 years ago

    I would love to see 2560×1600 on a 30 inch monitor from Samsung. Preferably LED, maybe throw in some DisplayPort. USB optional. Thanks!

      • Airmantharp
      • 7 years ago

      I’ll stick with LG panels…

      Now, if they could get response times <8.3ms for 120Hz while supporting the input and switch to a near-glossy matte screen coating with RGB-LED backlights and thin bezels, well, that would rock. Just don’t expect a real gaming or color pro monitor from Samsung. They make consumer rubbish.

      • Wirko
      • 7 years ago

      The same res on 23″-24″ wouldn’t hurt, either.

      • Anonymous Coward
      • 7 years ago

      “And look, we’ve thrown in a fancy SoC for free, so you can install a fully functional version of Linux on your monitor.”

        • stupido
        • 7 years ago

        … and play L4D2 on it… (zombie face & posture): L4D2.. L4D2… must learn linux… must learn linux…

      • FrankJ
      • 7 years ago

      Yes its odd that they have a better display in their $500 tablet than in their $800 monitor ( and I own that monitor, lol ).

    • bjm
    • 7 years ago

    Meh.. I’m still waiting for manufacturers to adopt an Intel SoC in mass numbers, hopefully with open sourced drivers, before jumping on the latest and greatest Android train. The Exynos chip is looking nice, but if the drivers aren’t open, updates are still going to come at a snail’s pace.

      • Airmantharp
      • 7 years ago

      You should be less worried about the SoC used than the branding. Nexus devices get updates, everyone else waits, hopes, and prays.

      Intel’s SoCs look promising too but they still have to prove themselves. They’re not a tier 1 SoC competitor yet.

        • NeelyCam
        • 7 years ago

        My prediction is that x86 SoC is the #1 solution by end of 2014.

          • blastdoor
          • 7 years ago

          #1meaning best or most popular? And for what class of device?

          • Unknown-Error
          • 7 years ago

          When is the successor to ‘Atom Z2580’ coming? Once Intel moves to 22nm (Or is it 14nm?) then they really will kick the competition out. Sounds scary.

            • willmore
            • 7 years ago

            Valley View. It bring open source graphics (IVB generation) to Atom. That’ll a be a large turning point for wide support for intel almost-SoCs.

            • Unknown-Error
            • 7 years ago

            Thanx. Looks like the usage of PowerVR by Intel was just a temporary thing.

            • Airmantharp
            • 7 years ago

            One can only hope- Intel’s GPU setup more closely resembles what Nvidia and AMD are doing. The mobile IP is great for phones but I’d rather have desktop class stuff if I’m using the machine for content creation, and not just consumption.

            • BobbinThreadbare
            • 7 years ago

            I’d rather have desktop class stuff for consumption, I like my graphics.

            • willmore
            • 7 years ago

            Nowhere near temporary enough.

          • Airmantharp
          • 7 years ago

          That’s a good target, really.

          Intel should be able to outperform ARM + TSMC/UMC + SoC integraters/vendors easily if they put some effort into it.

        • bjm
        • 7 years ago

        Branding or not, it doesn’t change the fact that practically all the current Android manufacturers use chips whose drivers come in binary form only. This limits the upgrade options for groups like CyanogenMod since they have to use those same drivers for the kernels released by the manufacturer. In addition, there is still the issue that ARM SoCs lacks any standard BIOS/UEFI equivalent that the x86 architecture has. Each phone’s firmware has to have code for device intialization, locating the bootloader, executing it, etc. It’s a large duplication of effort for every phone.

        It’s not yet certain if Intel’s SoC will directly address those two drawbacks of ARM, but if the desktop landscape is of any indication, they will. Intel’s integrated graphics is already open source and they have already used BIOS/UEFI for over three decades.

        If the phones use a similar boot architecture, then you can look forward to releases like: “CyanogenMod 11, now released for Intel based phones. Click here to download now!” Instead of, “CyanogenMod 11, now released for Samsung’s US based phone on AT&T (not Verizon!) that you purchased on a Tuesday. This firmware does not apply to Wednesday purchased versions nor the International version.”

        Okay — so that’s a slight exaggeration, but having open sourced drivers for the platform *AND* a standardized boot initialization architecture would make these releases a lot more easier to create. It would benefit the Android ecosystem as a whole, and not limit quick updates to just one brand.

    • codedivine
    • 7 years ago

    [quote<]Samsung's Exynos 5 whitepaper claims the GPU doubles the 3D graphics performance of the Mali-400 graphics processor in the Exynos 4[/quote<] The whitepaper is a bit ambiguous at places. Are they twice as fast as the older Exynos 4 (in the Galaxy S2) or the new one (in the Galaxy S3)? [quote<] general-purpose computing tasks via OpenCL[/quote<] As an OpenCL programmer, really looking forward to OpenCL support (with fp64 to boot). The whitepaper claims 72 GFlops, and have seen numbers in similar ballpark (68 GFlops) in previous ARM marketing material. I assume that peak is for FP32. Anyway, for comparison, the 18W E-350 had about 80 GFlops Fp32 peak. So AMD better be careful or their lunch will soon be eaten.

      • DancinJack
      • 7 years ago

      I don’t know which they’re talking about, but Anand expects performance of the new GPU to somewhere between the SGX543MP2 and MP4.

      [url<]http://www.anandtech.com/show/6148/samsung-announces-a15malit604-based-exynos-5-dual[/url<] e: spelling and source

      • bcronce
      • 7 years ago

      I can’t wait for OpenGL to merge with OpenCL. Context switching between the two doesn’t help performance.

      • Flying Fox
      • 7 years ago

      [quote<]So AMD better be careful or their lunch will soon be eaten.[/quote<] I believe their lunch is already being eaten by everybody? It is also one reason why they are looking into building ARM-based chips. If you can't beat them join them...

    • Bensam123
    • 7 years ago

    And tablets edge ever closer to netbooks… Careful, you wouldn’t want to disturb the precious magical lines that somehow separate the two.

      • eofpi
      • 7 years ago

      You mean keyboards?

        • Bensam123
        • 7 years ago

        I was thinking more along the lines of a OS instead of a app store, but that’s pretty close too.

    • Shouefref
    • 7 years ago

    SoC will become the desktop.
    Only question: when exactly?

      • Alexko
      • 7 years ago

      When they can stack a decent amount of DRAM and Flash (or other, equivalent types of memory) on top of them, I would say. The prospect of having an entire computer in a single <1cm³ package is quite appealing.

      • Deanjo
      • 7 years ago

      It’s getting pretty close with intels and AMD’s latest efforts. But even then SoC x86 chips have been around for a long time already like the offerings from DMP Electronics.

      • sweatshopking
      • 7 years ago

      THE FUTURE

        • CuttinHobo
        • 7 years ago

        But what about when tomorrow is yesterday??

          • sweatshopking
          • 7 years ago

          THEN YOUR HEAD EXPLODES.

            • Haserath
            • 7 years ago

            My head is still

        • ronch
        • 7 years ago

        That’s just three years from now.

      • ronch
      • 7 years ago

      When performance is so high already that any excess power that can be crammed in a desktop part is no longer needed. Right now, that’s not yet the case.

      • ImSpartacus
      • 7 years ago

      Haswell, so that’s next year.

    • XTF
    • 7 years ago

    Am I the only one that doesn’t know what resolution WQXGA is without looking it up?

      • DancinJack
      • 7 years ago

      I know what it is, but I can assure you that you’re not the only one that doesn’t know.

      • Chrispy_
      • 7 years ago

      The VGA-based resolution acronyms went to pot after XGA, because after that the lack of 4:3 aspect ratio added massive ambiguity to the system.

      Without looking it up I am guessing that it stands for Wide Quad XGA,

      [list<][*<]XGA = 1024x768 [/*<][*<]QXGA = 2048x1536 [/*<][*<]WQXGA = 2560x1536? [/*<][/list<] How am I supposed to know how wide? 16:9,16:10, or something in between like 1280x768 was? It's just meaningless doing all that work and then still not necessarily knowing what the resolution is supposed to be. It's pretty close to the commonly-used 2560x1600 too. [i<]Edit: Oh man, it's SUCH A MESS! Here's info on the standard. It's so full of words like 'or', 'usually','sometimes', 'near', and 'roughly' that I fail to see how it is defining anything at all: "[b<]Wide (W)[/b<] The base resolution increased by increasing the width and keeping the height constant, for square or near-square pixels on a widescreen display, usually with an aspect ratio of either 16:9 (adding an extra 1/3rd width vs a standard 4:3 display) or 16:10 (adding an extra 1/5th). However, it is sometimes used to denote a resolution that would have roughly the same total pixel count as this, but in a different aspect and sharing neither the horizontal OR vertical resolution - typically for a 16:10 resolution which is narrower but taller than the 16:9 option, and therefore larger in both dimensions than the base standard (e.g. compare 1366x768 and 1280x800, both commonly labelled as "WXGA", vs the base 1024x768 "XGA")."[/i<]

        • Alexko
        • 7 years ago

        You forgot “roughly” and “commonly”. :p

        I never use these abbreviations, I prefer to write 2560×1600 or just 1600p, which I think most people will understand because they know what 1080p is.

          • UberGerbil
          • 7 years ago

          Except 1600p suggests to me a horiztonal resolution of ~2844 because I associate the “p” notation with HD video and its 16:9 aspect ratio.

            • Alexko
            • 7 years ago

            Well, close enough. 🙂

            • Airmantharp
            • 7 years ago

            We use 1200p for 1920×1200?

            I’m pretty sure it’s arbitrary. Isn’t there a ‘p’ designation for 2.35:1 cinema displays as well? I mean, it just means ‘progressive’ and ‘not interlaced’.

      • JMccovery
      • 7 years ago

      I thought it was in the article…

      “…allows the chip to power displays with resolutions up to 2560×1600, the same number of pixels as 30″ desktop monitors.”

      W(ide)XGA = 1280×800 or 1366×768; in this case, we’ll use 1280×800, quadruple that, and we get 2560×1600, or WQXGA.

    • MFergus
    • 7 years ago

    Hey look title of the article doesn’t mention “retina” displays, gj unlike ars

      • Chrispy_
      • 7 years ago

      [b<][u<]Next-gen Samsung SoC has USB 3.0, retina display support[/u<][/b<] Apple litigation in 3... 2... 1...

        • yogibbear
        • 7 years ago

        You can’t patent the retina display. If that happens I will patent your ARMS!

Pin It on Pinterest

Share This