Say hello to Kal-El, Tegra 2’s quad-core successor

Nvidia has scored a solid number of design wins for its Tegra 2 processor, which is going to find its way into many of the iPad competitors due out this year. The chipmaker isn’t resting on its laurels, though. At the Mobile World Congress in Barcelona, Nvidia says it gave the first public demonstration of Kal-El, Tegra 2’s successor.

Kal-El will feature four ARM cores as well as a “12 core” GeForce GPU—that presumably means a graphics component with 12 ALUs, or stream processors. Samples have already gone out to Nvidia’s customers, who are purportedly planning to kick off production of Kal-El-based gear in August.

The Mobile World Congress demo involved web browsing, games, and streaming of 1440p video on a 2560×1600 panel. As Nvidia points out, the high-res video streaming wasn’t just a show of strength. The firm has made provisions for mobile devices with 10.1″ displays that have a 300-DPI pixel density. If my math is right, a 300-DPI, 10.1″ display with a 16:10 aspect ratio would have a resolution of about 2569×1606, so that’s not far off the mark.

Nvidia didn’t capture the MWC demo on video, but its blog post includes a clip of a Kal-El development system running a browser benchmark on YouTube:

There’s also a second video that shows Kal-El running the Coremark benchmark on Android. Apparently, Kal-El managed to outperform Intel’s Core 2 Duo T7200 processor in that test. The benchmark is clearly heavily multi-threaded, so Kal-El’s extra cores must give it an advantage—and, you know, this is a test hand-picked and run by Nvidia. Still, we’re talking about an ARM-based solution here, not an x86 CPU like the Core 2.

What’s next? Nvidia’s MWC blog post shows a roadmap of upcoming Tegra products code-named Wayne, Logan, and Stark, which are scheduled for 2012, 2013, and 2014, respectively. Stark will purportedly be a whopping 75 times quicker than Tegra 2. Crysis 3 on next-gen slates, anyone?

Comments closed
    • moog
    • 9 years ago

    Very nice.

    I’m assuming Kal-El is on 28nm.

    I’m skeptical about the 75x claims for Stark or other generations (schedule looks overly optimistic) since we’re hitting physical limits of silicon within the next couple of yrs. My bet is on Intel having the process advantage once they get their mobile architecture sorted out.

      • shank15217
      • 9 years ago

      Kal-El is on 40nm

      • snowdog
      • 9 years ago

      NVidia is big on exaggeration. The 5X Tegra 2 is a lot closer to 2X performance. They are taking the best case CPU boost, the best cast Graphics boost and adding them together, which doesn’t make any real sense.

    • UberGerbil
    • 9 years ago

    A big chunk of the CoreMark bench [url=http://www.eetimes.com/design/embedded/4212735/CoreMark–A-realistic-way-to-benchmark-CPU-performance?pageNumber=1<]involves matrix processing[/url<]. If nVidia compiled that to run on their GPU, there's a lot more than 4 ARM A9 cores crunching in that second video. And CoreMark apparently scales pretty much linearly with core count, something that is going to automatically favor a quad (and something that doesn't happen much in the real world, certain embarrassingly parallel applications aside). So while ARM matching or beating an x86 [i<]anything[/i<] (even a two-generation old 2GHz mobile proc) will raise eyebrows... But hey, sufficiently advanced technology is indistinguishable from a rigged demo. (Not saying it is, but one video from a marketing dog-and-pony presentation is hardly the basis for concluding anything)

    • d0g_p00p
    • 9 years ago

    So we have Superman (Kal-El), Batman (Wayne), Wolverine (Logan) and Iron Man (Stark) as Tegra code names, nice.

    edit: I should have read the comments before I posted, FAIL!

    • BiffStroganoffsky
    • 9 years ago

    As pointed out elsewhere, the second video would have been more compelling if Kal-El and C2D had been running the same test: [url<]http://images.anandtech.com/reviews/SoC/NVIDIA/Kal-El/DSC_1401.jpg.[/url<] Squint a little at the GCC rev. numbers.

      • Game_boy
      • 9 years ago

      And the optimisation levels. O2 vs O3. The other flags affect it a few percent too.

    • kamikaziechameleon
    • 9 years ago

    If this in a tablet for next fall count me in assuming the power draw is amazingly small.

    • dpaus
    • 9 years ago

    Does anyone see any information about the power draw?

      • Silus
      • 9 years ago

      No more than Tegra 2.

      • Silus
      • 9 years ago

      Forgot to include the link where I got that info:

      [url<]http://www.anandtech.com/show/4181/nvidias-project-kalel-quadcore-a9s-coming-to-smartphonestablets-this-year/2[/url<]

        • dpaus
        • 9 years ago

        Thanks – now, does anyone know the power draw of a Tegra 2? (I think I asked this a few weeks ago, but I don’t think I got an answer…)

          • Kurotetsu
          • 9 years ago

          A company called CompuLab released an ultra-slim desktop system using Tegra 2 not too long ago:

          [url<]https://techreport.com/discussions.x/20313[/url<] [url<]http://www.semiaccurate.com/2011/01/26/compulab-announces-tegra-2-powered-trim-slice/[/url<] According to SemiAccurate: [quote="SemiAccurate"<]The Trim Slice is anything but a high-end system, although compared to most devices its size, it really stands out. The chassis in measures 130x95x15mm (5.1x3.7x0.6in) and it draws a mere 3W of power on average, not something we’ll see Intel or AMD beat any time soon, although we might get to see some even more power efficient dual core ARM solutions in the near future.[/quote<] So I'd say around 3W?

            • OneArmedScissor
            • 9 years ago

            Lol phones have like 5w batteries. The best you can do is look at some phone tests and guesstimate:

            [url<]http://www.anandtech.com/show/4144/lg-optimus-2x-nvidia-tegra-2-review-the-first-dual-core-smartphone/15[/url<]

            • Silus
            • 9 years ago

            Less than that. Remember that that system has more stuff other than the SoC, like the SSD, 1 GB of DDR2, which also consume some power.

            • jdaven
            • 9 years ago

            That’s TOTAL system average power of 3W.

    • jdaven
    • 9 years ago

    See this is the kind of roadmap and demonstrations that show the huge investment Nvidia is putting into its mobile strategy. We have specs, we have internal model names and a timeline.

    Intel is not going to die but they currently don’t have the architecture and internal thinking that will bring out a suitable competitor to all this. If Intel starts now, they can dump x86 completely and start from scratch on a brand new chip architecture. They did it with x86. They did it with Itanium. Why can’t any of the chip companies make another new architecture?

    Development from the ground up with power reduction capabilities saturated into the very DNA of a company is the only way you can make it in the upcoming smartphone market. Shoestringing a 30 year old chip architecture initially meant for large beige boxes and a connected power source will not cut it in the smartphone world.

      • bhtooefr
      • 9 years ago

      One thing to note is that modern x86 CPUs are far, far closer to modern RISC processors in design, with a different decode front-end that breaks down x86 instructions into simpler instructions.

      So, it’s not really a 30 year old chip architecture. And, once 64-bit OSes take off, almost all of the legacy cruft can be disposed of, if running old 32-bit OSes isn’t important (32-bit software will still run) – the 16 and 8-bit modes won’t be used at all, which also means that real mode, the insane segmented addressing, etc., etc. won’t be used, either.

        • maxxcool
        • 9 years ago

        *snort* as long as mom and pop keep buying computers and transition costs keep high we will see x86 legacy crap for at least 10 more years.

      • OneArmedScissor
      • 9 years ago

      Smart phones as we know them are about to just turn into normal phones that everyone and their dog gets with their regular phone plan. Intel have something else in mind.

      They are working on a higher power x86 platform because they want a higher power x86 platform. The x86 part isn’t necessarily the problem. They are trying to manipulate the direction of the upcoming smart phone market so that it falls in line with their existing PC strategy – edging out the competition in “speed” so they can play marketing-friendly numbers games, sell their brand at a premium, and make competitors look cheap and behind the times.

      Since when did Intel care if they were selling something for a few hundred dollars more that doesn’t work better for the average person?

      It looks like their plan is already working. Without selling a single phone, they’ve spurred an arms race (harharhar!). Everyone is hurrying to get ahead, over-emphasizing speed as a marketing point, just before Intel shows up. There’s a reason they’re sitting on the side lines right now. It’s easier to let the smaller companies pick each other apart and set the stage for their inevitable laptop in smartphone’s clothing.

      • sjl
      • 9 years ago

      Yeah, Intel came out with x86 and Itanium. They also came out with (ignoring the long obsolete pre-x86 4004, 4040, 8008, 8080, and 8085) the i432, the i860, and the i960. There’s also the StrongARM and XScale; StrongARM came from DEC, and XScale (which is also an ARM chip) was sold to Marvell in 2006.

      Let’s be blunt here. [b<]None[/b<] of Intel's non-x86 architectures have taken off in the marketplace. The i960 still sees a little use in RAID controllers and other niche applications, but other than that ... bugger all, really. Itanium had high hopes, and (relative to those hopes) has proven to be a massive flop ("Itanic" is far too apt a nickname.) In the CPU market, Intel looks very much like a one trick pony; their history doesn't give me confidence in their ability to build a new low-power architecture. Besides, ARM is already in that space, and doing very nicely indeed; why would an embedded shop give up the massive expertise that has been built up around ARM chips?

      • ludi
      • 9 years ago

      x86 has successfully fended off all would-be assassins going on 33 years now. Pointing out that Intel managed to do a “whole new architecture” with x86 is a canard when the year in question is 1978, and the modern computing industry and its four mega-Farads of installed infrastructure is nowhere in existence.

      IA64 has been effectively niched.

      In short, I have no idea what you’re driving toward. Intel arguably does need to get a significant presence in the mobile sector RSN, but their most recent attempt to move in that direction was Atom, and look how well that’s gone for them. Also, it’s [i<]still[/i<] x86.

    • alloyD
    • 9 years ago

    As I watched the Coremark video I noticed (as the commentator mentioned) that the performance is about double the dual-core Tegra 2. Shouldn’t that easily follow? Doubling the number of cores should double your score on a CPU benchmark program. This leads me to ask: Is Kal-El really the successor to Tegra 2 or is it Tegra 2 with double the cores? I’m not saying that what they’ve done isn’t impressive, I’m just trying to understand if this is a new chip design or not.

      • Game_boy
      • 9 years ago

      Nvidia didn’t modify the Cortex A9 CPU core from the reference design for either Tegra 2 or this. So on the CPU side it’s just more cores. Denver is Nvidia’s first in-house ARM design.

        • alloyD
        • 9 years ago

        Thanks! That clears that up for me.

        • mczak
        • 9 years ago

        That isn’t quite true. Yes nvidia didn’t modify the cores, but Tegra 2 does not contain the optional MPE (media processing engine, supporing NEON instructions) whereas Kal-El does.

    • kvndoom
    • 9 years ago

    Why was Superman the first thing to come to mind when I read this headline??

      • flip-mode
      • 9 years ago

      Kal-El: Superman
      Wayne: Batman
      Logan: Wolverine
      Stark: Iron Man

        • khands
        • 9 years ago

        To be more specific, Kal-El is Superman’s birth/Kryptonian name.

        • thanatos355
        • 9 years ago

        This was exactly what I was thinking while reading the article.

      • dpaus
      • 9 years ago

      The first thing I thought of was Leonard Hofstetter’s Facebook page….

    • Silus
    • 9 years ago

    The term “stream processors” is usually used for a unified architecture, which the architecture of the GeForce chip in Tegra 3 is not. It’s an evolution of the GPU in Tegra 2, which had 4 vertex shaders and 4 pixel shaders. This “new” one in Tegra 3 has 12 “cores”, but they still stick with the vertex/pixel shader duo.

    • lilbuddhaman
    • 9 years ago

    1. but does it blend

    2. where and when is a slate/device [that doesn’t have a horrible UI / OS] that uses it ?

    • Hattig
    • 9 years ago

    Some useful information here.

    First is that there will be 300dpi 10.1″ displays coming out soon (possibly this year). These will look amazing. It’s potentially enough to put you off buying a tablet before they come out. It also supports those iPad 2 rumours with higher DPIs – although it’s unlikely iPad 2 will have them, there’s the rumoured iPad 3 coming out pre-Holidays now.

    Second is that a 2GHz Core 2 Duo may be an old generation of processors, but it’s a significant processor architecture. This isn’t ARM possibly beating Atom like the arguments from earlier this year. There are loads of ultra-portables out there using sub 2GHz Core 2 Duos still (MacBook Air for one), or sub 1.6GHz Core i5/i7s.

    Thirdly is appears that Nvidia is finally on form with their Tegra line, and they’re not messing around.

      • bhtooefr
      • 9 years ago

      For that matter, last month, I received a brand new Core 2 [b<]SOLO[/b<] SU3500 laptop from my employer. (And the model is available with ULV Core 2 Duos...)

    • bhtooefr
    • 9 years ago

    [quote<]As Nvidia points out, the high-res video streaming wasn't just a show of strength. The firm has made provisions for mobile devices with 10.1" displays that have a 300-DPI pixel density. If my math is right, a 300-DPI, 10.1" display with a 16:10 aspect ratio would have a resolution of about 2569x1606, so that's not far off the mark.[/quote<] I think I just popped a woody.

      • pins
      • 9 years ago

      You’re somehow unsure of whether you have or not?

        • Grigory
        • 9 years ago

        Do *you* have a microscope to hand constantly? I don’t think so!

          • grantmeaname
          • 9 years ago

          He uses a CRT that’s like 16″ and 3840*2400.

            • Grigory
            • 9 years ago

            Actually I was only refering to the unknown state of his pecker. 🙂

            • bhtooefr
            • 9 years ago

            LCD, 22.2″, and to be completely honest, I don’t use it that often – the setup needed to drive it off of my laptop (my main computer) is a little insane, and the 15″ 2048×1536 panel I’ve got in it is enough desktop real estate for most things.

        • bhtooefr
        • 9 years ago

        Touche.

    • Dagwood
    • 9 years ago

    Death of Intel? I think not. The test was hand picked and it is Core 2 (two generations ago). Intel remains king in the desktop/server/workstation world.

    However! Intel does really need to rethink the atom. The Atom powered net book is looking real old now.

    Apple has to be thinking about how it is going to compete with future generations of tablets with either Fusion or Tegra parts. Apple has a history of being first to the market then loosing market share from not keeping pace with the competition.

    My prediction is that the tablet is here to stay. It is the new laptop for the masses.

      • Helmore
      • 9 years ago

      I don’t see why Apple has to rethink anything based on this demo.

      • TheBulletMagnet
      • 9 years ago

      Its losing. As in losing market share. Not loose.

      • TaBoVilla
      • 9 years ago

      [url<]http://theoatmeal.com/comics/misspelling[/url<]

        • TheBulletMagnet
        • 9 years ago

        Well there went several minutes of my morning. Awesome link.

          • RickyTick
          • 9 years ago

          It cost me nearly 20 minutes.

        • NeelyCam
        • 9 years ago

        Perfect.

      • bhtooefr
      • 9 years ago

      The thing is, those performance numbers are Cortex-A9, and I’m guessing the same clock speed ballpark as a Tegra 2, based on that performance.

      While, yes, by the time A15 is shipping, it’ll also be a couple generations out of date, IIRC, it’ll be in Core 2’s performance per MHz territory, and in its clock speed territory, too.

      And, as for being a couple generations out of date… there’s the “fast enough” factor, too…

      A 2 GHz Core Duo, my main machine’s CPU, is fast enough for most of what I do with a computer – in fact, I’m RAM constrained more than anything – and is slower than this. Sure, it’s three uarch generations out of date, but it’s fast enough.

    • blastdoor
    • 9 years ago

    Huh… I thought I posted something, but now I don’t see it.

    Anyway…

    I think this is another piece of evidence that the next generation of consoles could be ARM-based.

    And it might also be evidence that Intel is doomed, or at least should have been more willing to license x86.

    • blastdoor
    • 9 years ago

    edit —

    now I see my original post. Problem with comment system?

Pin It on Pinterest

Share This