Trinity due mid-year in 17W, 35W variants

AMD had a surprise waiting for us at its CES booth: a new demo of the upcoming Trinity APU, some details about the chip’s performance and launch schedule, and a glimpse at a couple of real Trinity chips.

According to AMD’s Raymond Dumbeck, Trinity will roll out in 35W and 17W variants—shown in that order in the image above. Compared to AMD’s existing A-Series APUs, the 35W model’s graphics component will purportedly deliver 50% more gigaFLOPS, while the CPU will offer a 25% increase in compute power. We were told the 17W variant boasts similar performance at half the power, making it ideal for ultra-thin notebooks. AMD expects both versions of Trinity to be out around the middle of the year.

We were also treated to a fresh demo of Trinity in action. The demo rig was running a DirectX 11 racing game on one monitor and a GPU-accelerated video transcoder on another. When Dumbeck popped open the case for the big reveal, we saw that the hardware inside was a Trinity-powered notebook. The machine had a video playing on its display, too, highlighting Trinity’s built-in multi-monitor support.

Comments closed
    • Rza79
    • 8 years ago

    I’ve read on other websites that the video transcoder was running on the CPU. That makes sense since that way the GPU is handling the game, the CPU is handling the transcoding and the UVD is handling the video.

    • shank15217
    • 8 years ago

    Why is it due mid year? Why is there such a long lead time between demo and production? Their GPU division just pulled a rabbit out their hat with a 4.3 B transistor graphics monster with probably a 5-6 month lead time to Nvidia and their CPU division is gonna be 3 months behind Intel with a value competitive part.

      • stmok
      • 8 years ago

      [quote<]Why is it due mid year? Why is there such a long lead time between demo and production? Their GPU division just pulled a rabbit out their hat with a 4.3 B transistor graphics monster with probably a 5-6 month lead time to Nvidia and their CPU division is gonna be 3 months behind Intel with a value competitive part.[/quote<] (1) Different design teams. (GPU teams and CPU teams don't work in the same room.) (2) GPUs and Bobcat-based APUs are made at TSMC. (3) FX/Opteron CPUs and A-series APUs are made at GlobalFoundries. (4) GPU only based products have less design/manufacturing issues compared to a CPU/GPU hybrid design. (More design compromises with the latter solution as it has to fine tune and balance between the needs of CPU and GPU cores on the same silicon.) (5) GlobalFoundries needs to time to set things up for mass production. At this time, Trinity is undergoing the final process of manufacturing by making test samples. (Trial run with a small sample to make sure the mass production process has minimal issues when it goes live. Time spent here means fewer delays and problems down the line.) (6) Given the popularity of the APUs with OEMs, AMD is adding more time so GlobalFoundries has ample production time to meet OEMs and retail needs. (Last year, Bobcat-based APUs was in heavy demand such that countries like Australia missed out on certain products due to limited chip supply. ie: They didn't make enough! AMD had to place another order!) (7) GlobalFoundries no longer have AMD as its total focus. They also have other clients that need ARM-based processors, while their engineering talent is focused on future manufacturing processes. (Its why AMD had a tough time with the 32nm SOI process in 2011. GF's engineering folks were assigned to 28nm, 22nm, etc technology. The kind of key people required to help mature 32nm SOI in shorter time.)

    • Flatland_Spider
    • 8 years ago

    The 17w APU may be just what I’m looking for to use in a home server.

    I’m not too enthusiastic about the Realtek and Broadcom NICs that usually get bundled with AMD hardware. At least with Intel, I can buy Intel boards and get Intel NICs which have good *nix support.

      • shank15217
      • 8 years ago

      Not true at all, Intel chipsets don’t mean Intel nics.

        • NeelyCam
        • 8 years ago

        He said “Intel boards”, and yes – that does mean Intel NICs.

          • shank15217
          • 8 years ago

          Very few people buy Intel branded board, if he meant that, its not obvious.

            • NeelyCam
            • 8 years ago

            No need to downthumb – I was just pointing out what he meant. I’m sorry if it wasn’t obvious to you… but to me, mentioning “Intel boards” and “Intel NICs” in the same sentence makes it pretty obvious that he was talking about “Intel boards” – not “Intel chipsets”.

        • Flatland_Spider
        • 8 years ago

        I meant motherboards from Intel when I said Intel boards. You’re right Asus, Gigabyte, et al. bundle Realtek and Broadcom NICs with Intel chipsets as well.

    • slaimus
    • 8 years ago

    Finally the 17W fusion appears

    I can see this as:

    Engineer: It is very risky to deliver 17W Llano on a new 32nm process. We can probably do it with Trinity.
    Management: But Apple wanted the 17W chips yesterday for the new MacBook Air. So we will take the risk and promise them the chips, as if we do succeed this will be a good coup over Intel.

    And then all the AMD managers get fired when Apple is mad that they could not deliver 17W Llanos and changes even discrete GPUs to Nvidia for other Macs.

      • sschaem
      • 8 years ago

      The issue was yeld…

      And the CEO and all the other that got fired are for more serious reasons then loosing this potential contract.

    • Abdulahad
    • 8 years ago

    I get a strong feeling INTELIE fanboys would be among the first to rush for Trinity specially after seeing Ivy Bridge taking curbs and bumps on its own…
    Mooly wasn’t caught red handed, he was caught hands-up and turned red…:-)

    I sincerely wonder how many more lies went through and still go on undetected….
    INTEL… LIE AHEAD..lol

      • chuckula
      • 8 years ago

      [quote<]I sincerely wonder[/quote<] No you don't. There's nothing sincere about you whatsoever.

    • Abdulahad
    • 8 years ago

    BRING IT ON AMD!!!BRING IT ON!!!
    ULTRABOOKS AT ULTRALOW PRICES WITH ULTRAGRAPHICS..!!!
    And on top of that, no Ultralie!!! A game is a game, no pre-recorded video!!!

    My Turion II still keeps me going smoothly but I’m so tempted for a Trinity upgrade..
    What an investment, life is cheap and simple with AMD solutions!

      • stupido
      • 8 years ago

      AMD akkbaaarrr! 😀

    • Pantsu
    • 8 years ago

    Mid-year? I thought these were supposed to be out now. Then again it’s AMD, which means mid-year is a paper launch in September.

      • Abdulahad
      • 8 years ago

      It’s better late than a lie…. (Phantom f1 driver)lol
      Shame on Intel or Intelie!!

    • ronch
    • 8 years ago

    Some time ago I think I read at Anandtech that AMD is working on the next BD stepping, presumably B3. It was also mentioned that AMD historically took about 9 months to get new steppings out. If this same time period applies to Bulldozer and its revisions, it means we could expect the new steppings to hit around mid year as well. I’ve been wondering though, whether AMD will use Piledriver on the new steppings, which wouldn’t be simply new steppings anymore if they use Piledriver. And putting out Bulldozer B3 steppings alongside Piledriver doesn’t seem logical to me if both are socket compatible anyway. AMD could just do silicon tweaks on Piledriver-based chips directly and do away with the Bulldozer B3 chips.

      • Game_boy
      • 8 years ago

      Relaying out Orochi with PD cores instead of BD would take about a year and a whole design team dedicated to it. It isn’t a case of swapping them out on the design template.

    • ronch
    • 8 years ago

    So there’s a 17w half-power version that puts out similar performance? That’s some sweet cherry-pickin’ going on. Hopefully Trinity could be inside my next laptop.

      • NeelyCam
      • 8 years ago

      They mean similar performance to Llano.

        • paulWTAMU
        • 8 years ago

        D’oh! I had my hopes all up 🙁

          • Arag0n
          • 8 years ago

          Still, a 17w Trinity APU that is capable of delivering much higher performance than an E-350 with similar power envelop…. It’s a non-easy target and amazing processor for budget and lightweight systems. I would rather buy a “Ultrathin” than an “Ultrabook”… since I don’t expect my “Ultrathin” to be powerful, just to not be laggy and chucky.

            • khands
            • 8 years ago

            Agreed on all points.

            • franzius
            • 8 years ago

            The Trinity APUs are replacement for the Llano APUs not the Bobcat APUs.

            • NeelyCam
            • 8 years ago

            They are replacing both.

    • adisor19
    • 8 years ago

    17W trinity sounds like a great candidate for a proper graphics MBA.. IF AMD can deliver it in the numbers that Apple requires.

    Adi

      • ronch
      • 8 years ago

      They could supply Apple well enough, perhaps, but then they’d be leaving all the rest of their OEM partners angry with unfulfilled orders. Happened before when Dell hopped on the AMD bandwagon.

    • no51
    • 8 years ago

    YO DAWG I HEARD YOU LIKE COMPUTERS…

    • lycium
    • 8 years ago

    yo dawg, i herd u liek computers so we put a laptop in ur desktop…

    • jensend
    • 8 years ago

    No word on desktop parts yet I guess… if they’re moving from 35 and 45W laptop parts to 17 and 35W, I wonder what kind of TDP reductions we might see on the desktop side. 2-month-old rumors suggest they’re sticking with 65W and 100W TDPs and trying to focus on performance, but if they released any ~45W parts they might be a compelling alternative to Intel’s xxxxT processors in the low-power SFF and HTPC space.

      • sschaem
      • 8 years ago

      The stated tdp is at stock… but you are welcome to undervlot / underclock to ‘any tdp’ with modern processors.

        • khands
        • 8 years ago

        That will be interesting to see on the 65W chips, [i<]how low, can we go?[/i<]

      • Goty
      • 8 years ago

      I run a 95W Phenom II at full speed inside a completely silent HTPC (well, in terms of cooling, anyhow; I can still hear my WD Caviar Green seeking away inside). The only place you need “energy-efficient” parts in the HTPC space is in those ridiculous SFF cases that don’t really lend themselves well to silence in the first place.

        • Anonymous Coward
        • 8 years ago

        There are some passive cooled cases out there that use lower wattage chips, ex: [url<]http://www.tranquilpcshop.co.uk/power-pc-ixls-custom-build/[/url<]

        • Palek
        • 8 years ago

        I own a Shuttle cube and have an Athlon II 250e in it. It’s a 45W part clocked at 3.0GHz. I’d say going for it was definitely worth it; the compy sips power and is very, very quiet. I can only hear the fans when there is complete silence in the room. In other words: my experience says otherwise.

    • tejas84
    • 8 years ago

    Does anybody know whether the GPU on this is VLIW4 or GCN?

      • guilmon14
      • 8 years ago

      It’s VLIPr

      sorry VLIPr

      VLIW4
      %Pr

        • tejas84
        • 8 years ago

        Can anyone else confirm that it is VLIW4?

          • derFunkenstein
          • 8 years ago

          ya, maybe someone who isn’t PWI. :LOL:

      • Palek
      • 8 years ago

      According to [url=http://arstechnica.com/gadgets/news/2012/01/amd-aiming-to-undercut-ultrabooks-with-500-trinity-ultrathins.ars<]Ars[/url<] the Trinity GPU is based on GCN.

        • Goty
        • 8 years ago

        This would actually surprise me somewhat. GCN would definitely be the way to go for integration with the CPU, but VLIW4 would be the better path in terms of efficiency, I think.

          • Palek
          • 8 years ago

          It does sound unlikely, considering that would mean AMD actually had two “first” GCN products in the pipeline, and for two different fabs at that. That would be a pretty risky strategy for a new architecture.

          At the same time it makes perfect sense to inject some of the improved GPGPU mojo of GCN into Trinity – PileDriver probably needs all the help it can get to catch up with Ivy Bridge.

            • wierdo
            • 8 years ago

            From AMD’s past charts, this indeed sounds like the eventual convergence point, where the GPU and CPU merge into a more closely coupled system, at which point I imagine something like GCN would be used as a building block since it does better in handling GPU computing related workloads.

            • Goty
            • 8 years ago

            There isn’t any software out there now that could leverage GCN to catch Trinity up with Ivy Bridge on the non-graphics side.

            • khands
            • 8 years ago

            Well, there isn’t any now, sometime down the line their might be though so the earlier they can converge the two the better it will be for them in the long run.

        • Palek
        • 8 years ago

        Nevermind… According to Anand [url=http://www.anandtech.com/show/5426/amd-clarifies-7000m-strategy<]AMD has confirmed[/url<] that Trinity will in fact use VLIW4, though somewhat improved over Llano.

      • guardianl
      • 8 years ago

      It should be. AMD was going to do GCN on 32nm at TSMC until they gave up—*cough* I mean “skipped” right to 28nm.

    • wierdo
    • 8 years ago

    Thought it was a clever little show, though the “second generation DX11 is much harder” comment was cheesy marketing imho.

    At least vlc was used for playing a baseball clip in this case, Intel chose a bad time to fake their gaming performance presentation right before this one, I don’t know why they did that, they didn’t need to.

      • Game_boy
      • 8 years ago

      Their externally-sourced graphics chip is capable of DX11 (and does it in other devices) but Intel screwed up the design so it only does DX9 in their products. At least that’s what S|A was claiming.

        • willmore
        • 8 years ago

        I believe SA was talking about the graphics in the new Atom processors, not IVB nor Trinity.

    • OneArmedScissor
    • 8 years ago

    I can understand that they’re looking for something to make up the loss of the 28nm Bobcat SoC, but they really need to just commit to something here and hurry up. “Mid year” is extremely ambiguous, and I won’t be surprised if there’s yet another delay.

    Their yields might be better than with Llano, but they look like they’re trying to repeat the same mistake of turning it into a befuddled “not realy high end but maybe” and “not really low power but maybe” platform.

    They really need to just get the low power dual cores out there and move on. That is what people will buy and that is what they can surely make enough of. With Llano, they never even got around to this. The better GPU might be a selling point for a few, but only a few, and the quad-core part that comes along with it really isn’t going to be a selling point for much of anyone. Ivy Bridge quad-cores will already have been out, and also at 35w.

      • chuckula
      • 8 years ago

      From this demo + history + my gut instinct:

      17 Watt Trinity vs. 17 Watt IB:
      A 17 watt IB will easily beat it in CPU, but the Trinity will win in GPU. However, the advantage in GPU over IB will be less when compared to Llano’s GPU advantage over Sandy Bridge.

      35 Watt Trinity vs. 17 Watt IB:
      Closes some (maybe not all) of the CPU gap, and opens up the GPU lead to levels similar to Llano vs. Sandy bridge.

      35 Watt Trinity vs. 35 Watt IB:

      IB is back in a comfortable lead in CPU, maybe does a little (but not much) to close the gap in GPU. Remember, the 17 Watt IB parts actually come with the highest-end GPU from Intel, so the higher TDP parts don’t get a big boost in GPU (maybe some clockspeed scaling bumps).

        • NeelyCam
        • 8 years ago

        Yeah, I’m curious to see how well an AMD BD-derivative 17W 32nm non-trigate part will do against an Intel SB-derivative 17W 22nm trigate part…

        I mean, we all remember how BD vs. SB went, and that was 32nm non-trigate vs. 32nm non-trigate…

          • Joe Miller
          • 8 years ago

          Did you enjoy the Ivy demo of Mooly Eden? It was very nice.

            • NeelyCam
            • 8 years ago

            It was a marketing fail, but as Anand showed, IB DX11 works fine on both standard notebooks and ultrabooks.

            But that’s pretty much unrelated to my comment about AMD being practically two nodes behind (1. 22nm vs 32nm, 2. trigate vs. non-trigate) but expecting to have competitive performance at 17W. I predict fail. GPU might work OK because they use more than half of the TDP budget for that, but the chip overall is still fail.

        • shank15217
        • 8 years ago

        Llano’s gpu trashes Intel’s GPU and Trinity is 50% faster so why wouldn’t it trash IB? Oh and don’t give frame rates, they don’t mean anything. Intel’s GPUs cut all sorts of graphics quality corners to get good performance and IB will probably do the same.

    • geekl33tgamer
    • 8 years ago

    Unless im missing something, this little APU was running 3 screens all at the same time. Not only that was running Dirt 3 (With Dx11 actually enabled and smooth FPS? Dubious) and GPU accelerated transcode on the GPU/CPU at the same time…

    …It’s one powerfull little chip at face value then?

      • Ringofett
      • 8 years ago

      Well, we dont know the resolution it was pushing in Dirt 3, do we? Or other possible IQ tweaks. So, thats an open question.

      Also using the GPU was the transcode, but again, how fast was that going and how much was it using the GPU? Maybe it was barely touching the GPU and pegging the CPU?

      The video was probably using the UVD block and a tiny bit of CPU, so thats no surprise.

      If Dirt 3 wasn’t cranked, the transcode was crawling, and the video using separate acceleration, then the demo is cute but didn’t impress me a ton. That’s just what I expect at this point.

        • BlackStar
        • 8 years ago

        There’s a dedicated transcode chip (codec, actually) so it’s running full-speed. This is indeed where things are heading, but it’s still impressive and opens up new possibilities for ultraportable systems.

          • Ringofett
          • 8 years ago

          Thats makes sense then. I wondered if it had the same sort of hardware in the 7970s integrated.

          Not sure why the -8 though. It’s still an interesting product for laptops, but I don’t see how its as compelling for desktops outside of budget builds. It’s GPU wont be good enough for gamers, and it’s CPU side will get crushed by Ivy Bridge. In other words, Trinity will keep AMD treading in mediocre water, hence I wasn’t impressed. My apologies to those emotionally invested in the company I guess. Indeed, Ivy Bridge can do DX11, so might eat a lot of Trinity’s lunch.

          I’ve actually been more impressed by demo’s of multitasking on certain tablets; the Playbook comes to mind. I just dont see whats groundbreaking here. Maybe the multi-monitor support, and maybe thats useful to some, but what % of the market?

    • indeego
    • 8 years ago

    “And one more thing!” and then proceeding to unscrew a case cover to reveal a notebook within to lackluster applause.

      • NeelyCam
      • 8 years ago

      I thought that was kind of cute.

        • l33t-g4m3r
        • 8 years ago

        I liked the “yo dawg” comment more.

      • helboy
      • 8 years ago

      well beeter than what Mooly did on stage “dance dance jig jig slip slip on and on…..” 🙂 pathetically dissapointing.the blue team should have spend much more time in covering up their mistakes on stage.well its all business at EOD.

    • dpaus
    • 8 years ago

    That tower case needs an exorcist!

    So, a laptop driving two external monitors and it’s internal display. About time! Good thing, too, since Ivy Bridge will also do that, leaving only Nvidia limiting output to two displays maximum (which I’m sure they’ll address by then too).

      • mczak
      • 8 years ago

      Trinity supporting 3 displays isn’t really news, since some old slides said something along the lines of “enhanced display support” and “Eyefinity” even though AMD didn’t tell any specifics.
      As for nvidia, since they don’t have any APUs (not x86 ones at least) well as long as they rely on Optimus on Notebooks (and I would expect a lot of notebooks to do that) they get triple display support without even having to do anything (so the first system with a single nvidia gpu supporting 3 displays might actually be a Optimus-enabled Ivy Bridge system with some older GeForce 5xx chip…). That said, I’d be _very_ surprised if nvidia wouldn’t feature more than 2 display outputs in their next gen chips.

    • chuckula
    • 8 years ago

    Looks interesting! One rather cryptic comment from the story is that the 17 watt TDP chips will have similar performance to the 35 watt parts, but just have a lower TDP. If this is true, what sort of tweaks to process/binning is being done to make this happen for mass-production of lots of chips?

      • Game_boy
      • 8 years ago

      The claim is similar performance to Llano in 35W. The architecture improvements from Llano -> Bulldozer -> Piledriver core, in particular power states, account for this. Also a year of 32nm development on the process side.

        • chuckula
        • 8 years ago

        OK, that makes more sense and jives with what AMD has been doing with the new CPU + GPU.

      • DavidC1
      • 8 years ago

      The wording is confusing, but they are saying the 17W Trinity will have similar performance to the 35W Llano. If 17W Trinity and 35W Trinity was similar in performance, it would be redundant.

Pin It on Pinterest

Share This