Leak slides spill Haswell chipset details

Intel’s next-generation Haswell CPU is due out next year, and details are trickling out about its supporting Lynx Point chipset. According to a series of slides posted at Chinese site EXPreview, desktop-bound versions of the platform will sport six USB 3.0 ports, 14 USB 2.0 ports, six 6Gbps SATA ports, and eight PCIe 2.0 lanes. Two of the USB 3.0 and 6Gbps SATA ports are listed as “muxed with PCIe,” suggesting some bandwidth sharing with the PCI Express lanes. It’s still nice to see additional high-bandwidth I/O ports, though.

Some of those ports will purportedly disappear in Lynx Point-LP, a low-power version of the chip targeted at mobile systems. USB connectivity is cut to eight 2.0 ports and four 3.0 ones. Only four SATA ports remain, and just three of them can hit 6Gbps speeds. The number of PCIe lanes has been trimmed to six, as well. Interestingly, all the SATA ports and two of the USB 3.0 ports are apparently being handled through the chipset’s PCI Express interface.

CPU overclocking support appears to have been dropped from Lynx Point-LP, which isn’t surprising. What’s really interesting is the omission of the DMI and FDI interconnects that usually hook up to the CPU. In their place, Lynx Point-LP will reportedly use a new OPI link with eight lanes of bandwidth. OPI stands for On Package Interface, and one of the slides shows the platform hub sitting on the same package as the Haswell CPU. Oooh.

Lynx Point will be fabbed on a 32-nm process, say the slides, so it shouldn’t take up too much room on the package. It appears the chipset will be part of the CPU’s thermal management scheme, with CPU throttling invoked if the platform hub gets too hot, and vice versa. To further conserve power, it looks like ultra-low-power versions of Haswell will be able to drop their base clocks from 100MHz down to just 24MHz. These so-called ULT chips will additional support C8, C9, and C10 power states that will be absent from the desktop line.

Haswell was designed primarily for thin-and-light mobile systems, so a greater degree of platform integration only makes sense. Given Intel’s history of moving platform components onto the chip package before integrating them into the die, I wonder if Haswell might be the last Intel microarchitecture without SoC-style integrated I/O.

Comments closed
    • Bensam123
    • 7 years ago

    This almost looks like a step backwards… WTF is up with the USB 2.0 and 3.0 options too? Why are they even offering USB 2.0? Sata 2? They still can’t be trying to push Tbolt… can they?

    About the only thing that is a step forward is the power usage.

    This is the start of even more product segmentation and more money for Intel. Want USB 3? Tough, pay an extra $10 for it. SATA 3? A extra $15. You want SRT? A extra $5. Oh you don’t actually want your USB and SATA ports dangling off PCIE lanes? Pay a extra $20. Oh and all these offerings will conveniently be offered in PCIE 1.0 to 3.0 flavors. Since PCIE scales so nicely, we can simply turn down the version in hardware to whatever we want and get you to pay a extra $50 for it!

    Behold the awesome future that doesn’t even involve the actual SPEED of the processor.

    SoCs are terrible for consumers outside of tablets or smartphones. You have absolutely no control of your peripherals from that point on. Motherboard manufacturers can try and hobble together peripherals and hook them up to PCIE lanes (hoping you opted for the $50 more processor to get PCIE 3.0), but that’s not nearly the same.

    Fuck that shit. I think people would buy AMD just to get away from it. Lets hope they don’t go under because I’m pretty sure this is all a likely scenario of what the future would hold if Intel had their way. We can already see the start of it right now.

      • chuckula
      • 7 years ago

      Uh… where to begin…..

      1. Name one single AMD chipset for Bulldozer/Vishera that includes any native USB 3.0 at all. Why are you bitching that you “only” get [b<]six[/b<] USB 3.0 ports on a standard Intel chipset? So what if you [b<]also[/b<] get USB 2.0.. are you thinking that your typing rate will quadruple with a USB 3.0 keyboard? 2. SATA 2!! OMG!! On a low-power Ultrabook chipset that is integrated in the same package as the CPU!! OH THE HORROR of only having 3 SATA 6Gbit connectors in an Ultrabook! What will I do with my 8-Drive Enterprise storage array I want to put in my Ultrabook! Intel hates progress! 3. A bunch of weird crap about paying more for features that doesn't have anything to do with the article YAY! Say, tell me how much I have to pay for an AMD chipset that includes SRT? Oh, and one more time, what is the price for a chipset aimed at Bulldozer/Vishera that includes USB 3.0 again? If I had say.. $100 million in cash could I walk into AMD's headquarters this afternoon and leave with a Bulldozer/Vishera compatible motherboard that has native USB 3.0 support? 4. Some whiny stuff about PCIe... do you hate this standard since AMD hasn't bothered to implement PCIe 3.0 and has no plans to do so at any time before 2014? EDIT: 4.5: Whoa... $50 more for PCIe 3.0?? Aside from the fact that the PCIe 3.0 is built into the CPU and not the southbridge so you are completely on crack, please show me the native AMD chipset that gives me PCIe 3.0 [b<]at any price[/b<]. Let's say I go back to the ATM and this time I withdraw a cool $1 Billion and walk into AMD's headquarters this afternoon... will I walk out with a PCIe 3.0 enabled system? I seem to recall that when Intel first introduced PCIe 3.0 it was deemed stupid and useless by none other than... well you. [b<]Shouldn't you be thanking Intel for giving you a $50 discount for a system that isn't burdened with the stupid and useless PCIe 3.0???... Just like every AMD system in existence?[/b<] 5. [quote<]SoCs are terrible for consumers outside of tablets or smartphones.[/quote<] YEAH INTEGRATION IS EVIL!! If Intel really cared about consumers it would pull out the memory controller and put it back on the Northbridge since integrating the memory controller is part of making a SoC! AMD would never ever do that to poor innocent consumers! Integration is bad! Oh and don't even get me started about some evil conspiracy to put GRAPHICS on the CPU chip! That's sacrilege! AMD will protect us from all of that evil anti-consumer nonsense! 6. [quote<]SoCs are terrible for consumers outside of tablets or smartphones. You have absolutely no control of your peripherals from that point on. Motherboard manufacturers can try and hobble together peripherals and hook them up to PCIE lanes (hoping you opted for the $50 more processor to get PCIE 3.0), but that's not nearly the same.[/quote<] ---> So basically you just said that hooking up peripherals via PCIe is stupid and evil... OK, what magical system does AMD use to hook up peripherals then? I just built a system that uses an AMD graphics card and the connector looked *suspiciously* like PCIe, should I call Rory Read on the Batphone and have him come out to investigate? Is Intel in a giant evil conspiracy to cripple all of AMD's graphics by hijacking the card shipments in route and forcibly slapping PCIe connectors on them? I'm pretty sure AMD communicates with its GPUs using ground-up Unicorn horns the way REAL men do their I/O! 7. I'm really hoping your entire post was an intentional troll... please tell me I just missed the humor there or something.... [b<]ADDENDUM:[/b<] Hrmm.. judging by all the knee-jerk downmods from AMD fanboys who will be slobbering with love when AMD finally catches up with Intel and puts out SoC-like systems, I'm almost regretting the fact that I defended AMD in yesterday's HPC flame-fest. I'm detecting a schizophrenic break in the AMD fanboy base: The same faboys who rabidly defend the worst of AMD's products like Bulldozer simultaneously pretend that the graphics side of AMD doesn't exist, or just give it passing lip-service. It's fascinating to watch since about the only real innovation going on at AMD is in the GPU side of things too.... oh and if you are stupid enough to think that AMD isn't desperately trying to turn Trinity/Vishera into a SoC along the lines of Haswell, then I have a bridge to sell you.

        • travbrad
        • 7 years ago

        [quote<]Why are you bitching that you "only" get six USB 3.0 ports on a standard Intel chipset? So what if you also get USB 2.0.. are you thinking that your typing rate will quadruple with a USB 3.0 keyboard?[/quote<] I have to agree with you on that. Getting a million USB 3.0 ports sounds nice in theory, but how many people have enough USB 3.0 devices that having "only" six 3.0 ports is an issue? My guess would be about 0.1% or less. External HDs are the only "mainstream" devices that are held back by USB 2.0. Even most USB flash drives don't get anywhere near the limits of USB 2.0 (the cheap ones still only do about 7-8MB/S).

          • Bensam123
          • 7 years ago

          You want USB 3.0 when you need USB 3.0, not because you only have two USB 3.0 devices right now. Think about tomorrow, not just today. You don’t buy a new processor every day… Better yet, think of how many USB 3.0 devices you will have over the processors lifetime. Forward thinking is always good when purchasing hardware devices.

          It costs them absolutely nothing to make all the ports USB 3.0. It’s product segmentation that will negatively effect you and I in the future. The same goes with PCI-E lane sharing between ports, which is a artificial way of creating product segmentation again.

        • Bensam123
        • 7 years ago

        Read my response to trav for #1.

        #2… These changes aren’t just being applied to the mobile sector.

        #3… It has everything to do with the article. You are breaking my post down so much you aren’t even comprehending my overall point.

        #4… Chipsets as they stand are standalone products. You can mix and match chipsets with any processor. That includes features on that chipset. It forces them to actually have a good meaningful chipset that is pretty robust. We as consumes benefit from this. If they’re integrated into the CPU they can decide to axe features because you have to buy it no matter what. Then they can obfusicate it behind the processor name you also have to buy.

        #5… Don’t take my post to extremes. Moving some parts of the northbridge to the CPU is a good idea, but they’re just moving everything there to blatantly exploit their ability to manipulate the peripherals. They can easily change what they offer and how much of it. They’re selectively limiting connectivity on purpose. When USB 1-2 came along and SATA 1-2 came along Intel was all over that, same with PCI-E 1.0-2.0. Now they’re selectively limiting it’s implementation.

        #6… Don’t twist my words. I said that integrated components made by a billion dollar giant generally are better then trying to dangle third party controllers off of PCI-E lanes.

        #7… There is no point to this post, you’re just trying to make me look bad.

        I can guarantee you no one rated you down because they’re blatant AMD fanbois, more so because of the extreme use of hyperbole, sarcasm, being all around dillhole, and your points don’t have a lot of merit behind them to back up your bark. You don’t even need to read most of the paragraphs. Your point just consists of basically one sentence and six that follow it up explaining what a ‘stupid’ thought the original point was that you’re replying to. They have no merit as does your reputation for presenting yourself this way.

          • chuckula
          • 7 years ago

          OK, Intel is making changes outside of the mobile sector… it’s adding more 6Gbit SATA ports and more USB3 ports to its next generation desktop chipsets *and* adding more than you could ever practically hope to use to its mobile chipsets… oh the horror, I feel so enslaved by this pure evil manipulation from Intel! Won’t Rory come and save us! Oh wait, he’s too busy putting price tags on what’s left of AMD’s assets to worry about us.. shucks.

          Why again is it the worst thing ever that Intel is giving you 6 USB3 ports… [b<]when the only AMD chipset that offers it at all with USB3 gives you four.. wow! 6 is smaller than 4 all of the sudden, this new math will save AMD![/b<] I feel so forced at gunpoint to use Thunderbolt when every single chipset sold for Ivy Bridge includes 2 USB3 controllers today and next year that number will go up to 6.. oh and even on Haswell parts that can even theoretically fit into freakin' tablets Intel will still be giving you four USB 3 controllers which is apparently perfectly fine in the small selection of AMD chipsets that have USB3 but pure unadulterated EVIL on a chip that can even run in a tablet!! Hey Mr. "I have so much choice in AMD chipsets!!" Please name me one chipset made by a company other than AMD that will run my Bulldozer/Vishera chip (crickets). Ok, how about Trinity since Trinity is a desktop part and AMD loves consumers so much... give me a giant list of all those Trinity chipsets that are made by Nvidia and Via... (more crickets). OK, well AMD loves consumers so obviously you can get all kinds of third party chipsets that suport Brazos right?!!?!?!?! (crickets burst through the door with the axe and say "Here's Jimmini!")

            • Bensam123
            • 7 years ago

            More compared to the last chipset or compared to all the ports being USB 3 because they can simply do that?

            And no, telling someone what they will or wont need doesn’t mean anything to me. If I own 8 USB 3 devices and I want to use all 8 of them, I should be able to do it. It doesn’t matter what YOU or Intel believe for that matter. Stop trying to impose upon me your own views of how I should live my lifestyle.

            This is completely disregarding what I said to Trav. Use a little bit of forethought and think about what you will have 4-5 years down the road when you’re still using the same motherboard and processor. I really have no idea whats with the trend of people being so terribly shortsighted.

            This isn’t about AMD either, it’s about Intel limiting their processors to artificially segment their product lineup (SoCs, not chipsets). That’s what my original post was all about. This is going to turn into a microtransaction cash cow. I don’t see AMD talking about integrating the same components into their processors in the same way. AMD and Intel have quite a bit more influence over their processors then they do over what chipsets motherboard makers use or opt for. The free market can decide what chipset they want with their processor, which I’m sure is quite skewed. Not to mention companies like Nvidia and ATI before they were absorbed can make their own chipsets.

            Rememer back when Intel and AMD chips could be used on the same board? Or what about AMD making chipsets for Intel chips?

            Dude, once again out of those three giant paragraphs I picked up four sentences with points to them and the rest of it is you just being a complete and utter douchebag. It’s almost not even worth arguing with you and I wouldn’t if those points didn’t exist and I didn’t actually think you believed in them. At least back up your sentences with actual logic or sense instead of getting up on a soap box and trying to write a short story about what a terrible person someone is. Ridiculous amounts of hyperbole almost to the point where it’s a unreadable mess.

            • chuckula
            • 7 years ago

            [quote<] This is completely disregarding what I said to Trav. Use a little bit of forethought and think about what you will have 4-5 years down the road when you're still using the same motherboard and processor. I really have no idea whats with the trend of people being so terribly shortsighted.[/quote<] What you said to Trav was stupid and ignorant and you are living in your fanboy bubble as a way of denying reality about the impending fire-sale at AMD: 1. I buy Haswell next year (and I will, AMD has nothing even remotely competitive). I get 6 USB 3.0 ports, PCI 3.0 support, 6 Gbit SATA support, etc. etc. And don't even go there about how it's horrible that Intel gives you 6 SATAIII ports while AMD supposedly gives you 8... nobody ever uses that many in real life and you know it so don't even try to lie about it or come up with some nonsensical anecdote. People who need that many disks in real life use dedicated controllers that work best with PCIe 3.0.... 2. In an alternate universe I buy Vishera this year or next year, it doesn't matter since there aren't any upgrades planned. I get: Whatever tacked-on typically lower performance USB 3 solution some motherboard manufacturer comes up with because AMD in its infinite wisdom has decided that I don't need USB 3. PCI express? Well sorta, it's not integrated with the CPU so it's slower, and it's only PCIe 2.0, because AMD has decided I don't need that either. 6 Gbit SATA? Sure it's there if I buy the highest-end chipset (wait, didn't I hear you whining that Intel makes you pay more for extra features?) 3. In alternate universe #2 I buy Trinity or Trinty 1.2 next year: I get, [b<]four, which is a smaller number than six[/b<] US 3.0 ports, some 6 Gbit SATA ports that aren't any better than what Intel gave me, and PCIe [b<]2.0[/b<] because AMD has decided I don't need anything better than that. Oh, let's not forget a few things: The Haswell chip is faster at CPU, has a GPU that's better for HTPC, which is the only real use for an IGP in a desktop machine, and uses far less energy and comes with next-generation instructions that are still on the drawing board at AMD. OK delusional fanboy: in 4-5 years... which one of those platforms will be doing better? I really can't wait for you to tell me how AMD will sneak in at night and do in-place hardware upgrades with magical elves to keep its systems vastly superior to those big-evil Intel chips.

            • Bensam123
            • 7 years ago

            I don’t have a fanboi bubble? In order to have a fanboi bubble I would have to favor either Intel or AMD, I don’t go either way… I expressed interest in AMD because Intel is going in this direction. My original post wasn’t even about AMD over Intel, rather Intels direction with SoCs and their seemingly decrepit list of peripherals compared to what they could do.

            I don’t understand what a firesale has to do with any of this…

            I don’t understand why you use numbered lists and then abandon them as soon as someone uses the same list to assess points. I suppose if you actually stuck to them you couldn’t mix in nearly as much hyperbole and random BS.

            1… This isn’t about AMD vs Intel, it’s about Intels current trends. You seem to think this argument is explicitly about AMD vs Intel, it’s not. Get a grip man.

            2… There is really no point to this bullet besides badmouthing AMD. You aren’t even getting close to the original topic of discussion. Intel moving parts to a SoC that can’t be changed by the end user and it resulting in a market where Intel controls microtransactions through it.

            3… Once again random AMD hate that doesn’t even pertain to the original subject matter.

            I’m pretty sure at this point you aren’t even discussing the same thing I am. I was arguing the direction Intel is taking in limiting connectivity, moving it out of the end users hands, and then turning it into a scenario where they could prophet hardcore off of it. I did mention after all of this that that would result in people jumping ship to AMD. That was the only reason I mentioned AMD.

            People are right, you really do have a serious bug up your ass about AMD verse Intel. You can’t even see what we’re talking about correctly. Remember that post Geoff made about getting rid of blatant raging fanbois whose sole point is to argue one brand to the point that they don’t even make sense? Yeah…

            It doesn’t help that you’re being a complete douchebag about this too. You did manage to tone it down a bit, but your doucher aura is still coming on pretty strong. It’s not hard to be civil online and not look to insult someone every other sentence. I could say the SAME EXACT THINGS about you, but I chose not to. It’s not because I can’t or it’s too hard, it’s because I’m simply a better human being then you. You can change the way you act though. No one is stopping you from becoming a better person.

            • chuckula
            • 7 years ago

            [quote<] Intel moving parts to a SoC that can't be changed by the end user and it resulting in a market where Intel controls microtransactions through it.[/quote<] Once again: You live in a fantasy world and you create a reality that doesn't need to conform to any facts. You've been doing this for years going (at least) as far back as saying that TR shouldn't benchmark Nehalem because tubo-boost is cheating and no benchmark can ever be valid in a chip that includes turbo-boost (interestingly this logic didn't apply when AMD later copied that feature, but I'm not expecting intellectual consistency out of you). Please tell me the last system that you owned from your beloved AMD where you could replace the freakin chipset in-place on the motherboard... you can't! Why is this great and wonderful but if Intel does the same thing it's a "microtransaction". Intel is simply taking the southbridge and moving it onto the CPU package to save space and power in mobile devices. THAT'S IT. There's no "microtransactions" there's no real change in technology except for it being an upgrade with more I/O options than in existing systems and saving power in notebooks. What is so hard for you to understand about this? Why is it wonderful that AMD segments its chipsets into all sorts of different incompatible combinations, but if Intel simply does something to save power in notebooks it's an evil microtransaction? Oh. and try using Trinity in an FM1 socket motherboard...OOPS, looks like AMD is charging you "microtransactions" for upgrades now!

    • anotherengineer
    • 7 years ago

    Insert NeelyCam drooling pic below 😉

      • NeelyCam
      • 7 years ago

      Friggin’ six more months…

        • ULYXX
        • 7 years ago

        Ah dang it, I was in the zone of blissful ignorance about the projected wait time.

    • MrDigi
    • 7 years ago

    What is up with dropping display interfaces HDMI and DP, or are they not shown because they now come from the CPU directly. VGA being legacy is thru the chipset.

    • Sam125
    • 7 years ago

    [quote<]...I wonder if Haswell might be the last Intel microarchitecture without SoC-style integrated I/O.[/quote<] Both AMD and Intel are uncertain about the future of desktop computing which means the time is ripe for a disruptive technology to come along and shake up the industry. Isn't it exciting? =)

    • redavni
    • 7 years ago

    English version of page. [url<]http://wccftech.com/intel-haswell-ult-processors-power-saving-features-lynx-point-lp-chipset-detailed/[/url<] Will it be power efficient enough to be viable in a tablet?

      • chuckula
      • 7 years ago

      Depends on your definition of “viable”…The ULV parts are capable of running in a tablet form factor (although the tablet will be fatter than an iPad). The battery life won’t be anything spectacular though.

      • dpaus
      • 7 years ago

      What tablet needs even the LP’s eight USB 2.0 ports, four USB 3.0 ports, four SATA ports and six PCIe lanes??

        • redavni
        • 7 years ago

        Who knows…what if Google or Microsoft if Valve put out a gaming console that was also a tablet?

        • pedro
        • 7 years ago

        My Fatal1ty Tab Extreme.

          • MadManOriginal
          • 7 years ago

          Fatabl1ty

            • BIF
            • 7 years ago

            That’s funny!

            • redavni
            • 7 years ago

            I think you guys underestimate the popularity of a Playstation that lets you play any game on Steam, Final Fantasy, and turns into an Android or iOS tablet when you unplug it, but whatever. I can dream.

    • UberGerbil
    • 7 years ago

    Quite a while ago there were statements/predictions that Broadwell would be a full SoC-style design, and that the “ultrabook” (or whatever Intel term is trademarking at that point) Haswell designs would be a step in that direction (ie MCM). Others suggested that Broadwell would only take the MCM step. But with Haswell seeming ahead of schedule, I imagine Intel is happy to accelerate the timetable for everything.

      • Theolendras
      • 7 years ago

      Would make sense, given that Intel seeks to gain share in mobile platforms aggressively. I would still bet the one after Broadwell for SoC style integration. Would probably be easier to move IO on die with a “tick” than a “tock”. I’d like to be proven wrong by Chipzilla tough.

      • dpaus
      • 7 years ago

      So, purportedly, you’re saying they should double-down?

        • pedro
        • 7 years ago

        LOL.

      • Flying Fox
      • 7 years ago

      Timna-inspired idea is coming back with a vengence?

      • OneArmedScissor
      • 7 years ago

      Maybe it’s three versions?

      1) SoC manufactured completely differently for tablets / ultraportables
      2) MCM for larger laptops
      3) Backwards compatible CPU w/o southbridge for desktops

      They’d potentially only have to build two mainstream CPU chips, comparable to today’s dual-cores and quad-cores.

      Another possibility is that every chip gets two USB 3.0 ports, and the southbridge becomes optional or dynamically disabled. We’re talking about 2014, so native PCIe may be standardized for SSDs.

      I doubt it will be all SoC because they still have too many things more important to integrate:

      1) More VRM circuits
      2) wireles modem
      3) Thunderbolt
      4) SSD controller
      5) RAM

      And they always blow some of it on a bigger GPU.

      Intel has been able to fit USB and SATA controllers in the CPU for years, but they have chosen not to with anything but the phone version of Atom. That implies they are waiting for the entire southbridge to become “legacy” controllers and just do away with it, rather than blow the R&D on integrating it.

    • chuckula
    • 7 years ago

    [quote<] To further conserve power, it looks like ultra-low-power versions of Haswell will be able to drop their base clocks from 100MHz down to just 24MHz. [/quote<] ... and that + improved granularity in shutting down functional units is how Haswell is getting its [i<]purported[/i<] 20x idle power consumption improvement over Ivy Bridge.

Pin It on Pinterest

Share This